ansh: I'm just looking at this with monkey now, in the office
ansh
alastairp: I found some such cases in the database. I'll find them again
alastairp
the one thing that we both saw at first is that the 3 sections (list of reviews, list of this entity in the other db, list of related entities) are difficult to visually separate
I think the best solution here would be to add some more whitespace and open a ticket for aerozol to make it look better
ansh
Yes, that would fix the issue for now
alastairp
I'm going to open a ticket
for now let's not overthink it, and just add some more whitespace, and leave it as-is
so that we can merge it
yvanzo: hi! I want to do a push for updating translations on CB after all of ansh's work. should we remain with transifex for now?
in this case, the "this entity also appears" is currently in the table, as `<tr><td colspan=3></td></tr>`
rather than your version which has a separate table per list of reviews
and it also explicitly says "no reviews" in the case there aren't any, which was one of the things that made the earlier Ann Leckie example a bit confusing as to where each section starts and ends
one thing we're not sure about is what it will look like if a) the main entity you are viewing has no reviews, but a related one does, or b) none of the main entity or related entities have reviews
monkey
^what he said
It turns out we're too smart for our own good and your way of doing is is both simpler and more future-proof (with the CB-442 redesign we will soon lose the table elements)
reosarevok: Your track example was really great. I just discovered that there is an undocumented limit to the size of the query parameter sent to the Spotify API which if reached results in a 404
144 characters max
reosarevok
hah!
lucifer
mayhem: deployed the album crawler and surprisingly. nothing pending...
mayhem
so if i listed to stuff now it should start crawling them?
mayhem plays more music
huh, but listens should come in and start causing items to be downloaded. but that hasn't happened in a few minutes, so something seems off.
lucifer
even before i took it off, the artist id crawler only had 1/2 ids in pending
for approx ~30 mins or so.
mayhem
ok, I guess it fetches everything and then goes back to sleep.
maybe print out how many albums have been fetched since start?
lucifer
can see that in redis.
118 so far in 5 mins
mayhem
question: if you find an artist id, do you automatically download all albums for that artist?
this is already getting stored in grafana as well, if you want to graph it
oh well, the fanout setup is wrong which broke the mapper.
heyarne[m], thanks for the report. will fix.
heyarne[m]
🌄 amazing
lucifer
mayhem: my understanding of fanout messages was incomplete. we can reuse the exchange but still need a unique name for this queue to ensure both the mapper and the cache get the listens. "spotify_metadata" for a new queue name?
heyarne[m]: can you try listening to that song again and see if it gets mapped now?
alastairp: monkey: I think keeping white spaces would be good for now. The prototype looks good, but I am also not fully sure for those two cases. Because if there are no reviews for the main entity, It would be kinda unclear for the users
Should we say 'No reviews' for the related entities btw?
heyarne[m]
<lucifer> "heyarne: can you try listening..." <- Yup! That was fast
lucifer
awesome!
rozlav has quit
rozlav joined the channel
mayhem
lucifer: that sounds fine to me.
`Pending IDs in Queue: 211652`
oy, that is clearly working ok.
HorusHorrendus has quit
lucifer
mayhem: yeah but something still seems off, 2000 ids in 30 mins is clearly too slow.
mayhem
alastairp: do you have a current email for Paul Lamere?
fetched or discovered ids?
lucifer
fetched
alastairp
mayhem: the only one I have in my inbox is @echonest.com, which I suspect no longer exists
lucifer
discovered is > 200k
mayhem
alastairp: ok, I'll dm him on twitter, that should work.
not sure what the rate limits are lucifer, so not sure.
are we fetching one album at a time or multiples at a time?
lucifer
1 album at a time.
mayhem
there is an endpoint for fetching multiples in one call -- lets use that.
mayhem: yes, tracks are cut off after 50. i guess we could detect and do separate queries in that case though.
chrisshepherd has quit
ssam has quit
chrisshepherd joined the channel
ssam joined the channel
chrisshepherd has quit
ssam has quit
chrisshepherd joined the channel
ssam joined the channel
mayhem
Yes, a pita, but bound to be faster.
reosarevok
loujine, jesus2099 (and probably most other userscript makers): the new React relationship editors are out in beta. They'll be there for a while (longer than the usual week)
They'll almost certainly break several of your scripts - feel free to ask bitmap or me if you need help with something (well, bitmap knows a lot more since it's his code, but I can try to help)
Aheno: thanks! I did see, I'm just recovering from a little collection of injury and illness. But should get onto it today
Also fair warning - everything here usually moves at a slow pace. So I would get used to playing the loong game. And doing a lot of editing/scanning while you wait!
mayhem
lucifer: oh heh, lol. we'll we'll find out soon enough why it is slow.