apparently the rust implementation listenbrainz-mpd opts to go around its elbow to get to its thumb by making an api request, when mpd already exports MBIDs in metadata so we can just query that instead, lol
monkey
"opts to go around its elbow to get to its thumb" <3
opal
the funnier thing here is how LB does do its own lookup server-side if that info is missing
so that kinda pisses me off the rust impl is even doing that
mayhem
that page of LB enabled apps is finally no longer depressing. great to see that. :)
Well, I do hope they don't weigh the hand luggage :P But they barely ever do
kurumi has quit
fletchto99 has quit
fletchto99 joined the channel
outsidecontext
if it's still too much chocolate you just need to eat it right at the checkin until it fits
reosarevok
It is known
fletchto99 has quit
mayhem
swiss weighs them in ZRH -- at least prior to the pandemic.
there is a very simple solution: when you see the dude with the scale appear, walk to the next gate and watch for the actual boarding process to start. once people start boarding, the guy takes off, so then its safe to return.
totes not speaking from experience, of course.
yvanzo
“anything to declare?” “just some brown powder, good product, casted into bars…”
reosarevok
yvanzo, bitmap: ping :)
yvanzo
pang
akshaaatt: thanks, I clarified it is about UI
opal
i had to pay $200 for a friend's overweight baggage simply because she was too disarrayed to, you know, pack a third bag
reosarevok
yvanzo: did you have anything in particular you wanted to check? :)
Or should we just look at stuff in general?
yvanzo
reosarevok: yes but it's not ready :x
reosarevok
haha
Well, we can wait a bit? How not-ready is it, an hour or a week? :D
yvanzo
let's review and merge as much as we can :)
if we freeze today and release on Monday, then we can merge anything during the summit.
Pratha-Fish
reosarevok: looks like there's an easy way out of the whole sql dumps :D
yvanzo
(unlikely to be today)
reosarevok
Sure
Pratha-Fish: oh?
Pratha-Fish
I can just dump the actual data from the tables instead of dumping the tables as well as their schema in the dump.
yvanzo
reosarevok: anything you particularly want us to check?
Pratha-Fish
The only issue is I woulnd't be able to test it on my current internet connection until around 2nd Oct
Focussing on the query and transformation scripts RN :D
reosarevok
yvanzo: I think "small things now, bigger things in person where we can hack at them" might make sense?
Pratha-Fish: it didn't import on an empty DB either?
reosarevok: I didn't try importing in an empty db as it would just beat the purpose of it all :D
reosarevok
Why?
I thought the idea was to have a basic area DB you could test stuff against
Pratha-Fish
like, just setting up the areas wouldn't have enabled the musicbrainz server to run on it. Yes, we could query areas from it, but adding and verifying areas would've been a mess
reosarevok
Hmm
Pratha-Fish
But now that you mention it, we don't really have to check if the add_area functionality works or not, as it's already being tested in the musicbrainz_bot
reosarevok
You'd still need to import things in the right order (area and url before l_area_url for example)
But maybe it'd work your way, might as well try I guess
Pratha-Fish
Let's give it a shot anyway :))
reosarevok
Just keep in mind that you want to add it in that order
area, url, link at least need to have the data in first
There might be ID conflicts though, but I guess you can try
Pratha-Fish
Sure, I'll try it out
Not sure if I'd be able to reorder all the SQL commands in the dump tho. It's too big to even open in VSCode 🥲
outsidecontext
is something up with the spotify import for LB? It hasn't imported any listens today so far for me
disconnect your spotify service and reconnect, outsidecontext
we really need to have LB notifications so that we can tell users that there was a hiccup.
outsidecontext
I set it to "disabled" and back to "Activate both". let's see if this heals it
mayhem: all listens from today just got imported! thanks for the help
I'll remember to try this reconnect should this happen again
mayhem
`2023-09-26 14:44:14,673 listenbrainz.webserver INFO imported 50 listens for outsidecontext`
indeed. P)
Pratha-Fish
reosarevok: apparently the dump ain't importing even on an empty db. There's no way to run the schema commands sequentially as the file is just too large to edit
Did you try with the deactivating of the foreign keys?
Pratha-Fish
Here's the error I am getting BTW
reosarevok: I tried the foreign key method as well, but apparently there isn't a single "foreign key" switch that turns off fk constraints for all tables
reosarevok: nothing much. only the invalid command \n errors throughout
The snippet I posted was just the tail of it
reosarevok
Apparently the first line of all is the real error, and then \N is crap (see stack)
admin/sql/DropFKConstraints.sql drops all FKs fwiw
(we have similar files for other constraints if needed)
Pratha-Fish
nice!
reosarevok
Worst case scenario, by the 2nd we'll all be in BCN together and can try to figure out something for you, though
If none of all this works
Pratha-Fish
I'll try my best to figure it out if possible! But yes, if you guys can figure something out, it would be great :)
lucifer
Pratha-Fish: you need to use pg_restore to restore the dump
Pratha-Fish
I think I had also setup some other flags while making the dumps. Specifying the relevant options relating to those flags on import could help too
lucifer: I read somewhere you've to restore .sql files with psql or something, as restoring with pg_restore didn't work the last time I tried
I'll try once again tho
lucifer
Pratha-Fish: the dump you at least the command i shared can't be restored using psql because its not sql statement dumps. you need to create a different kind of dump for that
Pratha-Fish
lucifer: For this particular command: `pg_restore -d mb_area pg_area_dump.sql`
I am getting this error: `pg_restore: error: input file appears to be a text format dump. Please use psql.`