alastairp: also the PR is ready and up on test.lb for testing..
alastairp
set to false?
lucifer
yes
alastairp
lucifer: fyi, review submitted - not sure if you saw it
one bug for local development
reosarevok
yvanzo, bitmap: banners added on prod and beta
bitmap
moin
yvanzo
hi
lucifer
alastairp: oh duh, make sense. how about storing the musicbrainz_up value in a variable as well in config and looking at that instead?
if item is missing or false, don't do anything.
alastairp
lucifer: sounds good. So if config entry is False we block the login page, and in all other cases if the engine is None we have the existing guards to just not get the data?
lucifer
alastairp: yup.
alastairp
perfect
reosarevok
yvanzo, bitmap: when do you think we should start with aretha / patton?
also bitmap: could you document on that doc how to put stuff on read-only? ^ that way we can refer to it next year if you are not around for whatever reason
yvanzo
Then a couple hours.
bitmap
it's just setting DB_READ_ONLY to 1, but no need to do that this time
yvanzo
I’m reviewing the steps related to sir atm.
bitmap
I changed the steps so that we perform the backup before the upgrade (see my latest comment on the doc)
reosarevok
I understand that in case of a disaster, then we'd lose the edit data that is not replicated?
bitmap
not really, since we save the last replication packet before applying the schema changes. it'd just be kinda annoying to replay those changes on the prod DB
reosarevok
bitmap: but edits aren't replicated, are they?
bitmap
oh right, I was thinking applied edits
hackerman7 joined the channel
hackerman has quit
hackerman7 is now known as hackerman
reosarevok
Is it worth the risk?
How long is a dump?
I guess in aretha, long, given those discs...
bitmap
all of the edits would still be in the prod DB unless we delete those somehow
reosarevok
What is the worst possible disaster you can imagine? Having to reimport raw tags?
bitmap
a backup takes about 50 minutes to an hour
reosarevok
(which are not replicated so we'd lose an hour of them but probably not the worst)
Also, and it might be a stupid question: given we have pink and floyd, is there a way we can make a complete copy of pink, data and all, before upgrading, or is that absurd? docker black magic, so
bitmap
we don't touch the tag_raw tables at all so I'm not really worried about that
reosarevok
Oh, I guess, we just generate stuff based on them
So that's not an issue
I guess we could horribly corrupt artist_credit somehow
But that one is replicated
bitmap
well, we already ran the upgrade against a copy of the prod DB and artist_credit was fine
reosarevok
Sure :)
I'm just thinking "what's the worst that can possibly happen", I guess
yvanzo
zas: for moving aretha to ansible, do you need us to move docker volumes elsewhere?
zas
yvanzo: nope, it shoudln't be needed as aretha is already on 20.04, but it will require a reboot
lucifer
zas: when will aretha be rebooted?
bitmap
reosarevok: re: pink, if you mean copying the pgdata dir from the filesystem, it can't really be done because the files won't be consistent
unless there are no writes/replication happening
reosarevok
Ok :) I expected it wasn't a thing, was just wondering
zas
during the downtime, but I don't exactly know when. bitmap, when do you think it would be possible?
reosarevok
Quickly copying as soon as we start going down, and dumping it later if we make a mess, or something, dunno :D
Anyway, nvm
yvanzo
bitmap: Which time would be a good to start rebuilding search index? It is rather DB intensive.
lucifer
alastairp: updated PR with a new config variable.
alastairp
I saw it, looking
lucifer
ah great, thanks
bitmap
yvanzo: can it run against pink? I think soon after the upgrade should be fine
alastairp
lucifer: btw, the KEY_JSON template feels a bit scary to me, though I'm not sure the best way to deal with it
yvanzo
bitmap: yes, we should hold off sir-prod during that time.
bitmap
zas: I stopped json dumps and barman cron already, maybe we can do it now?
lucifer
alastairp: agreed. any particular issue with KEY_JSON though because it feels similar to other existing ones to me.
zas
bitmap: ok, I'l proceed then
reosarevok
bitmap: marked json-dump as down on the doc
bitmap
I also stopped musicbrainz-production-cron temporarily so it doesn't try to copy anything to aretha, but we can restart it for the 15 utc run
alastairp
I guess the trick is to work out how to have a field in the consul config file (which is json) which then turns into python in the config file. I see why we did it, but I don't like it because it means that you have to interpret json inside the config file (even when you just want to read it, for example)
again, not important for today
lucifer
i see, makes sense.
tested both locally and on test.
alastairp
lucifer: looks good. let's add IS_MUSICBRAINZ_UP to the config template anyway
lucifer
đź‘Ť
alastairp
do you also want to add the labs checks to this PR?
lucifer
separate PR is better i think
alastairp
fine
can we re-enable MB database on test?
I want to log in, then disable MB again, then check that everything works
lucifer
i have deployed on beta as well just now
so can test that workflow there (i was going to do the same)
alastairp
I'm logged in on beta now, thanks
lucifer
yup beta working fine for me as well.
do you want to test new user sign up on beta as well before merging?
i switched beta to false while logged in and i remain logged in and Lb works fine as well
alastairp: I wanna know who has listened to NGGYU and how much
atj
zas: just checking
alastairp
reosarevok: me, but I submitted the listen using the API demo, rather than actually listening to it...
reosarevok
No, I mean, as in, give me an all time ranking on the site :D
alastairp
oh, sure. I guess we could do that
lucifer
alastairp: i am around for 15-20 mins but will then be afk till schema change time probably. lmk if you need me for something.
alastairp
reosarevok: sounds like you're volunteering to help with artist/recording pages on LB after today's schema change
lucifer: should be OK. 2h20m, right?
lucifer
yup
alastairp
we can go ahead with a deploy around about then once I get this PR done
reosarevok
I would say how hard can that be, but I have no idea how the whole huge DB thing you have works, so probably quite :p
alastairp
reosarevok: i suspect that we actually have most of this in place, including the relevant stats
plus monkey's work on the now listening page, etc
lucifer
the stats aren't in place yet but i can probably add that by wednesday fwiw
bitmap
yvanzo: should we replace `sudo /root/docker-server-configs/scripts/start_services.sh` with `sudo sh -c 'cd /root/docker-server-configs; git pull origin master; ./scripts/start_services.sh'`?
a git pull will be needed to obtain the new DBDefs files
yvanzo
Yes
bitmap
ok
zas
bitmap: we have to fix a small issue with ansible before I can proceed with aretha, slightly delayed. The move should take 10-15 minutes if everything runs smoothly. I'll tell you when done.
bitmap
sure, np
BrainzGit
[musicbrainz-server] 14mwiencek opened pull request #2533 (03master…runexport-enable-dbmirror2): RunExport: remove --nodbmirror2 from ExportAllTables https://github.com/metabrainz/musicbrainz-serve...
yvanzo
reosarevok, bitmap: Is it worth saving logs of caa-indexer and wikidata-bot?
reosarevok
wd-bot, probably not
bitmap
I'd say no to caa-indexer too.
atj
hopefully the ansible issue is resolved now
yvanzo
Thanks, removed these steps
zas
I'll start with aretha
bitmap
yvanzo: reosarevok: would one of you like to drive the main parts of the upgrade today? I'll still be around to help, but it might be a good idea to give someone else a turn
yvanzo
np
BrainzGit
[listenbrainz-server] 14alastair opened pull request #1991 (03master…labs-api-musicbrainz-down): Return empty results if MusicBrainz database connection isn't available https://github.com/metabrainz/listenbrainz-serv...
reosarevok
I'm happy to as well, but quickly having dinner first
zas
rebooting aretha
BrainzGit has quit
akshaaatt
atj: is it fine if I convert the containers dashboard project to a react project?