Inspected hip containers for maintenance and pg/ws containers for load issue.
Plus usual stuff: SpamBrainz, small fixes, tickets triage, support...
Finito, reosarevok?
ruaok
yvanzo:have you tried the VM I uploaded before I left.
it was a complete one, but I never tried it.
reosarevok
Hi! More support, more editing and fixing
yvanzo
ruaok: yes, it is corrupted, I deleted it.
ruaok
alas, thanks
reosarevok
Trying to do some last minute Wikimedia Eesti business
ruaok
yvanzo: lets chat quickly about this after the meeting.
reosarevok
And moved back to the city
So should be around more often again
And have more time to deal with stuff :)
bukwurm: you?
bukwurm
Hey everyone!
Freso
(People still up: Leo_Verto, kartikeyaSh, dragonzeron β anyone else, please let me know ASAP.)
bukwurm
This week I worked on migrating entity creation from bookbrainz site to data
Many new exciting features in the language have been added in last few years, primarily dealing with async nature.
So I refactored the existing code, and made it fit with the existing work in progress on the bb-data
Apart from that I worked on recent imports and routes handling discard/upgrade on the bbsite
That's it from me this week. Leo_Verto ?
Leo_Verto
Thank you!
Finally got through my exams, worked on polishing spambrainz code (type hinting is really cool) and started work on docker images.
I've also been trying to figure out how to make GCE preemptible instances work for Keras.
Finally my shitty cable connection suddenly decided to stop getting an upstream so I'm stuck with mobile internet. Yay for ISPs!
(also zas, if possible I'd love to talk to you after the meeting)
fin.
kartikeyaSh, go!
zas
sure
kartikeyaSh
hi
Freso
(Only dragonzeron left on my list. Last chance for anyone else to let me know they want to go!)
kartikeyaSh
This week I wrote code to create recording clusters for incoming recordings. I'll write tests for the same and create PR ASAP.
I also worked on setting up a VM for executing the written code so far for part one of the GSoC project. This code was executed on the data in the only available datadump of messybrainz http://ftp.osuosl.org/pub/musicbrainz/messybrainz/
While creating clusters I found that we have recording_json which contains MBIDs but not titles. others with MBIDs keys pointing to empty strings which must be inserted before the LB put a validation check on such mistakes. I had to handle those cases.
And then I learned how to work with EXPLAIN statement to speed up the queries by creating appropriate indexes. And an important note to keep was use of VACUUM ANALYZE statement in postgres which must be executed after loading a datadump into postgres. So, that the query execution plans are created optimally.
iliekcomputers
π
kartikeyaSh
This dump contains 9335675 recordings. I created clusters for the recordings that contain recording MBID. And created artist clusters for the artist_credits that contain artist MBIDs. And for the releases too. It took aprox 3 hours to create recording clusters. Whereas just aprox 20 seconds to create artist clusters and release clusters. This was because we just have aprox 10-12k recordings with artist MBIDs and release MBIDs.
But fetching release clusters using recording MBIDs was slow it took aprox 24 hours. As we did query musicbrainz database again and again for the same information with a different MBIDs. If we copied the 4 tables involved in the join on musicbrainz database we won't have to join again and again those tables. And the process will speed up.
fin!
dragonzeron: go
dragonzeron
Ok
Leo_Verto
I see I'm not the only one pre-typing my reviews :P
CatQuest
Leo_Verto: I always do that :P
Freso wonders if dragonzeron is still aroundβ¦
kartikeyaSh
Leo_Verto: can't let people wait while I write this type of long reviews
π
Freso
dragonzeron: Ping?
dragonzeron
So I have been working on adding isrcs as usual however I started using Vgmdb to add additional information that I was not able to find about albums so I have been working on that as well as removing Likedis auto edits
Freso
Ah.
dragonzeron
and I also said Ok before hand
Freso
Oh, I missed that. Sorry.
dragonzeron
understandable
I have also been going to albums that have amazon covers and then grabbing the covers from there and uploading it to the Cover Archive
so yeah thats it
Freso
Alright, thanks dragonzeron and everyone for your reviews. :)
Looks like we have no other items on the agenda, so we'll close the meeting with this.
Thanks for your time everyone!
</BANG>
ruaok
thanks Freso
iliekcomputers, zas: stick around for a minute.
iliekcomputers
thanks Freso
zas
sticking around...
kartikeyaSh
thanks Freso
ruaok
I see that EX51 at hetzner has 64BG ram and 2*4TB drives.
no setup cost.
I propose that we order an EX51 to put into our empty slot at hetzner.
samj1912
bitmap can you do a quick test run of picard 2.0.2 (see if it works without code sign issues)
ruaok
then we move into that and when we're ready we let go of one of boingo/prince.
4TB of disk (in RAID-0) should be sufficient, no?
CatQuest
and tanks Freso hopefully next time I'll be well and joining in again? ()woul have done it this week but I am so ill π
thanks*
iliekcomputers
ruaok: should be sufficient def
ruaok
zas?
iliekcomputers
spike uses 1 TB around :)
zas
ruaok: ok for me
ruaok
so then boingo/prince as app server and the new machine as DB server.