[musicbrainz-android] 14yellowHatpro opened pull request #127 (03brainzplayer…bp-implementation): Service completion, Service callbacks and notification stuff created, basic service setup complete https://github.com/metabrainz/musicbrainz-andro...
yellowhatpro
Moin!!!
akshaaatt: so our brainzplayer's service is complete for our playback use cases. Now for some time I will be focussing on some UI stuff and implementing functions in a view model
currently I am targetting playable entitites, i.e. Songs, and after a basic player implementation , when we are able to play songs, I will work on browsables like Albums and Playlists.
antlarr has quit
antlarr joined the channel
antlarr has quit
antlarr joined the channel
KevlarNoir has quit
skelly37 joined the channel
adhawkins has quit
adhawkins joined the channel
yvanzo
O’Moin
mayhem
moin moin!
skelly37
moin
akshaaatt
moin!
Amazing work yellowhatpro ! 💯✨
ansh
alastairp: Hi! Finally done with my exams.
For getting the data from the bb database, should I use direct queries then or add the necessary columns in the BB views?
alastairp
ansh: congrats, hope that they went well
ansh: I was just talking to monkey about that last week. what fields are you missing?
it's probably better that we add the columns that you need to the view
ansh
name, author_name, label, etc..
monkey
Hi! If you make a list of the fields missing per entity, I can see if they can be added
(to the views)
ansh
Okay, I'll make a list of it
Pratha-Fish
moin :)
alastairp: Here's CSV file with MLHD_artist_mbids that don't belong to artist_credit_list
I definitely think having the default alias name and sort name on all entities is useful. Same for disambiguation
Author credit name + BBID will be for bothe Edition and Edition Group
But then again you won't care about the Edition entity IIRC
ansh
Yes
alastairp: Can you give me permissions for the docker/pg_custom folder? I am getting permissions denied.
alastairp
ah, sure
ansh: try that
Pratha-Fish: great, thanks. I think the next thing we were going to add to this was the canonical recording MBID, right?
monkey
ansh: Not sure how to construct a useful author credit in the view: getting the name and BBID of the first author in the author_credit is one thing, but it does not allow multiple authors (i.e. something like that: https://critiquebrainz.org/review/39a9ad67-05a5...)
You might have to fetch the author_credit separately by its id and reconstruct the name and links from it
alastairp
monkey: one option would be to use array_agg, but not sure if it's worth doing this query every time for the view, or if it will adversly affect query time
I think it'd probably be OK to query the credit separately if it's needed
monkey
Never used array_agg myself, that's why, thanks :)
We could do that indeed
alastairp
monkey: is this a view or materialized view?
monkey
not materialized
alastairp
because if it's a regular view then I'd definitely profile array_agg first
because I wouldn't be surprised if it makes things much slower
Soo what do we do next in the artist conflation issue?
CatQuest
hey Pratha-Fish !
ansh
alastairp: worked thanks
alastairp
Pratha-Fish: cool. can we do the following, then:
Pratha-Fish
CatQuest Hi :D
alastairp
Pratha-Fish: oh - the drive file, is that all rows of the same mlhd files that you loaded?
Pratha-Fish
alastairp: yes
alastairp
and the other one that you linked above - uploaded to irccloud - is the subset of this same file which has mismatching mlhd_artist/artist_credit_mbid ?
Pratha-Fish
Yes, that's right
alastairp
ok, great!
Pratha-Fish
The first one contains all mismatches
the 2nd one contains all of them, correct and incorrect ones
alastairp
can we make one more small change, then -
can you generate the 2nd file again, but add the canonical recording mbid as a new column
Pratha-Fish
Yes, that should be easy :)
alastairp
so recording-MBID should be the value from the mlhd file
one other really tiny thing - can you make the header of that csv file be all lower-case, and use _ instead of - ?
artist-MBID -> artist_mbid
Pratha-Fish
Oh right, I should've done that from the get go
Are there anymore changes required?
alastairp
cool, once that file is generated we'll be able to use it to
use it to compare the mapper lookup
Pratha-Fish
Sounds good
I'll try to get the file ready in 30mins
ansh
alastairp: Again getting the same error.
alastairp
ansh: ok, let's debug what's happening here, it's likely that something in your configuration is changing the permissions when you run it
can you paste the database section of your docker-compose.yml?
[bookbrainz-site] 14tr1ten opened pull request #852 (03master…async-language-select): Feat(language-select): Asynchronously load options in language select https://github.com/metabrainz/bookbrainz-site/p...
alastairp
ansh: can I log in as you on wolf for a moment to test?
ansh
yes sure
alastairp
ansh: OK, I have a theory. it's possible that the the db container "remembers" that it used to be owned by root, and if we shut it down, chown, and then start it up again, it goes back and corrects it to the original value
so I'm going to stop and delete the db container, try the chown, and then start it up again
drwxr-xr-x 2 anshg1214 anshg1214 4096 Jun 27 11:17 pg_custom
now I started it again, and the permissions have stayed the same
now I stopped it, and started it again, and it's remaining owned by you
I just learned something new about docker, thanks :)
ansh: should be good to go now. let me know again if something weird happens
Pratha-Fish: that's great, thank you!
ansh
Sure! Thanks :)
skelly37 has quit
Pratha-Fish
alastairp: BTW can we implement the "file-sharing through public URL on wolf" thing that you mentioned earlier?
alastairp
Pratha-Fish: ok, let me have a look
Pratha-Fish
My internet is kinda crap, so it takes ages to upload files to drive as they're getting larger and larger
alastairp
btw, of course if you're sharing files to us and they're already on wold, just tell me the file path of it on the server and I can look myself :)
*wolf
Pratha-Fish
thanks :)
alastairp
just say "I generated a new file, it's at /home/snaek/mlhd-output/mlhd-sample-metadata-mapping.csv"
and I can take a look
Pratha-Fish
Here's my present working directory:
/home/snaek/MLHD/unk_ids
alastairp
mayhem: we're now at the point where we can test some MLHD recording-artistcredits on the mapping lookup. I recall that you suggested we write something to run locally/in batch against the lookup code instead of hitting the dataset hoster API
mayhem
what exactly should the lookup do?
alastairp: slightly off topic: a Kiwi traveling to Spain doesn't need a visa, correct?
alastairp
artist credit + recording name -> canonical/mapped MBID
mayhem: correct
mayhem
alastairp: thx on the visa q.
alastairp
not sure if the electronic shengen permit is active yet?
mayhem
not yet, coming later this year (more likely next year)
but that wouldn't involve visa letters which is what I am working on now.
for that lookup, hitting the labs endpoint or the LB /metadata endpoint is actually the best.
how many lookups do you need to do?
alastairp
Pratha-Fish: how many lines is the complete file?
Pratha-Fish
alastairp: 376038
CatQuest
[14:00] <alastairp> not sure if the electronic shengen permit is active yet?
I read that as "electronic shenanigan" at first
:D
Pratha-Fish
alastairp: and ~34k of them are incorrect
alastairp
Pratha-Fish: for now we're not too worried about the incorrect artist credits, for now we just want to compare the results of our two mapping techniques
Pratha-Fish
okie
alastairp
oh, that file also includes the timestamp right? so does it include duplicate artistmbid/recordingmbid lines?
Pratha-Fish
alastairp: Nah, it doesn't include any duplicate files