Hmm, in the commands I posted, you'll probably want to replace the user (-U flag) with postgres instead of bookbrainz
gr0uch0mars joined the channel
akhilesh
Yes, It require, how to replace user with postgres instead of bookbrainz?
Mr_Monkey
`-U postgres`
instread of `-U bookbrainz`
akhilesh
ok
ayerhart_ has quit
Mr_Monkey: Running now, Thanks!
Mr_Monkey
Great :)
ayerhart joined the channel
akhilesh
Mr_Monkey: In which issue, You are working ?
in BB
Mr_Monkey
Currently on ListenBrainz, but about to go back to tackling entity merging
akhilesh
I am working with date validation tests, then I think we should move to author_credit
also, it is imp
Mr_Monkey ^
Mr_Monkey
Agreed
That gives me a day or two to think of a plan and what needs to be done
akhilesh
By that time, I will try to finish date related work.
ayerhart has quit
D4RK-PH0ENiX has quit
D4RK-PH0ENiX joined the channel
Matthew_ joined the channel
D4RK-PH0ENiX has quit
Matthew_
Hi. I'm struggling to get full indexing working properly on Solr. I note a lot of errors in logs about missing fields when running reindex from mb-sir. It seems that the submodules for mb-solr are very old and aren't aware of some schema 25 fields. Does anyone know what revisions I should be using to reindex against a schema 25 database? Or should I be using a < schema 25 version of mb-sir to do the indexing? Thanks!
yvanzo
Hi Matthew_: are errors about label/isni only?
Matthew_
Hi yvanzo. I'm mostly seeing issues with artists - haven't got any further than the 'a's ;) I'm seeing 'unknown field 'primary_alias'' a lot.
yvanzo
schema 25 should still work with previous versions of mb-solr/sir
Matthew_: Ok, which versions of mb-solr and sir do you run?
sir master is ready to work with schema 25, but mb-solr is still work-in-progress
We still run previous versions of mb-solr/sir in production.
Matthew_
For mb-solr, I've tried 'master' should I be using v3.0? For sir, I don't see any branch/tag that leaps out as stable so I'm also on master
OK. Thanks, yvanzo. What versions/revisions would you recommend for those components?
yvanzo
mb-solr v3.0
and sir commit 5109d2606eb84a444b1c15dd6efe70571e67afd7
Matthew_
Great. Thanks, yvanzo!
yvanzo
Matthew_: you will have to patch sir requirements.txt because mbdata 2017.6.2 is no more available from PyPI
Matthew_
Thanks yvanzo. What version should I be replacing it with? most recent? 25.0.3?
yvanzo
Same version, but it should retrieved from github instead as follows:
ferbncode: Finals could always be better, but it wasn't too bad.
Lotheric_ joined the channel
ohrstrom joined the channel
Lotheric has quit
alastairp
how do I submit acoustids with picard? on mac, I have fpcalc bundled, I added an API key, the album has mbids
the icon is grayed out
CallerNo6
alastairp, and you've hit the `scan` button I assume
alastairp
no, it wasn't clear that you have to scan before you can submit (in this case I know that the album has no fingerprints, so I assumed that scan wasn't necessary)
also, scan only works if there are files in the left, but since they all have mbids, they all go straight to the right
anyway, I found the scan button now, doing that.
CallerNo6
no, it's not very clear. it trips me up every time I've done it.
You can select and drag a set of files back to the left to do further operations on them at any time.
alastairp
thanks
looks like acoustid is down anyway, I'll finish tomorrow
CallerNo6
argh. anyway, iirc the submit button will also be ghosted if there's already a match (same mbid and acoustid).
Lotheric_ is now known as Lotheric
ferbncode
spellew: Cool, just let me know whenever you're done fixing the timelines. :) Also, hit me with any questions that you encounter o/
Matthew_ has quit
ohrstrom has quit
spellew
ferbncode: Sure thing
ohrstrom joined the channel
ohrstrom has quit
iliekcomputers
alastairp: re comments on PR, if the rows aren't ordered, suppose the current max is 3, the query then returns rows 8, 9, 10 without returning 4, 5, 6, that'd be a bug.
alastairp
iliekcomputers: mmm, sure. but it only has to return 4, 5, 6 eventually. if it computes 8 9 10 then the left join against highlevel will have non-null rows, so they won't be selected, and maybe 4 5 6 will appear
iliekcomputers
but once 10's hl data is calculated, the new max would become 10, so the query would only return ids >= 10.
that's the way I wrote it, not sure if there's something I'm missing.
alastairp
mmm, yes. this is the bit that I was missing
you're right. if we want to have a limit we need to do it in order
OK, I think we should try it splitting into 2 queries too then - as ruaok pointed out last week, doing order by with the json table in the picture makes queries quite slow
do you understand the queryplans that I asked for?