Gentlecat: its done, I was just working on the taghelper. Although its not required for places. Should I keep it rather for the release group work?
Gentlecat
keep what?
sorry, I'm not sure what you mean
ferbncode
I didnot frame it correctly. Send and work on the taghelper function (for showing release group and other entity tags) for later PRs ?
Gentlecat
if that part is done, let's merge and try to deploy it (I can help with that)
ferbncode
ohh..sure, i will update the PR and remove the WIP tag then. :D
Gentlecat
I'd rather us merge smaller parts quickly than having huge PRs that take forever to finish and review
ferbncode
Gentlecat: sure, understood. (and updated PR)
Gentlecat
give me a sec, I'll post some comments
ferbncode
Gentlecat: okay, thanks :)
sam____ joined the channel
sam_____ joined the channel
sam_____ has quit
aeromarine joined the channel
sam____ has quit
github joined the channel
github
[critiquebrainz] ferbncode opened pull request #113: Fix missing table name in function users.tokens (master...token-name-2) https://git.io/vQ3QT
github has left the channel
github joined the channel
[critiquebrainz] gentlecat closed pull request #113: Fix missing table name in function users.tokens (master...token-name-2) https://git.io/vQ3QT
github has left the channel
aeromarine has quit
agentsim joined the channel
agentsim has quit
arbenina_ joined the channel
iliekcomputers
GSoC first evaluations open up today btw :)
agentsim joined the channel
agentsim has quit
Nyanko-sensei joined the channel
D4RK-PH0ENiX has quit
alastairp
ruaok: I've spent a bit of time looking at your counting PR, but it's a bit over my head
I'm not sure of the best way to give feedback to you on it
ruaok
Import your listens. I'll watch your logs. :-)
alastairp
well, I mean, I did a successful import locally, and it counted fine
so it appears to work...
importing
ruaok
And it isn't mission critical... It is generating metadata, so if wrong, we can fix it.
alastairp
ok
zas
Moin
Freso
🙋
TOPIC: MetaBrainz Community and Development channel | MusicBrainz non-development: #musicbrainz | Picard meeting (Tues, 18:00 UTC): Working document https://goo.gl/anENrc | MetaBrainz meeting agenda (now: 17 UTC!): reviews, MB User Survey (Leo_Verto/Quesito), spam review (ruaok), Picard meetings (Freso)
iliekcomputers
ruaok: no hiccups yet?
on the beta? :)
alastairp
hmmm
listenbrainz count is still 2 less than last.fm
drsaunder joined the channel
iliekcomputers
alastairp: i'm almost certain that those are duplicates with the same timestamp
alastairp
yeah, it's likely
I guess I should scrape my history and see if I can find them
unfortunately it's difficult to catch this during the import process and report it
drsaunder has quit
Nyanko-sensei has quit
D4RK-PH0ENiX joined the channel
ruaok
iliekcomputers: looking now.
alastairp: I'm with iliekcomputers on this one -- I think it is related to the fact that we're more serious about data ingestion than last.fm was early on.
PeterJones joined the channel
samj1912 has quit
no errors in the influx writer.
nothing I can see in the bigquery-writer
zas: docker and long log files don't play well together to they?
zas
what do you mean ?
ruaok
lemmy:~->docker logs -f listenbrainz-web
just hangs.
worked a minute ago.
PeterJones has left the channel
drsaunder joined the channel
alastairp
ruaok: yeah, I agree
marc2k3 joined the channel
ruaok
still, I share your frustration that this a very opaque process. we're going to get other people complaining about this and it would be nice to have some tools that allows us to fix exact cases where things went wrong and why.
marc2k3
hi guys, looks like the certificate for community.metabrainz.org has expired. Firefox 54 is blocking me outright and IE11 lets me in if I ignore the warning
we *think* we fixed the bug that was affecting you.
agentsim joined the channel
zas
hmmm it happened again wtd
marc2k3
I don't think I can try the alpha import any more as I tested my last.fm account instead. I had 2 missing scrobbles out of 21,000 odd and after parsing my entire last.fm history myself, I did find duplicates as you've already predicted.
agentsim has quit
the duplicates weren't identical tracks but tracks that followed each other on an album that ended up with the same timestamp
alastairp
interesting. as long as the title and artist is different, I thought that the same timestamp wouldn't matter
ruaok
if the metadata is different, but timestamps the same then LB should accept them.
iliekcomputers
the deduplication we went back to (after reverting the time range one) didn't check for track name or artist name, only timestamp.
influx doesn't allow more than one row with the same timestamp and tags
afaik
ruaok goes to read the code
ruaok
yes, influx is going to eat the exact dupes, but we should not be calling dupes on that basis of timestamp alone.
that might explain when CatQuest gets so many dropped
iliekcomputers
let's add the artist_msid and recording_msid check to it then
CatQuest
oh my old srobbles doenst have thme i think :/
but. I have a limited pool of maybe 2-3k songs that i've playd over and over
ruaok
iliekcomputers: yes, lets do that.
CatQuest
import finished btw.. and I still have diff of 1000~
ruaok
in theory once this check is in place, a new import should get more listens imported.