#metabrainz

/

      • Etua joined the channel
      • Etua has quit
      • Etua joined the channel
      • akashgp09 joined the channel
      • Etua has quit
      • Etua joined the channel
      • Etua has quit
      • akashgp09 has quit
      • ruaok
        moooin!
      • lucifer: zas: what is the status of aretha / willaims?
      • yvanzo
        mo’’in’
      • lucifer
        morning
      • jasondk: cool, thanks!
      • ruaok: from what i see, barman has been moved but everything else is on williams.
      • ruaok
        zas: can we work on this today, please>
      • lucifer: what is the status with the dumps and importing data to spark from the 15th?
      • lucifer
        sentry's clear and the dumps also imported fine yesterday. so i think it was a one time connectivity issue.
      • let's generate the parquet dumps for testing?
      • ruaok
        yes, let's. I'll get that moving.
      • we're still running old cron right?
      • lucifer
        yes
      • ruaok
        k
      • lucifer
        we should do one full dump and 2 inc dumps. so that we can test all 3 scenarios. full dump import, first inc dump import and subsequent inc dump imports.
      • ruaok
        ok
      • lucifer
        ruaok: also, how do continous aggregates work? do they refresh counts for current year or only older than 1 year?
      • ruaok
        let me adjust the dummy dump code.
      • lucifer
        👍
      • Etua joined the channel
      • ruaok
        the cont aggs are currently set to refresh only 1 year automatically. then the cron job wakes up and processes 2 or 3 years in one go, until all time is done.
      • this keeps the disk space usage low.
      • lucifer
        makes sense, due to some reason counts on my profile have been broken for ~3 days now. i was hoping the cron job would fix it but it didn't.
      • ruaok
        ok, I'll look into that in a bit once I get the spark stuff going
      • like that?
      • lucifer
        thanks.
      • yes looks good.
      • zas
        ruaok: moooin, yes, I'm transferring ftp data atm, we can start to move services, machine looks stable
      • ruaok
        zas: cool and you can take care of the ssh keys? do we need to install new ones or reuse the existing ones?
      • zas
        ssh keys for ? I guess we can copy them over
      • ruaok
        for LB and other servivces to copy things to FTP.
      • would be ideal if they could be copied over.
      • lucifer: dumps started.
      • lucifer: do you know if the listen counts are broken for only you or for all people?
      • lucifer
        ruaok: they are broken for at least me. i checked a few other profiles and it looked okay for them. for me the counts were 0 (now ~100) so i noticed.
      • ruaok
        have you deleted your listen count key in redis and then gone back to your profile page to force a recount?
      • redis should be the first place to look at this.
      • lucifer
        no i haven't done that yet. because i thought it might be useful to know why this happened in the first place.
      • i'll try forcing the recount.
      • ruaok
        its going to be hard to figure out why -- we should really replace the redis stuff with the solution that I suggested -- keeping the counts in TS and be done with it.
      • (without relying on cont agg)
      • lucifer
        agreed. +1
      • ruaok
        ok, to get back into the swing of things and since i like pain, I'll add the listen_mbid_mapping table to the dumps.
      • I suppose that ought to be a new dump file...
      • lucifer
        yeah, new file makes sense.
      • forcing recount fixed the issue.
      • ruaok
        ok, keep your eye on it and see if it gets borked again.
      • lucifer
        👍
      • ruaok
        I'm doing the same.
      • alastairp
        hello, buenos dias
      • back to it this week
      • ruaok
        hiya!
      • guten morgen!
      • lucifer: my listen counts are incrementing.
      • peterhil joined the channel
      • lucifer
        hi!
      • alastairp
      • > Distance of the Race in 2021 is around 2450km which makes it possible to finish it in around a week.
      • lucifer
        ruaok: yes, the listen counts increment for me as well. the issue is my listens were around 8000 last week, then those reset to 0 due to some reason. from then they kept on incrementing as i listened to ~100 today.
      • ruaok
        yeah, sounds like your redis key was lost. for whatever reason.
      • lucifer
        possibly.
      • ruaok
        alastairp: those rules. ouch.
      • alastairp
        winner rolled in to arc de triomf 2 days ago
      • after 6 days of riding
      • ruaok
        sheeesh. thats freaking insane.
      • ~400km a day? over mountains? insane.
      • alastairp
        I guess some people like doing that to themselves...
      • ruaok
        did they get a PCR test for the borders?? 🤭
      • BrainzGit
        [musicbrainz-android] 14amCap1712 merged pull request #78 (03master…onboarding): Onboarding Tweaks https://github.com/metabrainz/musicbrainz-andro...
      • akashgp09 joined the channel
      • peterhil has quit
      • Etua has quit
      • MrClon has quit
      • alastairp
        question for react experts: as a matter of style do you always have exactly 1 component per file, or if you have small components that are only used as a helper in a larger one will you include them in the same file?
      • tandy[m]
        im trying to submit this listen, but im getting an error 500, any ideas?
      • ruaok
        500? hmmm, lemme look/
      • tandy[m]
        i think its because the json has escape characters
      • ruaok
        I doubt it -- then you'd get a 400 error.
      • did you retry submitting it?
      • it may be a transient error.
      • tandy[m]
        no i didnt retry, but i have tried sending the submission a couple times
      • ```Error: unhandled exception: ERROR 500 Data requested not available [HttpRequestError]```
      • thats the error
      • ruaok
        please submit it again, so I can look for the error in the logs. tell me right after you submit.
      • peterhil joined the channel
      • tandy[m]
        done
      • the user id is tandy1000
      • if that helps
      • monkey
        alastairp: I'm not a purist, so if the helper component is small and not used anywhere else I think it's acceptable to keep it in the same file as the main component.
      • If there are more helper components I usually create a separate helpers file for them.
      • ruaok
        oh hai monkey !
      • monkey
        Hai !
      • ruaok
        tandy[m]: I see the error. let me look closer.
      • feeling back to 100%, monkey ?
      • monkey
        Yep, all good now
      • ruaok
        great.
      • lucifer: what do you make of tandy[m] 's error? https://sentry.metabrainz.org/metabrainz/listen...
      • the listen pasted is looks ok, but we can't parse it.
      • monkey
        I'm going to deploy PR #1539 to test.LB unless anyone is is actively using it.
      • Some BrainzPlayer improvements
      • yvanzo
        alastairp: same as monkey for MBS
      • tandy[m]
        @ruaok the json is generated by some unmarshallling code, so i might have missed something in my custom types
      • ruaok
        the JSON looks ok to me, but somehow we get an array parsed from the JSON, not a dict.
      • tandy[m]
        hmm i used this guide to write the types for which a submissin payload is generated
      • monkey
        ruaok: I see the JSON starts with a curly bracket, and I think maybe the code expects an array instead
      • Would that be possible?
      • Not sure why that would result in a 500, but…
      • ruaok
        shouldn't be, no. That would make our docs wrong. :)
      • monkey
        Ignore me :)
      • ruaok
        thanks for looking. :)
      • monkey
        I can't see the whole JSON, was trying to see if there could be some issue with the formatting
      • Can you see the whole JSON somewhere ruaok ? I only see a truncated version
      • ruaok
      • monkey
        Thanks
      • It's anoying that we can't see the whole data in the sentry issue
      • lucifer
        sorry had fallen asleep. 😅
      • i have seen that error before tandy[m]. my hunch is on an encoding issue. let me look deeper.
      • monkey
        lucifer: Hi! I have another one for your plate :) There are some uncommitted changes in docker-server-configs on kiss that I don't want to mess up
      • lucifer
        monkey: hi! just took a look, you can safely discard that. those are probably from ruaok updating cron earlier today for testing.
      • monkey
        OK, thanks :)
      • lucifer
        tandy[m]: your entire json is wrapped inside extra `''`. so its not a correct json payload. i am not sure of the source of those but place would be to look into your listen submission code.
      • *first place
      • akshaaatt[m]
        Hi lucifer ! Thanks for reviewing the PR. I am available in case any discussions are to be made.
      • I even got some good feedback from the community regarding the design, which I intend to incorporate soon.
      • lucifer
        akshaaatt[m]: hi! sounds nice. i am trying to test your picard addition PR again and it does not seem to work this time. can you try again and see if it works for you.
      • ruaok: the alert you set up seems to working nicely ;)
      • akshaaatt[m]
        Sure!
      • kepstin has quit
      • JuniorJPDJ has quit
      • elomatreb[m] has quit
      • akshaaatt[m] has quit
      • yyoung[m] has quit
      • tandy[m] has quit
      • MagnusSvensson[m has quit
      • ruaok
        oh good, I hadn't had a chance to test it yet.
      • did you stop the metric writer to test it? or did something else happen?
      • lucifer
        its hung up itself. i just came to know of it from the alerts.
      • ruaok
        ok, that begs the question: why does it stop?