hmmm kiki's cpu temperature increased by 15°C at 1AM UTC... I guess hetzner still has crappy cpu fans in stock...
ruaok
I bet. sigh. haven't we replaced all of them.. yet? at least once?
zas
It seems replacement parts are no better... but I think that's the first time on kiki though
I'll switch to herb and create a ticket for hetzner techs to have a look
btw, it's a bit messy, since we have few things running on kiki that don't run on herb (git2consul, and various mb stats)
I'll open a ticket at hetzner
aidanlw17
hi alastairp, are you around to talk?
zas
the issue with traffic starting at 8:16 UTC is prolly due to a network issue between hetzner cloud (solr cloud) and other machines. MB website behavior when this happens isn't great, causing very long delays:
alastairp: great, since this is the last week of gsoc I just wanted to touch base about what I need to finish up etc.
ruaok
Zas: skimmed it.
aidanlw17
alastairp: the eval will be ready for review today at some point. For the rest of the week, I'm not sure that I should start a new large task
although I feel that there is still a _ton_ of work that can done on the similarity project
alastairp
absolutly, I agree
I think we should focus on integration this week
what was it that you mentioned yesterday about mistakes in similarity?
Lotheric__ is now known as Lotheric
aidanlw17
ok cool, that's what I was thinking too
Mr_Monkey
akhilesh: Just pushed some code regarding browse request query params validation. I opted for one utility for all entity types. I will now write down an exhaustive list of possible relationships between entities (in short, the array passed to `validateBrowseRequestQueryParameters` for each entity, and a reason for each.
As for other query params: I'm thinking about how to validate other filter params. Not sure how to go about it yet. It will probably involve passing a function used as a filter in `getBrowsedRelationships`.
`limit` and `count` are straightforward: if they are present, use their value, if not use a default.
Any other query params will be ignored (for example mistyped or non-existing ones)
alastairp
with the changes that you made to the original PR, did you rebase the others with those changes?
Currently I'm just submitting two recordings, calling add_metrics, and checking the calls that are made to submit_similarity_by_id to ensure that the data is all included
For next time, do you have thoughts on what would have been a better structure (for the PRs)?
alastairp
I'd like to also see a test for the sql statement that gets this data
I think the better way of doing PRs would be to not base one off another
so always make a new one from master
but perhaps this means that we should be a bit more proactive in merging earlier work, so that it can be used in subsequent branches
aidanlw17
How can we write a test specific for the sql statement, should I make the same query in a test and check what is fetched against what is expected?
makes sense, I think that would work
especially if the work was split into smaller PRs
alastairp
there's a method in db/data.py that gets the data? and this is the query that you fixed?
along with some changes to the readme, and the similarity docs as well
cool. I'm gonna start that today, is there anything specific you'd like to see in it?
alastairp
no, that's completely up to you. what you did, why we did it, things you learned, things that surprised you, what it's like working with metabrainz
aidanlw17
perfect! I'm excited to write and reflect about it
GeneralDiscourse is now known as ephemer0l
chaban joined the channel
Mr_Monkey
akhilesh: I also now pushed some code stub showing how to use the browse request filters for `/author`. You'll need to use the same mechanism for each endpoint.
bitmap: will you be available tomorrow, i.e 24 hrs from now ?
I have a lot of query piled up, I'm almost through with convertions of edit forms.
will spend my day time tomorrow converting the remaining 2 entity editforms. then I'll ask about all the bugs tomorrow night (guess itll be morning for you 🤔)
travis-ci joined the channel
travis-ci
metabrainz/picard#4822 (master - 88df4e0 : Philipp Wolfer): The build passed.
> if it's low overhead then I'd just do it every time because you don't want to have a situation where you run a 3 hour job and then it fails just because you forgot to run a different command first