we don't actually link the user's apple account with LB because that needs us to load the apple music player script.
there is no proper oauth api to do that.
i was wondering if on clicking this button, we could load the music player dynamically
and then have it perform the oauth, save the token back to LB via an api call.
monkey
Yes, loading the library dynamically shouldn't be an issue
lucifer
currently, this happens when the user tries to play songs on LB frontend with the apple music player.
i think the workflow outlined above is better, thoughts?
monkey
Yes, definitely a better idea IMO to try to do as much setup in one go when the user enables the feature
lucifer
cool, makes sense. one more question.
when the linking expires because of the tokens expiring, we'll probably get some 403s. to have the user relink it we just send the user back to that settings page?
and ask them to relink?
monkey
Was that the issue that the token expires really quickly?
lucifer
i think in the best case it will last 6 months.
monkey
Can we detect this specific case (organic expiration of token)? If so we should maybe try to handle it in BrainzPlayer when we encounter a 403. But as a first version, I think it'll be simpler to link to the settings page for the user to reauth
lucifer
makes sense
thanks!
monkey
Of course
lucifer
pixelpenguin: hey!
monkey
Generally, I've had a few ideas brewing for improving how we manage third-party music playing settings.
I think a new BrainzPlayer settings page will be necessary, and will likely replace the current music service page
lucifer
monkey: yup indeed, that would make sense.
monkey
Do you reckon we could save separate tokens for reading data and for playing music? For example for Spotify we could allow users to link their account to read history, and separately have another token for BrainzPlayer.
Would probably simplify some of the scopes mess we have to deal with
lucifer
that might be possible but I am unsure. i think it would vary service to service.
monkey
Right.
lucifer
one possibility could be to create multiple oauth apps, one for BP and one for LB main.
monkey
Right, I guess that makes sense
PossiblyNot sure it would be necessary though. I'll keep brainstorming.
lucifer
yup sure
monkey
Oh, lucifer, one more thing if you've never done it: to load a script dynamically, you can write some JS to inject a resource into the page (like you would with HTML) and it will load the library
lucifer
monkey: yes makes sense, i did see the utils functions to do it with react.
on this page, might need to write more JS from scratch because its not react-ified yet.
atj_: not yet, just so I could measure how each config change affects things, but planning on it today
atj
bitmap: do you use `wal_sync_method = 'fdatasync'` by default?
bitmap
yes, it's set to fdatasync
atj
oh, it's the default on Linux
pixelpenguin
<lucifer> "pixelpenguin: hey!" <- heyy, what's up
jasje
akshaaatt: update dev branch?
with my pr
work all good
ship it
bitmap
atj_: setting full_page_writes=off doesn't seem to affect the benchmark at all, probably because the test doesn't write enough data or run long enough to trigger an automatic checkpoint
were there any other settings I should tweak/? jimmy has the same amount of RAM as pink/floyd right?
atj
bitmap: you can disable wal_init_zero
and wal_recycle
bitmap
ok
atj
the other tweaks are at the ZFS level so we'll need to arrange a time to collaborate on that
but performance seems reasonable at this stage?
bitmap
yeah
I could start sending standby traffic to there (currently handled by pink)
atj
best hold off on that for now
I think some ZFS tweaks may have a decent impact on performance, but that will require creating a new FS, stopping the container and moving the data across
basically we need to create a dedicated FS for PG data and WAL
rdswift has quit
rdswift joined the channel
lucifer
pixelpenguin: Hi! let's discuss about setting up select options in Dataset Hoster UI. so like we currently have input fields for text and datetime, we want to add a field for select input that should be prepoulated with various options. these options can be hardcoded as an enum in the start.
in the final format, we want the options in the enum to be dynamic if possible (for example load from redis or database)
but either way you can start implementing the part with static options first and then we can look into dynamic options.
for the partial query ticket, i will have a discussion with mayhem and then you should work on it next.
pixelpenguin
Thanks for the detailed explanation of the implementation, I agree with it.
I can get started on it
Is there anything else I should look out for?
Or any other task in the current PR which I should work on first?
lucifer
pixelpenguin: i had looked at the current PR, it lgtm. i have not merged it yet just in case some bug comes up and we need to do a new bugfix release of DSH. the current PR introduces multiple incompatible changes and that would make it harder to do so.
so we should probably merge it all at once.
so no other changes required on the existing work but add the changes for these remaining 2 tickets to that PR as well.
bitmap
atj: ok, let me know if you create the new FS, I can help move the data over. though I'm not sure how well I can measure the impact of those changes with pgbench alone. I don't see any disk ops stats for jimmy in grafana