#bookbrainz

/

      • anonn has quit
      • MohamedAli00949 joined the channel
      • MohamedAli00949
        Hi there
      • I am trying to run the project locally but found this error and tried to solve it many times and can't resolve it.
      • ```bash
      • ERROR in ./stylesheets/style.scss (./stylesheets/style.scss.webpack[javascript/auto]!=!../../node_modules/css-loader/dist/cjs.js!../../node_modules/resolve-url-loader/index.js!../../node_modules/sass-loader/dist/cjs.js??ruleSet[1].rules[0].use[3]!./stylesheets/style.scss)
      • Module Error (from ../../node_modules/sass-loader/dist/cjs.js):
      • Unexpected token '?'
      •  @ ./stylesheets/style.scss
      • ERROR in ./stylesheets/style.scss (./stylesheets/style.scss.webpack[javascript/auto]!=!../../node_modules/css-loader/dist/cjs.js!../../node_modules/resolve-url-loader/index.js!../../node_modules/sass-loader/dist/cjs.js??ruleSet[1].rules[0].use[3]!./stylesheets/style.scss)
      • Module build failed (from ../../node_modules/resolve-url-loader/index.js):
      • Error: resolve-url-loader: error processing CSS
      •   PostCSS received undefined instead of CSS string
      •   at new Input (/home/mohamed/Downloads/bookbrainz-site/node_modules/postcss/lib/input.js:24:13)
      •     at encodeError (/home/mohamed/Downloads/bookbrainz-site/node_modules/resolve-url-loader/index.js:274:12)
      •     at onFailure (/home/mohamed/Downloads/bookbrainz-site/node_modules/resolve-url-loader/index.js:215:14)
      •  @ ./stylesheets/style.scss
      • ERROR in ./stylesheets/style.scss
      • Module build failed (from ../../node_modules/mini-css-extract-plugin/dist/loader.js):
      • HookWebpackError: Module build failed (from ../../node_modules/resolve-url-loader/index.js):
      • Error: resolve-url-loader: error processing CSS
      •     at symbolIterator (/home/mohamed/Downloads/bookbrainz-site/node_modules/neo-async/async.js:3485:9)
      •     at done (/home/mohamed/Downloads/bookbrainz-site/node_modules/neo-async/async.js:3527:9)
      •     at /home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/Compilation.js:4873:8
      •     at /home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/Compilation.js:3352:32
      •     at /home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/HookWebpackError.js:68:3
      •     at Hook.eval [as callAsync] (eval at create (/home/mohamed/Downloads/bookbrainz-site/node_modules/tapable/lib/HookCodeFactory.js:33:10), <anonymous>:4:1)
      •     at Cache.store (/home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/Cache.js:107:20)
      •     at ItemCacheFacade.store (/home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/CacheFacade.js:137:15)
      •     at /home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/Compilation.js:3352:11
      •     at /home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/Cache.js:93:5
      •     at Hook.eval [as callAsync] (eval at create (/home/mohamed/Downloads/bookbrainz-site/node_modules/tapable/lib/HookCodeFactory.js:33:10), <anonymous>:4:1)
      •     at Cache.get (/home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/Cache.js:75:18)
      •     at ItemCacheFacade.get (/home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/CacheFacade.js:111:15)
      •     at Compilation._codeGenerationModule (/home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/Compilation.js:3322:9)
      •     at codeGen (/home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/Compilation.js:4861:11)
      •     at symbolIterator (/home/mohamed/Downloads/bookbrainz-site/node_modules/neo-async/async.js:3482:9)
      •     at timesSync (/home/mohamed/Downloads/bookbrainz-site/node_modules/neo-async/async.js:2297:7)
      •     at Object.eachLimit (/home/mohamed/Downloads/bookbrainz-site/node_modules/neo-async/async.js:3463:5)
      •     at Hook.eval [as callAsync] (eval at create (/home/mohamed/Downloads/bookbrainz-site/node_modules/tapable/lib/HookCodeFactory.js:33:10), <anonymous>:13:1)
      •     at Hook.CALL_ASYNC_DELEGATE [as _callAsync] (/home/mohamed/Downloads/bookbrainz-site/node_modules/tapable/lib/Hook.js:18:14)
      •     at /home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/Compilation.js:4993:43
      •     at symbolIterator (/home/mohamed/Downloads/bookbrainz-site/node_modules/neo-async/async.js:3482:9)
      •     at timesSync (/home/mohamed/Downloads/bookbrainz-site/node_modules/neo-async/async.js:2297:7)
      •     at Object.eachLimit (/home/mohamed/Downloads/bookbrainz-site/node_modules/neo-async/async.js:3463:5)
      •     at /home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/Compilation.js:4958:16
      •     at symbolIterator (/home/mohamed/Downloads/bookbrainz-site/node_modules/neo-async/async.js:3485:9)
      •     at timesSync (/home/mohamed/Downloads/bookbrainz-site/node_modules/neo-async/async.js:2297:7)
      •     at Object.eachLimit (/home/mohamed/Downloads/bookbrainz-site/node_modules/neo-async/async.js:3463:5)
      •     at /home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/Compilation.js:4926:15
      •     at symbolIterator (/home/mohamed/Downloads/bookbrainz-site/node_modules/neo-async/async.js:3485:9)
      •     at done (/home/mohamed/Downloads/bookbrainz-site/node_modules/neo-async/async.js:3527:9)
      •     at /home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/Compilation.js:4873:8
      •     at /home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/Compilation.js:3352:32
      •     at /home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/HookWebpackError.js:68:3
      •     at Hook.eval [as callAsync] (eval at create (/home/mohamed/Downloads/bookbrainz-site/node_modules/tapable/lib/HookCodeFactory.js:33:10), <anonymous>:4:1)
      •     at Cache.store (/home/mohamed/Downloads/bookbrainz-site/node_modules/webpack/lib/Cache.js:107:20)
      • MohamedAli009496 joined the channel
      • If you can help me, that's will be nice
      • MohamedAli00949 has quit
      • MohamedAli009496 has quit
      • MohamedAli00949 joined the channel
      • MohamedAli00949 has quit
      • insane_22 joined the channel
      • insane_22 has quit
      • insane_22 joined the channel
      • insane_22 has quit
      • MohamedAli00949 joined the channel
      • MohamedAli00949 has quit
      • insane_22 joined the channel
      • insane_22
        Hello people, I have a doubt to ask. In the current setup we use redis 'for sessions and to cache requests to external APIs' like wikimedia. At the same time, it’s a popular practice to use a Redis caching layer to your Elasticsearch queries. Any specific reason not to do so ?
      • insane_22 has quit
      • insane_22 joined the channel
      • insane_22 has quit
      • munishk joined the channel
      • insane_22 joined the channel
      • insane_22 has quit
      • anonn joined the channel
      • munishk has quit
      • Tarun_0x0 joined the channel
      • Tarun_0x0 has quit
      • Tarun_0x0 joined the channel
      • insane_22 joined the channel
      • monkey[m] joined the channel
      • monkey[m]
        Hi MohamedAli00949 (IRC) did you modify stylesheets/style.scss ? The error says it found an unexpected question mark somewhere, but I can't replicate the issue myself.
      • Are you running everything in Docker? Have you made modifications to other files?
      • insane_22 (IRC): No reason, no. Didn't know about using Redis to cache queries
      • Tarun_0x0 has quit
      • insane_22 has quit
      • insane_22 joined the channel
      • insane_22
        monkey, by storing the results of database queries and checking for them before having to search again, in my view this could be achieved. Isn't it?
      • Saw after googling that it's used at places and wanted to validate if it would even be needed in our case
      • actually, I was looking into project musicbrainz search server wherein they use solr search to have a better idea of the way things are done. It uses "RabbitMQ which gathers data to be indexed from the database, and finally builds searchable documents and sends these to the Solr search server". I was thinking about using Redis for the same and
      • other ways by which we can improve upon things.
      • wait are monkey and monkey[m]two different peopleO_O
      • monkey[m]
        Nope, just me switching between computers :)
      • IMO using rabbitMQ is the way to go
      • Regardless of caching query results
      • Also I think a good start would be tuning the debounce for the search inputs, currently it's too low and we're doing a bunch of searches as we type
      • If you want to fix that too it's an easy one with good resource savings
      • I'd have to read up on using redis for caching query results: got some docs I can read ?
      • yvanzo
        We are actually thinking of using PostgreSQL directly for search indexing in MusicBrainz.
      • It seems to be manageable without RabbitMQ.
      • At least it is for indexing cover art.
      • This is an implementation details and it probably doesn’t matter much at BB scale.
      • insane_22 has quit
      • insane_22 joined the channel
      • Tarun_0x0 joined the channel
      • insane_22 has quit
      • insane_22 joined the channel
      • Also MB search results are cached through OpenResty which uses nginx.
      • insane_22
        Thank you yvanzo for your replies
      • with rabbitMQ I'm a bit familiar but have no idea of use cases, implementation or anything of OpenResty.
      • also I would like to know if there are any reasons not to use redis?
      • yvanzo
        The main issue with using RabbitMQ is that `pg_amqp`is unmaintained.
      • For using PostgreSQL directly, see https://github.com/metabrainz/sir/pull/108
      • It seems feasible to use Redis/KeyDB as message broker.
      • I would still advise to start with the simplest solution (PG table) at first, and have metrics through Prometheus to monitor scalability.
      • bitmap
        interestingly, for musicbrainz, all of the message data that sir needs is already stored in postgres (under dbmirror2.pending_data). so we have triggers pushing the same data to rabbitmq for no real reason
      • Tarun_0x0 has quit
      • yvanzo
        (For others, dbmirror2 is more recent than the search architecture.)
      • bitmap
        yeah, we just haven't made use of it yet
      • monkey[m]
        Interesting
      • bitmap
        I would concur with the use of a PG table though. postgres is very capable for messaging nowadays if you don't need advanced routing
      • and it's one less service to manage
      • yvanzo
        Also avoiding the need of a bridge PG-RabbitMQ or PG-Redis makes it much simpler to maintain.
      • bitmap
        you can safely have multiple consumers take items from the table with `SELECT ... FOR UPDATE SKIP LOCKED`
      • Tarun_0x0 joined the channel
      • insane_22 has quit
      • Tarun_0x0 has quit
      • Tarun_0x0 joined the channel
      • insane_22 joined the channel
      • insane_22
        all of this makes sense to me (except for "Prometheus to monitor scalability", need to see how), also would have a better look at MB changes (dbmirror2 and pr that yvanzosent)
      • imo, all of us are on the same page about this:- using postgres directly for sending trigger messages to search index rebuilder
      • Also, about having a query-caching layer in front of (or solr search, upcominng project for summers)
      • In Musicbrainz as told by yvanzo it's done "through OpenResty which uses nginx", I had earlier thought of using redis to do this
      • "Also I think a good start would be tuning the debounce for the search inputs, currently it's too low and we're doing a bunch of searches as we type" - monkey[m] Isn't this already done? Here:- https://github.com/metabrainz/bookbrainz-site/b...
      • yvanzo
        insane_22: Prometheus can be used to report metrics such as the number of update by minute. See for example: https://stats.metabrainz.org/d/000000078/sir-an...
      • (By the way, MB search has some metrics but unfortunately not this one at the moment.)
      • The issue with caching search results would be to know when to invalidate the cache.
      • Tarun_0x0 has quit
      • insane_22 has quit
      • insane_22 joined the channel
      • Tarun_0x0 joined the channel
      • insane_22 has quit
      • Tarun_0x0 has quit
      • Tarun_0x0 joined the channel
      • monkey[m] has quit
      • Tarun_0x0 has quit