implement index sharding
we have an index that grew to 16MB uncompressed, and it can drop to 800KB in brotli, but this takes several minutes and lots of CPU and RAM. it takes far less time on gzip but it can only drop to 2MB.
the issue is that we can't take advantage of incremental builds (#1) because the index is modified anytime a post is added or modified, so it would be compressed in every build.
to prevent this, i think it could be possible to "shard" the index into several smaller files, grouped by year-month (maybe) so the only files that are recompressed in every build are those that contain a modified post (most likely the current year-month).
then search.js
needs to fetch the full list on indexes and since lunr doesn't support merging indexes, we would need a lunr instance for each one of them and run queries in parallel. then we would have to merge results so we can order them by score (or maybe not?)
this last bit is weird because it would take some time to download several smaller files than a big one. though we can download them in parallel and even start showing results :O