Bloom – Super fast and highly configurable cache for REST APIs
github.comI read pretty far into the readme before I realized this doesn't somehow use bloom filters.
How does it compare to Varnish ?
Vastly simpler, I understand what Bloom is doing after 10min of reading. I had hard time even understanding all of Varnish's scope in same time.
Written in RUST. Uses Redis. Uses custom headers instead of Cache-Control Seems more flexible cache invalidation, but maybe varnish can do this after understanding all of VCL/Varnish.
Seems written for an API developer not wanting to write own/reinvent cache layer for their REST API. vs written for a sysadmin who is building a system that is larger/more generic, less tied to one specific app, and who would reach for Varnish being old, reliable and having every feature ever thought of.
> an API developer not wanting to write own/reinvent cache layer for their REST API
For Laravel projects at least, a package like spatie/laravel-responsecache makes it super easy to handle caching for GET API routes. I'm sure there are similar packers for other frameworks often used in API development.
I really like Bloom, I'd just rather handle it at the application layer, where I can get the finest level of customization (assuming there's a suitable package to abstract the most tedious work away).
Seems you could keep the associated code to a minimum, and easily maintainable, by using model events to trigger cache updates.
Personally, I'd rather have a little more code than a new dependency (and the resources the Bloom takes from each API worker it's installed on). But in situations where it's non-trivial or inadvisable to do it at the application layer, it seems Bloom could be quite useful.
Curious, if you already are running NGINX in front, why not just use proxy_cache?
Reminds me of https://varnish-cache.org/
What's the difference between this and varnish or nginx acting like a reverse proxy?
Found this at https://crates.io/crates/bloom-server:
> A simpler caching approach could have been to enable caching at the Load Balancer level for HTTP read methods (GET, HEAD, OPTIONS). Although simple as a solution, it would not work with a REST API. REST API serve dynamic content by nature, that rely heavily on Authorization headers. Also, any cache needs to be purged at some point, if the content in cache becomes stale due to data updates in some database.
> NGINX Lua scripts could do that job just fine, you say! Well, I firmly believe Load Balancers should be simple, and be based on configuration only, without scripting. As Load Balancers are the entry point to all your HTTP / WebSocket services, you'd want to avoid frequent deployments and custom code there, and handoff that caching complexity to a dedicated middleware component.