Deno KV Is in Open Beta
deno.comI poked around with this a few months ago to figure out how it works locally. The answer is SQLite: https://til.simonwillison.net/deno/deno-kv
I'm finding the business model aspect of Deno KV absolutely fascinating.
const kv = await Deno.openKv();
That's a Deno core API. It works fine in the open source version of Deno using a local SQLite database file.But as soon as you deploy your application to their proprietary hosted service, that core API feature gets massively more powerful. It's no longer a SQLite database, it's now a globally distributed key/value store backed by FoundationDB, replicated around the world.
It looks like they've extended that idea further with the latest version - you can now do this:
export DENO_KV_ACCESS_TOKEN="personal access token"
const kv = await Deno.openKv(
"https://api.deno.com/databases/your-database/connect",
);
And your local code is now able to manipulate that remote FoundationDB database as well.I'm having trouble thinking of a precedent for this - an open source project that has a core API which is effectively a lead generator for their proprietary cloud service.
I'm not entirely sure how I feel about it. I think I like it: open source projects need a business model, and the openKv() method is still a supported, useful part of the open source offering.
Kind of fascinating pattern though.
UPDATE: I just found this page of docs https://github.com/denoland/deno/blob/be1fc754a14683bf640b7b... - which describes the "KV Connect" protocol they are using. It looks like this evens the playing field, in that anyone could implement their own alternative backend to Deno Deploy if they wanted to.
This firmly establishes me on the "I think this is cool" side of the fence.
This is the enshittification of open source.
It has been happening for a while with a bunch of startups like Supabase claiming to be "open source" and marketing themselves as such but making it really hard to self host for a long time.
It wasn't just them either.
I would see with disgust a bunch of startups use "open source" as their marketing tactic, no matter how hard it was to setup or run without their hosted service.
It is also a peverse incentive: the harder the open source system is to run and maintain, the more you will gravitate toward their cloud. Supposedly open source companies raising a ton of money from VC is also strangely contrary to the open source ethos.
Deno KV is basically the next jump in that chain.
Richard Stallman was right once again, as usual.
> This is the enshittification of open source.
You can simply not use it, no?
I am not going to make any sweeping generalizations across all products. But at least in the case of Deno KV, there doesn't seem to be lock-in. So if you were running something self-hosted for KV persistence, it will continue to work unmodified.
> I would see with disgust a bunch of startups use "open source" as their marketing tactic.
Again, not sure which bunch of startups. But I am not seeing that with this product. Seems more like a survival strategy to add some cashflow behind the developers.
I am curious what you think Open Source should be (or should not be). I think it's fair that running a service in the cloud should cost something. And self-hosting it, I think it's fair that it requires a bit more effort than using the hosted service.
One does not have to use/not use something to have valid criticisms of it/opinions about it [1]
1 - https://i.kym-cdn.com/entries/icons/original/000/036/647/Scr...
Isn't that the reverse? It's against the sentiment that being a part of system means that you implicitly agree with it, and any complaints you make about it are void.
That meme doesn't further your point because the peasant is "using" society and understands the problems with it.
My point with the meme link is that you can respond to any complaint about anything with a 'gotcha' argument that's ultimately invalid.
Seems like you can similarly respond to any complaint with that meme too, which is also invalid. In reality, there is a lot more nuance than can be described by pithiness.
Does this change your mind at all? https://github.com/denoland/deno/blob/be1fc754a14683bf640b7b...
It looks to me like they've documented the KV Connect protocol they invented to support this feature, in enough detail that anyone else could build an alternative backend for it.
This has helped me feel completely OK with how they're handling this. They get an advantage in that they've already built an extremely robust proprietary backend, but I find that acceptable given the documented protocol.
I think this is great, except I feel odd that it's just hanging around on the Deno global instead of being e.g. imported like any other database client.
If their protocol is indeed open and usable with your own backend, then that library should be able to work for anyone. And if they need some fancy native performance then maybe they could intercept that import when running code on Deno Deploy?import KV from "https://deno.land/kv" // for exampleTreating their hosted service as "part of the runtime" which is what the Deno global tends to be for is the only remaining ick factor for me.
Yeah, this is the only thing I don't like. Having it readily available as a module import would not change anything (except one line of code) and it feels more decoupled. But in practice I don't see a huge difference, it's not like it's polluting the global namespace with dozens of functions.
> This is the enshittification of open source.
I'm really confused by this statement. What exactly is being degraded in their service? Or in their API? Or in the underlying tech they are compatible with?
All I see here is an open source system that you can manage and deploy yourself, with a 100% compatible API for a cloud service that handles that for you, should you decide to pay money to have that problem solved for you.
I find your comment to be quite dramatic.
Good Open Source is something like PostgreSQL: It's completely free, you can host it yourself, or you can pay someone to host it, there are multiple competing service providers, most of them contribute to the project, and everyone has access to almost all of the source code. Anyone can start offering a compatible service with minimal investment. If you run into a problem, the source code for everything is public, and you can often fix it yourself.
And then there is something like DenoKV: There's an Open Source version that is designed mostly for prototyping or small scale deployments, and a closed source hosted version designed for production. If you want to use it you have to pay one company and there are no competitors. You are locked in. Theoretically, a competitor could create a compatible product, but the required effort is huge, creating a big barrier to entry. And even if competitors do show up, any new features introduced by the proprietary service will take a long time to trickle down to competing services. If you run into a problem, you have to hope the vendor fixes it.
I am definitely ready to call out shams, sellouts, and "enshittification." But I don't see how this one option is any of that.
I'm building a back end with Deno and MariaDB, and pretty psyched about it so far. I haven't built a back end since I did one with PHP 5 in 2011 or so. Thus far I've found it easier to get something working than I did with the tools years ago. So much so that I'd consider throwing these guys a bone by paying for a service.
But... I don't know that a key/value store suffices for what I want to do, which will involve relational queries.
Stallman was right and he provided an alternative. Go use it. Stick to GPLd software safely nested under the auspices of a foundation.
Don't expect Deno or any other corporate entity solely focused on profit seeking to give a single shit about anything other than profit.
It's all marketing, it's all spin, and it can't be any other way, that's how the system is structured.
Cloudflare KV is basically the exact same. Even the same name.
Cloudflare’s pricing seems more reasonable, though a 1:1 comparison is impossible. Cloudflare reads and writes in every region for the same price, Deno scales price with number of replica regions. Also CF is billed per read/write operation (and supports listing), this is billed per kB read/written.
That all said, if your values are bigger than ~2kB Cloudflare is almost certainly cheaper. And the list (with metadata) operation is quite powerful.
However, single-region means you don’t have to care about synchronization nearly as much, which can be quite annoying. The end user experience will suffer in areas far from your write zone though.
A better comparison might actually be Cloudflare D1, a hosted SQL with per-kB fees, but that’s still beta.
Yes, Deno KV is much closer to D1 than to Cloudflare KV.
Except D1 gives you a full powered SQL interface and DKV gives you “get”, “set”, and transactions.
D1 is also faaaaaar cheaper. Like 3+ orders of magnitude cheaper.
> I'm having trouble thinking of a precedent for this - an open source project that has a core API which is effectively a lead generator for their proprietary cloud service.
Is this not Vercel’s entire business model? I don’t think they invented it either.
I suppose it's a variation on the "freemium" model.
I like this too... it's practical API but there's no way you can actually provide this production API for free. So instead of an open source project pulling back stuff you'd expect to be free (because it has zero marginal cost), they are adding stuff in that makes sense to be paid.
NextJS / Vercel is not the same thing but shares similar approach
I'm wondering if this might become interesting for a Sandstorm-like use case, where you can write a personal web app, publish it on GitHub, and other people can deploy it easily to their own Deno Deploy account. (Or they can self-host if they prefer.)
Why does a KV store need SQLite? Does it provide transactionality?
What would you use instead?
By the time you've implemented even a basic key-value store with on-disk storage you've probably written a bunch of code that would be unnecessary if you had used SQLite.
LMDB?
SQLite is superb choice as a local storage solution, you don't need a transactionality requirement to use it as a KV store.
People choose NoSQL databases primarily for scaling reasons, which is not the problem here.
I continue to watch Deno with excitement. I haven't had a good use case to play with it yet (all my free programming time has gone into my side business and I'm not ready to chance it on Deno yet) but I'll keep looking.
I find the way they handle secondary indexes very interesting. I mean under the hood I think DynamoDB does pretty much the same thing (stores the data multiple times) but instead of explicitly writing the data multiple times you define fields on the data that the secondary indexes use so the data is written there at the same time it's written to the primary (I could be a little mistaken, I'm working at a higher abstraction layer so I don't think about that). I can't decide which approach I like more. I will say that I don't think I'd need anything but my own abstraction layer to work with Deno KV vs DynamoDB. That said I still think DynamoDB is way more powerful overall.
As always I'm rooting for Deno to succeed.
Deno gives me hope for web development. The security model, no bundler and general pace of progress is great. They've drastically improved node interop but if they could close the gaps (and sort out extension/less imports) so many more projects could finally jump ship over to Deno.
I worked on some of this - happy to answer questions :)
When you run `kv = await Deno.openKv()` locally it opens a SQLite database. On Deno Deploy it opens a connection to FoundationDB. How does that mechanism work? Is it using the same URL mechanism as the new Deno.openKv(URL) thing?
Yes, sort of - on Deno Deploy the authentication doesn't come from a token in env vars, but from intrinsic security tickets baked into the Deno Deploy system. Also, it's a bit faster on first connect, because compared to KV connect we can skip the metadata exchange[1] because the information it provides is already present in Deno Deploy through other means. Both the backend service and frontend API (JS->KV) is the same though :)
[1]: https://github.com/denoland/deno/blob/be1fc754a14683bf640b7b...
Typo in your blog post: "first processed at the primary pegion"
What's the cost of additional read regions?
Storage cost + write ops are multiplied by region count
Is it like a businessmodel for Deno?
Yes, the hosted KV solution built into Deno Deploy is one of the ways Deno makes money.
Is "Deno KV" a feature of Deno the runtime or Deno the hosting provider? The docs aren't clear about what it actually is, and that makes me a bit wary when deciding to use it.
It's both. I wrote some notes on that here: https://til.simonwillison.net/deno/deno-kv
Related:
Deno KV - https://news.ycombinator.com/item?id=35743446 - April 2023 (11 comments)
How has using foundationdb been?
What have been the biggest pros? The biggest cons? Would you use it again (one alternative could have been TikV).
Genuine question, are there folks hyped about Deno that don't come from a nodejs background?
Me. I'm excited about what they're doing with their security stuff (I love that it can only access files that were explicitly allowed) and it feels like there's a ton of good ideas in there generally. My notes so far: https://simonwillison.net/tags/deno/
Yes.
congrats to the deno team on launching this :)
Thanks <3
Congrats on the launch!
I don't know enough about this to make any real nuanced comments, but I hope they clean up this import cuz this is ugly:
> import { Semaphore } from "https://deno.land/x/semaphore@v1.1.2/semaphore.ts";
Should just be~
> import { Semaphore } from "deno/utils";
Or something like that.
Deno has various mechanisms[0] for abbreviating your imports if/how you want to, but I like the fact that the canonical paths are plain, full URLs
Yeah, it's actually a _feature_ if you've never been lost in the shit show that is node_modules before. I highly prefer Deno's style.
What syntax would you suggest for importing a specific version of a library?
import ... from "dino@1.1/utils"
idk i just think there's many options besides a full url in your code
In Deno you can load your code from any URL on the Internet, instead of relying on a namespace within a single package manager.
Unfortunately, having a URL is only the beginning. After that it gets complicated, because the "source" you get by following a URL in Deno (or a web browser) has often been put through the blender, repackaged and minified by some CDN. It's hard to read and VS Code's debugger doesn't handle it well. Sometimes I'm left just looking at type definitions and it's unclear where the source code for the implementation is at all. Aren't source maps supposed to help here?
So, I look at the repo instead, but now I don't know how it matches up with the code I'm actually running.
I find Go modules to be easier to understand. They are similarly distributed (many source repos will work), but when you navigate to the source code, you get the actual source.
Yes, people do all kinds of things to their code, but this makes it possible for you to download your own.