The Missing App Economy (2022)
icing.spaceIt should be obvious to everyone now that centralized platforms (like Apple's App Store) might be generous and foster innovation during their growth stages, but then become extractive and limit innovation in later stages. After all, they have total control of the platforms. This is why decentralization matters.[1]
The problem is that "decentralized" protocols often end up becoming de facto centralized because they lack key economic features.[2] And centralized platforms fill the gaps by offering those economic features — at the cost of giving them total control of the network.[3]
This is why the most innovative and most interesting web applications occur in the browser, which is open and permissionless — and where web domains are mostly decentralized in nature. The only way to ensure a platform retains something close to its original economic properties is to ensure it remains decentralized, but it's hard to resist centralization.
I know HN hates it, but there is a class of protocols that use cryptography and Byzantine fault-tolerant consensus mechanisms and are designed to resist centralization... and they have explicit and verifiable economic properties. They're known as crypto, and involve blockchains. And this appreciation for decentralized-by-design systems is colloquially called web3.
[1]: https://cdixon.org/2018/02/18/why-decentralization-matters
[2]: https://www.youtube.com/watch?v=WGfS6pPJ5jo
[3]: https://knightcolumbia.org/content/protocols-not-platforms-a...
I don't see hate for protocols or cryptography on HN. But I do see
(a) skepticism about the actual decentralized (vs distributed) nature in practice (especially where the topic is financialized tech or even an end run around all the learning that's gone into the social codebase about finance).
(b) realize that decentralization presented by things like the web are essentially "good enough" that capabilities are strong, personal or entrepreneurial latitude is pretty wide, and obstacles are few.
EDIT: Thanks for this comment — I edited the original for clarity.
I'm talking about crypto as in Ethereum, which are designed to resist centralization. HN wrongly thinks of crypto as a financial system (which it can do, for good and bad reasons thanks to its permissionlessness), instead of a family of methods for making composable, decentralized protocols with built-in economics (blockchains).
But why do these protocols need built-in economics? No user is going to want to have a wallet that slowly drains as they use the web -- it's just not convenient and will be thoroughly rejected by the mainstream.
Compare this to something like torrenting - people give, people take. Sure, indexing is centralized, but once you have the index, it's all p2p. That's the direction decentralization should be taking, instead of the next generation of microtransactions where they're required for operation.
You now too have a wallet that slowly drains while you use the internet. Web3 wallets are just better integrated into it.
Also, there is a misconception that the cryptocurrency based web is like dial up. It isn't. What it is is: 1. a way to decouple the ownership of the digital goods from the service that gave them to you 2. a way to actually do privacy preserving computing 3. an open layer to build added value things that can interact with assets and services others have created (think web2.0 in the bookmarklets era but your bookmarklets can talk with any site)
Which of my wallets is draining as watch YouTube/TikTok/Twitch or read HN/Reddit? I have guesses on what you could mean (data, attention, etc), but some clarification would be nice.
1. There's no question that ownership is going to be a hot-button issue going forward, but I don't see how decentralization solves that problem (i.e NFTs being on ten different chains).
2. How does it preserve privacy? If I share data with a third party, what's to stop them from copying it over and ignoring my request to cut it off?
3. What is that added value? If you're talking about data portability that's certainly an argument in favor of decentralization, but that has a mess of competing & conflicting standards to wade through (see #1)
Good questions.
I actually meant your normal wallet or bank account. They have a tendency to go towards zero even if we don't do anything because we already are paying monthly to be connected to the modern world. But yeah, attention and data are also good example of ones value draining away while using the web.
Re 1: most L1 blockchains have bridges to each other now. This means that I can move assets from one ecosystem to another quite easily. If one discounts the ethereum network, moving across chains is actually quite economic as well.
Re 2: in web2.0 sharing your data with another party was the only way to play. In the decentralized internet it is not. There is a lot of interesting work going on in this domain actually.
Re 3: I'll use an example from the gaming world but the concepts are transferable to non gaming 1 to 1. If Activision decides to create digital collectibles in their games today, I as an indie Dev cannot create a game where a user can log in and have their Activision stuff available in my game. In web3 these assets are encoded as state on an open platform so anybody can integrate them in their applications. In the case where NFTs are 1st class chain state, like in Cardano, you are even guaranteed that you can trade your NFT on whatever platform you choose, be that an indie game or the originating marketplace. Is it clearer now what the added value is?
> I actually meant your normal wallet or bank account.
Unless I'm on a mobile provider plan with data caps, the cost of consuming an additional Web page is zero.
Built-in economics doesn't mean users pay per use. A popular design path is to try to build systems where data that goes on the blockchain is very limited[1] and easily subsidized. It's also very easy to design systems where data writes and computation are subsidized by clients or the system itself.[2],[3],[4]
The reality is that in current networks, which are totally financial but only implicitly and in a way that is opaque to most users, computation and data costs are still subsidized. Crypto networks open a huge design space that simply makes the economic flows explicit — and not necessarily borne by users.
[1]: https://www.varunsrinivasan.com/2022/01/11/sufficient-decent...
[2]: https://medium.com/coinmonks/ethereum-meta-transactions-101-...
> But why do these protocols need built-in economics?
I believe, in this time of humanity where economy matters, that Blockchains have built-in economics for two reasons:
1. They all started copying Bitcoin, which by nature is an economic system (this is the goal of the Bitcoin's paper). 2. Reward by money is a common way to attract a sustainable amount of active users that keep the network alive without evil intention appart from making "quick bucks". In contract to volunteer networks such as Tor, where a lot (even a majority) of nodes are run by gov. agencies with the goal of identifying users, popular blockchains are still run properly since users are focusing on money and not destroying the network.
>No user is going to want to have a wallet that slowly drains as they use the web -- it's just not convenient and will be thoroughly rejected by the mainstream.
You don't need the mainstream, you just need the cool kids. The mainstream will follow---or not.
As for why anyone would want this, there is a very compelling reason---no ads.
But I already have no ads and no payments, so no thanks.
> Ethereum, which are designed to resist centralization
https://cryptoslate.com/research-ethereum-is-neither-decentr...
> permissionlessness
The existing internet is sufficiently permissionless for the vast majority of potential activities. Where it isn't, you start to reach into the boundaries past which there are often reasons for permissions to exist.
> HN wrongly thinks of crypto as a financial system... instead of a family of methods for making composable, decentralized protocols with built-in economics
A system with "built-in economics" is a financial system.
> HN wrongly thinks of crypto as a financial system (which it can do, for good and bad reasons thanks to its permissionlessness),
That's how it's implemented in real life, so that's what people's takeaway is going to be. What was the last dApp anybody used that wasn't a pay-to-play game, or crypto marketplace?
You may as well say communism is theoretically sound, but people will continue to get hung up on the fact that's it's real-world implementations have always led to despotic leadership.
> might be generous and foster innovation during their growth stages, but then become extractive and limit innovation in later stages.
Winner-takes-all is something like a law of nature.
While I'm very pro-competition, trust busting (anti-monopoly), and nurturing young small businesses...
I suspect that churn matters. Innovation and creativity requires the Powers That Be give the old ant farm a good shaking every few years.
I have no idea what that'd look like. The only historical analogy I have are Debt Jubilees.
https://en.wikipedia.org/wiki/Debt_jubilee
https://en.wikipedia.org/wiki/Jubilee_(biblical)
Said another way...
I'd be fine with maximally rights granting patents, all but sovereign corporate charters, and other final boss economic and political arrangements, provided that it all gets burned down every 7 years.
The water bowl a good oxygenation for the gold fish to thrive.
It's called dweb. Sometimes people do get it confused with 'web3' but that was just the name of the massive 2-billion-dollar shit a16z took on everything.
Dwebcamp just extended ticket sales if you want to go talk to real dweb people: https://dwebcamp.org/
I'm familiar with dweb, but it seems that they're just preparing to relearn the lessons of the early web (federation doesn't work; you need mechanisms to maintain Byzantine fault tolerant consensus; explicit economic incentives are required if you don't want the system to develop out-of-protocol centralizing economics; etc.).
So I'm just working on a CSS thing that barely qualifies as dweb. But the people I talk to that are into building solutions are well aware of the problems and tradeoffs. I don't disagree with your assessment. But the relearning is like our way of mining for genuinely new lessons. We're quixotic but totally self-aware about it.
Honestly I prefer a native app to a web app because the web browser is basically an O/S running inside an O/S and there's a annoying amount of gain mismatch / poor coupling between the two. Native (non electron) apps on the Mac just integrate with the other OS capabilities (e.g. text selection, drag/drop all work consistently and interoperate properly; keystrokes, shortcuts, system context menu & integrations, etc just make using the machine smoother. This is also true under iOS.
I don't like apps that are just a wrapper to a web view, but they do protect in one way: if you fumblefinger when entering an address in the browser it can autocomplete to a phisher, while my bank's app is in a known place on my screen.
I'd argue bookmarks achieve much of the same result, but I see your point.
Cross-platform, as much as I'd like it to work, just doesn't seem to ever feel native. I'd very much like to see a transpilation-based cross-platform development framework that generates real Swift/Kotlin/Javascript for iOS/Android/Web, but I'm not sure if it's possible with all of the nice developer experience goodies that developers would come to expect (namely hot reloading) -- but maybe!
People have been trying to make that work since the 80s.
The different interfaces don’t have the same metaphors or semantics so cross-platform UI toolkits are always restricted to worst-case lowest common denominator capabilities.
You see this outside UX too — few programs are written to pure POSIX because usually you want to do more than that core.
> I don't like apps that are just a wrapper to a web view...
Devs are just too aware that they do it to maximise profits and not to maximise UX quality. It's just like Linux guys building console tools and pretending they are better than a (good) UI.
Well, if one is an advocate of PWAs, wrapping it in an app store app should provide an equivalently good and perhaps more user-accessible experience, no?
So my point wasn't to point at the junk apps like, say, the CVS or ATT apps which are simply a web view to a shitty website that is itself just a poorly written skimcoat over a mainframe.
I'm talking about people who put a lot of effort into the web site and then turned it into an app as well to make it easier to run. I'm saying that those provide a crummy experience in practice as well, because in practice I think the web browser experience is rarely that good for anything more than simple functionality.
There are apps today that are using a mix of web and local capabilities.
- photopea.com - import local image, make edits, export back to local storage
- tldraw.com - same
- demo.logseq.com - best example I've found yet. No login required, you can import a local folder of Markdown files, and have the Logseq web app interpret them as a set of backlinked notes.
To paraphrase the article: with the Google-Apple mobile duopoly, consumers are stuck between a rock (web apps) and a hard place (native apps).
It isn't in Apple's interest to allow web apps on iOS to have feature parity with native apps, because that's where Apple's moat is.
It isn't in Google's interest to allow mobile web browsers the freedom to behave like desktop browsers (extensions and filesystem APIs), because web ads are how Google establishes its moat.
The solution seems obvious; A phone with a barebones OS that can run a desktop-class browser. This is what the MokoPhone and Nokia N900 tried many years ago. Even Palm called its mobile offering WebOS, because they knew that competing via app store counts was a dead end.
I’d say Safari is not bad at all on an iPad. I remember a lot of people carping because of bad ideas from PWAs not being implemented (spam spamifications, half-baked rube goldberg schemes that accomplish half of what Netscape Netcaster did back in the day…) but really all the real web apps and even demos like visualization of NeRF models work on my iPad.
> spam spamifications
As much as I dislike notifications, they're becoming a necessity in a world of decommoditized email. If I could know that my mail would actually get through when I send it to a willing recipient (as opposed to the almighty Google System Lords deciding that my mail should be dropped because their chicken entrails said it looked like spam that day) then there'd be no need to reinvent email as notifications... but here we are.
One of the drivers for email newsletters, ironically, is that bloggers don't trust Google to send visitors their way repeatedly, so they harass their users incessantly hoping to bypass the Google search monopoly although this runs headlong into the Google email monopoly.
To be fair though I remember working for firms that were struggling mightily to deliver mail to AOL in 2005.
“Spam spamifications” as in, notifications?
Good luck building a competitive direct messaging client without “spam spamifications”.
PWAs aren’t ideal, but building to a single platform with different screen sizes is far more efficient and achievable for upstarts than building multiple separate apps in different languages targeting multiple platforms. Arbitrarily limiting features like “spam spamifications” just gives an arbitrary advantage to the well funded over the independent and bootstrapped.
Would the librem line of phones be an early attempt at what you describe?
No, because they are designed for developers and privacy nuts first, not average customers. Devices like this will never achieve a level of polish necessary for general uptake. Heck, the Moko and N900 died precisely because they were too niche to justify the price.
I recommend spending some time on OnlyFans to see what social networks can look like without the App Store.
Video fundraisers, paid messages, creator subscriptions, tips, interactive live streams and tiered subscriptions all work together to allow creators to flourish financially.
Apple’s 30% rules cripples YouTube’s ability to imitate these features and even restricts the viability of these features for platforms like Patreon who have a critical chunk of their engagement coming from iOS
Nearly everything in that space comes with a sea of confounding factors; biological imperative is fickle thing. I find it akin to gambling, in that folks with impulse control disorder are most susceptible to making rash and often ill-advised decisions.
Point I'm trying make is you can't sensibly extrapolate lessons from there and apply them to a different space. Sadly. Because it is an absolute cashcow and I reckon it probably makes for like a quarter of Stripe's income.
Social interaction can be just as much of a biological imperative as reproduction.
I bet many people would be willing to pay $500 for a FaceTime call with MrBeast. Or pay $50 to choose which makeup their vlogger tries on. Maybe even $100 for their favourite comedian to make a joke about them.
What you're describing is parasocial interaction, not social interaction.
One person was willing to pay a lot more: https://www.essentiallysports.com/esports-news-mrbeast-recal...
I was initially horrified when I read it first but then realised it would be like someone with $100k paying out $25 for a birthday present.
> Video fundraisers, paid messages, creator subscriptions, tips, interactive live streams and tiered subscriptions
As a user, that doesn’t sound great to be honest.
Are you a user of this particular product though?
I personally don't play video games anymore because single-player is largely an afterthought, and they no longer come in a box costing $20-50. That was my mental model of the games business, but the industry has moved on to DLCs, microtransactions, subscriptions etc. And seeing as how gaming is bigger than ever, the majority appears to have accepted the pricing changes that I did not.
Similarly, those that pay for porn have likely had to adjust to the changing realities of that industry.
My son had loads of fun with Hogwarts Legacy, and that’s before any DLCs or multiplayer. Same with Stray, Unravel, and other titles. I agree there’s lots of games that I just don’t consider playing because of the reasons you mentioned.
Monetisation always hurts users, it’s just a matter of picking the least painful poison.
You probably don’t enjoy VPN adverts in the middle of your YouTube videos either, but people need to get paid somehow.
Well there’s YT premium. One transaction per month and you’re set.
Most of that is already on the app store anyways, besides the parts that don't make sense (ie "video fundraisers").
I'm trying to think where any of this is explicitly banned. You can do subscriptions with various tiers, various one time consumable purchases and various one time permanent purchases. And things like games make this a bit easier with the ability to buy generic in game currency that you can spend within the internal game store. On top of that, if the service is not 'digital' you can use your own payment infra and skip the %30 cut. Like Uber or buying physical goods from an e-commerce store.
Like video fundraisers, paid messages, creator subscriptions, tips, interactive paid streams and tiered subscriptions can be covered by consumable credit purchases and credit subscriptions. What are the explicit rules or history of rejections banning this?
The App Store doesn't prevent any of this (though, the App Store won't distribute porn) - it just requires a 30% cut. What's OnlyFan's cut of all those transactions?
Youtube can imitate these features on it's website, where it can surely do a better job due to scale. Twitch does this - arguably OnlyFans is largely modelled on Twitch.
> What's OnlyFan's cut of all those transactions?
Onlyfans takes 20%.
If Apple had a stripe connect like API which charged 30% then the business activities I mentioned would probably be viable, if painfully expensive.
To make social payments at present, a company would have take In App Purchases revenue into its own bank account then distribute those funds to creators. This usually requires some kind of money transfer licence depending on jurisdiction. There’d also be cost in performing KYC checks on all creators and covering another layer of currency conversion (again depending on jurisdiction).
That kind of setup is possible at the moment, but hard to sustain without adding at least another 10%+ to the fee.
And the operations side of managing chargebacks, payment timings and refunds through the IAP platform stacked on top of your own money transfer business would be very painful.
Those businesses are already viable, without App Store, as evidenced by Twitch and OnlyFans.
App Store deserves plenty of criticism, Youtube's failure to produce a viable ecosystem for accepting tips for videos is not Apple's fault.
Does Twitch allow tips inside its iOS app? Ignore the rest of comment if so.
I agree theses businesses are viable however their iOS App activities are severely limited, thus creating the missing app economy.
Surprisingly enough, yes, Twitch does allow tips (Bits) inside its iOS app. They're 30% more expensive than on the web https://help.twitch.tv/s/article/watching-twitch-on-ios-devi.... Same with subs https://help.twitch.tv/s/article/subscriptions-on-mobile
Google says OnlyFans takes 20%. However, if you look at a saas business, the general metric for a well-run business is a 50% cost ratio. That means paying 30% instead of 1-ish percent for payments is actually 60% of profits in a well-run saas business. I suspect stripping/sex work has a lower cost than engineers, marketing, CS, computers, etc.
Which goes a long way to explaining the dearth of professional tools in the app stores.
> just > 30%
30% is anything but just.
I find that people making these arguments need a villain to blame to $x million. Apple's privacy protections is an easy target for why they haven't made any of the $320 billion Apple has distributed to developers since the App Store creation.
Sure, I'm glad it works that way for them. However, I'm glad Apple's rules exist so I don't have to deal with those and low-quality apps. I'm in the 98% majority that's fine with not having xxx or crap apps in the app stores.
Every time I use the android play store I'm just overwhelmed by the number of low quality crap.
Apple’s rules on social payments primarily affect content creators and other big tech orgs like FB/YT/TT.
Whether Apple allows adult content is a separate matter from whether Apple takes a 30% cut from bloggers and video makers who produce digital content in exchange for payment.
One of the biggest problems for "apps" and entrepreneurs is distribution. Everything this article is talking about was true when HTML5 launched (2010 era) and so many mobile device APIs became web standards.
What the App store has, that is hard to reproduce, is distribution.
The web already cracked the distribution problem.
Javascript on the browser was the only choice for your frontend code, however, WASM is already shipping on all current browsers except IE and Opera Mini (which is a miniscule percentage of browsers in use today).
It is just a matter of time where WASM becomes the ABI of choice for crossplatform app development as the binary deployment vehicle where you can program in any language and have the app delivered as a WASM package via HTTP/HTTPS.
Apple's App Store moat will dry up when people (esp. devs) realize that they can get close to native performance for web apps delivered this way and the vanity of an icon on the phone/tablet screen isn't worth forking 30% to big apple.
It's not about the technologies of distribution.
Apple's moat is the 1B users who go to the app store to search for apps. No amount of WASM is going to get developers in front of a billion qualified purchasers.
I shipped an app in the App store. Over a few years, I made mid-six-figures from it. I never once begrudged Apple their 30% because without the app store I would have made next to nothing, certainly not enough to justify building and supporting the app. What was I going to do, set up a web page and hope my SEO skills were good enough to get a small trickle of users?
I often wonder if any of the people outraged at Apple's 30% (now often 15%) cut have actually tried to market an application. That is a tiny, tiny price to pay for the upside.
> the vanity of an icon on the phone/tablet screen isn't worth forking 30% to big apple.
I think the biggest problem here is that these icons on the home screen are great retention drivers and if you go through an app store the OS creates it for you while through the browser you have to actively decide to do it. Its also funny how the iPhone initially only had web shortcuts for the screen (and arguably invented the pwa) only to immediately change course once they noticed that Apps are their moat.
Surely, the concept of a web app icon on the desktop/tablet is not a new idea to anyone here.
the problem with web is there's really no dispute mechanism and you don't know if you can trust an unknown website. doesn't mean you can't make it work just more friction that costs money.
Isn’t that the purpose of credit cards and other payment processors like PayPal?
can you dispute subscriptions through credit cards or just the individual transaction. plus you don't know if your credit card info is just going to end up on the dark web. I think it's part of the reason people don't try new apps and stuff as much.
In the process of pulling an app off the app stores. Yea, there will be more friction, but to me, a person finding my app organically and having a more direct relationship with them means more to me and has been more valuable than the low conversion sales I get from casual app store browsing.
I’d love To hear more about this
Contact info in my profile =D
(2022)