Facebook Considered Charging for Access to User Data
wsj.comHow is this different than any other company in any other industry? https://en.m.wikipedia.org/wiki/Information_broker
This is credit agencies' entire business model and you can't even opt out/close your account
I believe the major difference here is that credit bureaus are open about their business models. Can’t say the same about the ways in which Facebook monetizes data.
How is my credit score calculated?
It's the same in many ways, but that's no defense for facebook, quite the contrary.
edit: it's even the same in that with facebook, opt out is not a realistic way in many cases. even if you go ahead blocking all their servers ( uMatrix! <3 ) you'll find the web broken in many exciting ways.
I'd actually be down with this, but only if facebook gave user's the ability to lock their own data down at fair market price. That way the user pays for a product, or decides to be the product. It's clean and cut & dry; and there wouldn't be any illusions of a "free" service.
The problem is market price would be less than a penny per megabyte of data.
EDIT: Okay, I admit I pulled this figure out of my behind but my point is data is incredibly cheap.
That's a completely useless metric.
200 bytes can easily contain name, adress, phone number, occupation, age and list of interests for one person.
The first 200 bytes are free then.
Do you measure your privacy in megabytes?
Maybe he has a data capped VPN? ;)
Data delivery is cheap. The data itself can be very, very expensive.
That would be good under this model, wouldn't it, because you could buy back your privacy at a very low price.
I don't believe this to be the case though.
Then I would have no problem paying, and Facebook would be glad to make 10x what they would have otherwise.
Facebook currently make $60 per user in the US, a little less for the EU, and basically nothing per user in the rest of the world.
It seems doubtful that to many people would pay $60 per year for Facebook, at certainly not $600 if they where to make 10x (minus the overhead of running ad sales)
I'd be happy to subscribe to FB at $5/month if I had no advertizements, no promoted posts and no spammy messenger entries. With them emphatically not selling my data.
The amount that Facebook makes from storing my PI is not exactly the same as the amount it makes from showing ads. Contextual ads are not completely worthless, they ran the web for years.
Maybe Facebook shouldn't be a sustainable business, then?
Winning lottery numbers for the next draw are less than 8 bytes.
Isn't their ad product already effectively charging for access to the user data? In a slightly more obfuscated way, sure, but broadly the same?
I don't think so. I think the more correct way to frame it would be that they are charging for access to the space in a user's feed. Like renting billboard space, except the user data makes it very easy to understand where you should put that billboard.
In fact, the adtech industry quite literally refers to these spaces as "inventory" - https://wiki.appnexus.com/display/adnexusdocumentation/Inven...
No, Facebook sells attention of relevant users, which is a product which Facebook uses user data as a tool to craft. That's not the same as selling user data, in the same way that, say, investment houses aren't selling Microsoft Excel.
Yes, but that's just enough obfuscation for them to deny any claim they do charge. But from the free content I was able to see on this WSJ article (not a subscriber) they considered charging businesses in addition to users. Perhaps in the same way they charge for access to user data in France. Where some Facebook users can only see, for example, the first friend photo. The rest of the friend photos then appear behind a paywall. I saw this just last week actually in Indonesia from a tourist trying to show me some photos of his girlfriend back home—only he couldn't because the mobile app his the images of his friends behind a content upgrade.
So what, we've all considered bad ideas.
most of us don't have a track record of making and implementing decisions that
1. expose millions of people to russian propaganda and disinformation 2. expose very sensitive personal data of millions and millions of individuals to dubious actors in violation of company policy (cambridge analytica) 3. hire right wing dirty tricksters to smear opposition as being financed by the evil jew George Soros 4. drag our feet when internal security professionals identify (1)
and that's just off the top of my head. Facebook has a consistent pattern of egregious misbehavior. This article is just another brick in the wall indicating profoundly bad judgement. Only by the grace of god did they not implement such a policy of selling user data.
I'm not in love with the monopoly power of google (I use duck duck go) and they clearly do some dumb shit. like the government approved censorship engine (firefly) but it basically seems like brin and page and schmidt and sundar pinchai are not expressively malevolent and seem to be trying to act in a way that balances shareholder interests with the interests of users. Zuckerberg and Sandernberg still seem to be operating under the mantra of "they trust me; the dumb fucks".
honestly that quote should have been the end of facebook.
Even if customers pay for a service, isn't it likely that eventually the service will also want to sell their data anyway in order to increase margins?
The potential reputation hit is harder to justify if you've got robust other income streams.
Indeed, and that's certainly reassuring; the option may still gain traction over time though, even for the most forthright of businesses.
I wish that was the case. The number of third party trackers on web shops is sometimes infuriating.
They already did this experiment years ago. They had a product called Pylon with Datasift. Rumor had it people thought it was a cool idea, in theory, but few people had any practical uses for it.
The real question is, did they decide not to because they realized it was unethical? Or because they felt it was too risky to public perception?
I don't get this obsession with ethics (in terms of correct motives) from companies. Why does the thought process behind the behavior matter when it comes to a public company, over and above the behavior itself? The fact that the company is responsive to public opinion should the goal.
Act too unethically and with popular opinion against you you’ll quickly find legislation against your interests.
> I don't get this obsession with ethics (in terms of correct motives) from companies. Why does the thought process behind the behavior matter when it comes to a public company, over and above the behavior itself?
Because people want to evaluate the company's trustworthiness (or more precisely, the trustworthiness of its leaders and culture).
In this case, if they refrained from selling data due to ethical principles, you can put more trust in them, at least for a little while. If they only refrained because they didn't think they could get away with it now, they'll probably sell your data the first chance they get, so you can't trust them as much.
However, Facebook's already clearly demonstrated its' true colors many, many times, so it's kinda a waste of effort to reevaluate it.
Most likely they realized it wasn't as profitable when compared to other options.
Profit sharing would be interesting.
When the product is free, it isn't the product. You are the product.
Like Google employee Facebook employee would have done something, if they were any good.
Sorry for being harsh.
Reference: https://www.cnbc.com/2018/11/27/read-google-employees-open-l...
Hanging more good quotes in the facebook's office wall doesn't make you any good, you are part of an organisation that creating more problems in the world, you are the reason. Unless you act and do your part.