Settings

Theme

Election's Over, Facebook Gets Back to Spreading Misinformation

vanityfair.com

60 points by malloreon 5 years ago · 21 comments

Reader

zaroth 5 years ago

As I recall the “misinformation” label was erroneously applied, heck even the “Russian disinformation” trope was walked out to suppress some pretty damaging factual information this election cycle.

These systems are just outright abused for political purposes. It’s just gaslighting to claim they serve some higher purpose.

I don’t think it’s illegal for platforms to censure information for obliquely political purposes, but I personally find it extremely distasteful.

  • staticautomatic 5 years ago

    As far as I can tell, FB does not actually have a public definition of misinformation. Not long ago, I wanted to see how they defined it and tried to look it up. Searching led me to “fake news” and I learned that they call “fake news” “false news” and that if you search across Facebook.com and fb.com for “false news is” you get about 6,000 results, none of which are a definition of false news. If anyone has found a definition of these terms somewhere on Facebook please link!

root_axis 5 years ago

In the same way that it's Facebook's prerogative to spread misinformation on their website, it's the user's prerogative to understand that they shouldn't trust what they read on Facebook. There is no way to stop the spread of misinformation on social media, attempting to curtail it to some degree is a PR business prerogative not an actual solution to the problem. Ultimately this is a cultural issue, we need to train the next generation to understand that anything read on social media should be implicitly distrusted; citing something you read on social media should be regarded as the intellectual equivalent of citing something you saw in a movie once. Yes, true things can be depicted in a movie, but nobody believes the movie is an acceptable source for itself, if a film claims to depict true events, it is expected that reliable sources will not contradict the film.

  • tweetle_beetle 5 years ago

    > In the same way that it's Facebook's prerogative to spread misinformation on their website, it's the user's prerogative to understand that they shouldn't trust what they read on Facebook.

    The asymmetry between Facebook users and Facebook users is enormous. One has the power to spend vast sums on advertising to attract new users, maintain a clean brand image with PR, monitor non-users' actions across the internet and beyond and algorithmically manipulate the emotions and beliefs of its users, etc.

    What power do you or I have to justify splitting the moral responsibility with them 50/50? More education is always a good thing, but it's too late to tell 1.62 billion daily active users (Q3 2019) "You should know better". Facebook has become too good at what it does.

    I don't know what the answer is. But whatever it is, letting Facebook carry on doing what it does and expecting society to change around it isn't going to work.

    • root_axis 5 years ago

      For the vast majority of people that use Facebook it's a minuscule sliver of their day, it's not some all encompassing monolith that controls their thoughts, most people will spend a few minutes leaving comments or posting photos, they don't rely on Facebook as a tool to understand complex issues. As I said, this is a cultural problem not a technology problem, you might as well be complaining about Fox News, people have to make their own decisions about which sources of information are reliable.

underseacables 5 years ago

I tried to buy ads for a nonprofit I volunteer for (drive donor traffic) and Facebook said they have a ban on social issue advertising, in place SINCE the election. Seems rather strange to ban such advertising after the election.

  • dillondoyle 5 years ago

    A HUGE amount of political spend is 'non profit' 501c___ shadow entities. hard to define the political line & the FEC regs and 'magic words' are pretty stupid. way easier to just ban it all for now.

    FB opened GA I might guess (hope) they open it all after the inauguration

sthnblllII 5 years ago

Private mega-corporations like Facebook should never be deciding what information is "mis"-information, especially during an election. We as Americans should decide what speech we want to be legal or illegal and use proper transparent legal channels to enforce these collective decisions, whether that speech happens on the street or on-line.

  • soneil 5 years ago

    I want to agree; corporations should not be the arbiter of what is and isn’t true.

    But they do want to promote content to us, because their entire business model depends on keeping us in a never-ending scrolling algorithmic stupor.

    So there’s the catch-22. I don’t want them promoting falsehoods. And most falsehoods only survive be being more interesting, more salacious, more engaging than the truth - so unless the system has some bias against them, it will prefer them.

    If there’s a magical solution to not promoting falsehoods without deciding what’s false - there’s probably a lot of money waiting for you.

    • sthnblllII 5 years ago

      >I don’t want them promoting falsehoods

      Have you're political beliefs been 100% completely static your entire life? Have you ever realized you were wrong about something, and that your previous belief was a falsehood?

  • root_axis 5 years ago

    Facebook doesn't "decide what is misinformation". When you need to figure out if something is true, you don't check Facebook as a source because everyone knows it's not an arbiter of truth, it's not productive to speak as if it is one.

  • dasil003 5 years ago

    Media companies have always had to decide what is misinformation, and is an entire discipline called fact checking to support it.

    Now obviously social media is not the same as traditional media, but they have the same reach, and with orders of magnitude less latency and more virality. I'm not sure how you propose the government and legal system can stay on top of this—by the time the courts hear about something weeks, months and years later the damage has been done and the news cycle has moved on.

    It's not ideal for Facebook to be the arbiter of truth (and neither do they want that responsibility), but operationally any solution has to be deeply embedded within their systems.

    • rndmind 5 years ago

      Actually the government did have stringent regulations that all broadcast news corporations had to adhere to. This was called the Fairness Doctrine https://en.wikipedia.org/wiki/FCC_fairness_doctrine

      That was until 1987 the Reagan administration eliminated the Fairness Doctrine... thanks Reagan!

      • tracker1 5 years ago

        Unfortunately, even if it were still a thing it wouldn't apply to online, print or cable news sources more than likely.

        • rndmind 5 years ago

          Something should be drafted regarding online news sources, or perhaps classify them as websites that get > 50 million unique visitors a month... something like that

    • sthnblllII 5 years ago

      >Media companies have always had to decide what is misinformation

      >I'm not sure how you propose the government and legal system can stay on top of this

      In my country, (the United States) we have a principal in our cultural called "freedom of speech". It basically means that we do not propose or tolerate any mechanisms for general political speech to be "kept on top of" (or "fact checked" or declared "misinformation"). People are free to decide for themselves what to believe, and when you think something is wrong you publish an argument against it. It may be different in your country.

    • tracker1 5 years ago

      I think the biggest issue is that the "Facts" the fact checkers present, often isn't factually based. Not to mention, the fact checking orgs will cover factual, even if mixed with hyperbolic opinion articles in favor of their own news orgs articles (that are monetized to boot).

      It's also less than ideal that they body-check facts from right leaning news sources far more than left leaning. FTR, I'm Libertarian not typical left/right.

      I posted a humorous meme, and it was tagged with a couple different "get the facts" messages, which was just plain stupid and kind of funny, even drawing in comments to that effect.

      In the end, it's hard to trust most of the media sources without getting news/pov from multiple left/right/foreign sources. I'm starting to think that anything that is opinion based over facts in news reporting should have a big red label of "OPINION, NOT NEWS!" on it in order to receive any protection from libel, including videos for a couple seconds. I also think that any social media org that has more than 25% of a country as its' users should probably have to comply with the same free speech norms as a government agency would.

      I work in the elections space, and while some things seem really fishy, it's hard to take any of it without a grain of salt considering how many of the things I've seen being accused are purely BS from people who don't know/understand what's happening. I mean, I'm pretty sure there was some fishy stuff going on in at least some places, but I just don't know what to believe anymore... and that's almost worse than just picking a side.

      I think having peer review opportunities would go a long way over some of the paid and algorithmic review options. Even if you had a hidden Karma like here and other sites and was randomly chosen to "review" judge/vote on content posts... maybe not private messages, but say public posts or group posts... just a general thumbs up/down.. and the quorum is comprised of say 3-5 randomly selected each of left/right/other for a decent mix. You get a 55% quorum for violation, the post then goes to FB for review/action.

      Reviews are done by those in the same locality of the person posting. Private groups or message posts go directly to FB monitors... but there's no reason not to have a community judge for itself if something is over the line.

      • mr_toad 5 years ago

        > It's also less than ideal that they body-check facts from right leaning news sources far more than left leaning.

        You mean statements like this?

cohnjesse 5 years ago

Facebook will never change the way they serve content as long as their business model depends on it.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection