Ditch the media literacy cynicism and get to work

4 min read Original article ↗

“In reality, many forms of both radicalization and infiltration would be more difficult with a media literate audience — particularly if those with the most influence had better skills and habits around assessing reputation and intent.”

A few days ago, YouTube star Pewdiepie recommended his 75 million subscribers follow a YouTube account that was associated with promoting alt-right and anti-Semitic content. He chose the account based on several video essays on films there, but the signs were all there for anyone to read, visual “jokes” about the death of Heather Heyer in Charlottesville, a link to a Gab account that featured comments about the “Jewish question”, and even an Adolph Hitler speech in one of the older videos, apparently. As a result of his recommendation, the channel’s following grew by 15,000 subscribers.

When I talk to people about media literacy, doubters often express some version of what I call the “homeostatic fallacy”: the idea that ultimately we all just share and read things that confirm our beliefs, with no net effect on anything. It’s often portrayed as a hard truth — this is the reason media literacy can’t work, silly rabbit! But it’s actually a profoundly comforting belief to those that embrace it: the more things change, the more they remain the same.

Yet both my experience in the classroom and the increasing frequency of events like the one above give the lie to this analysis. I don’t know how many of those 15,000 subscribers came in knowing they were subscribing to a channel that was likely to push such content to their feed, but my guess is most signed up for Death Note anime analysis, not anti-Semitism. Had they known the nature of the channel, many would not have subscribed. It’s also the case that PewDiePie, who lost major content partnerships when he was accused of anti-Semitism previously, risks losing millions of dollars of income with mistakes like this. He likely wishes he had vetted the channel better. And in our classrooms we find lack of skills to be a far greater driver of mistakes than worldview  — when students are taught basic vetting skills we find little discernible effect of tribalism at all.

Of course, perspectives shift. Once a person subscribes to a page or channel, what Claire Wardle calls the drip, drip, drip of radical content begins to wear at one’s worldview. But this process so often seems to begin through a series of small mistakes, little neglects that eventually lead to more permanent results. In reality, many forms of both radicalization and infiltration would be more difficult with a media literate audience — particularly if those with the most influence had better skills and habits around assessing reputation and intent.

In some ways, the homeostatic fallacy served us well the past few years. It reminded folks of the complex reasons why people might share things that weren’t true. It pointed to the resilience of bad ideas in the face of correction. And it formed a useful counterpoint to naive Cartesianism, which saw bad information primarily as bad input leading directly to bad conclusions, an idea that is now rightfully dead and buried.

But as we watch this slow, uncontrolled skid of a year head towards the gas pumps, it’s probably best we bury the homeostatic fallacy as well. That “bias” part of confirmation bias has always meant something more specific than many realize — the tendency of errors to fall more towards one side or another of an equation. Leave the bias aside: if you reduce the errors, you reduce the drift. And maybe, just maybe, the skid comes to a stop.

Mike Caulfield heads the Digital Polarization Initiative at the American Democracy Project.