Settings

Theme

Meta in Myanmar, Part III. The Inside View

erinkissane.com

212 points by joeyh 2 years ago · 116 comments

Reader

3np 2 years ago

Part I HN thread: https://news.ycombinator.com/item?id=37709284

theptip 2 years ago

> the most generous number from the disclosed memos has Meta removing 5% of hate speech on Facebook. That would mean that for every 2,000 hateful posts or comments, Meta removes about 100–95 automatically and 5 via user reports.

Its really hard to contextualize these numbers. What are the comparable rates for other media?

What % of hate speech on Mastodon is taken down? Twitter? YouTube? Discord? Comparing with old-school mob formation technology, what about leaflets handed out? Political rallies?

Meta is an obvious target because they have such a high % of total online speech, and by virtue of this make a potentially impactful single intervention point to make things better. And these articles do make it seem they have room at the margin to improve. But imagine a word with multiple social networks per country - do we think they would be better or worse at policing this stuff (and crucially, the places where it matters are the developing less-regulated countries, often with governments that are at least disinterested in preventing the ethnic conflicts, if not actively promoting them)? I can see hand-wavey arguments in both directions but what I really want is data analyzing the question.

  • intended 2 years ago

    The term is Prevalence. I have been looking for years, and have essentially shifted careers to find it. I didnt even have a name for it until recently. There is also no way to get the data you are looking for. As someone described it - this is akin to asking how much crime happens every day.

    You could get an approximation - For resourced languages though.

    Meta launched Hindi classifiers in 2021 and Bengali in 2022, covering 0.6bn and 0.24bn people, respectively. I know how poor the lexicons I have seen in the wild are, and how hard it is to stay ahead of terminology.

    Meta is probably doing more to stay ahead than others. However the ARPU for a user in the US vs India is stark.

    Platforms aren’t going to release the necessary data for even a rough approximation. Reddit was a potential data source to learn from, but now we have the API rules, so that is also closed off. Lucky us.

    Not to mention how supremely unprepared society is to actually understand or discuss the data without making it worse.

    Meta has more resources and frankly they regained my trust years ago. However, business is business. Twitter and Reddit have shown that in this current cycle, there are are no consequences for neglecting Trust and Safety, as shown by other companies. T&S teams have been reduced globally. T&S teams are being targeted and vilified.

    Next year, there are ~50 elections happening, including the USA and India. The network suits certain narratives - the more insular, fear driven ones. Trust and Safety teams are themselves being exposed.

    Its already looking like a perfect storm.

  • arczyx 2 years ago

    Rather than the 5% number (which may be the best they can do with current start of art), what feels more damning to me is that they found a method that can reduce misinformation (by way more than 5%), and then decided to rollback it.

    > And this method works. In Myanmar, “reshare depth demotion” reduced “viral inflammatory prevalence” by 25% and cut “photo misinformation” almost in half.

    > In a reasonable world, I think Meta would have decided to broaden use of this method and work on refining it to make it even more effective. What they did, though, was decide to roll it back within Myanmar as soon as the upcoming elections were over.

    • p_j_w 2 years ago

      The free market has spoken, and it will gladly accept helping encourage a genocide in the name of profits.

      • hef19898 2 years ago

        One of the reasons real Communism (the USSR variety, not what people in the US think Communism is) has such a hard time in the capitalist societies: It is the only form of authoritarianism in which corporations cannot make profits.

        • logicchains 2 years ago

          >It is the only form of authoritarianism in which corporations cannot make profits

          It's still the same people making the profits though. Regardless of what political/economic system you have, sociopaths will always gravitate towards whatever position allows them to acquire the most wealth and power. The only difference is whether they have the legal right to send men with guns to lock you up or kill you if you disobey them.

        • p_j_w 2 years ago

          The fact that you lack the ability for creativity and nuance to such a degree that you scream “HAH, Stalin!” anytime someone criticizes the free market says more about you than the point you’re arguing against.

          • hef19898 2 years ago

            I actually critized the free market for being totally fine with, e.g., fascism, or any kind of atrocoty as long as profits are to be made...

            • fnordpiglet 2 years ago

              Surely you aren’t asserting the Soviet Union committed no atrocities and was an angelic society of benevolent leaders blessing the harmonious proletariat?

              In any society composed of humans, the shitburgers will find a way to shit on the largest groups of people they can using whatever instrument of power is available. In the US it’s dollars, in the USSR, its influence and power.

              • p_j_w 2 years ago

                GP isn’t asserting that. They’re saying that the the western establishment will gladly tolerate right wing authoritarian governments because they allow private companies to run roughshod over everyone in the name of profit.

            • p_j_w 2 years ago

              Oh I misread, sorry. In that case you’re completely correct. The right holding up dictatorships like Chiang Kai Shek, Pinochet, and Park regimes as being good are all great examples.

        • megaman821 2 years ago

          Does this "real" Communism exist anywhere, or should it be more aptly named fantasy Communism?

    • blackoil 2 years ago

      What was the false positive rate? Blocking all posts about Israel and Palestine will bring down misinfo but that is rarely acceptable.

      • droopyEyelids 2 years ago

        We're talking about the case where there was an actual genocide happening, and Facebook was aware of it.

  • erie 2 years ago

    Facebook has been the internet in Myanmar,' Entering the country in 2010, Facebook initially allowed its app to be used without incurring data charges, so it gained rapid popularity. It would come pre-loaded on phones bought at mobile shops and was a cultural fit.'

  • BlueTemplar 2 years ago

    Discord (and maybe Mastodon ??) don't operate through engagement, so I'm not sure why you put them on the list ?

    • anon84873628 2 years ago

      The Facebook apologists always conveniently forget that Facebook's algorithms viralize content and decide what people see. I.e. the precise reason for their culpability.

  • PaulHoule 2 years ago

    It is almost laughable that Mastodon could do any better in a case like Myanmar.

    First there would be the delay of the rest of the community understanding what was going on in an unusual language. Second there is the difficulty of deciding what to do and implementing an effective filtering policy. Third since there are multiple instances this can't be done in a consistent way across the network. Fourth they can't stop the Myanmar government from setting up its own Mastodon instances, blocking access to outside instances, etc.

    You'd better believe that if Modi keeps going down the road he is going on that India will kick out all western social media companies if they try to block his campaign of dehumanizing non-hindus.

    From my perspective there is a lot of harmful talk that goes on social media that falls short of hate speech. In terms of my own feed I have keyword blocks against most Republican primary candidates (not Nikki Haley though) and there are many other topics that I think people are not capable of having a civil and productive conversation on or that I just don't want to be exposed to. A lot of stuff on the slippery slope to "hate speech" is blocked by this and it is OK when I do it because it is my own feed, but you'd better believe there are lots of people who think they have something so important to say that it would be crime to "censor" their shit (for lack of a better word) if the block affected more than just me.

    Unfortunately the discussion around A.I. is dominated by positions like: (1) ChatGPT is the God at the end of the universe that Teilhard de Chardin told us about, or (2) It was just fine that Google stole everybody's content for years and years but all of a sudden ChatGPT came out and be outraged all of a sudden at this rip-off.

    The truth about it is that machine learning algorithms over text have terrible performance, even the good ones. The performance curves look something like

    https://scikit-learn.org/0.23/auto_examples/model_selection/...

    and for my RSS reader I am happy to approach an AUC of 0.8, an AUC of > 0.95 for most tasks is getting into science fiction territory. And that's nowhere near being able to clean up hate speech at the rate that it should be cleared.

    The short of it is that if you want to get rid of a lot more than "5%" of the outright "hate speech" you will also get rid of a lot of "dislike speech" and "angry speech" and "annoyed speech" as well as some "innocent speech".

    To be really frank my filter deletes anything about "cis" and "trans" because it I think maybe 10% of what people say on the subject is outright hateful, 80% of it is emotionally negative, maybe 10% of the time somebody has something neutral or positive to say. My take is that losing a small amount of content I might want to see for being free of a lot of negativity is a bargain I am willing to make. A platform that imposed that kind of judgement on the community could be a much more emotionally positive place than anything you've seen but it is also going to get all sorts of criticism from people for doing so.

    • BlueTemplar 2 years ago

      > It is almost laughable that Mastodon could do any better in a case like Myanmar.

      > First there would be the delay of the rest of the community understanding what was going on in an unusual language. Second there is the difficulty of deciding what to do and implementing an effective filtering policy. Third since there are multiple instances this can't be done in a consistent way across the network.

      This (except the last bit) is missing the point about what Mastodon is for.

      Why would "the rest of the community" would even be entitled to decide what to do about it ? Not that there would be a lot of overlap with the Burmese community anyway, so that wouldn't have much of an effect.

      We already had an example BTW, with Gab splitting out with Mastodon, but then in a heavily English-speaking context.

      What would have made at least a symbolic difference, is that some of these Mastodons would have been operated by the anti-hatred groups described in part I, and they would not have needed to unsuccessfully beg Facebook about doing moderation (with 0-4 moderators from other continents).

      But in the end this is mostly spurious discussion : hate speech only effectively gets stomped out when a judge hands out sentences to the haters, but when the law enforcement that should handle them are the ones preparing genocide...

      (Not that Facebook is not condamnable for heating it up.)

      • PaulHoule 2 years ago

        Certainly a lot of people seem to believe that many Mastodon instances are better moderated than commercial social media.

        Certainly some kinds of right-wing hate are suppressed but there is very little pushback about widespread misuse of the word "fascist" to describe your local police department, or people in women's sports who use period trackers to prevent ACL tears, mainstream milquetoast politicians, or just about anyone.

        On the other hand a lot of it seems like BS, there are two Mastodon instances that seem to be at war with Hacker News, I think I understand why one of them is mad at HN, I don't know if the second is mad out of sympathy with the first or they have some other grievance.

        Many mastodonsters were concerned that Threads would be poorly moderated (might allow posts about how period tracking prevents ACL tears in female athletes I guess) but Threads pretty much dropped everybody's radar.

        ---

        One angle is that a system like this should have "defense in depth", like you try to create a culture where hateful speech is not welcome, where people don't try it because they know they'll get disapproval, where people know they'll get confronted if they call Keir Starmer a fascist, where people know you just don't boost hateful things, etc.

        In that sense you might just want to cultivate a lot of positive and neutral stuff and encourage people seeking negativity to go elsewhere. I think this is what Instagram was going for with Threads and it might be one reason why people should be asking "whatever happened to Threads?" but they don't because they've forgotten all about it.

        • cultureswitch 2 years ago

          The problem with Facebook, Twitter, even Reddit to a large extent, is that it is hard or impossible for users to self-segregate.

          Mastodon offerts a possibility of having truly disconnected communities who do their own thing though sharing a protocol. What's acceptable in one community is not in another and that's fine because users can pick and choose what they participate in and what they see. However, this model is not perfect in Mastodon either as people who believe crusading is a virtue actively seek to undermine communities they don't participate in.

          With Facebook however, the profit structure of the company and the way the website works pushes it to categorize users to receive content not based on user preference but based on algorithmic optimization. I may intentionally not join any motorist group, I still get shown content that gets a lot of engagement from idiotic suburbanites.

  • cultureswitch 2 years ago

    Less developed countries are typically more willing to and more abusive in controlling expression than more developed ones. That is by no means a rule though.

    The question, of course, is whether there is a political will to "police hate speech". In reality this already dubiously noble goal is as flexible as the definition of "terrorist organisation".

kaycebasques 2 years ago

The discussion about how Meta purportedly twisted the content moderation discussions around 2020 was a revelation for me. This quote from Meta captures the alleged spin well:

> We proactively detect 99 percent of the hate speech removed from Facebook in Myanmar

If the accusation is correct, they are not saying that they capture 99% of all Myanmar hate speech on Facebook. I remember that the big news agencies reported it that way. Meta is actually saying "of the 5% that we do catch, 99% was caught automatically." The key phrasing is "99% of the hate speech removed".

  • billjings 2 years ago

    Oh, wow.

    In other words, this number actually means, "We do almost no non-automated detection of hate speech."

    Smart spin, huh...

  • ipaddr 2 years ago

    The author is playing word games. Author states facebooks has good estimates for amount of hate content. Then say 99% (of terrorist material) of an unknown amount.

    Do they have good estimates or is this an unknown amount?

    • kaycebasques 2 years ago

      The article cites Frances Haugen's whistleblower disclosures:

      > …we’re deleting less than 5% of all of the hate speech posted to Facebook. This is actually an optimistic estimate—previous (and more rigorous) iterations of this estimation exercise have put it closer to 3%, and on V&I [violence and incitement] we’re deleting somewhere around 0.6%…we miss 95% of violating hate speech.

      https://facebookpapers.com/wp-content/uploads/2021/11/Hate-s...

      • ipaddr 2 years ago

        Is that unknown? It's an estimate the author believes here and forgets below

        • intended 2 years ago

          Facebook states they remove 99% of hate speech.

          Their internal teams say they could be missing 95% of it.

          The reality is that Facebook has no idea how much hate speech they have. They remove 99% of whatever they detect.

          • ipaddr 2 years ago

            Facebook removed 95% of hate speech identified by automation / 5% manually of the hate speech they could identify which was between 3 and 5%.

            Facebook also said they removed 99% of terrorist/isis posting. Of which the author claims the total amount is unknown. But we know the % of hate speech because of facebooks ability to guess (as stated by the author of which the author was confident). Now we are lead to believe that terrorist content can't be estimated by facebook.

            Either we accept they can estimate and take the 99% of terrorist content removal in total at face value (of the total amount) or we must ignore facebooks hate content estimates

arczyx 2 years ago

So Facebook is presented with a trolley problem that have people lives on one track and more profit on the other track, and chose the latter, just as what we can expect from big corporations nowadays.

Not an unexpected decision overall, but it did surprise me how low Facebook can stoop. Like, they didn't even bother to fix their stuff in Myanmar after all these years with Myanmar being the poster child of their hate speech problem!

> In 2022, Global Witness came back for one more look at Meta’s operations in Myanmar, this time with eight examples of real hate speech aimed at the Rohingya—actual posts from the period of the genocide, all taken from the UN Human Rights Council findings I’ve been linking to so frequently in this series. They submitted these real-life examples of hate speech to Meta as Burmese-language Facebook advertisements.

> Meta accepted all eight ads.

  • cm2012 2 years ago

    I promise you Meta doesn't give a shit about the $8 in ads its getting from Myanmar per year. This is more an issue of a giant corporation not noticing that some users in a foreign country speaking in a foreign language were doing horrible things with their platform. At worst its negligence. If it wasn't for the Facebook association most HN users would not even know Myanmar was in a civil war at all.

    (Keep in mind after this, Facebook hired 20,000+ content moderators at great expense who they still employ to monitor for stuff like this).

    • arczyx 2 years ago

      > At worst its negligence

      According to the article, Facebook did make the deliberate decision not to dedicate resources to Myanmar and other non-western countries, so I don't think it's negligence. "Greed" is more appropriate.

      > Guy Rosen, Facebook’s VP of Integrity, told Sophie Zhang that the only coordinated fake networks Facebook would take down were the ones that affected the US, Western Europe, and “foreign adversaries.”

      • cm2012 2 years ago

        That's frankly sensible? There are three hundred+ countries on earth, to monitor all of their "fake networks" would take an organization like the CIAs. It's like saying its greed that Apple doesn't solve the family issues of all their workers. its not reasonable.

        • afavour 2 years ago

          If you can’t operate responsibly in 300+ markets then don’t operate in 300+ markets? Seems sensible to me.

          If an airline doesn’t have enough pilots to service 20 routes they reduce their routes, not let passengers fly the plane.

          • nradov 2 years ago

            What does operating responsibly mean? Everyone seems to have a different definition of that.

        • AndrewKemendo 2 years ago

          Then they shouldn't operate there if they can't do so without their users being harmed as a result

        • rakoo 2 years ago

          > its greed that Apple doesn't solve the family issues of all their workers. its not reasonable.

          Of course it's greed, and of course it's reasonable. Apple is directly responsible for the work conditions of its workers and the contractors they use, absolutely. If they can't be responsible then they can close shop. There is no obligation for them to have a business.

        • PaulHoule 2 years ago

          I wouldn’t think the CIA would be capable of doing such a thing and if it tried it would devolve into scandal one way or another.

        • taway1237 2 years ago

          Maybe they should consider not operating in countries where they unmonitored presence may lead to genocide?

          There's no obligation to operate everywhere. Instead, as discussed in previous parts, they invested heavily to get a monopoly in third world countries.

          • cultureswitch 2 years ago

            I have a hard time following this reasoning. I would definitely consider Facebook partly responsible for this genocide if they had deliberately made an editorial decision to spread the genocidal rhetoric.

            But they didn't. The situation was as you put it mostly unmonitored. In other words, the spread of genocidal content was organic. This is the equivalent of blaming the phone company or the ISP for what the users do with the communication channels.

        • Hikikomori 2 years ago

          This comment makes sense after reading his history.

          • cm2012 2 years ago

            Yes, a history of trying to get to the truth of matters even if it's not popular in the zeitgeist.

            Look at fsociety's comment in this same thread to see just how much misinformation is out there (which he helpfully corrects)

            It just so happens a lot of misinformation is anti-corporate since many people want to believe every bad thing they hear about them. So I end up being one of the few commenters who say, actually, the big company did try here.

            This is important because if every big company is supposed to be equally bad, there's no reputational penalty for the actually bad ones.

            • Hikikomori 2 years ago

              Funny, talking about the truth when your comment history is neck deep in right wing conspiracy and talking points.

              The point people are making here is that Facebook shouldn't have done the impossible and created a perfect tech solution to moderate content, they should not have operated in these countries at all if they couldn't do it safely. They did not even try, instead they ignored internal reports on these problems for years and actively decided to make the problem worse with their single focus of growth. And for as you put it, only a few $ in advertisements, to fuel the fires of genocide.

              • cm2012 2 years ago

                Considering I've voted only democratic in every election in my life, and volunteered to help progressive candidates manage their campaigns in swing states, you're waaaay off base on my politics.

        • arczyx 2 years ago

          I really don't want to ask this since it run contrary to HN rules, but did you really read the article?

          No one expect Facebook to remove all misinformation/hate speech/etc. That is unreasonable. But they have literally 1 data scientist to monitor the whole non-western world.

          In a year where their annual profit is $23.9bn.

          Assuming 1 data scientist is paid $500k/year, surely it is sensible to dedicate $20mn, less than 0.1% of their annual profit, to hire 40 more such data scientist and literally increase their effectiveness on this topic by 4000%? At the very least, this is what I expect from a company that actually have a speck of moral.

          > A strategic response manager told me that the world outside the US/Europe was basically like the wild west with me as the part-time dictator in my spare time. He considered that to be a positive development because to his knowledge it wasn’t covered by anyone before he learned of the work I was doing.

          • fsociety 2 years ago

            When you write “they have literally 1 data scientist to monitor the whole non-western world”, I assume you mean Sophie Zhang. I agree with a lot of what she said, and I mean a lot, but you have a misconception.

            The integrity org, back when I was there, had thousands of folks and was aggressively hiring. They were not a lone island and were also supported by other orgs like data infra, FAIR, security, privacy, product teams and more.

            Site integrity, Sophie’s team, was a small piece of the work done there. And they, like any team, relied heavily on other teams. You have all this unbelievable tooling internally, and internal teams are incentivized to get you to use it.

            These issues were not from want of trying. The reality is that problems like this at scale are incredibly difficult. In my opinion, Frances trivialized a lot of this in her whistleblowing, but it makes a good news story so what can you do.

            It pains me when people trivialize it as “just do the machine learnings to fix it all” or “if these tech companies actually gave a damn this wouldn’t be an issue”. It’s an incredibly hard problem, and much like security it will never be “solved”.

            That doesn’t excuse when technology has a bad effect on the world. We need to be better, and for the most part as a society we are trying hard as hell. It’s also why the work is interesting and impactful. More should get into it.

            • bostik 2 years ago

              I'll provide a slightly different argument.

              I talked to FB people (recruiters, managers) in the site integrity team back in 2018. Admittedly I had some reservations up front, but during the on-site day I made up my mind to never even consider FB as a potential employer again. During a chat session between interviews, I asked about the prospect of doing proactive education - essentially, detecting users who had been caught in the influence operations and then surfacing them a note that they had been targeted by such activities, so that they could themselves make educated decisions.

              The senior manager I was talking with at the time was visibly taken aback by the very idea. "We don't do that!"

              From that experience I drew the inference that FB are fundamentally, as an organisation, incapable of doing the right thing. I suspect it's less about the cost, and more about the prospect of openly accepting accountability for what their platform is really used for.

            • anon84873628 2 years ago

              The series of articles makes it very clear that there are simple, non-high tech things Facebook could have done to mitigate the problem.

              Insisting that it's hard and requires advanced ML is complete BS. Self delusion by FB'ers to give themselves and excuse for their complicity.

      • jncfhnb 2 years ago

        Purposely ignoring a problem is negligence.

    • duped 2 years ago

      > I promise you Meta doesn't give a shit about the $8 in ads its getting from Myanmar per year.

      They did certainly care about the growth of daily active users even if it meant they expanded to markets where they couldn't responsibly moderate their platform.

      • blululu 2 years ago

        User growth and profits are not the same thing.

        A company as big as Facebook is going to have different groups with different objectives that often do not see eye to eye or even know about each other. A company as profitable as Facebook is going to have projects that are motivated by things other than profits.

        There is most likely a team at Facebook whose goal it is to get as many users as possible in developing markets. These markets do not generate profits from online ads. Facebook will never disclose its cost per user but in general online ads only make money from affluent users. I have heard that even mid income countries like Mexico or Romania are not profitable for ads. A place where people make an average of $3 a day is wildly unprofitable for Facebook and even the most optimistic development projections will not make them profitable for several decades. The OP is spot on that the amount of money that Facebook will ever receive from Myanmar is negligible.

        Based on conversations with people at Facebook I generally get the sense that people there (especially at the top) have a strong sense of mission and a belief that their product makes the world a better place (I disagree with this but that doesn't change the fact). I would guess that these the project tasked with increasing users in developing countries is viewed more as philanthropy than business.

    • kaycebasques 2 years ago

      > Facebook hired 20,000+ content moderators at great expense who they still employ to monitor for stuff like this

      Part II of this series addresses this stat. During the peak genocidal years it seems there were only a handful of mods assigned to Myanmar, and none of them lived in Myanmar. To this day it sounds like there still may not be more than a hundred people moderating Myanmar content. Which is inline with the 20K number: 195 countries * 100 moderators per country = 19.5K mods total

      I guess my main idea here is that when I look at the back-of-envelope numbers it's easy to see how even 100 moderators would have trouble keeping up with the content that millions of people produce. So you really do need those automated moderation tools to be effective.

      Can someone please continue the thought experiment by estimating how much content each mod can review in a day, then extrapolating up to a week or month, and comparing to an estimate of how much content was actually posted on Facebook in the similar timeframe? I believe Part I or II provides an estimate like that.

  • BlueTemplar 2 years ago

    Why "nowadays" ? In which century did big corporations behave responsibly ?

    Of course Facebook didn't fix anything in Myanmar - the new junta government is both the one running the troll farms and responsible for the previous genocide !

gmerc 2 years ago

The Senate hearing was really the final confirmation anyone needed our elected representatives are woefully outmatched.

Calling out the simple statistic slight of hand would have been a sure fire way of establishing control and the impression of hard on big tech they wanted. It was delivered on a platter.

Instead of that we got “Senator we run ads” and stupid gotcha games trying to prove FB was biased against conservatives when it was painfully clear that Joel Kaplan had fully taken over the bias department in favor of his friends (preventing misinformation action on the news surface because, shocker, it came 90% from conservative staples like Drudge report)

photochemsyn 2 years ago

This series is an interesting view into Meta's internal policies - but historically it lacks a great of context and doesn't explain the roots of the conflict that led to the mass exodus of Rohingya people from Myanmar - in fact the author is almost entirely silent on a key issue that exacerbated the situation: Saudi-Pakistani efforts to run a regime change operation based in the Rohingya region. See Reuters, 2016:

https://www.reuters.com/article/us-myanmar-rohingya/myanmars...

> "“Though not confirmed, there are indications he went to Pakistan and possibly elsewhere, and that he received practical training in modern guerrilla warfare,” the group said. It noted that Ata Ullah was one of 20 Rohingya from Saudi Arabia leading the group’s operations in Rakhine State. Separately, a committee of 20 senior Rohingya emigres oversees the group, which has headquarters in Mecca, the ICG said."

https://en.wikipedia.org/wiki/Ataullah_abu_Ammar_Jununi

That certainly doesn't justify the response of the Myanmar government, but it helps explain it to some degree - imagine the US political response if, say, US border facilities were attacked by an Iran-based group that killed a dozen US border guards.

Entirely neglecting this background is just bad journalism, and also distorts the role of Meta. It's highly unlikely, given this information, that a rigorous censorship strategy on Meta would have changed the outcome much if at all. A more rational conclusion is that the Myanmar government viewed the Rohingya people as a potential base for Saudi/Pakistani-backed terror groups to launch attacks within Myanmar, and that was the root cause of the expulsion, and that social media-based propaganda efforts were not that important in understanding what happened.

I'm also in favor of free speech over censorship - it's very easy to demonstrate in an open conversation that mindless hatred of others based on their ethnic/religious/racial background is simply the product of ignorance and immaturity.

  • catlover76 2 years ago

    > I'm also in favor of free speech over censorship - it's very easy to demonstrate in an open conversation that mindless hatred of others based on their ethnic/religious/racial background is simply the product of ignorance and immaturity.

    Clearly it's not; this myth needs to stop being pushed. Everything we have observed from the internet age is that such hatred metastasizes even easier because of the internet and social media, and that almost all discourse is worse.

  • anon84873628 2 years ago

    >mindless hatred of others

    The whole reason Facebook is a useful tool for hate groups is because it allows them to cultivate a fake justification for hate that goes beyond "mindless". Inventing crimes, doctoring photos, promoting false narratives and conspiracy theories, etc. It radicalizes moderate people by making it seem like the out group is an existential threat to the in group. We don't hate them because of their skin color or religion, we hate them because they're trying to kill us first.

    Open dialogue can't fix this because it's much harder to disprove bullshit than to generate it.

  • alxmng 2 years ago

    Agreed. Myanmar has been at civil war since WWII. The Tatmadaw viewed Arakan muslims as enemies for decades before Facebook. In the 1980s they had policies against them. And that's just one conflict in Myanmar. The Tatmadaw has been killing Shan, Karen, and other ethnic groups for 70 years. The Karen conflict is one of the longest running conflicts in the world. They're bombing Karen and Sagaing villages right now. Facebook is inconsequential to Myanmar's bloody politics.

    It's true there are problems with Facebook. But without Facebook the Tatmadaw would still have laws against Rohingya, still have a policy of driving them out, still would engage in fighting ARSA, still would fund and promote groups like Ma Ba Tha, etc.

    • blix 2 years ago

      The conflict in Myanmar isn't a civil war any more than the conflict of the USA between settlers and natives was a civil war.

      The Myanmar (Bamar), Karen, Shan, etc populations were all there before the British drew the borders of 'Burma.' The conflict goes back way farther than WWII. The main difference now is that the Myanmar are winning.

      • alxmng 2 years ago

        The conflicts exist along ethnic lines, but all major armed ethnic groups had formed a government together at end of WWII, and even today many of the groups openly support a federal union.

        Settlers and natives has nothing to do with it. All the major armed ethnic peoples existed in Myanmar since before WWII. The “settlers” were the British and they left.

        • blix 2 years ago

          > formed a government together at end of WWII

          I am not convinced this ever happened in any meaningful sense. As you note, it turned into armed conflict almost immediately. This was a British-induced transient state that was unable to sustain itself in the region.

          > The “settlers” were the British and they left.

          The Myanmar state/people are also colonizers/conquerors. You don't need to come across an ocean to see territory that you dont control and decide to try to take control by force. Facebook and ethno-nationalism are simply new tools in a long-running conflict.

  • BlueTemplar 2 years ago

    They did mention in the previous parts about ARSA killing border guards and executing 99 civilians (?)

  • g-b-r 2 years ago

    > I'm also in favor of free speech over censorship

    At issue here is the PROMOTION of misinformation, lies and genocidal speech (in users' feeds), not the fact that they haven't deleted it.

  • p_j_w 2 years ago

    > I'm also in favor of free speech over censorship - it's very easy to demonstrate in an open conversation that mindless hatred of others based on their ethnic/religious/racial background is simply the product of ignorance and immaturity.

    Demonstrating that fact is a really poor substitute for taking some kind of action against an ongoing genocide.

    • photochemsyn 2 years ago

      Practically, the best way to have avoided the problem might have been to encourage Saudi Arabia and Pakistan to NOT engage in the export of radical Wahhabist ideology and the funding of regime change operations in other states.

      Kind of like the best way to have avoided the war in Ukraine would have been for the USA to NOT have engaged in a regime change op in 2014, and the best way to avoid the current Libyan disaster would have been for NATO to have NOT engaged in a regime change operation in 2011, and so on.

      • p_j_w 2 years ago

        Regardless of whether the things you assert caused the genocide or not, there was a genocide ongoing and your free speech and rational argumentation is completely impotent to stop it.

        • cultureswitch 2 years ago

          Do you believe Facebook could stop or even affect what happened in any meaningful way? Because I've got a bridge to sell

          • p_j_w 2 years ago

            If they had taken action then fewer people would have undoubtedly got swept up, yea, absolutely 100%. I have to say though, “I don’t think it would’ve had much impact anyways so they should get a pass for knowingly profiting off of aiding and abetting a genocide” is not an argument I thought someone would ever seriously advance. Congratulations for that.

  • Aunche 2 years ago

    I get a sense that people like the author care less about the Rohingya genocide as much as they enjoy the ammunition against Facebook. I'm sure you could cherry pick examples of Rohingya escaping persecution because Facebook offered better communication. Nobody is give Facebook credit for doing good and rightfully so, but why should they get the full blame for escalating hate speech?

sensanaty 2 years ago

I'll preface this by saying the articles are of excellent quality, obviously well researched and to me an exemplary piece of research and journalism.

With that being said, are we really supposed to be holding entities like Meta/Facebook accountable for the actions of governments? You can read through my comments here on HN and I'm sure you'll notice I'm not one to generally have nice or even neutral things to say about megacorp entities like Meta, but I just fail to see how any of this is really on them.

If it wasn't Facebook where misinfo was being spread, it would've been twitter, or mastodon, or whatever other social media platform. And even if there were no social media platforms, disinformation, propaganda and unfortunately even genocides have all been happening for a lot longer than Meta's been a thing, and I seriously doubt that the lack of channels like Facebook would've prevented or even changed anything.

Why are we suddenly deciding that Meta/Twitter/Google/Pick your poison are to be the arbiters and keepers of truth and justice?

  • intended 2 years ago

    The article explains it.

    They are being held responsible for their claims. For their product decisions. For their choices to ignore their own researchers.

    No one is “suddenly” anything. People have been crying out about this for a decade+.

    Please read the article. The answers are directly there.

  • g-b-r 2 years ago

    Besides what "intended" said, it's not likely at all that there would have been another analogous social network had Facebook not existed.

    It was them who first adopted most of the dynamics that lead to these things

throwaway4good 2 years ago

Is Facebook still allowed in Myanmar?

  • alxmng 2 years ago

    The military blocks Facebook (and Twitter), but everyone uses it via VPN (including members of the military). Online commerce is popular, and the majority of online commerce happens on Facebook with cash-on-delivery.

mycologos 2 years ago

OK, I spent ~an hour reading parts I [1], II [2], and III (TFA). My perception going in was that comment moderation is very hard, and essentially impossible at scale, and that Facebook/Meta cannot credibly be accused of literally causing genocide. Here is how my opinion shifted some after reading this series:

The articles present a lot of evidence to support the notion that Facebook was well aware of clear-cut incitement to violence years before its escalation into large-scale genocide in late 2016. It isn't spelled out explicitly, but I think there are two things that make Facebook's initial apparent non-response easier to understand as incompetence rather than evil:

* In the aftermath of the Arab spring, technology and social networks enjoyed a honeymoon in public perception that is hard to remember over a decade later. People, including (especially?) people in tech, could genuinely believe that making it easier for people to talk to each other was basically good, full stop. I'd suggest that this honeymoon only really ended after the 2016 US presidential election, which caused a lot of soul searching about information and echo chambers, and got even worse after the Cambridge Analytica scandal [3, this is a graph image from NBC news]

* As part I of the article notes, "[t]he information landscape in Myanmar is so unstable that accounts of any given incident conflict, often in major ways". This dynamic seems to appear in a lot of places where well-off westerners interact with much poorer and more chaotic countries (Rwanda is a clear parallel). Part I points out that the "Burmese government" itself seems to make statements both fanning anger against the Rohingya and then trying to walk them back when it boils over into disorder

So I think it was possible to be a basically well-intentioned (naive) Facebook employee in the leadup to 2016 and hear of strange, almost genocidal sounding posts in another language in a place you don't understand and still not think "this is a break-glass moment and I need to escalate and make a new path inside my corporation for this to be taken seriously". In hindsight, this was wrong, but I think we're forgetting that what happened next didn't seem possible, because we still thought of the internet as an exceptional new dawn in mass communication. A lot is necessary for people to process something as a novel, serious emergency.

However, it's hard to maintain this judgment over the ensuing years, as senior figures at Facebook are repeatedly presented with evidence that people are deliberately distributing misinformation (e.g., making fake accounts, and other clear-cut evidence that doesn't require a fact checker and research) and mostly wave it away unless it's big enough to provoke media and investor consternation. By the end of the 2010s, I think Facebook was well past the "this sort of thing wasn't something we could imagine" stage of a crisis, and into a less defensible position along the lines of "this is reality, it cannot really be moderated, but we can pretend to moderate".

One last point re: moderation, I have no idea how Meta hires its Burmese content moderators, but it must be tricky, since as far as I can tell most people in Burma are anti-Rohingya. I would like to be wrong about this, but what should be factually trustworthy sources like the Brookings institution write things like:

> Most of the Myanmar population, especially the Buddhist majority, feels that the Rohingya don’t belong in their country ... [t]he straightforward solution would be to help the Rohingya return to their homes in Myanmar and live in peace and freedom. Sadly, this solution looks impossible in the near term because of the nationalist sentiment of the Buddhist majority

[1] https://erinkissane.com/meta-in-myanmar-part-i-the-setup

[2] https://erinkissane.com/meta-in-myanmar-part-ii-the-crisis

[3] https://media-cldnry.s-nbcnews.com/image/upload/t_fit-560w,f...

[4] https://www.brookings.edu/articles/on-the-ground-in-myanmar-...

kaycebasques 2 years ago

> There are a lot of posts about Rohingya men raping, forcibly marrying, beating, and murdering Buddhist women.

(That quote comes from the description of the mechanics of the military's coordinated propaganda campaign.)

I would like to share a PSA in the truest sense of that acronym. I suggest building an automatic bullshit indicator in your brain whenever you see any content along the lines of "they're harming our women" from anyone. It may literally be the oldest trick in the propaganda book. Both sides used it extensively in WWI [1]. It pops up a lot in the history of American slavery / racism as the idea that often directly sparked lynchings. I'm sure you can think up lots of other examples. Sorry if this is already obvious but over the years I keep seeing more and more examples of it and IMO this tactic doesn't seem to be as well-known as the technique of depicting the victims as an infestation of animal pests.

[1] Example of USA propaganda from WWI https://firstamendment.mtsu.edu/article/committee-on-public-...

  • blackoil 2 years ago

    I was just now reading exactly similar posts on reddit for Hamas and Israel.

g-b-r 2 years ago

We need a Nuremberg trial for Facebook (without the death sentences, which I don't support for anyone)

  • gizmo385 2 years ago

    What exactly is the target of the comparison here? Facebooks existence? Its behavior? And whatever that thing is, to be clear, you are comparing it to the crimes against humanity committed by the nazis during world war 2? Regardless of whether you like or dislike Facebook, that comparison seems patently absurd.

    • g-b-r 2 years ago

      It for example faciliated a genocide, the rise or confirmation of authoritarian and bloody regimes, and kept consciously doing it if you read the article.

      The reference to Nuremberg was more for an example of an international trial than for the nazis, although faciliting a smaller scale genocide does somehow seem to have a relationship with the nazis...

      Given the havoc they willingly caused to nations worldwide, an international trial for Zuckerberg and the other responsibles does not seem so inconceivable to me...

      And that's not considering a lot of other probably less important things they did

      • cultureswitch 2 years ago

        How did Facebook facilitate a genocide? Did they provide material support? Did they make an editorial decision to stoke genocidal messages? It doesn't appear any of that is the case. It appears Facebook was used as a tool of communication by people who wanted to commit a genocide. These people also used Android and Windows (and countless other products), which technically also could censor speech in their products. Why aren't you saying they should be put on trial too?

        Is your argument that Facebook as a whole is specifically engineered to incite tribal violence, and so should be banned on the same basis as radar scanners? Because in that case I think you have an argument to make which isn't completely incoherent.

  • BlueTemplar 2 years ago

    We need to fix our antitrust :

    https://news.ycombinator.com/item?id=37457766

  • CatWChainsaw 2 years ago

    Okay but then life in prison without the possibility of parole, internet use, visitors, or access to excesses finances, because why would a successful conviction and sentencing suddenly endow Zuck with morality and ethics to not just facilitate more genocides the second he's given the chance?

czhu12 2 years ago

People have been quite good at committing genocide well before Facebook came around. I suspect even if Facebook didn't exist, the situation in Myanmar would not be materially improved.

jowea 2 years ago

The previous parts do say that Facebook was not doing well even when it was just disorganized masses and some provocateurs, but I am the only one who feels it is weird that Facebook, a private company, is expect to "fight" state organizations? Is this consequence of free market capitalism really how the US should do things?

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection