Settings

Theme

Effective Altruists Use Threats and Harassment to Silence Their Critics

realtimetechpocalypse.com

31 points by konmok 17 days ago · 20 comments

Reader

Trasmatta 17 days ago

Turns out you can justify all sorts of reprehensible behavior when you convince yourself it's for "the greater good"

They learned the wrong lesson from Death Note

imtringued 17 days ago

> I’m referring to his claim that “it’s hard to see how” 7 to 10 degrees C of global warming “could lead directly to civilisational collapse.” He proceeds to assert that, while “climatic instability is generally bad for agriculture,” his “best guess” is that “even with fifteen degrees of warming, the heat would not pass lethal limits for crops in most regions.”

Do these people not understand that crops need water? Higher temperatures mean higher evaporation rates. Vast swathes of Iran have become inhospitable due to water mismanagement. That will lead to millions of refugees fleeing the country. Climate change is like poverty in this respect. If you're poor in water, you can't afford to make any mistakes.

Longtermism is a curse to long term thinking. You're not allowed to think about the next ten thousand years of humanity, because apparently that's too short of a window.

Not just that. This type of thinking is a contradiction of optimal control theory. Your model needs to produce an uninterrupted chain from the present to the future. Longtermism chops the present off, which means the initial state is in the future. You end up with an unknown initial state to which the Longtermists then respond by with hacks: They are adding a minimal set of constraints back. That minimal set is the avoidance of extinction, which is to say they are fine with almost everything.

Based on that logic, you'd think that Longtermists would be primarily concerned with colonizing planets in the solar system and building resilient ecosystems on earth so that they can be replicated on other planets or in space colonies, but you see no such thing. Instead they got their brains fried by the possibility of runaway AI [0] and the earth is treated as a disposable consumable to be thrown away.

[0] The AI they worry about is extremely narrow. Tesla doors that can't be opened in an emergency due to battery loss don't count as runaway AI, but if you had to beg the Tesla car AI to open the door and the AI refused, that would be worthy of AI safety research. However, they wouldn't see the problem in the inappropriate use of AI where it shouldn't be used in the first place.

Arnt 17 days ago

Didn't that book suggest that a single building used 20% of the water in South America? Amazingly sloppy.

I really do think that people should be careful about what they say in public and measure their words. And further, I think that the author of that book ought to be silent on that particular subject.

  • konmokOP 17 days ago

    Your comment kinda proves the article's point, don't you think? I mean, obviously your comment doesn't constitute a threat or harassment, but it does demonstrate the weird double standard and unbalanced scrutiny that the article describes.

    • Arnt 17 days ago

      No double standard.

      On one hand, I think that people should check before publication and not publish shit. That goes for posting on the internet, and also about publishing books.

      Separately and orthogonally, I think that someone who doesn't check before publication and publishes shit should refrain from complaining about other people's shit, even though other people's shit really is shit.

      • konmokOP 16 days ago

        Sure, fine. I'm just highlighting that you chose to call out Karen Hao for her mistake (which she admitted and corrected), but not Will MacAskill or any of the other big EA names that have made egregious and dishonest claims. If it's not a double standard, Will also ought to be silent on this subject, right?

        That's what I mean by unbalanced scrutiny.

        • Arnt 11 days ago

          IMNSHO she didn't admit and correct the mistake — yet. She admitted one mistake and has corrected none, and the one she admitted was IMNSHO not the severe one.

          She made several mistakes, of which I'll describe two. One (modestly serious) was to confuse units and compute the wrong number. The second (against my religion) was to publish without sanity-checking. You and I both know she didn't check, because her estimate for the average water use of one building was 20% of the water use of the continent. Any sort of check would uncover that mistake.

          We in the rational camp are supposed to behave differently from Alex Jones, and part of that is to check before we publish.

          She's "making arrangements with her editor to rectify the situation". If she fixes every reported error, not just one of them, I'll have a lot of respect for her.

          • konmokOP 11 days ago

            You dodged the question. That's not very rational of you :)

            • Arnt 10 days ago

              Was the question about Will whatshisname? I'd rather not mention his possible transgressions, since I hadn't even heard the name until this thread.

              Post a story about him to HN and I'll either comment or miss the thread, both are possible.

  • rendx 17 days ago

    Interesting how you seem to see nothing inherently wrong in the provided quotes that call for violence against people of different opinion, but decided to only critique the person that admitted a mistake without aggression against anyone else, and demand they be (forever?) silent about a topic they seem interested in.

    Why would you ever want to demand that someone "stay silent" about anything. Taking away somebody's voice is the lowest of the low. You do not have to read it or interact with it if you don't like it. And how would you want to be treated when you make a mistake? Can't you see how that leads straight to a world of zero progress, where people are afraid to do anything because it could turn out to be a mistake and they will be shunned for it by those that happen to have the most power? Are you not aware of the research into how bad punishment is for learning and advancement of society?

    Williams, K. D., & Nida, S. A. (2022). Ostracism and social exclusion: Implications for separation, social isolation, and loss. Current opinion in psychology, 47, 101353. https://doi.org/10.1016/j.copsyc.2022.101353

    Knapton, H. M. (2014). The Recruitment and Radicalisation of Western Citizens: Does Ostracism Have a Role in Homegrown Terrorism?. Journal of European Psychology Students, 5(1), 38-48. https://doi.org/10.5334/jeps.bo

yongjik 17 days ago

This blog post could have been better without the long intro featuring Timnit Gebru. It just reads as a boring "someone finds a mistake in a book, someone else quotes it sarcastically, a bunch of others call it out with 'bro why so butthurt'" story. You'll find better stories in r/subredditdrama.

As it reads now, I'm not sure if this is an objective critic of EA or gripes of someone who orbited in the same social space having a public fallout.

konmokOP 17 days ago

I find this really frustrating because I like the idea of "make a lot of money, then give most of it away to make the world better for everyone". But it seems like most of the people who proudly call themselves "effective altruists" are just heartless tech bros that toss their money into useless AGI cults.

  • themafia 17 days ago

    How about just "build a good company and give most of the profits to the workers."

    I just saved you several steps and opportunities for graft and corruption. Let's call it "immediate altruism."

    • konmokOP 17 days ago

      Well, that doesn't really align with my interests, education, personality, or skills[1]. I do appreciate that criticism, but I'm looking for ways to give back that don't require abandoning my chosen career. I think there's a middle ground, basically.

      [1]: What I mean is, I don't want to build my own company, and if I did, it would be in a very niche area that wouldn't directly benefit the people that most need help.

      • themafia 17 days ago

        > Well, that doesn't really align with my interests, education, personality, or skills

        Ah, well for you, we have "regular altruism." Just pick a charity and send them money or donate your time to volunteer efforts in your community.

        > What I mean is

        Completely understandable. I was responding to the idea that being a cut throat capitalist that treads on your customers and workers to make a bunch of money that you then export some fraction of into "effective altruism" is probably missing the point of altruism entirely. I think it creates more suffering than it solves.

    • listenallyall 17 days ago

      Why the workers and not the customers, let's say? Workers have little risk, they get paid a salary regardless of the company's fortunes (unless the company is so awful it goes out of business). The customers who believed in the company enough to give them money, that seems more worthy of future compensation (via profit-sharing, as per your example).

      • themafia 17 days ago

        > Why the workers and not the customers, let's say?

        Workers represent more of an investment in time and training. Therefore they represent long term value. Customers are fickle, as they should be, but if I get beat on prices today they're gone tomorrow.

        > customers who believed in the company enough to give them money

        You seem to be describing a donor or possibly a member of a co-op. A customer simply receives an object of value in exchange of the money. As long as they're getting a good value on a quality product then their belief in the company is not material.

        • listenallyall 16 days ago

          > Workers represent more of an investment in time and training

          Not really, post a job opening and you'll likely get plenty of applicants, many of whom are indeed qualified. You taking the time to vet them and choose one is a benefit of having too many options. Getting customers is harder, you have to advertise and market your product, "acquisition cost" is a real thing.

          > As long as they're getting a good value on a quality product

          But especially early on, how do new customers know the product is quality? Someone has to be the first to eat at a restaurant or to hire you to paint their house, whatever. Even established companies - ordering clothes online when you can't actually feel the material, picking a dentist when you dont actually know how he/she will treat you, letting Uber decide who will drive you to the airport, how a pair of skis will perform from looking at them on a carpeted floor - most customer purchases and decisions are made with far-from-perfect information and they just have to put faith in the seller or service provider - and that's what I'm suggesting is worth future compensation.

          > if I get beat on prices today they're gone tomorrow

          If this is the case you really havent built much of a business, you're just selling commodities, and your employees have failed in differentiating your company from your competitors.

    • dfe 17 days ago

      This is a time-tested winning strategy that too few corporate owners embrace.

      When you look at some of the most well-known industrial companies, their founders basically did this.

      Difficulty: give away too much of the company trying to raise capital and most investors won't let you do this. Of course, you aren't really the owner then anymore, are you?

      I think that's the allure of effective altruism. You founded a company or were early enough in a company to have enough shares to sell to investors. Those investors want big returns. The company is now at their mercy, but hey, they gave you a pile of cash so you can spend it on feeling good.

  • plastic-enjoyer 17 days ago

    EA is a neat philosophy to make greed and fraud seem principled.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection