How to spot cult leader personalities
sashachapin.substack.comThere's lots and lots written about cults and cult leaders, which is worth reading, such as Combatting Cult Mind Control by Steven Hassan, or if you prefer a more traditional take, "Thought Reform and the Psychology of Totalism" by Robert Jay Lifton. Or any of other dozens of good authoritative takes on the subject by people who have spent their careers formally studying such phenomena, which can draw lines between all kinds of different groups and political movements that actually have lots in common.
What's wrong with this blog post is that the author does not seem to have read any books at all on the subject, which IMO is fairly irresponsible considering how much there is to read about it and how critical it is that "cult behavior" is widely understood in order for society to progress. Seems to be off the cuff musings about the topic, alarmingly even considering that some "cult leaders" are "good" (no) or that they can be "reformed" (irrelevant, but also REALLY doubtful considering what it means to be a cult leader; cult leaders generally remain such leaders well after they've been sentenced to prison for decades, their followers show up to visit, etc., there is no "reform" here, sorry), and overall a lot of muddy, uninformed and made-up thinking that will only get more people into cults.
This post is about "cult leader personalities", not literal "cult leaders".
I imagine we've all had the experience of meeting such magnetic personalities, but probably only a tiny percentage have met an actual cult leader.
use a different word then
> What's wrong with this blog post is that the author does not seem to have read any books at all on the subject,
Lol and you didn't read the blog post apparently
lol I read the whole post and it does not refer to any research or books for its assertions
Cult leader behavior: belief that no field of study matters until one's own attention is turned to it :)
Also known as physicist behaviour, heh. https://xkcd.com/793/
The author is on point, and he didn't say 'cult leaders' are good, he said that people with that personality type can be, which of course they can.
I think that most of our 'great leaders' (good one's) that we look up to, flirt with this mindset for periods of time in their lives.
Not all cult leaders maintain their disposition after being arrested.
The author was fairly concise actually in identifying core attributes, and especially with the meta cognition bit.
I've known a few of these guys (usually guys, but not always), and I agree they are a very generic type. I also think they can be tremendously beneficial to society as well, even if they aren't to the individuals in their lives. (Steve-Jobs-types, etc.)
I still find them fascinating, because they tend to exist in interesting environments. It's like going to Africa, but not wanting to see any lions.
Thank you for the book recommendations!
If you wouldn’t mind sharing what you found most interesting while reading the books you mentioned.
well of course that people can be made to fully believe in things that are demonstrably false. That's like no big deal today as we see it everywhere, but back in 1987 when "Dianetics" was on TV all the time including a whole list of completely false claims, I needed some help to understand exactly what that was about and how indeed maybe I didn't need to throw my life away at the local Scientology center as that book would have you believe. As a young absolute literalist, I needed some help to learn that people can be made to think basically anything, so if someone is claiming to you, "I can levitate - no really, I can float around the room" - they are likely incorrect (yet still saying those words that are easily disproven).
Good observations and I've observed more than a few in computerland, especially in startups.
However, it's not true that cult leaders are always grandiose. They can be strategically vulnerable too!
When confronted -- especially by someone too smart to be fooled by their facade -- they can take that critic aside, in a private setting, and suddenly express vulnerability, maybe even fragility. The critic is taken aback, maybe even worries that they've gone too far, but also feels honored to have become a confidant of the leader. The critic may now even feel compelled to help cover for the leader's failings!
But this, too, is an act. Remember, as Sasha said, such people are always being strategic, and strategic vulnerability is part of how they operate.
Some related thoughts, sorted by brevity:
(1) All pathos are healthy behaviors "gone wrong" - wrong time, or, wrong place, or (mostly), wrong magnitude.
(2) I wondered for a very long time: what's the difference between being convinced, and being manipulated? Stumbled across something on the 'net that finally gave an answer: Manipulation begins with diminishment, and (bringing it back to cults), "isolation" is one of the powerful forms of manipulation.
(3) Recently been thinking about "responsibility" in terms of the physics of a 'bounce' -
Drop a ball on sand, it goes thud; the force involved is the weight of the ball. Drop a bouncy ball on concrete, the force is ~2x (stopping the downward motion, then enough for the upward motion).
When someone comes to you and says: "This thing you did had this negative impact on [me/them/us]", if someone rejects any possible responsibility for it, that's a "bounce". It ends up looking like any of the forms of "pushing your reality onto others" - "you do this to me", gaslighting, etc etc etc.
> Drop a ball on sand, it goes thud; the force involved is the weight of the ball. Drop a bouncy ball on concrete, the force is ~2x (stopping the downward motion, then enough for the upward motion).
The force in both scenarios is exactly the same. In the first the force goes into displacing the sand while in the second - since the concrete is a rigid lattice - the force goes into deforming the ball, which causes it to bounce back due to its elasticity. Objects with no elasticity (like another piece of concrete) will not bounce.
Beware of physical metaphors.
Hmmm good point about the sand deformation but I think you're off on "exactly the same".
AFAICT - The force to bring the ball to rest is exactly the same. As you point out, either the sand or the ball's structure (or the ground's structure, such as a trampoline) absorbs that force. After that however, one ball remains at rest, and the other ball accelerates back up. The force producing that acceleration is exerted (keeping in mind newton's third). If the ball returns to 80% of the original height, then that's 100% of the force to bring it to rest (same in both situations), and then an additional 80% of the force to re-"throw" it, which is only present in the bounce. So in a "thud" there's 100% of the force, and in the "bounce" there's 1xx% of the force.
No, we judge people by actions, not by random categorizations or 'fits type' or 'Polarizing Charisma'. Here's how to Spot Cult Leader proper: They manipulate people and exploit them by taking their money. Otherwise it's not a cult leader.
There is post-facto judgment of those who have done harm, and then there are heuristics we can use to help us avoid people who would do harm before it's been done.
What is the false positive rate of these heuristics?
My impression is this is a common phenomenon in the software engineering community, probably mainly because a significant fraction of the community are young and/or inexperienced (e.g. number of programmers double every 5 years, which means 50% of programmers have less than 5 years experience).
There are plenty of examples from streamers to serious programmers but the most common is probably the benevolent dictator for life personalities for numerous software projects.
> There are plenty of examples from streamers to serious programmers but the most common is probably the benevolent dictator for life personalities for numerous software projects
Can you provide any examples from either of these?
I feel like you may be confusing "people with strong opinions" with "narcissistic cult leader."
Richard Stallman is a classic example of this. Brilliant man who produced many valuable things while having a strong philosophical basis underpinning them. As a person he has also engaged in concerning behavior that was hard to separate from his contributions to the Open Source community.
Not sure if it should be on the list but I think that epistemic certainty should be on the list as something distinct from personal humility. You can command a room by saying something like "I'm not all that smart, smart people overcomplicate this, but this is so simple..."
I tend to be a "only true wisdom is in knowing you know nothing" sort of guy, but when I run into someone who isn't like that, who seems to have it all figured out and says it with conviction, it definitely pulls at me. It does one of these:
1. Makes me think that I've overcomplicated it
2. Makes me think I have to convince him he's wrong (which is very hard to do unless you're as laser-focused as he is and he's engaging in good faith)
3. Makes me think it's not worth the effort because of what the other two options are
If you follow 1 or 3, you're ceding the argument. 2 is hard to do well.
"This is so simple..."
Reminds me of "One weird trick." Really they're two specimens of the same phenomenon. Some people essentially spam and clickbait the world with ads for themselves.
An accurate description of a good many politicians.
Charismatic, polarizing and tactical.
I think politics only works for people with these traits.
Yah, but one really stands out.
Is cult leader an extreme version of narcissistic personality? A lot of overlap it seems.
I was reading it thinking it sounds like a description of a narcissist/psychopath
This just comes across to me as a nice description of the dark triad of personality traits which actually exist in all types of organisations and social situations, which is not very exclusive to cult leaders at all. Below are some interesting resources on this topic that I have read as I have an interest in this subject. You may have read them already, but I thought I would pop them below as there were no resources in the article.
Hogan, J., Hogan, R., & Kaiser, R. B. (2011). Management derailment. In S. Zedeck (Ed.), APA Handbook of Industrial and Organizational Psychology (vol. 3, pp. 555–575). American Psychological Association.
Tourish, D. (2018). Dysfunctional leadership in corporations. In P. Garrard (Ed.), The Leadership hubris epidemic: Biological roots and strategies for prevention (pp. 137–162). Palgrave Macmillan.
Tourish, D. (2013). The dark side of transformational leadership: A critical perspective. Routledge. Chapter 3: Coercive persuasion, power and corporate cults
Alvesson, M., & Blom, M. (2019). Beyond leadership and followership: Working with a variety of modes of organizing. Organizational Dynamics, 48(1), 28–37.
However, as was pointed out in the post it is never just a cult leader, there are always those who are complicit, and additionally there are those who just want someone to follow, which is really what gives cult leaders their "power" or their "authority". This is where the concept of relational leadership is interesting:
Hughes, R. L., Ginnett, R. C., & Curphy, G. J. (2014). Leadership: Enhancing the Lessons of Experience. McGraw-Hill Higher Education. Chapter 1: What do we mean by leadership
Haslam, S. A., & Reicher, S. D. (2016). Rethinking the psychology of leadership: From personal identity to social identity. Daedalus, 145(3), 21–34.
Cunliffe, A. L., & Eriksen, M. (2011). Relational leadership. Human Relations, 64(11), 1425–1449.
I don't think there is necessarily anything wrong with the traits described here, but rather there can be a problem with the way certain people abuse their power once they have it. The burning question that bugs me in life, is why do so many humans just follow the leader, even when the leader is a total piece of shit.
Context is very important. With close friends anyone can say anything and it’s brushed off. In a tense business meeting where the difference in big money is how two groups speak and act, it makes sense to be calculating.
It's not talking about people who are "calculating," it is talking about people who switch between negging you and complementing you in a way that causes you to think about them too much.
I was confused reading this. Is the author talking about literal suicide cult leaders like Jones, Koresh, and Applewhite, or "cult of personality" types like Jobs and Elon?
I have a LOT of problems with Koresh and the Branch Davidians, but he def wasn't leading a "suicide cult." They were actually the opposite...they believed that in the End Times they needed to stick it out on their property as long as possible even if it came to violence. They stockpiled weapons believing that it would come down to a final showdown between them and the antichrist's forces. Obviously they were unbiblical, and the child abuse is absolutely unforgiveable, but I take issue with people calling the Branch Davidians a suicide cult when that was against their beliefs.
That being said, I think the author is talking about cult of personality types. People with a personality that gears towards being the leaders of a cult, or getting a cult-like status. That is how it reads to me anyway.
It's obviously an n-dimensional space. I know a lot of the people that Sasha knows, so I can guess some of the people he was talking about here, and it's more like Jared Leto and his handful of weird groupies who are in a deeply unhealthy social and intellectual dynamic. So not really suicide cults, but uncomfortably close, not successful or mainstream enough to be Jobs, not really growth oriented or external facing enough to be Scientology. But a dubious social bubble around certain personality types.
..."cult of personality" types like Jobs and Elon?
I haven't met Elon, but the impression I got from videos is very different to Jobs. He has a cult following indeed, but is it really because his magnetism or PUA tricks like in the article?
Did you read the part where he said they’re not always a net negative?
That does not help clarify which group you are talking about.
Is Jones a net positive or neutral?
I’ve always been fascinated by personality, and spent a decent amount of time in and out of school studying it.
One thing that gets me about the way this author talks is it’s condescending. As though he’s an enlightened observer staring down over his spectacles at this type of person. (Like the final quote about cult leaders being mundane or boring…) Sorry to say, but that person you’re talking down on is probably way more clever and motivated than you are. He’s putting you in the bucket not the other way around!
But the thing that fascinates me is how personalities evolved in general. Like why do people have these set ways of being?
The apparent answer is that personalities evolved as a symbiotic trait. If I have a few people in my village: one’s an asshole to defend from enemies, one’s passionate and emotional to rear children, one’s a narcissist who wants to unite and lead us (under him), one’s a thinker who will improve our hunting and killing tools, and a few people just don’t think that much and view life as hard and are willing to “just go along with things”…
Before you know it we have a mini society. A village of people who work together automatically bc it’s just who they are and it works for (most of) them
So the question is: are cult leaders actually something that thousands of years ago was a benefit? Has society changed so that now they just don’t fit like they otherwise would in a healthy village? Or do parasitic personalities also evolve that have always been a detriment to the rest of us?
I feel like this is something where everyone does it to some extent I don't know. To me it makes more sense to focus on propensity towards behaviors that you find undesirable rather than labelling someone a bad name.
the author seems to actually be upset by self centered ladder climbers, not actual cult leaders. post is lame
blow/muratori
It's interesting how many of these traits line up perfectly with the actions of Eliezer Yudkowsky. Especially the parts about lacking humility to the point where they think their ideology will save the universe from certain doom. Or making people who follow his words say that everyone who thinks he's crazy just "doesn't get it" and is tragically unable to absorb the wisdom or goodness.
Please let's not cross into personal attack on HN. I gather that's not your intent, but threads can too easily turn that way unintentionally.
A substantive discussion of this topic isn't going to work if it veers into dissecting particular personalities.
I see your point, but EY is a huge figure - I think that more leeway should be given for criticism, in the same way people can criticize Tim Cook or Sam Altman.
I don't think hugeness is the criterion here, since personal attacks wouldn't be ok on HN for those characters either.
The question is, where's the line that crosses into getting too personal? Criticism of someone's views is fine, criticism of their actions is fine (though people tend to slip into worse while conducting it), criticism of personality traits starts to be a different story.
This is not a binary thing. I can see good aspects to the kind of discussion you were starting there—I just think the bad aspects outweigh the good.
A lot of this is about the difference between personal conversation and online conversation. One can have a fun and harmless personal conversation about this kind of thing with friends, but the same thing broadcasted to thousands of strangers (and other commenters) online leads to different dynamics, a lot of which get pretty ugly and are best nipped in the bud.
That's not interesting at all, because it was written partially to target Yudkowsky, with the goal that you won't "Pascal's mug"* yourself eg https://twitter.com/sashachapin/status/1657063187655843841
Keep in mind that a lot of tech-adjacent writing these days is just AI debates (and this is why he is not engaging with the actual cult literature or providing examples); this is not the place to debate the object-level arguments, but I will say I disapprove of Chapin writing like this, and not owning up to the real purpose of this essay and the Bay Area dynamics he's criticizing. It is deceptive in precisely the way you inadvertently illustrate.
* His use is wrong, incidentally, both in the original abstruse decision-theory sense of the phrase as coined by Yudkowsky, ironically enough, and in the vulgarized sense of 'you should ignore small probabilities of very bad things' (because we are now far beyond some 'small' probability of AI, and AI risk is now considered so probable people like Geoff Hinton are quitting their jobs so they can speak out about it https://www.lesswrong.com/posts/bLvc7XkSSnoqSukgy/a-brief-co... )
Gwern, this seems like mind-reading. The real purpose of this essay was to muse about a set of character traits I find interesting, not participate in an AI debate. Most of the people I was thinking of are not involved in AI. (I'm much more a part of the meditation world than the AI world, and there's a LOT more cultiness there; and a more thorough survey of my twitter would have revealed that I think about meditation more than AI.)
Yudkowsky was definitely a case I thought of, but not the specific target of this article. AND, Yud is actually an example of one of these personalities that I think is probably net positive! Even if I have a lot of objections to the way he's presented himself and his specific arguments, I think he's doing a good job with moving the Overton window of taking AI seriously.
I grant that I probably do not fully understand Pascal's Mugging. But this tweet was a subtweet of someone who said that she was working on AI because an aligned AI would definitely end factory farming; whether or not this is true, this seems like the kind of thinking that will drive people crazy (all concerns must be suborned to the One Great Cause.)
I'm saying more that it's "interesting" in an argumentative way, not that it's actually of interest about how this is about EY. It should be interesting to followers of EY because all of these traits are things that cult leaders do, and are things that EY does, because EY is (intentionally or not) running a doomsday cult. It's impressive how much reach he's been able to get, to the point where his ideas of control and morality have seeped into the most important technology of our time, but it's still a cult founded on shaky conjecture nonetheless. I think it's pretty clear if you are interested in AI who this article is about, although I do agree it should have been much more explicitely followed up with "Here's how EY is these things".
It's made me very sad to see extremely smart people, who I once looked up to and really found very interesting (you're included in here!) fall victim to the irrational fear of AI spurred on by Eliezer. You can create as much literature as you want, and you can create as many hypothetical scenarios as possible, but it's not going to change the fact that dedicating your life into controlling AI so that you can "logically" justify your own life is nonsense.
It's not that I don't get it. I just disagree. I don't think AI will create a science fiction horror show to end all of humanity. I don't think AI will cause humans to stop being humans. I don't think AI will create the end of the world. Yes, I am not 100% sure. There is a 0.01% chance of this all happening. But that doesn't mean that we need to pay attention to it just because multiplying 0.0001 by 1,000,000 yields a big number. I think basing your morality and existence on trolley problems and mental experiments is abhorrent. And no, I can't give you an exact spelled out reason as to why. Do I have to justify basic human morality?
You're an extremely smart person. I love your blog posts, and I love the depth and time you put into them. It's incredible how disappointing it is for me to see you become another "rationalist" obsessed with a fantasy of AI doom.
Edit for your edit: Someone who is wealthy and has nothing to lose quitting his job to join the cult of AI doomerism is, in no way, proof that AI xrisk is meaningful. You're using someone else coming to a conclusion as a reason you are correct. I trust you are smart enough to know why this is awful logic.
> It should be interesting to followers of EY because all of these traits are things that cult leaders do
It's not really because it's glib pattern-matching. It's roughly "Is the NIH a cult?" https://www.michaeleisen.org/blog/?p=1217 level. Go compare Eliezer to say, NXIVM, to see what a real cult looks like.
> It's made me very sad to see extremely smart people, who I once looked up to and really found very interesting (you're included in here!) fall victim to the irrational fear of AI spurred on by Eliezer. You can create as much literature as you want, and you can create as many hypothetical scenarios as possible, but it's not going to change the fact that dedicating your life into controlling AI so that you can "logically" justify your own life is nonsense...You're an extremely smart person. I love your blog posts, and I love the depth and time you put into them. It's incredible how disappointing it is for me to see you become another "rationalist" obsessed with a fantasy of AI doom.
I feel I should point out that you have not seen me 'become another rationalist' because this is what I have been since around 2004, long before I started any writing or even was 'gwern'. 'AI doom' has always driven my writing (even if I sometimes wander quite far afield before eg. concluding that nootropics are a dead end for intelligence augmentation that might help with AI risk - on the bright side, I think DL scaling may have helped answer why nootropics have been such a bust in general). I just thought we'd have way more time.
> Someone who is wealthy and has nothing to lose quitting his job to join the cult of AI doomerism is, in no way, proof that AI xrisk is meaningful. I trust you are smart enough to know why this is awful logic.
'some wealthy dude' is not the relevant description here for Geoff Hinton. I trust you are smart enough to see the problem with your even glibber comment there.
Yes, a 100 times this! Thank you for putting this so nicely into words! Too bad there's no HN gold or whatever for me to award you!
The author sounds like he has some jealousy issues to get sorted.
I found the last section shocking. Wanting to joust, jealous of this personality type? That’s.. really strange.
I came away from the article thinking, "Where is he finding all these cult leaders?"
I didn’t read it that way at all, and felt that section ring particularly true. “Joust” really just seems like another word for willing to engage in well-intentioned debate. The realization the author likely made was that the so-called “cult leaders” were not at all engaging in good faith, despite them and their followers insisting otherwise. Some really great examples of this would be Jordan Peterson or Eric Weinstein. The moment you think you can engage with these sorts in good faith is the moment you’ve fallen into their trap.