There's a downside to loosening up the mental resistance to mind-changing - you're more susceptible to cult indoctrination.

You can look no further than the Rationalist community who have internalized this to such a degree that cults are endemic to the community. Sure, there's positives to being open to changing one's beliefs, but like all advice, it's contextual. Some people probably do need to loosen up, but they are the least likely to do so. Those who hold their beliefs too loosely, could stand to tighten that knot a little more.

Cult indoctrination could be explained by this but could also be explained by the fact that a certain number of formerly gifted kids, who have been ostracised during their childhood and have low social skills tend, to gravitate around the rationalist community. I do believe that those people are more likely to be indoctrinated.

From my readings of the Zizian, they also don't seem to easily change their mind, they instead have had a tendency towards very radical opinions that progressively become more extreme.

I argue that having opinions that progressively become more extreme is in fact changing one's mind. That might not be the kind of mind changing we immediately imagine when we think about changing one's mind, but it is mind changing nonetheless.

I'm not trying to be clever; the fact that this flies under the radar just means we might be looking for "changing minds" in one form when it's mostly occurring in another.

People who feel ostracised or underappreciated tend to make good marks for cults and extremist groups in general. Another commenter pointed out that changing an opinion is a more emotional process than we'd like to assume.

Can you elaborate a bit more on the rationalist community’s perceived cults? I’ve only dipped my toes into places like LessWrong, so I am curious what you see there.

Rationalism is essentially a tech-flavored self-help movement, and the people who tend to gravitate towards self-help in general tend to be emotionally vulnerable people who are strongly susceptible to cult techniques (there's a reason so many cults start out as self-help movements).

On top of that, given the tech-flavored nature of Rationalism, its adherents seem to gravitate towards strongly utilitarian ethics (evil can be justified if done for a greater good) and an almost messianic relationship towards artificial superintelligence (a good so great it can justify a lot of evil).

Finally, it seems to me that Rationalism is especially prone to producing tedious writers which create insularity (by making it impenetrable to non-insiders) and lots of schisms over minor disputes that, due to insularity, end up festering into something rather more cult-like that demands more immediate and drastic action... like the Zizians.

To add a little nuance and a bit of a detour from the original topic, some Rationalists (I'm thinking Scott Alexander) tend to spend a lot of brainpower on negative aspects of AI too - think the alignment problem.

The category of events having near infinite positive or negative outcomes with zero to few examples where it's difficult to establish a base-rate[prior] appears to attract them the most. Conversely, an imagined demonic relationship with a yet to be realized unaligned AI results in a particular existential paranoia that permeates other enclaves of Rationalist discourse.

[deleted]

Probably referring to the Ziz cult which was born out of the rationalist community, which recently murdered bunch of innocent people.

Among others.

Yeah I see your point but the median person probably falls on the side of needing to loosen up.

An open mind is like a fortress with its gates unbarred and unguarded.

Is this where we are now?

Shockingly, in a world where both eating too much food and too little will kill you, as will too much or too little water, heat or oxygen, the solutions are rarely found at the extremes of any continuum.

Creative, but no.

I wonder what is the cause and what is the effect? If Rationalism promises mind changing, I bet it attracts people obsessed with mind changing. Rationalism promises a chance to touch the eternal Truth, or at least to come closer to it, so naturally people who seeks such a truth will try to become rationalists.

This overall can easily lead to greater then average concentration of people susceptible to cults.

You see, I was engaged in lesswrong.com activites 10+ years ago, and I didn't become more "cultist". Probably even less. If I look at changes in me that happened due to me reading Yudkowski and talking with other people who read him, I'd say that these changes were coming in me in any case, the lesswrong stuff played its role and influenced the outcomes, but even before my lesswrong period I was:

1. Interested in arguments and how they work or do not work 2. All the time tried to dismantle laws, social norms, rules morale to find an answer "why do they exists and how they benefit the society", "how do they work?". Some of them I rejected as stupid and pointless. 3. I was interested in science overall and psychology in particular.

I learned a lot from that time of how arguments work and I was excited to see Yudkowski take on that. His approach doesn't work in reality, only with other rationalists, but I like it nevertheless.

OTOH, I need to say that Yudkowski by himself have a lot of traits of a leader of a cult. His texts are written like they are his own unique ideas. He refers sometimes to Socrates of some other person, but it doesn't help and his texts looks like he is a genius that invented a new philosophical system from ground up. I didn't know the history of philosophy enough to see how far from the truth the picture is. The bells begin to ring in my head when I get to the "Death Spirals" where Yudkowski talked about cults and why lesswrong is not a cult. It is highly suspicious as it is, but his arguments were not good enough to me, maybe because they were worse than usual or maybe because I was more critical than usual. "Death Spirals" failed to convince me that lesswrong is not a cult, on the contrary they made me to wonder "a cult or not a cult" all the time.

And this question led me to a search for information everywhere, not just lesswrong. And then I've found a new "sport": find Yudkoswki's ideas in writings of thinkers from XIX century or earlier. Had he conceived at least one truly original idea? This activity was much more fun for me than lesswrong and after that I had no chance whatsoever to become a part of a cult centered on Rationality.

The point I'm trying to make is Yudkowski's Rationality doesn't deliver its promises, people get not what was promised but what they had already. Rationality changes them somehow, but I believe that it is not the reason, just a trigger for changes that would come in any case.

> And this question led me to a search for information everywhere, not just lesswrong. And then I've found a new "sport": find Yudkoswki's ideas in writings of thinkers from XIX century or earlier. Had he conceived at least one truly original idea? This activity was much more fun for me than lesswrong and after that I had no chance whatsoever to become a part of a cult centered on Rationality.

Do you have any interesting references? :)

I don't remember Yudkowski good enough to point at the missing references. But I can point to my last find.

Didn't you hear about Charles Sanders Pearce[1]? He said basically that the truth is what people are ready to bet on. Not something hand-wavy like "scientific method" or "millions of flies" or anything else, it is what real people are ready to rely on. Yudkowski is in favor of betting, moreover it tends to measure "truthiness" by bets. He is much in favor of the scientific method, but if you look closely it is because you can bet on a scientific knowledge.

BTW, about his belief in a scientific method. All or almost all psychology Yudkowski refers to was debunked, and sometimes very hard. Standford prison experiment for example was staged, it was like a play in a theater with Zimbardo whispering from behind the curtains "more brutality please". It is a separate issue with Yudkowski, he can't distinguish good science from bad science even when bad science was debunked decades ago. He is (like his version of Harry Potter) believes that if he vowed his allegiance to Science and knows what integral is then he is a scientist. He talks a lot of training one's mind, but this training doesn't include reading basic textbooks for a branch of the science that he is interested in.

You see, "Zimbardo experiment" technically speaking is not an experiment. There are no two groups with varying stimuli to compare outcomes. If we tried to classify it, it can probably be classified as "observation", the lowest tier of research approaches (experiment is the highest one, though meta-research is probably even higher, but it is about reading works of others instead of asking questions to Reality directly). It doesn't allow us to reason about causes and effects. It is something that undergraduates in social sciences learn in their first year.

[1] https://en.wikipedia.org/wiki/Charles_Sanders_Peirce

So I'm open to changing my mind on this, but — having already been familiar with the evidence you posted below and having been adjacent to these circles for a long time — I'm very skeptical of both the claim generally that cults are endemic to the Rationalist community, and even moreso, specifically that it has anything to do with Rationalists holding beliefs loosely.

The Zizians are absolutely a cult. But did they get there by changing their beliefs too easily?

I think that's a really tough case to make -- one of their chief characteristics is their extreme slavishness to some particular radical views. These weren't people who jumped around often ideologically. Several of the Zizians (of whom there were never many) also weren't rationalists first. Where's the case that this is a result of Rationalism influence, or particularly that holding beliefs loosely was the problem? A handful of (the many) ex-rationalists forming a cult doesn't seem like strong evidence.

Leverage was certainly a high-demand social circle, and some people came out with some damage. I know others who were involved briefly, got no cult vibes, had no issues, and had a good experience with Leverage programs. Note also that a number of the "cult" claims came from Ziz and Ziz's friends, who even separately from Ziz influence have not tended to be particularly stable people — this doesn't mean they're wrong, but I do update a bit based on that. And Vassar definitely had a penchant for seeing vulnerable people near crisis and suggesting that they take drugs, which is generally stupid and harmful.

I don't think it's particularly useful to call leverage a "cult" even if there's some overlap, but if it is, is it because of Rationalists' willingness to change their minds? Again, I'm very skeptical. Vassar looked for people who were a little bit crazy/unstable, and did influence them to change their minds. But he didn't do this because he was looking to prey on them, and often engaged in ways that don't seem cultish at all — he did it because those were the people who understood him, because he was also a bit crazy/unstable!

Alternatively, what other explanatory factors are there for two cults closely adjacent to Rationalism? 1. Base rates. Have you been to the Bay Area? Cults are everywhere. Seriously, I suspect Rationalists are well-below the base rate here. 2. Very smart people who are also atypical as thinkers seem to be more susceptible to mental health issues, and in many cases these people from otherwise-vulnerable groups (e.g. almost all of the Zizians, many of the Leverage people). You definitely get some high-octane crazy, and groups of people that can follow certain types of reasoning can insulate themselves in a mental cul-de-sac, and then get stuck there because their blind spots block the exit and few others can follow the reasoning well enough to come in and get them. 3. Young people are easily influenced. As one Lesswrong commenter put it, "the rationalist community is acting as a de facto school and system of interconnected mentorship opportunities."

There's a lot of related discussion on these topics catalogued here, with Rationalists carefully dissecting these issues from various angles to see what the risks are and how they can make the community more resilient to them: https://www.lesswrong.com/posts/MnFqyPLqbiKL8nSR7/my-experie...