1.5M ratings
277k ratings

See, that’s what the app is perfect for.

Sounds perfect Wahhhh, I don’t wanna
evolution-is-just-a-theorem
argumate

This is a fun read, and the comments are also amusing.

My worst experience with rationalists (and possibly some of their worst experiences with me) were when romance/sex conflict came up. It turns out people are really bad at being rational when that happens.

What Duncan is hungry for is for the world to be better, and he thinks as a contingent fact that being the chief of this particular tribe is the best way for him to do that.

read the whole thing, as it were.

shedoesnotcomprehend

[vaguely hysterical laughter] nope nope nope nope nope

this entire post is making my things to run away from very fast alarms go off like a car alarm in a hailstorm

(seriously, though, if the ~rationalist community~ produces two or three things of this approximate flavor I am going to start being very very careful with engaging with it, because holy cult, batman (no pun intended))

Keep reading

evolution-is-just-a-theorem

I think there are more charitable interpretations of some of the things, but yes this is very dangerous and stupid.

Also, like, #notallrationalists.

(Also Duncan is curriculum director at CFAR, not that this is really more of a qualification than curriculum director of a sixth grade class)

shedoesnotcomprehend

Thanks – I stand corrected on a point of fact, there.

And yes, absolutely #notallrationalists; I’m vaguely rationalist-adjacent or I wouldn’t feel half so strongly about this. (I’d be interested in joining Alicorn’s prospective (and much saner) group house, actually, if I weren’t getting ready to head to grad school.)

It is a fair point that I am definitely not being maximally charitable, and I agree that there could definitely be reasonable and innocent justifications for a number of the things in the post. When you take them all together, though…

So, yeah, dangerous and stupid and makes me nervous.

evolution-is-just-a-theorem

Most of my concern comes from the level of overconfidence/lack of self awareness on display.

As you point out, his listed qualifications are, uh… well they leave something to be desired.

He makes a bunch of noise about how he’s aware of the skulls and claims to be taking precautions, but there aren’t actually very many concrete precautions listed, and there are a *lot* on concrete scary things (‘A Dragon is responsible for being triggered’).

Plus there’s a bunch of stuff that just seems… entirely aesthetic. Like, I’m glad you liked Ender’s Game but I’m pretty sure Orson Scott Card didn’t actually have special insight into the best way to organize groups.

slatestarscratchpad

I would never participate in the linked concept and I think it will probably fail, maybe disastrously.

But I also have a (only partially endorsed) squick reaction to the comments against it. I guess I take it as more axiomatic than other people that if people want to try something weird, and are only harming themselves, that if you make fun of them for it, you’re a bully.

Making fun of the weird authoritarian group house idea just has too many echoes of making fun of people in poly relationships, or people who home school their children, or in age gap relationships, or who have weird sexual fetishes. There’s a veneer of “I worry for the sake of these people; perhaps they are harming themselves”, but I can’t shake the feeling that underneath it there’s this “I cringe at these weird losers who don’t even understand how low status they should feel.”

As far as I can see it, everyone involved in this is doing a public service in sacrificing some of their time and comfort to test a far-out idea that probably won’t work, but might. I don’t want to promote a social norm of “nobody is allowed to do weird experiments around social norms where everyone involved consents”.

I can definitely think of ways it could harm the participants, in the same way I can think of ways that poly relationships, home schooling, age gap relationships, and sexual fetishes can harm the participants. I think it’s fair to point these out to potential participants (and the leader) so they’re forewarned, but I also feel like there’s a missing mood  in the LW comments I’m actually reading.

Also, Duncan’s taking the wrong strategy by denying it’s a cult. His pitch should be “Hey, cults seem pretty good at controlling their members, let’s get together a bunch of people who are interested in using cult techniques to become the best people they can be by their own values, and see if we can make it work.” Not my cup of tea, but My Kink Is Not Your Kink, etc.

EDIT: A friend points out that it’s important this has a very clear door marked EXIT and really good mechanisms for making this as uncostly as possible, just in case. I agree with that, even if it makes the commitment mechanisms a little harder.

evolution-is-just-a-theorem

So I am making fun of Duncan, but this is because I already went over my thoughts on this several times IRL and didn’t have the energy to reformulate them fully.

However I am not claiming “this is bad because it’s weird”. I’m claiming “this is bad because it has an unreasonably high chance* of turning into an abusive clusterfuck”. Vulnerable populations exist, even (or especially) within the rationalist community, the fact that everyone involved is doing it voluntarily is *not enough*.

Like, I have concrete criticisms that are unrelated to how weird it is. Here’s a sample:

  • Duncan seems extremely overconfident. The fact that his qualification list (which is supposed to distinguish him as unique and highly qualified) contains a number of items that are neither unique nor particularly relevant to this project is scary, because it indicates that he isn’t even aware of what’s required.
  • The overall response to criticism (both preemptively in the post and in the comments) has been mostly of the form “Yes we’re aware of this and taking every precaution” with very little mention of the actual precautions.
  • A number of rules that could easily be used to prevent dissent (e.g. “A Dragon will take responsibility for its actions, emotional responses, and the consequences thereof… if angry or triggered will not blame the other party.”) This is dumb and wrong. Being triggered does not necessarily imply the other party is to blame, but it is certainly possible for the other party to be at fault.
  • Really high exit costs (must find a replacement room mate first, also this is the Bay so housing is extremely non-trivial). These are apparently standard for the Bay: I think they should be lower than average Bay Area exit costs. (I have no idea what you think the “really good mechanisms” are for making exit as uncostly as possible. Where is that coming from?)
  • Point 5 of Section 2 just completely misunderstands the threat model. Duncan seems to think that abuse comes from maintaining an abusive set of norms, and misses the part where sometimes abuse comes from tyrants trying to maintain their power. He also makes the claim that people will naturally leave on their own if things aren’t working out: anyone who knows anything about abusive relationships knows that leaving is not a free action.
  • Point 6 of Section 2 makes a bunch of claims about protections without actually going into detail. Saying “transparency” is well and good, but literally anyone can do that.
  • Speaking highly of Leverage. Leverage is not what I would call a successful experiment. If someone wants to emulate them I question their ability to even recognize success. (This point is last because it’s the weakest, I really don’t want to get into an argument about Leverage).

Also, counter to one of your points: there is totally a community trying experiments like this. They’re called the intentional community community, and they’ve banned communities of this sort (i.e. one dictator in charge of everything) because that type of community tends to fail hard.

I will say in Duncan’s favor that in the comments he (eventually) goes into some detail about precautions, and mentions having check ins with outsiders. This was one of the main things suggested by people I know who are more familiar with cults than I am, and I did update in his favor after seeing that.

To reiterate: I’m not against this because it’s weird. I am against this for specific reasons, mostly related to implementation details. I can even give criteria for what would make me support it:

  • Run by someone with extensive therapeutic / social moderation experience.
  • No legal or financial barriers to exit.
  • Regular (>monthly) check-ins with outsiders chosen by the participants.
  • Not modeled after the army, which is not well known for preserving the psychological well-being of its members.
  • If modeled on a sci-fi novel, a reasonable justification for why this sci-fi novel serves as a good model is given.
  • Run by someone who responds reasonably well to criticism. (This is a washy criterion and I could easily move the goal posts on it. All I can do is say that I have a consistent idea in my head of what this looks like and that I wouldn’t just make it stricter if someone met it).

I have additional criticisms related to the odds of success, but these are much less relevant. If people want to throw away their time, money and energy on an experiment that won’t succeed**

* 15%. Will consider actual bets if we can come up with a decent criteria for determining the outcome.

** Or rather one that makes fixable mistakes that decrease its odds of success.

slatestarscratchpad

I wasn’t criticizing you so much as the comments on LW. To give an example of them:

> Before anything else, the original post is disgusting. I suggest that Duncan should kill himself, not because I believe that telling people to kill themselves is an “instrumentally rational” argumentative position, but rather because I’m disgusted by his continued existence. I’m asserting that if I could reshape the world at will, he would not be part of my world. The fact that he persists in existing is an affront to my sense of what’s right. Some people do believe that Duncan is fucked up in the head and is externalizing his personal issues, which involves some (sub)conscious drive toward power and the formation of a cult-like organization most attractive to people suffering from the same mental health problems. Some people do believe that “Dragon Army” will be deeply harmful to the participants on a deep-seated, instinctual level. Duncan’s attempts to persuade these people that “Dragon Army” is fine by writing even more bullshit in the style of his original post belies a deep misunderstanding of what exactly is wrong here.

After looking through the thread further, there’s less of it than I remember (and most of it is one person), so apologies if I tarred all critics with that brush.

But I’m still not sure I agree with you. Your criteria seem to be a combination of insurmountable barriers (Duncan does a bit of social moderation stuff already, so it sounds like you’re wanting it to be led by an actual therapist, in which case, good luck) and diluting it down to more like a normal group home (yes it would be safer if it weren’t like the army, but the whole point is that the army is a unique kind of thing which has powerful social bonding effects and they want to try that model).

I guess in the end when a bunch of people want to do this, I feel like us not-interested people trying to impose so many restrictions on them that the whole thing falls apart and they can’t, seems a little bit BETA-MEALR. And I just have this really strong social intuition that third parties telling people what social arrangements they are or aren’t allowed to have, for their own good, while making fun of them, has the potential to be really bad.

I do understand your concerns and I’ll think about them more. And to clarify, I don’t disagree with your odds, and would strongly disrecommend anyone from actually joining this group house.

Source: argumate