1.5M ratings
277k ratings

See, that’s what the app is perfect for.

Sounds perfect Wahhhh, I don’t wanna
slatestarscratchpad
argumate

This is a fun read, and the comments are also amusing.

My worst experience with rationalists (and possibly some of their worst experiences with me) were when romance/sex conflict came up. It turns out people are really bad at being rational when that happens.

What Duncan is hungry for is for the world to be better, and he thinks as a contingent fact that being the chief of this particular tribe is the best way for him to do that.

read the whole thing, as it were.

shedoesnotcomprehend

[vaguely hysterical laughter] nope nope nope nope nope

this entire post is making my things to run away from very fast alarms go off like a car alarm in a hailstorm

(seriously, though, if the ~rationalist community~ produces two or three things of this approximate flavor I am going to start being very very careful with engaging with it, because holy cult, batman (no pun intended))

Keep reading

evolution-is-just-a-theorem

I think there are more charitable interpretations of some of the things, but yes this is very dangerous and stupid.

Also, like, #notallrationalists.

(Also Duncan is curriculum director at CFAR, not that this is really more of a qualification than curriculum director of a sixth grade class)

shedoesnotcomprehend

Thanks – I stand corrected on a point of fact, there.

And yes, absolutely #notallrationalists; I’m vaguely rationalist-adjacent or I wouldn’t feel half so strongly about this. (I’d be interested in joining Alicorn’s prospective (and much saner) group house, actually, if I weren’t getting ready to head to grad school.)

It is a fair point that I am definitely not being maximally charitable, and I agree that there could definitely be reasonable and innocent justifications for a number of the things in the post. When you take them all together, though…

So, yeah, dangerous and stupid and makes me nervous.

evolution-is-just-a-theorem

Most of my concern comes from the level of overconfidence/lack of self awareness on display.

As you point out, his listed qualifications are, uh… well they leave something to be desired.

He makes a bunch of noise about how he’s aware of the skulls and claims to be taking precautions, but there aren’t actually very many concrete precautions listed, and there are a *lot* on concrete scary things (‘A Dragon is responsible for being triggered’).

Plus there’s a bunch of stuff that just seems… entirely aesthetic. Like, I’m glad you liked Ender’s Game but I’m pretty sure Orson Scott Card didn’t actually have special insight into the best way to organize groups.

slatestarscratchpad

I would never participate in the linked concept and I think it will probably fail, maybe disastrously.

But I also have a (only partially endorsed) squick reaction to the comments against it. I guess I take it as more axiomatic than other people that if people want to try something weird, and are only harming themselves, that if you make fun of them for it, you’re a bully.

Making fun of the weird authoritarian group house idea just has too many echoes of making fun of people in poly relationships, or people who home school their children, or in age gap relationships, or who have weird sexual fetishes. There’s a veneer of “I worry for the sake of these people; perhaps they are harming themselves”, but I can’t shake the feeling that underneath it there’s this “I cringe at these weird losers who don’t even understand how low status they should feel.”

As far as I can see it, everyone involved in this is doing a public service in sacrificing some of their time and comfort to test a far-out idea that probably won’t work, but might. I don’t want to promote a social norm of “nobody is allowed to do weird experiments around social norms where everyone involved consents”.

I can definitely think of ways it could harm the participants, in the same way I can think of ways that poly relationships, home schooling, age gap relationships, and sexual fetishes can harm the participants. I think it’s fair to point these out to potential participants (and the leader) so they’re forewarned, but I also feel like there’s a missing mood  in the LW comments I’m actually reading.

Also, Duncan’s taking the wrong strategy by denying it’s a cult. His pitch should be “Hey, cults seem pretty good at controlling their members, let’s get together a bunch of people who are interested in using cult techniques to become the best people they can be by their own values, and see if we can make it work.” Not my cup of tea, but My Kink Is Not Your Kink, etc.

slatestarscratchpad

Also, my least favorite among the LW comments are the ones that are like “You guys don’t realize that you’re just autistic and everyone else solves these problems easily” or “you’re never going to get anywhere until you find non-losers to live with”.

I’m sure there are very successful people who are already totally happy with their lives and social skills. Good for these people. It seems like part of the goal of a community, and definitely part of the goal of a weird social engineering cult experiment, is to figure out ways to help people who aren’t already naturally great.

I don’t think this is going to be the best way to do this, but I feel like if you don’t even realize this is a desirable goal then we’ve parted company way earlier than that assessment.

Source: argumate