1.5M ratings
277k ratings

See, that’s what the app is perfect for.

Sounds perfect Wahhhh, I don’t wanna
nostalgebraist
nostalgebraist

At various points Bostrom (like Yudkowsky) implicitly or explicitly uses a definition of intelligence that is something like “ability to achieve one’s goals.”  This is nicely clean, but problematic, because it doesn’t take into account the fact that some goals may have hard upper limits where others don’t.

In particular, this seems to apply to things having to do with social behavior.  I can imagine beings that are qualitatively better than humans at math, information recall, etc., since there are already orders of magnitude of variation in these abilities among humans.  (John von Neumann is a good example of a person who seems to have really been “superhuman” in these kinds of areas.)  However, social abilities like “ability to manipulate others” do not seem unbounded in these ways.  There are some people who are good at manipulation, and many of us have developed types of wariness, etc. to protect ourselves from these people, but it doesn’t seem like this is a “skill” like math ability that spans orders of magnitude.  Roughly speaking, are no “super-manipulators” out there who can manipulate ordinarily wary people (but not “super-wary” people?).

For instance, one of the most effective ways to get people to do your bidding is to start a cult: there are plenty of chilling stories about the level of devotion that cultists have had to their various leaders.  However, it’s not at all clear that it is possible to be any better at cult-creation than the best historical cult leaders — to create, for instance, a sort of “super-cult” that would be attractive even to people who are normally very disinclined to join cults.  (Insert your preferred Less Wrong joke here.)  I could imagine an AI becoming L. Ron Hubbard, but I’m skeptical that an AI could become a super-Hubbard who would convince us all to become its devotees, even if it wanted to.  If social abilities like this are subject to hard upper bounds that have already been nearly achieved, then there’s no potential for AIs to achieve their goals better by becoming superhuman at these abilities, which makes it problematic to just postulate an AI that’s “superhuman at achieving its goals.”

slatestarscratchpad

A couple of disagreements. First of all, I feel like the burden of proof should be heavily upon somebody who thinks that something stops at the most extreme level observed. Socrates might have theorized that it’s impossible for it to get colder than about 40 F, since that’s probably as low as it ever gets outside in Athens. But when we found the real absolute zero, it was with careful experimentation and theoretical grounding that gave us a good reason to place it at that point. While I agree it’s possible that the best manipulator we know is also the hard upper limit for manipulation ability, I haven’t seen any evidence for that so I default to thinking it’s false.

(lots of fantasy and science fiction does a good job intuition-pumping what a super-manipulator might look like; I especially recommend R. Scott Bakker’s Prince Of Nothing)

But more important, I disagree that L. Ron Hubbard is our upper limit for how successful a cult leader can get. L. Ron Hubbard might be the upper limit for how successful a cult leader can get before we stop calling them a cult leader.

The level above L. Ron Hubbard is Hitler. It’s difficult to overestimate how sudden and surprising Hitler’s rise was. Here was a working-class guy, not especially rich or smart or attractive, rejected from art school, and he went from nothing to dictator of one of the greatest countries in the world in about ten years. If you look into the stories, they’re really creepy. When Hitler joined, the party that would later become the Nazis had a grand total of fifty-five members, and was taken about as seriously as modern Americans take Stormfront. There are records of conversations from Nazi leaders when Hitler joined the party, saying things like “Oh my God, we need to promote this new guy, everybody he talks to starts agreeing with whatever he says, it’s the creepiest thing.” There are stories of people who hated Hitler going to a speech or two just to see what all the fuss was about and ending up pledging their lives to the Nazi cause.  Even while he was killing millions and trapping the country in a difficult two-front war, he had what historians estimate as a 90% approval rating among his own people and rampant speculation that he was the Messiah. Yeah, sure, there was lots of preexisting racism and discontent he took advantage of, but there’s been lots of racism and discontent everywhere forever, and there’s only been one Hitler. If he’d been a little bit smarter or more willing to listen to generals who were, he would have had a pretty good shot at conquering the world. 100% with social skills.

The level above Hitler is Mohammed. I’m not saying he was evil or manipulative, just that he was a genius’ genius at creating movements. Again, he wasn’t born rich or powerful, and he wasn’t particularly scholarly. He was a random merchant. He didn’t even get the luxury of joining a group of fifty-five people. He started by converting his own family to Islam, then his friends, got kicked out of his city, converted another city and then came back at the head of an army. By the time of his death at age 62, he had conquered Arabia and was its unquestioned, God-chosen leader. By what would have been his eightieth birthday his followers were in control of the entire Middle East and good chunks of Africa. Fifteen hundred years later, one fifth of the world population still thinks of him as the most perfect human being ever to exist and makes a decent stab at trying to conform to his desires and opinions in all things.

The level above Mohammed is the one we should be worried about.