Pronouns: she/her or they/them.
I got interested in effective altruism back before it was called effective altruism, back before Giving What We Can had a website. Later on, I got involved in my university EA group and helped run it for a few years. Now I’m trying to figure out where effective altruism can fit into my life these days and what it means to me.
I don't think I agree either with the idea of recruiting people from elite colleges or recruiting "Internet weirdoes". I'm not against inviting in either of those kinds of people, but why target them specifically? I prefer a version of the EA movement that is more wholesome, populist, inclusive, and egalitarian.
I don't mean populist in the typical political sense used these days of being against institutions, against experts, highly distrustful, framing things as good people vs. bad people, or adopting the "paranoid style". I mean populist in the sense of believing average, everyday, ordinary people are good, have a lot to contribute, are diverse and heterogenous, are often talented, wise, intelligent, and moral, and often are full of surprises. A belief in people, in the average person, in the median person, in the diversity of people who are never quite captured by an average or a median.
I don't like the somewhat more traditional, more institutionalist elitism you sometimes see in EA, and I don't like the idiosyncratic, anti-institutionalist nerd elitism of the rationalist community, where people seem to think the best people by far, and maybe the only people really worth a damn, are them, or people just like them. I'm a weird person, and I've often had to fight to find a place in the world, but I think it's the wrong lesson to learn to say, "People treated me badly because I was different and acted like I was inferior just because I wasn't like them... now I finally see the truth... it's normal people who are inferior and it's people like me who are better than everyone else!" Good job, God or karma or whatever sent you a trial so you'd have a chance to become more enlightened and learn compassion, and instead you're repeating the cycle of samsara. Better luck next life.
It's possible there are all kinds of ways to reach people from different walks of life that would be a good idea. I'm just highly suspicious of any idea that there's a superior kind of person, suspiciously similar to the person saying who's superior and who's not, and that outreach should be focused specifically on that kind of person.
Rate limiting on the EA Forum is too strict. Given that people karma downvote because of disagreement, rather than because of quality or civility — or they judge quality and/or civility largely on the basis of what they agree or disagree with — there is a huge disincentive against expressing unpopular or controversial opinions (relative to the views of active EA Forum users, not necessarily relative to the general public or relevant expert communities) on certain topics.
This is a message I saw recently:
You aren't just rate limited for 24 hours once you fall below the recent karma threshold (which can be triggered by one comment that is unpopular with a handful of people), you're rate limited for as many days as it takes you to gain 25 net karma on new comments — which might take a while, since you can only leave one comment per day, and, also, people might keep downvoting your unpopular comment. (Unless you delete it — which I think I've seen happen, but I won't do, myself, because I'd rather be rate limited than self-censor.)
The rate limiting system is a brilliant idea for new users or users who have less than 50 total karma — the ones who have little plant icons next to their names. It's an elegant, automatic way to stop spam, trolling, and other abuses. But my forum account is 2.5 years old and I have over 1,000 karma. I have 24 posts published over 2 years, all with positive karma. My average karma per post/comment is +2.3 (not counting the default karma that all post/comments start with; this is just counting karma from people's votes).
Examples of comments I've gotten downvoted into the net -1 karma or lower range include a methodological critique of a survey that was later accepted to be correct and led to the research report of an EA-adjacent organization getting revised. In another case, a comment was downvoted to negative karma when it was only an attempt to correct the misuse of a technical term in machine learning — a topic which anyone can confirm I've gotten right with a few fairly quick Google searches. People are absolutely not just downvoting comments that are poor quality or rude by any reasonable standard. They are downvoting things they disagree with or dislike for some other reason. (There are many other examples like the ones I just gave, including everything from directly answering a question to clarifying a point of disagreement to expressing a fairly anodyne and mainstream opinion that at least some prominent experts in the relevant field agree with.) Given this, karma downvoting as an automatic moderation tool with thresholds this sensitive just discourages disagreement.
One of the most important cognitive biases to look out for in a context like EA is group polarization, which is the tendency of individuals' views to become more extreme once they join a group, even if each of the individuals had less extreme views before joining the group (i.e., they aren't necessarily being converted by a few zealots who already had extreme views before joining). One way to mitigate group polarization is to have a high tolerance for internal disagreement and debate. I think the EA Forum does have that tolerance for certain topics and within certain windows of accepted opinions for most topics that are discussed, but not for other topics or only within a window that is quite narrow if you compare it to, say, the general population or expert opinion.
For example, 76% of AI experts believe it's unlikely or very unlikely that LLMs will scale to AGI according to one survey, yet the opinion of EA Forum users seems to be the opposite of that. Not everyone on the EA Forum seems to consider the majority expert opinion an opinion worth considering too seriously. To me, that looks like group polarization in action. It's one thing to disagree with expert opinion with some degree of uncertainty and epistemic humility, it's another thing to see expert opinion as beneath serious discussion.
I don't know what specific tweaks to the rate limiting system would be best. Maybe just turn it off altogether for users with over 500 karma (and rely on reporting posts/comments and moderator intervention to handle real problems), or as Jason suggested here, have the karma threshold trigger manual review by a moderator rather than automatic rate limiting. Jason also made some other interesting suggestions for tweaks in that comment and noted, correctly:
Strong downvoting by a committed group is the most obvious way to manipulate the system into silencing those with whom you disagree.
This actually works. I am reluctant to criticize the ideas of or express disagreement with certain organizations/books because of rate limiting, and rate limiting is the #1 thing that makes me feel like just giving up on trying to engage in intellectual debate and discussion and just quit the EA Forum.
I may be slow to reply to any comments on this quick take due to the forum's rate limiting.
This isn't directly related to the points raised in the post or in any of the top-level comments so far, but I can't help but wonder: would efforts to quash tobacco go any better than efforts to quash illegal drugs? Is the crux of the matter really whether a drug is net harmful — if it is, try to abolish, if it isn't, leave it alone? What about considering the most effective forms of harm reduction?
Whether this is true or not depends on what specifically you mean by it. If by attention-grabbing content you mean:
Then I think it’s not true. I think investing in those mediums would just be good money chasing after bad.
On the other hand, if you have in mind content like the highest-quality video essays on YouTube, such as:
The bad news is that saying just go and make high-quality video essays is like saying just go and make high-quality movies. Literally, some of these videos require a similar amount of work as making a microbudget indie movie. The main creator might work on them full-time for over a year, and often involve an editor and, in some cases, people to help with other aspects of production like music, visual effects, or voiceover. (ContraPoints lamented the fact that she spent longer writing the script for her video about Twilight than Stephanie Meyer spent writing Twilight.) The people who make these videos have honed their craft over years of experience.
Jenny Nicholson’s older videos, which are shorter and have less razzle dazzle, are much lighter on production, but still a lot of time and effort. Hank Green does 2-3 vlogs per week on his channel and the vlogbrothers channel that I like a lot, which are relatively quick to make (they have to be, to come out so frequently). But Hank Green has this amazing radio host, podcaster, YouTube vlogger, TV personality-type charisma that is rare and also not a thing where you can say just go do that. If you think you can go do that, then definitely do it! But it’s not something just anyone can do, and no one can do it easily.
Another positive example from YouTube: Kirk Honda, a clinical psychology professor whose channel is called Psychology in Seattle. He does unbelievably good educational videos on mental health, psychology, personality disorders, therapy, abusive relationships, and so on, and the spoonful of sugar that makes the medicine go down is he’s often reacting to reality TV. But this is not a gimmick. He says that for ethical reasons, clinical psychology professors don’t have recordings of real people interacting that they can show to their students. Without real (or realistic) examples, it’s hard to convey what you’re talking about. Reality TV is a great teaching tool because these Hollywood studios have decided to do the arguably unethical thing and exploit people’s anguish for money, so he can just comment on it and use it to explain psychology concepts. These videos seem relatively quick to produce, but again you have the Hank Green problem: Kirk Honda has charisma, wisdom, eloquence, and charm, and not just anyone can do it, and no one can do it easily.
Podcasts are another medium that is incredibly popular, but also hard to do. Effective altruism already has the 80,000 Hours Podcast which is by and large a great success. The winning formula for shows like that is just booking fascinating people with fascinating things to say and letting them talk for a long time. And having good audio and video production, and having an interviewer who can lightly steer the fascinating person into saying more fascinating things. This is the easiest formula to replicate, but 80,000 Hours is already doing it. Not that I discourage people from starting new podcasts, or that I think the 80,000 Hours Podcast can’t be improved — my suggestion to them was to shake things up with guests who say things other than what people in EA are used to hearing, e.g., Richard Sutton, Yann LeCun, Jeff Hawkins, Edan Meyer. Toby Ord’s episode of the 80,000 Hours podcast was an amazing example of this. Toby Ord is as central to EA as it gets, but the cold water he’s dumped on AI scaling very much goes against most people in EA are saying about AI.
I’ve worked on 3 podcasts as hobby projects, and let me tell you, unless you’re already an expert, audio production is surprisingly hard to work out. If you’ve got a budget of ~$200 per episode (take this with a grain of salt and double-check my recollection/estimate), you can take the vast majority of this complexity away by renting a podcast recording studio for 1-2 hours. I think there are probably multiple spaces you can rent in any large city in developed countries. (This requires the people on the podcast are all physically in the studio. I don’t have a solution for making it easier to record podcasts remotely.) Editing the audio isn’t nearly as complicated and just takes time and work, but if you’ve got more money to spend, you can also hire an editor to take that complexity off your hands as well.
The right way to think about this is you’re trying to make good art. Easier said than done. However, I think it’s worth thinking about, and it’s worth trying for people who are willing to put in what it takes to give it a good shot. I just want to discourage people from thinking about making “content” or “social media content” as opposed to art.
Update: the Forecasting Research Institute has changed the language in the report in response to this critique! (It seems like titotal played an important role in this. Thank you, titotal.)
On page 32, the report now gives the survey results in the same intersubjective resolution/metaprediction wording the question was asked in, rather than as an unqualified probability:
By 2030, the average expert thinks that 23% of LEAP panelists will say the state of AI most closely mirrors an (“rapid”) AI progress scenario that matches some of these claims.
This is awesome! I’m very happy to see this. Thanks to the Forecasting Research Institute for making this change.
I see also this EA Forum post has been updated in the same way, so thanks for that as well. Great to see.