Pronouns: she/her or they/them.
I got interested in effective altruism back before it was called effective altruism, back before Giving What We Can had a website. Later on, I got involved in my university EA group and helped run it for a few years. Now I’m trying to figure out where effective altruism can fit into my life these days and what it means to me.
"Perilously close" has no legal definition, so what you are asserting is a matter of opinion, not a matter of fact. My intention in using "perilously close" was to convey that such statements have a similar kind of danger to statements that would meet the legal definition of incitement to violence, even though they are perfectly legal.
You know that I did not say people who make such statements bear no moral responsibility for how their words are interpreted, so I'm not sure what your intention is in making that false statement.
Since you have not signaled good faith, I won't engage further.
I didn't say it was an incitement to violence, I said it was perilously close to one. What that means is that the person making such statements can, indeed, completely avoid legal liability for such statements, and can plausibly deny any moral responsibility if any violence occurs, although the actual effect on a very small minority of people listening — who aren't in a headspace where they can safely process these kinds of inflammatory proclamations — might be, plausibly, to encourage violence.
The important question is not what kind of speech is illegal or not, the important question is what kind of speech might be taken as encouragement (or discouragement) of the kind of violence or threatened violence that just happened, whether or not that is the speaker's intention. I'm not trying to make a claim about what's illegal or not, I'm making a claim about what kind of public statements are responsible or irresponsible.
You have to look beyond what was literally, directly said to what the most extreme people who are listening to those remarks might infer, or might feel encouraged to do. Saying that people should burn down AI labs and the employees should be jailed for attempted murder is not literally, directly saying someone should commit violence against the employees of AI companies, but it is easy for someone who is in an extremist mindset and who is emotionally unwell to take what was said that extra step further.
And there is no point arguing about what is theoretically, in principle true or not when this violence is already happening, or being threatened.
I regret that this has happened, but there has apparently just been a confirmation of what I said four days ago about how incendiary such comments could be from the point of view of people who are emotionally troubled, and how there is a real risk of physical harm to innocent people. These are not "social status games", and statements to that effect are irresponsible.
No, this wouldn’t be politically viable. Taxes are unpopular, the vast majority of people (including both voters and politicians) have no idea what EA is, and many people don’t like EA and would be vocally opposed to something like this.
However, there may be vaguely similar ideas that might work to a limited degree. I remember hearing about an idea where employers could nudge their employees to donate 1% of their paycheque to charity and give them a list of recommended options. Something like that could work — a nudge toward voluntary giving in a way that reduces the friction of giving, the mental load of figuring it out and the admin work of arranging it.
To the extent governments fund EA priorities, it will be on a case-by-case basis (e.g. X amount of money to Y foreign aid initiative), rather than a general "EA” bucket where money goes.