I know I'm super late to the party here, but I only just discovered this sub. I'd like to share a few thoughts, as this topic means a lot to me.
We even met face to face with Erik Martin to express our concerns. I was in talks with Yishan Wong for a while. The only thing to come out of these talks was fuckall. From what I've read here, I see this going in a similar direction of half assed promises and zero change.
I've been right there. I've also met with Erik and other staff, and talked a fair bit with some others. Even three years ago, when I was still new to this site, I saw clearly the shit it was becoming home to. I thought about it, wrote about it (annoyingly, link is currently dead and I'm waiting to hear back from the maintainer), and talked about it with other users, other mods, and ultimately, I thought, with the staff.
Some small, superficial things I noted all that time ago have finally, just the last six months, been addressed. Systemically, nothing.
From the beginning, reddit has thrived by taking in the scum that other forums didn't want. They actively encourage their horrible behavior by giving them a shieldwall of a misinterpretation of constitutional rights or by blaming the people they are abusing. For reddit to start curbing this behavior, they would need to pull a complete 180 with how they brand themselves. I sincerely believe that they do not want to administer change that would interfere with this vision of themselves.
This is a problem because, despite a lot of admin rhetoric about subreddit individuality, this is a single site with a single experience. It's still about reddit, and reddit is still a single effective entity. But even if that weren't the case, would be a problem. It's a problem (and I wish I had the link up, as I don't even have an archive of what I wrote) because of how communities like this work, fundamentally.
When abusive users enter a community, they turn other users away. Users who are more sensitive, or more cynical, or less tolerant of poor behavior leave. These are users who are most concerned with preventing abusive behavior. When they leave, the community is less focused on abusive behavior.
In a vote-based community like any subreddit, community voting patterns control comment and post visibility. When there are fewer members who strongly disapprove of abusive behavior, that behavior is less likely to be strongly downvoted. That means it is more visible.
More visible abusive behavior gradually turns off more current community members. It also turns away new/potential community members. Those who join the community have to be more tolerant or accepting of abusive behavior. This gradually shifts the tone of the community towards greater acceptance of abusive behavior, as those who do not accept abusive behavior leave because they find the community increasingly intolerable (offensive to their tastes, not specifically more intolerant, as this effect works with all influences).
Reddit is connected. The whole site shares the same user namespace, authentication, messaging. Communities are designed such that they mix and mingle in user perceptions, as content from multiple communities is deliberately mingled on many listings, and on the most common ways to view content (the frontpage).
Because the site is connected, the previously described effect works on the web of subreddits, on the reddit community itself. If one subreddit cracks down on abusive behavior, it means nothing, it has no effect. Because on the whole site, abusive users are still tolerated. And even if a user confines themselves to a single, well-controlled subreddit, they know they are surrounded by danger. It is taxing and unpleasant. And they are pushed away. In every other subreddit, abusive users are accepted as an inevitable part of the site experience. Even if they're downvoted, they're never truly invisible. They remain, and some users are turned away.
This is especially bad when large, popular, high-visibility subreddits harbor abusive users. I'm looking at you, virtually un-moderated defaults like
/r/videos
and
/r/wtf. Because here, the influence of abusive users is broader, because visibility is higher. Instead of merely cowering in their corner, abusive users make visible excursions. And the tone of the site shifts, gradually.
This is all compounded hugely by the site's massive overall growth rate. The problem, in many communities, is that new members come in faster that the community can assimilate them. That is, faster than the members can adjust to the norms of behavior, faster than the community can weed out problem members. This is what destroys the community. New users come in, who are more accepting of abusive behavior. They see it, they see it tolerated, and they too tolerate it. This drives a vicious cycle.
I don't even like the whole social justice movement. I dislike SRS and the fempire. But this is a problem that transcends my petty, personal concerns. It's a problem that impacts not only my experience of the site as a user, but my experience of the site as a moderator of multiple large subreddits. It's something I'm getting really fucking tired of dealing with.
The admins are going to add a few mod tools that you can only get right now through third party browser addons. It won't be enough to do anything useful and the addons will still be better.
As one of the
primary developers of those tools, I assure you that we will not cease development so long as there is any way to improve upon what the site natively provides. Unlike the admins, we are in a unique position as the tools' primary users to create what is most needed, and to work with the moderation community to determine where our own blind spots are. We do this because we think it's worth doing, not because it is our job (not that the two are mutually exclusive).
They are not going to ban hate groups. They aren't going to ban anyone.
And nothing will change. Because the solution here is to stay on top of abusive behavior and deal with it swiftly and concisely. This applies within all communities, be they individual subreddits or the site as a whole.
To prevent users from coming to accept abusive behavior, moderation is required to fill the natural gaps in the community's behavior, to provide direction and act as an example. Removing abusive content and removing abusive users sets the right tone. When users see that abusive behavior is not tolerated, they themselves do not tolerate it. When they very rarely see abusive behavior, it becomes something rare and unusual, worthy of note. For example, reporting. If there is a lot of abusive behavior, there is little incentive and little motivation to report it. What's the point, when more will just crop up? But when abusive behavior is rare, its presence is unusual and merits action.
This is why action must be taken at more than the individual subreddit level. Because the naïve, self-aggrandizing approach of non-involvement is not only morally unacceptable, it is demonstrably functionally ineffective.
I am thoroughly unconvinced that they have the backbone to do anything that will make reddit a pleasant place for women and minorities
As am I, and yet I try anyways.
Changing reddit would be a huge commitment. It would involve taking on a whole new department just to police content and users.
To be clear, I'm not suggesting or supporting direct, paid-staff moderation. This notion of policing content and users to me means working with the community, working with moderators to improve the site (I don't mean half-assed compromise, I mean functional work). It's not practicable for employees to do any meaningful sitewide moderation, due to the scale of the site now. It is practicable for them to enforce that responsibility on existing community moderators, and to act as necessary when issues scope exceeds that. Specifically, I'm talking about mandating subreddits enforce rules against abusive behavior, starting with the biggest examples, the defaults. I'm talking about working with community moderators to crack down on abusive users, to eliminate them from the site altogether. To remove moderators who do not react to abuse, to actively ban entire communities focused on perpetrating hate. To not give them a home on reddit.
Ban the Chimpire. All of those racist subs? Ban them and their users forever.
Please. Do this. Do this, and work with the moderation community to actually address abusive users. Do this, and work with us to track down and ban these users when they show up.