Why is /r/photoplunder, a subreddit with 44,000 subscribers dedicated to exposing and sharing nude photos of women without their consent, still up and running in spite of Reddit’s rules regarding non-consensual pornography?
According to Reddit’s new CEO Steve Huffman, the controversial forum platform is turning over a new leaf. Last week Huffman took to the site to announce a new set of rules and standards for all subreddits. The new rules, Huffman stated, were the first step of many that Reddit would be taking to make its community a more open, welcoming, and respectful place.
“As Reddit has grown, we’ve seen additional examples of how unfettered free speech can make Reddit a less enjoyable place to visit, and can even cause people harm outside of Reddit,” he explained. “Earlier this year, Reddit took a stand and banned non-consensual pornography. This was largely accepted by the community, and the world is a better place as a result.”
Indeed, back in March, Reddit updated its digital privacy rules prohibiting the sharing of any “photograph, video, or digital image of [a person] in a state of nudity, sexual excitement, or engaged in any act of sexual conduct without that person’s explicit consent.”
But there’s at least one place on Reddit where non-consensual pornography is still welcomed.
The /r/photoplunder subreddit (which is two years old, and used to be called /r/photobucketplunder) purports to be a forum focused on sourcing “interesting” photos of women (and men) from photo hosting services like Photobucket. But in practice, it’s a dumping ground for found nudes. Typically, users of the subreddit lurk on the “Recent Uploads” tab of Photobucket, looking for people who have uploaded nude photos and, perhaps unwittingly, set them to be publicly viewable in the feed. They then save those photos, upload them to another site like Imgur in order to avoid drawing Photobucket’s attention, and post them to the subreddit, where they routinely receive thousands of views.
In a message pinned to the top of r/photoplunder, moderator thegreenmeanie says that “There are no stolen pictures tolerated here. Pics here were uploaded to a public forum and then saved. There is no hacking, or anything like that involved.” This is probably technically true, and perhaps the reason the subreddit has been allowed to stay up.
But just because nude photos weren’t attained through a hack doesn’t mean they’re consensually shared. It’s likely that most of the Photobucket users whose nudes appear on r/photoplunder gave those photos the wrong privacy settings by mistake — and have no idea they’re being shared among thousands of Redditors. Many posts on the subreddit contain phrases like “recently went private” or “recently deleted,” indicating that the photos’ owners took them down once they realized they’d been exposed.
In a 2014 thread, one /r/photoplunder user explained how the subreddit’s system worked:
Sometimes you have girls out there freeing up the memory on their phones or cameras, and forgetting the settings they have set up on PB.
In other words, the subreddit is in direct violation of Reddit’s rules against non-consensual pornography. So how is /r/photoplunder skirting a ban?
Following the rollout of Reddit’s new rules regarding subreddits, a number of the more problematic threads were immediately shut down to set an example. /r/FatPeopleHate, a subreddit focused on fat-shaming people, was one of the largest subs to be shut down as a part of the new regime. At its height, /r/FatPeopleHate was host to more than 150,000 members, and many of them lashed out in anger at their forum being shuttered. (Thousands of its users immediately flocked to Voat, a Reddit clone, that allows for their particular flavor of hate.)
YouTube user wagsmytag uploaded a seven minute-long video to his channel explaining that Reddit was in the wrong for closing r/FatPeopleHate. He explained that the subreddit was actually a forum to promote healthy living.
And in the past, Reddit has made a point of shutting down particularly odious subreddits like /r/jailbait (photos of underage children) while others like /r/creepshots (surreptitiously captured NSFW photos of women) were shut down from mounting pressure of online hacktivists.
But other controversial subreddits, like /r/Coontown (hatred for black people), have been allowed to stay up because of a tenuous technicality: they’re playing within the bounds of Reddit’s rules. While /r/Coontown is filled with racist sentiment, its moderators are active in shutting down any calls for violence against black people. From Reddit’s perspective, a community of racists is welcome on its site — just as long as members of the community aren’t organizing to actively harass people.
But what made /r/FatPeopleHate worthy of a Reddit ban and not /r/photoplunder?
Like r/FatPeopleHate, /r/photoplunder goes out of its way to target people that they find interesting (albeit for different reasons) and share their photos without consent, potentially hurting their targets in the process. Where r/FatPeopleHate’s most prominent problem was its vicious harassment, /r/photoplunder’s is its sexual exploitation of women, something Reddit dragged its heels on dealing with in the past.
Reddit did not respond to multiple request for comments and none of /r/photoplunder’s moderators responded to Fusion’s request for an interview. It seems as if most of /r/photoplunder’s mods haven’t been active on Reddit at all in months. The forum’s just been allowed to run unsupervised as a trading ground for involuntary porn of unsuspecting people.
Reddit is notoriously tight-lipped about the process it uses to determine which subreddits get shut down, but it appears as if decisions like these are made on a case-by-case basis. If a particularly noxious subreddit gets large enough and it’s brought to the attention of Reddit’s higher ups, there’s a chance that the site’s administrators might do something about it.
What good are Reddit’s rules if the users posting on subreddits become smart enough to maintain low profiles while still making the network a toxic place?