Each video might appear innocent on its own, a home movie of a kid in a two-piece swimsuit or a nightie. But each has three common traits: • the girl is mostly unclothed or briefly nude • she is no older than age 8 • her video is being heavily promoted by YouTube’s algorithm
-
- Show this thread
-
Any user who watched one kiddie video would be directed by YouTube's algorithm to dozens more — each selected out of millions of otherwise-obscure home movies by an incredibly sophisticated piece of software that YouTube calls an artificial intelligence. The families had no idea.
Show this thread -
We talked to one mother, in Brazil, whose daughter had posted a video of her and a friend playing in swimsuits. YouTube’s algorithm found the video and promoted it to users who watched other partly-clothed prepubescent children. Within a few days of posting, it had 400,000 views
Show this thread -
We talked to child psychologists, sexual trauma specialists, psychologists who work with pedophiles, academic experts on pedophilia, network analysts. They all said YouTube has built a vast audience — maybe unprecedented — for child sexual exploitation, with grave risks for kids.
Show this thread -
YouTube, to its credit, said it has been working nonstop on this issue since a similar issue was first reported in February. YT also removed some of the videos immediately after we alerted the company, though not others that we did not specifically flag.
Show this thread -
YouTube’s algorithm also changed immediately after we notified the company, no longer linking the kiddie videos together. Strangely, however, YouTube insisted that the timing was a coincidence. When I pushed, YT said the timing might have been related, but wouldn’t say it was.
Show this thread -
I asked YouTube— why not just turn off recommendations on videos of kids? Your system can already identify videos of kids automatically. The recommendation algorithm is driving this whole child exploitation phenomenon. Switching it off would solve the problem and keep kids safe.
Show this thread -
Initially, YouTube gave me comment saying that they were trending in that direction. Experts were thrilled, calling it potentially a hugely positive step. Then YouTube “clarified” their comment. Creators rely on recommendations to drive traffic, they said, so would stay on.
Show this thread -
On a personal note, I found reporting this emotionally straining, far more so than I'd anticipated. Watching the videos made me physically ill and I've been having regular nightmares. I only mention it because I cannot fathom what this is like for parents whose kids are swept up.
Show this thread -
As I reported last year with
@kbennhold, YouTube’s algorithm does something similar with politics. We found it directing large numbers of Germans news consumers toward far-right extremist videos, with real-world implications.https://www.nytimes.com/2018/09/07/world/europe/youtube-far-right-extremism.html …Show this thread -
For more on what happens when YouTube and other social networks route an ever-growing global share of human social relations through engagement-maximizing algorithms, read our essay on “the Algorithmification of the Human Experience”: https://static.nytimes.com/email-content/INT_5362.html?nlid=78801897 …
Show this thread End of conversation
New conversation -
-
-
Possibly dumb question: Why is the algorithm driving users to videos that aren't monetized (I'm assuming home movies aren't monetized)?
-
Don’t have any data in front of me but I would assume (1) keeping eyes continuously on YouTube is more important than every view being monetized, (2) these videos probably encourage longer viewing sessions (3) they probably end up driving viewers to monetized videos
-
Yep. The key word is “engagement”: the longer you can keep a user on your site, the more likely they are end up on revenue generating portions of it.
-
So, effectively: Google is taking people's home movies, repackaging them as softcore child porn by putting them together in never-ending algorithmic "playlists," and promoting them on its website for profit. Very cool!
-
YT deliberately pushed family content, then allowed their algorithm to suggest it to people who had other motiveshttps://twitter.com/loudmouthjulia/status/1135556972231704578?s=21 …
End of conversation
New conversation -
-
-
Yes, it is horrifying. Just one more reason to keep photos of children off social media.
-
Off of publicly available social media, indeed. The internet is a creepy place.
End of conversation
New conversation -
-
-
old enough to remember when Google's motto was
#DontBeEvilThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Why anyone would use social media to share pictures or videos of their children astounds me. Use a cloud storage service, and provide access only to your family and close friends.
-
Let's not blame the victims.
-
The victims are the children. The parents are culpable participants.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.