I was born in 1990. There were no shortage of male role models when I was growing up. Male actors, lead singers of my favourite bands, comedians, men were given the respect that they earned. Now it's all about bringing men down and saying "Yeah, women can do stuff too!!" Yes, nobody was denying that. But now, it seems that the only men you see in the media are morons or bastards. Not only that but it seems that feminism has killed anything fun.
Comedy has gone all innocent. The best comedians have either gone silent, retired or gone all PC. Comedy is so innocent now and therefore boring. All of the latest movies and TV shows are all about women and minorities overcoming some sort of oppression with the white man (or men in general) as the villain. Or it's about a man who's a mean son of a bitch to everyone. Most teachers are female (so probably feminist or at least into the SJW bullshit). I just feel so bad for the young boys coming up today, essentially being told that they're all inherently pieces of shit. It's no wonder that the number of male suicides is going up and up. It's no wonder that people are having less and less sex than ever before. And it's no wonder that men are going MGTOW.
You guys probably know all of this, it just annoys me.
ここには何もないようです