I've been looking at news and research for years now. I love how nobody has EVER been able to find a true answer for my biggest question, "Name one right that men have that women don't." in America, of course. I very much admit that in almost any other country there is some form of social/political/intellectual discrimination, but this is NOT at all the case in the USA.
The only "right" or "privilege" that men have that women don't is the right to work menial jobs like coal mining or construction, as well as contributing to over 85% of workplace-related deaths and injuries. I'm sorry, but in America, socially, politically, intellectually, and most of all occupation-wise, women are completely equal to men.
Inb4 "
r/iamverysmart"; I'm not trying to prove my vast intelligence over everybody else because I recognize, obviously, that many many people in the world are much smarter than me. You,
/u/TaPontym, are probably smarter than me in many ways. But a lot of people are uneducated on social issues and relations in America. They are swayed by biased media, and don't know how some things really are in ways other than anecdotal evidence, which is completely irrelevant 99% of the time. I'm truly sorry if I seem like a sexist, mysogynistic asshole, because I have absolutely NOTHING against women whatsoever. It's really only the first-world feminists, that are deluded into believing women in America face real social discrimination any more than men do, that set me off a bit. I'll accept the assured barrage of downvotes I'll receive for this.