So first off I'm not really the most knowledgeable about political terms, especially american terms. I'm not american, and where I live "liberal" isn't really thrown around a lot.
So I've looked into what "liberal" means and it seems to promote freedom and tolerance for all, as well as secular government, and a social democratic economy. Which all are ideas that I support, given that a lot of the most prosperous and well off countries in the world would fit into that definition above.
But in US politics "liberal" seems to be used as more of a derogatory term by both sides of the political spectrum (e.g "fucking liberals"). Liberals seem to be unfairly grouped in with the thugs that protest after every police shooting and demonize white people, and the portions of feminists who act like all men are some horrible sex monsters.
But with the definition I used above both of those groups wouldn't really fit into being a liberal.
Liberals sound like they'd be the most tolerant, forgiving, and nicest people of politics, just going by what I've found out. But they just seem to get bullied a lot, to the point where even the bare mention of "liberal" will get some people riled up.
So I'm just in need of someone to explain to be why liberals get such a bad reputation and the term is thrown around so much whenever someone who might lean slightly left does something deplorable. Acts of violence and such seem to go against what I've researched about being liberal.
ここには何もないようです