Feminism is defined as the advocacy of women’s rights on the grounds of political, social, and economic equality to men. Isn’t that absolutely ridiculous? How dare women speak up after being oppressed for centuries? Feminism is obviously one big joke that no one should take seriously.
I definitely don’t need feminism because I am perfectly fine with being pulled out of class to change so that I don’t distract the boys. I’m content with pretending to be on the phone when I walk down the street alone at night. I don’t even mind making 25 cents less than my male counterparts every hour. I think it’s amazing that throwing “like a girl” has become an offensive insult. I’m fine with being told to look a certain way in order to impress men. I don’t need feminism because I’m happy with my abilities and skills being doubted.
It’s evident that gender inequality is a thing of the past, just like racism, police brutality, and world hunger.
However, on a serious note, no matter what gender, religion, or race one is, shouldn’t we all be equal in the light of politics, economics, social relations, and life?