Feminism today is usually dismissed as a dirty word. When most people think of a feminist they think of hairy, anti-men women who are probably the way they are because of a bad breakup however that is far from the truth. Feminism doesn’t mean women want to be the overruling sex, it means wanting gender equality and being seen as equal to men.
One topic of conversation that has been on a few lips recently is the debate of whether or not to free the female nipple. (Emphasis on female because of course the male nipple can be shown in public despite the only difference is a matter of extra tissue.) I, a 15 year old, can understand that the reason why women can’t walk as freely as men or even breastfeed their children is because of the female body being sexualised by pornography. The majority of people reading this were probably breastfed so why is this such a taboo subject? Some may say it will lead to more rapes however it is legalised in most European countries and they are among the safest places in the world for women to be.
Whenever I bring this subject of ‘free the nipple’ to others all I hear is “I don’t want to see it,” but once I explain that we’re raised to think it is wrong the kind of understand it clearer. The breasts are simply there to feed babies- nothing more and nothing less. I would also like to state breasts are not a sex organ so the excuse that breasts are equivalent to the vagina and penis is stupid. So why is it so wrong?
It is true, we still live in a man’s world where men still get paid more than women and they are still allowed to do more things compared to women. People need to stop regarding feminism as a dirty word and start standing up for gender equality. Women in the past died to get us where women are today so it is owed to them to be finally seen as equal.
I hope this post has made people thing more than anything.
Lots of Love,
On Marz
x