Feminism is equality
feminism is “the belief that men and women should have equal rights and opportunities.” We live in a world where the genders are far from equal, which serves to harm both men and women alike. While we believe that feminism is a positive movement that continues to bring beneficial social change to society, some people still aren’t convinced.
More by Lisa Wee View profile
Like