How Feminism Destroyed The US Black Family

Written by keith
Share This Page

Feminism is a range of social movements, political movements, and ideologies that aim to define, establish, and achieve the political, economic, personal, and social equality of the sexes. Feminism incorporates the position that societies prioritize the male point of view, and that women are treated unfairly within those societies. Efforts to change that include fighting gender stereotypes and seeking to establish educational and professional opportunities for women that are equal to those for men. A video by Shahrazad Ali.


Share This Page