Feminism in America is no longer necessary. Women have all the rights as men do! We’re all equal now!
Look, sexism and racism are things that are awful, but we can’t get rid of. People will always have offensive opinions, but that’s on them. I’m not sexist and I’m not racist, but America’s feminists need to STOP COMPLAINING. There are women in the middle east who will be EXECUTED if they show their ANKLES. That is where feminism is necessary. NOT here.
Okay, lets address the wage gap. IT DOESN’T EXIST. Georgetown University economist Anthony Carnevale discovered that women tend to lean toward lower paying majors, such as drama and theater arts and counseling psychology. On top of that, there are women who leave the workforce for a period of time to raise their children.
Also, I believe that men and women have creative roles in the family. It is the mans job to be prophet, priest, King, and warrior. Men are supposed to provide for their families, pray for their families, and protect their families. Women are supposed to support and pray for their husband. That is what the bible says.