I took Women in American History this semester at my university. I’ve never been so excited to attend a lecture at 9 a.m. in the history of my education. My professor came with disclaimers:
“#1: I study the history of sex and sexuality. We will be discussing sex this semester. What choices you make in your personal life are your own. When I speak about sex, I am doing so historically.
#2: I study the history of medicine. We will be discussing medical issues, especially those related to reproduction and including contraceptives and abortion. Again, what choices you make in your personal life are your own. When I speak about these topics, I am doing so historically.
#3: Most importantly, what I say in class on medical history should never be taken as medical advice. On all matters of your physical or mental health, you should consult a medical doctor.”
I liked her before class even began. This class taught me about things I could have never learned from a textbook. Why is it that women never take the stage in history? Why is it that this is the Rosie the Riveter we know and love instead of the original? Why don’t people know about the women who were so integral during the Civil Rights movement?
I know that before this class, I wouldn’t have been able to answer any of these questions. Hell, I didn’t even know what ERA stood for. I’m grateful that I have found the feminist inside of me. (And I don’t look like this.)
Women deserve the opportunity to be equal in every aspect to a man. Sure, there are serious biological differences. That should never hinder a woman from accomplishing her individual goals nor should anyone (woman or man) criticize her for wanting to break the mold.