Feminism is a broad term. By and large it is a political movement that began in the late 19th Century that pushes for the elimination of gender roles in society. </font>[/QUOTE]I didn't want to hi-jack Aaron's thread on feminism/homosexuality, but wanted to reply and then focus on feminism, so here goes. I do agree that to a degree, homosexuality has, to a small degree, become more acceptable due to radical feminism. On the other hand, I'm all for equal social and political rights. For example, the right to vote, to hold political office, equal pay for equal work, etc.. Gender roles...I don't think they're all that existant, except for the basic and natural common sense that women are the ones who carry and nurse children. Staying at home after nursing doesn't seem a gender role...how are the men to teach the children and raise them properly (homeschooling, a man's job, right?) if the woman stays home? PS: What I'm saying in this post is not necessarily what I believe, and if this discussion continues, please don't take what I may say to be an exact reflection of what I believe. I'd simply like to explore the topic more, as I don't have all my thoughts on it solidified.