"Femiphobia," my term for the fear of feminism, is widespread. Feminism has been labeled radical, but is it really? The American Heritage Dictionary defines feminism as the "belief in the social, political and economic equality of the sexes." That is not radical at all.
I'm going to assume most women, and many men, agree with equality between the sexes. If you still believe that women should be submissive and devote their lives to having babies, stop reading this now and re-evaluate your whole point of existence.
I doubt, however, that there are many who still believe in this old-fashioned definition. After making several milestones toward equality, women have become more demanding of their rights, taking stances on issues that mean something to them.
But many people don't feel women are oppressed anyway. Until recently, I've always considered myself disadvantaged as a black person, not as a woman. After looking into the issue deeper, you start to realize it all works together, although sexual inequality sometimes is less logical.
For instance, if minority inequality is defended by the claim that outnumbered groups will be disadvantaged, what is the defense for a subjugated group that comprises more than half the population?
Feminist ideology is diverse, although it is rarely painted that way. Nowadays, the "typical" feminist has been identified the way described earlier -- a woman who thinks all men are evil and that women would be better off without men at all. Castration is supposedly a thumbs-up solution to the feminist.
This stereotype, no doubt egged on by the media, has served as a very powerful means of preventing women from identifying themselves as feminists. Consequently, women's achievements are being stagnated also. Taking comfort out of female-bonding experiences means disrupting unity between them, and unity is essential to progress.
Now let's set the record straight. Feminism and man-hating are not synonymous. Of course there are some feminists who do, but the percentage is small. Besides, you can find plenty of non-feminists who hate men.
One might even argue that feminism makes relations with men healthier, as opposed to worse. It teaches a woman to respect herself as more than a sex object. That's a lesson worth having, and living by it forces men to respect her as well.