I think the word "feminism" is very mis-leading.
I get it's to give both men and women equal rights since unfortunately, there are sexist men out there who see women as nothing but sex objects and kitchen wives (and I've unfortunately dealt with friends like that), but just the way it's worded sounds more like women trying to be more dominant over men.
It also doesn't help that there are, unfortunately, "feminists" out there who take it way too far (and don't even get me started with those annoying "feminists" who claim they want "equal rights" but then use their sex as an easy copout to get out of a situation).