I say western feminism, because in other countries where women are severely repressed, feminism can be quite useful to enact change. But here in the good old USA, feminism seems to have other motives. Since women are repressed by the media, but not the government, it's more of a "social" change women are seeking in this country, not political change like other countries.
One problem with western feminism today is that the idea seems to come from a sense of entitlement. "I'm a woman, I'm JUST as good as a man, and you must automatically treat with respect!" However, should we hand out respect just because it's demanded?
I'm a firm believer that respect should be earned, not expected. And if women want equality with men, they should take time to earn mutual respect, compromise, and not demand respect from men just because other men treat women badly. That is just reversing the sexism. Respect: It's an honor, and something everyone should earn.
To me, it's admirable if a woman wants to pursue a career, has her own financial goals, and can take care of herself. But to somehow imply that a woman who DOES enjoy homemaking, being a supportive wife, and taking care of the family is the WEAK woman? That's just more garbage floating around this western, "politically correct" society.
A woman can be strong and hard working, even if she does embrace "typical femininity" and follows standard gender roles. In fact, I believe a woman who sacrifices herself selflessly for the benefit of her family and loved ones to be much stronger and selfless than modern feminists might imagine!
A few years ago, someone might have said that we were closer to gender equality than we had ever been before. Now many people would argue that we're not even close. So which is it? Have we really gone backwards here? Have more problems come up in the recent years? Is it possible these "new" problems were simply exacerbated by the media?



