I don't think it's a gender thing anymore, it's more like a power thing now. Women in power have proven to be just as capable of victimizing people as anybody else. Just look at the tidal wave of pervert female teachers getting caught diddling the kids. Also consider the fact that 30% of the people convicted of human trafficking are women. http://www.thedailybeast.com/witw/articles/2013/08/07/when-women-are-found-trafficking-other-women.html
Like I said...it's a mans world Find me a country where the men have to cover themselves up like its Halloween everyday...or where the men are barred from school while the women aren't Women are second class citizens no more worthy than a goat and this shit is state-sanctioned There are the exceptions as with anything else in life, but be real...one look at stats involving violence against women perpetrated by men (and women to a lesser extent) will prove this more than any post of mine every could
In the history of the world in nearly every society and in every race women are subjected to the rule of men. It's either cultural, physical or sexual and their agency is society is always as an appendage to a male societal actor. Gender equality is the exception, not the norm, and is mostly a modern phenomena when women do make an advance. Remember the old adage "Behind every strong man is a stronger woman"?
I pretty much thought this was a widely accepted and understood viewpoint Not that it's a bad thing....If women had the chance, they'd be the ones with all the power
Bingo. If they had equality, they wouldn't have to manipulate a man to be a social actor. And men benefit by having women kowtow to them to have anything resembling influence. And women are just doing what they have to to survive.
All this is true in certain cultures. But it really isn't the case anymore in western civilization. There are no laws in western society that favor men over women. In fact in the U.S. and the U.K. the only gender based laws that exist, FAVOR women over men.
In an earlier post, I mentioned not only political, but cultural and physical domination. The culture of the US and UK is still extremely male-centric, as it is in all of Western Civilization. The laws that provide preferential treatment for women in given situations are largely in place as an acknowledgment of that subjugation and an effort to remedy past discrimination. Laws are also necessary when the culture and actual conditions don't allow for equal treatment, basically legislating (coercing) what people are unwilling to do otherwise. The Scandinavian countries and Japan have taken probably the most aggressive steps toward gender equality in recent years, but the work is not finished, not by a long shot.
Or when the recitation of a few laws means it's over. Not even close. Male dominance is so hard-wired into the fabric of the species from the caveman to today. It's got a longer pedigree than white dominance, but is accepted as a universal, and even internalized by some women as the way things ought to be.
No. No matter how PC people want to be, not all injustices are the same. Anyone who thinks ww in America suffer the same as bm is either horribly naive or willfully ignorant. Neither of which I have patience for.
I agree. They are both there, but qualitatively different. The domination of women is an urge to control, not destroy. But I think men around the world exert themselves in this way with the women in their societies. Inter-male race hostility has a violent or raging tinge to it.