I’ve noticed an interesting trend in books, movies and TV
shows. When I was growing up in the 1980’s you never saw a woman fighting a man
in hand to hand combat…well maybe with the exception of Charlie’s Angels. It just wasn't cool. However,
in the last five to ten years, it is noticeably more common to see a woman and
a man duking it out, sometimes to a bloody end.
Clearly it, the era of the damsel in distress is over and
that is a good thing! Women aren’t
helpless and frankly never have been and I’m glad to see them given credit for
their strength and ingenuity.
That being said (and don’t call me some name for pointing
this out) Men and women are not built the same.
Often, by virtue of their gender, men are physically stronger. How often do we hear of a man suffering a
terrible beating at a woman’s hands that puts him in the hospital. We just don’t because most often, the man is
strong enough to ward off a direct physical attack.
So here’s the point.
When our daughters watch men and women directly fighting (and often
killing) each other, does that change their perception of what a woman can
physically do and is that what we want to teach them?
For me, I have mixed feelings on this issue. I want my daughter to be tough, to know what
she wants and to be able to defend herself in any situation but I also want her to realize that not
being the same as a man does not mean she’s weak. I want her to celebrate the fact that men and
women have different strengths and virtues and can learn from each other. I want
her to know that it is okay to need someone, especially if that someone needs
you in return.
So how will what she watches on screen and reads about affect how
she sees herself. Which messages will she embrace and which will she reject. Only
time will tell.
What do you think? Do the trends in popular culture affect
us and how will this one change our daughters, our friends and the little girls who will one day
grow up to lead us? I'd <3 to hear from you.
No comments:
Post a Comment