Thursday, December 15, 2011

I Don't Get It

I received an email about the new sitcom that's suppose to start on ABC soon called Work It.  The basis of it is:
According to ABC, the show centers on two unemployed men who have "learned the hard way that the current recession is more of a 'man-cession' and their skills aren't in high demand." One finds out that a pharmaceuticals company is hiring sales reps, but only female sales reps. He goes to the interview dressed in heels, a skirt, and make-up and gets hired as a woman.

When I first saw the commercial for it, I laughed and immediately thought of that 80s sitcom Bosom Buddies with Tom Hanks.  -------------->

For those who don't remember, the basis of Bosom Buddies were these 2 guys looking for a place to live in NYC and the only affordable thing they could find was an apartment building that only rented to woman.  So, they dressed as woman, got the apartment and hilarities ensued.  No one caused an uproar about it then, at least, not that I can remember.  It was a funny show.  That's how Work It looks to me.  I won't say who the email came from but it has it's panties in a twist because "The premise of "Work It" reinforces false and damaging stereotypes about transgender people."  I don't see how.  The 2 guys in questions are obvious dudes who I even believe are married in the show.  They are not trying to really be women.  The way this economy is now, people have to great lengths just to land a job and I feel it's a satire poking fun at the way the world is now.  I just don't see how it's damaging to the transgender community.  I am well aware that members of the transgender community face very real adversities in the workplace as well as in all other areas of life.  But what I don't see how this satire of a show is a damaging stereotype of transgender people.  It's almost as if they are missing the entire point of the show.  Why haven't the extreme feminist come around and said that this show is sexist beyond belief and shows the inequality in the work place between men and women, that women are seen as sex objects and that's why the hypothetical company only wants female sales people?  At what point did we as a society cross over into taking everything so offensively?  When are we going to get to the point of simply accepting human beings for who they are and not what they look like, who they love, what they're wearing, or the damn reality TV show they're on?  I just don't get it anymore...