Everywhere you look in Hollywood, there are new personalities are coming “out” everyday. Whether in real life or in television shows, being gay has never been as accepted or been as popular. In the “Goode Family”, there is a lesbian couple. In “Dirty Sexy Money”, there is a transsexual who’s having an affair with one of the main characters. And in “Mad Men”, Salvatore Romano marries a woman while pining for a male colleague.
Clearly, the message is, “if you’re gay then the time to come out is now.” Hiding in the closet is no longer tolerated. Some people who display gay tendencies but don’t want to be gay are being forced to admit that they are gay. But is this type of culture really healthy? Is today’s society actually encouraging people to become gay? What if a gay person wants to stop his “gayness”? Should the media deny him this right?
People are being encouraged to accept themselves as they “really” are. In fact, countless people have observed that they have never seen so many gay, bisexual, lesbian, and transgender individuals in their life.
Hollywood has done much to make gays and lesbians accepted in everyday American life. Even as fictional gay characters are promoting “gayness”, real life gay individuals such as Neil Patrick Harris, Ellen DeGeneres, and Lindsay Lohan are promoting this way of life.
From a certain perspective, it would seem that the media is inadvertently promoting gays and lesbians. In fact, this campaign has been going on for years that people cannot help but listen. Gay rights are now one of the hottest topics in Hollywood and even at the courts.
Hollywood can say that what it wants. No, everyone has their own freedom of choice, they’re either gay or they aren’t gay. However, instead of focusing too much attention of gay couples and scandals, it would have been better for them to encourage men to claim their masculinity and for females to claim their femininity.
After all, despite everything else, if a male has been born a male, he ought to have some masculine characteristics inside him. Instead of coercing his feminine side to come out, encouraging “gay” individuals to become more masculine might be better for him and for society. The same is true for females. Instead of telling them that being “masculine” is right, Hollywood should encourage women to claim their femininity or not interfere at all.