A recent New York Times article highlighted a shift in the American economy: men are now entering jobs that were once dominated by women. The phrase “women’s work” is becoming obsolete as men are now going to school for dental assisting, dental hygiene, nursing, early childhood education, etc.
I find the phrases “pink collar jobs” and “pink collar work” perfectly ridiculous. (I’m also not fond of the terms “blue color jobs” and “white color jobs” but that’s a conversation for a separate article.) Attaching a person’s gender to an occupation is archaic. What’s between our legs should not be relevant to our education / employment endeavors. Ability and passion for the industry should dictate a career choice, not gender.
The deconstruction of gender stereotypes in all aspects of our society as well as our industries is thrilling. Diversity in the American job market benefits everyone. Men, are you working in a field that was once dominated by women? If not, would you consider entering these occupations?
photo: 8bitjoystick / flickr