Go up to any random woman from any walk of life, and I bet you anything that the answer to “what’s the most annoying thing men say to you” will be one short word: smile. I cannot tell you how many men/boys/pre-adolescent male creatures that have told me to do it.
“you’d look so much better if you smiled”
“there’s no harm in smiling”
“smile for once”
Trust me, I’ve heard it all. To be clear, I wouldn’t label myself as the bitchy/downcast type, and even though friends and family have told me I have a ‘resting-bitch-face’, I’m not a person who hates sunshine and everything happy and positive in life, but I have never understood why we, as females and therein our femininity, somehow has to be synonymous with smiling?
Smiling does not make me a better person, it does not make me prettier (contrary to wide belief) and most importantly it will not make me more likely to give you my number, so why do men do it? Personally, I believe its to do with a man’s masculinity and confidence, or rather, a lack thereof. A man with confidence will never tell you to “smile” as to suggest that you’re somehow lacking in femininity, making you slightly embarrassed and giving him a psychological testosterone shot in the process, no, a real man will just make normal human to human conversation with you, is that too much to ask?
It sounds so weird typing this but I actually remember the first time a man told me to smile, (Why, out of all the billions of memories in my 21-year-old brain I remember that I have no idea). It was a homeless man who was asking my Dad for change; he complimented my sunflower top I was wearing and told me, “smile beautiful”. At the time I remember blushing in response and ducking my head, but if I knew then that the homeless man would set the scene for thousands of random men to come, I would’ve walked away quicker.
From then on, through my teens and now into my early twenties, I’ve had to deal with it on a ridiculous scale. Men on the street, men at construction sites, men in bars, men in clubs, men in shopping centres, it’s almost disheartening that myself, and I’m sure, many other women, have had to put up with the stupid comment in every kind of social setting.
In a nutshell, it’s nearing 2018 and yet there are some men (aka the majority), who still feel like they have some kind of authority over women they’ve never met before because of what’s in between their legs that allows them to invade a female stranger’s privacy. Telling me to “smile” is not cute, it’s actually condescending and dumb, why would anyone want to walk around smiling 24/7? (fun fact: a guy actually asked me why I was smiling so much in a club once). Most importantly, it’s very, very sexist, women don’t need to smile to make men feel comfortable anymore, we’re not homebodies who wait for our men to come home with dinner in the oven, we’re out and about and we’ve got things to do, who the hell has time to smile?
So now, whereas before I would awkwardly force a smile and avoid eye-contact, when a male stranger tells me to smile, which just so happens was yesterday evening, I say “no”. I don’t quicken my pace in fear of their response, I keep my ‘resting-bitch-face’ intact, and with one short word I correct his assumption that women are put in this society to please men. And if I am intimidating because of it, then so what? I’d much rather be intimidating than pleasing every Tom, Dick and Harry who comes along and tells me what to do with my face.
Have you ever been told to “smile” before? (dumb question lol), comment below with your story.