What’s a phrase women hate hearing from men ?

– You’d look better with a smile.

– You’re not going to get a husband without a smile.

– You should smile.

It’s specifically older men who seem to think women should always be smiling, it just seems to be a sticking point for them. I’m not even miserable when I’ve had it said, just not actively smiling at nothing … at being around them?