What is something that society/your culture made you insecure about, which you didn’t/don’t think should be an insecurity?

All genitals and sexuality related topics where taboo where I come from, really gave me a lot of insecurities growing up, and you are basically in the dark when you start your sexual life

At the end, I made tons of mistakes and had to “learn” on my own

Wish my parents talked more with me about sex stuff in general