It’s so sad that adults often shied away from telling or teaching us about our bodies when you were younger. I remember starting my period and my mum just said “now if you sleep with a man, you’re going to get pregnant” – no education whatsoever on what was happening to my body and why.
What are some facts you wish you were taught about your body/ sexual health when you were younger? No matter how small or trivial
For me, it’s little things like no one at home telling me we don’t pee from the vagina until I learned it at school or that you don’t have to wash the vagina and it’s not supposed to smell like roses.