This could also be called “Lies my church told me” or “Lies society tells me” because I hear it EV.ER.Y.WHERE and it gets old.
People raised female get force-fed this idea that their entire worth is in their uterus and their ability to procreate – that if they are infertile or simply do.not.want.children – something is terribly, horribly, wrong with them or they are bad people.
I’m here to tell…
I’ve never felt pressured to have kids, but because I am uncomfortable around them and know I don’t want them, my family has labeled me as a “kid hater.”
A few months ago I decided to see a doctor about making my decision permanent. Also since I biologically don’t want kids, I don’t feel that I should be forced to have to deal with any of the monthly crap, either. I have never felt so judged, demeaned, and belittled in my life because that doctor was a giant asshole who thought I should just “deal with things the way they were” and “let my husband worry about the permanent thing.” It’s really hurt my confidence but has made me more resolved than ever to live inside my body how I want to, not how biology or some idiot doctor thinks I should.
^This. Every time I broached the subject of sterilization or something more long-term/effective than the pill with my nurses, I was patronized and just felt horrible afterwards. They didn’t help with the feeling-not-broken bit, and it hurt.
In other news, planned parenthood is the best and actually treated me like a person and didn’t try to talk me out of long-term bc (I have the implant) for the first time. I actually felt human and not like some kind of monster or child who changes their mind every 10 seconds.