back to top
Community

As A Woman, Have You Ever Felt Like A Doctor Didn’t Take Your Pain Seriously?

*starts typing aggressively*

Posted on

Recently there have been several articles talking about how it seems as if doctors take women's pain less seriously than men.

Independent, Variety, Huffington Post

As someone who has personally experienced this, I'm asking you... has this happened to you?

Maybe you went to the ER for excruciating pain and were told you were being ~dramatic.~

Or maybe you went to the doctor for recurring pain but the doctor made it seem like it wasn't a big deal.

Or maybe a doctor straight up told you that you were making something up.

It happens... so tell us, has it happened to you?

Let us know in the comments below and you might be featured in a BuzzFeed Community post.

Add Yours!

Add text, image, or both

Submit
Your message was posted successfully