Recently there have been several articles talking about how it seems as if doctors take women's pain less seriously than men.

Independent, Variety, Huffington Post
As someone who has personally experienced this, I'm asking you... has this happened to you?
Maybe you went to the ER for excruciating pain and were told you were being ~dramatic.~
Bravo / Via realitytvgifs.tumblr.com
Or maybe you went to the doctor for recurring pain but the doctor made it seem like it wasn't a big deal.
Bravo / Via realitytvgifs.tumblr.com
Or maybe a doctor straight up told you that you were making something up.
Bravo / Via realitytvgifs.tumblr.com
It happens... so tell us, has it happened to you?
Let us know in the comments below and you might be featured in a BuzzFeed Community post.