Recently there have been several articles talking about how it seems as if doctors take women's pain less seriously than men.
Maybe you went to the ER for excruciating pain and were told you were being ~dramatic.~
Or maybe you went to the doctor for recurring pain but the doctor made it seem like it wasn't a big deal.
Or maybe a doctor straight up told you that you were making something up.
It happens... so tell us, has it happened to you?
Let us know in the comments below and you might be featured in a BuzzFeed Community post.