Community·Posted on Mar 5, 2017As A Woman, Have You Ever Felt Like A Doctor Didn’t Take Your Pain Seriously?*starts typing aggressively*by Lara ParkerBuzzFeed StaffFacebookPinterestTwitterMailLink Recently there have been several articles talking about how it seems as if doctors take women's pain less seriously than men. Independent, Variety, Huffington Post As someone who has personally experienced this, I'm asking you... has this happened to you? Maybe you went to the ER for excruciating pain and were told you were being ~dramatic.~ Tap to play GIF Tap to play GIF Bravo / Via realitytvgifs.tumblr.com Or maybe you went to the doctor for recurring pain but the doctor made it seem like it wasn't a big deal. Tap to play GIF Tap to play GIF Bravo / Via realitytvgifs.tumblr.com Or maybe a doctor straight up told you that you were making something up. Tap to play GIF Tap to play GIF Bravo / Via realitytvgifs.tumblr.com It happens... so tell us, has it happened to you? Let us know in the comments below and you might be featured in a BuzzFeed Community post.