Let's be honest: going to the doctor or seeing a mental health professional can be nerve-wracking. Not only are you talking about your health and/or potential health issues but you're also being incredibly vulnerable with someone you typically don't know very well.
Unfortunately, though, as much as we want our doctors and therapists to advocate for us, there are some who dismiss or disrespect our bodies and/or mental health.
And since this can happen a lot, we want to know if there was ever a time a doctor, a therapist, or another kind of medical professional dismissed or disrespected your body or mental health.
For instance, did a doctor claim you didn't need antidepressants because "everything was in your head"?
Or maybe a therapist denied your own cultural experience because it didn't "match" their own.
Perhaps a doctor denied your chronic pain level because you weren't exercising or eating enough healthy foods, when that, in fact, had nothing to do with what was going on.
Or maybe they said your body weight was causing the high amount of pain you were feeling, and they didn't want to schedule any tests because of their own "diagnosis," even when you repeatedly asked them to.
Also, how did you feel when your doctor, therapist, or other medical professional told this to you? Did you respond a certain way? And if so, how did they respond when you talked to them?
On another note, how did this experience affect you in the long run? Because of this experience, do you make sure to do certain things as a way to advocate for yourself in the future? Or are there other things you would like other patients to know moving forward?