So, all too often I find that movies and TV shows are so much more than JUST consumable culture. They can move us, validate us, and inspire us to be our true selves.
And, in some cases, they can even motivate us to see the world differently, or live our lives better.
So, what's a quote/moment in film or TV that changed the way you see the world, or even changed the way you see yourself?
Or perhaps Cristina from Grey's Anatomy reminded you that you are (and always have been) enough after a particularly terrible break-up.
Or maybe seeing Wonder Woman take on "No Man's Land" filled you with a strength you never knew you had.
Share a movie/TV show moment/quote and how it changed you in the drop box below for a chance to be featured in an upcoming BuzzFeed Community post or video!