If you're like me, then you LOOOOVE documentaries.
They can be about anything: wildlife conservation, social justice issues, niche cultures, crimes... ANYTHING. And you love them all!
But has there ever been one documentary that especially made you care about or change your perspective on something?
Like maybe you watched National Geographic's Gender Revolution: A Journey with Katie Couric and you finally learned the difference between a transgender woman and a non-binary person, therefore becoming more knowledgable and empathetic.
Or maybe you watched Food Inc. and felt moved enough to go vegetarian or vegan.
Oooooooor perhaps you watched Discovering Bigfoot and now you're a hard believer.
Tell us what documentary changed you and why in the DropBox below for a chance to be featured in an upcoming BuzzFeed Community post or video!