Growing up as a girl, you get to learn that representations of women in movies can be more than a little unfair. But you also know that there are some movies that make you really damn proud to be a woman.
Maybe you've seen a documentary that inspired you to go out into the world and make a difference.
Or perhaps it was a film based on a true story.
Maybe even what you thought would be a light-hearted romantic comedy showed you just how badass women really are.
We want to know your feminist movie recommendations. Tell us all about the movies you love in the comments, and your submission could be featured in a future BuzzFeed Community post!