With everything happening in the world on a day-by-day basis, Earth Day may not be the top thing on your radar. That said, there's no understating the importance of raising awareness of what's happening to our planet.
Since 1970, the holiday has brought national attention to the need for environmental conservation. We've lived through enough 'unprecedented events' to know that we're not in great shape, and every step we take toward making the world a better place (in the physical sense here, but really, in every sense) is a crucial one.
Here are some of the documentaries and films that teach us important lessons about the world around us.
1.Kiss the Ground
2.Our Planet
3.Ice on Fire
4.The Year Earth Changed
5.The Biggest Little Farm
6.Jane
7.Fantastic Fungi
8.Chasing Coral
9.A Plastic Ocean
10.The True Cost
11.Mission Blue
12.The Lorax
13.Happy Feet
14.Wall-E
15.No Impact Man
16.Planet Earth
17.Wings of Life
18.Hostile Planet
19.Watermark
20.Public Trust: The Fight for America’s Public Lands
What other earthy documentaries are among your faves? Share in the comments!