I don't think I'm wrong in saying that some of life's most important skills aren't taught in school.
Like, I never learned what compound interest is (and still am not sure I totally understand it, LOL!).
And no teacher ever stressed just how important my credit score is — and all the things that can affect it.
I didn't understand how important mental health was until it directly affected me.
There are other topics that schools don't cover either, like in sex ed. I remember learning all about erections, but never once were female parts explained beyond a period.
So, anyway, I am sure you have things to add as well. Let us know in the comments and you could be featured in an upcoming BuzzFeed Community post!