If you're majoring in 'Gender Studies' or some bullshit like that, then YES Schools are teaching you to be a loser. I'd even broaden that to cover those majoring in Sociology, Anthropology, etc depending on the schools.
I am of the belief that public school teaches you from the youngest of ages that "Your job IS school" which is why we have so many "professional students" these days. Folks who just want to get paid to go learn stupid shit....not how to fix a car, not how to build a house, not how to be useful in society.....NO they learn about "oppression" and the "patriarchy" and "white male privilege" and the "gender binary" and "cultural appropriation" and so on.
Bookmarks