Dana Goldstein, author of The Teacher Wars: A History of America’s Most Embattled Profession, spoke to Fresh Air about how teachers have become both “resented and idealized” over 200 years of history.
In the interview Goldstein explains how teaching became a woman’s profession:
"A lot of people are surprised to learn that back in 1800, 90 percent of American teachers were actually male. Today we know that actually 76 percent of [them are] female, so how did this huge flip happen?
The answer is that as school reformers began to realize in the 1820s that schooling should be compulsory — that parents should be forced to send their kids to school, and public education should be universal — they had to come up with a way to do this basically in an affordable manner, because raising taxes was just about as unpopular back then as it is now. So what we see is this alliance between politicians and education reformers in the early 19th century to redefine teaching as a female profession.
They do this in a couple ways: First, they argue that women are more moral in a Christian sense than men. They depict men as alcoholic, intemperate, lash-wielding, horrible teachers who are abusive to children. They make this argument that women can do a better job because they’re more naturally suited to spend time with kids, on a biological level. Then they are also quite explicit about the fact that [they] can pay women about 50 percent as much — and this is going to be a great thing for the taxpayer.”