I have this debate all the time. Career Builder put out an article saying a teacher can make 70K in their first year, and I laughed.
Americans think teachers are super rich, so we don't deserve any extra pay or anything that could help make learning better.
I never taught in the States, for I don't have a teaching degree. But I taught abroad for six years, and there I felt respected. I hesitate on getting a teaching degree here because of all the abuse and shit pay. I tell my bf all time I'm not going to make the money he thinks I'm going to make as a teacher.
I have no idea if teaching is worth it anymore. It doesn't look like it to me.