U.S. school excuses challenged
Many Americans, including me, are skeptical of efforts to portray our public schools as failures compared to the rest of the world. The late Gerald W. Bracey, my favorite contrarian education expert, exposed exaggerations, false assumptions and deceptive graphics that made us look worse than we were.
But a new book edited by Marc S. Tucker, president of the National Center on Education and the Economy, offers convincing evidence that we are running out of excuses. The book, “Surpassing Shanghai: An Agenda for American Education Built on the World’s Leading Systems,” is so unsettling I am devoting two columns to it. Today I examine apparent flaws in our rejection of international comparisons. Thursday I outline what Tucker says we must do to catch the Shanghainese, Japanese, Finnish and other top-performing education systems.
Tucker’s analysis is based on results of the 2009 Program for International Student Assessment (PISA) test. It compares 15-year-olds around the world in math, language arts and a few other areas. Like all standardized exams, it has its flaws.
Here are some common excuses for poor U.S. performance and why Tucker thinks they are wrong. I also have included commentary from Brookings Institution scholar Tom Loveless, an expert on PISA.
1. Our scores are lower because so many of our children are from immigrant families speaking different languages. Tucker says “the reading performance of children without an immigrant background in the United States is only marginally better than the performance of all students. It turns out that Canada, New Zealand, Australia and Hong Kong, all with percentages of immigrant students equal to or greater than the United States, all out-perform the United States in reading.” Loveless says Tucker needs to prove that immigrants in those countries are as poor and culturally deprived as U.S. immigrants.
2. Our suburban kids do fine, but our national average PISA results are dragged down by urban schools that serve low-income students. In fact, Tucker says, the U.S. suburban average is only slightly above the average for all developed nations in the Organization for Economic Cooperation and Development, which sponsors PISA.
3. If top-performing countries had to educate as many disadvantaged students as we do, they would not perform as well. PISA has results for what it calls “resilient” students, those who are in the bottom quarter of an index of economic, social and cultural status but who score in the top quarter of the PISA achievement measures. The higher portion of students like that in a country, the theory goes, the better its schools are doing in educating the students who are most difficult to teach. The percentage of resilient students in the United States is below the PISA average. Twenty-seven countries, including Mexico, are ahead of us. Loveless wonders if this says anything besides “countries that score higher than us score higher than us.”
4. If we spent more on education, we would have better results. In fact, Tucker could find only one OECD country, Luxembourg, that spends more per pupil than we do, even though we score only average in reading and below average in math and science. The key factor, he says, is what we spend the money on. If we measure teacher compensation by how much teachers are paid compared to other professions requiring the same years of education, only three OECD countries pay their teachers less than we do.
5. If we emphasize reducing class sizes, our students will do better. The PISA data shows otherwise. Countries that give higher priority to raising teacher salaries than reducing class sizes have better achievement rates than countries like ours that do the opposite. Loveless says he is sympathetic to this argument and the previous one, but would like to see evidence of causality.
Much of the data Tucker used is at oecd.org. On Thursday, I will present his recommendations, and whether they make sense for the United States.
By 05:00 AM ET, 12/11/2011
|