10 things about top 10 global school rankings
The results of international school tests in reading, taken every five years, have been published - with a strong showing for the two participating UK education systems, England and Northern Ireland.
As well as this Progress in International Reading Literacy Study (Pirls), there are also the Timss maths tests and the OECD's Pisa tests, as well as numerous higher education tables. What do these global rankings show?
1. England and Northern Ireland are in the top 10 of a global schools ranking - with Northern Ireland in joint sixth place, in nudging distance of education superstars such as Finland.
It's an impressive performance, with England in joint eighth place, in the Progress in International Reading Literacy Study - known as Pirls - taken in primary schools every five years.
2. Russia? The top of global rankings such as Pirls and Pisa usually have a limited cast list - Singapore, Finland, South Korea and particularly clever parts of China usually dominate. But this year Russia is in the gold medal position.
The academics running the tests say this shouldn't be a surprise. Russia has done well before in these tests and has been changing its schools, with a big push on academic excellence and a more rigorous emphasis on standards.
3. Who takes these tests? These global rankings are based on samples of pupils representing the different range of regions, peoples and types of school, whether it's somewhere the size of Luxembourg or the United States.
For the Pirls tests, England's result was based on a sample of about 5,000 students in 170 schools, while top-rated Russia's result was based on about 4,600 pupils in 206 schools. The sample for the United States was 4,425, or the equivalent of less than 100 per state.
4. Comparing like with like? There is something mesmerising about a ranking, it's a hierarchy uncluttered by any complicating factors. They are blazing headlights on the motorway rather than a torch in the study.
But that means not noticing details, such as pupils in the Pirls test being different ages. The flying Finns near the top of the table were on average about a year older than the lower-ranked French or Italians. That's a big difference in primary school.
5. Who should take the credit? It's an iron rule that current governments are responsible for all success, previous governments for all failure. Also, it's a free buffet for drawing conclusions that suit your own views.
England's success could be an argument for a rigorous national curriculum testing system, phonics and league tables. Northern Ireland's could be attributed to not having Sats, schools divided on religious lines and the demands of selective secondary schools.
6. Pick your facts, choose your headline: The same rankings can generate entirely different narratives. The Pirls results have rightly been seen as impressive performances from schools in England and Northern Ireland, well above average by international standards.
But rankings can be used selectively. In absolute terms this year's results put England in 10th place, but because there is no meaningful statistical difference with the two countries above, the Pirls's organisers have said it is the equivalent of joint 8th.
In the past tests five years ago, England was ranked 11th. But the Pirls people say that if the same approximation were applied retrospectively, England would have been joint 6th. So did the results improve or dip? You could produce entirely different interpretations from the same evidence.
7. They make a big impact: Even if you don't believe in education league tables, they make things change around them.
The Programme for International Student Assessment tests run by the Organisation for Economic Co-operation and Development have been seen as driving education policy and stirring education ministers to measure themselves against international standards.
In Germany, this became known as "Pisa shock", when a country that thought it had the world's best education system discovered it was some way behind many of its Asian competitors.
When the Pisa tests were still finding their feet, the US tried to squash their uncomfortable message about its deeply divided schools - and the OECD's Andreas Schleicher has said that it was the intervention of Ted Kennedy that stopped the US from trying to stop their publication.
8. You rank only what you can measure: It's no coincidence that global rankings focus on maths, science and reading. They are much more straightforward to test and mark than more complicated, culturally defined subjects such as history or literature.
But does that mean that less value is attached to subjects that won't see countries climbing up league tables?
9. Nothing is inevitable: It's no accident that countries such as Singapore, South Korea and Finland are at the top of global rankings. They quite deliberately pursued long-term, multi-generational policies to create excellent school systems, with the aim of rising up the economic food chain.
The OECD has rejected the idea that some countries have a "culture" of education. It uses the examples of Singapore and South Korea to show a country can go from widespread illiteracy and poverty to having some of the highest education standards in the world.
10. Are they fair? It might seem one-sided to compare a wealthy European school system with a developing country, or a huge sprawling country with a compact city state. But the argument of league tables is that it doesn't matter whether it's fair - it's the reality of a globalised world.
Young people in very different and unequal settings are in the same economic race - and their chances of success will be heavily dependent on their access to education.
And if you don't get the right result in an education league table... there will be another one along soon.