Methodological issues in the production of government statistics aren’t generally headline news.
But the headline international migration statistics are one of the most politically important and controversial set of statistics published by the Office of National Statistics.
So the decision of the UK Statistics Authority that they can no longer be described as “National Statistics” – that is, they are no longer “fully compliant” with the Code of Practice for Statistics – will be an exception. And the accompanying explanation by the ONS, which includes significant revisions by the ONS to the historical data on migration, even more so.
So what’s going on? It is now almost four years since I first pointed to the divergence between migration from the EU as measured in the migration statistics, and the number of EU citizens registering for National Insurance numbers; I concluded that “there are actually considerably more such recent migrants than the official immigration statistics actually suggest.”
And since then the inconsistencies have only grown.
In particular, as I wrote last June, the official labour force and population statistics, based on the Labour Force Survey, which try to estimate broadly who is resident in the UK at any one time, tell a very different story to the migration statistics, based largely on the International Passenger Survey.
According to the former, the increase in the number of EU citizens resident in the UK far exceeds the numbers who have, according to the latter, migrated here. For non-EU citizens, the reverse is the case. I concluded “it is reasonably clear that in the recent past EU migration has been significantly higher, and non-EU migration significantly lower, than we thought.”
I speculated then that the most likely explanation of these divergences was that “some EU citizens who arrive here thinking they’re only going to stay for a short time – and hence don’t tell the survey that they’re immigrants – end up staying.” Meanwhile, the overestimation of non-EU migration is likely to be down to undercounting students who leave at the end of their studies, as the university sector has long argued. ONS’s paper today comes to broadly the same conclusions.
So today’s announcement represents a welcome recognition of these issues by ONS, and an acceptance that the current migration statistics are no longer fit for purpose.
Rather than relying solely on the International Passenger Survey, they are “triangulating” estimates, using not just the IPS but also the information from the LFS and administrative data from the DWP, HMRC and the Home Office. The result is a significant set of revisions to the previously published estimates of both EU and non-EU net migration.
Cumulative EU migration has been revised upwards by about 240,000 over the 2009-16 period, while non-EU migration was overestimated by about 170,000 over the 2012-18 period, mostly because student emigration was undercounted.
This is, however, very much work in progress – as the re-designation of the statistics as “experimental” – shows.
It will be a while before the new system beds in, and perhaps even longer before we fully understand what it is telling us. Indeed, we will probably have to wait for the results of the 2021 Census, sometime in 2022 or thereafter, before we really understand what was actually happening to migration to the UK in the 2010-2016 period.
What, then, are the broader lessons from this?
First, ONS – after what I and others working in this area regarded as a somewhat slow start – have done the right thing.
They have taken on board the points we have raised about the unreliability of the statistics, and have responded, not just by accepting that there was a problem, but by trying to address it using additional data sources. They have done so in an open and consultative way, despite the very real political sensitivities involved. And they have, sensibly in my view, implemented an interim solution rather than stonewalling until a new system is fully operational.
But there are also some political and policy issues.
David Cameron’s target to reduce net migration to the “tens of thousands” would have been economically illiterate if it had been based on reliable statistics, but this new information cements its position as one of the most idiotic government pledges in recent history.
Worse still was the impact on policy on student visas. It was these statistics, which appeared to show that far more non-EU students were arriving to the UK than ever left, that Theresa May used to justify her deeply damaging approach to overseas students – a policy which has reduced the UK’s market share in one of our most competitive and highest value export sector, as well as doing less quantifiable but equally important harm to the UK’s global image.
This also has implications for our post-Brexit immigration policy.
Much has been made of the potential for greater “control” over migration flows after the end of free movement. This often assumes implicitly that politicians in Westminster, or bureaucrats in Whitehall, are well placed to engage in the central planning of the UK labour market. The fact that our estimates of such flows – both of EU citizens and those from outside the EU, where in principle we already have such control – can be so inaccurate illustrates the difficulties involved.
Finally, we should be careful not to overreact.
Doubtless, some more excitable politicians and commentators will react to this news by claiming that “we have no idea who is in the country” and that the answer is to beef up border controls so that we “count people in and out of the country”; indeed it has been reported that Priti Patel has dispatched officials to Singapore to work out how to do just that.
But in fact we do not measure “who is in the country”, or plan the provision of schools, hospitals and transport, with the migration statistics – it is the population statistics which serve that purpose. And while there may be issues with them as well, that is for another day.
By Jonathan Portes, senior fellow at the UK in a Changing Europe.