Bernard Casey examines the methodology of the Office for National Statistics’ Labour Force Survey in light of recent concerns about the quality of UK labour market data.
In the last few weeks, there has been much discussion about the quality of UK labour market data, including at a recent UK in a Changing Europe event on the state of the UK’s labour market. This came after the Office for National Statistics (ONS) found it necessary to delay the publication of unemployment numbers and rates to make further revisions.
Unemployment, together with employment and labour force inactivity (people of working age but outside the labour market), are vital for measuring the level of demand in the economy. And this in turn shapes decisions that affect everyone – these numbers are eagerly watched by those making interest rate decisions.
One UKICE panellist argued that the response rate (RR) on the Labour Force Survey (LFS) – the source of the relevant figures – had fallen dramatically from 50% to 15%. He attributed this to Covid and the fact that, rather than knocking on doors, the ONS started contacting people by the internet and by telephone.
How the LFS works
The problems, however, are not merely Covid-related. The LFS is a quarterly ‘micro-survey’ of households that collects information about individuals too. People would be surveyed over five successive quarters, providing insights into labour market transitions.
Like all surveys, the LFS suffered from being unable to contact all sampled. Some refused and some were lost when recontacted in follow-up quarters. Initial failure to contact did not mean no further efforts were made, and the ONS compensated for non-contacts by adjusting responses on the basis of what was known about the population as a whole – increasing/decreasing the importance of answers from groups whose response rates (RRs) were lower/higher.
Initially, the LFS was conducted face-to-face (FtF). However, the ONS started using telephone only (TO) surveying in the late 1980s, finding the results to be satisfactory. Wave 1 interviews were FtF, but subsequent ones were TO. Post-2010, TO began to be used for first contact with a respondent, too. This was part of an effort to cut down costs.
Response rates were indeed lower by the early 2010s. However, they had been dropping well before that. In summer 1993, they had been 79%. Before the onset of Covid they had fallen to 36%. But, as chart 1 shows, there was no obvious kink in the data post-2010 – rather, a steady downward fall. Explanations for this include changing living conditions (multiple occupancy and single living) and changing job conditions (fewer standard working weeks), increasing mobile phone use, and a distrust of hoax callers.
The LFS during Covid
With Covid, a kink can be seen. The rate of decline of the response rate accelerated– from 36% in spring 2020 to 15% in summer 2023.
The switch to telephone interviewing has been cited as one reason. However,
from April 2021, a field strategy referred to as ‘Knock to Nudge’ (KtN) was introduced. Interviewers visiting sampled addresses where no phone numbers could be obtained and encouraged residents to provide their phone number and arrange a telephone appointment. This improved response rates as well as follow-up among those people that are otherwise ‘harder to reach’.
Further attempts to compensate included a temporary doubling of survey size.
Chart 2 outlines recent developments. The initial fall in responses was particularly precipitous. The changes made, including KtN and boosting the sample, seem to have stemmed the rate of decline somewhat, but this was temporary. In the ‘post-pandemic’ period, the same rate of fall that occurred in spring 2020 reappeared. The slope of the line post-pandemic is two and a half times as steep as pre-pandemic.
What about elsewhere?
The US Current Population Survey (CPS) also tracks people over time. Its RR had been higher than that of the LFS – nearly 90% in summer 2013 compared to the LFS’s 49%. But it, too, was on a slow downward trajectory – reaching 82% before Covid struck. By August 2020, it was 70%. It recovered thereafter to 80%, but the long-term downward path quickly set in again. Post that peak, its rate of decline was three times that in the pre-Covid period, and by summer 2023 it was back down to 70%.
Data from Europe is less clear. Eurostat, which publishes the results of member states’ five-wave labour force surveys, produces its own quality reports. That covering 2020, the latest available, reports how, as a consequence of Covid, data collection was ‘severely hampered in most countries’. However, any Covid-impact appears to have been temporary. Subsequent interim reports have made no further comment. Of four examined – Germany, France, Netherlands and Sweden – only one – Germany – did, and merely to report that ‘[d]ue to Covid-19 and technical issues it was not possible to access sufficient information on households who failed to respond.’ In Germany, participation is obligatory.
Consequences and further explanations
The ONS has been sufficiently concerned about LFS quality that it is developing plans for revisions – a Transformed Labour Force Survey (TLFS). This would, boost sample size, intensify follow-ups, and concentrate greater efforts on reaching those hard to contact. The TLFS is, thus, likely to be more costly.
Cost concerns might well have led to the longer-term deterioration in response rates, but Covid had an additional impact. Resources that the ONS had were thrown into its Coronavirus Infection Survey – a survey that heavily relied on FtF contacts. That survey was widely praised. But its costs were huge – £391mn in 2021 and totalling £945mn by December 2022. In 2022, the budget for statistical services for the UK government was rather under £500mn. The price of the Covid survey’s ‘success’ might well have been the ‘failure’ of the LFS.
By Bernard H Casey, retired academic, former member of staff, OECD and former consultant to the European Commission. He can be contacted via www.soceconres.eu.