Friday 9 July 2021
They take too long, they cost too much, they don’t lead to change: just some of the strong opinions that public inquiries can generate, from journalists and MPs to academics and practitioners. But what does the public think about them? In fact, how much does the average person actually know about public inquiries?
The government’s announcement of an inquiry into the Covid-19 pandemic makes these questions more relevant than ever; never before have so many people been personally affected by an event which will be subjected to the scrutiny of an independent investigation under the 2005 Inquiries Act. In theory, everybody in the UK has an interest in the remit, running and results of this inquiry, scheduled to begin next year.
Consequently, Crest has surveyed a 1,000-strong representative sample of the UK population  testing their knowledge of public inquiries through a series of true or false questions and exploring their views on how they think public inquiries should be conducted in other ways.
The full dataset and charts for this blog are available here.
Below we set out five key findings and consider what the implications may be.
Understanding of how public inquiries work in practice is weak
A majority of our sample knew that an inquiry involved lawyers asking questions (60% recognised it was a legal process, another 60% correctly recognised it has powers to compel people to give evidence under oath). Nearly two thirds (64%) recognised that a public inquiry is independent of the government. After that it got pretty hazy.
More than a quarter thought (wrongly) that an inquiry has a jury while another third weren’t sure if it has or it hasn’t. Nearly half thought an inquiry (wrongly) can fine people and nearly as many believed (wrongly) it could award compensation. Another quarter even thought (wrongly) inquiries can jail people.
Strikingly, people who told us they understood public inquiries “very well” only answered half the true/false questions correctly on average.
This lack of understanding of the mechanics of an inquiry is perhaps unsurprising; because inquiries exist in response to exceptional events, relatively few people experience one at first hand. But the gap between assumed knowledge and actual knowledge suggests that a significant number of people may have unrealistic expectations of what an inquiry can do and be disappointed over time.
There appears to be a consensus on how inquiries should work in principle
We also explored what people think a public inquiry should prioritise when deciding how it should be run, by asking our sample to score a series of considerations from 1 to 10 (with 1 being the lowest level of priority and 10 the highest level of priority).
These considerations were:
‘timeliness i.e. completing its work quickly’
‘minimising cost to the taxpayer’
‘getting the evidence it needs to reach conclusions’
‘independence from government’
‘gaining the trust and confidence of the people most affected by the event/events’
The chart below shows the percentage of our sample who gave each of these considerations a score of 9 or 10 - meaning they thought it should be a very high priority.
Far fewer people thought ‘timeliness’ and ‘minimising cost to the taxpayer’ should be very high priorities than the other considerations. Critics of public inquiries invariably cite duration and cost as prima facie evidence of weakness in the model. However, the public appear unlikely to consider either to be the most meaningful measures of an inquiry’s performance.
And consider this. We also asked how much people agreed or disagreed with the following statement: ‘A public inquiry should investigate the event or events as thoroughly as possible, even if this means the inquiry takes longer and costs more than was originally anticipated.’
The verdict was clear; 75% agreed - with 45% strongly agreeing. Of the rest, 20% were neutral and only 5% disagreed.
It is understandable that commentators and politicians grow frustrated at the stop-start flow of information from inquiries (and undoubtedly more could and should be done to explain why this is sometimes necessary) and the sums of money involved along the way. But three quarters of the public seem to think it is preferable for an inquiry to spend the time and money it needs to do a proper job rather than cutting corners to meet a predetermined deadline or keep to a budget set in stone.
There was a broad consensus about the ultimate purpose of an inquiry
We asked people to rank the following outcomes for an inquiry in order of importance:
to establish the facts
to make recommendations to prevent a repeat
to help people affected by the event /events feel listened to and get closure
to restore confidence in government or another institution, e.g. the police or the NHS
to apportion blame or responsibility to individuals or organisations
Nearly two thirds (63%) said that the most important outcome was ‘to establish the facts’.
Majorities in all age groups and ethnicities said this was the most important outcome as did majorities among those who said they had lost a close relative or a friend to Covid-19 or lost neither.
Making recommendations to prevent the event from happening again was most important for 14% of people while helping people affected by the event /events feel listened to was most important for 12%. Just 6% put apportioning blame or responsibility at the top.
Nearly half (43%) said apportioning blame was the least important outcome while more than a fifth (22%) said it was the second least important outcome. The public - including those who have lost relatives or friends to Covid-19 - appear unlikely to believe a public inquiry is the right arena for a blame game.
The public is supportive of inquiries investing in measures which help people most affected to take part in the process
Sixty nine percent agreed with inquiries holding hearings at specially designed venues, even if this means the inquiry costs more and takes longer than using a general purpose venue. Seventy percent agreed with inquiries having a forum to consult with people most affected by the event(s) on how to work with them during an inquiry. Sixty five percent of people agreed that counselling and other forms of psychological support should be available for witnesses, while 59% agreed these services should be available to other people attending hearings. And nearly half of people (45%) agreed an inquiry should provide wifi and laptops to people most affected so they could follow hearings online, with only 18% disagreeing. This is not a blank cheque, however. More than half (53%) agreed the inquiry had a responsibility to provide best value to taxpayers, even if this meant not being able to accommodate all the needs of people most affected. People expect inquiries to go the extra mile for the bereaved, the traumatised and people harmed in other ways - but not at any cost it seems.
There is a broad consensus that ethnic diversity among the Chair and Panel is important
Nearly two thirds of people (64%) agreed it was important to ensure ethnic diversity among the Chair and Panel of an inquiry, with women (72%) more likely to think this than men (56%). We found that support for ethnic diversity was higher among non-white respondents (71%) than white respondents (63%) and higher among people who have lost a close relative or friend to Covid-19 (71%) than those who had not (62%).
Overall, our findings raise a number of questions for the Covid-19 inquiry. How will it manage expectations, given interest in its work may outstrip understanding of how it will be done? How will it engage with such a potentially large pool of people without becoming unwieldy or losing focus? And how can it understand the needs of the different groups of people bereaved or harmed by Covid-19 in diverse ways? Fortunately, with a provisional start date of spring 2022, this inquiry may get the time (considerably more time than many other inquiries) to think through these challenges and come up with some practical solutions.
Jon Clements is Crest’s Director of Development and has advised a number of statutory public inquiries in England and Wales.
 Fieldwork was conducted between the 3rd and 10th of June 2021. A sample of 996 adults aged 16 years and over was recruited through the Dynata panel to be nationally representative of the UK in terms of age, gender, and geographical distribution. The survey was written and scripted by Crest Advisory and hosted on the Dynata Insights Platform.