top of page

Getting the message? What adults think about the use by children of end-to-end encryption

Insights Perspective


Joe Caluori, Head of Policy and Research | Patrick Olajide, Analyst

Monday 20 June 2022

As part of our research into the relationship between the use of social media and serious youth violence, funded by the Dawes Trust, Crest Advisory conducted an online poll. We surveyed a nationally representative sample of 1,011 UK adults about their knowledge of and views on the use of end-to-end encrypted messaging in social media apps by children. Joe Caluori and Patrick Olajide set out the background to the poll, the findings and their significance.

Our research into serious violence has found that there is great unease about the links between social media and serious youth violence among parents and people who work with children, such as teachers, social workers and police officers. They fear that as well as being exposed to familiar dangers which have migrated online, children are facing new threats which adults are unable to understand, let alone protect against.

The Online Safety Bill, which is being debated in Parliament, may give MPs a once in a generation opportunity to provide children with some protection, by setting new rules for technology companies. The key question is whether legislators can reach a consensus on the balance between the responsibilities of tech companies, parents, law enforcement agencies and regulators when different age groups have such contrasting experiences of using social media.

One problem is the rapid pace of development of social media technology. The philosophy of ‘move fast and break things’ often results in new functionalities being raced onto the market ahead of ethical, moral or legal considerations. An example of that is the introduction of ‘end-to-end encrypted messaging’ (E2EE).

With E2EE, messages are encrypted (sent in code) and are only decrypted on the devices of the sender and the intended recipient. Social media companies cannot read the messages during or after transmission and law enforcement agencies cannot read them without accessing hardware or remotely hacking accounts, which is rarely done outside of counter-terror operations.

Before E2EE, the norm in social media messaging was standard encryption. The contents of messages could be seen only by the senders and recipients but the encryption was managed by the company providing the messaging service so they could access the contents of the messages if they needed to.

Social media services such as WhatsApp, Messenger (for both Facebook and Instagram), Snapchat, Telegram and Signal already offer end-to-end encrypted messaging services, lauding the benefits of privacy and security. Some services have a minimum age for those wishing to use them: for WhatsApp, Facebook and Instagram, it is 16; for Snapchat it is 13. But Telegram and Signal do not appear to state the minimum age of their users.

Senior figures with responsibility for law enforcement, including the Home Secretary, Priti Patel, Metropolitan Police Assistant Commissioner Neil Basu and Rob Jones, interim Director General of the National Economic Crime Centre have warned of the impact of E2EE messaging on counter-terror policing and preventing and detecting child sexual abuse. But there has been comparatively little discussion about the risk that E2EE messaging could lead to a rise in serious youth violence. We conducted a survey to find out how adults in the UK feel about the use of social media by children - just as their elected representatives make critical decisions in Parliament on what should and should not be included in the Online Safety Bill.

Public views

In keeping with other studies, we found that smartphone and tablet ownership among children and young people is the norm, even for those of primary school age. Of parents with children under 11 who answered our survey, 75% said their child owns a personal smartphone or tablet. That rises to 96% among parents with secondary school aged children (11-16).

When asked how many hours a day children and young people aged 11-16 should be allowed access to the internet through their device, the most common response was a limit of two hours, selected by just over a quarter of respondents (26%). Interestingly, just 16% of parents with children aged 11-16 selected this option, perhaps reflecting the realities of enforcing limits with teenagers. [1]

Parental controls

We wanted to find out how parents try to control their children’s use of social media. Just over half of parents (51%) said that they held the registration of their child’s device, with the same proportion saying they had installed an app that allowed them to control their child’s access to the web. Just under half (49%) said they manually check their child’s device, 39% said they take away their child’s devices at certain times; 16% said they turn off the home wifi to prevent web access.

In our wider research programme into the relationship between social media and violence we found that parents and professionals working with children have little if any knowledge of what end-to-end encryption is - let alone which apps use it and which do not. We showed survey respondents a brief description of E2EE [2] before asking whether children under 16 should have access to this technology. Opinions were sharply divided: 38% of all adults polled agreed under 16s should be able to use E2EE; 39% disagreed, though it’s worth noting the relatively high number, 22%, who answered ‘don’t know’.

Surprisingly, parents were more likely to agree that children under 16 should be able to use E2EE messaging (50%), as were adults under 35 (52%). But it is significant that a large majority of adults (75%) agree that if children under 16 have unrestricted access to end-to-end encrypted messaging technology it will “increase the risk of them being groomed and sexually or criminally exploited” with two-thirds saying “arguments and disputes on social media apps between children are more likely to get out of control and result in violence”.

It seems most parents are realistic about their children using E2EE messaging, even though they acknowledge the risks of doing so. That could be because parents believe their own child won’t be adversely affected, an explanation that emerged during focus groups we conducted.

However, even parents who say children should not be able to use E2EE are apparently unable to stop their own child from doing so: 53% of parents in this category report that their children use apps which offer end-to-end encrypted messaging.

Rights and responsibilities

Online harms are a high stakes issue for parents. According to our poll a worrying 43% said their child had seen online content connected to bullying. Almost a third (31%) said their child had seen content showing fights involving children; around one in five (19%) said that their child had seen threats to beat up another child or content involving weapons such as knives (18%); 14% said their child had seen content about gangs.

It begs the question: what do parents and the public think is the best overall approach to keeping children safe when they use social media?

In our survey, views were split, but Government legislation to impose a ban on children using E2EE was the top choice (35%). Nearly a quarter of all adults (24%) said social media companies should police their own products, with the same percentage saying parents should take responsibility by using technological tools to control their children’s use of social media. Almost one in five (18%) preferred education for parents, carers and children. [3]

Responses from parents were similar to those for all adults.

What are the attitudes that underpin these views? We tested public statements by prominent groups and individuals to see how favourably they were received.

There was strong overall support (85%) for social media companies having a ‘duty of care’ towards children using their products, with 61% ‘strongly’ backing such an approach. But an overwhelming majority (78%) agreed that social media companies ‘cannot be trusted’ to keep children safe online because they put commercial considerations before safeguarding. Just 27% said there are already strict laws in place to prevent harmful activity so additional measures are not required. That points towards a high level of potential support for legislation and regulation over self-policing or leaving it to parents.

However, when we tested statements saying E2EE should not be banned for privacy reasons, almost two-thirds agreed (64%), with 59% opposing a ban on civil liberty grounds.

Attitudes towards social media, and E2EE in particular, are confused. The benefits of E2EE are clear to adults and they are reluctant to lose them - but they do not trust social media companies to police themselves in order to protect children.

The Online Safety Bill

As part of our survey, we canvassed views on a range of potential measures which could be included in the Online Safety Bill. Usually, when the public are asked to rate policy proposals without them having to consider what kind of contribution or ‘trade off’ they will have to make, there will be high levels of support, so these poll findings should be treated with caution. However, in the world of social media, where users naturally expect free access, it was a useful exercise nonetheless.

The findings suggest that the Government may find favour for measures in the Bill which take an interventionist approach to the use of social media by children. Over half of the adults polled (51%) backed a requirement to allow users to prevent anonymous or unauthorised accounts from contacting people online. A similar proportion (48%) strongly agree that there should be a minimum age for users backed by software such as facial recognition which can carry out checks - mandatory age verification. Nearly one in five (39%) strongly agree the legislation should force companies to embed computer code in their systems to prevent content from being shared in bulk or block virtual private networks (VPNs), under which encrypted content can be exchanged between individuals and within groups.

Despite civil liberty and privacy concerns expressed in earlier questions, 38% back allowing software which uses artificial intelligence, AI bots, to read messages and search for and report harmful content. This proposal has strong support (46%) among parents of children aged 11-16. Nearly half of adults (47%) would allow police access to material on request from social media companies, with 48% strongly agreeing with the idea of taxing their profits to fund police activity online.

Over half (51%) strongly agree that the law should consider material sent using E2EE messaging in the same way as harmful content shared on open media sources; very few disagreed. That is potentially significant. If the law is changed in this way, so that the regulator OFCOM is tasked with overseeing content that is harmful, but not necessarily illegal, that will affect the business case for tech companies. They may have to tighten up their use of E2EE by children so they do not see such material. One difficulty for MPs as they consider these possible measures is defining what is and isn't ‘legal’ (especially around free/hateful speech) and what is ‘harmful’: that depends on the context and the characteristics of those accessing such content. [4]


Our poll indicates that the public are divided over whether or not children should be able to use E2EE and how to keep them safe when they use social networks. But people are apparently willing to put privacy and civil liberty concerns aside to give the Government responsibility for ensuring there are safeguards; that is probably because of low levels of trust in the companies themselves.

But social media companies are unlikely to want to invest huge sums in technology to enforce new rules on the use of E2EE by children. Nor will they be keen to allocate funds to pay substantial fines they might incur as a penalty for allowing ‘harmful’ content in messages. Therefore, moves to constrain the use of E2EE through legislation will be strongly resisted by the tech sector. If new rules are introduced, there is also the question of how feasible it will be to enforce them, particularly if tech firms don’t install the software to identify harmful content, if policing and the regulator lack sufficient resources to investigate and if parents don’t know what they are looking for and how to report it.

As the Online Safety Bill continues its passage through Parliament, MPs will need to ask searching and detailed questions of social media giants to satisfy themselves that children in the future can safely use end-to-end encryption. In time, E2EE may prove to be an acid test for the ability of politicians to ‘future proof’ legislation governing the use of new communications technologies that offer great opportunities but also risk creating unintended and unforeseen threats.

Crest’s final report into the relationship between social media and violence will be published in July 2022.

To keep up-to-date with all our reports, analysis and insights sign up for our newsletter.

BBC Panorama interviewed Joe about this research as part of the documentary A Social Media Murder: Olly's Story which you can watch on BBC iPlayer here, or watch Joe's appearance below.



[1] Full question: For how many hours on a typical day do you think a child between the ages of 11-16 should be allowed to access the internet on their device(s)?

[2] ‘End-to-end encryption’ is a technology which allows users to send messages through social media apps and other messaging apps securely, so the content of messages can only be viewed by the sender or the receiver(s) of messages. No third party can view the content, including the company who owns the social media platform or the police.

[3] Full question: What in your opinion is the best overall approach to keeping children safe when they use social media?

Do you think any of the other possible approaches suggested should also be implemented?

  • Pass a law to ban tech companies from allowing children to use end-to-end encrypted messaging. If the law is broken then the companies responsible will face significant fines.

  • Social media companies should police their own products to prevent the harmful use of messaging involving children.

  • Parents and carers should buy technological tools such as parental control apps to limit use of social media by their children.

[4] Full question: The Government has announced an ‘Online Safety Bill’ which would include new laws for social media companies if it is agreed by Parliament. To what extent do you agree or disagree that the following proposals should be included in the Bill?

Full proposals:

  • Mandatory option for social media users to prevent anonymous or unauthenticated accounts from contacting them online .

  • Mandatory age verification for users backed by technology such as facial recognition or other forms of authentication.

  • Social media companies should have to give the police access to the content of end to end encrypted messages on request. - The Government has announced an ‘Online Safety Bill’ which would include new laws for social media companies if it is agreed by Parliament.

  • The law should treat harmful content shared through end to end encrypted messages in the same way as harmful content which is shared on ‘open’ social media sources which can be viewed by followers.

  • Media literacy campaigns on the message “report don’t share” in respect of illegal content.

  • Artificial intelligence ‘bots’ programmed by tech companies should search end to end encrypted messages to identify, delete and report harmful content.

  • Design features built into the social media platforms themselves, such as limiting the sharing of material or refusing access via virtual private networks (VPNs).

  • A tax on the profits of social media companies should be used to fund additional specialist police officers and analysts to keep children safe online.

bottom of page