Law vs Privacy
Home Resources Articles Online safety laws vs privacy – it’s all kicking off in the UK

Online safety laws vs privacy – it’s all kicking off in the UK

The UK's Online Safety Bill, aiming to increase user safety on social media, is causing concern among app developers over potential privacy breaches and the undermining of encryption. Major messaging app leaders threaten to withdraw from the UK unless the bill is revised.
by Usercentrics
Jun 7, 2023
Law vs Privacy
Table of contents
Show more Show less
Book a demo
Learn how our consent management solution can improve privacy and user experience for your users.
Get your free data privacy audit now!

Is the UK government about to undermine encryption in apps? Many developers think so. And they are not happy.

In 1952, the United Nations decided to make an important statement about a grave danger facing the world. The danger was, of course, Superman.

 

The UN published a report that concluded: “By undermining or warping the traditional values of each country, the Superman myth is becoming a kind of international monster.”

 

It sounds ridiculous now. But every generation has its moral panic about the newest evil that is endangering its children. In the 1950s it was comic books. In the 1980s video nasties. Today, the threat is coming from digital content and social media.

 

Now, we’re not saying whether this concern is correct or misplaced. But there’s no doubt that the clamour for action is rising. Governments and regulators all over the world are being urged to ‘do something‘.

 

In many regions, they are. It’s for content creators/developers to adapt to these new laws. And what is becoming clear is that many of them are not prepared to. They are fighting back.

 

The most vivid example comes from the UK, which is currently in the process of passing its Online Safety Bill.

 

Here’s how the government describes the regulation:

 

The Online Safety Bill is a new set of laws to protect children and adults online. It will make social media companies more responsible for their users’ safety on their platforms.

 

It will protect children by making social media platforms:

  • Remove illegal content quickly or prevent it from appearing in the first place. This includes removing content promoting self-harm
  • Prevent children from accessing harmful and age-inappropriate content
  • Enforce age limits and age-checking measures
  • Ensure the risks and dangers posed to children on the largest social media platforms are more transparent, including by publishing risk assessments
  • Provide parents and children with clear and accessible ways to report problems online when they do arise

Encryption by the back door? The fightback begins

 

Many of the proposals are uncontroversial. But not all. For example, there is the question of what is ‘harmful’ content. How can this be defined? Who defines it? And won’t it just mean that social media firms merely take the safe option and ban anything they might have the slightest worry about?

 

But there’s another equally problematic issue: encryption.

 

Here’s the challenge. In many popular chat apps, messages are encrypted, so that they can’t be read if they are intercepted over the air. This is a key benefit of these products since it protects the privacy of users.

 

While the bill does not propose an outright ban on encryption, it does allow the UK telecoms regulator, Ofcom, to demand that messaging apps use ‘client-side scanning’ technology that can detect illegal images.

 

It’s clear that the messaging app companies see this as encryption removal by the back door. They’re not happy. Matthew Hodgson, CEO of message app company Element, spelled out the problem.

 

He said: “Adding any content scanning technology to a secure messaging platform fatally undermines its end-to-end encryption – you cannot have communication which is both private and scanned by a third party system: it is a contradiction in terms.”

 

In April Hodgson teamed up with the heads of OPTF/Session, Signal, Threema, Viber, WhatsApp and Wire to articulate their concerns in an open letter.

 

They said: “The UK government is currently considering new legislation that opens the door to trying to force technology companies to break end-to-end encryption on private messaging services. The law could give an unelected official the power to weaken the privacy of billions of people around the world.

 

“We don’t think any company, government or person should have the power to read your personal messages and we’ll continue to defend encryption technology.”

A different internet for the UK?

 

This is the heart of the matter. The proposals assume that the ability to read encrypted messages will only be used for good. But in reality it will likely be exploited by cybercriminals, rogue employees, abusive spouses etc.

 

Weeks after submitting their open letter, the signatories went further and stated that they would be willing to withdraw from the UK if the law is not amended. This raised the prospect of a UK-only Internet that is different from the rest of the world’s.

 

Will Cathcart, head of WhatsApp, told the Guardian: “98 percent of our users are outside the UK. They do not want us to lower the security of the product, and it would be odd for us to (do that) in a way that would affect those 98 percent of users.”

 

The dispute around the UK Online Safety Bill is yet another example of the rising influence of regulators in the app space.

 

Now, as it happens, this legislation is a little different from what has gone before. Why? Because the UK bill threatens to open up access to private data, whereas most previous regulation has sought to protect it. Examples include the California Consumer Privacy Act (CCPA), Brazilian General Data Protection Law (LGPD), ePrivacy and of course, the European Union’s General Data Protection Regulation (GDPR).

 

Still, the controversy serves as a reminder to developers to be mindful of the need for compliance at all times.

Ignore regulation at your peril

 

This is, of course, where Usercentrics can help. We know that failure to consider mobile app consent can lead to a breakdown in trust between the consumer and developer. It also leads to significant financial penalties. It’s why we recommend that developers use a consent management platform.

And also observe some important best practices such as:

  • Present your consent request and the right moment
  • Give the consumer the choice to decline
  • Use clear, friendly language
  • Use disclosure prompts that look like your app, not the iOS
  • Be transparent, clear and specific

Regulation that impacts app developers is on the rise all over the world. But achieving privacy compliance need not be a headache. Usercentrics can help. So talk to one of our experts

Related Articles

consent based marketing

What is consent-based marketing? Benefits and tips for marketers

Data privacy continues to be a top priority for companies, as consumers increasingly want transparency and choice over...

New Hampshire Privacy Act (NHPA)

New Hampshire Privacy Act (NHPA): An Overview

The New Hampshire Privacy Act is the 14th state-level data privacy law passed in the United States. It was...