The United Kingdom admitted technological challenges in its planned crackdown on illicit internet material after encrypted messaging providers such as WhatsApp threatened to withdraw their services from the nation.
Ofcom can only require tech companies to check platforms for unlawful content such photographs of child sexual abuse if it is "technically feasible," culture minister Stephen Parkinson told the House of Lords on Wednesday, as the chamber examined the government's Online Safety Bill. He stated that the agency will collaborate closely with businesses to create and source innovative solutions.
"If appropriate technology that meets these requirements does not exist, Ofcom cannot require its use," Parkinson added. Ofcom "cannot compel companies to use proactive technology on private communications in order to comply" with the bill's safety requirements.
The words try to ease internet companies' fears that scanning their platforms for unlawful content may jeopardize user privacy and encryption, giving hackers and spies access to private communications. In March, Meta Platforms' WhatsApp threatened to withdraw from the UK.
"Today appears to be a case of the Department for Science, Innovation, and Technology offering some wording to messaging companies in order for them to save face and avoid the embarrassment of having to back down from their threats to leave the UK, their second largest market in the G7," said Andy Burrows, a tech accountability campaigner who previously worked for the National Society for the Prevention of Cruelty to Children.
Child Protection
The six-year-long legislative effort to make the internet safer is nearing completion in Parliament. Parkinson stated that Ofcom would be empowered to order corporations to "develop or source a new solution" to comply with the law.
"It's right that Ofcom should be able to require technology companies to use their considerable resources and expertise to develop the best possible protections for children in encrypted environments," he added.
Meredith Whittaker, president of encrypted messaging app Signal, had earlier welcomed a Financial Times report claiming the government was backing down from its standoff with technology companies, citing anonymous officials as saying there isn't a service that can scan messages without jeopardizing privacy.
However, security minister Tom Tugendhat and a government spokeswoman both stated that the policy had not changed.
Feasibility
"As has always been the case, as a last resort, on a case-by-case basis, and only when stringent privacy safeguards have been met, it will enable Ofcom to direct companies to either use or make best efforts to develop or source, technology to identify and remove illegal child sexual abuse content - which we know can be developed," said the spokesman.
On Tuesday, ministers met with representatives from major technology companies such as TikTok and Meta in Westminster.
The administration has previously utilized language relating to technological feasibility. Parkinson told Parliament in July that "Ofcom can only require the use of technology on an end-to-end encrypted service when it is technically feasible."
The National Society for the Protection of Children, a prominent supporter of the UK crackdown, said the government's statement "reinforces the status quo in the bill and the legal requirements on tech companies remain the same."
Accredited Technician
Finally, the text of the Act leaves it up to the government to determine what is technically viable.
According to the July draft of the legislation, after the bill becomes law, Ofcom can issue a notice ordering a corporation to "use accredited technology" to identify and prevent child sexual abuse or terrorist content, or face fines. Because the process of identifying and authorizing services does not begin until the bill becomes law, there is presently no certified technology.
Previous attempts to overcome the problem have focused on so-called client-side or device-side scanning. However, Apple Inc. postponed such a system, which would have examined photographs on smartphones for symptoms of child sex abuse, in 2021 following significant opposition from privacy groups who thought it would pave the way for additional sorts of surveillance.
According to Andy Yen, founder and CEO of privacy-focused VPN and messaging company Proton, "as it stands, the bill still permits the imposition of a legally binding obligation to ban end-to-end encryption in the UK, undermining citizens' fundamental rights to privacy, and leaves the government defining what is 'technically feasible.'"
"For all the good intentions of today's statement, without additional safeguards in the Online Safety Bill, all it takes is a future government to change its mind and we're right back where we started," he added.