The amendment to the internet safety bill will require tech companies to use their “best efforts” to develop new technology that identifies and removes child sexual abuse and exploitation (CSAE) content. It comes as Mark Zuckerberg’s Facebook Messenger and Instagram apps prepare to introduce end-to-end encryption amid strong opposition from the UK government, which has branded the plans “unacceptable”. Priti Patel, a longtime critic of Zuckerberg’s plans, said the law change balanced the need to protect children while providing privacy for online users. The Home Secretary said: “Child sexual abuse is a sickening crime. We must all work to ensure that criminals are not allowed to run wild online and tech companies must play their part and take responsibility for keeping our children safe. “Privacy and security are not mutually exclusive – we need both, and we can have both, and that’s what this amendment delivers.” Child safety campaigners have warned that heavy encryption will prevent law enforcement and technology platforms from viewing illegal messages by ensuring that only the sender and receiver can see their content – a process known as encryption by end to end. However, officials said the amendment is not an attempt to stop the development of more such services and that any technology developed should be efficient and proportionate. Zuckerberg’s Meta business, which also owns the encrypted messaging service WhatsApp, is delaying unveiling its plans for Messenger and Instagram until 2023. Screening private messages for child abuse material has proven controversial, with participants warning of negative consequences for users’ privacy. One controversial method that could be considered by the communications watchdog, which is overseeing the implementation of the bill, is client-side scanning. Apple has delayed plans to introduce the technology, which would involve scanning user images for child sexual abuse material before uploading them to the cloud. The company has proposed developing a technique that would compare photos with known child abuse images when users chose to upload them to the cloud. Under the proposed amendment, watchdog Ofcom will be able to require technology companies to develop or develop new technology that can help find abusive material as well as stop it from spreading. The amendment strengthens an existing clause in the Bill which already gives Ofcom the power to require the development of “accredited technology”. The change will now require companies to use their “best efforts” to develop or deploy “new” technology if the existing technology is not suitable for their platform. If a company does not adopt this technology, Ofcom has the ability to impose fines of up to £18m or 10% of a company’s global annual turnover – whichever is higher. The Internet Security Bill returns to parliament next week after being considered by a committee of MPs and is expected to become law around the end of the year or early 2023. There are between 550,000 and 850,000 people in the UK who are a sexual risk to children, according to the National Crime Agency. “We need tech companies to be there on the front line with us and these new measures will ensure that,” said Rob Jones, NCA director general for child sexual abuse.