The European Commission wants to put communication on messaging apps under constant control
Threat to privacy: there is talk of preventing child abuse, but artificial intelligence would start monitoring all users.
The European Commission is proposing a regulation that would require service providers to monitor all photos and videos sent in private communications and oblige them to report images that could be linked to child abuse. However, such a plan would require all private conversations to be constantly monitored.
The debate around the regulation has been a complex one, pitting the need to stop child abuse against the right to privacy, writes The Brussels Times. Although the vote on the proposal was postponed indefinitely in June, it is now back on the agenda.
The European Commission proposed in 2022 to introduce a regulation - also called the ‘chat control' - to help prevent and combat Child Sexual Abuse (CSA). The plan would oblige technology companies to identify and report child abuse images (photos, videos, etc.) in private communications. The proposal was rejected by the European Parliament, but the then Belgian presidency reintroduced an amended version. Although Belgium withdrew the proposal from the agenda in the last days of its presidency, the current Hungarian presidency has reintroduced the plan.
The original proposal called for service providers to be obliged to use artificial intelligence to check messages. This gave privacy campaigners cause to fear that the algorithm could easily misinterpret whether or not child abuse was taking place. So, for example, parents who send pictures of their children to each other could face charges of sharing illegal material.
The current option, according to Politico, is a slight and temporary mitigation, where the use of AI is voluntary for service providers and does not oblige them to search for new material.
However, automated searches of private communications, including end-to-end encrypted conversations, will still be required, so the proposal still threatens privacy as it could stop end-to-end encryption. Notably, messaging platforms such as WhatsApp and Signal would also have to moderate and check uploaded content before the message is encrypted and sent. This is the case for all private messages sent by all EU citizens to identify any images that may be associated with child sexual abuse. While users should allow such monitoring, refusal means they will no longer be able to send photos or videos.
The images found will be compared against a database of known abuse images that the EU is already creating, which would include "indicators of online child sexual abuse that providers will be required to use to comply with the detection obligations". The databases are expected to be in place before the implementation of the chat control regulation.
The proposal also includes an obligation for providers of web hosting or inter-personal communication services to assess the risks of such material on their platform, to put in place mitigation measures and to report the results to the relevant EU authority. Software application stores would also be obliged to assess whether any of the applications they offer could be used to solicit children for sexual purposes.
The regulation will also be accompanied by the establishment of a European Centre to prevent and counter child sexual abuse (‘the EU Centre’) to enable service providers to comply with the obligation, enforce the rules, and identify, remove and report material. Among other things, the EU centre should facilitate access to detection technology and make available indicators (a database) of abuse. However, service providers, in cooperation with the EU Centre, must contribute to the development of accurate and reliable technologies to identify new child sexual abuse material.
All of these proposals would create a situation equivalent to one where there is no secrecy of correspondence, i.e. each letter is checked before it is sent, and the postal service that forwarded the letter would have the obligation to assess whether its service could be used illegally. This is the dream of authoritarian or dictatorial regimes, where, under the pretext of fighting heinous crimes, citizens can be monitored and controlled and any form of dissent can be nipped in the bud.
If the regulation of chat control is also adopted in the 'new form', ordinary citizens will probably be deprived of secure encrypted messages, and that will mean the end of privacy. The messaging app Signal, for example, has written that end-to-end encryption protects all or none, and given the current complex world circumstances, breaking end-to-end encryption would be catastrophic.
The proposal to combat child sexual abuse is a noble and good one, but regulation in this form sets a dangerous precedent and does not serve the original purpose of preventing and combating child abuse, as it would entice those involved to act even more covertly.
How does discovering images of children go any way towards eliminating or diminishing the ACTUAL abuse of children?
Is it not more likely that law enforcement officers posing as child abuse image users infiltrate the online posters of said images with the power to locate and arrest the persons directly in their various jurisdictions?
Literally nothing whatsoever to do with combatting child abuse. Only the dumb NPCs believe this (70% of the population).