<100 subscribers


Share Dialog
Share Dialog
2025-09-25ยท17 minsยท
2025-09-25ยท17 minsยท
The European Union wants to force tech companies to scan your private messages & images, even in your favorite encrypted apps.
The ๐ช๐บ European Union is advancing legislation that could fundamentally change how we communicate online. ChatControl would require all messaging platforms to automatically scan their usersโ private messages and images.
Yes, even encrypted ones like Signal, WhatsApp and Telegram. No, you canโt opt out.
This isnโt just another privacy policy update you can ignore. If passed, this EU regulation (strongest and most binding legal instrument in EU law) would automatically apply to all member states without any wiggle room for national interpretation. It would even override constitutional protections for communication privacy and establish unprecedented mass surveillance of private communications.
The official justification? Fighting child sexual abuse material (CSAM). Protecting children is undeniably crucial, but the proposed methods would eliminate digital privacy for 450 million Europeans and set a global precedent for mass surveillance.
This surveillance trend extends beyond Europe: ๐จ๐ญ Switzerland is advancing metadata retention requirements, the ๐ฌ๐ง UK is implementing comprehensive age verification systems and now the ๐ช๐บ EU proposes to scan every private message. Each initiative is positioned as child protection policy, but the implications reach far beyond their stated goals.
ChatControl is what critics call the EUโs proposed Regulation to Prevent and Combat Child Sexual Abuse, also known as CSAR (Child Sexual Abuse Regulation).
The proposal builds on surveillance techniques already deployed by major tech companies. Meta analyzes all Facebook Messenger conversations and unencrypted WhatsApp data (profile photos, group descriptions). Apple announced similar scanning for iCloud content in 2021, though they later suspended the program.
This turns voluntary corporate surveillance into mandatory government-ordered scanning. A temporary 2021 EU regulation allowed platforms to scan content voluntarily for three years. That authorization expired in 2024, which is why CSAR was proposed. The temporary regulation merely permitted scanning; CSAR would make detection obligatory under certain conditions.
Thereโs also the Roadmap for Lawful Access to Data which has an even bigger goal: making all our digital data readable by authorities upon request. Weโll dive deeper into this broader surveillance agenda later.
CSAR casts an extremely wide net. The regulation would apply to all interpersonal communication service providers, not just obvious targets like Signal, WhatsApp, or Telegram, but also:
Email providers
Dating apps
Gaming platforms with chat features
Social media platforms
File hosting services (Google Drive, iCloud, DropBoxโฆ)
App stores
Even small community hosting services run by associations
This means virtually any digital service that allows people to communicate or share content would fall under surveillance requirements. The scope extends far beyond what most people imagine when they hear messaging apps.
ChatControl relies on Client-Side Scanning. Your device becomes a monitoring station that analyzes your content before encryption happens.
This represents a fundamental shift away from targeted surveillance based on court orders or reasonable suspicion. Unlike airport security (where you consent to specific, limited searches for immediate safety), ChatControl would automatically scan all private communications of all citizens, all the time.
Effectively reversing the presumption of innocence by treating everyone as a potential criminal. Itโs the digital equivalent of permanently installing monitoring devices in every home and office, wellโฆ just in case.
How does Chat Control work?
The system would automatically scan for three categories of content before encryption:
Known illegal content: Images or videos already catalogued by authorities as CSAM. Your device creates hash fingerprints of your content and compares them against databases of known illegal material.
Unknown potential content: Photos or videos that might constitute CSAM but havenโt been previously identified. AI algorithms analyze visual elements (like exposed skin) to flag potentially problematic content based on statistical models.
Grooming behavior: Text analysis using AI to identify communication patterns that match predefined indicators of adults soliciting children. This involves scanning the actual content of your private conversations.
If something gets flagged, it automatically gets reported to authorities. No human checks it first, that would be impossible given the billions of daily messages. This would be mandatory for all messaging platforms in ๐ช๐บ Europe.
Claims that client-side scanning is compatible with encryption are misleading.
ChatControl doesnโt break encryption, it bypasses it entirely. While your messages still get encrypted during transmission, the system defeats the purpose of end-to-end encryption by examining your content before it gets encrypted. True E2EE means only you and your recipient can read messages: no government, no company, no algorithm should peek inside. This surveillance violates that principle by inserting monitoring at the source.
Privacy-focused companies like Proton point out this approach might be worse than encryption backdoors. Backdoors give authorities access to communications you share with others. This system examines everything on your device, whether you share it or not.
Your encrypted messaging app becomes spyware. Supporters claim this protects privacy because scanning happens locally, but surveillance built into your device makes it impossible to escape.
The proposal would create a centralized EU Centre on Child Sexual Abuse to receive all reports, but EU institutions wouldnโt control the scanning technology itself.
Service providers would face additional obligations beyond scanning. They would need to conduct risk assessments to evaluate and minimize the potential for illegal content sharing on their platforms. This requires collecting detailed information about their users (age groups, content types) that many privacy-focused services deliberately avoid gathering.
The regulation also pushes for mandatory age verification systems. While some privacy-preserving age verification concepts exist (like zero-knowledge proofs), no viable, scalable solutions have been deployed at internet scale. Current implementations either compromise user privacy through identity collection or lack the accuracy needed for legal compliance.
Rules for thee, but not for me: While ordinary Europeans would have all their messages scanned, the proposed legislation includes exemptions for government accounts used for โnational security purposes, maintaining law and order or military purposesโ. Convenient.
ChatControl fits into a broader political strategy. Since the 1990s crypto wars, certain states have argued that privacy-protecting technologies, especially encryption, obstruct police investigations. These technologies are designed to do exactly that, protect everyoneโs ability to control their expression and communication.
The European Commissionโs Roadmap for Lawful Access to Data wants to make all digital data accessible to authorities by 2030. This involves systematically weakening encryption rather than simply bypassing it.
Edward Snowdenโs revelations ten years ago led to widespread adoption of encryption and institutional consensus supporting the right to encrypted communication. But governments remain frustrated by their inability to access private communications. Weโre seeing a return to authoritarian positions using terrorism, organized crime and child exploitation as justifications for undermining encryption.
๐ฉ๐ฐ Danish Minister of Justice Peter Hummelgaard, chief architect of the current ChatControl proposal, recently stated: โWe must break with the totally erroneous perception that it is everyoneโs civil liberty to communicate on encrypted messaging services.โ Well, there you have it folks: encrypted communication isnโt a civil liberty anymore. You cypherpunks were wrong all along. /s
Similarly in ๐ซ๐ท France, both Bernard Cazeneuve and Emmanuel Macron have explicitly stated their desire to control encrypted messaging, seeking to pierce the privacy of millions who use these services.
CSAR provides the perfect opportunity for member states to finally design and implement a generalized surveillance tool for monitoring population communications. Crossing this threshold means eliminating all confidentiality from communications using digital infrastructure.
These scanning systems have a big accuracy problem. When content gets flagged, itโs wrong most of the time. ๐ฎ๐ช Irish law enforcement confirms that only 20.3% of 4,192 automated reports actually contained illegal material, meaning 79.7% were false positives.
This is fine.
Even with hypothetical 99% overall accuracy (which current systems donโt achieve), scanning billions of daily messages would generate millions of false accusations that overwhelm police resources.
Innocent content regularly triggers these systems: family photos, teenage conversations, educational materials and medical communications. Consider this real case: a father was automatically reported to police after sending photos of his childโs medical condition to their doctor. Googleโs algorithms flagged this legitimate medical consultation as potential abuse, permanently closed his account and refused all appeals. His digital life was destroyed by an algorithm that couldnโt distinguish between medical care and criminal activity.
For the third time in three years, over 600 cryptographers, security researchers and scientists across 35 countries have co-signed an open letter explaining why this mass scanning project is โtechnically unfeasibleโ, constitutes a โdanger to democracyโ and would โcompletely compromiseโ the security and privacy of all European citizens.
The letter emphasizes that client-side scanning cannot distinguish between legal and illegal content without fundamentally breaking encryption and creating vulnerabilities that malicious actors can exploit.
Meanwhile, the Commission has provided no serious studies demonstrating the effectiveness, reliability or appropriateness of these intrusive measures for actually protecting children. Industry claims appear to have taken precedence over evidence-based policy-making.
Genuine security emerges through thoughtful design where security measures and civil liberties function as complementary forces, not opposing ones.
The fundamental flaw in ChatControl becomes clear when examining how easily determined actors can circumvent these scanning systems. Criminals donโt need sophisticated techniques to bypass client-side scanning; they use well-documented public knowledge already employed by malicious actors.
Layered Encryption
Encrypt files with standard tools like GPG before messaging. Hell, even a basic Caesar cipher would be sufficient to bypass detection. Since client-side scanning occurs after user encryption but before transport encryption, pre-encrypted content looks like random data to detection algorithms. Recipients decrypt locally with shared keys.
External Platform Bypass
Upload content to any third-party platform (Dropbox, OneDrive, anonymous file hosts, or obscure hosting services) and share links instead of files. The scanner sees innocent text containing a URL while the actual content sits untouched on external servers.
Custom Messaging Clients
Open-source protocols like XMPP and Matrix allow custom client development. Modified clients can automatically implement cloud storage and encryption workflows transparently. Users experience normal messaging while completely evading surveillance infrastructure.
Digital Steganography
Steganographic techniques embed data within innocent images. Family photos can carry hidden payloads invisible to both human operators and AI systems. Tools like OpenStego make this accessible to average users.
Platform Migration
Criminal networks can shift to decentralized platforms, peer-to-peer networks or services outside EU jurisdiction. Tor-based messaging, blockchain communications or servers in non-compliant countries remain beyond ChatControlโs reach.
ChatControl catches only amateur criminals who directly attach problematic content to messages. Professional networks already employ these evasion techniques as standard practice. EU legislation wonโt make them forget how computers work.
The system fails at protecting children while succeeding at mass civilian monitoring. Itโs not a bug, itโs a feature.
The child protection narrative masks concerning business interests. The European Commission based its CSAR proposal primarily on claims from industry players rather than independent research.
Commercial surveillance companies would manage the technology with guaranteed access to the European market. Organizations like Thorn (co-founded by actor Ashton Kutcher), Microsoftโs PhotoDNA and other tech companies develop these detection systems while simultaneously lobbying for regulations that would require their adoption across Europe.
These companies develop the detection technologies and lobby for laws mandating their adoption, creating a profitable feedback loop. The proposal would secure privileged market positions for surveillance companies across hundreds of millions of European users. Pretty nice, isnโt it?
These systems would be:
Unverifiable: Operating without meaningful external examination or accountability.
Legally powerful: Capable of starting criminal proceedings through algorithmic decisions.
Proprietary: Built on closed-source code with methods hidden from public view.
We cannot audit what these algorithms would actually do. While companies claim they only detect illegal content, the closed-source nature makes verification impossible. The same scanning infrastructure could easily be repurposed to track political dissidents or journalists, monitor specific keywords or phrases, flag content based on changing political priorities or share data with intelligence agencies beyond stated purposes.
Corporate media offers control of internet speech wrapped in protect kids packaging.
Commissioner Ylva Johansson consistently emphasizes this narrative in her communications:
โ[Privacy defenders make a lot of noise], but someone has to speak for the children.โ
Wonโt somebody please think of the children? - The Simpsons
โThink of the childrenโ is a well-documented political rhetoric technique that appeals to emotion rather than evidence. While child protection is genuinely important, this approach frames any opposition as being against child welfare, making nuanced discussion more difficult.
This creates a false choice. Privacy isnโt a luxury for troublemakers, itโs a fundamental right that protects journalists, whistleblowers, activists and ordinary people from unwarranted intrusion.
Critics arenโt opposing child protection. Weโre questioning whether undermining privacy rights for 450 million ๐ช๐บ Europeans is the most effective approach when targeted alternatives exist that preserve rights.
Understanding how ๐ช๐บ EU member states position themselves on this legislation is crucial, as their votes will determine whether ChatControl becomes reality.
Countries that support ChatControl (12): ๐ง๐ฌ Bulgaria โข ๐ญ๐ท Croatia โข ๐จ๐พ Cyprus โข ๐ฉ๐ฐ Denmark โข ๐ซ๐ท France โข ๐ญ๐บ Hungary โข ๐ฎ๐ช Ireland โข ๐ฑ๐น Lithuania โข ๐ฒ๐น Malta โข ๐ต๐น Portugal โข ๐ท๐ด Romania โข ๐ช๐ธ Spain
Countries that oppose ChatControl (7): ๐ฆ๐น Austria โข ๐จ๐ฟ Czech Republic โข ๐ช๐ช Estonia โข ๐ซ๐ฎ Finland โข ๐ฑ๐บ Luxembourg โข ๐ณ๐ฑ Netherlands โข ๐ต๐ฑ Poland
Countries still undecided (8): ๐ง๐ช Belgium โข ๐ฉ๐ช Germany โข ๐ฌ๐ท Greece โข ๐ฎ๐น Italy โข ๐ฑ๐ป Latvia โข ๐ธ๐ฐ Slovakia โข ๐ธ๐ฎ Slovenia โข ๐ธ๐ช Sweden
๐ฆ๐น Austria: Constitutional and privacy concerns.
๐จ๐ฟ Czech Republic: Prime Minister explicitly rejects proposals that would allow widespread monitoring of citizensโ private digital communications.
๐ช๐ช Estonia: Acknowledges sincere concerns about child exploitation, but opposes undermining end-to-end encryption and forcing mass surveillance.
๐ซ๐ฎ Finland: Cannot support the latest compromise proposal because it contains a constitutionally problematic identification order.
๐ฑ๐บ Luxembourg: Rejects broad surveillance measures like client-side scanning and insists that EU regulation must ensure proportional, targeted detection to protect citizensโ fundamental rights.
๐ณ๐ฑ Netherlands: Strong privacy protection stance.
๐ต๐ฑ Poland: Opposition to mass surveillance measures.
๐ง๐ช Belgium: The N-VA party calls ChatControl a โmonster that invades your privacy and cannot be tamedโ. Despite this, Belgium backed Denmarkโs compromise during September meetings. Mixed signals from Brussels.
๐ฉ๐ช Germany: Wonโt break encryption but wants to find middle ground. Theyโre trying to craft their own compromise instead of rejecting ChatControl outright. Germanyโs fence-sitting could be decisive.
๐ฌ๐ท Greece: Still figuring out the technical details. No clear stance yet.
๐ฎ๐น Italy: Has concerns about expanding the scope to cover new CSAM detection. Rome seems hesitant about how far this thing could reach.
๐ฑ๐ป Latvia: The government likes what they see on paper but worries about political backlash after summer attention. Classic politicians hedging their bets.
๐ธ๐ฐ Slovakia: Playing the wait-and-see game. No commitment either way.
๐ธ๐ฎ Slovenia: Dealing with constitutional headaches around privacy. Another country wrestling with legal implications.
๐ธ๐ช Sweden: Stockholm is still reading the fine print. Taking their time to decide.
Current situation: Country positions continue shifting regularly since September 12. With 12 countries supporting, 7 opposing, and 8 undecided, ChatControl supporters still fall short of the 65% EU population threshold needed for a qualified majority. The opposition maintains enough demographic weight to block the proposal for now, but the situation remains fluid as the interim regulation approaches expiration.
ChatControl Proposal Introduced
The European Commission unveils the original ChatControl proposal, requiring all email and messaging providers to scan communications for child sexual abuse material.
Danish Presidency Takes Charge
๐ฉ๐ฐ Denmark assumes the EU Council Presidency and immediately reintroduces ChatControl as a top legislative priority, targeting October 14, 2025 for adoption.
Support Momentum Builds
Fifteen EU member states back the ChatControl proposal, reversing earlier resistance. ๐ซ๐ท France has shifted its position and now supports the proposal. ๐ฉ๐ช Germany remains the crucial undecided vote.
Opposition Wave Begins
๐จ๐ฟ Czech Prime Minister Petr Fiala announces total opposition on behalf of the entire coalition government.
Constitutional Concerns
๐ซ๐ฎ Finland rejects the compromise proposal due to constitutionally problematic detection requirements.
Blocking Minority Secured
๐ฉ๐ช Germany, ๐ฑ๐บ Luxembourg, and ๐ธ๐ฐ Slovakia officially oppose breaking encryption. This creates the blocking minority needed to stop the proposal.
Estonia Joins Opposition
๐ช๐ช Estonia acknowledges child exploitation concerns but opposes undermining end-to-end encryption and mass surveillance.
Germany Wavers
๐ฉ๐ช Germany refrains from taking a definitive stance during the LEWP meeting, despite previous encryption concerns. Position becomes uncertain.
Three Countries Flip
๐ง๐ช Belgium, ๐ฑ๐ป Latvia, and ๐ฎ๐น Italy have moved away from supporting the proposal and are now undecided. Country positions continue changing regularly since September 12.
The effects of these proposals go beyond individual privacy concerns.
Cybersecurity gets compromised
Adding deliberate vulnerabilities to encryption creates weaknesses that everyone can exploit. Any backdoor for authorized access becomes a potential entry point for criminals and foreign intelligence services. In February 2024, the ๐ช๐บ European Court of Human Rights already determined that mandating weakened encryption โcannot be regarded as necessary in a democratic societyโ.
Innovation suffers
๐ช๐บ European cybersecurity companies would face an impossible situation in global markets. How could they credibly sell security solutions when regulations require them to build in access mechanisms that undermine those very protections?
โBuy our ultra-secure encrypted stuff!โ (Terms and conditions apply, government backdoors included)."
Tech companies will leave Europe
Privacy-focused services that moved to ๐ช๐บ Europe after the Snowden revelations are already signaling they might leave. Signal has explicitly said it would stop operating in ๐ช๐บ Europe rather than compromise its security.
Even ๐จ๐ญ Switzerland, traditionally seen as a privacy haven, is facing severe legislative pressures that are forcing tech companies to relocate. Proton has confirmed it has begun moving some of its physical infrastructure out of Switzerland due to โlegal uncertaintyโ over the proposed surveillance law amendments. Lumo, their AI chatbot, became the first product to relocate, moving to Germany instead of Switzerland specifically because of these legislative concerns.
The Swiss OSCPT (Ordinance on the Surveillance of Correspondence by Post and Telecommunications) revision would require VPNs and messaging apps to identify users and retain data for up to six months, plus decrypt communications upon authority request. As Protonโs CEO Andy Yen explained, these are proposals that โhave been outlawed in the EUโ but could soon become reality in Switzerland.
Other privacy-focused providers like Tuta have expressed similar concerns and contingency plans to leave ๐จ๐ญ Switzerland if the surveillance laws pass.
Europe might become dependent on US surveillance
Iโm not so sure on this one, but by outsourcing surveillance technology to American companies, ๐ช๐บ Europe may create dangerous dependencies. These companies operate under ๐บ๐ธ US jurisdiction and the CLOUD Act, potentially allowing ๐บ๐ธ Washington to access data collected on ๐ช๐บ European citizens. Under the pretense of child protection, the ๐ช๐บ EU risks handing surveillance keys to foreign powers.
Social behavior changes
When people know theyโre being watched, they change how they communicate. People start self-censoring, avoiding certain topics and carefully choosing their words even in private conversations.
This is called the chilling effect. Rights donโt disappear overnight: they erode gradually as people change their behavior to avoid potential problems.
Hereโs how you can contribute to defending our digital freedoms:
Share this article and educate your network: Use hashtags like #ChatControl or #StopScanningMe. Forward resources to friends, family and colleagues.
Sign the petition: against ChatControl at change.org.
Stay informed and follow updates: @[emailย protected], x.com/nonchatcontrol, patrick-breyer.de and fightchatcontrol.eu.
Contact your national representatives (MEPs) to convince your country to oppose ChatControl, if itโs not already the case.
Join campaigns and support organizations: stopscanningme.eu for local actions, EFF and EDRi for digital rights advocacy.
Adopt privacy tools and infrastructure: Use Signal and other privacy-respecting alternatives. Host your own services or support privacy-focused providers.
The irony is kinda painful: the continent that built GDPR to protect digital privacy now designs ChatControl to dismantle it systematically. What was once a fundamental right could become mandatory surveillance.
ChatControl represents a historic choice for ๐ช๐บ Europe. Either we become the first democracy to normalize mass surveillance of private communications or we defend the digital rights that made Europe a global privacy leader.
You were the chosen one! - Star Wars
This decision deserves close attention: authoritarian regimes worldwide are watching, ready to justify their own programs with: โEh, if Europe does it, why shouldnโt we?โ
The next chapter unfolds on October 14, 2025, when the government vote is officially scheduled. ๐
If youโve found this article meaningful, you can support me or follow me on your favorite platform. ๐
Author
Metalhearf
IT Architect, Hacker & Passionate Tinkerer
The European Union wants to force tech companies to scan your private messages & images, even in your favorite encrypted apps.
The ๐ช๐บ European Union is advancing legislation that could fundamentally change how we communicate online. ChatControl would require all messaging platforms to automatically scan their usersโ private messages and images.
Yes, even encrypted ones like Signal, WhatsApp and Telegram. No, you canโt opt out.
This isnโt just another privacy policy update you can ignore. If passed, this EU regulation (strongest and most binding legal instrument in EU law) would automatically apply to all member states without any wiggle room for national interpretation. It would even override constitutional protections for communication privacy and establish unprecedented mass surveillance of private communications.
The official justification? Fighting child sexual abuse material (CSAM). Protecting children is undeniably crucial, but the proposed methods would eliminate digital privacy for 450 million Europeans and set a global precedent for mass surveillance.
This surveillance trend extends beyond Europe: ๐จ๐ญ Switzerland is advancing metadata retention requirements, the ๐ฌ๐ง UK is implementing comprehensive age verification systems and now the ๐ช๐บ EU proposes to scan every private message. Each initiative is positioned as child protection policy, but the implications reach far beyond their stated goals.
ChatControl is what critics call the EUโs proposed Regulation to Prevent and Combat Child Sexual Abuse, also known as CSAR (Child Sexual Abuse Regulation).
The proposal builds on surveillance techniques already deployed by major tech companies. Meta analyzes all Facebook Messenger conversations and unencrypted WhatsApp data (profile photos, group descriptions). Apple announced similar scanning for iCloud content in 2021, though they later suspended the program.
This turns voluntary corporate surveillance into mandatory government-ordered scanning. A temporary 2021 EU regulation allowed platforms to scan content voluntarily for three years. That authorization expired in 2024, which is why CSAR was proposed. The temporary regulation merely permitted scanning; CSAR would make detection obligatory under certain conditions.
Thereโs also the Roadmap for Lawful Access to Data which has an even bigger goal: making all our digital data readable by authorities upon request. Weโll dive deeper into this broader surveillance agenda later.
CSAR casts an extremely wide net. The regulation would apply to all interpersonal communication service providers, not just obvious targets like Signal, WhatsApp, or Telegram, but also:
Email providers
Dating apps
Gaming platforms with chat features
Social media platforms
File hosting services (Google Drive, iCloud, DropBoxโฆ)
App stores
Even small community hosting services run by associations
This means virtually any digital service that allows people to communicate or share content would fall under surveillance requirements. The scope extends far beyond what most people imagine when they hear messaging apps.
ChatControl relies on Client-Side Scanning. Your device becomes a monitoring station that analyzes your content before encryption happens.
This represents a fundamental shift away from targeted surveillance based on court orders or reasonable suspicion. Unlike airport security (where you consent to specific, limited searches for immediate safety), ChatControl would automatically scan all private communications of all citizens, all the time.
Effectively reversing the presumption of innocence by treating everyone as a potential criminal. Itโs the digital equivalent of permanently installing monitoring devices in every home and office, wellโฆ just in case.
How does Chat Control work?
The system would automatically scan for three categories of content before encryption:
Known illegal content: Images or videos already catalogued by authorities as CSAM. Your device creates hash fingerprints of your content and compares them against databases of known illegal material.
Unknown potential content: Photos or videos that might constitute CSAM but havenโt been previously identified. AI algorithms analyze visual elements (like exposed skin) to flag potentially problematic content based on statistical models.
Grooming behavior: Text analysis using AI to identify communication patterns that match predefined indicators of adults soliciting children. This involves scanning the actual content of your private conversations.
If something gets flagged, it automatically gets reported to authorities. No human checks it first, that would be impossible given the billions of daily messages. This would be mandatory for all messaging platforms in ๐ช๐บ Europe.
Claims that client-side scanning is compatible with encryption are misleading.
ChatControl doesnโt break encryption, it bypasses it entirely. While your messages still get encrypted during transmission, the system defeats the purpose of end-to-end encryption by examining your content before it gets encrypted. True E2EE means only you and your recipient can read messages: no government, no company, no algorithm should peek inside. This surveillance violates that principle by inserting monitoring at the source.
Privacy-focused companies like Proton point out this approach might be worse than encryption backdoors. Backdoors give authorities access to communications you share with others. This system examines everything on your device, whether you share it or not.
Your encrypted messaging app becomes spyware. Supporters claim this protects privacy because scanning happens locally, but surveillance built into your device makes it impossible to escape.
The proposal would create a centralized EU Centre on Child Sexual Abuse to receive all reports, but EU institutions wouldnโt control the scanning technology itself.
Service providers would face additional obligations beyond scanning. They would need to conduct risk assessments to evaluate and minimize the potential for illegal content sharing on their platforms. This requires collecting detailed information about their users (age groups, content types) that many privacy-focused services deliberately avoid gathering.
The regulation also pushes for mandatory age verification systems. While some privacy-preserving age verification concepts exist (like zero-knowledge proofs), no viable, scalable solutions have been deployed at internet scale. Current implementations either compromise user privacy through identity collection or lack the accuracy needed for legal compliance.
Rules for thee, but not for me: While ordinary Europeans would have all their messages scanned, the proposed legislation includes exemptions for government accounts used for โnational security purposes, maintaining law and order or military purposesโ. Convenient.
ChatControl fits into a broader political strategy. Since the 1990s crypto wars, certain states have argued that privacy-protecting technologies, especially encryption, obstruct police investigations. These technologies are designed to do exactly that, protect everyoneโs ability to control their expression and communication.
The European Commissionโs Roadmap for Lawful Access to Data wants to make all digital data accessible to authorities by 2030. This involves systematically weakening encryption rather than simply bypassing it.
Edward Snowdenโs revelations ten years ago led to widespread adoption of encryption and institutional consensus supporting the right to encrypted communication. But governments remain frustrated by their inability to access private communications. Weโre seeing a return to authoritarian positions using terrorism, organized crime and child exploitation as justifications for undermining encryption.
๐ฉ๐ฐ Danish Minister of Justice Peter Hummelgaard, chief architect of the current ChatControl proposal, recently stated: โWe must break with the totally erroneous perception that it is everyoneโs civil liberty to communicate on encrypted messaging services.โ Well, there you have it folks: encrypted communication isnโt a civil liberty anymore. You cypherpunks were wrong all along. /s
Similarly in ๐ซ๐ท France, both Bernard Cazeneuve and Emmanuel Macron have explicitly stated their desire to control encrypted messaging, seeking to pierce the privacy of millions who use these services.
CSAR provides the perfect opportunity for member states to finally design and implement a generalized surveillance tool for monitoring population communications. Crossing this threshold means eliminating all confidentiality from communications using digital infrastructure.
These scanning systems have a big accuracy problem. When content gets flagged, itโs wrong most of the time. ๐ฎ๐ช Irish law enforcement confirms that only 20.3% of 4,192 automated reports actually contained illegal material, meaning 79.7% were false positives.
This is fine.
Even with hypothetical 99% overall accuracy (which current systems donโt achieve), scanning billions of daily messages would generate millions of false accusations that overwhelm police resources.
Innocent content regularly triggers these systems: family photos, teenage conversations, educational materials and medical communications. Consider this real case: a father was automatically reported to police after sending photos of his childโs medical condition to their doctor. Googleโs algorithms flagged this legitimate medical consultation as potential abuse, permanently closed his account and refused all appeals. His digital life was destroyed by an algorithm that couldnโt distinguish between medical care and criminal activity.
For the third time in three years, over 600 cryptographers, security researchers and scientists across 35 countries have co-signed an open letter explaining why this mass scanning project is โtechnically unfeasibleโ, constitutes a โdanger to democracyโ and would โcompletely compromiseโ the security and privacy of all European citizens.
The letter emphasizes that client-side scanning cannot distinguish between legal and illegal content without fundamentally breaking encryption and creating vulnerabilities that malicious actors can exploit.
Meanwhile, the Commission has provided no serious studies demonstrating the effectiveness, reliability or appropriateness of these intrusive measures for actually protecting children. Industry claims appear to have taken precedence over evidence-based policy-making.
Genuine security emerges through thoughtful design where security measures and civil liberties function as complementary forces, not opposing ones.
The fundamental flaw in ChatControl becomes clear when examining how easily determined actors can circumvent these scanning systems. Criminals donโt need sophisticated techniques to bypass client-side scanning; they use well-documented public knowledge already employed by malicious actors.
Layered Encryption
Encrypt files with standard tools like GPG before messaging. Hell, even a basic Caesar cipher would be sufficient to bypass detection. Since client-side scanning occurs after user encryption but before transport encryption, pre-encrypted content looks like random data to detection algorithms. Recipients decrypt locally with shared keys.
External Platform Bypass
Upload content to any third-party platform (Dropbox, OneDrive, anonymous file hosts, or obscure hosting services) and share links instead of files. The scanner sees innocent text containing a URL while the actual content sits untouched on external servers.
Custom Messaging Clients
Open-source protocols like XMPP and Matrix allow custom client development. Modified clients can automatically implement cloud storage and encryption workflows transparently. Users experience normal messaging while completely evading surveillance infrastructure.
Digital Steganography
Steganographic techniques embed data within innocent images. Family photos can carry hidden payloads invisible to both human operators and AI systems. Tools like OpenStego make this accessible to average users.
Platform Migration
Criminal networks can shift to decentralized platforms, peer-to-peer networks or services outside EU jurisdiction. Tor-based messaging, blockchain communications or servers in non-compliant countries remain beyond ChatControlโs reach.
ChatControl catches only amateur criminals who directly attach problematic content to messages. Professional networks already employ these evasion techniques as standard practice. EU legislation wonโt make them forget how computers work.
The system fails at protecting children while succeeding at mass civilian monitoring. Itโs not a bug, itโs a feature.
The child protection narrative masks concerning business interests. The European Commission based its CSAR proposal primarily on claims from industry players rather than independent research.
Commercial surveillance companies would manage the technology with guaranteed access to the European market. Organizations like Thorn (co-founded by actor Ashton Kutcher), Microsoftโs PhotoDNA and other tech companies develop these detection systems while simultaneously lobbying for regulations that would require their adoption across Europe.
These companies develop the detection technologies and lobby for laws mandating their adoption, creating a profitable feedback loop. The proposal would secure privileged market positions for surveillance companies across hundreds of millions of European users. Pretty nice, isnโt it?
These systems would be:
Unverifiable: Operating without meaningful external examination or accountability.
Legally powerful: Capable of starting criminal proceedings through algorithmic decisions.
Proprietary: Built on closed-source code with methods hidden from public view.
We cannot audit what these algorithms would actually do. While companies claim they only detect illegal content, the closed-source nature makes verification impossible. The same scanning infrastructure could easily be repurposed to track political dissidents or journalists, monitor specific keywords or phrases, flag content based on changing political priorities or share data with intelligence agencies beyond stated purposes.
Corporate media offers control of internet speech wrapped in protect kids packaging.
Commissioner Ylva Johansson consistently emphasizes this narrative in her communications:
โ[Privacy defenders make a lot of noise], but someone has to speak for the children.โ
Wonโt somebody please think of the children? - The Simpsons
โThink of the childrenโ is a well-documented political rhetoric technique that appeals to emotion rather than evidence. While child protection is genuinely important, this approach frames any opposition as being against child welfare, making nuanced discussion more difficult.
This creates a false choice. Privacy isnโt a luxury for troublemakers, itโs a fundamental right that protects journalists, whistleblowers, activists and ordinary people from unwarranted intrusion.
Critics arenโt opposing child protection. Weโre questioning whether undermining privacy rights for 450 million ๐ช๐บ Europeans is the most effective approach when targeted alternatives exist that preserve rights.
Understanding how ๐ช๐บ EU member states position themselves on this legislation is crucial, as their votes will determine whether ChatControl becomes reality.
Countries that support ChatControl (12): ๐ง๐ฌ Bulgaria โข ๐ญ๐ท Croatia โข ๐จ๐พ Cyprus โข ๐ฉ๐ฐ Denmark โข ๐ซ๐ท France โข ๐ญ๐บ Hungary โข ๐ฎ๐ช Ireland โข ๐ฑ๐น Lithuania โข ๐ฒ๐น Malta โข ๐ต๐น Portugal โข ๐ท๐ด Romania โข ๐ช๐ธ Spain
Countries that oppose ChatControl (7): ๐ฆ๐น Austria โข ๐จ๐ฟ Czech Republic โข ๐ช๐ช Estonia โข ๐ซ๐ฎ Finland โข ๐ฑ๐บ Luxembourg โข ๐ณ๐ฑ Netherlands โข ๐ต๐ฑ Poland
Countries still undecided (8): ๐ง๐ช Belgium โข ๐ฉ๐ช Germany โข ๐ฌ๐ท Greece โข ๐ฎ๐น Italy โข ๐ฑ๐ป Latvia โข ๐ธ๐ฐ Slovakia โข ๐ธ๐ฎ Slovenia โข ๐ธ๐ช Sweden
๐ฆ๐น Austria: Constitutional and privacy concerns.
๐จ๐ฟ Czech Republic: Prime Minister explicitly rejects proposals that would allow widespread monitoring of citizensโ private digital communications.
๐ช๐ช Estonia: Acknowledges sincere concerns about child exploitation, but opposes undermining end-to-end encryption and forcing mass surveillance.
๐ซ๐ฎ Finland: Cannot support the latest compromise proposal because it contains a constitutionally problematic identification order.
๐ฑ๐บ Luxembourg: Rejects broad surveillance measures like client-side scanning and insists that EU regulation must ensure proportional, targeted detection to protect citizensโ fundamental rights.
๐ณ๐ฑ Netherlands: Strong privacy protection stance.
๐ต๐ฑ Poland: Opposition to mass surveillance measures.
๐ง๐ช Belgium: The N-VA party calls ChatControl a โmonster that invades your privacy and cannot be tamedโ. Despite this, Belgium backed Denmarkโs compromise during September meetings. Mixed signals from Brussels.
๐ฉ๐ช Germany: Wonโt break encryption but wants to find middle ground. Theyโre trying to craft their own compromise instead of rejecting ChatControl outright. Germanyโs fence-sitting could be decisive.
๐ฌ๐ท Greece: Still figuring out the technical details. No clear stance yet.
๐ฎ๐น Italy: Has concerns about expanding the scope to cover new CSAM detection. Rome seems hesitant about how far this thing could reach.
๐ฑ๐ป Latvia: The government likes what they see on paper but worries about political backlash after summer attention. Classic politicians hedging their bets.
๐ธ๐ฐ Slovakia: Playing the wait-and-see game. No commitment either way.
๐ธ๐ฎ Slovenia: Dealing with constitutional headaches around privacy. Another country wrestling with legal implications.
๐ธ๐ช Sweden: Stockholm is still reading the fine print. Taking their time to decide.
Current situation: Country positions continue shifting regularly since September 12. With 12 countries supporting, 7 opposing, and 8 undecided, ChatControl supporters still fall short of the 65% EU population threshold needed for a qualified majority. The opposition maintains enough demographic weight to block the proposal for now, but the situation remains fluid as the interim regulation approaches expiration.
ChatControl Proposal Introduced
The European Commission unveils the original ChatControl proposal, requiring all email and messaging providers to scan communications for child sexual abuse material.
Danish Presidency Takes Charge
๐ฉ๐ฐ Denmark assumes the EU Council Presidency and immediately reintroduces ChatControl as a top legislative priority, targeting October 14, 2025 for adoption.
Support Momentum Builds
Fifteen EU member states back the ChatControl proposal, reversing earlier resistance. ๐ซ๐ท France has shifted its position and now supports the proposal. ๐ฉ๐ช Germany remains the crucial undecided vote.
Opposition Wave Begins
๐จ๐ฟ Czech Prime Minister Petr Fiala announces total opposition on behalf of the entire coalition government.
Constitutional Concerns
๐ซ๐ฎ Finland rejects the compromise proposal due to constitutionally problematic detection requirements.
Blocking Minority Secured
๐ฉ๐ช Germany, ๐ฑ๐บ Luxembourg, and ๐ธ๐ฐ Slovakia officially oppose breaking encryption. This creates the blocking minority needed to stop the proposal.
Estonia Joins Opposition
๐ช๐ช Estonia acknowledges child exploitation concerns but opposes undermining end-to-end encryption and mass surveillance.
Germany Wavers
๐ฉ๐ช Germany refrains from taking a definitive stance during the LEWP meeting, despite previous encryption concerns. Position becomes uncertain.
Three Countries Flip
๐ง๐ช Belgium, ๐ฑ๐ป Latvia, and ๐ฎ๐น Italy have moved away from supporting the proposal and are now undecided. Country positions continue changing regularly since September 12.
The effects of these proposals go beyond individual privacy concerns.
Cybersecurity gets compromised
Adding deliberate vulnerabilities to encryption creates weaknesses that everyone can exploit. Any backdoor for authorized access becomes a potential entry point for criminals and foreign intelligence services. In February 2024, the ๐ช๐บ European Court of Human Rights already determined that mandating weakened encryption โcannot be regarded as necessary in a democratic societyโ.
Innovation suffers
๐ช๐บ European cybersecurity companies would face an impossible situation in global markets. How could they credibly sell security solutions when regulations require them to build in access mechanisms that undermine those very protections?
โBuy our ultra-secure encrypted stuff!โ (Terms and conditions apply, government backdoors included)."
Tech companies will leave Europe
Privacy-focused services that moved to ๐ช๐บ Europe after the Snowden revelations are already signaling they might leave. Signal has explicitly said it would stop operating in ๐ช๐บ Europe rather than compromise its security.
Even ๐จ๐ญ Switzerland, traditionally seen as a privacy haven, is facing severe legislative pressures that are forcing tech companies to relocate. Proton has confirmed it has begun moving some of its physical infrastructure out of Switzerland due to โlegal uncertaintyโ over the proposed surveillance law amendments. Lumo, their AI chatbot, became the first product to relocate, moving to Germany instead of Switzerland specifically because of these legislative concerns.
The Swiss OSCPT (Ordinance on the Surveillance of Correspondence by Post and Telecommunications) revision would require VPNs and messaging apps to identify users and retain data for up to six months, plus decrypt communications upon authority request. As Protonโs CEO Andy Yen explained, these are proposals that โhave been outlawed in the EUโ but could soon become reality in Switzerland.
Other privacy-focused providers like Tuta have expressed similar concerns and contingency plans to leave ๐จ๐ญ Switzerland if the surveillance laws pass.
Europe might become dependent on US surveillance
Iโm not so sure on this one, but by outsourcing surveillance technology to American companies, ๐ช๐บ Europe may create dangerous dependencies. These companies operate under ๐บ๐ธ US jurisdiction and the CLOUD Act, potentially allowing ๐บ๐ธ Washington to access data collected on ๐ช๐บ European citizens. Under the pretense of child protection, the ๐ช๐บ EU risks handing surveillance keys to foreign powers.
Social behavior changes
When people know theyโre being watched, they change how they communicate. People start self-censoring, avoiding certain topics and carefully choosing their words even in private conversations.
This is called the chilling effect. Rights donโt disappear overnight: they erode gradually as people change their behavior to avoid potential problems.
Hereโs how you can contribute to defending our digital freedoms:
Share this article and educate your network: Use hashtags like #ChatControl or #StopScanningMe. Forward resources to friends, family and colleagues.
Sign the petition: against ChatControl at change.org.
Stay informed and follow updates: @[emailย protected], x.com/nonchatcontrol, patrick-breyer.de and fightchatcontrol.eu.
Contact your national representatives (MEPs) to convince your country to oppose ChatControl, if itโs not already the case.
Join campaigns and support organizations: stopscanningme.eu for local actions, EFF and EDRi for digital rights advocacy.
Adopt privacy tools and infrastructure: Use Signal and other privacy-respecting alternatives. Host your own services or support privacy-focused providers.
The irony is kinda painful: the continent that built GDPR to protect digital privacy now designs ChatControl to dismantle it systematically. What was once a fundamental right could become mandatory surveillance.
ChatControl represents a historic choice for ๐ช๐บ Europe. Either we become the first democracy to normalize mass surveillance of private communications or we defend the digital rights that made Europe a global privacy leader.
You were the chosen one! - Star Wars
This decision deserves close attention: authoritarian regimes worldwide are watching, ready to justify their own programs with: โEh, if Europe does it, why shouldnโt we?โ
The next chapter unfolds on October 14, 2025, when the government vote is officially scheduled. ๐
If youโve found this article meaningful, you can support me or follow me on your favorite platform. ๐
Author
Metalhearf
IT Architect, Hacker & Passionate Tinkerer
Cyber Union
Cyber Union
No comments yet