It’s hard to find privacy in today’s society. Everything we do, say, or search on the internet is tracked and recorded. Most of our activities are then commercialized. End-to-end encryption (E2EE) is the only thing that truly keeps users’ communications private.
However, recent moves by many world governments may compromise that last stronghold. Several governments have proposed legislation that would force encrypted platforms to perform CSAM scanning of user communications.
Although these proposals seem straightforward, they have larger implications for the future of encryption and privacy.
What is CSAM Scanning?
Child Sexual Abuse Material (CSAM) scanning tools are tools platforms use to identify and take action on CSAM on their websites. These tools work by using AI and other software to compare the content you share on platforms to a list of known CSAM.
If a threat is identified, the tool will contact a mediator who will be in charge of dealing with the issue. If the mediator deems the content illegal, they notify child safety advocacy groups and law enforcement.
These groups, such as the National Center for Missing and Exploited Children (NCMEC), would then further investigate the incident. Knowingly distributing and viewing CSAM is illegal and puts children around the world at risk. In 2021, according to Homeland Security, the US Secret Service “identified and/or rescued 1,177 child victims in child exploitation investigations.”
It’s easy to understand why government agencies want to increase the likelihood of catching perpetrators of this crime. It is a reprehensible activity that has been a problem for years. According to NCMEC, online companies were the most frequent reporters of online enticement.
Most enticements happened as a result of explicit content shared on online platforms that led to plans to meet up with the child. It’s clear that this is a problem, but is CSAM scanning enforcement the solution?
What is End-to-End Encryption?
End-to-end encryption, such as AES encryption, is the strongest form of privacy protection on the internet currently. The way E2EE works is it takes a message before it is sent to the recipient and scrambles the information using an encryption algorithm.
The algorithm makes the content unreadable. Then, as the message travels to the recipient it is protected from outside spies. Even if a hacker gains access to the message, it is impossible to decrypt it without the decryption key.
Once the message arrives at its destination, the recipient has a private encryption key that unscrambles the message into readable text. So, communication is protected at every stage. This offers the most security against cyberattacks.
Many people with legitimate reasons use E2EE. Bankers and clients, medical professionals and patients, and journalists and sources are some examples of people who rely on end-to-end encryption.
They need enhanced privacy to protect sensitive information. Many business professionals need to keep their communications private and out of the hands of cybercriminals who would exploit these communications.
Even everyday users have the right to keep certain communications private, no matter what they are communicating.
The Problem with CSAM Scanning and Encryption
E2EE is currently unbreakable. The US government even uses E2E encryption to secure their top secrets. Many platforms such as WhatsApp use E2E encryption to keep their customers safe from cyber threats.
The proposed legislation wants to force encrypted platforms to scan for CSAM, but by doing so, they also put everyday communications at risk. Since E2EE cannot be cracked, it is practically impervious to cybercriminals.
However, when you start opening the door for certain types of scanning, you open up a way cybercriminals can hack into it. This is why people have resisted adding backdoors to E2E encryption. These backdoors are ways for cybercriminals to access encrypted information.
In the same way, scanning will be a vulnerability that will allow scammers to break into encryption. Additionally, the entire point of E2EE is to protect communications from prying eyes. By scanning these messages, you are violating the very privacy that users are entrusting companies with.
These legislations also make blanket statements that they can later use to further monitor people’s messages. These scans may start by looking for CSAM, but these legislations open up the possibility of the government forcing platforms to monitor all communications, illegal or otherwise.
This is a slippery slope that can mean further privacy restrictions in the future.
Proposed Legislation
There are several proposed legislations aimed at protecting children against online child exploitation. Each of these bills has the same issue: they want to protect children at the cost of user privacy.
By compromising E2E encryption, the government would put users at risk of cyber attacks and take away the right to keep some communications away from prying eyes. Here is an overview of some of the controversial bills to date.
Online Safety Bill
One of these proposed bills is a U.K. amendment to the Online Safety Bill (OSB). This bill makes social media companies legally responsible for protecting children on their platforms by removing illegal content, preventing underage children from accessing certain content, and providing more transparency about the risks of child abuse online.
Companies that do not follow these new rules will be fined either 18 million pounds or 10% of their annual revenue, whichever is more. This incentivizes companies to install CSAM scanners to prevent accumulating fines.
However, within this proposed bill there is a “spy clause,” which puts E2E encryption at risk. Clause 122 of the OSB forces service providers to scan every message for certain keywords and media files for CSAM.
This includes platforms that offer encryption. However, there was an incredible pushback against this bill. Companies such as WhatsApp and Signal threatened to leave the UK if the law was finalized.
After a few rewrites, the bill passed in late September 2023 with the spy clause still intact. The UK government stated that they want big tech companies to develop their own systems for scanning illegal content that still allows users privacy.
This seems to be a compromise that companies are willing to comply with. Creating their own scanners seems to be better than using a one-size-fits-all CSAM scanner that may compromise the integrity of their encryption.
EARN IT Act
The US has its own version of this privacy bill. The EARN IT Act was a piece of legislation introduced in 2020 by the US Congress. The bill amends Section 230 of the Communications Act, which offers liability protections to companies that provide communication services.
This bill incentivizes social media platforms to perform CSAM scanning. In February 2022, the US Senate approved the bill. The bill puts companies that use encryption at risk of liability if they do not monitor and filter for illegal content.
This kind of monitoring compromises the essence of E2E encryption, which is meant to give users one iota of privacy in a world of constant monitoring.
STOP CSAM Act
In April 2023, the US Congress introduced the STOP CSAM Act, which makes companies liable if their users are storing, sending, or making child pornography.
The government could fine these companies up to $1 million for these crimes, whether they knew about them or not. Additionally, victims of these crimes can sue companies for allowing people committing these crimes to operate on their platform.
Since the act states that companies would be liable for “the intentional, knowing, reckless or negligent promotion” of child exploitation, companies have fewer protections against being sued, even if they have no idea of the criminality happening on their platform.
These reduced protections force companies’ hands, making it unavoidable to use CSAM scanning tools to keep from getting fined and/or sued.
Does E2EE Promote Child Exploitation?
The ongoing debate is whether or not end-to-end encryption promotes child exploitation, or even facilitates it. Some people suggest that by making communication unreadable by the provider and law enforcement, platforms that use E2EE are encouraging criminals to use their services to commit crimes.
However, this puts a lot of emphasis on the company’s responsibility for every user’s actions. Just because a platform uses E2EE, that doesn’t mean it promotes criminality. People choose how they use services, but that doesn’t mean the service itself is faulty.
Many people use E2EE for legitimate and understandable reasons, however, it appears that governments are after these protections, leaning towards an outright ban on E2E encryption in the future.
Would CSAM Scanning Stop the Problem?
Although legislatures believe that the solution to the child trafficking problem is by enforcing platforms to scan for CSAM, this is not the blanket solution to this problem that proponents believe it is.
Even though reports show that online platforms like Facebook Messenger tend to have more reports of CSAM content, these reports need to be carefully examined. Some of the reports are inflated numbers because there are multiple reports of the same content.
CSAM scanners, like any other tool, are incomplete. They flag inappropriate content based on certain parameters. These parameters do not always flag the right content. Sometimes innocent people can be accused of sharing child pornography.
Also, criminals can mask their activities and implicate innocent people. Criminals are incredibly sneaky and know how to redirect law enforcement’s efforts. CSAM is a problem that exists and has existed, even before the internet and E2EE.
However, unfortunately, even if every platform is scanned for CSAM and encryption is compromised, criminals will adapt and find ways to hide their illegal activities.
How Are Companies Fighting Back?
Apple has been back and forth on issues of privacy over the years. In December 2022, they announced a new feature that would scan iCloud photos for CSAM. However, they decided to backtrack on this feature after considerable backlash.
Apple, WhatsApp, and 80 other civil society organizations have signed an open letter objecting to the U.K.’s Online Safety Bill, which did get some rewrites completed on the bill to allow companies to build their own CSAM scanners.
Companies are also working on pre-encryption scanners that would use AI to scan, detect, and flag illegal child abuse content. These scanners would then report the illegal content to moderators who would take further actions.
This would comply with legislation without compromising end-to-end encryption. However, this is an ongoing issue that big tech companies are trying to combat as governments continue to pass legislation that further restricts people’s privacy.
Sekur’s Encryption
At Sekur, we believe in giving our clients privacy. That’s why our platform is 100% private. We provide encrypted Swiss Hosted Email, VPN, and instant messaging. With Sekur, you can communicate privately and securely with both Sekur and non-Sekur users alike. We do not data mine and we are free from big tech hosting.
Our proprietary technology is a multi-layered 2048-bit encrypted tunnel that allows users to communicate only on our secure Swiss servers. As of now, Swiss privacy laws are some of the most protective in the world.
Swiss laws force businesses to ask permission from their users before storing and processing personal data. They also provide the best protection for their users’ data. Switzerland has remained politically neutral since 1815 and is not a part of the EU, which supports further restrictions on E2E encryption.
When you use Sekur, you will be able to navigate securely under the protection of Switzerland’s strict data privacy laws.
Conclusion
Although legislators are proposing CSAM scanning bills to prevent and end child abuse crimes, these new privacy acts may have bigger implications for the general public. There are several reasons people use end-to-end encryption, and not all of them are for criminal reasons.
These laws will end up punishing everyone for the crimes of a few people. When you open the door to let the government scan private content for certain things, it leaves it open for them to take further actions that compromise your privacy altogether.
It is important to protect your privacy and access to E2EE. Let your government officials know that you want them to find alternative ways to stop child abusers.