To support MIT Technology Review journalism, please consider becoming a subscriber..
Lloyd Richardson, technology director at the Canadian Center for Child Protection, says there aren’t too many penalties for platforms that fail to quickly remove CSAM, except for “bad press.” “I think it would be difficult for you to find a country that has imposed fines for slowing or removing CSAM against an electronic service provider,” he said.
The amount of CSAM around the world increased dramatically during the epidemic as both children and predators spent more time online than ever before. Child protection experts, including Thorn, an anti-child trafficking organization, and INHOPE, a global network of 50 CSAM hotlines, predict that the problem will only get worse.
So what can be done to deal with it? The Netherlands may provide some pointers. The country still has a significant CSAM problem, partly because of its national infrastructure, its geographical location, and its location as a hub for global Internet traffic. However, it has managed to make some big progress. It moved from 41% hosting of Global CSAM in late 2021 to 13% in late March 2022, according to the IWF.
Much of that progress is due to the fact that when a new government comes to power in the Netherlands in 2017, it prioritizes tackling CSAM. In 2020 it published a report naming and embarrassing Internet hosting providers who failed to remove such content within 24 hours of being alerted to its presence.
It seems to have worked – at least in the short term. The Dutch CSAM hotline EOKM has found that suppliers are willing to take action to remove CSAM within 24 hours of its discovery and to remove components quickly.
However, EOKM chief executive Arda Gerkens believes that instead of tackling the problem, the Netherlands has pushed it elsewhere. “It looks like a successful model, because the Netherlands cleared it. But it didn’t go – it moved. And it worries me,” she says.
The solution, the argument of child protection experts, will come in the form of legislation. Congress is currently considering a new law called the EARN IT (Abuse of Interactive Technology and Elimination of Massive Negligence) Act, which would allow services to sue CSAM hosts on their networks and force service providers to scan user data for such content.
Proponents of her case have been working to make the actual transcript of this statement available online. Proponents of her case have been working to make the actual transcript of this statement available online. But contrary to that argument, John Sheehan of the National Center for Missing and Exploited Children says technology companies are currently prioritizing the privacy of those who distribute CSAM on their platforms for the safety of those affected by it.
Even if lawmakers fail to pass the EARN IT Act, the forthcoming UK legislation promises to hold technology platforms responsible for illegal content, including CSAM. Technology giants could face billions of dollars in fines due to the UK’s online safety bill and Europe’s digital services law if they fail to adequately address illegal content while the law is in force.
The new rules will apply to social media networks, search engines and video platforms operating in the UK or Europe, meaning that US-based companies, such as Facebook, Apple and Google, will have to comply. Working in the UK. “There’s a lot of global movement around it,” Sheehan said. “It will have a ripple effect around the world.”
“I would rather not have to legislate,” Farid said. “But we waited 20 years to find a moral compass for them. And that is the last resort. ”