Images online live forever, but that could change for the victims of child sexual exploitation. Internet companies that could make that change happen should make a serious commitment to do so.
On Sunday, The New York Times reported the story of two sisters who were sexually exploited as children by their own father. Graphic pictures were taken. The father is in prison. The pictures are still out there in cyberspace.
The sisters, now young women, live with the fear that they will be recognized and harassed by creeps who can still see those images. Internet companies can identify and scrub pictures and videos of child pornography. If artificial intelligence can drive trucks, compare DNA and identify millions of Chinese, it certainly can do this work.
Tech has been ramping up efforts to identify child abuse online. Even flagging the 45 million images found so far, the number reported by the Times, isn’t enough. Companies could be more diligent in their efforts to stop the storage and sharing of these photos and videos. They could begin now to find ways to stop an even more disgusting behavior—live streaming of the sexual exploitation of children.
Tech companies already use private information to make giant profits. Protecting the victims of child sexual exploitation seems like a much better reason to enforce those user agreements.
There are issues of privacy involved in the kind of scrubbing that these AI technologies would involve. Society will need to debate and agree upon the kinds of images that constitute child sexual abuse.
Rape is certainly definable. The line between innocent fashion posts or private family photos of cute kids and prepubescent children in sexually provocative poses should be nearly as obvious.
In the early days of the Internet and social media, creators and users failed to fully appreciate all of the consequences of having things live online forever. Now everyone knows better.
Blocking images of child sexual exploitation should be a top priority for tech companies.