“When we look at intimate image abuse, the vast majority of tools and weaponry have come from the open source space,” says Ajder. But they often start with well-intentioned developers, he says. “Someone creates something they think is interesting or cool, and someone with bad intentions recognizes its malicious potential and weaponizes it.”
Some, such as the repository disabled in August, have purpose-built communities around them for explicit use. The model positioned itself as a tool for deepfake porn, Ajder claims, and became a “funnel” for abuse, which is predominantly aimed at women.
Other videos uploaded to the porn streaming site by an account crediting AI models downloaded from GitHub showed the faces of popular deepfake targets, celebrities Emma Watson, Taylor Swift and Anya Taylor-Joy, as well as others less famous but very real women, superimposed in sexual situations.
The creators freely described the tools they used, including two scrubbed by GitHub but whose code survives in other existing repositories.
Perpetrators in search of deep fakes congregate online in many places, including secret community forums on Discord and prominently on Reddit, reinforcing attempts at deepfake prevention. A Redditor offered their services using the software for the archived repository on September 29th. “Could someone do my cousin,” asked another.
Torrents of the main repository, which was banned by GitHub in August, are also available in other corners of the web, showing how difficult it is to monitor open source deepfake software across the board. Other deepfake porn tools, such as the app DeepNude, have been similarly taken down before new versions appeared.
“There are so many models, so many different forks in the models, so many different versions, it can be hard to keep track of them all,” says Elizabeth Seger, director of digital policy at UK-based cross-party think tank Demos. “Once a model is made open source publicly available for download, there’s no way to do a public rollback of it,” she adds.
A deep-fake porn creator with 13 manipulated explicit videos of female celebrities credited a prominent GitHub repository that was marketed as a “NSFW” version of another project that encouraged responsible use and explicitly asked users not to use it for nudity. “Learn all available Face Swap AI from GitHUB, not using online services,” says their profile on the tube page, cheekily.
GitHub had already disabled this NSFW version when WIRED identified the deepfake videos. But other repositories labeled as “unlocked” versions of the model were available on the platform on January 10, including one with 2,500 “stars.”
“It is technically true that once [a model is] out there it cannot be reversed. But we can still make it more difficult for people to gain access,” says Seger.
If left unchecked, she adds, the potential for harm from deepfake “porn” isn’t just psychological. Its knock-on effects include intimidation and manipulation of women, minorities and politicians, as seen with political deep-fake cases affecting female politicians globally.
But it’s not too late to get the problem under control, and platforms like GitHub have options, Seger says, including intervening at the time of upload. “If you put a model on GitHub and GitHub said no, and all the hosting platforms said no, it becomes harder for a normal person to get that model.”
Curbing deepfake porn made with open source models also relies on politicians, tech companies, developers and, of course, creators of offending content themselves.
At least 30 US states also have some legislation addressing deepfake porn, including bans, according to nonprofit Public Citizen’s legislative tracker, although definitions and policies differ and some laws only cover minors. Deepfake creators in the UK will also soon feel the force of the law after the government announced the criminalization of the creation of sexually explicit deepfakes, as well as sharing them, on January 7.