Deepfakes: The Morals of Swapping Faces

Machine learning and artificial intelligence have been the hottest topics of the tech world for quite awhile now. With the power to emulate human thought, computers can now tackle a whole new range of problems. Computers can safely navigate cars through major cities, predict emergency room waiting times, and even play the ancient Chinese game of Go better than any human alive. But with any technology, there is always the chance, perhaps even the inevitability, that it will be used for less-than-favorable purposes. Thus is the case with FakeApp.

The idea is simple: take a source video of a person and replace their face with someone else’s. This can be done through a machine learning algorithm which analyzes the source video and a large collection of photos of the target face, and figures out how to map the target face onto the source video. The resulting videos are called “deepfakes”. If you’re interested, you can download FakeApp here, but the goal of this post is to examine the implications of this technology rather than to explain how it works.

First, we should discuss feasibility. FakeApp is free to download and there are plenty of mirrors, so even if one file-sharing website removes it, it’ll be available somewhere else. However, to get convincing results, the user needs a large collection of photos of the target face from many different angles. They also need a powerful computer in order to train the model, or else the process would take too long to be feasible. Even still, the training process takes hours. In short, anyone with a good computer, a lot of pictures, and a lot of time could make deepfakes.

The technology is very new. It was launched by Reddit user deepfakes (who has since been banned) on December 11th, 2017. This user posted a few videos on various subreddits (communities within Reddit), and piqued interest before eventually releasing his tool. Since its release, it has been used for some very funny projects, including one which fakes Nic Cage’s face onto a variety of famous scenes from different movies:

However, deepfakes are most commonly used for a more questionable purpose: fake pornography. Deepfake creators gather large datasets of famous people (generally actresses) and fake them onto pornographic videos with similar-looking performers. This raises concerns, both morally and legally.

The primary question deals with consent. Obviously, the famous person did not consent to having their face used in a pornographic video. However, is consent needed? The famous person isn’t actually doing anything, they are just having their likeness used in an unflattering way. They could attempt to sue for defamation, but it is difficult for celebrities to win with this argument because being in the public’s eye is the nature of their career.

Then there is the question of should this be allowed to happen. On one hand, the argument could be made that it’s not really hurting anyone, but on the other hand, there is something fundamentally messed up about making it look like someone did something compromising that they didn’t really do.

And there lies the major issue. Because while deepfakes thus far have mainly involved celebrities, there is no reason that someone with enough photos of a coworker or ex couldn’t make a deepfake with their likeness. These deepfakes, if they are convincing enough, could be used as revenge or blackmail. This once again prompts the moral and legal questions, but the context changes as the defendant is a regular citizen. While they could sue for defamation, the legal area for deepfakes is obviously very murky. Since they haven’t actually been put into a compromising position, they would have to carefully craft a case to show clear evidence of defamation or other personal stress and issues. By that point, perhaps the person who created the deepfake will have gotten what they wanted.

As of late, many platforms have banned deepfakes, and have worked to remove them from their websites. These platforms include Discord (a chat service targeted at gamers), YouTube, Twitter, Reddit (where it all began), and even Pornhub. However, deepfakes will never fully go away. Deepfake creators have their own secretive communities where they share videos, photo collections, and tricks, and it’s fairly straightforward for anyone to download the app and make a deepfake themselves. Perhaps in the near future we’ll see the first deepfake-related case go to court, or perhaps the deepfake creators will retreat to their secret communities and remain there without provoking attention. Regardless, this application of machine learning is remarkable in its relative simplicity and the depth of moral and ethical issues it manages to raise.

 

Sources:

https://motherboard.vice.com/en_us/article/gydydm/gal-gadot-fake-ai-porn

https://www.theverge.com/2018/1/30/16945494/deepfakes-porn-face-swap-legal

https://www.theverge.com/2017/12/12/16766596/ai-fake-porn-celebrities-machine-learning

2 thoughts on “Deepfakes: The Morals of Swapping Faces

  1. Alex Cayley says:

    I think this is honestly such a fascinating topic. I saw the post on Reddit the other day announcing the removal of r/deepfakes due to Reddit’s policy on non-consensual images/videos. I have to say, personally I agree with this. I think at this point where this specific software is relatively new and only just starting to emerge and the results are already fairly realistic, it is important to prevent it from developing further and becoming a more wide-spread tool, resulting in even more realistic and indistinguishable results.
    I definitely think that once this software can successfully produce extremely realistic results with limited computing power and easy access, then I definitely believe laws and regulations will most like be enacted to prevent the production or use of such material. The problem is of course how you would ever tell if an image had been faked – which is kind of a paradox.

    One issue that I have thought about is that this technology could most definitely be utilized in other legal areas. Today, video (and to an extent) image evidence are mostly indisputable evidence in a court of law. So could people use this technology to alter who committed a crime or frame an innocent individual in a case by modifying the face of a person in a video? It could definitely be used in widespread cases and might cause video evidence to eventually become not fully concrete evidence.

    I think a scary thought is that even though websites are removing the “deep fake” videos now, at some point we won’t be able to tell if a faked celebrity, explicit video is real or not. So a video could simply be posted as a “Celebrity Sex Tape” and people most likely will assume it is just a real video, so the problem (as I mentioned above) is that once the technology becomes so advanced and successful that the results are completely realistic, websites will not be able to distinguish between what is fake and what is not.

  2. Morgan says:

    In general, the realm of technology is advancing at absurd speeds. Pretty soon machines are going to be replacing all of us and there will be no need for humans to work (other than the people who make the technology). I can just imagine a bunch of robots walking around and performing tasks for humans… kinda like the movie iRobot hahah. But in all actuality, this probably won’t be the case for years and years and years.
    In terms of the topic you wrote about in your blog, deepfakes, I am appalled that these are a real thing! First of all, I cannot believe that we have this kind of technology. I knew that on movie sets, some sort technological green-screen innovations were used to make people look like monsters and things of that sort. They even used this technology to make a face replica of Paul Walker, the leading actor of the Fast and Furious movie series, after he had passed away in the middle of shooting the seventh movie. To finish the movie, they recruited Paul’s brother and utilized green-screen face changing technology to make him resemble Paul almost seamlessly.
    On the professional scale, I think this practice is perfectly acceptable. The producers of Fast and Furious 7 had the purpose of honoring Paul Walker’s legacy and finishing his final Fast and Furious movie after the long series. And it is a technology that should be used since we have the resources.
    Beyond the professional world, I am very surprised that this sort of technology exists. I would never have imagined that a simple app could perform such a reality altering task. When this technology enters public access, this is where I think the issues lies. People are too often not concerned with the outcomes of their actions and putting the face of an unsuspecting individual on the body of, let’s say a pornography star, is irresponsible to say the least. The person who’s face is being replicated could have personal and professional backlash for such a video even though they did not in any way participate in the acts show on tape. The fact that someone thought that our society could handle such an algorithm that can perform these tasks embodies the immature status of a portion of the American people.

Leave a Reply

Your email address will not be published. Required fields are marked *