I recently watched the New York Times’ Weekly episode “Deepfakes – Believe at Your Own Risk” and while I have been concerned about the implications of undetectable forged audio and video for quite some time (both as an information security issue and a societal issue), this video really drove the dangers home to me in a visceral way. I highly recommend that you spend 30 minutes and watch this excellent reporting.
I felt compelled to write about this issue, both to share these concerns with my peers and to organize my thoughts on how to frame this problem. My initial reactions follow.
Deepfake technology feels like the equivalent of a weapon of mass destruction (at least to me). Many less sophisticated communications technologies have led to death and destruction in the wrong hands (see WhatsApp in India). Deepfakes have the potential to subvert democratically elected governments, incite people or nations to violence, and tear at the trust which is foundational to a properly functioning society. And yet, there are (and probably cannot be) effective controls on their development and use.
Deepfakes truly mark the beginning of a “post truth” society where liars will always have the upper hand. Liars can choose to put words into the mouths of their opponents to discredit them or to disavow lies and inconvenient words as truly “fake news.” The power of audio and video evidence to indict or exculpate will be destroyed.
It is amazing to me to see the almost total lack of any kind of serious introspection on the part of the people at Dessa as to whether they should be building perfect deepfake tools. The engineering team seems to be totally focused on “what CAN we do” rather than “what SHOULD we do.” This is the same kind of simplistic, technology and profit driven thought that has gotten us to the point where social media tools provide our enemies with convenient ways to get our electorate to do their work for them. Yes, Dessa’s CEO noted that he didn’t sell the code to the CIA, but…
Any nation state actor (or criminal organization) that isn’t trying to develop their own deepfake tools or hack companies like Dessa is just not doing their job. I hope that Dessa and others who insist on working to bring deepfake technology to market have damn good cybersecurity, although I doubt that even the best countermeasures would stand up to nation state attack when the prize is so valuable.
What is left of the democratic system is under dire threat from cyber attacks and deepfakes. I would not be at all surprised if days before the 2020 US presidential election, if there is no at least one sensational viral video or audio clip “leaked” in an effort to sway voters at the last moment. And we will have no way to know whether the candidate actually said whatever incendiary words attributed to them. When the White House is occupied by a pathological liar of questionable mental stability and a core of supporters without critical thinking skills, this could lead to a constitutional crisis such as we have never seen before.
The private sector is under threat as well. Deepfakes can be used to destroy personal and corporate reputations, subvert the justice system, manipulate the stock markets, and as part of totally convincing social engineering schemes to gain access to information and networks. We information security people need to be thinking of how our companies can be protected against and react to deepfake enabled attacks. At one level, this is a classic exercise in crisis communications, but at a whole new level, where relying on exculpatory evidence is getting harder and harder.
Action to deal with the problem of deepfakes is needed now and I don’t think we can depend on the traditional means (legislation, government action) that we have relied on in the past. The very people and institutions who we depend on for those measures are also those who stand to benefit from the post truth era. Efforts to detect and call out deepfakes have to come from the scientific and journalistic communities – I just hope that there are enough AI folks and data scientists out there who are motivated by a recognition that a continuing retreat from truth has the potential to crack the foundations of our society.
I have to admit that the Times documentary left me very pessimistic as to where deepfakes are going to take us as a society. I have to do some more thinking and research on this topic and I am sure I will be returning to it in future blog posts. I would love to hear from readers – what are your thoughts on deepfakes?