Manipulated Reality, Menaced Democracy: An Assessment of the DEEP FAKES Accountability Act of 2019

2020 Legislation Competition Winner

By: Daniel Lipkowitz

March 5, 2020

Imagine waking up on election Tuesday. Before heading out to cast your vote for the next President, you turn on the news. A breaking report shows footage of one of the candidates in an apparently private conversation announcing that, if elected, they would use their power of office to pardon criminals for the right price. The candidate’s campaign issues a press release denying the accusation and claiming the video is a forgery. However, the person depicted in the video is indistinguishable from the candidate disputing it, down to the freckles on their face. What do you believe?

The computerized manipulation of content is nothing new. That being said, deepfakes are unique in that they utilize machine learning to realistically replicate facial movements with a fluidity that previously required painstakingly tailored effort. As AI systems advance, less and less data is needed to render more and more convincing forgeries

While deepfakes enhance virtual reality programs and have allowed Hollywood to interpolate a young Carrie Fisher into a recent Star Wars movie, this technology also presents serious risks. The use of deepfakes to alter elections, enact fraud, slander individuals, and even incite violence is no longer hypothetical. Deepfake technology can degrade trust in the institutions that individuals rely upon most. When any video of a government official, member of the press, or even our closest friends and family can be puppeteered, all videos’ validity will be questioned.

H.R. 3230 – The DEEP FAKES Accountability Act

Congresswoman Yvette Clarkes’ DEEP FAKES Accountability Act, H.R. 3230, attempts to prevent the sinister application of deepfakes to spread disinformation. The legislation requires altered videos and audio to contain a watermark and explicit text or audio notifying viewers that it is a false depiction. If these requirements are not implemented by the creator, it is a crime.1Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2019, H.R. 3230, 116th Cong. (2019). The bill also allows victims of harmful deepfake portrayals to seek compensation in civil court through a private right of action and permits in rem litigation against the deepfake content to find the content materially false.2Id. at sec. 2 § 1041(g), sec. 4(a).

It is unlikely that H.R. 3230 will prevent the creation and distribution of what it calls “advanced technological false personation records.”3Id. at sec. 2 § 1041(n)(1). Watermarks are easily removable, and it is extremely difficult to track down the creators of harmful false content. Nevertheless, this bill does provide an important foundation for tackling the issue. First, it will allow benevolent users to create deepfake content with a clear notion of what is acceptable under criminal and civil law. Second, current criminal and tort law is ill-suited to deal with the unprecedented harms caused by deepfakes. This legislation gives victims legal recourse in the—albeit rare cases—where a malicious actor is identified. This is particularly salient for victims of deepfake fueled humiliation.

The discussion of deepfakes introduces a kaleidoscope of issues and conflicts. With the legislation’s benefits in mind, I will focus on two modifications to H.R. 3230 that would strengthen the legislation and feasibly enhance bipartisan support for the bill in Congress. The first recommendation aims to acknowledge possible dangerous applications of deepfake technology that we have yet to conceive. The second recommendation addresses the most pressing application of deepfakes that we can foresee – its potential to pervert our democratic process.

Unforeseen Applications of Deepfake Technology 

H.R. 3230 lists three types of intent and one situational factor that would subject a creator of deepfake content to penalty.4Id. at sec. 2 § 1041(f)(1)(A). The list admirably attempts to cover all potential harms deepfakes could inflict. Yet, rapid advancements in deepfake technology could implicate a whole host of crimes that H.R. 3230 does not mention. One example would be the use of digital manipulation to create child pornography, which is distinct from slander or nonconsensual porn.5See Daniel S. Armagh, Virtual Child Pornography: Criminal Conduct or Protected Speech?, 23 Cardozo L. Rev. 1993, 1994-96 (2002). In Ashcroft v. Free Speech Coalition, the Supreme Court recognized the potential confusion digital manipulation could introduce in prosecuting child pornography crimes. Justice Thomas noted that “technology may evolve to the point where it becomes impossible to enforce actual child pornography laws because the Government cannot prove that certain pornographic images are of real children.”6Ashcroft v. Free Speech Coal., 535 U.S. 234, 259 (2002) (Thomas, J., concurring). Addressing deepfakes’ implications for child exploitation in H.R. 3230, would likely invite strong bipartisan support and help move the legislation through the committee process. H.R. 3230 has been referred to the House Judiciary Committee. “Preventing Crimes Against Children” is a top priority for Judiciary Committee Republicans

It is important to identify and penalize the current disturbing trends in deepfake use. Nevertheless, if this legislation is to serve as a foundation for future legislation, it must not only be corrective, but also pioneering. H.R. 3230 implicates existing criminal law by criminalizing deepfake production “in the course of criminal conduct related to fraud, including securities fraud and wire fraud, false personation, or identity theft.”7Supra note 1, at sec. 2 § 1041(f)(1)(iii). However, the legislation should acknowledge the unknown future applications of deepfake technology by permitting criminal or civil liability for the production of a deepfake in the course of any criminal or tortious act. The Malicious Deep Fake Prohibition Act of 2018 in the Senate defined all content that “would facilitate criminal or tortious conduct” as punitively liable.8Malicious Deep Fake Prohibition Act of 2018, S. 3805, 115th Cong. sec. 2 § 1041(b)(1) (2018).  Recently introduced legislation in Massachusetts has adopted a similar standard.9An Act to Protect Against Deep Fakes Used to Facilitate Criminal or Torturous Conduct, B. 3366, 191st Leg. Reg. Sess. (Mass. 2019). Using the rubric of existing criminal and tort law would not only increase H.R. 3230’s reach but its clarity as well. Rather than teeing up debate over potentially vague terms such as “humiliate” or “diplomatic conflict,” which have not been wed to a single legal definition, prosecutions could refer to an established canon of law.10Supra note 1, at sec. 2 § 1041(f)(1).

Deepfake Technology and the U.S. Electoral Process

Some dangers posed by deepfake technology are easily recognizable. The most striking is its potential to corrupt our electoral process. Deepfakes do not simply disparage individual political officials—they call all depictions of politicians into question. This delegitimization of the electoral process exacerbates existing distrust and division. H.R. 3230 advances protections against foreign interference in our elections by providing extraterritorial federal jurisdiction for an offense, “if the defendant or the depicted person is a citizen or permanent resident of the United States.”11Id. at sec. 2 § 1041(m). If jurisdiction over a defendant cannot be established, the bill enables in rem litigation to declare content materially false.12Id. at sec. 4(a).

However, political interference will not always be administered by a foreign power. H.R. 3230 fails to cover domestic actors who propagate misleading depictions of political figures. To evade complicated questions surrounding free speech, the legislation may have intentionally excluded individuals who are not acting on behalf of a foreign power. But, given the outsized impact deepfakes have in the political arena and the heightened public awareness about the issue, the legislature should consider including additional language to include domestic actors.

There are no precedential court decisions specifically addressing the regulation of deepfakes, and, therefore, no definitive understanding of the constitutionality on limiting deepfake production. Yet, established limits on false speech against political candidates may provide some guidance. A deepfake, like any other form of speech, can likely be prohibited if it falsely depicts a political candidate, does not include a disclaimer, and is made “with knowledge that it was false or with reckless disregard of whether it was false or not.”13N.Y. Times Co. v. Sullivan, 376 U.S. 254, 280 (1964). H.R. 3230, in particular, targets deepfake content creators rather than distributors. This narrow scope avoids complicated questions surrounding the culpability of those who share deepfakes. 

One possible way to create accountability for all deepfake-driven political interference would be to prohibit, within a certain period of time before an election, any person or entity from creating, intentionally damaging deepfakes audio or visual media of a candidate, unless the audio or visual media met H.R. 3230’s disclosure requirements. This idea was initially introduced in the California legislature.14A.B. 730, 2019-2020 Leg. Reg. Sess. § 4(a) (Cal. 2019).

Emphasizing this legislation’s applicability to election security may also garner additional Democratic support for the legislation. One of the Democrats’ top priority this congressional term has been election security. The very first bill brought to the floor, H.R. 1, was a package of various election reforms.15For the People Act of 2019, H.R. 1, 116th Cong. (2019). H.R. 1 will likely be reintroduced in the next congress as the Senate has shown no interest in picking it up this congress. H.R. 1 could act as a promising vehicle for H.R. 3230. Accenting its election security measures, H.R. 3230 is more likely to be incorporated into H.R. 1.

Conclusion

Deepfakes are not the only form of pernicious computerized manipulation. Earlier this year, a video of Speaker of the House Nancy Pelosi was slowed down and its audio pitch adapted to make it appear as though the Speaker slurred her words and was mentally impaired. That video used simple video editing software rather than deepfake technology. Yet, it achieved a similarly damaging effect. H.R. 3230 will not get rid of malicious deepfakes. Even if it did, it would not eliminate other forms of computerized misinformation. Despite that, at the very least, H.R. 3230 defines the creation of deepfake content without notice of its speciousness as a crime. Having a crime in the books provides a foundation upon which justice can be sought and additional preventative measures can be built.


Daniel Lipkowitz, J.D. Class of 2022, N.Y.U. School of Law.

Suggested Citation: Daniel Lipkowitz, Manipulated Reality, Menaced Democracy: An Assessment of the DEEP FAKES Accountability Act of 2019, N.Y.U. J. Legis. & Pub. Pol’y Quorum (2020).

  • 1
    Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2019, H.R. 3230, 116th Cong. (2019).
  • 2
    Id. at sec. 2 § 1041(g), sec. 4(a).
  • 3
    Id. at sec. 2 § 1041(n)(1).
  • 4
    Id. at sec. 2 § 1041(f)(1)(A).
  • 5
    See Daniel S. Armagh, Virtual Child Pornography: Criminal Conduct or Protected Speech?, 23 Cardozo L. Rev. 1993, 1994-96 (2002).
  • 6
    Ashcroft v. Free Speech Coal., 535 U.S. 234, 259 (2002) (Thomas, J., concurring).
  • 7
    Supra note 1, at sec. 2 § 1041(f)(1)(iii).
  • 8
    Malicious Deep Fake Prohibition Act of 2018, S. 3805, 115th Cong. sec. 2 § 1041(b)(1) (2018).
  • 9
    An Act to Protect Against Deep Fakes Used to Facilitate Criminal or Torturous Conduct, B. 3366, 191st Leg. Reg. Sess. (Mass. 2019).
  • 10
    Supra note 1, at sec. 2 § 1041(f)(1).
  • 11
    Id. at sec. 2 § 1041(m).
  • 12
    Id. at sec. 4(a).
  • 13
    N.Y. Times Co. v. Sullivan, 376 U.S. 254, 280 (1964).
  • 14
    A.B. 730, 2019-2020 Leg. Reg. Sess. § 4(a) (Cal. 2019).
  • 15
    For the People Act of 2019, H.R. 1, 116th Cong. (2019).