PUBLISHED:April 10, 2019

Ward: Mining public health strategies to inoculate against damage from “deep fakes”

Jeff Ward Jeff Ward

Associate Clinical Professor Jeff Ward JD/LLM ’09 posits that public health offers inspiration for policy approaches to combat the social and legal harms threatened by the digital generation of disinformation.

Ward, director of the Duke Center on Law & Technology and the Law School’s associate dean for technology and innovation, examined some of the social and legal challenges posed by advances in artificial intelligence — specifically generative adversarial networks (GANS) — in a November talk at Stanford University’s CodeX Center. Demonstrating how “generative” systems replicate data to create digital artifacts (photos, speeches, or videos, for example) to try to fool “discriminative” systems designed to guess whether the creations are real or fake, Ward observed that the systems are becoming both increasingly expert and autonomous at their tasks. And increasingly sophisticated “deep fakes,” can have very real consequences: pornographic videos can be created or electoral candidates can be depicted making racist comments from snippets of an individual’s voice and photos pulled from a social media account; automated tweets from bots can generate and spread false statistics relating to vaccination harms; and GANs-generated evidence can find its way into court proceedings.

“What happens when you supercharge the data — the fakes — such that it’s harder and harder for even a sophisticated audience to determine whether it’s real or fake,” asks Ward, who also cautioned judges to be alert to deep fakes in court in a recent Judicature essay. “This is the most frightening thing to me — that it undermines our ability to believe in anything, in any context.”

A deep fake’s source uncertainty, coupled with free speech protections, makes traditional legal and regulatory approaches that focus on punishing or controlling their sources and intermediaries insufficient, he said, pointing to a promising approach in the way public health campaigns focus on intermediaries and receptors to fight the spread of infectious diseases. “How do we combat the flu? Our first response are flu shots and vaccines.” He likened disinformation to influenza: “Receptors are important.”

A public health approach would include trying to raise awareness and understanding of the threat at hand, as well as the approach and intervention, and build structures for organized action, Ward said. In the case of threats posed by GANs, the focus might shift from just regulating industry to educating community stewards and influencers. It would build upon liability regimes to include coping strategies for building systemic resilience to the problem.

“I have faith in the ability of people — the receptors — to be an important part of the defense to disinformation,” Ward said.

Watch: “One giant leap for machinekind: Generative adversarial networks and the next age of tech regulation.”