A coalition of tech companies set up to combat deepfakes has released the first version of its technical specification for digital provenance.
The Coalition for Content Provenance and Authenticity (C2PA), which counts Adobe, Microsoft, Arm, Intel TruePic and the BBC among its members, says the standard will allow content creators and editors to create media that can’t secretly be tampered with.
It allows them to selectively disclose information about who has created or changed digital content and how it has been altered. Platforms can define what information is associated with each type of asset – for example, images, videos, audio, or documents – along with how that information is presented and stored, and how evidence of tampering can be identified.
“We have long believed that secure media provenance is the best way to relay high-integrity, authentic digital content online,” says Jeff McGregor, CEO of Truepic.
“An open standard in which any platform, website, app, or organization can ingest, preserve, and publish that content to consumers will be critical to achieving trust at internet scale.”
In one famous deepfake that did the rounds in 2018, Barak Obama was apparently seen discussing fake news and criticizing Donald Trump. Others have involved pornographic images, while criminals have used deepfakes to impersonate company officials for the purposes of fraud.
In a report earlier this month, the Royal Society found that most people can’t detect a deepfake, even when they’ve been warned that the content they’re watching may have been digitally altered.
When people were shown deepfake videos of Tom Cruise created by VFX artist Chris Ume, nearly eight out of ten failed to spot them as fake, even when they had been warned that they might be.
Last summer, Facebook teamed up with Michigan State University to create a new reverse-engineering research method to detect and attribute deepfakes, even when they haven’t been seen by the detector before.
It uses fingerprint estimation to predict the network architecture and loss functions of an unknown generative model, based on a single generated image.
The success of the C2PA specification will depend on the extent to which it’s taken up by content creators and understood by the general public.
“As the C2PA pursues the implementation of open digital provenance standards, broad adoption, prototyping and communication from coalition members and other external stakeholders will be critical to establish a system of verifiable integrity on the internet,” says Leonard Rosenthol, chair of the C2PA technical working group and senior principal scientist, Adobe.