Artificial intelligence makes it easy to create convincing manipulated photos, video and audio known as deepfakes. To counter this, researchers at ETH Zurich developed a sensor chip that cryptographically signs images, video and audio inside the device at the moment the signal is recorded. The signature identifies the camera or recorder, records when the content was captured and shows whether it has been changed later.
The signatures can be published in a publicly accessible, immutable ledger, for example a blockchain, so anyone can verify a file by comparison. The team argues that if data are signed at capture, any later manipulation leaves detectable traces. One developer explains that to alter the data the chip itself would have to be physically attacked, which would require a large technological effort and make mass production of manipulated content for social media practically impossible.
The technology can in principle be built into many cameras and recording sensors. Social media platforms could verify content automatically when users upload it; otherwise journalists, researchers and public authorities could authenticate material with simple tools. The idea began as a side project in a bioengineering lab; the group had planned such a sensor as early as 2017.
The chip described in the paper is a working prototype that shows technical feasibility, but more development is needed before commercial use. The researchers have filed a patent application and are exploring how to reduce costs for camera and sensor manufacturers. The research appears in Nature Electronics and was supported by the SNSF and the SBFI through the SwissChips initiative.
Difficult words
- deepfake — convincing fake media created by AIdeepfakes
- sign — add a cryptographic mark to datasigns
- signature — cryptographic proof of origin and integrity
- immutable — cannot be changed or deleted later
- ledger — record of transactions or entries in order
- authenticate — verify that material is genuine and unchanged
- manipulation — intentional change of media to mislead
- prototype — early working model used to test feasibility
- patent — legal protection for an invention or idea
- sensor — device that detects or measures physical signals
Tip: hover, focus or tap highlighted words in the article to see quick definitions while you read or listen.
Discussion questions
- How might automatic verification by social media platforms change the spread of manipulated media? Give reasons.
- What practical challenges could camera manufacturers face when adding this signing chip to devices?
- Should journalists and public authorities rely on such signatures when verifying source material? Why or why not?
Related articles
Digital harassment of women journalists in Indonesia
Online attacks against female journalists and activists in Indonesia have become more visible in the last five years. Victims report doxing, edited photos, DDoS and other abuse, while legal protection and platform responses remain limited.
New acid-free way to recycle lithium-ion batteries
Researchers at Rice University developed a two-step FJH-ClO process that separates lithium and other metals from spent batteries. The lab-scale method recovers valuable materials with less energy, fewer chemicals and less wastewater.
AI expands sexual and reproductive health access in Latin America
Research groups in Peru and Argentina use AI tools to give sexual and reproductive health information to young and marginalised people. Experts praise potential but warn of bias and call for better data, rules and oversight.