Truth

Proof over appearance.

Core claim: there is no reliable way to determine whether a digital photo or video is real by looking at it alone.

Generative AI can produce convincing media without visible artifacts. If authenticity matters, it must be proven, not inferred.

This page separates facts, implications, and limits—so confidence doesn’t outpace what can be checked.

One-sentence summary

Trust is a property of process.

Authenticity comes from verifiable origin and handling—not visual persuasion.

What can be verified

Replace “Does this look fake?” with a checkable question: “Can its history be proven?”

Origin

Where the file came from and when it was captured—when the proof is present and intact.

Integrity

Whether the captured bytes were altered after the proof was created.

Custody

Whether there is a continuous, auditable record of handling from capture to delivery.

How provenance works (plain language)

  1. Capture: media is recorded on a real device.
  2. Bind: the device creates a cryptographic record at capture time.
  3. Protect: the record is signed and attached to the media.
  4. Verify: any change breaks the proof and is detectable.

Verification doesn’t require trusting the image. It requires checking the proof.

Scope and limits

Provenance restores verifiability, not certainty about context or intent.

What provenance can do

  • Prove a file matches a specific capture event
  • Show whether it was modified after capture
  • Provide an auditable chain of custody

What it cannot do

  • Prove moral truth or completeness
  • Reveal intent, framing, or omissions
  • Replace judgment and corroboration

How to verify (a simple path)

A verification path reduces asymmetry—even if most people never run the check.

If you have a file and need proof

  1. Look for provenance: does the file include a signed manifest / proof bundle?
  2. Verify the signature: check that the manifest is signed by a valid key.
  3. Verify hash continuity: ensure the hashes match the media bytes.
  4. Review the output: trust tier, capture conditions, and any breaks in continuity.

Want capture that generates these proofs automatically? See Epistemic Recorder.

FAQ

Short answers, minimal jargon. When a term matters, it is defined.

Can AI detectors tell whether something is fake?

Detectors output probabilities, not durable proof. They often disagree, and their behavior changes as generative models change. Use provenance when the question is high-stakes.

What do you mean by “provenance”?

Provenance is a verifiable record of origin and handling: where media came from, when it was captured, and whether it has been modified. It is checked by verifying cryptographic proofs, not by judging appearance.

Does provenance mean the content is true?

No. Provenance can prove origin and integrity, but it cannot prove intent, framing, or completeness. It restores verifiability, not moral certainty.

Are watermarks or Content Credentials enough?

They can help when present and intact, but they are optional and can be removed or stripped. Verification-grade systems assume hostile conditions and provide an auditable chain of custody from capture.

How do I verify something in practice?

Use a verifier that checks signed manifests and hash continuity. Epistemic Suite supports verification at ingestion (Vault) or locally (Verifier CLI).