“This looks like AI.” That phrase kills conversations. A writer posts an illustration. A photographer shares a shot. A creator uploads audio—and the first response in the replies is skepticism masquerading as inquiry. The accusation has become reflexive, not because AI is everywhere, but because we’ve built a system where proving your work is human-made is now harder than claiming it isn’t.
The Verge’s recent piece surfaces a real tension: generative AI tools won’t label themselves. They have no incentive to. But creators—the ones facing actual displacement risk—have every incentive to prove authenticity. Yet we don’t have a mechanism for it.
The Labeling Asymmetry Problem
Here’s what’s broken: platforms refuse to reliably label obvious AI content. At the same time, human creators have no standardized way to prove their work came from a human hand. You’re stuck defending yourself after the fact, not certifying yourself beforehand.
This creates a perverse incentive structure. If a creator uploads a photograph and someone in the comments says “this is AI,” the burden shifts to the creator to disprove it. They have to explain their workflow, their equipment, their process—all retroactively. The accuser has nothing to prove.
Meanwhile, actual AI-generated content? It stays unlabeled unless a platform makes the decision to flag it manually. Most don’t, consistently, at scale.
A Fair Trade Model for Human Work
The Verge comparison to Fair Trade certification is precise. Fair Trade works because it’s a certification visible on the product before purchase. It signals a specific standard was met. A human creator—a writer, illustrator, photographer, or audio engineer—would have access to a similar badge or certificate that says: “This work was created by a human.”
What would this require?
- Standardized metadata: Embed human-creation verification into file headers, similar to how digital signatures work in contracts. Not unhackable, but trustworthy enough for casual verification.
- Platform-level support: Integrate the label into upload workflows. When a creator posts, they certify their work is human-made. Platforms display it. Simple claim, visible verification.
- Consequences for false claims: If someone falsely certifies human creation, there’s accountability. This is the missing piece. Right now, claiming AI content is human has almost no friction.
Why This Actually Matters for Creators
The displacement risk is not theoretical. Stock photography sites are flooded with AI-generated images. Writing platforms host AI-written articles. Music distribution has AI-composed tracks. A human creator competing against these needs a way to signal difference at the point of consumption, not after suspicion.
Without a labeling system, creators are forced into an impossible position: they have to argue about their own authenticity constantly, or they exit the space. The certification approach flips this. The creator makes one claim upfront; the platform honors it; the audience sees it.
Why Platforms Won’t Do This Alone
Here’s the friction: platforms benefit from ambiguity. A stock photo site with unlabeled AI images gets cheaper inventory and higher margins. A writing platform with AI content gets faster scaling. If certification is voluntary, platforms with financial incentives to host AI content won’t adopt it.
This means regulation or industry standard-setting has to drive adoption. The EU’s AI Act and similar frameworks are starting to address labeling, but the focus is backward-facing—labeling AI content as AI. The inverse label—certifying human creation—isn’t getting the same attention.
What You Should Do Now
If you’re creating work online, start documenting your process. Screenshots of your tools, process videos, timestamps—keep a record. It’s not a perfect solution, but when someone asks “prove it was human,” you have material to show.
Push your platform to adopt human-creation metadata. Most don’t have it yet. If enough creators demand it, the calculus changes. You’re not asking for perfection—you’re asking for a standard that makes certification possible.
And be specific when you see AI-generated work without labels. Don’t just call it out; note what the label was missing. Make the absence visible. That friction is how change happens.