There's a certain poetic irony when the CEO of OpenAI — the man who arguably did more than anyone to put generative AI in the hands of the masses — becomes the subject of a deepfake screening right here in San Francisco.

A screening focused on deepfaking Sam Altman recently made the rounds locally, and if you weren't paying attention, you should be. Not because it's some gotcha moment against Altman personally, but because it crystallizes the single most important question the AI industry refuses to answer honestly: who's responsible when this stuff goes wrong?

Let's be clear about the landscape. San Francisco is ground zero for the AI boom. Billions of dollars are flowing through SoMa offices and Mission District co-working spaces to build tools that can convincingly fabricate anyone's face, voice, and likeness. The technology is breathtaking. It's also terrifying. And the people building it have spent far more time fundraising than they have proposing meaningful guardrails.

Altman himself has called for AI regulation — often in the same breath as launching products that make regulation harder. It's the classic Silicon Valley two-step: move fast, break things, then show up to Congress looking concerned.

Here's where the libertarian in us gets uncomfortable, though. Government regulation of AI deepfakes is almost certainly going to be clumsy, overbroad, and expensive. Washington doesn't exactly have a stellar track record of understanding technology written after 1998. But the alternative — trusting billion-dollar companies to self-police technology that can undermine elections, destroy reputations, and defraud ordinary people — isn't exactly inspiring confidence either.

The real answer is what it always is: clear property rights, enforceable consent laws around likeness, and actual legal consequences for bad actors. Not a new federal agency. Not an AI czar. Just straightforward accountability.

Sam Altman getting deepfaked is funny until it's your face. And in San Francisco, that future is closer than you think.