For Sale: One Iris. Slightly Used. Fair Market Rate.
- Tanya Zhuk
- Aug 14
- 4 min read
I walk the line between being excited for technology to prosper — and mourning the loss of humanity to that same solution.

As a Sci-Fi fangirl, I’m thrilled to see the tech we once thought super-futuristic, out-of-reach, or even impossible actually coming true in the foreseeable future.
I want to play with my smartphone, cruise in an autonomous car, and get dog food recipes from my AI buddy. I just don’t want to pay for it with my humanity.
I. The Cost of Being Human in a Machine Economy
Brands aren’t just looking for customers. They’re hunting for the most verifiable humans with the highest spending power.
In the U.S., we’ve been obsessed with building brands, knowing what’s best, and keeping up with the Joneses. But it’s the brands that need us. They need our support, our voice — and most importantly, our wallets.
And as much as you don’t want bots following you around the web tracking your clicks, views, and purchases, the brands don’t want deepfakes threatening their carefully orchestrated investments either.
These bots — the ones that make our lives easier — can also distort reality and drive brand spend even higher. Why? Because in a world of bots and fakes, verified humans are a luxury item.
II. Worldcoin and the Price of Your Iris
Sam Altman wants to give everyone access to AI’s rewards — whether you're a tech billionaire or a barefoot kid in Kenya (his metaphor, not mine).
All you have to do? Let his team take a picture of your eye.
Why? Because in the future, the more provably human you are, the more valuable your data becomes.
But valuable to whom?
Look, I went through cancer several years ago. I don’t mind being poked and prodded for a good reason with equitable return for my personal data. I want speedy access at the airport — so yes, Clear, TSA, and the DMV already have my iris, my fingerprints, and probably more identifiers I’ve forgotten.
Actually, come to think of it, I’m an immigrant. My fingerprints were probably logged with the FBI when I became a citizen. They even required a specific angle of my face to capture my ear — oh yeah, that too.
III. The False Promise of Digital Equity
Let’s say you’re like me and don’t care. You agree to have your iris scanned.
Do you really think you’ll get digital equality?
Equality compared to whom? That kid in Kenya, the one who can’t afford a new pair of shoes, let alone an AI-enabled smartphone that currently retails for $1,000 USD?
How equitable is it for him? Or for someone living in the Andes or rural Indonesia, when no one around them has access either nor the urgent need to for it?
Onboarding into a system that was never built for them is not equity.
It makes me angry to see how detached these tech millionaires are when they pitch us their dream of “equality.” The further they float with their utopias, the wider the gap grows between the haves and the have-nots.
In the age of surveillance capitalism, we’re not just handing over data we’re outsourcing authority. As Shoshana Zuboff asked: “Who decides who decides?”
When we let platforms scan our bodies, log our behaviors, and verify our existence — we’re not just opting in. We’re conceding.
Consent becomes infrastructure. And power consolidates in the hands of the architects.
It’s no longer a question of what we’re allowed to access but who gets to define the terms of access in the first place?
“This isn’t participation. It’s programmable compliance with a friendly UX.”
IV. Who Gets Rich Off Your Biometrics?
Let’s say Worldcoin gives you 16 tokens. And in 20 years, they’re worth $16,000 USD. “The system works!” You think. Now that kid (no longer a kid) can finally buy an iPhone and “access the future.”
He logs on.
What happens next?
Will he be offered a stake in a company? An influencer deal? A discount code?
Most likely, he just enters the marketplace. And Worldcoi (the company that built the tools, owns the servers, and monetized the infrastructure) profits. Not him.
And no matter how many ethical policies are in place, once you click accept on the terms, you might as well not have had them to begin with.
Because let’s face it: when you don’t accept, you get faulty service, dropped connections, or unexplained “errors.” But when you do accept (including sharing your biometrics with third parties and enabling FaceID recognition across apps), everything works like a dream. Right?
V. The Addiction Loop
So how did we get here?
We built it.
We wanted it.
And we fed it.
With every click, swipe, and scroll, it looked, tracked, stored — then predicted and pleased us. And slowly, it made us addicts.
We can’t break the machine because we built the machine.
Our freedom of choice built it.
Our exceptional greed and insatiable hunger for more, better, faster, now.
I just think of Ronny Chieng’s Amazon bit in his stand-up special. He was funny.The reality? A lot less funny.
“Your data was never the product. You were.”
VI. What We Should Be Asking Instead
I can’t tell you how to live your life. I have enough doubt about what the future will bring.
My iris is scanned.
My DNA is stored.
My data is somewhere — everywhere.
The future is as clear as a Magic 8-ball: Fuzzy, shakable, and weirdly fun to posture.
Where will we be in five years? Probably right here.
But will we still recognize ourselves when we look into the machine?
