
This is another long-ish read but I do hope that you find the time to delve into it. If you want a quick summary, a video explainer to watch, or an AI to chat with about the content just click the button below which takes you to a notebooklm, (not some sketchy link, well except that it is powered by google).
“Men accept servility in order to acquire wealth; as if they could acquire anything of their own when they cannot even assert that they belong to themselves.” — Étienne de La Boétie
Identity, Biometrics, and the Question of Who Holds the Keys
A man stands in a short line at an airport.
No dogs bark, no guards shout. A soft screen glows above the gate: Seamless Biometric Check‑In. A camera reads his face. A light flashes green. The barrier slides aside. The ritual feels civilized, almost weightless.
Other lines live in recent memory.
Prisoners at Auschwitz walking past a table while a needle pushes numbers into skin. Those numbers decide who eats, who works, who dies. No password reset exists for an inked forearm.
Technology changed. The basic logic did not.
The first essay argued that trust is the only real value layer. Money, law, institutions, and brands all borrow their meaning from it. This essay looks at a narrower question:
What happens to trust when identity, data, and economic life depend on systems that can mark, track, and sort human beings without any real way to refuse?
The tools are new. The double edge is not. Blockchain and identity technologies can harden the scaffolding that keeps the Jenga tower of society from falling. The same tools can also sharpen the blades of control, effectively removing our choice. No choice, no trust, no trust, no value in life, as the first article shows.

People have marked other human beings for a long time. This is not a new idea by any stretch.
Ancient Greek and Roman writers describe slaves branded with hot irons. Letters burned into the forehead or arm signaled “fugitive” or “thief”. A person could change name or city. The mark stayed.
Slave systems in the Atlantic world revived and refined those practices. Bodies carried the symbols of ships, traders, crowns, and plantations. A human being became a walking ledger entry. The owner read value from scars. The person wearing those marks read a simple truth: escape would never fully wipe the slate clean.
Industrial genocide added bureaucracy to cruelty.
“The essence of totalitarian government… is to make functionaries and mere cogs in the administrative machinery out of men, and thus to dehumanize them.”
— Hannah Arendt
Nazi administrators turned people into numbered entries in a camp management system and tattooed those numbers onto arms at Auschwitz. The tattoo anchored each person to a card, a list, a file. Killing became a matter of updating records.
Less openly murderous systems relied on paper instead of ink.
Apartheid South Africa forced Black people to carry passes and identity documents at all times. Police could stop anyone and demand proof of racial category and permission to stand on that street. Failure meant fines, prison, expulsion. Daily life turned into one long audition for the right to exist outside a cage.
The details differ. The pattern repeats:
Someone else defines what counts as your identity.
Someone else writes the master copy into a system you do not control.
Someone else can demand proof at any moment.
Refusal carries the threat of exclusion or violence.
Trust in that setting does not describe free cooperation. It describes managed obedience.
Digital identity schemes change the surface, not the structure.
Aadhaar in India ties a twelve‑digit number to fingerprints, iris scans, and a photograph. That number has become entangled with welfare, pensions, banking, SIM cards, and school enrollment. Official language praises inclusion and efficiency. Critics describe something closer to conditional citizenship: life works smoothly only while the central database agrees that you are who you say you are.
Breaches and leaks have exposed Aadhaar‑linked information many times. Once biometric templates or ID numbers drift into the wild, no one can revoke their own eyes or hands. A single compromised system quietly creates a permanent vulnerability in millions of bodies.
China’s social credit experiments and wider surveillance architecture combine identity numbers, financial records, travel data, and facial recognition. Algorithms score “trustworthiness”. Cameras and databases watch daily life. The project claims to strengthen social trust by punishing fraud and rewarding good behavior. The lived experience looks much closer to behavioural obedience, enforced by ever‑present risk of invisible downgrades.
Europe presents a softer shape of the same problem. The European Digital Identity Wallet is framed as voluntary for individuals. Member States must make at least one wallet available. Large services and very large platforms will be required to accept the wallet when people choose to present it for strong authentication. The wallet itself emphasizes data minimization and selective disclosure, which is meaningfully better than centralized biometric dossiers. The practical pressure still points in one direction: convenience becomes default, default becomes expectation, and expectation becomes the path most people cannot realistically refuse.
The United States built a different kind of identity spine. Social Security numbers began as an administrative tool to track earnings and benefit eligibility, not as an identity credential. Institutions reached for the number anyway because it was convenient, then built workflows around it, then treated it as required for work, credit, banking, housing, and verification. Companies copied SSNs into countless databases and used them to stitch profiles together. Every copy widened the attack surface, and every breach made the number less credible as “security”, the trust lessened so the pressure rises for a stronger binder—driver’s licenses, passports, and “verification” schemes that lean on face photos and other biometrics. The creep rarely arrives by decree; it arrives by default. A number becomes a key, the key becomes a dependency, the dependency becomes a gate, and the gate invites a permanent biometric upgrade unless minimal disclosure, revocable keys, and real exit get built into the system.
Corporations run softer versions of the same logic.
“At its core, surveillance capitalism is parasitic and self-referential.”
— Shoshana Zuboff
Banks, platforms, and telecoms collect oceans of data under the banner of KYC and “safety”. They store not just names and addresses but contact graphs, click histories, purchase patterns, locations, IP addresses, device type and every other type of data they can pull together. That data lives in private silos that ordinary citizens, the users, cannot inspect or meaningfully correct. Access to essential services then depends on remaining in good standing with entities whose risk models remain opaque. These types of risks are more, “Trust me bro”, than structural trust.
None of these systems hate trust. Each one simply asks for too much of it, and in the wrong place.
They all say some version of this:
Trust us with the raw material of your identity and your life. Trust us to keep it safe. Trust us not to share it too widely. Trust us not to overreact when power changes hands or panic hits.
History shows us no reason to accept that bargain.
Blockchains arrived with a promise: neutral, tamper‑resistant rails that do not belong to any single institution. A Bitcoin transaction does not care whether the sender lives under a dictatorship or a democracy. An Ethereum smart contract does not check passports before executing code.
Those traits make blockchains powerful tools for rebuilding trust.
A public ledger lets anyone verify that a payment has cleared.
A smart contract enforces terms without asking either party to trust the other’s accountant.
A censorship‑resistant network allows people to transact even when banks or states decide they should not.
The same traits make blockchains dangerous when combined with the wrong identity layer.
Everything recorded on a public chain persists. Every transfer, every interaction with a contract, every approval sits there as a permanent entry. If a clear, stable identity sits behind those entries, the ledger can become a perfect life‑log.
Attach a real‑name identity system or a mandatory biometric registry to an on‑chain world and you have built something unsettling:
A financial history that no one can ever erase.
A map of every association, donation, and collaborative act.
A behavioural dossier that no regime needs to reconstruct from scraps; it simply queries a node.
A trust architecture without privacy turns neutral infrastructure into a ready‑made panopticon. A person might still choose to interact with that system. The choice would not look like free trust; it would look like resignation.
A panopticon is the final triumph of power: when domination becomes internal, and the prisoner becomes their own jailer.
The double edge with painful clarity.
Blockchains can reduce the amount of trust placed in fallible institutions by replacing “just believe us” with verifiable rules. Blockchains can also concentrate power in the hands of whoever controls identity and analytics.
The difference depends on how identity and data interact with the rails.
Our ability to choose the system and trust in the decentralized network to supply the security makes all the difference.
Biometrics promise convenience and safety. They deliver on convenience.
“Arguing that you don’t care about the right to privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say.”
— Edward Snowden
A fingerprint unlocks a phone with one touch. A face scan opens a bank app. An iris scan might one day grant access to payments, voting, or universal income through a sleek orb. The sales pitch sounds simple: biometrics prove that you stand in front of the screen, not a bot.
The reality cuts another way.
Bodies do not rotate.
Passwords, keys, and even plastic ID cards can be revoked and replaced. That property underpins genuine self‑sovereignty. If a key leaks, you burn it and mint a new one. A biometric “key” never truly burns. It remains attached to your face, your palm, your eyes, visible to every camera and scanner you pass.
Bodies are easy to coerce.
Cryptography circles have an old joke about the “five‑dollar wrench attack”: no need to break the cipher when a cheap wrench and a threat persuade someone to reveal a password. A system that makes a fingerprint or iris the final key for banking, identity, or voting turns every wrench wielded by a criminal or a state agent into a master hacking tool.
Storage does not fix this.
Even if biometric templates live on a personal device or inside a hardware wallet, authorities or attackers can still force the body to cooperate. Even if zero‑knowledge proofs hide the raw data, the underlying dependency remains: basic participation in society now depends on a feature of your body that you can neither meaningfully disguise nor revoke.
Trust in that context does not describe mutual confidence. Trust, as you remember from the first article, is willingly chosen vulnerability—risk accepted, not mere expectation—and it is the condition that makes anything else valuable in practice; once scanners and registries remove choice, “trust” stops describing mutual confidence and becomes nothing more than the hope that those controlling the system will use their power gently.
If blockchains become the backbone of finance and identity, biometrics tie flesh directly into that backbone. The combination turns a promising trust technology into a very efficient control grid.
Biometric risk is one edge of the sword. Data custody is the other.
Identity without data can be thin and brittle. Data without identity can be noisy. Combined, they form leverage.
“Who owns the data? Does the data about my DNA, my brain and my life belong to me, to the government, to a corporation, or to the human collective?”
— Yuval Noah Harari
When a state runs a central biometric ID system, the state holds leverage. When a corporation runs a giant identity graph and a stack of behavioral data, that corporation holds leverage. When a chain of oracles and custodians bridges tokenized assets with their real‑world counterparts, those actors hold leverage.
In every case, the same questions determine whether trust grows or shrinks:
Who can see what?
Who can correlate which datasets?
Who can freeze, erase, or rewrite which entries?
Who can be compelled by law or violence to hand over control?
Trust as the deepest value layer rarely fails because human beings became too trusting in general. It fails when trust lands in the wrong place.
The more data and power gather in one set of hands, the more tempting that point becomes for capture. Elections change. Boards change. Laws change. A database that looked benign in one decade can become a weapon in the next.
Decentralized systems do not magically erase these dangers. They offer a different way to distribute them.
A blockchain that does not have a super‑administrator cannot quietly delete an address on political grounds. A protocol that lets people prove membership or age without disclosing full identity gives strangers enough to cooperate without handing them a dossier. A design that lets users move between wallets and providers without losing access reduces the chance that a single gatekeeper can lock them out of economic life, or life in general.
No structure removes the need for trust. Some structures make it easier to place trust in ways that can be withdrawn, re‑negotiated, and verified.
The argument so far can sound abstract. The landscape no longer is.
You all know I like simple tests to help illustrate a point so try this one:
how hard is it to live a normal life without enrolling? A second test follows: how easily can you revoke, rotate, or exit once you are inside?
Those two questions define what “escape” means in practice, and what freedom is when lived.
Below is a snapshot of several large systems and the kind of pressure they create. Some systems are compulsory by law. Some remain voluntary on paper yet become mandatory by convenience and mandatory acceptance. The control creep is real if rarely seen or felt, the compulsory “trust” is like the slowly raising water temperature boiling the frog alive. If only it realized the danger and jumped out of the pot. Perhaps this chart helps you decide if you would like to remain in the pot or not.

Sources (for the figure): EU Regulation 2024/1183 and implementing regulations (Dec 2024), national ID laws, Aadhaar documentation and court constraints, and global ID trackers (compiled Feb 2026).
The point is not to rank countries as “good” or “bad”. The point is to see the direction of travel: identity systems increasingly fuse with services, and services increasingly fuse with daily survival.
The story is not all doom.
The same tools that make permanent registries and life-logs possible also allow different arrangements, where trust lands in healthier places and can be withdrawn. Trust that follows the true definition: Trust is a "willing vulnerability" and a wager that the other side will not exploit you.
Some patterns already exist in the wild:
Many keys, not one true name.
A person can hold several pseudonymous identities, each with its own keys and history. Losing one does not erase their place in the world. No authority owns the master switch. Tools like Frame and Brume push this back toward normal life, where “separation of selves” is not a crime scene but a kind of dignity: one wallet for work, one for play, one for savings, one for giving, none of them required to confess the whole person.
Private value transfer without publishing your entire life.
Money should not force confession. People have a simple human need to pay without broadcasting an entire social graph. Shielded transfer systems and privacy pools exist for that purpose. Railgun, Privacy Pools, and zkbob point in the same direction: everyday economic life should not become a permanent public diary, searchable forever by anyone who cares to look.
Stealth addresses: receiving without becoming a target.
A public address becomes a label people can follow forever. Stealth address designs break that habit by generating one-time receiving addresses so recipients don’t expose a persistent identifier. Umbra and Fluidkey live here. They turn “receiving money” back into a private act, rather than a beacon that invites profiling, extortion, or social coercion.
Credentials instead of dossiers.
Trust grows when people can prove a narrow fact without surrendering a file cabinet. “I am over eighteen.” “I live in this region.” “I passed this exam.” Privacy-preserving credentials and proofs move identity away from “hand over your whole life” and toward “show only what matters.” ZKPassport, Reclaim, and Rarimo sit along this path. They sketch a world where trust becomes precise and proportional, not absolute and invasive.
Local data, global proofs.
Sensitive information—health records, contact graphs, even biometrics when they must exist—should live close to the person or the community, not in one hungry database. Chains do not need the raw material. They can hold small proofs that certain checks passed, and nothing more. This is the difference between an identity system that asks for trust and one that demands surrender.
Private coordination, not just private payments.
Trust is not only economic. Group choice becomes coercible when everyone can see everyone else’s vote. Private voting systems aim to protect the inner act of choosing by making bribery and intimidation harder. MACI belongs in this family. It treats governance as something that should remain free inside the mind, not legible to whoever holds the biggest stick.
Network privacy: hide metadata, not just content.
Even perfect on-chain privacy can fail if network observers can link activity through IP addresses, RPC logs, and timing fingerprints. Mixnets and hardened transport matter because they protect the seams—those quiet places where identity leaks when no one is watching. Nym and HOPR exist because privacy without transport privacy is often a costume with the seams exposed.
Uncensorable base, private rooms above.
Public ledgers can provide shared truth about settlement and state, while upper layers use encryption and privacy-preserving tools to keep individual lives from turning into open books. Fair ordering systems like Shutter and Fairblock push in the same direction: even the queue should not become a weapon. The base stays common and verifiable. The rooms above remain human.
Private-by-default infrastructure is arriving.
The deeper edge of the sword lives in the base layers. Privacy-focused L2s and rollups like Aztec, Miden, TEN, and INTMAX aim to make confidentiality a default property rather than a bolt-on habit. Confidential compute pushes the same frontier from another side: systems like Zama, Fhenix, and TACEO work toward a world where computation can be verified without turning every input into public property.
Exit as a first-class feature.
Trust has value only when it can be withdrawn. Wallets, identities, and agents can be designed so that people move between providers without losing everything. No company and no government should get to say, “Without us, you no longer exist.” Exit is what keeps trust from degrading into obedience.
None of these patterns solves the problem by itself. Together, they sketch the other edge of the sword: a way to use cryptography and shared ledgers to reduce blind trust in institutions without turning human beings into perfectly legible objects.
The question is whether we have the courage to build toward those patterns, instead of choosing whatever makes administration easiest.
Trust has moral weight because it rests on choice.
A person decides to rely on another, fully aware that they could have chosen differently. A community extends trust to an institution while knowing it can withdraw that trust through exit or reform. That possibility of withdrawal keeps trust from collapsing into submission.
Systems that tie identity to unchangeable bodily features and central databases attack that possibility at its root.
A branded slave in the ancient world never truly left the master’s mark behind.
A person with a camp tattoo could not peel off their association with a file.
A citizen whose right to move depends entirely on biometric clearance or a central score does not stand in a position of free trust.
Digital versions of the same pattern feel smoother. They deserve the same scrutiny.
A trust‑worthy architecture needs at least three features:
Revocable keys – people can rotate credentials and devices without losing their place in society.
Optional participation – no single identity system becomes the irreplaceable gate to every basic need.
Limited visibility – no actor can see, correlate, and weaponize every aspect of an individual’s life.
Decentralized blockchains, used carefully, can help build such a trust architecture. They provide shared, tamper‑resistant ledgers that no one actor owns. They make it possible to prove that something happened without revealing every detail of who did it and why. They allow money, contracts, and even proofs of personhood to live on rails that people can inspect and, if necessary, fork.
The same technology, used carelessly, can hard‑wire human dependence into an inescapable graph.
The double‑edged sword cuts both ways.
This series looks at how to keep the blade facing the right direction.
The next essay maps the scaffolding in broad strokes:
Money and real value: from fiat abstractions to crypto‑native settlement and tokenized commodities that reference actual food, metals, energy, and housing.
Proof of personhood without orbs: ways to say “a real human is here” without scanning bodies into permanent registries.
Self‑sovereign identity and minimal disclosure: credentials you hold and can rotate, proofs that reveal only what others need to know.
Privacy as a condition for freedom: chains and tools that create rooms without microphones while still allowing verifiable collaboration.
Verifiable computation and honest ledgers: zero‑knowledge proofs, rollups, and validator networks that turn “just trust us” into “check the math”.
Trust in AI agents: protocols for identity, reputation, and constraints when software begins to act economically on our behalf.
Each of these topics grows directly out of the basic claim that trust is the only real value layer. Technology will not replace trust. At best, technology can protect trust from the worst things we have already done with power, data, and one another.
The task ahead is simple to state and hard to execute:
Build systems where trust flows between free people and free communities, not from marked bodies toward distant centralized control databases.
If blockchains and identity tools help us do that, they deserve to be the that base layer of trust. They reduce the amount of blind faith we place in unaccountable intermediaries. The deeper trust remains what it has always been: the courage to rely on each other, backed by the right to walk away when we must.

This is another long-ish read but I do hope that you find the time to delve into it. If you want a quick summary, a video explainer to watch, or an AI to chat with about the content just click the button below which takes you to a notebooklm, (not some sketchy link, well except that it is powered by google).
“Men accept servility in order to acquire wealth; as if they could acquire anything of their own when they cannot even assert that they belong to themselves.” — Étienne de La Boétie
Identity, Biometrics, and the Question of Who Holds the Keys
A man stands in a short line at an airport.
No dogs bark, no guards shout. A soft screen glows above the gate: Seamless Biometric Check‑In. A camera reads his face. A light flashes green. The barrier slides aside. The ritual feels civilized, almost weightless.
Other lines live in recent memory.
Prisoners at Auschwitz walking past a table while a needle pushes numbers into skin. Those numbers decide who eats, who works, who dies. No password reset exists for an inked forearm.
Technology changed. The basic logic did not.
The first essay argued that trust is the only real value layer. Money, law, institutions, and brands all borrow their meaning from it. This essay looks at a narrower question:
What happens to trust when identity, data, and economic life depend on systems that can mark, track, and sort human beings without any real way to refuse?
The tools are new. The double edge is not. Blockchain and identity technologies can harden the scaffolding that keeps the Jenga tower of society from falling. The same tools can also sharpen the blades of control, effectively removing our choice. No choice, no trust, no trust, no value in life, as the first article shows.

People have marked other human beings for a long time. This is not a new idea by any stretch.
Ancient Greek and Roman writers describe slaves branded with hot irons. Letters burned into the forehead or arm signaled “fugitive” or “thief”. A person could change name or city. The mark stayed.
Slave systems in the Atlantic world revived and refined those practices. Bodies carried the symbols of ships, traders, crowns, and plantations. A human being became a walking ledger entry. The owner read value from scars. The person wearing those marks read a simple truth: escape would never fully wipe the slate clean.
Industrial genocide added bureaucracy to cruelty.
“The essence of totalitarian government… is to make functionaries and mere cogs in the administrative machinery out of men, and thus to dehumanize them.”
— Hannah Arendt
Nazi administrators turned people into numbered entries in a camp management system and tattooed those numbers onto arms at Auschwitz. The tattoo anchored each person to a card, a list, a file. Killing became a matter of updating records.
Less openly murderous systems relied on paper instead of ink.
Apartheid South Africa forced Black people to carry passes and identity documents at all times. Police could stop anyone and demand proof of racial category and permission to stand on that street. Failure meant fines, prison, expulsion. Daily life turned into one long audition for the right to exist outside a cage.
The details differ. The pattern repeats:
Someone else defines what counts as your identity.
Someone else writes the master copy into a system you do not control.
Someone else can demand proof at any moment.
Refusal carries the threat of exclusion or violence.
Trust in that setting does not describe free cooperation. It describes managed obedience.
Digital identity schemes change the surface, not the structure.
Aadhaar in India ties a twelve‑digit number to fingerprints, iris scans, and a photograph. That number has become entangled with welfare, pensions, banking, SIM cards, and school enrollment. Official language praises inclusion and efficiency. Critics describe something closer to conditional citizenship: life works smoothly only while the central database agrees that you are who you say you are.
Breaches and leaks have exposed Aadhaar‑linked information many times. Once biometric templates or ID numbers drift into the wild, no one can revoke their own eyes or hands. A single compromised system quietly creates a permanent vulnerability in millions of bodies.
China’s social credit experiments and wider surveillance architecture combine identity numbers, financial records, travel data, and facial recognition. Algorithms score “trustworthiness”. Cameras and databases watch daily life. The project claims to strengthen social trust by punishing fraud and rewarding good behavior. The lived experience looks much closer to behavioural obedience, enforced by ever‑present risk of invisible downgrades.
Europe presents a softer shape of the same problem. The European Digital Identity Wallet is framed as voluntary for individuals. Member States must make at least one wallet available. Large services and very large platforms will be required to accept the wallet when people choose to present it for strong authentication. The wallet itself emphasizes data minimization and selective disclosure, which is meaningfully better than centralized biometric dossiers. The practical pressure still points in one direction: convenience becomes default, default becomes expectation, and expectation becomes the path most people cannot realistically refuse.
The United States built a different kind of identity spine. Social Security numbers began as an administrative tool to track earnings and benefit eligibility, not as an identity credential. Institutions reached for the number anyway because it was convenient, then built workflows around it, then treated it as required for work, credit, banking, housing, and verification. Companies copied SSNs into countless databases and used them to stitch profiles together. Every copy widened the attack surface, and every breach made the number less credible as “security”, the trust lessened so the pressure rises for a stronger binder—driver’s licenses, passports, and “verification” schemes that lean on face photos and other biometrics. The creep rarely arrives by decree; it arrives by default. A number becomes a key, the key becomes a dependency, the dependency becomes a gate, and the gate invites a permanent biometric upgrade unless minimal disclosure, revocable keys, and real exit get built into the system.
Corporations run softer versions of the same logic.
“At its core, surveillance capitalism is parasitic and self-referential.”
— Shoshana Zuboff
Banks, platforms, and telecoms collect oceans of data under the banner of KYC and “safety”. They store not just names and addresses but contact graphs, click histories, purchase patterns, locations, IP addresses, device type and every other type of data they can pull together. That data lives in private silos that ordinary citizens, the users, cannot inspect or meaningfully correct. Access to essential services then depends on remaining in good standing with entities whose risk models remain opaque. These types of risks are more, “Trust me bro”, than structural trust.
None of these systems hate trust. Each one simply asks for too much of it, and in the wrong place.
They all say some version of this:
Trust us with the raw material of your identity and your life. Trust us to keep it safe. Trust us not to share it too widely. Trust us not to overreact when power changes hands or panic hits.
History shows us no reason to accept that bargain.
Blockchains arrived with a promise: neutral, tamper‑resistant rails that do not belong to any single institution. A Bitcoin transaction does not care whether the sender lives under a dictatorship or a democracy. An Ethereum smart contract does not check passports before executing code.
Those traits make blockchains powerful tools for rebuilding trust.
A public ledger lets anyone verify that a payment has cleared.
A smart contract enforces terms without asking either party to trust the other’s accountant.
A censorship‑resistant network allows people to transact even when banks or states decide they should not.
The same traits make blockchains dangerous when combined with the wrong identity layer.
Everything recorded on a public chain persists. Every transfer, every interaction with a contract, every approval sits there as a permanent entry. If a clear, stable identity sits behind those entries, the ledger can become a perfect life‑log.
Attach a real‑name identity system or a mandatory biometric registry to an on‑chain world and you have built something unsettling:
A financial history that no one can ever erase.
A map of every association, donation, and collaborative act.
A behavioural dossier that no regime needs to reconstruct from scraps; it simply queries a node.
A trust architecture without privacy turns neutral infrastructure into a ready‑made panopticon. A person might still choose to interact with that system. The choice would not look like free trust; it would look like resignation.
A panopticon is the final triumph of power: when domination becomes internal, and the prisoner becomes their own jailer.
The double edge with painful clarity.
Blockchains can reduce the amount of trust placed in fallible institutions by replacing “just believe us” with verifiable rules. Blockchains can also concentrate power in the hands of whoever controls identity and analytics.
The difference depends on how identity and data interact with the rails.
Our ability to choose the system and trust in the decentralized network to supply the security makes all the difference.
Biometrics promise convenience and safety. They deliver on convenience.
“Arguing that you don’t care about the right to privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say.”
— Edward Snowden
A fingerprint unlocks a phone with one touch. A face scan opens a bank app. An iris scan might one day grant access to payments, voting, or universal income through a sleek orb. The sales pitch sounds simple: biometrics prove that you stand in front of the screen, not a bot.
The reality cuts another way.
Bodies do not rotate.
Passwords, keys, and even plastic ID cards can be revoked and replaced. That property underpins genuine self‑sovereignty. If a key leaks, you burn it and mint a new one. A biometric “key” never truly burns. It remains attached to your face, your palm, your eyes, visible to every camera and scanner you pass.
Bodies are easy to coerce.
Cryptography circles have an old joke about the “five‑dollar wrench attack”: no need to break the cipher when a cheap wrench and a threat persuade someone to reveal a password. A system that makes a fingerprint or iris the final key for banking, identity, or voting turns every wrench wielded by a criminal or a state agent into a master hacking tool.
Storage does not fix this.
Even if biometric templates live on a personal device or inside a hardware wallet, authorities or attackers can still force the body to cooperate. Even if zero‑knowledge proofs hide the raw data, the underlying dependency remains: basic participation in society now depends on a feature of your body that you can neither meaningfully disguise nor revoke.
Trust in that context does not describe mutual confidence. Trust, as you remember from the first article, is willingly chosen vulnerability—risk accepted, not mere expectation—and it is the condition that makes anything else valuable in practice; once scanners and registries remove choice, “trust” stops describing mutual confidence and becomes nothing more than the hope that those controlling the system will use their power gently.
If blockchains become the backbone of finance and identity, biometrics tie flesh directly into that backbone. The combination turns a promising trust technology into a very efficient control grid.
Biometric risk is one edge of the sword. Data custody is the other.
Identity without data can be thin and brittle. Data without identity can be noisy. Combined, they form leverage.
“Who owns the data? Does the data about my DNA, my brain and my life belong to me, to the government, to a corporation, or to the human collective?”
— Yuval Noah Harari
When a state runs a central biometric ID system, the state holds leverage. When a corporation runs a giant identity graph and a stack of behavioral data, that corporation holds leverage. When a chain of oracles and custodians bridges tokenized assets with their real‑world counterparts, those actors hold leverage.
In every case, the same questions determine whether trust grows or shrinks:
Who can see what?
Who can correlate which datasets?
Who can freeze, erase, or rewrite which entries?
Who can be compelled by law or violence to hand over control?
Trust as the deepest value layer rarely fails because human beings became too trusting in general. It fails when trust lands in the wrong place.
The more data and power gather in one set of hands, the more tempting that point becomes for capture. Elections change. Boards change. Laws change. A database that looked benign in one decade can become a weapon in the next.
Decentralized systems do not magically erase these dangers. They offer a different way to distribute them.
A blockchain that does not have a super‑administrator cannot quietly delete an address on political grounds. A protocol that lets people prove membership or age without disclosing full identity gives strangers enough to cooperate without handing them a dossier. A design that lets users move between wallets and providers without losing access reduces the chance that a single gatekeeper can lock them out of economic life, or life in general.
No structure removes the need for trust. Some structures make it easier to place trust in ways that can be withdrawn, re‑negotiated, and verified.
The argument so far can sound abstract. The landscape no longer is.
You all know I like simple tests to help illustrate a point so try this one:
how hard is it to live a normal life without enrolling? A second test follows: how easily can you revoke, rotate, or exit once you are inside?
Those two questions define what “escape” means in practice, and what freedom is when lived.
Below is a snapshot of several large systems and the kind of pressure they create. Some systems are compulsory by law. Some remain voluntary on paper yet become mandatory by convenience and mandatory acceptance. The control creep is real if rarely seen or felt, the compulsory “trust” is like the slowly raising water temperature boiling the frog alive. If only it realized the danger and jumped out of the pot. Perhaps this chart helps you decide if you would like to remain in the pot or not.

Sources (for the figure): EU Regulation 2024/1183 and implementing regulations (Dec 2024), national ID laws, Aadhaar documentation and court constraints, and global ID trackers (compiled Feb 2026).
The point is not to rank countries as “good” or “bad”. The point is to see the direction of travel: identity systems increasingly fuse with services, and services increasingly fuse with daily survival.
The story is not all doom.
The same tools that make permanent registries and life-logs possible also allow different arrangements, where trust lands in healthier places and can be withdrawn. Trust that follows the true definition: Trust is a "willing vulnerability" and a wager that the other side will not exploit you.
Some patterns already exist in the wild:
Many keys, not one true name.
A person can hold several pseudonymous identities, each with its own keys and history. Losing one does not erase their place in the world. No authority owns the master switch. Tools like Frame and Brume push this back toward normal life, where “separation of selves” is not a crime scene but a kind of dignity: one wallet for work, one for play, one for savings, one for giving, none of them required to confess the whole person.
Private value transfer without publishing your entire life.
Money should not force confession. People have a simple human need to pay without broadcasting an entire social graph. Shielded transfer systems and privacy pools exist for that purpose. Railgun, Privacy Pools, and zkbob point in the same direction: everyday economic life should not become a permanent public diary, searchable forever by anyone who cares to look.
Stealth addresses: receiving without becoming a target.
A public address becomes a label people can follow forever. Stealth address designs break that habit by generating one-time receiving addresses so recipients don’t expose a persistent identifier. Umbra and Fluidkey live here. They turn “receiving money” back into a private act, rather than a beacon that invites profiling, extortion, or social coercion.
Credentials instead of dossiers.
Trust grows when people can prove a narrow fact without surrendering a file cabinet. “I am over eighteen.” “I live in this region.” “I passed this exam.” Privacy-preserving credentials and proofs move identity away from “hand over your whole life” and toward “show only what matters.” ZKPassport, Reclaim, and Rarimo sit along this path. They sketch a world where trust becomes precise and proportional, not absolute and invasive.
Local data, global proofs.
Sensitive information—health records, contact graphs, even biometrics when they must exist—should live close to the person or the community, not in one hungry database. Chains do not need the raw material. They can hold small proofs that certain checks passed, and nothing more. This is the difference between an identity system that asks for trust and one that demands surrender.
Private coordination, not just private payments.
Trust is not only economic. Group choice becomes coercible when everyone can see everyone else’s vote. Private voting systems aim to protect the inner act of choosing by making bribery and intimidation harder. MACI belongs in this family. It treats governance as something that should remain free inside the mind, not legible to whoever holds the biggest stick.
Network privacy: hide metadata, not just content.
Even perfect on-chain privacy can fail if network observers can link activity through IP addresses, RPC logs, and timing fingerprints. Mixnets and hardened transport matter because they protect the seams—those quiet places where identity leaks when no one is watching. Nym and HOPR exist because privacy without transport privacy is often a costume with the seams exposed.
Uncensorable base, private rooms above.
Public ledgers can provide shared truth about settlement and state, while upper layers use encryption and privacy-preserving tools to keep individual lives from turning into open books. Fair ordering systems like Shutter and Fairblock push in the same direction: even the queue should not become a weapon. The base stays common and verifiable. The rooms above remain human.
Private-by-default infrastructure is arriving.
The deeper edge of the sword lives in the base layers. Privacy-focused L2s and rollups like Aztec, Miden, TEN, and INTMAX aim to make confidentiality a default property rather than a bolt-on habit. Confidential compute pushes the same frontier from another side: systems like Zama, Fhenix, and TACEO work toward a world where computation can be verified without turning every input into public property.
Exit as a first-class feature.
Trust has value only when it can be withdrawn. Wallets, identities, and agents can be designed so that people move between providers without losing everything. No company and no government should get to say, “Without us, you no longer exist.” Exit is what keeps trust from degrading into obedience.
None of these patterns solves the problem by itself. Together, they sketch the other edge of the sword: a way to use cryptography and shared ledgers to reduce blind trust in institutions without turning human beings into perfectly legible objects.
The question is whether we have the courage to build toward those patterns, instead of choosing whatever makes administration easiest.
Trust has moral weight because it rests on choice.
A person decides to rely on another, fully aware that they could have chosen differently. A community extends trust to an institution while knowing it can withdraw that trust through exit or reform. That possibility of withdrawal keeps trust from collapsing into submission.
Systems that tie identity to unchangeable bodily features and central databases attack that possibility at its root.
A branded slave in the ancient world never truly left the master’s mark behind.
A person with a camp tattoo could not peel off their association with a file.
A citizen whose right to move depends entirely on biometric clearance or a central score does not stand in a position of free trust.
Digital versions of the same pattern feel smoother. They deserve the same scrutiny.
A trust‑worthy architecture needs at least three features:
Revocable keys – people can rotate credentials and devices without losing their place in society.
Optional participation – no single identity system becomes the irreplaceable gate to every basic need.
Limited visibility – no actor can see, correlate, and weaponize every aspect of an individual’s life.
Decentralized blockchains, used carefully, can help build such a trust architecture. They provide shared, tamper‑resistant ledgers that no one actor owns. They make it possible to prove that something happened without revealing every detail of who did it and why. They allow money, contracts, and even proofs of personhood to live on rails that people can inspect and, if necessary, fork.
The same technology, used carelessly, can hard‑wire human dependence into an inescapable graph.
The double‑edged sword cuts both ways.
This series looks at how to keep the blade facing the right direction.
The next essay maps the scaffolding in broad strokes:
Money and real value: from fiat abstractions to crypto‑native settlement and tokenized commodities that reference actual food, metals, energy, and housing.
Proof of personhood without orbs: ways to say “a real human is here” without scanning bodies into permanent registries.
Self‑sovereign identity and minimal disclosure: credentials you hold and can rotate, proofs that reveal only what others need to know.
Privacy as a condition for freedom: chains and tools that create rooms without microphones while still allowing verifiable collaboration.
Verifiable computation and honest ledgers: zero‑knowledge proofs, rollups, and validator networks that turn “just trust us” into “check the math”.
Trust in AI agents: protocols for identity, reputation, and constraints when software begins to act economically on our behalf.
Each of these topics grows directly out of the basic claim that trust is the only real value layer. Technology will not replace trust. At best, technology can protect trust from the worst things we have already done with power, data, and one another.
The task ahead is simple to state and hard to execute:
Build systems where trust flows between free people and free communities, not from marked bodies toward distant centralized control databases.
If blockchains and identity tools help us do that, they deserve to be the that base layer of trust. They reduce the amount of blind faith we place in unaccountable intermediaries. The deeper trust remains what it has always been: the courage to rely on each other, backed by the right to walk away when we must.
DeFi to ReFi to Regenerate connecting the dots for abundance in all the ways.
DeFi to ReFi to Regenerate connecting the dots for abundance in all the ways.

Subscribe to donny_jerri

Subscribe to donny_jerri
<100 subscribers
<100 subscribers
Share Dialog
Share Dialog
1 comment
The Truth about Trust 2: The Double Edge Sword