Subscribe to Fangorn
<100 subscribers

Generative AI is shifting the web from an attention economy to an intention economy. The consequence is an informationally dense 'thin web' that threatens to fragment shared meaning through atomization. As content becomes abundant and agents become first-class actors, the platform-economy model, powered by artificial scarcity of data, centralized moderation, surveillance, and gatekeeping, begins to break down. This business model collapses when the main resource, information, is no longer scarce; interpretation and intention are.
When information is nearly free, value is not derived through controlling access, but in how information is curated, shared, and acted on. Today's centralized control and curation implies platforms are arbiters of shared-meaning, with a primary goal to maximize profit, not social welfare. Allowing AI agents to infer intent is fickle, often relying on invasive surveillance and with massive variance depending on models and contexts. The future web, if it is to be fair and open, requires transforming data form a platform-controlled resource to a programmable commons.
The concept behind Fangorn is simple: Data access should be enforced by cryptographic proof, not platform permissions.
Instead of trusting platforms to provide permission to access your data, we're building access control based on proof. Fangorn enables secrets that enforce their own access rules. We call this intent-bound data.
Intent-bound data is encrypted information that can only be decrypted by someone who mathematically proves they meet some condition without revealing what they're accessing, and without relying on any trusted intermediaries.
Fangorn is programmable secrets infrastructure for the Agentic web: encrypted information that enforces its own access rules, consumed by humans and agents alike.
At a high level, Fangorn combines:
client-side encryption
zero-knowledge proofs for access control
trustless, on-chain verification
decentralized threshold encryption and storage
Consider one of the most prevalent, yet simplest, use cases: password-protected data.
Today, this generally requires trusting a server:
to store and verify the password (or hash of it).
to determine if and when you should get access, based on opaque platform policies.
that can log, leak, revoke, and misuse your data.
With Fangorn:
Data is always encrypted client-side; no server ever sees the plaintext.
The password never leaves your device.
Encrypted data exists in decentralized storage, making it instantly portable.
No intermediary can access the secret or decrypt anything.
No logs to leak, no unfair revocations.
If you can prove you know the password, you can decrypt the data. Otherwise, it remains locked.
Today, sharing data on a platform is subject to the platform’s policies, incentives, and mistakes. Access can be revoked, personal data can be leaked, and “ownership” is really a temporary license. Fangorn eliminates any need to trust platforms with enforcement, but the problem we face isn't storage. Decentralized storage already exists, but adoption lags since, without privacy-preserving access control, using it means publicizing everything.
Fangorn fills this gap by separating storage from access, proof from identity, and stripping enforcement powers from platforms. It unlocks new classes of applications where data is shareable without surrendering control, is consumable by AI agents without surveillance, and whose access is governed by math instead of terms of service.
What You Can Build
Agentic Dark Pools
AI agents executing trades based on private signals that only decrypt under specific market conditions (e.g. RSI thresholds).
ZK-Gated Content
Content that only releases by meeting creator-controlled rules, independent of platform policies.
Timelocked Data
Secrets that becomes public after a predetermined time (e.g. a Deadman's switch)
Token-Gated Assets
Access control determined on-chain through token ownership (e.g. with perpetual royalties enforced by math, not platforms).
Peer-to-Peer games with hidden state
Including sealed-bid auctions and rock-paper-scissors, and other trustless games requiring private inputs.
Cross-organizational data sharing and collaborative AI Training
Many parties can encrypt private data to a 'data lake' that only decrypts for a specific TEE, or model trainer, or data clean room.
These are early ideas and examples, not an exhaustive list
Fangorn emerged out of several years of research and prototyping at Ideal Labs, where we learned what it takes to move from cryptographic theory to production ready systems.
In December 2025, we won first place in the Polkadot 2.0 hackathon for a proof-of-concept threshold encryption network. The project demonstrated a novel approach to decentralized key management and programmable access conditions, and the win validated that we can design and implement complex cryptographic systems. However, the experience also revealed a harsh reality: engineering bespoke, novel cryptography is slow, risky, and often orthogonal to real-world usability.
In January 2026, we began rearchitecting the system for a pragmatic, production ready deployment. Rather than reinventing core primitives, we focused on building on battle-tested infrastructure and modern zero-knowledge tooling, with privacy, composability, and developer experience as a fundamental focus from day one.
The first supported access-control condition is deliberately simple: password-protected data that does not leak the content being accessed, not to anyone. This forms the foundation of a broader, extensible gadgets framework, enabling data to be encrypted behind a composition of various gadgets, like: timelocks, membership checks, token ownership, and additional conditions under development. Fangorn aims to build a modular and extensible zk-gadgets SDK and framework, where new zk-access-control-conditions can be developed and registered, and verified trustlessly on-chain.
Join us as we build!
If you're reading then, you're early. Fangorn's ideas are ambitious, and the first tools are almost ready. If you believe the future of the web should be built on proof, not permission, we'd like you to build with us.
💬 Discord

Generative AI is shifting the web from an attention economy to an intention economy. The consequence is an informationally dense 'thin web' that threatens to fragment shared meaning through atomization. As content becomes abundant and agents become first-class actors, the platform-economy model, powered by artificial scarcity of data, centralized moderation, surveillance, and gatekeeping, begins to break down. This business model collapses when the main resource, information, is no longer scarce; interpretation and intention are.
When information is nearly free, value is not derived through controlling access, but in how information is curated, shared, and acted on. Today's centralized control and curation implies platforms are arbiters of shared-meaning, with a primary goal to maximize profit, not social welfare. Allowing AI agents to infer intent is fickle, often relying on invasive surveillance and with massive variance depending on models and contexts. The future web, if it is to be fair and open, requires transforming data form a platform-controlled resource to a programmable commons.
The concept behind Fangorn is simple: Data access should be enforced by cryptographic proof, not platform permissions.
Instead of trusting platforms to provide permission to access your data, we're building access control based on proof. Fangorn enables secrets that enforce their own access rules. We call this intent-bound data.
Intent-bound data is encrypted information that can only be decrypted by someone who mathematically proves they meet some condition without revealing what they're accessing, and without relying on any trusted intermediaries.
Fangorn is programmable secrets infrastructure for the Agentic web: encrypted information that enforces its own access rules, consumed by humans and agents alike.
At a high level, Fangorn combines:
client-side encryption
zero-knowledge proofs for access control
trustless, on-chain verification
decentralized threshold encryption and storage
Consider one of the most prevalent, yet simplest, use cases: password-protected data.
Today, this generally requires trusting a server:
to store and verify the password (or hash of it).
to determine if and when you should get access, based on opaque platform policies.
that can log, leak, revoke, and misuse your data.
With Fangorn:
Data is always encrypted client-side; no server ever sees the plaintext.
The password never leaves your device.
Encrypted data exists in decentralized storage, making it instantly portable.
No intermediary can access the secret or decrypt anything.
No logs to leak, no unfair revocations.
If you can prove you know the password, you can decrypt the data. Otherwise, it remains locked.
Today, sharing data on a platform is subject to the platform’s policies, incentives, and mistakes. Access can be revoked, personal data can be leaked, and “ownership” is really a temporary license. Fangorn eliminates any need to trust platforms with enforcement, but the problem we face isn't storage. Decentralized storage already exists, but adoption lags since, without privacy-preserving access control, using it means publicizing everything.
Fangorn fills this gap by separating storage from access, proof from identity, and stripping enforcement powers from platforms. It unlocks new classes of applications where data is shareable without surrendering control, is consumable by AI agents without surveillance, and whose access is governed by math instead of terms of service.
What You Can Build
Agentic Dark Pools
AI agents executing trades based on private signals that only decrypt under specific market conditions (e.g. RSI thresholds).
ZK-Gated Content
Content that only releases by meeting creator-controlled rules, independent of platform policies.
Timelocked Data
Secrets that becomes public after a predetermined time (e.g. a Deadman's switch)
Token-Gated Assets
Access control determined on-chain through token ownership (e.g. with perpetual royalties enforced by math, not platforms).
Peer-to-Peer games with hidden state
Including sealed-bid auctions and rock-paper-scissors, and other trustless games requiring private inputs.
Cross-organizational data sharing and collaborative AI Training
Many parties can encrypt private data to a 'data lake' that only decrypts for a specific TEE, or model trainer, or data clean room.
These are early ideas and examples, not an exhaustive list
Fangorn emerged out of several years of research and prototyping at Ideal Labs, where we learned what it takes to move from cryptographic theory to production ready systems.
In December 2025, we won first place in the Polkadot 2.0 hackathon for a proof-of-concept threshold encryption network. The project demonstrated a novel approach to decentralized key management and programmable access conditions, and the win validated that we can design and implement complex cryptographic systems. However, the experience also revealed a harsh reality: engineering bespoke, novel cryptography is slow, risky, and often orthogonal to real-world usability.
In January 2026, we began rearchitecting the system for a pragmatic, production ready deployment. Rather than reinventing core primitives, we focused on building on battle-tested infrastructure and modern zero-knowledge tooling, with privacy, composability, and developer experience as a fundamental focus from day one.
The first supported access-control condition is deliberately simple: password-protected data that does not leak the content being accessed, not to anyone. This forms the foundation of a broader, extensible gadgets framework, enabling data to be encrypted behind a composition of various gadgets, like: timelocks, membership checks, token ownership, and additional conditions under development. Fangorn aims to build a modular and extensible zk-gadgets SDK and framework, where new zk-access-control-conditions can be developed and registered, and verified trustlessly on-chain.
Join us as we build!
If you're reading then, you're early. Fangorn's ideas are ambitious, and the first tools are almost ready. If you believe the future of the web should be built on proof, not permission, we'd like you to build with us.
💬 Discord
Share Dialog
Share Dialog
No comments yet