Variant
Cover photo

Investing in Pluralis Research

jhackworth.eth

J Hackworth, Jesse Walden and Daniel Barabander

Variant’s founding thesis is that crypto enables an internet where users become owners—solving coordination problems by aligning incentives through shared ownership. One coordination problem crypto excels at solving is the “resource problem,” which occurs when software development requires costly external resources beyond an individual’s means. 

Our seed investment in Pluralis Research is premised on using crypto ownership to solve the resource problem of foundation AI models. Let’s unpack that.

The Resource Problem of Foundation AI Models and the Cost of the Corporate Form 

It costs tens of millions of dollars in compute and data to pretrain a foundation model. Foundation models therefore suffer from a severe resource problem that to date has only been solved through the “corporate form”: closing off the software into a corporate entity and raising a massive amount of capital to get over the resource hump. As a result, only a few companies have been able to raise the capital necessary to develop foundation models—all built behind closed doors. Even Meta and DeepSeek, which have released their foundation model weights publicly, build their models privately with their own resources and determine when to make them available to the world.

Using the corporate form to solve the resource problem comes with significant costs. Looked at normatively, if only a few entities can build foundation models, they will dictate access to information and opportunity, setting us up for a dystopian hellscape of a magnitude Orwell couldn’t have dreamed of. But there is also a major opportunity cost, as our models will forever remain suboptimal because of constraints on talent and resource supply. 


1. Talent Constraints

Today, to build foundation models, you must work inside one of the major corporations whose research labs steward their development and training. These corporations have limited seats and highly selective hiring processes, indexing heavily on candidates with fancy pedigrees. This necessarily excludes a whole suite of high-quality model designers with nontraditional backgrounds. Additionally, many high-quality model designers do not want to work inside one of these organizations for personal, ethical, or geographic reasons. When the talent pool is global, the corporate form excludes a significant chunk of talent. This forecloses a diversity of ideas and new thinking for how to build foundation models.

These corporations also constrain the collaboration of talent across and outside their walled gardens. Model designers cannot iterate on each other’s work fluidly, which contradicts how machine learning development has historically occurred. This hampers innovation and limits our ability to reach the next frontier of model design.

2. Resource Supply Constraints

Under the current system, AI compute and data resources are fragmented and trapped inside the walled gardens of the few entities that can afford them. Each major model provider hoards these resources, creating redundant silos that limit collective progress. Because each entity is closed, there is also no way for nontraditional resource providers to contribute things like consumer GPUs and data on individuals’ devices. This locks out a significant chunk of valuable resources that are otherwise sitting idle.

Current research suggests that, all things being equal, more compute and data in pretraining means better models. By failing to tap into the full potential of supply, our models’ performance will remain artificially stunted.

Pluralis Uses Crypto Ownership to Solve the Resource Problem

If we used crypto ownership instead of the corporate form to solve the resource problem of foundation models, we would unchain these constraints on talent and resource supply and build better models. This is exactly what Pluralis does.

Pluralis is building a decentralized training protocol where model designers can come to the protocol with their model ideas while compute and data providers can contribute the resources necessary to train models. These “protocol models” are open, developed in public, and owned by a network of resource contributors. 

The project uses the two core features of crypto ownership to get over the resource problem of foundation models. First, ownership in the form of a token is provided directly to speculative compute and data providers who contribute resources to train a model. This token represents upside in the model in the form of a claim on the future inference revenue it will generate. Second, ownership is made meaningful but the software remains open because the value of the software—the weights that represent training—is moved onto a distributed network. To do this, the project exploits a key property of neural networks: their computations are local, which means their weights can be split up across the nodes of the network through model parallelism. 

This second feature represents a really thorny problem because it requires solving challenges at the intersection of distributed systems and AI. This intersection is largely unexplored because big tech incumbents have not needed to explore it; as centralized entities, they can assume homogenous and performant GPUs that are largely centrally hosted with high-speed interconnects. This is at odds with the permissionless and globally distributed system that Pluralis requires, which involves heterogeneous GPUs with highly variable speed and memory, interconnects of varying bandwidth and reliability, and large distances between nodes. 

The Pluralis team is the one to tackle this novel research problem. Led by founder Alexander Long, the team currently consists of five machine learning PhDs who all left AI research at Amazon to find better ways to build models. They have authored foundational machine learning papers and are world experts in relevant research areas for distributing training, such as implicit neural representations (highly relevant for compression). They are approaching the problem from first principles rather than assuming a centralized stack, building on recent research that suggests training over lower-bandwidth networks is feasible. We were deeply impressed by the progress the team has already made in its novel research around distributed model parallelism. 

We believe the models built on Pluralis will be better than our current models built under the corporate form because they will not be subject to existing talent and resource supply constraints. By drawing a line of ownership directly between the model designers who need resources and the providers who have those resources, Pluralis will serve as a talent marketplace for compute and data providers. Tapping into the upside of model designers previously trapped inside labs will motivate new parties to serve as resource providers, just as the prospect of Bitcoin ownership forged a new mining industry. This will aggregate resource supply levels that are simply not available under walled gardens, leading to bigger and better models. And because the models will be publicly developed, model designers will be able to iterate on each other’s work, leading to more innovation in model development.

We could not be more excited to invest in Alexander and the Pluralis team as they use crypto’s superpower of ownership to solve one of the most important and challenging resource problems in software: making better AI that is by the people, for the people.


All information contained herein is for general information purposes only. It does not constitute investment advice or a recommendation or solicitation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. It should not be relied upon for accounting, legal or tax advice or investment recommendations. You should consult your own advisers as to legal, business, tax, and other related matters concerning any investment. None of the opinions or positions provided herein are intended to be treated as legal advice or to create an attorney-client relationship. Certain information contained in here has been obtained from third-party sources, including from portfolio companies of funds managed by Variant. While taken from sources believed to be reliable, Variant has not independently verified such information. Any investments or portfolio companies mentioned, referred to, or described are not representative of all investments in vehicles managed by Variant, and there can be no assurance that the investments will be profitable or that other investments made in the future will have similar characteristics or results. A list of investments made by funds managed by Variant (excluding investments for which the issuer has not provided permission for Variant to disclose publicly as well as unannounced investments in publicly traded digital assets) is available at https://variant.fund/portfolio. Variant makes no representations about the enduring accuracy of the information or its appropriateness for a given situation. This post reflects the current opinions of the authors and is not made on behalf of Variant or its Clients and does not necessarily reflect the opinions of Variant, its General Partners, its affiliates, advisors or individuals associated with Variant. The opinions reflected herein are subject to change without being updated. All liability with respect to actions taken or not taken based on the contents of the information contained herein are hereby expressly disclaimed. The content of this post is provided "as is;" no representations are made that the content is error-free.

Collect this post as an NFT.

Variant

Subscribe to Variant to receive new posts directly to your inbox.

Over 4.8k subscribers

Daniel BarabanderFarcaster
Daniel Barabander
Commented 3 weeks ago

I’m thrilled to announce our seed investment in Pluralis Research, which is building a decentralized AI training protocol. We are laser focused at investing in projects that use crypto ownership to solve coordination problems. Pluralis is a bullseye on that thesis, using ownership to solve the "resource problem" of training foundational AI models that are truly open. We couldn't be more excited to support Alex and the rest of the team. Full post here: https://blog.variant.fund/investing-in-pluralis-research

Investing in Pluralis Research