Sam Discusses Covenant’s Decentralized AI Stack on Hash Rate Pod

Sam Discusses Covenant’s Decentralized AI Stack on Hash Rate Pod
Read Time:6 Minute, 4 Second

In an industry defined by scale, secrecy, and billion-dollar data centers, Covenant is attempting something that borders on the unthinkable: making frontier-model pre-training accessible to the entire world.

On Hash Rate Pod’s episode 145 (see the video below), Mark Jeffrey explored with Covenant Labs’ Sam Dare what may be one of the most ambitious projects in decentralized AI: an attempt to make frontier-model pre-training accessible to the entire world.

What emerged was not just a technical discussion, but a sweeping look at the future of AI infrastructure, the limits of centralized “colossi,” and the emergence of Templar (Subnet 03), Grail (Subnet 81), and Basilica (Subnet 39) as a vertically integrated ecosystem built on Bittensor.

Sam explained that Covenant’s mission is rooted in a simple but radical premise: the intelligence of the future should not be controlled by a handful of data centers, but produced collectively across the internet.

Mark pressed on this repeatedly, noting that the industry has long treated pre-training as a sacred, closed-door capability reserved for trillion-dollar giants. The revelation was that this constraint is beginning to break.

A New Model of Frontier-Level Training

The core of Covenant’s architecture rests on its three interconnected subnets:

a. Templar for pre-training,

b. Grail for post-training and evaluation, and

c. Basilica for decentralized compute.

Together, Sam noted, they form a self-reinforcing system capable of training large-scale models over dispersed compute rather than colocated superclusters. 

Mark pushed on the feasibility of such an approach, raising the traditional skepticism around distributed training being “too slow.” Sam countered by pointing to Templar’s recent runs, including 72B-parameter models (currently the largest ever trained across the open internet) and emphasized that the system is now capable of performance within striking distance of centralized labs.

Mark characterized it succinctly: Covenant is turning the internet itself into a data center, replacing the monolithic colossus with a decentralized, Napster-like structure. And the numbers appear to support it. Sam estimated the system sits at roughly 60% of frontier-model capability, with rapid progress closing the gap faster than originally expected.

The Democratization of Pre-Training

Much of their discussion revolved around why pre-training has remained inaccessible for so long. The barrier is not GPUs alone, Sam explained, but the entire industrial ecosystem behind them i.e the real estate, the power infrastructure, the proprietary data pipelines, and the enormous operating overhead. Only a few global entities can justify these investments.

Templar disrupts these dynamics by enabling organizations to run custom pre-training at a fraction of traditional cost. Mark called this “a personal colossus for everyone,” a framing Sam didn’t dispute.

In practice, this means universities, startups, enterprises, and even governments could soon build domain-specific foundational models without needing access to hyperscale compute.

This shift is what Mark identified as the heart of Covenant’s thesis: pre-training is where intelligence is formed, and access to that process is becoming open rather than gated.

Commercialization: From Research Project to AI Stack

A major theme Sam emphasized is that Covenant is transitioning from pure research into commercialization. After a year of proving the science, the team is now opening enterprise APIs, launching custom pre-training services, and preparing to build applications on top of their own models. 

Mark underscored how unusual this is in the decentralized space; where many projects excel at research but struggle to convert that research into revenue-driven products.

Covenant intends to change that dynamic.

The model is the product, but not the only one. Covenant is building everything from training code to evaluation frameworks to inference tools, creating a flywheel where internal research strengthens external offerings and vice versa.

Token Value and the Post–TAO Flow World

Mark repeatedly returned to the economic heart of the system: why subnet tokens matter. The new TAO Flow environment forces subnets to justify their existence through real demand.

For Templar, Sam framed the value clearly; all future revenue from custom training will flow back into the token through buybacks. 

But he emphasized something deeper: Covenant is constructing a unified, vertically integrated AI stack, and future mechanisms will tie the value of its three subnets closer together.

Mark described this as buying a stake in “the decentralized AI lab of the future,” a framing that Sam allowed was directionally accurate.

Basilica and the Redesign of Decentralized Compute

Their conversation took a sharp turn into compute economics. Many compute subnets today operate with distorted incentives: miners overpaid to remain online, compute undersold to attract users, and token emissions wasted when miners cannot operate profitably. Sam sees this as unsustainable and outlined Basilica’s contrasting philosophy, rewarding miners only when they provide compute cheaper than centralized alternatives, and burning rewards when they fail to meet that threshold.

Mark called this “flipping the economics right-side up.” Basilica positions itself not as a subsidized market, but as a competitive one that aligns real costs with real demand.

The Three Orders Function as One Machine

Throughout the discussion, it became clear that Covenant’s vision is not three subnets, but a single organism. Training breakthroughs in Templar feed directly into Grail and evaluation tools flow downstream into Basilica. Compute improvements reinforce training efficiency. Research, economics, and infrastructure iterate together.

Sam described this as building a unified machine, Mark corroborates it as building an AI economy. Both descriptions are appropriate.

Looking Two Years Into the Future

When pressed by Mark to articulate the success case, Sam laid out a surprisingly clear view: Templar becomes the world’s most cost-effective pre-training platform, used by institutions worldwide; Grail becomes the default environment for refinement and evaluation; Basilica becomes a sustainable decentralized compute cloud; and the entire stack becomes synonymous with open, permissionless frontier intelligence.

Mark noted that if such a system succeeds, it will not merely compete with centralized AI, it will redefine what the AI landscape looks like.

A Founder Learning at the Edge

One of the quiet through-lines of the conversation was Sam’s willingness to examine his own journey. He described Covenant not as a path of certainty, but as a steady reduction of cluelessness; making decisions until the next horizon becomes visible, then advancing again. Mark pointed out that such transparency is rare in a space dominated by secrecy.

That openness may be part of what makes Covenant’s work compelling. This is frontier research, built in public, moving quickly, and anchored in clear economic and philosophical boundaries.

A New Paradigm in the Making

As the discussion concluded, one theme stood above the rest: the era of centralized frontier AI is hitting physical and economic limits, and decentralized pre-training (once viewed as impossible) is emerging as a viable, competitive alternative.

Covenant stands at the center of that shift.

If their trajectory continues, frontier intelligence will not remain confined within a small circle of elite institutions. It will become something trained across the open internet, owned collectively, and accessible to anyone with the imagination to use it.

A new paradigm is forming and Covenant is building the architecture beneath it.

Subscribe to receive The Tao daily content in your inbox.

We don’t spam! Read our privacy policy for more info.

Be the first to comment

Leave a Reply

Your email address will not be published.


*