
For a long time, “Decentralized AI Training” was viewed as a science experiment, cool in theory but impossible in practice. The narrative was simple: “You can’t train massive AI models across the internet. It’s too slow.” But this week, that narrative shifted.
Jack Clark, co-founder of Anthropic (the company behind Claude AI) and former policy director at OpenAI, published an analysis highlighting Covenant AI’s Templar network as the largest active decentralized training setup in the world.
This caught fire on X when Bittensor co-founder const_reborn shared the news on January 5, 2026. The Bittensor community responded enthusiastically, with one prominent member calling Covenant AI’s developers “best in the space.” It’s a big deal when someone from the inner circle of Big Tech AI acknowledges what decentralized projects are building.
What Jack Clark Actually Said
Jack Clark’s analysis looked at whether decentralized AI training could scale to match frontier models like Grok 4. The answer? It’s technically possible, but the numbers show just how far behind decentralized efforts still are.

Right now, the biggest decentralized training runs are achieving about 6e22 to 6e23 FLOPs of compute. That sounds like a lot until you realize it’s about 1,000 times less than what was used to train Grok 4. FLOPs are basically a measure of how much computing work gets done—more FLOPs means more powerful AI.
Covenant AI’s Templar network is currently running at roughly 9e17 FLOPs per second. Compare that to top centralized AI datacenters, which can hit around 3e20 effective FLOPs per second. That makes Templar about 300 times smaller than what the giants are working with.
But here’s why the Bittensor community is excited despite those numbers: They’re actually being measured now. Industry insiders are paying attention. And the gap is closing.
Why This Recognition Matters
When someone like Jack Clark writes about your project, it means you’ve crossed a threshold. Clark isn’t just any tech blogger. He helps shape AI policy and regulation. He’s been at the center of the biggest AI developments of the past decade.
For decentralized AI to be taken seriously, it needs this kind of validation from people who understand the technology deeply and have credibility with policymakers, investors, and researchers. Getting mentioned in Clark’s analysis puts Covenant AI and Bittensor on the map in a way that community hype on X never could.
The Bittensor community has been building for years while most people focused on centralized AI labs. Now that work is getting recognized at the highest levels of the industry.
What Covenant AI Is Actually Building
Covenant AI operates on Bittensor’s network, which rewards people who contribute computing power to train AI models. Instead of one company with a massive data center, thousands of people around the world run computers that work together to train AI.

They run three main projects that work together. Templar handles the actual training of large AI models. Basilica provides a marketplace where people can rent computing power. Grail focuses on improving models after they’re initially trained. Together, these create a complete system for building AI from start to finish, all decentralized.
Their biggest achievement so far is training Covenant72B, a 72 billion parameter model. That might not sound like much compared to models with hundreds of billions of parameters, but it’s the largest AI model ever trained in a completely decentralized way over the public internet. People contributing computing power are scattered across the globe, coordinating through blockchain incentives rather than working for one company.
How They’re Closing the Gap
The 1,000 times gap sounds impossible to close, but Covenant AI has a plan. They’re using several technologies that multiply together to dramatically improve efficiency.
DiLoCo reduces the bandwidth needed for distributed training by 500 times. Bandwidth is a huge problem when you have computers all over the world trying to train one AI model together—they need to constantly share information with each other. Making that 500 times more efficient is massive.
They’re also using quantization and sparsification. These are ways to compress the data that computers need to share without losing accuracy. When you multiply all these improvements together—better compression, better communication, better coordination—you start closing that 1,000 times gap much faster than you’d expect.
Jack Clark’s analysis, drawing from research by Epoch AI, noted that bandwidth won’t be a bottleneck for decentralized scaling in the near future. That’s huge because bandwidth has been the main argument against decentralized AI training ever working at scale.
The Bigger Picture
What Covenant AI is proving is that you don’t need billions in venture capital and a massive data center to train serious AI models. You need smart engineering, good incentive design, and a community willing to contribute computing power.
This matters for several reasons. First, it democratizes AI development. Right now, only a handful of companies can afford to train frontier AI models. If decentralized approaches work, anyone could contribute and benefit. Second, it creates competition for the tech giants. When alternatives exist, Big Tech can’t dictate all the terms. Third, it aligns with the crypto ethos of decentralization, where no single entity controls the infrastructure.
Covenant AI started as Templar AI in late 2024 when they launched Bittensor Subnet 3. At the time, most people thought decentralized training was impossible. The coordination problems seemed too hard. The bandwidth requirements seemed too high. The incentive alignment seemed too complex.
But they proved it could work. They trained a 1.2 billion parameter model using about 200 GPUs distributed around the world. That opened the door. By mid-2025, they’d expanded into a full ecosystem with Basilica and Grail. Now they’re training 72 billion-parameter models and getting recognized by Anthropic co-founders.
The community response to Jack Clark’s analysis shows how hungry the decentralized AI space is for legitimacy. Comments like “only gonna get bigger” and celebrating the recognition reveal that people working on these projects know they’re building something important, even if mainstream tech hasn’t noticed yet.
What Happens Next
Covenant AI is continuing to push the boundaries. They’re working on even larger models, improving their training infrastructure, and releasing research papers accepted at major conferences like NeurIPS. They hold weekly community calls where they share updates and technical details.
The goal isn’t just to match what centralized labs are doing—it’s to prove that decentralized approaches have unique advantages. Verifiable training, permissionless participation, and transparent development are all things centralized labs can’t or won’t offer.
For the Bittensor community and TAO holders, this recognition from someone at Anthropic’s level is exactly the kind of institutional attention that helps the ecosystem mature. It signals that decentralized AI isn’t just a crypto meme—it’s being watched by the people shaping the future of the industry.

Be the first to comment