
Bittensor has never been easy to explain. The project sits at the intersection of decentralized compute, open-source AI, and crypto-economic theory — a combination that tends to lose people before the first sentence is finished. But in a recent long-form conversation, Jesus Martinez, one of the more prominent voices in the Bittensor community, delivered what may be the clearest distillation of the project’s thesis, its current state, and why he believes the window for early participation is closing fast.
His core argument is simple: centralized AI is economically fragile, open-source technology has a long history of winning, and Bittensor has just crossed a threshold that makes the whole experiment real. Here is the case he laid out.
The Centralized AI Problem
Martinez opened with a provocation: centralized AI has already failed, but it just hasn’t admitted it yet.
The evidence, as he sees it, is financial. OpenAI is burning billions annually and recently shut down Sora, its image generation model, because the economics didn’t work. The company is now promising investors 17.5 percent returns while raising capital from geopolitically unstable regions. Meanwhile, DeepSeek demonstrated that open-source approaches can compete with the largest closed labs at a fraction of the cost.
This is not, Martinez argued, a new pattern. He cited a conversation with Mark Jeffrey, a partner at Stillcore Capital whose fund is reportedly targeting one percent of the total TAO supply. Jeffrey drew parallels to previous cycles where centralized incumbents lost to open alternatives. AOL dominated the early internet until open protocols rendered it irrelevant. Linux now runs an estimated 99 percent of the world’s servers and powers every Android phone on the planet. Bitcoin itself was dismissed as a scam for years before becoming the most significant financial innovation of the century.
The throughline is that no centralized player, no matter how well-funded, can ultimately compete against the collective resources of the entire world. A gigafactory in Texas is impressive. Every computer on the planet is more impressive.
How Bittensor Works
For listeners unfamiliar with the project, Martinez broke the mechanics into a few core concepts.
TAO is the base asset, the native token of the Bittensor network. It has a hard cap of 21 million tokens, identical to Bitcoin, with the same halving schedule. TAO completed its first halving in December 2025, placing it at roughly 2013 in Bitcoin-equivalent supply terms. The token is required for virtually everything on the network: staking, mining, and creating subnets.
Subnets are the operational units of the network. There are currently 128 of them, each functioning as an AI startup with its own team, its own task, and its own competitive dynamics. Miners within each subnet do the actual work (providing compute, training models, delivering data, etc) while validators score their output. Only the best-performing miners get paid. If a subnet isn’t producing something genuinely useful, its economics collapse and it gets replaced. The system is, by design, ruthless.
Martinez offered a one-line summary that has been gaining traction in the community: TAO is the S&P 500 for AI. It sits in the liquidity pool opposite every subnet token, meaning that if any individual subnet explodes in value, TAO benefits directly. Hold the base asset, and you gain passive exposure to the aggregate performance of the entire ecosystem. Go into individual subnet tokens and you take on more risk for potentially higher returns — individual stocks versus the index.
The comparison to traditional crypto projects was pointed. Martinez contrasted TAO’s tokenomics with Chainlink, which generates substantial oracle revenue but whose token, he argued, suffers from a lack of meaningful holder incentives. TAO’s design, inherited from Bitcoin’s proven model but redirected toward an incentive layer for AI work, avoids that disconnect. The token isn’t just governance or speculation; it’s the economic fuel the network runs on.
Templar: The Proof That Changed Everything
Before Subnet 3’s breakthrough, the network was essentially running on promise. The token existed, the architecture existed, but no subnet had done anything dramatic enough to force the outside world to pay attention.
Templar changed that. The team completed the first-ever 72-billion-parameter decentralized large language model training run. Not a fine-tune, but full pre-training from scratch, coordinated across permissionless GPU contributors on the open internet. It was an achievement that critics within the centralized AI world had explicitly said was impossible. Anthropic’s CEO has since referenced it publicly on multiple occasions in a short span of time.
Martinez was careful to temper expectations: the resulting model is not yet competitive with the latest offerings from OpenAI, Anthropic, or Google. But that isn’t the point. The point is that the architecture works. If the network can coordinate a 72-billion-parameter run, it can coordinate a larger one. And unlike a centralized lab constrained by the capacity of its own data centers, a decentralized network can scale its contributor base essentially without limit.
The implications ripple outward. Bittensor now has something almost no other crypto project can claim: proof. Not a roadmap, not a promise, not a theoretical framework, a published paper and a trained model.
Subnets on the Ground
To make the abstract concrete, Martinez walked through several subnets that illustrate the range of what the network is producing.
Targon (Subnet 4) is a decentralized compute marketplace where anyone can contribute GPU power and earn rewards. The subnet uses collateral requirements and slashing mechanisms to ensure quality — if a contributor provides bad compute or goes offline, they lose their stake. Prices run dramatically below centralized alternatives, with some offerings at 70 to 90 percent below AWS equivalents. Martinez noted that Targon co-authored a paper with Intel on encrypted compute, making it what he described as the only decentralized compute provider in the world capable of guaranteeing data privacy at the cryptographic level. For institutions — hedge funds protecting proprietary strategies, government entities handling sensitive operations — this addresses a trust problem that centralized cloud providers fundamentally cannot solve. Targon’s co-founder recently reported triple-digit week-over-week growth.
Quasar (Subnet 24) is tackling long-range context degradation in large language models — the problem where extended conversations lose coherence as token counts climb. The team, which Martinez described as two young founders with an ambitious technical vision, has published a paper on a method they believe can enable lossless conversation at extreme lengths. Miners contribute compute to train the underlying model, and the early results have drawn attention within the community.
IOTA offers a consumer-facing entry point. Martinez demonstrated on-screen how anyone with a capable MacBook can download a “Train at Home” application, contribute compute to real model training tasks through macOS, and earn IOTA token rewards automatically. The returns are modest — roughly a dollar a day on a MacBook Pro in his testing — but the accessibility is the point. Anyone, anywhere, can participate in the network’s work without specialized hardware or institutional access.
The barrier to entry on the other side of the market is equally low compared to incumbents. Martinez contrasted the experience of procuring compute from AWS — which he said can require multi-year futures contracts and lengthy approval processes — with simply purchasing compute on Targon with no intermediary and no gatekeeping.
The Investment Case
Martinez did not shy away from the financial angle. He framed the opportunity in starkly asymmetric terms: TAO is a roughly $3 billion asset operating in a $4 trillion industry. If open-source AI captures even a modest share of that market, the upside from current valuations is enormous.
He cited Mark Jeffrey’s price target of $3,000 per TAO by year-end — a figure that would imply a market cap in the $30 to $50 billion range. Martinez acknowledged this is aggressive but argued it is not unreasonable given the pace at which the AI industry moves and the catalysts now in place.
The deeper structural argument is about access. The most valuable AI companies in the world — OpenAI, Anthropic, and their peers — have been funded through insider rounds that the public never sees. By the time an IPO arrives, the valuation is already in the trillions. Bittensor, by contrast, is fully accessible on major exchanges. Anyone can buy TAO on Coinbase or Kraken and gain exposure to the only credible decentralized alternative to the closed AI labs.
Currently, only about 20 percent of all TAO is staked into subnets. The remaining 80 percent represents latent capital that could flow into subnet liquidity pools as awareness and confidence grow — a dynamic that could meaningfully tighten supply.
Staking yields reflect the ecosystem’s early stage. TAO’s base staking yield has compressed from roughly 150 percent APY three years ago to around 5 percent today. But individual subnets still offer significantly higher returns — Templar was yielding approximately 40 percent at the time of the conversation — a premium that rewards early participants willing to do the research and take the risk.
The Bigger Picture
Stepping back from individual subnets and token prices, Martinez situated Bittensor within a broader thesis about what crypto is actually for.
The crypto market, he argued, has been clarifying. The top-performing tokens in recent months have been privacy coins and genuinely decentralized projects — assets that hew closest to the original cypherpunk vision of peer-to-peer systems beyond the reach of centralized control. The speculative altcoin casino that defined the last cycle, epitomized by the memecoin frenzy on Solana, has largely burned out. What remains is a smaller but more serious cohort of builders working on things that matter.
Bittensor fits squarely in that camp. It is attempting to ensure that the most transformative technology of the era, artificial intelligence, is not controlled by a handful of companies that can censor output, sell user data, or collapse under their own financial weight. The network’s permissionless design means that a genius with an idea and no funding can come to Bittensor, define a task, attract miners, and build something real. And every participant — from a MacBook owner earning a dollar a day to a fund deploying millions into subnet tokens — shares in the upside of whatever the collective builds.
Whether the experiment succeeds at the scale its advocates envision remains genuinely uncertain. Not all 128 subnets will survive. The technology is early and the ecosystem is still rough around the edges. But Martinez’s closing observation lingered: in a crypto landscape starved for projects that have actually built something, Bittensor has papers, products, and proof. For a $3 billion asset in a $4 trillion industry, that may be all the edge an early participant needs.
This article is based on a conversation between Jesus Martinez and Lewis Jackson on YouTube. It has been condensed and organized for clarity. Watch the full conversation below:
Enjoyed this article? Join our newsletter
Get the latest Bittensor & TAO ecosystem news straight to your inbox.
We respect your privacy. Unsubscribe anytime.

Be the first to comment