Const Explains Bittensor: Rebuilding AI With Bitcoin’s DNA

Const Explains Bittensor: Rebuilding AI With Bitcoin’s DNA
Read Time:3 Minute, 25 Second

During a Hackquest event in Tsinghua university, Beijing, Const (the co-founder of Bittensor), explains why Bittensor exists, how it generalizes Bitcoin’s mining model, and why incentive-driven networks may outperform centralized AI labs.

His conversation focuses on Bittensor’s fundamentals, not token prices.

Watch below:

Speaker (Const) Background

  • Studied AI and mathematics in Canada, graduated in 2015.
  • Worked as a DARPA contractor building neuromorphic chips.
  • Discovered Bitcoin in 2016, later worked at Google Brain.
  • Quit to start Bittensor nearly a decade ago.
  • Currently based in Peru.
  • Emphasizes this is not an investment pitch, but a technical and philosophical framework.

Why AI Changed After 2012

  • Pre-2010 AI relied on handcrafted features and stagnant methods like SVMs.
  • AlexNet in 2012 introduced representation learning via gradient descent.
  • Models stopped being told what to look for and instead learned representations themselves.
  • This shift ended the AI winter and unlocked modern deep learning.

The Universal Pattern of Intelligence

  • All learning systems follow the same loop: state → objective → feedback → adaptation.
  • This appears in neural networks, reinforcement learning, genetic algorithms, and even natural systems.
  • Examples include slime molds solving mazes, tree growth, lightning paths, and river deltas.
  • Intelligence is framed as energy-seeking, adaptive behavior.

Bitcoin as a Self-Adaptive Computer

  • Bitcoin is presented not as money, but as a global, self-adaptive computer.
  • It follows the same adaptive loop used in learning systems.
  • Miners adapt hardware and strategy based on feedback from the network.

Bitcoin’s Scale and Efficiency

  • Bitcoin is the largest supercomputer in the world.
  • Produces roughly 450,000 exaflops of compute.
  • Consumes electricity comparable to Thailand.
  • Achieves this for $50–300 billion, versus $1 trillion for ~1,000 exaflops in traditional compute.
  • This makes Bitcoin 700–9,000 times more efficient at producing hashes.

Why Bitcoin Works So Well

  • Permissionless and borderless participation.
  • Operates 24/7 with no downtime.
  • No HR, no credentials, no bias.
  • Rewards only performance, not background.
  • Capital flows directly to those doing useful work.

Incentive Computing

  • Introduced as a new computing paradigm.
  • Similar in importance to machine learning or reinforcement learning.
  • The question posed:
    Can Bitcoin’s incentive structure be used for useful work instead of hashes?

What Bittensor Is

  • Bittensor generalizes Bitcoin’s structure:
    • Miners perform any type of work.
    • Validators evaluate any type of work.
    • The network emits rewards based on performance.
  • It functions like a programming framework for incentive systems, similar to PyTorch for neural networks.
  • Consensus is reached on who contributes the most value.

Coding Intelligence Breakthrough – Ridges

  • A coding subnet (Ridges) trained AI agents evaluated on SWE-bench.
  • Over three months, performance surpassed Claude and OpenAI models.
  • The system did not define the solution, only incentives.
  • A previously unknown miner built a 7,000-line autonomous coding agent.
  • Top performers earned up to $60,000 per day.

Decentralized Model Training – Gradients

  • Training frontier models normally requires tens of thousands of GPUs.
  • Bittensor creates a market for Gradients, a subnet.
  • Participants are paid for producing gradients that reduce loss faster than others.
  • This enables fully decentralized training of large models across the internet.

GPU Mining and Physical Resources

  • Miners contribute verifiable GPU resources.
  • GPUs are rented at globally lowest rates due to open competition.
  • Demonstrates incentive computing applied to physical infrastructure.

Inference at Global Scale – Chutes

  • Miners provide inference endpoints evaluated on speed and reliability.
  • Chutes (a Bittensor subnet) became the largest open-source inference provider on OpenRouter.
  • At times, served more DeepSeek inferences than DeepSeek itself.

Robotics and Beyond

  • Incentive computing extends to robotics and control systems.
  • Drone control models are evaluated in simulations and rewarded by performance.
  • The system can optimize software, hardware, data, and physical behavior.

Broad Ecosystem Applications (other existing subnets)

  • Stock trading signal generation.
  • Sports betting and prediction markets.
  • Vision language models for sports analytics.
  • AutoML systems.
  • Multiple real-world domains powered by the same incentive structure.

Bottom line

Bittensor reframes AI development as an economic coordination problem, not an engineering one. By generalizing Bitcoin’s incentive model, it enables global, permissionless optimization that can outperform centralized labs across software, hardware, and intelligence itself.

Subscribe to receive The Tao daily content in your inbox.

We don’t spam! Read our privacy policy for more info.

Be the first to comment

Leave a Reply

Your email address will not be published.


*