Decentralized AI Milestone: Covenant + Gradients Create LLM Training Pipeline on Bittensor

Decentralized AI Milestone: Covenant + Gradients Create LLM Training Pipeline on Bittensor
Read Time:41 Second

SUMMARY: This episode of Covenant’s TGIF community chat walked through a major milestone in building an end-to-end open weights AI model fully trained on Bittensor. The team explained they reached checkpoint two, released a working chat app, and proved that large scale pre-training, post-training, and deployment can happen across collaborating subnets

In this upgrade, Templar handled pre-training, while Gradients enabled instruction fine tuning, chat capability, and a major context length expansion without requiring additional compute. Together, these tools show how decentralized infrastructure can turn the internet into a training factory and significantly reduce the cost of building frontier models. 

While the model is still early and unaligned, the achievement signals real progress toward producing state-of-the-art AI natively on Bittensor.

By: Covenant Labs

Subscribe to receive The Tao daily content in your inbox.

We don’t spam! Read our privacy policy for more info.

Be the first to comment

Leave a Reply

Your email address will not be published.


*