Bittensor Covenant-72B Wins Praise From Chamath and Jensen Huang, TAO Jumps 24%

Bittensor’s decentralized network has produced Covenant-72B, a 72-billion-parameter large language model trained by more than 70 independent contributors with no central server, drawing public praise from venture capitalist Chamath Palihapitiya and Nvidia CEO Jensen Huang. TAO, Bittensor’s native token, surged 24% after video of their comments circulated on social media.

Covenant-72B: 72 Billion Parameters, Zero Central Infrastructure

Covenant-72B was built on Bittensor’s Subnet 3, known as Templar, using approximately 1.1 trillion training tokens. The model was assembled by over 70 contributors worldwide who pooled compute resources through the Bittensor protocol, with no central server coordinating the process.

That scale puts Covenant-72B in the same parameter tier as models from major centralized AI labs. Benchmark results place it in direct competition with established centralized models, including Meta’s same-scale offering.

The distinction is not just the model’s size but how it was produced. Centralized training runs from companies like Meta, Google, and OpenAI rely on massive proprietary data centers. Covenant-72B was trained across a distributed network of independent participants, each contributing excess compute, with no single entity controlling the training pipeline.

For readers tracking the evolution of decentralized infrastructure projects, this represents a concrete proof point: distributed compute can now produce frontier-scale AI models, not just smaller experimental ones.

Chamath Calls It ‘Crazy,’ Huang Says Open and Proprietary AI Are ‘A and B’

Chamath Palihapitiya spotlighted Bittensor on the All-In Podcast, describing the distributed training approach as “a pretty crazy technical accomplishment.”

“They managed to train a 4 billion parameter LLaMA model, totally distributed, with a bunch of people contributing excess compute.”

A clarification is warranted: Chamath’s quote references an earlier Bittensor milestone involving a 4-billion-parameter model. The current headline achievement, Covenant-72B, is a far larger 72-billion-parameter model representing the project’s latest and most significant training run.

Jensen Huang’s comments carry arguably greater strategic weight. The Nvidia CEO framed the relationship between open and proprietary AI not as a competition but as a coexistence.

“These two things are not A or B; it’s A and B. There is no question about it.”

Huang went further, describing AI models as “a technology, not a product,” and noting that industries with deep domain expertise need open models to capture and control that knowledge. He added that “every startup we’re investing in now is open source first, and then going to the proprietary model.”

This framing matters because it comes from the CEO of the company that supplies the vast majority of AI training hardware. When Nvidia signals that decentralized, open-source AI is complementary to proprietary systems rather than a threat, it lends institutional credibility to the thesis behind projects like Bittensor. Most competing coverage of this story focused on the TAO price spike and basic specs; few highlighted that Nvidia’s own leadership is publicly validating the open AI model.

It is worth noting that Huang’s comments were not made exclusively about Bittensor. His remarks addressed the broader open-source and decentralized AI landscape, and the primary source article contextualized them alongside Bittensor’s milestone.

TAO Surges 24% as Volume Hits $406M in an Extreme Fear Market

TAO jumped 24% immediately after the video of Palihapitiya and Huang circulated across social media. The token was trading at $284.73 at press time, with a market capitalization of approximately $2.73 billion, placing it at rank #35 among cryptocurrencies.

The 24-hour trading volume reached $405.95 million, producing a volume-to-market-cap ratio of roughly 15%, a level that typically signals high-conviction trading activity rather than routine speculation.

The move is especially notable given the broader market environment. The Fear & Greed Index sat at 11, deep in “Extreme Fear” territory, yet TAO posted gains of 31.44% over seven days and 48.58% over 30 days. That kind of divergence from overall market sentiment, particularly during a period when macroeconomic headwinds continue to weigh on risk assets, suggests the rally was driven by project-specific catalysts rather than broad market momentum.

TAO’s circulating supply stands at 9.6 million of a 21 million maximum, a scarcity profile that mirrors Bitcoin’s fixed-supply design. The combination of a hard technical milestone, mainstream AI leader endorsement, and constrained supply created a concentrated catalyst that the market priced in rapidly.

Whether the current price level holds will depend on whether Bittensor can follow Covenant-72B with continued benchmark performance and broader adoption of its decentralized training infrastructure. The next concrete test will be how Covenant-72B performs as independent researchers evaluate it against centralized alternatives in real-world applications.

Disclaimer: This article is for informational purposes only and does not constitute financial or investment advice. Cryptocurrency and digital asset markets carry significant risk. Always do your own research before making decisions.