Elon Musk’s Colossus Supercomputer Launches with 200,000 GPUs and Massive Energy Capacity

Elon Musk’s xAI has launched its supercomputer, Colossus, in Memphis, Tennessee.
The system is powered by 200,000 NVIDIA GPUs, making it the world’s largest AI training cluster.
Colossus is housed in a former Electrolux factory, which was adapted to meet the project’s needs.
The location was chosen for its existing infrastructure and potential for rapid expansion.

Powering the Beast

The supercomputer uses 150 megawatts from the local grid and another 150 megawatts from Tesla’s Megapacks.
These massive batteries provide energy stability during high demand or power outages.
The total energy consumption is expected to rise to 300 megawatts in the next phase – enough to power 300,000 homes.
xAI’s goal is to train its language models, including Grok, which is integrated into the X platform, formerly known as Twitter.

A Boost for Musk’s Companies

The processing power of Colossus will also benefit other Musk companies, such as Tesla and SpaceX.
The project initially used natural gas turbines but is now shifting towards more sustainable solutions.
A new substation and extensive use of Megapacks will reduce the data center’s carbon footprint and improve energy resilience.
NVIDIA CEO Jensen Huang has praised the project’s scale and speed.
xAI aims to reach 1 million GPUs in the next phase, making Memphis a major AI hub.

Recent Articles

Related News

Leave A Reply

Please enter your comment!
Please enter your name here