Elon Musk Buys Thousands of GPUs for Twitter's Generative AI Project

Elon Musk Buys Thousands of GPUs for Twitter's Generative AI Project

Nvidia

(Image credit: Nvidia)

Despite advocating for an industry-wide halt to AI training, Elon Musk has reportedly kicked off a major artificial intelligence project within Twitter. The company has already purchased approximately 10,000 GPUs and recruited AI talent from DeepMind for the project that involves a large language model (LLM), reports Business Insider.

One source familiar with the matter stated that Musk’s AI project is still in its initial phase. However, acquiring a significant amount of additional computational power suggests his dedication towards advancing the project, as per another individual. Meanwhile, the exact purpose of the generative AI is unclear, but potential applications include improving search functionality or generating targeted advertising content. 

At this point, it is unknown what exact hardware was procured by Twitter. However, Twitter has reportedly spent tens of millions of dollars on these compute GPUs despite Twitter’s ongoing financial problems, which Musk describes as an ‘unstable financial situation.’ These GPUs are expected to be deployed in one of Twitter’s two remaining data centers, with Atlanta being the most likely destination. Interestingly, Musk closed Twitter’s primary datacenter in Sacramento in late December, which obviously lowered the company’s compute capabilities. 

In addition to buying GPU hardware for its generative AI project, Twitter is hiring additional engineers. Earlier this year, the company recruited Igor Babuschkin and Manuel Kroiss, engineers from AI research DeepMind, a subsidiary of Alphabet. Musk has been actively seeking talent in the AI industry to compete with OpenAI’s ChatGPT since at least February. 

OpenAI used Nvidia’s A100 GPUs to train its ChatGPT bot and continues to use these machines to run it. By now, Nvidia has launched the successor to the A100, its H100 compute GPUs that are several times faster at around the same power. Twitter will likely use Nvidia’s Hopper H100 or similar hardware for its AI project, though we are speculating here. Considering that the company has yet to determine what its AI project will be used for, it is hard to estimate how many Hopper GPUs it may need. 

When big companies like Twitter buy hardware, they buy at special rates as they procure thousands of units. Meanwhile, when purchased separately from retailers like CDW, Nvidia’s H100 boards can cost north of $10,000 per unit, which gives an idea of how much the company might have spent on hardware for its AI initiative.

Anton Shilov
Freelance News Writer

Anton Shilov is a Freelance News Writer at Tom’s Hardware US. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • AI is the next scam.

    After decades of Tesla using “AI” and “machine learning”, the cars are still horrendous at self-parking. Companies not relying on “AI” for parking have been able to properly park for years.

    nsb2XBAIWyA

    Reply

  • am i dumb or is AI the next Gen business , after Mining.

    Reply

  • bniknafs9 said:

    am i dumb or is AI the next Gen business , after Mining.

    Yes and no. The word AI has definitely become the hot keyword, just simply put AI in your company name, throw it out in a conference call or say your doing something (anything) with it and you will get a stock pop. Most companies that claim they are doing AI work aren’t or are just simply leveraging existing ML tools to extend some existing functionality than doing anything that will move the needle sales wise.

    However, AI in computer vision has been a big move forward and AI in the form of LLM has also been able to do things that were difficult to do with lines of code (not impossible, but difficult and expensive). In this way AI is a next gen business, but with some caveats. However, companies jamming LLM into every product and calling it a revolution, well, that is largely just cashing in on the hype.

    In my personal opinion the companies that best leverage AI will be the winners, but I don’t think their will be a single company that will get associated with AI the way things like search has with Google or Netflix with streaming. People are trying to do it with ChatGPT, but ChatGPT is just one type of AI, there are many many forms of AI/ML that are good at specific tasks and I think the idea of one model to rule them all is unlikely at this point in the cycle/evolution.

    Reply

  • PlaneInTheSky said:

    AI is the next scam.

    its not.

    ai has uses just needs to mature properly.

    especially in fields like medical where theres a ton of info and having an ai use your given input to offer outputs can save you time.
    same for coding as bug testing your code is timely and ai doing it for you quickly is helpful.

    whatever reason Musk wants it for thoguh is for sure a waste of money, effort, & power.

    Reply

Add a Comment