Beyond the success of Kotlin: a documentary about how and why Kotlin succeeded in the world of Android development.

Elon Musk’s chatbot has started training, OpenAI will produce chips, and GPT-4o is at risk of losing to the new Llama 3.1 405B — the top 3 AI news stories of the week

Our latest AI Digest covers the biggest breaking AI news of the week. Anywhere Club community leader, Viktar Shalenchanka, comments on key stories.

Anywhere Club community leader, Viktar Shalenchanka

Published in AI24 July 20242 min read

#1 — The new Grok AI is already in training

Elon Musk has announced that xAI has begun training its chatbot, Grok, on a new supercluster in Tennessee, USA. The cluster is equipped with 100,000 H100 GPUs — 4 times the number of, and 2.5 times more powerful than, the GPUs that OpenAI used to train GPT-4. AI market leaders have yet to disclose the capacity of their current clusters. One thing is clear: Musk has seriously moved into the AI race — Grok 3 is worth following.

EngX AI-Supported Testing
Leverage generative AI to minimize repetitive efforts throughout the software testing lifecycle.
View coursearrow-right-blue.svg

#2 — OpenAI will be creating chips

GPUs for training and operating neural networks are expensive for data centers. Recent figures suggest that the cost for Nvidia’s H100 starts at approximately $25,000 per GPU, although there are pricing variations from country to country. Last year, Sam Altman announced that OpenAI would set up its own AI chip manufacturing. “The Information” media published the latest news about it:

  • The chip design will not be ready until 2026.
  • OpenAI is hiring former Google engineers who know how to make chips.
  • OpenAI is discussing manufacturing ventures with Intel, Samsung, TSMC, and Broadcom.
  • Altman believes that the need for computing power for AI will increase, so  production should be set up now.

If OpenAI starts manufacturing chips, it will continue to remain one of the most influential AI companies in the world.

#3 — Llama 3.1 405B is available

It’s here: Llama 3.1 405B (yes, with 400 billion parameters). And data on weights and benchmark results have already appeared online. According to test results focused on things such as understanding school tasks, reasoning, writing code, and mastering sciences, the new Llama may outdo GPT-4o, the current leader in the AI industry. Support for new languages was added, and it now includes: English, French, German, Hindi, Italian, Portuguese, Spanish, and Thai. We will keep waiting for OpenAI’s counter move — they risk losing in performance to the open-source model!

Related posts
Get the latest updates on the platforms you love