China’s AI firms are cleverly innovating around chip bans
Tweaks to software blunt the shortage of powerful hardware
TODAY’S TOP artificial-intelligence (AI) models rely on large numbers of cutting-edge processors known as graphics processing units (GPUs). Most Western companies have no trouble acquiring them. Llama 3, the newest model from Meta, a social-media giant, was trained on 16,000 H100 GPUs from Nvidia, an American chipmaker. Meta plans to stockpile 600,000 more before year’s end. XAI, a startup backed by Elon Musk, has built a data centre in Memphis powered by 100,000 H100s. And though OpenAI, the other big model-maker, is tight-lipped about its GPU stash, it had its latest processors hand-delivered by Jensen Huang, Nvidia’s boss, in April.
Already have an account?Log in
Continue with a free trial
Explore all our independent journalism for free for one month. Cancel any time
Get startedExplore more
Science & technology September 21st 2024
More from Science & technology
Most electric-car batteries could soon be made by recycling old ones
Mining for raw materials may peak by the mid-2030s
New battery designs could lead to gains in power and capacity
Researchers are looking beyond the cathode
Earth may once have had a planetary ring
It would have collapsed 450m years ago
How bush pigs saved Madagascar’s baobabs
Non-native species are not always harmful
Geothermal energy could outperform nuclear power
Tricks from the oil industry have produced a hot-rocks breakthrough
The world’s first nuclear clock is on the horizon
It would be 1,000 times more accurate than today’s atomic timekeepers