Breaking News

Frontier trained a ChatGPT-sized large language model with only 3,000 of its 37,888 Radeon GPUs — the world’s fastest supercomputer blasts through one trillion parameter model with only 8 percent of its MI250X GPUs

Frontier, the world’s fastest supercomputer, can train a large language model like GPT-4 with just 8% of its GPUs.

Go to Source
Author: