AMD’s data center Instinct MI300X GPU can compete against Nvidia’s H100 in AI workloads, and the company has finally posted an official result for MLPerf 4.1 — for the Llama 2 70B LLM at least. It’s roughly in line with the H100 but noticeably slower than H200 and B200.
Go to Source
Author: