The demands of AI data centers compute capability are set to increase dramatically from around 100 ZettaFLOPS today to around 10+ YottaFLOPS* in the next five years (approximately by about 100 times), ...
Built on the AMD CDNA™ 3 architecture, AMD Instinct MI325X accelerators are designed for exceptional performance and efficiency for demanding AI tasks spanning foundation model training, fine-tuning ...
At its Advancing AI event, the chip designer says the 288-GB HBM3e capacity of the forthcoming Instinct MI355X and MI350X data center GPUs help the processors provide better or similar AI performance ...
ASE Holdings, the world's largest outsourced semiconductor assembly and test (OSAT) provider, has transitioned its core IT infrastructure to AMD platforms, deploying EPYC processors across its servers ...
Micron Technology, Inc. announced its integration of the HBM3E 36GB 12-high memory product into AMD's upcoming Instinct™ MI350 Series solutions, emphasizing both power efficiency and performance ...
TL;DR: AMD announced at CES 2026 that its next-generation Instinct MI500 AI accelerators, launching in 2027, will feature TSMC's advanced 2nm process, CDNA 6 architecture, and next-gen HBM4E memory.
TL;DR: AMD's new Instinct MI430X GPU, based on CDNA 5 architecture and equipped with 432GB HBM4 memory at 19.6TB/sec bandwidth, targets HPC and large-scale AI workloads. Deployed in top supercomputers ...
Supermicro introduces the latest addition of AI-accelerated solutions with a new 10U air-cooled server, which incorporates the AMD Instinct MI355X GPUs delivering breakthrough performance for AI and ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results