AMD AI Chip Unveiling - Assembly - Salesforce Research
AMD Gives Peek at Upcoming Line of AI Processors in Challenge to Rival Nvidia
finance.yahoo.com - None - Read On Original Website
(Bloomberg) -- Advanced Micro Devices Inc. showcased its upcoming line of artificial intelligence processors, aiming to help data centers handle a crush of AI traffic and challenge Nvidia Corp.'s dominance in the burgeoning market.
More Context
keyboard_arrow_down keyboard_arrow_right What is AMD's latest chip that has been unveiled?
videocardz.com Instinct MI300X GPU
servethehome.com Instinct MI300
wepc.com MI300X
crn.com Instinct MI300X, EPYC 97X4
finance.yahoo.com AI superchip
neowin.net 128-core EPYC 97X4 Series
insidehpc.com 4th Generation EPYC
latestly.com EPYC 97X4
techradar.com 144-Core EPYC Bergamo
wccftech.com Instinct MI300 APUs
pcmag.com Instinct MI300X
seekingalpha.com MI300 series
The company's Instinct MI300 series will include an accelerator that can speed processing for generative AI -- the technology used by ChatGPT and other chatbots -- AMD said during a presentation in San Francisco on Tuesday. The product, called MI300X, is part of a lineup that was unveiled at the CES conference in January.
Like much of the chip industry, AMD is racing to meet booming demand for AI computing. Popular services that rely on large language models -- algorithms that crunch massive amounts of data in order to answer queries and generate images -- are pushing data centers to the limit. And so far, Nvidia has had an edge in supplying the technology needed to handle these workloads.
"We are still very, very early in the life cycle of AI," AMD Chief Executive Officer Lisa Su said at the event.
The total addressable market for data center AI accelerators will rise fivefold to more than $150 billion in 2027, she said. "It's going to be a lot."
Still, the presentation failed to dazzle investors, who already have sky-high expectations for AI growth. They had bid up AMD shares 99% this year through Monday's close, but the stock drifted down more than 3% during Tuesday's event.
Executives from Amazon.com Inc.'s AWS and Meta Platforms Inc. joined Su on stage to talk about using new AMD processors in their data centers. The chipmaker also announced the general availability of the latest version of its Epyc server processors and a new variant called Bergamo that is aimed at cloud computing uses.
The MI300X accelerator is based on AMD's CDNA 3 technology and uses as much as 192 gigabytes of memory to handle workloads for large language models and generative AI, the Santa Clara, California-based company said.
More Context
keyboard_arrow_down keyboard_arrow_right What are the other applications that the chip could be used for?
crn.com AI, Cloud Expansion
neowin.net business and cloud
videocardz.com cloud native and technical computing
servethehome.com NICs, storage, and even memory
wepc.com high-performance computing (HPC) and AI workloads
finance.yahoo.com help data centers handle a crush of AI traffic and challenge Nvidia Corp.'s dominance in the burgeoning market
latestly.com Software Enablement for Generative AI (Artificial Intelligence)
benzinga.com health care to 5G networks and data centers
zdnet.com large language models
wccftech.com various core IPs, memory interfaces, interconnects
beststocks.com high-performance computing, graphics, and visualization technologies
prnewswire.com technical computing
Key customers will start sampling the technology in the third quarter, with full production starting in the fourth, AMD said. Another model, the Instinct MI300A, is going out to customers now.