A.I. chip, Maia 200, calling it “the most efficient inference system” the company has ever built. Microsoft claims the chip ...
Running both phases on the same silicon creates inefficiencies, which is why decoupling the two opens the door to new ...
Microsoft’s new Maia 200 inference accelerator chip enters this overheated market with a new chip that aims to cut the price ...
Microsoft has announced the launch of its latest chip, the Maia 200, which the company describes as a silicon workhorse ...
Maia 200 packs 140+ billion transistors, 216 GB of HBM3E, and a massive 272 MB of on-chip SRAM to tackle the efficiency crisis in real-time inference. Hyperscalers prioritiz ...
Cisco Systems, Inc. is upgraded to a hold rating, reflecting a mixed near-term outlook but strong long-term AI infrastructure positioning. CSCO's Q4 results showed respectable profitability ...
“The rapid release cycle in the AI industry has accelerated to the point where barely a day goes past without a new LLM being announced. But the same cannot be said for the underlying data,” notes ...
The market for serving up predictions from generative artificial intelligence, what's known as inference, is big business, with OpenAI reportedly on course to collect $3.4 billion in revenue this year ...
Sandisk is advancing proprietary high-bandwidth flash (HBF), collaborating with SK Hynix, targeting integration with major GPU makers. Learn more about SNDK stock here.
Anthropic is known for its stringent safety standards, which it has used to differentiate itself from rivals like OpenAI and ...