
In light of rising DRAM prices, we’re updating this article from June. As of late 2025, server DRAM prices are surging. Memory makers are prioritizing higher-margin HBM for AI accelerators over DDR5. Consequently, DDR5 DIMMs are perhaps the largest and fastest-rising component costs of a server. At the same time, OEMs are passing these cost continue reading

IBM’s Power11 processor introduces technologies like external memory buffers, AI acceleration, and SMT that will become common in other systems. Power11 refines the Power10 design, achieving a 14–50% speedup in IBM’s applications and offering up to 256 cores in a chassis. continue reading

Leading up to Nvidia’s recent earnings call, concerns mounted that the company would confirm fears that the AI bubble would soon burst. Too many companies are prodigiously spending to build the next data center to create the next foundation model. Profitability is well over the time horizon, and their annular funding arrangements are unsustainable. Nvidia continue reading
Amazon AMD Arm auto Broadcom business Cerebras Ceva consumer CPU data center DPU DSP edge AI embedded Epyc FPGA Google GPU Imagination industrial Intel interconnect Marvell MCU MediaTek Meta Microsoft MLPerf NPU (AI accelerator) Nvidia NXP OpenAI PC process tech Qualcomm Renesas RISC-V SambaNova SiFive smartphone SoftBank software Tenstorrent Tesla