
Mike Demler joins Bob and Joe to discuss AI-chip juggernaut Nvidia rapturing key Groq staff and licensing the struggling startups innovative NPU technology. continue reading

In light of rising DRAM prices, we’re updating this article from June. As of late 2025, server DRAM prices are surging. Memory makers are prioritizing higher-margin HBM for AI accelerators over DDR5. Consequently, DDR5 DIMMs are perhaps the largest and fastest-rising component costs of a server. At the same time, OEMs are passing these cost continue reading

IBM’s Power11 processor introduces technologies like external memory buffers, AI acceleration, and SMT that will become common in other systems. Power11 refines the Power10 design, achieving a 14–50% speedup in IBM’s applications and offering up to 256 cores in a chassis. continue reading
Akeana Amazon AMD Ampere Andes Apple Arm auto Broadcom Ceva CPU data center DPU DSP edge AI embedded Epyc FPGA Google GPU Imagination Intel Marvell MCU MediaTek Meta Microsoft MLPerf networking NPU (AI accelerator) Nvidia NXP OpenAI PC process tech Qualcomm Renesas RISC-V SambaNova SiFive smartphone SoftBank software Tenstorrent Tesla