— TheValueist (@TheValueist) December 26, 2025
Evolution of the AI Memory Hierarchy: HBM, DRAM, & SRAM Over
所有跟帖:
•
The Global Semiconductor Industry in One Giant Chart
-chyang98-
♂
(81 bytes)
()
12/26/2025 postreply
09:58:34
•
Groq的LPU不依赖HBM而使用SRAM,实质上是用极高的硬件成本,换取极致的带宽和延迟性能,
-红泥小火炉2022-
♂
(2781 bytes)
()
12/26/2025 postreply
20:59:56