Photo credit: www.investing.com
AI Datacenter Storage: Insights from Bernstein Webinar
Investing.com — Recent analysis from Bernstein indicates that the market potential for storage in AI datacenters is relatively minor compared to the needs of server infrastructure, especially since large language models (LLMs) have a significantly reduced storage requirement.
In a recent online seminar featuring David Hall, the former Vice President of Infrastructure at Lambda, Bernstein gathered valuable insights on current dynamics within the AI cloud sector.
Hall estimates that storage expenses account for just “8-12% of the cost of a model-training GPU cluster,” which aligns with Bernstein’s assertion that storage is not a primary focus in AI infrastructure compared to servers.
The need for enhanced storage is more pronounced in models that process images and videos due to their larger datasets. However, Bernstein points out that the general demand for storage in AI datacenters remains lower than that for other critical components like server systems.
Additionally, the selection of a storage provider is centered around finding the right balance between features and affordability, according to Bernstein.
During the seminar, it was highlighted that Lambda currently collaborates with storage providers such as DDN, Vast, and WEKA. They opted to steer clear of alternatives such as NetApp, Dell, and Pure Storage, citing that the latter options did not deliver comprehensive features needed for their operations.
Hall also elaborated on the longevity and upgrade cycles of GPUs in AI applications. Bernstein reports that Hall mentioned GPUs typically have a lifespan ranging from “7-9 years,” indicating that even those fully depreciated can still offer useful performance.
He further noted that the latest Blackwell GPUs present remarkable performance improvements of “60-200%” compared to Nvidia’s Hopper models, though not every application necessitates cutting-edge technology. Many tasks can efficiently utilize older GPU generations like Ampere or P-series models.
Bernstein’s analysis reiterated Nvidia’s stronghold in the software domain, underscored by the significance of CUDA and CuDNN as essential differentiators in the AI landscape. While new startups with tailored GPUs pose a potential threat to Nvidia’s market share, Bernstein asserts that the foundation of successful AI application lies in robust software.
In conclusion, as the competitive framework for GPUs continues to evolve, the software layer is projected to remain “the most important component” influencing market dynamics.
Source
www.investing.com