Photo credit: arstechnica.com
Primate Labs Launches Geekbench AI to Measure NPU Performance
Neural Processing Units (NPUs) are becoming increasingly integrated into mainstream chipsets from major manufacturers like Intel and AMD. Historically, NPUs were primarily associated with mobile devices such as smartphones, tablets, and Macs. However, as the demand for local generative AI processing, image editing, and chatbots rises, evaluating NPU performance is set to play a crucial role in consumer purchasing decisions.
Primate Labs, the creators of Geekbench, recognize this shift. Originally designed to benchmark CPU and GPU performance, Geekbench has introduced a specialized project, now renamed Geekbench AI, to specifically assess the inference performance of NPUs. As companies like Microsoft launch initiatives such as Copilot+, and with all eyes on boosting NPU capabilities, Primate Labs is stepping up its efforts by advancing Geekbench AI to version 1.0. This branding change taps into the current AI craze while positioning the tool as essential for developers and consumers alike.
John Poole from Primate Labs highlighted the importance of adaptability in AI workloads, stating, “Just as CPU-bound workloads vary in how they can leverage multiple cores or threads for performance improvement, AI workloads encompass diverse precision levels based on requirements and available hardware.” Geekbench AI will provide users with performance summaries across various workload tests that utilize single-precision, half-precision, and quantized data types, all tailored to the needs of developers working in AI.
Beyond measuring speed, Geekbench AI places emphasis on accuracy, which is vital for machine learning tasks that demand reliable outcomes, such as identifying and sorting images in a database.
The new application is designed to work seamlessly with several AI frameworks, including OpenVINO for Windows and Linux, ONNX on Windows, Qualcomm’s QNN on Snapdragon PCs, and Apple’s CoreML for macOS and iOS. This compatibility allows it to operate on CPU, GPU, or NPU, depending on the device’s hardware configuration.
For Windows users, where NPU support is still evolving and APIs like Microsoft’s DirectML are developing, Geekbench AI is currently compatible with Intel and Qualcomm NPUs, with plans for AMD support in future releases as clarity improves on enabling this technology.
“We’re hoping to add AMD NPU support in a future version once we have more clarity on how best to enable them from AMD,” Poole remarked in response to inquiries.
Geekbench AI is accessible on multiple platforms, including Windows, macOS, Linux, iOS/iPadOS, and Android, available for free. A Pro version offers additional command-line tools, the option to run tests without reporting results to the Geekbench Browser, and other exclusive features. As AI technology continues to evolve, Primate Labs anticipates releasing regular updates to keep pace with new hardware, frameworks, and requirements.
“AI is nothing if not fast-changing,” Poole noted in the update announcement. “Expect new releases and ongoing enhancements as market demands and AI functionalities develop.”
Source
arstechnica.com