CIOs and IT leaders call it the most disruptive technology yet, and now it’s moving rapidly into the mainstream.
Artificial intelligence (AI), an increasingly crucial piece of the technology landscape, has arrived.
More than 91 percent of businesses surveyed have ongoing — and increasing — investments in artificial intelligence.
Deploying AI workloads at speed and scale, however, requires software and hardware working in tandem across data centers and edge locations.
Foundational IT infrastructure, such as GPU- and CPU-based processors, must provide big capacity and performance leaps to efficiently run AI. Without higher performance levels, AI workloads could take months and years to run. With it, organizations can accelerate AI advancements.
Dell Technologies’ recent developments in hardware and software solutions mirror AI software capabilities to do just that—advance AI.
More specifically, next-gen offerings from Dell Technologies provide 8-10x performance improvements according to MLCommons ®MLPerf™ benchmarks. The upgraded Dell Technologies solution portfolio includes a range of GPU-optimized servers for AI training and CPU-powered servers for enterprise-wide AI inferencing, both of which are essential, co-existing elements of AI deployment.
MLCommons MLPerf Results
For benchmarking, the MLCommons updated version 3.0, MLPerf Inference was used; the latest results are shown here. Benchmarks include categories such as image classification, object detection, natural language processing, speech recognition, recommender systems and medical image segmentation.
While the inference benchmark rules did not change significantly, Dell Technologies expanded its submission with the new generation of Dell PowerEdge servers, including new PowerEdge XE9680, XR7620, and XR5610 servers and new accelerators from its partners. Submissions were made with VMware running on NVIDIA AI Enterprise software and NVIDIA accelerators as well as Intel-based CPU-only results.
The results for Dell Technologies’ next-gen processors are extraordinary for the highly demanding use cases of AI training, generative AI model training and tuning and AI inferencing. Compared to previous generations of hardware, the results show a significant uptick in performance:
- 8-10x improvement in performance.
- 6-8x improvement in performance.
More detailed results can be seen here.
AI in Action
AI data center and edge deployments mandate a highly interdependent ecosystem of up-level software and hardware capabilities, including a mix of GPU- and CPU-based processors.
Each industry and organization can tailor infrastructure based on unique needs, preferences and requirements.
Consider, for example, a pharmaceutical company using AI modeling and simulation for drug discovery.
Modern developments are based on a chemist finding highly active molecules that also test negative for neurotoxicity. There are trillions of compounds to consider and evaluate.
Each search takes almost two months and thousands of dollars, limiting the number of searches and tests that can be conducted. Using AI, simulations can examine many more molecules much faster and cheaper, opening a new world of possibilities.
To accelerate drug discovery (there are thousands of diseases and only hundreds of cures), pharmaceutical companies need powerful processors to handle large and diverse data sets efficiently and effectively.
Retailers typically use AI differently than pharmaceutical companies. Retail use cases often revolve around video imagery used to enhance security, bolster intrusion detection and support self-check-out capabilities. To boost newfound capabilities in these areas, retailers need more powerful GPU-optimized processors to handle image-based data streams.
Advancing AI
Emerging generative AI use cases, such as digital assistants and co-pilots for software development, are appearing as the next frontier of AI. That’s why at Dell Technologies, innovation never rests. When it comes to technology infrastructure, Dell Technologies and its partners are constantly innovating to reach new performance levels and help redefine what is possible.
The exponential performance increase in NVIDIA GPU-optimized servers and the infusion of AI Inferencing in Intel’s® Xeon®-based servers are creating the required AI foundation. With these results, Dell Technologies can help organizations fuel AI transformations precisely and efficiently, including new AI training and inferencing software, generative AI models, AI DevOps tools and AI applications.
***
Dell Technologies. To help organizations move forward, Dell Technologies is powering the AI journey, including enterprise generative AI. With best-in-class IT infrastructure and solutions to run AI workloads and advisory and support services that roadmap AI initiatives, Dell is enabling organizations to boost their digital transformation and accelerate intelligent outcomes.
Intel. The compute required for AI models has put a spotlight on performance, cost and energy efficiency as top concerns for enterprises today. Intel’s commitment to the democratization of AI and sustainability will enable broader access to the benefits of AI technology, including generative AI, via an open ecosystem. Intel’s AI hardware accelerators, including new built-in accelerators, provide performance and performance per watt gains to address the escalating performance, price and sustainability needs of AI.
Artificial Intelligence
Read More from This Article: 8-10x performance upticks in next-gen infrastructure enable AI workloads
Source: News