DPUs Aren’t Just For The Cloud

0

Cloud data centers are the ultimate in system compute architecture and rightfully so. Cloud data centers handle some of the most compute-intensive tasks from scientific modeling for COVID vaccines to developing AI models from billions of data points. As a result, cloud data centers require a huge investment in hardware, software, and infrastructure, which in turn drives a need for performance, efficiency, and return on investment (ROI) optimization. These requirements are driving changes in data center architectures towards processors and accelerators optimized for each type of workload. Just as we have seen the rise of custom processors and the use of Graphics Processing Units (GPUs), digital Signal Processors (DSPs), Field Programmable Gate Arrays (FPGAs), Neural Processing Units (NPUs), and custom accelerators for processing the data coming into or through the data center, so too are we seeing the rise of even more specialized accelerators called Data Processing Units (DPUs) or Infrastructure Processing Units (IPUs) – referred to simply as DPUs for the rest of this article – to execute specific workloads within a data center, including security, network management, storage management, and other operational functions.

While the use of coprocessors for internal acceleration is nothing new, the current crop of DPUs takes the concept of a coprocessor to a new level. The DPU is not just a separate processor. A DPU is an accelerator that may have dedicated and/or programmable processing elements to accelerate critical data center tasks leaving the main processing elements, typically server processors, to be dedicated to revenue-generating tasks. However, the DPU is also tightly integrated into the system architecture and works closely with the host processors and workload accelerators. As a result, DPUs can provide a significant uplift to data center processing. According to some of the data provided by Nvidia, this contribution could be as much as 2X. That raises the question: If the cloud data centers can benefit from DPUs, can enterprise servers also benefit from DPUs? And the answer is yes!

The term server can mean different things to different people and different industries, but a server is really just a network-attached compute resource that typically does not support direct access through user interface devices, such as a keyboard, mouse and monitors. A server is accessed over the network by client services that have a user interface or by other servers. Unlike many cloud servers, which may be dedicated to performing specific functions or executing on certain data types, enterprise servers may be tasked with performing a wide range of functions for various groups within an organization. Think of an enterprise server as the digital Swiss Army Knife for an organization.

An enterprise server may run an inventory system for manufacturing, serve as a customer resource management (CRM) platform for sales, run an invoicing system for purchasing, host design applications for engineering, or provide office productivity tools across the organization. Even with this this broad set of server requirements, a DPU is beneficial, and in some ways even more so that in a cloud environment. The diverse applications running on an enterprise server all require some basic functions including storing and retrieving data, managing the flow of data over a network, and ensuring the security of that data, which may be coming from and accessed by a wider array of sources and consumers. These are precisely the functions that can be better managed with a DPU. Additionally, DPUs, like Nvidia’s Bluefield as an example, are equipped with accelerators that may be used to accelerate other functions, such as running data analytics or AI algorithms for usage, efficiency, network maintenance, or for maintaining other enterprise resources such as storage or precise time synchronization. Integrating DPUs into an enterprise server can increase the overall performance of the platform while reducing the need to add additional servers by taking over the overhead tasks that may not be well suited for traditional CPU-based host processors.

It’s important to note that DPU hardware and software are still evolving and rapidly. There is an industry-wide effort to develop and implement DPUs, but there are and will continue to be different solutions from different silicon and server vendors. Currently, silicon providers AMD (Xilinx and Pensando), Broadcom, Fungible, Intel, Nvidia, and Marvell all offer products that would fit into the DPU category but vary greatly in terms of architecture and functionality.

The introduction of the DPU for the data center is changing the system architecture for data centers going forward, and as Nvidia’s Jensen Huang noted in a GTC Q&A session, this is just the beginning for DPUs. As the use of DPUs grows, certain functions may be broken off into more specialized accelerators, just as we are seeing now with the workload accelerators. We may have one DPU for storage, one for networking, and one for security. These functions may be implemented in separate chips, or they could be realized as individual dies stacked into a single package. While determining the best way to implement these accelerators is evolving with semiconductor manufacturing and packaging technology, it is clear that DPUs are here to stay. TIRIAS Research believes that DPUs will have a significant impact not just on cloud data center design, but also enterprise server design.

Stay connected with us on social media platform for instant update click here to join our  Twitter, & Facebook

We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.

For all the latest Technology News Click Here 

Read original article here

Denial of responsibility! Rapidtelecast.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.
Leave a comment