How GPUs Are Changing the Data Center
by Stephanie Faris on Tuesday, February 21 12:00
At one time, intense image processing was relegated to video gaming, where innovators consistently challenged the limits of electronics. But today's consumers use their devices to watch videos and interact with apps, making Graphics Processing Units (GPUs) an integral part of the most popular computers and mobile devices on the market. In recent years, GPUs have also made their way into the data center environment, giving network administrators the power they need to handle today's bandwidth-intensive activities. As one analyst explained, GPUs are ideal for data analytics, which have become one of the biggest challenges for IT administrators. Businesses need to work with volumes of data on a regular basis, requiring server teams to create a framework to support it.
GPUs date back to 1999, when NVIDIA created the GeForce 256 GPU, which could make billions of calculations per second. However, graphics chips have been used in video games since the 1970s. NVIDIA's chip simply took image processing to the next level, and from that point forward, the processor became popular in 3-D graphics. The single-chip processor is specifically designed to render digital images much more efficiently than Central Processing Units (CPUs). Now processor manufacturers are advertising GPU-optimized server solutions, built to handle all of the complex operations businesses employ today. These solutions can push servers to operate at peak speed without consuming additional resources, making them ideal for data centers that cater to multiple clients. Over time, though, GPU-optimized components will become the norm even for businesses that have on-site servers, especially since an increasing number of companies will be incorporating data science into the decisions they make every day.
Moving into the Data Center
The biggest argument for GPUs in the data center is efficiency. One two-way node with four interconnected NVIDIA P100s can offer the same processing power a network administrator would get from 32 CPU-based nodes. Not only does this free up space in server rooms and data centers, but it also brings a substantial cost savings to administrators. For data centers that serve multiple businesses, this is a cost savings they can pass onto customers. For businesses that maintain their own on-site server rooms, this reduced cost can bring a big relief to their annual budgets.
Best of all, there are a growing number of applications that can benefit from GPU power at the server level. This number is only going to continue to grow as software grows more sophisticated. Eventually, so many applications will demand that type of power, businesses that don't accommodate it will find that they're left behind. Additionally, once businesses realize their network infrastructure has the scalability to handle those more intensive applications, they'll realize they can invest in applications that are optimized for GPU. GPUs can increase network performance while also saving money. As an end result, data centers and server rooms will be able to better serve the professionals and consumers who access them every day. For those server administrators who haven't already investigated the benefits GPU can bring to their environment, now is a great time to research and learn more about the technology.