NVIDIA GPUs are both AI’s brain and its beating heart

In an article for Forbes about the march of AI, machine learning and the technology that underpins both, author Stephen McBride refers to NVIDIA as “America’s most important company”. But why?

Speed is key to training machine learning models cost effectively, and CPUs process slowly, so GPUs made by NVIDIA and companies like it are essential components of any machine learning system. GPUs are much more powerful and more expensive than CPUs but you need fewer of them, so systems can be, relatively, much smaller and overall less expensive than than their CPU-driven predecessors. They also consume less power overall.

Artificial Neural Network

In the first half of 2020, while the world unravelled, NVIDIA’s fortunes soared as its GPUs found their way into at least 90% of all AI projects and AI-related sales topped $2.8 billion. NVIDIA’s new A100 processors are wowing customers and commentators alike. This “supercomputer in a box” is “the most powerful chip system ever created”, equivalent to the grunt of 300 conventional rack-mount servers, according to McBride:

In fact, just one A100 packs the same computing power as 300 data center servers. And it does it for one-tenth the cost, takes up one-sixtieth the space, and runs on one-twentieth the power consumption of a typical server room. A single A100 reduces a whole room of servers to one rack.

If you plan to take a serious look at how machine learning and intelligent automation can help your business in 2021 and you would like to speak to some real experts in the field, from NVIDIA and HPE and their many partners, about how to get started, and the skills and technology you are going to need, access to get in touch and we will arrange a call.

Source: Forbes