Why business is booming for the GPU

'GPUaaS' may be an acronym too far, but many are turning cloud-wards for their big data-crunching capabilities.
21 November 2018

Time to free up some shelves in the data center? Source: Shutterstock

Despite seeing a dip in sales for its GPU (graphics processing unit) explicitly designed for cryptocurrency mining in recent months, Nvidia’s growth rates in the sector remain skies above nearest competitors.

In the third quarter of 2018, GPU maker achieved US$3.18 billion in sales—a near 21 percent hike on the previous year— while income was up by 46.8 percent to hit $1.23 billion.

So, if cryptocurrency mining and Ethereum-based blockchains technologies (such as smart contracts, or more conventional fintech apps) are on the downturn, what is continuing to drive Nvidia’s growth?

Today’s GPUs are used for more than mere gaming. Any processing activity which relies on highly fast yet simple calculations (so-called ‘grid-based computing’) is better undertaken on a GPU.

Rather than six or eight cores, GPUs possess many hundreds or thousands of simpler cores capable of parallel computing that solves problems quickly. GPUs are coming into their own in many new and emerging markets, such as:

  • Robotics – analysis of the sensors and inputs requires high-speed processing.
  • Data analysis — environmental modeling, molecular and biotechnological analysis, and so-called big data processing, for example.
  • Self-driving vehicles — GPUs used to analyze video and other sensor information at high speed to enable lightning-fast decision-making.
  • Artificial intelligence (AI) tasks, such as facial recognition — the iterative processes required to teach machine-learning routines are usually offloaded onto GPU arrays.

As an increasing number of software and service providers investigate the possibilities that AI can deliver, cloud providers of all sizes are rushing to offer GPU computing as a service. While readers will be well acquainted with platforms like Amazon’s AWS EC2 and Oracle Cloud possessing GPU-compatibility, there are several smaller, arguably more agile cloud companies offering GPU arrays.

While some companies are looking to source their computing power from the cloud, some industry verticals prefer to use GPUs on-premise. In an interview this year, Nvidia’s CEO, Jensen Huang, pointed to healthcare as one area in which machine and deep-learning methods can model intricate neural networks to cure diseases or to provide better diagnostic imaging.

Conversely, some companies prefer cloud provision of GPU compute so that the CSPs (cloud service providers) bear the burden of increased regulation and compliance requirements. In short, it’s quicker and cheaper to farm out.

In a specifically business-oriented environment, companies like SAP are using AI powered by GPU to automate processes such as employee approvals, payment processing, and sales discounting. General Electric, meanwhile, uses it to aid predictive maintenance and process data coming from the industrialized Internet of Things.

What is beyond doubt is that deep learning and AI is changing the way we design hardware, conceptualize software — training software to write software, for instance — and create new architectures that make all of this possible.

There is a danger that ‘GPU-powered’ will become the type of shorthand for ‘cutting edge’ that bedevils the use of ‘AI’ and ‘machine learning’. However, while not every business has the capability or willingness to exploit exascale GPU infrastructures, its availability to anyone who can pay on a per-second basis will make GPU-powered code the data-crunching norm in the next 10 years.