Tech

Intel, Nvidia, Qualcomm Compete for Artificial Intelligence Hardware [Video]

By

Computers and computing technology have evolved from mere adding machines to systems capable of thinking on its own. In recent years, the world witnessed computers that can translate written language, understand commands from spoken sentences and identify objects through advanced machine learning techniques. It's no secret that chipmakers of the computing industry are head-to-head in the race to who will be first in artificial intelligence (AI) hardware.

Although there are examples of consumer technology products -- Google's Translate and Apple's Siri -- that can operate in real time, the complex mathematical models from which these products depend on require power, energy and time. This has urged chipmakers like Intel, mobile computing kingpin Qualcomm and graphics powerhouse Nvidia, including various tech startups to compete in developing specialized hardware to create cheaper and faster modern deep learning.

It is an understatement to consider the significance of such chips in the development and training of AI algorithms. Jen-Hsun Huang, Nvidia CEO, discussed that computers require months of training to perform a new task. But with these chips, it could only take days. He compared it to like having a time machine, as reported by Fast Company.

Nvidia is mainly associated with video cards that assist gamers play the latest first-person shooters at the highest resolution achievable. In addition to that, the company has been targeting on adapting its graphics processing unit chips, or GPUs, to data center number crunching and serious scientific computation.

Nvidia has made the GPU technology into more general purpose by getting outside of graphics, as reorted by Nvidia's vice president and general manager Ian Buck.

GPUs are used in fast drawing video game graphics and other real-time images. These are specifically GPUs that conduct mathematical processes that include matrix multiplications and those that involve large quantities of elementary computations. Scientists have discovered the similarities as important for other mathematical applications. These applications involve modeling attributes of complex biomolecular structures and running climate simulations.

GPUs recently have proven expertise at training deep neural networks. These networks are mathematical structures modeled on the human brain. This also highly dependent on repeated parallel matrix calculations.

© 2024 University Herald, All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics