Sunday, Apr 28 2024 | Updated at 06:46 PM EDT

Stay Connected With Us F T R

Dec 09, 2016 09:41 PM EST

Computers and computing technology have evolved from mere adding machines to systems capable of thinking on its own. In recent years, the world witnessed computers that can translate written language, understand commands from spoken sentences and identify objects through advanced machine learning techniques. It's no secret that chipmakers of the computing industry are head-to-head in the race to who will be first in artificial intelligence (AI) hardware.

Although there are examples of consumer technology products -- Google's Translate and Apple's Siri -- that can operate in real time, the complex mathematical models from which these products depend on require power, energy and time. This has urged chipmakers like Intel, mobile computing kingpin Qualcomm and graphics powerhouse Nvidia, including various tech startups to compete in developing specialized hardware to create cheaper and faster modern deep learning.

It is an understatement to consider the significance of such chips in the development and training of AI algorithms. Jen-Hsun Huang, Nvidia CEO, discussed that computers require months of training to perform a new task. But with these chips, it could only take days. He compared it to like having a time machine, as reported by Fast Company.

Nvidia is mainly associated with video cards that assist gamers play the latest first-person shooters at the highest resolution achievable. In addition to that, the company has been targeting on adapting its graphics processing unit chips, or GPUs, to data center number crunching and serious scientific computation.

Nvidia has made the GPU technology into more general purpose by getting outside of graphics, as reorted by Nvidia's vice president and general manager Ian Buck.

GPUs are used in fast drawing video game graphics and other real-time images. These are specifically GPUs that conduct mathematical processes that include matrix multiplications and those that involve large quantities of elementary computations. Scientists have discovered the similarities as important for other mathematical applications. These applications involve modeling attributes of complex biomolecular structures and running climate simulations.

GPUs recently have proven expertise at training deep neural networks. These networks are mathematical structures modeled on the human brain. This also highly dependent on repeated parallel matrix calculations.

See Now: Covert Team Inside Newsweek Revealed as Key Players in False Human Trafficking Lawsuit

Follows Intel, NVIDIA, quad-core Qualcomm processor, AI, Chipmakers
© 2024 University Herald, All rights reserved. Do not reproduce without permission.

Must Read

Common Challenges for College Students: How to Overcome Them

Oct 17, 2022 PM EDTFor most people, college is a phenomenal experience. However, while higher education offers benefits, it can also come with a number of challenges to ...

Top 5 Best Resources for Math Students

Oct 17, 2022 AM EDTMath is a subject that needs to be tackled differently than any other class, so you'll need the right tools and resources to master it. So here are 5 ...

Why Taking a DNA Test is Vital Before Starting a Family

Oct 12, 2022 PM EDTIf you're considering starting a family, this is an exciting time! There are no doubt a million things running through your head right now, from ...

By Enabling The Use Of Second-Hand Technology, Alloallo Scutter It's Growth While Being Economically And Environmentally Friendly.

Oct 11, 2022 PM EDTBrands are being forced to prioritise customer lifetime value and foster brand loyalty as return on advertising investment plummets. Several brands, ...