Arm puts the smarts in small things

New chips enable learning for the Internet of things, writes Martin Garner, COO at CCS Insight.

As artificial intelligence (AI) touches a growing number of products, from the data centre to connected devices, applications for the technology are creating massive computing requirements. Smartphones use machine learning for voice recognition, image processing and other tasks, but the growth of smaller devices for the internet of things is creating greater possibilities for machine learning.

Last week, Arm, a UK-based semiconductor design company owned by SoftBank, unveiled two new computer chip designs for AI.

The Cortex-M55 processor is a relatively low-cost and power-efficient chip that will be used to run sensors and perform simple computational tasks. The Ethos-U55 neural processing unit is intended to run in conjunction with the Cortex-M55 as a way of accelerating applications that need neural networks — a key approach used in machine learning.

Neural processing units are chips designed specifically for running neural networks. Arm describes the Ethos-U55 as the industry’s first micro neural processing unit, in other words, small enough and power-efficient enough to run on the smallest electronic devices.

In the past, such chips generally lacked enough computing power to perform machine learning functions efficiently. Instead, most of those tasks had to be performed on higher-powered chips, such as Arm’s Cortex-A microprocessors, found in smartphones.

Working in sync

The company says that as the two new chips work in sync, it managed to run machine learning tasks 480 times faster than with its previous Cortex-M chips. Using the two chips together also makes them 25 times more energy efficient, which is critical for many devices that depend on battery power.

The new chips will allow AI applications to be brought to areas such as farming, healthcare and voice assistants. In farming, low-cost sensors equipped with machine learning can be used to carefully calibrate plant inputs. In smart speakers, such as those using Amazon’s Alexa, some of the functions can be run directly on the devices. These devices currently relay all data, apart from the wake word, to remote data centres, raising concerns about cybersecurity and eavesdropping.

Getting AI to function on relatively low-power devices at the network edge — rather than in the cloud, where most AI workloads are run today — changes the data security and privacy equation. There are some benefits in not sending so much data to the cloud, but all parties need to be confident that the edge devices are themselves secure and private enough. But processing data locally does allow for faster response times that are critical in applications like voice assistants, where any lag can harm the user experience.

Not alone

When it first started out, Arm provided chip designs aimed at mobile devices that were used by component suppliers and device makers such as Qualcomm and Apple. But the company now has a diverse set of customers and is a leading provider of processor designs for embedded computers employed in many areas, such as cars and factory machines.

Arm isn’t alone in supplying chips for low-cost, power-efficient sensors and electronics. Huawei has also invested in inexpensive neural processors but packaged them in higher-end mobile phones, such as with its Kirin 970 chip. AMD has also created some chips designed to run AI applications on smart TVs.

The push to add AI functionality to smaller and more power-constrained devices is putting pressure on component manufacturers to keep pace with the evolving needs of device makers. Arm’s new chips are expected to hit the market in 2021 and will enable technology companies to implement AI applications locally on smaller devices without sacrificing performance or size.

www.ccsinsight.com