Movidius Announces Fathom Deep Learning Accelerator Compute Stick

Fathom1Movidius, a leader in low-power machine vision technology, today announced both the Fathom Neural Compute Stick – the world’s first deep learning acceleration module, and Fathom deep learning software framework. Both tools hand-in-hand will allow powerful neural networks to be moved out of the cloud, and deployed natively in end-user devices.

The new Fathom Neural Compute Stick is the world’s first embedded neural network accelerator. With the company’s ultra-low power, high performance Myriad 2 processor inside, the Fathom Neural Compute Stick can run fully-trained neural networks at under 1 Watt of power. Thanks to standard USB connectivity, the Fathom Neural Compute Stick can be connected to a range of devices and enhance their neural compute capabilities by orders of magnitude.

Neural Networks are used in many revolutionary applications such as object recognition, natural speech understanding, and autonomous navigation for cars. Rather than engineers programming explicit rules for machines to follow, vast amounts of data are processed offline in self-teaching systems that generate their own rule-sets. Neural networks significantly outperform traditional approaches in tasks such as language comprehension, image recognition and pattern detection.

[embedded content]

When connected to a PC, the Fathom Neural Compute Stick behaves as a neural network profiling and evaluation tool, meaning companies will be able to prototype faster and more efficiently, reducing time to market for products requiring cutting edge artificial intelligence.

As a participant in the deep learning ecosystem, I have been hoping for a long time that something like Fathom would become available,” said Founding Director of New York University Data Science Center, Dr. Yann LeCun. “The Fathom Neural Compute Stick is a compact, low-power convolutional net accelerator for embedded applications that is quite unique. As a tinkerer and builder of various robots and flying contraptions, I’ve been dreaming of getting my hands on something like the Fathom Neural Compute Stick for a long time. With Fathom, every robot, big and small, can now have state-of-the-art vision capabilities.”

Fathom allows developers to take their trained neural networks out of the PC-training phase and automatically deploy a low-power optimized version to devices containing a Myriad 2 processor. Fathom supports the major deep learning frameworks in use today, including Caffe and TensorFlow.

Deep learning has tremendous potential — it’s exciting to see this kind of intelligence working directly in the low-power mobile environment of consumer devices,” Google’s AI Technical Lead Pete Warden. “With TensorFlow supported from the outset, Fathom goes a long way towards helping tune and run these complex neural networks inside devices.”

Fathom Features

  • Plugged into existing systems (ARM host + USB port), Fathom can accelerate performance between 20x and 30x on deep learning tasks, i.e. plug it into a “dumb” drone and then you can run neural network applications on it.
  • It contains the latest Myriad 2 MA2450 chip – the same one Google is using in their undisclosed next generation deep learning devices.
  • It’s ultra-low power (under 1.2W) is ideal for many mobile and smart devices. This is roughly 1/10th of what competitors can achieve today.
  • Can take Tensorflow and Caffe PC networks and put them into embedded silicon at under 1W. Fathom Images/Second/Watt is roughly 2x Nvidia on similar tests.
  • Fathom takes machine intelligence out of the cloud and into actual devices. It can run deep neural networks in real time on the device itself.
  • With Fathom, you are able to finally bridge the gap between training (i.e. server side on GPU blades), and inferencing (running without cloud connection and in user’s devices). Customers can rapidly convert a PC-trained network and deploy to an embedded environment – meaning they are going to be able to put deep learning into end user products way faster, and far more easily than before.
  • Application example: plug Fathom into a GoPro and turn it into a camera with deep learning capabilities.

Availability

General availability will be Q4 of this year. Pricing will be sub $100 per unit.

Sign up for the free insideBIGDATA newsletter.

Source: insideBigData