ARM Announces Project Trillium, a New Dedicated AI Processing Family
AI and machine learning are the hottest topics in computing today. Dozens of companies are attempting to establish themselves in the market, from Google and Amazon to Intel and Nvidia. Now, ARM is throwing its own hat into the ring with its just-unveiled Project Trillium, a new line of processors specifically dedicated to AI.
There are three components to the new processor family. The ARM ML (Machine Learning) Processor is a new mobile chip designed for efficient, low-power machine learning workloads.
Here’s ARM’s description:
The solution consists of state-of-the-art optimized fixed-function engines to provide best-in-class performance within a constrained power envelope.
Additional programmable layer engines support the execution of non-convolution layers, and the implementation of selected primitives and operators, along with future innovation and algorithm generation. The network control unit manages the overall execution and traversal of the network and the DMA moves data in and out of the main memory.
The ARM OD processor is the second piece of the puzzle. The OD stands for Object Detection, and it’s designed to recognize specific parts of the body, overall positioning, and similar information. ARM claims “rich and detailed metadata allow even more information to be extracted from each frame,” which honestly sounds less like a feature and more like an attempt to improve the surveillance state. ARM directly refers to this. The first benefit in the Key Benefits section states: “Cutting-edge people detection running on mobile or embedded cameras.”
It’s unfortunate to see ARM getting into the global panopticon business. I’ve been taking photos and videos (poorly) on a phone for over a decade. That “rich metadata” capability isn’t for end users — it’s a way to allow businesses to invade even more of your life.
The third piece of the puzzle is a set of open source software development tools. ARM describes it providing a bridge between existing neural network frameworks and ARM’s various CPU cores. The ARM NN SDK can target ARM Mali GPUs, Cortex-A CPUs, the new ARM ML, and even existing Cortex-M embedded CPUs. The first release of the SK will support Caffe, with TensorFlow support arriving soon thereafter.
The ARM NN is a bridge between existing neural networks and ARM processors and translates from one type of code to a more optimized ARM format. ARM is also releasing a version explicitly tuned for Android devices.
The ARM ML and ARM NN devices are both strong options and a smart move for the company, but we can’t honestly say the same for the ARM OD.
For more, read PCMag’s AI processor explainer.
Continue reading
Google’s AI-Focused Tensor Processing Units Now Available in Beta
Google is ready to open up its Cloud TPU platform to developers and researchers looking to test machine learning workloads — and it's got a new, more powerful Cloud TPU design than the chips we've previously discussed.
MediaTek Unveils New P60 Helio With Integrated AI Processing
MediaTek is unveiling its new Helio P60 SoC at Mobile World Congress, with substantially improved CPU, GPU, and AI performance.
MIT Neural Network Accelerates MRI Image Processing by 1,000 Times
It can take hours for a computer to match all the locations in a 3D map, but researchers from MIT have developed an algorithm that could cut that time to less than a second.
New Samsung Exynos 9820 Includes Neural Processing Unit
Samsung has announced its new Exynos 9820, with significantly improved single-core performance and overall power consumption.