Nvidia Announces Jetson Orin Nano, Updates Isaac Nova Orin Dev Kit

Nvidia Announces Jetson Orin Nano, Updates Isaac Nova Orin Dev Kit

During the Nvidia keynote at GTC today, the company unveiled its new Jetson Orin Nano system-on-module (SOM) and announced updates to its Nova Orin robot reference platform.

At $199, the new Orin Nano is more expensive than the older Xavier-based Jetson Nano. Nvidia claims that the new hardware is much faster, and the system includes 4GB and 8GB storage options, instead of 2GB/4GB. Both tiers feature an Ampere GPU and include deep learning and computer vision accelerators. Nvidia says that the 8GB version can perform “up to 40 TOPS” for between seven and fifteen watts. The 4GB version should deliver about half that performance, but with power options “as low as 5W to 10W.”

One significant difference between the AGX Orin product family (including the Nano) and the previous Nvidia Xavier SoC is their choice of CPU IP. While both Xavier and Orin use ARM-based processors, Orin is based on the ARM Cortex-A78AE. Xavier’s CPU, in contrast, was codenamed Carmel. Carmel is a further iteration of Nvidia’s long-running (and occasionally semi-mythical) “Project Denver.” Originally envisioned as a CPU that would be capable of emulating x86 and challenging Intel on its own turf, some of Project Denver’s design concepts were similar to Transmeta’s. Carmel was the second iteration of the core and showed significant improvements in multiple areas. It seems the core has been quietly retired in favor of an ARM-designed CPU implementation.

There could be good reasons for this. The ARM Cortex-A78AE is designed to meet additional safety and reliability standards that ordinary chips don’t need to provide, and it may be that Nvidia found it simpler to build around a standard ARM core than to continue developing its own architecture.

Nvidia Announces 3 New Nova Orin Reference Platform Configurations

Orin modules specialize in coordinating data from a wide array of sensors, such as IR cameras, ultrasonic sensors, and inertial measurement units. They’re capable of wrangling multiple concurrent AI application pipelines. Nvidia also emphasizes its Orin modules as part of three new reference designs for autonomous mobile robots (AMRs). In addition to the new Orin modules, Nvidia announced an update to its Nova Orin AMR reference platform. Or rather, a trio of updates. The company released details for three different AMR configurations. All three rely on Jetson AGX Orin modules.

Nvidia Announces Jetson Orin Nano, Updates Isaac Nova Orin Dev Kit

Inward-facing sensors and cameras, as well as AMRs using the Nova Orin platform, can plug in to Nvidia’s Metropolis tools, to offer both “inside-out” and “outside-in” mapping perspectives. (Wall-E looks outward at the world from the inside of his chassis, so his perspective is “inside-out.” The surveillance cameras in his Buy-N-Large warehouse look inward from the outside, so they’re “outside-in.”)

Nvidia’s Jetson AGX Orin modules run the company’s Isaac robotics stack, which Nvidia CEO Jensen Huang announced today will be moving to the cloud. Isaac, including Isaac Sim, proffers a suite of “cloud-native” tools compatible with the ROS 2 ecosystem. Developers can use Isaac Sim to manage sensor data, which Nvidia hopes will find widespread use in autonomous self-driving vehicles, immersive gaming, and medical diagnostics. For example, SlashGear reports that Amazon is using Isaac Sim to map out the inside of its warehouses, in order to test, train, and manage its own AMRs.

GTC 2022 runs from Sept. 19 through Sept. 22.

Continue reading

MSI’s Nvidia RTX 3070 Gaming X Trio Review: 2080 Ti Performance, Pascal Pricing
MSI’s Nvidia RTX 3070 Gaming X Trio Review: 2080 Ti Performance, Pascal Pricing

Nvidia's new RTX 3070 is a fabulous GPU at a good price, and the MSI RTX 3070 Gaming X Trio shows it off well.

Nvidia Will Mimic AMD’s Smart Access Memory on Ampere: Report
Nvidia Will Mimic AMD’s Smart Access Memory on Ampere: Report

AMD's Smart Access Memory hasn't even shipped yet, but Nvidia claims it can duplicate the feature.

Nvidia Unveils Ampere A100 80GB GPU With 2TB/s of Memory Bandwidth
Nvidia Unveils Ampere A100 80GB GPU With 2TB/s of Memory Bandwidth

Nvidia announced an 80GB Ampere A100 GPU this week, for AI software developers who really need some room to stretch their legs.

Nvidia, Google to Support Cloud Gaming on iPhone Via Web Apps
Nvidia, Google to Support Cloud Gaming on iPhone Via Web Apps

Both Nvidia and Google have announced iOS support for their respective cloud gaming platforms via progressive web applications. Apple can't block that.