Nvidia is promoting their own high-end performance in major AI and machine learning benchmarks, as apparently some kind of floodgate has popped open on companies talking about performance metrics for their own hardware solution (This isn’t literally true, but we’ve been seeing a lot of artificial intelligence, machine learning, and similar data cross our desks of late). According to Nvidia, it’s hit some major milestones, including:
Nvidia is also talking up the use of Volta as a potential replacement for ASICs that would otherwise provide superior functionality in a limited set of use-cases or scenarios. It’s not clear — and I genuinely mean that — how such claims should be interpreted. Nvidia notes: “For instance, each Tesla V100 Tensor Core GPU delivers 125 teraflops of performance for deep learning compared to 45 teraflops by a Google TPU chip. Four TPU chips in a ‘Cloud TPU’ deliver 180 teraflops of performance; by comparison, four V100 chips deliver 500 teraflops of performance.” It also refers to a project by fast.ai to optimize image classification on the CIFAR-10 dataset using Volta that turned in best-in-class overall performance, beating all other competitors.
There are problems, however, with relying on FLOPS to measure performance. FLOPS is calculated by a simple mathematical equation:
In GPUs, this works out to GPU cores * clock * two instructions per clock (one multiply, one accumulate) = X rating in TFLOPS. This intrinsically assumes that the GPU is executing a multiply and an accumulate on every GPU core simultaneously. This assumption allows us to generate comparative metrics relatively quickly using a constant formula, but there’s a huge loophole: If GPU #1 typically achieves only 50 percent of its theoretical peak FLOPs, it could be outperformed by GPU #2, which might have much lower maximum theoretical FLOPs performance but still exceed the perf of GPU #1 if it’s more efficient. This also applies to any kind of comparison between two different solutions.
The recent reports on Google’s cloud TPU being more efficient than Volta, for example, were derived from the ResNet-50 tests. The results Nvidia is referring to use the CIFAR-10 data set. The Dawnbench team records no results for TPUs in this test, and fast.ai’s blog post on the topic may explain why this is:
Google’s TPU instances (now in beta) may also a good approach, as the results of this competition show, but be aware that the only way to use TPUs is if you accept lock-in to all of:
Google’s hardware (TPU)Google’s software (Tensorflow)Google’s cloud platform (GCP).More problematically, there is no ability to code directly for the TPU, which severely limits algorithmic creativity (which as we have seen, is the most important part of performance). Given the limited neural network and algorithm support on TPU (e.g. no support for recurrent neural nets, which are vital for many applications, including Google’s own language translation systems), this limits both what problems you can solve, and how you can solve them.
As hardware and software continue to evolve, we’ll see how these restrictions and capabilities evolve along with them. It’s absolutely clear that Volta is a heavy-hitter in the AI/ML market as a whole, with excellent performance and the flexibility to handle many different kinds of tasks. How this will change as more custom hardware comes online and next-generation solutions debut is still unclear.
Nvidia Goes All-In On G-Sync With New ‘BFGD’ Ultra-High-End Displays
Nvidia is bringing some of the highest-end displays imaginable to market in 2018, with 4K panels, 120Hz refresh rates, low latency displays, integrated Nvidia Shields, and support for 1,000 nits of brightness in HDR. Yowza.
Nvidia May Be Prepping a New GTX 1050 Ti Max-Q to Counter Intel, AMD
Nvidia is reportedly readying a new version of the GTX 1050 Ti with a Max-Q spin on the silicon. Is it a move against AMD's just-announced Vega Mobile or a shot across the bow of Intel's Vega-equipped CPUs?
Nvidia Calls for Limits as Crypto Hysteria Pushes GPU Prices Sky High
You can thank the surge of interest in cryptocurrency for the increase in graphics card prices, but Nvidia is trying to do something about it. "Trying" is the operative word here.
Nvidia Soars On Cryptocurrency, Gaming Revenue
Nvidia's revenues soared last quarter, thanks to cryptocurrency and strong gaming performance — but the company is playing things close to the chest on when we should expect new GPUs.