Microsoft has announced it will take advantage of a new security capability Intel has enabled in its integrated GPUs. Normally, antivirus scans run entirely on your CPU. Back in the old days, this process could bring a system to its knees. Single-threaded CPUs would lag badly if you tried to run a significant workload while scanning, and hard drives couldn’t handle the task of scanning system files and running applications at the same time. The advent of SSDs and multi-core CPUs largely solved this problem. But it’s still possible to see some lag in applications when scanning, even on a modern rig.
Intel’s proposed solution? Running the AV scan on its own integrated GPUs, thereby saving power and increasing overall system efficiency.
In theory, this is an excellent idea. According to Intel, the total CPU usage when running a GPU-assisted AV scan is around 2 percent, compared with an average of 20 percent without GPU acceleration. Intel calls this new technology Accelerated Memory Scanning, and it’s part of the company’s overall package of security capabilities, dubbed Intel Threat Detection Technology. Intel is also working on a feature dubbed Advanced Platform Telemetry, which uses machine learning to categorize potential threats and take appropriate action. This capability will be integrated into Cisco’s Tetration Platform, which is used for securing data centers and protecting the cloud.
Right now, these features are reserved to enterprise offerings — Microsoft is integrating them into Windows Advanced Threat Protection, which is its own business platform. But long-term, these capabilities will come to consumer platforms, if only because the evolution of new threat types will require these capabilities to keep machines secure (assuming, of course, that machine learning becomes a component of system security in the first place). GPUs could be ideally suited for this kind of scanning because they’re typically excellent at pattern recognition. In theory, a neural network model could be trained on malware code the same way you might train it to recognize images of cats or faces.
The major question we have is how much of an impact running AV on the integrated GPU has on the system’s graphics performance. It’s been over a decade since most people had to think about this. But if you’ve been computing since the 1990s, you probably remember when upgrading your graphics solution could meaningfully improve your desktop performance. One of the selling points for Nvidia’s Ion platform back when Intel’s Atom launched was that its own GPU provided far snappier performance than Intel’s. GPUs, however, tend to be miserable at multitasking. It would be interesting to know if Intel has invented a way to handle the workload on its integrated graphics processor without any discernible performance impact to the end user that might be simultaneously using the PC.
Long term, features like this could actually help Intel and AMD solve a marketing problem of their own. One of the problems with marketing the integrated graphics solutions found on Intel and AMD processors is that above a certain price point, many users install their own video cards. This makes it harder to sell the integrated solution as a useful component in its own right, especially if the end user doesn’t transcode video or need the additional video outputs for multi-monitor configurations.
AMD tried for years to get HSA off the ground, but the effort largely came to naught. The advent of AI and machine learning, however, could breathe new life into the concept, possibly giving both firms a better range of capabilities to use when marketing their products. Long-term, AMD’s vision of using the integrated GPU for processing certain workloads during gaming or productivity applications could still become reality — just not necessarily in the ways the company expected back in 2012 – 2013.
Similarly, capabilities like this could give Intel a marketing leg up when trying to sell users on an Intel CPU when they plan to game on a discrete card. It’s also a way for the company to burnish its image following the beating its taken over Meltdown and Spectre. For now, Microsoft is the only AV vendor supporting this capability, but we expect to see more companies piling on if the approach proves valuable.
Intel’s Desktop TDPs No Longer Useful to Predict CPU Power Consumption
Intel's higher-end desktop CPU TDPs no longer communicate anything useful about the CPUs power consumption under load.
New Intel Rocket Lake Details: Backwards Compatible, Xe Graphics, Cypress Cove
Intel has released a bit more information about Rocket Lake and its 10nm CPU that's been back-ported to 14nm.
Intel’s Raja Koduri to Present at Samsung Foundry’s Upcoming Conference
Intel's Raja Koduri will speak at a Samsung foundry event this week — and that's not something that would happen if Intel didn't have something to say.