AMD Will Not Limit Cryptocurrency Mining on RDNA2 GPUs

AMD Will Not Limit Cryptocurrency Mining on RDNA2 GPUs

AMD and Nvidia often look for ways to differentiate themselves from the competition and one of those ways going forward, apparently, will be cryptocurrency mining. Unlike Nvidia, which opted to limit RTX 3060 mining performance before releasing a driver that obviated much of the goal, AMD told us it explicitly won’t be limiting any workload at all.

“The short answer is no,” said Nish Neelalojanan, a gaming product manager at AMD. “We will not be blocking any workload, not just mining.”

Nish went on to explain some specific factors weighing into AMD’s decision, including the fact that the company believes RDNA2 and its Infinity Cache are specifically tuned towards gaming rather than mining or compute workloads. There’s some evidence to support this, inasmuch as the high-end RDNA2 GPUs are not as good at mining as the RTX 3080 or RTX 3090.

But let’s be real about this: Right now, GPU prices are so broken, you can get an extra $50 for a houseplant so long as it supports HDMI. Anything that can mine cryptocurrency is being used to mine cryptocurrency, and if AMD’s GPUs can crank out profitable hash rates, no miner is going to care if AMD optimized for the workload. It’s also not clear how much leeway AMD has to block cryptocurrency workloads on Linux, given that AMD’s Linux GPU driver is entirely open source. Nvidia probably has more leeway here.

AMD Will Not Limit Cryptocurrency Mining on RDNA2 GPUs

I’ve stated in several articles that I think Nvidia is doing the right thing by blocking cryptocurrency workloads. It’s not a position I like taking. The entire point of a PC, as opposed to a console, tablet, or smartphone, is the freedom the end-user has to create, modify, and run applications. It’s extremely difficult to create a programmable graphics processor only to turn around and try to prevent it from running very specific programs. It’s not clear that AMD could prevent mining the way Nvidia tried to do, even if it wanted to.

But if that’s true, it only speaks to how much the cryptocurrency industry is warping computing. When AMD decided to fully open-source its Linux driver, it wasn’t considering the impact that might have on GPU availability down the road. There wasn’t a reason to believe one topic had anything to do with the other.

Even if we assume the cryptocurrency market eventually cools off, this is cold comfort to anyone who wants a GPU. Long-term, we’re looking at a situation in which each new microarchitectural launch has a chance to spark a 9-18 month GPU shortage every 24-36 months. That’s not going to be good for the long-term survival of our hobby. Best-case is a massive buying cycle realignment, but that’s still going to burn everyone whose GPU dies during the 9-18 month period and who can’t find a card at MSRP.

These impacts aren’t theoretical. I’ve heard from a number of readers who are variously delaying purchases, saving additional funds to buy a complete system, or who wound up paying a lot more money than they intended. If they continue long enough, people who can’t afford to game on PC will game on something else or find a different hobby altogether.

I don’t know if there’s a way for Intel, AMD, and Nvidia to build a GPU that continues the industry’s historical trend of improving both game performance and performance per watt while absolutely sucking as a cryptocurrency miner, but the future of the gaming market may belong to the firm that figures out the best answer to this puzzle. At this point, the most optimistic predictions suggest the GPU semiconductor shortage might begin to ease by the end of Q2, with Q3 and Q4 floated as alternatives. No one is suggesting a widespread outage that drags on into Q1 2022 — yet.

If AMD’s reason for not limiting cryptographic mining is philosophical, not technical, I have to say I disagree with the decision under the current circumstances. Six months after Ampere debuted, GPUs of every stripe remain very difficult to find.

Continue reading

AMD Radeon 6700 XT vs. 5700 XT: Putting RDNA2 to the Test
AMD Radeon 6700 XT vs. 5700 XT: Putting RDNA2 to the Test

AMD's 6700 XT offers a rare opportunity to investigate efficiency and performance gains between two GPU generations using a near-identical iteration of both cores. We compare the 5700 XT and 6700 XT clock-for-clock to measure IPC, power consumption, and generational improvements.

The Radeon RX 6600 is AMD’s Weakest RDNA2 GPU Yet
The Radeon RX 6600 is AMD’s Weakest RDNA2 GPU Yet

AMD's latest Radeon RX 6600 GPU is now available, but the card is not the value proposition we had hoped it would be.

Apple Likely Planning to Use AMD RDNA2 GPUs in Future Macs
Apple Likely Planning to Use AMD RDNA2 GPUs in Future Macs

Apple is apparently going to support RDNA2 GPUs in future Mac systems, implying Radeon might have a future with Mac after all.

AMD’s Ryzen 6000 APUs May Feature RDNA2 GPUs
AMD’s Ryzen 6000 APUs May Feature RDNA2 GPUs

AMD's RDNA2 integrated GPU microarchitecture may arrive with the Ryzen 6000 series. Will Infinity Cache come along for the ride?