Custom GeForce GTX 1660 Ti Model Information Leaks From Multiple Retailers

Custom GeForce GTX 1660 Ti Model Information Leaks From Multiple Retailers

We’ve been covering rumors of a new Turing-class GPU coming from Nvidia (minus Turing’s principle features) for the past few months, but the leak of packaging and other details confirms that these cards are imminent. Whether they’ll represent a significant shift in value proposition from GPUs already in-market, however, is its own question.

WCCFTech has rounded up card shots from several vendors, as has VideoCardz. The information on the GTX 1660 Ti is firmer than the data for the GTX 1600; we have multiple leaks that show specs on the 1660 Ti, including base clock, boost clock, and memory capacities. Unfortunately, those leaks don’t always agree.

According to these images, the GeForce GTX 1660 Ti is a GPU with a 192-bit GDDR5 design and an effective memory clock of 8008MHz. That corresponds to the 192GB/s of memory bandwidth attached to the original GeForce GTX 1060.

Custom GeForce GTX 1660 Ti Model Information Leaks From Multiple Retailers

According to these images, the GeForce GTX 1660 Ti is a GPU with a 192-bit GDDR6 design and an effective memory clock of 8008MHz. This would still correspond to the exact same amount of bandwidth as is available to a GTX 1060, though we’d expect the RAM overclocking headroom to be pretty good unless that’s some really bad GDDR6 they’re recycling on these cards.

Custom GeForce GTX 1660 Ti Model Information Leaks From Multiple Retailers

Finally, according to this leak, the GeForce GTX 1660 Ti is a GPU with a 192-bit GDDR6 design but an effective RAM clock of 12,000MHz. This would correspond to 288GB/s of memory bandwidth — substantially more than the old 1060.

Custom GeForce GTX 1660 Ti Model Information Leaks From Multiple Retailers

The table below summarizes these differences, with two different entries for the 1660 Ti — one reflecting its rumored 12.0Gbps transfer rate and one for the 8.0Gbps transfer rate.

* = Unconfirmed.
* = Unconfirmed.

It’s possible that this uncertainty simply reflects rumor mill churn, but the fact that it’s swirling around the GTX 1660 Ti in particular suggests another alternative. Nvidia has launched enough GTX 1060 spinoff parts to qualify as an entirely separate line of GPUs. The GPU has been available in three separate memory configurations (3GB, 5GB, 6GB) and two different types of RAM (GDDR5, GDDR5X). There have been seven different GPU model numbers sold under the GeForce GTX 1060 banner, though most of these were just point revisions of the same underlying die (both GP104 and GP106 have been used to build 1060s in varying flavors).

It’s entirely possible that the plethora of rumors around the 1660 Ti’s configuration reflect Nvidia’s intent to offer the GPU in a variety of flavors. It would also explain why the price banding seems to be fairly large for a card expected to launch on February 22. Nvidia likely realizes that it can’t just split the RAM configuration in a 3GB/6GB slice for this market segment any longer — even the 6GB configuration on the RTX 2060 is small for the price point it commands. A 3GB midrange card in 2019 is a distinct non-starter of a value proposition. Instead, the firm may be varying its memory frequency and standard, with cheaper, slower GDDR5 offered on some SKUs as a way to hit lower price points. We’ll have to investigate how much impact this has on performance.

Details on the GTX 1660 are slimmer than the 1660 Ti. This GPU should debut in March, and the rumored configuration looks identical to the original GTX 1060. The only performance improvements would be delivered by the GPU architecture (clock-for-clock gains for Turing over Pascal have typically been confined to specific games) or by improved clocks. The price banding of $200 – $250 suggests these cards will not offer a dramatic midrange improvement in performance per dollar relative to the GTX 1060, though we’ll wait to see clocks and overall performance before declaring that to be the case. The overall positioning suggests Nvidia is wary of offering a too-good deal to gamers that would leave them preferring current-generation rasterization performance over shucking out additional cash for Nvidia’s RTX features.

Feature image by Videocardz

Continue reading

How Does Windows Use Multiple CPU Cores?
How Does Windows Use Multiple CPU Cores?

We take multi-core awareness for granted these days, but how do the CPU and operating system communicate with each other in the first place?

EA Will ‘Allow’ BioWare to Pull Dragon Age 4’s Unnecessary Multiplayer
EA Will ‘Allow’ BioWare to Pull Dragon Age 4’s Unnecessary Multiplayer

EA will allow developers not to ship multiplayer in Dragon Age 4 after Anthem tanked and Jedi: Fallen Order soared. How kind of them.

Microsoft Flight Simulator Adds Competitive Multiplayer
Microsoft Flight Simulator Adds Competitive Multiplayer

The new mode will allow you to race friends and online players, and includes a number of famous race planes.

The Epstein-Barr Virus May Cause Multiple Sclerosis
The Epstein-Barr Virus May Cause Multiple Sclerosis

A study of over 10 million young adults reveals that prior EBV infection increases one's liklihood of developing MS 32-fold.