Seriously? Nvidia GTX 1660 Ti, that’s the supposed name of the first genuinely mainstream graphics card coming out of the Turing architecture? Yes, once more the rumour mill is grinding away on some fresh speculation that the RTX 2060 is the last card in the RTX lineup with the full next-gen GeForce range being back-filled with GTX cards, sans ray tracing hardware.
That at least makes more sense than a GTX 1180 popping up with RTX 2080 levels of gaming performance and the innate ability to put the future of real-time ray tracing in games in jeopardy. I mean, those rumours must surely be so much bull or else Nvidia will have the whole RTX range sleeping with the fishes. Sorry, I’ve been rewatching The Sopranos…
Anyways, this the second time the GTX 1660 Ti rumours have tipped up; the first time came along with the suggestion that actually it’s a GTX 1160 with a typo. But with this second outing of the rumour, and potentially it not simply being the same one recycled, the prospect of a 16-series mainstream GPU range is becoming more likely.
Again, the rumour has surfaced on Videocardz, though this time with no reference to a source – the previous GTX 1660 Ti specific rumour had appeared via Expreview. But, as the report says, the naming scheme has still not been fully confirmed, though this is the second time the writer has heard of the 16-series, so now it’s come from two different sources.
Read more: The best graphics cards to spend a penny on today
I get the GTX bit – if there are no RT cores baked into this difference spin of the Turing architecture that makes sense. And I understand the xx60 nomenclature – this, not the RTX 2060, is going to be the mainstream GPU to genuinely replace the mighty GTX 1060. But why the 16-series prefix and why a Ti suffix? Can anyone explain that to me? Answers on a postcard to the usual address… of just post a comment, whatever.
The GTX 1660 Ti is expected to come rocking the same 12nm FinFET production process as the existing Turing GPUs, and will use the same Turing shader design. That means it will be able to take advantage of the concurrent execution of integer and floating point calculations to streamline the graphics pipeline, utilise the variable rate shading and content adaptive shading features, and take advantage of the redesigned memory architecture too. Check out our full Turing architecture deep-dive to get a bead on what those Turing-specific features can offer us gamers.
In terms of the actual GPU of the proposed GTX 1660 Ti, however, we’re supposedly looking at a TU116 chip with 1,536 CUDA cores, which would put it somewhere between the GTX 1060 and GTX 1070 in terms of silicon riches. The extra potential performance of the Turing shaders could come into play here, and deliver traditional rasterised rendering power ideally on par with a GTX 1070 – that’s what we’d want to see anyway.
GTX 1660 Ti | RTX 2060 | GTX 1060 | |
GPU | TU116 | TU106 | GP106 |
Lithography | 12nm FinFET | 12nm FinFET | 16nm FinFET |
CUDA cores | 1,536 | 1,920 | 1,280 |
Memory | 6GB GDDR6 | 6GB GDDR6 | 6GB GDDR5 |
Memory bus | 192-bit | 192-bit | 192-bit |
The rumour also suggests the new mainstream card will come with GDDR6 memory too, running across the same level of 192-bit memory bus as the ol’ GTX 1060. That might come as a little bit of a surprise given the new GTX cards are being aimed at a more budget conscious crowd and the latest memory tech is reportedly rather pricey. That might indicate we shouldn’t necessarily be expecting a $200 price tag though.
But we ought to know more soon, with the 10-series cards reportedly close to being cleared from the channels, despite Nvidia still claiming it could be another quarter before they’re fully cleansed from the e-shelves, we’re due a proper mainstream launch. Nvidia does like a GDC launch, so I’d keep an eye out around March.
I just kinda hope there’s a rethink on that name…
Source