Skip to main content
If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Nvidia RTX 2080 vs GTX 1080: How much faster is Nvidia's new graphics card?

Frame of mind

In my Nvidia GeForce RTX 2080 review last week, we discovered that Nvidia's super duper new graphics card was about as fast as their GTX 1080Ti when paired with Intel's Core i5-8600K CPU, representing only the teensiest bit of improvement to your overall frames per second output if you were to bung one in your PC today. That may well change once we start seeing more games take advantage of the Nvidia's clever speed-boosting AI-driven Turing tech, but until developers get their act together and start patching in support for all of the best RTX features, the only thing we've got to go on right now is raw performance data.

With this in mind, I thought I'd take a closer look at how the RTX 2080 compares to its direct predecessor, the Nvidia GeForce GTX 1080. The former might not represent much of a leap past the GTX 1080's souped up Ti cousin, but regular 1080 owners should see much better results compared to what they can do now, particularly when it comes to gaming at 4K. Let's take a look.

To see how each card stacks up, I've chucked five of today's toughest games at them: Final Fantasy XV, Monster Hunter: World, Shadow of the Tomb Raider, Assassin's Creed Origins and Middle-earth: Shadow of War. I've tested both GPUs at 1920x1080, 2560x1440 and, of course, 3840x2160 (4K) to see what kind of speeds are possible across a range of graphics settings and, most importantly, what you need to do to get a smooth 60fps at each resolution.

The cards in question are the Founders Edition of Nvidia's RTX 2080 and Zotac's GeForce GTX 1080 Amp Extreme edition, the latter of which is one of the fastest GTX 1080s around, so it should give you a pretty good idea of how the RTX 2080 compares to the cream of the crop over in 1080 Town. I've also used two internal benchmarks in the case of Assassin's Creed Origins and Shadow of War, and for the rest I've worked out the average frame rate from my in-game testing results.

And what's this? Actual graphs? Wonders will never cease. Here's what you're looking at with all five games running at high, if not max settings at 4K (Final Fantasy XV, by the way, is without any of Nvidia's fancy graphics settings enabled):

In most cases, you're looking at an increase of around 10fps, which in these circumstances is enough to push something from being just about playable to really quite comfortable. We're still not looking at a steady 60fps at 4K with the RTX 2080, all told, but somewhere in the 40-50fps region is still a heck of an improvement over the 30-40fps you'll get with a regular GTX 1080.

Moving on to 2560x1440, this is what you can expect to see at max settings across the board. Barring a couple of only minor improvements, here you're looking at a rough increase of around 15-20fps, allowing you to more often than not hit a flawless 60fps at this resolution on the bestest best settings with the RTX 2080 as opposed to somewhere around the 40-50fps mark on the GTX 1080.

Finally, at 1920x1080, you're looking at a similar jump in performance, albeit somewhere in the zone of 10-15fps. Of course, the GTX 1080 is already a highly capable card at this resolution, so those with regular 60Hz monitors won't get any benefit here whatsoever by opting for the most powerful RTX 2080. Those with high refresh rate monitors, however, will no doubt appreciate the extra frames here, as it means they can play at higher frame rates without compromising on image quality. The only anomaly at this particular resolution is Assassin's Creed Origins, which showed practically no improvement whatsoever.

Still, in every other circumstance you're looking at a fairly respectable speed bump between these two cards, regardless of your chosen resolution. As we've discussed in my RTX 2080 review, however, the only problem is that these RTX scores are essentially nigh-on identical to what you can already get with the GTX 1080Ti, which right now is just a smidge cheaper than its new RTX-series cousin.

That rather puts a bit of a downer on the RTX 2080 results, but it's important to remember we're also missing one of Turing's key ingredients right now: Nvidia's DLSS tech. As mentioned briefly above, this uses AI to take some of the load off the GPU in the edge-smoothening, anti-aliasing department, which can often take quite a toll on performance (*cough*Tomb Raider*cough*) when it's left entirely up to the graphics card to figure out how to do it on a moment-to-moment basis. Indeed, Final Fantasy XV and Shadow of the Tomb Raider have both been confirmed as DLSS games (as soon as Square Enix get round to patching in support for them, that is), so I'll be very interested to see how much of a difference it makes to my results once it's available to test in-game.

Is such theoretical greatness worth spending all that extra money on right this very second? Probably not if you're trying to get the best graphics card for the least amount of money, especially when we don't have a clear idea of how many other upcoming games will be getting DLSS support as well further down the line. If you can afford to wait before upgrading your graphics card, I'd definitely advise doing so, if only to see whether the price of the GTX 1080Ti drops even further to make it better value for money.

Read this next