Skip to main content
If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

AMD Radeon 7 review: So close to the RTX 2080 and yet so far

Knocking at the RTX 2080's door (stop)

With the next generation of RTX cards almost fully established, AMD's Radeon 7 has a lot to prove. It's not a next-gen Big Navi GPU, nor does it have any special reflection/performance boosting tech hidden away inside its second gen Vega architecture. There are improvements to be found over AMD's existing Vega 56 and Vega 64 cards, sure, but I think you'll probably agree that a reduction in latency times and improved memory bandwidth isn't nearly as sexy as 'Hey, look at these crazy frame rate boosts', or, 'Hot damn, doesn't this light look amazing?'

It may be the world's first consumer graphics card to use a 7nm (nanometer) manufacturing process, but it is, in almost every sense, a very workmanlike graphics card. It's got buckets of power at its disposal thanks to its ludicrous 16GB of HBM2 memory (which is twice that of the Vega 64), 3840 stream processors and 1TB/s memory bandwidth, but on the surface I'd argue there isn't really a huge amount to get excited about. Until, that is, you realise you can get almost RTX 2080 levels of speed for around £40-100 less. Almost.

The Radeon 7 is, I should note, very much a graphics card aimed at the 4K end of things rather than 1920x1080 or 2560x1440. You'll have no trouble running any sort of game at these lower resolutions on maximum settings with this kind of card, but there are also plenty of other GPUs out there that will accomplish exactly the same thing for a heck of a lot less money - just take a peek at our roundup of the best graphics cards to see how much you could be saving.

Instead, it's 4K performance I'm mostly concerned about here, and specifically how the Radeon 7 matches up to Nvidia's RTX 2080. After all, this is the card that AMD themselves have named as the Radeon 7's main competition, both in their Radeon 7 unveiling at CES back in January and in subsequent press briefings prior to the card arriving for review.

Admittedly, gaming only seems to be half the story for the Radeon 7, as AMD have spent almost as much time talking about its content creation chops in professional rendering programs such as DaVinci Resolve and Adobe Premiere as they have its actual honest to goodness games potential. Indeed, with a massive 16GB of memory under its belt, the Radeon 7 is almost certainly better equipped to deal with those kinds of programs than Nvidia's 8GB RTX 2080, and even the 11GB RTX 2080 Ti.

But again, it's gaming I'm interested in here, not whether it's better in Blender or Luxmark or some other professional video editing software. There are other people far more knowledgeable than I to tell you about that sort of thing, so I'm going to be focusing on gaming and gaming alone in this review - and what I've found is that the Radeon 7 isn't so much an RTX 2080 competitor, but a sort of halfway house between the RTX 2080 and the RTX 2070, which isn't particularly good news when the RTX 2070 can be had for £200 / $200 less at the moment.

Or at least it is when it's paired with my Intel Core i5-8600K CPU and 16GB of RAM, which as some of you may recall, also seemed to cause some bottle-necking problems in my RTX 2080 review. It's possible that an Intel Coffee Lake Core i7 (or indeed AMD Ryzen 7) CPU may well squeeze a few more frames out of it (which I will endeavour to find out as soon as possible), but for the sake of keeping everything fair and square and in line with my previous set of results for the RTX 2080, here's what I managed with my Core i5.

Indeed, for a moment, the Radeon 7 looked as though it had truly outdone the RTX 2080, offering nigh-on identical speeds in a number of my benchmark results for (at least in the UK) a sizably lower chunk of change - £649 / $699 compared to £690-750 / $720-800 at time of writing.

In Assassin's Creed Odyssey, for example, the Radeon 7 and RTX 2080 were absolutely neck-and-neck across every resolution and graphics setting, the former settling on a very admirable average of 39fps at 4K on Ultra High quality, while the RTX 2080 was just a single frame behind on the same settings. Both achieved an identical 53fps average on regular High as well, while the RTX pulled just a couple of frames ahead with its average of 65fps on Medium at 4K compared to the Radeon 7's 62fps average.

A similar thing happened in Monster Hunter: World, too. While 4K Highest was out of reach for both graphics cards, 4K High saw each one hit peaks of 50fps and lows of around 45fps. The Radeon 7 did, I admit, briefly dip to an imperceptible 38fps when some of the game's larger dinos were tussling around onscreen, but quickly returned to its earlier range of 45-50fps for the rest of the fight. Again, the RTX 2080 had a slight edge when I dropped the settings down to Mid at 4K, pushing closer to 50-57fps out in the field over the Radeon 7's range of 42-54fps, but they're such small differences that I'd be very much inclined to go with the Radeon 7 here and save myself a bit of cash in the process.

The same can be said of Shadow of the Tomb Raider as well. Here, the RTX 2080 had the advantage on nearly every graphics setting going, but in most cases it was just by two or three frames. On Highest at 4K, for example, the RTX 2080 averaged 50fps with its SMAATx2 anti-aliasing enabled when I was walking around the busy day of the dead celebrations in Cozumel's town square, while the Radeon 7 came in with an average of 48fps. Again, the gap between them is so tiny that I'd be more than happy to go to bat for the Radeon 7 in this case, especially given its cheaper price.

However, those three games were very much the exception to the rule when you look at all the other results I got from my benchmark data, which more often than not put the RTX 2080 a full graphics setting ahead of the Radeon 7 when it came to achieving an equivalent frame rate, essentially putting it in very much the same ballpark as the £200 / $200 cheaper RTX 2070.

In Total War: Warhammer II, for instance, the RTX 2080 managed an average of 42fps in the game's built-in battle benchmark on 4K Ultra settings, while the Radeon 7 could only manage that (42fps) on 4K High. On Ultra, it scraped in with 34fps. The Witcher III painted a similar picture. Whereas the RTX 2080 ranged between 50-60fps on 4K Ultra, the Radeon 7 veered between 43-51fps on the same setting, and only rose to 49-58fps when I kicked the quality down to High.

The Radeon 7 had a terrible time with Final Fantasy XV as well. Admittedly, AMD's Radeon RX 590 didn't cope with it very well, either, when I tested it at the end of last year, so it may just be that AMD's cards simply aren't very well-suited to Noctis' anime boyband jaunt. But even with all of the extra Nvidia effects turned off (which ground the Radeon 7's frame rate down to a 15fps slideshow when switched on, regardless of resolution, I might add), AMD's effort could still only manage between 42-47fps on 4K Average settings, which is almost a full 10fps behind the RTX 2080's range of 50-53fps on the same settings.

The RTX 2080 also pulled ahead in Forza Horizon 4. While the Radeon 7's average of 78fps on 4K Ultra settings is nothing to be sniffed at, the RTX 2080 nabbed the photo finish with its average of 82fps, maintaining a much tighter range of 78-92fps in the game's built-in benchmark race compared to the Radeon 7's wider swerves between 67-89fps.

The gap only got larger when I moved on to Doom, too. Of course, anything above 60fps won't mean anything at all unless you've got a high refresh rate 4K monitor at your disposal (of which there are precious few around at the moment unless you've got a couple of spare grand to spend on something like the 144Hz Nvidia G-Sync Ultimate-enabled Acer Predator X27 or Asus ROG Swift PG27UQ). But when the RTX 2080 can comfortably pump out 90-120fps on 4K Ultra settings in Doom while the Radeon 7's stuck at 70-95fps on the same settings, it still puts another firm tick in the RTX column for when more of those monitors do start coming through.

Perhaps it's a little unfair to talk about future-proofing in this sense when the Radeon 7's 16GB of HBM2 memory is arguably far more valuable in this respect, especially when it comes to the ever-increasing demands of game texture packs and so forth. But unless you play a lot of modded games with balloon-sized extras to take into account, most games only tend to demand around 8GB of video memory right now, which for me rather puts the Radeon 7's 16GB of memory into very much the same category as Nvidia's ray tracing and DLSS features at the moment. That is, that barely anything can actually use it properly right now, and it will probably be a long time coming before anything can fully take advantage of it - by which point we'll probably have even faster graphics cards that far outstrip what's possible on the ones you can buy today.

What's more, even if we did suddenly get inundated with support for all those confirmed DLSS games, then that would put the RTX 2080 in an even stronger position for playing games at 4K due to the Radeon 7's already weaker/equal speeds without any of that stuff enabled.

I really wanted to like the Radeon 7, and I really wanted it to be a proper RTX 2080 rival. I mean, realistically, you're probably looking at playing most games on Medium to High settings with both cards anyway if you're after a smooth 60fps at 4K, which for some may well be enough to swing things in favour of the Radeon 7 so they can save a bit of cash. But when the cheapest RTX 2080 is only another £40 / $40 on top of the Radeon 7 at time of writing, I'd personally be tempted to spend that little bit extra and get a generally better gaming card in the process.

This may well change once we start seeing third party Radeon 7s starting to come through with potentially higher clock speeds or superior cooling, as right now it's only AMD's version of the card that's actually available to buy. But we also don't know how much more (or less) those are going to cost yet, either, so we may end up in exactly the same position even if they do end up offering slightly better performance.

As I mentioned earlier, the Radeon 7's larger memory banks will almost certainly stand it in better stead for anyone who does a lot of 4K / 8K video editing and is much more of a creative type than a gaming type. But for those of you who just want to know what the best card is for 4K gaming right now, then the RTX 2080 is still the card to beat in this particular price range.

Update: So it turns out RTX 2080 prices have swung even lower than the Radeon 7 today, such as the MSI GeForce RTX 2080 Ventus for £640 and EVGA GeForce RTX 2080 XC for £650, which only seals the deal for the RTX 2080 even more than before.

Read this next