It’s time for a new mega benchmark comparing the Radeon RX 7900 XTX head to head with the GeForce RTX 4080 in a myriad of games. In total we have tested 54 titles, though some of those were tested using multiple configurations (DX11 vs DX12, ray tracing, and so forth) making up 62 game tests.
For testing we’ve got our hands on the massive Gigabyte Aorus RTX 4080 Master — we believe this is the largest graphics card on the market right now. Currently retailing for at least $1,350, that’s over 12% above MSRP or about a $150 premium, but given the size and weight it kind of feels like you’re at least getting that difference in the cooler.
For comparison we have the Gigabyte Aorus 7900 XTX Elite which isn’t a “Master” version as it’s unavailable on the Radeon side, but even so the Aorus Elite is still a beast and we’re sure many would argue this is also a more practical graphics card in terms of dimensions and weight. Pricing for this model is a little unclear as it’s yet to go on sale, but if we had to guess we’d say $1,100 will be the asking price for this one.
Our test system was powered by the Ryzen 7 5800X3D with 32GB of dual-rank, dual-channel DDR4-3200 CL14 memory on the MSI MPG X570S Carbon Max Wi-Fi motherboard. As usual we plan to go over the data for a dozen or so titles before jumping into the big breakdown graphs. We’re centering our attention into 1440p and 4K performance as you would expected, so let’s get into it…
We’re starting with Fortnite which has recently been updated to use Unreal Engine 5.1, and is the first game to do so. Here we’re using DX11 with the Epic quality preset and we do realize anyone playing this game competitively will be using different quality settings, possibly a mix of Epic and Low quality options, but for the sake of testing GPU performance we’re using Epic here.
In our DX11 test the RTX 4080 was 15% faster at 1440p and 12% faster at 4K. The Radeon GPU delivered slightly better 1% lows though, which is interesting as that wasn’t the case for the previous build of the game.
Where things get messy is when we introduce DX12, which is required if you wish to take advantage of the new Unreal Engine 5.1 technologies. The problem is DX12 breaks Fortnite, it always has, and this is noticeable in fast paced build fights where this testing was conducted.
The RTX 4080 saw nearly a 50% reduction in frame rates at 1440p compared to DX11 using the same quality settings. We’re also looking at a 40% hit for the 7900 XTX, simply by switching from DX11 to DX12, and that’s got to be bad for ray tracing, which will have a further negative influence on the frame rate.
The main reason for using DX12 is that you can enable Lumen and Nanite technology which does look amazing. But neither technology is well suited to a game like Fortnite as it makes it harder to spot enemies and therefore survive the round.
The frame rates are also horribly low, especially at 4K. High level players will have a hard time defeating less skilled players when limited to just 45 fps, which is why competitive gamers target well over 140 fps.
What’s interesting here is that ray tracing performance in Fortnite, which is one of the best examples of RT effects in any game right now, sees the 7900 XTX and RTX 4080 neck and neck. In fact, the Radeon GPU actually does a better job here with RT Lumen and Nanite enabled, and that’s a surprising result.
Call of Duty: Modern Warfare II is another competitive shooter and for this testing we’re just using the built-in multiplayer benchmark, as it’s useful for measuring GPU performance, but not so much CPU performance. We’ll be looking at performance using both the ‘Basic’ and ‘Ultra’ quality presets.
At 1440p with the basic settings we see that the 7900 XTX is 42% faster than the 4080, and 35% faster at 4K — a brutal beat down by the Radeon GPU. These are the kind of results we’ve come to expect from Modern Warfare II and Warzone II where AMD GPUs are heavily favored.
The margins remained the same with ‘Ultra’ settings, where the 7900 XTX was 40% faster at 1440p and 34% faster at 4K. Both are impressive gains and both are very noticeable when gaming, especially at 4K, though competitive gamers will certainly appreciate and enjoy the massive uplift at 1440p.
In The Witcher 3 we also tested different APIs, starting with the DX11 data we found that the game ran exceptionally well using either GPU, with the ultra quality settings allowing for over 250 fps at 1440p and over 100 fps at 4K. An amazing great experience, no frame stutters and zero issues.
Moving to DX12, the game mostly ran well, though we did experience some crashes. Performance overall was down when compared to DX11.
But, DX12 does allow you to enable ray tracing and with the Ultra RT preset enabled… the game ran like crap. Sure it looked nice, but we went from over 200 smooth frames per second at 1440p, to under 100 fps.
The RTX 4080 played reasonably well with 66 fps on average and 1% lows of 48 fps, it wasn’t amazing, and certainly not a $1,200 experience, but it was playable. The 7900 XTX, on the other hand, was a stuttery mess, completely unplayable and a bad experience. Both sucked at 4K, so you will need to use upscaling here.
Metro Exodus has long been hailed as a great showcase for ray tracing, and while we think the RT effects are nice on this game, we weren’t convinced that the performance is hit on previous generation hardware made it all feasible for actual play.
Here we’re running the standard version of the game without RT enabled, and the performance is amazing, both GPUs rendered over 270 fps at 1440p and over 150 fps at 4K, both were buttery smooth, the game looked nice and played well, so a great experience all round.
Now here’s the Enhanced Edition version with ray tracing enabled and we’ve got to say the 1440p results are quite good. Yes, the performance hit is massive, but these new high-end GPUs still managed over 60 fps at 1440p. For 4K you will want to use some kind of upscaling, especially with the 7900 XTX, but overall the results are quite good.
Because this is an RTX title, the GeForce 4080 has a massive advantage with RT enabled, pumping out 42% more frames at 1440p and 51% more at 4K, a significant difference to what was seen in Fortnite using RT Lumen and Nanite.
By default we test F1 22 with ray tracing enabled as this is the default setting for the Ultra High preset. The 7900 XTX performs well, but the RTX 4080 does perform better, delivering 12% better performance at 1440p and 14% more at 4K.
In our opinion though, ray tracing does very little to improve image quality in F1 22, and most gamers are best turning it off. Doing so improves the performance of the RTX 4080 at 4K by 232% and then 302% for the 7900 XTX.
If you’re happy playing the game at 4K with ~60 fps, then maybe RT will be worth enabling, but we suspect for most of you the performance hit here simply can’t be justified and if true the 7900 XTX is the superior performer in this title.
Playing The Riftbreaker using the ultra quality visual settings provided very little challenge for these new high-end GPUs, both easily pushed past 300 fps at 1440p and over 170 fps at 4K, which is significantly more performance than anyone needs to play this game well.
That being the case the game is worth playing with ray tracing enabled as performance for both GPUs is still excellent, and shadow detail in particular is much more impressive. Even the 7900 XTX produced over 200 fps at 1440p and over 100 fps at 4K, so even though the RTX 4080 was 16% faster, both delivered a similar experience.
Next we have Warhammer 40,000: Darktide using the Autodesk Stingray Engine and this one appears well optimized for both AMD and Nvidia GPUs as the 7900 XTX and RTX 4080 delivered virtually identical performance. Using the high quality preset we’re looking at comfortably over 100 fps on average at 1440p and 60 fps at 4K.
For A Plague Tale: Requiem we used the highest quality preset, where the 7900 XTX outperformed the RTX 4080 to the tune of 21% at 1440p and 10% at 4K. A clear win for the Radeon GPU, though both graphics cards delivered excellent performance given the visual quality settings.
The Callisto Protocol is another new game where the 7900 XTX is able to best the RTX 4080, winning by 8% at 1440p and 3% at 4K.
That said, it’s the substantially better 1% low performance than really gets the Radeon GPU over the line here, improving 1% lows at 1440p by 20% and 11% at 4K.
Traditionally, Radeon GPUs have performed well in World War Z Aftermath, for example the 6900 XT used to comfortably beat the 12GB 3080. In this generation, the Radeon 7900 XTX matches the RTX 4080 at 4K but is able to beat it by a small margin at 1440p.
We’ve taken a more detailed look at just a handful of the 54 games we tested, let’s now compare the GeForce RTX 4080 and Radeon RX 7900 XT across all the games we’ve spent the past few weeks benchmarking, and we’ll start with the 1440p data…
Here we have data for all 62 test configurations, squeezed into a single graph, showing us that on average the Radeon 7900 XTX is a match for the GeForce RTX 4080 across this massive sample of games. Of course, we have a number of ray tracing enabled results here, this is everything we tested for this benchmark review.
At 4K, the results are equally close but now the Radeon 7900 XTX is 1% faster than the RTX 4080 on average, but to put it more succinctly, performance was identical overall with some outliers making a difference if you happen to play those game a lot. The 7900 XTX suffers its worst losses in Metro Exodus EE and The Witcher 3 using RT. Meanwhile, it enjoyed its greatest win in Modern Warfare 2, with other solid wins in Borderlands 3, Death Stranding, Far Cry 6 and Assassin’s Creed Valhalla.
It’s worth noting that for 42 of the 62 game configurations tested here, or ~70% of the game tests, we’re looking at single-digit differences between these GPUs.
Cooling & Power
When it comes to thermals, the results will vary depending on the graphics card make and model used, but for this comparison we had Gigabyte Aorus 7900 XTX and RTX 4080. We still wanted to give you a reference point in our test that was measured when the cards were installed inside an ATX case with the doors closed and an ambient room temperature of 21C.
After an hour of playing Hitman 3, the Aorus Master RTX 4080 saw a peak hot spot temperature of 77C with a peak average GPU temperature of 63C. The memory also peaked at 58C and to enable these temperatures the fans spun at 1930 RPM and generated 38 dBA of noise. Meanwhile, the cores typically clocked at 2745 MHz with the memory running at 22.4 Gbps, a good result overall.
The Gigabyte Aorus Elite 7900 XTX was a very cool customer, in similar conditions it peaked at just 72C for the hot spot with an average GPU temperature of just 58C. The VRM temp 75C and it was the memory that ran the hottest of all, hitting 90C, which is significantly hotter than the memory used by the RTX 4080, though we are comparing GDDR6 vs GDDR6X.
The Aorus Elite 7900 XTX was also quieter at 37 dBA and that’s a surprising result considering that the total system power usage was much higher using the Radeon GPU at 555 watts compared to just 465 watts with the RTX 4080. Finally, the 7900 XTX typically clocked its cores at 2690 MHz, while the memory operated at 19.9 Gbps.
When we reviewed the GeForce RTX 4080 back in November we weren’t too impressed from a value perspective. Performance-wise the 4080 is great, but at $1,200 it didn’t excite us, and we concluded that we’d need to wait for the 7900 XTX before making any concrete calls.
Then came mid-December and we finally got to see what the Radeon 7900 XTX was all about. Overall, AMD has a competitive high-performance product relative to its GeForce competitor, but whether or not it’s worth buying at $1,000 will depend on how much stock you place in ray tracing performance.
The Radeon 7900 XTX does well in matching the RTX 4080 in terms of performance/value, that’s not up for discussion, but we still feel the 4080 represents poor value for gamers in the first place, so here we are with a Radeon alternative to a product we’re none too excited about.
We’ve been thinking a lot about the 7900 XTX, and why we’ve found it so underwhelming. If we think back to the Radeon 5700 XT, a 2019 GPU that we very much liked and deemed excellent value, the margins are somewhat similar to this generation. The 5700 XT was 20% cheaper than the RTX 2070 Super, or a $100 saving and it offered a similar level of performance. Meanwhile, the new 7900 XTX is 16% cheaper than the RTX 4080 and offers a similar level of performance while saving you $200.
The situation with the Radeon 7900 XTX is different though. First, the 2070 Super was a decent product, it killed off the Radeon VII for gaming, and it was worth recommending at $500 — essentially offering what was previously $800 performance with the RTX 2080. So for the 5700 XT to come in and offer a similar level of performance but at the lower $400 price point, well, that was quite impressive.
Moreover, the RTX brand of features wasn’t nearly as strong 3 years ago, DLSS support and image quality wasn’t what it is today, and very few games implemented RT effects well, so neither technology was a strong selling point for GeForce GPUs, making the cheaper 5700 XT all the more appealing.
The 7900 XTX is competing with a far worse value product in the RTX 4080, and while the Radeon GPU is cheaper, it’s also generally inferior when it comes to ray tracing performance, it still lacks a true upscaling competitor to the widely supported DLSS, it sucks down quite a bit more power and driver support isn’t as mature with a few strange niggles here and there.
Still, we’d say worst-case, the 7900 XTX matches the value of the RTX 4080, and most of you feel you’d purchase it over the GeForce GPU, at least based on data from a recent poll we carried out.
We also think moving forward ray tracing performance might be less of an issue, at least if what we’re seeing in Fortnite is anything to go by. Sure, there will always be RTX titles sponsored by Nvidia where Radeon GPUs get slaughtered, that’s just how the game seems to be played, but overall we think AMD’s ray tracing performance is now in more solid ground than before.
At the end of the day, we don’t think you should go out and purchase either of these GPUs at their $1,000 – $1,200 MSRPs, but it does seem like many gamers are interested and willing to part with the less expensive 7900 XTX in this particular price segment, which seems reasonable but we’d be a lot happier if we see a price drop in the near future.
That’s going to do it for this massive benchmark comparison, if you enjoyed this feature and appreciate the work that went into it, feel free to share, like and subscribe.