How’s that Vega wine tasting now, AMD? Pretty good, I expect. Especially now we’ve got the latest Radeon graphics cards giving their GeForce rivals trouble in Shadow of War, and that coming hot on the heels of the realisation that AMD’s Vega is leaving the top-end Nvidia 10-series GPU eating its dust in Forza 7.
You’re going to need a great screen to go with your shiny GPU, so check out our list of the best gaming monitors around today.
We’ve done our own performance testing for Shadow of War, with regards to the sort of graphics cards that represent the chips-in-rigs of the vast majority of gamers, but unfortunately our Vega samples have been seconded to Poland by AMD and are still in transit back to the office.
Computer Base in Germany, however, have gone to town on the GPU testing. They’ve benchmarked Shadow of War with all the most recent graphics cards to see how they stack up in the open-world orc-fest.
The results highlight both how memory-intensive the game is, and just how well AMD’s latest Vega GPU architecture can deal with that sort of title. For the most part the AMD RX Vega 64 fell behind the Nvidia’s GTX 1080 in our gaming benchmarks when we first reviewed it - even in the high-res textured world of the game's prequel, Shadow of Mordor.
But with Shadow of War the Vega 64 is consistently out in front of the GeForce card. The difference is just 9% at 1080p and 1440p, but it stretches to a hefty 21% when the resolution is pushed up to 4K. Interestingly, it’s at 4K where the AMD RX Vega 56 also pulls away from the GTX 1080.
That’s a pretty impressive result for AMD, however it has to be said that AMD’s flagship GPU is still a lot more expensive than the Nvidia competition. The cheapest RX Vega 64 is $650 in the US, though is only £551 in the UK. That still puts it around $100, or £100, more expensive than the GTX 1080.
So you’d kinda hope it was delivering better numbers, right?
All the Shadow of War numbers were taken with AMD’s High Bandwidth Cache Controller turned off too, so there are no caching shenanigans going on. It’s all straight GPU and memory interface performance.