3DMark’s Port Royale ray tracing benchmark is so demanding even the top overclockers in the world are only able to offer an average 51fps using nearly $2,000 worth of graphics card and processor. Granted, it’s not out yet, but UL – the company behind the 3DMark benchmark – has been out at the Galax GOC 2018 contest in Ho Chi Minh City showing off the new ray tracing focused test, Port Royale.
The Port Royale benchmark is going to be released on January 8, 2019, with the tech world’s focus on Las Vegas and the CES tradeshow. Overclockers at the event in Vietnam, however, have had an early preview and have had the chance to get down and dirty with it, competing to see who can get the highest score and the fastest overall frame rate.
And even with a heavily overclocked Nvidia RTX 2080 Ti, paired up with Intel’s Core i9 9900K, the top score comes out well under 60fps. That shows just how demanding ray tracing is on even the latest graphics hardware, but also shows that UL has designed the Port Royale test for this and the next generation of ray tracing capable GPUs.
It was Sweden’s Tobia “Rauf” Bergström who managed to grab the top spot at the Galax event, delivering an overall Port Royale score of 11,069, with an average 1440p frame rate across the test fo 51fps.
The test uses ray traced reflections and shadows to show what we can expect from upcoming games trying to take advantage of the new DirectX Raytracing (DXR) features now baked into DX12. UL worked with both Intel and AMD, alongside Nvidia, to create the benchmark, but has also worked very closely with Microsoft to really highlight the extra visual fidelity of its DXR API.
UL has reiterated that the Port Royale test in 3DMark will run on any graphics card that supports Microsoft’s DXR, saying again that “there are limited options for early adopters, but more cards are expected to get DirectX Raytracing support in 2019.” Whether UL is just referencing a potential new RTX 2060 utilising the ray tracing hardware already inside its TU106 GPU, or other more red-tinted GPUs, we’re not entirely sure.
Source