top of page
pernuyderlole

DLSS finally delivers: Nvidia RTX technology tested across 5 games



As for how much of the desktop RTX 3080 performance you can expect from GeForce Now, you can see both our 1440p and 4K results below, tested on PC and Shield respectively. Owing to the streaming nature of GeForce Now, our standard FCAT benchmarking system doesn't work here, so in both desktop and cloud tests, we simply note the results of the in-game benchmarks themselves. The results? Well, there can be quite a gap between the desktop RTX 3080 and its cloud equivalent in 1440p rendering - likely down to the lower memory bandwidth on the cloud GPU, along with the performance limits of the Ryzen 3000 CPU architecture (our desktop results used a Core i9 10900K locked to 5.0GHz on all cores). The desktop RTX 3080, paired with the Intel chip can deliver up to 26 percent more performance, but the differential varies depending on the game. Indeed, in the case of Watch Dogs Legion running with ray tracing, the cloud system actually has a sizeable advantage.




Does DLSS finally deliver We tested the Nvidia RTX technology across 5 games



Kicking off with image quality, it's worth stating that xCloud still seems to be work-in-progress - and even if the Series X silicon in the cloud is rendering at 4K, the image gets downscaled to 1080p. However, selected titles do seem to support 120Hz. There's the sense here that xCloud's streaming technology is somewhat limited while the processing power at rhe server-end is very good. In this regard it beats Stadia easily, but doesn't offer the 4K streaming option that Google has had since day one with its Pro subscription. To put it into perspective, Outriders is a 60fps experience on GeForce Now and xCloud, but the 4K output option for Stadia only delivers 1440p at 30fps (though 1080p60 is an option).


We tested latency by using a second generation Nvidia LDAT sensor, which straps to the test monitor and measures latency from button press to a significant change in luminescence on-screen (such as the muzzle flash from a gun, for example). So, the results seen here account for complete end-to-end latency, including the display lag on our monitor. This is a crucial component of the testing as some of Nvidia's claims here have been extraordinary. The firm says that Destiny 2 on GeForce Now has input latency that's actually lower than the local Xbox Series X experience - whether you're gaming on their service at 60Hz or 120Hz. Part of this may well be down to Nvidia's Reflex technology - an input lag reduction technology which doesn't have an widely used equivalent on consoles (Microsoft's DLI has seen sparing support thus far).


Stacked up against the major competition, the streaming side of the equation also sees Nvidia pull ahead. In terms of image quality, we've been impressed by Stadia's 4K output previously, but GeForce Now seems capable of delivering clear imagery with less artefacting. The input lag results are also best in class. In best case scenarios such as Destiny 2, at 120Hz with a well-optimised title that also supports Nvidia Reflex, the difference between a local PC experience and a cloud-streamed alternative can drop to less than 30ms - which is remarkable. Even at 60Hz, cloud lag in the region of the mid-30s is astonishingly good. Outriders also demonstrates that not every game can run so fast, but even so, results there were still better than the competition. Perhaps what's more surprising here is that in the case of both Destiny 2 and Outriders, console latency was much, much higher than expected. A 60fps game running locally on Series X should outperform any streaming system also running at 60fps, but that doesn't seem to be happening in both of the titles tested here (but please note, input lag will be more competitive in other titles based on legacy testing).


One Network For All Games - The original DLSS required training the AI network for each new game. DLSS 2.0 trains using non-game-specific content, delivering a generalized network that works across games. This means faster game integrations, and ultimately more DLSS games.


Which does make you wonder what else can AI and machine learning achieve in gaming? And how long can AMD avoid such technology when the pace of improvement in neural network upscaling is so rapid year-over-year? An industry in waiting need only implement DLSS effectively across a majority of upcoming titles for Nvidia to be effectively able to tack on a performance improvement of some 60% without ever even touching its silicon.


Still, there's certainly something to be said for DLSS 2.0 and the team responsible for it at Nvidia. After a limited initial run with DLSS, I thought it surely a feature to be marked as tried and tested then left to wither on the vine. Instead, DLSS 2.0 is an effective and immediate gateway to a gratuitous performance bump with little to no visual impact. And its growing availability across a number of graphics cards and games could signal a groundswell in AI applications across gaming yet to come.


I looked at each technology, where relevant, across 4K gaming results that were both natively rendered (that is, at actual 4K), as well as scaled from various resolutions below that. I've assembled a trove of images below (they're hosted in sliders; click the arrows!) that show the same scene in various games (support-dependent), processed by each individual sharpener.


Because we had to use two different cards (and multiple games) to test these features to their full extent, this won't be so much a "performance" comparison as it is strictly a look at the output image quality of each technology. We chose the game Control to show off what DLSS and the three sharpener technologies can do, while compatibility issues forced us into testing with the titles The Riftbreaker and Godfall for AMD's FSR. (Control does not support FSR.)


Only part of what Freestyle does is sharpening; it also lets you apply filters to your game to change the overall look. Nvidia hasn't shared much about how the technology behind Freestyle works, stating simply that it's an "image post-processing tool" that sharpens the edges of objects in your games. (See our guide to running and using Freestyle.)


The thing about PC ports, of course, is that they have to work on a wide range of machines. As of press time, the Spider-Man version we tested doesn't necessarily surpass the mix of stability and impressive technical performance that developer Insomniac delivered on dated PlayStation 4 architecture.


Has there been another game development cycle longer than the ones of both Prey games? I personally have been waiting for a very long time for this game, even gave up at some point. But, Prey has finally delivered. Powered by the Cryengine we should expect a reasonably demanding game that should utilize as much resources as you can throw at it. That's what we're here today to find out, how does it perform on PC. 2ff7e9595c


0 views0 comments

Recent Posts

See All

Comments


!
Widget Didn’t Load
Check your internet and refresh this page.
If that doesn’t work, contact us.
bottom of page