NVIDIA DLDSR: “super resolution” assisted by artificial intelligence

A super-resolution technique reserved for graphics cards NVIDIA with Tensor cores.

If the availability is still as erratic, that does not prevent the tenors of the graphics card from giving in blow for blow. Thus, less than a week after AMD announced the integration of FidelityFX Super Resolution to its graphics drivers, NVIDIA is advancing a new pawn.

the Dynamic Super Resolution

For several months, NVIDIA has been beating our ears with its Deep Learning Super Sampling. The idea is to calculate the image in a definition lower than that which will actually be displayed on the screen in order to lighten the calculations for the graphics card.

We obviously gain in performance, but so that the image does not lose too much detail, NVIDIA relies on a clever technique involving artificial intelligence. This must fill the gaps in the most efficient way possible and the fact is that the rendering is today very close to the result in the native definition.

Related:

NVIDIA Launches GeForce RTX 3080 Ti and RTX 3070 Ti Laptop GPUs

Well, the Deep Learning Dynamic Super Resolution it’s a bit the same thing, but “backward”. NVIDIA has offered an option for Dynamic Super Resolution (DSR) in its pilots. The thing is simple since it is a question of calculating the image in a definition this time more important than that which will be really displayed.

As you can imagine, we will then be able to obtain a significantly finer image and it will be possible to do without any anti-aliasing. Problem, the thing is monstrously greedy.

… assisted by artificial intelligence

This is where the DLDSR comes in, of course, thanks to artificial intelligence. She again. NVIDIA brings it in this time to take care of the downsampling in the best possible way. The objective is obviously that the DSR becomes much less greedy for the PC.

NVIDIA DLDSR © NVIDIA
© NVIDIA

Thus, in the NVIDIA drivers, it was possible to set the DSR to x4, in other words, we had to multiply the number of pixels calculated by four and 2560 x 1440 pushed the card to calculate 5120 x 2880. Guaranteed loss of performance, but DLDSR relies on AI to calculate only x2.25 and keep an equivalent image quality, according to NVIDIA of course. It will be necessary to verify all this.

By the way, who says Deep Learning at NVIDIA also says the contribution of the famous Tensor cores of RTX cards: the DLDSR will therefore be reserved for GeForce RTXs and we can assume that the RTX 3000 will be more efficient, but NVIDIA emphasizes that the thing must be compatible ” with most games », Without the studios having to use themselves as with DLSS.

The Nvidia DLDSR is not yet available, it should be as of January 14 with the new version of the GeForce drivers that the company must distribute to support the release of God of War PC, which will be compatible with this good DLSS which does not bow out for all that.

Must Read: Exynos 2200 release date, postponed presentation event?

Source: NVIDIA

Must Read

Related Stories

Stay on op - Ge the daily news in your inbox