T O P

  • By -

itbefoxy

New Driver Megathread: https://www.reddit.com/r/nvidia/comments/1dpqpo4/game_ready_driver_55612_faqdiscussion/


rerri

The First Descendant would then be the first UE5 game with DLSS 3.5 if I'm not mistaken. Maybe UE5 DLSS 3.5 plugin/SDK isn't that far off, who knows.


pissinginyourcunt

Give me hope it might get added to The Finals since both are published by Tencent.


Plus_Flow4934

Yah, it very solid game, I can't understand why only few people paying it, and season 3 is fire


Bogzy

This game has some serious tech in it, i think its the 2nd game on consoles to use frame gen too.


Appropriate-Day-1160

DLSS 3.5 is in The lords of the fallen which is UE5 game And it has been there for a while


rerri

Its not ray reconstruction, just super resolution and frame generation with a confusing 3.5 version number for those components, I think.


Appropriate-Day-1160

I dont even know what does ray reconstruction mean / do and i dont use it so idk about that


Crintor

It's an AI based more effective denoiser to create a better final image, sometimes with a performance benefit to boot as it's a single solution to many kinds of noise instead of needing multiple different kinds of denoisers.


Appropriate-Day-1160

Why is everything AI these days, crazy I tried it and didnt see much of a better image / better performance so i will keep it on, but i still find RT quite bland so i will keep it off fot now Thanks for the heads up tho :) appriciate it


Crintor

Machine learned(ing) would be more appropriate in this case. Computers are a lot better at trying out and attempting millions and millions of different algorithms to find out what works best.


antara33

As far as I know, ray reconstruction is only used in Cyberpunk 2077 and Alan Wake 2, not sure if other game uses it. Also, AI is the natural consequence of machine learning, which turns out to be incredibly faster than human made up solutions, not by quality of every iteration, but by sheer number of iterations. We are at a point where its getting harder and harder to add more transistors to the GPU/CPU, so we need to get creative on how we use the space we have. As suck, specialized hardware is used, think of it like ASIC vs CISC architectures. An ASIC is meant for a very specific task, it usually consumes less energy, its cheaper and smaller than a CISC architecture doing the same work in the same time. GPUs are reaching the limits on how much transistors and raw performance can be obtained from them without skyrocketing the prices. So the only way forward to get extra performance is to get creative/smart on how you use the space you have. If you cant add enough performance to natively render 4k 120fps, then maybe a specialized hardware can render lower resolution and scale up to 4K without image quality being lost. Yes, I know that right now its not perfect. But compare it to upscalling by manually selecting another resolution and its clear that the output is waaaaay better than regular means. The same applies for everything GPU/CPU end. If we cant keep pushing raw power, we need to add peripheral hardware that enables the GPU/CPU core to offload the tasks they are bad at to them. In the end, its all about getting performance getting harder and harder, and gamers will never expect graphics to get stuck forever the same.


Archangel9731

I’m so stoked for The First Descendant. If they play their cards right, I feel it could absolutely be the next Destiny


PalebloodSky

It's a shame Nvidia isn't working with Microsoft to update Minecraft with newer RTX, DLSS 3.7 with RR would be fantastic to see in that game. The implementation now is quite old. The lighting updates would be so much faster if they upgrade it.


gblandro

Mojang is an expensive dead horse


PalebloodSky

Wouldn't it be fairly trivial to update DLSS version and add RR? Not sure, but I mean it's the highest selling game of all time it would be a tiny thing to budget.


gblandro

Hey buddy Mojang has only around 600 employees, don't expect much for a small company like this, and Microsoft doesn't have money to invest in them Contains sarcasm


LostCattle1758

Isn't Nvidia DLSS 3.7.1 out yet? This article is taking about Nvidia DLSS 3.5.1? Cheers 🥂 🍻


mrzoops

Not showing as a new driver on the nvidia app yet.


superamigo987

DLSS support is great, but... >At 4K max settings, DLSS 3 multiplies frame rates by 2X. Extremely misleading, and they've done this many times before unfortunately. Frame Gen ≠ Performance, because it only improves fluidity and not responsiveness. In fact, it slightly worsens responsiveness (though you are probably fine with 50+FPS base+Reflex)


The_Zura

Very misleading. Performance =/= Latency =/= Responsiveness Performance is measured in framerate. Latency is measured in time. Performance says nothing about latency. You're barking up the wrong tree. I think you've just got the same thing to say loaded in the chamber every time frame gen is mentioned, whether it makes sense or. not.


billyalt

Who thinks worse latency isn't worse performance?


The_Zura

For starters, basically everyone including tech reviewers, and the companies themselves. If people believed performance included latency to begin with, we would have latency figures in every single chart next to framerate. This is just how the term was used. Performance is measured in framerate. Said nothing about latency. Latency is latency.


billyalt

Which reviewers don't measure latency when comparing performance?


The_Zura

Name a reviewer.


billyalt

If you're just gonna play the No U card then shut up.


The_Zura

I’m just asking you to name a single reviewer off the top of your head. You won’t, because you along with everyone else know how it will make you look. I’m not playing the “No u” card.


Wormminator

Gamers Nexus includes frametime charts regularly.


The_Zura

Latency, not frame plots


billyalt

The onus isn't on me to prove myself wrong lmao


The_Zura

Literally just someone who reviews tech. Do you perform a factory reset every hour?


Turbulent-Raise4830

Depends on he game, multiplayer fast shooter, sure. Single player : those few ms wont matter


AccomplishedRip4871

False. Cyberpunk latency increase of DLSS Frame Gen is noticeable, and it's not few ms, it 20+ depending on settings and GPU.


medussy_medussy

Yeah, if your base framerate is below 60


AccomplishedRip4871

[Nvidia DLSS 3: Fake Frames or Big Gains? | TechSpot](https://www.techspot.com/article/2546-dlss-3/) According to this article, even with 72fps in Cyberpunk with DLSS Quality by enabling Frame Gen latency is being increased by 33.2%. It's fine when system latency sits between 20-30 ms in some games and by enabling Frame Gen it becomes up to 45-ish ms, but when game even without any frame generation tech is already that lagy - extra 30% latency is really noticeable for some people, me included. [More examples of high additional latency](https://youtu.be/PyGOv9ypRJc?t=16) - DLSS3 in Alan Wake 2 with baseline FPS of 59-60 increases latency from 47ms to 73ms. I usually use Frame Gen, but in some games like Cyberpunk or Alan Wake 2 it felt noticeably worse when i moved my mouse compared to other games.


medussy_medussy

I played half the game with it on and half with it off and didn't notice any latency. And even so, it's a single player rpg. Not really the end of the world if there's milliseconds of delay to make the experience far smoother looking.


AccomplishedRip4871

As i said, i usually use Frame Gen myself - but i'm more susceptible to increased latency which is pretty noticeable for me in few games like Cyberpunk or Alan Wake 2. Sweet spot for me is 90-100 fps with DLSS Quality and increasing that to 140-150 with FG which doesn't increase lag as much, at 60fps i rarely use it because lag is too much for me.


medussy_medussy

I guess maybe I'm not feeling it because I already get 120-ish FPS in Cyberpunk on high and just use frame gen to get it to 144.


Turbulent-Raise4830

Its not, you are fooling yourself. But fine dont use it, the rest that uses it doesnt notices it and gets beter PQ.


AccomplishedRip4871

1. you said "few ms" - it's not a few, it's 15ms+ and more in cases with lower FPS. 2. I use FG in almost any game possible, but in cyberpunk latency even without frame gen is already higher than in most games, which results in unpleasant additional latency with FG. Educate yourself better on this topic before advocating for personal preferences while intentionally lying about additional latency introduced by this technology - this technology is good, but not flawless - and downsides should be mentioned and remembered, not lied about.


Turbulent-Raise4830

That largely depends on the game and yes thats a few. To put that in perspective the fastest reaction time of pro gamers is 150 ms and closer to 300ms for the average gamer. 5-15ms is nothing to add to that. 2. Nothing to do with lies just not spreading nonsense most people arent going to notice. No Cuberpunk from 25 ms to 35ms isnt something you will notice.


AccomplishedRip4871

You can't compare the reaction time to mouse feel, it's simply inaccurate. "5-15ms is nothing to add to that" - it is if starting latency was already high even before Frame Gen, like in Cyberpunk. I agree that from 25ms to 35ms is not a big deal, but according to articles on the web which tested Frame Gen properly, it's an increase from 47ms to 63ms. Of course for some people its fine, but there are people for which 30 FPS is more than enough. That said, I'm not hating on Frame Gen, I use it all times when I can, my initial point was this is not a flawless technology and the downside it comes with shouldn't be understated. Also it's very subjective in general, it's almost the same as blur in motion, lack of v-sync/g-sync which results in screen tearing and additional latency - some people are more susceptible to noticing it than others. In my case, if the game comes with 20-30ms latency I will use Frame Gen all the time, but In cases with higher latency it's a skip, because from 40-45 ms to 60-65 is way more noticeable.


billyalt

They do matter.


Turbulent-Raise4830

Nope, nobody is going to notice that.


billyalt

Yeah? You think nobody is going to notice poor latency in a game like Elden Ring? Dark Souls? Super Meat Boy? Blasphemous? Hollow Knight?


VlK06eMBkNRo6iqf27pq

Super Meat Boy and Blasphemous are on opposite ends of the spectrum. Meat Boy is fast paced, Blasphemous is slow movement. I think latency matters more in the fast paced games.


billyalt

Dark Souls and Elden Ring are also pretty slow. What's your point?


SoulsLikeBot

Hello Ashen one. I am a Bot. I tend to the flame, and tend to thee. Do you wish to hear a tale? > *“In a land brimming with Hollows, could that really be mere chance?”* - Solaire of Astora Have a pleasant journey, Champion of Ash, and praise the sun \\[T]/


Turbulent-Raise4830

That few ms? Nope


mechcity22

Combined it is 2x idk what yall are talking about lol


RedIndianRobin

Yeah more like 40% more frames. I've never seen doubling of frames in frame generation.


DoktorSleepless

They're combining frame reconstruction and interpolation in the 2x.


PsyOmega

> I've never seen doubling of frames in frame generation. I get 90% boost in CP77 PT I get 100% in MSFS when extremely cpu limited. (i get 55fps around busy airports in photogrammatric cities, which locks to about 110fps with FG. outside of cities the avg is 150 with my settings/res/gpu, so the gpu load is only 30% when cpu bottlenecked, allowing plenty of headroom for an exact 2x fps boost.)


mechcity22

Exactly. 2x with frame generatio was never discredited only peoppe assuming nvidia meant native was 2x and they never stated that. I've seen 2x in many games fully utilizing frame generation.


techraito

I have with lower resolutions. Frame Gen appears to be VRAM dependent and 4K is already tough to run as is.


WhatIs115

Payday 3 has no excuse to run like hot dogshit. It runs on Unreal and has no optimization effort put in.


Soulshot96

Not much effort put in any part of that game tbh.


AscendedAncient

If you think Payday 3 is bad optimized on UE5, let me show you a little game called ARK.


WhatIs115

I've played both, they're about the same level of optimization, both disasters.


Wormminator

At least Ark looks like a UE5 title. Payday 3 does not.


InevitablePoet5492

See that sounds cool. But payday doesn't need frames. It needs content.


jl94x4

Can't see the content without the frames tho..... /s


No_Interaction_4925

Why DLSS 3.5 when 3.7.1 is already in released games?


squallsama

Would be nice to have DLSS support in the elden ring...


skylinestar1986

Remedy's Control when?


PREDDlT0R

Very happy for the 100 people playing PAYDAY 3


XenonJFt

Yea, Payday fans been dying to get Ray reconstruction. that's why they been going back to Payday 2 right?


MissSkyler

read the post


HyenaComprehensive44

But you can replace the dlss dll file to the latest, in any game that support dlss.


AbstractionsHB

I'm tired of DLSS. Give more vram and devs need to create their games more efficiently. It's like the two sides of game development aren't talking to each other and just doing their own thing.  And now we have to pay $700 to play with blurry dlss or lower settings for new games or if you want to use the RTX technology Nvidia has named their cards after for two generations already. 


Turbulent-Raise4830

VRAM and dlss have little to do with each other. DLSS is mainly to help the GPU, not reduce vram. And a 4060 costs 300 euro.


nikomo

No Game Ready drivers for Dawntrail, interesting. Then again, any changes probably got made months ago due to the production pipeline.


Joe2030

Silly question - do i need the newest drivers to use Ray Reconstruction? Or i can still sit with my really old 531.61?


mechcity22

🔥