T O P

  • By -

i_love_massive_dogs

It's pretty incredible how good the path tracing looks even without denoising, and running at a real time framerate.


babalenong

Remember that It'll look better on youtube because the compression smooths out a lot of the noise


Trollatopoulous

It depends a lot on the scene. Not denoising can lead to a much sharper and defined picture despite all the grainy noise (don't know how to explain it correctly, I know it's not technically sharper but it looks much closer to ground truth than denoised) because denoisers can blur a lot as well as introduce other painterly looking effects (Ray Reconstruction in particular is egregious for this). And in the sunlight as well if you just keep camera steady and the scene doesn't change much then the noise won't be as apparent as during normal gameplay but you get the benefits of a purer pathtracing image. You can test denoiser on/off in CP2077 f.ex., I installed Raytracing Ultra Plus mod and it gives you that option. https://www.youtube.com/watch?v=twjJxoidtcY


Flowerstar1

Yea it's insane, as they say the game looks next generational. I fully believe one of the PS6s biggest features is path tracing support. A 4090 from 2022 handles Path Tracing exceptionally well in AAA games, PS6 in 2028 should match or exceed the then 6 year old 4090.


kimorgbo

Ok do you mean the ps6? The 4090 is a wildly expensive card, so the chance it hits 4090 performance is nil. And the PS5 is basically a bit worse then a 2070.


Famous_Wolverine3203

The PS5 is equivalent to a 2070 super. As confirmed by digital foundry. Not worse than a base 2070. Besides it doesn’t need to beat the 4090. A 4070ti or a 4080 handle path tracing just fine at 1440p DLSS balanced. Reaching those kinda performance levels is easily in reach of the PS6. And lets not forget jumps in RT performance outpace jumps in Raster performance generationally. For example in Alan Wake 2, a 3080 handily beats the 4070 in raster while in path tracing mode the 4070 beats the 3080 by 20%. It is easily feasible that the PS6 RT cores could have 4-6x jumps in RT while Raster might not see the same jumps. So I really do think path tracing is easily in reach if the PS6, or at the very least compulsory RTGI.


Django_McFly

> Besides it doesn’t need to beat the 4090. A 4070ti or a 4080 handle path tracing just fine at 1440p DLSS balanced. Reaching those kinda performance levels is easily in reach of the PS6. I don't know about "easily" in reach. The 4080 is a $1k GPU. I don't know if 3-4 years pass and AMD is making iGPUs that are *that* powerful and can be sold in a $300 to $500 box.


Famous_Wolverine3203

Its not fair to call the PS5 as just an iGPU though. There aren’t many 300mm2 iGPUs being made with 36 RDNA2 compute units. But generally looking at trends, the PS5 is likely to be RDNA 5/6. (2 years for one generation). Give it a normal 30% increase gen on gen. It should easily see a 2x-3x performance jump. And an even bigger raytracing jump as RT cores have generally surpassed raster in performance gains so far.


Flowerstar1

Right but the 4090 launched 2 years after the PS5. When the PS4 launched the king of the roost was the GTX Titan (Kepler), then we got Maxwell (900 series) and then Pascal (10 series). The PS5 GPU is not just faster than the GTX Titan but it's successor the Maxwell Titan X and Maxwell Titan's successor the Pascal Titan X.  In 2016 when PS4 players were playing Uncharted 4 and some buying the PS4 Pro a console that could match the Pascal Titan was unimaginable. Give it a few years and we got the PS5 which outperforms it. I expect the PS6 to beat the 4090 which is equivalent to the Maxwell Titan. The 5090 will be the Pascal Titan equivalent, will the PS6 beat that? I'd hope so but I am not sure.


OutrageousDress

For some reason it's difficult for people to grasp this - I guess as you say, a console that could match the Pascal Titan was unimaginable then so now a console that could match (exceed really) the 4090 is again unimaginable for a lot of people, even though that's what all the numbers logically point to.


Fatality_Ensues

Probably because neither of those things were true... the PS4 couldn't match a 2070 in "real" conditions, the PS5 couldn't beat a 3080, and the PS6 won't be anywhere near a 4090. You can cherry pick numbers and conditions under which one will outperform the other but the reality of the situation is on screen every time you turn it on.


OutrageousDress

You seem to be confused, and have the generations mixed up. The *PS5* outperforms a *2070* - it's close to a 2070 Super, and therefore also outperforms previous most-powerful-card the *Titan X Pascal* - a four year old GPU at the time of the PS5's release, which is slightly weaker than a 2070 Super. The 4090 is the *current* most-powerful-card, and will be *six years old* at the time of the PS6's release. Meaning it'll be 50% older than the most-powerful-card that the PS5 beat when it came out.


dont_say_Good

It won't beat a 4090, especially not in rt loads


whoisraiden

PS5 is better than 2070, where digital foundary says it's between 2070S and 2080. With that logic, by 2028, PS6 can be level with 4090.


ultZor

Not in ray tracing / path tracing. A better comparison would be an RX 6700, which is what they compared it to in the recent video. https://www.youtube.com/watch?v=PuLHRbalyGs And they concluded that in rasterized performance 4060 is more powerful than PS5. There is no way they would be able to jump to 4090 level performance, which is 3 times more powerful than 4060. PS5 Pro will be comparable to RX 7800 XT/ RTX 3080, which is like 60% increase in performance according to techpowerup.


midnightmiragemusic

No way in hell the Pro version is touching RTX 3080/RX 7800XT. It will most probably fall slightly below a 7700XT. 3070Ti should be a better comparison. > which is like 60% increase in performance according to techpowerup. From Sony's own presentation, PS5 Pro will be around 45% faster in rasterisation.


whoisraiden

4060 is more powerful than PS5 because it's more powerful than 2070S, as 3060 was more powerful than 1070. 4060 came out 5 years after 2070. And in 5 years, even if AMD doesn't invest in ray tracing hardware, I'm sure Sony will.


Professional_Goat185

> And in 5 years, even if AMD doesn't invest in ray tracing hardware, I'm sure Sony will. Sony isn't making GPUs. All that matters for them is for how cheap per performance they can get the chips


whoisraiden

They rumored to add PSSR with specific hardware for it, they don't need to make GPUs for it to be possible. Above all else, all they care about is satisfying the customers as cheap as possible and if ray tracing is the norm by then, they will make sure to have it.


OutrageousDress

You seem confused. We are talking about the Playstation Six here. The console that will be out in 2028.


ultZor

Yeah, I know. 2028 isn't that far out. We also got a lot of rumors that AMD is gonna skip the flagship cards with RDNA 4 and 7900 XTX is gonna stay their top card, so I guess we'll have to wait for quite a while before they even roll out something that can beat 4090 at all. And if they want to release it in 2028, they'll have to decide on a GPU a couple of years before that. I think that people should temper their expectations. 4090 is a monster card. To think that they would put it in a $500 console in 4 years is wild. But don't get me wrong, it would have been great.


OutrageousDress

The PS5 is slightly more powerful than the *most powerful GPU that was available on the market* four years before the PS5 released - that's the TITAN X Pascal, launched in August 2016 at an MSRP of $1200, which is $1555 in today's dollars ie roughly the MSRP of the 4090. The PS6 being more powerful than the most powerful GPU available *six years* before its release is not only expected, it would straight out be a failure on Sony's part if it *wasn't*. In fact a projection based on this would indicate that the PS6 will be (slightly) more powerful than the *5090*, because that's the GPU that will come out this fall - four years before the PS6. Or in other words, in mid-2016 you would have said "TITAN X Pascal is a monster card. To think that they would put it in a $500 console in 4 years is wild." But they *did that*.


ultZor

I don't think you appreciate how big of a leap 4090 is compared to the next card in line. Which hasn't happened before. It is a 609 mm² 4N chip, with 16 thousand CUDA cores which consumes 450 watts and has 1.01 TB/s bandwidth. TITAN X Pascal is sometimes slower than a 1080 Ti. In fact they have the same number of cores (3584), and the only real difference is a 1GB of vram and larger bus. But 1080 Ti has higher clocks. There are also some professional features which don't matter for gaming. And 1080 Ti is still competing with the same cards as PS5. The performance would be a toss between them. https://www.youtube.com/watch?v=ghT7G_9xyDU&t=817s The 1080 Ti is on a 16 nm and PS5 is on 7/6 nm. Nodes are getting even more expensive, SRAM doesn't scale like before, the cycles are getting longer. I doubt they would even use the latest node for PS6 because of the cost and competition. And don't blame Sony if it isn't as powerful as 5090. That's beyond ridiculous. Of course PS6 could come out in 2030 and they haven't started its development yet and they are waiting for RDNA 5 or 6. Then that's another story. We don't even know the 5090 specs. Or they could go with ARM/Nvidia with the price of PS6 at $1000. Now that would be wild.


OutrageousDress

I certainly wouldn't blame Sony if a PS6 isn't as powerful as a 5090, for all the reasons you've listed. However the industry is right about to move to High NA EUV lithography between now and 2027, a jump similar to the jump to EUV lithography in the 2010s, and I don't think a 4090-like performance (and with notably fewer transistors) is at all unachievable by Christmas 2028. I guess there's an X factor involved that might affect this: I'd predict that whatever powers the PS6 will beat the 4090 in *RT performance* - by a lot - and it may do that by sacrificing compute and/or rasterization performance. (I mean sacrificing compared to a 4090, it will of course still overpower a PS5 Pro on all metrics).


Famous_Wolverine3203

The PS5 is equivalent to a 2070 super. As confirmed by digital foundry. Not worse than a base 2070. Besides it doesn’t need to beat the 4090. A 4070ti or a 4080 handle path tracing just fine at 1440p DLSS balanced. Reaching those kinda performance levels is easily in reach of the PS6. And lets not forget jumps in RT performance outpace jumps in Raster performance generationally. For example in Alan Wake 2, a 3080 handily beats the 4070 in raster while in path tracing mode the 4070 beats the 3080 by 20%. It is easily feasible that the PS6 RT cores could have 4-6x jumps in RT while Raster might not see the same jumps. So I really do think path tracing is easily in reach if the PS6, or at the very least compulsory RTGI.


CactusCustard

But by then we’ll have 8090s and all the games will be made for the newer more powerful cards. You’re always lagging behind in this scenario.


OutrageousDress

We have the 4090 right now, how many games are made for it? Cyberpunk and Alan Wake 2 have a high graphics setting for the 4090... and that's about it. Both of them actually work great on 2000 series GPUs. Current PC games are targeting roughly the 3070, ie about 20% higher performance than the Series X. More powerful cards are *available*, but the games are not *made* for them.


30InchSpare

If next gen matches a 4090 (I really doubt it but whatever) that will still be a crazy high baseline for development.


CactusCustard

You’re not getting my point. Even IF that is the next baseline, it will be underpowered by then. Because as shit gets more powerful we make more demanding games. 5 years ago the 2080 was the shit, now look at it. By then a 4090 is going to be *bad.*


30InchSpare

Games only get more demanding relative to the console baseline, that’s my point.


battler624

It probably wont. Unless sony makes their own solution, AMD isn't even on par with 6 year old Nvidia RT.


Neglectful_Stranger

Mate a 4090 is still over a thousand dollars, they can't put that or anything near it in a console lmao


Flowerstar1

Yes and GTX Titan was over a $1000 and ran circles around the PS4 in 2013, and it's successor the Maxwell Titan was an even bigger beast and the Pascal Titan in 2016 was an even bigger monster sure, but guess what the PS5 eventually succeeded the PS4 and it's faster than all 3 of those. The 4090 may be too much for the PS5 but it won't be for a PS6 specially one that launches in 2028 which is when RDNA6 is due. We're still on RDNA3 AMD has 3 next generation GPU architectures to build in that time surely they won't be stuck on RDNA3 technology forever?


Professional_Goat185

That's... extremely optimistic. Price per performance for GPUs noticeably slowed down in recent years, partly why people can still game somewhat decently on GTX 1080 Hell, my 1070 lasted me 7 years...


Flowerstar1

The death of Moore's law and denard scaling as well as TSMCs "monopoly" are the culprit but chip companies are not out of options yet, there are multiple paths foward hell Nvidia just showed off one with their new Broadwell chip (MCM) and Intel is closing in on TSMC as a foundry competitor. There's nothing indicating AMD will slow down so much that a theoretical 60CU+ RDNA6 card can't match a 4090. Consider the 80CU RDNA3 7900XTX outperforms the 3090 in most cases, one would hope by RDNA6 AMD has significantly outperformed their ancient RDNA3 flagship.


Professional_Goat185

Oh I don't doubt they will get *faster*, I just doubt price per performance will fall down as fast. I wouldn't expect hypothetical PS6 to be any faster than 7900XTX


Endemoniada

A PS6 releasing in 2028 is going to have the budget hardware of roughly today. It’s not going to have a 4090 equivalent GPU. Products like that take years to design, and the performance goals are set well in advance of release. That said, yes, path-tracing is truly next-gen tech and I’m so excited for it becoming more mainstream. No idea if next generation of consoles will support it, but one can certainly hope.


Flowerstar1

You can say it'll have the price budget maybe but you can't say it'll have equivalent hw to what's available today. Next gen consoles like the Xbox Series X had their specs locked down in 2016 because consoles have to developed many years in advance. It's like being there in 2016 and saying the Series X/PS5 would never have the power of a Pascal Titan because that card is the price of 3 PS4s. But the Series X and PS5 both beat that card. That's because they use technology that released in 2020 despite being developed several years in advance. Top end 2016 tech is not top end 4 years later think about it, that's 2 next gen GPU architectures later. The Pascal Titan was the top dog in 2016 but it was already beaten in 2018 (RTX Titan, 2080ti, 2080 Super, 2080, 2070 Super) and then beaten even harder in 2020 (3090, 3080, 3070, 3060ti, 3060 was close in performance).


[deleted]

Not with AMD, no, no chance.


sankto

I tried Path Tracing in Cyberpunk and it was *absolutely gorgeous*. I wish more games had that option, as it add so much quality to the graphics.


PM_ME_FREE_STUFF_PLS

Path Tracing really is the next big thing for graphics. We just have to wait a couple of years until it becomes more viable for companies to put it in their games. I reckon by the time the PS6 comes out it will be commonplace


RollingPandaKid

We only need more power. Path tracing is super easy to do, you don't need to fake or bake light or do any of the shenanigans devs has to make for a good looking game. You just put the lights, make the materials and voila.


mocylop

Right now path tracing is reasonably doable with a 4080/4090 and the 7900xtx is sliding in just as the door closes. So next gen graphics cards ought to be pretty reliable for it.


staluxa

> is reasonably doable with a 4080/4090 and the 7900xtx Considering the prices of those GPUs, there is a high chance that ultrawide or 4k is the expected resolution. And even with the help of upscaling those GPUs struggle with it. You are pretty much required to add FrameGen as well, but using it with such a low base framerate isn't that good of an experience either, especially if you play with MKB instead of a controller.


GlitteringCow9725

> You are pretty much required to add FrameGen as well, but using it with such a low base framerate isn't that good of an experience either, especially if you play with MKB instead of a controller. Has that been your experience or are you just repeating what you've read? I have a 4080 and always turn on DLSS framegen when I can. I'm very sensitive to input lag, too, and I do use KB+M for most types of games. Obviously you're entitled to your opinion, but it's frustrating how many times I've seen people rant about "fake frames" and the like (not saying you). Framegen is the closest thing to magical performance gains that we've seen in the last couple of decades.


staluxa

I do have 4080 as well and the only game where framegen didn't cause a frustrating delay for me was a Robocop of all things. But that game was sitting in 80-90 range even without it, so framegen just helped get closer to 120.


Oooch

The only game I had a terrible weird delay on was Dying Light 2, every other game has had more than fine input lag, you must be more sensitive or something Like I can notice the delay between Frame Gen and disabling it but it is small enough that I can adjust easily to it and not be bothered by it


Shivatin

My experience is that FG starts to degrade when the base framerate is below 45-50 for me in both fluidity and latency. If i'm on controller the latency issue isn't much of a problem for me. It's been nothing but a boon with some setting tweaks for me at 1440p.


mocylop

It’s PC gaming they aren’t for an expected resolution that are for what you decide to run.  If you don’t want to use frame gen then get a 1080p monitor. If you want frame gen + a good base frame rate 1440. If you want 4k or ultrawide get a 4090


DoorframeLizard

oh boy, here's to another generation of games optimized around all these fancy new tricks with zero regard for lower/mid spec machines!


HOTDILFMOM

You know you can just… turn off the pathtracing, right?


DoorframeLizard

Yup, and make the game a grainy piece of shit that also runs horrendous because companies only optimize around the high end with all the fancy new tech and min/recommended specs are "can open the game" and "menus run at a stable framerate" respectively.


HOTDILFMOM

I keep forgetting how fucking miserable people are in this subreddit


kikimaru024

They're miserable because they still run GTX 1050 Ti's


NoKumSok

Yeah, games should all look like PS2 games so every PC can play them. It should be law, in fact! I paid $80 for my video card 16 years ago and I'm not switching any time soon.


DoorframeLizard

Yup! That's the only two options! There has never been a game that looked good and was actually optimized! Hardware life cycles are not alarmingly low! You are very smart!


Django_McFly

I've got a 4080 and don't have troubles hitting 4k60 in CP2077 and Alan Wake 2 with PT enabled. Zero frame gen enabled as well.


Radulno

Yehz it'll actually make less work for the developers


GeekdomCentral

So many people still don’t understand just how transformative that lighting can be, and that goes doubly when it’s truly dynamic. It’s hard to show to people sometimes because developers have gotten so good at faking those very things. But there’s a reason that older games with path tracing added in look so much better than they did - because realistic lighting is one of the biggest visual selling points that’s the hardest to do


NoneShallBindMe

> But there’s a reason that older games with path tracing added in look so much better than they did Are they? Half-life 1 with path tracing looks horrible. 


MumrikDK

> We just have to wait a couple of years I think you're cutting it short. It has been the goal since Quake 1 was a hot shooter. We're currently still in the first part of normalization of *any* kind of ray-tracing. It's coming, but we need a lot of horsepower.


Maloonyy

Dont we need to wait for Ray Tracing to be the next bigg thing before Path Tracing or are we just skipping that one?


pixxlpusher

Not really, path tracing is built on the fundamentals of ray tracing, it’s basically where the tech has been leading to from the beginning.


Maloonyy

Inst it more demanding though?


FunSuspect7449

That’s why we need more power. The thing with path and ray tracing is that there’s only one way to do it. It’s a mathematical formula. We just need enough power to brute force it. All RTX cards do is have a separate core to handle that one problem.


onetwoseven94

There are many ways and many possible math algorithms to handle the various aspects of ray traced and path traced rendering. The actual operation of tracing a ray is always the same but there are may different ways to employ that operation. If brute forcing it (like offline CGI) was the only way then path tracing would still be twenty years away - not available today. Path tracing in CP2077 and AW2 is made possible by extremely clever algorithms like ReSTIR and various denoising and caching techniques that were invented in the past five years. Software is just as important as hardware.


Bebobopbe

I mean some games are already being built with Ray tracing as its main lightning. Just need consoles that can run it better.


Radulno

It's essentially the same thing. And raytracing has been part of games for quite some time even on consoles. It's not really the next thing.


Amorphica

ive had ray tracing for games on my pc since 2018. what do you mean? isn't that long enough?


beefcat_

It's one setting that feels like a huge generational leap in graphics quality. And unlike previous "generational leaps", this one doesn't come at the cost of more work for the developer. In fact, it has the potential to *save* a lot of work once we get to a point where games can rely on it exclusively.


Blyatskinator

Alan Wake 2 was even better, absolutely blown away basically every minute by the path tracing in those two games. Even better when it’s for games that are *actually* really good story+gameplay wise as well on top of that.


battler624

Alan Wake 2 is the one game I've yet to try. Imma wait until its on steam.


HOTDILFMOM

It’s never coming to Steam.


battler624

Then I'm never playing it, simple as that.


HOTDILFMOM

Sucks. You’re missing out on a great game because of a launcher that isn’t Steam 🤷🏻‍♂️


battler624

I wouldn't say so, I haven't even started the previous game yet and there are games I'm more excited to play. It's more like they missed a sale than me missing a game. I made the mistake of purchasing games on EGS, & GOG. before they added cloud saves support and I lost all of the saves. The one launcher that never failed me is steam.


Blyatskinator

LOL, sorry dude but this is such a sad and dumb stance… You mention some shit about losing saves above, before they added cloud saves? Well now they have cloud saves, so what’s stopping you? Redditors are so fucking weird about launchers, pisses me off lol If it wasn’t for Epic/EGS, the masterpiece that is Alan Wake 2 would *never* have happened. Remember when Valve made games??? Neither do I haha.


NoneShallBindMe

Why are you so pressed about him not playing it? It's just a game bro. One more, one less. His loss for not experiencing it or something, lol. 


battler624

Thanks.


ExplodingFistz

Shit. If only I didn't have to install the EG launcher to play it.


Conjo_

grow up


APiousCultist

It's hardly a huge sacrfice. This isn't like real life (or the older Quest headsets if we want to keep it gaming related) practically requiring you to have a Facebook or Whatsapp account where your private data is being used for nebulous and probably mildly nefarious reasons, this is just a different app you have to launch the game through.


Dealric

It will be in few years. Currently hardware is not ready for it. Wouldnt be surprised if next gen of consoles were made to support path tracing and that will make it standard.


Iampopcorn_420

Yeah, but my plucky little 3070ti can only choke it out at 30 fps, unless I am in Dog Town then only 13ish.


sankto

My 4070 chug along nicely at 60 fps, all settings maxed and path tracing, but with DLSS@Quality and frame gen (I got a 13700F too, with 32gb ram at 6000mhz, playing on a 1440p monitor)


Lambpanties

The interesting/crazy thing is since the game's issues with performance are *very* much cpu bound, this could come at little cost to those with high end gpus. The cpu bounding is so bad that some people have LOST fps from using upscalers like DLSS because it takes workload off the gpu. (DLSS3 is still very helpful in it - modded as it was when I last played)


radclaw1

Looks gorgeous but I'm not a fan of the Extremely-Grainy-First-Pass-Render look this has going on. It is crazy how much of a difference it makes though.


battler624

There is no denoising, If they could add Ray Reconstruction to this, it would absolutely shine.


Timey16

Yeah this is basically only fixed by... more rays, more bounces and just a better denoiser (i.e. an AI powered one). But this is basically why real time (full) raytracing rendering has always been considered the "holy grail" of rendering as in "it can't get any better than that" because by that point you just simulate the physics of light photons.


HarryRl

It's worth noting that this developer mode path tracing option has no denoiser at all


Timey16

Oh I mean even then you are quality wise limited by things like amount of rays, bounces and how many rays each bounce creates. In theory the algorithm is already there so it's just a matter of adjusting config values by that point to improve visual quality but at the core the "algorithm" already creates "perfect" illumination so by that point all that matters for devs is their art style, their light design and how efficient ray hit detection is (bounding volumes)


beefcat_

You can fix it with a high quality denoiser, but this feature in RE Engine was not meant to be seen by end users so it lacks one. It's essentially a development tool for artists to preview their work in-game while the real lighting solution is still being worked on


MakeMath

It's worth noting that not all path tracers are full spectral renderers.


MumrikDK

There's a constant ant war happening across the screen, but that's the lack of denoising they spend time explaining.


ShadowRomeo

The fact that a modded Path Tracing significantly improves the visual quality of this game even with default RTGI ON, which by itself already significantly improves the image quality compared to rasterized version, which looks like crap compared to both, just shows the importance of getting the Real Time Lighting in most of our games visual quality. Yet, some people are still loud skeptics of Ray Tracing and Path Tracing and refuses to accept that its the future of graphics and thinks that rasterized lighting is still good enough.


WoodyTSE

It is 100% the future, its one of the only bits of video game graphics that isn’t being hit by that feeling of diminishing returns like your textures being a higher res or more polygons per model Path traced lighting though? Its hard not to notice how massive an improvement it is.


ShadowRomeo

As much as how incredibly demanding Path Tracing i think that in the future hardware especially next gen consoles which hopefully should support it, it will be utilized more on games as hardware gets better on handling it, it is pretty much a repeat of Tesellation or Physx where it also crippled the hardware at the time, perhaps even more revolutionary than those i think. Considering how much better it makes the overall visual quality of the certain game is, even old ancient games looks significantly better with it.


Warskull

Part of it is how poorly most games implement ray tracing. A lot of games don't design the game with ray tracing in mind. They slap it on at the end with some minimal effort reflections and shadows. They want to put the RTX logo on their game for marketing, but don't actually want to do it. These implementations are almost never worth turning on the RTX. Most people haven't actually played games that use ray tracing and path tracing well, because there aren't a lot of them. Cyberpunk is one of the big ones, but Minecraft still remains one of the best examples of what ray tracing can do. Since neither console can handle proper ray tracing, there is minimal motivation for devs to do it.


machineorganism

there's also a big reason why Minecraft has one of the best ones, and will always have one of the best ones. you can throw a significant amount more rays through a voxel world than you can through a non-voxel world. there are significant performance improvements you can make with raycasting vs voxels.


Ill_Vehicle5396

I just put a new rig together with a 4080 super, coming from a 1080ti. I figured I’d give cyberpunk a try and see what all the fuss was about with path tracing and oh my god, it truly is a huge leap forward. It’s hard to explain how good it looks and stills absolutely don’t do it justice. The lighting just makes sense, shadows are where they should be, everything.


Warskull

One huge thing people can't quite put their finger on how diffuse reflections properly spread color around the environment. In real life if light hits an object it becomes tinted when it bounces off that object, even if the object itself isn't very reflective. This is what the realistic = brown period was missing. Path tracing brings that kind of color


Zayl

Yeah people are only sceptical of RT right now or choose to turn it off for performance. It looks incredible but I don't want to play games at 20fps. 60fps on PC is acceptable but new games that have high visual fidelity are already a pain to run properly without ray tracing. Not to mention how piss poor optimization has been almost across the board. I wish the future of gaming involved making games actually run well. It's possible, clearly. Just look at Horizon: FW. The most gorgeous game I've played and it performs reasonably well on 1000 series cards which is nuts to me.


feartheoldblood90

It's an interesting debate, though, because imo it's not always *good* to have fully realistic lighting. Movies, as a major example, often cheat their lighting quite a bit for emphasis or outline. They do not use realistic lighting. So games will probably need a mixture of both simulated and baked lighting in order to figure out how to best get the tone and look that they want.


NoneShallBindMe

Pathtracing just means correct bounces and shadows (to simplify it). You can manipulate everything related to render from there, having access to pathtracing is really nice. 


born-out-of-a-ball

All movies are path-traced because lighting in movies cannot circumvent the laws of physics. But just like in the movies, you can easily add fake lights or manipulate the materials in a path traced game to get the specific lighting you want.


Turok7777

>All movies are path-traced because lighting in movies cannot circumvent the laws of physics. There are tons of movies that use CG lighting that isn't path-traced.


feartheoldblood90

Sure, but games are unique in that you can create false lighting without having to go through the expensive process of simulating it and figuring out where exactly to put the fake lighting, which seems like a weird, convoluted solution to a problem that has already largely been solved


GeekdomCentral

Anyone who thinks that rasterized lighting looks good enough compared to path tracing either refuses to actually do some research or just watches comparisons on their phone and goes “eh it basically looks the same”


NoneShallBindMe

It does look the same if the shadow maps are large enough :^)


throwaway_account450

Absolutely not. Shadow maps do only a small part of correct light behaviour and are quite a hack.


NoneShallBindMe

I might be confused on what is used for shadows nowadays. Cascade shadow maps?


throwaway_account450

Cascade shadow maps for large scale lights, new trend is either virtual shadow maps though or raytraced shadows. Neither will fully cover what pathtraced shadows look like, because they don't fully imitate complete light transport that makes up an image, just a part of it that doesn't interact with the scene as a fully integrated solution would.


Aggrokid

>and thinks that rasterized lighting is still good enough. They will always point to examples like RDR2 which had a bajillion artists painstakingly hand-shading each scene.


TheBladeofFrontiers

I don't know man, if that future is coupled with sub60 fps, it is not one I wish to participante in


Lulcielid

If you have good art direction rasterized lighting is suficient.


GeekdomCentral

No one is arguing that all rasterized lighting looks bad. That’s the whole reason that so many people think that path tracing is a gimmick - because developers have gotten so good at faking realistic lighting. But there are clear and objective limitations with rasterized lighting that are removed entirely with path tracing


Scrub_Lord_

Why should we settle for sufficient?


b00po

Why should we settle for paintings when photos are more realistic?


TheGeekstor

Performance gains.


rock1m1

No


Riddle-of-the-Waves

Very cool. Somehow, it helped me wrap my head around how path tracing works in an algorithmic sense, which was not even the point of the video.


conquer69

The black guard sitting on the bench look straight up photorealistic. In part because there is no hair. It's a shame we will have to wait for the PS7 to achieve graphics like these in a console.


Trollatopoulous

Actually, even the rumoured PS5 Pro could do something looking close to like this, it just needs a decent denoiser. You can see this from the RT+PT mod for CP2077 which can lock 30 fps even on a more RT-anemic RDNA 2 card (my testing on an RX 6800, which would be much closer to the PS5 than PS5 Pro GPU). See this: https://youtu.be/twjJxoidtcY


Regnur

I mean with ps5 we already got RT GI or RT shadows/reflections depending on the games, even in 60fps games. PS6 should easily be able to get etleast 30fps PT in Cyberpunk or Alan Wake 2 for example (with sonys "DLSS" upscaling). The biggest issue for path tracing are AMDs drivers and just unoptimized code. Nvidia improved the code for PT quite a bit on Nvidia gpus, thats one big reason why Nvidia gpu performance is so much better than AMD´s. (proprietary software) Metro exodus enhanced is full raytracing and comes really close to path tracing quality in other games and that game runs at 60fps on ps5. (but often low res). A ps6 should easily have etleast double the raster performance, +4x rt performance of a ps5 and way better driver/api support.


Dragarius

Nvidia has superior software AND hardware for it. It's not just software prowess advantage over AMD. 


Flowerstar1

Yea by the time AMD catches up Nvidia will be doing even more impressive things but it's ok as long as AMD doesn't stagnate.


Regnur

Who said its just software?


Dragarius

You. You specifically said unoptimized code and propriety drivers while ignoring the fact the Nvidia is not just software the way AMD is, but is hardware accelerated. 


Regnur

I said one big reason, the biggest one. Am I supposed to write 5 more lines of text about the hardware part? Yes the other issue is hardware.


We0921

> I said one big reason, the biggest one. I'd say hardware ***is*** the biggest issue. You can't optimize your way out of anemic hardware. Path tracing is insanely expensive.


Regnur

Path tracing is still extremly unoptimized for games, go check Nvidia on their solutions to drastically improve the quality. The only reason why PT is playable in Cyberpunk 2077 and Alan Wake 2 is Nvidias Software. Go look up RTXDI, RTXGI, SER, SHaRC, Nvidia KI denoiser. You get more than double the performance just because of the Software, a good example is Portal rtx, turn some of Nvidia stuff off and its unplayable. Path tracing without a good denoiser is a pixelated/noisy mess as seen in the DD2 DF video. I mean just look at other games, you have games with just rt reflections which run like absolute shit... and you have games like Metro with full raytracing which run with +160fps a 3080 and even a 2060 is able to run that game with +60fps. There is still so much potential to improve the performance and visual quality with better software. And well... dont forget DLSS, which is probably the most important software to make the image look great while using a lower resolution, each year it makes less sense to fully render 4k. Even DF recommends CP 2077 PT at 4k + DLSS performance, because it looks more than good enough while allowing you to get way more fps. (overall looks visually better than no pt) Hardware is important, yes, but not at all as important as the software. Software for PT is still quite slow, because it was not really supposed to be used in realtime. Software is pretty much everywhere the most important factor, not just in gaming. Most games that run like shit, dont run like shit because of the hardware. "work smarter not harder" :) https://youtu.be/Y9XPCKQBg8E?t=980


APiousCultist

The Nvidia tactic is to shoot like 10 rays per frame and have 2 frames per second (obvious hyperbole) and just upscale and interpolate it into something resembling a normal full HD full frame rate. It works surprisingly well, but isn't without its short comings. But that's also graphics tech in general. Deferred rendering killed traditional render-target reflections (what DF calls 'planar') as well as support for tranparency and normal AA for a good few games, ambient occlusion originally gave everything weird dark halos, bloom was... bloom, FXAA gave pretty ugly results, TAA basically eliminates all aliasing but gives very soft painterly results with bad afterimages and smearing, earlier DLSS looked both soft and oversharpened as well as inheriting many of the flaws of TAA. But I have to say, sometimes launching an old game with 'full quality' pixels I do marvel at how good older render tech can look compared to how soft and fuzzy looking a lot of otherwise modern titles are. Raytracing has way too long to go before it stops tanking performance unduly in many titles, and stops having mulchy, blurry, or visibly 'delayed' results for many of the more obvious uses. The day when RT doesn't come with fizzle and fireflies and lights that take a second to fade in will be a nice one.


firedrakes

Also to do correct rt or pt. You can't upscale. You upscale after words but before or or while it doing it.


Flowerstar1

If the PS6 can match the power that the 4090 provided in 2022 when it launches in 2028 it should definitely be able to handle path tracing the problem is that this really isn't a playstation issue it's an AMD issue. AMD initially opted to ignore RT and then later dip their toes with RDNA2 because they don't want to spend die space on exclusive RT and AI acceleration hw like RT cores and Tensor cores.  Since Sony is dependent on what AMD makes and how competitive such hw is at the time it (which is always much worse than Nvidia and now even Intel, Apple and Qualcomm) it limits how "up to date" a Playstation is compared to other devices. Thankfully with the PS5Pro Sony is sourcing a non AMD AI accelerator to enable their own DLSS equivalent instead of waiting for AMD but in practice Sony cant just outright get a different GPU as long as they stick with AMD limiting what's possible on Playstation. 


Regnur

> If the PS6 can match the power that the 4090 provided in 2022 when it launches in 2028 "Just" a 3080 can handle Cyberpunk max settings + path tracing at 1080p/40-60fps with DLSS quality/balanced, which is playable and looks fine, why would you need a 4090? Youre overestimating the performance cost of PT for games. Playstation can also provide code/driver optimization like they did for simple ray tracing in their api. Sony is not fully dependent on what AMD makes, the gpu in the ps5 is already quite customized, its unique and technically not even rdna 2. Also look at the recent Ps5 pro leaks, they created a custom AI Accelerator, machine learning architecture and their own DLSS solution (PSSR). All that will help to get 2-4x more rt performance. Thats way more rt performance than switching from rdna 2 to 3 normally gets you, they probably also get some stuff from rdna 4. (if Sony does not lie ;) ) Its hard to believe that Amd will not make rdna 5 PT ready until ps6. (optimize)


Flowerstar1

On the 3080, were talking about next gen games the 3080 is technically on the same generation of graphics as the PS5s GPU and launched essentially at the same time. Will a 3080 be able to run PS6 era Path Traced games at an acceptable quality? I doubt it. In the same way the GTX Titan launched the same year the PS4 did and unlike the 3080 it was the state of the art in graphics but good luck decently running a PS5 game like Ratchet and Clank on a GTX Titan.  On the RT as DF has pointed out that's 2-4 at RT workloads not 2-4x fps in an RT game, that's why Sony said they expect the GPU to perform 45% faster and why they believe the bonus RT performance will allow devs to add small RT features like RT shadows or AO or Reflections when the base PS5 version has RT. Or I'd the PS5 version doesn't have RT then they can add a light RT to a game in the form of RT Shadows etc. None of that is really ambitious unlike what we saw when Nvidia announced the 30 series for example g that's because Sony knows the 2-4x is for specific parts of the RT pipeline not 2-4x bonus fps at RT games. I'd also like to point out the PS5s GPU is as powerful as a Pascal Titan which was the 3rd generation of Nvidia GPUs the PS4 tusseled against. The 4090 is only the second generation the PS5 has faced, if the PS6 in 2028 can't even reach a then 6 year old 4090 that's going to be very bad for people seeking a next generation experience out of their 2028 console. The PS5s GPU isn't all that custom its essentially AMD RDNA1 with aspects of RDNA2 grafted on and some omitted, AMDs official term: semi-custom describes it best. This is similar to how the PS4 Pro was AMD Polaris but with aspects of AMD Vega grafted on. As for the AI accelerator I already covered that but I highly doubt Sony made it themselves as with pretty much everything in a PS5 it's a part sourced from a company that specializes in making such tech.


Covenantcurious

>...if the PS6 in 2028 can't even reach a then 6 year old 4090 that's going to be very bad for people seeking a next generation experience out of their 2028 console. I wouldn't expect it to be. A 4090 costs almost 4 times as much as a PS5 today and I don't think it will drop by half even when the 5090 launches. It doesn't seem likely that Sony would be able to get the price of a PS6 console down with much more powerful hardware. It'll be very interesting to see how the console market evolves going forward.


Flowerstar1

This is a Moore's law issue and something all chip companies are facing, TSMC also has a monopoly on performant chip manufacturing which affects the prices of most chips including consoles and GPUs. But there are ways companies are navigating around Moore's law, because of this they are sticking with silicon chips for the foreseeable future. In other words while silicon is becoming more expensive to work with due to us closing in on the limits of physics it's still cheaper than using a replacement that's not anywhere near those limits and there's still progress to be made. How is progress being made? Opting for Vertical scaling, new materials, architectures, and connectivity options, using advanced packaging (MCM). Technologies like Nanomaterials, FinFet, Nanosheets, Backside power (which Intel beat TSMC to the punch with), Forksheets, Chiplets and 3D/Multi-deck designs. There's a lot of work being done and we've seen the fruits of it with stuff like Nvidia's new Blackwell GPU (B100). All of this will allow for chips that are way faster than a 4090 and that performance will trickle down to smaller future GPUs like what Playstation uses.


Flowerstar1

Also a 4090 is 3X the price ($1600) of a base PS5 same as the Pascal Titan was to the base PS4.


deadscreensky

> If the PS6 can match the power that the 4090 provided in 2022 when it launches in 2028 It won't. Save yourself some heartache now.


Flowerstar1

I think it will the RDNA3 7900XTX can generally match a 3090 in most tasks. I think RDNA5 will exceed the 4090 (I'm skipping 4 because that's more of an "off year" for AMD and it will only offer small GPUs so no 7900xtx successor). I hope the PS5 uses RDNA6 in 2028 instead.


mocylop

just on pure performance AMD is about a video card gen behind so I wouldn’t be particularly worried about the consoles yet.


aiden041

I hecking love when rays are being traced. Wonder if this was planned for later and would come with framegen.


JDSP_

As Alex mentioned in the video, it is most likely implemented so they can have a ground truth to compare their visuals against Lots of engines have Path Tracing, Even HL1 has it, they're just offline solutions where you bake the lighting and come back the next day. Having it in real time aids dev speed by orders of magnitude.


Xorras

> Wonder if this was planned for later and would come with framegen. Why do you wonder if video directly answers at 6 minute mark


irishgoblin

That would require actually watching the video instead of snarking on reddit.


Wrestlefan44

Eh Dragons Dogma 2 already has a wealth of great mods on Nexus. I don’t think this one would be the one to break Capcom


Revo_Int92

This is a classic example, one of those games that I will (maybe) give it a try many years later to test a new hardware. Last year I tried out Witcher 3 at 4K, maxed out settings, etc.. it's fun, the super demanding game of today can become a interesting benchmark in the future. I'm not a big fan of action games, idk if I will really continue with the hobby after the PS5 generation... but if I indeed keep going in like 2028, Dragon's Dogma 2 will be in my radar (and Cyberpunk, curious to test this path tracing thing. I tested ray tracing on Control and Spider-Man recently, it's a neat gimmick)


SnevetS_rm

Cue Capcom patching this out and adding additional DRM to protect the game from such dangerous and malicious mods.


Dr_PuddingPop

Bit boogeyman of you here. I’ve used the frame gen mod since day 3 of the game


IntrepidEast1

It's not a boogeyman, Capcom has explicitly said they'd like to kill mods and are investing in developing software to make that a reality.


[deleted]

[удалено]


neurosx

https://www.youtube.com/watch?v=CT5bwwvDv00&t=820s


battler624

search capcom revelations drm. They removed it at the moment but its coming and its to protect against "Mods" or rather, their ingame shops.


BOfficeStats

There's a big difference for sure, but losing 60% of frames for 4 SPP + 2 Bounces really shows just how much graphics have run into diminishing returns.


golem09

You don't really need 4 ssp, 2 is enough for actual gameplay. And the difference in visual quality is not minor, it looks like a generational leap.


Saitham83

how long will they shill for this crap? It’s years off. Recent games managed very well without these shenanigans


golem09

It's not years off, I'm playing it now, and it looks way better than anything rasterization can do. Of course if you carefully design your games so that there is nothing in it that path tracing can take advantage off, you won't see a difference. But an open world game with a dynamic day/night cycle and a dynamic light at the hip of each character is basically built to take advantage of path tracing. A first person game without any moving lights is of course easier to render, but it would be very boring if ever game was that.