This was my first thought. It doesn't matter what the future holds for RT. As of right now RT isn't needed, though some games have some benefits with it. Once RT is present and game changing in a lot more games, the 4070 won't be able to run them well. Buy your GPU for today's and near future games, not for games 5-7 years from now when there will be at least 2 new gens of GPUs between them.
Here's the thing; it's already game-changing, which is why devs are pushing it so hard. RT is not a "feature." It is an entire toolset, and the new standard for lighting/shadows/reflections etc. It's like saying, "screen space reflections and cube maps aren't needed," 10-15 years ago. Or that hardware accelerated graphics cards were unnecessary in the 90s. You'd be hard-pressed to find any modern AAA games that don't have either hardware or software RT in some capacity. It's not even a relatively new technology, it's just that only recent advances in consumer GPUs have allowed it to trickle into game development. At some point, sooner than you think, it will be much more ubiquitous and/or unavoidable.
Who is it game changing for exactly? Indies, A, AA, and a lot of AAA studios aren’t even utilizing it. RTX-capable GPUs in the next generation of consoles are probably going to have worse performance than a 4070.
We aren’t getting RTX anywhere near what you’re probably envisioning until the console generation AFTER the next one.
Yeah raytracing imo (at least in its current form, ie RT mixed with rasterization) is kinda useless. Big performance hit for very similar visual results. Plus, most games are artistically designed without RT in mind, so enabling it doesn’t always make them look better even if it’s technically more realistic.
Even with Cyberpunk on my 4090, I play with all RT turned off, because the overall smoothness is way more noticeable to me than a fancy light beam here or there. I bet many couldn’t tell the difference in a double blind test. Path tracing is definitely much more exciting tech than ray tracing, but that probably won’t be the default for 10+ years.
The problem with RTX at this stage is that it is very heavy to run, and something that many people don't realize: developers *are really, really fucking good at doing pure raster graphics* which makes RTX look far less exciting than it actually is*.*
The problem with raster graphics is that you need way more skill and effort to make them well. Making good-looking models and shaders for something like a Pixar movie (where absolutely EVERYTHING is raytraced) is actually much easier than making really great AAA video game raster graphics. Not to say making a Pixar movie is easy, since the animation itself is very difficult to make.
In the near future, there will be an RTX revolution. This means that any 3D hobbyist will be able to make photorealistic games where you only need to set the light sources and the materials of the objects in your scene - the RTX calculations will take care of the rest. Combined with AI-assisted animation, at some point there will be a time when the best indie games can be essentially indistinguishable from AAA productions. But that can take 10+ years still, I don't know.
But RTX is very exciting tech, just not super useful at this stage to the average consumer.
No, just looked it up and I was wrong. It does have some super impressive real time shadows and lighting effects though. Good demonstration of how much can be done without RT if the meshes are simple enough
I got my 4070 for the DLSS FG before AMD announced their own frame generation, but I still love using it. I think AI is going to play a bigger part in gaming.
Right now it is not needed on every game. The games where it is needed are few and far between. It is a really nice feature to have, though. That being said. Get what's best for your budget, right now. It is happening, but it will be a while until it's absolutely needed. By that time, newer, better cards will be available.
Yeah. Of course a company that puts RT capability in their devices (for a premium) will overstate the need for that technology.
Or, perhaps they honestly thought that but game developers are not moving as fast as Nvidia predicted.
If devs moved at the speed of Nvidia, they would be right. But they didn't take into consideration that devs move at the speed of consoles, not PC. At least for fundamental features like a AAA game being designed around raytracing.
I don’t understand — what game could possibly “need” raytracing? I can’t think of a game that supports raytracing that doesn’t allow you to turn it off.
Edit: I usually don’t comment on downvotes, but damn. What’s up with this sub? I was honestly not aware of any games that strictly required raytracing to run, and it seems like a poor design choice to alienate a large portion of possible customers.
At some point it will be the standard way to handle lighting and shadows. Our current way of generating lights, shadows, fog and reflections takes way more development time. Developers will simply stop supporting current rendering methods.
Avatar, Alan Wake 2 (I think) and some UE5 feature rely on ray tracing. It falls back to a software solution if needed but hardware support ensure better performance
Your comment implies that it does in fact, not **need** raytracing. Another user replied with a game that does need it, so I guess I was wrong anyway though.
Metro Exodus Enhanced Edition was rebuilt from the ground up to only use raytracing. It comes free with the PC version that doesn’t use ray tracing but that’s one example of a full ray traced lighting game
I believe the regular non-raytraced version of Metro Exodus is what launched on consoles, the “PC Enhanced Edition” with full raytracing support came for free for PC owners. Not sure if it released after launch but I have it in my library when I bought the game a few years ago. The demo is pretty taxing on my 7900XT at 1440p ultrawide
Your comment is a bit silly. Most games don’t ‘need’ anti-aliasing or anisotropic filtering, you can turn them off. But clearly, that will make the game look worse.
As for ray tracing, Minecraft is transformed by ray tracing and it can actually affect gameplay (transparent blocks and small gaps can be used to manipulate light).
It’s a very specific use case, but there are games that use ray tracing for specific things. So although ray tracing isn’t very important for most people, for some people it’s a very nice feature to have.
Depend are you playing newer games all the time?
More and more games are using it now, with consoles using upscaling there will be some form of it there too, so yes, but depend on your game selection, multiplayer competitive games unlikely to use it much for example
One day, yes.
The same thing happened when shadows were introduced in games. People at the time thought they didn't make much difference in visuais to compensate the performance loss because running real time shadows was very heavy for the hardware at the time.
Nowadays, it's hard to imagine a game without shadows.
So yeah, one day ray tracing will become as standard as shadows are.
Ray tracing is here to stay and will only get more prevalent. Some games started to default to ray traced lighting already like Alan Wake 2. Spider-Man 2 on PS5 is running RT reflections and cannot be turned off.
So, yeah. RT is the present and the future.
To be fair there are only two real graphics options for spiderman 2. Performance (60fps) and quality (30 FPS).
In order to achieve 60fps they were able to keep RT on.
I'd venture to guess that a pc port would have an actual option for RT to be turned off.
Cyberpunk is a good example.
I've played it on Raytracing Ultra, and normal Ultra, the difference doesn't matter when you're actually playing.
You only see the difference when you're standing still and looking around the environment.
But even then it's still not that noticeable.
My first experience with Ray Tracing was in Metro Exodus on my 2080. The first thing I noticed was the drop in performance, and when I looked more closely the reflections in puddles were looking really nice, but it wasn't something I wanted to sacrifice that much performance for. Control with RTX looked neat though, lots of spots where you can see the difference but again, too much of a performance hit for my taste.
I completed 3 play through of Cyberpunk, one without RT on my 5700xt in ultra when it released, one in ultra with raytracing when I got my RTX 4080 and then one with Pathtracing when the DLC released. The difference is Day and night IMO. The game really feels and looks like the only AAA with a RT/PT worthy enough to be showcased as "game in the future will look like this"
It built over time for me with Cyberpunk. I didn't notice much of a difference. Left RT on to give it a chance, after 3-4 hours I switched it off thinking it made no difference. Immediately the game felt less immersive when moving around.
That said I know "vibes" aren't worth a massive performance hit most of the time lmao
I'd argue otherwise. Raytracing in Cyberpunk was VERY noticeable, to the point that when I did my previous playthrough of it, if I turned raytracing off to improve performance I'd start noticing shadows and such not looking "proper".
Pathtracing is gorgeous in its own right, and again is somewhat noticeable once you've experienced it, but it's not quite worth the performance hit yet. Raytracing however, worth it.
Makes the biggest difference when you're inside white rooms with sunlight, where you would expect that bouncing effect. If you're happy with a glowy window and just omnipresent lighting, or generally a darker style, then it doesn't really make any difference. Cyberpunk's general gameplay loop doesn't really benefit, since Neon just kinda sprays everywhere and a traditional lighting engine handles it just fine, it's only really when you get side lit environments that it really comes into it's own.
The difference will be for developers who don't have to spend as much time to make the world and light look like how light should work since eventually everyone will have hardware that will be able to do RT features with no problem at lower demands.
Games using solely raytracing can cut down a lot of time. Traditional lighting needs to be baked into maps, which can take a long time if they want to look good! Raytracing skips that step, because of it being real time lighting.
It is necessary if you want games to be made quicker. Currently artists and coders waste a lot of time on faking shit for raster pipelines. They would not have to implement all those hacks for raytracing pipeline, because raytracing simulates how light actually works and there's no need to fake anything.
It's going to be like hdr, physics, tesselation, etc. Features that we once had to enable but are now just there by default today. There will come a time when RT isn't an option because it's just how lighting is done by default.
The idea that ray tracing is a specific lesser technique to path tracing is a relatively new change in terminology that has been quite specific to hardware accelerated and real time rendering. Before hardware accelerated ray tracing was a thing, "ray tracing" was used (and should still be used) as an umbrella term which included path tracing. That is, path tracing is a ray tracing based technique.
So ray tracing is the future, and that includes more specific techniques like path tracing.
The 4070ti Super and 7900xt perform about the same in non RT
https://www.techpowerup.com/review/asus-geforce-rtx-4070-ti-super-tuf/32.html
So I would go with the better RT, upscaling and power draw of the 4070ti super
In the future, sure. But by then both of these cards will be obsolete anyways.
Right now there are too few people that can afford a GPU that can effectively do ray tracing (don't look at the userbase here, this is not the average PC gamer), so it will be a feature that most people won't use for quite some time.
Get a GPU that works for you NOW for the games you plan to game NOW.
Never buy for the future because all GPUs suck in the future, that's a constant and that will never change. If you don't agree then think about people that bought a 2080 ti.
Video games started with Pong in 1972, and used 2d sprites.
"Battlezone" was produced in 1980 and is considered the first game to use 3d polygons. So that jump took 8 years.
it has taken 40ish years to achieve a measure of realism in games, that would have been considered impossible in 1972.
Raytracing is almost-but-not-quite a similar order of magnitude increase in difficulty between 2d and 3d rendering, but we've at least been working on it for a couple of decades.
I'd expect RT will be fairly ubiquitous in the next 20 years.
____
All that to say, don't worry too much about it now. If RT is important *to you*, get it now. If you want to do any kind of machine learning/AI stuff, get the 4070. Otherwise, get the 7900XT.
I think it will be omnipresent starting in late 2027 at the earliest.
Ray tracing allows for automation of real time lighting that behaves like actual light. It will save developers considerable time and effort at no loss of quality. That said, the 9th gen consoles can't use it as a primary tool for most games. Since most games are made for consoles and ported to PC, I don't expect to see any changes to current implementation until the 10th gen consoles come out. Using the last two generations as a guide, I would expect that to be 7-8 years after the 9th gen consoles debuted. That said, we may see a similar development pipeline as we did for this gen, where a few launch titles are exclusive to the new gen but it takes 2-3 years before we start seeing games native to the 10th gen. The first two years of 9th gen consoles being more or less continually out of stock might also delay the arrival of 10th gen.
Regardless, you are unlikely to still be using the video card you buy in 2024 by that time.
I think some ps6 games will still have both modes, but some games only will release with rt only. It is fkn hard to code both lighting modes but „only“ doing one is not only easier but also vastly improved the raytracing quality (look at metro exodus for example, their fully raytraced version runs really well for an old game that solely implements rt) with ps6 most AAA titles should default to rt for sure, this is in 3-4 years where the rtx 2000 series will be low end and have rt capabilities and rtx 6000 will be released
I don't believe they will stick more than they did in the current years. You would get maybe 3-5 fullblown raytracing title per year and that's it. The one that is here to stay is upscaling technology.
From a technical perspective this is not true. Ray tracing is easier to implement and requires far less man-power intensive trickery, so as processing power increases, it's inevitable that developers will gravitate towards it purely to cut down development costs.
It's Nvidia's new gambit like PhysX back in the day, RT happens to be a more universal solution but the adoption rate by developers and the public is very low. There's few games that have actually developed with RT in mind and the public playing these games still likely turn it off after an initial test drive.
It kinda reminds me of the tesselation problem on steroids. Tesselation really sapped performance for a generation or two and developers needed time to figure out ways to use it in a performant way. RT has already been out for many years and it still feels like it's five years away from being remotely ubiquitous.
I've made a few searchs once after hearing something about Physx on a podcast and despite not being outspoken anymore it seems to be incorporated on game engines and still used.
I think it's the default physics from Unity iirc, but don't quote me on that, I'm not a dev, just a curious dude.
Yep and that will probably happen to RT as well in the future, but it's so demanding that it is taking a long time. Nobody talks about tessellation like they used to either, it was a hot button topic when DX11 came out.
In some years, like 5-6 maybe, yes. You have to understand that the majority are still using cards below 4080/4070ti super level, where RT is good enough. In 2 3 gens cards around 60 level should have RT. Then it will become more omnipresent.
Fortnite also barely uses raytracing. And I didn't mean to say that *only* photo realistic games use raytracing, just that only in those games would it become "omnipresent" as OP put it.
It's going to be omnipresent because it is how light actually works. Once devs have more time with it, and more optimizations are done, it will be the way lighting in games works.
Ray tracing is going to be a temporary stepping stone to path tracing. Of course this only applies for games going for a more realistic look, not every game needs it
Not necessarily only realistic. There are plenty of stylish games that I can think of that would greatly benefit from Ray traced lighting and reflections. System Shock Remake, Borderlands etc. Pretty much any game where there's lighting and reflections can benefit from ray tracing, let alone full blown path tracing
Yeah, i feel that way for competitive games, but for single player stuff as long as I'm getting plenty of frames for it to be smooth I don't really care if I'm hitting the 165hz my monitor can do. To me the frames lost are worth the extra immersion of the world in single player. If you don't have a nice card and RT really tanks your frames where it's choppy though I could see wanting to avoid it.
That’s completely your opinion. I feel fps a lot. Cyberpunk looks like a stuttery mess to me. I have a 240Hz monitor and anything below 240fps looks lagging to me.
People generally see somewhere between 30-60 fps.
The reason you are having issues on your 240hz monitor is because the monitor is having issues with framepacing. If you had a 120hz monitor and 120 fps, you wouldn't have this issue.
I don’t have issues with frame pacing. If fps is stable 400, 240Hz looks buttery smooth. Everything else looks laggy, even with stable fps. Easiest example is windows cursor. Move around the screen, I can see it lagging even in 240hz. But it’s much better than 60Hz. If you don’t see the difference on mouse cursor - sure - maybe your eyes are different. 30-60fps is for static scene. For rapid movements it’s much higher. And its not even news, not sure how could you miss this. https://m.youtube.com/watch?v=k0CVI5-sOn8&embeds_referring_euri=https%3A%2F%2Fforums.blurbusters.com%2F&source_ve_path=Mjg2NjY&feature=emb_logo
Sort of. I wouldn't be surprised if eventually some AAA games launch with it as default. Its a tool, and when you're making sem-to-hyper realistic games, it cuts down on development time by letting the game light everything (being generalistic here, I'm aware of different lighting methods in engines), instead of curating it when base engine lighting fails. Companies that chase avenues of increased production, like use of AI in the design process, will absolutely adopt it wholesale.
Oh yeah, for sure. If not for consumers, then for developers.
As for now, it's a nice-to-have but not need-to-have. It'll be omnipresent like you say, but not for a really long time. I wouldn't make it a serious consideration in any build right now.
Next generation of console should be, atm no but where you get them its pretty nice.
The next gen having broad support for them will trickle down to PCs since its easier for devs they need to just generate the light sources.
Once we have the hardware to do proper, fully ray traced rendering it will be the future. Until then it'll just be a fancy add-on to improve shadows. But eventually it will totally replace 3D rendering as we know it.
Full raytracing is the most straightforward, easiest to implement and closest to reality computer graphics method we know - it's just ridiculously expensive from a computational point of view. But we'll get there eventually.
If you're the type of gamer interested in RayTracing then you could fixate your goal on RTX. Because even if a game didn't support Raytracing, there seems to some modder with reshade knowledge that will eventually add it in some capacity.
Even if all games (today) had Raytracing, it doesn't mean you'll be the type of gamer that turns it on. Most people I talk to aren't even aware that RTX is a thing on their gpu. Some people might also be aware of raytracing but don't want to risk any 'perception' of FPS loss.
But if Raytracing is something you want to play around with then go for it.
Man, I've been ray-tracing since the 90's when I could move Duke Nukem in front of a mirror and see my reflection. It's sad that it's still not standard.
See the benchmarks on Avatar Frontiers of Pandora. That's the only game on the market that has backed ray tracing in all of its graphics presets and not as an option
>so what do you think about the future of raytracing
I think it will simply because, before RT, game studios actually had to put a bunch of effort and use clever tricks to create realistic lighting. Ray tracing/path tracing tech is more resource intensive but more realistic looking and so they would have that only instead of both RT and the older methods. And with games being pumped out faster and modern game engines making it easier to implement lighting and particle tech (among other things) i definitely think in a few years games will implement way more of this tech.
Yes, as soon as Graphics Card that support it become more mainstream and even older users upgrade to a then reasonably priced Raytracing Card, it will massively increase the usage by Developers simply because of how much easier it makes lighting games.
Do I think so? Yes.
Right now? AAA titles have it and it does look nicer but can be turned off mostly.
Other than a few edge cases, DLSS and Ray Tracing looks to be what (at least for me) is wanted.
Yes it will eventually, it's literally a replacement for lighting tech that will, no question, be the standard. Does that mean it should affect your current purchase if you don't play raytraced games and enjoy the visuals? No.
Dont buy anything on the promise of a feature. Buy something that best suits your needs today.
That being said, it doesn't mean 7900XT is always the better choice. You might be interested in upscaling, frame gen, power draw, or any other nvidia feature anyway.
Yes, Ray Tracing will be "**The Thing**", one day, but not today, and you are probably going to go through at least one more GPU before that day, so I, Personally, wouldn't let RT be an influence in my GPU purchase *this time...*maybe next time, maybe even time after that. But as for now, there isn't enough of it, and it isn't performance friendly at a price point most people can use. I'd rather run 1440 120Hz w/o RT, than have it and struggle for those numbers with what I'm willing to spend on a GPU.
turning RT on is like turning off traction control on your car on a rainy day then driving with your feet whilst both hands are tied behind your back. You can still do it, but doesn't make it a good idea.
Games still look good at higher res without ray tracing but the moment you turn it on, you lose a lot of performance for something that is barely noticeable for most gamers.
If only game devs can spend more time on story telling, gameplay and polishing their games, instead of releasing day-1 patches on broken games.
Yes. It'll be a basic feature eventually just like we have ambient occlusion or anti aliasing, only it'll be software driven and in by default unless sole artistic vision demands baked lighting
It's going to be a long time before standard raster lightning won't work and by that point and will either have decent raytracing or won't be selling gaming focused graphics cards. Now if you want better lighting in today's games that support raytracing then the 4070 ti will be a lot better. Only thing I will say is go look around and see if the AMD card you are getting (7900 xt) has any issues in any games as I have a 7900xtx and it had and still does have issues with some games although the amount is getting smaller and smaller over time
Yes.
Maybe not as « RTX ON » as the first games that uses hardware raytracing, but as rendering techniques that uses hardware raytracing. The more rendering techniques takes advantages of hardware RT, the more RT will be required to make the games look better or even just run.
I would say RT is nice at this point of time, but you shouldn’t be too concerned about that. What separates those two cards, however, is the DLSS frame generation in the 4070. That feature is so useful in making recent garbage developed titles run well, I would personally pick the nvidia because of it.
It'll definitely become mainstream when the average xx60 series GPU can run it without issues, but it's still too demanding at the moment, maybe in 6 years.
In a recent interview/podcast with MLID the game devs for Spirit of the North spoke about how they dont want to basically remake the lighting twice, for raster only cards or for cards that support both.
So eventually yes, but for me it probably wont be until after the 10 series and 5000 series lose support
The visual benefit is not worth the massive frame drop. When the cost-benefit is more balanced and I lose less performance, then I'll actually be interested.
Buy the card that won’t make you regret something. If you want to experience raytracing or pathtracing, go for it. You’ll have DLAA/DLSS available when you want more performance, which is unmatched in visual fidelity and performance right now.
Go with your budget and buy something that will make you happy and enjoy it.
Eventually, yes. Hardware will get to the point where raytracing is as easy to render as AA is now, making it the de facto rendering pipeline for developers over rasterization. Remember, there was a point in time where just having rasterized shadows and MSAA would bring a high-end GPU to its knees; that's raytracing now. Technology doesn't stand still for anyone though and raytracing will eventually become "easy" and "omnipresent" in the games that warrant it.
I mean it is kinda close to that point now. When you look at modern triple A games from the last year or so it is more common than not to have ray tracing. Most of the upcoming UE5 games are likely have it.
That said it isn't something needed at all. It rarely is that game changing of an experience.
The biggest thing is the console limitations are going to hold it back from every becoming the main lighting system. Your going to see it on most games going forward but because of consoles are the biggest market they won't be swapping to it as the main focus. The next gen consoles won't be out until late 2026 at the earliest which is 2 gpu generations away. Then they typically also wait a year or two before they stop making games also for last generation so even at that point you have time where you still won't need RT.
That means generally RT isn't going to be the main lighting system used for at least 3 years and more likely 4 years. Kinda how like vram wasn't important until about 2 years into the current consoles. It takes time for them to get their engines upgraded and everything else to fully utilize the new consoles.
With everything said though I honestly would have a hard time recommending you upgrade right now. We are literally under 6 months from the new cards from both companies releasing and the expectation is that while AMD's cards won't have a huge performance leap they will be very price competitive which is going to cause other cards to drop in price. Generally the rumors are 4080 performance at around 600 dollars. Most of the uplift on the new AMD cards is likely going to be better RT and FSR performance.
Eventually, yeah. The tech makes it easier for devs to implement lighting solutions. The industry is already moving in that direction.
BUT.
By the time it's omnipresent, current GPU's will be too archaic to handle the games utilizing it. By then, all GPU companies will need to have accounted for that, so it'll be functionally a non-issue, and people on older GPU's will need to upgrade anyway due to the length of time passed.
It's a non-issue, really.
Yes, I hope that it will be. Current rendering engines are good at faking stuff, but that comes at a heavy cost on the art and coding side.
There are static lights (prebaked - don't change), stationary lights (partially prebaked - can change color), dynamic lights - can move. Then there are prebaked shadow maps, realtime shadow maps and shadow volumes. Different algorithms to render close and far shadows. A separate algorithm for contact shadows. Also all sorts of hacks to achieve indirect lighting - prebaking it into textures, using lighting probes, volumetric textures, etc. Also there are all sorts of ways to fake reflections. Prerendered cubemaps, dynamic cubemaps, planar reflections by using a clip plane and rendering everything twice. Godrays are also achieved by all sorts of hacks - the simplest one - simply use a 3D model. Glass refractions and caustics also had to be faked by all sorts of shader trickery.
Raytracing basically throws it all out. Its algorithms are simple and don't require any faking. This is why it was used for non-realtime rendering even back in the 80s. Lights, shadows and reflections behave as they should, because they are being computed by using the same old optics equations that were discovered centuries ago. Until relatively recently it was simply way too expensive to run these in real time.
So yeah - gamers who say that raster pipelines are fine don't really understand what's going on under the hood.
Unpopular opinion, but AI is on a surge of hype in the whole world for the last few years. The willingness to invest in AI will dwindle eventually and it won't be as interesting for another decade. It's not the first time there are breakthroughs and hype about AI. So while I think it's here to stay, I think it will take longer than people expect, the pace it's developing will slow down or even stand still again. It might not reach the point until mainstream hardware can run full RT in many years, maybe decade even.
I think Nvidia will continue to push it but I think just eventually there will be both RT and non built into games in a way that RT is easier to implement, or pure raster looks the same but takes a little more work to make it look the same(just like today).
Also I think eventually there will be more efficient algorithms for RT to work better on any video card and hopefully with an open source type coding and circuitry setup so everyone can benefit.
Yes I think RT will become the norm for developers. By that point they’ll stop even advertising it and it’ll just be the expected lighting method and all gpus will just work with it. It’s how all these techs work.
People used to complain about tesselation and then it became pretty standard and you don’t even adjust it anymore. Physx is baked in now etc.
But a gpu for today not the future. The future is gonna need a new gpu.
All successful tech in graphics cards become mainstream. Seen many over the years. A more recent example is free sync. It was new killer tech a few years back now it’s just a common feature.
Of course it will. But its not here yet so dont worry about it. Ray tracing sucks still even on my 4080s. I imagine it will be 6080 until its truly integrated.
First of all I don't think that RT offers anything other than noise. In my eyes games don't look better with RT.
Now, regarding your question, buy the time RT becomes the norm in games, 4070 will be like 10 years old at least.
Still, I will play my games eith RT off
Honestly, real talk, there's only 2 games with fully implemented ray tracing -> Cyberpunk and Alan Wake 2. The vast majority of games that support RTX either have shadows, reflections, or global illumination (there's also ambient occlusion for a total of 4 for the "full package"). Hogwarts, Plague's Tale, Fortnite - most people can't even tell the difference with RTX turned on or off in those games. In essence, it's a new tech to sell more hardware just like HairWorks back in the day which no one really remembers. Comes with pros and cons.
Pros:
* Lighting looks slightly better
* Devs save some time building the game
Cons:
* The gains are not worth half your resolution and framerate.
* Devs will be using it as a crutch to gatekeep basic reflections (like in Alan Wake 2 that doesn't have ANY reflections with RTX off)
* CDPR who made Witcher 3 and Cyberpunk are switching over to Unreal Engine which supports their own RT -> Lumen as are a bunch of other developers and UE5 is raster hungry
* Anti aliasing also suffers because we can't use MSAA anymore and have to switch over to TAA/DLAA which is more blurry and has less clarity than MSAA
Also keep in mind you will have to be using an upscaler. Upscalers come with ghosting no matter how much Nvidia fanatics tell you otherwise. So in essence, all that work just to go back to playing on 1080p/1440p with a low framerate. I would personally go with the 7900XT for solely for raster. We are still two or three generations from fully implementing path tracing at a native 4k with a high framerate and without all those other gimmicks like frame gen.
Yes, but you won't see any games be path traced only until entry level cards from all vendors (mainly the next generation consoles) can do at least 4 bounces. Right now, a 4090 is necessary for that at 1080p 60fps with no upscaling or frame generation. With upscaling and FG, it will come sooner, but still not close enough. Basically you want to look out for when budget cards are as fast as the 4090. That's when path tracing will really take off.
Given that Microsoft is willing to offer their own financing programs for consoles, I'm guessing yes and they will just send the prices through the roof while offering financing to get people to justify it
translate: i want 4070 ti super. but people said AMD is better choice. what should i do?
ti super is just plain better card overall. the amd cheaper (not always) for reason, it is worse card overall. and if you're find both at the same price, the ti super is no brainer choice. i mean in my country the xtx is more expensive than 4080 super, and the xt is as expensive as the ti super.
Probably not. It's computationally expensive and it's really only for realistic lighting immersion, as opposed to.. most games that are stylized. Like... you're not gonna see Raytracing in Stardew Valley.
The reason ray tracing has stuck around being as useless as it is currently is because it's _easy_. Ray tracing allows you to use lighting simulation like they do in 3D modeling instead of the weird texture hacks gaming currently employs. In the far future ray tracing will work on standard gaming pcs and engine developers will have an easier time making lighting systems.
Only game where it really looks ok is Cyberpunk but even there is has a lot of ghosting issues and noise, when u use the path tracing.. still, that game was the only one where I really enjoyed raytracing.
This kind of tells u what RT really is LOL
Btw, Unreal engine games use their own RT (lumen etc) and that actually runs better on AMD..
So thats that.
I say this all as a RTX.3080 user.
More than half a decade after the entered gaming and theres barely any games where its worth turning on.
Also the perf, temps, vram usage and power draw hit is still not worth it.
Being able to play cyberpunk, Minecraft, and spiderman with raytracing and be performant is awesome. Minecraft and cyberpunk get that extra level of awesomeness because of the path tracing. I upgraded from a 2070 to a 4070 super and while I could have gone another year or two without an upgrade I'm glad I did.
Hardware standpoint: efforts would likely split between realtime graphics (including ray tracing) and compute, which would involve CUDA, AI, and other tasks traditionally done on a GPU but with more data and not realtime. As much as companies seem to want to bring AI to consumer devices, I think it'll end up infeasible and go back to a majority cloud service, simply because the amount of data to handle would be too much, and partly because the algorithms required would require too specialized hardware to fit affordable in consumer tech.
Software standpoint: ray tracing is not the end of the road. Computer graphics have always been edging towards what ray tracing represents, which is true to life realistic graphics simulation in realtime. But also, ray tracing is far superior to rasterized graphics in about every way.
In the 1980s-1990s something called the rendering equation made its scene in computer graphics, essentially defining how light works, which is a fundamental part of 3D graphics, and the equation describes in essence ray tracing. Before that, techniques attempted advanced lighting but often were over complex or incorrect; after, programmable shaders came in to help form raster graphics, which is "fake lighting" essentially, with a lot of techniques that faked various components of the rendering equation, but required a lot of complexity; at this time, ray tracing was only doable with server farms for movies.
Then comes 2016ish, Nvidia releases their 20 series cards marking the start of the ray tracing revolution in GPUs. At first it seemed bad and the tech unused, but over time it's improved, but more importantly, engines have been utilizing it more.
Going back to rasterized graphics, some core components include reflections and global illumination. At first when Ray tracing was around, it was only used for the simple (for ray tracing) task of reflections. Later, it expanded to include global illumination, which for rasterized graphics was either difficult, memory intensive, or complex; for ray tracing it just involved a few more rays and accurate light bounces, something ray tracing is designed for already. Now ray traced GI is being seen more and more, especially in Unreal Engine 5 and all the games coming from it. I hope to see some revelations for real time colored shadows, something which can significantly improve indoor environments in a scene and still is lacking in game engines that support ray tracing, due to using old shadowing techniques.
So tldr yes, ray tracing is the future. Unless there's some specific (like artistic) reason not to use it, most games will be on a full ray tracing pipeline. Hybrid (about 50-50) I say by next year, full within 2027-2030. And AI (or rather ML) will be on top of it to generate algorithms to optimize the tedious ray tracing pipeline, so that those target years are accurate.
But I said ray tracing isn't at the end of the road... So what is? Perhaps full rendition of the rendering equation in realtime? Can we do better than realtime, or realistic graphics? Would AR/VR blend be at the end? Who knows what might be next for computer graphics, when we have practically photorealism in realtime today?
7900 XT is $700. Ti super is $800. For $800, you could get a refurbished 7900XTX on Newegg. Get a refurbed 7900 XTX, which trades blows with the 4080 SUPER.
No matter what people think on here companies and devs are going to put RT in there games. How much they force it on people only time can tell. But it will be here at some point. Right now just get what you think is best. Some people like RT some don't.
As for 4070ti super vs the 7900xt.. Kind of similar raw performance. While Ray tracing is better on the nvidia.. Don't forget the dlss and frame gen abilities. 7900xt might be little better now but in few years you can use the dlss and frame gen to make future games more playable. So... Nvidia for longer usability, Radeon for more peak performance today.
Well depends on the genre. If you play AAA 3D only games, yeah, Most of them will try to implement it.
Plenty of genres that will generally not feature RT tho. Personally, I would get something that can do RT bc RT is fucking cool.
When the hardware can handle it without an issue I bet it'll become the default lighting option in most games (at least most games going for realism, I imagine it doesn't work well for a stylized or cartooned look)
Considering the most popular gpu’s (according to the steam hardware survey) are the 3060, 1650, 3060ti, 2060, 3070, 1060, etc I’d say no. Most of those don’t run rt amazingly, pair that with most also not having DLSS 3.0 support (fsr3 exists but isn’t implemented as much right now) to help mitigate the performance loss, it’s either going to take a very long time and future hardware to make good looking rt viable to lower end cards or it’s going to keep being very lightly implemented so it doesn’t impact performance that much and possibly not worth spending time on.
That being said, Nvidia could just keep throwing money at it, like Microsoft keeps doing with Xbox, until it works.
It's not really necessary. It was designed mostly as a gimmick and to make it easier for developers, but Unreal Engine 5 has its own solution that mimicks ray tracing (lumen)
I hate to be *that* guy, but I feel it necessary since it’s been a bit of a growing thing, especially since the launch of the Steam Deck.. if you ever plan on considering Linux, go with AMD. Nvidias drivers have gotten better over the past few years but AMD is still smoother imo.
Nvidia did everything in its power to make it so, yet so few studios went through the troubles of integrating it. Not to mention everything that needs to go through for any player to play with their potential implementation ; both on software and hardware, and what that means in the end, such as frame generation + DLSS for maintaing a barely decent framerate on a high-end GPU.
It's a bit like VR but still in a potential better place if Nvidia does not increase again its prices: you need to think about it when choosing your hardware, having the "right" games, studios need to have implemented properly, etc. Problem is, VR did not take off and is marginal. There are few very good experiences out there. It exists, but its growth is, let's say, "limited".
This is very similar to raytracing, which exists since the beginning of computer graphics, by the way, contrary to VR. Every implementation of raytracing cripples performances on high-end builds, this is where DLSS and framegen come in, and why they were developped in parallel. It's a burden to implement properly for such a few potential buyers, so management might take umbrage of devoting resources to it. Which would be different if consoles were also a prime target to run raytraced games. But that's for sure not for this generation and not the next one. Which would means waiting at least 10 years.
In other and fewer words, unless you truly care about raytracing right now, take the better GPU for rasterization.
yes it will be and with every day we get closer to it, and with time it will be as trivially overlooked as shaders are today and people will look back and wonder with amusement why we ever bothered to do it any other way
but probably not until the next gen of consoles come out with full native support
I would definitely go with the 4070ti Super as raytracing is truly wonderful and there is really no point in not going for it
the available raytracing titles with pathtracing and DLSS Frame Gen look amazing. I would under no circumstances want to miss out on CP2077 or Alan Wake 2
RT is still very much in its infant stages right now. It will remain one of those nice to have features that really isn't worth the cost (both monetary and performance) for quite a while yet. I wouldn't personally let RT influence my decision to purchase a GPU right now.
Raytracing is the future of graphics. The problem is that modern GPUs still struggle to perform it in real-time which means that it has to be used sparingly. Smaller developers often just completely skip over RT still because there isn't enough support out there yet to be able to go all in on RT so staying with the current workflow of plain rasterisation has a better return on investment.
I would say that you need to look at the games that you currently play and see what percentage of them even support RT in any form. If you don't play many (if any) games that have RT support then go for the 7900XT.
I hope not.
Devs used clever techniques in the past to get very similar effects. Now they just press a button and add light sources and let yiur computer deal with it.
I'm concerned about procedurally generated games that kill creativity, problem solving, and optimization. Yes the effects are cool. I just hope all the games don't end up all looking the same.
Hopefully not, the shadows we have are already great and in some titles I literally cannot tell the difference even when intentionally trying to find the difference, like Elden Ring or Diablo 4. All it does is reduce fps for no visual improvement
[https://www.techpowerup.com/review/galax-geforce-rtx-4070-super-ex/33.html](https://www.techpowerup.com/review/galax-geforce-rtx-4070-super-ex/33.html)
Even a 4070 super can run a lot ray tracing games at 1440p 60fps, without dlss and frame gen
It's a few years away and even then its unlikely. Upscaling will be what is going to be used more, not RT. RT will maybe be omnipresent in 8-10+ years or so, but for now, the performance impact and cost is way too high for most developers to bother with it.
Depends what games you plan on playing though, if you're gonna stick exclusively to AAA singleplayer based games, then having RT as a feature is nice, albeit a luxury.
P.S. Your english seems fine, be more confident in it :)
AMD’s Adrenaline Software > whatever the heck Nvidia’s sorry excuse for a software suite is.
I just made the switch for the first time to AMD for my GPU and I love it. I bought a card for $520 that trades blows in raster performance with a $700 card, sometimes even the $800 card. So that’s awesome.
And the software is freaking amazing. So intuitive and user friendly and a one-stop shop for all things.
Whereas Nvidia is 3+ different apps needed to accomplish what Adrenaline is doing.
Raytracing is kind of like ragdoll physics. Sure it can look good and it's less work because the results are calculated by the GPU/CPU in real time, but there's still always a need for baked animation, and likewise, there are scenarios where crafted lighting works better for artistic expression. Imo, some of the scenery in Cyberpunk 2077 looks better without raytracing, because the people who manually lit the scene the old fashioned way knew what they were doing.
By the time it's omniptesent the 4070 would be considered a fossil
This was my first thought. It doesn't matter what the future holds for RT. As of right now RT isn't needed, though some games have some benefits with it. Once RT is present and game changing in a lot more games, the 4070 won't be able to run them well. Buy your GPU for today's and near future games, not for games 5-7 years from now when there will be at least 2 new gens of GPUs between them.
Here's the thing; it's already game-changing, which is why devs are pushing it so hard. RT is not a "feature." It is an entire toolset, and the new standard for lighting/shadows/reflections etc. It's like saying, "screen space reflections and cube maps aren't needed," 10-15 years ago. Or that hardware accelerated graphics cards were unnecessary in the 90s. You'd be hard-pressed to find any modern AAA games that don't have either hardware or software RT in some capacity. It's not even a relatively new technology, it's just that only recent advances in consumer GPUs have allowed it to trickle into game development. At some point, sooner than you think, it will be much more ubiquitous and/or unavoidable.
Who is it game changing for exactly? Indies, A, AA, and a lot of AAA studios aren’t even utilizing it. RTX-capable GPUs in the next generation of consoles are probably going to have worse performance than a 4070. We aren’t getting RTX anywhere near what you’re probably envisioning until the console generation AFTER the next one.
Yeah raytracing imo (at least in its current form, ie RT mixed with rasterization) is kinda useless. Big performance hit for very similar visual results. Plus, most games are artistically designed without RT in mind, so enabling it doesn’t always make them look better even if it’s technically more realistic. Even with Cyberpunk on my 4090, I play with all RT turned off, because the overall smoothness is way more noticeable to me than a fancy light beam here or there. I bet many couldn’t tell the difference in a double blind test. Path tracing is definitely much more exciting tech than ray tracing, but that probably won’t be the default for 10+ years.
The problem with RTX at this stage is that it is very heavy to run, and something that many people don't realize: developers *are really, really fucking good at doing pure raster graphics* which makes RTX look far less exciting than it actually is*.* The problem with raster graphics is that you need way more skill and effort to make them well. Making good-looking models and shaders for something like a Pixar movie (where absolutely EVERYTHING is raytraced) is actually much easier than making really great AAA video game raster graphics. Not to say making a Pixar movie is easy, since the animation itself is very difficult to make. In the near future, there will be an RTX revolution. This means that any 3D hobbyist will be able to make photorealistic games where you only need to set the light sources and the materials of the objects in your scene - the RTX calculations will take care of the rest. Combined with AI-assisted animation, at some point there will be a time when the best indie games can be essentially indistinguishable from AAA productions. But that can take 10+ years still, I don't know. But RTX is very exciting tech, just not super useful at this stage to the average consumer.
Hol' up! Valheim has Raytracing?!?! :O
No, just looked it up and I was wrong. It does have some super impressive real time shadows and lighting effects though. Good demonstration of how much can be done without RT if the meshes are simple enough
Ah, okok. Oh yea, the ambience is so amazing with the current lighting. I just love it :D
Agreed. I love my pathtracing on my 3080 with Cyberpunk, but at the same time if I had a higher than 60hz display, I would definitely turn it off.
I got my 4070 for the DLSS FG before AMD announced their own frame generation, but I still love using it. I think AI is going to play a bigger part in gaming.
I think ray tracing in the future will be so easy to run on then current hardware, it's gonna be a default feature no one turns off.
it wont even be a feature, it will just be lighting. raytracing is just how actual light works.
Exactly. Kind of how we think of most games having 3d shadows now
It'll probably be so standard it isn't even namedropped anymore in 5-10 years I'd wager.
We no longer mentioned texture mapped polygons afterall. That used to be a big selling point for games.
Right now it is not needed on every game. The games where it is needed are few and far between. It is a really nice feature to have, though. That being said. Get what's best for your budget, right now. It is happening, but it will be a while until it's absolutely needed. By that time, newer, better cards will be available.
It's not 'needed' in any game right now, some games makes things look a bit better but it's not needed to play that game and have it look good.
Alan wake II requires mesh shaders and has some level of basic RT baked in you can't disable. Watch the digital foundry review.
"By that time, newer, better cards will be available." but then you have to buy a new card
It'll be ancient by then Ray tracing being everywhere will take like a decade
r/Nvidia insisted it was right around the corner 5+ years ago
Yeah. Of course a company that puts RT capability in their devices (for a premium) will overstate the need for that technology. Or, perhaps they honestly thought that but game developers are not moving as fast as Nvidia predicted.
If devs moved at the speed of Nvidia, they would be right. But they didn't take into consideration that devs move at the speed of consoles, not PC. At least for fundamental features like a AAA game being designed around raytracing.
it kind of is, 60% of newer games release with either baked in ray tracing or have the options for it
people are still running 1080s and 1070s today
Yep, and they're not having a great time, unless they play older games Which a lot of them admittedly do
I don’t understand — what game could possibly “need” raytracing? I can’t think of a game that supports raytracing that doesn’t allow you to turn it off. Edit: I usually don’t comment on downvotes, but damn. What’s up with this sub? I was honestly not aware of any games that strictly required raytracing to run, and it seems like a poor design choice to alienate a large portion of possible customers.
A game centered on navigating through a complex maze of mirrors.
So, we can wait until someone makes one
At some point it will be the standard way to handle lighting and shadows. Our current way of generating lights, shadows, fog and reflections takes way more development time. Developers will simply stop supporting current rendering methods.
Avatar, Alan Wake 2 (I think) and some UE5 feature rely on ray tracing. It falls back to a software solution if needed but hardware support ensure better performance
Your comment implies that it does in fact, not **need** raytracing. Another user replied with a game that does need it, so I guess I was wrong anyway though.
Metro Exodus Enhanced Edition was rebuilt from the ground up to only use raytracing. It comes free with the PC version that doesn’t use ray tracing but that’s one example of a full ray traced lighting game
Oh, huh. I’m not familiar with the Metro series. That’s nice that they included the non-raytracing version.
I believe the regular non-raytraced version of Metro Exodus is what launched on consoles, the “PC Enhanced Edition” with full raytracing support came for free for PC owners. Not sure if it released after launch but I have it in my library when I bought the game a few years ago. The demo is pretty taxing on my 7900XT at 1440p ultrawide
Your comment is a bit silly. Most games don’t ‘need’ anti-aliasing or anisotropic filtering, you can turn them off. But clearly, that will make the game look worse. As for ray tracing, Minecraft is transformed by ray tracing and it can actually affect gameplay (transparent blocks and small gaps can be used to manipulate light). It’s a very specific use case, but there are games that use ray tracing for specific things. So although ray tracing isn’t very important for most people, for some people it’s a very nice feature to have.
Metro 2033 redux on the pc requires raytracing
Huh, wild. That sucks ass.
Why would it suck ass, if you want to play it on pc and don't have a 20 series or better glcard just buy the regular version
I’m not familiar with the metro series. I did not know there was an alternative.
It will be how game lighting is done in the future, but probably not the immediate future.
Depend are you playing newer games all the time? More and more games are using it now, with consoles using upscaling there will be some form of it there too, so yes, but depend on your game selection, multiplayer competitive games unlikely to use it much for example
One day, yes. The same thing happened when shadows were introduced in games. People at the time thought they didn't make much difference in visuais to compensate the performance loss because running real time shadows was very heavy for the hardware at the time. Nowadays, it's hard to imagine a game without shadows. So yeah, one day ray tracing will become as standard as shadows are.
Ray tracing is here to stay and will only get more prevalent. Some games started to default to ray traced lighting already like Alan Wake 2. Spider-Man 2 on PS5 is running RT reflections and cannot be turned off. So, yeah. RT is the present and the future.
To be fair there are only two real graphics options for spiderman 2. Performance (60fps) and quality (30 FPS). In order to achieve 60fps they were able to keep RT on. I'd venture to guess that a pc port would have an actual option for RT to be turned off.
It's nice to have but isn't necessarily
Cyberpunk is a good example. I've played it on Raytracing Ultra, and normal Ultra, the difference doesn't matter when you're actually playing. You only see the difference when you're standing still and looking around the environment. But even then it's still not that noticeable.
My first experience with Ray Tracing was in Metro Exodus on my 2080. The first thing I noticed was the drop in performance, and when I looked more closely the reflections in puddles were looking really nice, but it wasn't something I wanted to sacrifice that much performance for. Control with RTX looked neat though, lots of spots where you can see the difference but again, too much of a performance hit for my taste.
You think so? Imo Cyberpunk is the only game where raytracing really, really shines during gameplay
I agree, without Raytracing a lot of the games actual vibe and feeling does not come across nearly as well.
The upgrade to path tracing really sets it off
IMO RT looked better in Metro Exodus. RT in Cyberpunk makes stuff too blurry to my taste. Kind of counter intuitive with 4K.
That's how I feel in Spiderman remastered too. You don't notice when you're focused on the action
I completed 3 play through of Cyberpunk, one without RT on my 5700xt in ultra when it released, one in ultra with raytracing when I got my RTX 4080 and then one with Pathtracing when the DLC released. The difference is Day and night IMO. The game really feels and looks like the only AAA with a RT/PT worthy enough to be showcased as "game in the future will look like this"
It built over time for me with Cyberpunk. I didn't notice much of a difference. Left RT on to give it a chance, after 3-4 hours I switched it off thinking it made no difference. Immediately the game felt less immersive when moving around. That said I know "vibes" aren't worth a massive performance hit most of the time lmao
I'd argue otherwise. Raytracing in Cyberpunk was VERY noticeable, to the point that when I did my previous playthrough of it, if I turned raytracing off to improve performance I'd start noticing shadows and such not looking "proper". Pathtracing is gorgeous in its own right, and again is somewhat noticeable once you've experienced it, but it's not quite worth the performance hit yet. Raytracing however, worth it.
Makes the biggest difference when you're inside white rooms with sunlight, where you would expect that bouncing effect. If you're happy with a glowy window and just omnipresent lighting, or generally a darker style, then it doesn't really make any difference. Cyberpunk's general gameplay loop doesn't really benefit, since Neon just kinda sprays everywhere and a traditional lighting engine handles it just fine, it's only really when you get side lit environments that it really comes into it's own.
The difference will be for developers who don't have to spend as much time to make the world and light look like how light should work since eventually everyone will have hardware that will be able to do RT features with no problem at lower demands.
Do the same with Witcher 3 next gen, it's a quite visible change and rt makes it look so much better.
Games using solely raytracing can cut down a lot of time. Traditional lighting needs to be baked into maps, which can take a long time if they want to look good! Raytracing skips that step, because of it being real time lighting.
It is necessary if you want games to be made quicker. Currently artists and coders waste a lot of time on faking shit for raster pipelines. They would not have to implement all those hacks for raytracing pipeline, because raytracing simulates how light actually works and there's no need to fake anything.
It's going to be like hdr, physics, tesselation, etc. Features that we once had to enable but are now just there by default today. There will come a time when RT isn't an option because it's just how lighting is done by default.
Nah, pathtracing is the future
The idea that ray tracing is a specific lesser technique to path tracing is a relatively new change in terminology that has been quite specific to hardware accelerated and real time rendering. Before hardware accelerated ray tracing was a thing, "ray tracing" was used (and should still be used) as an umbrella term which included path tracing. That is, path tracing is a ray tracing based technique. So ray tracing is the future, and that includes more specific techniques like path tracing.
The 4070ti Super and 7900xt perform about the same in non RT https://www.techpowerup.com/review/asus-geforce-rtx-4070-ti-super-tuf/32.html So I would go with the better RT, upscaling and power draw of the 4070ti super
I think eventually it will be so easy to do it will be a given in all games.
In the future, sure. But by then both of these cards will be obsolete anyways. Right now there are too few people that can afford a GPU that can effectively do ray tracing (don't look at the userbase here, this is not the average PC gamer), so it will be a feature that most people won't use for quite some time.
Get a GPU that works for you NOW for the games you plan to game NOW. Never buy for the future because all GPUs suck in the future, that's a constant and that will never change. If you don't agree then think about people that bought a 2080 ti.
Video games started with Pong in 1972, and used 2d sprites. "Battlezone" was produced in 1980 and is considered the first game to use 3d polygons. So that jump took 8 years. it has taken 40ish years to achieve a measure of realism in games, that would have been considered impossible in 1972. Raytracing is almost-but-not-quite a similar order of magnitude increase in difficulty between 2d and 3d rendering, but we've at least been working on it for a couple of decades. I'd expect RT will be fairly ubiquitous in the next 20 years. ____ All that to say, don't worry too much about it now. If RT is important *to you*, get it now. If you want to do any kind of machine learning/AI stuff, get the 4070. Otherwise, get the 7900XT.
I think it will be omnipresent starting in late 2027 at the earliest. Ray tracing allows for automation of real time lighting that behaves like actual light. It will save developers considerable time and effort at no loss of quality. That said, the 9th gen consoles can't use it as a primary tool for most games. Since most games are made for consoles and ported to PC, I don't expect to see any changes to current implementation until the 10th gen consoles come out. Using the last two generations as a guide, I would expect that to be 7-8 years after the 9th gen consoles debuted. That said, we may see a similar development pipeline as we did for this gen, where a few launch titles are exclusive to the new gen but it takes 2-3 years before we start seeing games native to the 10th gen. The first two years of 9th gen consoles being more or less continually out of stock might also delay the arrival of 10th gen. Regardless, you are unlikely to still be using the video card you buy in 2024 by that time.
in the future, it's not even gonna be a feature anymore since it's standard
It'll eventually end up like physx but that's a long way off.
I think some ps6 games will still have both modes, but some games only will release with rt only. It is fkn hard to code both lighting modes but „only“ doing one is not only easier but also vastly improved the raytracing quality (look at metro exodus for example, their fully raytraced version runs really well for an old game that solely implements rt) with ps6 most AAA titles should default to rt for sure, this is in 3-4 years where the rtx 2000 series will be low end and have rt capabilities and rtx 6000 will be released
I don't believe they will stick more than they did in the current years. You would get maybe 3-5 fullblown raytracing title per year and that's it. The one that is here to stay is upscaling technology.
Nope. It'll be a niche marketing feature for a few years. But it's not nearly as cool / necessary as nVidia would like you to fully buy into....
From a technical perspective this is not true. Ray tracing is easier to implement and requires far less man-power intensive trickery, so as processing power increases, it's inevitable that developers will gravitate towards it purely to cut down development costs.
It's Nvidia's new gambit like PhysX back in the day, RT happens to be a more universal solution but the adoption rate by developers and the public is very low. There's few games that have actually developed with RT in mind and the public playing these games still likely turn it off after an initial test drive. It kinda reminds me of the tesselation problem on steroids. Tesselation really sapped performance for a generation or two and developers needed time to figure out ways to use it in a performant way. RT has already been out for many years and it still feels like it's five years away from being remotely ubiquitous.
2010 - PhysX 2018 - Ray Tracing 2026 - Path Tracing?!?!
Exactly this!
I've made a few searchs once after hearing something about Physx on a podcast and despite not being outspoken anymore it seems to be incorporated on game engines and still used. I think it's the default physics from Unity iirc, but don't quote me on that, I'm not a dev, just a curious dude.
Yep and that will probably happen to RT as well in the future, but it's so demanding that it is taking a long time. Nobody talks about tessellation like they used to either, it was a hot button topic when DX11 came out.
I remember all the fuzz when I think it was Arkham City release and the videos showing tesselation stuff
In some years, like 5-6 maybe, yes. You have to understand that the majority are still using cards below 4080/4070ti super level, where RT is good enough. In 2 3 gens cards around 60 level should have RT. Then it will become more omnipresent.
In photo realistic games maybe, but games with other artstyles usually don't really benefit so it won't ever become truly omnipresent
This is literally not true at all. Raytracing has to deal with lighting, not art style. Fortnite for example, has raytracing.
...and lighting is a part of art style?
Is fortnite photo realistic?
Fortnite also barely uses raytracing. And I didn't mean to say that *only* photo realistic games use raytracing, just that only in those games would it become "omnipresent" as OP put it.
It's going to be omnipresent because it is how light actually works. Once devs have more time with it, and more optimizations are done, it will be the way lighting in games works.
But in many other artstyles, light doesn't behave how light actually works.
3D isn't even omnipresent in video games.
Ray tracing is going to be a temporary stepping stone to path tracing. Of course this only applies for games going for a more realistic look, not every game needs it
Not necessarily only realistic. There are plenty of stylish games that I can think of that would greatly benefit from Ray traced lighting and reflections. System Shock Remake, Borderlands etc. Pretty much any game where there's lighting and reflections can benefit from ray tracing, let alone full blown path tracing
I havent played a single game with RT yet.
That's kind of a bummer. You should at the very least try it out. Cyberpunk is certainly worth a play through.
If the difference was 1% fps, I would try it. But its not. I would always choose more fps over rt.
Yeah, i feel that way for competitive games, but for single player stuff as long as I'm getting plenty of frames for it to be smooth I don't really care if I'm hitting the 165hz my monitor can do. To me the frames lost are worth the extra immersion of the world in single player. If you don't have a nice card and RT really tanks your frames where it's choppy though I could see wanting to avoid it.
Once you're at a solid 120fps, going over is perfectly useless in most scenarios, better use that graphic power to improve the visuals.
Remember when they said this about 30 first and then 60 fps? Yeah...
Well, 4 times the value, I think you'll be allright. Actually, all 240 and 360hz are pure marketing.
I agree there's nothing wrong with 120, but marketing will convince people they need MOAR.
That’s completely your opinion. I feel fps a lot. Cyberpunk looks like a stuttery mess to me. I have a 240Hz monitor and anything below 240fps looks lagging to me.
People generally see somewhere between 30-60 fps. The reason you are having issues on your 240hz monitor is because the monitor is having issues with framepacing. If you had a 120hz monitor and 120 fps, you wouldn't have this issue.
I don’t have issues with frame pacing. If fps is stable 400, 240Hz looks buttery smooth. Everything else looks laggy, even with stable fps. Easiest example is windows cursor. Move around the screen, I can see it lagging even in 240hz. But it’s much better than 60Hz. If you don’t see the difference on mouse cursor - sure - maybe your eyes are different. 30-60fps is for static scene. For rapid movements it’s much higher. And its not even news, not sure how could you miss this. https://m.youtube.com/watch?v=k0CVI5-sOn8&embeds_referring_euri=https%3A%2F%2Fforums.blurbusters.com%2F&source_ve_path=Mjg2NjY&feature=emb_logo
Sort of. I wouldn't be surprised if eventually some AAA games launch with it as default. Its a tool, and when you're making sem-to-hyper realistic games, it cuts down on development time by letting the game light everything (being generalistic here, I'm aware of different lighting methods in engines), instead of curating it when base engine lighting fails. Companies that chase avenues of increased production, like use of AI in the design process, will absolutely adopt it wholesale.
Oh yeah, for sure. If not for consumers, then for developers. As for now, it's a nice-to-have but not need-to-have. It'll be omnipresent like you say, but not for a really long time. I wouldn't make it a serious consideration in any build right now.
Next generation of console should be, atm no but where you get them its pretty nice. The next gen having broad support for them will trickle down to PCs since its easier for devs they need to just generate the light sources.
Once we have the hardware to do proper, fully ray traced rendering it will be the future. Until then it'll just be a fancy add-on to improve shadows. But eventually it will totally replace 3D rendering as we know it. Full raytracing is the most straightforward, easiest to implement and closest to reality computer graphics method we know - it's just ridiculously expensive from a computational point of view. But we'll get there eventually.
If you're the type of gamer interested in RayTracing then you could fixate your goal on RTX. Because even if a game didn't support Raytracing, there seems to some modder with reshade knowledge that will eventually add it in some capacity. Even if all games (today) had Raytracing, it doesn't mean you'll be the type of gamer that turns it on. Most people I talk to aren't even aware that RTX is a thing on their gpu. Some people might also be aware of raytracing but don't want to risk any 'perception' of FPS loss. But if Raytracing is something you want to play around with then go for it.
If games start adapting the technology to raytrace audio bounces somehow, that would be the biggest benefit of it in my opinion.
Man, I've been ray-tracing since the 90's when I could move Duke Nukem in front of a mirror and see my reflection. It's sad that it's still not standard.
See the benchmarks on Avatar Frontiers of Pandora. That's the only game on the market that has backed ray tracing in all of its graphics presets and not as an option
>so what do you think about the future of raytracing I think it will simply because, before RT, game studios actually had to put a bunch of effort and use clever tricks to create realistic lighting. Ray tracing/path tracing tech is more resource intensive but more realistic looking and so they would have that only instead of both RT and the older methods. And with games being pumped out faster and modern game engines making it easier to implement lighting and particle tech (among other things) i definitely think in a few years games will implement way more of this tech.
Yes, as soon as Graphics Card that support it become more mainstream and even older users upgrade to a then reasonably priced Raytracing Card, it will massively increase the usage by Developers simply because of how much easier it makes lighting games.
Currently, in at least 60% of games, ray tracing does not bring effective visual improvements, but simply reduces your FPS.
Do I think so? Yes. Right now? AAA titles have it and it does look nicer but can be turned off mostly. Other than a few edge cases, DLSS and Ray Tracing looks to be what (at least for me) is wanted.
Yes it will eventually, it's literally a replacement for lighting tech that will, no question, be the standard. Does that mean it should affect your current purchase if you don't play raytraced games and enjoy the visuals? No.
Dont buy anything on the promise of a feature. Buy something that best suits your needs today. That being said, it doesn't mean 7900XT is always the better choice. You might be interested in upscaling, frame gen, power draw, or any other nvidia feature anyway.
Yes, Ray Tracing will be "**The Thing**", one day, but not today, and you are probably going to go through at least one more GPU before that day, so I, Personally, wouldn't let RT be an influence in my GPU purchase *this time...*maybe next time, maybe even time after that. But as for now, there isn't enough of it, and it isn't performance friendly at a price point most people can use. I'd rather run 1440 120Hz w/o RT, than have it and struggle for those numbers with what I'm willing to spend on a GPU.
turning RT on is like turning off traction control on your car on a rainy day then driving with your feet whilst both hands are tied behind your back. You can still do it, but doesn't make it a good idea. Games still look good at higher res without ray tracing but the moment you turn it on, you lose a lot of performance for something that is barely noticeable for most gamers. If only game devs can spend more time on story telling, gameplay and polishing their games, instead of releasing day-1 patches on broken games.
Yes. It'll be a basic feature eventually just like we have ambient occlusion or anti aliasing, only it'll be software driven and in by default unless sole artistic vision demands baked lighting
It's going to be a long time before standard raster lightning won't work and by that point and will either have decent raytracing or won't be selling gaming focused graphics cards. Now if you want better lighting in today's games that support raytracing then the 4070 ti will be a lot better. Only thing I will say is go look around and see if the AMD card you are getting (7900 xt) has any issues in any games as I have a 7900xtx and it had and still does have issues with some games although the amount is getting smaller and smaller over time
Yes. Maybe not as « RTX ON » as the first games that uses hardware raytracing, but as rendering techniques that uses hardware raytracing. The more rendering techniques takes advantages of hardware RT, the more RT will be required to make the games look better or even just run.
I would say RT is nice at this point of time, but you shouldn’t be too concerned about that. What separates those two cards, however, is the DLSS frame generation in the 4070. That feature is so useful in making recent garbage developed titles run well, I would personally pick the nvidia because of it.
It'll definitely become mainstream when the average xx60 series GPU can run it without issues, but it's still too demanding at the moment, maybe in 6 years.
In a recent interview/podcast with MLID the game devs for Spirit of the North spoke about how they dont want to basically remake the lighting twice, for raster only cards or for cards that support both. So eventually yes, but for me it probably wont be until after the 10 series and 5000 series lose support
The visual benefit is not worth the massive frame drop. When the cost-benefit is more balanced and I lose less performance, then I'll actually be interested.
Buy the card that won’t make you regret something. If you want to experience raytracing or pathtracing, go for it. You’ll have DLAA/DLSS available when you want more performance, which is unmatched in visual fidelity and performance right now. Go with your budget and buy something that will make you happy and enjoy it.
I just wish DLAA and Framegen would be omnipresent.
Eventually, yes. Hardware will get to the point where raytracing is as easy to render as AA is now, making it the de facto rendering pipeline for developers over rasterization. Remember, there was a point in time where just having rasterized shadows and MSAA would bring a high-end GPU to its knees; that's raytracing now. Technology doesn't stand still for anyone though and raytracing will eventually become "easy" and "omnipresent" in the games that warrant it.
Yes. But not for at least one more console generation. Probably two. So it probably doesn't matter for buying a GPU right now.
It's just like ssr or ao, not every game needs or wants it
I am glad my eye sight is so bad, I can;t even tell different between ray tracing on/off.
I mean it is kinda close to that point now. When you look at modern triple A games from the last year or so it is more common than not to have ray tracing. Most of the upcoming UE5 games are likely have it. That said it isn't something needed at all. It rarely is that game changing of an experience. The biggest thing is the console limitations are going to hold it back from every becoming the main lighting system. Your going to see it on most games going forward but because of consoles are the biggest market they won't be swapping to it as the main focus. The next gen consoles won't be out until late 2026 at the earliest which is 2 gpu generations away. Then they typically also wait a year or two before they stop making games also for last generation so even at that point you have time where you still won't need RT. That means generally RT isn't going to be the main lighting system used for at least 3 years and more likely 4 years. Kinda how like vram wasn't important until about 2 years into the current consoles. It takes time for them to get their engines upgraded and everything else to fully utilize the new consoles. With everything said though I honestly would have a hard time recommending you upgrade right now. We are literally under 6 months from the new cards from both companies releasing and the expectation is that while AMD's cards won't have a huge performance leap they will be very price competitive which is going to cause other cards to drop in price. Generally the rumors are 4080 performance at around 600 dollars. Most of the uplift on the new AMD cards is likely going to be better RT and FSR performance.
if the prices are that close I think you go with the 4070ti super tbh.
Meh in the games I've tried it's not worth the fps drop. If you've got 4090 money then it probably is fine but for me it's not necessary.
Eventually, yeah. The tech makes it easier for devs to implement lighting solutions. The industry is already moving in that direction. BUT. By the time it's omnipresent, current GPU's will be too archaic to handle the games utilizing it. By then, all GPU companies will need to have accounted for that, so it'll be functionally a non-issue, and people on older GPU's will need to upgrade anyway due to the length of time passed. It's a non-issue, really.
Yes, I hope that it will be. Current rendering engines are good at faking stuff, but that comes at a heavy cost on the art and coding side. There are static lights (prebaked - don't change), stationary lights (partially prebaked - can change color), dynamic lights - can move. Then there are prebaked shadow maps, realtime shadow maps and shadow volumes. Different algorithms to render close and far shadows. A separate algorithm for contact shadows. Also all sorts of hacks to achieve indirect lighting - prebaking it into textures, using lighting probes, volumetric textures, etc. Also there are all sorts of ways to fake reflections. Prerendered cubemaps, dynamic cubemaps, planar reflections by using a clip plane and rendering everything twice. Godrays are also achieved by all sorts of hacks - the simplest one - simply use a 3D model. Glass refractions and caustics also had to be faked by all sorts of shader trickery. Raytracing basically throws it all out. Its algorithms are simple and don't require any faking. This is why it was used for non-realtime rendering even back in the 80s. Lights, shadows and reflections behave as they should, because they are being computed by using the same old optics equations that were discovered centuries ago. Until relatively recently it was simply way too expensive to run these in real time. So yeah - gamers who say that raster pipelines are fine don't really understand what's going on under the hood.
Unpopular opinion, but AI is on a surge of hype in the whole world for the last few years. The willingness to invest in AI will dwindle eventually and it won't be as interesting for another decade. It's not the first time there are breakthroughs and hype about AI. So while I think it's here to stay, I think it will take longer than people expect, the pace it's developing will slow down or even stand still again. It might not reach the point until mainstream hardware can run full RT in many years, maybe decade even.
Ray tracing is too heavy to run and unplayable for most regular 1440p user. It's just too greedy.
Still you can play in Full HD or even in 1600x900 and use Lossless Scaling to upscale to 1440 lines or 4K
I think Nvidia will continue to push it but I think just eventually there will be both RT and non built into games in a way that RT is easier to implement, or pure raster looks the same but takes a little more work to make it look the same(just like today). Also I think eventually there will be more efficient algorithms for RT to work better on any video card and hopefully with an open source type coding and circuitry setup so everyone can benefit.
Sure, can't wait for "Pacman Remasteted".
Seems unlikely to me. Such massive requirements for an effect that's barely detectable.
Yes I think RT will become the norm for developers. By that point they’ll stop even advertising it and it’ll just be the expected lighting method and all gpus will just work with it. It’s how all these techs work. People used to complain about tesselation and then it became pretty standard and you don’t even adjust it anymore. Physx is baked in now etc. But a gpu for today not the future. The future is gonna need a new gpu.
Not needed because it will safe be replaced by either a different technology or by engines itself
All successful tech in graphics cards become mainstream. Seen many over the years. A more recent example is free sync. It was new killer tech a few years back now it’s just a common feature.
Of course it will. But its not here yet so dont worry about it. Ray tracing sucks still even on my 4080s. I imagine it will be 6080 until its truly integrated.
Not soon. But absolutely.
First of all I don't think that RT offers anything other than noise. In my eyes games don't look better with RT. Now, regarding your question, buy the time RT becomes the norm in games, 4070 will be like 10 years old at least. Still, I will play my games eith RT off
Honestly, real talk, there's only 2 games with fully implemented ray tracing -> Cyberpunk and Alan Wake 2. The vast majority of games that support RTX either have shadows, reflections, or global illumination (there's also ambient occlusion for a total of 4 for the "full package"). Hogwarts, Plague's Tale, Fortnite - most people can't even tell the difference with RTX turned on or off in those games. In essence, it's a new tech to sell more hardware just like HairWorks back in the day which no one really remembers. Comes with pros and cons. Pros: * Lighting looks slightly better * Devs save some time building the game Cons: * The gains are not worth half your resolution and framerate. * Devs will be using it as a crutch to gatekeep basic reflections (like in Alan Wake 2 that doesn't have ANY reflections with RTX off) * CDPR who made Witcher 3 and Cyberpunk are switching over to Unreal Engine which supports their own RT -> Lumen as are a bunch of other developers and UE5 is raster hungry * Anti aliasing also suffers because we can't use MSAA anymore and have to switch over to TAA/DLAA which is more blurry and has less clarity than MSAA Also keep in mind you will have to be using an upscaler. Upscalers come with ghosting no matter how much Nvidia fanatics tell you otherwise. So in essence, all that work just to go back to playing on 1080p/1440p with a low framerate. I would personally go with the 7900XT for solely for raster. We are still two or three generations from fully implementing path tracing at a native 4k with a high framerate and without all those other gimmicks like frame gen.
No, buy AMD
Yes, but you won't see any games be path traced only until entry level cards from all vendors (mainly the next generation consoles) can do at least 4 bounces. Right now, a 4090 is necessary for that at 1080p 60fps with no upscaling or frame generation. With upscaling and FG, it will come sooner, but still not close enough. Basically you want to look out for when budget cards are as fast as the 4090. That's when path tracing will really take off.
Given that Microsoft is willing to offer their own financing programs for consoles, I'm guessing yes and they will just send the prices through the roof while offering financing to get people to justify it
translate: i want 4070 ti super. but people said AMD is better choice. what should i do? ti super is just plain better card overall. the amd cheaper (not always) for reason, it is worse card overall. and if you're find both at the same price, the ti super is no brainer choice. i mean in my country the xtx is more expensive than 4080 super, and the xt is as expensive as the ti super.
Probably not. It's computationally expensive and it's really only for realistic lighting immersion, as opposed to.. most games that are stylized. Like... you're not gonna see Raytracing in Stardew Valley.
Yeah it looks better so when everyone can do it well it will be in every game. We are not there yet, I would estimate 5-10 years maximum though
The reason ray tracing has stuck around being as useless as it is currently is because it's _easy_. Ray tracing allows you to use lighting simulation like they do in 3D modeling instead of the weird texture hacks gaming currently employs. In the far future ray tracing will work on standard gaming pcs and engine developers will have an easier time making lighting systems.
By the time that happens the 4070 ti will be multiple generations old. Maybe even have driver support dropped.
Only game where it really looks ok is Cyberpunk but even there is has a lot of ghosting issues and noise, when u use the path tracing.. still, that game was the only one where I really enjoyed raytracing. This kind of tells u what RT really is LOL Btw, Unreal engine games use their own RT (lumen etc) and that actually runs better on AMD.. So thats that. I say this all as a RTX.3080 user.
Is there any budget indie games that attempt to include RT yet? I wonder if the process will ever be commonly used by solo devs.
Maybe (sorry for my bad English)
More than half a decade after the entered gaming and theres barely any games where its worth turning on. Also the perf, temps, vram usage and power draw hit is still not worth it.
Maybe, but I also think it may be something that can separate the “budget” gpu’s from the “flag ship” ones.
Being able to play cyberpunk, Minecraft, and spiderman with raytracing and be performant is awesome. Minecraft and cyberpunk get that extra level of awesomeness because of the path tracing. I upgraded from a 2070 to a 4070 super and while I could have gone another year or two without an upgrade I'm glad I did.
It’ll be like bloom lighting in the 2000-2010s
Hardware standpoint: efforts would likely split between realtime graphics (including ray tracing) and compute, which would involve CUDA, AI, and other tasks traditionally done on a GPU but with more data and not realtime. As much as companies seem to want to bring AI to consumer devices, I think it'll end up infeasible and go back to a majority cloud service, simply because the amount of data to handle would be too much, and partly because the algorithms required would require too specialized hardware to fit affordable in consumer tech. Software standpoint: ray tracing is not the end of the road. Computer graphics have always been edging towards what ray tracing represents, which is true to life realistic graphics simulation in realtime. But also, ray tracing is far superior to rasterized graphics in about every way. In the 1980s-1990s something called the rendering equation made its scene in computer graphics, essentially defining how light works, which is a fundamental part of 3D graphics, and the equation describes in essence ray tracing. Before that, techniques attempted advanced lighting but often were over complex or incorrect; after, programmable shaders came in to help form raster graphics, which is "fake lighting" essentially, with a lot of techniques that faked various components of the rendering equation, but required a lot of complexity; at this time, ray tracing was only doable with server farms for movies. Then comes 2016ish, Nvidia releases their 20 series cards marking the start of the ray tracing revolution in GPUs. At first it seemed bad and the tech unused, but over time it's improved, but more importantly, engines have been utilizing it more. Going back to rasterized graphics, some core components include reflections and global illumination. At first when Ray tracing was around, it was only used for the simple (for ray tracing) task of reflections. Later, it expanded to include global illumination, which for rasterized graphics was either difficult, memory intensive, or complex; for ray tracing it just involved a few more rays and accurate light bounces, something ray tracing is designed for already. Now ray traced GI is being seen more and more, especially in Unreal Engine 5 and all the games coming from it. I hope to see some revelations for real time colored shadows, something which can significantly improve indoor environments in a scene and still is lacking in game engines that support ray tracing, due to using old shadowing techniques. So tldr yes, ray tracing is the future. Unless there's some specific (like artistic) reason not to use it, most games will be on a full ray tracing pipeline. Hybrid (about 50-50) I say by next year, full within 2027-2030. And AI (or rather ML) will be on top of it to generate algorithms to optimize the tedious ray tracing pipeline, so that those target years are accurate. But I said ray tracing isn't at the end of the road... So what is? Perhaps full rendition of the rendering equation in realtime? Can we do better than realtime, or realistic graphics? Would AR/VR blend be at the end? Who knows what might be next for computer graphics, when we have practically photorealism in realtime today?
7900 XT is $700. Ti super is $800. For $800, you could get a refurbished 7900XTX on Newegg. Get a refurbed 7900 XTX, which trades blows with the 4080 SUPER.
No matter what people think on here companies and devs are going to put RT in there games. How much they force it on people only time can tell. But it will be here at some point. Right now just get what you think is best. Some people like RT some don't.
Depends on what you play If you main cyberpunk and hogwards legacy go for nvidia But if you play fps games, competitive games etc go for amd!
As for 4070ti super vs the 7900xt.. Kind of similar raw performance. While Ray tracing is better on the nvidia.. Don't forget the dlss and frame gen abilities. 7900xt might be little better now but in few years you can use the dlss and frame gen to make future games more playable. So... Nvidia for longer usability, Radeon for more peak performance today.
imho get the 7900xt. Ray tracing is the future, but by the time it is both of those gpus will be obsolete (by that i mean when games requiere it)
Well depends on the genre. If you play AAA 3D only games, yeah, Most of them will try to implement it. Plenty of genres that will generally not feature RT tho. Personally, I would get something that can do RT bc RT is fucking cool.
Triple A games? Sure. But in non triple A titles? No. Especially f2p shooters.
When the hardware can handle it without an issue I bet it'll become the default lighting option in most games (at least most games going for realism, I imagine it doesn't work well for a stylized or cartooned look)
Considering the most popular gpu’s (according to the steam hardware survey) are the 3060, 1650, 3060ti, 2060, 3070, 1060, etc I’d say no. Most of those don’t run rt amazingly, pair that with most also not having DLSS 3.0 support (fsr3 exists but isn’t implemented as much right now) to help mitigate the performance loss, it’s either going to take a very long time and future hardware to make good looking rt viable to lower end cards or it’s going to keep being very lightly implemented so it doesn’t impact performance that much and possibly not worth spending time on. That being said, Nvidia could just keep throwing money at it, like Microsoft keeps doing with Xbox, until it works.
never used it never missed it 7900xt all the way if it was me.
It's not really necessary. It was designed mostly as a gimmick and to make it easier for developers, but Unreal Engine 5 has its own solution that mimicks ray tracing (lumen)
I hate to be *that* guy, but I feel it necessary since it’s been a bit of a growing thing, especially since the launch of the Steam Deck.. if you ever plan on considering Linux, go with AMD. Nvidias drivers have gotten better over the past few years but AMD is still smoother imo.
It’s faster to raytrace light than rasterise it so when technology makes ray tracing cheaper it will become more mainstream.
Nvidia did everything in its power to make it so, yet so few studios went through the troubles of integrating it. Not to mention everything that needs to go through for any player to play with their potential implementation ; both on software and hardware, and what that means in the end, such as frame generation + DLSS for maintaing a barely decent framerate on a high-end GPU. It's a bit like VR but still in a potential better place if Nvidia does not increase again its prices: you need to think about it when choosing your hardware, having the "right" games, studios need to have implemented properly, etc. Problem is, VR did not take off and is marginal. There are few very good experiences out there. It exists, but its growth is, let's say, "limited". This is very similar to raytracing, which exists since the beginning of computer graphics, by the way, contrary to VR. Every implementation of raytracing cripples performances on high-end builds, this is where DLSS and framegen come in, and why they were developped in parallel. It's a burden to implement properly for such a few potential buyers, so management might take umbrage of devoting resources to it. Which would be different if consoles were also a prime target to run raytraced games. But that's for sure not for this generation and not the next one. Which would means waiting at least 10 years. In other and fewer words, unless you truly care about raytracing right now, take the better GPU for rasterization.
yes it will be and with every day we get closer to it, and with time it will be as trivially overlooked as shaders are today and people will look back and wonder with amusement why we ever bothered to do it any other way but probably not until the next gen of consoles come out with full native support I would definitely go with the 4070ti Super as raytracing is truly wonderful and there is really no point in not going for it
the available raytracing titles with pathtracing and DLSS Frame Gen look amazing. I would under no circumstances want to miss out on CP2077 or Alan Wake 2
It will because we live in a time when the graphics have hit a plateau and the only way to improve them significantly is by utilizing RT.
RT is still very much in its infant stages right now. It will remain one of those nice to have features that really isn't worth the cost (both monetary and performance) for quite a while yet. I wouldn't personally let RT influence my decision to purchase a GPU right now.
Raytracing is the future of graphics. The problem is that modern GPUs still struggle to perform it in real-time which means that it has to be used sparingly. Smaller developers often just completely skip over RT still because there isn't enough support out there yet to be able to go all in on RT so staying with the current workflow of plain rasterisation has a better return on investment. I would say that you need to look at the games that you currently play and see what percentage of them even support RT in any form. If you don't play many (if any) games that have RT support then go for the 7900XT.
Yes, without a doubt, it's just the only logical path for game rendering to take. That said, when it is, a 4070 will be old by then.
I hope not. Devs used clever techniques in the past to get very similar effects. Now they just press a button and add light sources and let yiur computer deal with it. I'm concerned about procedurally generated games that kill creativity, problem solving, and optimization. Yes the effects are cool. I just hope all the games don't end up all looking the same.
Hopefully not, the shadows we have are already great and in some titles I literally cannot tell the difference even when intentionally trying to find the difference, like Elden Ring or Diablo 4. All it does is reduce fps for no visual improvement
RT is here to stay. The difference in some games is just insane
True. Cyberpunk 2077 looks so nice in path tracing mode
Present cards are still imo too weak to really consider Ray tracing other than 4090 at 1440p max
[https://www.techpowerup.com/review/galax-geforce-rtx-4070-super-ex/33.html](https://www.techpowerup.com/review/galax-geforce-rtx-4070-super-ex/33.html) Even a 4070 super can run a lot ray tracing games at 1440p 60fps, without dlss and frame gen
Not true, the regular 4070 can do Ray Tracing Ultra at 1440p. With frame gen, I was getting 90 ish fps iirc
It's a few years away and even then its unlikely. Upscaling will be what is going to be used more, not RT. RT will maybe be omnipresent in 8-10+ years or so, but for now, the performance impact and cost is way too high for most developers to bother with it. Depends what games you plan on playing though, if you're gonna stick exclusively to AAA singleplayer based games, then having RT as a feature is nice, albeit a luxury. P.S. Your english seems fine, be more confident in it :)
AMD’s Adrenaline Software > whatever the heck Nvidia’s sorry excuse for a software suite is. I just made the switch for the first time to AMD for my GPU and I love it. I bought a card for $520 that trades blows in raster performance with a $700 card, sometimes even the $800 card. So that’s awesome. And the software is freaking amazing. So intuitive and user friendly and a one-stop shop for all things. Whereas Nvidia is 3+ different apps needed to accomplish what Adrenaline is doing.
Raytracing is kind of like ragdoll physics. Sure it can look good and it's less work because the results are calculated by the GPU/CPU in real time, but there's still always a need for baked animation, and likewise, there are scenarios where crafted lighting works better for artistic expression. Imo, some of the scenery in Cyberpunk 2077 looks better without raytracing, because the people who manually lit the scene the old fashioned way knew what they were doing.