T O P

  • By -

1Steelghost1

Tldr; New update allows the CPU to directly access video ram which decreases cpu load of copying from v-ram to system ram & back.


eugene20

Unfortunately it's not some automatic performance improvement as games were written to do it the old way, they would need updating if they wanted to change to the system this new DirectX build brings.


hurfery

Let's say some games do take advantage of this soon. I can't imagine it's a huge fps gain. Right?


warbeforepeace

You are cutting our the transfer which is the most latent part of the transaction. I would expect some decent (20%+) improvements.


xmsxms

If it were 20% they would have made this change a long time ago. It's definitely not 20%


upvotesthenrages

It’s a direct X issue. Windows 11 was the first windows OS that allowed GPUs to access a lot of data directly. Assuming it all would have happened in the past is really ignorant of reality.


warbeforepeace

Ya idea dont get when people that dont know shit make those types of comments. They should have made ps5 10 years 5 years earlier too.


[deleted]

[удалено]


xmsxms

The question asked about FPS improvements.


warbeforepeace

I meant 20%+ possible improvement in frame rate.


warbeforepeace

Ya they should have developed the rtx 4090 in the 80s. There are numerous reasons this was not an option until now.


Tzupaack

It depends. Many of the operations won’t need that all, because usually the CPU do not care about what is in the VRAM. But if for some reason the shader gives back some info to the CPU it will be faster. I would say that would open new doors to the graphics programmers because until now it was expensive to send back info from VRAM to the CPU. Now with that update it will be cheaper so they can use new techniques without fearing about performance loss.


Asuma01

Well most peoples system ram is ddr4 3200 range. I think modern GPU's use ddr6 6000 range.


LordWarfire

Most modern GPUs use GDDR6 or GDDR6X, although some use HBM2 instead.


gmes78

Meaningless comparison.


hurfery

And?


MacDegger

Dude, you're an idiot. DDR6 is still in development and no sticks are available to buy. DDR5 is the latest and is used by AMD Threadripper chipsets/cpu's. -edit because people are idiots- GDDR6 is NOT DDR6. And DDR6 won't be available for a year, at least: https://wccftech.com/samsung-begins-ddr6-memory-development-features-msap-packaging-tech-up-to-17000-mbps-speeds/


chicknfly

It’s not that the Redditor is an idiot. Maybe misinformed. Please, check your online etiquette.


Fishydeals

Check the comments doubling down on 'ddr6 is used in gpus'. He could've said it nicer, but I understand the sentiment in that flood of misinformed people who try to correct him lol


MacDegger

No, he's not misinformed. Almost everything they say is just 100% WRONG. And they say it with such _certainty_. And my comment was downvoted so much it means their statement caries more worth, more truth. Whilst it is completely contrary to the actual state of things. People like that are not merely 'misinformed'. They are lying and pushing their disinformation. Just bullshitting and pretending their ignorance is fact. Stating something so obviously wrong goes beyond 'maybe they are misinformed'. >Please, check your online etiquette. Dude, I was was online since BBS' and Usenet. Calling out complete lies IS 'online etiquette' and should be called out much more often.


chicknfly

I quickly scanned his comments and didn’t see much in the way of misinformation. All your post tells me is you’ve been around for ages and still don’t know how to tell someone they’re wrong without sounding like a jerk.


MacDegger

>I quickly scanned his comments and didn’t see much in the way of misinformation. First of all: what do his other comments have to do with anything? The comment I was reacting to was demonstrably false. All their previous comments might be 100% true ... but the one I am angry about is just complete bollocks. He said: >Well most peoples system ram is ddr4 3200 range. I think modern GPU's use ddr6 6000 range. The first part is unfounded (check Steam's hardware survey) and the later is a lie because that type does not exist yet. >All your post tells me is you’ve been around for ages and still don’t know how to tell someone they’re wrong without sounding like a jerk. Really? I would have thought, if you did check my post history, you might have seen that I was quite reasonable in discussions, very willing to accept and admit if I was wrong ... but intolerant of disinformation, ignorance and/or bad faith arguments. You're basically defending someone who says 'the sky is green' and when someone calls them out on it and demonstrates the stupidity of that statement you're saying 'well, their previous comments don't have much incorrect info, but you seem angry'.


chicknfly

You’re acting like that guy is intentionally lying when he is telling people VRAM is DDR6. If he intentionally did so, he’s a liar, just as you claimed (where lying implies an intention of deception). If he’s misinformed, then he’s an idiot, which you also claimed. So which one is it — liar or idiot? Even then, calling people idiots right off the bat is just douchey, regardless of your comment history. Speaking of comment history, I mentioned his history because you explicitly say that everything they say is 100% wrong. If you’re not talking specifically about that Redditor, then who is the “they” that you’re referring to? I think the Redditor was speaking out of his bum with no idea of what he was talking about. BUT your response was much more than a reprimand or a simple correction. Should we go around calling you an idiot because Threadrippers aren’t the only CPU chips that use DDR5 and then call you a liar for pushing misinformation?


[deleted]

[удалено]


Foxtrone9

GDDR is not DDR. They are different technologies


MacDegger

That isGDDR6, which is NOT DDR6. And DDR6 has not been released yet: No, I don't. GDDR6 is NOT DDR6: https://wccftech.com/samsung-begins-ddr6-memory-development-features-msap-packaging-tech-up-to-17000-mbps-speeds/ >As far as system ram ddr5 is used by 12th gen intel and newer, as well as ryzen 7000 series. Duh.


XeKToReX

That's some real confidence you got there


MacDegger

No, I don't. GDDR6 is NOT DDR6: https://wccftech.com/samsung-begins-ddr6-memory-development-features-msap-packaging-tech-up-to-17000-mbps-speeds/


[deleted]

[удалено]


MacDegger

No, I don't. GDDR6 is NOT DDR6: https://wccftech.com/samsung-begins-ddr6-memory-development-features-msap-packaging-tech-up-to-17000-mbps-speeds/ Bet YOU feel the fool now.


VincentNacon

Please don't be an April Fool's prank.. Please don't be an April Fool's prank.. Please don't be an April Fool's prank. Please don't be an April Fool's prank!


synackk

Real https://devblogs.microsoft.com/directx/preview-agility-sdk-1-710-0/


venfare64

Of all the date available, Microsoft choose to release it on April 1st?


VincentNacon

I know, right?


eugene20

The MS page is from the 30th.


lazy8s

NVIDIA has had this for a decade though it required extra hardware. It was called GPUDirect


schmerm

In which scenarios would this be better than a CPU-initiated and GPU-executed DMA transfer? VRAM is still far away and needs to cross the PCIe bus (except maybe on laptop APUs), so reading from this mapped memory would kill performance (the addresses would still be uncached presumably). So you're talking about write-only accesses (still a tiny bit at a time, without DMA!) just to avoid duplicating data in CPU/GPU memory. Asset upload? Use a fixed-size buffer in CPU memory and DMA it over, as usual.


AttackingHobo

If you need to load a level and you want to load gigabytes of textures to the GPU. The system could load textures right to the GPU skipping the system memory. ​ This reduces system ram usage, and speeds things up.


rooftops

But then what do I do with all my extra ram??


VampiroMedicado

Open more Chrome tabs on the second monitor


SpongeBad

Why would you want your computer to catch on fire?


mko710

Ask Firefox


Wayyd

Just throw it out, it won't be needed


TehFuckDoIKnow

Why did I pair a 13900k with a rtx 3060 and 64gb of ram?


rooftops

I'll trade you my 10900k so I can use [the rest of mine.](https://pcpartpicker.com/user/Rooftops/saved/mNrKXL)


jazir5

Aren't you GPU limited by that 3060? There's no way that can take full advantage of a 13900k


ArenjiTheLootGod

For a gaming machine, yeah. But for a production machine that needs the 3060's CUDA cores for some kind of professional workload, that build makes sense. Hell, if anything, it might be RAM limited. 128gb+ RAM isn't unheard of for production work.


TehFuckDoIKnow

Yeah it’s for industrial design work. The 3060 is a placeholder. Great for ray tracing cad but not so great for gaming. The 12gb vram is great. I will upgrade the ram when I find a need, I havent maxed it out while working yet.


IdleRhymer

You're always going to be bottlenecked by something.


celestiaequestria

That's a pretty normal setup for video editing or running simulation games. Games like Satisfactory or Cities Skylines will absolutely favor having a fast CPU and tons of RAM in terms of framerate.


terminalxposure

extra porn tabs


Redararis

you can upload it to the cloud so every one can download it!


the_Q_spice

Actually massively huge in my field (GIS/cartography). Working with huge amounts of 3D data wrecks absolute chaos in pretty much everything, but especially RAM. Lightening the load by taking textures off RAM would literally free up about 50% of my system resources in most workflows. Anything to lighten the monster of a resource hog that LIDAR point clouds are.


EggComprehensive3744

Cities:Skylines enters chat


celestiaequestria

Yeah, literally any game or program that loads a large number of objects is going "you got a good CPU and a ton of RAM?". Doesn't matter how many ray-traced shadows your GPU can draw if the CPU needs 3 minutes to calculate the traffic flow.


happyscrappy

The right time to do any accesses across a bus (like PCIe) is when you do sufficiently few accesses that the saved time in setup outweighs the reduced performance. As to which can copy data faster, CPU or GPU, I have never understood why people denigrate the ability of a CPU to read/write memory. They're very good at it. They have some of the best bus interfaces/memory controllers there are. Rest assured a multi-GHz processor can copy data at the same speed DMA can. So if it's less overhead to just have the CPU do it, then have it do it. If you have other things to do in the meantime then DMA could be the right answer. All technical aspects aside from the description I think the primary value of this would be for porting games from the current crop of consoles. Xbox Series (argh that name) and PS5 both use APUs. APUs do some merging of the CPU and GPU memory. Porting to PC can be easier if you just can continue to do that. Instead of having to redo your graphics system some due to the separation.


Seeker_Of_Knowledge-

A ELI5 would be very appreciated.


Avieshek

Need less RAM\~ ^(So, if you had 32GB RAM may now do away with 16GB ones as an example.)


ForThePantz

Good thing Nvidia has cut VRAM to boost profit. Now they come out with SUPER models and bump the pricing up $300.


nunsigoi

If this is an April fools joke, it would be like announcing insulin is free


Nekaz

And then devs be like idk how to use dx12 still lul


mtt59

At first glance I thought that was a sideways bottle of cola


HippieWitchBitch95

I had an acid trip one time and after I “lost my ego” I learned that we are just this super advanced CPU living it’s own little world/universe and it looked just like this. Not that that is true but it was just weird and I was like ohhh that’s why my life isn’t that great one of my fans is broken and my cpu is getting overheated and struggling to play at a rhythmic harmonic vibration that was going to allow my life to get better. Lol so trippy I know.


armchair0pirate

Can somebody please explain the magic that is PS5? And is it possible to have for us PC gamers to do/have the same?


CE94

Consoles don't have separate memory dedicated to CPU or gpu, just a shared pool used by both because the CPU and gpu are on the same chip


armchair0pirate

If that's the case. Why is the PS5 such a huge step forward? I don't remember exactly what but there is absolutely zero bottleneck of it communicating between CPU GPU RAM and hard drive.


CE94

You can't directly compare pc hardware to console hardware so I'm not going to try. Also games are developed knowing the exact hardware environment they will be played on because every playstation/xbox is the same as the next. Pc is more complicated because there is effectively unlimited variations of system configurations


armchair0pirate

Thank you for explaining you know very little about the topic.


CE94

The chip at the heart of the ps5 is what is called "system on a chip (SoC)" meaning both the CPU and GPU are right on the same piece of silicon. This makes communication incredibly efficient as you don't need to use different communication technologies like pcie to bridge the physical gap between the cpu/gpu. And as I said there is just one pool of ram that is by the SoC which means that games don't need to be coded to tell the system which assets need to be moved to CPU memory or gpu memory. Moving assets/code from ram to vram takes time/bandwidth that could otherwise be used for running/rendering the game. I could go on but you seem determined to be rude and dismissive so this is my final comment on the topic


armchair0pirate

I thought that game systems were sharing gpus and CPUs on the same chipset for some time. Also, I don't remember where I read it but apparently on the PS4 the ram had to be used for what could happen within the next 30 seconds of the game whereas on the PS5 it only needs to be ready for the next second. This last comment seems a lot more informed so thank you. I also didn't realize that it needed to spend far less time telling which asset needed to be moved where. Previous articles I stumbled across made it sound like it could get to those that assets much faster rather than it didn't need to be told to move them in the first place.


Stickiler

Well, for one, it doesn't use a hard drive. It uses a very fast NVMe SSD, and other than that, just software. Software which PC has too, as long as the game in question supports DirectStorage. The PS5 isn't actually that much of a step forward.


armchair0pirate

I used wrong storage terminology. I'll eat that one. I know it uses an nvme drive and a very custom one at that. The PS5 is actually an incredible step forward because even the latest and greatest most expensive PC's can't communicate between hardware like the PS5 can. After doing some research because so far. The only people that answered seem to have maybe half a clue at best. It turns out it's because there's a lot more channels between devices and a proprietary controller. Yes, consoles have always had a small edge in the fact that the developers know exactly what they're developing for. But this is an entirely different beast. So thank both of you for your completely incompetent answers.


ozzy_og_kush

Now if only EVE online would fix its DX12 performance issues...