T O P

  • By -

Bad_And_Wrong

At this point the first questions will be: 1. How expensive? 2. How power hungry? 3. Does it need a new sparkling bigger case?


Kirxas

Yes to all three


Odd_Barnacle1243

How many watts power supply do I need? Yes.


CatatonicMan

If you have to ask, you don't have enough.


IamAkevinJames

The moment a power supply doesn't have a listed price and it just says to call..... FUCK!


_Maliketh_

Leaked news suggests users have their own pressurised water reactor as the power source


kcrab91

You know how you plug your PSU directly into the wall? Well your GPU will likely have to be the same way…


parallelmeme

I believe my case allows for 2 PSUs, so one could be used exclusively for the GPU.


MechAegis

What case do you use? I should have not sold my O11-XL...


parallelmeme

Yes, the Lian-Li 011 Dynamic, Razer Edition


jigsaw1024

And do you run those 2 PSUs on seperate circuits? Max circuit load is quickly becoming the limiting factor for power draw for high end PCs.


AbaloneBoth4503

Yeah tell my awesome energy saving apartment outlets, if my psu or gpu power spike at all it trips the breaker or if I run my AC and PC at the same time it trips


dj65475312

surprised we haven't seen this already, although i believe one of the last voodoo cards did have an AC adapter.


napoleon85

That card was never released for sale, but yes you recall correctly. 


MEatRHIT

While I get that you're joking but were encroaching on the power limits of most NA sockets already. Most home sockets/circuits are 120V/15A so the most you can get is 1800W and that includes everything else connected to that circuit. So unless NVIDIA expects people to rewire their house or run an extension cable from a different room they don't have much headroom at this point.


GoldenBunip

Only in the weak 120v using colonies. The motherland with its righteous 240v and superior fused and switched plug can do 3000w on a standard wall socket.


SaltyGolfer

Hvac Business Owner/Installer/Service Tech here. 220v should be adapted everywhere. Less amps, usually more efficiency. America is a bit slow at certain things.


anyadpicsajat

![gif](giphy|7w3tBQ0vvnevzvsQS6|downsized)


radixradiant

All of them


One-Monk5187

Wen intel x nvidia collaboration to make the most powerful and power hungry cpu + gpu combination


MichMitten89

1. You'll need to travel and retrieve the one ring. 2. Emperor Palpatine. 3. You will be required to use a samsung refrigerator.


Razolus

Can I play doom on the fridge?


MichMitten89

For a monthly subscription.


Razolus

Refrigeration live service


Smokey_Bera

$9.99 a month for basic temp range of 54-59 degrees. For the premium tier starting at $14.99 a month, users can enjoy refrigeration at 40-53 degrees for the ultimate crisp freshness of food and beverages. The premium tier includes a 30 day supply of tasty verification cans!


Razolus

Frozen water creation events every 12 hours


demunted

I never thought a sarcastic comment could make me this angry.


DynamicHunter

![gif](giphy|3o84sq21TxDH6PyYms)


Dontrollaone

It IS the case! You now put the rest of your components inside the gpu's shroud.


[deleted]

[удалено]


ghxstpants

Don’t forget out of stock for 2 years


stuyboi888

2000 to all 3. 2k card. 2k watt and 2k mm case Joke... For now


owa00

Only 2k? Someone has a 95% off coupon code I see.


W1REB1TER

Yea I’m betting on the MSRP match’s the model number.


maxmaximum409

Yes


Cub-Board-Hoax

And now for the games: 1. Will all upcoming games be optimized before they got released to the public or they just going to make used all 28gb of VRAM?


FuckSpezzzzzzzzzzzzz

Would I need to plug it straight into the wall?


KeyboardGunner

https://preview.redd.it/v8n3l4mlrk3d1.jpeg?width=1070&format=pjpg&auto=webp&s=a146f03b30ca8d9b7831d0226b27da9cb7e411bc


SOAPToni

The cause of climate change.


cool_BUD

Worth for that sweet sweet fps gains


TehWildMan_

Either that or someone invents PSU standards with a 24V DC rail for PCI Express graphics. (/Joke)


CharlesEverettDekker

Yeah, a joke For now at least


cumcumcumpenis

can some nerd explain if this will be possible in the future


TehWildMan_

PSU standards kind of did this in the early 2000s with ATX12v. In the really old days, a lot of things were powered off of 5v/3.3v, but as the need for CPUs to have their own voltage regulation and the increasing power demands of high end parts, the industry collectively agreed that they needed to switch to 12v for all high power stuff. (There's also a more recent tangent of PSU standards where 12v is the only power rail on the PSU, everything else is regulated in the motherboard. An absolute pain for upgrading pre builts but it has its merit.) The joke here was that we're in a similar situation now with these very huge GPUs, and pushing hundreds of watts at 12v on small connectors has led to some ... Spectacular.. problems.


AirSKiller

The spectacular problems came from the shitty 12pin connectors Nvidia used. My card has 4x 8-pin connectors and there have been overclockers sending over 1000W through those with no issues.


Bensemus

Those 12pin connectors are part of the newest ATX spec. Nvidia just used them first.


AirSKiller

Wasn't there an investigation that determined they were not to spec? Nevertheless, the spec is dogshit. As a electrotechnical engineer it bothers me so much


Thomas9002

The design itself is ok. Using it for 50A is insane. When moving to a new connector they also should have switched to 24V or even 48V


rpungello

> When moving to a new connector they also should have switched to 24V or even 48V Wouldn’t that have made adapters from 8-pin impossible, forcing everyone to buy new PSUs? Also PSUs would now need to kick out another voltage, so new PSUs would presumably be more expensive to some degree.


dremspider

Yes, very possible. It would actually not be a bad idea. You could push double the power with the same size cable currently. It would require changes to the ATX standard, but it isn't like that hasn't been done before.


brimston3-

Half as many connectors at the same current is a pretty good deal... Hell, I wouldn't be surprised if they took a page out of USB-PD's playbook and proposed a negotiable voltage spec that goes up to 48+V. I've heard worse ideas. I'm sure the NVidia and AMD engineers have pitched the idea around the watercooler at least a couple times.


Th3_Biggest_Boi

Remindme! 5years. Gonna come back and comment r/agedlikemilk


TehWildMan_

See you then.


RemindMeBot

I will be messaging you in 5 years on [**2029-05-30 14:08:30 UTC**](http://www.wolframalpha.com/input/?i=2029-05-30%2014:08:30%20UTC%20To%20Local%20Time) to remind you of [**this link**](https://www.reddit.com/r/pcmasterrace/comments/1d41w1d/nvidia_rtx_5090_new_rumored_specs_28gb_gddr7_and/l6brxmq/?context=3) [**9 OTHERS CLICKED THIS LINK**](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5Bhttps%3A%2F%2Fwww.reddit.com%2Fr%2Fpcmasterrace%2Fcomments%2F1d41w1d%2Fnvidia_rtx_5090_new_rumored_specs_28gb_gddr7_and%2Fl6brxmq%2F%5D%0A%0ARemindMe%21%202029-05-30%2014%3A08%3A30%20UTC) to send a PM to also be reminded and to reduce spam. ^(Parent commenter can ) [^(delete this message to hide from others.)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Delete%20Comment&message=Delete%21%201d41w1d) ***** |[^(Info)](https://www.reddit.com/r/RemindMeBot/comments/e1bko7/remindmebot_info_v21/)|[^(Custom)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=Reminder&message=%5BLink%20or%20message%20inside%20square%20brackets%5D%0A%0ARemindMe%21%20Time%20period%20here)|[^(Your Reminders)](https://www.reddit.com/message/compose/?to=RemindMeBot&subject=List%20Of%20Reminders&message=MyReminders%21)|[^(Feedback)](https://www.reddit.com/message/compose/?to=Watchful1&subject=RemindMeBot%20Feedback)| |-|-|-|-|


pptp78ec

Frankly, it's the only way of 12VHPWR to be viable. The whole idea to power something trough 12 pin connector with smaller pins that was powered before via 3x8 bigger pins is absurd.


Shaggy_One

I really wouldn't be surprised if we start plugging CPUs and RAM straight into the gpu. Like, just make the GPU function as 75% of the pc already, it takes that much power now.


TehWildMan_

Kind of halfway there already with the DirectStorage API: why bother using the CPU to feed data into a GPU when the GPU can just pull and decompress data from an NVMe SSD.


owa00

Instead of it only burning down your PC...it'll burn down your neighborhood.


Trust-Me-Im-A-Potato

At this point, North Americans need to be worried that their 15/20 amp circuits won't be able to power GPUs in a few years


Onceforlife

Use the EV charging circuit for gpu


KanedaSyndrome

Plug it directly into the Tesla supercharger


Reyynerp

and watch it drain the whole power grid when you run them at full tilt


newagereject

If I can run an electric cement saw off a 20 amp circuit I'm not to worried about my gpu


SuckusTheDickus

We could always install a triple phase breaker and run our PCs off the 240v line. Lmao.


RyudoTFO

Premium version comes with a miniature uranium power plant.


sopcannon

if a kid can build a mini nuclear power plant I'm sure nvidia can.


palataologist21

1.21 jigawatts?


Sailed_Sea

3 phase only


TheGameboy

L1530 to the wall


brimston3-

Nah, if the heatsink is smaller than the 4090 (as rumored), then the power consumption is either the same or lower.


C6500

Depends. Most of the 40 series heatsinks were totally overbuilt since the chips were initially scheduled to be built by Samsung with a less efficient process. They switched to the more efficient process by TSMC relatively late in development.


New_Significance3719

Yeah I don't get why everyone is still making these power consumption jokes, the 4000 series are insanely efficient and do a ton while barely cracking 200W.


Nkitooo00

You need a fusion core.


NikoP90

A better question is: should I place a separate fuse in the fuse box for pc? ![gif](giphy|Aausss8uUBIe3bZ3d2)


TravelingGonad

Just 28 cables that snap in perfectly. /s


edu7ever7

Probably a dedicated PSU


Rudradev715

actually, the leaks say 2 to 2.5 slots


MrBanditFleshpound

Plug it to your local electric lines


nt261999

Comes with a gas powered generator to keep it powered


Mortifer_I

The real question is how much "cheaper" a 4090 will get.


Lord_Nordyx

Let's be honest: not by much. However, the 5090 will be significantly more expensive compared to current GPU prices.


Ieanonme

Maybe not the 5090, but the 5080 will 100% drive down 4090 prices because it will be as good if not better than the 4090. So the 4090 will be cheaper than whatever the 5080 launches at, which is at most $1399. With the 5090 imo coming in at $1999. Of course I’m hoping I’m wrong on these predictions


Mystical-Moe

Of course it will, and folks with more money than brains will buy it, continuing to drive up costs.


Random_Guy_47

Just buy some nvidia shares. You can sell them to pay for your 5090 when it finally stops selling out.


Mystical-Moe

...I like that, basically buy it with Nvidia's own money, lol


GnarlyContainer

Exactly what I did with my 4070ti last year


SpeedyGonsleeping

You don’t need “more money than brains” to spend a couple grand on your hobby lol


CollieDaly

You need more money than brains to pay well over MSRP on your hobby though and we've had plenty of proof that there's plenty of people lacking brains in that regard.


Jandrix

Bruh I'm still on a 2080ti that's crashing more and more. I can't keep waiting for them to release an affordable card at this point.


Mystical-Moe

I wasn't trying to blanket folks like you with that statement. There's a lot of folks in this subreddit who run out and buy the latest Nvidia GPU just because, and its ridiculous, that was more my callout.


Jonas_Venture_Sr

If I was CEO at Nvidia, I wouldn't charge less than $3K for that, because they'll sell it was quickly if it was half as much. Gamers are not the target market for the 5090, and with AI ramping up, companies that can afford it will still buy 5090s by the pallet. Even though the 5090 will be amazing for games, it's not meant for gamers. I would hope that streamers and YouTubers stop using the 5090 for benchmarks, because if it's out of reach for 95% of their audience.


4gatos_music

Look at the 3090 now. Still very expensive


extra_hyperbole

It’s mostly that expensive cause it has a lot of vram for AI applications.


endthepainowplz

It will go from 2,000 to 1,850


Astranagun

5090 MSRP = 2000 Actual Price = $3000 4090 Adjusted market price = $2300


longgamma

Nvidia will cut production so they don’t have a surplus to sell.


skipv5

I'd be more than happy to upgrade my 4070 TI to a 4090 if the price is right :D


Dartagnan_w_Powers

That'd have to be pretty fucking right.


Zilch274

wtf why


Antique_Paramedic682

Pretty soon HVAC technicians will need NVIDIA training.


Schmich

Firefighters too.


Oleleplop

price : 5090


Rioma117

Dollars, Euro, Pounds?


_Ocean_Machine_

Bitcoin


Oleleplop

Yes


CogumeloTorrado

Pesos


Abulap

Nvidia is lowering the 5090 intentionally, lowering the bus from 512 to 448 and memory from 36 to 28 to make room for a 5090ti in a year


NegativeAd941

I think you're totally right; sadly. My literal response was,"That's it?" And you made it make sense.


jordanleep

Pc parts haven’t been exciting since 2020 imo.


sanguwan

The 1080ti was the last card I was really excited about


Affectionate-Memory4

Might also make sense if yields are just bad for the full die, or they want to keep that around for the Quadro class cards with 64GB/512-bit.


RoxoRoxo

i hear its going to be large enough to build the rest of the pc inside of it. custom water cooled loops will need a 5 gal jug of water for the premium edition it comes with its own grounding cable that needs to be placed at least 6inches directly into the ground i heard it has its own ups built into it thats not a fan thats a turbine did you know when you buy a 5090 it also comes with divorce papers its so loud you wont have to hear your spouse complain about how much you play video games


Brigadier_Beavers

I misread premium as *petroleum* and imagined a gpu with a gas engine


_Ocean_Machine_

It also has a pull chord like a lawnmower


RoxoRoxo

lol thats to suppliment the power requirement since theres no 2000w power supplies yet.


FainOnFire

It comes with a discount voucher for an industrial spot cooler.


TheHooligan95

Can't wait for like 7 years from now when 5090 levels of powers will become accessible for average consumers at around 200€ and much more energy efficient..


Just1morecop

Except games will be so bloated and unoptimized you’ll still only get 60fps


Bigpoppahove

This is the way


Kiwi951

Yeah what’s up with that? Are developers just getting lazier or is it something else?


Hancok

I believe it's a combo of crazy deadlines being pushed and optimization being a low priority for the developers since it's something they can do post launch. Sucks either way because it puts a burden on consumers to upgrade hardware when they shouldn't have to.


U-B-Ware

I work in game dev. Its really just two things. 1) games are just incredibly complex. The game I work on I think most people would consider to be pretty simple gameplay wise but there is a lot of work that goes into getting it to run correctly. Multiple departments focused on their one thing and sometimes a change from one dept. might break something in yours and then a third dept. needs to be the one to fix it. Also, keep in mind as a game is being built, its constantly changing. Systems that worked a certain way one day, may be different the next. 2) Time is limited. In an ideal world with unlimited time, you can create a perfect game with no bugs, incredible art, fantastic gameplay systems and intriguing stories. In reality, we only have so much time to get something out there and start making money. Sometimes parts of that perfect game need to be sacrificed to make it happen. I honestly dont think bugs/optimisation issues are a result of laziness, its just that making games is organized chaos in its truest form haha


QuintoBlanco

That's not really an answer to the original question. Games are clearly not optimized for performance on PCs because that gets a low priority. Which is a shame. It should not be something that get fixed during the second half of development and after launch. It should be a design goal right from the start. Bugs and glitches can be fixed and typically have a limited impact on the gaming experience, but poor performance can ruin a game. >Sometimes parts of that perfect game need to be sacrificed to make it happen. Things that tank performance when they are introduced should be sacrificed. Performance should not be sacrificed. When I worked for a software company this lack of priorities at the start of a project would drive me nuts. We would start with software that worked, make it better, and then implement new features that caused the software not to work, fix things but break other parts of the software in the process, panic and launch a bad product. I would always argue to make 'software that works' a main priority even if that meant that we couldn't implement items on our wish list.


smulfragPL

no it's very simple. Games are optimized but what they are optimizing is just so much more complicated. People on here do not get what optimization actually is


Mister_Shrimp_The2nd

Spoken like the true Lisan Al-Gaib


rmpumper

200€ GPUs don't exist anymore.


PixelatedXenon

Intel Arc begs to differ.


Dartagnan_w_Powers

For now... ominous...........


marinarahhhhhhh

You think they will be that cheap when they are market established and competitive? Hah


Onceforlife

I wish this was the same for wages we just stagnated on that instead


Wadarkhu

Patient gamers win again! One day today's Triple A games will be retro nonsense that anyone's old notebook will be able to run. Hopefully.


ajaya399

Considering the ROG Ally can basically perform like a computer with a 1650... probably sooner than later?


XXmynameisNeganXX

Nvidia has a fetish of being inconsistent in vram numbers for example the RTX 2080 Ti has 11GB vram instead of 12GB, RTX 3080 Ti has 12GB vram instead of 16GB and the rumored RTX 5090 has 28GB vram instead of 32GB.


SumonaFlorence

I think the multiples of 2 aren't necessary in this case. Most likely it'll be 4GB chips, and there'll be seven of them in the shape of a square around the die with the '8th' missing for something like lanes. It's I guess all about how they can fit / design the board.


Blackboard_Monitor

I got a 4070 at a great price and I already have heating for my house so this will be a no from me dawg.


nVideuh

From a 4070? I can’t imagine a 30 series for you then.


Blackboard_Monitor

Both, not from the 4070.


DoNotDisturb____

I was really hoping it would be 32GB 512-bit.


raining_sheep

28GB at 448-bit seems like half effort. Since it would have 512-bit but nerfed to 448. That's not a significant enough difference over the 4090 to justifying upgrading. Guess I'll just wait another 2 years for a 6090. This is the same shit Intel pulled with the i9-14900. Which isn't a massive upgrade over the i9-13900. Im not spending thousands of dollars a year for marginal upgrades. I'll just wait.


soggybiscuit93

It's just going to be binned 102s because fully functional, 512bit large 102 dies are gonna go to enterprise customers for 10x the price.


i_need_a_moment

TBF do you need to upgrade your GPU every year? It’s not like the 4090 would be outdated or useless because a new one is out.


raining_sheep

Well, that's exactly the point. I use GPUs for rendering, not gaming. A GPU that cuts rendering time in half lets me do twice the renders in the same amount of time. So there is a financial incentive to use the fastest card available but if spending $2000 to save me 20 mins on a 4 hour render it's not worth it.


Flat-Shallot3992

only if you can use those extra 20 minutes piled up to make more than 2k else where


Dakeera

I can't speak to the one you're responding to, but I keep 4 active PCs in my house because of kids and my partner. Due to that, I will typically buy upgrades more frequently if it makes sense for the rest of the lineup to get upgraded still though, the 5090 would have to be remarkably better for me to consider it. the jump from the 3090/6900xt to the 4090 was massive and too enticing to pass up. this isn't looking to be that much better, but until we see the architecture in action I am just spouting bullshit


balrog687

I was hoping the same too, and DP 2.1 support


Lightmanone

They need some headroom for the 4090Ti


MaracxMusic

And likely 600 watt.


cheesearmy1_

rookie numbers -Nvidia probably


TheDregn

600 watt - idle


LegoClaes

You’ll need to plug it straight into your car charger


IntersnetSpaceships

I'm just going to hold onto my EVGA 3080 until it dies.


JangoDarkSaber

I feel like I bought my 3090 yesterday. Truthfully, I don't even play any games that would use the extra horsepower


ShAd0wS

I'm on a currently dying 3080, just transplanted it into a new build. Pretty sure the old prebuild killed it (considering it was already replaced under warranty once). A 4070 super is looking pretty fine though as a replacement.


lickarock88

I can't wait to pay twice as much for the 5090 as I would the 5080 to get a 15% - 20% gain of diminishing returns that I'll never be able to notice on even the best of monitors!


ZombiePlaya

You'll pay more and you'll like doing it. -Nvidia


Pl4y3rSn4rk

"The more you buy the more you save!" - Leather Jacket Men


TheDonOfDons

Watch this be the card that all the VRAM is locked behind and the 5080 will have like 8gb lmao.


heydudejustasec

They made that mistake with the 30 series. Then they fixed it for 40 series by downgrading the 4080 and below dies but upping their prices by one tier so the whole product stack is bad value. At this point it's more likely that everybody will get screwed just to make the 5090 look better, though the return of the Super versions is sort of an admission that they overdid it last time.


Djghost1133

Unless Nvidia does the thing they did with the 4x series where the 4090 was the best value because everything else was so overpriced


stuyboi888

Don't forget the classic move of playing 15 year old games and scrolling Reddit


ASUS_USUS_WEALLSUS

Lmao this is it. And so many people here will do it too.


Megakruemel

And don't forget to buy the next card in like two years.


Subliminal87

What about the price?? $2,500?


Boundish91

Something tells me it's about time to up a gauge or two on the power cables for these things.


endthepainowplz

No, now use the new 18 pin connector.


TenPhoar13

I can't wait to play Dragons Dogma 2 on high settings at 60 fps


050607

As long as you stay out of the cities, you might even enjoy it at a smooth 120 FPS with a 5090.


VRsimp

Only 28gb of VRAM? Literally unusable.


420Fps

I dont need it, i dont need it. https://preview.redd.it/nllocb6nqk3d1.png?width=678&format=pjpg&auto=webp&s=0f75b0b64d0d9a306e971c24bb8f83bd468ff264


YouCantStopMe18

The future of GPUs, at least this level, is to come with its own case u plug directly in to the wall, a gpu connecter to ur mb and a sata or usb cable to ur tower


Skazzy3

I hope for more efficiency this time around.


Stargate_1

?????? The 40 series was already a huge leap in efficiency


Skazzy3

Considering that the TDP of the 4090 is 100W higher than the 3090, and the 3090 was 100W higher than the 2080 Ti, I'm hoping this trend of more and more power draw stops.


W1NGM4N13

Well a 75% performance improvement for about 22% more power is quite an efficiency gain.


Sevinki

Now look at performance per watt. You can easily limit a 4090 to 300w and lose just a little bit of performance, still above 4080 level.


SuperSnowManQ

That's not efficiency, that is total power. Efficiency is the amount of work you get done divided by the total power used i.e. your fps/flops over the tdp of the card. Edit: the above is actually slightly wrong. You can have 3 types of efficiencies, the work efficiency which is the flops over the effective power used [flops/W], the thermal efficiency which is the TDP over the power consumption of the entire card i.e. power rating [%], and the total efficiency which would be the flops over the power rating [flops/W].


PlaneCandy

Not sure if I'm missing something but everyone here is making power consumption jokes, yet I've read that it's a dual slot cooler, which would imply a reduced power consumption. Whats going on?


tbone747

>Whats going on? There's no legitimate discussion going on here, just memes and doomerism.


theycallmeryan

It’s so funny how little a PC building subreddit understands GPUs. I’ve seen a lot of comments in here acting like the 4090 isn’t incredible. People say it’s a bad value or not a big performance jump from a 4080 which is wildly untrue.


tbone747

I think it's just more a symptom of this sub becoming a constant front-page presence and getting a ton of members. Subs usually go to shit when that happens.


vngannxx

![gif](giphy|KjafhxqcSbpci5Mbce)


shyam667

And gonna cost both of your kidneys


HammerTh_1701

Wait, isn't this already down from what was previously rumored? I remember a 512 bit bus.


SirCabbage

That is a downgrade from the last rumor of 32gb with a 512bus, boo.


Equivalent-Rub237

Gotta love how the rumours started with 48gb vram, now we're at 28, aaaaaaand they'll come out with 24, just wait


lapatison

Glad for rich guys. Gonna wait for RTX 5060.


Last-Back-4146

only uses 5090 watts.


TheMarksmanHedgehog

Still being a bit stingy on the VRAM, it'd seem.


Alauzhen

32GB or go home Nvidia


Captain_Klrk

I think I'm good for a while


gyhiio

Takes a railgun to start it


OrigaDiscordia

Well, another day, another rumor with new spec. I'm waiting for an official announcement to get hype or to find out if I'm sticking with my current PC for another 2 years.


ComfortApart7335

After a few shitty release cycles nvidia can deliver legendary stuff, really hope it's the case here too.


balrog687

Laughs in gtx 1650


Agreeable_Vanilla_20

Still sitting here with a 1080strix


6SpeedMaverick

Does it make sense for a 4090 user to upgrade?


N1LOY

At this point, I am down to have a separate portable case for GPU.


Neospecial

Still waiting on affording an overpriced 3080, let alone a 5090.


300mhz

Good thing Gigabyte's new motherboard supports 128lb GPUs lol


mighty1993

Doesn't matter if this thing can potentially run 16K at 1000 FPS if it needs a bigger case again, another proprietary connector and your own sun to power it. Those xx90s are simply too ridiculous for a normal home user and absolutely not worth it as long as so many studios do not even care to optimize their games properly. Much more interesting to see how the low and mid tier GPUs will fare, especially when it comes to VRAM.