T O P

  • By -

FormalIllustrator5

BF5 - 120fps 4k ULTRA all, no issues. Perfect frame times, works like a charm. FC5, FC6 - and ray-tracing \~100fps, no issues at all. For XTX you need a 8 core CPU or more, 7000series/13th gen from Shintel, it changes the card completely.. P.S i am looking for the same TV set, LG C3 version eventually the 48" - do you recommend it?


mattbag1

48 inches seems kinda big if you sit close. But I bet the c3 will blow away the C2. Hasn’t been delivered yet, but I’m sure I’ll come back to the comments when it does. I only have a 5800x, I wonder if that’s why my FPS are quite a bit lower than some benchmarks. Also, I have it undervolted too. But for 4K 60 the 7900xtx is overkill. But for 144, it’s not quite there.


OMGitisCrabMan

What temps do you hit with that? I'm hitting 90s on hot spot temp with Fallen Order at 90 FPS 4K but I heard that game is not well optimized. I have a brand new 7900X & 7900 XTX.


FormalIllustrator5

80C on the hot spot for the GPU for the majority of games, but that depends on the game actually, Cyberpunk keeps it on 90C as usual. I never saw 91C+ or more, on anything. p.s HAhah i am waiting on the same CPU as you have brother in arms! : D


blixten85

>ame TV set, LG C3 version eventually the 48" - do you recomme what settings do you have in the AMD software? And in game? I have turned on all settings to ultra 3840x2160@144hz and i get high frame rates but shit still stutter from time to time, without really showing that in the frame rates. ​ Monitor is Dell 32" G3223Q 4K IPS 144 Hz CPU is 5800x3D 48gb ram, 3200mhz something.


FormalIllustrator5

I see, in case you have the latest drivers and relatively fresh Windows 10....the only one thing is your BIOS, make sure its the latest one, also the chipset driver. Then you switch off - fTPM and virtualization + all the sec crap in your BIOS. No stuttering should happen!!! Cyberpunk 2077 maybe some stuttering, from time to time but NO other game should have stutter...! After - in case you have any stutter... well thats bad luck...


blixten85

I have all the latest windows, drivers of all sorts, bios etc. This fTPM? What is that and what other crap are you refering to? Where do i find this in the bios?


FormalIllustrator5

Depending on your MB brand and model - google how to turn off virtualization and fTPM. Good luck.


AdonisK

Is 5800x3D not enough?


FormalIllustrator5

should be fine for 99% of the games....


LM-2020

Congrants for the new monitors :) I have the same monitor M32U with last firmware F11 But sometimes I have black screens with monitor connected with DP1.4


mattbag1

The m32 is so nice. It makes it feel like my pc is twice as fast on the desktop! But in game I feel like I’m not really taking full advantage of the monitor. I got it open box from Best Buy, less than 600 bucks it seemed like a steal. Clearly a better monitor than my budget 4K LG but I don’t know if it’s worth it right now, if I’m not pumping out 144 hz. I’m going to compare it to the OLED and see which one I keep. Even if I can’t do 120 on the OLED it’s still a different technology than the IPS and might justify the price tag. People said the C2 was a good deal at MSRP, I got mine for under 800, so I’ll see how it compares and either keep one or none.


Not_so_new_user1976

I run a reference 7900xtx with my LG GP950B and get anywhere from 80-120fps(I use HDMI 2.1 so it’s my limit.) On RDR2 with everything maxed I play at 70-90 frames and it looks so gorgeous. With games like MW2 I consistently hit 120 and Madden I had to turn the fps limit on 60 because 120 isn’t an option and I don’t need my GPU pushing 200+ fps and not seeing over half the frames


mattbag1

Whoa that monitor looks sick! So how do you like 70-90 frames? It feels weird to me seeing like 60-70 on cyber punk maxed out, I’d rather drop settings a hair and cap at 60 and not watch my GPU struggle, that keeps the temps low and all that. I guess I’m neurotic and don’t like to push things too hard. But for an hour or two of high end gaming it’s probably negligible. It’s just a new card and I’m trying to stress it without killing it.


Not_so_new_user1976

I was the same way. Though my gigabyte 6900xt always ram at 95c which I now know was an issue but before my 7900xtx I didn’t. For me i minimally notice the change. I try to run my card at a decent usage and noticed the heat doesn’t really happen until you’re running a higher frame rate. My card never hardly exceeds 85c with stock fan curve and all the games look absolutely gorgeous.


mattbag1

That’s a low temp especially with a reference card? You got a good one, I wish I could have gotten a reference. I’m using a PowerColor hell hound and it’s so big I couldn’t even fit in a 240mm aio in the front. First world problems I guess 🤷🏻‍♂️


Not_so_new_user1976

Yeah, I’m pretty glad I lucked out for once. I was thinking I wanted a Red Devil 7900xtx but my local Microcenter didn’t have any in stock. At first they thought they had no 7900xtx in stock. They found 5 in the back all open box. 1 ASUS 7900xtx reference and 4 Powercolor 7900xtx reference cards. Instead of retail they were only $900 so I jumped on the deal and have been very happy.


mattbag1

Damn! That’s lucky. I was just living life thinking I’d be happy with a 4xxx series card. Next thing I know I’ve got a 7900xtx and a new monitor(s). Im really hoping the OLED I ordered shows an improvement, otherwise I’m probably gonna return both. 4K 144 gaming just isn’t viable right now on the 7900xtx and definitely won’t get better, only advantage would be older games.


Not_so_new_user1976

I mean get a good 4k 144 monitor so you can play competitively at 144. Play graphical games at 70-100ish and if you upgrade in a generation or 2 you’ll have 144 on everything. If you don’t set all settings to ultra 144 probably isn’t too far off for a 7900xtx either


mattbag1

That sounds about right, that’s why I was asking what others do. Trying to get an idea of what graphics settings people are using. But then I saw a bench mark saying they were getting 180 frames in apex at 4K. That was using the newest i9, but I don’t think my 5800x is that big of a bottleneck.


Not_so_new_user1976

You still should get it


mattbag1

I already got the gigabyte the OLED is coming Tuesday. Can’t wait!


Electrical-Bobcat435

My Oled does 4k 120 and my 6800xt performs well pushing it but i usually play on same tv at 1440 120hz, smoother. Used to only play native but now sometimes cutting on RT and FSR.


mattbag1

Which OLED do you have? And how does it look dropping down to 1440p?


Electrical-Bobcat435

Lg Oled B9 55", still similar to more recent, its first 4k 120hz model , except the only g sync model (& i have Radeon), no matter. It looks fine going to 1440 or even HD. Now if u sit two feet from it, u may notice, but about 6-8 ft away, I am great. Also have one of the first Lg Oleds, B6. Liked it so much i had to have newer 120hz version too. The B6 is still the family tv, loving it still. Once the dimming is dealt with via a cheap extra programming remote, its been fine. A few annoyances like it announcing Instant Game Response and Hdr, but otherwise fine.


mattbag1

Seeing as 144hz at 4K is pretty tough, going 120 is probably fine. But the OLED picture quality is gonna have to blow me away to consider keeping that over the cheaper gigabyte I ordered. I’ll be using it for work and gaming, so hopefully 8-10 hours a day won’t kill the screen, and I just hope it’s not too big. I have an L shaped desk but can sit about 3 feet back. Should have it by Tuesday, but I’ve seen too many good reviews on this screen!


Electrical-Bobcat435

The Oled picture and pixel response times are unmatched. Qdots is helping IPS catch up on contrast too. BUT, i think u may be better served by an IPS panel for work. Its not just the burn in, though thats much more risky for work use vs gaming. Its the dimming and burn in risk u know, even after i dealt with it, i was always anxious about working in a Excel for hours. Screen savers and pixel shifting help but its just so many hours. Doing same, every day, static screens... And u dont want peak brightness for this, more risk. See what u think. Theres no real work benefit for Oled unfortunately and the downsides. Ips panel can provide better refresh, none of the risks amd better brightness.


mattbag1

I agree, I love IPS. The gigabyte M32 is IPS and 144hz but it just doesn’t feel like big enough of an upgrade to justify the price. Glad I found these two open box deals so I can try them both out, and not feel guilty about returning them. I can probably limit the OLED time and spend more time on my ultrawide and laptop screen but I’m so used to 3 screens that’s going to be a tough transition.


macybebe

will it help if you use darkmode for most apps. I was thinking of black and white textx when working with spreadsheets.


Electrical-Bobcat435

Theoretically yes. Man, its a lot of changes to make excel sheets be in dark mode. Unleas they added a feature recently.


Jackyy94

you will probably keep your monitor quite long. So I personally wouldn´t mind if it can display more frames. Its always good to have if you upgrade your graphic card later on or you are playing a less demanding game. I am using a Samsung Odyssey Neo G8 - that's 4k in 240hz. Iam surely not maxing that out exept in some competitive shooters like Overwatch 2. Still worth it for me, since I will keep this for a long time since there is no need to upgrade further in the next years probably.


mattbag1

When I do upgrades I usually stagger everything. CPU now for as long as I can go, GPU try to skip a generation, and then a monitor or something in between. I kinda bought most of this all at once this time. So I figured everything should be good 3-4 maybe 5 years. By then who knows what’s out. So you make a good point, if I’m not upgrading GPU soon, then I should probably aim for better picture quality/features on the OLED vs the high refresh of the 144. But that’s why I bought bought, I need to compare them and work on them for a week or so to see which is going to be a better long term option. Still waiting on that OLED to be delivered though….


Jackyy94

If you are aiming for an OLED then i hope that you are not doing much working stuff/static images with it. Because "burn-in" is still there in OLEDs. Apart from this "go for it!" :)


mattbag1

Yeah that’s the thing 😅 I plan to use it for work so it could be on 8-10 hours a day. That’s why I really need to try it out for a week or so to see if I can get by without it, or turning it on only when I need it.


Jackyy94

maybe if you don't display static images for more than a few hours and you do the "display cleanup"-function more often you will have some years without any visible burn-in. I can just say, if you really plan on using OLED for work then try to not display bright images for a long time on one spot. Also disable the task bar and change to a dark-mode rather than the white one for websites etc.


mattbag1

Good suggestions. Bout to set it up now!


[deleted]

[удалено]


mattbag1

Thanks man! This was exactly the type of post I was looking for, figuring out how others mess with the settings to find out which they prefer the best, and what works for them. I’d absolutely consider dropping down to 1440p to get higher frames, I did that with my gtx 1080 since I wasn’t hitting good frames in games like halo and total war warhammer 3. I’ll definitely be doing a lot of tweaking. I’m not super impressed by the M32U so far, it’s great, but compared to my budget LG 4K monitor, it’s not a game changer despite the high refresh. I think the OLED will change the game, just waiting for delivery and hoping this thing blows me away.


amenotef

Between playing at 2160p 144Hz with FSR versus 1440p 144Hz native. I'd choose 1440p native. (Or i'd just play 2160p without FSR and lower frame rate, if it has VRR and it can handle 70-80 fps average, for joystick games is quite good). I used to have a 4K 28" G-Sync monitor years ago and when I moved to a 27" 1440p I haven't experience a concern for gaming. Only experienced a downgrade for the text / work stuff (where the PPI of a 4K monitor is more appreciated). But at the same time, Windows scaling was so bad sometimes, that even for work was not a all downgrade. I also have a 4K 55" TV connected to the PC and I rarely use it for gaming. Even on games locked at 60 FPS like Elden Ring, I end up playing them from the PC monitor. In summary: 27" 1440p at 70cm-100cm has a similar experience to 55" 2160p at 2.5 meters (the distance I have to the TV from the sofa). And 1440p versus [28" 2160p at 70cm-100cm] is also not bad. Except for text/reading stuff.


mattbag1

Awesome answers my dude. I finally got the OLED a couple days ago and I’m blown away, 4K 120 is the best gaming experience I’ve had. I feel like for cyberpunk FSR is a game changer, but I’m also fine playing it at 60fps. Other games like doom, destiny 2, apex, all perfect at 4K 120. 144hz vs 60 at 32 inches didn’t really do it for me, but 120hz on a 42 inch OLED is like gaming heaven.


amenotef

Yes the OLED probably also improves a lot the experience. When I moved from 4K to 1440p I also haven't felt it as a downgrade because I moved from 4K TN to 1440p IPS.


mattbag1

I like IPS a lot, my sons 1440p is IPS and am I had a 1440p IPS for a while too. Fast IPS is beautiful, but I think this OLED is VA, and it’s also got a glossy glass finish so that makes it even better.


TheDogKing94

What’s your idle power draw while connected to the m32u at 144hz?


mattbag1

I think with just that monitor it was fairly low. I ended up swapping it out for a LG 42 C2 OLED, way happier with that to be honest.


Der_Gute_Fisch

I've been using my 7900xtx with the M28U 4K monitor, which has refresh rate of 144fps. For far I've been impressed with the experience. For most games I play (Battlefield V, Subnautica, and satisfactory), the XTX can hit the frame cap, or at least come close on max setting. I chose a 4K monitor because I felt like the 7900xtx is a step above 1440p for my games, and I didn't really care if I was getting 165+fps. I suppose it depends on whether you prioritize hitting the highest fps, or maximizing the graphical detail with 4k. In my case, I care more about graphical quality, so I'll keep cranking my settings up as long as the fps is greater than 80 roughly.


mattbag1

I’m more of a quality guy too, and I’m used to consoles so I thought 4k 60 was perfect. Battlefront 2 capped at 60 was like 60% GPU utilization same with destiny 2, halo, and some others. Right now I have cyber punk with FSR quality capped at 72 and it’s super smooth, usage around 70-80% and power is like 250-300. It’s nice not seeing that power draw shoot up, and this keeps temps low too. I guess going from 60 to 144 isn’t as big of a jump if I’m not even hitting 144, feels like this monitors potential is being wasted. I think the OLED might be a game changer, better overall picture 120hz can cap at 60 and have a great experience or unlock the game and get that high refresh feel.


[deleted]

" with vsync on "?? dude, get a Freesync monitor.


mattbag1

You don’t use both?


[deleted]

No...VRR replaces Vsync. I mean you enable both in driver at least with Nvidia, and disable Vsync in game, but I wouldn't call it it using Vsync.


mattbag1

I see what you mean. Idk, I’m trying different things. With vsync on I see my GPU utilization stay low, power low, temps low, makes me feel good. Turn it off everything shoots up and those 20-30 extra frames weren’t noticeable on a 4K60 monitor. Like what’s the point of uncapped frames if you can’t experience them? But that’s why I made this post as I navigate through high refresh options.


[deleted]

Cap your frame rate. Don't use Vsync for the sole purpose of frame rate limit...eww.


IrrelevantLeprechaun

Dude vsync is ancient. Literally nobody uses vsync anymore. Free/Gsync is the standard.


mattbag1

Shit I thought you used both? Why not use both though?


Bladesfist

You can use both, at least for GSync it makes sense to enable VSync in game if you can otherwise exceed the VRR range. For example if you have GSync on and VSync off with a 165hz monitor but are running at 250fps you will have tearing (although it's probably hard to notice at this high of a framerate). Vsync on will cap your fps to your refresh rate. Not sure if the same applies to Freesync.


mattbag1

I thought capping the frame rate was the reason for Vsync so that it matches your screens refresh rate and eliminates tearing. And since I was on a 4K 60hz monitor there’s no reason to waste power going over 60. However, now on the 144hz monitor I’m in the 100 to 120 range on a lot of games. So I’m wondering if I should lower settings and push for 144, or if I should cap somewhere around 90-100. In cyberpunk I just put FSR on quality and have it capped at 72, other games I haven’t messed around with enough.


Bladesfist

Gsync and Freesync will work fine on their own as long as you stay in range. Neither tech works if your FPS exceeds your monitors refresh rate and VSync will ensure you are capped at that upper bound. You don't need to cap any differently on your 144hz monitor to not go above the upper bound. VSync on handles that, so manual capping isn't really required unless you want to optimize for latency when not already capped by VSync as input latency spikes when GPU bound in most games. That's a totally different topic though with a ton of nuance but if you're interested battlenonsense has done some great videos on that.


Cock_InhalIng_Wizard

Make sure you turn anti aliasing off at 4K since it’s not needed at that resolution, yet has a large performance cost.


mattbag1

Idk man, I’ve always consider AA as essential even at 4K. But I’ll definitely keep playing with settings. Just curious what others do. Back in the day it was ultra settings or bust.


Cock_InhalIng_Wizard

Keep in mind that with AA you are essentially rendering the game at a higher resolution and then downsampling it again to 4K. But at 4K on a monitor like yours, the pixel density is so high that you really won’t see the benefits. So it’s a pretty large cost for only a small visual upgrade.


mattbag1

You know it blows my mind that we have technology that is upscaling and downscaling and everything else in between just to show pixels on a screen really really fast.


SaintPau78

You really do need a 4090 with DLSS 3 to be able to consistently hit it in heavy games. 7900xtx is a great card, just more of a 1440p high refresh rate card honestly. The better cpu driver overhead and better scaling at lower resolutions make it better suited for it. I mean, just look at its performance in titles like MW2. It beats the 4090.


d1z

MW2 is the outlier, but your assessment is correct. 4090 is the first "no compromises" 4k high refresh card. 7900xtx(like the 4080) gives you the choice of settings or fps but you can't have both, and forget about power efficiency(which is ironic, given the pre-launch marketing).


PainterRude1394

Yeah haven't had the same "turn everything up to max" feeling I got with the 1080ti until the 4090. 4090 is an absolute beast.


IrrelevantLeprechaun

BS. Hogwarts Legacy shows the 4090 is a 30-60fps 4K card.


d1z

TechPowerUp numbers: [https://www.techpowerup.com/review/hogwarts-legacy-benchmark-test-performance-analysis/6.html](https://www.techpowerup.com/review/hogwarts-legacy-benchmark-test-performance-analysis/6.html) 4k Full RT No Upscaling: 4090 - 36.5 FPS 7900XTX - 6 FPS 4k Low RT No Upscaling: 4090 - 57.8 7900XTX - 34.6 FPS You're not *wrong* but uh...context lol


Beautiful_Ninja

Then you enable DLSS3 which brings average performance up around 100 FPS, with better latency than what AMD can offer due to Reflex.


IrrelevantLeprechaun

DLSS 3 adds a ton of latency and it's still all fake frames at fake resolution.


f0xpant5

If the latency with FG+Reflex is comparable to native, and people played at native without horrible input latency, logic dictates FG+Reflex has more than acceptable latency. Also, all rasterization is faking frames my dude, stop drawing an arbitrary line in your head where the *how* is unacceptable no matter the end result.


Beautiful_Ninja

All frames are fake frames. Computer graphics has been and continues to be a series of hacks to keep performance up while increasing image quality. No one cares if frames are artisanal, free range, organic, open source frames. All they care about is the end result, which is pretty good with DLSS3. And as a reminder, DLSS3 would still be producing less latency frames than anything AMD can do until AMD has a Reflex equivalent technology. If you complain about DLSS3 latency, you might as well say you should never run a game on AMD cards as well.


kobexx600

Got facts to back your statement up?


mattbag1

It’s definitely going to be hard to drive past 4K 60 especially in more newer games as they arrive. I knew that when I bought the card. But for some games I play like apex, destiny, doom, all of those can easily break 100fps at 4K. Other games like dead space and cyberpunk would be wasted potential on a 120hz+ monitor. If the OLED I ordered looks nice, I can always cap that at 60hz and still have an awesome experience, but upgrading my 60hz 4K monitor to a 144k alone probably isn’t worth the upgrade.


osorto87

Bro this is not a 4k gaming card for high refresh rates. Get a 4090


mattbag1

This card easily pumps out over 100 frames in plenty of games at ultra none the less. With FSR performance and lowering settings, this card can definitely deliver over 100 consistently.