Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
gibber
May 21, 2001
Not an MC.

overeager overeater posted:

With the 4070 being actually available in dual-fan versions* I'm thinking of upgrading, but at the moment I have a Ryzen 5 3600 - is a slow CPU bottlenecking GPU performance still a problem to look out for, and if so, how do you check whether that will be a problem?

*I thought mini-ITX was a good idea at the time

I got a 4070Ti paired with a 3600 at 3440x1440 and it is glorious. The 0.1% and 1% lows are barely noticeable for me, probably because adaptive sync does such a good job at hiding them.

Adbot
ADBOT LOVES YOU

DoombatINC
Apr 20, 2003

Here's the thing, I'm a feminist.





shrike82 posted:

https://twitter.com/VideoCardz/status/1654461220831805440?s=20

going to go day 1 on the rog ally if they launch at $700-800

Beelink have announced a mini pc line that'll have 780M APUs, very eager to see the prices

Dr. Video Games 0031
Jul 17, 2004

DoombatINC posted:

Beelink have announced a mini pc line that'll have 780M APUs, very eager to see the prices

I wonder about these. Handhelds like the GPD Win Max 2 that use LPDDR5-7500 or more could have a tangible performance advantage over PCs that use regular sodimms (DDR5-5600 in this case), even if this thing can support a much higher power envelope. I mean, the 780M is likely already hitting heavy diminishing returns past 35W.

Dr. Video Games 0031 fucked around with this message at 18:08 on May 6, 2023

Yudo
May 15, 2003

orange juche posted:

FSR3 does run on RDNA2, since it's an open standard and AMD specifically stated it would run on older hardware, they'd be silly to not have it run on older hardware as current consoles are RDNA2 based, and also the lions share of their graphics hardware goes into consoles.

I have been reading a bit on rdna3, and was pleased to discover it has acceleration for matrix operations, though this doesn't seem implemented through specialized circuitry.

I bring this up because it seems like FSR3 would be perhaps be much more competitive if it utilized what rdna3 has baked in. If their goal is to sell rx7000 gpus, being an rdna3 exclusive--particularly if it rivals dlss3--doesn't seem like such a bad idea.

Edit: rdna3 got the ability to handle dual issue fp32, so never mind: what can run on rdna 3 should work too on older rdna architectures, just more slowly.

Yudo fucked around with this message at 18:13 on May 6, 2023

Seamonster
Apr 30, 2007

IMMER SIEGREICH

Dr. Video Games 0031 posted:

I wonder about these. Handhelds like the GPD Win Max 2 that use LPDDR5-7500 or more could have a tangible performance advantage over PCs that use regular sodimms (DDR5-5600 in this case), even if this thing can support a much higher power envelope since the 780M is likely already hitting heavy diminishing returns past 35W.

Zen 4 IMC is the "culprit" here not being able to handle higher bandwidth DDR5, even if the iGPU can. Zen5 fixes that??

hobbesmaster
Jan 28, 2008

Seamonster posted:

Zen 4 IMC is the "culprit" here not being able to handle higher bandwidth DDR5, even if the iGPU can. Zen5 fixes that??

AMD’s monolithic APUs have always supported higher speed RAM.

Dr. Video Games 0031
Jul 17, 2004

Seamonster posted:

Zen 4 IMC is the "culprit" here not being able to handle higher bandwidth DDR5, even if the iGPU can. Zen5 fixes that??

Both the GPD Win Max 2 and the Beelink PCs use Zen 4 CPUs (the 7840U and 7840HS respectively). I think the 7840's memory controller is better than the desktop Ryzen parts due to something to do with the monolithic design, and maybe the only reason why the Beelink uses slower memory is because it relies on sodimms (which don't come in speeds that high), while high-speed LPDDR5X can be soldered onto the mainboard close to the APU.

Zen 5 will almost certainly have a better memory controller than Zen 4 though. I'll be disappointed if it won't be able to utilize memory speeds of 7000 or more (which Raptor Lake already supports).

Dr. Video Games 0031 fucked around with this message at 19:19 on May 6, 2023

ChazTurbo
Oct 4, 2014
How're AMD drivers these days? I want to get my brother a 6600 or 6600XT since Nvidia's midrange pricing is garbo.

Kibner
Oct 21, 2008

Acguy Supremacy

ChazTurbo posted:

How're AMD drivers these days? I want to get my brother a 6600 or 6600XT since Nvidia's midrange pricing is garbo.

The drivers are fine. Not great, not terrible. Fine.

The Adrenaline software package that is optional with them, however, is fantastic.

sauer kraut
Oct 2, 2004
There's always that one AMD problem game (I think atm it's the latest Call of Duty) but my experience with the 6000 series has been stellar.

Also your monthly reminder, if you have weird flickering with Chrome and switching between tabs/apps:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\Dwm, create DWORD OverlayTestMode with value 00000005
Reboot

Dr. Video Games 0031
Jul 17, 2004

If you only want to spend $200 - $250 on a graphics card, then the 6600 or the 6600 XT/6650XT are your only real choices, and they're not bad at all.

orange juche
Mar 14, 2012



power crystals posted:

The 5800X3D is notoriously a nightmare to cool due to the stacked die topology (the cache die effectively acts as a thermal insulator). The stock cooler won't break it or anything, but getting a fancier one may appreciably improve its performance by letting it boost for longer.

One of the best things you can do with the 5800x3d is undervolt it a bit on vcore, I dropped my voltage down in the BIOS by nearly 0.1 volt, i think .08 or something, and it shaved off 10C as long as I'm not under all-core 100% utilization. The CPU doesn't get nearly as hot under gaming loads as it did before, I was sitting close to 80C while playing VRChat, and went down to 68C with no other changes.

The amount of thermal load that the stacked cache puts the CPU under is pretty crazy, the CPU itself won't use more than about 70w under load, but it will drive right on up to 80C even under water cooling because the cache die blocks heat from traveling from the cores to the IHS.

orange juche fucked around with this message at 00:34 on May 7, 2023

spaceblancmange
Apr 19, 2018

#essereFerrari

Two issues I've had with my 6600 was high memory clock and idle power usage when in multi screen and a black screen of death with chrome video playback which made me finally move back to Firefox.

Drivers have been fine for gaming and it really is the only option in the price range but it is pretty obvious it is not going to have anything like the longevity the 1060 it replaced had.

Yudo
May 15, 2003

My undervolted 7900xt uses 80w of power idle (stock it is about 90w). That seems like alot. Aside from that, no driver problems!

orange juche
Mar 14, 2012



Yudo posted:

My undervolted 7900xt uses 80w of power idle (stock it is about 90w). That seems like alot. Aside from that, no driver problems!

yeah the MCD setup for the 7xxx series GPUs is an idle power hog. Under load it uses less than the equivalent Nvidia card but the Nvidia card is less power hungry at idle. Not sure if the 7x series cards just don't clock down as well, or if there's inefficiencies that are somehow gobbling up ~50w of power.

sauer kraut
Oct 2, 2004
It's 2-3 times as much stuff to glue together as a Ryzen CPU, so that number kinda makes sense on a napkin.
Very funny that AMD keep it in the driver patchnotes as a known issue to placate the fans, as if it was fixable.

Dr. Video Games 0031
Jul 17, 2004

Yudo posted:

My undervolted 7900xt uses 80w of power idle (stock it is about 90w). That seems like alot. Aside from that, no driver problems!

That is unusually high: https://www.techpowerup.com/review/sapphire-radeon-rx-7900-xt-pulse/37.html

If you have a secondary monitor, try setting it to 60hz. That helps with my Nvidia cards.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
That's stupidly high, 6800xt only draws 7-9w idle. You're telling me an undervolted 7900xt still consumes 10 times as much power idle?


You'll be paying for that over the year for sure, especially if you live in Europe

Truga
May 4, 2014
Lipstick Apathy
it's probably fixable, but not a priority for amd because power draw doesn't sell cards lol

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
10x as high is way off but "proper" idle power for RDNA3 is seemingly about 5W per MCD, before driver patches it was 71W/81W for 4MCD/6MCD configurations and after patches it was 46W/54W. Given the relative changes between the two products, it sounds like the problem was primarily that the GCD was not idling down properly, moreso than it was blasting the MCDs improperly.

So presumably in the absence of MCDs, it would idle at something like 26-28W for the GCD alone which is not dissimilar to the measured 34W idle for 6900XT (which has "MCDs built in" of course, in the form of PHYs on a monolithic die, but it can idle them better because it's monolithic).

So yeah MCD does ramp up power by ~30-40W but not 100W. The problem is, of course, that you have that power anytime you want to talk to memory or maintain state in cache/etc, even if the GCD is essentially idle it still needs memory to talk to... sort of like how adding a small iGPU (or a small cache) on the IO die is a good idea for power efficiency compared to having a full chiplet. The evolution of chiplets is all about how you break apart the components so you can control when the chiplet has to be powered on.

Incidentally when you look back on that talk from the radeon engineers about how the power penalty of MCD was pretty minimal (supposedly only 15W)... that really must have been idle power they were talking about.

Paul MaudDib fucked around with this message at 17:54 on May 7, 2023

Yudo
May 15, 2003

orange juche posted:

yeah the MCD setup for the 7xxx series GPUs is an idle power hog. Under load it uses less than the equivalent Nvidia card but the Nvidia card is less power hungry at idle. Not sure if the 7x series cards just don't clock down as well, or if there's inefficiencies that are somehow gobbling up ~50w of power.

I would like to blame the display engine, but AMD generally does a good job on that front. Even with the MCD setup, 80w strikes me as extreme.

Dr. Video Games 0031 posted:

That is unusually high: https://www.techpowerup.com/review/sapphire-radeon-rx-7900-xt-pulse/37.html

If you have a secondary monitor, try setting it to 60hz. That helps with my Nvidia cards.

My second monitor is 60hz. Unfortunately, reading reviews (professional and consumer), this does not seem to be an uncommon problem for people with multi monitor setups. There were several Amazon reviews with complaints of 100w(!) at idle. I am extremely reluctant to return this card seeing as I have gone through 3--a 6950xt and two 7900xts--that had the most horrendous coil whine I have ever encountered. The one I have now is, mercifully, inaudible within a closed case.

Paul MaudDib posted:

So yeah MCD does ramp up power by ~30-40W but not 100W. The problem is, of course, that you have that power anytime you want to talk to memory or maintain state in cache/etc, even if the GCD is essentially idle it still needs memory to talk to... sort of like how adding a small iGPU (or a small cache) on the IO die is a good idea for power efficiency compared to having a full chiplet. The evolution of chiplets is all about how you break apart the components so you can control when the chiplet has to be powered on.

I just can't accept that it takes 100w to decode a Youtube video. There has to be something wrong.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
It's likely to be your multi-monitor setup with differing refresh rates. It's probably the most common issue that pops up for different goons in this thread.

Yudo
May 15, 2003

Zedsdeadbaby posted:

It's likely to be your multi-monitor setup with differing refresh rates. It's probably the most common issue that pops up for different goons in this thread.

And we have a winner. I set both monitors to 60hz and now have an idle power usage of 18w-30w. I have never encountered this with a 1080, 970, or 7970. One monitor is 144hz, the other 60hz. The 60hz panel sucks and I was going to replace it anyway. Will two 144hz monitors be a problem, or is it only as you said: two monitors of different refresh rates?

Thanks for your help, everyone.

Dr. Video Games 0031
Jul 17, 2004

It's not differing refresh rates that does it, really. I have two 165 Hz monitors. When both are at 165 Hz, my 4090 pulls around 100 watts at idle. When the secondary monitor is at 60 Hz, my GPU idles at a more reasonable 30 watts. I can set both to 120 Hz, and it's more like an 80-watt idle now. Just having one of the monitors at 60hz is enough to let the memory controller run at a lower power state.

edit: Despite the above success, I highly doubt it's the mismatched refresh rates that does it exactly. GPU memory controllers just don't like high refresh displays and are really fickle in general. It seems unlikely to me that a second 144 Hz panel will solve the issue. But there can be strange quirks to the way monitor timings and pixel clocks work on different monitors, and some monitors randomly seem to be less disruptive to a gpu's idle states than others.

Dr. Video Games 0031 fucked around with this message at 22:27 on May 7, 2023

UHD
Nov 11, 2006


For what it’s worth i have a 165hz and 144hz monitor and my 4070ti idles at around 10W

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
super good double-feature analyzing perfcounters in CP2077 path-tracing and the XeSS/FSR upscalers

Paul MaudDib fucked around with this message at 01:47 on May 8, 2023

Yudo
May 15, 2003

Dr. Video Games 0031 posted:

It's not differing refresh rates that does it, really. I have two 165 Hz monitors. When both are at 165 Hz, my 4090 pulls around 100 watts at idle. When the secondary monitor is at 60 Hz, my GPU idles at a more reasonable 30 watts. I can set both to 120 Hz, and it's more like an 80-watt idle now. Just having one of the monitors at 60hz is enough to let the memory controller run at a lower power state.

edit: Despite the above success, I highly doubt it's the mismatched refresh rates that does it exactly. GPU memory controllers just don't like high refresh displays and are really fickle in general. It seems unlikely to me that a second 144 Hz panel will solve the issue. But there can be strange quirks to the way monitor timings and pixel clocks work on different monitors, and some monitors randomly seem to be less disruptive to a gpu's idle states than others.

You are likely correct, but my dinosaur of a 60hz panel is not long for this world anyway. From my experimentation, refresh rate does not impact idle power with one monitor alone. With two monitors, it seems to scale with the combined refresh rate: 144hz + 60hz uses more idle power than 120hz + 60hz and so on. This seems like a problem on the part of AMD as my 1080 and 970 never did this.

I do appreciate having a stop gap solution (running 60hz+60hz day to day), and I can only hope that this can be fixed in software.

repiv
Aug 13, 2009


lol at 90% of the frame being one giant dispatchrays call

all the tracing and shading is bundled into one enormous ubershader then, which figures since SER can only re-order work within a dispatch. AMD would probably prefer it to be broken into multiple smaller shaders with manual sorting steps inbetween, but then it wouldn't be able to take advantage of nvidia and intels on-chip sorting. the only way to make everyone happy would be to implement both approaches i think.

dkj
Feb 18, 2009

Been tinkering with undervolt settings for my 7900xt and my best scores on time spy extreme are a few hundred points below average.

I’m wondering if it’s below average because the average 3Dmark user is overclocking/enthusiast grinding for performance/score or if I’m genuinely behind the majority of people who have a 5800x3D/7900xt setup?

Dr. Video Games 0031
Jul 17, 2004

In my experience, 3DMark averages tend to be slightly below what is normal for a typical, well-built PC because there are always a bunch of people with a dozen kinds of malware running in the background using these benchmarks too. That said, I wouldn't sweat a difference of just a few hundred points. That seems roughly within the margins of silicon lottery, and it shouldn't lead to a noticeable difference in gaming performance.

power crystals
Jun 6, 2007

Who wants a belly rub??

dkj posted:

Been tinkering with undervolt settings for my 7900xt and my best scores on time spy extreme are a few hundred points below average.

I’m wondering if it’s below average because the average 3Dmark user is overclocking/enthusiast grinding for performance/score or if I’m genuinely behind the majority of people who have a 5800x3D/7900xt setup?

Check the CPU-specific benchmark data if you can. I had that happen to me when I first built this and it turned out to be the 5800X3D thermal throttling as those things love to do. A -25 curve optimizer value got me above average with no change to the GPU.

dkj
Feb 18, 2009

Dr. Video Games 0031 posted:

In my experience, 3DMark averages tend to be slightly below what is normal for a typical, well-built PC because there are always a bunch of people with a dozen kinds of malware running in the background using these benchmarks too. That said, I wouldn't sweat a difference of just a few hundred points. That seems roughly within the margins of silicon lottery, and it shouldn't lead to a noticeable difference in gaming performance.

I didn’t think it was that popular. I guess it could be worse.

power crystals posted:

Check the CPU-specific benchmark data if you can. I had that happen to me when I first built this and it turned out to be the 5800X3D thermal throttling as those things love to do. A -25 curve optimizer value got me above average with no change to the GPU.

I currently have it -20, and I’ve tried -25 and -30 and it scored lower. I don’t think it’s thermal throttling but I’ll make sure next time, because the CPU scores don’t seem great.

Wonton
Jul 5, 2012

sauer kraut posted:

There's always that one AMD problem game (I think atm it's the latest Call of Duty) but my experience with the 6000 series has been stellar.

Also your monthly reminder, if you have weird flickering with Chrome and switching between tabs/apps:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\Dwm, create DWORD OverlayTestMode with value 00000005
Reboot

Ummm can someone please elaborate ? I am having similar issues and found a similar elsewhere but not sure what’s going on (not sure if it really helps) thanks.

I have a 7900Xt with 5800x3d

Wonton fucked around with this message at 11:42 on May 8, 2023

wolrah
May 8, 2006
what?

Yudo posted:

Will two 144hz monitors be a problem, or is it only as you said: two monitors of different refresh rates?
This behavior is not fully understood and sometimes can come and go with different driver releases, so it's hard to say anything for sure.

That said, in general it doesn't seem to happen to people using identical monitors, no matter how many they have. It doesn't have to be exactly the same monitor, but they need to be same enough that the signal on the wire looks the same. Same resolution, bit depth, refresh rate, etc. Any kind of mismatched set is likely to trigger higher than expected power consumption.

sauer kraut
Oct 2, 2004

Wonton posted:

Ummm can someone please elaborate ? I am having similar issues and found a similar elsewhere but not sure what’s going on (not sure if it really helps) thanks.

I have a 7900Xt with 5800x3d

It disables some Microsoft fullscreen/overlay system that causes a lot of problems.
Harmless and easy enough to revert if it doesn't help in your case.

power crystals
Jun 6, 2007

Who wants a belly rub??

dkj posted:

I currently have it -20, and I’ve tried -25 and -30 and it scored lower. I don’t think it’s thermal throttling but I’ll make sure next time, because the CPU scores don’t seem great.

Have you also tried at 0? What does HWINFO say your temps are for the various runs? And what kind of clock speed is it hitting for those tests?

Are you sure your board's BIOS isn't old enough it can't actually let the CPU boost? The one this board came with was super old which was an "oh, duh" when I realized it had been stuck at 3.4ghz the whole time.

Yudo
May 15, 2003

wolrah posted:

This behavior is not fully understood and sometimes can come and go with different driver releases, so it's hard to say anything for sure.

That said, in general it doesn't seem to happen to people using identical monitors, no matter how many they have. It doesn't have to be exactly the same monitor, but they need to be same enough that the signal on the wire looks the same. Same resolution, bit depth, refresh rate, etc. Any kind of mismatched set is likely to trigger higher than expected power consumption.

As alluded to by other posters in this thread, the memory clock is fucky. I recall this being a problem with some 3090tis. At 144hz+60hz the memory clock is locked at 2487mhz, at 120hz+60 it is locked at 909mhz, and at 60hz+60hz it varies between 25-110mhz. Given that, I am reluctant to place my hopes in a new monitor fixing this issue. Replacing a 9 year old 60hz panel isn't an unappealing idea, but my 144hz monitor is no longer available at a reasonable price. That is to say, an identical match is impossible, though getting the exact same settings on a newer LG 144hz+ panel shouldn't be a problem. As it stands, 120hz+60hz is an okay compromise with idle power usage at around 40-50w.

Wonton
Jul 5, 2012
I just want to vent.

I had asrock x570 board, 32 gb ram 3600 ddr4z

Initially bad 5600X 6600XT

Ended up replacing with 5800X3D 7900XT, my m.2 1Tb drive.

A pcie 4.0 extension riser cable for my DanA4 h20 “bought wrong version of case)

And I still get stupid driver errors. I really want to support team red because of garbage Nvidia pricing, but man this sucks. My 2* 34 144 hz dell
Monitors feels like wasted money.

Or I should have gotten a 4070ti or something I dunno I’m frustrated sorry guys

Just want to say AMD drivers is just anecdotal and made me more superstitious with computers

hobbesmaster
Jan 28, 2008

Have you tested with PCIE3 mode? Even good PCIE4 risers can be questionable and that’s something that could impact overall stability.

Adbot
ADBOT LOVES YOU

Olewithmilk
Jun 30, 2006

What?

Hi all, bit of a long shot but I'm looking at a PC on an auction website. I don't think the GPU is going to be a 4090 or anything, but can anyone work out what type it is from this admittedly lovely picture?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply