Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Shumagorath
Jun 6, 2001

Dr. Video Games 0031 posted:

https://www.youtube.com/watch?v=FqpfYTi43TE

Every GPU will be different. Undervolting takes a couple hours of fiddling to hone in on a good configuration. Around 1900 MHz at 850 - 900 mV is a config I see a lot with GA102 GPUs. My 3080 Ti sits at 1900MHz at 825mV, but it seems better than most GPUs at this. Another goon wasn't stable at 1900MHz at anything under 925 mV but found they were stable when set to 1850MHz at 850(?) mV. The performance difference between these clock speed differences is very small. Like, 1% or less. Hell, even going up from 1900 to 2050MHz, I see only about a 2 - 3% improvement usually. So clock speeds aren't everything on these GPUs, and the difference can usually be compensated by overclocking your memory. 1900MHz GPU clock with a +600 on my memory nets me slightly better performance than my stock settings that top out at around 2000 MHz.
Thanks :) Unless it's going to magically knock 100W+ off I think I'd rather just enjoy my new card and deal with the heat. At 1200P it's unlikely to break a sweat in anything but Cyberpunk.

Shumagorath fucked around with this message at 03:38 on Apr 19, 2022

Adbot
ADBOT LOVES YOU

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Shumagorath posted:

Thanks :) Unless it's going to magically knock 100W+ off the card I think I'd rather just enjoy my new card and deal with the heat. At 1200P it's unlikely to break a sweat in anything but Cyberpunk.

Undervolting can also make it quieter. In my case it made quite a difference. But it was a bit of a hassle to set up, and I’ll occasionally find a game that’s allergic to the changes.

Canna Happy
Jul 11, 2004
The engine, code A855, has a cast iron closed deck block and split crankcase. It uses an 8.1:1 compression ratio with Mahle cast eutectic aluminum alloy pistons, forged connecting rods with cracked caps and threaded-in 9 mm rod bolts, and a cast high

Undervolting is the new overclocking.

hobbesmaster
Jan 28, 2008

Canna Happy posted:

Undervolting is the new overclocking.

And it’s purely silicon lottery instead of mostly silicon lottery

side_burned
Nov 3, 2004

My mother is a fish.
I just want to make sure I am understanding something correctly. When Ethereum goes from proof of work to proof of stake, I should anticipate a drop in GPU prices because mining will be less profitable, is that correct?

repiv
Aug 13, 2009

you should anticipate ethereum never actually switching to proof of stake because its perpetually 6 months away

they literally just delayed the switchover again

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I read that at current profitability a 3080Ti doesn't pay for itself until like 2 years in, so I feel that at the very least we're seeing the effects of a major reduction in crypto demand on GPU prices that has along with the dropping of tariffs and along with some element of market saturation helped to account for the dropping of prices to close to MSRP that we've seen. I would expect that if mining becomes (even?) worse, and also when the next generation arrives we'll start seeing more sell-off type behavior which will put a lot of used card competition with prices and maybe drop things further again.

By such time as that I hope to have mined a shitload of loving pixels my friends, shiny pixels, shinier than fractions of fractions of coins by far. I may not have pockets full of them but my eyes eat them as they are made so it's ok.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Record your gameplay and sell it as NFTs so you can have both.

Kibner
Oct 21, 2008

Acguy Supremacy

repiv posted:

you should anticipate ethereum never actually switching to proof of stake because its perpetually 6 months away

they literally just delayed the switchover again

Also, the people with all the mining gpu's will just switch to some other butt coin that users PoW so that they can still make money on their cards.

sauer kraut
Oct 2, 2004

Shipon posted:

220W isn't that insane these days so I can imagine it'll be fine.

I had a few run ins with Asus Dual 580/590s and they began to struggle at about 170W.
It's their bottom line non-blower model after all. If you buy that, an undervolt and max power limit reduction is likely in order.

side_burned
Nov 3, 2004

My mother is a fish.

Kibner posted:

Also, the people with all the mining gpu's will just switch to some other butt coin that users PoW so that they can still make money on their cards.

You know that is a question I have always had with Crypto, where exactly happens when there are multiple coins in the market. Sure Bitcoid and now Ethereum will most likely be valuable but what about the dozens of other coins.

I know I am not the only one but the more I think about Crypto the more it looks like massive waste of electricity.

Shumagorath
Jun 6, 2001

side_burned posted:

I know I am not the only one but the more I think about Crypto the more it looks like massive waste of electricity.
:hmmyes: https://forums.somethingawful.com/showthread.php?threadid=3904417

side_burned
Nov 3, 2004

My mother is a fish.

https://www.youtube.com/watch?v=ARR-ZSyCaf4

Dr. Video Games 0031
Jul 17, 2004

Kibner posted:

Also, the people with all the mining gpu's will just switch to some other butt coin that users PoW so that they can still make money on their cards.

Nah. None of them are remotely as profitable and it's way harder to make an altcoin profitable than you think. They'll just sell off their GPUs if PoS ever happens.

The price of used GPUs will crash when that happens, but it's hard to predict what will happen to the retail market.

pseudorandom name
May 6, 2007

And they’ll fight tooth and nail to prevent abandoning Proof of Work because that renders their massive GPU investment worthless.

Shumagorath
Jun 6, 2001
Yeah that's the key - no one mining is actually keeping their buttz. They have to convert them back into real money ASAP.

side_burned
Nov 3, 2004

My mother is a fish.
What a loving waste.

Dr. Video Games 0031
Jul 17, 2004

sauer kraut posted:

I had a few run ins with Asus Dual 580/590s and they began to struggle at about 170W.
It's their bottom line non-blower model after all. If you buy that, an undervolt and max power limit reduction is likely in order.

Not all Duals use the same cooler. This goes for most GPU model names, but they all tend to use different coolers for GPUs with different TDPs. I just checked, and the Asus Dual 580 has a completely different cooler design than the 3060 Ti and 3070. The nvidia ones are a decent bit thicker and seem to have a vertical fin alignment with more heat pipes, so they should be better equipped for higher TDPs.

explosivo
May 23, 2004

Fueled by Satan

I've been noticing what sounds like a loud fan wobbling sound from what I'm pretty sure is my GPU (3080ti) when the fan really gets cranking, almost like a grinding sound, but it stops as soon as it slows down. Is there a way to fix this or is this an RMA thing? It works fine otherwise but I'm worried one of these times it's going to break.

Jimbot
Jul 22, 2008

I got my EVGA 3080 in the mail today! :toot:

Unfortunately, my PSU only has 4 PCIe slots and I didn't realize the card requires 3. Can anyone recommend at least a 850w that has more than 4 that will fit my needs? I have a mid-tower case. I really don't want to daisy chain the cables. Too many inconclusive and varied answers about the validity of doing that without causing problems.

Shipon
Nov 7, 2005

Jimbot posted:

I got my EVGA 3080 in the mail today! :toot:

Unfortunately, my PSU only has 4 PCIe slots and I didn't realize the card requires 3. Can anyone recommend at least a 850w that has more than 4 that will fit my needs? I have a mid-tower case. I really don't want to daisy chain the cables. Too many inconclusive and varied answers about the validity of doing that without causing problems.

If you have two dedicated cables, you can daisy-chain one of them safely. Each cable should have a max of 150W on it including the daisy chain, and the PCI-E slot gives 75W for a total of 375W off two cables. EVGA's 3080 shouldn't exceed that.

Edit: These numbers are off the official PCI-E spec, which is quite conservative when it comes to cable gauges and should mean even the crappiest PSU with the cheapest cables will be safe.

Shipon fucked around with this message at 22:51 on Apr 19, 2022

Indiana_Krom
Jun 18, 2007
Net Slacker

Jimbot posted:

I got my EVGA 3080 in the mail today! :toot:

Unfortunately, my PSU only has 4 PCIe slots and I didn't realize the card requires 3. Can anyone recommend at least a 850w that has more than 4 that will fit my needs? I have a mid-tower case. I really don't want to daisy chain the cables. Too many inconclusive and varied answers about the validity of doing that without causing problems.

I'm confused, your PSU has 4 slots, your new video card needs 3, so what's the problem?

Unless you have two plugged in to the motherboard (which some boards do accept). In which case one of them is completely useless unless you have a pot of LN2 on top of you CPU at all times, unplug the "optional" one and send it to the GPU instead.

Shumagorath
Jun 6, 2001
Do some PSUs use the PCIe and CPU plugs interchangeably??

I always plug that second 4-pin in because WHAT IF :tinfoil:

Indiana_Krom
Jun 18, 2007
Net Slacker

Shumagorath posted:

Do some PSUs use the PCIe and CPU plugs interchangeably??

I always plug that second 4-pin in because WHAT IF :tinfoil:

On my Seasonic they are, the whole block is labeled "CPU/PCIe", though my CPU connectors are all 8 pin.

hobbesmaster
Jan 28, 2008

Indiana_Krom posted:

I'm confused, your PSU has 4 slots, your new video card needs 3, so what's the problem?

Unless you have two plugged in to the motherboard (which some boards do accept). In which case one of them is completely useless unless you have a pot of LN2 on top of you CPU at all times, unplug the "optional" one and send it to the GPU instead.

The third GPU one is also optional unless you have a pot of LN2 on it.

Though, for the CPU thing I’m pretty sure LTT is hitting that power demand in this video which has an i9 12900ks with an after market heat spreader and open loop water. 329W.

https://www.youtube.com/watch?v=9j-YD11LcLw

Jimbot
Jul 22, 2008

Indiana_Krom posted:

I'm confused, your PSU has 4 slots, your new video card needs 3, so what's the problem?

Unless you have two plugged in to the motherboard (which some boards do accept). In which case one of them is completely useless unless you have a pot of LN2 on top of you CPU at all times, unplug the "optional" one and send it to the GPU instead.

Yeah, two are in the board. I have to double check the instruction manual to see what that second one is for and necessary. I just assumed it just needed it for mobo stuff.

Edit: Yeah, I'm a dope. One of the 8-pins was optional. I thought they were both required. I'll disconnect one of those and use it for the GPU.

Jimbot fucked around with this message at 15:12 on Apr 20, 2022

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Hey just FYI recent WIN11 patches (not sure which ones) have made HDMI 2.1, and its interactions with HDMI 2.1 receivers rock solid stable for me.

This probably applies to very few people but it's kind of a big deal if this concerns you- make sure you patch your Win11 up. The entire ecosystem works great at this point: Auto HDR, HDMI 2.1 VRR, 4K 120fps, sound and video switching with HDMI 2.1 certified receivers- all of it.

So awesome.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Isn't it that time of the year where the IRS comes knocking and be all like 'hey you need to pay taxes on your crypto'

So maybe gpu prices will keep going down

Drakhoran
Oct 21, 2012

On the other hand, China is not finished with Covid so more supply problems could be coming.

Shumagorath
Jun 6, 2001
Canadian prices are indeed catching up: https://www.canadacomputers.com/product_info.php?cPath=43_557_559&item_id=213523

I ordered one of those from EVGA that's arriving today; kinda silly that I could subway over and pick one up for the same price without dealing with UPS, but there was no way to know.

repiv
Aug 13, 2009

https://www.youtube.com/watch?v=-x0orqFrHGA

DF got around to the UE5 city, looked at the limitations of software Lumen, performance scaling, etc

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

repiv posted:

https://www.youtube.com/watch?v=-x0orqFrHGA

DF got around to the UE5 city, looked at the limitations of software Lumen, performance scaling, etc

The PS5 demo did look really blurry to me, and here Alex thinks it's quite possibly internally sub 1080p (or the reconstruction is low quality).

Are we entering an era where CPUs will be bottlenecks even at fairly modest framerates?

repiv
Aug 13, 2009

Seems like a UE5 specific issue, or at least an issue with how this demo is built

For whatever reason it's struggling with parallelism and bottlenecking extremely hard on a few cores

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
i3-12100 stays winning :smugdon:

seriously though while I understand the image quality issues DF went through I'm still really impressed at Nanite and Lumen

repiv
Aug 13, 2009

OTOH it does seem inevitable that CPU bottlenecks will become more of a limitation this generation, due to the consoles having pretty fast CPUs this time

If a game saturates the consoles 7-core ~3.5ghz Zen2 to hit a 60fps target it's going to be a struggle to get up to 120fps on PC, especially since PC CPUs are mostly scaling wider rather than faster, and every engine will hit a parallelism wall eventually

Even worse if they max out the console CPUs just to hit 30fps :barf:

repiv fucked around with this message at 16:05 on Apr 20, 2022

EoRaptor
Sep 13, 2003

by Fluffdaddy

Shumagorath posted:

Canadian prices are indeed catching up: https://www.canadacomputers.com/product_info.php?cPath=43_557_559&item_id=213523

I ordered one of those from EVGA that's arriving today; kinda silly that I could subway over and pick one up for the same price without dealing with UPS, but there was no way to know.

Same. I think can-com actually ends up being about 30 bucks cheaper. :/

Value was always going to drop, just a matter of when.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

repiv posted:

https://www.youtube.com/watch?v=-x0orqFrHGA

DF got around to the UE5 city, looked at the limitations of software Lumen, performance scaling, etc

Its single threaded, lmao

repiv
Aug 13, 2009

I wonder how much they're struggling with tech debt as a result of building UE5 on top of UE4, and committing to not breaking existing UE4 workflows/assets

It'll be great for quick adoption since UE4 projects can migrate to UE5, but it really limits how much they can rearchitect the engine

repiv fucked around with this message at 18:36 on Apr 20, 2022

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Zedsdeadbaby posted:

Its single threaded, lmao

Broke: can it run Crysis?

Bespoke: can everything run like Crysis?

Adbot
ADBOT LOVES YOU

hobbesmaster
Jan 28, 2008

Zedsdeadbaby posted:

Its single threaded, lmao

The nature of a game loop in a fps means that the main thread is likely always going to be limited by single thread performance.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply