Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
repiv
Aug 13, 2009

yeah it's poor form for a game to run uncapped in menus (it's a waste of power if nothing else) but if it causes the hardware to break then it's ultimately the hardware's fault

Adbot
ADBOT LOVES YOU

Arrath
Apr 14, 2011


repiv posted:

there are reports of the diablo 4 beta being new world 2: card explosion boogaloo

https://www.tomshardware.com/news/rtx-3080-ti-gpus-are-mysteriously-dying-on-diablo-iv-beta

most reports are of gigabyte 3080tis dying

Lol doesn't Blizz ever learn? Launch SCII had a problem with the menu not being frame limited, as I recall.

I know it made my laptop spin the fans for takeoff and get supernova hot until I went into the nvidia control panel and forced a limit on it.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
At this point, if you have a high-end GPU and you don't have a frame limiter going all the time because you don't have a sub-$300 VRR monitor, I kinda have no sympathy. People's priorities are hosed.

LOL Blizzard tho.

repiv
Aug 13, 2009

people are calling out a particular cutscene (the one "in the chapel") as being the trigger, is that in-engine or pre-rendered?

kliras
Mar 27, 2021
the game gobbles up memory across dram and vram like crazy, so it's possible cards are getting a stress test they otherwise don't

only the intro is prerendered, so it's in-game

repiv
Aug 13, 2009

if it's in-engine then it's unlikely to be an uncapped-FPS issue

:iiam:

kliras
Mar 27, 2021
it's 41:30 into this video

https://www.youtube.com/watch?v=SoQAp41w6Sw&t=2490s

there generally appears to be some weird stuff going on with the "high(-res)" texture settings

Arivia
Mar 17, 2011

if i watch this youtube is it going to destroy my 1070

Arrath
Apr 14, 2011


K8.0 posted:

At this point, if you have a high-end GPU and you don't have a frame limiter going all the time because you don't have a sub-$300 VRR monitor, I kinda have no sympathy. People's priorities are hosed.

LOL Blizzard tho.

Im one of those weirdos with a 3080 driving a stupidly big curved Dell Ultrawide at 60fps lol

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

K8.0 posted:

At this point, if you have a high-end GPU and you don't have a frame limiter going all the time because you don't have a sub-$300 VRR monitor, I kinda have no sympathy. People's priorities are hosed.

LOL Blizzard tho.

Weird take that you expect everyone to know or care about frame rate limiting. Maybe some people just buy the card or buy a prebuilt, and install a game and just expect it to work.

Take the same thought and move it over to CPUs. Are we saying that people encoding video should expect their CPU to just burn up? That running at 100% for extended periods is wrong? Nah..

Edit: nobody try :iiaca: "well you don't expect to run your car at WOT all day long"

HalloKitty fucked around with this message at 18:26 on Mar 23, 2023

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Crazy idea here but why not tell the GPU (in firmware) to cap framerate to the display resolution unless the user specifically tells it otherwise?

uiruki
Aug 6, 2003
blah blah blah
I’m not sure it is unlimited frame rates - for whatever reason the cutscenes seemed to be locked at about 45fps on my 3080 for me. They are really heavy though, and the game absolutely devours memory so it might be the point where it brings in all those extra assets, especially in High texture settings. I’ve never played a game which has used as much RAM (nearly 30gb on my 32gb system) as D4 on high textures.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

HalloKitty posted:

Weird take that you expect everyone to know or care about frame rate limiting. Maybe some people just buy the card or buy a prebuilt, and install a game and just expect it to work.

Take the same thought and move it over to CPUs. Are we saying that people encoding video should expect their CPU to just burn up? That running at 100% for extended periods is wrong? Nah..

Edit: nobody try :iiaca: "well you don't expect to run your car at WOT all day long"

It's definitely stupid that AMD and Nvidia have not made capped framerate + vsync on the default when VRR is enabled, but that's just how it is for now. And sure, there are people who are ignorant, and yes, games shouldn't break hardware and hardware shouldn't break from games. It's just silly seeing people with $1k+ GPUs and $50 monitors.

Vintersorg
Mar 3, 2004

President of
the Brendan Fraser
Fan Club



Who cares what they pair it up with.

VelociBacon
Dec 8, 2009

Arrath posted:

Lol doesn't Blizz ever learn? Launch SCII had a problem with the menu not being frame limited, as I recall.

I think everyone from SC2 has left Blizzard at this point but really the test is going to be when the game actually goes live, has Blizz decided to rent servers or whatever to deal with the load or are they going to expect people who spent $70usd on the game to wait in massive queues...

Jiro
Jan 13, 2004

I mean, I'm pretty sure everyone here knows the answer considering the past.

Inept
Jul 8, 2003

VelociBacon posted:

I think everyone from SC2 has left Blizzard at this point but really the test is going to be when the game actually goes live, has Blizz decided to rent servers or whatever to deal with the load or are they going to expect people who spent $70usd on the game to wait in massive queues...

the answer from OW2's launch last fall is a "no"

Arrath
Apr 14, 2011


It costs a lot of money just to scale up for those 72 hours of launch fever.

My dumb rear end braved an actual blizzard to get my copy of Burning Crusade only to sit in server queues. I'm told the launch of Lich King Classic was much the same. I don't expect the d4 launch to be any different.

Cygni
Nov 12, 2005

raring to post

VelociBacon posted:

I think everyone from SC2 has left Blizzard at this point but really the test is going to be when the game actually goes live, has Blizz decided to rent servers or whatever to deal with the load or are they going to expect people who spent $70usd on the game to wait in massive queues...

To be fair, at least in California, i only had to wait in a queue the first time I logged in on launch day (an hour lol, and it was occasionally chuggy in game as well), but I was able to play the rest of the weekend without any other server issues at all, and I played... a lot. But I hear that wasn't everyone's experience universally.

For a big MMOish game, I sorta expect stuff to not work on day 1 but thats because im an old gamer who has been acclimatized to dogshit corporate launch day planning.

Jenny Agutter
Mar 18, 2009

VelociBacon posted:

I think everyone from SC2 has left Blizzard at this point but really the test is going to be when the game actually goes live, has Blizz decided to rent servers or whatever to deal with the load or are they going to expect people who spent $70usd on the game to wait in massive queues...

Day 1 and 2 will see massive wait times for anyone trying to log in. Day 3 they’ll put out a statement about bringing more servers/shards up that should alleviate wait times (this is a lie, they didn’t do anything). Player population will naturally fall until the end of week 2 at which point most people will be able to get in without a wait. Cost to Activision: $0 and a couple tweets.

Inept
Jul 8, 2003

Arrath posted:

It costs a lot of money just to scale up for those 72 hours of launch fever.

Blizzard regularly has launches that they need to prepare for and have flexible capacity for. It's not excusable that they still have issues at launch. It isn't 2006 anymore. AWS and GCP and flexible capacity are standard for large games like Apex and Valorant, but Blizzard probably has over a decade of technical debt that no one bothered to fix.

GhostDog
Jul 30, 2003

Always see everything.
Took me three tries to realize that it's not the servers themselves diconnecting me all the time, but rather that selecting one specific loadout does. Reproduceably.

Blurb3947
Sep 30, 2022

Arrath posted:

It costs a lot of money just to scale up for those 72 hours of launch fever.

My dumb rear end braved an actual blizzard to get my copy of Burning Crusade only to sit in server queues. I'm told the launch of Lich King Classic was much the same. I don't expect the d4 launch to be any different.

Oh yeah that poor company, would hate for them to have to fork out some money just so people could experience the thing they paid for.

Arrath
Apr 14, 2011


Blurb3947 posted:

Oh yeah that poor company, would hate for them to have to fork out some money just so people could experience the thing they paid for.

I ain't defending poo poo, just based off of almost two decades of their MMO history I don't expect them to do a drat thing to actually prepare for or overprovision servers for the launch.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Blurb3947 posted:

Oh yeah that poor company, would hate for them to have to fork out some money just so people could experience the thing they paid for.

you're not wrong. Not to jump on Square Enix's dick too much, but the crush of people trying to log in to play Endwalker (I am sure you've heard the stories about hellqueue by now) did prompt them to retask their prod servers to increasing login capacity. So while it doesn't make sense to build new servers to account for launch rushes, there is something to be said for designing infrastructure that will allow additional resources to be allocated to improve the launch experience.

But no, that would cost money and time and therefore Bobby K would never allow it.

Cygni
Nov 12, 2005

raring to post

I mean its the same as everything else. Why have airlines cut every perk? Because people only care about price and keep flying. People have repeatedly showed that spending money to ramp up launch day servers isn't worth it because people forget about it and keep buying the games. ThatsCapitalism emoticon

Enderzero
Jun 19, 2001

The snowflake button makes it
cold cold cold
Set temperature makes it
hold hold hold
I really don’t feel bad for anyone caught in a day one fiasco because even here you see people rightfully complaining about bad launches and getting screwed by whatever game that just launched, be it overloaded servers, or crazy bugs, or performance issues and yet every time a game launches there’s dozens of people, often a large subset of the first group, counting down hours and minutes to launch and can I change my region to NZ to get it a few hours early??

You waited two years for the game, you can wait 4 days and see how it goes. You know the drill, it often goes this way, and life is in no way impacted by booting a game a few days later. As you said, we kinda get what we deserve.

Jiro
Jan 13, 2004

Inept posted:

Blizzard regularly has launches that they need to prepare for and have flexible capacity for. It's not excusable that they still have issues at launch. It isn't 2006 anymore. AWS and GCP and flexible capacity are standard for large games like Apex and Valorant, but Blizzard probably has over a decade of technical debt that no one bothered to fix.

This is why the buyout and merger needs to happen you guys! :suicide:

Dr. Video Games 0031
Jul 17, 2004

Josh Lyman posted:

This seems to be the only benchmark comparing 5800X to (hypothetical) 7800X3D on a 4090:


It's an average of 12.6% more FPS, but your 5600X is also averaging about 148fps. Of course, if you want to lock at 144Hz, then the additional margin afforded by 7800X3D would help.

You won't benefit from 64GB RAM vs 32GB. One thing to note is that Ryzen 7000 doesn't seem to like 4 DIMMs so if you KNOW you're going to need 64GB for productivity, you could look into getting 2x 32GB but that's incredibly overkill for games. 2x 16GB is more than enough.

It's worth noting that if you like to turn on ray tracing whenever possible, then it might help a lot more than 12%. But again, it depends on the game. Spider-man, Callisto Protocol, Witcher 3, HogLeg, and Hitman 3 are a handful of recent games that end up heavily CPU limited (usually single-thread limited) when turning on ray tracing. As in, they frequently struggle to maintain 60fps on zen 3. So it really all depends on what games you like to play.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

repiv posted:

yeah it's poor form for a game to run uncapped in menus (it's a waste of power if nothing else) but if it causes the hardware to break then it's ultimately the hardware's fault

new worlds failures were ultimately tracked back (via x-ray analysis of failed cards) to bad soldering on a batch of EVGA cards. iirc there were a handful of other people reporting failures, including a few AMD, and it's likely that all of them were just down to occasional manufacturing defects. in 2020-2022 you had quality control slipping at every level (component manufacture, assembly, etc) due to COVID, mining, and "lol who cares, ship it and let's get paid" all aligning. Nobody was inclined to be too anal about a slightly higher rate of parts/assembly defects when it's the difference between keeping a line running or not.

Igor went off on yet another wild goose chase of course, and Buildzoid threw in his own random speculation too. But basically another batch of bad EVGA cards (and a handful of others) were the cause. Presumably running thousands of FPS at full power load is a good workout of solder joints etc - high framerate causes high-frequency vibrations in inductors (coil whine) and also caps (cap whine is a thing, go full screen on this and put your ear up to your display) and guess what's sitting on the output of those MOSFETs? Plus any resonance of their own - and of course since you're doing this at full power it's as high an amplitude and as strong a physical vibration as the discretes can generate. Of course it's a manufacturing defect but it's also not just random chance that a super high framerate at max power is causing problems either, and partners just kinda didn't really give a poo poo during COVID.

EVGA was initially blamed for POSCAP/MLCC too (another Igor special). And they had 1080s catching on fire back in 2016 too. They tend to be in the hotseat a lot for a variety of reasons - they were the #1 partner by volume in the US market (!) and especially in the DIY market everybody who had a choice tended to buy from them (and step up/etc made that easier vs fighting miner bots at retail). So there's simply a larger pool of cards to have issues, and disproportionately in DIYer hands. On top of that, they subcontracted out their assembly to a third party factory, who had absolutely no incentive to keep defects under control to any extent more than they needed to in order to get paid. It's not their factory that's covering the warranty, why go looking for trouble? EVGA made it up via a super generous warranty... but they definitely had higher defect rates than other partners too.

Paul MaudDib fucked around with this message at 01:49 on Mar 24, 2023

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Zero VGS posted:

Crazy idea here but why not tell the GPU (in firmware) to cap framerate to the display resolution unless the user specifically tells it otherwise?

somewhere, a counterstrike player startles awake in a cold sweat and doesn't know why

Shipon
Nov 7, 2005

Paul MaudDib posted:

new worlds failures were ultimately tracked back (via x-ray analysis of failed cards) to bad soldering on a batch of EVGA cards. iirc there were a handful of other people reporting failures, including a few AMD, and it's likely that all of them were just down to occasional manufacturing defects. in 2020-2022 you had quality control slipping at every level (component manufacture, assembly, etc) due to COVID, mining, and "lol who cares, ship it and let's get paid" all aligning. Nobody was inclined to be too anal about a slightly higher rate of parts/assembly defects when it's the difference between keeping a line running or not.

Igor went off on yet another wild goose chase of course, and Buildzoid threw in his own random speculation too. But basically another batch of bad EVGA cards (and a handful of others) were the cause. Presumably running thousands of FPS at full power load is a good workout of solder joints etc - high framerate causes high-frequency vibrations in inductors (coil whine) and also caps (cap whine is a thing, go full screen on this and put your ear up to your display) and guess what's sitting on the output of those MOSFETs? Plus any resonance of their own - and of course since you're doing this at full power it's as high an amplitude and as strong a physical vibration as the discretes can generate. Of course it's a manufacturing defect but it's also not just random chance that a super high framerate at max power is causing problems either, and partners just kinda didn't really give a poo poo during COVID.

EVGA was initially blamed for POSCAP/MLCC too (another Igor special). And they had 1080s catching on fire back in 2016 too. They tend to be in the hotseat a lot for a variety of reasons - they were the #1 partner by volume in the US market (!) and especially in the DIY market everybody who had a choice tended to buy from them (and step up/etc made that easier vs fighting miner bots at retail). So there's simply a larger pool of cards to have issues, and disproportionately in DIYer hands. On top of that, they subcontracted out their assembly to a third party factory, who had absolutely no incentive to keep defects under control to any extent more than they needed to in order to get paid. It's not their factory that's covering the warranty, why go looking for trouble? EVGA made it up via a super generous warranty... but they definitely had higher defect rates than other partners too.

Is it good or bad that I can't hear anything from that coil whine example even in fullscreen.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Shipon posted:

Is it good or bad that I can't hear anything from that coil whine example even in fullscreen.

good for your monitor, perhaps bad for your hearing? grandpa can't hear the mosquito tones :laugh:

my current X34GS is really good, I can hear it but I have to listen for it, it's easier to pick it up during certain parts of the frequency sweep. My old Dell P2015 whatevers from work you could hear from the next cube over.

Bondematt
Jan 26, 2007

Not too stupid

Shipon posted:

Is it good or bad that I can't hear anything from that coil whine example even in fullscreen.

Both

SwissArmyDruid
Feb 14, 2014

by sebmojo

Zero VGS posted:

Crazy idea here but why not tell the GPU (in firmware) to cap framerate to the display resolution unless the user specifically tells it otherwise?

pretty sure nvidia locks that to GFE. No snooping? NO FEATURES FOR YOU.

8-bit Miniboss
May 24, 2005

CORPO COPS CAME FOR MY :filez:

SwissArmyDruid posted:

pretty sure nvidia locks that to GFE. No snooping? NO FEATURES FOR YOU.

Framerate cap is in the Nvidia Control Panel, part of the basic driver install. Don’t need GFE for that.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Y’all ain’t tricking me into putting my ear on my monitor.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Paul MaudDib posted:

new worlds failures were ultimately tracked back (via x-ray analysis of failed cards) to bad soldering on a batch of EVGA cards. iirc there were a handful of other people reporting failures, including a few AMD, and it's likely that all of them were just down to occasional manufacturing defects. in 2020-2022 you had quality control slipping at every level (component manufacture, assembly, etc) due to COVID, mining, and "lol who cares, ship it and let's get paid" all aligning. Nobody was inclined to be too anal about a slightly higher rate of parts/assembly defects when it's the difference between keeping a line running or not.

Igor went off on yet another wild goose chase of course, and Buildzoid threw in his own random speculation too. But basically another batch of bad EVGA cards (and a handful of others) were the cause. Presumably running thousands of FPS at full power load is a good workout of solder joints etc - high framerate causes high-frequency vibrations in inductors (coil whine) and also caps (cap whine is a thing, go full screen on this and put your ear up to your display) and guess what's sitting on the output of those MOSFETs? Plus any resonance of their own - and of course since you're doing this at full power it's as high an amplitude and as strong a physical vibration as the discretes can generate. Of course it's a manufacturing defect but it's also not just random chance that a super high framerate at max power is causing problems either, and partners just kinda didn't really give a poo poo during COVID.

EVGA was initially blamed for POSCAP/MLCC too (another Igor special). And they had 1080s catching on fire back in 2016 too. They tend to be in the hotseat a lot for a variety of reasons - they were the #1 partner by volume in the US market (!) and especially in the DIY market everybody who had a choice tended to buy from them (and step up/etc made that easier vs fighting miner bots at retail). So there's simply a larger pool of cards to have issues, and disproportionately in DIYer hands. On top of that, they subcontracted out their assembly to a third party factory, who had absolutely no incentive to keep defects under control to any extent more than they needed to in order to get paid. It's not their factory that's covering the warranty, why go looking for trouble? EVGA made it up via a super generous warranty... but they definitely had higher defect rates than other partners too.

this post was like watching an episode of air crash investigations. good job

Former Human
Oct 15, 2001

Paul MaudDib posted:

high framerate causes high-frequency vibrations in inductors (coil whine) and also caps (cap whine is a thing, go full screen on this and put your ear up to your display) and guess what's sitting on the output of those MOSFETs?

Good thing I don't have epilepsy.

Adbot
ADBOT LOVES YOU

Kazinsal
Dec 13, 2011

Paul MaudDib posted:

Igor went off on yet another wild goose chase of course, and Buildzoid threw in his own random speculation too. But basically another batch of bad EVGA cards (and a handful of others) were the cause. Presumably running thousands of FPS at full power load is a good workout of solder joints etc - high framerate causes high-frequency vibrations in inductors (coil whine) and also caps (cap whine is a thing, go full screen on this and put your ear up to your display) and guess what's sitting on the output of those MOSFETs? Plus any resonance of their own - and of course since you're doing this at full power it's as high an amplitude and as strong a physical vibration as the discretes can generate. Of course it's a manufacturing defect but it's also not just random chance that a super high framerate at max power is causing problems either, and partners just kinda didn't really give a poo poo during COVID.

Oh wow. I can get a faaaaaaaint whine when the bars are thicker, nothing when they start to thin out. Not sure how much of it is the caps in my monitor (MSI Optix AG32CQ) being good and how much of it is having blown out the uppermost range of my hearing from too many nights at an indie punk bar in college without hearing protection, but y'know. Pretty neat stuff.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply