Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

The 4080 Super FE box is literally as large as my entire PC

change my name fucked around with this message at 20:46 on Feb 8, 2024

Adbot
ADBOT LOVES YOU

MarcusSA
Sep 23, 2007

HalloKitty posted:

Not to pick on you specifically, but is there a reason we give X a bunch of traffic instead of just linking to the article?

Because it’s quick and easy

CatelynIsAZombie
Nov 16, 2006

I can't wait to bomb DO-DON-GOES!
So if I'm looking to pick up a 4080super when they come back in stock what discord or website do I need to set up alerts on ?

Elem7
Apr 12, 2003
der
Dinosaur Gum

Kibner posted:

It's still an issue because of all those obvious issues with GPU waterblocks. It means that the waterblocks for the AIO are not able to take advantage of economies of scales to make prices lower than the current situation. The CPU waterblocks can fit a large variety of CPUs from both major brands with maybe a cheap bracket to adjust for different mounting solutions and such. You can't do that with the GPU waterblocks.

The GPU companies would have to come up with like a dozen different waterblock designs for each generation of cards. That ain't gonna happen.

Those same GPU companies are the ones who make the choice to use non-reference PCB designs and already come up with a host of air coolers every generation. Again I'm not proposing after market cooler manufacturers make water blocks for every SKU from every vendor but rather some of those vendors do so for their own designs at the same time they're designing the air coolers. There's no real reason to they'd need to be doing ground up designs every generation for every SKU, a dozen every generation is an exaggeration, no ones going to water cool anything below a xx70. Cards with AIO have been done already and they've carried premiums for it but not excessively large ones over similar premium binned examples.

Cygni
Nov 12, 2005

raring to post

Nvidia put out one of their "hotfix" drivers that wont be autopushed via Geforce Experience to fix some micro-stutter issues:

quote:

This hotfix addresses the following issues:

Some users may experience intermittent micro-stuttering in games when vertical sync is enabled [4445940]
Potential stutter may be observed when scrolling in web browsers on certain system configurations [4362307]
[Red Dead Redemption 2][Vulkan] Stutter observed on some Advanced Optimus notebooks [4425987]
[Immortals of Aveum] Addresses stability issues over extended gameplay [4415277]

I havent seen the prob personally but in case you have

https://nvidia.custhelp.com/app/answers/detail/a_id/5519?=&linkId=100000240085402

Dr. Video Games 0031
Jul 17, 2004

HalloKitty posted:

Not to pick on you specifically, but is there a reason we give X a bunch of traffic instead of just linking to the article?

it's a convenient, low-effort way to get a clickable thumbnail and summary.

VectorSigma
Jan 20, 2004

Transform
and
Freak Out



Cygni posted:

Nvidia put out one of their "hotfix" drivers that wont be autopushed via Geforce Experience to fix some micro-stutter issues:

I havent seen the prob personally but in case you have

https://nvidia.custhelp.com/app/answers/detail/a_id/5519?=&linkId=100000240085402

oh nice, i'll have to try this. i was wondering how the same GPU that can push 100+ fps in CP2077 with everything turned on (other than path tracing) somehow has trouble with scrolling a web page smoothly

shrike82
Jun 11, 2005

is there a reason why some driver updates aren't pushed out by GFE

Cygni
Nov 12, 2005

raring to post

shrike82 posted:

is there a reason why some driver updates aren't pushed out by GFE

It’s talked about at the link, but the “hotfixes” get an abbreviated QA cycle to fix something that can’t wait until the next overall driver release. They are mostly quick releases to address a bug that may not impact everyone, before that fix is rolled into the mainline releases, but they aren’t tested on everything.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Dr. Video Games 0031 posted:

I don't know if you were supposed to be given the HDR10+ Gaming option. HDR10+ is separate from HDR10 (which is the universal standard for desktop monitors), and I believe the AW3423DWF only supports basic HDR10, with none of the special standards created by Samsung. Even my new monitor that supports Dolby Vision doesn't support HDR10+. I think if you don't have a Samsung display introduced in the last couple years, then you probably don't have support for the HDR10+ Gaming mode in CP2077. My Odyssey Neo G7 and Taima's S95B should support it, and other recent Samsung monitors like the Odyssey OLED line should also support it, but there's probably not much else out there that does.

Whoa! Ok, you are onto something here. I was messing around with the Game Panel on the S95B for other reasons (it's really useful since it says with certainty if you're using HDR while in a game and also if your VRR is working) and I saw this!



I've never seen anything like that before, but it looks like it really is using something called "HDR10+ Gaming"! I plugged that term into Google and got some hits back...

It looks like very few currently released games support it. 2077 might be the only one? Not sure. Here is an article from late August talking about a game called "The First Descendant" which was the first game confirmed to use the HDR10+ Gaming standard. And that game doesn't even have a release date yet, it just says "2024".

https://www.theverge.com/2023/8/21/23839768/samsung-hdr10-plus-gaming-standard-the-first-descendant-nexon

This could be the only game supporting HDR10+. Kind of neat, I have no idea what the effective difference is though between this and standard HDR.

Taima fucked around with this message at 00:46 on Feb 9, 2024

Dr. Video Games 0031
Jul 17, 2004

One of the things it is supposed to be capable of is source tone mapping. Instead of your display applying a generic tone mapping profile with no idea of what type of content it's receiving, the game should understand the capabilities of your display better, and your display should understand the content better with the metadata provided, to present a more accurate image. At least, that's the general idea as I understand it. In practice, I could not tell a difference between HDR10+ Gaming and HDR10 PQ in Cyberpunk 2077 when using it with my Neo G7. It didn't feel like the game was automatically calibrating the image to my display, and I didn't notice any difference with the tone mapping. Maybe it wasn't working correctly, I dunno.

Dr. Video Games 0031 fucked around with this message at 00:34 on Feb 9, 2024

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Wait, HDR10+ is bidirectional communication with the display?

Dr. Video Games 0031
Jul 17, 2004

Subjunctive posted:

Wait, HDR10+ is bidirectional communication with the display?

https://www.flatpanelshd.com/news.php?subaction=showfull&id=1635416918

"During the handshaking, a display provides its panel properties to the game's HDR10+ processing block. On receiving the physical attributes from the display the game automatically performs its HDR10+ video output optimization for the given display without the user having to do it manually," explained Bonggil Bak. "This process is not only convenient but also very effectively preserves the original creative intent of the game creators."

I believe it's supposed to work similarly to HGIG? There is some basic metadata in a display's EDID that the source can already read, so maybe it's just that. When I plug an HDR monitor into a windows PC for instance, it knows what its max luminance is out of the box, no extra calibration needed.

kliras
Mar 27, 2021
sounds like source-based tone mapping (sbtm) spec in hdmi 2.1b

hdr is always a crap shoot, but the hdr tool in windows 11 is intriguing at least

kliras fucked around with this message at 00:58 on Feb 9, 2024

mcbexx
Jul 4, 2004

British dentistry is
not on trial here!



Whoopsiedoodle.

Cablemod recalls its 16-pin GPU power adapters due to fire hazard — over $74,500 in property damage claims so far

quote:

272 cases out of 25,300 represent just a 1% error rate, which may not sound significant. However when dealing with high wattages and temperatures, it only takes one misfortune for things to go very bad. With the potential of a fire hazard, Cablemod ultimately decided to go with a voluntary recall, which is the right thing to do.

runaway dog
Dec 11, 2005

I rarely go into the field, motherfucker.
Hopefully their cables hold up better because I just bought one, I really hate this dumb plug would it have really been that difficult to just make it have all the same size pins?

Cygni
Nov 12, 2005

raring to post

Supposedly it’s only limited to the CM 90 and 180 adapters, not the connector design itself and not any other brands adapters. The CM cables haven’t been blamed for anything that I’ve seen (so far).

Other than being another melting cable/connector drama, I don’t think it’s really related to the prior FE issue or the connector design.

Bondematt
Jan 26, 2007

Not too stupid
Man, I procrastinated my way into not having a PC fire.

Go team just plug in 4 cables to the adapter.

Former Human
Oct 15, 2001

Are there any 40 series GPUs with indented power connectors like this?

https://assets.rockpapershotgun.com/images/2020/10/Nvidia-GeForce-RTX-3070-power-connectors.jpg

Sorry for the huge file. It's a Zotac 30 series card. I wonder how much this would help for the cards that are the size of a cinder block and nearly touch the side of the case.

pyrotek
May 21, 2004



Former Human posted:

Are there any 40 series GPUs with indented power connectors like this?

https://assets.rockpapershotgun.com/images/2020/10/Nvidia-GeForce-RTX-3070-power-connectors.jpg

Sorry for the huge file. It's a Zotac 30 series card. I wonder how much this would help for the cards that are the size of a cinder block and nearly touch the side of the case.

I know the non-Super PNY 4070 Ti and higher had that, but they seemed to have cheaped out with their coolers on the Supers so you should double-check those

runaway dog
Dec 11, 2005

I rarely go into the field, motherfucker.
Well, couldn't really decide on a spot for the gpu support, and I have 3 supports soooooo, now the gpu is the most stable component in my entire pc.

Anime Schoolgirl
Nov 28, 2002

now your gpu is invincible

Kramjacks
Jul 5, 2007

Why don't they make the whole GPU out of supports?

runaway dog
Dec 11, 2005

I rarely go into the field, motherfucker.
it's gotta be study enough for my Skyrim collectors edition dragon statue.

WonkyBob
Jan 1, 2013

Holy shit, you own a skirt?!

Bofast posted:

I think I'm just going to avoid 12VHWPR anything until they either fix the standard or find some other solution.

Check if your PSU manufacturer makes their own cable. I've been using Corsairs 12VHWPR cable with my 4090 for over a year and haven't had any melty issues.

RodShaft
Jul 31, 2003
Like an evil horny Santa Claus.


Is the 6700xt still the best ~$300 card? The 7600xt just came out and seems to perform worse(really?) But has AI cores and more RAM. Should I get that for future proofing or does none of that stuff matter at all?

sauer kraut
Oct 2, 2004
There's no future proofing with AMD cards. For that you'd need to be clairvoyant and only buy Nvidias top shelf bangers (8800GTX, 1080ti, 4090) at launch, for msrp.
x600 AMD cards are cripples with cut PCIe lanes, small 128 bit memory bus (even for the 16BG model) and small cache. No idea who would even consider them when they're already beginning to struggle at 1080p.
The 6700XT will do Cyberpunk and other current stuff with rastering at 1080p, high+ settings and solid 60fps.
The 7700XT is about 1/3 faster and last time I checked 1/3 more expensive. Maybe you can turn on some mild raytracing :toot:
Next step is the 4070Super, 1/3 faster than the 7700XT and with Nvidias features.

sauer kraut fucked around with this message at 02:51 on Feb 10, 2024

Psion
Dec 13, 2002

eVeN I KnOw wHaT CoRnEr gAs iS

CatelynIsAZombie posted:

So if I'm looking to pick up a 4080super when they come back in stock what discord or website do I need to set up alerts on ?

i set up alerts directly with retailers and they never went off, despite buying a card just this morning. Never did get an email, so I don't suggest relying on those. I just made checking retailers directly part of my morning routine, annoying as that was.

now, it was in stock for a single-digit number of hours, but it was still in stock!

DoombatINC
Apr 20, 2003

Here's the thing, I'm a feminist.





sauer kraut posted:

The 6700XT will do Cyberpunk and other current stuff with rastering at 1080p, high+ settings and solid 60fps.

You can even turn on a lot of the raytraced effects with a 6700 XT if you run FSR2 in quality and can tolerate some dips into the fifties, which is how me and my 6700 XT are currently enjoying Cyberpunk :goleft:

Cantide
Jun 13, 2001
Pillbug

Always a mystery to me why you'd want to insert another point of failure

Shipon
Nov 7, 2005
Laughing my rear end off that they spent all that time fearmongering on Reddit to get people to buy their adapters only for them to have a 10-20x higher failure rate than the "dangerous stock adapters"

orcane
Jun 13, 2012

Fun Shoe
That's not what happened.

Cantide posted:

Always a mystery to me why you'd want to insert another point of failure
The actual point of failure is the connector on the card, but if you attach a CM adapter and the garbage 12VHPWR gets too hot, some people will still blame the adapter instead of Asus for trying to push 600 W through a badly designed flip connector (apparently in Europe the adapter returns are 99% Asus cards and a handful of MSI Suprim 4090s). It's completely understandable if CM (or Asus?) doesn't want to have to keep replacing $2500 4090 Strix cards even if technically their adapter is fine for what it's supposed to do.

It will also happen with other adapters and native 12VHPWR PSU-cables, but there it's even less obvious what part is really at fault and the cost of replacing cards will be spread to a larger number of manufacturers.

orcane fucked around with this message at 13:32 on Feb 10, 2024

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

WonkyBob posted:

Check if your PSU manufacturer makes their own cable. I've been using Corsairs 12VHWPR cable with my 4090 for over a year and haven't had any melty issues.

We’re talking about an issue arising from a 1% failure rate, so individual anecdotes for comparison will be even less compelling than usual, I think. My 180 CM adapter didn’t melt either, but I don’t think the whole thing is made up and I’m going to destroy the thing.

Indiana_Krom
Jun 18, 2007
Net Slacker
Welp, sent in to cablemod for my adapter recall, they wanted a picture of it "disabled" but their example video bending the pins with a screwdriver looked insufficient to me, so I instead removed the metal cover and then snapped the PCB in half. Also I was curious how robust it really was, yeah the PCB is mostly okay, the copper lines are a little small (although it is still the connector that fails not the PCB). But still, considering the amount of thermal putty and sizable aluminum block they put on it, perhaps it would have been better to just size up the actual power traces so it wouldn't heat up in the first place...

Overall I think the whole 12VHPWR standard was a bad idea, it is just too small and too finicky to deal with 50 amps. For comparison, if this was wiring in your residence designed to handle 50 amps, you would use 6 AWG wire for it, which is typically multi-strand copper wire about as thick as a #2 pencil that is incredibly stiff and hard to work with. I think if they really want to get back to a single reasonable sized connector, they should just go back to the old 6 or 8 pin plugs but switch to 48 or 60 volts (keyed differently of course) so the connector only needs to handle 10-13 amps to deliver 600W. The old 6 and 8 pin connectors physical designs are good for like double the amperage of the PCIe specs they were limited to and it clearly paid off since you don't hear many cases of them melting under load, I kinda doubt 12VHPWR has that level of built in safety margin. IIRC the 8 pin connector is rated for 27 amps (324W @ 12v), but the PCIe spec limited it to 12.5 amps (150w @ 12v). Well 12.5 amps at 48v is the same 600W that the 12VHPWR design was supposed to handle.

Animal
Apr 8, 2003

Any reports of tragedy with ModDIY 90° cables and/or 4090 FE’s? I gotta use a 90° adapter in order to close my SFF case.

Yes, I’m one of those annoying SFF persons

SlowBloke
Aug 14, 2017

Indiana_Krom posted:

Overall I think the whole 12VHPWR standard was a bad idea, it is just too small and too finicky to deal with 50 amps. For comparison, if this was wiring in your residence designed to handle 50 amps, you would use 6 AWG wire for it, which is typically multi-strand copper wire about as thick as a #2 pencil that is incredibly stiff and hard to work with. I think if they really want to get back to a single reasonable sized connector, they should just go back to the old 6 or 8 pin plugs but switch to 48 or 60 volts (keyed differently of course) so the connector only needs to handle 10-13 amps to deliver 600W. The old 6 and 8 pin connectors physical designs are good for like double the amperage of the PCIe specs they were limited to and it clearly paid off since you don't hear many cases of them melting under load, I kinda doubt 12VHPWR has that level of built in safety margin. IIRC the 8 pin connector is rated for 27 amps (324W @ 12v), but the PCIe spec limited it to 12.5 amps (150w @ 12v). Well 12.5 amps at 48v is the same 600W that the 12VHPWR design was supposed to handle.

It's fun how my car 22kw type2 cable is thicker than my thumb, yet it will pass less amps than that minuscule connector on a gpu.

runaway dog
Dec 11, 2005

I rarely go into the field, motherfucker.
I just don't understand why we need 4 pins that are smaller and in a different spot and also recessed, like surely it would've costed less to fabricate a connector with 16 equal sized pins.

njsykora
Jan 23, 2012

Robots confuse squirrels.


The thing is there haven't really been any big reports of failure on 12-pin connectors since the initial 4090 release, except for these adaptors.

Hughmoris
Apr 21, 2007
Let's go to the abyss!
Do I need to disclose a 4090 purchase to my home insurance agent?

Adbot
ADBOT LOVES YOU

BurritoJustice
Oct 9, 2012

orcane posted:

That's not what happened.

The actual point of failure is the connector on the card, but if you attach a CM adapter and the garbage 12VHPWR gets too hot, some people will still blame the adapter instead of Asus for trying to push 600 W through a badly designed flip connector (apparently in Europe the adapter returns are 99% Asus cards and a handful of MSI Suprim 4090s). It's completely understandable if CM (or Asus?) doesn't want to have to keep replacing $2500 4090 Strix cards even if technically their adapter is fine for what it's supposed to do.

It will also happen with other adapters and native 12VHPWR PSU-cables, but there it's even less obvious what part is really at fault and the cost of replacing cards will be spread to a larger number of manufacturers.

That's absolutely what happened lol, the fail rate of the adapter is astronomically higher than people connecting their power directly.

I don't know what your point is about 600W when that's the rating of the connector, lots of cards will draw that if turn the power slider up, and none of them are failing at anywhere close to a 1% rate.

E: seriously I follow the big subreddits where people post their failures. I genuinely can't remember the last time I saw a burnt up post not using the adaptor, at least since the gamers nexus video that made everyone neurotic about actually plugging it in, while there has been a burnt CM adaptor posted every single week since launch.

BurritoJustice fucked around with this message at 16:47 on Feb 10, 2024

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply