|
The 4080 Super FE box is literally as large as my entire PC
change my name fucked around with this message at 20:46 on Feb 8, 2024 |
# ? Feb 8, 2024 20:05 |
|
|
# ? Jun 6, 2024 12:37 |
|
HalloKitty posted:Not to pick on you specifically, but is there a reason we give X a bunch of traffic instead of just linking to the article? Because it’s quick and easy
|
# ? Feb 8, 2024 20:13 |
|
So if I'm looking to pick up a 4080super when they come back in stock what discord or website do I need to set up alerts on ?
|
# ? Feb 8, 2024 20:38 |
|
Kibner posted:It's still an issue because of all those obvious issues with GPU waterblocks. It means that the waterblocks for the AIO are not able to take advantage of economies of scales to make prices lower than the current situation. The CPU waterblocks can fit a large variety of CPUs from both major brands with maybe a cheap bracket to adjust for different mounting solutions and such. You can't do that with the GPU waterblocks. Those same GPU companies are the ones who make the choice to use non-reference PCB designs and already come up with a host of air coolers every generation. Again I'm not proposing after market cooler manufacturers make water blocks for every SKU from every vendor but rather some of those vendors do so for their own designs at the same time they're designing the air coolers. There's no real reason to they'd need to be doing ground up designs every generation for every SKU, a dozen every generation is an exaggeration, no ones going to water cool anything below a xx70. Cards with AIO have been done already and they've carried premiums for it but not excessively large ones over similar premium binned examples.
|
# ? Feb 8, 2024 21:02 |
|
Nvidia put out one of their "hotfix" drivers that wont be autopushed via Geforce Experience to fix some micro-stutter issues:quote:This hotfix addresses the following issues: I havent seen the prob personally but in case you have https://nvidia.custhelp.com/app/answers/detail/a_id/5519?=&linkId=100000240085402
|
# ? Feb 8, 2024 21:05 |
|
HalloKitty posted:Not to pick on you specifically, but is there a reason we give X a bunch of traffic instead of just linking to the article? it's a convenient, low-effort way to get a clickable thumbnail and summary.
|
# ? Feb 8, 2024 22:30 |
|
Cygni posted:Nvidia put out one of their "hotfix" drivers that wont be autopushed via Geforce Experience to fix some micro-stutter issues: oh nice, i'll have to try this. i was wondering how the same GPU that can push 100+ fps in CP2077 with everything turned on (other than path tracing) somehow has trouble with scrolling a web page smoothly
|
# ? Feb 8, 2024 22:49 |
|
is there a reason why some driver updates aren't pushed out by GFE
|
# ? Feb 8, 2024 23:25 |
|
shrike82 posted:is there a reason why some driver updates aren't pushed out by GFE It’s talked about at the link, but the “hotfixes” get an abbreviated QA cycle to fix something that can’t wait until the next overall driver release. They are mostly quick releases to address a bug that may not impact everyone, before that fix is rolled into the mainline releases, but they aren’t tested on everything.
|
# ? Feb 8, 2024 23:42 |
|
Dr. Video Games 0031 posted:I don't know if you were supposed to be given the HDR10+ Gaming option. HDR10+ is separate from HDR10 (which is the universal standard for desktop monitors), and I believe the AW3423DWF only supports basic HDR10, with none of the special standards created by Samsung. Even my new monitor that supports Dolby Vision doesn't support HDR10+. I think if you don't have a Samsung display introduced in the last couple years, then you probably don't have support for the HDR10+ Gaming mode in CP2077. My Odyssey Neo G7 and Taima's S95B should support it, and other recent Samsung monitors like the Odyssey OLED line should also support it, but there's probably not much else out there that does. Whoa! Ok, you are onto something here. I was messing around with the Game Panel on the S95B for other reasons (it's really useful since it says with certainty if you're using HDR while in a game and also if your VRR is working) and I saw this! I've never seen anything like that before, but it looks like it really is using something called "HDR10+ Gaming"! I plugged that term into Google and got some hits back... It looks like very few currently released games support it. 2077 might be the only one? Not sure. Here is an article from late August talking about a game called "The First Descendant" which was the first game confirmed to use the HDR10+ Gaming standard. And that game doesn't even have a release date yet, it just says "2024". https://www.theverge.com/2023/8/21/23839768/samsung-hdr10-plus-gaming-standard-the-first-descendant-nexon This could be the only game supporting HDR10+. Kind of neat, I have no idea what the effective difference is though between this and standard HDR. Taima fucked around with this message at 00:46 on Feb 9, 2024 |
# ? Feb 8, 2024 23:53 |
|
One of the things it is supposed to be capable of is source tone mapping. Instead of your display applying a generic tone mapping profile with no idea of what type of content it's receiving, the game should understand the capabilities of your display better, and your display should understand the content better with the metadata provided, to present a more accurate image. At least, that's the general idea as I understand it. In practice, I could not tell a difference between HDR10+ Gaming and HDR10 PQ in Cyberpunk 2077 when using it with my Neo G7. It didn't feel like the game was automatically calibrating the image to my display, and I didn't notice any difference with the tone mapping. Maybe it wasn't working correctly, I dunno.
Dr. Video Games 0031 fucked around with this message at 00:34 on Feb 9, 2024 |
# ? Feb 9, 2024 00:30 |
|
Wait, HDR10+ is bidirectional communication with the display?
|
# ? Feb 9, 2024 00:42 |
|
Subjunctive posted:Wait, HDR10+ is bidirectional communication with the display? https://www.flatpanelshd.com/news.php?subaction=showfull&id=1635416918 "During the handshaking, a display provides its panel properties to the game's HDR10+ processing block. On receiving the physical attributes from the display the game automatically performs its HDR10+ video output optimization for the given display without the user having to do it manually," explained Bonggil Bak. "This process is not only convenient but also very effectively preserves the original creative intent of the game creators." I believe it's supposed to work similarly to HGIG? There is some basic metadata in a display's EDID that the source can already read, so maybe it's just that. When I plug an HDR monitor into a windows PC for instance, it knows what its max luminance is out of the box, no extra calibration needed.
|
# ? Feb 9, 2024 00:47 |
|
sounds like source-based tone mapping (sbtm) spec in hdmi 2.1b hdr is always a crap shoot, but the hdr tool in windows 11 is intriguing at least kliras fucked around with this message at 00:58 on Feb 9, 2024 |
# ? Feb 9, 2024 00:52 |
|
Whoopsiedoodle. Cablemod recalls its 16-pin GPU power adapters due to fire hazard — over $74,500 in property damage claims so far quote:272 cases out of 25,300 represent just a 1% error rate, which may not sound significant. However when dealing with high wattages and temperatures, it only takes one misfortune for things to go very bad. With the potential of a fire hazard, Cablemod ultimately decided to go with a voluntary recall, which is the right thing to do.
|
# ? Feb 9, 2024 01:02 |
|
Hopefully their cables hold up better because I just bought one, I really hate this dumb plug would it have really been that difficult to just make it have all the same size pins?
|
# ? Feb 9, 2024 01:21 |
|
Supposedly it’s only limited to the CM 90 and 180 adapters, not the connector design itself and not any other brands adapters. The CM cables haven’t been blamed for anything that I’ve seen (so far). Other than being another melting cable/connector drama, I don’t think it’s really related to the prior FE issue or the connector design.
|
# ? Feb 9, 2024 01:49 |
|
Man, I procrastinated my way into not having a PC fire. Go team just plug in 4 cables to the adapter.
|
# ? Feb 9, 2024 01:58 |
|
Are there any 40 series GPUs with indented power connectors like this? https://assets.rockpapershotgun.com/images/2020/10/Nvidia-GeForce-RTX-3070-power-connectors.jpg Sorry for the huge file. It's a Zotac 30 series card. I wonder how much this would help for the cards that are the size of a cinder block and nearly touch the side of the case.
|
# ? Feb 9, 2024 09:11 |
|
Former Human posted:Are there any 40 series GPUs with indented power connectors like this? I know the non-Super PNY 4070 Ti and higher had that, but they seemed to have cheaped out with their coolers on the Supers so you should double-check those
|
# ? Feb 9, 2024 13:10 |
|
Well, couldn't really decide on a spot for the gpu support, and I have 3 supports soooooo, now the gpu is the most stable component in my entire pc.
|
# ? Feb 9, 2024 15:33 |
|
now your gpu is invincible
|
# ? Feb 9, 2024 15:40 |
|
Why don't they make the whole GPU out of supports?
|
# ? Feb 9, 2024 16:13 |
|
it's gotta be study enough for my Skyrim collectors edition dragon statue.
|
# ? Feb 9, 2024 16:46 |
|
Bofast posted:I think I'm just going to avoid 12VHWPR anything until they either fix the standard or find some other solution. Check if your PSU manufacturer makes their own cable. I've been using Corsairs 12VHWPR cable with my 4090 for over a year and haven't had any melty issues.
|
# ? Feb 9, 2024 20:17 |
|
Is the 6700xt still the best ~$300 card? The 7600xt just came out and seems to perform worse(really?) But has AI cores and more RAM. Should I get that for future proofing or does none of that stuff matter at all?
|
# ? Feb 10, 2024 02:06 |
|
There's no future proofing with AMD cards. For that you'd need to be clairvoyant and only buy Nvidias top shelf bangers (8800GTX, 1080ti, 4090) at launch, for msrp. x600 AMD cards are cripples with cut PCIe lanes, small 128 bit memory bus (even for the 16BG model) and small cache. No idea who would even consider them when they're already beginning to struggle at 1080p. The 6700XT will do Cyberpunk and other current stuff with rastering at 1080p, high+ settings and solid 60fps. The 7700XT is about 1/3 faster and last time I checked 1/3 more expensive. Maybe you can turn on some mild raytracing Next step is the 4070Super, 1/3 faster than the 7700XT and with Nvidias features. sauer kraut fucked around with this message at 02:51 on Feb 10, 2024 |
# ? Feb 10, 2024 02:48 |
|
CatelynIsAZombie posted:So if I'm looking to pick up a 4080super when they come back in stock what discord or website do I need to set up alerts on ? i set up alerts directly with retailers and they never went off, despite buying a card just this morning. Never did get an email, so I don't suggest relying on those. I just made checking retailers directly part of my morning routine, annoying as that was. now, it was in stock for a single-digit number of hours, but it was still in stock!
|
# ? Feb 10, 2024 03:27 |
|
sauer kraut posted:The 6700XT will do Cyberpunk and other current stuff with rastering at 1080p, high+ settings and solid 60fps. You can even turn on a lot of the raytraced effects with a 6700 XT if you run FSR2 in quality and can tolerate some dips into the fifties, which is how me and my 6700 XT are currently enjoying Cyberpunk
|
# ? Feb 10, 2024 05:18 |
|
kliras posted:might be time for these folks to rebrand Always a mystery to me why you'd want to insert another point of failure
|
# ? Feb 10, 2024 06:09 |
|
Laughing my rear end off that they spent all that time fearmongering on Reddit to get people to buy their adapters only for them to have a 10-20x higher failure rate than the "dangerous stock adapters"
|
# ? Feb 10, 2024 11:31 |
|
That's not what happened.Cantide posted:Always a mystery to me why you'd want to insert another point of failure It will also happen with other adapters and native 12VHPWR PSU-cables, but there it's even less obvious what part is really at fault and the cost of replacing cards will be spread to a larger number of manufacturers. orcane fucked around with this message at 13:32 on Feb 10, 2024 |
# ? Feb 10, 2024 13:23 |
|
WonkyBob posted:Check if your PSU manufacturer makes their own cable. I've been using Corsairs 12VHWPR cable with my 4090 for over a year and haven't had any melty issues. We’re talking about an issue arising from a 1% failure rate, so individual anecdotes for comparison will be even less compelling than usual, I think. My 180 CM adapter didn’t melt either, but I don’t think the whole thing is made up and I’m going to destroy the thing.
|
# ? Feb 10, 2024 13:28 |
|
Welp, sent in to cablemod for my adapter recall, they wanted a picture of it "disabled" but their example video bending the pins with a screwdriver looked insufficient to me, so I instead removed the metal cover and then snapped the PCB in half. Also I was curious how robust it really was, yeah the PCB is mostly okay, the copper lines are a little small (although it is still the connector that fails not the PCB). But still, considering the amount of thermal putty and sizable aluminum block they put on it, perhaps it would have been better to just size up the actual power traces so it wouldn't heat up in the first place... Overall I think the whole 12VHPWR standard was a bad idea, it is just too small and too finicky to deal with 50 amps. For comparison, if this was wiring in your residence designed to handle 50 amps, you would use 6 AWG wire for it, which is typically multi-strand copper wire about as thick as a #2 pencil that is incredibly stiff and hard to work with. I think if they really want to get back to a single reasonable sized connector, they should just go back to the old 6 or 8 pin plugs but switch to 48 or 60 volts (keyed differently of course) so the connector only needs to handle 10-13 amps to deliver 600W. The old 6 and 8 pin connectors physical designs are good for like double the amperage of the PCIe specs they were limited to and it clearly paid off since you don't hear many cases of them melting under load, I kinda doubt 12VHPWR has that level of built in safety margin. IIRC the 8 pin connector is rated for 27 amps (324W @ 12v), but the PCIe spec limited it to 12.5 amps (150w @ 12v). Well 12.5 amps at 48v is the same 600W that the 12VHPWR design was supposed to handle.
|
# ? Feb 10, 2024 14:55 |
|
Any reports of tragedy with ModDIY 90° cables and/or 4090 FE’s? I gotta use a 90° adapter in order to close my SFF case. Yes, I’m one of those annoying SFF persons
|
# ? Feb 10, 2024 15:17 |
|
Indiana_Krom posted:Overall I think the whole 12VHPWR standard was a bad idea, it is just too small and too finicky to deal with 50 amps. For comparison, if this was wiring in your residence designed to handle 50 amps, you would use 6 AWG wire for it, which is typically multi-strand copper wire about as thick as a #2 pencil that is incredibly stiff and hard to work with. I think if they really want to get back to a single reasonable sized connector, they should just go back to the old 6 or 8 pin plugs but switch to 48 or 60 volts (keyed differently of course) so the connector only needs to handle 10-13 amps to deliver 600W. The old 6 and 8 pin connectors physical designs are good for like double the amperage of the PCIe specs they were limited to and it clearly paid off since you don't hear many cases of them melting under load, I kinda doubt 12VHPWR has that level of built in safety margin. IIRC the 8 pin connector is rated for 27 amps (324W @ 12v), but the PCIe spec limited it to 12.5 amps (150w @ 12v). Well 12.5 amps at 48v is the same 600W that the 12VHPWR design was supposed to handle. It's fun how my car 22kw type2 cable is thicker than my thumb, yet it will pass less amps than that minuscule connector on a gpu.
|
# ? Feb 10, 2024 15:51 |
|
I just don't understand why we need 4 pins that are smaller and in a different spot and also recessed, like surely it would've costed less to fabricate a connector with 16 equal sized pins.
|
# ? Feb 10, 2024 16:14 |
|
The thing is there haven't really been any big reports of failure on 12-pin connectors since the initial 4090 release, except for these adaptors.
|
# ? Feb 10, 2024 16:17 |
|
Do I need to disclose a 4090 purchase to my home insurance agent?
|
# ? Feb 10, 2024 16:27 |
|
|
# ? Jun 6, 2024 12:37 |
|
orcane posted:That's not what happened. That's absolutely what happened lol, the fail rate of the adapter is astronomically higher than people connecting their power directly. I don't know what your point is about 600W when that's the rating of the connector, lots of cards will draw that if turn the power slider up, and none of them are failing at anywhere close to a 1% rate. E: seriously I follow the big subreddits where people post their failures. I genuinely can't remember the last time I saw a burnt up post not using the adaptor, at least since the gamers nexus video that made everyone neurotic about actually plugging it in, while there has been a burnt CM adaptor posted every single week since launch. BurritoJustice fucked around with this message at 16:47 on Feb 10, 2024 |
# ? Feb 10, 2024 16:44 |