Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Indiana_Krom
Jun 18, 2007
Net Slacker

runaway dog posted:

I just don't understand why we need 4 pins that are smaller and in a different spot and also recessed, like surely it would've costed less to fabricate a connector with 16 equal sized pins.

Those 4 pins aren't carrying any current, they don't need to be as big as the rest. They are just there to tell the card how much current is available (and they work in the most simple analog way imaginable).

Adbot
ADBOT LOVES YOU

BurritoJustice
Oct 9, 2012

Indiana_Krom posted:

Those 4 pins aren't carrying any current, they don't need to be as big as the rest. They are just there to tell the card how much current is available (and they work in the most simple analog way imaginable).

Crucially, since the most recent revision the card won't even accept power unless the sense pins are active and now they're recessed far enough that they don't mate until the main pins are fully mated.

repiv
Aug 13, 2009

IIRC the spec was always set up such that the sense pins not being connected would set the card to 150W mode, but they really should have recessed them from the beginning

BurritoJustice
Oct 9, 2012

repiv posted:

IIRC the spec was always set up such that the sense pins not being connected would set the card to 150W mode, but they really should have recessed them from the beginning

Yes, previously both open was 150w and now it is 0w. So technically there is an incompatibility for 150w clients, because now 150w mode needs the pins shorted. 150w mode isn't going to burn anything even with a poor connection, so the recessed pins are a bigger deal.

runaway dog
Dec 11, 2005

I rarely go into the field, motherfucker.

njsykora posted:

The thing is there haven't really been any big reports of failure on 12-pin connectors since the initial 4090 release, except for these adaptors.

Yeah because everyone is paranoid as gently caress about it now and baby it like it's a fragile porcelain angel

BurritoJustice
Oct 9, 2012

runaway dog posted:

Yeah because everyone is paranoid as gently caress about it now and baby it like it's a fragile porcelain angel

I guess it's pretty notable that these adaptors were burning up at a hundred times the rate of even the initial pre-plugging-in-properly launch era rates, then.

kliras
Mar 27, 2021
it was just a perfect storm of design issues: both the general electrical issues, but also how loose the plug is even when connected, and with poor feedback for whether it's properly seated. to make things worse, the cards were so big that some people hard to bend the cables at a suboptimal angle to fit it in their cases, and some of them were probably pressed up against the case panel, too - which isn't great for tempered glass either

part of this situation is just an overlap with the problems of cards becoming as big as they are now where they barely fit in one of the most popular gaming cases on the market

BurritoJustice
Oct 9, 2012

If you believe cablemod's Reddit presence, the failure point for the right angled adaptor was the PCB mount.

They could just be saying that so you don't avoid their regular, non-PCB, cables though.

Flowing Thot
Apr 1, 2023

:murder:
What if the put the connectors at the back of the card and not the side.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Would be too easy to use consumer cards in datacentres then.

BurritoJustice
Oct 9, 2012

I dunno why they stopped doing the angled connector that all the founders cards had last generation, it was way easier to use.

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

BurritoJustice posted:

I dunno why they stopped doing the angled connector that all the founders cards had last generation, it was way easier to use.

That was more rotated 90 degrees than angled tbf, but agreed

repiv
Aug 13, 2009

Subjunctive posted:

Would be too easy to use consumer cards in datacentres then.

that made sense when cards were following the standard pcie form factor, but since everyone started doing this they wouldn't fit in a standard server regardless of where you put the power connector

Flowing Thot
Apr 1, 2023

:murder:

Subjunctive posted:

Would be too easy to use consumer cards in datacentres then.

Do it anyway.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Flowing Thot posted:

Do it anyway.

I would if I could!

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Flowing Thot posted:

Do it anyway.

If you are a target with deep pockets Nvidia will come knocking because you are violating the license agreement on their GPU drivers.

It's software piracy to use a GeForce card in a rackmount server or DC of any type.

Arivia
Mar 17, 2011

Twerk from Home posted:

If you are a target with deep pockets Nvidia will come knocking because you are violating the license agreement on their GPU drivers.

It's software piracy to use a GeForce card in a rackmount server or DC of any type.

they should go after Linus then

Turmoil
Jun 27, 2000

Forum Veteran


Young Urchin
With the connector talk.

ASUS announced their BTF line that uses PCIe high power connector on the GPU with the slot on the MB. Interesting idea, but a lot more companies would have to start doing the same thing. Right now that line is just future e-waste.
https://www.asus.com/content/btf-hidden-connector-design/


I saw some stuff a few years ago talking about how that connector might be the way to go in the future as cards required more power, but that's the first MB/GPU combo to use it.

repiv
Aug 13, 2009

is that a PCI-SIG standard or is asus just making poo poo up

i recall PCI-SIG announced something like that but i'm not sure if its the same connector

Cross-Section
Mar 18, 2009

Turmoil posted:

With the connector talk.

ASUS announced their BTF line that uses PCIe high power connector on the GPU with the slot on the MB. Interesting idea, but a lot more companies would have to start doing the same thing. Right now that line is just future e-waste.
https://www.asus.com/content/btf-hidden-connector-design/


I saw some stuff a few years ago talking about how that connector might be the way to go in the future as cards required more power, but that's the first MB/GPU combo to use it.

Why would you do a motherboard of this type on LGA 1700 and not a socket that has least one more generation (if not more) in it like AM5 does

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Flowing Thot posted:

Do it anyway.

I'm sorry this kind of attitude is not aligned with our mission to provide maximum value to our shareholders. I'll need you to clean out your desk and turn in your badge.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Lockback posted:

I'm sorry this kind of attitude is not aligned with our mission to provide maximum value to our shareholders. I'll need you to clean out your desk and turn in your badge.

No, we’re keeping the stuff in their desk, sorry.

BurritoJustice
Oct 9, 2012

Cross-Section posted:

Why would you do a motherboard of this type on LGA 1700 and not a socket that has least one more generation (if not more) in it like AM5 does

Most people aren't going to upgrade their CPU before AM6 is out.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

BurritoJustice posted:

Most people aren't going to upgrade their CPU before AM6 is out.

AM5 would be better for me, and I think would be more likely to get people demoing sick SFF builds and such. Most importantly it would be better for me.

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.
Didn’t Apple do something similar to that ASUS extra power slot for the Mac Pro GPUs? Might have been the same connector IDK.

CEM spec really needs an update for high power stuff. Or just move to mezannine type connectors on the motherboard and then the gpu heatsink will be just like the CPU one.

repiv
Aug 13, 2009

yeah apple took it one step further by also piping video back to the motherboard through the extra pins though, which let them mux the GPUs video outputs into thunderbolt ports

then they removed it from the ARM mac pro because it doesn't support GPUs at all, what a weird system

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.
I eagerly await the day that a x16 PCIe CEM slot is looked at as a weird legacy curiosity like a PCI slot is today.

SlowBloke
Aug 14, 2017

Cross-Section posted:

Why would you do a motherboard of this type on LGA 1700 and not a socket that has least one more generation (if not more) in it like AM5 does

Because if it flops, they won't have to support it with extra cards. The sampler card is a 4070ti so they seem to aim for the middle of the road market, not people keeping cards long term.

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

SlowBloke posted:

Because if it flops, they won't have to support it with extra cards. The sampler card is a 4070ti so they seem to aim for the middle of the road market, not people keeping cards long term.

They had a 4090 version at CES this year fwiw

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Windows native upscaling out in the canary build

https://x.com/PhantomOfEarth/status/1756334413040718140

"Automatic super resolution
Use Al to make supported games play more smoothly with enhanced details."

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

That’s gonna be a mess.

MarcusSA
Sep 23, 2007

Subjunctive posted:

That’s gonna be a mess.

This is exactly what I was thinking.

I can’t wait for the reviews lol

Anime Schoolgirl
Nov 28, 2002

i morbidly wonder if this can be stacked with DLSS and FSR like AFMF and DLSS3

Animal
Apr 8, 2003

Anime Schoolgirl posted:

i morbidly wonder if this can be stacked with DLSS and FSR like AFMF and DLSS3

Galaxy brain

dkj
Feb 18, 2009

Animal posted:

Galaxy brain

A shimmering blurry galaxy

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

dkj posted:

A shimmering blurry galaxy

One with so much latency, that you can, just like looking up at distant stars, enjoy the light from frames long, long ago

Arzachel
May 12, 2012

HalloKitty posted:

One with so much latency, that you can, just like looking up at distant stars, enjoy the light from frames long, long ago

"I can't tell the difference anyways"

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

HalloKitty posted:

One with so much latency, that you can, just like looking up at distant stars, enjoy the light from frames long, long ago

This is a common fallacy; half of the starlight we see was actually inserted by AI after production to recreate an uninterrupted viewing experience

KillHour
Oct 28, 2007


I'm going to invent an AI frame generation software that just estimates what the game would look like if you were winning and shows you that.

Adbot
ADBOT LOVES YOU

kliras
Mar 27, 2021
clearly the efforts should be going into an os-wide crt shader instead

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply