Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Rastor posted:

Doesn't seem hard, just have a power connector on the motherboard and then some fat traces to the PCIe connector.

Remember that microwaves, air conditioners, etc. often exceed 1,000 watts.

Yeah but they do it through 10G cables, not a PCIe connector with pins half the size of a toothpick.

Adbot
ADBOT LOVES YOU

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

wargames posted:

so no more 6 pins or 8 pins yay!

Yeah, all I read from that is "300W from the slot now? Okay, let's make a 450W card."

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

BIG HEADLINE posted:

Yeah, all I read from that is "300W from the slot now? Okay, let's make a 450W card."

I for one welcome our new 3.5 slot cooler overlords.

shrike82
Jun 11, 2005

I ran into a major issue upgrading to Windows 10 anniversary and the latest Nvidia drivers for my 1080 where during the driver installation, the screen just glitched out to random artifacts. Rebooting didn't work so I had to restart in safe mode and restore to pre Anniversary and pre latest set of drivers.

DDU didn't help and Google doesn't seem to show that this is a common issue.

Kinda scratching my head on this

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
Does it also mean that PCIe 4.0 won't be backwards compatible with PCIe 3.0 cards and vice versa?

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

BIG HEADLINE posted:

Yeah, all I read from that is "300W from the slot now? Okay, let's make a 450W card."

Just in time for a node where I'm not actually sure they can make an affordable consumer card make meaningful use of 300W.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

xthetenth posted:

Just in time for a node where I'm not actually sure they can make an affordable consumer card make meaningful use of 300W.

O ye of little faith in AMD. You think they put all that effort into the Fury X's liquid cooler not to use it again?

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Twerk from Home posted:

O ye of little faith in AMD. You think they put all that effort into the Fury X's liquid cooler not to use it again?

I loving hope they're going to use it again.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
I can't imagine what kind of monstrosity you'd need to come up with to make a 450w card off a 14nm GPU. You'd basically need 2 Titan XP's on one chip as that card is only 250w on its own.

SlayVus
Jul 10, 2009
Grimey Drawer

Beautiful Ninja posted:

I can't imagine what kind of monstrosity you'd need to come up with to make a 450w card off a 14nm GPU. You'd basically need 2 Titan XP's on one chip as that card is only 250w on its own.

Titan Black X2. $3000, 7,168 Cuda cores. Go Quad-Sli for your triple 4k setups.

Shumagorath
Jun 6, 2001
Is there a passively-cooled (and ideally slot powered) Nvidia GPU on the market that's worthwhile as a PhysX card or is that driver option purely theoretical? The new Deus Ex has cloth physics and maybe it could boost Hairworks?

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Shumagorath posted:

Is there a passively-cooled (and ideally slot powered) Nvidia GPU on the market that's worthwhile as a PhysX card or is that driver option purely theoretical? The new Deus Ex has cloth physics and maybe it could boost Hairworks?

Whole lot of nope there. My understanding is that if you use a PhysX GPU that's much slower than your main GPU, it just ends up bottlenecking the whole thing down anyway. Anything that would be passively cooled would introduce a huge bottleneck in your system, the fastest seems to be a GT 730, which is basically a card that exists for people who don't have functional integrated graphics.

ColHannibal
Sep 17, 2007

SwissArmyDruid posted:

PCIe 4.0 (v.7 of the spec) to have 16 GT/s and 300W from the slot. :toot:

http://www.tomshardware.com/news/pcie-4.0-power-speed-express,32525.html

edit: forum trimmed the url.

And custom psi cable makers just let out a collective scream.

Avalanche
Feb 2, 2007
Can someone explain what's going on with overclocking 1070 and 1080 cards in terms of Nvidia somehow limiting voltage OC?

I OC'd my Gigabyte G1 1070 but the performance gains were pretty marginal. There doesn't really seem to be a point to OC the card at the moment. I looked into it a little bit, and apparently Nvidia limits how much you can overvolt the card which ends up limiting how much you can bump up the clock speed. Even with the card being maxed out before stability became an issue, I was getting around 65-73C temps which was still very much in the safe zone.

Any reason why this is happening? I'm guessing it has something to do with stopping people from just buying 1070 cards and OCing them to 1080 specs so no one feels a need to pay a grand to buy a loving 1080. Is there any way around Nvidia's hard limit on overvolting the card?

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Avalanche posted:

Can someone explain what's going on with overclocking 1070 and 1080 cards in terms of Nvidia somehow limiting voltage OC?

I OC'd my Gigabyte G1 1070 but the performance gains were pretty marginal. There doesn't really seem to be a point to OC the card at the moment. I looked into it a little bit, and apparently Nvidia limits how much you can overvolt the card which ends up limiting how much you can bump up the clock speed. Even with the card being maxed out before stability became an issue, I was getting around 65-73C temps which was still very much in the safe zone.

Any reason why this is happening? I'm guessing it has something to do with stopping people from just buying 1070 cards and OCing them to 1080 specs so no one feels a need to pay a grand to buy a loving 1080. Is there any way around Nvidia's hard limit on overvolting the card?

No one knows why Nvidia limited it like that, though I really doubt it's to keep people from OCing the 1070 up to 1080 level as I very much doubt that is anywhere close to possible, plus if you could there is no reason why you could not OC the 1080 that much further, the fundamental difference in CUDA cores stays the same. And no, there is no way around it right now since the BIOS/Firmware is signed.

penus penus penus
Nov 9, 2014

by piss__donald
Nvidia has always limited the voltage, everybody does. Its just this time they're shipping closer to that limit than before. A part of me thinks its just because they can (far fewer wattage concerns on 16nm). The short of it is all cards seem to consistently hit around the same limits - consider the fact this is a good thing with some immediate annoying consequences. We've built expectations around overclocking since it shifted from a no no to a marketable feature, but we must not forget that ideally we wouldn't have to overclock at all to get the full use of the chip. A boring future for sure, and one that AIBs won't be super happy about but we shouldn't really be upset about this direction we are heading into. At the end of the day its better for a product to basically "OC" itself.

Now to the darker side of things , most voltage limits are set to prevent damage. To go beyond those limits is what overclocking truly is, and just like before no company is going,to support that officially. Once someone breaks the hard lock on voltage control then we will see what it can do, but that's obviously uncharted territory and not one id jump into immediately until you see others blow up their cards.

And to the dark part of it, the limit in this case may be set intentionally low for a 11 series refresh. But that's just speculation at best. Once someone figures out how to circumvent the hard limit we will know then.

Billa
Jul 12, 2005

The Emperor protects.

Smarf posted:

What driver are you using on the monitor? At this stage I'm guessing it's more of a Crimson driver issue than anything.


You can, I just want to make sure its not a monitor issue whilst I'm still able to get a full refund. Also considering that the latest drivers were meant to fix it but doesn't seem to work for either of us.

I'm using the beta driver on the monitor (the one that adds 'freesync' after the name). And yes, it is a driver issue more than a thing with the monitor.

Smarf
Mar 21, 2004

Arsehole!!!!

Billa posted:

I'm using the beta driver on the monitor (the one that adds 'freesync' after the name). And yes, it is a driver issue more than a thing with the monitor.

Could you like me to that specific driver? The one I'm using from their website shows my monitor with a G instead of the PF in the model name, seems really odd.

SwissArmyDruid
Feb 14, 2014

by sebmojo

BIG HEADLINE posted:

Yeah, all I read from that is "300W from the slot now? Okay, let's make a 450W card."

Real talk though, you could power a Titan XP off that, and power consumption is only ever going to go down. Hopefully GloFo doesn't get stuck on 14nm like they did with 28nm. (AHAHAHAHAHA yes they are, just watch them, they said they were going to skip 10nm entirely and go straight to 7nm. :saddowns:)

Moore's Law (or some similar analogue/derivative thereof) at work, and all that. I don't think Titans have ever used more than a single 8-pin and a 6-pin, so.

ColHannibal posted:

And custom psi cable makers just let out a collective scream.

Well, at least they can't ever take away the ATX and EPS cables away from guys like Cablemod, so I think they'll be okay.

SwissArmyDruid fucked around with this message at 07:58 on Aug 22, 2016

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
How can it be backwards compatible, though? I'm guessing all cards will still come with PCIe power connectors, and check in the early stages of boot if they're hooked up to PCIe 4, then go nuts and pull all the power through the slot (which sounds dodgy to me, those little fingers in the slot were never enough before, and now they're suddenly fine for ludicrous power? Seems odd).

Billa
Jul 12, 2005

The Emperor protects.

Smarf posted:

Could you like me to that specific driver? The one I'm using from their website shows my monitor with a G instead of the PF in the model name, seems really odd.

https://www.youtube.com/watch?v=9GrYpQeTNks

(you have the driver on the show info of the video)

Smarf
Mar 21, 2004

Arsehole!!!!

Billa posted:

https://www.youtube.com/watch?v=9GrYpQeTNks

(you have the driver on the show info of the video)

Perfect, thank you. I'm going to assume our monitors are fine and it'll be fixed with a driver update in the future.

New Zealand can eat me
Aug 29, 2008

:matters:


BIG HEADLINE posted:

Yeah, all I read from that is "300W from the slot now? Okay, let's make a 450W card."

You can tell nobody read the article because they actually say the real number might end up being 500 or higher

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

HalloKitty posted:

How can it be backwards compatible, though? I'm guessing all cards will still come with PCIe power connectors, and check in the early stages of boot if they're hooked up to PCIe 4, then go nuts and pull all the power through the slot (which sounds dodgy to me, those little fingers in the slot were never enough before, and now they're suddenly fine for ludicrous power? Seems odd).

They can change the contacts, make them thicker, use lower resistance materials, etc.

SwissArmyDruid
Feb 14, 2014

by sebmojo

HalloKitty posted:

How can it be backwards compatible, though? I'm guessing all cards will still come with PCIe power connectors, and check in the early stages of boot if they're hooked up to PCIe 4, then go nuts and pull all the power through the slot (which sounds dodgy to me, those little fingers in the slot were never enough before, and now they're suddenly fine for ludicrous power? Seems odd).

One of two ways:

* A longer PCIe slot, and stick the extra power pins at the end. Mechanically, at present, there's nothing stopping you from plugging an x1 or x4 card into a full-length slot. Similarly, any currently-existing PCIe cards would not be long enough to reach those extra power pins.
* They do some negotiation between the card and the PCIe controller before turning on the full juice. Kind of like a smartphone using Quick Charge, but for PCIe. The actual x1-x16 connectors still are not (and have never been) a hotplug interface, after all.

I LIKE TO SMOKE WEE posted:

You can tell nobody read the article because they actually say the real number might end up being 500 or higher

No, you know that nobody read the article because nobody mentioned the FREAKING STANDARDIZED EXTERNAL PCIE CABLE. Which I'm not exactly happy about because it just takes all the work that Intel and AMD and Razer put into external docks and throws it all back up into the air with uncertainty all over again.

SwissArmyDruid fucked around with this message at 09:08 on Aug 22, 2016

ColHannibal
Sep 17, 2007

SwissArmyDruid posted:

One of two ways:

* A longer PCIe slot, and stick the extra power pins at the end. Mechanically, at present, there's nothing stopping you from plugging an x1 or x4 card into a full-length slot. Similarly, any currently-existing PCIe cards would not be long enough to reach those extra power pins.
* They do some negotiation between the card and the PCIe controller before turning on the full juice. Kind of like a smartphone using Quick Charge, but for PCIe. The actual x1-x16 connectors still are not (and have never been) a hotplug interface, after all.

As someone who works at a company who designs pcba's when you get plastic connectors that big the failure rate goes through the roof.

Daviclond
May 20, 2006

Bad post sighted! Firing.
Does this mean in the future we'll be buying motherboards based on their PCIe VRM quality and number of power phases in order to achieve good GPU overclocks?

EoRaptor
Sep 13, 2003

by Fluffdaddy
You guys are crazy. The 400+ watt limit for PCIe will never make it down to consumer motherboards. Adding that much extra power circuitry to every Walmart special PC just on the off chance someone will buy a top end video card for it is a complete non-starter. The target market is servers which need to run multiple high end compute cards* inside a rack case where space and cooling constraints make additional per card cabling very problematic. Consumer PCs will get 4.0 signalling someday, but it'll be a variant of the spec with lower power requirements for the slot.

* this also applies to desktop virtualization setups that use the same cards in servers design.

That external connector also seems destined for problems. It looks like they've stuck the optical transceiver inside the plug on the cable, which will make the cable very expensive and very fragile. Not being durable enough for daily cycling means it's dead in the consumer space, and it faces very stiff competition from other established interfaces in the server space.

HMS Boromir
Jul 16, 2011

by Lowtax
I really hope they don't effectively obsolete PCIe 3.0 for anything stronger than GTX *50 / RX *60 level cards. I'm looking to keep my current CPU+mobo combo for long enough that I might actually run afoul of the new standard if they stop making cards with 6 pins and while I'm a noted low resolution idiot I'd at least like the option to move up once I have the inclination and finances.

HalloKitty posted:

How can it be backwards compatible, though? I'm guessing all cards will still come with PCIe power connectors, and check in the early stages of boot if they're hooked up to PCIe 4, then go nuts and pull all the power through the slot
This seems like the best of both worlds, if it's feasible. Last time we got any info about PCIe 4.0 there was mention of backwards compatibility and it'd be weird if they just meant "as long as it doesn't draw more than 75W". This time, though, there seems to be no mention of it whatsoever.

HMS Boromir fucked around with this message at 09:45 on Aug 22, 2016

SwissArmyDruid
Feb 14, 2014

by sebmojo


Hey cool, up 50 cents on the day after the Zen demo. Things are looking u-

GlobalFoundries Will Allegedly Skip 10nm and Jump to Developing 7nm Process Technology In House

-p. RIP AMD. Got their poo poo sorted out, but was then strangled to death by the albatross of GloFo dragging them back to the bad old days of 20nm.

Anime Schoolgirl
Nov 28, 2002

isn't that precisely the reason why amd contracted samsung directly for fabrication recently

SwissArmyDruid
Feb 14, 2014

by sebmojo
Doesn't change the fact that AMD has their balls stapled to GloFo for a minimum number of wafers/year. Look at how well that arrangement is working out now: AMD can't get good yields on 480s, GloFo are continually being blamed by the talking heads and people like us.

It hurts AMD trying to compete, and it hurts GloFo when trying to get new sales.

SwissArmyDruid fucked around with this message at 11:57 on Aug 22, 2016

Setset
Apr 14, 2012
Grimey Drawer

SwissArmyDruid posted:

Doesn't change the fact that AMD has their balls stapled to GloFo for a minimum number of wafers/year. Look at how well that arrangement is working out now: AMD can't get good yields on 480s, GloFo are continually being blamed by the talking heads and people like us.

It hurts AMD trying to compete, and it hurts GloFo when trying to get new sales.

I just see it as GloFlo not having the resources to do both 10nm and 7nm. At least by focusing only on 7nm they can get a bit of a head start. Long term if they are going to be successful then they are going to have to live on the bleeding edge of tech at least for a little while, even if they sacrifice a couple years of 10nm to do it.

SlayVus
Jul 10, 2009
Grimey Drawer

Ninkobei posted:

I just see it as GloFlo not having the resources to do both 10nm and 7nm. At least by focusing only on 7nm they can get a bit of a head start. Long term if they are going to be successful then they are going to have to live on the bleeding edge of tech at least for a little while, even if they sacrifice a couple years of 10nm to do it.

Yeah, but skipping isn't doing so well for AMD in their case in the past. They skipped 20nm to wait for 14nm, which to go from 28nm to 14nm took 4 years. When Nvidia waited for 16nm(4 Years) they at least released performing 28nm cards. To skip 10nm to go to 7nm means they could be waiting for another 2-3 years before they have a working GPU on the new process.

Also, Nvidia maximized their 28nm process by basically increasing die size for each generation. AMD took ~2 years to go from a max die size of 352mm^2 on HD 7970 to 438mm^2 on the R9 290X

SlayVus fucked around with this message at 14:24 on Aug 22, 2016

Phlegmish
Jul 2, 2011



EdEddnEddy posted:

That's close to the setup I used to Play Crysis the first time. Tri SLI 8800GTX's on a Overclocked E6600. It was playable at ultra until you got to the final boss and it sort of memory leaked all over its face.

Even upgraded to a Q9550 that system couldn't quite run it as good as the system I built for myself with a Q9550 also, and a 4870X2. The X2 was able to play it all the way through with ease always above 30FPS lol. Was interesting to see ATI graphics really slap the Nvidia stuff around in Crysis a lot back when.

I almost want to give Crysis a whirl with my overclocked i7-6700k + GTX 1070 set-up to see what happens, but I don't want to financially reward them for releasing a horribly unoptimized game.

Truga
May 4, 2014
Lipstick Apathy
Buy it on humble bundle, set 100% to charity :v:

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.

Anime Schoolgirl posted:

isn't that precisely the reason why amd contracted samsung directly for fabrication recently

Huh? Are you sure you aren't thinking of Nvidia's pascal shrink?

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

EdEddnEddy posted:

That's close to the setup I used to Play Crysis the first time. Tri SLI 8800GTX's on a Overclocked E6600. It was playable at ultra until you got to the final boss and it sort of memory leaked all over its face.

Even upgraded to a Q9550 that system couldn't quite run it as good as the system I built for myself with a Q9550 also, and a 4870X2. The X2 was able to play it all the way through with ease always above 30FPS lol. Was interesting to see ATI graphics really slap the Nvidia stuff around in Crysis a lot back when.

AMD still slaps around nVidia in Crysis, that's one of handful of games where the 290/290X really spank the 970. I'd assume the relationship still exists between the RX 480 and the 1060 based on the 480's higher memory bandwidth, but review sites finally stopped testing Crysis: Warhead about 2 years ago.

NihilismNow
Aug 31, 2003

SlayVus posted:

Also, Nvidia maximized their 28nm process by basically increasing die size for each generation. AMD took ~2 years to go from a max die size of 352mm^2 on HD 7970 to 438mm^2 on the R9 290X

Fury X came out in 2015 and was 596 mm2. And there is a gap of 3.5 years from the release of 7970 to Fury X.

Adbot
ADBOT LOVES YOU

EdEddnEddy
Apr 5, 2012



Twerk from Home posted:

AMD still slaps around nVidia in Crysis, that's one of handful of games where the 290/290X really spank the 970. I'd assume the relationship still exists between the RX 480 and the 1060 based on the 480's higher memory bandwidth, but review sites finally stopped testing Crysis: Warhead about 2 years ago.

It's also a drat pretty game even today.

Also the Mechwarrior Living Legends mod is still fantastic looking, and more fun then MWO soo..


I need to go back and play those games again. It's been a while.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply