Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SwissArmyDruid
Feb 14, 2014

by sebmojo
The tech sites are doing their HBM analysis after AMD releases more info. Pick your poison:

http://www.anandtech.com/show/9266/amd-hbm-deep-dive

http://semiaccurate.com/2015/05/19/amd-finally-talks-hbm-memory/

http://www.pcper.com/reviews/General-Tech/High-Bandwidth-Memory-HBM-Architecture-AMD-Plans-Future-GPUs

Of note:

* Anandtech thinks that AMD will halve their memory power consumption: http://www.anandtech.com/show/9266/amd-hbm-deep-dive/4 This isn't the kind of power savings as you'd find in Maxwell, though. Oh well, there's always Arctic Islands. Nvidia will benefit from being able to claim a "free" 15-30W power savings relative to Maxwell, Team Green's marketing will probably spin this to make Team Red look bad.

* Semiaccurate thinks that while HBM is going to be a very high end product, its associated tangential benefits that are helping to defray costs now (not needing to route memory through PCBs = less complex, cheaper PCBs, being able to repurpose old 65nm lines into dedicated interposer lines) will really help drive costs down once volume ramps up. In other words, the HBM should not be any more expensive than whatever people were going to do with GDDR5, if not cheaper.

* PcPer notes that the interposer gives AMD some serious flexibility. Probably a "duh" thing, but it lets AMD use different process technologies, like if they wanted to use a 14nm process on the GPU and 19nm process on the HBM. (Their article uses 28nm and 19nm, but let's face it, we're getting 14nm FinFET parts from both nVidia and AMD inside of two years.)

SwissArmyDruid fucked around with this message at 21:32 on May 19, 2015

Adbot
ADBOT LOVES YOU

veedubfreak
Apr 2, 2005

by Smythe
Hay look, Nvidia decided to hook us up.
http://www.geforce.com/titan-x-geforce-experience-beta

GrizzlyCow
May 30, 2011
It seems the Project Cars developers may have lied about their correspondence with AMD per HardOCP.


SwissArmyDruid posted:

The tech sites are doing their HBM analysis after AMD releases more info. Pick your poison:


I'm really hoping that this won't be a repeat of the Rx 200 or GTX 700 series where only two or three cards are new out of the whole lineup. I guess we won't know 'til this summer.

Automata 10 Pack
Jun 21, 2007

Ten games published by Automata, on one cassette

Oh poo poo.

SwissArmyDruid
Feb 14, 2014

by sebmojo

GrizzlyCow posted:

It seems the Project Cars developers may have lied about their correspondence with AMD per HardOCP.

I am unfamiliar with the personalities on [H]. Who is cageymaru and what makes what he says carry any weight?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

GrizzlyCow posted:

It seems the Project Cars developers may have lied about their correspondence with AMD per HardOCP.

quote:

The PhysX makes 600 calculations per second on the CPU.

:eyepop:

penus penus penus
Nov 9, 2014

by piss__donald

GrizzlyCow posted:




I'm really hoping that this won't be a repeat of the Rx 200 or GTX 700 series where only two or three cards are new out of the whole lineup. I guess we won't know 'til this summer.

haha im pretty sure that we are certain of

Rukus
Mar 13, 2007

Hmph.

GrizzlyCow posted:

It seems the Project Cars developers may have lied about their correspondence with AMD per HardOCP.

I'm really hoping that this won't be a repeat of the Rx 200 or GTX 700 series where only two or three cards are new out of the whole lineup. I guess we won't know 'til this summer.

I'm sure AMD will continue to dress up the rotting corpse that is the 7970 into new clothes with yet another rebrand. Makes it really difficult to keep track of what actually supports their latest features like TrueAudio. That and the fact they've been keeping the prices on these rebrands the same for years now, stagnating the market a bit.

repiv
Aug 13, 2009

GrizzlyCow posted:

It seems the Project Cars developers may have lied about their correspondence with AMD per HardOCP.

The amount of rumor being passed around as fact regarding Project CARS is unreal. If you believe the mob, it runs badly on AMD because the Gameworks source code is under NDA and AMD can't see it (even though the game doesn't use any Gameworks libraries besides PhysX), and because the PhysX simulation is only GPU accelerated on nVidia cards (even though it runs perfectly with CPU-forced PhysX on nVidia, and the consoles pathetic CPUs).

Who knows who's to blame for AMDs low performance, but making stuff up to paint nVidia as a mustache twirling villain isn't helping :suicide:


Slightly Mad put out a statement refuting that, it's their own CPU-based simulation that runs at 600hz.

repiv fucked around with this message at 22:05 on May 19, 2015

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


Is 4GB that limiting for modern 4k games?

Kazinsal
Dec 13, 2011

repiv posted:

Slightly Mad put out a statement refuting that, it's their own CPU-based simulation that runs at 600hz.

They're changing their story pretty much every day.

repiv
Aug 13, 2009

Are they? They did say the physics ran at 600hz but as far as I can see they never said PhysX runs at 600hz.

Someone unaware that the game has its own physics model for the vehicles just saw that and made a false assumption.

Ragingsheep
Nov 7, 2009
There's no GPU accelerated physics in Project Cars regardless of whether you have an AMD or Nvidia gpu.

Sininu
Jan 8, 2014

I have been under impression that only APEX part of PhysX can run on GPU.
I know Arma 3 and all Unreal 4 games use PhysX for general physics that only runs on CPU.

Sininu fucked around with this message at 22:35 on May 19, 2015

wolrah
May 8, 2006
what?

Tab8715 posted:

Is 4GB that limiting for modern 4k games?

GTA V on higher settings can blast past 4GB in 4K. Not sure about other games, I haven't really tried too hard to break it.

Automata 10 Pack
Jun 21, 2007

Ten games published by Automata, on one cassette
Shadow of Mordor requires 6GB of VRAM to use the Ultra Textures setting.

AMD is putting out a card rumored to be $850 that can't run a year old game at max settings.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Mutation posted:

Shadow of Mordor requires 6GB of VRAM to use the Ultra setting on textures.

AMD is putting out a card rumored to be $850 that can't run a year old game at Ultra settings.

Are the Mordor Ultra textures (DLC, I think?) actually an improvement in visual quality even at 4K, or just a stunt?

Star War Sex Parrot
Oct 2, 2003

Subjunctive posted:

Are the Mordor Ultra textures (DLC, I think?) actually an improvement in visual quality even at 4K, or just a stunt?
It's difficult to see a difference in still shots, and basically impossible when the game is in motion.

GrizzlyCow
May 30, 2011

SwissArmyDruid posted:

I am unfamiliar with the personalities on [H]. Who is cageymaru and what makes what he says carry any weight?

I'm not too familiar either. Dude just seems to be a backer. If it helps back up what he said, someone accidentally forced enable CPU only PhysX on the Steam forums, and the performance dropped to AMD levels. There's also this page which shows what happens when the R9 290X is used with Windows 10's updated D3D11 (which reduces load on the processor).

Jenny Agutter
Mar 18, 2009

Mutation posted:

Shadow of Mordor requires 6GB of VRAM to use the Ultra Textures setting.

AMD is putting out a card rumored to be $850 that can't run a year old game at max settings.

It really doesn't, you can run ultra textures on a 970 with no problems

repiv
Aug 13, 2009

GrizzlyCow posted:

I'm not too familiar either. Dude just seems to be a backer. If it helps back up what he said, someone accidentally forced enable CPU only PhysX on the Steam forums, and the performance dropped to AMD levels. There's also this page which shows what happens when the R9 290X is used with Windows 10's updated D3D11 (which reduces load on the processor).

I've tested CPU-only PhysX myself, it makes absolutely zero difference to performance even on ultra settings.

Can't say what caused that guys performance to tank, but it's more likely something unrelated caused it than CPU PhysX miraculously running faster on my machine.

Tanreall
Apr 27, 2004

Did I mention I was gay for pirate ducks?

~SMcD
I don't know if the other HBM articles mentioned this but in The Tech Report's interview with Joe Macri he talks about memory capacity efficiency. So maybe 4GB will still be the future if they're able to work around the problem without degrading quality.

http://techreport.com/review/28294/amd-high-bandwidth-memory-explained/2

Joe Macri posted:

When I asked Macri about this issue, he expressed confidence in AMD's ability to work around this capacity constraint. In fact, he said that current GPUs aren't terribly efficient with their memory capacity simply because GDDR5's architecture required ever-larger memory capacities in order to extract more bandwidth. As a result, AMD "never bothered to put a single engineer on using frame buffer memory better," because memory capacities kept growing. Essentially, that capacity was free, while engineers were not. Macri classified the utilization of memory capacity in current Radeon operation as "exceedingly poor" and said the "amount of data that gets touched sitting in there is embarrassing."

ryangs
Jul 11, 2001

Yo vivo en una furgoneta abajo cerca del río!

ryangs posted:

How's this for insane: I just bought a new GTX 970 and my PC won't boot if my monitor is connected.

I have an Intel DZ77GA-70K motherboard and an Apple Cinema HD Display, the 23" one with a DVI connector. I just bought the ASUS Strix GTX 970. If the monitor is connected, the PC won't go beyond the POST screen (where I see the Intel logo). It sits there, then reboots.

If I connect another monitor, POST completes successfully and it boots.

If I boot with no monitor connected, it POSTs and then I can connect my Cinema HD Display once Windows is booting.

WTF?

Unfortunately, I'm already on the newest BIOS for my motherboard. Some people also suggested disabling UEFI boot, which I did, without effect.

For the curious, running the monitor through a simple DVI to HDMI adapter has fixed this. Amazing. The POST screen now shows at 640x480 instead of 1900x1200, which may be a clue to why this fixes it.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
I'd much rather have 8GB cards at launch, and if they don't have them I'll be tempted to hold off, but what happened to all the people backing the 970 with "3.5GB is more than enough", and what of the fact that the GTX 980 only has 4GB?

It doesn't sound like there's a technical reason they can't do 8 stacks - the only plausible objection I've heard is that the interposer might be too big. Kinda sounds like they might be holding 8GB units back as a counterattack after the 980 Ti launches with 6GB, along with the dual-GPU card (if rumors are to be believed (they're not)).

Or they could go the :ironicat: route and back their 4GB of up with 4-8GB of GDDR5 as a slow segment (after which NVIDIA immediately launches back with the "6GB means 6GB!" campaign and AMD's marketing team commits seppaku). AMD usually puts so much bandwidth on their memory that it would still probably be fine. After all a Titan X only has 5% more bandwidth than a 290X, so a "slow segment" would still probably be faster than a 980 Ti, especially if (since it's a cut-down GM200) they start disabling parts of the memory system like when they cut the GM100 down into the GTX 980. You're probably back into "huge board and runs hot" territory at that point, of course.

Also, hopefully the DX12 "adding memory capacity of the cards together in SLI/Crossfire" thing comes to fruition, because that would be a killer feature. Or that devs tighten their poo poo up and reduce memory usage, although I'm dubious that that will solve the problems entirely. If that happens, people will just start running double-ultra textures at 8x SSAA in triple-monitor 4K. Whatever it takes to get your rig bogged down to 50 FPS.

Paul MaudDib fucked around with this message at 00:22 on May 20, 2015

veedubfreak
Apr 2, 2005

by Smythe
Ugh, witcher 3 is 23gb. Gonna take a bit to download that one.

But it was free, woo.

GrizzlyCow
May 30, 2011

repiv posted:

I've tested CPU-only PhysX myself, it makes absolutely zero difference to performance even on ultra settings.

Can't say what caused that guys performance to tank, but it's more likely something unrelated caused it than CPU PhysX miraculously running faster on my machine.

Hmm. After looking around, there does seem to be more (user) benchmarks that follows your conclusion. Guess I goofed.

repiv
Aug 13, 2009

wolrah posted:

GTA V on higher settings can blast past 4GB in 4K. Not sure about other games, I haven't really tried too hard to break it.

Off the top of my head Crysis 3 hits 3.7GB at 4K.

VRAM consumption numbers are kind of dubious though, especially in streaming-heavy open world games. You can't tell if the engine is bloating its consumption by aggressively pre-caching more asset than it needs if spare VRAM is available, or delaying deleting probably unneeded assets until more VRAM is actually required.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
So far a lot of games with massive vram requirements seem to perform just fine when the excess is loaded into system memory. I think people over estimate just how much bandwidth is needed, as long as the driver is putting the bandwidth hungry assets in GDDR5/HBM then most games should run fine.

penus penus penus
Nov 9, 2014

by piss__donald
I've yet to run into any scenario where I couldnt just turn AA down either but ... I haven't played things like witcher 3 or gta5

sauer kraut
Oct 2, 2004

repiv posted:

Off the top of my head Crysis 3 hits 3.7GB at 4K.

VRAM consumption numbers are kind of dubious though, especially in streaming-heavy open world games. You can't tell if the engine is bloating its consumption by aggressively pre-caching more asset than it needs if spare VRAM is available, or delaying deleting probably unneeded assets until more VRAM is actually required.

That's correct. Witcher 3 only 'needs' a 2GB card for the highest quality textures, setting it to ultra just increases the buffer.
Of course it helps that it uses a custom variant of FXAA and doesn't waste resources on MSAA. Well it applies MSAA on HairWorks stuff but at that point you're at the very high end anyway.

Gwaihir
Dec 8, 2009
Hair Elf

Mutation posted:

Shadow of Mordor requires 6GB of VRAM to use the Ultra Textures setting.

AMD is putting out a card rumored to be $850 that can't run a year old game at max settings.

Also most importantly they don't actually use 6 gigs without also bumping AA up to high levels.

KakerMix
Apr 8, 2004

8.2 M.P.G.
:byetankie:

This is supremely cool. Not only am I all 'yay team green' but now I am using GoG's Galaxy Beta where I'd probably have ignored it before. A good marketing move I do say so.

Miley Virus
Apr 9, 2010

Tab8715 posted:

Is 4GB that limiting for modern 4k games?



Not really. That graph from TPU shows the max memory of those games at either 1440p with 4xAA or 4K with no AA. (CoD is probably an outlier, filling as much memory as it's given without actually using it). Having at least 6 seems like the sweet spot so that it's not limiting for at least a couple of years.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

KakerMix posted:

This is supremely cool. Not only am I all 'yay team green' but now I am using GoG's Galaxy Beta where I'd probably have ignored it before. A good marketing move I do say so.

If they're smart, they'll keep it up. I'd imagine it costs them no more than ~$10 per download code buying in bulk (if not less), and if they leverage their capital over AMD to provide first-tier not-yet-released games to people who buy Titan cards for their first year, that munches up a *lot* of the price difference between their SKUs, and all but forces AMD to lower their prices or start a program to compete. I'd think the only thing that'd give them pause is setting up a system like Humble Bundle did for a while there where they automatically added titles to your linked Steam account so they could put the curb on people selling the codes.

Of course, the folks on Guru3D are already theorycrafting how they could hack a cheap nVidia card in BIOS to act like a Titan X in name and DEV_ID only so they could game the system.

BIG HEADLINE fucked around with this message at 12:45 on May 20, 2015

Evil Kaneval
Sep 28, 2001

Jumping books since 1938

Evil Kaneval posted:

I'm wondering if there are others here that are running 770s in SLI that have experienced a tremendous amount of stutter while using nVidia drivers beyond 347.88 (namely 350.12 and 352.86)? This has been a confounding problem for me as 347.88 worked flawlessly. I updated to 350.12 for GTA 5 and experienced game breaking stuttering in that and other games (Guild Wars 2 and Diablo 3). The stuttering stopped when I disabled SLI, but I wasn't happy with that solution.

I was able to roll back to 347.88 (uninstalled using DDU and reinstalled using the clean install option) and play through GTA 5 without issues. Fast forward to yesterday when I installed 352.86, hoping that my issues had been resolved. They had not. Again, all of my games (including Witcher 3) are experiencing horrible stutter with SLI enabled and I'm faced with rolling back to 347.88 again (which obviously doesn't have a Witcher 3 SLI profile). I'm really disappointed with these last two nVidia driver releases and I've seen a fair number of people using 770/780/780Ti having similar, but not identical, issues. I've heard that drivers after 347.88 are no longer supporting Kepler GPUs, which seems really odd (via the nVidia 7xx section of their official forums, seems mostly speculative though). I'm not terribly sensitive to performance issues, as long as gameplay is relatively smooth but this stutter absolutely destroys the experience I'm heavily invested in.

Also I'm using an i7-4790k in ASUS Maximus VII Hero motherboard - BIOS is fully updated on all system components. All components are seated properly and SLI bridge is secure.

Sorry if this isn't the right place to post these questions and thank you for any insight you all can provide.

I posted this on page 424 and I seemed to have isolated my issue and resolved my stuttering. It was a problem with my particular nVidia surround setup and I'm still not 100% sure what I was doing wrong. It seems that something was introduced in 350.12 and beyond that made my setup no longer functional. I wrote about it on the GeForce forums if anyone is curious: https://forums.geforce.com/default/...your-monitors-/

penus penus penus
Nov 9, 2014

by piss__donald
I'm glad you figured out what was triggering it. I'd never have guessed lol

The Slack Lagoon
Jun 17, 2008



How would heat generation and energy consumption compare between a 760 and a 970?

I have a 760 that seems to throw off waaay more heat than my previous 660ti. In thinking of upgrading to the 970 for the 2 free games and selling my 760 to offset the cost, and part of that consideration would be decreases energy consumption.

How much could I get for an EVGA 760 4gb, no overclocking?

Seamonster
Apr 30, 2007

IMMER SIEGREICH

cat doter posted:

So far a lot of games with massive vram requirements seem to perform just fine when the excess is loaded into system memory. I think people over estimate just how much bandwidth is needed, as long as the driver is putting the bandwidth hungry assets in GDDR5/HBM then most games should run fine.

Maybe with HBM in laptops we can at least get some standardization for mobile GPUs in terms of memory bandwidth. For too long have manufacturers castrated otherwise decent GPUs with lovely DDR3, not mention it at all in the specs and then have people wonder why they're not getting all the FPS they should or thought they would.

Cinara
Jul 15, 2007
760 170w

vs

970 145w

So the 970 is about 15% lower, up to you if that is really worth it. The main power savings were from people using 770s(230w) and 780s(250w). The 970 is certainly a much more powerful card in terms of performance though, so you would see gains there.

Adbot
ADBOT LOVES YOU

The Slack Lagoon
Jun 17, 2008



Cinara posted:

760 170w

vs

970 145w

So the 970 is about 15% lower, up to you if that is really worth it. The main power savings were from people using 770s(230w) and 780s(250w). The 970 is certainly a much more powerful card in terms of performance though, so you would see gains there.

I am amazed at the energy efficiency gains recently in computer parts.

:science: Hey, we're using significantly less power... And it performs way better then before!

Any thoughts on what I could sell a 760 4gb for?

The Slack Lagoon fucked around with this message at 20:50 on May 20, 2015

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply