Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
BurritoJustice
Oct 9, 2012

Desuwa posted:

The ACX 2.0 on my SC has the curved heatpipies.

It's actually worse than the FTW version Anandtech reviewd. The FTW which has a fourth heatpipe, the new ACX 2.0 with the straight pipes still only has three.

When even EVGA's marketing only claims a 6% improvement I won't bother with the step-up program. I'm just going to return it to Amazon once a replacement MSI card arrives. I waffled back and forth for a while over whether I was going to bother with the return or not, but seeing such an inefficient and lazily redesigned cooler just tells me that EVGA doesn't care in the slightest.

So there are four EVGA coolers, ACX, ACX 2.0, ACX 2.0 that came with the FTW, and ACX 2.0 that comes with the SSC (which has less heatpipes?). EVGA's 6% is comparing the new one (the 4th) to the curved heatpipes ACX (the first or second) instead of the 3rd which by all accounts is better than the 1st and 2nd while still being lovely. By all accounts they are all lovely. Hilarious.

Adbot
ADBOT LOVES YOU

wolrah
May 8, 2006
what?

BurritoJustice posted:

The second one is quite clearly exactly the same cooler from the Anandtech review that is poo poo

Scroll down the page and see the cooler shot, it has those heatpipes.

The curved around, poorly contacting heatpipes were on the original 970 ACX 1.0, these new cards have the large nickel contact plate that is clearly in that picture on the FTW.

Take a look for yourself: http://www.evga.com/Products/Product.aspx?pn=04G-P4-2974-KR

Even in the promo shot with the box you can see the heatpipes running across the card. Their marketing is just smoking crack and throwing names at whatever sticks. Within the 970 line alone there are now at least three different coolers that have been called "ACX 2.0".

We'll see how it does when I get it, I mostly wanted the upgrade in the output configuration and just paying shipping for that plus a probably better cooler seems worth a shot.

e:f;b

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Kazinsal posted:

300W TDP? Jesus christ, that's like two MSI 970 4Gs. :stare:

OTOH the rumor is that it's 50% again larger than the 290X, and HBM will alleviate one of the biggest bottlenecks in current-gen cards. Combine that with imperfect SLI scaling, and the thing could actually RUN like a pair of 970s.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

BurritoJustice posted:

So there are four EVGA coolers, ACX, ACX 2.0, ACX 2.0 that came with the FTW, and ACX 2.0 that comes with the SSC (which has less heatpipes?). EVGA's 6% is comparing the new one (the 4th) to the curved heatpipes ACX (the first or second) instead of the 3rd which by all accounts is better than the 1st and 2nd while still being lovely. By all accounts they are all lovely. Hilarious.

As far as I know all the non-FTW cards have three pipes, where the FTW versions have four. It's their attitude more than their technical incompetence that has me going through the motions to replace it.

Star War Sex Parrot
Oct 2, 2003

I don't know how many heatpipes my GTX 980 has because I'm too busy playing games with it.

univbee
Jun 3, 2004




Currently mulling over some upgrade possibilities. Is there any particular reason to get the larger-sized 970 cards instead of the mini ones, other than for overclocking? Or at stock speeds is it identical, just with less fans and a smaller size?

1gnoirents
Jun 28, 2014

hello :)
Pretty excited to see what hbm can do. Hopefully the gpu can keep up. 300 watts is disappointing but not unexpected I suppose save for that one leak earlier

Party Plane Jones
Jul 1, 2007

by Reene
Fun Shoe

univbee posted:

Currently mulling over some upgrade possibilities. Is there any particular reason to get the larger-sized 970 cards instead of the mini ones, other than for overclocking? Or at stock speeds is it identical, just with less fans and a smaller size?

At stock it's going to run hotter (some people with the mini-ITX have reported running at 10c higher than cards with more fans in some things) and probably run louder since it's one fan versus two or three.

Khagan
Aug 8, 2012

Words cannot describe just how terrible Vietnamese are.

Factory Factory posted:

OTOH the rumor is that it's 50% again larger than the 290X, and HBM will alleviate one of the biggest bottlenecks in current-gen cards. Combine that with imperfect SLI scaling, and the thing could actually RUN like a pair of 970s.

But can I use the same nuke jokes that AMDfags used to describe Fermi with this card?

Also, what bottleneck does HBM solve?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Memory bandwidth. Since the 7970/GeForce 680, cards have gotten a lot of performance out of VRAM overclocking, and plethoras of DDR3 interfaces require a lot of shenanigans like huge die sizes and tons of memory controllers with complex access control. HBM will pack shitloads of bandwidth into fewer controllers (albeit still with a ton of I/O pins). It'll basically act like a giant and effective L4 cache, like Intel Crystalwell or the ESRAM cache on the Xboxen.

sauer kraut
Oct 2, 2004
Today's GTX 960 leaks are dire.
It's a GM204 cut in half but they stepped hard on the clockspeed to edge out a benchmark lead over the 760, 1200+ MHz out of the box = 120W and the same cooling needed as a 970.
Somehow they managed to make a card that's less attractive than the 2GB Tonga.

Josh Lyman
May 24, 2009


sauer kraut posted:

Today's GTX 960 leaks are dire.
It's a GM204 cut in half but they stepped hard on the clockspeed to edge out a benchmark lead over the 760, 1200+ MHz out of the box = 120W and the same cooling needed as a 970.
Somehow they managed to make a card that's less attractive than the 2GB Tonga.
Maybe they overshot with the 970 and are trying to overcompensate with the 960.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

sauer kraut posted:

Today's GTX 960 leaks are dire.
It's a GM204 cut in half but they stepped hard on the clockspeed to edge out a benchmark lead over the 760, 1200+ MHz out of the box = 120W and the same cooling needed as a 970.
Somehow they managed to make a card that's less attractive than the 2GB Tonga.

AMD has to be left confused right now, because their slightly underwhelming mid-range part was just copied. Maybe they should feel flattered?

Any information about the retail pricing yet? Because NVIDIA hasn't historically been known for their good mid-range pricing. Either way, it seems the rumours were on the money.

HalloKitty fucked around with this message at 15:33 on Jan 14, 2015

Presto
Nov 22, 2002

Keep calm and Harry on.

BurritoJustice posted:

By all accounts they are all lovely. Hilarious.
They're not lovely. They all do a perfectly adequate job of cooling the card. Just because it's maybe not quite as good as other cards doesn't mean it's some kind of total failure.

Maybe it's just because my last card was a 5870, but my EVGA 970 seems pretty quiet.

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>

sauer kraut posted:

Today's GTX 960 leaks are dire.
It's a GM204 cut in half but they stepped hard on the clockspeed to edge out a benchmark lead over the 760, 1200+ MHz out of the box = 120W and the same cooling needed as a 970.
Somehow they managed to make a card that's less attractive than the 2GB Tonga.

Ouch, if that card is any higher than $200 then that's a bit sad.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

HalloKitty posted:

AMD has to be left confused right now, because their slightly underwhelming mid-range part was just copied. Maybe they should feel flattered?

Any information about the retail pricing yet? Because NVIDIA hasn't historically been known for their good mid-range pricing. Either way, it seems the rumours were on the money.

$199 is what I've heard. For 2GB and 128bit bus in 2015, um....

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Recall:

1) It's supposed to be a new chip, and there are supposed to be Ti versions with harvested GM204 dies to fit between the 960 and 970.

2) Maxwell features end-to-end memory compression. Notice how the GeForce 980 matches-to-beats a 780 Ti (with a 384-bit bus) using a 256-bit bus and identical memory clocks.

3) A GeForce 760-ish card really can't take much advantage of more than 2 GB of RAM while maintaining ~60 FPS by itself. Like, here's some benchmarks testing Far Cry 3. At 4K, the extra VRAM makes a 500% difference - from 2 FPS to 10 FPS. It's clearly a 1920x1080 card, and 1920x1080 only broke the >1 GB benchmark consistently a year or two ago.

4) Today's "high" settings are yesteryear's "Ultra," and today's "Ultra" involves supersampling and/or ludicrously expensive, barely perceptible differences like texture quality you'd only pick up on at 4K res. It's not like today's Mass Effect needs more oomph, it's that people are exposing Metro 2033 levels of high-end options more regularly.

The_Franz
Aug 8, 2003

Factory Factory posted:

3) A GeForce 760-ish card really can't take much advantage of more than 2 GB of RAM while maintaining ~60 FPS by itself. Like, here's some benchmarks testing Far Cry 3. At 4K, the extra VRAM makes a 500% difference - from 2 FPS to 10 FPS. It's clearly a 1920x1080 card, and 1920x1080 only broke the >1 GB benchmark consistently a year or two ago.

It does prevent frame time spikes when games just assume that they can have 3+ gigs of textures cached at one time and 2 gig cards end up swapping textures to and from system memory.

Simple Simon
Dec 29, 2008

"Good sir, would you fancy a Salmon tartar?"
Wrong thread.

Simple Simon fucked around with this message at 19:59 on Jan 14, 2015

1gnoirents
Jun 28, 2014

hello :)

The_Franz posted:

It does prevent frame time spikes when games just assume that they can have 3+ gigs of textures cached at one time and 2 gig cards end up swapping textures to and from system memory.

Kind of. I have a lot of opinions on nvidia's memory situation last gen. Most of the time it was very well matched. At least as far as what was out last year as far as games, almost no cards could reasonably push past the quantity of memory present on the board without major suffering in raw fps first. That being said, running out of memory is the worst kind of fps drops the times where it was present first (such as, 770 2gb SLI) but even then it only happened in certain situations. I could mitigate those to-zero fps drops by frame limiting, despite never having "enough" memory. I'm not sure how well some of those cards might fare with today's most memory hogging games.

I am very impressed with the memory improvements this time around, and it kind of cinched the idea that memory bits and bandwidth are only really comparable in performance within the same generation of tech (which was evident in earlier generations as well). However, the thought of 128 bit 2GB cards seems to be pushing it. We shall see

BurritoJustice
Oct 9, 2012

Presto posted:

They're not lovely. They all do a perfectly adequate job of cooling the card. Just because it's maybe not quite as good as other cards doesn't mean it's some kind of total failure.

Maybe it's just because my last card was a 5870, but my EVGA 970 seems pretty quiet.

I'd call louder than reference a pretty big failure for a two fan open blower.

By "maybe not quite as good" do you mean its up to 10 decibels louder and 10 degrees hotter than other custom cards you can get for the same money. How is that not lovely?

Its a pretty awesome performer if you compared it to older, way higher TDP cards with basic blowers but its far below the standard for custom coolers this generation.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Factory Factory posted:

3) A GeForce 760-ish card really can't take much advantage of more than 2 GB of RAM while maintaining ~60 FPS by itself. Like, here's some benchmarks testing Far Cry 3. At 4K, the extra VRAM makes a 500% difference - from 2 FPS to 10 FPS. It's clearly a 1920x1080 card, and 1920x1080 only broke the >1 GB benchmark consistently a year or two ago.
...and there's a reason about that ended about "a year ago" with the introduction of the new consoles - that is no longer the case. 2GB, even 3+GB vram can be exceeded by several next-gen console ports already at 1080p.

2GB is just a shortsighted purchase even for a lower-end card these days. You can suffer performance drops by trying to push higher res textures than your card can drive, sure - but there's a hell of a difference between going from 40->30fps and "completely unplayable" which happens when you run out of texture memory.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Happy_Misanthrope posted:

...and there's a reason about that ended about "a year ago" with the introduction of the new consoles - that is no longer the case. 2GB, even 3+GB vram can be exceeded by several next-gen console ports already at 1080p.

2GB is just a shortsighted purchase even for a lower-end card these days. You can suffer performance drops by trying to push higher res textures than your card can drive, sure - but there's a hell of a difference between going from 40->30fps and "completely unplayable" which happens when you run out of texture memory.

Yeah, but see the point about halo-tier effects. Go look at Shadow of Mordor's 6 GB VRAM Ultra textures compared to High. Tell me you see a difference there. Tell me you'd see a difference when the game was in motion. Play it at 1080p rather than a higher resolution and then tell me you could spot the differences reliably. Because I can't.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

So why does gsync (and I think freesync/adaptive sync) require exclusive full-screen? Can the Windows compositor not scan out at variable rates?

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>
Most games in the last 6 months have either used nearly all of or even more than 2GB of VRAM, I'd be extremely surprised if that doesn't start becoming a major bottleneck this time next year.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Factory Factory posted:

Yeah, but see the point about halo-tier effects. Go look at Shadow of Mordor's 6 GB VRAM Ultra textures compared to High. Tell me you see a difference there. Tell me you'd see a difference when the game was in motion. Play it at 1080p rather than a higher resolution and then tell me you could spot the differences reliably. Because I can't.

Even at 2560x1440 I couldn't notice a difference between SoM's high and ultra textures, so I went back to high because ultra textures made it leak memory faster.

I'm surprised I can't find more talk online about Shadow of Mordor leaking memory, when I had a fixed-size small page file it would run out of memory and crash after about 20-30 min, now that I let the page file get gigantic it starts filling up the pagefile and stuttering and freezing after 30 min. I'd say there was something weird about my machine, but a friend of mine who has a GTX 970 instead of my R9-290 has the exact same thing happen. Whenever I've asked other people about it, I just get told that 8GB isn't enough RAM anymore, it seems like most of my friends have at least 16GB and several have 32GB.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Hace posted:

Most games in the last 6 months have either used nearly all of or even more than 2GB of VRAM, I'd be extremely surprised if that doesn't start becoming a major bottleneck this time next year.

Hell, it was a bottleneck some time ago with modded Skyrim. It seems very short-sighted now for anything more powerful than a 750 Ti.

sauer kraut
Oct 2, 2004
I guess 3 year old 7970's for 230 bucks is just too good of a deal for the dirty poors, it couldn't be allowed to happen again :tinfoil:

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Subjunctive posted:

So why does gsync (and I think freesync/adaptive sync) require exclusive full-screen? Can the Windows compositor not scan out at variable rates?


I would not go there

Gwaihir
Dec 8, 2009
Hair Elf

Twerk from Home posted:

Even at 2560x1440 I couldn't notice a difference between SoM's high and ultra textures, so I went back to high because ultra textures made it leak memory faster.

I'm surprised I can't find more talk online about Shadow of Mordor leaking memory, when I had a fixed-size small page file it would run out of memory and crash after about 20-30 min, now that I let the page file get gigantic it starts filling up the pagefile and stuttering and freezing after 30 min. I'd say there was something weird about my machine, but a friend of mine who has a GTX 970 instead of my R9-290 has the exact same thing happen. Whenever I've asked other people about it, I just get told that 8GB isn't enough RAM anymore, it seems like most of my friends have at least 16GB and several have 32GB.

I have a 980 and 8 gigs of system ram and it doesn't do that for me. It plays fantastically at 2560 * 1600 at ~70 fps for hours.

:iiam:

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Gwaihir posted:

I have a 980 and 8 gigs of system ram and it doesn't do that for me. It plays fantastically at 2560 * 1600 at ~70 fps for hours.

:iiam:

What we've worked out is that we both have 2500K, 8GB DDR3-1600, and Samsung 840 EVOs with RAPID enabled. I'm betting that RAPID is doing something dumb.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Malcolm XML posted:

I would not go there

?

Gwaihir
Dec 8, 2009
Hair Elf

Twerk from Home posted:

What we've worked out is that we both have 2500K, 8GB DDR3-1600, and Samsung 840 EVOs with RAPID enabled. I'm betting that RAPID is doing something dumb.

Hm, I also have an 840 Evo with rapid, and a 3770K. Perhaps different motherboard chipsets, z67 vs z77.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!
The size of the EVO matters for how much memory RAPID consumes BTW. For what it's worth I have seen something eat up my computer's Nonpaged pool when I've not restarted it after a few weeks. (This is easiest to see in RAMMap).

I have a 750GB EVO and 16GB of RAM though (I love my chrome tabs and games so I overspent on RAM, sue meeeeee).

sauer kraut
Oct 2, 2004
Why would you install the lovely software that comes with a harddrive/SSD, and most of the stuff that comes with mainboards and GPUs? That's just begging for trouble

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

sauer kraut posted:

Why would you install the lovely software that comes with a harddrive/SSD, and most of the stuff that comes with mainboards and GPUs? That's just begging for trouble

:what:

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

sauer kraut posted:

Why would you install the lovely software that comes with a harddrive/SSD, and most of the stuff that comes with mainboards and GPUs? That's just begging for trouble

I think you're misunderstanding exactly what RAPID is.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

The way that windows interfaces with graphics and DX is complicated and the answer is, maybe but it would break applications so I am not surprised that dynamic refresh can only be done in single application fullscreen mode.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

sauer kraut posted:

Why would you install the lovely software that comes with a harddrive/SSD, and most of the stuff that comes with mainboards and GPUs? That's just begging for trouble

Do you know what rapid does?

Adbot
ADBOT LOVES YOU

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Malcolm XML posted:

The way that windows interfaces with graphics and DX is complicated and the answer is, maybe but it would break applications so I am not surprised that dynamic refresh can only be done in single application fullscreen mode.

Yeah, I work with some ex-Windows-graphics people, but we weren't sure and I couldn't find an explanation anywhere, nor indication as to whether it was something they could fix in a driver update (like they did when adding support for windowed apps to Shadowplay, f.e.). I guess you're saying it could violate applications' expectations of how often DWM scans out, but DWM composites already so apps update whenever they want, so I don't really follow.

I'll keep looking on the web. Does AMD's freesync stuff have the same limitation?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply