Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Rigged Death Trap
Feb 13, 2012

BEEP BEEP BEEP BEEP

Gestalt Intellect posted:

Well thats pretty weird because the 740's specifications were better in every way except memory interface bus size (128 bit vs 256 bit) but I can return it.

Low end Nvidias (x10-x40) tend to be outperformed by middle range (x50-x60)cards even from two generations back.
Especially since the middle range cards tend to get the features and hardware you'd want to run a game or a large display, things like GDDR, wider buses, and just plain better architecture that is more efficient and powerful per processor.

Adbot
ADBOT LOVES YOU

Anime Schoolgirl
Nov 28, 2002

DoctorOfLawls posted:

Is it fair to assume nVidia's Pascal will allow for 4k gaming at 60fps, or not by a long shot?
depends whether they decide to put more rops and tmus in the new node or bring back the hardware scheduler, which will make computing gains substantial but gaming gains modest

Owl Inspector
Sep 14, 2011

SwissArmyDruid posted:

Let me guess. You looked at the memory size and frequencies, and that was it? =P

It was a gift so I didn't make the call myself but I would have made the same mistake myself anyway, because the specs on newegg and such compared the 740 better (better core clock and double the VRAM mainly, also the general assumption that a card two generations later would be better). In the past going by those numbers always worked for me. Returning the 740 will not be a problem though so I'm now looking at the 960 or 970.

At this point I'm mostly just looking at FPS comparisons and poo poo on tom's hardware because I don't know what any of these numbers mean anymore

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Gestalt Intellect posted:

It was a gift so I didn't make the call myself but I would have made the same mistake myself anyway, because the specs on newegg and such compared the 740 better (better core clock and double the VRAM mainly, also the general assumption that a card two generations later would be better). In the past going by those numbers always worked for me. Returning the 740 will not be a problem though so I'm now looking at the 960 or 970.

At this point I'm mostly just looking at FPS comparisons and poo poo on tom's hardware because I don't know what any of these numbers mean anymore

http://anandtech.com/bench/GPU15/1248

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

The most impressive part I think of those benches is seeing a GTX 680 (35fps) between an HD7850 (32fps) and an R7 370 (39fps). They're both the exact same GPU, and even the R7 370 is downclocked. That's either some impressive hardware optimizations or driver shenanigans. Also 380X outperforming GTX 780, :laffo:, get your driver poo poo together AMD.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

FaustianQ posted:

The most impressive part I think of those benches is seeing a GTX 680 (35fps) between an HD7850 (32fps) and an R7 370 (39fps). They're both the exact same GPU, and even the R7 370 is downclocked.

The 7850's stock speed is 860Mhz, with 1200MHz VRAM.
The 370's stock speed on the other hand is 925MHz, with a boost of 975, and 1400MHz VRAM.

So the small difference is easily explained away in over 100MHz on the core clock and a couple hundred on the VRAM.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

HalloKitty posted:

The 7850's stock speed is 860Mhz, with 1200MHz VRAM.
The 370's stock speed on the other hand is 925MHz, with a boost of 975, and 1400MHz VRAM.

So the small difference is easily explained away in over 100MHz on the core clock and a couple hundred on the VRAM.

Durr, thanks for pointing out the obvious part I missed. Still odd they're sandwiching a GTX 680 between them, that's supposed to still be an impressive performer.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

FaustianQ posted:

Durr, thanks for pointing out the obvious part I missed. Still odd they're sandwiching a GTX 680 between them, that's supposed to still be an impressive performer.

That seems to be a particularly harsh benchmark on the 680. Compare the 7850 and the 680 in more benchmarks, and the picture changes.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

FaustianQ posted:

Also 380X outperforming GTX 780, :laffo:, get your driver poo poo together AMD.

That sounds about right actually. The 780 Ti is like a 970 level performer and AMD offers the 390 in that performance slot. The 380 is the AMD equivalent of the 960, and once the 960 Ti releases that will be NVIDIA's response to the 380X. So the base 780 performing like a 960 Ti or a 380X sounds about right.

The thing to remember is - for whatever reason, Tonga just isn't that fast a chip. Its power consumption and featureset improved (and tesselation in particular is vastly better) but overall performance didn't. The 285 was rather scandalous at release time because it was getting beaten by a card that was 3 years old at that point. Even today, in the TechPowerUp performance summaries, the 280X beats the 380X at all resolutions.

In that light, the 380X is actually doing very well in that particular benchmark. It's beating the 280X by 10% and it's a full 48% faster than the base 380 model. Guess they unlocked more of whatever was bottlenecking the 380. SoM is a game that's known to be kind of a tough test, it's probably something like tesselation performance. Probably why the 680 isn't doing that hot either. It's getting long in the tooth for sure.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Paul MaudDib posted:

In that light, the 380X is actually doing very well in that particular benchmark. It's beating the 280X by 10% and it's a full 48% faster than the base 380 model. Guess they unlocked more of whatever was bottlenecking the 380. SoM is a game that's known to be kind of a tough test, it's probably something like tesselation performance. Probably why the 680 isn't doing that hot either. It's getting long in the tooth for sure.

I seem to recall Shadow of Mordor being VRAM heavy. The 380's base configuration included only 2GiB VRAM, whereas all 380Xs have 4. That could be affecting the result more than the difference in the chip configuration.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

HalloKitty posted:

That seems to be a particularly harsh benchmark on the 680. Compare the 7850 and the 680 in more benchmarks, and the picture changes.

The 680 is a top end card, its contemporary would be the 7970 instead of the 7850. It's not surprising that it wins against a card that cost half as much. There's a huge jump in performance when you move from Pitcairn to Tahiti.

There's been some last 6950s, 6970s, and 7850s getting clearanced at Newegg and I've seen people advocate them for Crossfire builds. The 7850, 7950, and 7970 (and R9 variants) are still OK for cheap-and-nasty single-card builds but if you're considering crossfire you might as well step up to a 290 or 390 instead. Yeah, GCN 1.0 has aged super gracefully but it's pushing 4 years old at this point, and a single 4 GB card is much better than CrossFire 2 GB cards. Hawaii is going to stay viable for at least another year or two and will be a workable throwaway card for a bit longer after that.

The 680 has the same problem. Like sure it's a good score if you get a $40 unit from B-stock, but you're back to SLI'ing 2GB cards and that's just not future proof.

Paul MaudDib fucked around with this message at 01:25 on Dec 28, 2015

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

HalloKitty posted:

I seem to recall Shadow of Mordor being VRAM heavy. The 380's base configuration included only 2GiB VRAM, whereas all 380Xs have 4. That could be affecting the result more than the difference in the chip configuration.

Yeah, good point. Ultra-quality textures are huge on SoM, that very well might be the bottleneck even at 1080p.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


What are the ideal upgrade scenario(s) for a R290(non-x) at 2560x1600?

Gucci Loafers fucked around with this message at 03:03 on Dec 28, 2015

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Tab8715 posted:

What are the ideal upgrade scenario for a R290(non-x) at 2560x1600?

Probably going CrossFire, if the games you play support it. That's the most improvement for the least cash.

Past that you basically sell your card and step up to a 980 Ti. There's not a whole lot of other options that produce performance increases over a current-gen high-end card (since the 390 is the same chip). Basically there's just the 290x and 390x, the 980, the Fury/X, and the 980 Ti/Titan X. The 290x, 390x, and 980 are perfectly fine cards but the improvement is going to be like 15% or so, not a giant leap in performance.

I saw Fury Nanos at TigerDirect for $400 a while back. That would be a pretty good option too, at that price. Normal price is too high though, and the Fury isn't an especially compelling product overall as long as the 980 Ti remains priced so aggressively. Unless you need AMD-specific capabilities like Freesync or EyeFinity resolution matching.

Don't forget to overclock too, if you haven't already. That will get you up to par with the performance of a 390. Free is always good.

Paul MaudDib fucked around with this message at 03:01 on Dec 28, 2015

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Yeah I really didn't mean to be linking that for the SoM bench as much as to give the guy the tool.


Tab8715 posted:

What are the ideal upgrade scenario for a R290(non-x) at 2560x1600?

A year away, a 290 for CF if supported, a 980 Ti or a Fury (X) to go with your Freesync screen or AMD preference, in order.

Anime Schoolgirl
Nov 28, 2002

Will Nvidia ever patch the 980ti to use VESA/Intel/AMD Adaptive Sync though

Sacred Cow
Aug 13, 2007
Does anyone have any experience with Micro Centers open box GPUs? I'm about to get a GTX 970 and the MC has 6 or 7 open box Asus models for $280. Is it worth the risk or should I just pay the extra $40 for new and try to sell of the pack in game to make up the difference?

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


Paul MaudDib posted:

Don't forget to overclock too, if you haven't already. That will get you up to par with the performance of a 390. Free is always good.

I haven't actually but for whatever reason the Crimson overdrive is only give me a way to increment my GPU/Memory Speed by percentages? My google kung-fo is failing me, how do I switch this to show by Mhz?

Wistful of Dollars
Aug 25, 2009

Just download Afterburner or Trixx.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Anime Schoolgirl posted:

Will Nvidia ever patch the 980ti to use VESA/Intel/AMD Adaptive Sync though

It would instantly make GSync obsolete, so they won't do that unless they are completely ready to declare defeat.

So no, probably not anytime soon.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Sacred Cow posted:

Does anyone have any experience with Micro Centers open box GPUs? I'm about to get a GTX 970 and the MC has 6 or 7 open box Asus models for $280. Is it worth the risk or should I just pay the extra $40 for new and try to sell of the pack in game to make up the difference?

You can return them like a new product, but you're rolling the dice as to whether you're getting something someone "borrowed" or something broken or a card run through the ringer (or a card with bad coil whine). It might also be missing stuff...MC doesn't do a lot of QA, and they certainly don't plug it into a PC to make sure it works. Look at the extra 40 bucks as insurance in not having to go back and stand in line to return it. ASUS' 970 has made it to $280 retail online, too.

You also run the risk of the card being already registered and have to do a dance with the company's support to void the last registration.

BIG HEADLINE fucked around with this message at 04:02 on Dec 28, 2015

BurritoJustice
Oct 9, 2012

Anime Schoolgirl posted:

Will Nvidia ever patch the 980ti to use VESA/Intel/AMD Adaptive Sync though

They can't, Maxwell is missing hardware support (no DP 1.2a, only 1.2).

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Lockback posted:

It would instantly make GSync obsolete, so they won't do that unless they are completely ready to declare defeat.

So no, probably not anytime soon.

Well, right now Freesync just isn't as good of an implementation as G-sync. NVIDIA would still have better sync range, ULMB, and overdrive. But yeah it would cut into their vendor lockdown for sure.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


El Scotch posted:

Just download Afterburner or Trixx.

Does it matter which one?

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Paul MaudDib posted:

Well, right now Freesync just isn't as good of an implementation as G-sync. NVIDIA would still have better sync range, ULMB, and overdrive. But yeah it would cut into their vendor lockdown for sure.

Windowless fullscreen support is a big one too, but overdrive's been working in Freesync a while, what isn't is framerate dependent overdrive.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

BurritoJustice posted:

They can't, Maxwell is missing hardware support (no DP 1.2a, only 1.2).

NVIDIA already has a similar capability built out in Mobile G-Sync. The only hardware requirement for FreeSync and Mobile G-Sync is Embedded DisplayPort, which has been around since like 2008.

I would be really, really surprised if this is something NVIDIA couldn't patch in, if they wanted to. There's nothing in the GPU that is particularly specific to doing Adaptive Sync - in hindsight it's almost surprising nobody did it beforehand since it's a pretty intuitive fix for tearing.

xthetenth posted:

Windowless fullscreen support is a big one too, but overdrive's been working in Freesync a while, what isn't is framerate dependent overdrive.

So it only works when you're at full FPS? That is not particularly useful - ghosting is really just as much of a problem as tearing.

Paul MaudDib fucked around with this message at 04:29 on Dec 28, 2015

BurritoJustice
Oct 9, 2012

Paul MaudDib posted:

NVIDIA already has a similar capability built out in Mobile G-Sync. The only hardware requirement for FreeSync and Mobile G-Sync is Embedded DisplayPort, which has been around since like 2008.

I would be really, really surprised if this is something NVIDIA couldn't patch in, if they wanted to. There's nothing in the GPU that is particularly specific to doing Adaptive Sync - in hindsight it's almost surprising nobody did it beforehand since it's a pretty intuitive fix for tearing.


So it only works when you're at full FPS? That is not particularly useful - ghosting is really just as much of a problem as tearing.

Variable VBlank has been included with eDP forever as a power saving method, and that is what mobile GSync uses. It is, however, an important distinction that FreeSync (or VESA AdaptiveSync rather) does not use Variable VBlank in its current desktop form. When AMD showed off their initial "FreeSync" prototype to combat GSync, it was exclusively shown on laptops as only the portable GPUs and panels using eDP could use it. When they moved it to be a desktop product, however, AdaptiveSync support had to be added to desktop GPUs with an extension to DP 1.2a.

For the same reason that Adaptive Sync doesn't work on the 7970, it can't work on the desktop Maxwell chips without a hardware update.

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>

BurritoJustice posted:

They can't, Maxwell is missing hardware support (no DP 1.2a, only 1.2).

Surely they won't conveniently forget to include that for Pascal

SwissArmyDruid
Feb 14, 2014

by sebmojo
Realistically, I see them going to 1.3 and just pretending to ignore that Freesync exists, because they can flog the 5K60 and 8K60 capability that it offers instead.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
the annoying part is if we're gonna be rendering games at stupid resolutions like 2160p and higher, variable refresh needs to be more accessible, it shouldn't be locked away in nvidia's bullshit walled garden

I mean the more I think about it, the holy 60hz refresh has been a completely arbitrary limitation this whole time, variable refresh is a smart solution to the problems a fixed refresh introduces, yet the rollout is being completely hosed up because nvidia wants to sell their bullshit module

it's all stupid bullshit

Panty Saluter
Jan 17, 2004

Making learning fun!

cat doter posted:

it's all stupid bullshit

Close the thread, we're done here.

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
I just want my video game frames delivered from farm to table, is that so much to ask for

BurritoJustice
Oct 9, 2012

cat doter posted:

the annoying part is if we're gonna be rendering games at stupid resolutions like 2160p and higher, variable refresh needs to be more accessible, it shouldn't be locked away in nvidia's bullshit walled garden

I mean the more I think about it, the holy 60hz refresh has been a completely arbitrary limitation this whole time, variable refresh is a smart solution to the problems a fixed refresh introduces, yet the rollout is being completely hosed up because nvidia wants to sell their bullshit module

it's all stupid bullshit

Freesync would've never happened (or at least not when it did) if Nvidia didn't make GSync. For Nvidia it was a lot faster and evidently more effective to push out an in house solution instead of going the long way around with an industry standard. AMD is doing a literally identical thing with Freesync over HDMI right now, it is not part of the HDMI standard (and won't be for the foreseeable future) so AMD is working on a proprietary scalar module to support it.

Hate them or not, proprietary techs push the industry forward.


I do think Nvidia could still push the "GSync is superior" angle if they supported AdaptiveSync as well on their new line of cards. GSync could be advertised as the higher tech, more expensive solution for enthusiasts. Nvidia still gets to sell their module and the industry standard sees greater adoption. Win win

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
I don't understand why no one at nVidia hasn't had the brainwave to mount the G-Sync module in a VESA mount-sized enclosure so people who bought in early in the 120-144Hz craze could add the functionality to their monitors, or even better, allow people to add G-Sync to the objectively cheaper FreeSync monitors. If anything, I'd think that would make more sense, because if the G-Sync module ever failed, the monitor itself is probably going to be useless. If it were mounted externally, you could ditch the loving thing and still use the monitor and just RMA the module.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

BIG HEADLINE posted:

I don't understand why no one at nVidia hasn't had the brainwave to mount the G-Sync module in a VESA mount-sized enclosure so people who bought in early in the 120-144Hz craze could add the functionality to their monitors, or even better, allow people to add G-Sync to the objectively cheaper FreeSync monitors. If anything, I'd think that would make more sense, because if the G-Sync module ever failed, the monitor itself is probably going to be useless. If it were mounted externally, you could ditch the loving thing and still use the monitor and just RMA the module.

The module ties into the display driver controller at a low level, and thus you can't really do that since it's an integral part of the display pipeline. That's why none of the G-sync monitors with the gen-1 module have any inputs besides the single G-sync DisplayPort input. The input is fed directly through the G-sync module, which basically functions as the display driver for the monitor.

You could probably offer it as a separate standalone product just to drive FreeSync monitors, but NVIDIA would never do it. It would be an enormous hassle let alone cutting into their market.

Paul MaudDib fucked around with this message at 07:48 on Dec 28, 2015

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Paul MaudDib posted:

The module probably ties into the display driver controller at a low level, and thus you can't really do that since it's an integral part of the display pipeline. That's why none of the G-sync monitors with the gen-1 module have any inputs besides the single G-sync DisplayPort input.

Among other things I'm pretty sure the G-sync module has customized values for how hard to drive overdrive on the monitor, but any level of requiring a tie-in to the specific monitor makes it not work unless it pulls them down from an online table and can identify the panel itself.

BurritoJustice
Oct 9, 2012

BIG HEADLINE posted:

I don't understand why no one at nVidia hasn't had the brainwave to mount the G-Sync module in a VESA mount-sized enclosure so people who bought in early in the 120-144Hz craze could add the functionality to their monitors, or even better, allow people to add G-Sync to the objectively cheaper FreeSync monitors. If anything, I'd think that would make more sense, because if the G-Sync module ever failed, the monitor itself is probably going to be useless. If it were mounted externally, you could ditch the loving thing and still use the monitor and just RMA the module.

Nvidia did release GSync as a separate part very early in development. Due to the above reasons it required a serious level of DIY to install, and was specific to only one model of monitor. Not exactly practical

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Serious diy? I thought it was just a screwdriver job.

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.
I don't get why NVIDIA is using a huge FPGA that costs like $600/each when the reprogrammability isn't being used at all and they could easily spin up an ASIC for it.

Adbot
ADBOT LOVES YOU

Wistful of Dollars
Aug 25, 2009

Tab8715 posted:

Does it matter which one?

I've never tried Trixx, but I have no reason to think either is a poor choice.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply