Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SwissArmyDruid
Feb 14, 2014

by sebmojo

Sinestro posted:

I don't get why NVIDIA is using a huge FPGA that costs like $600/each when the reprogrammability isn't being used at all and they could easily spin up an ASIC for it.


xthetenth posted:

Among other things I'm pretty sure the G-sync module has customized values for how hard to drive overdrive on the monitor, but any level of requiring a tie-in to the specific monitor makes it not work unless it pulls them down from an online table and can identify the panel itself.

After a bit of thought, I think I concur: They might still have to program things on a per-model basis. Now, is that cheaper than running off a different set of ASICs for each model of monitor? I don't know the pricing on that, does anyone know?

Adbot
ADBOT LOVES YOU

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT
Trixx I think is pretty much Afterburner/Precision with a different skin. Last time I used it, it had all the same exact features, but the skin just looked different.

teh_Broseph
Oct 21, 2010

THE LAST METROID IS IN
CATTIVITY. THE GALAXY
IS AT PEACE...
Lipstick Apathy

Tab8715 posted:

I haven't actually but for whatever reason the Crimson overdrive is only give me a way to increment my GPU/Memory Speed by percentages? My google kung-fo is failing me, how do I switch this to show by Mhz?

Home->Gaming tab->Global Settings (top-left)->Global OverDrive tab, use the slider thing at the bottom:

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.

SwissArmyDruid posted:

After a bit of thought, I think I concur: They might still have to program things on a per-model basis. Now, is that cheaper than running off a different set of ASICs for each model of monitor? I don't know the pricing on that, does anyone know?

I mean, I feel like that stuff should probably be stuff that can get read in from a flash chip and used to configure the same silicon – even if it requires redundant systems, it would have to be pretty extensive to make it worth it.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

teh_Broseph posted:

Home->Gaming tab->Global Settings (top-left)->Global OverDrive tab, use the slider thing at the bottom:



On my system that shows up as + or - %, not as an actual MHz value, which is what he's talking about. Not that it's a huge problem, since it's easy enough to work out the values yourself.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Sinestro posted:

I don't get why NVIDIA is using a huge FPGA that costs like $600/each when the reprogrammability isn't being used at all and they could easily spin up an ASIC for it.

That's more expensive than I'd have expected given display pricing, are they selling it at a loss to display manufacturers?

I figured they're using the programmability to specialize it for the different monitors, rather than having a different ASIC for each one.

E: might it also have reduced their time to market?

repiv
Aug 13, 2009

Subjunctive posted:

That's more expensive than I'd have expected given display pricing, are they selling it at a loss to display manufacturers?

~$500 is the single unit price, they're probably 40-50% cheaper when bought by the hundred or thousand. Still a seriously :homebrew: component though.

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.

repiv posted:

~$500 is the single unit price, they're probably 40-50% cheaper when bought by the hundred or thousand. Still a seriously :homebrew: component though.

$576 for the cheapest possible option on Digi-Key, and it's significantly more everywhere else, $836 on Mouser and similar. Still, I'd doubt that they're paying more than $350 or so per unit in FPGA.

teh_Broseph
Oct 21, 2010

THE LAST METROID IS IN
CATTIVITY. THE GALAXY
IS AT PEACE...
Lipstick Apathy

HalloKitty posted:

On my system that shows up as + or - %, not as an actual MHz value, which is what he's talking about. Not that it's a huge problem, since it's easy enough to work out the values yourself.



Oh weird, was wondering where the percentage was showing up. Looks like it's a newer (290) card thing? From a review: "The AMD Radeon R9 290/290X are both fully dynamic, so there is no longer an absolute clock to set. This means that overclocking is now done by percentages and not fixed values." I'm still on a 7970.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
What are the chances big pascal will launch before the dual fury and be faster and cheaper.

penus penus penus
Nov 9, 2014

by piss__donald

Don Lapre posted:

What are the chances big pascal will launch before the dual fury and be faster and cheaper.

I crunched the numbers , it's 49%

HMS Boromir
Jul 16, 2011

by Lowtax
There seems to be a lot of discussion about all this adaptive/G/free sync stuff going on and I'm a bit confused. Is it really just about screen tearing? V-Sync has always been an afterthought for me at best, can I safely ignore this stuff forever?

HMS Boromir fucked around with this message at 17:55 on Dec 28, 2015

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


Don Lapre posted:

What are the chances big pascal will launch before the dual fury and be faster and cheaper.

Pretty big.

Anime Schoolgirl
Nov 28, 2002

HMS Boromir posted:

There seems to be a lot of discussion about all this adaptive/G/free sync stuff going on and I'm a bit confused. Is it really just about screen tearing? V-Sync has always been an afterthought for me at best, can I safely ignore this stuff forever?
it has more to do with not seeing "hiccups" when framerates don't evenly divide into 30 or 60 or whatever, when you actually see it it feels as if the actual refresh rate is much better than it actually is

basically it's an alternative to buying a 980ti but most displays are priced such that you might as well get one and a non-sync $250 144hz monitor

adaptive sync is an implementation that uses the monitor's actual hardware itself to do it, but it's less flexible than a pure custom solution like gsync.

Anime Schoolgirl fucked around with this message at 18:14 on Dec 28, 2015

HMS Boromir
Jul 16, 2011

by Lowtax
Right, but do those hiccups happen if you're not doing framerate to refresh syncing of any description?

Anime Schoolgirl
Nov 28, 2002

they only don't happen if you cap the framerate high enough beyond what the display shows (if 60hz) or run at a constant 120/144fps that it doesn't matter, even without vsync you'll have half of the image hiccupping when you hit that magical 45fps number on a 60hz display and this is actually jarring to a lot of people, namely people who didn't grow up playing quake on a pentium 1

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

HMS Boromir posted:

There seems to be a lot of discussion about all this adaptive/G/free sync stuff going on and I'm a bit confused. Is it really just about screen tearing? V-Sync has always been an afterthought for me at best, can I safely ignore this stuff forever?

The other part of it is what makes me more interested. Rather than frames getting made on one schedule and displayed on another, which causes varying latency from the graphics card and makes motion appear less smooth, the frames get made on a schedule and displayed after a roughly constant time, so when moving the frames display at about the right time relative to each other.

HMS Boromir
Jul 16, 2011

by Lowtax
Alright, I think I mostly get the idea - it makes things look smoother when you get framerate drops than on a normal monitor, where you get stuttering because they get displayed at uneven time intervals? Not sure what you mean by "half of the image" hiccupping though, Anime Schoolgirl. Are you describing something different than screen tearing?

Anime Schoolgirl
Nov 28, 2002

HMS Boromir posted:

Alright, I think I mostly get the idea - it makes things look smoother when you get framerate drops than on a normal monitor because the get displayed at uneven time intervals? Not sure what you mean by "half of the image" hiccupping though, Anime Schoolgirl. Are you describing something different than screen tearing?
it's sort of the opposite of screen tearing but you'll see both halves of the image alternate doing an awkward 3:2 pulldown effect like when you see movies on a tv

Slider
Jun 6, 2004

POINTS
Gsync also reduces input lag found when running a game at your moniters fixed refresh rate typically 60hz. The linus video is pretty informational

https://www.youtube.com/watch?v=fghQh0Y4oA4

Wistful of Dollars
Aug 25, 2009

Don Lapre posted:

What are the chances big pascal will launch before the dual fury and be faster and cheaper.

I give it ~85% chance atm.

HMS Boromir
Jul 16, 2011

by Lowtax
I looked up what 3 2 pulldown is and I've definitely never experienced anything like that. :psyduck: Maybe my eyes are just broken.

Truga
May 4, 2014
Lipstick Apathy

Anime Schoolgirl posted:

namely people who didn't grow up playing quake on a pentium 1

I played quake on a 386 and quake 2 on a pentium 1. Tearing makes me lose my mind. It wasn't noticeable on my 320x240@120 CRT!

Anime Schoolgirl
Nov 28, 2002

Don Lapre posted:

What are the chances big pascal will launch before the dual fury and be faster and cheaper.
if they keep their promise of Pascal being Better For Compute: 35%
if they don't and decide to put in more render throughput thingys: 100%

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Truga posted:

I played quake on a 386 and quake 2 on a pentium 1. Tearing makes me lose my mind. It wasn't noticeable on my 320x240@120 CRT!

Quake min requirements was a p75 so i cant imagine you actually played quake like this

https://www.youtube.com/watch?v=E2zsV-kXMIk

This is even with a 387 addon

Truga
May 4, 2014
Lipstick Apathy
https://quakewiki.org/wiki/r_drawflat

drawflat baby

e: wow there's even a screenshot attached:

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
I get 70-160 fps in the games I play on my 60hz monitor. When I tried to use v-sync it would occasionally go below 60fps which seemed odd to me.

I can cap fps to 65 with nvidia inspector and never get below 60f but then I dont have that exact 60 fps that matches up with my monitor.

60fps vsync, 65fps capped, and uncapped 70+ all look fine and I cant tell any difference between them.

In nvidia inspector, what should my settings be:
frame rate limiter - 60, 65, uncapped?
vertical sync - force on or off?
vertical sync tear control - adaptive, im pretty sure I want this one always.

Fauxtool fucked around with this message at 19:11 on Dec 28, 2015

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
I'm hoping for big leaps in GPU performance but do you really think we will get double GPU performance in a single card?

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Fauxtool posted:

I get 70-160 fps in the games I play on my 60hz monitor. When I tried to use v-sync it would occasionally go below 60fps which seemed odd to me.

I can cap fps to 65fps with nvidia inspector and never get below 60fps but then I dont have the exact 60 fps that matches up with my monitor.

60fps vsync, 65fps capped, and uncapped 70+ all look fine and I cant tell any difference between them.

What would you do?

Get a second gpu

B-Mac posted:

I'm hoping for big leaps in GPU performance but do you really think we will get double GPU performance in a single card?


780ti to 980ti comes close before overclocking.

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

Don Lapre posted:

Get a second gpu

I just ordered quad titans to solve this non-problem I am having

penus penus penus
Nov 9, 2014

by piss__donald
Tearing drives me insane, I can't wait to get something to stop it. I will use vsync (or some variation) when I can, but I'm definitely in the "lag is even worse than tearing" crowd when it comes to games where that matters. And since any Gsync monitor I'd consider an upgrade to what I have now has too many downsides (quality or price) and until AMD comes out with something I'd actually want to own, im just out of luck for now.

Frame capping does almost eliminate even the worst examples of tearing for some games. It absolutely does nothing for others. I don't know what exactly distinguishes the two scenarios.

Panty Saluter
Jan 17, 2004

Making learning fun!


Just push the framerate high enough and it's not even noticeable :chord:

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Don Lapre posted:

Quake min requirements was a p75 so i cant imagine you actually played quake like this

https://www.youtube.com/watch?v=E2zsV-kXMIk

This is even with a 387 addon

That is legitimately how I played Doom II port on my Macintosh LC III, I actually beat several levels at that framerate figuring that's how fast is was supposed to run for everyone.

Wistful of Dollars
Aug 25, 2009

They should just skip DP 1.3 and bring in 1.4 so we can have 144+ hz 4k's. :chord:

monster on a stick
Apr 29, 2013
Thanks thread, I upgraded from a 6850 to an MSI GTX 970 Gaming 4G, booted up Dragon Age Inquisition, and what a difference.

CDW
Aug 26, 2004
Just got myself an MSI GTX 970 Gaming 4G, upgraded from a Sapphire 7850 that absolutely refused to overclock for some reason, it would just crash during games if I touched it at all, and already applied a good OC and am amazed at how it is handling things already.

Also got the new Rainbow Six and some free months of Xsplit Broadcaster to try out for Twitch streaming.

It was a completely unmolested Open Box at Microcenter for 270 after rebate but pretax, which Amazon and nearly every one else taxes too.

penus penus penus
Nov 9, 2014

by piss__donald

CDW posted:

Just got myself an MSI GTX 970 Gaming 4G, upgraded from a Sapphire 7850 that absolutely refused to overclock for some reason, it would just crash during games if I touched it at all, and already applied a good OC and am amazed at how it is handling things already.

Also got the new Rainbow Six and some free months of Xsplit Broadcaster to try out for Twitch streaming.

It was a completely unmolested Open Box at Microcenter for 270 after rebate but pretax, which Amazon and nearly every one else taxes too.

If all you want to do is stream your game (and voice, and a little webcam square) the shadowplay built into geforce experience is very slick and streams very high quality video now.

Panty Saluter
Jan 17, 2004

Making learning fun!
I had a 1GB Sapphire 7850 that wouldn't overclock either. My AMD CPUs never seemed to like it either, but who knows.

I now cannot overclock because of my ultra-widescreen hax :v:

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Must have warehouses full. These launched at $649 right?

http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9777989

Adbot
ADBOT LOVES YOU

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

That's incredibly cheap. drat, and that price, worth it.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply