Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

macnbc posted:

So I'm upgrading from an old GTX560Ti to a GTX970.

I noticed that the 970 uses PCIe 3.0 where my current motherboard is still PCIe 2.0. I know that it's backwards-compatible, but will I notice any performance hit at all?

No, PCIe 2.0 does not even hit full utilization right now, let alone 3.0.

Adbot
ADBOT LOVES YOU

Truga
May 4, 2014
Lipstick Apathy

Subjunctive posted:

Some engines (Unreal for one) allow the internal rendering resolution to differ from the display resolution, meaning that you could have it basically do 1080p worth of work and then scale it up to display at 1440p. Roughly the opposite of DSR, perhaps "dynamic sub resolution"!

GW2 does this both ways. There's an option that goes subsampling/normal/supersampling. Each setting double the 3d rendering resolution, where normal is your native.

Supersampling 3d makes the game look better, for obvious reasons. Subsampling renders the 3d at half resolution, but any 2d parts (the UI) are still at your native resolution. It's a really cool thing, because you can get the performance of 1280x800, but the UI still stays at 2560x1600, so you keep all the screen real-estate/UI sharpness.

Foxhound
Sep 5, 2007
Is anyone else having their 970 "sag" a bit on the right side of the card? It's pretty heavy and I think the PSU cables are pushing it down a little bit. Has anyone had something horrible happen because of this?

Truga
May 4, 2014
Lipstick Apathy
One of my radeons has been "sagged" a bit for nearly a year due to how I mounted the water cooling thing and it hasn't caused any issues. :shrug:

penus penus penus
Nov 9, 2014

by piss__donald
That's a very common thing, and reports of it being the root of a problem are almost nonexistent. I'm sure you could find one but I wouldn't worry. You can GIS "bent GPU" or "Sagging GPU" and see hundreds of pics of this, working fine though. I imagine its less common with the 970 rather than 2 or 3 previous gens but its still a big boy

betterinsodapop
Apr 4, 2004

64:3
My Strix 970 is a bit saggy on the right too. Seems to work ok, but it's kind of annoying.
You spend all this time getting your cable management just so, and then you have this droopy ballbag GPU loving up all of your :rice: action.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
I've had saggy GPUs ever since GPUs got huge, I don't think it does any harm but I think that's why manufacturers started putting backplates on. Be careful about sagging if your normal computer usage involves throwing your case down the stairs.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Buy a backplate or support it somehow, ive had good luck using the PCIE power connectors and a zip tie to push it up a bit

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

My 970 doesn't sag, it leans back all :smug:

veedubfreak
Apr 2, 2005

by Smythe
Horizontal motherboards are the way of the future.

SwissArmyDruid
Feb 14, 2014

by sebmojo

veedubfreak posted:

Horizontal motherboards are the way of the future.

ITX has been trying to do that for years.

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>

veedubfreak posted:

Horizontal motherboards are the way of the future.

What about 6ft cases?

veedubfreak
Apr 2, 2005

by Smythe

Hace posted:

What about 6ft cases?

They call those coffins.

repiv
Aug 13, 2009

veedubfreak posted:

Horizontal motherboards are the way of the future.

Nah, 90 degree motherboards are the way of the future. Every manufacturer clone the Silverstone FT02/FT05 design TIA

EoRaptor
Sep 13, 2003

by Fluffdaddy

repiv posted:

Nah, 90 degree motherboards are the way of the future. Every manufacturer clone the Silverstone FT02/FT05 design TIA

NLX?

SwissArmyDruid
Feb 14, 2014

by sebmojo

repiv posted:

Nah, 90 degree motherboards are the way of the future. Every manufacturer clone the Silverstone FT02/FT05 design TIA

I unironically really wish they would. It makes a lot more sense from a thermodynamics perspective than the standard ATX layout.

mcbexx
Jul 4, 2004

British dentistry is
not on trial here!



A little handheld camera clip of Squareenix showing up what they think they will be able to do with DX12...

https://www.youtube.com/watch?v=rk7x5LxCmSU

SwissArmyDruid
Feb 14, 2014

by sebmojo

mcbexx posted:

A little handheld camera clip of Squareenix showing up what they think they will be able to do with DX12...

https://www.youtube.com/watch?v=rk7x5LxCmSU

If they're trying to say that they updated Agni's Philosophy to run under DX12, they really did themselves a disservice without any side-by-side shots compared to the old version.

repiv
Aug 13, 2009

You could already do all that with DX11 albeit with more overhead, so showing it on DX12 without stating the hardware it's running on tells us absolutely nothing :crossarms:

Wiggly Wayne DDS
Sep 11, 2010



repiv posted:

You could already do all that with DX11 albeit with more overhead, so showing it on DX12 without stating the hardware it's running on tells us absolutely nothing :crossarms:
Rendering tech hasn't exactly been static since the original demo either, so showing side-by-side old version vs new version would be disingenuous as well.

Rigged Death Trap
Feb 13, 2012

BEEP BEEP BEEP BEEP

Wiggly Wayne DDS posted:

Rendering tech hasn't exactly been static since the original demo either, so showing side-by-side old version vs new version would be disingenuous as well.

Then run the old build on the same rig.
Unless squeenix they did the thing where they delete old builds as they go.

But yeah 8000x8000 textures. That's quite a bit of data.
Im assuming they did it on the setup they had a screenshot of at the beginning. Quad SLI or something.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Rigged Death Trap posted:

Then run the old build on the same rig.
Unless squeenix they did the thing where they delete old builds as they go.

But yeah 8000x8000 textures. That's quite a bit of data.
Im assuming they did it on the setup they had a screenshot of at the beginning. Quad SLI or something.

Back in 2012, they said the original Agni's demo was run on a single GTX 680.

source: http://www.neogaf.com/forum/showthread.php?t=502188

Wiggly Wayne DDS
Sep 11, 2010



Rigged Death Trap posted:

Then run the old build on the same rig.
Unless squeenix they did the thing where they delete old builds as they go.

But yeah 8000x8000 textures. That's quite a bit of data.
Im assuming they did it on the setup they had a screenshot of at the beginning. Quad SLI or something.
I'm not talking about hardware tech, nor rendering apis.

Rigged Death Trap
Feb 13, 2012

BEEP BEEP BEEP BEEP

Wiggly Wayne DDS posted:

I'm not talking about hardware tech, nor rendering apis.

Right now I'm a bit confused.

repiv
Aug 13, 2009

He means the general algorithms and techniques used in engines have improved in ways that didn't need new APIs, so it's not fair to use a snapshot from years ago to compare API versions.

For example SMAA was developed during the DX11 era but it works just fine on DX9 hardware, so a deferred DX9 or DX10 renderer written today would look much better than a vintage one which uses FXAA or worse.

repiv fucked around with this message at 22:25 on Apr 30, 2015

canyoneer
Sep 13, 2005


I only have canyoneyes for you
How useful is the GeForce Experience (TM) game optimization?

veedubfreak
Apr 2, 2005

by Smythe

canyoneer posted:

How useful is the GeForce Experience (TM) game optimization?

Slightly more useful than tits on a board.

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>

canyoneer posted:

How useful is the GeForce Experience (TM) game optimization?

It's pretty useful actually! The only thing is that it usually targets ~45FPS rather than a locked 60, so you might want to ratchet a few things down after you apply it.

CatHorse
Jan 5, 2008

Rigged Death Trap posted:

Im assuming they did it on the setup they had a screenshot of at the beginning. Quad SLI or something.

4xTitan X. http://blogs.nvidia.com/blog/2015/04/30/build-demo/

Darkpriest667
Feb 2, 2015

I'm sorry I impugned
your cocksmanship.

Haeleus posted:

Has anyone had issues with play any video with VLC after installing the latest nvidia driver (for GTA V)? I'm not 100% if its related but from around the same time I cannot play a single video past 5-10 minutes before the video locks up/goes black while the audio keeps going, and if I try to interact with the program it just shuts off entirely. I tried reinstalling to no avail so I'm thinking it may be driver related.


What version of VLC is it? Was the reinstall the newest version? I know the last two Nvidia drivers have given me some driver failures when I was doing certain things. ELITE keeps crashing unless you downclock to like 1100Mhz on a 980 SC. Apparently boost clocks on the maxwell cards don't play nice with some things I am finding out. I think there is a way to force a constant clock and disable the boost and the energy saver, but I've been so busy the last 3 weeks I haven't had the chance to look.

SwissArmyDruid
Feb 14, 2014

by sebmojo

That's.... massively disappointing. Again, it was reported back in 2012 that the original Agni's Philosophy demo was run on a SINGLE GTX 680, which, considering how good it looked *then* was impressive, and a closer-to-real-world test scenario. (Even if the rig did have 32 GB of RAM.) It would have been more significant if they had gotten a graphically improved demo running on equally mundane hardware under DX 12. (or at least, one or two 970s, not quad Titan X.) I mean, did you really need 8K by 8K downsample?

CatHorse
Jan 5, 2008
http://blogs.msdn.com/b/directx/archive/2015/05/01/directx-12-multiadapter-lighting-up-dormant-silicon-and-making-it-work-for-you.aspx

Rastor
Jun 2, 2001

Ars Technica writeup on the Squenix demo, including high-quality version:

http://arstechnica.com/gaming/2015/05/01/new-square-enix-real-time-directx-12-demo-crosses-the-uncanny-valley/

Diviance
Feb 11, 2004

Television rules the nation.

Interesting, I wouldn't mind taking advantage of my Intel GPU for some extra fps...

Truga
May 4, 2014
Lipstick Apathy

Diviance posted:

Interesting, I wouldn't mind taking advantage of my Intel GPU for some extra fps...

In before nvidia disabling itself if it detects another gpu, just like it currently disables physx. :v:

penus penus penus
Nov 9, 2014

by piss__donald

creepy. I like it though, once I wrap my head around "this is real time" and "not a video"

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Diviance posted:

Interesting, I wouldn't mind taking advantage of my Intel GPU for some extra fps...

I see this helping AMD also, because their APUs have beefier GPUs, but asymmetrical crossfire works like poo poo, last I read of it.

THE DOG HOUSE posted:

creepy. I like it though, once I wrap my head around "this is real time" and "not a video"

Where's a high quality cap of the thing, though? The YouTube video quality far from ideal, and all the "high-res" screenshots I've seen are just YouTube caps. Seems a bit perverse to go to all those lengths to make high quality models and shaders, only to have most people experience it through a low bitrate video.

HalloKitty fucked around with this message at 15:07 on May 1, 2015

penus penus penus
Nov 9, 2014

by piss__donald
I doubt we'll see it any higher quality than that, at least as a video. Once he starts panning around though its... clear to me that this is certainly a true next step.

Blorange
Jan 31, 2007

A wizard did it

Diviance posted:

Interesting, I wouldn't mind taking advantage of my Intel GPU for some extra fps...

Check out the Frametime graph that they provide, it looks like using the integrated GPU gains you 10% extra frames at the cost of +20ms latency for each of said frames. That's a terrible tradeoff.

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

AFR with mis-matched GPUs is a recipe for disaster. Hopefully multi-adapter is nuanced enough to do more fine-grained workload splitting, like always rendering the main passes on the discrete GPU then applying post-process passes on the IGP.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply