macnbc posted:So I'm upgrading from an old GTX560Ti to a GTX970. No, PCIe 2.0 does not even hit full utilization right now, let alone 3.0.
|
|
# ? Apr 29, 2015 22:28 |
|
|
# ? Jun 5, 2024 08:30 |
|
Subjunctive posted:Some engines (Unreal for one) allow the internal rendering resolution to differ from the display resolution, meaning that you could have it basically do 1080p worth of work and then scale it up to display at 1440p. Roughly the opposite of DSR, perhaps "dynamic sub resolution"! GW2 does this both ways. There's an option that goes subsampling/normal/supersampling. Each setting double the 3d rendering resolution, where normal is your native. Supersampling 3d makes the game look better, for obvious reasons. Subsampling renders the 3d at half resolution, but any 2d parts (the UI) are still at your native resolution. It's a really cool thing, because you can get the performance of 1280x800, but the UI still stays at 2560x1600, so you keep all the screen real-estate/UI sharpness.
|
# ? Apr 30, 2015 11:25 |
|
Is anyone else having their 970 "sag" a bit on the right side of the card? It's pretty heavy and I think the PSU cables are pushing it down a little bit. Has anyone had something horrible happen because of this?
|
# ? Apr 30, 2015 14:08 |
|
One of my radeons has been "sagged" a bit for nearly a year due to how I mounted the water cooling thing and it hasn't caused any issues.
|
# ? Apr 30, 2015 14:15 |
|
That's a very common thing, and reports of it being the root of a problem are almost nonexistent. I'm sure you could find one but I wouldn't worry. You can GIS "bent GPU" or "Sagging GPU" and see hundreds of pics of this, working fine though. I imagine its less common with the 970 rather than 2 or 3 previous gens but its still a big boy
|
# ? Apr 30, 2015 14:46 |
|
My Strix 970 is a bit saggy on the right too. Seems to work ok, but it's kind of annoying. You spend all this time getting your cable management just so, and then you have this droopy ballbag GPU loving up all of your action.
|
# ? Apr 30, 2015 16:20 |
|
I've had saggy GPUs ever since GPUs got huge, I don't think it does any harm but I think that's why manufacturers started putting backplates on. Be careful about sagging if your normal computer usage involves throwing your case down the stairs.
|
# ? Apr 30, 2015 16:23 |
|
Buy a backplate or support it somehow, ive had good luck using the PCIE power connectors and a zip tie to push it up a bit
|
# ? Apr 30, 2015 16:27 |
|
My 970 doesn't sag, it leans back all
|
# ? Apr 30, 2015 16:56 |
|
Horizontal motherboards are the way of the future.
|
# ? Apr 30, 2015 18:05 |
|
veedubfreak posted:Horizontal motherboards are the way of the future. ITX has been trying to do that for years.
|
# ? Apr 30, 2015 18:07 |
|
veedubfreak posted:Horizontal motherboards are the way of the future. What about 6ft cases?
|
# ? Apr 30, 2015 18:07 |
|
Hace posted:What about 6ft cases? They call those coffins.
|
# ? Apr 30, 2015 18:17 |
|
veedubfreak posted:Horizontal motherboards are the way of the future. Nah, 90 degree motherboards are the way of the future. Every manufacturer clone the Silverstone FT02/FT05 design TIA
|
# ? Apr 30, 2015 18:24 |
|
repiv posted:Nah, 90 degree motherboards are the way of the future. Every manufacturer clone the Silverstone FT02/FT05 design TIA NLX?
|
# ? Apr 30, 2015 19:01 |
|
repiv posted:Nah, 90 degree motherboards are the way of the future. Every manufacturer clone the Silverstone FT02/FT05 design TIA I unironically really wish they would. It makes a lot more sense from a thermodynamics perspective than the standard ATX layout.
|
# ? Apr 30, 2015 19:52 |
|
A little handheld camera clip of Squareenix showing up what they think they will be able to do with DX12... https://www.youtube.com/watch?v=rk7x5LxCmSU
|
# ? Apr 30, 2015 20:39 |
|
mcbexx posted:A little handheld camera clip of Squareenix showing up what they think they will be able to do with DX12... If they're trying to say that they updated Agni's Philosophy to run under DX12, they really did themselves a disservice without any side-by-side shots compared to the old version.
|
# ? Apr 30, 2015 20:45 |
|
You could already do all that with DX11 albeit with more overhead, so showing it on DX12 without stating the hardware it's running on tells us absolutely nothing
|
# ? Apr 30, 2015 20:52 |
|
repiv posted:You could already do all that with DX11 albeit with more overhead, so showing it on DX12 without stating the hardware it's running on tells us absolutely nothing
|
# ? Apr 30, 2015 20:57 |
|
Wiggly Wayne DDS posted:Rendering tech hasn't exactly been static since the original demo either, so showing side-by-side old version vs new version would be disingenuous as well. Then run the old build on the same rig. Unless squeenix they did the thing where they delete old builds as they go. But yeah 8000x8000 textures. That's quite a bit of data. Im assuming they did it on the setup they had a screenshot of at the beginning. Quad SLI or something.
|
# ? Apr 30, 2015 21:19 |
|
Rigged Death Trap posted:Then run the old build on the same rig. Back in 2012, they said the original Agni's demo was run on a single GTX 680. source: http://www.neogaf.com/forum/showthread.php?t=502188
|
# ? Apr 30, 2015 21:24 |
|
Rigged Death Trap posted:Then run the old build on the same rig.
|
# ? Apr 30, 2015 21:27 |
|
Wiggly Wayne DDS posted:I'm not talking about hardware tech, nor rendering apis. Right now I'm a bit confused.
|
# ? Apr 30, 2015 21:46 |
|
He means the general algorithms and techniques used in engines have improved in ways that didn't need new APIs, so it's not fair to use a snapshot from years ago to compare API versions. For example SMAA was developed during the DX11 era but it works just fine on DX9 hardware, so a deferred DX9 or DX10 renderer written today would look much better than a vintage one which uses FXAA or worse. repiv fucked around with this message at 22:25 on Apr 30, 2015 |
# ? Apr 30, 2015 21:47 |
|
How useful is the GeForce Experience (TM) game optimization?
|
# ? Apr 30, 2015 22:26 |
|
canyoneer posted:How useful is the GeForce Experience (TM) game optimization? Slightly more useful than tits on a board.
|
# ? Apr 30, 2015 23:14 |
|
canyoneer posted:How useful is the GeForce Experience (TM) game optimization? It's pretty useful actually! The only thing is that it usually targets ~45FPS rather than a locked 60, so you might want to ratchet a few things down after you apply it.
|
# ? Apr 30, 2015 23:17 |
|
Rigged Death Trap posted:Im assuming they did it on the setup they had a screenshot of at the beginning. Quad SLI or something. 4xTitan X. http://blogs.nvidia.com/blog/2015/04/30/build-demo/
|
# ? May 1, 2015 01:59 |
|
Haeleus posted:Has anyone had issues with play any video with VLC after installing the latest nvidia driver (for GTA V)? I'm not 100% if its related but from around the same time I cannot play a single video past 5-10 minutes before the video locks up/goes black while the audio keeps going, and if I try to interact with the program it just shuts off entirely. I tried reinstalling to no avail so I'm thinking it may be driver related. What version of VLC is it? Was the reinstall the newest version? I know the last two Nvidia drivers have given me some driver failures when I was doing certain things. ELITE keeps crashing unless you downclock to like 1100Mhz on a 980 SC. Apparently boost clocks on the maxwell cards don't play nice with some things I am finding out. I think there is a way to force a constant clock and disable the boost and the energy saver, but I've been so busy the last 3 weeks I haven't had the chance to look.
|
# ? May 1, 2015 04:23 |
|
That's.... massively disappointing. Again, it was reported back in 2012 that the original Agni's Philosophy demo was run on a SINGLE GTX 680, which, considering how good it looked *then* was impressive, and a closer-to-real-world test scenario. (Even if the rig did have 32 GB of RAM.) It would have been more significant if they had gotten a graphically improved demo running on equally mundane hardware under DX 12. (or at least, one or two 970s, not quad Titan X.) I mean, did you really need 8K by 8K downsample?
|
# ? May 1, 2015 05:57 |
|
http://blogs.msdn.com/b/directx/archive/2015/05/01/directx-12-multiadapter-lighting-up-dormant-silicon-and-making-it-work-for-you.aspx
|
# ? May 1, 2015 10:45 |
|
Ars Technica writeup on the Squenix demo, including high-quality version: http://arstechnica.com/gaming/2015/05/01/new-square-enix-real-time-directx-12-demo-crosses-the-uncanny-valley/
|
# ? May 1, 2015 12:30 |
|
MikusR posted:http://blogs.msdn.com/b/directx/archive/2015/05/01/directx-12-multiadapter-lighting-up-dormant-silicon-and-making-it-work-for-you.aspx Interesting, I wouldn't mind taking advantage of my Intel GPU for some extra fps...
|
# ? May 1, 2015 12:57 |
|
Diviance posted:Interesting, I wouldn't mind taking advantage of my Intel GPU for some extra fps... In before nvidia disabling itself if it detects another gpu, just like it currently disables physx.
|
# ? May 1, 2015 13:55 |
|
Rastor posted:Ars Technica writeup on the Squenix demo, including high-quality version: creepy. I like it though, once I wrap my head around "this is real time" and "not a video"
|
# ? May 1, 2015 15:03 |
|
Diviance posted:Interesting, I wouldn't mind taking advantage of my Intel GPU for some extra fps... I see this helping AMD also, because their APUs have beefier GPUs, but asymmetrical crossfire works like poo poo, last I read of it. THE DOG HOUSE posted:creepy. I like it though, once I wrap my head around "this is real time" and "not a video" Where's a high quality cap of the thing, though? The YouTube video quality far from ideal, and all the "high-res" screenshots I've seen are just YouTube caps. Seems a bit perverse to go to all those lengths to make high quality models and shaders, only to have most people experience it through a low bitrate video. HalloKitty fucked around with this message at 15:07 on May 1, 2015 |
# ? May 1, 2015 15:03 |
|
I doubt we'll see it any higher quality than that, at least as a video. Once he starts panning around though its... clear to me that this is certainly a true next step.
|
# ? May 1, 2015 15:08 |
|
Diviance posted:Interesting, I wouldn't mind taking advantage of my Intel GPU for some extra fps... Check out the Frametime graph that they provide, it looks like using the integrated GPU gains you 10% extra frames at the cost of +20ms latency for each of said frames. That's a terrible tradeoff.
|
# ? May 1, 2015 16:54 |
|
|
# ? Jun 5, 2024 08:30 |
|
AFR with mis-matched GPUs is a recipe for disaster. Hopefully multi-adapter is nuanced enough to do more fine-grained workload splitting, like always rendering the main passes on the discrete GPU then applying post-process passes on the IGP.
|
# ? May 1, 2015 17:37 |