Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Ludicrous Gibs!
Jan 21, 2002

I'm not lost, but I don't know where I am.
Ramrod XTreme
Yeah, I know, it's a waste of money, but it's just. So. Pretty.

Adbot
ADBOT LOVES YOU

Animal
Apr 8, 2003

I am gonna get a 670 because my 560 Ti 448 is not enough for 1440p. Resell value for the 560 Ti seems bad so might as well use it for PhysX.

Kramjacks
Jul 5, 2007

Agreed posted:

Yep. That said, I've gone in and gussied it up like loving crazy in the .ini files and it looks gorgeous rendering on the 680... And I don't ever encounter slowdown, ever, with the overkill 580. It just crunches through it and takes a nap at the same time, I reckon.

So, :getin: indeed.

I have the junior version of your setup, a 670 and 570 for Physx. What sort of stuff did you do in the .ini? Is there a guide for that anywhere?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Kramjacks posted:

I have the junior version of your setup, a 670 and 570 for Physx. What sort of stuff did you do in the .ini? Is there a guide for that anywhere?

No, the closest we've got at the moment is this poster's continued exploration. I've also added commentary, but it's intermingled with unrelated discussion including some launch day kvetching (I was one of the people affected by both the "installing .net framework 4 over and over" bug, as well as the "can't get preorder bonuses" error, so there is a bit of :qq: over that since I was really looking forward to playing it yesterday after work, d'oh). So here's a more collected version of my thoughts:

Notes regarding WillowEngine.ini located in the My Documents\My Games\Borderland 2\WillowGame\Config directory, based on loving around with stuff:

WillowEngine.ini

By default the game renders via D3D9 using DX9. You can enable D3D11 mode for improved effects quality, tessellation, etc.; this is similar to how Borderlands 1 had a DX10 rendering path that was fully fleshed out but disabled in the engine by default. Borderlands 2's D3D11 mode looks great. I initially thought it caused problems due to tessellation but that's not the case. You can enable both MSAA and Temporal AA. It states "D3D9 MSAA" but from what I can tell it works in D3D11 as well. 4 samples is plenty, combined with FXAA and Temporal AA, given Borderland's unique and not too demanding rendering style. You'll get a good combination of smoothing and sharpness.

Don't mess with shadows, uh, at all. THAT'S what was causing issues for me. There are lots of ways for that to break the game's visuals, as it turns out; treat the stock values as hard-coded for the time being, maybe further experimentation will be fruitful but right now stock shadows should be fine. Also, don't mess with reflections, that'll break texture LOD and look ugly as hell.

Raising PhysX to "3" rather than the maximum Launcher/in-game "High" value of 2 will give you what amounts to a hidden "Very High" setting. I'd write-protect the file after making all these edits, as the Launcher tries to apply its settings every time. I'd like a way to bypass the launcher entirely but no such luck as of yet.

WillowGame.ini

You can up the fading decal count and increase the amount of time it takes for fading decals to go away for more visual flair.

-----

All I've got so far, but it's looking really nice and playing smooth as hell.

Kramjacks
Jul 5, 2007

Awesome, thanks!

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Intel has announced WiDi 3.5 and is pushing it to OEMs. It looks like a good update:
  • Less latency
  • USB and multitouch transmission
  • Stereoscopic 3D support
  • Compatibility with Miracast receivers

bacon!
Dec 10, 2003

The fierce urgency of now

Berk Berkly posted:

As of right now? No higher than a GTX670 and no lower than the GTX660, with the only card in between being the 660TI. The 680 is far to close to the 670 performance to justify the extra $100 it costs, and the 690 is just a doubled up 680. It is a matter of how much you want to spend.

Just don't pay premium for GTX660/660TIs with more than 2GBs of VRAM.

Why not? I've got a 2560x1440 monitor and I was under the impression the cards with more VRAM would perform better specifically at those resolutions. Faulty assumption?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
More than 1GB. More than 2GB is, generally speaking, the realm of ridiculous SLI/CF setups and three or more monitors.

GRINDCORE MEGGIDO
Feb 28, 1985


bacon! posted:

Why not? I've got a 2560x1440 monitor and I was under the impression the cards with more VRAM would perform better specifically at those resolutions. Faulty assumption?

For a single card @ that res, I'd consider a 7970 - have a look at reviews using modern drivers.

Gwaihir
Dec 8, 2009
Hair Elf

Agreed posted:

snip

-----

All I've got so far, but it's looking really nice and playing smooth as hell.

Man, BL2 really has been a pleasant surprise. I upgraded from a GTX480 (Don't laugh too hard, I was struggling with a gtx260 driving a 30" screen, and got the 480 for 350$) to a 680. I was considering doing the even more laughably silly version of your setup, and leaving the 680 in there, but ultimately after trying it both ways I didn't see enough of a slowdown on the 680 to justify leaving the 480 in the machine sucking power. I'm looking forward to trying all the tweaks / DX11 rendering path tonight though.

On a related note, the Asus DirectCU II cards are really damned nice. I've had basically nothing but EVGA cards till now, but wanted one that was at least as quiet as my 480 was- I had a thermalright spitfire on it with a 140mm fan, because the stock 480 is probably in the same realm of volume as the infamous FX dustbuster. The Asus card delivers, no coil whine or anything else annoying, and no louder under load gaming than the spitfire's slow fan was.

Incessant Excess
Aug 15, 2005

Cause of glitch:
Pretentiousness

Agreed posted:

No, the closest we've got at the moment is this poster's continued exploration. I've also added commentary, but it's intermingled with unrelated discussion including some launch day kvetching (I was one of the people affected by both the "installing .net framework 4 over and over" bug, as well as the "can't get preorder bonuses" error, so there is a bit of :qq: over that since I was really looking forward to playing it yesterday after work, d'oh). So here's a more collected version of my thoughts:

Notes regarding WillowEngine.ini located in the My Documents\My Games\Borderland 2\WillowGame\Config directory, based on loving around with stuff:

WillowEngine.ini

By default the game renders via D3D9 using DX9. You can enable D3D11 mode for improved effects quality, tessellation, etc.; this is similar to how Borderlands 1 had a DX10 rendering path that was fully fleshed out but disabled in the engine by default. Borderlands 2's D3D11 mode looks great. I initially thought it caused problems due to tessellation but that's not the case. You can enable both MSAA and Temporal AA. It states "D3D9 MSAA" but from what I can tell it works in D3D11 as well. 4 samples is plenty, combined with FXAA and Temporal AA, given Borderland's unique and not too demanding rendering style. You'll get a good combination of smoothing and sharpness.

Don't mess with shadows, uh, at all. THAT'S what was causing issues for me. There are lots of ways for that to break the game's visuals, as it turns out; treat the stock values as hard-coded for the time being, maybe further experimentation will be fruitful but right now stock shadows should be fine. Also, don't mess with reflections, that'll break texture LOD and look ugly as hell.

Raising PhysX to "3" rather than the maximum Launcher/in-game "High" value of 2 will give you what amounts to a hidden "Very High" setting. I'd write-protect the file after making all these edits, as the Launcher tries to apply its settings every time. I'd like a way to bypass the launcher entirely but no such luck as of yet.

WillowGame.ini

You can up the fading decal count and increase the amount of time it takes for fading decals to go away for more visual flair.

-----

All I've got so far, but it's looking really nice and playing smooth as hell.

This is a good post and I wanna be able to find it later.

Spek
Jun 15, 2012

Bagel!
I did some testing with single & multi card PhysX on Borderlands 2 using an old extra card. Here's the results in case anyone else interested in seeing whether putting an old card in for PhysX is worth it.

Each test was 10 mins and I made sure to stay in the action for as much of that as was possible, very little faffing about in the menu or shops. The settings were all the highest the launcher allows except for anisotropic which was at 2x and Depth of Field was turned off and no frame limiting or vsync. The resolution is 2560x1440.
code:
GTX680 graphics GTX470 PhysX:
Frames, Time (ms), Min, Max, Avg
 43506,    600000,  18, 105, 72.510

GTX680 doing both graphics and PhysX:
Frames, Time (ms), Min, Max, Avg
 37497,    600000,  28, 118, 62.495

GTX680 graphics and CPU PhysX(i7 2600k, clocked at 4.6ghz):
Frames, Time (ms), Min, Max, Avg
 38838,    600000,  34, 113, 64.730
The higher average but lower min & max with the dedicated card doing PhysX seems strange to me I can only guess it's because having a second card seems to lower the PCI-E slot bandwidth from x16 to x8. But the real surprise is that the CPU PhysX got the best results overall for me. Either way it doesn't look like it's worth it for the increased power use & heat to run a 470 as a dedicated PhysX card. I'm really curious as to whether something in the 500 or 600 series would get a big enough boost to make it worthwhile.

The Illusive Man
Mar 27, 2008

~savior of yoomanity~
I remember 5 or 6 years ago there used to be actual dedicated PhysX cards. Not that anyone actually bought one, but are those still useful at all, or are they entirely obsolete by now compared to previous-gen spare Geforces?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Space Racist posted:

I remember 5 or 6 years ago there used to be actual dedicated PhysX cards. Not that anyone actually bought one, but are those still useful at all, or are they entirely obsolete by now compared to previous-gen spare Geforces?
They only work with old versions of the PhysX library. Modern games require current versions and would only run in software or on a GPU.

craig588
Nov 19, 2005

by Nyc_Tattoo
They turned out to be slower in many cases than forcing the PhysX to run in software on a contemporary CPU too.

Scalding Coffee
Jun 26, 2006

You're already dead
Is the 660ti fan (I assume) supposed to make a weird sounding interrupted buzz when BOINC is running? I never saw this kind of fan before and it is a little louder than my 560ti. I don't mind the noise too much and it has been running for more than 20 minutes.

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.
I think EVGA's cards as of late are odd because they seem content with equipping even their non-reference models with smaller, higher RPM blowers which are unfortunately a lot more noisy and less effective than the larger non-stock multifan alternatives.

MiniSune
Sep 16, 2003

Smart like Dodo!

Spek posted:

I did some testing with single & multi card PhysX on Borderlands 2 using an old extra card.

Thank you for doing this. Down the track I'll do the same with a 460 but in a 16x slot, it'll be interesting to compare numbers.

Whale Cancer
Jun 25, 2004

I've never heard my evga 660ti make said noise. I've got two big fans running full time at the top of my case though.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Spek posted:

I did some testing with single & multi card PhysX on Borderlands 2 using an old extra card. Here's the results in case anyone else interested in seeing whether putting an old card in for PhysX is worth it.

Each test was 10 mins and I made sure to stay in the action for as much of that as was possible, very little faffing about in the menu or shops. The settings were all the highest the launcher allows except for anisotropic which was at 2x and Depth of Field was turned off and no frame limiting or vsync. The resolution is 2560x1440.
code:
GTX680 graphics GTX470 PhysX:
Frames, Time (ms), Min, Max, Avg
 43506,    600000,  18, 105, 72.510

GTX680 doing both graphics and PhysX:
Frames, Time (ms), Min, Max, Avg
 37497,    600000,  28, 118, 62.495

GTX680 graphics and CPU PhysX(i7 2600k, clocked at 4.6ghz):
Frames, Time (ms), Min, Max, Avg
 38838,    600000,  34, 113, 64.730
The higher average but lower min & max with the dedicated card doing PhysX seems strange to me I can only guess it's because having a second card seems to lower the PCI-E slot bandwidth from x16 to x8. But the real surprise is that the CPU PhysX got the best results overall for me. Either way it doesn't look like it's worth it for the increased power use & heat to run a 470 as a dedicated PhysX card. I'm really curious as to whether something in the 500 or 600 series would get a big enough boost to make it worthwhile.

Maximum GTX 580 usage is around 8% under EXTREME PhysX conditions, with forced higher PhysX.

What are you using to bench?

Spek
Jun 15, 2012

Bagel!

Agreed posted:

What are you using to bench?

Fraps. I was originally going to do longer benchmarks but Fraps kept discarding the info after some arbitrary amount of time so I got annoyed and did short ones just to make sure I got some data.

Aquila
Jan 24, 2003

Glen Goobersmooches posted:

I think EVGA's cards as of late are odd because they seem content with equipping even their non-reference models with smaller, higher RPM blowers which are unfortunately a lot more noisy and less effective than the larger non-stock multifan alternatives.

Note some of EVGA's non reference coolers absolutely suck. The one on my 560ti448 classified (this one) was very noisy and ran over 90C when gaming. I replaced it with a Gelid cooler and now it's running around 60C. Looking at EVGA's current line it doesn't seem like they use that specific design of cooler anymore, but be careful.

Kramjacks
Jul 5, 2007

I tried Borderlands 2 with Nvidia 3D vision. It works and there weren't any graphical glitches but there also wasn't any real implementation. The HUD wasn't on its own, closer to the viewer plane so it looked flat against everything. The mouse cursor wasn't in 3D at all, if I turned the 3D intensity up at all in the Nvidia control panel it would appear as two cursors about an inch apart, meaning I had to center what I wanted to click on between them.

I'm going to stick with playing in 2D unless Nvidia or Gearbox updates something to improve the 3D experience.

Scalding Coffee
Jun 26, 2006

You're already dead
The noisy card might have been getting used to being on all the time. It very occasionally made that buzzing sound, but it seems to be as quiet as my 560ti. Just getting used to that new card smell.

KillHour
Oct 28, 2007


Scalding Coffee posted:

The noisy card might have been getting used to being on all the time. It very occasionally made that buzzing sound, but it seems to be as quiet as my 560ti. Just getting used to that new card smell.

Buzzing usually comes from the chokes. I'm pretty sure it never just "goes away".

Wozbo
Jul 5, 2010
The gigabyte gtx 670 is pretty nice. If you don't have a restricted airflow it will only ever hit 65c max and stay at about 20% fan. If you do though, prep for 90% fan at 90C stable (furmark tested). Also, close slots for sli is a pretty big pita and it looks like they might grind up against one another (having one lifted slightly via some zip ties on power cables instead of letting gravity drop it a little does the trick). I wish my mobo had farther away pci-e slots :(.

Speaking of extravagant expenses, I got the 144hz monitor out on newegg, and I can just say goddayum. This + adaptive vsync is the freaking bees knees.

(Side note: If you have nvidia glasses + 3d emitter, don't use a USB 3.0 port; it fucks up something fierce)

Wozbo fucked around with this message at 15:39 on Sep 21, 2012

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.
Gigabyte's 670 is literally the thinnest model there is. The problem would have to be your motherboard's crowded PCI-E array.

Also, 144Hz? What monitor is this? If you've got a 120Hz you should basically never need vsync ever, unless you're playing ancient games without some kind of frame limiter on the high end.

Wozbo
Jul 5, 2010
The card takes up 2.00001 slots. The pins from the power connector stick out a good 3 to 4 mm farther than they should (This will cause a bzzzzzzzzz->SNAP as the fan from the other card runs against it). I really wish I could mount something to suspend the cards perfectly, because its literally just the smallest bit that's causing it. I'm not going to dork with sanding it down, I just have a zip tie on the cables forcing a perfect suspension (no bending), so nothing (not even any vibration) will cause any touching. I just forced adaptive because that way I don't have to worry about legacy games (things like tf2 come to mind). I've had a few instances where some of the games I play a) don't ask what to use and b) will take 100% power to go at a fps larger than the ref rate of my monitor. They are few and far in between, but considering the near-0 cost of forcing adaptive, I see no reason not to.

http://www.newegg.com/Product/Product.aspx?Item=N82E16824236293

This is the monitor. Funny story: The 3d glasses actually do work at 144hz as well, it just took the slightest bit of finnegaling.

E: I have a vga bracket apparently still in my case box. I'm going to go get that and see if that helps.

E2: Its too big to install and keep side fan, so no go.

Wozbo fucked around with this message at 16:26 on Sep 21, 2012

KillHour
Oct 28, 2007


Glen Goobersmooches posted:

Gigabyte's 670 is literally the thinnest model there is. The problem would have to be your motherboard's crowded PCI-E array.

Also, 144Hz? What monitor is this? If you've got a 120Hz you should basically never need vsync ever, unless you're playing ancient games without some kind of frame limiter on the high end.

A 670 can easily run 150+FPS on many games maxed. Not Metro, but still.

Cavauro
Jan 9, 2008

Glen Goobersmooches posted:

Also, 144Hz? What monitor is this? If you've got a 120Hz you should basically never need vsync ever, unless you're playing ancient games without some kind of frame limiter on the high end.

Probably the ASUS VG278HE. I'm not sure if any others are being marketed yet.

craig588
Nov 19, 2005

by Nyc_Tattoo
Furmark isn't the best stability tester for the GK104 series of cards because it will cause the GPU to throttle pretty low. I've been enjoying using Heaven maxed out and letting it run its demo loop for an hour or so.

Blower style cards are generally better for cramped spaces. It has to do with pressure vs volume, blower style fans are better at moving a fixed amount of air regarless of how much restriction they're facing, but conventional fans can move more air at a given size if they're not trying to overcome a lot of resistance.


Glen Goobersmooches posted:

Gigabyte's 670 is literally the thinnest model there is.

I don't know where you got the idea that other manufacturers were only making 3 slot cards, but (Almost?) all of the blower cards are also 2 slot designs. If anything I think the Gigabyte cooler might exceed the width of 2 slots a bit while the blower shrouds actually fit within that limit.

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

Wozbo posted:

The card takes up 2.00001 slots. The pins from the power connector stick out a good 3 to 4 mm farther than they should (This will cause a bzzzzzzzzz->SNAP as the fan from the other card runs against it). I really wish I could mount something to suspend the cards perfectly, because its literally just the smallest bit that's causing it. I'm not going to dork with sanding it down, I just have a zip tie on the cables forcing a perfect suspension (no bending), so nothing (not even any vibration) will cause any touching. I just forced adaptive because that way I don't have to worry about legacy games (things like tf2 come to mind). I've had a few instances where some of the games I play a) don't ask what to use and b) will take 100% power to go at a fps larger than the ref rate of my monitor. They are few and far in between, but considering the near-0 cost of forcing adaptive, I see no reason not to.
That's annoying, but at least it's close enough to fix with zip ties and no worrying flexing. Other models would've presented a much bigger problem on that board for sure. Thanks for valuing the recommendation. I didn't know you were going right to SLI with a new 27", lucky!

I have a history of vsync's ill effects being much, much worse than slight tearing whenever I've tried it in the past. Is adaptive truly slick enough with the necessary shifting to be functionally unnoticeable?

craig588 posted:

I don't know where you got the idea that other manufacturers were only making 3 slot cards, but (Almost?) all of the blower cards are also 2 slot designs. If anything I think the Gigabyte cooler might exceed the width of 2 slots a bit while the blower shrouds actually fit within that limit.
Tom's Hardware roundup. Apologies for not clarifying that they called it the thinnest of the 7 non-ref cooler designs they had their hands on. Does anyone besides EVGA still run reference blowers on clocked up models?

TheRationalRedditor fucked around with this message at 16:55 on Sep 21, 2012

Wozbo
Jul 5, 2010
No tearing from adaptive and no vsync afteraffects @ 144; pretty much absolutely no reason not to turn on.

Wozbo fucked around with this message at 17:01 on Sep 21, 2012

Bob NewSCART
Feb 1, 2012

Outstanding afternoon. "I've often said there's nothing better for the inside of a man than the outside of a horse."

Ok guys this is probably a pretty big stretch but since this may well be GPU related, I figured I would post it here because I'm at an absolute loss.

http://cloud.steampowered.com/ugc/920115767414939711/8F285D6E68B510A84BA6828FD920D89FD601A524/

This white border at the bottom of the screen, is a complete piss off and I have no idea how to get rid of it. I have a GTX280, and this only shows up on my native resolution and one or two other resolutions.

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.
I have that visual artifact border too, it ends up taking on attributes from colored lighting effects now and then. It's easy to ignore when playing, but it would certainly be better if it weren't there. Perhaps it's something about 1680x1050.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
1050 isn't divisble by any power-of-two other than 2, which is probably related to the issue. I'd guess that for optimization reasons they used shaders that are expecting the resolution to be divisible by 16, and then failed to render a bit outside of the viewport to prevent graphical anomalies. You can run into similar issues with video, which MUST have a resolution that is divisible by 16 (because macroblocks are 16x16 pixels, ignoring obsolete formats). This is why 1080p video is actually 1088 pixels tall, you can see a colored band at the bottom of the picture sometimes if your video decoder/player doesn't know to crap that off.

Bob NewSCART
Feb 1, 2012

Outstanding afternoon. "I've often said there's nothing better for the inside of a man than the outside of a horse."

Alereon posted:

1050 isn't divisble by any power-of-two other than 2, which is probably related to the issue. I'd guess that for optimization reasons they used shaders that are expecting the resolution to be divisible by 16, and then failed to render a bit outside of the viewport to prevent graphical anomalies. You can run into similar issues with video, which MUST have a resolution that is divisible by 16 (because macroblocks are 16x16 pixels, ignoring obsolete formats). This is why 1080p video is actually 1088 pixels tall, you can see a colored band at the bottom of the picture sometimes if your video decoder/player doesn't know to crap that off.

Hm, I figured it was something to do with the resolution. Is there anything that can be done about this? Playing on other resolutions doesn't look nearly as crisp :smith:

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

Alereon posted:

1050 isn't divisble by any power-of-two other than 2, which is probably related to the issue. I'd guess that for optimization reasons they used shaders that are expecting the resolution to be divisible by 16, and then failed to render a bit outside of the viewport to prevent graphical anomalies. You can run into similar issues with video, which MUST have a resolution that is divisible by 16 (because macroblocks are 16x16 pixels, ignoring obsolete formats). This is why 1080p video is actually 1088 pixels tall, you can see a colored band at the bottom of the picture sometimes if your video decoder/player doesn't know to crap that off.
This explanation seems so perfect and well-researched that all I can do is hope they have a way to eventually patch it. It is indeed a colored band more often than not.

Maxwell Adams
Oct 21, 2000

T E E F S

Kramjacks posted:

I tried Borderlands 2 with Nvidia 3D vision. It works and there weren't any graphical glitches but there also wasn't any real implementation.

I haven't tried this myself, but the HelixMod guys made a patch for Borderlands 2 to improve the 3d Vision effects.

Adbot
ADBOT LOVES YOU

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


Are the blower style fans actually quieter than their standard counter parts?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply