Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Fellow overclockers, you're doing yourself a disservice if you haven't updated to the more recent overclocking software. I know it was released with Titan and then the 700-series in mind, but it does wonderful things for the 600 series as well.

EVGA's version, Precision X 4.2.0, is outstanding, well beyond its previous version in that it seems to have almost BIOS-level access to how your card's power system (and thus overall overclock performance) works. It seems to be able to override long-term turbo TDP limitations that previous versions were not able to do, and it is much more aggressive (perhaps more capable?) of making use of greater-than-100% power targets.

With my GTX 680 SC+, I was previously using a high version of the 3.0 era Precision X and routinely, in demanding games, would see it rise slightly above TDP now and again - 105%, 107% - but always only for a very small moment or two, with the rest of the time having the card sitting at a stolid 99%. This despite my instruction to go full hog with it and use 132% if it thought it could handle it, whynot.

Further, for long-haul graphically demanding titles, it went from my happy 1254mhz configured boost overclock to a mopey mid-1100mhz lower boost overclock as it throttled back to the aforementioned TDP, costing real performance.

The new one doesn't do either of those things. Now power percentage sits anywhere from 115% to 120% in games like Crysis 3 and other demanding titles, and will rise above 120% if necessary though mainly stress test applications seem to bring that out in it. Most importantly, it locks in the overclock I gave it. Apart from adjusting the three sliders to achieve the 1254MHz GPU and 6800MHz VRAM, I enabled the new option "Overvoltage" - which seems just to give me back the full 1.175V that I lost in the driver update from 311 to 314 :v:. It seems to basically just be doing the full overclocking stuff pretty much all on its own. Not using the "K-Boost" feature, that has bad idea written all over it and I am not sure what it's for honestly, lock the clocks at boost even in 2D? Why?...

Anyway, Precision X 4.2.0: It is just better. Update if you're a Precision user, for sure.

Maybe the guy already gave MSI the goods and this is just EVGA's turn, I dunno, but it kinda rocks getting my full OC performance all the time :3:

Adbot
ADBOT LOVES YOU

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Jmcrofts posted:

Anyone know if there's any merit to the rumor that the GTX 760 ti will be released during Computex i.e. next week?

Well, they were right about the 770 coming hot on the heels of the 780, and right about the specs thereof. I wouldn't be surprised to see a 760 Ti next week.

Agreed posted:

Not using the "K-Boost" feature, that has bad idea written all over it and I am not sure what it's for honestly, lock the clocks at boost even in 2D? Why?...

If I recall correctly, that's an artifact of Nvidia's control over its Turbo algorithms.

To review:

Boost 1.0 (first wave Kepler): Clock-only adjustments, based on coarse TDP estimates only.

Boost 2.0 (650 Ti Boost, Titan, 770 and 780): Clock and voltage adjustments, based on finer-grade (but still coarse) TDP estimates and ALSO temperature measurements. Boost 2.0 cards have less headroom in the power target (TDP), but tend to boost/OC better thanks to taking temperature and its ample headroom into account.

AMD has a similar 1.0/2.0 split on boost algorithms, with the added frustration of 7000 series cards having not only 1.0 and 2.0, but none.0 as well. The 2.0 version debuted with the 7790 and is standard in the currently-emerging wave of APUs, so it should be standard in the HD 8000 series, as well. AMD's 1.0 and 2.0 work analogously to Nvidia's 1.0 and 2.0.

But back to Nvidia: Nvidia loving hates it when you muss around with Boost 1.0. Hates hates hates it. As in, no cards are allowed to let the end user adjust the Power Target or Boost algorithm beyond stock behavior. If you recall, MSI got into trouble trying to work around this by overvolting the VRM hardware, to alter the behavior of the Boost algorithm without altering the software. That little shenanigan was what got MSI dropped from the video card vendor list in the parts picking thread.

So that's why K-Boost exists: direct control of voltage and clocks by sidestepping the Boost algorithm completely. It's a completely alternative power algorithm, so no Boost features (including downclocked p-states) function. But when you say "1100 MHz at 1.2V, please," by gum, it'll give you 1100 MHz at 1.2V. You can't alter the Boost algorithm, but you can skip it entirely.

Factory Factory fucked around with this message at 06:36 on May 31, 2013

ethanol
Jul 13, 2007



drat, one of the fans on my 3 month old sapphire 7950 started rattling and stopping today. I tap it and get it to spin back up but I guess I better rma sooner than later. There goes my zero rma record. :(

craig588
Nov 19, 2005

by Nyc_Tattoo
There are two different power targets stored in the bios, the user level one and the hidden one that is never exposed through software. My card sucked when I got it, dropping down to the 900mhz range depending on the load. After messing with them it only goes down to the 1280mhz range. The boost table can adjust voltage and speed on the 6x0 series GK104s. The major difference between boost 1 and 2 is temperature is taken into account much more accurately with boost 2.

The restrictions Nvidia placed on software might have been relaxed with the GK110 release so the newer version of Precision is able to apply the parcentage adjustment to the hidden power target table as well. (The hidden power target has the whole range of minimum, target and maximum just like the software controls) You could dump your bios to confirm if it was the hidden power target levels giving you problems.

The most important thing to remember is that all of the percentage values shown are derived from the actual wattage figures stored in the bios and not comparable unless two cards are running the exact same bios.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Looks like we just got the high-end GeForce 700M series. AnandTech link with some 780M benches.

Looks like the result of better yields and a more mature process. Same GK104/GK106 chips, but the SKUs of "either GK106 or GK104 depending on what died today" are gone. the new parts have vastly higher clock speeds than their 600M counterparts (except the 760M - the 660M was 384 shaders at 835 MHz, and the 760M is 768 shaders at 657 MHz). Memory bandwidth is usually similar, albeit done with faster VRAM and smaller bus bitsizes. And all the parts have Boost 2.0.

But the real honker is the 780M. It's a fully-enabled GK104 (1536 shader, i.e. 680/770) at 823 MHz with Boost and 4GB of 5 GHz GDDR5. At 100W. Hell of a thing for a laptop GPU.

cheesetriangles
Jan 5, 2011





Would a 100w laptop GPU be really low or really high? Where does it fall on the melt your genitals scale.

LRADIKAL
Jun 10, 2001

Fun Shoe

Zotix posted:

Nah, that's a bit past my comfort level for a video card. I'll either order a 770 or a 780, and when it arrives just RMA the 580. Taking off the outer cover is one thing, removing the heatsink is another.

How much to buy your 580 when it comes back?

craig588
Nov 19, 2005

by Nyc_Tattoo

cheesetriangles posted:

Would a 100w laptop GPU be really low or really high? Where does it fall on the melt your genitals scale.
Really high, but just last year almost the same amount of cabability was 195 watts in the desktop 680.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Factory Factory posted:

But the real honker is the 780M. It's a fully-enabled GK104 (1536 shader, i.e. 680/770) at 823 MHz with Boost and 4GB of 5 GHz GDDR5. At 100W. Hell of a thing for a laptop GPU.

You're loving making GBS threads me. In the age of tablets/mobile rising, they finally make a genuine goddamned gaming laptop that would work as a desktop replacement?

Animal
Apr 8, 2003

I wonder how svelte they can get with a laptop packing that and Haswell. In the next few years I may finally see myself buying a gaming laptop.

Animal
Apr 8, 2003

Agreed posted:

Fellow overclockers, you're doing yourself a disservice if you haven't updated to the more recent overclocking software. I know it was released with Titan and then the 700-series in mind, but it does wonderful things for the 600 series as well.

EVGA's version, Precision X 4.2.0, is outstanding, well beyond its previous version in that it seems to have almost BIOS-level access to how your card's power system (and thus overall overclock performance) works. It seems to be able to override long-term turbo TDP limitations that previous versions were not able to do, and it is much more aggressive (perhaps more capable?) of making use of greater-than-100% power targets.

With my GTX 680 SC+, I was previously using a high version of the 3.0 era Precision X and routinely, in demanding games, would see it rise slightly above TDP now and again - 105%, 107% - but always only for a very small moment or two, with the rest of the time having the card sitting at a stolid 99%. This despite my instruction to go full hog with it and use 132% if it thought it could handle it, whynot.

Further, for long-haul graphically demanding titles, it went from my happy 1254mhz configured boost overclock to a mopey mid-1100mhz lower boost overclock as it throttled back to the aforementioned TDP, costing real performance.

The new one doesn't do either of those things. Now power percentage sits anywhere from 115% to 120% in games like Crysis 3 and other demanding titles, and will rise above 120% if necessary though mainly stress test applications seem to bring that out in it. Most importantly, it locks in the overclock I gave it. Apart from adjusting the three sliders to achieve the 1254MHz GPU and 6800MHz VRAM, I enabled the new option "Overvoltage" - which seems just to give me back the full 1.175V that I lost in the driver update from 311 to 314 :v:. It seems to basically just be doing the full overclocking stuff pretty much all on its own. Not using the "K-Boost" feature, that has bad idea written all over it and I am not sure what it's for honestly, lock the clocks at boost even in 2D? Why?...

Anyway, Precision X 4.2.0: It is just better. Update if you're a Precision user, for sure.

Maybe the guy already gave MSI the goods and this is just EVGA's turn, I dunno, but it kinda rocks getting my full OC performance all the time :3:

I just tested it on my overclocked 670. You are absolutely right, it is much better at holding boost clocks.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
What are your preferred apps for testing your GPU and video memory overclocks on Kepler GTX 600 cards? I was using looped Metro 2033 runs but they are more variable than I'd hope and I don't think they're long enough to really stress the card,

Animal
Apr 8, 2003

I was having the same dilemma. Metro is not very good. I use Heaven.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Alereon posted:

What are your preferred apps for testing your GPU and video memory overclocks on Kepler GTX 600 cards? I was using looped Metro 2033 runs but they are more variable than I'd hope and I don't think they're long enough to really stress the card,

Mixture of 3DMark11 repeated runs (great at detecting GPU errors), Heaven repeated runs (great at detecting GPU and memory errors, it'll light up like the 4th of July), and playing demanding games while looking for non-engine artifacts (gotta be careful with UE3 games since pop-in is usually the engine and not the card).

There isn't a one-stop-shop for these suckers, and I've found that my card has different stability at DX9, DX10, and DX11 even. The logic is what it is, I guess. It's least stable at DX9, hence my lower overclock. If I only played DX11 games I could keep it at about 1300MHz all the time. Search me as to why.

Agreed fucked around with this message at 19:07 on May 31, 2013

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010


:stonk: This is impressive.

incoherent fucked around with this message at 19:34 on May 31, 2013

Nondescript Van
May 2, 2007

Gats N Party Hats :toot:
That is the exact opposite of a phantom.

Kramjacks
Jul 5, 2007

incoherent posted:



:stonk: This is impressive.

It looks like a wall-mounted AC unit.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

incoherent posted:



:stonk: This is impressive.

Thought it was passive until

Zotix
Aug 14, 2011



Jago posted:

How much to buy your 580 when it comes back?

I'm not sure what would be a fair price. I should be getting my new card come Tuesday, and I'll likely RMA it later next week. Not sure the turnaround on it, but once I get it back I'll probably end up selling it.

Automata 10 Pack
Jun 21, 2007

Ten games published by Automata, on one cassette

Don Lapre posted:

Thought it was passive until


Doubles as a chainsaw?

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010

Kramjacks posted:

It looks like a wall-mounted AC unit.

The busted kind that vents heat.

randyest
Sep 1, 2004

by R. Guyovich
I hope this doesn't count as an upgrade question, but I just ordered that insane SC GTX780 ACX beast and I'm wondering if it'd be worthwhile to keep my GTX570 in the box for PhysX only or just sell it? It runs super cool and quiet, but maybe it's overkill and a waste of electricity? Of course I'll try it and compare once the 780 gets here, but any info / experience would be appreciated.

Yudo
May 15, 2003

Hmm well I need a space heater for my bedroom and a new video card, so gtx 770 it is!

I'm in the 90% (according to Steam): I wouldn't spend more than $300 on a card. Then again, I don't have a fully nude moded Skyrim pasted across 5 LCDs...

Animal
Apr 8, 2003

randyest posted:

I hope this doesn't count as an upgrade question, but I just ordered that insane SC GTX780 ACX beast and I'm wondering if it'd be worthwhile to keep my GTX570 in the box for PhysX only or just sell it? It runs super cool and quiet, but maybe it's overkill and a waste of electricity? Of course I'll try it and compare once the 780 gets here, but any info / experience would be appreciated.

You'll most likely end up selling it. I tried using my 560 Ti 448 for PhysX once I got my 670, and the novelty wore off within the day.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

randyest posted:

I hope this doesn't count as an upgrade question, but I just ordered that insane SC GTX780 ACX beast and I'm wondering if it'd be worthwhile to keep my GTX570 in the box for PhysX only or just sell it? It runs super cool and quiet, but maybe it's overkill and a waste of electricity? Of course I'll try it and compare once the 780 gets here, but any info / experience would be appreciated.

I saw some tests done and showed if you are using an older card your 780 in this case would end up waiting for the old cards physx poo poo and slow things down.

Sepist
Dec 26, 2005

FUCK BITCHES, ROUTE PACKETS

Gravy Boat 2k
My 660ti is going to a black screen then returning a second later when it gets over 55 degrees celcius with less than 70% fan speed, is that normal without overclocking? I can circumvent this by having the video card fan go to 70% and set all of the stock fans on my phantom case at full speed. A nvidia driver infected with a bitcoin miner was running on it for 2 days before I found it which I'm afraid damaged the card :T

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Sepist posted:

My 660ti is going to a black screen then returning a second later when it gets over 55 degrees celcius with less than 70% fan speed, is that normal without overclocking? I can circumvent this by having the video card fan go to 70% and set all of the stock fans on my phantom case at full speed. A nvidia driver infected with a bitcoin miner was running on it for 2 days before I found it which I'm afraid damaged the card :T

Sounds like you still have a bitcoin miner running.

randyest
Sep 1, 2004

by R. Guyovich

Animal posted:

You'll most likely end up selling it. I tried using my 560 Ti 448 for PhysX once I got my 670, and the novelty wore off within the day.

Don Lapre posted:

I saw some tests done and showed if you are using an older card your 780 in this case would end up waiting for the old cards physx poo poo and slow things down.
Thanks guys. Some googling led to a bunch of people dumping higher-end secondary physX cards for older 280's and such, apparently in order to save noise/power/heat, but if the 780 is going to be bottlenecked by a secondary card that seems counterproductive.

I'm probably going to swap out my i5-2500K for an i7-3770 since they're so cheap, and I understand physX can run on CPU or GPU (or both?) Is that something I need to micromanage or will the Nvidia Advisor thing figure out the best settings for me?

beejay
Apr 7, 2002

Sepist posted:

My 660ti is going to a black screen then returning a second later when it gets over 55 degrees celcius with less than 70% fan speed, is that normal without overclocking? I can circumvent this by having the video card fan go to 70% and set all of the stock fans on my phantom case at full speed. A nvidia driver infected with a bitcoin miner was running on it for 2 days before I found it which I'm afraid damaged the card :T

Where do you download your nvidia drivers from? :psyduck:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

randyest posted:

Thanks guys. Some googling led to a bunch of people dumping higher-end secondary physX cards for older 280's and such, apparently in order to save noise/power/heat, but if the 780 is going to be bottlenecked by a secondary card that seems counterproductive.

I'm probably going to swap out my i5-2500K for an i7-3770 since they're so cheap, and I understand physX can run on CPU or GPU (or both?) Is that something I need to micromanage or will the Nvidia Advisor thing figure out the best settings for me?

I did some testing early on when I set up my GTX 580 (which is Good At CUDA, compared to non-Titan Kepler) as a headless CUDA/PhysX processor to compliment my GTX 680 which does nothing but rendering. Findings were highly positive w/r/t high PhysX settings in games that really take advantage of PhysX features. Notable big differences letting the 580 handle it vs. the 680 do both rendering and processing PhysX include Metro 2033, Borderlands 2, the Batman games, and Alice Return to Wonderland. Framerate minimums were drastically higher. Are. I haven't dismantled it.

The biggest point though is that there just aren't that many PhysX games since it's proprietary and there are a lot of very good CPU-based physics engines (including CPU PhysX, somewhat ironically). Hard to say it's reeeeally worth it, but I had the 580 laying around from the upgrade so I figured what the hell. It's a neat gimmick. Does own bones in Borderland 2, especially, but that game really emphasizes PhysX and more demanding stuff (simple, but LOTS OF IT) would slow just the 680 down a lot. It's nice not having to deal with that.

Old cards would bottleneck it. The cutoff point would probably be the aforementioned 550Ti as a minimum to keep up. Maybe a 460. Sticking a 280 in slowed it down iirc. It has been awhile since I did testing, though, and those are the only three PCI-e cards I have access to.

Agreed fucked around with this message at 02:02 on Jun 1, 2013

Animal
Apr 8, 2003

randyest posted:


I'm probably going to swap out my i5-2500K for an i7-3770 since they're so cheap, and I understand physX can run on CPU or GPU (or both?) Is that something I need to micromanage or will the Nvidia Advisor thing figure out the best settings for me?

What is your usage? In games you would be lighting your money on fire. I would keep the i5-2500k and use that money towards a very large SSD, or maybe a 1440p or a 120hz monitor if you dont have one.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

randyest posted:

I understand physX can run on CPU or GPU (or both?)

(edit: this is really fast and loose with the specific facts here, take it for what it is)

Meant to address this specifically - GPU PhysX is not the same thing as CPU PhysX. GPU PhysX is essentially an application of CUDA that's very resource-light, allowing the calculation crippled consumer cards to do just fine with it even though they'd suck at more serious CUDA loads (GTX 580 somewhat excepted, at its time; Titan excepted until the next Titan comes out :v:).

CPU PhysX is "just" a physics engine that can be employed by the devs if they intend to use it. It's good - Metro 2033 with GPU PhysX turned off still has very good physics effects and it's powered by the CPU variant. But it's comparable with other physics engines, and not really the same thing as GPU-accelerated PhysX at all.

Consider it a branding thing. Getting the PhysX name on a product is better than nothing, even if they don't intend to implement a proprietary, often heavily GPU-bound hardware accelerated algorithm alongside it.

Agreed fucked around with this message at 02:29 on Jun 1, 2013

The Illusive Man
Mar 27, 2008

~savior of yoomanity~
Just curious, but how much does pure PhysX usage tax your 580? As in, how are temps, power usage, etc?

randyest
Sep 1, 2004

by R. Guyovich

Agreed posted:

I did some testing early on when I set up my GTX 580 (which is Good At CUDA, compared to non-Titan Kepler) as a headless CUDA/PhysX processor to compliment my GTX 680 which does nothing but rendering. Findings were highly positive w/r/t high PhysX settings in games that really take advantage of PhysX features. ...
This is good info thank you very much. I'll share my findings once I get the new card and try with/without the 570. I'm still not 100% sure how/if I need to "tell" the drivers what to use for graphics versus physics but I'm still researching. Did you do anything special or just let the Nvidia drivers do their thing?

Animal posted:

What is your usage? In games you would be lighting your money on fire. I would keep the i5-2500k and use that money towards a very large SSD, or maybe a 1440p or a 120hz monitor if you dont have one.
Games mostly but some video editing as well. Maybe not enough to warrant the i7. But it's just $230 at microcenter down the street, it'll drop into my current mobo, and I just got a pretty nice unexpected bonus so it's pretty tempting even for the modest 10-20% boost. I already have two 120GB SSDs, a 2TB secondary, an 8TB NAS, and have already budgeted for a three-way 1080p monitor setup to replace my current one (which is actually fine for my old man eyes I just want to try triple-head gaming and I think that'll be easier with three 1080p's instead of three 1440's -- I hope the SC 780 can handle it but worst case I'll buy another and SLI them.)

Agreed posted:

(edit: this is really fast and loose with the specific facts here, take it for what it is)

Meant to address this specifically - GPU PhysX is not the same thing as CPU PhysX. GPU PhysX is essentially an application of CUDA that's very resource-light, allowing the calculation crippled consumer cards to do just fine with it even though they'd suck at more serious CUDA loads (GTX 580 somewhat excepted, at its time; Titan excepted until the next Titan comes out :v:).

CPU PhysX is "just" a physics engine that can be employed by the devs if they intend to use it. It's good - Metro 2033 with GPU PhysX turned off still has very good physics effects and it's powered by the CPU variant. But it's comparable with other physics engines, and not really the same thing as GPU-accelerated PhysX at all.

Consider it a branding thing. Getting the PhysX name on a product is better than nothing, even if they don't intend to implement a proprietary, often heavily GPU-bound hardware accelerated algorithm alongside it.
This is also extremely helpful; thank you. I want to be able to do 5760 x 1080 at max ultra tessellated everything with max physics but I don't yet understand how the drivers decide if the CPU, GPU, secondary GPU, or some combination of those handle the physics, and how that affects overall performance. I know I have to tone things down a bit on my 570 on Metro Last Light, Tomb Raider, etc. on a single 1920x1080p. If I can stick with an SC 780 and my old 570 and achieve that at 5760 I'd be thrilled. But if it makes more sense to sell the 570 and pick up a lower-power physX card, or even another SC 780 for SLI, I'd do that. I'd rather avoid the SLI hassle if possible.

Thanks everyone for the good info; it's great fodder for further googling.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

randyest posted:

Games mostly but some video editing as well. Maybe not enough to warrant the i7. But it's just $230 at microcenter down the street, it'll drop into my current mobo, and I just got a pretty nice unexpected bonus so it's pretty tempting even for the modest 10-20% boost. I already have two 120GB SSDs, a 2TB secondary, an 8TB NAS, and have already budgeted for a three-way 1080p monitor setup to replace my current one (which is actually fine for my old man eyes I just want to try triple-head gaming and I think that'll be easier with three 1080p's instead of three 1440's -- I hope the SC 780 can handle it but worst case I'll buy another and SLI them.)

You probably don't even need to build a Haswell system, but you certainly don't need to build an Ivy Bridge system. That is wasting money. Give me the money instead I won't waste it I promise!!!

randyest posted:

I'm still not 100% sure how/if I need to "tell" the drivers what to use for graphics versus physics but I'm still researching. Did you do anything special or just let the Nvidia drivers do their thing?

You do it like this in the nVidia control panel. :)



Space Racist posted:

Just curious, but how much does pure PhysX usage tax your 580? As in, how are temps, power usage, etc?

It doesn't get too bothered by even what would be considered demanding PhysX usage. Temps don't usually go over 51-53ºC, power usage is less controllable than with Kepler cards obviously so it's still using power but GPU usage rarely gets over 10% (I've seen 18%-24% before, now that I think about it, but that was some serious, using a trainer, super PhysX going on in Borderlands 2 just to see what happened. What happened was rad).

You have to remember that GF110 is the chip that powers the 560Ti-448, the 570, and the 580; the same chip powers last generation's Tesla and Quadro workstation cards. With the GTX 580's full complement of shaders and the fact that PhysX is a CUDA load that doesn't require high precision or anything fancy at all, it's sincerely overengineered for just doing PhysX.

When we first really dug into this we figured the 560 non-Ti, a 460, or possibly the 550-Ti would probably be the sweet spot for doing the job well without being more than necessary for the task.

Agreed fucked around with this message at 03:51 on Jun 1, 2013

randyest
Sep 1, 2004

by R. Guyovich

Agreed posted:

You probably don't even need to build a Haswell system, but you certainly don't need to build an Ivy Bridge system. That is wasting money. Give me the money instead I won't waste it I promise!!!

You do it like this in the nVidia control panel. :)


OK OK I'll hold off on the 3770; thanks :)

My Nvidia control panel looks nothing like that -- I installed the "GeForce Experience" thing. Maybe that's it, or I won't see options until a second card is installed? No need to reply as this seems to have strayed too far for this thread and I'll figure it out but thanks all for your info and help.

An Unoriginal Name
Jul 11, 2011

My favorite touhou is my beloved Nitori.
:swoon:

randyest posted:

OK OK I'll hold off on the 3770; thanks :)

My Nvidia control panel looks nothing like that -- I installed the "GeForce Experience" thing. Maybe that's it, or I won't see options until a second card is installed? No need to reply as this seems to have strayed too far for this thread and I'll figure it out but thanks all for your info and help.

This will sound stupid, but... do you actually have THE drivers installed? Like the legitimate 320.xx drivers, and not just Geforce Experience? If you do, you should have an icon in your system tray and when you right click it you get the option to open either the control panel or Experience.

randyest
Sep 1, 2004

by R. Guyovich

An Unoriginal Name posted:

This will sound stupid, but... do you actually have THE drivers installed? Like the legitimate 320.xx drivers, and not just Geforce Experience? If you do, you should have an icon in your system tray and when you right click it you get the option to open either the control panel or Experience.
Heh, thank you. Yes the drivers are in I just left-clicked the nvidia icon, which opens experience. Right click gave me the control panel option, and I see the PhysX option now. :shobon:

Thanks again everyone; I'll test out the games I play with all the combos of 570/680/CPU and see what works best.

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

Sepist posted:

My 660ti is going to a black screen then returning a second later when it gets over 55 degrees celcius with less than 70% fan speed, is that normal without overclocking? I can circumvent this by having the video card fan go to 70% and set all of the stock fans on my phantom case at full speed. A nvidia driver infected with a bitcoin miner was running on it for 2 days before I found it which I'm afraid damaged the card :T

Blow away the drivers and see if you can find a way to get rid of the miner - not sure if it'd be considered as malware or virus-related but couldn't hurt to run a scan with a tool like Malwarebytes. If nothing else, see if you can find out what the miner is called and do some searches to find out how to get rid of it. And honestly, I wouldn't download any drivers outside the ones direct from Nvidia, especially if you got yours from a sketchy third party site.

Adbot
ADBOT LOVES YOU

Sepist
Dec 26, 2005

FUCK BITCHES, ROUTE PACKETS

Gravy Boat 2k

beejay posted:

Where do you download your nvidia drivers from? :psyduck:

I was having issues with the original drivers I installed which apparently was normal with the 660ti so I downloaded a modified copy which turned out to have a miner installed.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply