Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Stanley Pain
Jun 16, 2001

by Fluffdaddy

veedubfreak posted:

How is Bf4? I went from being low settings and 40ish fps in MWO to high with the same framerate at 7880x1440.

So far I have the card at 1047/5200. Waiting to see how far I can get before voltage becomes an issue. Running at 46C at full load.


BF4 plays brilliantly. Amazingly smooth on Ultra @ 2560x1400 resolution.

Tab8715 posted:

How bad is the noise with the 290s?

Surprisingly not as loud as I was lead to believe by reviews. I guess when you have a decently sound dampened case the video card is barely audible over the 3x240mm fans I have running (900ish RPM).

Adbot
ADBOT LOVES YOU

Abalieno
Apr 3, 2011
About the 290:

The issue here is that most reviews run tests of about 1-2 minutes. So I'm wondering what happens instead when you stress the card for about two HOURS instead. Does it fall down to its lowest clock?

The real issue with these cards is how they perform if room temperature is already not ideal (say 24-26). The risk is that you think you're buying a great card, but that has a so huge variance that in the end is really not really much faster than a 770 (which runs cool and with max clocks).

Sure, peak performance with ideal situations, cool rooms and cases, is amazing. But what happens one of two years down the line when its condition isn't anymore as pristine and it has to work in non-perfect situations?

In the end: better get a solid 770 now and buy something new in 1.5/2 years, or wait for a decent 290 that maybe can survive a bit longer?

If you consider BF4 is already on the line with a 770, I don't think this card is going to survive very long the creeping up of PC hardware reqs.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

To simplify things, here's the current high-end video card situation, as of today/time of writing.

1. Don't buy a 290X or a 780Ti unless you know you actually need one. Or two. Seriously, this is the bleeding edge, unless you're running really high resolutions or an actual 4K monitor, nope. Why? The 290X isn't super far off the price:performance curve (it set the curve for higher end cards at launch, after all), but until and unless we see ubiquitous really remarkable overclocking results from them, and especially until we get better cooling options, no reason to get that. The 780Ti is just goofy as poo poo from a price:performance perspective and exists solely for the purpose of nVidia to accurately claim to have the fastest card.

2. Got high end needs but don't like setting money on fire? Get some flavor of 780 according to your needs if you wish to stay nVidia (not 780Ti), or wait for a non-reference 290 to hit to solve the absurd cooling issues those cards are having. Yeah, AMD has a patch out to fix the variance between manufacturers, but they're still under-cooled. Speculatively, when AMD's partners get custom cooling on the 290X it might pull significantly away from the 290 again, so keep your options open as more info becomes available, but right now it's looking like the 290 is about 8%-13% away from the performance of a 290x for a solid $100 off, and overclocks like a bastard under water and should overclock well under air, too. Two 290s is the winner for ultra high resolution gameplay, at $800 (or, $100ish more than a 780Ti) and actually capable of 4K gameplay and high-res multi-monitor setups. Neat!

3. Sane and playing at 1080p? Don't need ALL THE SHINIES? Eh, 280x or a 760, pick your poison. Right now the 770 needs to go lower to justify its existence even with the games bundle, and doesn't have enough VRAM for the price point. The 280x being a 7970GHz means it's still a really powerful card that will probably kick rear end at this resolution for some time yet. Remember, it's really not that far off its higher end brethren, as the top-end product from the last gen with a new name and label.

4. Side note regarding additional technologies: Shadowplay is really cool and I look forward to them working on it further. Because seriously it's really cool. Like, awesome as heck. Maybe TrueAudio will be really cool, it has potential as an idea for sure. Maybe Mantle will be really cool, we'll know more about it in the upcoming conference. Don't bank on Mantle until we know more.

I can speak a bit more about PhysX since I actually use it. It's cool, but only in a "well that sure is a thing" way, don't go out of your way for PhysX but if you're upgrading from nVidia to nVidia and your power supply can feed it, you might enjoy the PhysX experience in games that support it by using the old card as a PhysX card. Good option for price:performance in PhysX is the 650Ti (or 560Ti - they're performance analogs in many regards, including having similar PhysX performance - but with all the Fermi card driver issues lately, I'm sticking with the 650Ti baseline recommendation for that particular quagmire-of-choice). Anything weaker than that will likely bottleneck a fast rendering card, and we don't want that!

Right now the best price:performance crown goes to AMD for the R9 290 so long as you get one that will run close to its advertised max clock rather than hanging out at 727MHz due to insufficient cooling, but we won't really see what it can do until better coolers are slapped onto it. R9 290X remains a contender, with too little info to go on at the moment to really make an informed decision. The 780 is a little on the wrong side of the price:performance curve thanks to the 290, but nVidia has some neat features that you may value and they've got a good bundle going on at the moment too.

Agreed fucked around with this message at 22:41 on Nov 9, 2013

GrizzlyCow
May 30, 2011
AnandTech use a Phantom 630 full tower case, overclocked i7-4960X, and loop their benchmark runs several time.

HardOCP, on the other hand, do not use a chassis, but they do actually play (more like turn on God mod and run around) the games they benchmark, or rather, their benchmark is actually playing the game while recording the FPS. Their benchmark matches up with AnandTech's.

Still, that is an interesting complaint you have. I'll shoot an email to a couple of reviewers and ask for more information on their methodology.

Abalieno
Apr 3, 2011

Agreed posted:

Right now the 770 needs to go lower to justify its existence even with the games bundle, and doesn't have enough VRAM for the price point.

I see the RAM issue being brought up frequently, but it's not a real issue.

By the time you need more than 2Gb you also need a faster card. The 770 just does nothing with more memory onboard, and in most cases the 4Gb versions are underclocked compared to the 2Gb models.

Even BF4 that's the most RAM hungry game, memory usage is far from being an issue. It is an issue if you run at above 1080p and 4X AA. And in those case the 770 is just not enough.

So it's simply a fact that on the 2Gb is exactly what a card like the 770 is going to need. More memory (being useful) requires a faster card.

Magic Underwear
May 14, 2003


Young Orc
Is there any inkling that there may be a new never settle bundle for Black Friday/the holidays? $400-$450 for a custom 290 and a few new games would be killer.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Abalieno posted:

I see the RAM issue being brought up frequently, but it's not a real issue.

By the time you need more than 2Gb you also need a faster card. The 770 just does nothing with more memory onboard, and in most cases the 4Gb versions are underclocked compared to the 2Gb models.

Even BF4 that's the most RAM hungry game, memory usage is far from being an issue. It is an issue if you run at above 1080p and 4X AA. And in those case the 770 is just not enough.

So it's simply a fact that on the 2Gb is exactly what a card like the 770 is going to need. More memory (being useful) requires a faster card.

It's awesome to be talking to me back when the 680 launched. Word of warning, don't buy the 780, they're going to knock like $170 off the price and you're going to feel dumb as heck.

Edit: No, seriously

Agreed fucked around with this message at 23:46 on Nov 9, 2013

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I feel awful for saying this but CoD: Ghosts looks BEAUTIFUL on my GTX 670 with 4X TXAA enabled. This is the antialiasing I have been looking for. Now if only I had G-Sync so I could play without Vsync...

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Alereon posted:

I feel awful for saying this but CoD: Ghosts looks BEAUTIFUL on my GTX 670 with 4X TXAA enabled. This is the antialiasing I have been looking for. Now if only I had G-Sync so I could play without Vsync...

I know, man, TXAA is the poo poo and I love the, well, I hate to keep using Tim Lottes' description of it, but it fits, the cinematic look and even "feel" it imparts is just fantastic. I also look forward to picking up a G-Sync capable monitor so I can run it, because I'd rather have that with its still-25%-of-SSAA performance hit akin to heavy-ish 4xMSAA and 40-50fps than a straight 60fps with a less visually appealing AA method. I got my first experience with it in situ with Crysis 3, and I don't really trust them to have optimized it super well but drat is it gorgeous (although with regard to optimization, reading about it, there's really not that much involved in converting a MSAA pass to a TXAA pass, even given their history of tessellated oceans underneath geometry and rendering the poo poo out of turnpikes, I don't know how you could gently caress that up too badly).

Agreed fucked around with this message at 00:18 on Nov 10, 2013

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.
I would like better games than cod ghosts to have TXAA as an option. :argh:

veedubfreak
Apr 2, 2005

by Smythe
Fyi, my 290 is running at 1102mhz and 5400 mem with temps never going over 49. The problem right now is voltage. Once I get a voltage controlling overclock program I should be able to go much higher. Thes cards are the bomb on water.

Boar It
Jul 29, 2011

Mesmerizing eyebrows is my specialty

Agreed posted:

To simplify things, here's the current high-end video card situation, as of today/time of writing.


Wait, so the R9 290X is not the same card as R9 290? Since I keep seeing you guys saying both 290 and 290X.

Wistful of Dollars
Aug 25, 2009

Torabi posted:

Wait, so the R9 290X is not the same card as R9 290? Since I keep seeing you guys saying both 290 and 290X.

The 290 is a 290x with a v6 instead of a v8. It's just a really good v6.

Magic Underwear
May 14, 2003


Young Orc

Torabi posted:

Wait, so the R9 290X is not the same card as R9 290? Since I keep seeing you guys saying both 290 and 290X.

Yeah, this was really stupid on AMD's part. Everyone either risks the misunderstanding or has to call it "290 non-X"

beejay
Apr 7, 2002

Not much different than Ti/non-Ti, why have video card naming conventions been screwy for so many years?

edit: Just started looking back and I forgot what a mess things were around the GeForce 4 era, when you had the GeForce 4 Ti 4200 and 4600, and then the GeForce 3 Ti 200 which was still available, and then the GeForce 4 MX line which was technically GeForce 2 hardware and performed worse than the GeForce 3 cards... insanity.

beejay fucked around with this message at 02:23 on Nov 10, 2013

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Torabi posted:

Wait, so the R9 290X is not the same card as R9 290? Since I keep seeing you guys saying both 290 and 290X.

R9 290 is to GTX 670 as R9 290X is to GTX 680.

It has most of the features and nearly identical performance, just barely edged out by its more expensive counterpart, for $100 less. It's the smart buy if you're going to drop a load of cash on a graphics card. However, I do think we should pay attention to what happens with the R9 290X as things progress and sufficient cooling enters the equation from partners; right now, while its "Uber Mode" switch helps it keep closer to the higher possible stock Powertune clock settings, it is still under-cooled and appears to have significant headroom beyond the cooler's limitations.

Powertune is sort of like GPU Boost 2.0 from nVidia when you tell Boost 2.0 to focus on temperature and open the power floodgates entirely, except it works backwards - it starts at a predetermined high clock rate and then throttles heavily if the fan profile can't keep the chip within a given temperature threshold.

This will work to its advantage in overclocking when it actually gets a half-rear end decent cooler put on it (see, for example, our resident super high res waterblock guy's overclocking results on water), but for the time being, with the extraordinarily lovely reference coolers, there are some cards - 290s, especially, since they don't have an Uber Mode to kick the fan into high gear - that just don't cool well enough without extraordinary measures, and throttle quickly. This despite a patch aimed specifically at the issue.

The coolers just don't work well. Powertune does its best, but its best kind of sucks when the coolers themselves are at fault. On top of that they're noisy, usually if a cooler is loud it's at least moving heat but AMD engineered the chips one hell of a lot better than they engineered the coolers this go-around. :sigh:

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


Seems like nVidia's cooling solution more well thought out?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Tab8715 posted:

Seems like nVidia's cooling solution more well thought out?

Their high end cooling solution is to date the best vapor chamber dual slot cooler for a graphics/compute card, ever. Ever ever. Yes. Cast aluminum housing with cast aluminum fins under a Lexan faceplate with an extremely efficient vapor chamber drawing heat from everything, all engineered for maximum airflow internally and using an already quiet AND ALSO acoustically dampened fan. They achieve a blower cooler that builds on the success of the GTX 680's already very good cooler to deliver performance that is best in class, and best ever.

On the software side of things, I prefer Boost 2.0 for overclocking (because it's easy and exposes the stuff you can dick with to the user, though it's a bit draconian in what it doesn't let you mess with, too), and while AMD's Powertune software is great, they kinda hosed up the fan speed optimizations that they borrowed, conceptually, from nVidia. nvidia has a much better implementation of psychoacoustics in their default fan control profiles, with slow and easy ramp-up/ramp-down behavior to avoid the really obnoxious whiiRRRRRRRR that happens with AMD's blowers.

They also just don't run as hot. They could, but it'd be dumb and they'd throttle constantly. Just like... :haw:

veedubfreak
Apr 2, 2005

by Smythe
Like I already said. I'm watercooling my 290 and it is literally the same as the 290x. My card is running 1100mhz and 5400memory, the only reason I can't go higher is that there isn't currently a voltage mod yet.

Magic Underwear
May 14, 2003


Young Orc

veedubfreak posted:

Like I already said. I'm watercooling my 290 and it is literally the same as the 290x. My card is running 1100mhz and 5400memory, the only reason I can't go higher is that there isn't currently a voltage mod yet.

They differ by more than just clock speed you know. 290 has 9% fewer SPs and "texture units"

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

veedubfreak posted:

Like I already said. I'm watercooling my 290 and it is literally the same as the 290x. My card is running 1100mhz and 5400memory, the only reason I can't go higher is that there isn't currently a voltage mod yet.

It's not literally the same, though, it's got 2560 shader units and 160 texture units compared to 2816 and 176, respectively; not a huge difference, but if, as some who went with the 290x are finding, it turns out to be really radical overclocker that can hit 1200MHz-1300MHz if all necessary conditions are met for cooling, that could be a more significant performance disparity. Probably still within 15%, even if overclocking of both cards is taken into account, and you're saving a lot of dough and likely don't need that additional 15% even at your triple-monitor, >4K resolution, but there are some differences worth considering especially going forward as we find out what AMD's validation process for the Hawaii GPU looks like. If they're binning aggressively to run the 290Xs full-out, then it'll probably be less of an issue than otherwise, and regardless, you've obviously made the smarter choice for multi-monitor gaming considering the price difference jumps half the price of adding another 290 at that point for that maybe-15%-ish performance difference, best case.

It's pretty much dead on analogous to the 670 vs 680. Oh no, a SMX :qq:

Agreed fucked around with this message at 03:10 on Nov 10, 2013

veedubfreak
Apr 2, 2005

by Smythe

Agreed posted:

It's not literally the same, though, it's got 2560 shader units and 160 texture units compared to 2816 and 176, respectively; not a huge difference, but if, as some who went with the 290x are finding, it turns out to be really radical overclocker that can hit 1200MHz-1300MHz if all necessary conditions are met for cooling, that could be a more significant performance disparity. Probably still within 15%, even if overclocking of both cards is taken into account, and you're saving a lot of dough and likely don't need that additional 15% even at your triple-monitor, >4K resolution, but there are some differences worth considering especially going forward as we find out what AMD's validation process for the Hawaii GPU looks like. If they're binning aggressively to run the 290Xs full-out, then it'll probably be less of an issue than otherwise, and regardless, you've obviously made the smarter choice for multi-monitor gaming considering the price difference jumps half the price of adding another 290 at that point for that maybe-15%-ish performance difference, best case.

It's pretty much dead on analogous to the 670 vs 680. Oh no, a SMX :qq:

Trust me, I have a 290x on my desk. The amount of difference is not worth the 150 dollars. Which is why I bought a 2nd 290 and a 2nd waterblock. The rops and memory bandwidth seem to be the major factor. Where the single line of texture units don't seem to really matter.

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy
So here's an interesting test by someone who did a power output test on a r9- 290x

http://www.overclock.net/t/1441118/290x-psu-power-output-tests#post_21159770

"My PSU is enough to handle my max load seen under Furmark benchmark of 600.66 AC Watts I would feel better having a high quality 750W PSU and that is what I would get from day 1 if I had the choice. For now my 660W is good enough."

While it is in Furmark that loads the poo poo out of everything, its something I think to take in mind if your going to be buying one of them is to have a really high efficiency (80 plus gold) 650w PSU or bigger to keep things well fed

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

veedubfreak posted:

Trust me, I have a 290x on my desk. The amount of difference is not worth the 150 dollars. Which is why I bought a 2nd 290 and a 2nd waterblock. The rops and memory bandwidth seem to be the major factor. Where the single line of texture units don't seem to really matter.

Point to the part where we disagree, amigo ;)

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Michymech posted:

So here's an interesting test by someone who did a power output test on a r9- 290x

http://www.overclock.net/t/1441118/290x-psu-power-output-tests#post_21159770

"My PSU is enough to handle my max load seen under Furmark benchmark of 600.66 AC Watts I would feel better having a high quality 750W PSU and that is what I would get from day 1 if I had the choice. For now my 660W is good enough."

While it is in Furmark that loads the poo poo out of everything, its something I think to take in mind if your going to be buying one of them is to have a really high efficiency (80 plus gold) 650w PSU or bigger to keep things well fed

"Oh no, my PSU is not big enough!" said the guy who missed the point of the relationship between AC power, DC power, and PSU efficiency.

At that level of power, the PSU that fellow has is about 90% efficient - which he gets. But the furthest he took it is for the utility cost. If you do the math, he's hitting 540W internally of 660W available, i.e. about 20% overhead. It's perfectly fine.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Agreed posted:

Their high end cooling solution is to date the best vapor chamber dual slot cooler for a graphics/compute card, ever. Ever ever. Yes. Cast aluminum housing with cast aluminum fins under a Lexan faceplate with an extremely efficient vapor chamber drawing heat from everything, all engineered for maximum airflow internally and using an already quiet AND ALSO acoustically dampened fan. They achieve a blower cooler that builds on the success of the GTX 680's already very good cooler to deliver performance that is best in class, and best ever.
Though it's still inferior to most AIB cooler solutions. It also costs a lot to make apparently so I'm not sure how "useful" it really is given the AIB options. It's a drat sexy blower though. (Though didn't some of Anand's benches show the 780 with the reference blower could hit its 80 degree throttling temp in some games?)

quote:

They also just don't run as hot. They could, but it'd be dumb and they'd throttle constantly. Just like... :haw:
You mean like the furnace that was Fermi? Let's not pretend that nVidia hasn't had big thermal and noise issues as well. :)

BurritoJustice
Oct 9, 2012

SourKraut posted:

Though it's still inferior to most AIB cooler solutions. It also costs a lot to make apparently so I'm not sure how "useful" it really is given the AIB options. It's a drat sexy blower though. (Though didn't some of Anand's benches show the 780 with the reference blower could hit its 80 degree throttling temp in some games?)

You mean like the furnace that was Fermi? Let's not pretend that nVidia hasn't had big thermal and noise issues as well. :)

The 780 doesn't downclock/throttle. It just overclocks when it can (to a reasonably small degree at least). Also it pretty much always runs at 80 degrees because that is the default temperature target, for acoustic management.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
It's a pretty arbitrary distinction you're making, "overclocking when it can" vs. "downclocking when it must."

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Factory Factory posted:

It's a pretty arbitrary distinction you're making, "overclocking when it can" vs. "downclocking when it must."

I get what he means, though; it's the difference between Boost 2.0 and AMD's HI version of Powertune. The end result is very practically similar, but it's similar to the difference between upward and downward compression (where, again, the end result can be very practically similar, though they get there via different routes).


SourKraut posted:

Though it's still inferior to most AIB cooler solutions. It also costs a lot to make apparently so I'm not sure how "useful" it really is given the AIB options. It's a drat sexy blower though. (Though didn't some of Anand's benches show the 780 with the reference blower could hit its 80 degree throttling temp in some games?)

<snip>

You mean like the furnace that was Fermi? Let's not pretend that nVidia hasn't had big thermal and noise issues as well. :)

As to the first, AMD more or less addressed that when they talked about how people are "used to" cards running cooler. Clever bit of doublespeak, but 80ºC is still hot - given a relaxed fan profile (and the stock one is pretty damned relaxed, going for noise over absolute maxxximum clocks) I imagine it'd let it get up to around there and just sorta hang out, which iirc costs it 13MHz, I believe that's still the granularity across all Kepler cards for that particular boost bin. Setting a more aggressive one is probably louder but it's not hairdryer loud.

As to the second, haha man we went way back before Fermi's 400-series noise and heat issues (my 580 was awesome, it'd probably still do the job today if I weren't hit in the head with a graphics brick when I was a child). Remember the 5000 series Ultra fans? We talked about those a page or two ago. Holy poo poo loud. Startlingly so.

They both suck at various stuff at different times, who doesn't get that by now?

Agreed fucked around with this message at 16:23 on Nov 10, 2013

BurritoJustice
Oct 9, 2012

Factory Factory posted:

It's a pretty arbitrary distinction you're making, "overclocking when it can" vs. "downclocking when it must."

The main difference between Boost 2.0 and Powertune, is that with the stock cooler 780's won't just stay at the "Boost Clock", they will exceed it by a healthy margin when they can. While with the 290 and 290x the cards fight to try and reach the "Boost Clock". If a 780 has no thermal headroom, it will still likely run at it's "Boost Clock" but not in excess of it, while if the 290 has no headroom it will it run at the minimum speed and eventually increase fan speed to stay stable. Arbitrary difference yes, and more related to how the two different company's define "Base" and "Boost" clocks, but noticeable in practice.

Arzachel
May 12, 2012

BurritoJustice posted:

The main difference between Boost 2.0 and Powertune, is that with the stock cooler 780's won't just stay at the "Boost Clock", they will exceed it by a healthy margin when they can. While with the 290 and 290x the cards fight to try and reach the "Boost Clock". If a 780 has no thermal headroom, it will still likely run at it's "Boost Clock" but not in excess of it, while if the 290 has no headroom it will it run at the minimum speed and eventually increase fan speed to stay stable. Arbitrary difference yes, and more related to how the two different company's define "Base" and "Boost" clocks, but noticeable in practice.

Since reviewers usually don't limit Kepler GPUs to their boost clocks, there's zero difference. Both a 290 and 780 will preform worse than you'd expect from review benchmarks in a poorly ventilated case.

BIFF!
Jan 4, 2009
So last night I noticed my GTX 770 was idling about 15 degrees higher than normal. I opened up GPU-Z and saw that the memory and core clocks were idling at maximum. I tried rebooting but they stayed the same. I forced a crash by raising the clocks like 500 MHz each and they went back to normal. I booted up today and they're back to being maxed out. I'm running 331.65. Is this just a driver issue? Anyone else experiencing this?

veedubfreak
Apr 2, 2005

by Smythe
Just gonna remind everyone that when you put the 290 on water, it overclocks hardcore until it basically runs out of voltage.

Hamburger Test
Jul 2, 2007

Sure hope this works!

Agreed posted:

Word of warning, don't buy the 780, they're going to knock like $170 off the price and you're going to feel dumb as heck.

Is this a serious prediction? Because I'm sitting here hoping for nVidia's response to the 290, since I would prefer to stick with them for now. If it doesn't come though, I can wait for the custom cooled 290's.

Magic Underwear
May 14, 2003


Young Orc

BIFF! posted:

So last night I noticed my GTX 770 was idling about 15 degrees higher than normal. I opened up GPU-Z and saw that the memory and core clocks were idling at maximum. I tried rebooting but they stayed the same. I forced a crash by raising the clocks like 500 MHz each and they went back to normal. I booted up today and they're back to being maxed out. I'm running 331.65. Is this just a driver issue? Anyone else experiencing this?

Change your monitors recently? NV cards clock up when hooked up to multiple monitors unless they run at the exact same resolution/refresh rate.

Hamburger Test posted:

Is this a serious prediction? Because I'm sitting here hoping for nVidia's response to the 290, since I would prefer to stick with them for now. If it doesn't come though, I can wait for the custom cooled 290's.

No way in hell. I think the huge drops on the 770 and 780 plus the games are as far as NV is going to go.

GrizzlyCow
May 30, 2011

veedubfreak posted:

Just gonna remind everyone that when you put the 290 on water, it overclocks hardcore until it basically runs out of voltage.

Afterburner doesn't work? Or does it not have voltage control yet for the Hawaii cards?

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Fraps is only reporting up to 120 FPS when I'm explicitly setting COD:Ghosts to 144hz in its settings (and my monitor is set to 144hz in Windows, 1080p and using DualDVI. Does Fraps not go higher than that or is something else locking the FPS?

Edit: Figured it out, I needed to reboot after turning the EVGA Precision X limiter off. However, the frame rate limiter on that only goes to 120, is there an alternative that can go to 144?

Zero VGS fucked around with this message at 03:51 on Nov 11, 2013

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE
The 780ti is a very reasonable price here in Australia - it's much the same price as the 780 was before the price drop a few weeks ago. Also, I read in reviews that Nvidia is extending its game bundle to the 780ti.

BIFF!
Jan 4, 2009

Magic Underwear posted:

Change your monitors recently? NV cards clock up when hooked up to multiple monitors unless they run at the exact same resolution/refresh rate.

Yeah that would be it. I just added a U2312HM to my existing ASUS VN247H-P. Apparently that monitor has a refresh rate between 50-75 hz? Is there any way to get my gpu to downclock?

Adbot
ADBOT LOVES YOU

Bloody Hedgehog
Dec 12, 2003

💥💥🤯💥💥
Gotta nuke something

BIFF! posted:

Yeah that would be it. I just added a U2312HM to my existing ASUS VN247H-P. Apparently that monitor has a refresh rate between 50-75 hz? Is there any way to get my gpu to downclock?

I don't think it's even a question of the monitors being mismatched. I think the GPU clocks up with two monitors simply because it needs more power to drive two monitors (I could be wrong on that though, feel free to correct me if I am).

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply