Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Samsung has a huge fab business. They can make as many chips as they want and not have to play games with Global Foundries or whoever. They stick those chips into their own products, as well as manufacture chips for others, including Apple. Not to mention that Nvidia doesn't yet have a cellular modem they can integrate, which Samsung kinda needs.

Plus Tegra is a 4+1 core SoC. Samsung uses a 4+4 core (4xA15 + 4xA9) chip for the SGS4. They make the RAM, the NAND, hell the entire devices. Why let another company have a slice of the pie if they don't have the product you want?

Sometimes they'll use another company's SoC. SGS3 used a Qualcomm Snapdragon. But again, Nvidia doesn't have a cellular modem here. You gotta have that before you can have a phone.

Adbot
ADBOT LOVES YOU

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
It's a good offensive and defensive strategy in that you can expand potential markets without doing lame channel sales agreements or even making more fab contracts and at the same time if they're having problems getting good profits off of existing IP, just licensing it can shore up margins to help out. It'll be maybe a year or two before we see anything materialize out of this though probably, but I'd like to be wrong about it and find out that there were things already in the works that just made the licensing official policy for everyone.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
nVidia does have a cell modem, they acquired Icera. The Tegra 4i integrated the modem into the SoC, though handsets based around this SoC aren't actually out yet due to product lead times. That said, I don't know how nVidia's Tegra products could be considered competitive to Qualcomm's especially. I do think the market for an integrated Kepler is being underrated, pretty much every company that currently licenses ARM or Imagination Tech's graphics IP is a potential customer. There's a shitton of Chinese companies making ARM SoCs that you generally don't hear about in the west.

NJ Deac
Apr 6, 2006
Is it normal for a reference-board Radeon 7970 to overheat and crash when at stock settings? My Visiontek 7970 seems to have started doing this in the last year or so any time I run any 3d application, and the only workaround to prevent heat-related crashes is to manually jack up the fan speed or run MSI afterburner with, again, jacked up fan speeds. This results in the fan sounding much like a leaf blower which doesn't necessarily bother me, since I'm usually wearing headphones, but it does annoy my wife sitting nearby. It appears as if the fan need to run somewhere between 50-60% capacity to prevent the card from freaking out.

I've tried blasting out the vents with a can of compressed air to no effect, but it seems like overheating at stock settings is probably indicative of some kind of problem. I also don't understand why the default fan controller software doesn't increase the fan speed when it starts approaching its maximum temperature, allowing the card to crash instead of increasing the fan speed. Anyone else ever seen this behavior, or should I just bite the bullet and either RMA the card or and find an after-market cooler?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I'd try removing the heatsink, cleaning it thoroughly and reapplying thermal paste to the card, then securely remounting the heatsink. I had a similar problem with my Radeon HD 4850 awhile back and that fixed all of my issues. I think it was mostly the thermal paste, but the better angles to clean the dust out from inside the heatsink probably helped too.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

NJ Deac posted:

Is it normal for a reference-board Radeon 7970 to overheat and crash when at stock settings?

While Alereon's DIY approach might help your problem, the answer to this question is a solid "no, absolutely not."

The product sold to you was advertised to rather specifically not do that thing. If you're within your warranty period, I'd RMA it. It is not your fault that the card is misbehaving and you risk voiding the warranty by trying to reapply thermal paste or do other things.

As far as automatic fan controls go, sometimes if you manually set a fan control profile, the automatic control functionality kind of doesn't work anymore. A driver uninstall/reinstall might fix that, though. It was an old issue on nVidia cards, might be a current issue on AMD ones, no clue. But no, under normal circumstances, if it is within the card's power to cool itself, it should do so. That you can set it to manually cool and it doesn't crash but it won't cool itself and that it is crashing at stock settings make it a candidate for RMA, quick-like, in my opinion.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

Agreed posted:

As far as automatic fan controls go, sometimes if you manually set a fan control profile, the automatic control functionality kind of doesn't work anymore. A driver uninstall/reinstall might fix that, though. It was an old issue on nVidia cards, might be a current issue on AMD ones, no clue.
AMD/ATI cards revert to the fan controls specified in the GPU BIOS if software speed control is removed, so the fan speeds shouldn't have changed unless the BIOS was intentionally flashed or the cooler is malfunctioning. RMA is probably the best move at this point, although re-seating the cooler/checking case intake fans might be enough to fix it.

Animal
Apr 8, 2003

The new cell phone/tablet GPU by Qualcomm, the Adreno 330, is faster than a Geforce 7800 GTX. That kind of blows my mind. We could be playing Battlefield 2 on our tablets.

Anandtech comparison

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
That's just insane because the GPU in the PlayStation 3 is essentially a GeForce 7800 GTX.

Rigged Death Trap
Feb 13, 2012

BEEP BEEP BEEP BEEP

spasticColon posted:

That's just insane because the GPU in the PlayStation 3 is essentially a GeForce 7800 GTX.

The PS3 used it's weird rear end CPU for most of the processing though.

Yudo
May 15, 2003

Animal posted:

The new cell phone/tablet GPU by Qualcomm, the Adreno 330, is faster than a Geforce 7800 GTX. That kind of blows my mind. We could be playing Battlefield 2 on our tablets.

Anandtech comparison

Ugh...to think that Qualcomm bought AMD Imageon for just $65 million. AMD's management sucks so loving hard.

Regarding NV licensing its IP: am I being cynical in my assessment that they are hemorrhaging cash on Tegra and trying a new strategy to get into the mobile market? Mali clearly cannot compete with Adreno, perhaps as mentioned before Apple or Samsung would license it to pair with the otherwise competitive Cortex A15. This assumes kelper scales--NV can't seem to do it themselves, however. I dunno how attractive this is for licensees if NV stays in the SoC market.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Forgive me a moment of nerd :smith:, but here's what delivery day looks like when you're a week and a half in from spine surgery:

1. Receive card! Woooo I own an EVGA GTX 780 SC ACX, ballin as gently caress yeeeeah :supaburn:

2. Register card for warranty, etc., keeping in mind 30 days to buy an advanced RMA if I want one and 60 days to buy a cross-shipment warranty tier if I want one. :eng101:

3. Stare at box until family members arrive to help me get a spot cleared off so I can put the card in. For hours. Basically, hang out with a GTX 780 SC ACX. It's alright company, I guess. Not much for conversation but a good listener.

Can't be too long now. I hope!

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Yudo posted:

Regarding NV licensing its IP: am I being cynical in my assessment that they are hemorrhaging cash on Tegra and trying a new strategy to get into the mobile market? Mali clearly cannot compete with Adreno, perhaps as mentioned before Apple or Samsung would license it to pair with the otherwise competitive Cortex A15. This assumes kelper scales--NV can't seem to do it themselves, however. I dunno how attractive this is for licensees if NV stays in the SoC market.
I think you're oversimplifying. It's a little early to say that Mali can't compete with Adreno when we're comparing the performance of a new Adreno GPU with unknown size and power requirements to existing low power/cost Mali configurations. I should also note that nVidia has not been particularly competitive in the mobile GPU market to date (they still use Geforce 7-era non-unified vertex and pixel shaders), so it's not a foregone conclusion that Kepler will beat the competition.

All that said, Qualcomm is kicking a lot of rear end right now because Krait seems to be a better performance/power balance than Cortex A15 and they are able to get a lot of integration and cost wins since they don't have to license and integrate as many IP blocks from third parties.

It probably is time to post that low-power computing/SoC megathread I've been thinking about...

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Maybe. It'd be fun.

That said, while that's true for Tegra, we might have seen something like this IP licensing scheme coming when they announced the Kayla ARM/Kepler dev platform.

dog nougat
Apr 8, 2009

dog nougat posted:

Don't abuse the system man :v:.

But seriously, that's odd. I managed to break a capacitor (I think) off the back of my card while putting it back to factory spec and they still honored my RMA. Well sort of, there's going to be some cost associated with the repair now, but considering I voided my warranty, it was pretty awesome of them. Granted this happened after the RMA was approved and I emailed them about it and took photos it to document the damage. It's still really good of them...in theory at least, I guess when I find put what it will cost me I may sing a different time. Still it's probably better than me having a $300 paperweight and having to shell out for a new card.

So I just got an email from EVGA saying that my replacement card has shipped. Not what I was expecting to hear. That's pretty awesome of them though. Makes me wonder if they'll charge me for the damage I did to the card. In either case, they'll definitely get my business again.

I'm assuming I'll have to register this card for their warranty service once it arrives.

beejay
Apr 7, 2002

They will auto-register the new card for you before it even gets to you. Their general RMA service is pretty awesome. I just have something nutty going on with the specific model of card and my system and had a bad experience with a manager and I'm afraid I had no choice but to move on this time.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

beejay posted:

They will auto-register the new card for you before it even gets to you. Their general RMA service is pretty awesome. I just have something nutty going on with the specific model of card and my system and had a bad experience with a manager and I'm afraid I had no choice but to move on this time.

I hope I'm not overstepping, but without airing dirty laundry I can say that beejay has his reasons and they're legit - it's not the usual EVGA experience, but some people clearly get treated poorly at a point. Case by case, I guess, we're just more used to good news.

Speaking of which, well, my GTX 780 is in my computer now, awesome, and... it's kind of a lame overclocker. Broke my lucky streak, going all the way back to the GTX 280 I had cards that would either hit the max bin (GTX 280 had a roof on overclocking, but not every card could hit it - mine could, though, drat it). My GTX 580, which seems to be going into very deserving hands with poster Flute, got up to 915MHz across all DX9/DX10/DX11. If only DX11, it'd have been 945mhz. Either way, that's a good overclock for GF110! And my GTX 680 ran at 1241MHz rock solid.

I'm getting low to mid 1100mhz, topping out around 1160-1170 (whatever the turbo bin is there) with my GTX 780 SC ACX. The memory overclocks like crazy, and as you might expect with Kepler that does bring surprising performance gains, though not the way it did with the GTX 680. I hate that I got a bad overclocker, because 1. it was expensive, and 2. my lucky streak :saddowns:

On the plus side, even at that clock it runs like a bat outta hell, and, hey, fast memory. Plus, it's extremely quiet. I don't know if it's self-selection or what, but it seems like most other 780s that get posted about are running higher clocks. I like to tell myself I have a higher standard of stability - has to be 100% stable across DX9, DX10, DX11, across all benchmarks, no detectible and reproducible artifacts. But really I probably just got kind of a mediocre GPU this time, one which will overclock past factory specifications but not one which will truly take advantage of the exceptional cooling offered by EVGA's ACX system. Which is awesome. So quiet.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

You know what, I was hasty - and as usual when you get in a hurry like that, I was wrong. While this might not be a top 10% overclocker, it's got some really nice clocks going on now and is putting out some fantastic graphics and image quality to boot. Not gonna mince words, overclocking this thing is weird. 700-generation cards (well, Boost 2.0) is going to be another weird shift in how overclocking works, as such. I spent a bunch of time tinkering with it and now I've got back nearly to my initial performance level where I had just said "hey, I have the SC ACX :smuggo:, I can probably just turn up the clocks and rock this thing!" and set it to +100 core, +something memory, and promptly artifacted and crashed testing utilities for a while. After deflating the :smuggo: and going through a substantial process of trial and error and a lot of different tests and gameplay, I'm fairly comfortable calling it stable now.

No artifacts anymore. The manual boost setting for the core is odd, and seems best used to get you in the ballpark of a given turbo bin under ideal circumstances (and there are modded BIOSesout there now which bypass a lot of this stuff, but I don't recommend them in the least for anyone not going for absurdly, unnecessarily high clocks at the expense of the card).

Now, after learning more about how it works through tweaking it all afternoon and tonight, the manual setting for my core is a lot lower, while the manual setting for my memory is a lot higher (it works much more straightforwardly; you raise the memory slider, you get two effective clocks for each point you raise it). I also changed the priorities within the Boost 2.0 algorithm from prioritizing TDP to prioritizing the Thermal Target, since the cooler can keep it away from 95ºC all day and it'll still pursue that temperature by upping the voltage and clocks. I did some stuff with the fan controller to make it super aggressive to make very sure that it wouldn't have any issues staying cool, though I probably needn't have, really. The 780 is no exception to the general Kepler rule of "cool card, higher clocks," including throttling one boost bin at a time after a certain temperature is reached but with an added layer of trouble you get into when setting it to the Thermal Target if you aren't keeping it cool.

With the ACX cooling design, I barely notice my aggressive fan profile. Even at max RPM, this is not obnoxiously loud and won't detract from games. Sweet spot for noise and cooling in real-world gaming scenarios is around 60-65% of the fan, but it'll keep it at or under 70ºC under full Furmark load, too, which is quite a feat given how much power the card can draw and must dissipate. I find it almost hard to believe that their custom cooling solution carries no price premium over the standard EVGA 780 SC, but then again -

I feel it's really only fair to note that the stock blower is pretty damned good too this go-around, you can see the reference cooler disassembled in this review, scroll down a bit. Notice the very cool vapor chamber design and the undeniable fact of the construction quality of the reference card, a standard which every manufactured card has to meet thanks to nVidia's Greenlight program. Lots of metal in the right places, it's a great cooler. I don't think it's a 450W TDP cooler like the ACX could theoretically call itself (quick, somebody strap one to the 5ghz AMD CPUs! :v:). Still, I bet it moves heat extremely well, and benchmarks seem to indicate it's quieter than previous generations while doing it, too.

Incidentally, while on that page you might also have a first encounter (or not, if you're especially knowledgeable?) with the term ASIC score, which has to do with in-house validation, and relates to package leakage. Since it's something GPU-Z can measure which relates (but is not determinate) to binning chips for different usage scenarios, overclockers have come to place en emphasis, for better or worse, on having a high (high 70%s+) ASIC score, while others bemoan a low scores (low 70%s to the mid 60% range - cards don't seem to go lower than that, perhaps they fail to validate). For reference my card has an ASIC score of something like 66.8% and overclocks fine. I think there's a gestalt of factors that go into a clock, and focusing on one number like that is dumb. There, I said it.

It takes a little getting used to, figuring out how the Boost 2.0 overclocking paradigm works best. It isn't as straightforward as the old "set it to X GHz, test it, repeat" pre-Kepler method, and it isn't fully familiar to Boost 1.0 users right off the bat either since it does some important things differently.

I imagine this stuff in general will be more germane to folks getting 770s since so few are willing or need to shell out for a 780, and the 770 is already positioned very well on the price:performance curve for new-in-box cards. It shares both the new Boost 2.0 tech as well as the pretty darn fancy reference cooler shown above, each of which may surprise you.

Finally, I am eager to see what AMD answers with (especially after their drivers were dissected to see if anything interesting was in the pipeline and sure enough, new stuff on the way). nVidia has a combination of cool technologies and a strong base from which they built their refresh of this hardware generation (so far!) while waiting out the next node shrink for Maxwell. I am curious to see what AMD will do now. I don't think they have a gigantic 7.1 billion transistor die to answer blow for blow with, so what shall their method be? Looking forward to finding out. The 7970GHz is still hanging in there, so it's not like this is going to be a lost generation for them - unless they keep dragging their heels with a response while nVidia makes premium cash on premium cards that have no answer, a la the final months of nVidia's Fermi when AMD already had the Radeon 7000s out and were charging a considerable amount for them. Well, so it goes.

Agreed fucked around with this message at 21:40 on Jun 20, 2013

Wistful of Dollars
Aug 25, 2009

Good news Canadian shoppers! If your heart is set on a 7990 NCIX has them for $400 off! :thumbsup:

(Which brings it down to the same $1000 as everyone else...:iiam:)

beejay
Apr 7, 2002

Agreed, you are dedicated. That is awesome. Are you using PrecisionX? By the way, what is the equivalent program for AMD cards?

edit: I think I would follow your standard of overclocking myself. I would want fully stable all the time under all circumstances, but as fast as it can go in that case. I haven't been into overclocking for a long time and once I get my system finally working at stock, I will probably start working on the overclocking, CPU and GPU. I'm getting a Sapphire 7870 GHZ edition and I have read that they can usually be overclocked pretty heartily. I had my CPU overclocked slightly for a bit but have been running stock ever since. Haven't touched my video cards.

beejay fucked around with this message at 13:45 on Jun 20, 2013

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
Between Intel setting the Ultrabook standard and nVidia's Greenlight program I'm really looking forward to quality becoming the norm and not the exception

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
MSI Afterburner is built on the same core as EVGA Precision and works for both AMD and Nvidia cards (since MSI makes both). Both Afterburner and Precision are based on RivaTuner.

Oblivion590
Nov 23, 2010

Agreed posted:

Incidentally, while on that page you might also have a first encounter (or not, if you're especially knowledgeable?) with the term ASIC, which has to do with package leakage and is something GPU-Z can measure which relates (but is not determinate) in binning chips for different usage scenarios.

This is a great post, but I wanted to jump in on this misleading little part. An ASIC is just an abbreviation for "Application-Specific Integrated Circuit". CPUs and GPUs are ASICs, but so are other non-reconfigurable chips. The term used in the linked pages is "ASIC quality". From my brief search, it looks like GPU-Z determines ASIC quality by reading a number that was programmed in during validation. We aren't privy to the exact details of the binning process, so ASIC quality doesn't seem like a very useful metric. The benchmarks an overclocker might run are almost certainly different from the ones used to generate that score.

dog nougat
Apr 8, 2009

dog nougat posted:

So I just got an email from EVGA saying that my replacement card has shipped. Not what I was expecting to hear. That's pretty awesome of them though. Makes me wonder if they'll charge me for the damage I did to the card. In either case, they'll definitely get my business again.

I'm assuming I'll have to register this card for their warranty service once it arrives.

Well, looks like they're sending me a 570 Classified. A nice, if minor upgrade from a plain 570HD, at least I get the reference cooler instead of the noisy-rear end 560 cooler slapped on a 570. Part of me wants to sell it and get a 700 series, but I also feel like I'd be better off just waiting until the 800 series comes out to see a really big jump in performance.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Oblivion590 posted:

We aren't privy to the exact details of the binning process, so ASIC quality doesn't seem like a very useful metric. The benchmarks an overclocker might run are almost certainly different from the ones used to generate that score.
The ASIC quality score is used to decide what GPUs go into pre-overclocked products, so it would be very surprising if it didn't correlate with overclockability. My baseline (non-overclocked) EVGA GTX 670 has a relatively low ASIC quality and doesn't overclock particularly well, while pre-overclocked cards with the same cooler have higher ASIC quality values and generally overclock better. That doesn't mean there's a perfect relationship, but there should be SOME relationship.

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

dog nougat posted:

Well, looks like they're sending me a 570 Classified. A nice, if minor upgrade from a plain 570HD, at least I get the reference cooler instead of the noisy-rear end 560 cooler slapped on a 570. Part of me wants to sell it and get a 700 series, but I also feel like I'd be better off just waiting until the 800 series comes out to see a really big jump in performance.
Moving from the 570 to a 780 will get a really insanely big jump in performance if you want to spend that much money.

dog nougat
Apr 8, 2009

TheRationalRedditor posted:

Moving from the 570 to a 780 will get a really insanely big jump in performance if you want to spend that much money.

:stare: Those prices. Looks like I'm gonna stick with the 570 for now. Until I can save the $$ to get a new card. Pretty sure I'll be able to hold out until the 800 series comes out.

Is there some reason that the 780 seems to only come in 3GB models whereas the 770 has a 4GB version. I'm only really looking at EVGA cause their customer service had win me over, so I may be completely wrong.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Oblivion590 posted:

This is a great post, but I wanted to jump in on this misleading little part. An ASIC is just an abbreviation for "Application-Specific Integrated Circuit". CPUs and GPUs are ASICs, but so are other non-reconfigurable chips. The term used in the linked pages is "ASIC quality". From my brief search, it looks like GPU-Z determines ASIC quality by reading a number that was programmed in during validation. We aren't privy to the exact details of the binning process, so ASIC quality doesn't seem like a very useful metric. The benchmarks an overclocker might run are almost certainly different from the ones used to generate that score.

To be fair it was SUPER late when I typed that out, I'll fix the omission of the word, thanks for pointing that out. To add on to Alereon's thoughts, you can head to any overclocking joint around and they are all over the ASIC score. Multiple instances of people trying to RMA based on ASIC.

Incidentally mine is in the 66% range and overclocks fine, I think it's a lot of worry over nothing. But dudes with high 70s ASIC scores do generally get a respectable performance delta over us lowly folks. I say it "has something to do with" binning because Gigabyte and EVGA will most definitely toss a mid-60s chip into their factory OC model with a fancy cooler and call it a day, there's no guarantee you're getting a high ASIC score - just a chip that clocks to its stated clocks and has a non-reference cooler.

Weird correlation is that cards with low ASIC scores tend to be amenable to higher overclocks with water or sub-zero overclocking. High ASIC score cards seem better on air.

I think RMAing because a card doesn't overclock as well as you hoped it would is ridiculous and in the world before people knew what an ASIC score was, it was just a hidden factor among many known factors - now it's treated as more determinant than I personally think it actually is.

Yudo
May 15, 2003

dog nougat posted:

:stare: Those prices. Looks like I'm gonna stick with the 570 for now. Until I can save the $$ to get a new card. Pretty sure I'll be able to hold out until the 800 series comes out.

Is there some reason that the 780 seems to only come in 3GB models whereas the 770 has a 4GB version. I'm only really looking at EVGA cause their customer service had win me over, so I may be completely wrong.

The 780 has a 384-bit memory bus (like the Titan, both are GK110 chips) so it maps most easily to 3 or 6gb. AMD's 7900 series has the shares bus and memory configuration as well. The 770 has a 256 bit memory bus, so 2 or 4gb models are the norm.

Lest anyone suggest otherwise, the 780 with 3gb of RAM is extremely well appointed and should be so for several years. If I owned a 770 I would be thrilled, but the 780 is in a different league and priced in typical NV fashion: outrageously.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Yudo posted:

The 780 has a 384-bit memory bus (like the Titan, both are GK110 chips) so it maps most easily to 3 or 6gb. AMD's 7900 series has the shares bus and memory configuration as well. The 770 has a 256 bit memory bus, so 2 or 4gb models are the norm.

Lest anyone suggest otherwise, the 780 with 3gb of RAM is extremely well appointed and should be so for several years. If I owned a 770 I would be thrilled, but the 780 is in a different league and priced in typical NV fashion: outrageously.

Further, it leads to GPU nerd priapism. AMD will surely answer it with something (though, lacking a gigantic loving chip, they won't answer the 780 directly - more competition to make the 770 look like an rear end in a top hat will be my guess). But the AMD solution might cause GPU nerd Peyronie's. You've been warned.

(I really am excited to see what AMD's move is here, but never miss an opportunity for a dick joke, come on! :haw:)

Agreed fucked around with this message at 22:09 on Jun 20, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Somewhere in the bowels of AMD HQ, somebody's having a hissy fit.

"We build a giant chip for our CPUs and all they can say is that it sucks! Then Nvidia builds a giant gently caress-off chip and suddenly they're the Gods of Computers!"

And then with the drinking :yotj:

--

E: Anybody know of a benchmark comparison between HD 5000, Iris 5100, and Iris Pro 5200? I wanna know what kind of difference the Crystalwell L4 cache really makes.

Factory Factory fucked around with this message at 00:42 on Jun 21, 2013

Talaii
Sep 5, 2003

You crack me up, lil buddy!

Factory Factory posted:

Somewhere in the bowels of AMD HQ, somebody's having a hissy fit.

"We build a giant chip for our CPUs and all they can say is that it sucks! Then Nvidia builds a giant gently caress-off chip and suddenly they're the Gods of Computers!"

And then with the drinking :yotj:

--

E: Anybody know of a benchmark comparison between HD 5000, Iris 5100, and Iris Pro 5200? I wanna know what kind of difference the Crystalwell L4 cache really makes.

I've been waiting to see benchmarks of the lower-end versions, but as far as I'm aware the only laptop that's actually shipping with HD 5000 is the Macbook Air, and I haven't seen a thorough review yet. Also remember the 5000/5100 are in much lower power envelopes - they'll have much lower clock speeds as well as lacking the big chunk of eDRAM.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

beejay posted:

Agreed, you are dedicated. That is awesome. Are you using PrecisionX? By the way, what is the equivalent program for AMD cards?

edit: I think I would follow your standard of overclocking myself. I would want fully stable all the time under all circumstances, but as fast as it can go in that case. I haven't been into overclocking for a long time and once I get my system finally working at stock, I will probably start working on the overclocking, CPU and GPU. I'm getting a Sapphire 7870 GHZ edition and I have read that they can usually be overclocked pretty heartily. I had my CPU overclocked slightly for a bit but have been running stock ever since. Haven't touched my video cards.

I'm not so much dedicated as committed. Or is it that I should be committed...? :D I like working with my hardware to get the most out of it, it's fun and sorta like mowing the lawn, you can see the results immediately once you've put in the effort. Less allergies to worry about in this case too, bonus.

I use Precision X 4.2.0, which might be the only usable overclocking tool for the 700-series of cards at the moment unless the Rivatuner programmer (who licenses his software to both EVGA and MSI, making a killing, good for him!) has already got around to updating MSI Afterburner. They are generally feature equivalent for nVidia cards, but as EVGA is strictly an nVidia partner, only Afterburner will allow you to OC AMD cards. It's a great program, though, you're not going to be limited by it in any way. I do prefer Precision (now Precision X) for the relative straightforwardness of its features, but the differences are minor at best on a per-manufacturer basis.

I feel like the "lowest common denominator" OC method that I adhere to for video card overclocking is the way to go, too. Well, obviously I would say that, but - what's the point of a 1241Hz DX11 benchmark overclocck that crashes in DX10 games, or games which emphasize texel throughput, or games which are still running on DX9 engines and work the logic differently? An overclock is just a number, make sure it's the right number for everything you plan to do. I like to play The Witcher 2 lately, finally getting around to it and it's a ton of fun. It's a demanding DX9 game (though the 780 eats its lunch! I think I might be able to turn on Ubersampling even... I'll see about it). Or I might want to play S.T.A.L.K.E.R. SHoC, another very graphically demanding DX9 game with some customized mods. Or I might want to play Metro: Last Light in DX11 mode with ADoF and Tessellation. Or some of the in-betweener DX10 games that tend to have features similar to DX9 but better optimization.

All these different things work the card differently and stress its 7.1 billion transistor logic differently, so expecting any one thing to settle the OC and call it done is wishful thinking at best, and just e-peen stroking at worst. Different games demand different behaviors from the card. If someone thinks their OC is stable because it will run Unigen Heaven overnight, they might be in for a nasty surprise when they load up Crysis 3 on Very High and it starts artifacting badly.

Contrary to what reading some forums out there might have you believe, the idea is to use the card to play games - not to use games to evaluate the card! :laffo:

go3 posted:

Between Intel setting the Ultrabook standard and nVidia's Greenlight program I'm really looking forward to quality becoming the norm and not the exception

Yeah, w/r/t graphics cards the nVidia Greenlight is great and not getting enough press in my opinion! I love that one of the criteria is that it must match or exceed the power delivery of the reference unit. No more bean-counter bullshit paring down to the absolute minimum to support the card's power draw, and sometimes failing. This should be a great thing for consumer confidence in any nVidia-based graphics card purchase, because even companies with a history of screwing up power delivery to save a few bucks on the PCB no longer have the option. And aftermarket cooling has to be especially good, now, too.

I think given the likely expense of the reference cooler what with its fancy components and quiet operation, the non-reference coolers might actually save on some construction costs. My EVGA ACX cooler is realt nice, but even with whatever engineering thermoplastic they;'re using to make the fan blades light but rigid and very good at moving air; even with the dual fan controllers balancing the load to ensure less wasted electricity; even with the front-plate preventing any flex in the PCB so that they can use ample mounting pressure to really couple the cooler to the card... I bet they're not spending much more than it would cost to use a reference cooler, if they're spending more at all.

One last note on overclocking with Boost 2.0 - I really think setting a Thermal Target and giving it priority is the way to go. Set your fan(s) to keep the card cooler than the thermal target, and overrides the kind of last-gen TDP throttling by one turbo bin at a time as it hits certain temperature breakpoints. Where the GTX 680 starts dropping 13MHz per 10ºC starting at 70ºC, setting up a Thermal Target and keeping the card from actually hitting it shows no such behavior.

I've got a good balance of noise and cooling power (it'll kick in and keep things frost as hell if the game manages to get the card up to the high 70s, but until then it's staying at around 60% fan utilization, which is more than ample for all but the bleeding edge games). I bet the reference cooler does, too. The benefit of the Thermal Target is that it locks in clocks and voltage much harder than Power Target does. So long as your temperature isn't within a given distance from your set thermal, voltage and clocks won't go down in modern games that can use their power. I haven't had the opportunity to test further than 24ºC because the fans just don't let it get that hot.

It still takes advantage of power savings unless told not to with a driver profile in games that don't require the horsepower, and I imagine the 770 will as well. From what I can tell, it seems to stay at top clocks for DX11 games right out the gate, but that could just be because DX11 games are the demanding ones right now.

On a related note, I want AMD to hurry up and put out their refresh to see if it can match the 770's value proposition, and see what it does to nVidia's 780 pricing. Yeah, it might drop my card's cost by $180 in a day, but it might not if there's no direct competition and even if it does, early adopters ought to know what they're in for, price-wise.

Agreed fucked around with this message at 07:51 on Jun 21, 2013

David Mountford
Feb 16, 2012
For GK110 overclocking I've had the best luck with using the latest Precision X for monitoring purposes while using the overclocking tools included with nVidia Inspector for setting clocks and voltages. This is on an EVGA Titan, for what it's worth. Being able to generate shortcuts to set the clocks/voltages is really handy.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

David Mountford posted:

For GK110 overclocking I've had the best luck with using the latest Precision X for monitoring purposes while using the overclocking tools included with nVidia Inspector for setting clocks and voltages. This is on an EVGA Titan, for what it's worth. Being able to generate shortcuts to set the clocks/voltages is really handy.

Oh, cool as heck, you've got a Titan? Any other folks using the "real" GK110, the biggun', champion of all... etc.?

I don't have any need to set different clocks and voltages, personally, once I've got my clocks locked in I'm solid. 1.2V overvolting, +whatever GPU and Memory, and then adjust the Thermal Target and secondarily the Power Target while giving Thermal Target priority - the Precision/Afterburner "Profiles" feature saves everything but the overvoltage, and that only changes if there's a driver reset anyway so it's pretty much set it and forget it.

But if you have a Titan you might be doing development work, which would necessitate being able to switch around that stuff more freely so you could go in between low-precision and full-precision mode when gaming vs working. Otherwise I like the workflow of an all-in-one solution and find nVidia Inspector to be mainly useful at going to the driver level for customizing per-game settings that the NV Control Panel doesn't let you dick with. OGSSAA looks awesome, SGSSAA looks nearly as good, a little compatibility bit searching and experimentation is required but deep-level tweaking is worth it.

How do you feel, as a Titan owner, now that they've released the 780 for considerably less (though it's still expensive, obviously) and yet thanks to the very things that make it slower than Titan at stock settings (lasering off parts of the chip after binning, basically), it has more headroom for overclocking at the same TDP and so tends to keep up with or beat the Titan in games? If you're using it for dual purposes, I reckon you don't rightly give a poo poo since the 780 has nothing going on with high precision computing, but I don't want to assume that you bought it as an entry level GPGPU development card without hearing it from you first.

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.
What is the thermal target setting you keep mentioning Agreed? I see "Temp Target" right under Power Target in PrecisionX, but it's totally greyed out and unusable with my 670. WHAT DOES IT ALL MEAN

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
It's for 700-series cards only, so far. GPU Boost 2.0 lets you shoot for an upper limit on temperature, and the card will clock the best it can all the way up to that much heat.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

TheRationalRedditor posted:

What is the thermal target setting you keep mentioning Agreed? I see "Temp Target" right under Power Target in PrecisionX, but it's totally greyed out and unusable with my 670. WHAT DOES IT ALL MEAN

600 series cards and under need not apply. It's a new thing, Boost 2.0. I am pretty sure Titan doesn't even have a proper implementation of it but I would want to double check that before committing. I know Titan didn't get the new software for the fan controller hardware that the 780 and 770 have, which just smooths out the transitions between fan speeds so that you don't get quick ramping up and down - it's psychoacoustic but it really does give the impression of less noticeable noise, since a loading screen won't make the fan speed drop out suddenly only to rise again as soon as content is loaded.

You can do that manually by setting a fan control with a fairly aggressive profile and like 10ºC-15ºC of hysteresis, by the way - it won't drop temps that quick if it's just been under full load or close to it. Hysteresis in this context is just a term that means continuing to use the previous setting until a certain defined threshold has been met - in that case, you're telling it if it hits, say, 65ºC and you want it to run the fan at 70%, it won't drop below 70% until its at 50ºC.

For 670 overclocking, work with the power target (slide fully to the right), because that'll let it get where it needs to be. Set a high voltage, because it will end up there anyway and not having to make the jump improves stability of the memory OC in my 680 experience. Use the offsets to get to the turbo bin you're after (it goes in increments of 13MHz), and set up a fan profile that keeps it away from 70ºC, because that's where the 670/680 start throttling and also where stability of the OC starts to get iffy.

A cool Kepler is a happy Kepler, that's still the rule for the 700 series, but they definitely gave us some new stuff to wrap our heads around. It's good stuff once you get to know it, but confusing coming from the comparatively straightforward 600 series. Never thought I'd say that, heh.

Agreed fucked around with this message at 15:44 on Jun 21, 2013

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.
Oh, I was purely asking what that's about, I've had it OCed all along. I attain a reasonable max boost clock of 1241Mhz (didn't realize the increments were in 13Mhz though, that's cool), but I've always been able to push the VRAM up to 1900Mhz, which is an awesome jump. The power target on this Gigabyte only ever lets its top out at 112% though, even with the new precision. No idea what that truly represents. GPU-Z shows that the card almost never breaks 92.8% TDP but now and then during a logging session I get an anomalous outlier of 248%, which I have no idea what means.

That sounds like a cool feature for sure, along with the fan smoothness. I had a terrifying experience earlier this week when installing the new Precision, the fan curve accidentally got toggled off right before I launched BF3 for a short while, and upon exiting the game I realized my card had been cooking at 100C for at least 2 minutes. I nearly chunked a deuce, never have I manually toggled anything to 100% power in panic so quickly. Lucky for me that it doesn't seem to have caused any observable damage but it very very easily could've as I understand it. The fact that it happened at all is...less than ideal. There weren't any artifacts or signs of distress in the game is the curious part, the emergency throttling didn't make an observable difference which is why it was in the dangerzone so long. Either way, I increased the aggressiveness of my entire fan curve by like +40%, just babying the hell out of the thing from now on.

TheRationalRedditor fucked around with this message at 15:56 on Jun 21, 2013

Adbot
ADBOT LOVES YOU

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
In a nutshell, 600-series cards use the 1.0 version of the Boost algorithm, which focuses on chip TDP and electrical power usage. The 2.0 version of the Boost algorithm on 200-series cards can focus on that, or it can be set to focus on temperature instead (throwing TDP most of the way out the window as long as things are running cool). Very useful for overclocking with a card that has a beefy custom cooler.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply