Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Wiggly Wayne DDS posted:

No it's synthetic tests where people allocate all of the memory at once.

There are posts where people find it happening in games if you push enough, and a reluctance for the card to allocate more than 3.5. But it's pointless to keep running in circles about it unless there's new information, or a complete debunking by NVIDIA or some other authoritative source.

Adbot
ADBOT LOVES YOU

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Is it exceeding that much in use, or exceeding that much in transfer size? I may have misunderstood the NVIDIA statement.

RBX
Jan 2, 2011

Getting a 970 in a few weeks, i'm only playing at 1080p and I don't go crazy with downsampling and aa and poo poo. Is there a cheaper card that may do what I want? Is the 970 overkill for what I want? I have a 660 GTX right now and want to stay with Nvidia.

Swartz
Jul 28, 2005

by FactsAreUseless

RBX posted:

Getting a 970 in a few weeks, i'm only playing at 1080p and I don't go crazy with downsampling and aa and poo poo. Is there a cheaper card that may do what I want? Is the 970 overkill for what I want? I have a 660 GTX right now and want to stay with Nvidia.

Depends on the games you play.

I don't go crazy with AA or DSR (well...most of the time) and I feel a 970 is perfect (I'm at 1080p also). Hell with the right Skyrim mods I can't even get a solid 60 fps all the time (but ENB tends to do that).

A cheaper option would be the 960.

Haerc
Jan 2, 2011
After being underwhelmed by the 960 specs, should I just go for a 280x? Will my PSU be able to handle it(550w)?

Or will the 960ti will be worth the wait?

Party Plane Jones
Jul 1, 2007

by Reene
Fun Shoe
The 960 is more comparable to the 285 than the 280 because of the amount of RAM. Just be warned, the 285 draws like a little less than hundred watts more power at full load than the 960.

http://www.eurogamer.net/articles/digitalfoundry-2015-nvidia-geforce-gtx-960-review

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

Where are AMD putting all those watts? 100 watts is a lot of heat.

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


Boiled Water posted:

Where are AMD putting all those watts? 100 watts is a lot of heat.

Into necessarily beefier coolers and hotter components and air.

And also markedly less performance per watt.

At least there are probably things AMD can still do to improve their GPUs; nVidia may have had their Sandy Bridge moment but ATI hasn't had a post-PhenomII stall yet.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Sir Unimaginative posted:

At least there are probably things AMD can still do to improve their GPUs; nVidia may have had their Sandy Bridge moment but ATI hasn't had a post-PhenomII stall yet.

How can you come to this conclusion? Legitimate question, I'm not seeing how nvidia can't get more out of their GPUs, and it honestly looks like the Radeons are heading for "super" Fermi territory. TBH, I am ignorant of the details, which is why I ask.

sauer kraut
Oct 2, 2004

Haerc posted:

After being underwhelmed by the 960 specs, should I just go for a 280x? Will my PSU be able to handle it(550w)?

Or will the 960ti will be worth the wait?

550W is fine if it's a quality PSU, no one knows if there will even be a better 960

Spend 10$ more for a Tri-X of Vapor-X

SwissArmyDruid
Feb 14, 2014

by sebmojo

FaustianQ posted:

How can you come to this conclusion? Legitimate question, I'm not seeing how nvidia can't get more out of their GPUs, and it honestly looks like the Radeons are heading for "super" Fermi territory. TBH, I am ignorant of the details, which is why I ask.

At the very least, we know that AMD is about to leave nVidia in the dust with regards to memory bandwidth. The next high-end cards that AMD will release will feature Hynix's stacked memory modules. You may remember several years back when nVidia was touting their own stacked memory products which were then delayed to Pascal. This is because their own version of non-planar memory, which they were working on with Micron and calling Hybrid Memory Cube (HMC) fell through and the project was dropped. They will now shoot to use the joint AMD-Hynix High Bandwidth Memory (HBM) instead. The first generation of HBM products which is touted to, on paper, permit memory bandwidth 1 gigamegabit wide. The second generation of the memory standard, which they are working on presently, seeks to double this, as well as capacity per-stack.

For comparison's sake, a GTX 980 has only a 256-bit memory bus. A Titan Black only has a 384-bit memory bus. (The Titan Z also only has a 768-bit memory bus, but that's a dual-GPU card, so still, 384-bit per-GPU.)

All the memory efficiency and color compression touted by Maxwell is about to be obsoleted by this advancement.

We should not expect 1:1 performance improvements due to the increase in bandwidth, by which I mean that the full-fat Fiji XT should not suddenly triple the performance of a Titan Black. But it gives AMD a serious leg up, as nVidia will not bring non-planar memory products to the market for at least one year.

SwissArmyDruid fucked around with this message at 04:18 on Jan 25, 2015

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

Boiled Water posted:

Where are AMD putting all those watts? 100 watts is a lot of heat.

Nvidia made all sorts of optimizations to improve performance/watt going from Kepler to Maxwell and claimed a 2x improvement which is why Maxwell is so unusually power efficient when compared to other recent modern graphics cards. Normally this wouldn't be totally necessary for desktop GPUs but Nvidia and AMD have been stuck on the 28nm process node for longer than usual. This means that to keep increasing performance you need to either increase the TDP to higher than usual levels or have an especially power efficient architecture. AMD chose the former path and Nvidia chose the latter, I am assuming that AMD assumed when they started working on the architecture that they wouldn't have to release the 390X on 28nm but later were forced to due to delays at TSMC. The delays aren't as big of a deal for Nvidia because since their chip is so efficient they can build some massive 600+mm^2 beast approaching 10B transistors and still have it draw no more than maybe 225-250 watts whereas AMD's beast chip is supposedly going to be pushing 300.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

SwissArmyDruid posted:

At the very least, we know that AMD is about to leave nVidia in the dust with regards to memory bandwidth. The next high-end cards that AMD will release will feature Hynix's stacked memory modules. You may remember several years back when nVidia was touting their own stacked memory products which were then delayed to Pascal. This is because their own version of non-planar memory, which they were working on with Micron and calling Hybrid Memory Cube (HMC) fell through and the project was dropped. They will now shoot to use the joint AMD-Hynix High Bandwidth Memory (HBM) instead. The first generation of HBM products which is touted to, on paper, permit memory bandwidth 1 gigabit wide. The second generation of the memory standard, which they are working on presently, seeks to double this, as well as capacity per-stack.

For comparison's sake, a GTX 980 has only a 256-bit memory bus. A Titan Black only has a 384-bit memory bus. (The Titan Z also only has a 768-bit memory bus, but that's a dual-GPU card, so still, 384-bit per-GPU.)

All the memory efficiency and color compression touted by Maxwell is about to be obsoleted by this advancement.

We should not expect 1:1 performance improvements due to the increase in bandwidth, by which I mean that the full-fat Fiji XT should not suddenly triple the performance of a Titan Black. But it gives AMD a serious leg up, as nVidia will not bring non-planar memory products to the market for at least one year.

I thought only the R9 390s were to be HBM and everything else was using standard GDDR5, which explained the rumored TDP of the 380(X)? I mean, maybe they capture the enthusiast market for a year and half, Nvidia can just reclaim it with the 1000 series, correct? Are they at such a legitimate technological disadvantage that Nvidia could rollout first gen HBM on the 1000s just as AMD drops second gen cards with better thermals/consumption?

LRADIKAL
Jun 10, 2001

Fun Shoe
Update on my Gigabyte GV-n970WF30C-4GD http://www.newegg.com/Product/Product.aspx?Item=N82E16814125685 (the overclocked one WITHOUT) the back plane. (click on my thread history for other little notes)

I was having trouble boosting the memory speed at all. I bought some tiny heatsinks with adhesive, they were the wrong ones, way too tiny, so I stuck 2 on each of the 4 exposed memory modules on the back of the card (the ones on the front touch the main heatsink. As a result I'm currently running at +400mhz (3905 effective). They've only been at that speed for half an hour or so, but so far so good.

Haerc
Jan 2, 2011

sauer kraut posted:

550W is fine if it's a quality PSU, no one knows if there will even be a better 960

Spend 10$ more for a Tri-X of Vapor-X

I'm using this... Is it considered quality? I tried to find a good PSU at the time, 18 months ago or something like that.

Edit: and I wasn't really worried about the price difference between the 280x and the 960, I've just had much better luck with nvidia over the years.

Haerc fucked around with this message at 04:29 on Jan 25, 2015

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Haerc posted:

I'm using this... Is it considered quality? I tried to find a good PSU at the time, 18 months ago or something like that.

All of XFX's PSUs are made by Seasonic, so yes, I'd say you're perfectly fine in regards to quality.

Source: http://www.tomshardware.com/reviews/power-supply-psu-brands,3762-10.html

BIG HEADLINE fucked around with this message at 04:30 on Jan 25, 2015

SwissArmyDruid
Feb 14, 2014

by sebmojo

FaustianQ posted:

I thought only the R9 390s were to be HBM and everything else was using standard GDDR5, which explained the rumored TDP of the 380(X)? I mean, maybe they capture the enthusiast market for a year and half, Nvidia can just reclaim it with the 1000 series, correct? Are they at such a legitimate technological disadvantage that Nvidia could rollout first gen HBM on the 1000s just as AMD drops second gen cards with better thermals/consumption?

An argument could be made for the 380X using HBM as well. If AMD's implementation of the memory controller involves placing the entire HBM package directly onto the GPU die, as opposed to breaking it out to a second package, yes, that might explain where the extra heat is coming from.

I don't think this is likely, though, at least, not until the second generation of HBM products. The first-generation HBM products offer 1 GB of memory in a stack 4 wafers tall. To get the kind of memory capacities that a GPU needs, you'd need one stack per GB of conventional GDDR5, and that would make the die size balloon out of control. No, I think that if they're going to do that, it's not going to be until the gen 2 HBM parts. Those are slated to come in formats stacked either 4 or 8 layers high, in 4 GB or 8 GB per stack. PER STACK!

Just imagine what that would mean for AMD's APU parts, which have had trouble scaling their performance higher because of how bandwidth-limited they are. AMD has basically not been putting as many GCN cores onto their APUs as they COULD be putting, because the latency and bandwidth from talking to DDR3/4 just isn't usable.

But to answer your question about nVidia reclaiming a lead: Any advantage that AMD develops or maintains will, I think, depend entirely on the implementation of the memory controller going forward, whether the HBM gets mounted directly on the GPU die, or on separate packages similar to what we have now. Regardless of how they do it, I think we can expect much smaller cards, physically-speaking, since there won't need to be as many chips on the board to parallelize DRAM to enable throughput. AMD has been sampling these HBM products since, I think, September of last year. That's a lot of time on top of until when Pascal comes out to play around with the optimal layout for GPUs. If they play their cards right, they should be able to grow this lead into a good gain of market share.

But I think we won't be able to know for sure until we see Bermuda XT. (390X)

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
I'm a bit less expectant that HBM's going to ~revolutionize gaming~ so much as I think it's tailor-made to make it so FireGL cards finally have a trump card to hold over the Quadro for a while.

If AMD can be first to market with a workstation graphics card that has 16/32/64/128GB of sufficently-fast buffer on it, no one's going to loving care about the TDP.

BIG HEADLINE fucked around with this message at 07:42 on Jan 25, 2015

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

A little more analysis from AnandTech: http://anandtech.com/show/8931/nvidia-publishes-statement-on-geforce-gtx-970-memory-allocation

Wasn't imaginary, then. It'll be interesting to see if any more comes of this.

HalloKitty fucked around with this message at 07:51 on Jan 25, 2015

PC LOAD LETTER
May 23, 2005
WTF?!

BIG HEADLINE posted:

I'm a bit less expectant that HBM's going to ~revolutionize gaming~ so much as I think it's tailor-made to make it so FireGL cards finally have a trump card to hold over the Quadro for a while.
I thought that market was more compute and not bandwidth limited for their work loads + wants heaps of ECC VRAM instead of moar bandwidth?

At least for bandwidth between the GPU and VRAM. Increasing GPU to GPU or GPU to CPU bandwidth by heaps would probably be a big deal but HBM won't do that.

Kinda surprised they haven't already put a couple 40GbE or 100GbE fiber ports on those card already to let them talk between each other faster without waiting on the CPU or the PCIe bus.

PC LOAD LETTER fucked around with this message at 08:55 on Jan 25, 2015

Urby
May 31, 2011

MADE IN MEXICO
Umm, question from someone without too much GPU knowledge, but what happened to the Asus 780 ti UCII cards. I bought one a while back for around $700(AUS) and recently have been wanting to get a second one to SLI with. Only issue is, I can't find a single PC shop that stocks them. And looking on Ebay. I only see one being sold for $1200.

So I guess my question is... what happened to the card market-wise and also, does anyone know where I can obtain one for a reasonable price?

iuvian
Dec 27, 2003
darwin'd



Party Plane Jones posted:

The 960 is more comparable to the 285 than the 280 because of the amount of RAM. Just be warned, the 285 draws like a little less than hundred watts more power at full load than the 960.

http://www.eurogamer.net/articles/digitalfoundry-2015-nvidia-geforce-gtx-960-review

Nice review.

I was wondering why all the previous 960 reviews forgot to mention that you can get a 280 OC for under $200 with a gig more ram.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

iuvian posted:

Nice review.

I was wondering why all the previous 960 reviews forgot to mention that you can get a 280 OC for under $200 with a gig more ram.

Yeah, nobody was buying 285s anyway, the 280 remains the better deal for that sweet VRAM.

sauer kraut
Oct 2, 2004

Urby posted:

Umm, question from someone without too much GPU knowledge, but what happened to the Asus 780 ti UCII cards. I bought one a while back for around $700(AUS) and recently have been wanting to get a second one to SLI with. Only issue is, I can't find a single PC shop that stocks them. And looking on Ebay. I only see one being sold for $1200.

So I guess my question is... what happened to the card market-wise and also, does anyone know where I can obtain one for a reasonable price?

They have been out of production since September and replaced by the GTX 980.
Now the only people looking for them are in the same desperate situation as you are :smith:

Maybe you can sell it? Or keep looking on Craigslist and hope whatever you get isn't a shot to hell card that some scummer tries to unload.

isndl
May 2, 2012
I WON A CONTEST IN TG AND ALL I GOT WAS THIS CUSTOM TITLE

Urby posted:

Umm, question from someone without too much GPU knowledge, but what happened to the Asus 780 ti UCII cards. I bought one a while back for around $700(AUS) and recently have been wanting to get a second one to SLI with. Only issue is, I can't find a single PC shop that stocks them. And looking on Ebay. I only see one being sold for $1200.

So I guess my question is... what happened to the card market-wise and also, does anyone know where I can obtain one for a reasonable price?

You might be able to flip it and buy a couple 980s. Or given Australia's generous return policies, return it and buy 980s? It doesn't sound like you can get a new 780ti though.

Arglebargle III
Feb 21, 2006

Hey a GeForce 660Mu is still the best laptop GPU right?

sauer kraut
Oct 2, 2004

Arglebargle III posted:

Hey a GeForce 660Mu is still the best laptop GPU right?

Wouldn't that be the 980M/970M now?
http://www.asus.com/us/site/g-series/G751/

I guess it depends heavily on your definition of 'best'

autistic cum slut
Jun 3, 2011
I have an MSI GTX 760 4GB card, Core I5 2500 CPU, 16GB RAM, Windows 7 64. I installed MSI Afterburner and followed the guide in this thread - I didn't actually get that far into it, I just set it to temp priority at 80 degrees and made a custom fan profile. I went from 25-30fps on medium settings in Metal Gear Solid Ground Zeroes to 60fps on high settings.

Now, though, whatever game I play, after about 10-15 minutes or so the fps will suddenly drop to 10-15 and when I quit back to desktop, Afterburner shows the GPU usage at 97-98% and the temp stays at 80%. I have checked for any processes and causes of GPU usage but there is nothing I can see. When I restart, it goes back to normal and Afterburner shows usual desktop usage of 0-10%. The temperature also normalises and drops to about 40 degrees or so after being idle for a few minutes. As soon as I play a game (I have tried Ground Zeroes and Payday 2) the same thing will happen again after about 15 minutes or so.

I put Afterburner back to default settings but that made no difference. I have uninstalled it (and Rivatuner Statistics Server which installed alongside) and again that made no difference. I have uninstalled and reinstalled my NVIDIA drivers but again I have the same problem. Strangely, even after doing so and resetting, I was getting 60fps in Ground Zeroes before the slowdown which makes me think that there is a setting somewhere which is still enabled.

Does this sound like a software problem or do you think it's the hardware on my 760?

Kazinsal
Dec 13, 2011
I may be completely underestimating the horsepower requirements of MGSV, but I have a really hard time believing a 760 had an average of 25-30fps on medium in it. If I remember correctly it's slightly more powerful than a 660Ti and less than a 7970.

You may have a hardware issue. Or MGSV could just be optimized like complete and utter crap, but a simple overclock should not allow you to double your framerate while increasing your settings.

autistic cum slut
Jun 3, 2011

Kazinsal posted:

I may be completely underestimating the horsepower requirements of MGSV, but I have a really hard time believing a 760 had an average of 25-30fps on medium in it. If I remember correctly it's slightly more powerful than a 660Ti and less than a 7970.

You may have a hardware issue. Or MGSV could just be optimized like complete and utter crap, but a simple overclock should not allow you to double your framerate while increasing your settings.

Thanks, yeah it did seem too good to be true to have such a huge increase. There might have been a pre-existing problem which I temporarily alleviated by whatever Afterburner did, but unfortunately now I can't play anything else whereas before, it was only Ground Zeroes that gave me poor fps.

The only other question I have before I think about getting a new GPU is: could the motherboard also be the cause and/or make the card run at an underclock? Instinctively it seems not but I wouldn't want to take the plunge on a new GPU and then realise that in fact it's the mobo.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Kazinsal posted:

You may have a hardware issue. Or MGSV could just be optimized like complete and utter crap,

He probably has a hardware issue; MGSV is one of the most well optimized AAA titles of the present. I ran it with a 560ti at 1080p and was staying at 60fps with high-ish settings.

SwissArmyDruid
Feb 14, 2014

by sebmojo

iuvian posted:

Nice review.

I was wondering why all the previous 960 reviews forgot to mention that you can get a 280 OC for under $200 with a gig more ram.

Probably because the more appealing route is to go up about $50 and get an R9 290, with GCN silicon.

EDIT: Which, I note, *will* support DX12, whereas with the rebranded 280-that-is-secretly-a-rebranded-7970 may or may not, things are a bit hazy.

SwissArmyDruid fucked around with this message at 01:55 on Jan 26, 2015

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb
Why are there seven different versions of the 970 from EVGA? I wanna get a pair of them for SLI. I was thinking the ACX 2.0 FTW version?

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

fletcher posted:

Why are there seven different versions of the 970 from EVGA? I wanna get a pair of them for SLI. I was thinking the ACX 2.0 FTW version?

Do you have at least one slot of space between the cards in SLI? If so you should get the EVGA Super-Superclocked, that model has the best cooler of that brand: http://www.amazon.com/EVGA-GeForce-Superclocked-Graphics-04G-P4-2974-KR/dp/B00NVODXR4/

If there are no empty slots between cards, you will either want two blower cards, or to combine the one-fan mini itx Gigabyte 970 with the three-fan 970, so 1.5 of the fans of the later card can blow past the former card unobstructed.

suddenlyissoon
Feb 17, 2002

Don't be sad that I am gone.

Zero VGS posted:

Do you have at least one slot of space between the cards in SLI? If so you should get the EVGA Super-Superclocked, that model has the best cooler of that brand: http://www.amazon.com/EVGA-GeForce-Superclocked-Graphics-04G-P4-2974-KR/dp/B00NVODXR4/

If there are no empty slots between cards, you will either want two blower cards, or to combine the one-fan mini itx Gigabyte 970 with the three-fan 970, so 1.5 of the fans of the later card can blow past the former card unobstructed.

You linked to the wrong card. This is the right one to get

http://www.amazon.com/dp/B00R3NK2LE/ref=twister_B00OAXPYXG?_encoding=UTF8&psc=1

BurritoJustice
Oct 9, 2012

Zero VGS posted:

Do you have at least one slot of space between the cards in SLI? If so you should get the EVGA Super-Superclocked, that model has the best cooler of that brand: http://www.amazon.com/EVGA-GeForce-Superclocked-Graphics-04G-P4-2974-KR/dp/B00NVODXR4/

If there are no empty slots between cards, you will either want two blower cards, or to combine the one-fan mini itx Gigabyte 970 with the three-fan 970, so 1.5 of the fans of the later card can blow past the former card unobstructed.

The FTW+ has the best of the ACX2.0's, its pretty much exactly the same as the SSC but with four heatpipes not three. Unless I'm misremembering the four loving ACX2.0 coolers they have.

But to the person originally posting, the EVGA 900 series cards are pretty much non-competetive. Get MSI or ASUS or Gigabyte, in that order really.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Do we not care about the reference cooler? *I* thought having the cutout on the backplate to allow for airflow is pretty nice.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

SwissArmyDruid posted:

Do we not care about the reference cooler? *I* thought having the cutout on the backplate to allow for airflow is pretty nice.

Why on earth would anyone use a reference cooler unless they absolutely had to? (ie super cramped low airflow case where you've got no choice but to make sure all the exhaust air leaves the case; mATX SLi)

99% of people would be in a position where a non reference design would be substantially quieter and tens of degrees cooler.

I suppose you've got weirdos enthusiasts who remove the reference cooler and go with aftermarket liquid but they're an edge case at best.

SwissArmyDruid
Feb 14, 2014

by sebmojo

The Lord Bude posted:

Why on earth would anyone use a reference cooler unless they absolutely had to? (ie super cramped low airflow case where you've got no choice but to make sure all the exhaust air leaves the case; mATX SLi)

99% of people would be in a position where a non reference design would be substantially quieter and tens of degrees cooler.

I suppose you've got weirdos enthusiasts who remove the reference cooler and go with aftermarket liquid but they're an edge case at best.

If I do get a 900-series, it will probably be a reference design, yes. I intend to put it into the ML07B when it comes out.

Adbot
ADBOT LOVES YOU

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

SwissArmyDruid posted:

If I do get a 900-series, it will probably be a reference design, yes. I intend to put it into the ML07B when it comes out.

The ML07B is not one of the low airflow cases I was talking about. It is an incredibly well engineered case that manages pretty drat good cooling - it has 3x120mm fan slots, although you have to supplthem yourself - combined with the small dimensions that makes for some serious airflow. In addition, the intake for a non reference graphics card is hard up against two of the fan vents, creating almost perfect conditions for a non reference cooled graphics card. You'd be shooting yourself in the foot buying a reference design card.

Also - the ML07B is virtually identical in layout to the RVZ01, other than cosmetic differences, they are the same case - Except the RVZ01 comes with two of the three fans preinstalled, and it comes with dust filtering for all the intakes, something you can't get on the ML07B.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply