Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Shadowhand00 posted:

When you're running SLI, is it common to have one GPU substantially hotter than the other? I'm running DA:Inquisition right now and one GPU is sitting at 59 while the other is sitting at 82ish. I'd sit the two GPUs farther apart but they don't make SLI cables that long (unless someone knows of a place I can get an SLI cable longer than 120mm)

You can get really long SLI cables on eBay/Amazon, make sure the PCI slots are both a high enough speed to be compatible. What is your motherboard?

If you can't space the cards father apart, you can at least loosen the screws that fasten the cards the the motherboard, slide the top card up a millimeter or two, and slide the bottom card down a bit so there's a little more breathing room then retighten the screws. You could also aim a case side-fan between them. But two open air coolers slotted directly next to each other with no slots in between will always run hot. You can switch to blower-style coolers, or get a mini-ITX type card on the bottom to let at least one fan on the top reach past, but dropping a buck or two for a long SLI cable should be what you try first assuming the lower slots are compatible.

Adbot
ADBOT LOVES YOU

craig588
Nov 19, 2005

by Nyc_Tattoo

Capn Jobe posted:

Update: fans are working, so that's a plus. The screw holes almost line up; it's a matter of 1-2 mm probably. Everything else in the case seems to line up, and my old video card was fine. Strangest thing. I'll get out the drill a bit later, shaving off a bit of material should do the trick.

One more question: is it bad for the power cable to the cooler to be up against one of the exposed heat pipes? They don't seem to get too hot, but I'd rather not burn through a cable if I can avoid it.

It's not really important, but it's killing me that you keep calling it the power for the cooler. The cooler is only drawing 3 to 4 watts at full speed. The power cable is for the videocard itself which needs at least 145 watts, with the Asus card speced for 165 (ish) at stock speeds.

The heat pipes won't get hot enough to hurt the power cables so no worries there.

LRADIKAL
Jun 10, 2001

Fun Shoe
Do you have other cards that fit properly? Do the metal blanks fit right? You're driving me nuts, figure out how to seat your video card properly. Take a couple pictures, sheesh, you almost certainly shouldn't be sawing bits of metal to make it fit.

Shadowhand00
Jan 23, 2006

Golden Bear is ever watching; day by day he prowls, and when he hears the tread of lowly Stanfurd red,from his Lair he fiercely growls.
Toilet Rascal

Zero VGS posted:

You can get really long SLI cables on eBay/Amazon, make sure the PCI slots are both a high enough speed to be compatible. What is your motherboard?

If you can't space the cards father apart, you can at least loosen the screws that fasten the cards the the motherboard, slide the top card up a millimeter or two, and slide the bottom card down a bit so there's a little more breathing room then retighten the screws. You could also aim a case side-fan between them. But two open air coolers slotted directly next to each other with no slots in between will always run hot. You can switch to blower-style coolers, or get a mini-ITX type card on the bottom to let at least one fan on the top reach past, but dropping a buck or two for a long SLI cable should be what you try first assuming the lower slots are compatible.

I adjusted them to give them slight space for now. I can't seem to find any really long cables on eBay or Amazon but MSI supposedly sells them directly (140mm).

Capn Jobe
Jan 18, 2003

That's right. Here it is. But it's like you always have compared the sword, the making of the sword, with the making of the character. Cuz the stronger, the stronger it will get, right, the stronger the steel will get, with all that, and the same as with the character.
Soiled Meat
So I tried a different tack, and seem to have had some success. Put my old card back in to see how it fit, and it seemed fine (wasn't a perfect fit, but close enough). Then I took out all of the motherboard screws, slotted in the card, and then tried to fit the backplate in with the motherboard loose. Applied a bit of pressure, and something shifted slightly, which allowed it to fit. Went to replace the mobo screws, they all went into place just fine. Checked to see if the card was still slotted right, and it was.

Couldn't tell you what was wrong, but somehow something wasn't seated properly. My guess is the motherboard was rotated some fraction of a degree. Thank for the suggestions, everyone.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva
That's odd, but at least it's fixed. My older steel case has a small bend on the PCI slots so I have to flex it a bit when exchanging cards until I get them screwed down. Kinda annoying, but it's easier than replacing the case.

Inovius
Apr 7, 2010
Looks like evga has 2 new 970 models coming out that have the updated FTW cooler + a cooling plate on the memory & mosfets. I'm probably leaning towards a FTW+ once they finally hit personally.

https://twitter.com/EVGA_JacobF/status/547831253295460352

ZenVulgarity
Oct 9, 2012

I made the hat by transforming my zen

gently caress me is my reference cooler 290 loud as poo poo sometimes

Party Plane Jones
Jul 1, 2007

by Reene
Fun Shoe

Inovius posted:

Looks like evga has 2 new 970 models coming out that have the updated FTW cooler + a cooling plate on the memory & mosfets. I'm probably leaning towards a FTW+ once they finally hit personally.

https://twitter.com/EVGA_JacobF/status/547831253295460352

EVGA having at least 6 different bins for the same model is confusing as all hell.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

ZenVulgarity posted:

gently caress me is my reference cooler 290 loud as poo poo sometimes

Where have you been the past two years.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

ZenVulgarity posted:

gently caress me is my reference cooler 290 loud as poo poo sometimes

I got a spare Corsair Hydro H90 and a new-in-box Kraken G10 lying around for $60 shipped if you want both. You got no platty so email my at my screen name at yahoo.com if you want em.

Shadowhand00
Jan 23, 2006

Golden Bear is ever watching; day by day he prowls, and when he hears the tread of lowly Stanfurd red,from his Lair he fiercely growls.
Toilet Rascal

Shadowhand00 posted:

I adjusted them to give them slight space for now. I can't seem to find any really long cables on eBay or Amazon but MSI supposedly sells them directly (140mm).

Well I'm an idiot. The SLI cable I had wasn't quite 120mm anyway. I think that's the length I need.

Regardless, the few times I've managed to play some games this weekend, I can safely say that 3440x1440 can be run by 2x970s. Once I get some better cooling/the longer SLI cable, I'll be able to run for a longer period of time. Until then, I'm pretty happy with this purchase.

Does anyone want to purchase a GTX780?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Oh dear, the 960 looks like it may well come in with a pathetic 128-bit memory bus and 2GiB VRAM after all.

This would be a colossally awful move for NVIDIA seeing as VRAM requirements in games are pretty rapidly shooting up.

If this is actually going to be the case, then in my opinion, there's nothing interesting about it to wait for, regardless of how low power it might be; people wanting something cheaper than a 970 are going to be best served with a 280X or 290 until further notice.

HalloKitty fucked around with this message at 12:36 on Dec 29, 2014

1gnoirents
Jun 28, 2014

hello :)

HalloKitty posted:

Oh dear, the 960 looks like it may well come in with a pathetic 128-bit memory bus and 2GiB VRAM after all.

This would be a colossally awful move for NVIDIA seeing as VRAM requirements in games are pretty rapidly shooting up.

If this is actually going to be the case, then in my opinion, there's nothing interesting about it to wait for, regardless of how low power it might be; people wanting something cheaper than a 970 are going to be best served with a 280X or 290 until further notice.

Man, they should have added another $20 and a gig more at least. That's not going to go over well almost regardless of how it performs (at first)

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

HalloKitty posted:

Oh dear, the 960 looks like it may well come in with a pathetic 128-bit memory bus and 2GiB VRAM after all.

This would be a colossally awful move for NVIDIA seeing as VRAM requirements in games are pretty rapidly shooting up.

If this is actually going to be the case, then in my opinion, there's nothing interesting about it to wait for, regardless of how low power it might be; people wanting something cheaper than a 970 are going to be best served with a 280X or 290 until further notice.

Is R9-290 pricing at $250 or under sustainable in the long term? I can't see the 960 outperforming the 290 if this bus size is true, and I don't think people put that much of a premium on cool & quiet. Given the whole history of GPU architectures over the years, it's really funny to me that we might see a part with a 128-bit bus positioned against one with a 512-bit bus on the same memory technology.

1gnoirents
Jun 28, 2014

hello :)
I'll give them credit with the memory improvements this time totally changed the view on "256 bit" as a standard of measure, but no matter what I can't see 128 bit of the same generation being too great.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

HalloKitty posted:

Oh dear, the 960 looks like it may well come in with a pathetic 128-bit memory bus and 2GiB VRAM after all.

This would be a colossally awful move for NVIDIA seeing as VRAM requirements in games are pretty rapidly shooting up.

If this is actually going to be the case, then in my opinion, there's nothing interesting about it to wait for, regardless of how low power it might be; people wanting something cheaper than a 970 are going to be best served with a 280X or 290 until further notice.

Looks like I'll be getting a 970 or it's 20nm successor then. My PSU (Corsair HX650) is five years old now so should I replace/upgrade that first?

spasticColon fucked around with this message at 16:40 on Dec 29, 2014

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

spasticColon posted:

Looks like I'll be getting a 970 or it's 20nm successor then. My PSU (Corsair HX650) is five years old now so should I replace/upgrade that first?

Your PSU is probably getting close to clocking out at 5 years old anyway so it's not a bad idea.

suddenlyissoon
Feb 17, 2002

Don't be sad that I am gone.
I just got a gigabyte 970 (mini itx) and I'm having some issues. First, I can only get it to boot by using one display. Second, every few minutes the display goes black and sometimes it comes back, sometimes it doesn't. I'd it does it says, "Nvidia display driver error". I've basically done everything short of a full re-install of windows but I keep getting the error. I previously had a gigabyte 670 and didn't have a problem so I know the psu isn't an issue. Bad card?

Rukus
Mar 13, 2007

Hmph.
Yeah, if you just bought it then return it to the retailer for an exchange or get the one made by Asus instead.

suddenlyissoon
Feb 17, 2002

Don't be sad that I am gone.

Rukus posted:

Yeah, if you just bought it then return it to the retailer for an exchange or get the one made by Asus instead.

Back to Amazon you go! If only Amazon had the Strix in stock :(

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

suddenlyissoon posted:

I just got a gigabyte 970 (mini itx) and I'm having some issues. First, I can only get it to boot by using one display. Second, every few minutes the display goes black and sometimes it comes back, sometimes it doesn't. I'd it does it says, "Nvidia display driver error". I've basically done everything short of a full re-install of windows but I keep getting the error. I previously had a gigabyte 670 and didn't have a problem so I know the psu isn't an issue. Bad card?

Make absolutely sure you don't have an overclocking utility set to "Start on Windows Boot".

If for example you had your 670 overclocked with MSI Afterburner, it could be running in the background when the PC launches and overclocking your 970 to something crazy that makes it crash.

Also I know you said you did everything, but definitely completely install the Nvidia drivers and Nvidia experience, reboot, then do a clean reinstall as well.

After that, sure, return the card.

1gnoirents
Jun 28, 2014

hello :)
That has certainly been a symptom for me, even with cards in the same generation. Although not every time. I had to go into safe mode and use a driver removal tool

I have to say the residual poo poo it picked up was insane. It could find information on every single card I installed. Worked perfectly afterwards, I think I've had to do it 2 out of 15

veedubfreak
Apr 2, 2005

by Smythe

suddenlyissoon posted:

Back to Amazon you go! If only Amazon had the Strix in stock :(

I have 2 970 strix in stock. I really need to put these drat things up for sell.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
So I've been messing around with DSR on my GTX 770 and it's awesome when I can afford the extra performance, but I do have a question on the DSR factors. For a 1080p monitor, are there certain resolutions to downsample from that make or don't make sense, or as long as it's a 16:9 image it should be ok? I've been checking to see what resolutions people run and outside of the obvious 4x DSR factor for 4k, the next best one to use is stated to be 2.5x which is 2880x1620, but I'm not sure why it would be the next best one.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Take this with a grain of salt, but from what I understand, you're coming across the diminishing returns for trading in additional performance for graphics fidelity (in reverse). It's a big jump down from 4x to 2.5x on the multiplier because downsampling inherently needs many many many many many many many many many more pixels to provide just the slightest improvement in your image quality. If you can still notice DSR working nicely for you on its lower multipliers, then absolutely go for it, because it's far fewer pixels to deal with every frame so you can make more frames per second.

then again, noticing that takes a really sharp eye, and it could just be the beautiful 13-tap Gaussian blur that comes with DSR you might be seeing as well

1gnoirents
Jun 28, 2014

hello :)
The little experience I had with BF4's super sampling (or variant) was shockingly awesome. However it also brought my ~80 fps down to 5 at 1440p lol (cant imagine why I couldnt run 5k...). Sometimes I'd stand right in front of a bush,turn it on, and stare in wonder then shot shot in the head.

I can only imagine the DSR performance hit is pretty extreme too. A lot of people were excited about it and I was a little apprehensive because I figured most people would be disappointed. I thought that most any game (say at 1080p) that would really benefit from DSR, or super sampling of any kind, would simply put too much stress on a single 970 to really show what it can do. And if it did have enough horsepower to deal with full on DSR that you might be better off maxing out other settings in game instead. But those were just my guesses I dont actually know since I've never used it. I'd like to if I got SLI probably.

Also there's the whole "now ive got 4k" thing too

Ham Sandwiches
Jul 7, 2000

Sidesaddle Cavalry posted:

Take this with a grain of salt, but from what I understand, you're coming across the diminishing returns for trading in additional performance for graphics fidelity (in reverse). It's a big jump down from 4x to 2.5x on the multiplier because downsampling inherently needs many many many many many many many many many more pixels to provide just the slightest improvement in your image quality. If you can still notice DSR working nicely for you on its lower multipliers, then absolutely go for it, because it's far fewer pixels to deal with every frame so you can make more frames per second.

then again, noticing that takes a really sharp eye, and it could just be the beautiful 13-tap Gaussian blur that comes with DSR you might be seeing as well

I don't know how much you've messed around with downsampling but even downsampling 1440 to 1080p provides some pretty noticeable gains, and this was before it was officially supported in drivers. I was using either manual resolutions or gedosato. Most of the time 2.5x will provide solid visual improvement on a game, and if it isn't a resolution bound game, shouldn't affect the performance that badly.

[edit]

quote:

the next best one to use is stated to be 2.5x which is 2880x1620, but I'm not sure why it would be the next best one.

Because it's a 50% increase in pixels which is a better multiplier than 2x which is actually sqrt 2 or ~41%. 1920(x1.5) x 1080(x1.5) and you get 2880 x 1620. It was my go to downsampling rez when I was doing it manually.

Ham Sandwiches fucked around with this message at 01:50 on Dec 31, 2014

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Beautiful Ninja posted:

So I've been messing around with DSR on my GTX 770 and it's awesome when I can afford the extra performance, but I do have a question on the DSR factors. For a 1080p monitor, are there certain resolutions to downsample from that make or don't make sense, or as long as it's a 16:9 image it should be ok? I've been checking to see what resolutions people run and outside of the obvious 4x DSR factor for 4k, the next best one to use is stated to be 2.5x which is 2880x1620, but I'm not sure why it would be the next best one.

On my 680, I've mostly been combining 2560x and below (on a 1920 screen) with 2x or 4x MSAA and it's worked great. Especially since some titles don't have scaling interfaces and super high res makes everything tiny. Even 1.2x with 2x MSAA and FXAA gives really good results compared to 4x MSAA.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
I ordered one of those evga acx cooler 970s just now because it and the gigabyte were the only decently priced ones in stock with prime shipping and I'll never touch another gigabyte card if I can help it.

It was a bit of a rush purchase though, FedEx tried to destroy my desktop to the point where all the HDD caddies were somehow shaken loose and my four year old drives spent the trip bouncing against each other. Miraculously the drives all survived, even if it knocked a few months of life off them, but my gtx 670 did not.

After reading this thread and the part picking thread and hearing about how the cooler is either subpar or bad, I'm worried I made a terrible mistake. What should I be expecting with this card? Is it bad enough to consider returning it or is it just not as good as the other custom coolers?

td4guy
Jun 13, 2005

I always hated that guy.

Compared to its competition, it's slightly warmer and slightly louder. Compared to your 670, it's great in that regard.
Keep in mind though, that EVGA skimped on the power delivery portions of the card. So you probably won't be able to overclock as much.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Rakthar posted:

I don't know how much you've messed around with downsampling but even downsampling 1440 to 1080p provides some pretty noticeable gains, and this was before it was officially supported in drivers. I was using either manual resolutions or gedosato. Most of the time 2.5x will provide solid visual improvement on a game, and if it isn't a resolution bound game, shouldn't affect the performance that badly.

[edit]


Because it's a 50% increase in pixels which is a better multiplier than 2x which is actually sqrt 2 or ~41%. 1920(x1.5) x 1080(x1.5) and you get 2880 x 1620. It was my go to downsampling rez when I was doing it manually.

Thanks, that makes sense in why 2.5x was considered the next best resolution to downsample to. I've been using 2880x1620 in the games I haven't been able to pull off 4k at, makes tweaking PC settings even more interesting than it used to be, now I'm trying to chase a higher resolution on top of getting max settings on everything possible.

Fallows
Jan 20, 2005

If he waits long enough he can use his accrued interest from his savings to bring his negative checking balance back into the black.
ACX 1.0 buyer here, been using since Oct; What td4guy is saying about power delivery and overclocking seems pretty true for the evga acx. Mine doesn't stay stable at anything above 1420ish mhz on the stock voltages for the card which when talking to friends is a bit on the low end but still good. The other things about heat and noise have not been an issue for me, card never goes above mid 60s and fan above 40-50% which is really quiet still.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
In that case I'll probably be fine, the 670 it'll be replacing had a reference cooler and my case is pretty well ventilated. I also don't generally put in the effort to overclock so I'm fine with leaving it on the factory overclock.

I was about due for at least a GPU upgrade, even if I probably wouldn't have bothered this generation without this happening. For reference the CPU is an ivy bridge i5-3550 still using the stock cooler.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!
Hi thread, I have some questions about a low 3dMark score. I just got a new 970 and I decided to run 3dmark firestrike to test it out and I got a really low score, around 6700. The CPU looks like it was really dragging things down. I have a sandy bridge i7 2600k running at 4.4Ghz, is the score normal and I just need to upgrade my mobo and CPU or is something wonky going on?

Miley Virus
Apr 9, 2010

Desuwa posted:

I ordered one of those evga acx cooler 970s just now because it and the gigabyte were the only decently priced ones in stock with prime shipping and I'll never touch another gigabyte card if I can help it.

It was a bit of a rush purchase though, FedEx tried to destroy my desktop to the point where all the HDD caddies were somehow shaken loose and my four year old drives spent the trip bouncing against each other. Miraculously the drives all survived, even if it knocked a few months of life off them, but my gtx 670 did not.

After reading this thread and the part picking thread and hearing about how the cooler is either subpar or bad, I'm worried I made a terrible mistake. What should I be expecting with this card? Is it bad enough to consider returning it or is it just not as good as the other custom coolers?

EVGA are rolling out new versions of that card with (allegedly) better coolers and better power delivery. It has SSC on the box instead of SC. You most likely didn't buy it since it's only a limited release so far but you can use their Step-Up thing to exchange your card for it. You may have to pay a bit more though.

http://www.evga.com/support/stepup/

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

AVeryLargeRadish posted:

Hi thread, I have some questions about a low 3dMark score. I just got a new 970 and I decided to run 3dmark firestrike to test it out and I got a really low score, around 6700. The CPU looks like it was really dragging things down. I have a sandy bridge i7 2600k running at 4.4Ghz, is the score normal and I just need to upgrade my mobo and CPU or is something wonky going on?
No that's not right. Maybe you have a rogue process running like a bitcoin miner or other malware? A 2600k wouldn't hold you back in 3DMark - I should know since I have one running around that speed.

Might need to reinstall the drivers completely after removing them fully with display driver uninstaller if you haven't already.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

cisco privilege posted:

No that's not right. Maybe you have a rogue process running like a bitcoin miner or other malware? A 2600k wouldn't hold you back in 3DMark - I should know since I have one running around that speed.

Might need to reinstall the drivers completely after removing them fully with display driver uninstaller if you haven't already.

I seem to have fixed it, I had to turn off Intel Speed Step in my bios, for some reason the cpu was sticking at its idle clocks during the test instead of speeding up. Now getting a score of 9900-ish.

BTW, at 4.6Ghz I am getting a core temp of 44C-46C and a CPU temp of 57C, is that too high? Before it was like 25C-27C core and 34C for the CPU...

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

AVeryLargeRadish posted:

I seem to have fixed it, I had to turn off Intel Speed Step in my bios, for some reason the cpu was sticking at its idle clocks during the test instead of speeding up. Now getting a score of 9900-ish.

BTW, at 4.6Ghz I am getting a core temp of 44C-46C and a CPU temp of 57C, is that too high? Before it was like 25C-27C core and 34C for the CPU...

Not even close, thermal max should be somewhere around 90C before the CPU starts under clocking itself. Anything below that is within Intel's specifications for temperature and is designed to run at that temperature, though anything into the 80C and above range is real world circumstances might be a case where you might want to look at a better cooling solution.

Beautiful Ninja fucked around with this message at 19:51 on Jan 1, 2015

Adbot
ADBOT LOVES YOU

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva
You shouldn't have the to disable speedstep. Probably need to make sure the max power state in control panel>power options is set to 100%, with the minimum at the default of 5% iirc.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply