Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Seamonster
Apr 30, 2007

IMMER SIEGREICH
They've got enough money for a 980ti but can't even afford a basic aftermarket CPU HSF like a 212 or something?

Adbot
ADBOT LOVES YOU

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Seamonster posted:

They've got enough money for a 980ti but can't even afford a basic aftermarket CPU HSF like a 212 or something?

This isn't for me and its a i5-6500 the stock intel cooler works perfectly fine. Sadly the zotac has some rattling fans but i also have a windforce 980ti so i used that one instead.

Don Lapre fucked around with this message at 19:09 on Dec 17, 2015

penus penus penus
Nov 9, 2014

by piss__donald

Don Lapre posted:

Air, zotac 980ti Amp! stock cooler

So i never even looked at precision X lol. Didn't realize heaven showed incorrect clock speeds

Yeah ok this makes a lot more sense lol. Heaven has been that way as long as I've been using it.

SlayVus
Jul 10, 2009
Grimey Drawer

THE DOG HOUSE posted:

Wow 1443 mhz factory settings is the highest ive heard of

My factory setting GALAX 980 TI HOF does 1480 stock and can do 1575 on air. It also has a $100 price premium.

SlayVus fucked around with this message at 19:27 on Dec 17, 2015

Wistful of Dollars
Aug 25, 2009

xthetenth posted:

I'm hoping Freesync gets borderless windowed and crossfire support soon because then it'll be a total replacement or at least close enough. I'll cheerfully accept it being slightly less capable for the price difference between an XR341CK and an X34 though.

Aye, me too. I really want to give AMD my :10bux: but being able to run SLI in boarderless windowed with gsync is really, really convenient - particularly as a chronic alt-tabber.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

El Scotch posted:

Aye, me too. I really want to give AMD my :10bux: but being able to run SLI in boarderless windowed with gsync is really, really convenient - particularly as a chronic alt-tabber.

Yeah, I just took a loss sidegrading because I was having an issue with alt-tabbing, so not having borderless is the one real risk I feel I'm taking going with an XR341CK rather than the X34. Them adding low framerate compensation to Freesync with a driver upgrade makes me feel better about it to where I'll take the risk (that and my boss being interested in buying my 34UM95 covering much more of an XR than an X34).

NihilismNow
Aug 31, 2003

Paul MaudDib posted:

Yeah at this point there are still definite choices you have to make. AMD has the better Eyefinity mode, NVIDIA has G-sync and DVI outputs. There's a pretty good lock-in right now to one ecosystem or the other.

AMD also allows you to passthrough all their cards in VMware while Nvidia only allows that on their Quadro and Tesla cards.
For all the dozens of people to who this matters worldwide.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Thanks for making me feel bad about my plain Jane evga acx 980ti that tops out at 1403 or goes down to 1390 when it gets really stressed. I don't have any extra volts on it though.

SwissArmyDruid
Feb 14, 2014

by sebmojo

NihilismNow posted:

AMD also allows you to passthrough all their cards in VMware while Nvidia only allows that on their Quadro and Tesla cards.
For all the dozens of people to who this matters worldwide.

Not that hard to trick the Nvidia cards into thinking that they're not actually in a VM. I hear it only takes one or two things.

penus penus penus
Nov 9, 2014

by piss__donald
Lol geforce experience -> twitch, twitch auto records and auto exports to -> youtube, imgur takes youtube videos and converts to gifs



the times we live in

cat doter
Jul 27, 2006



gonna need more cheese...australia has a lot of crackers
Why the heck does the nvidia control panel take like a minute to load up every time? I've got windows on my SSD and my computer boots up in literally like 8 seconds, but just the nvidia panel takes forever. Sometimes I have to close it manually and open it again just to get it to stop hanging.

penus penus penus
Nov 9, 2014

by piss__donald

cat doter posted:

Why the heck does the nvidia control panel take like a minute to load up every time? I've got windows on my SSD and my computer boots up in literally like 8 seconds, but just the nvidia panel takes forever. Sometimes I have to close it manually and open it again just to get it to stop hanging.

I dont know man, that and if you try to manually open GFE sometimes it can take an obscene amount of time.

SlayVus
Jul 10, 2009
Grimey Drawer

Dogen posted:

Thanks for making me feel bad about my plain Jane evga acx 980ti that tops out at 1403 or goes down to 1390 when it gets really stressed. I don't have any extra volts on it though.

Don't feel bad, people spend like an extra $300-$400 or more to get ASIC qualities above 80. The 72% guarantee ones are $850. A used 80%+ is selling for $1,500 on amazon.

Sininu
Jan 8, 2014

cat doter posted:

Why the heck does the nvidia control panel take like a minute to load up every time? I've got windows on my SSD and my computer boots up in literally like 8 seconds, but just the nvidia panel takes forever. Sometimes I have to close it manually and open it again just to get it to stop hanging.
I don't think SSD even makes a difference. It's awfully slow no matter what.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

SinineSiil posted:

I don't think SSD even makes a difference. It's awfully slow no matter what.

Yeah, it's got some weirdnesses. I had a file with a machine generated file name that was crazy long lying in a folder and every time I tried to open the per-game settings window it would crash the program because it seems it scans the whole file system or a significant fraction. It took some combing through logs in process explorer to figure that one out because exception handling is for scrubs.

xthetenth fucked around with this message at 00:08 on Dec 18, 2015

Sininu
Jan 8, 2014

xthetenth posted:

Yeah, it's got some weirdnesses. I had a file with a machine generated file name that was crazy long lying in a folder and every time I tried to open the per-game settings window it would crash the program because it seems it scans the whole file system or a significant fraction. It took some combing through logs in process explorer to figure that one out because exception handling is for scrubs.

I had to reinstall it once because it kept crashing while starting up.
Awful

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
I'll take a moment to toot AMDs horn and say Radeon Software is incredibly responsive even on old dumb HDDs.

SwissArmyDruid
Feb 14, 2014

by sebmojo
It had better. It's hard to gently caress up a Qt-based UI.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!
The Nvidia control panel opens in 4-5 seconds for me, Geforce Experience in 7 seconds. :shrug:

xiansi
Jan 26, 2012

im judjing all goons cause they have bad leader, so a noral member is associated whith thoose crasy one

Personaly i would quit the goons if i was in cause of thoose crasy ppl
Clapping Larry

cat doter posted:

Why the heck does the nvidia control panel take like a minute to load up every time? I've got windows on my SSD and my computer boots up in literally like 8 seconds, but just the nvidia panel takes forever. Sometimes I have to close it manually and open it again just to get it to stop hanging.

gently caress yeah, that thing is jank-tastic.

It follows a time-honoured tradition though - all 'driver UI' software that comes with hardware, even decent hardware, runs like utter poo poo. Is generally really ugly too. Mouse software is by far the worst, closely followed by soundcard/audio stuff.

The fact that Samsung Magician is probably the least awful example of this tells you everything. Oh, and NZXT CAM, which still runs like crap, but at least looks quite nice.

And AMD have just released their Crimson thing, and whilst I have no idea how that runs, at least it looks not as awful as Catalyst did, though that's a loving low bar.

AVeryLargeRadish posted:

The Nvidia control panel opens in 4-5 seconds for me, Geforce Experience in 7 seconds. :shrug:

My times are around the same. And playing terribly fast & loose with numbers here, you are basically saying that in order to open & display a basic UI, plus run a few thousand (?) low-level hardware lookups takes between 20-25 *billion* CPU operations, assuming one core. That is some sloppy poo poo...

xiansi fucked around with this message at 01:36 on Dec 18, 2015

dissss
Nov 10, 2007

I'm a terrible forums poster with terrible opinions.

Here's a cat fucking a squid.

FaustianQ posted:

I'll take a moment to toot AMDs horn and say Radeon Software is incredibly responsive even on old dumb HDDs.

Whatever the last version of Catalyst Control Center that worked with the 5450 in my HTPC was the absolute worst for this - took over a minute to load sometimes.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I really have to give AMD props for the rate of improvement in their drivers over the past year or so. Of course, they had the farthest to go so it's easier to see visible changes, but it was like they were doing gently caress-all for a long time.

beergod
Nov 1, 2004
NOBODY WANTS TO SEE PICTURES OF YOUR UGLY FUCKING KIDS YOU DIPSHIT
Here's what I am having a hard time understanding. So I'm running two MSI 970s. Shouldn't there be an "ideal" overclock for those cards since they're all the same? Like shouldn't I just be able to Google the "correct" overclock values and plug them in? Why do I need to experiment, etc., to find the ideal overclock? Is it because of their interaction with other parts of my PC (like eg the CPU, the RAM, etc.) or because each card is a bit different or what?

I'm failing at overclocking and I feel like an idiot.

Star War Sex Parrot
Oct 2, 2003

beergod posted:

Here's what I am having a hard time understanding. So I'm running two MSI 970s. Shouldn't there be an "ideal" overclock for those cards since they're all the same?
Nope, variations in manufacturing lead to variations in performance.

beergod posted:

Like shouldn't I just be able to Google the "correct" overclock values and plug them in?
You can look around to see how well other people are doing as a rough guideline, but no that's not really how it works.

beergod posted:

Why do I need to experiment, etc., to find the ideal overclock?
Because at the end of the day your cards are special snowflakes, and it's very likely that even two "identical" cards will overclock differently.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

beergod posted:

Here's what I am having a hard time understanding. So I'm running two MSI 970s. Shouldn't there be an "ideal" overclock for those cards since they're all the same? Like shouldn't I just be able to Google the "correct" overclock values and plug them in? Why do I need to experiment, etc., to find the ideal overclock? Is it because of their interaction with other parts of my PC (like eg the CPU, the RAM, etc.) or because each card is a bit different or what?

I'm failing at overclocking and I feel like an idiot.

That would assume every chip is exactly the same. Atoms and the margin of error are way too big in comparison to transistors for that to be remotely true. Some chips are better than others and every one's unique.

Basically the limiting factor of how small we can make transistors is the accuracy to which we can carve designs into wafers, and to get that extra performance we take it to the bleeding edge where some of the chips aren't even going to work. Then we pack up all the ones that actually hit a given performance target and ship them up to little boys and girls around the world.

xthetenth fucked around with this message at 04:51 on Dec 18, 2015

beergod
Nov 1, 2004
NOBODY WANTS TO SEE PICTURES OF YOUR UGLY FUCKING KIDS YOU DIPSHIT
So what kind of FPS performance gain am I looking at in an "ideal" overlock? 5-10%?

What are we using to drive our Predator X34s? My Fallout 4 Ultra FPS is dropping to 30-40 at night and That Is Unacceptable.

BurritoJustice
Oct 9, 2012

beergod posted:

So what kind of FPS performance gain am I looking at in an "ideal" overlock? 5-10%?

What are we using to drive our Predator X34s? My Fallout 4 Ultra FPS is dropping to 30-40 at night and That Is Unacceptable.

From modern Maxwell cards, 20% is a fairly easy target (especially on the 980ti).

The Fury series from AMD has sub 5% performance gains from overclocking. The Hawaii (290/390/x) series can get ~10%

Super rough numbers of course

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

beergod posted:

So what kind of FPS performance gain am I looking at in an "ideal" overlock? 5-10%?

What are we using to drive our Predator X34s? My Fallout 4 Ultra FPS is dropping to 30-40 at night and That Is Unacceptable.

As BurritoJustice said, a GeForce 9xx card should easily see 20%-25% improvement, and that's before you start talking silly measures like liquid cooling and whatnot.

We are using 980/980Ti's to drive X34's. Not sure what you've got right now, but FO4 is actually a comparatively light game on the GPU for a modern AAA title (though it appears to benefit more from a beefier CPU than many other current-gen games). I've got a single 980 pushing a 1440p monitor, and with everything maxed it's comfortably above 75FPS the vast majority of the time.

KakerMix
Apr 8, 2004

8.2 M.P.G.
:byetankie:

beergod posted:

So what kind of FPS performance gain am I looking at in an "ideal" overlock? 5-10%?

What are we using to drive our Predator X34s? My Fallout 4 Ultra FPS is dropping to 30-40 at night and That Is Unacceptable.

Note that Fallout 4 does not currently have sli support, so if you have sli right now you are only using one card. You can force sli using some other profiles and instructions around the Internet but that comes with its own side effects.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

xiansi posted:

Mouse software is by far the worst

Spoken like someone who has never had a printer.

real_scud
Sep 5, 2002

One of these days these elbows are gonna walk all over you
Here's a question I'm wondering. I can currently get a EVGA 970 FTW+ for about $269, should I go for that instead of say a slightly less oc'd 970 from Gigabyte or MSI?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

real_scud posted:

Here's a question I'm wondering. I can currently get a EVGA 970 FTW+ for about $269, should I go for that instead of say a slightly less oc'd 970 from Gigabyte or MSI?

That's a good price and I doubt the MSI or Gigabyte models would be any cheaper.

I don't think the FTW ever got the crappy cooler.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

SwissArmyDruid posted:

Not that hard to trick the Nvidia cards into thinking that they're not actually in a VM. I hear it only takes one or two things.
Not sure which trick(s) that's being used to do it. I'd think the driver would be what's in control of the VM detection logic and they could just patch it out easily otherwise. Only references I see that get around anything in any manner is to use nVidia GRID vGPU under ESXi and that's not what people are looking for normally I'd imagine. Things like changing MAC address to another vendor's range and whatnot seem like it wouldn't work since not every VM will be virtual network connected I'd think. Otherwise, not letting VMware Tools run or changing around PCI addresses for certain devices could very well cause bugs in other things that are not very nice either.

And I'm one of those dozen people interested in this approach because I'd like to run some CUDA stuff in one VM running Linux and when not in use have the GPU switched to an HTPC VM that runs Windows (DRM BS for TV recording and such that's easier under Windows). Maybe I should just get separate GPUs or eat the cost of a Tesla but buying multiple $1k+ cards sucks for primarily home use.

SwissArmyDruid
Feb 14, 2014

by sebmojo

necrobobsledder posted:

Not sure which trick(s) that's being used to do it. I'd think the driver would be what's in control of the VM detection logic and they could just patch it out easily otherwise. Only references I see that get around anything in any manner is to use nVidia GRID vGPU under ESXi and that's not what people are looking for normally I'd imagine. Things like changing MAC address to another vendor's range and whatnot seem like it wouldn't work since not every VM will be virtual network connected I'd think. Otherwise, not letting VMware Tools run or changing around PCI addresses for certain devices could very well cause bugs in other things that are not very nice either.

And I'm one of those dozen people interested in this approach because I'd like to run some CUDA stuff in one VM running Linux and when not in use have the GPU switched to an HTPC VM that runs Windows (DRM BS for TV recording and such that's easier under Windows). Maybe I should just get separate GPUs or eat the cost of a Tesla but buying multiple $1k+ cards sucks for primarily home use.

I've heard it's possible, I've not yet done it myself. 2016 is the year I build a new box (Skull Canyon if it's got iGPU, Skylake if not, Arctic Islands either way) from scratch and move over fully to Linux and use hardware passthrough to VM Windows into its own little box where it can run my one or two remaining Windows-only apps and can't hurt us. And games. :rolleyes:

My chosen solution involving Qemu says that the driver looks for KVM extensions, and then self-terminates if it detects them. qEMU has flags you can use to hide those extensions to the driver, although it seems that they also look for Hyper-V as well.

Qemu works around it, but apparently this costs real performance under windows and may subject you to CLOCK_WATCHDOG_TIMEOUT bluescreens.

Quote from Nvidia:

"We fixed some hypervisor detection code that was breaking Hyper-V. It's possible that fix may be preventing GeForce cards from working in passthrough, but because it is not officially supported for GeForce cards, this will not be fixed."

https://forums.geforce.com/default/...232923/#4232923

SwissArmyDruid fucked around with this message at 18:20 on Dec 18, 2015

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
Oh.... KVM basically. I've definitely read about getting nVidia card passthrough to work via KVM but not under ESXi. According to this it's just two flags http://www.se7ensins.com/forums/threads/how-to-setup-a-gaming-virtual-machine-with-gpu-passthrough-qemu-kvm-libvirt-and-vfio.1371980/ but it may not apply under Linux as the guest OS. Makes no sense why Hyper-V matters under KVM as the hypervisor but evidently the VM detection logic triggers under Windows as the guest OS for sure.

NihilismNow
Aug 31, 2003

SwissArmyDruid posted:

Not that hard to trick the Nvidia cards into thinking that they're not actually in a VM. I hear it only takes one or two things.

Do you have any links? I'd like to have options if i ever decide to upgrade the old AMD card i am
using for this.

I know it works with the older cards but i haven't found anyone who got it working with a consumer 9 series card.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Any bets on how the next release cycle will play out? Is it really just going to be a "new 1080 matches/slightly beats the old 980 Ti"? Or will they lead off with big fully unlocked chips?

It seems like a bit of a prisoner's dilemma to me. AMD really needs some high-margin parts and some good press as the market leader, and if they do it then NVIDIA has to have something to at least match them.

Big chips on a new process node with oodles of HBM sounds expensive though.

Paul MaudDib fucked around with this message at 21:08 on Dec 18, 2015

Anime Schoolgirl
Nov 28, 2002

Aren't the low-end cards (370/950) and top end parts (save for Fury) the high margin parts these days?

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Anime Schoolgirl posted:

Aren't the low-end cards (370/950) and top end parts (save for Fury) the high margin parts these days?

I'm pretty sure it's all "high margin"... they're all like $30 to make or something, most of the cost being the VRAM I think.

Adbot
ADBOT LOVES YOU

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Paul MaudDib posted:

Any bets on how the next release cycle will play out? Is it really just going to be a "new 1080 matches/slightly beats the old 980 Ti"? Or will they lead off with big fully unlocked chips?

It seems like a bit of a prisoner's dilemma to me. AMD really needs some high-margin parts and some good press as the market leader, and if they do it then NVIDIA has to have something to at least match them.

Big chips on a new process node with oodles of HBM sounds expensive though.

I think AMD is intent on a summer release for Arctic Islands, likely with a Fiji and Tonga shrink (so 470 and 490 cards) while waiting a bit later with a Greenland, Baffin and Ellesmere drop. Nvidia may be holding back just to see AMDs play and try to match it to knock the wind out of Arctics sails.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply