Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Deathreaper
Mar 27, 2010
Haven't been keeping up with GPUs lately - any reason to flash the R9 290 to a R9 390. From what I recall, it was some driver BS giving the 390 a performance boost - was it anything else in the firmware?

Adbot
ADBOT LOVES YOU

Setzer Gabbiani
Oct 13, 2004

Deathreaper posted:

Haven't been keeping up with GPUs lately - any reason to flash the R9 290 to a R9 390. From what I recall, it was some driver BS giving the 390 a performance boost - was it anything else in the firmware?

Driver-wise, a big thing back when driver downsampling was just happening for AMD cards was higher DS resolutions only if it detected a high-end card, but as of now, any card can downsample from any resolution, and as of Crimson, I'm pretty sure old per-card optimizations are gone. Beyond the clock changes, memory timings are better in comparison to stock 200's, which no one's reported being as unstable if your card has already been successfully flashed, which is probably the biggest change

It's enough of a performance bump to give it a shot, this was the guide I used last year, and benchmarks are near the bottom

http://www.overclock.net/t/1564219/modded-r9-390x-bios-for-r9-290-290x-updated-02-16-2016

LiquidRain
May 21, 2007

Watch the madness!

I think I made the right decision in having a friend buy a used MSI TwinFrozr 280X 3GB for the same price as a 960 2GB? ($160) He's already bought it, but I'm looking for validation. It's the older Tahiti design, but he doesn't care about variable refresh and he just wants/wanted a stop-gap for the next year.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

LiquidRain posted:

I think I made the right decision in having a friend buy a used MSI TwinFrozr 280X 3GB for the same price as a 960 2GB? ($160) He's already bought it, but I'm looking for validation. It's the older Tahiti design, but he doesn't care about variable refresh and he just wants/wanted a stop-gap for the next year.

The 280X is the faster card so it seems like the right choice to me. The only disadvantage is that the 280X eats up a lot more power.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
Yeah the 960 is probably the least attractive card in the Nvidia lineup, I would definitely choose AMD instead at that price range.

Vinlaen
Feb 19, 2008

I'm not sure if this is the right place to ask, but I have an Nvidia GTX 980 that I use 100% for video output. However, I might want to use Intel QuickSync in the future, so is it worth enabling the iGPU?

Does it have any performance difference if I leave it enabled but don't connect any monitors to it?

What is the recommended approach when using a dedicated PCIe card like my GTX 980?

(Also, if it matters, my CPU is an i7-6700k)

EDIT: It looks like DirectX 12 has a multi adapter feature that can use both integrated and discrete GPU for increased performance?

Vinlaen fucked around with this message at 03:23 on Apr 4, 2016

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Only when transcoding video to disk.

Vinlaen
Feb 19, 2008

Sorry, I just made an edit but it was too late.

I've just read an article that mentions DirectX 12 supporting multi adapter and showing performance increases using integrated plus discrete graphics?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Vinlaen posted:

Sorry, I just made an edit but it was too late.

I've just read an article that mentions DirectX 12 supporting multi adapter and showing performance increases using integrated plus discrete graphics?

Old article. The question is whether games will support it, and whether NVIDIA will allow drivers to support it. Games will probably be using off-the-shelf engines, which makes them more likely to support it, but also NVIDIA hates competition. And actually there's not really that much reason to use it past the native ~20% boost AMD gets from DX12.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

LiquidRain posted:

I think I made the right decision in having a friend buy a used MSI TwinFrozr 280X 3GB for the same price as a 960 2GB? ($160) He's already bought it, but I'm looking for validation. It's the older Tahiti design, but he doesn't care about variable refresh and he just wants/wanted a stop-gap for the next year.

For a stopgap? Absolutely, it's a faster card and less likely to have memory size issues.

penus penus penus
Nov 9, 2014

by piss__donald

LiquidRain posted:

I think I made the right decision in having a friend buy a used MSI TwinFrozr 280X 3GB for the same price as a 960 2GB? ($160) He's already bought it, but I'm looking for validation. It's the older Tahiti design, but he doesn't care about variable refresh and he just wants/wanted a stop-gap for the next year.

Mmmm a 280x is a better choice here for for the same money, however that card in particular has a shady past.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127759

Rigged Death Trap
Feb 13, 2012

BEEP BEEP BEEP BEEP

Paul MaudDib posted:

Old article. The question is whether games will support it, and whether NVIDIA will allow drivers to support it. Games will probably be using off-the-shelf engines, which makes them more likely to support it, but also NVIDIA hates competition. And actually there's not really that much reason to use it past the native ~20% boost AMD gets from DX12.

If they dont then they cant slap DX12 READY FULLY COMPATIBLE on the box.
They know that so its why theyre doing the whole NVLink stuff.

Truga
May 4, 2014
Lipstick Apathy

feedmegin posted:

And yes, signed firmware is coming

AFAIK it's already here - I had to use a cracked nvflash to use a custom rom on maxwell.

Vinlaen posted:

I'm not sure if this is the right place to ask, but I have an Nvidia GTX 980 that I use 100% for video output. However, I might want to use Intel QuickSync in the future, so is it worth enabling the iGPU?

Does it have any performance difference if I leave it enabled but don't connect any monitors to it?

What is the recommended approach when using a dedicated PCIe card like my GTX 980?

(Also, if it matters, my CPU is an i7-6700k)

I tried doing this and it wouldn't work, because ASUS in all their knowledge decided not to wire the onboard gpu to anything at all, probably thinking "this is an xfire/sli mobo, surely nobody will use the haswell gpu for anything??".

Well, I want to, but I can't.

LiquidRain
May 21, 2007

Watch the madness!

THE DOG HOUSE posted:

Mmmm a 280x is a better choice here for for the same money, however that card in particular has a shady past.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127759
Yikes I wish I saw this first. Well, hopefully it lasts the year, or fails in the next 2 weeks. :p

Bleh Maestro
Aug 30, 2003
Nvidda announcement To-morrow?

penus penus penus
Nov 9, 2014

by piss__donald

Bleh Maestro posted:

Nvidda announcement To-morrow?

Everybody seems to be expecting it. High value demo units were spotted shipped into the conference

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Bleh Maestro posted:

Nvidda announcement To-morrow?

Yeah, Jen Hsun-Huang is making a keynote at noon tomorrow. I think that's when it'll happen, if they're announcing at GTC.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Sweet, can't wait to see the new Quadro. (Insert joke about it being the new Maxwell M5500)

penus penus penus
Nov 9, 2014

by piss__donald

xthetenth posted:

Sweet, can't wait to see the new Quadro. (Insert joke about it being the new Maxwell M5500)

That thing uses 150 watts lol. The brick on that laptop would be crazy

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Truga posted:

I tried doing this and it wouldn't work, because ASUS in all their knowledge decided not to wire the onboard gpu to anything at all, probably thinking "this is an xfire/sli mobo, surely nobody will use the haswell gpu for anything??".
That doesn't seem right, what board?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Hey Durinia: is it any cheaper to make a die-shrink mask instead of a whole new one? (ignoring the engineering/debugging costs of a new architecture)

In particular I was wondering if there would be any benefits to making the lower-end workstation chips simple die-shrinks of Kepler, while focusing their architectural work on a big HBM chip (with DP capabilities, disabled on graphics variants) and a smaller graphics-oriented chip ala Maxwell. But, probably too costly to make chips on 3 entirely different architectures...

Anyone care to read the tea leaves on what gets announced tomorrow? (if anything)

Paul MaudDib fucked around with this message at 01:12 on Apr 5, 2016

EoRaptor
Sep 13, 2003

by Fluffdaddy

Paul MaudDib posted:

Hey Durinia: is it any cheaper to make a die-shrink mask instead of a whole new one? (ignoring the engineering/debugging costs of a new architecture)

In particular I was wondering if there would be any benefits to making the lower-end workstation chips simple die-shrinks of Kepler, while focusing their architectural work on a big HBM chip (with DP capabilities, disabled on graphics variants) and a smaller graphics-oriented chip ala Maxwell. But, probably too costly to make chips on 3 entirely different architectures...

Anyone care to read the tea leaves on what gets announced tomorrow? (if anything)

The problem here is the false notion of 'die shrink'. While some features of a chip get smaller with each 'shrink', some don't. Straight up universal shrinks stopped being a thing in the early 2000's, and everything recent is tweaking some thigns to make them smaller, and balancing them against things which cannot get any smaller (eg: vdc lines). That isn't even talking about the hassle of routing data aorund the chip, and how sensitive it is to changes in timing.

There are tools that can help a lot with process changes, but there is no 'push butan' software that can shrink a mask anymore.

EoRaptor fucked around with this message at 01:23 on Apr 5, 2016

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

My main prediction is a gap between big pascal with a bunch of whatever memory exists now to compete with Knights Landing, and two or three chips starting from the bottom. If three it'll be interesting to see how this compares to AMD holding off on their second biggest chip in order to get better memory on it.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
Any hope the new video cards will come out for the launch of the new DOOM game? My current 660Ti won't cut the mustard for that game but I don't want to get a 970 or a 390 if Pascal and Polaris are right around the corner.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Ask again tomorrow.

Truga
May 4, 2014
Lipstick Apathy

Alereon posted:

That doesn't seem right, what board?

Eh, of course I googled again just now and found the fix. When I bought this PC nobody knew what the hell was going on and people just started assuming it won't work because asus is dumb, but now OBS forum told me to enable "multimonitor something or another" in bios, and yeah, it started working.

So yeah, it kinda didn't work because asus makes a dumb bios with settings in weird places under 4 menus I guess.

Thanks for making me google this again :v:

wolrah
May 8, 2006
what?

spasticColon posted:

Any hope the new video cards will come out for the launch of the new DOOM game? My current 660Ti won't cut the mustard for that game but I don't want to get a 970 or a 390 if Pascal and Polaris are right around the corner.

Doom runs like hell on my 970 as well, so I think there might be problems with it. I get 25-50 FPS at the default settings with a 4790k and 32GB RAM. It doesn't use SLI, but neither the CPU nor GPU were maxed while it was failing to achieve 60.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
It's just trying to be helpful and imitate the experience of playing it on a console so that you can know true pain.

penus penus penus
Nov 9, 2014

by piss__donald

wolrah posted:

Doom runs like hell on my 970 as well, so I think there might be problems with it. I get 25-50 FPS at the default settings with a 4790k and 32GB RAM. It doesn't use SLI, but neither the CPU nor GPU were maxed while it was failing to achieve 60.

Funny, since ID demos that game with Nvidia stuff . I wouldn't worry too much I doubt it'll run like that for release

I got to play the developers for 6 minutes. I got creamed by Carmack's son... Who was using a 360 controller

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

wolrah posted:

Doom runs like hell on my 970 as well, so I think there might be problems with it. I get 25-50 FPS at the default settings with a 4790k and 32GB RAM. It doesn't use SLI, but neither the CPU nor GPU were maxed while it was failing to achieve 60.

to be fair overwatch also ran like crap on a 980ti until a few patches ago.

NewFatMike
Jun 11, 2015

THE DOG HOUSE posted:

Funny, since ID demos that game with Nvidia stuff . I wouldn't worry too much I doubt it'll run like that for release

I got to play the developers for 6 minutes. I got creamed by Carmack's son... Who was using a 360 controller

Yeah, but that's kind of like Paul Atreides beating you at fortune telling.

SlayVus
Jul 10, 2009
Grimey Drawer

THE DOG HOUSE posted:

Funny, since ID demos that game with Nvidia stuff . I wouldn't worry too much I doubt it'll run like that for release

I got to play the developers for 6 minutes. I got creamed by Carmack's son... Who was using a 360 controller

Game is also locked to 16:9. Which sucks. Ultrawide Master Race!

GTC 2016 begins in 4 hours! Come on Pascal!

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

SlayVus posted:

Game is also locked to 16:9. Which sucks. Ultrawide Master Race!

I wish more games supported it well

Panty Saluter
Jan 17, 2004

Making learning fun!
Is there a benefit to locking aspect ratio? Other than hiding things you don't want seen in cutscenes I guess

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

SlayVus posted:

Game is also locked to 16:9. Which sucks. Ultrawide Master Race!

GTC 2016 begins in 4 hours! Come on Pascal!

You can change launch parameters to whatever resolution and fov you want

Still plays like poo poo though cause of mouse control problems.

Nodelphi
Jan 30, 2004

We are all quite capable of believing in anything as long as it's improbable.

Ham Wrangler
I apologize if this is the wrong place to ask this:

Currently I have a GeForce 970 that's running two 27" 1440p 165Hz panels and it's really under powered for this. I keep getting one monitor randomly going dark and gaming performance is passable at best. I want to upgrade my graphics card, would it be better to run these monitors off a 980 Ti or should I try 970's in SLI?

I'm a little nervous the second monitor going dark is a symptom of an underlying graphics card hardware issue as the monitor is fine when the connections are switched and seems to come back after running re-detection a few times. So I'm not sure it's worth trying to keep using that 970 in an SLI build.

penus penus penus
Nov 9, 2014

by piss__donald

Nodelphi posted:

I apologize if this is the wrong place to ask this:

Currently I have a GeForce 970 that's running two 27" 1440p 165Hz panels and it's really under powered for this. I keep getting one monitor randomly going dark and gaming performance is passable at best. I want to upgrade my graphics card, would it be better to run these monitors off a 980 Ti or should I try 970's in SLI?

I'm a little nervous the second monitor going dark is a symptom of an underlying graphics card hardware issue as the monitor is fine when the connections are switched and seems to come back after running re-detection a few times. So I'm not sure it's worth trying to keep using that 970 in an SLI build.

Good timing, in literally 15 minutes they are announcing the next release of GPU's.

But in general 970 SLI isn't a great option when the 980ti exists. There is a good chance used 980ti's will be sold for relatively cheap in the coming month as well.

NewFatMike
Jun 11, 2015

Go with the single card solution - a 980Ti is about price and performance par with 970s in SLI. You just don't have to do the janky SLI stuff and am absurd amount of new releases don't even support it.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
980 Ti is equivalent to SLI 970s in both price and performance and single card is always preferable to SLI if possible, so sell your 970 and upgrade. But hold on we may be getting new GPUs announced in a keynote speech that starts in literally 10 minutes.

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

Livestream is here: http://www.ustream.tv/channel/fWbQyaEMfbh

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply