Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
HMS Boromir
Jul 16, 2011

by Lowtax

Grim Up North posted:

These articles all seem to quote from an EE Times interview, but yeah its a bit surprising that 4.0 AICs wouldn't work (at 3.0 speeds) in a 3.0 system. Anyways it seems quite some time off and we'll know more on June 23.

Thanks. Am I missing something though, or was I correct in noticing that the quotes (and indeed the full text) don't actually support the idea that 4.0 AICs won't work in 3.0 slots?

Adbot
ADBOT LOVES YOU

Grim Up North
Dec 12, 2011

HMS Boromir posted:

Thanks. Am I missing something though, or was I correct in noticing that the quotes (and indeed the full text) don't actually support the idea that 4.0 AICs won't work in 3.0 slots?

No, you're right, I have no idea how you would get from

quote:

Gen 4 will use a new connector but the spec will be backward compatible mechanically and electrically with today’s 8GT Gen 3.

“We’ve done a lot of analysis on the connector -- we tried everything possible,” said Yanes. “We have some top engineers in our electrical work group and they’ve come through -- its exciting to see the amount of activity and participation,” he said.

to what that third-party site said. I'd assume that we'll get the same kind of backwards compatibility (i.e. full) we've always had.

Grim Up North fucked around with this message at 19:43 on Mar 26, 2016

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Are all these leaks leading to a sinking ship named Polaris and Vega!? (I can do a better job of writing clickbait WCCFtech, fund me)

People are flipping their poo poo that Polaris appears to share the same driver code path that Fiji/Tonga does, as this is. Vega is 4096 shaders though, so all it's performance improvements rely on improved µarch.

Rukus
Mar 13, 2007

Hmph.

FaustianQ posted:

People are flipping their poo poo that Polaris appears to share the same driver code path that Fiji/Tonga does, as this is. Vega is 4096 shaders though, so all it's performance improvements rely on improved µarch.

They're probably all worried about this being yet another round of rebrands except for the halo card. I mean, AMD has been stretching it pretty thin. Four rebrands (five if you count the HD8000 series) of a chip design from 2011 with some minor improvements upon each generation.

I can see where they're coming from, especially when today's "mid-range" prices are the same of what "high-end" was back in '09-'12. They're expecting more performance for their dollar.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

FaustianQ posted:

People are flipping their poo poo that Polaris appears to share the same driver code path that Fiji/Tonga does, as this is. Vega is 4096 shaders though, so all it's performance improvements rely on improved µarch.

It's possible they've made some substantial uarch improvements. I mean, none of the GCN revisions particularly improved on speed. Generation 3 (Tonga) did a little bit, but basically it's pretty close to the same. 2048 Gen 2 cores are roughly the same performance as 1792 Gen 3 cores, so it's ~15% faster. They gotta improve at some point.

Still though, like I keep saying - don't count your chickens before they hatch. AMD and NVIDIA have to milk this node for at least a couple years, they're not going to go all the way on the first date. I think we'll get a solid 30-50% bump out of this generation, but we probably won't get 100% until the big chips hit the market.

Paul MaudDib fucked around with this message at 03:02 on Mar 27, 2016

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Rukus posted:

They're probably all worried about this being yet another round of rebrands except for the halo card. I mean, AMD has been stretching it pretty thin. Four rebrands (five if you count the HD8000 series) of a chip design from 2011 with some minor improvements upon each generation.

I can see where they're coming from, especially when today's "mid-range" prices are the same of what "high-end" was back in '09-'12. They're expecting more performance for their dollar.

I liked when they rebranded the 290 series into the 390 series. They literally expect you chumps to pay an extra $100-150 for the exact same chip. Oh right and a slight factory OC and some more VRAM that's useless for everything except for 8K slideshows. I actually suspect the reason they switched was because it saved them money by moving production to chips with higher production volume.

Despite the fact that their prices have trended slightly upwards if anything due to pressure from the 390, the 290 series remains a killer value in the secondary market for anyone willing to open a control panel and move a slider to the right. For $250 you can get an aftermarket 290X that'll slot right between the performance of a 970 and a 980.

Paul MaudDib fucked around with this message at 02:59 on Mar 27, 2016

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Paul MaudDib posted:

I liked when they rebranded the 290 series into the 390 series. They literally expect you chumps to pay an extra $100-150 for the exact same chip. Oh right and a slight factory OC and some more VRAM that's useless for everything except for 8K slideshows. I actually suspect the reason they switched was because it saved them money by moving production to chips with higher production volume.

Despite the fact that their prices have trended slightly upwards if anything due to pressure from the 390, the 290 series remains a killer value in the secondary market for anyone willing to open a control panel and move a slider to the right. For $250 you can get an aftermarket 290X that'll slot right between the performance of a 970 and a 980.

The best thing is that it worked.

The real improvement to the 390 was that it shipped to reviewers with aftermarket coolers that meant it reviewed great. That was enough to make a huge improvement to its public perception. The 290 got dragged along, since after all it's basically a 390 right?

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Just saying though that the 2304 shaders @ 800Mhz are currently meeting/beating a 290X's 2816 shaders @ 1100Mhz, and it'll clock higher than current GCN (word of God and not some PR rep). That's not a 15% improvement, that's above 40% improvement per core, although this could be related to the overhaul elsewhere, so again I'm not sure why people are freaking out, this isn't a shrink of GCN3, it's not Tiny Tonga, christ. I'll definitely dig my teeth into full P10, it'll beat the snot out of my 290X for power draw and performance and I'll be on 1440p for at least a three years if not longer.

So Vega (64, 56, 48, 44) P10(40, 36, 32, 28) P11(24, 20, 16, 12)? I just get the impression that new node + attempting to price for volume sales means AMD is using every last drat chip coming out of fabrication even if they have to cut them down quite a bit to make them functional. Although a 12CU part might be questionable depending on how APUs shake out.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Also remember that Hawaii beats the everloving poo poo out of Fiji as far as performance per shader goes.

kode54
Nov 26, 2007

aka kuroshi
Fun Shoe
Does AMD also have a video encoder solution that works with Steam? I'd be interested in that, assuming I switch back to using my iMac as my primary desktop again. It was handy having hardware H.264 encoding, since that makes it possible to stream from the Windows PC to the Mac. If not, I may just stick with using the Windows PC directly for gaming, and forget about streaming.

Salt n Reba McEntire
Nov 14, 2000

Kuparp.
This is how bafflingly bad AMD are.

Yes, they have hardware encoding. It's called VCE, it's entirely equal to NVidia's solution, and it has been supported by Steam for a long old while. Also supported by (a branch of, heh) OBS, and plenty of third party tools. I used it to run KinoConsole to stream Steam games to my tablet in bed when I had a bad case of everything-itis, and it was entirely satisfactory.

AMD haven't made any issue of it, or released their own software shadowplay equivalent with it, because ... uh.

They did fart out an apologetic collaboration with Plays.tv or whatever the hell, but that was about it. Doesn't seem appealing to install more gamer shovelware with pretty lights on it to make decent use of what should be a built in feature. Then again I also find Crimson and GeForce Experience equally obnoxious so maybe I'm just getting old and cranky.

E: Although thinking about it, maybe they're doing us all a favour; the last thing anybody needs is more AMD software.

Salt n Reba McEntire fucked around with this message at 07:00 on Mar 27, 2016

Captain Yossarian
Feb 24, 2011

All new" Rings of Fire"
The AMD game recorder works really drat well, just FYI. I have a 970 now, but when I had my 290 it worked equally as well as shadowplay in my experience

Boris Galerkin
Dec 17, 2011

I don't understand why I can't harass people online. Seriously, somebody please explain why I shouldn't be allowed to stalk others on social media!
Is the nvidia one good/worth using? I'm not a ~streamer~ but I'd like to record some things here and there.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
It's pretty decent but marred by being part of GeForce experience(tm)

Panty Saluter
Jan 17, 2004

Making learning fun!
It's fine, I use it to capture all my sweet in-game pwnage moments

penus penus penus
Nov 9, 2014

by piss__donald
Yes its good, particularly for recording. It's sort of old news now but there is no comparison to the previous methods. You can hardly tell its happening and the result looks great, and while it takes up a whole lot of space its nothing like say, FRAPS, and it compresses well with handbrake and the like.

I'd use GFE beta so you can access the overlay which is lag free and convenient, unlike opening GFE itself still.

... Or you can do what I do and record directly to twitch for extreme laziness.

penus penus penus fucked around with this message at 19:49 on Mar 27, 2016

breaks
May 12, 2001

You can also just record to disk instead of streaming with OBS if you want to avoid the GeForce "Experience." You'll have to do a little more configuration work to set up a scene and get it to encode with the card, but it's not rocket science and there are a bunch of tutorials out there.

kode54
Nov 26, 2007

aka kuroshi
Fun Shoe

Moogle posted:

This is how bafflingly bad AMD are.

Yes, they have hardware encoding. It's called VCE, it's entirely equal to NVidia's solution, and it has been supported by Steam for a long old while. Also supported by (a branch of, heh) OBS, and plenty of third party tools. I used it to run KinoConsole to stream Steam games to my tablet in bed when I had a bad case of everything-itis, and it was entirely satisfactory.

Ah. My previous lower spec card would have VCE 1.0, and when I tried gaming with Steam In-Home Streaming, it tended to prefer software encoding over the VCE. I've also had cases with my GTX 960 where some OpenGL software will randomly prefer to offload the video encoding to the QuickSync encoder instead of using NVEnc. The OpenGL software was pulling 60fps+ on the attached display, and stuttering like mad on the Mac's screen.

Suburban Dad
Jan 10, 2007


Well what's attached to a leash that it made itself?
The punchline is the way that you've been fuckin' yourself




I'm thinking about a GTX 970 but I don't know poo poo about the equivalent AMD card(s) so I am curious. I'm looking to get an FPS improvement over the 750 Ti that I've got now for 3 screen gaming (iracing, if it matters). Seems they're around $275 used or so, $330ish new. What's AMD have around this level for the same price and which would you recommend for what I'm doing? I stupidly bought the 750 less than a year ago because I wanted something cheap that would run 3 screens, but it seems it was almost the bare minimum of cards that could do it.

Edit: started looking a little further and I should specify I have a 500w PSU.

Suburban Dad fucked around with this message at 13:48 on Mar 28, 2016

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Larrymer posted:

I'm thinking about a GTX 970 but I don't know poo poo about the equivalent AMD card(s) so I am curious. I'm looking to get an FPS improvement over the 750 Ti that I've got now for 3 screen gaming (iracing, if it matters). Seems they're around $275 used or so, $330ish new. What's AMD have around this level for the same price and which would you recommend for what I'm doing? I stupidly bought the 750 less than a year ago because I wanted something cheap that would run 3 screens, but it seems it was almost the bare minimum of cards that could do it.

Edit: started looking a little further and I should specify I have a 500w PSU.

A used non-reference (the kind with fans; don't get the blower type) R9 290 will keep up with a GTX 970 (slightly slower in some games slightly faster in others, and gets a big boost with DX12), and it'll run you around $200 on eBay. Get MSI, Asus, or Gigabyte only, as they'll all have 3-year transferable warranties that are still active.

Edit: 500w should be okay, R9 290 hit diminishing returns on the overclock pretty quickly so keep that and CPU overclock reasonable and you won't overload the PSU.

Edit 2: here's an R9 390 for $280 on Newegg: http://www.newegg.com/Product/Product.aspx?Item=N82E16814125805&cm_re=r9_390-_-14-125-805-_-Product

It's the same as a 290 except it's brand new and comes with a free download of Shitman, a pointless 8gb of video ram, a fresh 3 year warranty, and a passive cooling mode when you're using the desktop which will still probably never activate because the card still runs a bit too hot.

Zero VGS fucked around with this message at 14:04 on Mar 28, 2016

penus penus penus
Nov 9, 2014

by piss__donald
I'd probably do a little research before buying that specific card based on the :| reviews. But, they are GPU reviews and may be total BS.

dbcooper
Mar 21, 2008
Yams Fan
Hey all,

Am upgrading the display and graphics card for a work machine (Dell Precision T1700).

Usage is programming, backend development and some data analysis. Not for gaming (although I don't keep track of what he does up there). Budget is $200 (spending about $450-500 on monitor) but can increase if necessary or on the edge of an obviously superior choice.

System hardware:
Dell Precision T1700 Mini Tower, 365W TPM Chassis. [Mini Tower supports] one PCI Express x16 Gen 3 graphic card up to 150W (total for graphics)

Will get a 27" 1440p monitor. Probably one of the following:

ASUS MG279Q - :pcgaming: 144Hz, contingent on the price staying at $491 on Newegg
ASUS MX27AQ
ASUS PB278Q - Reddit recommends this one

Would like a graphics card that can easily drive 2 displays, preferably up to 4K but for the moment only planning on two, a 1440p and 1080p.

Would something like this EVGA GeForce GTX 950 SC GAMING be reasonable? I still need to check the case interior for clearance and power plug availability but I like the number of display ports and refresh rate.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

dbcooper posted:

Would like a graphics card that can easily drive 2 displays, preferably up to 4K but for the moment only planning on two, a 1440p and 1080p.

Would something like this EVGA GeForce GTX 950 SC GAMING be reasonable? I still need to check the case interior for clearance and power plug availability but I like the number of display ports and refresh rate.

I'm the guy from the Monitor thread, but yeah, the 950 is a pretty reasonable pick as long as you don't need the special ISV-certified drivers for stuff like CAD work. Otherwise you get to buy the big-boy card for 5x as much.

It'll totally drive a couple 4K or 1440p displays though.

penus penus penus
Nov 9, 2014

by piss__donald
I'd question paying for 144hz for coding imo, but based on the requirements a whole lot of cards can handle that. A 950 for sure.

Josh Lyman
May 24, 2009


THE DOG HOUSE posted:

I'd question paying for 144hz for coding imo, but based on the requirements a whole lot of cards can handle that. A 950 for sure.
Screen tearing in your IDE is serious business.

teh_Broseph
Oct 21, 2010

THE LAST METROID IS IN
CATTIVITY. THE GALAXY
IS AT PEACE...
Lipstick Apathy
Heads up in case anyone else has trouble, after updating my AMD drivers to 16.1.1 or 1.6.3.1 with a 7970, enabling 'Frame Rate Target Control' fucks my poo poo - freezes that I have to hard reboot through. Light testing after disabling it and everything seems OK.

'Course I installed Rift and updated both audio and video drivers in one go then didn't try to game till later, taking a few hours and a million reboots (thank god for SSDs!) to rip back through everything to find exactly what the trigger was. :pcgaming:

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Josh Lyman posted:

Screen tearing in your IDE is serious business.

Yeah, I ran an ultrawide off HDMI at 50 Hz for a while and have no complaints.

Although double duty is a possibility.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
30 Hz, on the other hand, is loving murder on your eyes even when staring at a screen full of text because the instant you scroll your eyes will mutiny against your eye sockets.

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

THE DOG HOUSE posted:

I'd probably do a little research before buying that specific card based on the :| reviews. But, they are GPU reviews and may be total BS.

I always take reviews with a grain of salt, I've seen good stuff get dinged hard because of idiots that don't know how to set up or use something new and shiny. It's usually the stupidest, most tech-illiterate people that bitch the loudest, at least in my experience :shrug:

NewFatMike
Jun 11, 2015

necrobobsledder posted:

30 Hz, on the other hand, is loving murder on your eyes even when staring at a screen full of text because the instant you scroll your eyes will mutiny against your eye sockets.

Oh my god. I had no idea what that was. I thought I was going crazy.

penus penus penus
Nov 9, 2014

by piss__donald

Ozz81 posted:

I always take reviews with a grain of salt, I've seen good stuff get dinged hard because of idiots that don't know how to set up or use something new and shiny. It's usually the stupidest, most tech-illiterate people that bitch the loudest, at least in my experience :shrug:

Yes but they were a bit more pointed than usual. But I agree they are usually BS. Sometimes they arent though (280x MSI)

edit: that is to say they are probably not BS but rather not representative usually

penus penus penus fucked around with this message at 21:09 on Mar 29, 2016

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT
:agreed: I've seen quite a few with bad reviews, most recently there was a Gigabyte 970 card that was smaller for SFF computers and it was getting horrid reviews, even in threads here. Cards dying, fan problems, artifacting, coil whine, you name it. Otherwise, if they're mixed or mostly positive, I'll check other review sites online for the exact brand/model and look at actual extensive, tested reviews versus reading about Joe Shmoe who probably can't tie his shoelaces properly, let alone use a PC.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva
I'm always hesitant to post something like this as I feel like I'm invoking a curse, but I haven't experience any coil whine from my 290 except when running 3DMark tests at crazy FPS. Was using a PC with a 980 installed the other day and it seemed like it triggered constantly - not sure if the user just couldn't hear it or didn't care but it drove me crazy. Also my MSI 280X worked great but it was just a little too underwhelming coming from a 6970.

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.
Is there any way to make my 980 Ti use more power? As far as I can tell, at low utilization (idle) and high utilization (gaming) everything is fine, but in some range my desktop makes the lights in my room flicker (game alt-tabbed, or playing movie with madVR). It's very unpleasant.

edit: alternatively, how do I rewire this house

Malloc Voidstar fucked around with this message at 09:24 on Mar 31, 2016

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS

Malloc Voidstar posted:

Is there any way to make my 980 Ti use more power? As far as I can tell, at low utilization (idle) and high utilization (gaming) everything is fine, but in some range my desktop makes the lights in my room flicker (game alt-tabbed, or playing movie with madVR). It's very unpleasant.

edit: alternatively, how do I rewire this house

get a ups so that it cant draw weird power amounts from the wall

mayodreams
Jul 4, 2003


Hello darkness,
my old friend

Malloc Voidstar posted:

Is there any way to make my 980 Ti use more power? As far as I can tell, at low utilization (idle) and high utilization (gaming) everything is fine, but in some range my desktop makes the lights in my room flicker (game alt-tabbed, or playing movie with madVR). It's very unpleasant.

edit: alternatively, how do I rewire this house

Fauxtool posted:

get a ups so that it cant draw weird power amounts from the wall

Not all UPS devices will do that. You want one with something with Active Voltage Regulation (AVR) which will boost an under volt (ie the fridge kicked on and the lights in the room dimmed) so you get a consistent 120Vrms.

An example: http://www.bestbuy.com/site/cyberpower-1000va-battery-back-up-system-black/3938835.p?id=1219609308930&skuId=3938835

I only buy APC, but those tend to be more expensive.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Malloc Voidstar posted:

Is there any way to make my 980 Ti use more power? As far as I can tell, at low utilization (idle) and high utilization (gaming) everything is fine, but in some range my desktop makes the lights in my room flicker (game alt-tabbed, or playing movie with madVR). It's very unpleasant.

edit: alternatively, how do I rewire this house

Get your outlets tested. You can do it with a multimeter.

Captain Hair
Dec 31, 2007

Of course, that can backfire... some men like their bitches crazy.
I spent £30 on a pretty solid extension cable that had built in protection against certain things and it solved all the brown - outs shutting off my pc back in my old house. I didn't think it'd do anything but I was plesently surprised.

Or swap out your lighting to Led bulbs and the reduced amount of power needed might solve it?

Or get a small ups, they are pretty cool and super handy to have.

Gwaihir
Dec 8, 2009
Hair Elf

Rukus posted:

I can see where they're coming from, especially when today's "mid-range" prices are the same of what "high-end" was back in '09-'12. They're expecting more performance for their dollar.

That's not true at all- In 2010 the GTX 470 launched at 350$, 20$ more than the 330$ GTX 970. The main difference is back then the halo performance cards were the lovely dual chip SLI in a single slot versions like the 4870x2, GTX 295, or GTX590 vs the better single chip huge die monsters we have at present.

Adbot
ADBOT LOVES YOU

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
I have never heard of Coil Whine until I came into this thread. Is it really that big an issue?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply