|
Grim Up North posted:These articles all seem to quote from an EE Times interview, but yeah its a bit surprising that 4.0 AICs wouldn't work (at 3.0 speeds) in a 3.0 system. Anyways it seems quite some time off and we'll know more on June 23. Thanks. Am I missing something though, or was I correct in noticing that the quotes (and indeed the full text) don't actually support the idea that 4.0 AICs won't work in 3.0 slots?
|
# ? Mar 26, 2016 18:33 |
|
|
# ? May 30, 2024 03:45 |
|
HMS Boromir posted:Thanks. Am I missing something though, or was I correct in noticing that the quotes (and indeed the full text) don't actually support the idea that 4.0 AICs won't work in 3.0 slots? No, you're right, I have no idea how you would get from quote:Gen 4 will use a new connector but the spec will be backward compatible mechanically and electrically with today’s 8GT Gen 3. to what that third-party site said. I'd assume that we'll get the same kind of backwards compatibility (i.e. full) we've always had. Grim Up North fucked around with this message at 19:43 on Mar 26, 2016 |
# ? Mar 26, 2016 19:40 |
|
Are all these leaks leading to a sinking ship named Polaris and Vega!? (I can do a better job of writing clickbait WCCFtech, fund me) People are flipping their poo poo that Polaris appears to share the same driver code path that Fiji/Tonga does, as this is. Vega is 4096 shaders though, so all it's performance improvements rely on improved µarch.
|
# ? Mar 27, 2016 02:18 |
|
FaustianQ posted:People are flipping their poo poo that Polaris appears to share the same driver code path that Fiji/Tonga does, as this is. Vega is 4096 shaders though, so all it's performance improvements rely on improved µarch. They're probably all worried about this being yet another round of rebrands except for the halo card. I mean, AMD has been stretching it pretty thin. Four rebrands (five if you count the HD8000 series) of a chip design from 2011 with some minor improvements upon each generation. I can see where they're coming from, especially when today's "mid-range" prices are the same of what "high-end" was back in '09-'12. They're expecting more performance for their dollar.
|
# ? Mar 27, 2016 02:39 |
|
FaustianQ posted:People are flipping their poo poo that Polaris appears to share the same driver code path that Fiji/Tonga does, as this is. Vega is 4096 shaders though, so all it's performance improvements rely on improved µarch. It's possible they've made some substantial uarch improvements. I mean, none of the GCN revisions particularly improved on speed. Generation 3 (Tonga) did a little bit, but basically it's pretty close to the same. 2048 Gen 2 cores are roughly the same performance as 1792 Gen 3 cores, so it's ~15% faster. They gotta improve at some point. Still though, like I keep saying - don't count your chickens before they hatch. AMD and NVIDIA have to milk this node for at least a couple years, they're not going to go all the way on the first date. I think we'll get a solid 30-50% bump out of this generation, but we probably won't get 100% until the big chips hit the market. Paul MaudDib fucked around with this message at 03:02 on Mar 27, 2016 |
# ? Mar 27, 2016 02:41 |
|
Rukus posted:They're probably all worried about this being yet another round of rebrands except for the halo card. I mean, AMD has been stretching it pretty thin. Four rebrands (five if you count the HD8000 series) of a chip design from 2011 with some minor improvements upon each generation. I liked when they rebranded the 290 series into the 390 series. They literally expect you chumps to pay an extra $100-150 for the exact same chip. Oh right and a slight factory OC and some more VRAM that's useless for everything except for 8K slideshows. I actually suspect the reason they switched was because it saved them money by moving production to chips with higher production volume. Despite the fact that their prices have trended slightly upwards if anything due to pressure from the 390, the 290 series remains a killer value in the secondary market for anyone willing to open a control panel and move a slider to the right. For $250 you can get an aftermarket 290X that'll slot right between the performance of a 970 and a 980. Paul MaudDib fucked around with this message at 02:59 on Mar 27, 2016 |
# ? Mar 27, 2016 02:53 |
|
Paul MaudDib posted:I liked when they rebranded the 290 series into the 390 series. They literally expect you chumps to pay an extra $100-150 for the exact same chip. Oh right and a slight factory OC and some more VRAM that's useless for everything except for 8K slideshows. I actually suspect the reason they switched was because it saved them money by moving production to chips with higher production volume. The best thing is that it worked. The real improvement to the 390 was that it shipped to reviewers with aftermarket coolers that meant it reviewed great. That was enough to make a huge improvement to its public perception. The 290 got dragged along, since after all it's basically a 390 right?
|
# ? Mar 27, 2016 03:28 |
|
Just saying though that the 2304 shaders @ 800Mhz are currently meeting/beating a 290X's 2816 shaders @ 1100Mhz, and it'll clock higher than current GCN (word of God and not some PR rep). That's not a 15% improvement, that's above 40% improvement per core, although this could be related to the overhaul elsewhere, so again I'm not sure why people are freaking out, this isn't a shrink of GCN3, it's not Tiny Tonga, christ. I'll definitely dig my teeth into full P10, it'll beat the snot out of my 290X for power draw and performance and I'll be on 1440p for at least a three years if not longer. So Vega (64, 56, 48, 44) P10(40, 36, 32, 28) P11(24, 20, 16, 12)? I just get the impression that new node + attempting to price for volume sales means AMD is using every last drat chip coming out of fabrication even if they have to cut them down quite a bit to make them functional. Although a 12CU part might be questionable depending on how APUs shake out.
|
# ? Mar 27, 2016 05:18 |
|
Also remember that Hawaii beats the everloving poo poo out of Fiji as far as performance per shader goes.
|
# ? Mar 27, 2016 05:26 |
|
Does AMD also have a video encoder solution that works with Steam? I'd be interested in that, assuming I switch back to using my iMac as my primary desktop again. It was handy having hardware H.264 encoding, since that makes it possible to stream from the Windows PC to the Mac. If not, I may just stick with using the Windows PC directly for gaming, and forget about streaming.
|
# ? Mar 27, 2016 06:35 |
|
This is how bafflingly bad AMD are. Yes, they have hardware encoding. It's called VCE, it's entirely equal to NVidia's solution, and it has been supported by Steam for a long old while. Also supported by (a branch of, heh) OBS, and plenty of third party tools. I used it to run KinoConsole to stream Steam games to my tablet in bed when I had a bad case of everything-itis, and it was entirely satisfactory. AMD haven't made any issue of it, or released their own software shadowplay equivalent with it, because ... uh. They did fart out an apologetic collaboration with Plays.tv or whatever the hell, but that was about it. Doesn't seem appealing to install more gamer shovelware with pretty lights on it to make decent use of what should be a built in feature. Then again I also find Crimson and GeForce Experience equally obnoxious so maybe I'm just getting old and cranky. E: Although thinking about it, maybe they're doing us all a favour; the last thing anybody needs is more AMD software. Salt n Reba McEntire fucked around with this message at 07:00 on Mar 27, 2016 |
# ? Mar 27, 2016 06:50 |
|
The AMD game recorder works really drat well, just FYI. I have a 970 now, but when I had my 290 it worked equally as well as shadowplay in my experience
|
# ? Mar 27, 2016 17:15 |
|
Is the nvidia one good/worth using? I'm not a ~streamer~ but I'd like to record some things here and there.
|
# ? Mar 27, 2016 19:22 |
|
It's pretty decent but marred by being part of GeForce experience(tm)
|
# ? Mar 27, 2016 19:45 |
|
It's fine, I use it to capture all my sweet in-game pwnage moments
|
# ? Mar 27, 2016 19:46 |
|
Yes its good, particularly for recording. It's sort of old news now but there is no comparison to the previous methods. You can hardly tell its happening and the result looks great, and while it takes up a whole lot of space its nothing like say, FRAPS, and it compresses well with handbrake and the like. I'd use GFE beta so you can access the overlay which is lag free and convenient, unlike opening GFE itself still. ... Or you can do what I do and record directly to twitch for extreme laziness. penus penus penus fucked around with this message at 19:49 on Mar 27, 2016 |
# ? Mar 27, 2016 19:47 |
|
You can also just record to disk instead of streaming with OBS if you want to avoid the GeForce "Experience." You'll have to do a little more configuration work to set up a scene and get it to encode with the card, but it's not rocket science and there are a bunch of tutorials out there.
|
# ? Mar 27, 2016 23:34 |
|
Moogle posted:This is how bafflingly bad AMD are. Ah. My previous lower spec card would have VCE 1.0, and when I tried gaming with Steam In-Home Streaming, it tended to prefer software encoding over the VCE. I've also had cases with my GTX 960 where some OpenGL software will randomly prefer to offload the video encoding to the QuickSync encoder instead of using NVEnc. The OpenGL software was pulling 60fps+ on the attached display, and stuttering like mad on the Mac's screen.
|
# ? Mar 28, 2016 05:37 |
|
I'm thinking about a GTX 970 but I don't know poo poo about the equivalent AMD card(s) so I am curious. I'm looking to get an FPS improvement over the 750 Ti that I've got now for 3 screen gaming (iracing, if it matters). Seems they're around $275 used or so, $330ish new. What's AMD have around this level for the same price and which would you recommend for what I'm doing? I stupidly bought the 750 less than a year ago because I wanted something cheap that would run 3 screens, but it seems it was almost the bare minimum of cards that could do it. Edit: started looking a little further and I should specify I have a 500w PSU. Suburban Dad fucked around with this message at 13:48 on Mar 28, 2016 |
# ? Mar 28, 2016 13:42 |
|
Larrymer posted:I'm thinking about a GTX 970 but I don't know poo poo about the equivalent AMD card(s) so I am curious. I'm looking to get an FPS improvement over the 750 Ti that I've got now for 3 screen gaming (iracing, if it matters). Seems they're around $275 used or so, $330ish new. What's AMD have around this level for the same price and which would you recommend for what I'm doing? I stupidly bought the 750 less than a year ago because I wanted something cheap that would run 3 screens, but it seems it was almost the bare minimum of cards that could do it. A used non-reference (the kind with fans; don't get the blower type) R9 290 will keep up with a GTX 970 (slightly slower in some games slightly faster in others, and gets a big boost with DX12), and it'll run you around $200 on eBay. Get MSI, Asus, or Gigabyte only, as they'll all have 3-year transferable warranties that are still active. Edit: 500w should be okay, R9 290 hit diminishing returns on the overclock pretty quickly so keep that and CPU overclock reasonable and you won't overload the PSU. Edit 2: here's an R9 390 for $280 on Newegg: http://www.newegg.com/Product/Product.aspx?Item=N82E16814125805&cm_re=r9_390-_-14-125-805-_-Product It's the same as a 290 except it's brand new and comes with a free download of Shitman, a pointless 8gb of video ram, a fresh 3 year warranty, and a passive cooling mode when you're using the desktop which will still probably never activate because the card still runs a bit too hot. Zero VGS fucked around with this message at 14:04 on Mar 28, 2016 |
# ? Mar 28, 2016 13:52 |
|
I'd probably do a little research before buying that specific card based on the :| reviews. But, they are GPU reviews and may be total BS.
|
# ? Mar 28, 2016 14:56 |
|
Hey all, Am upgrading the display and graphics card for a work machine (Dell Precision T1700). Usage is programming, backend development and some data analysis. Not for gaming (although I don't keep track of what he does up there). Budget is $200 (spending about $450-500 on monitor) but can increase if necessary or on the edge of an obviously superior choice. System hardware:
Will get a 27" 1440p monitor. Probably one of the following: ASUS MG279Q - 144Hz, contingent on the price staying at $491 on Newegg ASUS MX27AQ ASUS PB278Q - Reddit recommends this one Would like a graphics card that can easily drive 2 displays, preferably up to 4K but for the moment only planning on two, a 1440p and 1080p. Would something like this EVGA GeForce GTX 950 SC GAMING be reasonable? I still need to check the case interior for clearance and power plug availability but I like the number of display ports and refresh rate.
|
# ? Mar 29, 2016 00:27 |
|
dbcooper posted:Would like a graphics card that can easily drive 2 displays, preferably up to 4K but for the moment only planning on two, a 1440p and 1080p. I'm the guy from the Monitor thread, but yeah, the 950 is a pretty reasonable pick as long as you don't need the special ISV-certified drivers for stuff like CAD work. Otherwise you get to buy the big-boy card for 5x as much. It'll totally drive a couple 4K or 1440p displays though.
|
# ? Mar 29, 2016 03:39 |
|
I'd question paying for 144hz for coding imo, but based on the requirements a whole lot of cards can handle that. A 950 for sure.
|
# ? Mar 29, 2016 14:57 |
|
THE DOG HOUSE posted:I'd question paying for 144hz for coding imo, but based on the requirements a whole lot of cards can handle that. A 950 for sure.
|
# ? Mar 29, 2016 15:00 |
|
Heads up in case anyone else has trouble, after updating my AMD drivers to 16.1.1 or 1.6.3.1 with a 7970, enabling 'Frame Rate Target Control' fucks my poo poo - freezes that I have to hard reboot through. Light testing after disabling it and everything seems OK. 'Course I installed Rift and updated both audio and video drivers in one go then didn't try to game till later, taking a few hours and a million reboots (thank god for SSDs!) to rip back through everything to find exactly what the trigger was.
|
# ? Mar 29, 2016 17:00 |
|
Josh Lyman posted:Screen tearing in your IDE is serious business. Yeah, I ran an ultrawide off HDMI at 50 Hz for a while and have no complaints. Although double duty is a possibility.
|
# ? Mar 29, 2016 17:05 |
|
30 Hz, on the other hand, is loving murder on your eyes even when staring at a screen full of text because the instant you scroll your eyes will mutiny against your eye sockets.
|
# ? Mar 29, 2016 17:37 |
|
THE DOG HOUSE posted:I'd probably do a little research before buying that specific card based on the :| reviews. But, they are GPU reviews and may be total BS. I always take reviews with a grain of salt, I've seen good stuff get dinged hard because of idiots that don't know how to set up or use something new and shiny. It's usually the stupidest, most tech-illiterate people that bitch the loudest, at least in my experience
|
# ? Mar 29, 2016 18:48 |
|
necrobobsledder posted:30 Hz, on the other hand, is loving murder on your eyes even when staring at a screen full of text because the instant you scroll your eyes will mutiny against your eye sockets. Oh my god. I had no idea what that was. I thought I was going crazy.
|
# ? Mar 29, 2016 20:15 |
|
Ozz81 posted:I always take reviews with a grain of salt, I've seen good stuff get dinged hard because of idiots that don't know how to set up or use something new and shiny. It's usually the stupidest, most tech-illiterate people that bitch the loudest, at least in my experience Yes but they were a bit more pointed than usual. But I agree they are usually BS. Sometimes they arent though (280x MSI) edit: that is to say they are probably not BS but rather not representative usually penus penus penus fucked around with this message at 21:09 on Mar 29, 2016 |
# ? Mar 29, 2016 20:57 |
|
I've seen quite a few with bad reviews, most recently there was a Gigabyte 970 card that was smaller for SFF computers and it was getting horrid reviews, even in threads here. Cards dying, fan problems, artifacting, coil whine, you name it. Otherwise, if they're mixed or mostly positive, I'll check other review sites online for the exact brand/model and look at actual extensive, tested reviews versus reading about Joe Shmoe who probably can't tie his shoelaces properly, let alone use a PC.
|
# ? Mar 30, 2016 17:09 |
|
I'm always hesitant to post something like this as I feel like I'm invoking a curse, but I haven't experience any coil whine from my 290 except when running 3DMark tests at crazy FPS. Was using a PC with a 980 installed the other day and it seemed like it triggered constantly - not sure if the user just couldn't hear it or didn't care but it drove me crazy. Also my MSI 280X worked great but it was just a little too underwhelming coming from a 6970.
|
# ? Mar 31, 2016 00:15 |
|
Is there any way to make my 980 Ti use more power? As far as I can tell, at low utilization (idle) and high utilization (gaming) everything is fine, but in some range my desktop makes the lights in my room flicker (game alt-tabbed, or playing movie with madVR). It's very unpleasant. edit: alternatively, how do I rewire this house Malloc Voidstar fucked around with this message at 09:24 on Mar 31, 2016 |
# ? Mar 31, 2016 09:18 |
|
Malloc Voidstar posted:Is there any way to make my 980 Ti use more power? As far as I can tell, at low utilization (idle) and high utilization (gaming) everything is fine, but in some range my desktop makes the lights in my room flicker (game alt-tabbed, or playing movie with madVR). It's very unpleasant. get a ups so that it cant draw weird power amounts from the wall
|
# ? Mar 31, 2016 09:43 |
|
Malloc Voidstar posted:Is there any way to make my 980 Ti use more power? As far as I can tell, at low utilization (idle) and high utilization (gaming) everything is fine, but in some range my desktop makes the lights in my room flicker (game alt-tabbed, or playing movie with madVR). It's very unpleasant. Fauxtool posted:get a ups so that it cant draw weird power amounts from the wall Not all UPS devices will do that. You want one with something with Active Voltage Regulation (AVR) which will boost an under volt (ie the fridge kicked on and the lights in the room dimmed) so you get a consistent 120Vrms. An example: http://www.bestbuy.com/site/cyberpower-1000va-battery-back-up-system-black/3938835.p?id=1219609308930&skuId=3938835 I only buy APC, but those tend to be more expensive.
|
# ? Mar 31, 2016 12:59 |
|
Malloc Voidstar posted:Is there any way to make my 980 Ti use more power? As far as I can tell, at low utilization (idle) and high utilization (gaming) everything is fine, but in some range my desktop makes the lights in my room flicker (game alt-tabbed, or playing movie with madVR). It's very unpleasant. Get your outlets tested. You can do it with a multimeter.
|
# ? Mar 31, 2016 13:38 |
|
I spent £30 on a pretty solid extension cable that had built in protection against certain things and it solved all the brown - outs shutting off my pc back in my old house. I didn't think it'd do anything but I was plesently surprised. Or swap out your lighting to Led bulbs and the reduced amount of power needed might solve it? Or get a small ups, they are pretty cool and super handy to have.
|
# ? Mar 31, 2016 14:17 |
|
Rukus posted:I can see where they're coming from, especially when today's "mid-range" prices are the same of what "high-end" was back in '09-'12. They're expecting more performance for their dollar. That's not true at all- In 2010 the GTX 470 launched at 350$, 20$ more than the 330$ GTX 970. The main difference is back then the halo performance cards were the lovely dual chip SLI in a single slot versions like the 4870x2, GTX 295, or GTX590 vs the better single chip huge die monsters we have at present.
|
# ? Mar 31, 2016 14:37 |
|
|
# ? May 30, 2024 03:45 |
|
I have never heard of Coil Whine until I came into this thread. Is it really that big an issue?
|
# ? Mar 31, 2016 17:28 |