|
Cinara posted:The biggest reason for the huge FPS jump in DOOM is because AMD's OpenGL support was godawful and had huge CPU overhead. Vulkan cut the CPU overhead way down for everything, which ends up being a large boost for AMD cards but also a large boost for anything NVIDIA + lovely CPU. The link Beautiful Ninja posted above shows a great example, where a TitanX + i7-920 gets double the FPS under Vulkan. Dammit. Like a fool, I've invested money into a better CPU, thinking that a 1st-generation i7, despite being clocked at 4.2GHz, is just not enough CPU to drive my 980ti...
|
# ? Jul 12, 2016 00:40 |
|
|
# ? May 15, 2024 14:21 |
|
lDDQD posted:Dammit. Like a fool, I've invested money into a better CPU, thinking that a 1st-generation i7, despite being clocked at 4.2GHz, is just not enough CPU to drive my 980ti... On the other hand: You still have a 980ti, a very very good card.
|
# ? Jul 12, 2016 00:49 |
|
lDDQD posted:Dammit. Like a fool, I've invested money into a better CPU, thinking that a 1st-generation i7, despite being clocked at 4.2GHz, is just not enough CPU to drive my 980ti... Its still better lol
|
# ? Jul 12, 2016 01:21 |
|
sauer kraut posted:It's just a catchy name for OpenGL 5 to run away from decades of clutter, proprietary hacks and bad word of mouth. And Hitman 2016 (I believe).
|
# ? Jul 12, 2016 01:42 |
Hubis posted:And Hitman 2016 (I believe). Deus Ex: MD will come August also.
|
|
# ? Jul 12, 2016 01:44 |
|
Is really the only difference between this: http://www.newegg.com/Product/Product.aspx?Item=N82E16814487248 and this: http://www.newegg.com/Product/Product.aspx?Item=N82E16814487249 is clock speeds? I'm willing to pay the $20 since it's finally in stock.
|
# ? Jul 12, 2016 01:49 |
|
PunkBoy posted:Is really the only difference between this: Pretty much, and there's no real reason to pay more for an OC'd version of the same card when all GTX 10 series cards OC about the same, but if its in stock, go for it.
|
# ? Jul 12, 2016 01:50 |
|
Beautiful Ninja posted:Pretty much, and there's no real reason to pay more for an OC'd version of the same card when all GTX 10 series cards OC about the same, but if its in stock, go for it. Awesome, thanks!
|
# ? Jul 12, 2016 01:58 |
|
Hieronymous Alloy posted:Deus Ex: MD will come August also. On that note, since it's a cross-platform game I wonder how much this will really matter. I missed DXHR on release, did it push the technical envelope at all? I'm perfectly happy with my 980ti, but my brother is thinking of upgrading his aging GTX460 and this would be a nice opportunity to hand down the 980ti for a GTX1080...
|
# ? Jul 12, 2016 03:10 |
Anti-Hero posted:On that note, since it's a cross-platform game I wonder how much this will really matter. I missed DXHR on release, did it push the technical envelope at all? All indications are that it's basically using a modded Hitman engine so it'll matter as much as it does for Hitman and no more most likely.
|
|
# ? Jul 12, 2016 03:22 |
|
sauer kraut posted:It's just a catchy name for OpenGL 5 to run away from decades of clutter, proprietary hacks and bad word of mouth. Rise of the Tomb Raider just got a Asynchronous DX12 update and that's a The Way It Was Meant To Be Played game.
|
# ? Jul 12, 2016 04:01 |
|
Drakhoran posted:Rise of the Tomb Raider just got a Asynchronous DX12 update and that's a The Way It Was Meant To Be Played game. Deep within the furthest subterrainian level of the NVida Spire, a green light flickers on in the dark. "Drakhoran...noted" then flicked off, reducing back to its slumber state, which is powered by Pascal at a TDP unknown to their bitter enemy
|
# ? Jul 12, 2016 04:12 |
|
On the other hand, NVIDIA cards don't burn down your computer. Always a plus.
|
# ? Jul 12, 2016 04:30 |
|
Yeah Nixxes is the one company I'd trust to pull it off without AMDs help at this point.
|
# ? Jul 12, 2016 04:32 |
|
Paul MaudDib posted:On the other hand, NVIDIA cards don't burn down your computer. Always a plus. Outside of the year 2010 anyway.
|
# ? Jul 12, 2016 04:33 |
|
Paul MaudDib posted:On the other hand, NVIDIA cards don't burn down your computer. Always a plus.
|
# ? Jul 12, 2016 04:38 |
|
I don't understand why grown-ups who look down on, say, console wars feel this need to embellish graphics card manufacturers. I enjoy laughing at the idea of RX480 fires, but I'm not going to pretend that AMD is Yugo; nor is NVidia an evil empire because they want you to pay more for what the believe and consumers agree is a premium product. At the end of the day, you're the guy who cares this much about framerates to spend 10% more and get 5% extra frames. I had a Galaxy 460 for before my current 660, and I think it was a Fermi. I don't remember it being absurdly hot. Craptacular! fucked around with this message at 04:48 on Jul 12, 2016 |
# ? Jul 12, 2016 04:43 |
|
Sorry guys I'm just super high
|
# ? Jul 12, 2016 04:51 |
|
THE DOG HOUSE posted:Sorry guys I'm just super high So were the GTX 480 engineers when they decided it was OK to ship a ~300W GPU
|
# ? Jul 12, 2016 05:00 |
|
SwissArmyDruid posted:Vulkan *has* to be biased towards AMD, by virtue of being built upon Mantle's bones, right? Isn't that to be expected? Or is the delta just way too large? I'm not seeing bias, did you even seen those results from the guy with nehalem + titan x? It's been long documented that amd's drivers have more cpu overhead. Remove some of that, and clearly amd has more to gain.
|
# ? Jul 12, 2016 05:16 |
|
HMS Boromir posted:Yeah, barring unexpected developments, AdaptiveSync/FreeSync should slowly become a standard feature for monitors. The only reason it feels like it's a matter of "GPU-specific monitors" is because nVidia decided it wanted its own special snowflake version of adaptive sync, which carries a ridiculous price premium for no particularly good reason. You are vastly overestimating the impact of the whateversync monitor market and vastly underestimating the number of gamers who would rather just buy a better GPU to cap framerate at 60 then to deal with it and the sheer penny pinching cost cutting of display manufacturers/OEMs. Almost a decade since the Q6600 and quad cores still haven't outnumbered duals in the Steam gaming demographic, so good luck weaning them off their crappy TN/VGA monitors, let alone the even slower to adapt segment which is everybody else.
|
# ? Jul 12, 2016 05:26 |
|
Craptacular! posted:I had a Galaxy 460 for before my current 660, and I think it was a Fermi. I don't remember it being absurdly hot. I think most refer to the 480: http://www.techspot.com/review/263-nvidia-geforce-gtx-480/page13.html techspot posted:Despite being a single GPU graphics card it used slightly more power than the dual-GPU Radeon HD 5970 at both idle and load, and was comparable to a pair of Radeon HD 5870 graphics cards in Crossfire mode.
|
# ? Jul 12, 2016 05:28 |
|
Craptacular! posted:
Where do you think this comic came from?
|
# ? Jul 12, 2016 05:41 |
|
Well, got my MSI Gaming X 1080, seems to have done pretty well in the lottery but I really wish nvidia would fix their loving idle clock bug already. My 2139mhz overclock with 100% fans, that appeared to be stable in 3D, translated into 1430 in 2D mode which was not stable and caused twitches or flashes in Firefox. Backing it down to 2114, or 1405 in 2D mode, fixed that. It's possible 2139 wasn't as stable as I thought but I'll never get to know. There's no way to force gpu boost to one bin, is there? Or redefine the temperature ranges? Having one degree Celsius of difference with tonight's high ambient temperature drop me down a boost bin is annoying, especially when I know it was stable in 3D at those slightly higher temperatures with a higher overclock. At least MSI's cooler is nice and quiet, given the premium they charge; at 100% fans it's roughly comparable to the G1 Gaming 980Ti I had at 70% fans. At 70% I can barely tell it's there without headphones, but 70% fans is just not enough to hold the card at 2114 indefinitely tonight. I think if I ran the AC or opened up my case more it'd be fine. If I water cooled it'd be perfectly stable in the 2139 or 2126 boost bin, but I can't keep it there on air.
|
# ? Jul 12, 2016 05:44 |
|
Thinking of getting a fury x over a gtx 1070 for my 2560x1600 60hz monitor. The fury x is slightly cheaper, available and seems to have improved substantially in benchmarks since its initial release. Plus dx12/Vulcan with async is benefiting AMD cards nicely. Only thing I'm worried about is the 4GB vram. It would be replacing 2 r9 290s which are just too hot/noisy in CF and I don't want to deal with CF anymore. What do you guys think?
|
# ? Jul 12, 2016 05:54 |
|
Has anyone encountered this problem? Shadowplay is recording at the wrong resolution. My game is set to use DSR at something like 3414x1440, which is a multiple of my native resolution, 2560x1080. It's running in fullscreen. Shadowplay is set to record at the in-game resolution. The video output is in 1680x1050, which is the resolution of one of my secondary monitors. How do I get it to not be retarded?
|
# ? Jul 12, 2016 06:11 |
|
Vulkan trip report on Doom with my GTX 1080 (at 4k). FPS seems to be locked at 60 just like on OpenGL, but it doesn't feel as smooth. Something about it just isn't quite right. Loading times went up a ton with it too. It does seem more stable though.
|
# ? Jul 12, 2016 06:17 |
|
PerrineClostermann posted:Where do you think this comic came from? There needs to be a reversed version of that for today. Regardless, I love that Intel made a tiny amount of progress in the first two panels.
|
# ? Jul 12, 2016 06:20 |
|
Twerk from Home posted:He's not alone, I found DOOM kinda crashy on a 2500K / R9-290. Are you trying to run the game with some settings set to Nightmare? I've heard that the game can just crash due to memory issues (either GPU or system) when some of the graphic settings are cranked fully.
|
# ? Jul 12, 2016 06:34 |
Deathreaper posted:Thinking of getting a fury x over a gtx 1070 for my 2560x1600 60hz monitor. The fury x is slightly cheaper, available and seems to have improved substantially in benchmarks since its initial release. Plus dx12/Vulcan with async is benefiting AMD cards nicely. Only thing I'm worried about is the 4GB vram. It would be replacing 2 r9 290s which are just too hot/noisy in CF and I don't want to deal with CF anymore. What do you guys think? Unfortunately the 4GB of VRAM will really hold things back at anything over 1080p.
|
|
# ? Jul 12, 2016 06:34 |
|
bull3964 posted:Vulkan trip report on Doom with my GTX 1080 (at 4k).
|
# ? Jul 12, 2016 06:40 |
|
How accurate is videocardz.com for news or rumours? I saw this article which supposedly has the specs for the 470 and 460. It also offhandedly mentions that a source says the 470 will launch at the end of July, and become available early August.
|
# ? Jul 12, 2016 06:53 |
|
PerrineClostermann posted:Where do you think this comic came from? All the times I've seen that comic I've never noticed the Intel turtle until now. And now I see it slowly moves! :3
|
# ? Jul 12, 2016 07:32 |
|
Pascal VS Polaris
|
# ? Jul 12, 2016 08:52 |
|
sauer kraut posted:It's just a catchy name for OpenGL 5 to run away from decades of clutter, proprietary hacks and bad word of mouth.
|
# ? Jul 12, 2016 08:53 |
|
I got some pretty significant performance improvements for Tomb Raider on my 980ti with the new Steam update + most recent Nvidia drivers. Typical performance drop areas like Geothermal Valley don't seem to hit my framerate as hard running DX12.
|
# ? Jul 12, 2016 08:55 |
|
Crossposting from SA Mart since it seems like a lot of cards are still on backorder: Looking to sell my barely used gtx 1070, product details on the newegg page here . Picked it up when rebuilding my PC and realized its much more than I really need. Used for approximately 1 week. I can post pictures from home later in the morning. Looking to recoup cost, $410 shipped to the lower 48. I don't have PM, but can email questions to danames22 at gmail
|
# ? Jul 12, 2016 09:41 |
|
Thinking about that Zotac GTX 1080 AMP! Extreme some more, if it's fans TURN OFF when under a benchmark program, you could set it to Manual 20% and forget about it? That's loving insane.
|
# ? Jul 12, 2016 09:45 |
|
I forgot how much I hate displayport's loving hotplug detection. It's part of the spec where if you turn a monitor off it's treated as if it's been unplugged. I like to turn my secondary monitors off when watching something on my main screen, but with displayport that causes the desktop to flicker and resize every time. Windows avoids doing this for the primary monitor, at least, or my monitor doesn't follow the worst parts of the spec. The HDMI ports on my monitors are already full, so I need three DVI connections. On my GTX 1080 I've got the DVI port for one, I can use an HDMI to DVI cable for another monitor, and for the last monitor I'll try a DP->DVI cable, but I'm not too confident in that. It might end up being a dp -> hdmi -> EDID ghost (because they don't make these for DP, at least not yet) -> dvi frankenstein chain. I hate computers. I wish I could blame this on Windows but Mac and Linux have the exact same behaviour without any software fix, though you can work around it with nvidia's workstation cards. It's like they never thought people might use more than one monitor. At least 1200p60 is in the range of passive DP to HDMI/DVI converters.
|
# ? Jul 12, 2016 12:15 |
|
|
# ? May 15, 2024 14:21 |
|
If you have a dell monitor, you can sometimes correct the dumb "off=unplugged" poo poo somewhere in the OSD settings.
|
# ? Jul 12, 2016 12:28 |