|
K8.0 posted:The real problem is the lovely way AMD handles it driver-side. As long as a display can do 50% to max refresh rate, they should be able to fill anything below that in by displaying frames multiple times (so for for 32fps on a 120hz display with 60hz minimum, tell it to do 64hz and send each frame twice). It's hard to believe they haven't done that yet despite it being an obvious, simple solution to a problem that's been killing one of their premier technologies. I think you need a bit better than 50% to max to do it well, it's why they have had that solution on any screen that can do 2.5 times the minimum speed. That gives you a lot more margin to among other things react to not getting the next frame in time and follow it up with a better timed frame. Kilonum posted:21:9 is great for movies and racing/driving games but not really worth it for much else. 16:9 is great for strategy games, movies that have that aspect ratio and laptops/obsolescent screens that don't have room to do two+ windows really right but not worth it for much else unless that's the sacrifice you're making for high pixel pitch or high refresh rate. xthetenth fucked around with this message at 04:48 on Jul 24, 2016 |
# ? Jul 24, 2016 04:46 |
|
|
# ? Jun 6, 2024 07:41 |
|
xthetenth posted:16:9 is great for strategy games, movies that have that aspect ratio and laptops/obsolescent screens that don't have room to do two+ windows really right but not worth it for much else unless that's the sacrifice you're making for high pixel pitch or high refresh rate. Or if you're Blizzard and have an inexplicable hatred of an aspect ratio for some loving reason.
|
# ? Jul 24, 2016 04:57 |
|
Gonkish posted:Or if you're Blizzard and have an inexplicable hatred of an aspect ratio for some loving reason. I-i-it's unfaaaaiiirrrrrrrr
|
# ? Jul 24, 2016 05:00 |
|
Haquer posted:I-i-it's unfaaaaiiirrrrrrrr It is unfair that shitgibbons like that are allowed to make major decisions on games made by talented people.
|
# ? Jul 24, 2016 05:36 |
|
Sayara posted:Where the hell do you get those at $300? Dell is 630-730€ here and XB is 900€, so I wouldn't mind buying one overseas for that big of a difference. AcerRecertified website/ebay store and Dell Outlet (with a coupon code).
|
# ? Jul 24, 2016 06:02 |
|
Just picked up this Powercolor 380X for about $138 after tax, Paypal promo, and rebate. Seems like the best price/performance upgrade from my 7850 running on an FX-8350 to keep me going for another few years at 1080p. Any issues I should know about with 380X's? I've also never used Powercolor before. e: I should add I realize it's not top of the line, but benchmarks make it look like a pretty significant upgrade to my 7850 which has served me very well so far. I was looking at 970s and 1060s but this was almost half the price, and I'm trying to get a cheap boost before a full system rebuild in about 3 years. Gray Matter fucked around with this message at 06:48 on Jul 24, 2016 |
# ? Jul 24, 2016 06:13 |
|
Guys I own a lovely 960 and all this *sync talk has given me a massive hard-on, would it be wiser to invest in a gsync monitor and upgrade my gpu in the future instead of switching to amd for a rx480 and a freesync monitor?
|
# ? Jul 24, 2016 06:39 |
|
Otakufag posted:Guys I own a lovely 960 and all this *sync talk has given me a massive hard-on, would it be wiser to invest in a gsync monitor and upgrade my gpu in the future instead of switching to amd for a rx480 and a freesync monitor? Well do you want to spend ~$700 or about ~$500?
|
# ? Jul 24, 2016 06:46 |
|
Gray Matter posted:Just picked up this Powercolor 380X for about $138 after tax, Paypal promo, and rebate. Seems like the best price/performance upgrade from my 7850 running on an FX-8350 to keep me going for another few years at 1080p. Any issues I should know about with 380X's? I've also never used Powercolor before. The problem where some people mysteriously have their 285/380/380Xs crash/hardlock the entire system. Some people report it on other AMD GPUs too, but it works fine for other people on similar hardware configurations. For all the poo poo NVIDIA has been getting about drivers lately, AMD has their own fair share of driver problems too.
|
# ? Jul 24, 2016 07:30 |
|
Paul MaudDib posted:The problem where some people mysteriously have their 285/380/380Xs crash/hardlock the entire system. Some people report it on other AMD GPUs too, but it works fine for other people on similar hardware configurations. I battled something similar to this for a few weeks with a 7970 which drivers read as a 280x (iirc functionally identical, just the 280 is 1ghz stock?). Black screens, system crashes, artifacting, and that was IF I could get my PC to boot past the windows logo where it would typically black screen the most. Disabling the iGPU in device manager in safe mode is what fixed it for me. This all only happened after upgrading from w7 to 10. I definitely think there are still plenty of driver bugs, especially as you go back to older cards.
|
# ? Jul 24, 2016 07:59 |
|
Paul MaudDib posted:The problem where some people mysteriously have their 285/380/380Xs crash/hardlock the entire system. Some people report it on other AMD GPUs too, but it works fine for other people on similar hardware configurations. I have a 380x and its not a problem.
|
# ? Jul 24, 2016 08:56 |
|
BurritoJustice posted:Techpowerup has the MSI 1060 on average 13.6% faster than the 480 at 1080p. Techspot has the 1060 22% faster in Overwatch, which at 10m accounts is likely to be played by a lot of people who buy the cards. I really dig that meme when there are actually custom 1060s worth buying at the $250 MSRP, while the MSI 1060 you mentioned also drew 35W less power than a Asus custom 480. Also the meme that AMD, the holy savior of our ~$200 GPUs, abandoned the high-end out of their own choice to save us from evil Nvidia, not because of their inability to compete in that segment and they have also never gave us incredibly overpriced poo poo like a $1000 A64 FX when they were king of the hill. *snicker*
|
# ? Jul 24, 2016 12:32 |
|
At the time the FX competed directly against the EE chips, also $1000. You keep on hating AMD tho.
|
# ? Jul 24, 2016 13:01 |
|
jm20 posted:At the time the FX competed directly against the EE chips, also $1000. You keep on hating AMD tho. You answered your own question, but keep on trolling tho. The market sure is hating AMD for a long while now tho.
|
# ? Jul 24, 2016 13:55 |
|
K8.0 posted:The real problem is the lovely way AMD handles it driver-side. As long as a display can do 50% to max refresh rate, they should be able to fill anything below that in by displaying frames multiple times (so for for 32fps on a 120hz display with 60hz minimum, tell it to do 64hz and send each frame twice). It's hard to believe they haven't done that yet despite it being an obvious, simple solution to a problem that's been killing one of their premier technologies. But they did. They call it Low Framerate Compensation. The monitor needs to have a top end refresh rate equal to or greater than 2.5× that of the lowest refresh rate in its range, there's no other hardware or requirement. HalloKitty fucked around with this message at 14:59 on Jul 24, 2016 |
# ? Jul 24, 2016 14:03 |
|
wargames posted:I have a 380x and its not a problem. People have RMA'd their GPUs and the manufacturers can't reproduce the bug either. Yet plenty of people seem to have the issue. It's probably some kind of deeply-buried driver bug that only reveals itself on certain hardware/OS/monitor combinations.
|
# ? Jul 24, 2016 14:20 |
|
HalloKitty posted:But they did. They call it Low Framerate Compensation. The monitor needs to have a top end refresh rate 2.5× that of the lowest refresh rate in its range, there's no other hardware or requirement. Yeah, it's why I got the AOC 144hz free sync monitor instead of the ASUS or ViewSonic as it supports LFC and does normal free sync to about 35 fps instead of 48. Colors are a bit meh but it's incredible that you basically get smooth picture unless you're sub-20 fps (which is basically unplayable anyway). Good monitor for only $200.
|
# ? Jul 24, 2016 14:34 |
|
spasticColon posted:Okay, who on here is going to buy a $1200 Titan X? I'm considering it in a vain attempt to drive my 3x 4k monitors.
|
# ? Jul 24, 2016 16:14 |
|
SwissArmyDruid posted:But are the ranges worth a drat? Because I am still dying for a 21:9 1440p freesync monitor that isn't curved (too many reports of the curve affecting line straightness in CAD work) that isn't the Acer whatever it is, and the lack of any new developments on that front has got me worried that 21:9 is a dead thing and I'm probably going to wind up with a couple of said Korean 27"ers. I don't work in CAD but I do use Photoshop, which is also impacted by dimensions and line-based work. When I first got my curved monitor I wondered if I blew it because it did mess me up in Photohop. The thing is, after 1-3 weeks your brain adjusts to the curve. I don't even really see it anymore unless I think about it, and my accuracy in Photoshop seems to have come back to what it was before. KillHour posted:I'm considering it in a vain attempt to drive my 3x 4k monitors. Well then you're good because it won't
|
# ? Jul 24, 2016 17:36 |
|
Yeah.. I know. I'm probably going to get a 1080 and run them all at 1080p with GPU scaling to 4K. Hopefully, by the time a GPU comes out that can drive them properly, either nVidia will support freesync (hah!) or AMD will have something worth buying on the high end.
|
# ? Jul 24, 2016 18:41 |
|
A single GPU to drive 3x 4k at, what, 60hz or better? Man that willl be a crazy time to be alive.
|
# ? Jul 24, 2016 19:11 |
|
Anyone else notice how all the 14/16nm cards from both companies support full HEVC h.265 encoding in realtime, yet there's no first or third party software on the planet that can actually enable it yet?
|
# ? Jul 24, 2016 19:12 |
|
Paul MaudDib posted:AcerRecertified website/ebay store and Dell Outlet (with a coupon code). Thanks! I'll keep those in mind when I decide to get new one.
|
# ? Jul 24, 2016 19:15 |
|
Zero VGS posted:Anyone else notice how all the 14/16nm cards from both companies support full HEVC h.265 encoding in realtime, yet there's no first or third party software on the planet that can actually enable it yet?
|
# ? Jul 24, 2016 19:15 |
|
So I read that freesync doesn't work for anything other than fullscreen, e.g. borderless windowed. True? Also that sucks. FaintlyQuaint posted:A single GPU to drive 3x 4k at, what, 60hz or better? Man that willl be a crazy time to be alive. drat right, they'll be three quarters of the way to 8K!
|
# ? Jul 24, 2016 19:20 |
|
Zero VGS posted:Anyone else notice how all the 14/16nm cards from both companies support full HEVC h.265 encoding in realtime, yet there's no first or third party software on the planet that can actually enable it yet? Aren't ASICs grand?
|
# ? Jul 24, 2016 19:37 |
|
Zero VGS posted:Anyone else notice how all the 14/16nm cards from both companies support full HEVC h.265 encoding in realtime, yet there's no first or third party software on the planet that can actually enable it yet? How would this be consumed, though? As far as I know neither youtube nor twitch will accept HEVC upload streams, and that's where real-time HEVC encoding of captured game output would make the most sense. Could actually do 1080p60 at 3.5mbit. If Twitch took HEVC I know OBS would get the feature supported quickly.
|
# ? Jul 24, 2016 19:42 |
|
Twerk from Home posted:How would this be consumed, though? As far as I know neither youtube nor twitch will accept HEVC upload streams, and that's where real-time HEVC encoding of captured game output would make the most sense. Could actually do 1080p60 at 3.5mbit. Streaming from one house to another would be a great use case. One of those $99 Skylake pocket PCs could do Stream In Home Streaming or Nvidia Game Stream or something.
|
# ? Jul 24, 2016 19:47 |
|
Kilonum posted:21:9 is great for movies and racing/driving games but not really worth it for much else. It's great in any FPS that supports it, RTS or war games where you can see more of the map, and it is incredible for productivity (programing, etc). IT is better at everything than 16:9 except Overwatch
|
# ? Jul 24, 2016 19:53 |
|
Zero VGS posted:Anyone else notice how all the 14/16nm cards from both companies support full HEVC h.265 encoding in realtime, yet there's no first or third party software on the planet that can actually enable it yet? I believe NVIDIA's Gamestream uses it
|
# ? Jul 24, 2016 19:54 |
|
Zero VGS posted:Streaming from one house to another would be a great use case. One of those $99 Skylake pocket PCs could do Stream In Home Streaming or Nvidia Game Stream or something. I would be curious to see how the latency is with HVEC verses H264 for game streaming. H264 is pretty low on a wired network.
|
# ? Jul 24, 2016 19:56 |
|
Shrimp or Shrimps posted:I battled something similar to this for a few weeks with a 7970 which drivers read as a 280x (iirc functionally identical, just the 280 is 1ghz stock?). Black screens, system crashes, artifacting, and that was IF I could get my PC to boot past the windows logo where it would typically black screen the most. I had a similar problem after upgrading to Windows 10 with an R9 280, but i was completely unable to boot into Windows outside of safe mode. The problem was Windows automatically rolling back to a previous driver version, so I had uninstall and reinstall drivers through the Device Manager. Regrettable fucked around with this message at 20:09 on Jul 24, 2016 |
# ? Jul 24, 2016 20:06 |
|
3.5 mbps for 1080p 60fps sounds sexy to me. Im hovering around 11 mbps now iirc, and even though that's less than half my upload it still has a noticeable affect on gameplay https://developer.nvidia.com/nvidia-video-codec-sdk Nvidia released this a few days ago, maybe people can start working with it now penus penus penus fucked around with this message at 20:11 on Jul 24, 2016 |
# ? Jul 24, 2016 20:07 |
|
Lowen SoDium posted:I would be curious to see how the latency is with HVEC verses H264 for game streaming. H264 is pretty low on a wired network. I'd expect HEVC to have higher latency but better compression, right? It's far more complex and can use less bandwidth to get its job done, but there's more work to do on the encode and decode end.
|
# ? Jul 24, 2016 20:08 |
|
VP9 encoding is better
|
# ? Jul 24, 2016 20:43 |
|
I currently run a GTX 660 and someone told me that the GTX 1060 is a "980 killer" or whatever. Is the hype real? Anything I should know before ordering one of these? If it matters, I game from my couch on a 1080p television They're all out of stock on newegg so it's not like I'm ordering one this week or anything QuarkJets fucked around with this message at 20:52 on Jul 24, 2016 |
# ? Jul 24, 2016 20:49 |
|
The 1060 performs at or slightly above 980 levels (depends on the game), at a lower price point, with a lower TDP. It is a 980 killer through and through and a straight upgrade in every way over your current card.
|
# ? Jul 24, 2016 21:01 |
|
plus you can actually find it at $250, even if you need a coupon code which is more than you can say about the other two nvidia cards
|
# ? Jul 24, 2016 21:19 |
|
Gonkish posted:The 1060 performs at or slightly above 980 levels (depends on the game), at a lower price point, with a lower TDP. It is a 980 killer through and through and a straight upgrade in every way over your current card. So those of us with 980s should replace them then?
|
# ? Jul 24, 2016 21:26 |
|
|
# ? Jun 6, 2024 07:41 |
|
SourKraut posted:So those of us with 980s should replace them then? Not really... the only things you'd be gaining are greater power efficiency and some of the architectural stuff like the VR poo poo.
|
# ? Jul 24, 2016 21:27 |