|
SlayVus posted:Blizzard's stance on 21:9 resolutions is that it's an unfair advantage and they had at one point banned people in D3 for playing in 21:9. Even though they're going to add 21:9 to Overwatch, they have a long rear end history of blatantly saying that would never support it. Well it is, but I dont care lol
|
# ? Jul 15, 2016 04:53 |
|
|
# ? May 30, 2024 15:11 |
|
SlayVus posted:Blizzard's stance on 21:9 resolutions is that it's an unfair advantage and they had at one point banned people in D3 for playing in 21:9. Even though they're going to add 21:9 to Overwatch, they have a long rear end history of blatantly saying that would never support it. And their "support" of 21:9 in Overwatch is hilariously spiteful (or just plain incompetent): they literally took the 16:9 image, cropped and zoomed it: Blue box is 16:9, the red box is 21:9. Note that the 21:9 image is not actually wider, it's literally just a cropped version of the 16:9. That's EXACTLY how it works in the game. Basically this is what they did: https://www.youtube.com/watch?v=lnb-gDujd5U Their excuse for this? "Oh we don't want to go above this arbitrary FOV number because we think it's 'unfair' (also we just don't want to)." Meanwhile, they fully support 144Hz monitors (though only after people rightfully bitched about the 60fps cap that was in one or two of the builds in beta), which is a considerably bigger (and more noticeable) advantage. Blizzard has a massive, unrelenting hate boner for 21:9 for some reason.
|
# ? Jul 15, 2016 04:56 |
|
THE DOG HOUSE posted:Well it is, but I dont care lol Well a 21:9 monitor only starts at $200. So, get some money scrub
|
# ? Jul 15, 2016 04:57 |
|
SlayVus posted:Well a 21:9 monitor only starts at $200. So, get some money scrub not in a good resolution
|
# ? Jul 15, 2016 04:59 |
|
THE DOG HOUSE posted:not in a good resolution If everyone switched to 21:9 Blizzard would be forced to actually support it, but not really. I like 21:9 in video games, but I like my new HTC Vive more. So much more to see, I barely notice the blackness.
|
# ? Jul 15, 2016 05:02 |
|
SlayVus posted:A GTX 1080 would work better for 3440x1440. Else you'll need to turn down your graphic settings as a 1080 barely hits 60 FPS in some games. Well yeah, a better card will work better. I don't really want to spend the extra though and I don't need Ultra everything. Should be a decent boost over what my 970 is doing for me.
|
# ? Jul 15, 2016 05:09 |
|
Let me tell you a secret: literally every Blizzard game sucks except for Diablo II
|
# ? Jul 15, 2016 05:11 |
|
SlayVus posted:If everyone switched to 21:9 Blizzard would be forced to actually support it, but not really. I like 21:9 in video games, but I like my new HTC Vive more. So much more to see, I barely notice the blackness. Nah, they'd just tell us that we're all wrong and bad and should kill ourselves and then go back to only supporting 640x480.
|
# ? Jul 15, 2016 05:19 |
|
Gonkish posted:Nah, they'd just tell us that we're all wrong and bad and should kill ourselves and then go back to only supporting 640x480. Bet they are still bitter we picked ipv4 over ipx.
|
# ? Jul 15, 2016 05:23 |
|
I'm fine with Blizzard being turds about 21:9 - it let's me have a excuse my friends will buy when I tell them I won't buy the game.
|
# ? Jul 15, 2016 05:26 |
|
i dont hate ultrawides per se but gently caress people who post full size screenshots from them. i have my 1600p usually displaying browser windows side by side and it makes their screenshots a tiny loving ribbon.
|
# ? Jul 15, 2016 05:59 |
|
Gonkish posted:Blizzard has a massive, unrelenting hate boner for 21:9 for some reason. Also, their method for ensuring that this limit doesn't get exceeded is equally ham-fisted: every resolution, besides 16:9 gets letterboxed. Never mind that it's a trivial calculation to determine the camera distance to give equal in-game area for any aspect ratio. I could've programmed it in like 5 minutes. lDDQD fucked around with this message at 06:03 on Jul 15, 2016 |
# ? Jul 15, 2016 06:00 |
|
Founder's edition GPUs consistently clock higher. 100MHz on average. Pretty amusing considering the amount of "founders just equals overpriced reference" shitfits.
|
# ? Jul 15, 2016 06:14 |
|
BurritoJustice posted:Founder's edition GPUs consistently clock higher. 100MHz on average. Pretty amusing considering the amount of "founders just equals overpriced reference" shitfits. My 1070FE does +150 on core and +450 on memory, I'm happy.
|
# ? Jul 15, 2016 06:24 |
|
There are two problems with that data. First is that the largest limiting factor on overclocks and performance is the bios and we can't flash unsigned custom bioses yet because of the extra protections Nvidia added to the new NVflash that supports Pascal. Second is that I'm sure 3rd parties are using looser timings to allow for higher clocks, especially on their pre overclocked boards and that very easily explains the performance difference at the same clock speed. Since Kepler vendors have been able to limit how much cards overclock by setting the amount of allowable voltage droop and the drivers will induce a crash to prevent anything from going seriously wrong when the clock speed gets high enough to need enough current to cause that voltage droop. It's what allows overclocking to be so supported now and covered under warranty and is very good for average users since it allows for safe messing around. The best thing for more realistic overclocking we have so far is someone was nice enough to leak a "LN2" bios that Asus managed to get Nvidia to sign and people have seen clocks into the 2,200 MHz range from cards that struggled like all others to break 2,100 MHz with their factory bioses. The results would be much more interesting if all of the cards were flashed with the same bios, either the cool unlocked LN2 one or a FE reference one are probably the two best starting points.
|
# ? Jul 15, 2016 07:04 |
|
Ak Gara posted:lol 6800k for a gaming machine. Doesn't the 6700k get higher game fps? Broadwell is the die-shrink of Haswell and has similar IPC, so the difference usually isn't huge assuming you get both of them to equivalent clocks, and some games don't care about Skylake's extra IPC at all. X99 is the overall better platform if you want to go whole-hog, for example he could not get that graphics card running at 16x and also have his soundcard plugged in because Skylake only has 16 PCIe lanes. Nor could he do SLI graphics with his soundcard plugged in, since you need 8 lanes for each GPU. Vulkan and DX12 are promising better multithreaded utilization soon, so the HEDP is the smarter play long-term there. He may also be hoping to get into streaming or other video work, and Haswell-E destroys Skylake in those tasks. Also, he bought a Vive. VR has a lot of background tasks for positioning and so on, and the extra cores will be useful there too. I'm surprised he went with the 6800k instead of the 6850k or 6950X, given that he seemingly has an unlimited budget. But the low-end HEDP chips actually tend to overclock a little better, so that may be why. I'm also surprised he didn't go straight to SLI 1080s for the same reason, but SLI brings on its own set of problems. Overall it's a pretty solid high end build, although as someone who is not made of money I would have gone for the 1 TB Evo instead of the 2 TB Pro, and substituted some cheaper RAM and a Monoprice keyboard instead of the fancy stuff. Paul MaudDib fucked around with this message at 07:42 on Jul 15, 2016 |
# ? Jul 15, 2016 07:33 |
|
Just because he has an unlimited budget doesn't mean he can't be practical.
|
# ? Jul 15, 2016 07:40 |
|
Ugh, my 1070 seems to be borked. I thought I had solved my problem by removing my Afterburner overclock and changing some power management settings, but it hard locked the system while playing Witcher 3 tonight and the system refused to boot outside of safe mode. After several failed boots, I did a Windows refresh and thought I was ok until I tried to reinstall the NVidia drivers. Back into the boot loop I went. I took apart the system, reseated the card, the PCI-E power plug, moved it to another x16 slot on the MB, and reseated the MB power plugs. No success after any of these measures. I even completely reinstalled Windows, no refresh. When I finally got frustrated and pulled it out of the system, it's magically stable running on the iGPU. Guess it's time for a Newegg RMA. Since I have to pay return shipping, anybody know the cheapest way? UPS, FedEx or USPS? I don't do a lot of shipping, mainly just receiving.
|
# ? Jul 15, 2016 08:23 |
|
Gonkish posted:And their "support" of 21:9 in Overwatch is hilariously spiteful (or just plain incompetent): they literally took the 16:9 image, cropped and zoomed it: Hah, I was wondering why my video jumped in views
|
# ? Jul 15, 2016 09:19 |
|
Peanut3141 posted:Ugh, my 1070 seems to be borked. I thought I had solved my problem by removing my Afterburner overclock and changing some power management settings, but it hard locked the system while playing Witcher 3 tonight and the system refused to boot outside of safe mode. After several failed boots, I did a Windows refresh and thought I was ok until I tried to reinstall the NVidia drivers. Back into the boot loop I went. Try booting into safe mode and then disabling your iGPU in device manager. If, like me, you're on an older platform (I'm still using a 3570k) and your mouse doesn't work in safe mode, then Windows Key+X to get to the menu and keyboard to navigate to device manager. I had this exact issue after upgrading to W10 on my 7970 and I still get it frequently when W10 decides to turn my iGPU back on (typically happens when it downloads an update). I went through like 2 weeks of restoring old images, uninstalling overclocks, etc. seeing if I could fix the issue until I stumbled across a post somewhere. It's just a guess but I think W10 was trying to output display over the iGPU when its enabled. I'm not the only one who has had that issue either, on both AMD and nVidia side, but I haven't found any common thread between systems that exhibit the behavior. Shrimp or Shrimps fucked around with this message at 09:42 on Jul 15, 2016 |
# ? Jul 15, 2016 09:38 |
|
All this bitching about Blizzard/Overwatch aside, doesn't 21:9, or rather, anything non 16:9 get shafted in competitive games anyway? DotA2 gives less vision with 16:10 than with 16:9, and I think CS:GO is the same.
|
# ? Jul 15, 2016 10:16 |
|
what is "binning" he said without googling e: seriously idgi
|
# ? Jul 15, 2016 10:17 |
|
sout posted:what is "binning" he said without googling Not all chips are born equal, indeed a huge chunk of them come out with flaws that make them either unusable at full capacity (resulting in them getting sold as 970s instead of 980s or what have you) or unusable, period. Among chips that do pass quality testing, there's still usually a wide spread of quality, which is why you see some people overclocking the same CPU/GPU to high heaven that someone else can't get to work any higher than stock. "Binning" can refer to the process of figuring out how well a chip performs (IE if an intel CPU is good enough to be an i5-6600K or if it has to be sold as an i6-6500, for example) or in this case, specifically the act of choosing the best chips and, apparently, selling them as Founder's Editions.
|
# ? Jul 15, 2016 10:28 |
|
teagone posted:Just because he has an unlimited budget doesn't mean he can't be practical. He could have done a lot worse... most celebrities would have got the top AMD cpu encrusted with $100k in Swarovskis...
|
# ? Jul 15, 2016 12:15 |
|
HMS Boromir posted:or in this case, specifically the act of choosing the best chips and, apparently, selling them as Founder's Editions. Hah! I was thinking something like that when Jayz' Zotac Amp! Extreme was slower than his FE. If I remember correctly the only 980 Ti's to actually get binned was EVGA's Classified, MSI's Lighting, and the Kingpin Edition? The thing about buying a non binned card, is it MIGHT be faster than a factory overclocked card. You just never know! Ak Gara fucked around with this message at 12:55 on Jul 15, 2016 |
# ? Jul 15, 2016 12:53 |
|
lDDQD posted:Probably carried over from the Starcraft II team. They were always obsessed with limiting the amount of in-game terrain that could be visible on screen at a time. Which, to be fair, would give you a competitive advantage... but I feel like the default setting is perhaps inappropriate for a game released in 2010, even. Previous games were forced to have a fairly narrow field of view due to technical limitations, that no longer exited by the time SC2 was released. Not sure why they stubbornly stuck to it. In some people's minds, fighting with the UI is good game design, because you need to fight with the UI better than the other guy!!1 lolesport I quit playing zoom locked RTSes over lovely zoom once I got my 30" screen, because they feel so claustrophobic it makes me physically uncomfortable. Thankfully, PA and soupcan exist.
|
# ? Jul 15, 2016 13:05 |
|
Paul MaudDib posted:I'm surprised he went with the 6800k instead of the 6850k or 6950X, given that he seemingly has an unlimited budget. But the low-end HEDP chips actually tend to overclock a little better, so that may be why. I'm also surprised he didn't go straight to SLI 1080s for the same reason, but SLI brings on its own set of problems. Overall it's a pretty solid high end build, although as someone who is not made of money I would have gone for the 1 TB Evo instead of the 2 TB Pro, and substituted some cheaper RAM and a Monoprice keyboard instead of the fancy stuff. Do you know if broadwell-e scales as well as skylake with faster ram? If I were playing with his budget I'd be buying some lightning fast memory because it really makes a big difference on a skylake platform as your GPU gets to the top end.
|
# ? Jul 15, 2016 13:13 |
|
Skuto posted:All this bitching about Blizzard/Overwatch aside, doesn't 21:9, or rather, anything non 16:9 get shafted in competitive games anyway? This seems like such a stupid thing for the game devs to get hung up on. Put a checkbox in the server setup "limit FOV". Now all the esports wankers can choose amongst themselves how to handle it and the rest of us who couldn't give half a gently caress about competitive gaming can just enjoy the game looking correct on our displays of choice.
|
# ? Jul 15, 2016 14:50 |
|
xthetenth posted:Do you know if broadwell-e scales as well as skylake with faster ram? If I were playing with his budget I'd be buying some lightning fast memory because it really makes a big difference on a skylake platform as your GPU gets to the top end. It seems unlikely since the X99 platform has quad channel memory with double the bandwidth of Z170 to start with.
|
# ? Jul 15, 2016 16:11 |
|
Gonkish posted:His whole system is nuts and I love Terry Crews. He also went ultrawide (he got an X34) for added badassery. It's cool that he was building it to not only to game, but to bond with his son.
|
# ? Jul 15, 2016 16:17 |
|
Okay I got my Zotac 1070 Amp Extreme installed last night and everything is working great... But drat that card is heavy! I need to find a way to support the back end of the card in my tower case as it droops a bit. I have the case on its side for now. Any suggestions?
|
# ? Jul 15, 2016 16:28 |
|
Stick legos under it? Drooping is pretty normal
|
# ? Jul 15, 2016 16:34 |
|
Skuto posted:All this bitching about Blizzard/Overwatch aside, doesn't 21:9, or rather, anything non 16:9 get shafted in competitive games anyway? Afaik, dota 2, hots, cs:go, lol, tf2, etc. actually properly support 21:9. I've even seen cs:go triple monitor setups.
|
# ? Jul 15, 2016 16:38 |
|
Yeah, this is pretty much exclusive to blizzard and the RTS/moba community. Everything else will work just fine. People did all kinds of poo poo to their quake cfgs, but I don't think anyone complained about someone having fov set to 130 because I could play better/more comfortably at 100 personally.
|
# ? Jul 15, 2016 16:46 |
|
PerrineClostermann posted:Afaik, dota 2, hots, cs:go, lol, tf2, etc. actually properly support 21:9. I've even seen cs:go triple monitor setups. Dota 2 does support 21:9 but the HUD is messed up. According to Valve, a fix is on its way but I wouldn't trust a Valve time... It's a shame because dota 2 in 21:9 is awesome
|
# ? Jul 15, 2016 16:49 |
|
Even blizzard's other games now support it properly.
|
# ? Jul 15, 2016 16:51 |
|
Shrimp or Shrimps posted:Try booting into safe mode and then disabling your iGPU in device manager. If, like me, you're on an older platform (I'm still using a 3570k) and your mouse doesn't work in safe mode, then Windows Key+X to get to the menu and keyboard to navigate to device manager. Thanks for the idea, but this does not appear to be the problem. I reinstalled the 1070, disabled the iGPU in the BIOS, booted into safe mode, ran DDU, uninstalled the HD4600 decide in the device manager, then rebooted and installed the NVidia drivers. Screen went black, tried to boot a few times, then stayed on a black screen.
|
# ? Jul 15, 2016 16:52 |
|
Rabid Snake posted:Dota 2 does support 21:9 but the HUD is messed up. According to Valve, a fix is on its way but I wouldn't trust a Valve time... Our Lord and savior Gaben is an UltraWide user and does try to make sure Valve games support it properly. Half life Episode 3 is still on the way though, so you have a point.
|
# ? Jul 15, 2016 17:00 |
|
Terry Crews is ultrawide too
|
# ? Jul 15, 2016 17:07 |
|
|
# ? May 30, 2024 15:11 |
|
Zero VGS posted:He could have done a lot worse... most celebrities would have got the top AMD cpu encrusted with $100k in Swarovskis... Making awful PC hardware choices for your gaming PC is pretty much the norm, he made way better part choices than a lot a people make, just look at this poo poo. http://pcpartpicker.com/builds/
|
# ? Jul 15, 2016 17:11 |