|
Is there some goldrush for a specific Acer monitor going on right now?
|
# ¿ Apr 3, 2016 22:57 |
|
|
# ¿ May 14, 2024 11:27 |
|
GTX 10 Benjamins
|
# ¿ Apr 10, 2016 17:17 |
|
Trying here since I didn't get a response in the monitor thread and hopefully some turbo goon has tried both here: Has anyone played the same game at 120hz/120fps with ULMB (maybe 1080p), as well as at an ultrawide resolution with 60-100FPS and g-sync? How did the experience compare? I'm curious if driving a 1080p monitor at 120hz with motion blur reduction is a better experience than 21:9 with a variable, lower fps held up a bit by g-sync. I day dream about the old days playing arena shooters on CRTs, sometimes. I'm also intensely curious about gaming in 21:9. Also, there isn't a general PC Gaming thread in games anymore? Sorry if this is off-topic
|
# ¿ Apr 11, 2016 02:13 |
|
xthetenth posted:For the *sync question it's worth remembering that the higher the framerate the less *sync does. It's really just moving frames to when they should be displayed, and the faster the screen is going the less difference there can be between where the frame is and where the frame should be. Do games still have a connection between mouse polling and framerates? Also, does this affect analog sticks, too? I seem to remember this being the case unless the game engine had a "raw input" override (which never seemed to be on by default). This information could be way out of date. But I remember looking into this back in the Quake days and this made a big difference in aiming at higher framerates
|
# ¿ Apr 11, 2016 18:41 |
|
It's surprising that gpus don't support nearest neighbor interpolation yet, now that different native resolutions are becoming more popular. A 4K display that could display 1080p by just multiplying each pixel by 4 while not introducing the blur of bilinear interpolation would probably be pretty convenient for those who want to work on 4K but game on 1080p
|
# ¿ Apr 12, 2016 13:57 |
|
If it's true that they are canceling production of the 980ti, does that make it likely that they will have a 980ti class Pascal chip in June or whenever they cancel it? Or do they generally just kill last gen's 980ti level card while taking their time to produce a new one? What do you industry pundits think I'm in the awkward spot where I'd like to get something around the 980ti level of feeps per dollar but my system is on cinderblocks until I get a new card so I don't feel like holding out for a year, either. e: I guess I'd like to wait for display port 1.4 support, too, just in case there is some epic 144hz 21:9 display and the market comes down in price a bit fozzy fosbourne fucked around with this message at 15:08 on Apr 13, 2016 |
# ¿ Apr 13, 2016 15:05 |
|
I see. Is it reasonable to expect this year's cards to have DP 1.3 support at least? That spec includes the bandwidth to display the higher resolutions at 144hz, right?
|
# ¿ Apr 13, 2016 16:22 |
|
Zero VGS posted:The nice thing is that 8k monitors are practically indistinguishable from 4k monitors at a comfortable viewing distance, and foveated rendering is poised to lower VR rendering requirements by possibly two-thirds. Between that and DX12, we might actually reach a point in the next few years where graphics cards becomes like headphones; stuff at a couple hundred bucks price-point can be good enough that most people can't distinguish anything better, unless they're imagining it, like audiophiles. 4K pegged at 120hz+ so some strobe backlight feature can be enabled seems like it will be more easily distinguishable than high end headphones. I think there is a ton of headroom that won't be filled right away now that higher resolutions and refresh rates are entering the market.
|
# ¿ Apr 14, 2016 13:44 |
|
Yeah, is it possible that AMDs in the same tier curb stomp Nvidia cards in DX12 this generation?
|
# ¿ Apr 15, 2016 18:17 |
|
Rise of the Tomb Raider, Deus Ex later this year. Basically, the gpu penis measurment genre going forward
|
# ¿ Apr 16, 2016 20:26 |
|
I don't think this was posted yet, rumor has it that the PlayStation 4.5 will use a Polaris chip http://www.giantbomb.com/articles/sources-the-upgraded-playstation-4-is-codenamed-ne/1100-5437/ Hopefully it supports freesync
|
# ¿ Apr 19, 2016 16:47 |
|
I have a thermos for coffee and a water bottle with a spout, both at work and home. Anything less would be uncivilized
|
# ¿ Apr 25, 2016 18:26 |
|
One of those beer can helmets, but with voice comms, 7.1 surround and Mountain Dew
|
# ¿ Apr 25, 2016 19:09 |
|
THE DOG HOUSE posted:For the fanboy driven market share talk ive spoken my mind on this plenty but, well, Polaris better be the poo poo. $300-$400 for 980ti performance as per the earlier posts is absolutely expected and shouldn't be surprising, I'd hope for slightly better than that (as this is already expected of Pascal as well). Wasn't the thread only a short while ago speculating that the 1080 might only be a 10-15% gain on the 980ti, for the same price? Seems like the speculation has been all over the place
|
# ¿ Apr 26, 2016 21:21 |
|
What type of interpolation do modern gpus do when scaling to a lower resolution? Bilinear? Bicubic? E: also does scaling from the gpu add any display lag to the pipeline? I can only find mostly contradictory bro science on cs:go forums fozzy fosbourne fucked around with this message at 00:10 on Apr 27, 2016 |
# ¿ Apr 27, 2016 00:07 |
|
http://videocardz.com/59459/nvidia-pascal-editors-day-next-weekquote:We’ve confirmed this information with three independent sources (Jen-Hsun, Jen-Hsun doppelganger and Elon Musk). NVIDIA Editors’ Day 2016 will take place next week in USA. It’s a special event for reviewers and technology enthusiasts who are invited for a briefing about new graphics cards. During this day NVIDIA will inform the press about new Pascal-based graphics cards.
|
# ¿ Apr 29, 2016 18:57 |
|
There's also the production ramp up on the gddr5x memory. Found this on Beyond3d:quote:Microns last statement was that they plan to start mass production in summer, a bit earlier they said August. Reply: quote:I've posted this quote from their March conference call already: https://forum.beyond3d.com/threads/nvidia-pascal-announcement.57763/page-23
|
# ¿ Apr 29, 2016 22:52 |
|
How long does it usually take for the third party cards to come out after the reference card is released? Also, which of the third parties have had the best cards lately?
|
# ¿ May 5, 2016 14:49 |
|
These people are saying it's more like a 30% increase to stock if you compare graphic scores to graphic scores. https://www.reddit.com/r/hardware/comments/4hznfw/first_nvidia_geforce_gtx_1080_3dmark_benchmarks/d2tq8aw No idea how to interpret that
|
# ¿ May 5, 2016 15:20 |
|
THE DOG HOUSE posted:Lol im not sure I believe that but ... one can hope Well, the videocardz graphics score for the 1080 is 10102 and in the linked benchmarks in that thread the stock 980ti scores 7459-7841 graphics score. How does a less significant "overall" score delta and a more significant "graphics score" delta translate to practical game performance, I have no idea
|
# ¿ May 5, 2016 15:36 |
|
http://arstechnica.com/gaming/2016/02/directx-12-amd-and-nvidia-gpus-finally-work-together-but-amd-still-has-the-lead/ Woah how the hell did I only just hear about this multi gpu split frame rendering in DX12? I'm guessing there are some serious drawbacks that won't allow me to use a 1080, 670, and my iGPU in one big split frame rendering party?
|
# ¿ May 6, 2016 02:08 |
|
What determines a "base clock" anyways? It seems like it's mostly synthetic rather than an intrinsic value of the card, right? Maxwell could have been much "worse at overclocking" if they sold it with a higher base clock? I'm just trying to figure out if base clocks have a large marketing factor to them or if there is some fundamental engineering formula that they use to establish them
|
# ¿ May 6, 2016 13:52 |
|
This seems worth reposting: https://www.reddit.com/r/nvidia/comments/4i4399/dual_monitor_144hz_60hz_framerate_lock_is_a_bug/ quote:Issue is showcased here: https://www.youtube.com/watch?v=Rmaz4qAor_w If you watch a 60fps video on your second display your refresh on your gaming screen might get fuxxored
|
# ¿ May 6, 2016 16:45 |
|
FWIW some random guy on reddit says he heard from a little birdie that aftermarket cards with third party coolers will either launch with the reference cards or within a week
|
# ¿ May 6, 2016 21:16 |
|
I have a silverstone sg08 little babby itx case that i use blowers for since if it just blew all the hot air into the case it would basically be blowing itself!
|
# ¿ May 6, 2016 21:31 |
|
Froist posted:Stupid question but if the general consensus is that blowers are crap, why are they always the reference design? Because non-reference blow the hot air back into the case and that will only work if you know what you are doing and have air flow. They can't expect that so they use the lowest common denominator solution
|
# ¿ May 6, 2016 22:03 |
|
So is the thing done? Is there some place where I can read about the features and poo poo?
|
# ¿ May 7, 2016 04:42 |
|
Thanks. Here is another thing: http://nvidianews.nvidia.com/news/a-quantum-leap-in-gaming:-nvidia-introduces-geforce-gtx-1080
|
# ¿ May 7, 2016 05:23 |
|
So what does "double the bandwidth" for sli mean, in practice? Has SLI been constrained by bandwidth? E: also wccftech is saying the 1080 has DisplayPort 1.4? Is that even possible yet? fozzy fosbourne fucked around with this message at 06:07 on May 7, 2016 |
# ¿ May 7, 2016 05:59 |
|
I wonder when we'll see the first 4K 120hz g-sync monitor.
|
# ¿ May 7, 2016 06:07 |
|
Paul MaudDib posted:If you come out and say you're using DX12 benchmarks then it raises the obvious question of how DX11 performs. Same reason you don't come out and say your Pascal demo board is running on Maxwell... Well, there is this vague graph from the product page:
|
# ¿ May 7, 2016 13:30 |
|
Zotac: http://videocardz.com/59674/zotac-releases-its-geforce-gtx-1080 https://www.zotac.com/gb/product/graphics_card/geforce-gtx-1080
|
# ¿ May 7, 2016 17:32 |
|
Hidden Information Found In HTML Source Code of NVIDIA’s GTX 1080 Page lol
|
# ¿ May 7, 2016 18:05 |
|
slidebite posted:Hey sorry if this is a stupid question but I wasn't able to watch the press event and just getting caught up. The "relative performance" numbers seem to be based on (synthetic) VR benchmarking. The game performance graph on the product page (which someone grabbed some raw numbers from and the game settings in that link I just posted) seems slightly more concrete. quote:The “Up to 3x performance” originally had a disclaimer that it was “based on test results in graphics intensive VR gaming applications”. This disclaimer was commented out quote:The graph that compares the GTX 980 to the 1080 in the ‘performance’ tab doesn’t display exact numbers, meaning most readers are forced to estimate the values. I opened up the inspect element to view the width in pixels of the chart bars, and used simple math to determine the percentage difference. The GTX 1080 performs 70% better in Witcher 3 and 80% better at Rise of the Tomb Raider than the GTX 980, according to the benchmark. These benchmarks were conducted at max settings @ 2560×1600, according to a comment in the source. Until we see real benchmarks conducted by another party, I'm going to assume the above is the closest thing to reality right now
|
# ¿ May 7, 2016 18:15 |
|
A guy just broke nda from the deep dive presentation thing going on right now over at beyond3d :quote:If I'm getting this right, the current presentation just said that Pascal can finally change the allocation of SM-blocks to either compute or graphic tasks at runtime, with the same granularity as preemption. Controlled by the driver.
|
# ¿ May 7, 2016 18:26 |
|
980 came out at $549 if what I'm reading is correct. How big was the performance margin for the 980 over the 780 in benchmarks comparable for the time to the 2560x1600 max Witcher and Tomb Raider benchmarks?
|
# ¿ May 7, 2016 18:43 |
|
PC LOAD LETTER posted:980gtx was around 30% faster than the 780gtx I believe. The 780Ti actually came pretty close to the 980's performance though, around 10% difference on avg. MaxxBot posted:It looks like the 980 is about 35% faster than the 780. Nice! Now I'm curious about how the 980ti does on 2560x1600 "max" benchmarks for the Witcher 3 and Rise of the Tomb Raider, alongside 980 gtx benchmarks. We could use the percent different from that graph to model where the 1080 gtx might land in terms of fps as well as a percent difference between that and the 980ti
|
# ¿ May 7, 2016 19:55 |
|
Awful app edits instead of making a new post and ate my post, but I found techpowerup benchmarks for 2560x1600 max Witcher 3: 980ti 51fps 980 40fps And used the 70% difference reported on the official 1080 page to calculate a speculative 68fps for the 1080 in the same benchmark and a 28.57% percent difference between the 1080 and 980ti on that Witcher bench. fozzy fosbourne fucked around with this message at 21:06 on May 7, 2016 |
# ¿ May 7, 2016 20:19 |
|
FaustianQ posted:What is the clocks on these? 28-29% sounds about right, some comparisons were floating about of 25-30% performance improvement, but with a 60% clock difference when using published stock boost clocks (1076 vs 1733). This means a 980ti can trade punches with GP104 up to 2.0ghz, but then gets left in the dust without special cooling past that. All the while power draw difference looks to be massive, and GP104s actual clock limit looks pretty close to 2.4ghz. GP104 will also have advantages in VR and compute heavy tasks which can hobble a 980ti. This is what I used for the 980/980ti benchmarks: http://www.techspot.com/review/1011-nvidia-geforce-gtx-980-ti/ quote:Nvidia GeForce GTX 980 Ti (6144MB) quote:The base clock speed of the GTX 980 Ti is 1000MHz, with Boost Clock speed of 1075MHz, it matches the Titan X. The Boost Clock speed is based on the average GTX 980 Ti card running a wide variety of games and applications. I pulled the 70% difference between the 980 and 1080 in Witcher 3 from: http://deliddedtech.com/2016/05/07/hidden-information-found-in-source-code-of-nvidias-gtx-1080-page/ quote:The graph that compares the GTX 980 to the 1080 in the ‘performance’ tab doesn’t display exact numbers, meaning most readers are forced to estimate the values. I opened up the inspect element to view the width in pixels of the chart bars, and used simple math to determine the percentage difference. The GTX 1080 performs 70% better in Witcher 3 and 80% better at Rise of the Tomb Raider than the GTX 980, according to the benchmark. These benchmarks were conducted at max settings @ 2560×1600, according to a comment in the source. fozzy fosbourne fucked around with this message at 21:36 on May 7, 2016 |
# ¿ May 7, 2016 21:34 |
|
|
# ¿ May 14, 2024 11:27 |
|
Yeah, inflation takes the 500 from 2012 for the 680 to 520 in 2016
|
# ¿ May 7, 2016 23:20 |