Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
fozzy fosbourne
Apr 21, 2010

Is there some goldrush for a specific Acer monitor going on right now?

Adbot
ADBOT LOVES YOU

fozzy fosbourne
Apr 21, 2010

GTX 10 Benjamins

fozzy fosbourne
Apr 21, 2010

Trying here since I didn't get a response in the monitor thread and hopefully some turbo goon has tried both here:
Has anyone played the same game at 120hz/120fps with ULMB (maybe 1080p), as well as at an ultrawide resolution with 60-100FPS and g-sync? How did the experience compare?

I'm curious if driving a 1080p monitor at 120hz with motion blur reduction is a better experience than 21:9 with a variable, lower fps held up a bit by g-sync. I day dream about the old days playing arena shooters on CRTs, sometimes. I'm also intensely curious about gaming in 21:9.

Also, there isn't a general PC Gaming thread in games anymore? Sorry if this is off-topic

fozzy fosbourne
Apr 21, 2010

xthetenth posted:

For the *sync question it's worth remembering that the higher the framerate the less *sync does. It's really just moving frames to when they should be displayed, and the faster the screen is going the less difference there can be between where the frame is and where the frame should be.

Then again I seem to be really insensitive to framerates above ~45 so my XR341CK is kind of wasted on me and I'm not the right guy to ask. It seems a bit better but not the huge deal it makes to some.

Do games still have a connection between mouse polling and framerates? Also, does this affect analog sticks, too? I seem to remember this being the case unless the game engine had a "raw input" override (which never seemed to be on by default). This information could be way out of date. But I remember looking into this back in the Quake days and this made a big difference in aiming at higher framerates

fozzy fosbourne
Apr 21, 2010

It's surprising that gpus don't support nearest neighbor interpolation yet, now that different native resolutions are becoming more popular. A 4K display that could display 1080p by just multiplying each pixel by 4 while not introducing the blur of bilinear interpolation would probably be pretty convenient for those who want to work on 4K but game on 1080p

fozzy fosbourne
Apr 21, 2010

If it's true that they are canceling production of the 980ti, does that make it likely that they will have a 980ti class Pascal chip in June or whenever they cancel it? Or do they generally just kill last gen's 980ti level card while taking their time to produce a new one? What do you industry pundits think

I'm in the awkward spot where I'd like to get something around the 980ti level of feeps per dollar but my system is on cinderblocks until I get a new card so I don't feel like holding out for a year, either.

e: I guess I'd like to wait for display port 1.4 support, too, just in case there is some epic 144hz 21:9 display and the market comes down in price a bit

fozzy fosbourne fucked around with this message at 15:08 on Apr 13, 2016

fozzy fosbourne
Apr 21, 2010

I see. Is it reasonable to expect this year's cards to have DP 1.3 support at least? That spec includes the bandwidth to display the higher resolutions at 144hz, right?

fozzy fosbourne
Apr 21, 2010

Zero VGS posted:

The nice thing is that 8k monitors are practically indistinguishable from 4k monitors at a comfortable viewing distance, and foveated rendering is poised to lower VR rendering requirements by possibly two-thirds. Between that and DX12, we might actually reach a point in the next few years where graphics cards becomes like headphones; stuff at a couple hundred bucks price-point can be good enough that most people can't distinguish anything better, unless they're imagining it, like audiophiles.

4K pegged at 120hz+ so some strobe backlight feature can be enabled seems like it will be more easily distinguishable than high end headphones. I think there is a ton of headroom that won't be filled right away now that higher resolutions and refresh rates are entering the market.

fozzy fosbourne
Apr 21, 2010

Yeah, is it possible that AMDs in the same tier curb stomp Nvidia cards in DX12 this generation?

fozzy fosbourne
Apr 21, 2010

Rise of the Tomb Raider, Deus Ex later this year. Basically, the gpu penis measurment genre going forward

fozzy fosbourne
Apr 21, 2010

I don't think this was posted yet, rumor has it that the PlayStation 4.5 will use a Polaris chip http://www.giantbomb.com/articles/sources-the-upgraded-playstation-4-is-codenamed-ne/1100-5437/

Hopefully it supports freesync

fozzy fosbourne
Apr 21, 2010

I have a thermos for coffee and a water bottle with a spout, both at work and home. Anything less would be uncivilized

fozzy fosbourne
Apr 21, 2010

One of those beer can helmets, but with voice comms, 7.1 surround and Mountain Dew

fozzy fosbourne
Apr 21, 2010

THE DOG HOUSE posted:

For the fanboy driven market share talk ive spoken my mind on this plenty but, well, Polaris better be the poo poo. $300-$400 for 980ti performance as per the earlier posts is absolutely expected and shouldn't be surprising, I'd hope for slightly better than that (as this is already expected of Pascal as well).

Wasn't the thread only a short while ago speculating that the 1080 might only be a 10-15% gain on the 980ti, for the same price? Seems like the speculation has been all over the place

fozzy fosbourne
Apr 21, 2010

What type of interpolation do modern gpus do when scaling to a lower resolution? Bilinear? Bicubic?

E: also does scaling from the gpu add any display lag to the pipeline? I can only find mostly contradictory bro science on cs:go forums

fozzy fosbourne fucked around with this message at 00:10 on Apr 27, 2016

fozzy fosbourne
Apr 21, 2010

http://videocardz.com/59459/nvidia-pascal-editors-day-next-week

quote:

We’ve confirmed this information with three independent sources (Jen-Hsun, Jen-Hsun doppelganger and Elon Musk). NVIDIA Editors’ Day 2016 will take place next week in USA. It’s a special event for reviewers and technology enthusiasts who are invited for a briefing about new graphics cards. During this day NVIDIA will inform the press about new Pascal-based graphics cards.

Our sources claim that two graphics cards will be unveiled. So this should probably mean GeForce GTX 1080 and GTX 1070 showcase.
We are avoiding the term ‘announcement’ as it remains unclear if the event will be live streamed for public audience or its content will be disclosed later when NDA lifts. Normally NVIDIA tends to call public events as ‘press conferences’, so it is unlikely we will be seeing official confirmation of this Pascal event.

The last major Editors’ Day took place in 2014, a week before Maxwell GM204 announcement. So this should give you a perspective on when to we expect GeForce Pascal cards (I’m thinking maybe second or third week of May).

fozzy fosbourne
Apr 21, 2010

There's also the production ramp up on the gddr5x memory. Found this on Beyond3d:

quote:

Microns last statement was that they plan to start mass production in summer, a bit earlier they said August.

Reply:

quote:

I've posted this quote from their March conference call already:
In the Graphic segment, we’re enthusiastic about the early success of our GDDR5X a discrete solution for increasing data rates above 10-gigabits per second. We’ve several major design wins and expect to have the products available by the end of the current fiscal quarter.
Their current fiscal quarter ends of May 31st.

They must have accelerated things. The conf call came after those early predictions.
I can't blame you for thinking otherwise: everybody seems to be using GDDR5X availability as a crutch to claim that the GDDR5X version will be after the summer or q4, but that Micron statement throws that argument under the bus.

https://forum.beyond3d.com/threads/nvidia-pascal-announcement.57763/page-23

:f5:

fozzy fosbourne
Apr 21, 2010

How long does it usually take for the third party cards to come out after the reference card is released? Also, which of the third parties have had the best cards lately?

fozzy fosbourne
Apr 21, 2010

These people are saying it's more like a 30% increase to stock if you compare graphic scores to graphic scores. https://www.reddit.com/r/hardware/comments/4hznfw/first_nvidia_geforce_gtx_1080_3dmark_benchmarks/d2tq8aw

No idea how to interpret that

fozzy fosbourne
Apr 21, 2010

THE DOG HOUSE posted:

Lol im not sure I believe that but ... one can hope

Well, the videocardz graphics score for the 1080 is 10102 and in the linked benchmarks in that thread the stock 980ti scores 7459-7841 graphics score. How does a less significant "overall" score delta and a more significant "graphics score" delta translate to practical game performance, I have no idea

fozzy fosbourne
Apr 21, 2010

http://arstechnica.com/gaming/2016/02/directx-12-amd-and-nvidia-gpus-finally-work-together-but-amd-still-has-the-lead/

Woah how the hell did I only just hear about this multi gpu split frame rendering in DX12? I'm guessing there are some serious drawbacks that won't allow me to use a 1080, 670, and my iGPU in one big split frame rendering party?

fozzy fosbourne
Apr 21, 2010

What determines a "base clock" anyways? It seems like it's mostly synthetic rather than an intrinsic value of the card, right? Maxwell could have been much "worse at overclocking" if they sold it with a higher base clock? I'm just trying to figure out if base clocks have a large marketing factor to them or if there is some fundamental engineering formula that they use to establish them

fozzy fosbourne
Apr 21, 2010

This seems worth reposting:
https://www.reddit.com/r/nvidia/comments/4i4399/dual_monitor_144hz_60hz_framerate_lock_is_a_bug/

quote:

Issue is showcased here: https://www.youtube.com/watch?v=Rmaz4qAor_w
TLDV; If you run a game on 144hz monitor, and play video or moving graphic on 60hz monitor, 144hz locks at 60fps for the duration of the moving graphic.
Got this from Nvidia support
"After a lot of investigation internally, this was found not to be a bug, but a limitation when using a 144hz monitor and a 60hz monitor together and trying to watch video on the 60hz monitor.
If you are trying to watch a 60Hz streamed game or video on their 144Hz monitor, the PC tries to show the streamed image (144/60 = ) 2.4 times... before showing the next frame. The monitor is not going to show a fractional number of frames of the streamed content. The monitor is going to show the "frame 1" 2 times, then show the "frame 2" 3 times, then show "frame 3" 2 times, and keep repeating this pattern.
So it is expected to see some stutter and judder, if you lower the refresh to 120hz on the primary monitor this issue will go away.
Its not something we can fix. "
EDIT:
Some have reported that turning it down to 120hz solved it, this was not the case for me, my issue still persists.

If you watch a 60fps video on your second display your refresh on your gaming screen might get fuxxored

fozzy fosbourne
Apr 21, 2010

FWIW some random guy on reddit says he heard from a little birdie that aftermarket cards with third party coolers will either launch with the reference cards or within a week :pervert:

fozzy fosbourne
Apr 21, 2010

I have a silverstone sg08 little babby itx case that i use blowers for since if it just blew all the hot air into the case it would basically be blowing itself! :eyepop:

fozzy fosbourne
Apr 21, 2010

Froist posted:

Stupid question but if the general consensus is that blowers are crap, why are they always the reference design?

Because non-reference blow the hot air back into the case and that will only work if you know what you are doing and have air flow. They can't expect that so they use the lowest common denominator solution

fozzy fosbourne
Apr 21, 2010

So is the thing done? Is there some place where I can read about the features and poo poo?

fozzy fosbourne
Apr 21, 2010


Thanks. Here is another thing: http://nvidianews.nvidia.com/news/a-quantum-leap-in-gaming:-nvidia-introduces-geforce-gtx-1080

fozzy fosbourne
Apr 21, 2010

So what does "double the bandwidth" for sli mean, in practice? Has SLI been constrained by bandwidth?

E: also wccftech is saying the 1080 has DisplayPort 1.4? Is that even possible yet?

fozzy fosbourne fucked around with this message at 06:07 on May 7, 2016

fozzy fosbourne
Apr 21, 2010

I wonder when we'll see the first 4K 120hz g-sync monitor.

fozzy fosbourne
Apr 21, 2010

Paul MaudDib posted:

If you come out and say you're using DX12 benchmarks then it raises the obvious question of how DX11 performs. Same reason you don't come out and say your Pascal demo board is running on Maxwell...

Well, there is this vague graph from the product page:

fozzy fosbourne
Apr 21, 2010

Zotac:
http://videocardz.com/59674/zotac-releases-its-geforce-gtx-1080
https://www.zotac.com/gb/product/graphics_card/geforce-gtx-1080

fozzy fosbourne
Apr 21, 2010

Hidden Information Found In HTML Source Code of NVIDIA’s GTX 1080 Page lol

fozzy fosbourne
Apr 21, 2010

slidebite posted:

Hey sorry if this is a stupid question but I wasn't able to watch the press event and just getting caught up.

Did they actually describe the metrics of what "relative performance" is? Or give any real FPS (even benchmark) numbers? "Relative" sounds really nebulous.. like it could be using $$ as a factor or something which could be reasonable to use, but if for example performance is really not all that much higher but costs "relatively" less, could really skew a comparison graph.

The "relative performance" numbers seem to be based on (synthetic) VR benchmarking. The game performance graph on the product page (which someone grabbed some raw numbers from and the game settings in that link I just posted) seems slightly more concrete.

quote:

The “Up to 3x performance” originally had a disclaimer that it was “based on test results in graphics intensive VR gaming applications”. This disclaimer was commented out

quote:

The graph that compares the GTX 980 to the 1080 in the ‘performance’ tab doesn’t display exact numbers, meaning most readers are forced to estimate the values. I opened up the inspect element to view the width in pixels of the chart bars, and used simple math to determine the percentage difference. The GTX 1080 performs 70% better in Witcher 3 and 80% better at Rise of the Tomb Raider than the GTX 980, according to the benchmark. These benchmarks were conducted at max settings @ 2560×1600, according to a comment in the source.

Until we see real benchmarks conducted by another party, I'm going to assume the above is the closest thing to reality right now

fozzy fosbourne
Apr 21, 2010

A guy just broke nda from the deep dive presentation thing going on right now over at beyond3d :

quote:

If I'm getting this right, the current presentation just said that Pascal can finally change the allocation of SM-blocks to either compute or graphic tasks at runtime, with the same granularity as preemption. Controlled by the driver.

And yes, this apparently does mean that Maxwell actually statically assigned a SM either to compute or graphics for the duration of the next batch, depending on what load the driver tried to predict. And if the prediction was wrong, the GPU was stuck with half the SM units being unable to do anything? That explains a lot...

fozzy fosbourne
Apr 21, 2010

980 came out at $549 if what I'm reading is correct. How big was the performance margin for the 980 over the 780 in benchmarks comparable for the time to the 2560x1600 max Witcher and Tomb Raider benchmarks?

fozzy fosbourne
Apr 21, 2010

PC LOAD LETTER posted:

980gtx was around 30% faster than the 780gtx I believe. The 780Ti actually came pretty close to the 980's performance though, around 10% difference on avg.


Nice!

Now I'm curious about how the 980ti does on 2560x1600 "max" benchmarks for the Witcher 3 and Rise of the Tomb Raider, alongside 980 gtx benchmarks. We could use the percent different from that graph to model where the 1080 gtx might land in terms of fps as well as a percent difference between that and the 980ti

fozzy fosbourne
Apr 21, 2010

Awful app edits instead of making a new post and ate my post, but I found techpowerup benchmarks for 2560x1600 max Witcher 3:
980ti 51fps
980 40fps

And used the 70% difference reported on the official 1080 page to calculate a speculative 68fps for the 1080 in the same benchmark and a 28.57% percent difference between the 1080 and 980ti on that Witcher bench.

fozzy fosbourne fucked around with this message at 21:06 on May 7, 2016

fozzy fosbourne
Apr 21, 2010

FaustianQ posted:

What is the clocks on these? 28-29% sounds about right, some comparisons were floating about of 25-30% performance improvement, but with a 60% clock difference when using published stock boost clocks (1076 vs 1733). This means a 980ti can trade punches with GP104 up to 2.0ghz, but then gets left in the dust without special cooling past that. All the while power draw difference looks to be massive, and GP104s actual clock limit looks pretty close to 2.4ghz. GP104 will also have advantages in VR and compute heavy tasks which can hobble a 980ti.

This is what I used for the 980/980ti benchmarks:

http://www.techspot.com/review/1011-nvidia-geforce-gtx-980-ti/

quote:

Nvidia GeForce GTX 980 Ti (6144MB)
Gainward GeForce GTX 980 (4096MB)

quote:

The base clock speed of the GTX 980 Ti is 1000MHz, with Boost Clock speed of 1075MHz, it matches the Titan X. The Boost Clock speed is based on the average GTX 980 Ti card running a wide variety of games and applications.
So stock 980ti and I guess a 3rd party 980.

I pulled the 70% difference between the 980 and 1080 in Witcher 3 from:
http://deliddedtech.com/2016/05/07/hidden-information-found-in-source-code-of-nvidias-gtx-1080-page/

quote:

The graph that compares the GTX 980 to the 1080 in the ‘performance’ tab doesn’t display exact numbers, meaning most readers are forced to estimate the values. I opened up the inspect element to view the width in pixels of the chart bars, and used simple math to determine the percentage difference. The GTX 1080 performs 70% better in Witcher 3 and 80% better at Rise of the Tomb Raider than the GTX 980, according to the benchmark. These benchmarks were conducted at max settings @ 2560×1600, according to a comment in the source.
Not sure if that's the stock reference clock speed or if they overclocked it for those results, but I'd assume the former I think

fozzy fosbourne fucked around with this message at 21:36 on May 7, 2016

Adbot
ADBOT LOVES YOU

fozzy fosbourne
Apr 21, 2010

Yeah, inflation takes the 500 from 2012 for the 680 to 520 in 2016

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply