|
https://www.micron.com/about/blogs/2016/may/nvidia-launches-gtx1080-with-micron-gddr5x "I first talked about Micron’s GDDR5X in September 2015 and in February of this year, provided an update that the technology was on track for mass production by summer. Today, I am happy to announce that GDDR5X, the fastest discrete memory component in the world, has already entered mass production." Maybe supply won't be as bad as we feared
|
# ? May 11, 2016 13:24 |
|
|
# ? May 30, 2024 13:09 |
|
I am concerned that AMD may not have access to GDDR5X or enough of it, though.
|
# ? May 11, 2016 13:35 |
|
If you want to get moist reading nerdy papers about how fraggin awesome the gddrx are, try this one The TL;DR version being.....
|
# ? May 11, 2016 13:38 |
|
SwissArmyDruid posted:I am concerned that AMD may not have access to GDDR5X or enough of it, though. They will probably skip it for now, the extra price, power draw and supply issues make it unsuited for Polaris and HBM2 is straight up better and the advertised feature for Vega.
|
# ? May 11, 2016 14:00 |
|
froody guy posted:If you want to get moist reading nerdy papers about how fraggin awesome the gddrx are, try this one The current stuff is 10gbps though, right? Last I saw, the 12 and 14 were in "Sampling" while the 10 is still in production. I wonder how much the 1080 is bottlenecked by memory bandwidth and when we'll see the first 12/14 gbps chips available to consumers. Will the fast stuff be in 3rd party 1080s? A whole new sku? Next year? September?
|
# ? May 11, 2016 14:08 |
|
xthetenth posted:You're still getting a wider variation between when the frame starts rendering and when it's displayed with *sync, but it's going to be lessened. If you are consistently operating above Vsync there will be no variation because your rendering queue will be full and so you will stall on present, locking you're entire engine to a consistent 15ms update rate. Hubis fucked around with this message at 14:27 on May 11, 2016 |
# ? May 11, 2016 14:23 |
|
fozzy fosbourne posted:The current stuff is 10gbps though, right? fozzy fosbourne posted:Last I saw, the 12 and 14 were in "Sampling" while the 10 is still in production. I wonder how much the 1080 is bottlenecked by memory bandwidth and when we'll see the first 12/14 gbps chips available to consumers. Will the fast stuff be in 3rd party 1080s? A whole new sku? Next year? September? For sure if they start putting the speshul ones on all the 1080 sold from a certain point on I'd be massively pissed.
|
# ? May 11, 2016 14:24 |
|
Scammers Edition's pcb unveiled
|
# ? May 11, 2016 14:31 |
|
Hubis posted:If you are consistently operating above Vsync there will be no variation because your rendering queue will be full and so you will stall on present, locking you're entire engine to a consistent 15ms update rate. Right, sorry, it'd be less consistent between end of render and display but more consistent between start of frame and display, which I think would actually be better barring some of the VR tech like transforms based on game state changes between start and end of rendering a frame. That all changes if you dip below max FPS though, but we're specifically discounting that case.
|
# ? May 11, 2016 14:34 |
|
For a second my brain didn't parse what you meant by Scammers Edition and thought some more extremely good wood screws poo poo had come to light.
|
# ? May 11, 2016 14:35 |
|
Come on Nvidia, if you're going to charge a premium you could at least populate all the VRM phases
|
# ? May 11, 2016 14:37 |
|
Hubis posted:If you are consistently operating above Vsync there will be no variation because your rendering queue will be full and so you will stall on present, locking you're entire engine to a consistent 15ms update rate. It works, in theory and mostly in practice, but here is my main issue with vsync. This is a graph without vsync with an average fps of 99. You can clearly see the framerate is actually all over the place constantly. This is what games look like if you do frame by frame times. This looks terrible but its actually a very smooth example of gameplay, it looks much much worse when there is serious tearing. This is an average fps of 99 over 5 minutes, which on a side note is what the typical benchmark website would report as a result. Now vsync prevents the back buffer frames from drawing on the screen until the front buffer frame finishes, which in theory should make that a straight line across 60 fps (and its pretty close!). The problem with that is if you look at the graph above there is no way this mathematically works out time wise. All those frames with wildly varying times are being forced into locked frame times, which isn't what the GPU was truly outputting at the time it rendered the frame. This results in a constant speeding up and slowing down of "time" in the game itself, which is what we perceive as lag. If it didn't happen all the time it would be unnoticeable but as you can see above there is no way for it not be happening almost the entire time vsync is on whether or not the gpu is average above 60 fps or not. This is where the input lag comes from among other visual issues. Now it can never be more or less lag than the variation within 1 frame, but its simply too noticeable. On top of that its not actually perfect anyways and there are still pretty consistent dropped frames, albeit far reduced. But combine the varying lag and input lag with the fact you're making your GPU work harder to do this at all, reducing your settings potential... I'm just not a fan. Some games are worse than others and in some games it doesnt matter if it lags a little bit, but it was obnoxious enough for me to just accept tearing instead. And I really really really tried to like it for a long time and tried every vsync setting combination and mode there was. I'm well aware im splitting hairs here but having the monitor and GPU work together to sync refresh rate with fps is a really elegant solution that we all deserve to have. penus penus penus fucked around with this message at 15:47 on May 11, 2016 |
# ? May 11, 2016 15:45 |
|
THE DOG HOUSE posted:I'm well aware im splitting hairs here but having the monitor and GPU work together to sync refresh rate with fps is a really elegant solution that we all deserve to have.
|
# ? May 11, 2016 16:05 |
|
DrDork posted:I agree and ubiquitous *sync will be a grand day, indeed. In the meantime, though, I'm still solidly in the camp that, for those of us with limited funds, the $200-$300 extra it'd cost you to get a GSync monitor is probably better spent on a better GPU--that's the difference between a 970 and a 908Ti with change left over, after all. And yes, resale, price over time, multiple generations, blah blah blah. $300 can still get you at least one, if not two brackets up on the GPU ladder for at least two, possibly three generations if you're not dumb, and in 5 years if we're all still talking about a GSync tax something has gone terribly wrong. Yeah the cost of a gsync has to really drop, its silly for how much it costs.
|
# ? May 11, 2016 16:24 |
|
Hubis posted:If you are consistently operating above Vsync there will be no variation because your rendering queue will be full and so you will stall on present, locking you're entire engine to a consistent 15ms update rate. Unless the presentation queue is using "mailbox" mode in which case the renderer can run independently of vsync without tearing. Judder is the only problem then.
|
# ? May 11, 2016 16:28 |
|
Blackfyre posted:Yeah the cost of a gsync has to really drop, its silly for how much it costs.
|
# ? May 11, 2016 16:30 |
|
Would gsync help at all if I want to upgrade my center monitor to a 144hz one and still use my two side monitors that are 60hz? right now running a 120hz monitor for my center one and two 60hz on the side (all 1920x1080 27inch) I get screen tearing on the side monitors but the center one is fine.
|
# ? May 11, 2016 16:45 |
|
Cost wasn't even that much of a concern for me, I could stomach $200 over normal to get the tech into a monitor. The problem is what's available. If I could get a 4k Dell IPS monitor in the upper 20" range I would easily pay upwards of $750 for it. It could even be only 60hz. But, something like that doesn't exist. In theory, the Asus PG27AQ is the right monitor, but it's about $150 too expensive and has atrocious QC. Seriously, just slap a gsync module into the P2715Q and I would be willing to pay a $200 premium on it.
|
# ? May 11, 2016 16:51 |
|
Holyshoot posted:Would gsync help at all if I want to upgrade my center monitor to a 144hz one and still use my two side monitors that are 60hz? right now running a 120hz monitor for my center one and two 60hz on the side (all 1920x1080 27inch) I get screen tearing on the side monitors but the center one is fine.
|
# ? May 11, 2016 16:57 |
|
Holyshoot posted:Would gsync help at all if I want to upgrade my center monitor to a 144hz one and still use my two side monitors that are 60hz? right now running a 120hz monitor for my center one and two 60hz on the side (all 1920x1080 27inch) I get screen tearing on the side monitors but the center one is fine. Wouldn't help the side monitors
|
# ? May 11, 2016 16:58 |
|
bull3964 posted:In theory, the Asus PG27AQ is the right monitor, but it's about $150 too expensive and has atrocious QC. Seriously, I was looking into a monitor upgrade, and gently caress being expected to do my own quality control rather than them checking whether they built it properly before shipping it to me. Just not worth the hassle, I'll stick with what I've got.
|
# ? May 11, 2016 16:59 |
|
Holyshoot posted:Would gsync help at all if I want to upgrade my center monitor to a 144hz one and still use my two side monitors that are 60hz? right now running a 120hz monitor for my center one and two 60hz on the side (all 1920x1080 27inch) I get screen tearing on the side monitors but the center one is fine. In addition to the side monitors not having gsync support, it only works with a single display at a time right now. I don't think that's inherent, and might even be software-fixable, but it's the situation right now.
|
# ? May 11, 2016 17:19 |
|
Ok currently I just seem to be a slight bit confused. First Gsync/Freesync, from what I believe I understand, both sort of work together to either enable V-Sync above the Monitors refresh rate and disables it when framerate goes below, and/or adjust the refresh rate to sync the frames when it is below as well? When you have >60hz screen (120/144Hz range) unless you are going above 120FPS then what benefit does Freesync/G-Sync provide again?
|
# ? May 11, 2016 17:42 |
|
If I pick up a gsync monitor where I have to use a display port. Will I be ok going from display port out of the monitor to mini display port into my graphics card? My 690 has an HDMI, 3 DVI's and a mini display port. No actual display port.
|
# ? May 11, 2016 17:47 |
|
EdEddnEddy posted:Ok currently I just seem to be a slight bit confused. *sync makes it so the monitor outputs the next picture when it finishes rendering (unless the rendering happened before max refresh rate). Basically, it eliminates tearing and vsync stutter stutter, because vsync always has to wait for the next refresh.
|
# ? May 11, 2016 17:50 |
|
EdEddnEddy posted:Ok currently I just seem to be a slight bit confused. They match the refresh rate of the monitor with the output of the GPU, eliminating the need for resource intense things like vsync. It makes for smooth fps across a large range of fps, greatly reducing stutters, and as far as I know completely eliminating all tearing. Best of all it does this without introducing lag and other lame things. Since all those partial frames dont exist anymore its just a very smooth gameplay experience even if your GPU is totally maxed out struggling to render across a very wide range (which allows you to increase settings too) edit: I guess I should note I've never actually owned one so I might be omitting some quirks they might experience. I've just played on some in stores, events, etc penus penus penus fucked around with this message at 17:58 on May 11, 2016 |
# ? May 11, 2016 17:50 |
|
Holyshoot posted:If I pick up a gsync monitor where I have to use a display port. Will I be ok going from display port out of the monitor to mini display port into my graphics card? My 690 has an HDMI, 3 DVI's and a mini display port. No actual display port. AFAIK mini-displayport is just a different physical standard, but is completely compatible, kindof like miniUSB. Note that I'm often wrong, and might be an idiot.
|
# ? May 11, 2016 18:00 |
|
THE DOG HOUSE posted:They match the refresh rate of the monitor with the output of the GPU, eliminating the need for resource intense things like vsync. It makes for smooth fps across a large range of fps, greatly reducing stutters, and as far as I know completely eliminating all tearing. Best of all it does this without introducing lag and other lame things. Since all those partial frames dont exist anymore its just a very smooth gameplay experience even if your GPU is totally maxed out struggling to render across a very wide range (which allows you to increase settings too) It's pretty much that. It's way smoother, and actually has a greater benefit the lower the frame rate.
|
# ? May 11, 2016 18:04 |
|
Last I checked though, tearing only happens when the FPS > Hz of the screen so if you have a 120/144hz screen and are rendering at 90fps, there is no reason to have anySYNC turned on because your screen doesn't have any problem keeping up. Now for the COD/CS players that game at 200FPS+ then I can understand that sync/input lag/studder benefit as it is back above the Hz range and tearing would most definitely happen with that high of a FPS/Hz spread.
|
# ? May 11, 2016 18:20 |
|
MrYenko posted:AFAIK mini-displayport is just a different physical standard, but is completely compatible, kindof like miniUSB. Note that I'm often wrong, and might be an idiot. Yeah, that's right. The adapters are simple, cheap and passive. EdEddnEddy posted:Last I checked though, tearing only happens when the FPS > Hz of the screen so if you have a 120/144hz screen and are rendering at 90fps, there is no reason to have anySYNC turned on because your screen doesn't have any problem keeping up. That's not quite correct, because you're not guaranteeing that the frame is delivered when the monitor refreshes. They'd need to sync up exactly. HalloKitty fucked around with this message at 19:37 on May 11, 2016 |
# ? May 11, 2016 18:34 |
|
EdEddnEddy posted:Last I checked though, tearing only happens when the FPS > Hz of the screen so if you have a 120/144hz screen and are rendering at 90fps, there is no reason to have anySYNC turned on because your screen doesn't have any problem keeping up. With gsync fps = Hz, always. That's what it does: it changes dynamically the refresh rate of the monitor in order to be always paired (in sync) with the buffer of frames coming from the gpu at the given rate.
|
# ? May 11, 2016 18:36 |
|
EdEddnEddy posted:Last I checked though, tearing only happens when the FPS > Hz of the screen so if you have a 120/144hz screen and are rendering at 90fps, there is no reason to have anySYNC turned on because your screen doesn't have any problem keeping up. It definitely tears below the refresh rate. Often the worst tearing is when the GPU is struggling with something below the refresh rate. The problem is the monitor is refreshing at a more or less exact time and the frames themselves are definitely not, regardless of what the average fps is. It depends a lot on the game of course. Also how bad it is determines with you see actual tears across the screen going up or down in a pattern vs a faint impression of something being off. If you slowed it down enough, you will see tears though
|
# ? May 11, 2016 18:37 |
|
EdEddnEddy posted:Last I checked though, tearing only happens when the FPS > Hz of the screen so if you have a 120/144hz screen and are rendering at 90fps, there is no reason to have anySYNC turned on because your screen doesn't have any problem keeping up. It's not that simple, even if a GPU is maintaining an average FPS lower than your monitor's refresh rate, it's not necessarily sending frames to the monitor at the same timing as the screen is getting re-drawn. THE DOG HOUSE posted:It definitely tears below the refresh rate. Often the worst tearing is when the GPU is struggling with something below the refresh rate. The problem is the monitor is refreshing at a more or less exact time and the frames themselves are definitely not, regardless of what the average fps is. It depends a lot on the game of course. Also how bad it is determines with you see actual tears across the screen going up or down in a pattern vs a faint impression of something being off. If you slowed it down enough, you will see tears though ^ This
|
# ? May 11, 2016 19:34 |
|
GSync is actually best for people who don't upgrade their GPU very often, which is unfortunate it has such a premium on price because it's a decent way to attempt to future proof. I am running a GTX770 right now so if my FPS drops from 60->45 it's really not that big of a deal and far less noticeable. You can get an Acer XB270HU for $450 from their refurb site, which isn't too bad for a 27" 1440 monitor. The Acer's tend to have some backlight bleed though, which is unfortunate.
|
# ? May 11, 2016 19:45 |
|
I assume there will still be a titan / ti card this generation?
|
# ? May 11, 2016 19:56 |
|
Like I said earlier, waiting for prosumer GP100. I want to run my X34 at 3440x1440@100Hz constant in all games. GTX 1080 is probably good for 2560x1440 and 2560x1080, but I doubt it can 3440x1440 at max settings and desirable FPS. However, there are not any reviews of yet.
|
# ? May 11, 2016 19:56 |
|
The primary benefit of gsync and freesync is when your GPU is rendering below the maximum framerate. With a 60hz monitor if you miss the 16.67ms cutoff you have to wait for the next sync, at 33.33ms. With either adaptive sync technology you can instead present a frame at 20ms. It's best when you're not dipping too much below the referred rate. 35fps average is still going to look awful and you'd likely be better off activating half rate vsync to lock to 30fps, but 50-60 fps ends up much smoother.
|
# ? May 11, 2016 19:58 |
|
I'm trying to crank up my gtx560 using Asus gpu tweak. I've turned the core speed to 910mhz and it's perfectly stable, but it won't let me go any higher. I've tried turning the voltage up to 1.035 but still nothing. Is there any way to go higher?
|
# ? May 11, 2016 20:01 |
|
Lockback posted:You can get an Acer XB270HU for $450 from their refurb site, which isn't too bad for a 27" 1440 monitor. The Acer's tend to have some backlight bleed though, which is unfortunate. They pack their refurbs like poo poo though, I think it's deliberate to try and get Fedex/UPS to pay out insurance on crappy panels. Their retail units are packed much, much better and that's the most obvious reason why they'd do that.
|
# ? May 11, 2016 20:02 |
|
|
# ? May 30, 2024 13:09 |
|
Lockback posted:GSync is actually best for people who don't upgrade their GPU very often, which is unfortunate it has such a premium on price because it's a decent way to attempt to future proof. I am running a GTX770 right now so if my FPS drops from 60->45 it's really not that big of a deal and far less noticeable. As long as nothing drastic happens, people who are trying to keep their total cost of ownership low, especially ones who don't buy/flip cards frequently should probably go AMD. Freesync is going to make the biggest difference to people trying to get a good experience out of a marginal card while gsync is functionally irrelevant because of the price tag, and if historical evidence is anything to go by, getting at least two good years out of a mid range/old top range card (7970 once the 290 was out, 290 once the 970 was out) and probably another two marginal years after that (7970 is still a pretty reasonable cheap 1080 card) is totally doable. I expect my 290 to still be putting in good service in a friend's computer in 2018 and still running in 2020 (okay, it's replacing a GTX 430, it's going to last till it causes the heat death of the universe, but still).
|
# ? May 11, 2016 20:05 |