|
Mr E posted:My 1080 Armor is whisper quiet for the most part and my XB271HU looks to have no errors. Do I need to do anything besides turn on GSync in the control panel and turn off Vsync in games to take advantage of it? Make sure that you've set the refresh rate to 144hz in the NVIDIA control panel.
|
# ? Jun 25, 2016 11:46 |
|
|
# ? Jun 5, 2024 12:40 |
|
A Dutch tech blog left this RX480 benchmark online, but since I have no idea how to read benchmarks outside of apples to apples higher is better (most of the time), I dunno if it's good or not. http://ashesofthesingularity.com/metaverse#/personas/9c8acff5-95f0-49f5-a319-baccbcd978e6/match-details/530a27bd-2b27-43c7-8204-5887517b93ad Someone who knows how to read benchmark please say if I should be excited or not.
|
# ? Jun 25, 2016 12:17 |
|
Geemer posted:A Dutch tech blog left this RX480 benchmark online, but since I have no idea how to read benchmarks outside of apples to apples higher is better (most of the time), I dunno if it's good or not. According to a quick googles, 3500 for the RX480 and 5900 for a 1080 on the same settings (1080p crazy). No overclocks included, so it's not terribly useful. Seems to be around 60% of the 1080s performance
|
# ? Jun 25, 2016 12:30 |
|
Kithyen posted:
Here is my score for comparison, which is a EVGA SC with +50 base clock and +400 memory. It gets up to 72-75C fairly quickly and then hangs there, but the fan doesn't even turn on until it is over 60C so that's by design. For comparison I had a SC 1080 running at roughly the same speed (except the full speed memory) and it got 22,000 GPU score. Is there anyone with some magical success story about using the new overclock-per-voltage-level option available with the new Pascal cards? I just use the NVIDIA Inspector because I prefer its Windows UI instead of EVGA's garbage software that makes my Delete key stop working while it is running.
|
# ? Jun 25, 2016 14:06 |
|
Really, the RX 480s being good is all dependent on how well it overclocks. There are leaks that indicate it runs 1.1v on stock/boost clocks, but between the two leaks that suggest this the variance on voltage for the exact same clock is huge (1.08 vs 1.135), so again it seems AMD had some issues validating Polaris and has improved yields by setting the voltage to whatever it needs for consistent clocks, likely making overhead and power draw quite variable depending on silicon lottery. I guess it's possible golden samples will have no issue hitting the fabled 1.5Ghz+, but for now it kind of looks like AMD's new roadmap will be too due a basic run of Polaris, than a refresh once the process issues are worked in the 500 series with Vega, Navi following on 7nm in 2018 for the 600 series, if the 7nm Zen server chip is to be believed.
|
# ? Jun 25, 2016 14:09 |
|
What do you think would give me the best opportunity cost for better visuals at this point: Getting a 1080 GTX and sticking with my 1080p monitor, or staying with my 980ti and investing in a GSync monitor at 1440p?
|
# ? Jun 25, 2016 15:00 |
|
CapnBry posted:Here is my score for comparison, which is a EVGA SC with +50 base clock and +400 memory. It gets up to 72-75C fairly quickly and then hangs there, but the fan doesn't even turn on until it is over 60C so that's by design. For comparison I had a SC 1080 running at roughly the same speed (except the full speed memory) and it got 22,000 GPU score. The only reason I need a new card to begin with is to get more than one DisplayPort, to drive two 1440p 144hz monitors, otherwise I'd wait it out some more.
|
# ? Jun 25, 2016 15:03 |
exquisite tea posted:What do you think would give me the best opportunity cost for better visuals at this point: Getting a 1080 GTX and sticking with my 1080p monitor, or staying with my 980ti and investing in a GSync monitor at 1440p? The monitor, easily.
|
|
# ? Jun 25, 2016 15:07 |
|
Thanks for the advance. Without trying to get into one of those FPS flamewars the internet loves to have, my eyes tend to be really sensitive to frame drops, especially below 45ish. Does GSync really help to smooth out that curve so that even if my 980ti did drop frames down to below 45 in visually dense areas, I wouldn't notice it as much? I guess what I'm asking is if anybody who has a GSync monitor is like me and how much it helped them for a smoother gaming experience.
|
# ? Jun 25, 2016 15:14 |
|
G-Sync makes things very smooth because panel refresh rate adapts to the frame rate of the game. So you won't have tearing, nor stuttering when VSync forces frames on a rigid timing schedule. When frames are timed properly, things look way smoother at lower framerates.
|
# ? Jun 25, 2016 15:24 |
|
Yes. It's hard to explain, but normally when the framerate drops, your mouse input is also affected and becomes imprecise. With G-Sync that's eliminated, the movement is always buttery smooth, and you don't even notice the framerate nearly as much.
|
# ? Jun 25, 2016 15:26 |
|
exquisite tea posted:What do you think would give me the best opportunity cost for better visuals at this point: Getting a 1080 GTX and sticking with my 1080p monitor, or staying with my 980ti and investing in a GSync monitor at 1440p? Definitely the monitor. You'd probably not notice much difference with the 1080 because your 980ti is already maxing out your monitor in most games at 1080p. (unless its a 120/144hz monitor maybe)
|
# ? Jun 25, 2016 15:34 |
|
Has there been any sign of Nvidia relenting in their refusal to support the obviously-more-likely-to-gain-widespread-acceptance FreeSync technology? It seems like FreeSync could be awesome for HTPCs, both for in-home game streaming and watching content with frame rates that doesn't really match the monitor (all your 23.99 or whatever framerate movies would play correctly).
|
# ? Jun 25, 2016 15:38 |
|
pigdog posted:Yes. It's hard to explain, but normally when the framerate drops, your mouse input is also affected and becomes imprecise. This only happens in poo poo games with poo poo engines though.
|
# ? Jun 25, 2016 15:46 |
|
PBCrunch posted:Has there been any sign of Nvidia relenting in their refusal to support the obviously-more-likely-to-gain-widespread-acceptance FreeSync technology? No, and I doubt they would add FreeSync support until AMD starts significantly cutting into their market share and being locked into the Nvidia platform becomes less tenable. I find it very hard to believe that they would support only Gsync indefinitely, though, considering the entire rest of the industry is settled on FreeSync as the standard.
|
# ? Jun 25, 2016 15:51 |
|
They're only jumping when Intel supports Adaptive Sync on nearly all their GPUs and Intel is dragging their feet on that, partially waiting to see what AMD pulls off. Most unimpressive Mexican Standoff ever
|
# ? Jun 25, 2016 15:54 |
|
They'll have to add support sooner or later because it's a part of some new version of displayport now so
|
# ? Jun 25, 2016 15:54 |
|
It's optional in DisplayPort
|
# ? Jun 25, 2016 15:58 |
|
Combat Pretzel posted:The 1080 of yours was also an EVGA SC? If the difference is just 15%, I may opt for an 1070 after all. I couldn't really decide, because I was considering to go with the 1080 for VR readiness, but 15% don't cut it. Are there areas the 1080 has bigger advantages than 15%? I suppose in VR specific stuff? One thing I will say about NVIDIA vs AMD, NVIDIA seems to hype a lot of features of their GPUs that are probably just software additions but they get programming nerds like me excited. AMD seems to say "Here is our hardware and it is really good and with this generation exceptionally affordable". Selling me something that works good and is affordable doesn't excite me as much as concepts like Single Pass Stereo, Lens Matched Shading, Simultaneous Multi-Projection, Ansel Screenshots, and FAST SYNC (low latency V-SYNC). Of course, these are the sort of things that may or may not ever be used by real games. There is also a history of these sorts of features that don't really pan out all that well in the real-world such as stereoscopic 3D and most uses of GPU-based PhysX. CapnBry fucked around with this message at 16:11 on Jun 25, 2016 |
# ? Jun 25, 2016 15:59 |
It looks like I should have both Gsync and Vsync on in the Nvidia control panel, and Vsync off in games? Also, I guess in games like Skyrim I still need to limit the FPS to 59 or so? Finally - I'm assuming the Gsync when windowed option also works for borderless?
|
|
# ? Jun 25, 2016 16:07 |
|
pigdog posted:Yes. It's hard to explain, but normally when the framerate drops, your mouse input is also affected and becomes imprecise. With G-Sync that's eliminated, the movement is always buttery smooth, and you don't even notice the framerate nearly as much. What *sync does is even out the latency between a frame being made and it being displayed. So frames are displayed at the right time relative to each other and the movement doesn't jitter between too fast and too slow. It's real noticeable.
|
# ? Jun 25, 2016 16:17 |
|
CapnBry posted:One thing I will say about NVIDIA vs AMD, NVIDIA seems to hype a lot of features of their GPUs that are probably just software additions but they get programming nerds like me excited. AMD seems to say "Here is our hardware and it is really good and with this generation exceptionally affordable". Selling me something that works good and is affordable doesn't excite me as much as concepts like Single Pass Stereo, Lens Matched Shading, Simultaneous Multi-Projection, Ansel Screenshots, and FAST SYNC (low latency V-SYNC). Of course, these are the sort of things that may or may not ever be used by real games. There is also a history of these sorts of features that don't really pan out all that well in the real-world such as stereoscopic 3D and most uses of GPU-based PhysX. NVidia's got what gamers crave. It's got electrolytes
|
# ? Jun 25, 2016 16:22 |
|
We nerds like to read about R&D Side Quests.
|
# ? Jun 25, 2016 16:29 |
|
xthetenth posted:What *sync does is even out the latency between a frame being made and it being displayed. So frames are displayed at the right time relative to each other and the movement doesn't jitter between too fast and too slow. So theoretically with a G-Sync monitor I could grind any old GPU down to N64-level framerates and it would still look pretty smooth because there wouldn't be any stuttering as it oscillated between different frame rates? Sorry if these come off as truly stupid questions. I don't have any technical knowledge, just money.
|
# ? Jun 25, 2016 16:35 |
|
exquisite tea posted:So theoretically with a G-Sync monitor I could grind any old GPU down to N64-level framerates and it would still look pretty smooth because there wouldn't be any stuttering as it oscillated between different frame rates? Sorry if these come off as truly stupid questions. I don't have any technical knowledge, just money. It'll eventually become a slideshow, there's no getting around that, but you won't have the same degree of things moving not quite smoothly in that slideshow. I've found it's a big help in 40-60 fps areas but I tend to not want to go lower.
|
# ? Jun 25, 2016 16:45 |
|
Mr E posted:It looks like I should have both Gsync and Vsync on in the Nvidia control panel, and Vsync off in games? Also, I guess in games like Skyrim I still need to limit the FPS to 59 or so? Finally - I'm assuming the Gsync when windowed option also works for borderless?
|
# ? Jun 25, 2016 17:30 |
|
Gsync will work without vsync. As of a driver update sometime in 2015, turning on vsync with gsync enabled will enable it when fps >= refresh. Prior to that, vsync would always turn on when gsync was enabled if fps >= refresh, even if disabled in nvcp
|
# ? Jun 25, 2016 17:53 |
|
Gonkish posted:This latest driver (368.39) is loving godawful. I'm on dual 760s right now and I had to roll back to the previous driver just to get basic stability back. It was crashing randomly, even doing basic poo poo in Windows (like playing Youtube videos) and poo poo. Swapping back solved all of that instantly and things are back to normal. There's something seriously fucky going on with 368.39, and the 10xx series is stuck with it. This sounds a lot like my problem. Is there a particular stable version I can roll back to?
|
# ? Jun 25, 2016 17:54 |
|
CapnBry posted:Yup the 1080 was a EVGA SC as well. I was a little disappointed in the numbers as well. I'm not quite sure I fully understand how the clocks work on Pascal. I had it set to +100 base clock and 110% power target. That should have given me 1847+100=1947MHz but I was seeing 2025-2050MHz? Seeing people post their 2GHz 1070s with GPU scores of 19,000-20,000 I decided the cost wasn't worth it, sold the 1080 for $100 more than I paid for it, and got the 1070. Every Vive title I've played on it doesn't even seem to tax the GPU at all (although Witcher 3 with Ultra settings in Desktop Theater is a stuttery mess). I just ran 3D Mark and had a GPU score of 20,507 on the first try with: Core Voltage 10% Power Limit 111% Core Clock +75 Memory Clock +700 Got a GPU score of 20,746 with: Core Voltage 100% Power Limit 111% Core Clock +100 Memory Clock +700 Core clock 2,088 MHz Memory bus clock 2,352 MHz Either I'm missing something or upping core voltage on a 1070 does pretty much nothing. GPU Temp never went about 65-66C in either test either apparently. Fan speed capped out at 49% as well. Not sure what people get with 1080s but my total score it around 14.7k which is just under what 3D Mark lists for 4k gaming. I'm on a 1440p monitor though so it's more than enough to me.
|
# ? Jun 25, 2016 18:01 |
|
This core clock thing, is it the boost? Whenever I look up the 1070, the base clock is around 1600MHz.
|
# ? Jun 25, 2016 18:52 |
|
Still tweaking mine, but I hit 18502 on firestrike with a 24000 graphics score on a 1080. Core at +120 and memory at +400
|
# ? Jun 25, 2016 19:18 |
|
Techreport review of the 1080 is up, finally. Were they buying the card off newegg or something? Anyway, I like how perfectly it's priced: http://techreport.com/review/30281/nvidia-geforce-gtx-1080-graphics-card-reviewed/7 But overall I thought this review made it look a bit less impressive than it seemed to me before. It's definitely a noticeable improvement over the 980Ti but hardly looks like a game changer. Definitely looking forward to some solid 480 overclocking results.
|
# ? Jun 25, 2016 20:28 |
|
Reading that TechReport site, they mention GPU clocking issues with G-Sync at high rates. I just noticed that my GTX 780 doesn't drop entirely to idle at 144hz, but does so at 120hz. The gently caress! At 144hz, GPU clock is like 692MHz and memory 1500MHz. At 120hz, the GPU drops to 324MHz and memory to 162MHz. I'm glad I found out about this.
|
# ? Jun 25, 2016 21:17 |
|
VostokProgram posted:This sounds a lot like my problem. Is there a particular stable version I can roll back to? I'm currently on the previous version, 368.22. Seems to have solved the issues I was having.
|
# ? Jun 25, 2016 21:22 |
|
exquisite tea posted:What do you think would give me the best opportunity cost for better visuals at this point: Getting a 1080 GTX and sticking with my 1080p monitor, or staying with my 980ti and investing in a GSync monitor at 1440p? It's been said but yeah I'm 5thing the monitor here. You wouldn't hardly be able to tell the difference with a 1080 at 1080p. I'm afraid to post it again since it was met with derision earlier, but there are 27" 1440p 144hz gysnc monitors from a real brand less than $500 now finally. Not IPS yet, but it reviewed well for TN. Bodes well, I could never stomach the gsync 1440p monitor cost. Get it down to ... $375 to make up a number, and they have a buyer in me. I feel bad for all the driver woes people are experiencing. I haven't yet but, if a driver can gently caress up I guarantee it will happen to me!
|
# ? Jun 25, 2016 21:28 |
|
Combat Pretzel posted:Reading that TechReport site, they mention GPU clocking issues with G-Sync at high rates. I just noticed that my GTX 780 doesn't drop entirely to idle at 144hz, but does so at 120hz. The gently caress! I think that's a pretty old bug: https://www.reddit.com/r/nvidia/comments/38vonq/psa_nvidia_users_with_a_high_refresh_rate_monitor/ Another is checking your rgb setting to make sure it wasn't reverted to limited each time you upgrade the driver. I've been doing that for 3 years, hehe. I wonder if there is a place that catalogs all of the known issues
|
# ? Jun 25, 2016 21:37 |
|
I go to an anime con when the RX 480 hits, rip day one test runs of the thing Overall it seems to be a 290/390x at stock looking at all of the fell-off-the-back-of-a-truck leaks so far, still not enough overclocking data.
|
# ? Jun 25, 2016 21:38 |
|
Combat Pretzel posted:Unless the game has a specific G-Sync setting, you want VSync on, because you need to tell the driver that a frame is ready. You don't need vsync to tell the driver a frame is ready, or vsync-off would never render anything. Vsync means to block frame submission until the previous frame has been scanned out. It's the driver telling something, not being told.
|
# ? Jun 25, 2016 21:54 |
|
AVeryLargeRadish posted:Correct, it's the Thermaltake Suppressor F1. Zero VGS posted:It seems like it's exactly the same as the Core V1 but with some additional slats cut into the front of it for more airflow, and square power buttons instead of round? The other downside is that the filters on the sides are only removable from the INSIDE, unlike the V21, which is a baffling decision.
|
# ? Jun 25, 2016 21:56 |
|
|
# ? Jun 5, 2024 12:40 |
|
Anime Schoolgirl posted:I go to a motorcycle stunt competition to defend my champion title when the RX 480 hits, rip day one test runs of the thing
|
# ? Jun 25, 2016 21:58 |