Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Shrinkage
Oct 23, 2010

Mr E posted:

My 1080 Armor is whisper quiet for the most part and my XB271HU looks to have no errors. Do I need to do anything besides turn on GSync in the control panel and turn off Vsync in games to take advantage of it?

Make sure that you've set the refresh rate to 144hz in the NVIDIA control panel.

Adbot
ADBOT LOVES YOU

Geemer
Nov 4, 2010



A Dutch tech blog left this RX480 benchmark online, but since I have no idea how to read benchmarks outside of apples to apples higher is better (most of the time), I dunno if it's good or not.
http://ashesofthesingularity.com/metaverse#/personas/9c8acff5-95f0-49f5-a319-baccbcd978e6/match-details/530a27bd-2b27-43c7-8204-5887517b93ad

Someone who knows how to read benchmark please say if I should be excited or not.

Setset
Apr 14, 2012
Grimey Drawer

Geemer posted:

A Dutch tech blog left this RX480 benchmark online, but since I have no idea how to read benchmarks outside of apples to apples higher is better (most of the time), I dunno if it's good or not.
http://ashesofthesingularity.com/metaverse#/personas/9c8acff5-95f0-49f5-a319-baccbcd978e6/match-details/530a27bd-2b27-43c7-8204-5887517b93ad

Someone who knows how to read benchmark please say if I should be excited or not.

According to a quick googles, 3500 for the RX480 and 5900 for a 1080 on the same settings (1080p crazy). No overclocks included, so it's not terribly useful. Seems to be around 60% of the 1080s performance

CapnBry
Jul 15, 2002

I got this goin'
Grimey Drawer

Kithyen posted:



Not sure if this is a good place to ask. Installed my 1070 a couple days ago and performed a benchmark which scored over 10k. Today I tried two different times because I noticed some slowdown in areas of Mirror's edge catalyst that I didn't have the night previously. They both scroed quite a bit lower than I thought they would. As far as I can tell neither my CPU or GPU are throttling. CPU will hit a high of 60c at most and GPU hasn't gone above 72.
Do you have an overclock that has reset itself back to stock or something? Maybe check the CPU and GPU score detail to see which has dropped because that is rather significant.

Here is my score for comparison, which is a EVGA SC with +50 base clock and +400 memory. It gets up to 72-75C fairly quickly and then hangs there, but the fan doesn't even turn on until it is over 60C so that's by design. For comparison I had a SC 1080 running at roughly the same speed (except the full speed memory) and it got 22,000 GPU score.


Is there anyone with some magical success story about using the new overclock-per-voltage-level option available with the new Pascal cards? I just use the NVIDIA Inspector because I prefer its Windows UI instead of EVGA's garbage software that makes my Delete key stop working while it is running.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Really, the RX 480s being good is all dependent on how well it overclocks. There are leaks that indicate it runs 1.1v on stock/boost clocks, but between the two leaks that suggest this the variance on voltage for the exact same clock is huge (1.08 vs 1.135), so again it seems AMD had some issues validating Polaris and has improved yields by setting the voltage to whatever it needs for consistent clocks, likely making overhead and power draw quite variable depending on silicon lottery. I guess it's possible golden samples will have no issue hitting the fabled 1.5Ghz+, but for now it kind of looks like AMD's new roadmap will be too due a basic run of Polaris, than a refresh once the process issues are worked in the 500 series with Vega, Navi following on 7nm in 2018 for the 600 series, if the 7nm Zen server chip is to be believed.

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


What do you think would give me the best opportunity cost for better visuals at this point: Getting a 1080 GTX and sticking with my 1080p monitor, or staying with my 980ti and investing in a GSync monitor at 1440p?

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

CapnBry posted:

Here is my score for comparison, which is a EVGA SC with +50 base clock and +400 memory. It gets up to 72-75C fairly quickly and then hangs there, but the fan doesn't even turn on until it is over 60C so that's by design. For comparison I had a SC 1080 running at roughly the same speed (except the full speed memory) and it got 22,000 GPU score.

The 1080 of yours was also an EVGA SC? If the difference is just 15%, I may opt for an 1070 after all. I couldn't really decide, because I was considering to go with the 1080 for VR readiness, but 15% don't cut it. Are there areas the 1080 has bigger advantages than 15%? I suppose in VR specific stuff?

The only reason I need a new card to begin with is to get more than one DisplayPort, to drive two 1440p 144hz monitors, otherwise I'd wait it out some more.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

exquisite tea posted:

What do you think would give me the best opportunity cost for better visuals at this point: Getting a 1080 GTX and sticking with my 1080p monitor, or staying with my 980ti and investing in a GSync monitor at 1440p?

The monitor, easily.

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


Thanks for the advance. Without trying to get into one of those FPS flamewars the internet loves to have, my eyes tend to be really sensitive to frame drops, especially below 45ish. Does GSync really help to smooth out that curve so that even if my 980ti did drop frames down to below 45 in visually dense areas, I wouldn't notice it as much? I guess what I'm asking is if anybody who has a GSync monitor is like me and how much it helped them for a smoother gaming experience.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
G-Sync makes things very smooth because panel refresh rate adapts to the frame rate of the game. So you won't have tearing, nor stuttering when VSync forces frames on a rigid timing schedule. When frames are timed properly, things look way smoother at lower framerates.

pigdog
Apr 23, 2004

by Smythe
Yes. It's hard to explain, but normally when the framerate drops, your mouse input is also affected and becomes imprecise. With G-Sync that's eliminated, the movement is always buttery smooth, and you don't even notice the framerate nearly as much.

Deuce
Jun 18, 2004
Mile High Club

exquisite tea posted:

What do you think would give me the best opportunity cost for better visuals at this point: Getting a 1080 GTX and sticking with my 1080p monitor, or staying with my 980ti and investing in a GSync monitor at 1440p?

Definitely the monitor. You'd probably not notice much difference with the 1080 because your 980ti is already maxing out your monitor in most games at 1080p. (unless its a 120/144hz monitor maybe)

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me
Has there been any sign of Nvidia relenting in their refusal to support the obviously-more-likely-to-gain-widespread-acceptance FreeSync technology?

It seems like FreeSync could be awesome for HTPCs, both for in-home game streaming and watching content with frame rates that doesn't really match the monitor (all your 23.99 or whatever framerate movies would play correctly).

Truga
May 4, 2014
Lipstick Apathy

pigdog posted:

Yes. It's hard to explain, but normally when the framerate drops, your mouse input is also affected and becomes imprecise.

This only happens in poo poo games with poo poo engines though.

wicka
Jun 28, 2007


PBCrunch posted:

Has there been any sign of Nvidia relenting in their refusal to support the obviously-more-likely-to-gain-widespread-acceptance FreeSync technology?

It seems like FreeSync could be awesome for HTPCs, both for in-home game streaming and watching content with frame rates that doesn't really match the monitor (all your 23.99 or whatever framerate movies would play correctly).

No, and I doubt they would add FreeSync support until AMD starts significantly cutting into their market share and being locked into the Nvidia platform becomes less tenable. I find it very hard to believe that they would support only Gsync indefinitely, though, considering the entire rest of the industry is settled on FreeSync as the standard.

Anime Schoolgirl
Nov 28, 2002

They're only jumping when Intel supports Adaptive Sync on nearly all their GPUs and Intel is dragging their feet on that, partially waiting to see what AMD pulls off.

Most unimpressive Mexican Standoff ever

Truga
May 4, 2014
Lipstick Apathy
They'll have to add support sooner or later because it's a part of some new version of displayport now so

fozzy fosbourne
Apr 21, 2010

It's optional in DisplayPort

CapnBry
Jul 15, 2002

I got this goin'
Grimey Drawer

Combat Pretzel posted:

The 1080 of yours was also an EVGA SC? If the difference is just 15%, I may opt for an 1070 after all. I couldn't really decide, because I was considering to go with the 1080 for VR readiness, but 15% don't cut it. Are there areas the 1080 has bigger advantages than 15%? I suppose in VR specific stuff?
Yup the 1080 was a EVGA SC as well. I was a little disappointed in the numbers as well. I'm not quite sure I fully understand how the clocks work on Pascal. I had it set to +100 base clock and 110% power target. That should have given me 1847+100=1947MHz but I was seeing 2025-2050MHz? Seeing people post their 2GHz 1070s with GPU scores of 19,000-20,000 I decided the cost wasn't worth it, sold the 1080 for $100 more than I paid for it, and got the 1070. Every Vive title I've played on it doesn't even seem to tax the GPU at all (although Witcher 3 with Ultra settings in Desktop Theater is a stuttery mess).

One thing I will say about NVIDIA vs AMD, NVIDIA seems to hype a lot of features of their GPUs that are probably just software additions but they get programming nerds like me excited. AMD seems to say "Here is our hardware and it is really good and with this generation exceptionally affordable". Selling me something that works good and is affordable doesn't excite me as much as concepts like Single Pass Stereo, Lens Matched Shading, Simultaneous Multi-Projection, Ansel Screenshots, and FAST SYNC (low latency V-SYNC). Of course, these are the sort of things that may or may not ever be used by real games. There is also a history of these sorts of features that don't really pan out all that well in the real-world such as stereoscopic 3D and most uses of GPU-based PhysX.

CapnBry fucked around with this message at 16:11 on Jun 25, 2016

Mr E
Sep 18, 2007

It looks like I should have both Gsync and Vsync on in the Nvidia control panel, and Vsync off in games? Also, I guess in games like Skyrim I still need to limit the FPS to 59 or so? Finally - I'm assuming the Gsync when windowed option also works for borderless?

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

pigdog posted:

Yes. It's hard to explain, but normally when the framerate drops, your mouse input is also affected and becomes imprecise. With G-Sync that's eliminated, the movement is always buttery smooth, and you don't even notice the framerate nearly as much.

What *sync does is even out the latency between a frame being made and it being displayed. So frames are displayed at the right time relative to each other and the movement doesn't jitter between too fast and too slow.

It's real noticeable.

Setset
Apr 14, 2012
Grimey Drawer

CapnBry posted:

One thing I will say about NVIDIA vs AMD, NVIDIA seems to hype a lot of features of their GPUs that are probably just software additions but they get programming nerds like me excited. AMD seems to say "Here is our hardware and it is really good and with this generation exceptionally affordable". Selling me something that works good and is affordable doesn't excite me as much as concepts like Single Pass Stereo, Lens Matched Shading, Simultaneous Multi-Projection, Ansel Screenshots, and FAST SYNC (low latency V-SYNC). Of course, these are the sort of things that may or may not ever be used by real games. There is also a history of these sorts of features that don't really pan out all that well in the real-world such as stereoscopic 3D and most uses of GPU-based PhysX.

NVidia's got what gamers crave. It's got electrolytes

fozzy fosbourne
Apr 21, 2010

We nerds like to read about R&D Side Quests.

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


xthetenth posted:

What *sync does is even out the latency between a frame being made and it being displayed. So frames are displayed at the right time relative to each other and the movement doesn't jitter between too fast and too slow.

It's real noticeable.

So theoretically with a G-Sync monitor I could grind any old GPU down to N64-level framerates and it would still look pretty smooth because there wouldn't be any stuttering as it oscillated between different frame rates? Sorry if these come off as truly stupid questions. I don't have any technical knowledge, just money.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

exquisite tea posted:

So theoretically with a G-Sync monitor I could grind any old GPU down to N64-level framerates and it would still look pretty smooth because there wouldn't be any stuttering as it oscillated between different frame rates? Sorry if these come off as truly stupid questions. I don't have any technical knowledge, just money.

It'll eventually become a slideshow, there's no getting around that, but you won't have the same degree of things moving not quite smoothly in that slideshow. I've found it's a big help in 40-60 fps areas but I tend to not want to go lower.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Mr E posted:

It looks like I should have both Gsync and Vsync on in the Nvidia control panel, and Vsync off in games? Also, I guess in games like Skyrim I still need to limit the FPS to 59 or so? Finally - I'm assuming the Gsync when windowed option also works for borderless?
Unless the game has a specific G-Sync setting, you want VSync on, because you need to tell the driver that a frame is ready.

fozzy fosbourne
Apr 21, 2010

Gsync will work without vsync. As of a driver update sometime in 2015, turning on vsync with gsync enabled will enable it when fps >= refresh. Prior to that, vsync would always turn on when gsync was enabled if fps >= refresh, even if disabled in nvcp

Yaoi Gagarin
Feb 20, 2014

Gonkish posted:

This latest driver (368.39) is loving godawful. I'm on dual 760s right now and I had to roll back to the previous driver just to get basic stability back. It was crashing randomly, even doing basic poo poo in Windows (like playing Youtube videos) and poo poo. Swapping back solved all of that instantly and things are back to normal. There's something seriously fucky going on with 368.39, and the 10xx series is stuck with it.

This sounds a lot like my problem. Is there a particular stable version I can roll back to?

Evil Fluffy
Jul 13, 2009

Scholars are some of the most pompous and pedantic people I've ever had the joy of meeting.

CapnBry posted:

Yup the 1080 was a EVGA SC as well. I was a little disappointed in the numbers as well. I'm not quite sure I fully understand how the clocks work on Pascal. I had it set to +100 base clock and 110% power target. That should have given me 1847+100=1947MHz but I was seeing 2025-2050MHz? Seeing people post their 2GHz 1070s with GPU scores of 19,000-20,000 I decided the cost wasn't worth it, sold the 1080 for $100 more than I paid for it, and got the 1070. Every Vive title I've played on it doesn't even seem to tax the GPU at all (although Witcher 3 with Ultra settings in Desktop Theater is a stuttery mess).

I just ran 3D Mark and had a GPU score of 20,507 on the first try with:
Core Voltage 10%
Power Limit 111%
Core Clock +75
Memory Clock +700

Got a GPU score of 20,746 with:
Core Voltage 100%
Power Limit 111%
Core Clock +100
Memory Clock +700

Core clock 2,088 MHz
Memory bus clock 2,352 MHz

Either I'm missing something or upping core voltage on a 1070 does pretty much nothing. GPU Temp never went about 65-66C in either test either apparently. Fan speed capped out at 49% as well. Not sure what people get with 1080s but my total score it around 14.7k which is just under what 3D Mark lists for 4k gaming. I'm on a 1440p monitor though so it's more than enough to me. :shrug:

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
This core clock thing, is it the boost? Whenever I look up the 1070, the base clock is around 1600MHz.

Enos Cabell
Nov 3, 2004


Still tweaking mine, but I hit 18502 on firestrike with a 24000 graphics score on a 1080. Core at +120 and memory at +400

mobby_6kl
Aug 9, 2009

by Fluffdaddy
Techreport review of the 1080 is up, finally. Were they buying the card off newegg or something? Anyway, I like how perfectly it's priced:


http://techreport.com/review/30281/nvidia-geforce-gtx-1080-graphics-card-reviewed/7

But overall I thought this review made it look a bit less impressive than it seemed to me before. It's definitely a noticeable improvement over the 980Ti but hardly looks like a game changer. Definitely looking forward to some solid 480 overclocking results.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Reading that TechReport site, they mention GPU clocking issues with G-Sync at high rates. I just noticed that my GTX 780 doesn't drop entirely to idle at 144hz, but does so at 120hz. The gently caress!

At 144hz, GPU clock is like 692MHz and memory 1500MHz. At 120hz, the GPU drops to 324MHz and memory to 162MHz.

I'm glad I found out about this.

Gonkish
May 19, 2004

VostokProgram posted:

This sounds a lot like my problem. Is there a particular stable version I can roll back to?

I'm currently on the previous version, 368.22. Seems to have solved the issues I was having.

penus penus penus
Nov 9, 2014

by piss__donald

exquisite tea posted:

What do you think would give me the best opportunity cost for better visuals at this point: Getting a 1080 GTX and sticking with my 1080p monitor, or staying with my 980ti and investing in a GSync monitor at 1440p?

It's been said but yeah I'm 5thing the monitor here. You wouldn't hardly be able to tell the difference with a 1080 at 1080p.

I'm afraid to post it again since it was met with derision earlier, but there are 27" 1440p 144hz gysnc monitors from a real brand less than $500 now finally. Not IPS yet, but it reviewed well for TN. Bodes well, I could never stomach the gsync 1440p monitor cost. Get it down to ... $375 to make up a number, and they have a buyer in me.

I feel bad for all the driver woes people are experiencing. I haven't yet but, if a driver can gently caress up I guarantee it will happen to me!

fozzy fosbourne
Apr 21, 2010

Combat Pretzel posted:

Reading that TechReport site, they mention GPU clocking issues with G-Sync at high rates. I just noticed that my GTX 780 doesn't drop entirely to idle at 144hz, but does so at 120hz. The gently caress!

At 144hz, GPU clock is like 692MHz and memory 1500MHz. At 120hz, the GPU drops to 324MHz and memory to 162MHz.

I'm glad I found out about this.

I think that's a pretty old bug:
https://www.reddit.com/r/nvidia/comments/38vonq/psa_nvidia_users_with_a_high_refresh_rate_monitor/

Another is checking your rgb setting to make sure it wasn't reverted to limited each time you upgrade the driver. I've been doing that for 3 years, hehe.

I wonder if there is a place that catalogs all of the known issues

Anime Schoolgirl
Nov 28, 2002

I go to an anime con when the RX 480 hits, rip day one test runs of the thing

Overall it seems to be a 290/390x at stock looking at all of the fell-off-the-back-of-a-truck leaks so far, still not enough overclocking data.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Combat Pretzel posted:

Unless the game has a specific G-Sync setting, you want VSync on, because you need to tell the driver that a frame is ready.

You don't need vsync to tell the driver a frame is ready, or vsync-off would never render anything. Vsync means to block frame submission until the previous frame has been scanned out. It's the driver telling something, not being told.

Assepoester
Jul 18, 2004
Probation
Can't post for 10 years!
Melman v2

Zero VGS posted:

It seems like it's exactly the same as the Core V1 but with some additional slats cut into the front of it for more airflow, and square power buttons instead of round?

Edit: oh, duh, the front fan sucks the air in from the slats instead of the perforated front... Wonder if that helps noise and if it is still dust-filtered.
Yep, from what I can see in reviews and pictures, the sides have removable dust filter screens now and the front has the same filter material, the down side is that it's the really permissive magnetic filter screen previously seen on the side panels of the Core V21, which is not as good as the foam filter built into the front of the V1. On the plus side, since it pulls from all 4 sides and the material filter is so permissive, it might be able to provide the fan (that maxes out at 800 rpm?) with adequate airflow.

The other downside is that the filters on the sides are only removable from the INSIDE, unlike the V21, which is a baffling decision.

Adbot
ADBOT LOVES YOU

penus penus penus
Nov 9, 2014

by piss__donald

Anime Schoolgirl posted:

I go to a motorcycle stunt competition to defend my champion title when the RX 480 hits, rip day one test runs of the thing

Overall it seems to be a 290/390x at stock looking at all of the fell-off-the-back-of-a-truck leaks so far, still not enough overclocking data.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply