Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
eames
May 9, 2009

Sininu posted:

Yess, thank you for the reassurance.

He is right that the chips don’t put out much heat, but the potential problem with the Morpheus is that it can keep most GPU chips cool with very low fan speeds and airflow. That airflow is required to cool all the other components of the card (PCB, VRM, VRAM).

If you opt to run GDDR6 without RAM heatsinks then I suggest that you run a more generous fan curve than required by the GPU temperatures or those components may reach dangerous temperatures, particularly when idle or at partial load. I personally wouldn’t run a GDDR6 card without any RAM heatsinks after seeing the 2080ti launch problems and thermal images but you might be fine.

Adbot
ADBOT LOVES YOU

Arzachel
May 12, 2012

Indiana_Krom posted:

The problem with GCN isn't the age, it is the near total lack of updates it has received in that time.

Yup.

Sininu posted:

Yess, thank you for the reassurance.

Paul's right but you can pick up smaller heatsinks and thermal tape off amazon or wherever if you want to play it safe.

ijyt
Apr 10, 2012

I really want AMD to start doing better or my 1080 will be the last card I could afford.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Malcolm XML posted:

Fiji/Vega was that cleansheet redesign right -- it sucked for non-compute.

I allow that the argument could be made that because AMD's GPU business are a bunch of underfunded bumbling chucklefucks including Raja Koduri, that instead of doing the one-step transition that Kepler -> Maxwell was, it has taken them more steps in GCN 4 -> GCN 5 -> Vega -> Navi -> RDNA, and that getting Sony and Microsoft to fund development into RDNA -> RDNA 2 -> RDNA 3 was a masterstroke of business dealings.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

ijyt posted:

I really want AMD to start doing better or my 1080 will be the last card I could afford.

Well in a few years you can pick up a 4050ti for 4% more performance but 2gb less ram, probably for $300

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

DrDork posted:

The pending development of 4k@144 and higher monitors strongly disagrees with you.
Yeah, would be nice to have some decent models. I really just want a decent non-HDR 4K@120 27" IPS display. I'm still waiting.

slidebite
Nov 6, 2005

Good egg
:colbert:

future ghost posted:

Anyone who still has a Voodoo card sticking around somewhere, check eBay. Streamers and collectors are paying dumb amounts of money for them right now, especially if they're rare late models or PCI.

I literally have a V5500 in AGP (seemed like the way to go at the time!) in its box and a Voodoo 2.

Please tell me you found a way for me to retire!

I actually thought of finding an old AGP equipped motherboard/CPU and thought of building an old legacy system.. but who am I kidding I never will.

Sininu
Jan 8, 2014

eames posted:

He is right that the chips don’t put out much heat, but the potential problem with the Morpheus is that it can keep most GPU chips cool with very low fan speeds and airflow. That airflow is required to cool all the other components of the card (PCB, VRM, VRAM).

If you opt to run GDDR6 without RAM heatsinks then I suggest that you run a more generous fan curve than required by the GPU temperatures or those components may reach dangerous temperatures, particularly when idle or at partial load. I personally wouldn’t run a GDDR6 card without any RAM heatsinks after seeing the 2080ti launch problems and thermal images but you might be fine.

I'm running my Noctua NF-A12x25's at 25% when idle and they go up to 45% when under 100% load keeping the GPU itself at 63 degrees max after running for 40 min. Don't really want to make them louder. Ugh.


Also


Does the fullscreen here actually mean borderless windowed, as the pure exclusive fullscreen doesn't really exist in W10 anymore?

I'm fed up with random programs like Sticky Notes pulling the refresh rate below 30 and manually disabling Gsync for them so I'm thinking changing that setting around.

Sininu fucked around with this message at 18:05 on Mar 7, 2020

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Borderless windowed is a stretchy term. With Windows 10's desktop compositor and how things are handled in that regard for a while now, the operating system will run "fullscreen" games internally as windowed and treat them specially, but still pretend they're fullscreen. So long no other windows are overlaying their screen space, G-Sync should be active, if set to fullscreen only. If you configure a game explicitly to run in borderless windowed mode, then you'd need to keep the second G-Sync option enabled, because the operating system is oblivious to the intent of the game (window).

forest spirit
Apr 6, 2009

Frigate Hetman Sahaidachny
First to Fight Scuttle, First to Fall Sink


what is the use case for only having full screen gsync active as opposed to full screen + borderless? I have the latter chosen because I figure gsync will help in things like emulators and generally apps running windowed.

Geemer
Nov 4, 2010



Penpal posted:

what is the use case for only having full screen gsync active as opposed to full screen + borderless? I have the latter chosen because I figure gsync will help in things like emulators and generally apps running windowed.

I think it's to avoid issues like the mentioned Sticky Notes pulling the refresh rate down to <30 fps because it somehow hooks into gsync and doesn't run at the desktop framerate for no good reason. Other programs also do weird stuff sometimes.

Sininu
Jan 8, 2014

Combat Pretzel posted:

So long no other windows are overlaying their screen space, G-Sync should be active, if set to fullscreen only. If you configure a game explicitly to run in borderless windowed mode, then you'd need to keep the second G-Sync option enabled, because the operating system is oblivious to the intent of the game (window).
Do overlays like Steam, RTSS/Afterburner and Windows volume slider count as other windows?

Many newer games lack fullscreen option entirely or they do borderless regardless so I guess I have to deal with the crappy experience in some apps.


Penpal posted:

what is the use case for only having full screen gsync active as opposed to full screen + borderless? I have the latter chosen because I figure gsync will help in things like emulators and generally apps running windowed.

Sininu posted:


Does the fullscreen here actually mean borderless windowed, as the pure exclusive fullscreen doesn't really exist in W10 anymore?

I'm fed up with random programs like Sticky Notes pulling the refresh rate below 30 and manually disabling Gsync for them so I'm thinking changing that setting around.
This could be the use case.
Some other stock UWP apps like voice recorder also have that behaviour, while Calculator is just fine. Just checked and I can't even disable Gsync for them. Ryzen Master and Spotify have the issue.

Indiana_Krom
Jun 18, 2007
Net Slacker

Sininu posted:

Also


Does the fullscreen here actually mean borderless windowed, as the pure exclusive fullscreen doesn't really exist in W10 anymore?

I'm fed up with random programs like Sticky Notes pulling the refresh rate below 30 and manually disabling Gsync for them so I'm thinking changing that setting around.

Won't help with the sticky notes refresh rate problem, but otherwise is harmless. Running "Modern" apps in general tends to break around high refresh displays, especially if you have a multi-display+mixed refresh environment running. I saw a claim somewhere that Microsoft has fixed this behavior in the next Windows 10 feature update, might be worth keeping an eye out for.

Sininu
Jan 8, 2014

Arzachel posted:

Yup.


Paul's right but you can pick up smaller heatsinks and thermal tape off amazon or wherever if you want to play it safe.

The short heatsinks Morpheus came with are 4.95 mm tall, which was too much, are heatsinks shorter than that even effective?

E: I found some 4 mm ones that were out of production with a quick search so I guess they do.

Sininu fucked around with this message at 19:47 on Mar 7, 2020

sauer kraut
Oct 2, 2004
I honestly wonder what the alcoholism/suicide rate is for driver engineers working on those multi display/mixed desktop composition + gsync problems.

Craptacular!
Jul 9, 2001

Fuck the DH
So remember how Windows 7 had that Aero theme that made the window borders look like translucent frosted glass? In the old days, people would turn that off since it used 3D acceleration if they were super ultra pedantic about framerate stability. It was competing with the game for tiniest amount of resources. G-Sync for windowed apps didn't exist until 2017 or so, so it wasn't an issue, but lots of non-Gsync proles wanted to play Borderless Windowed so the glasy window borders were replaced with something more Windows XP-ish.

Microsoft made changes in Windows 8 that have continued to this day, so that the desktop experience is 3D accelerated and it can not be turned off, and part of this includes vsync for the desktop animations so that if a window spun around like a box or something it would be smooth and clip-free. When a game requested exclusive fullscreen, this desktop compositor would turn off and free up it's resources; and since your game took up your whole screen you couldn't see the difference. This was a BIG problem for the crowd that loves Borderless Windowed because it means that mode enables vsync regardless of the game's settings. Play CS in borderless so you can pull up your music player between matches, and enjoy additional mouse lag. This is why you'll find fucks on other technology boards who won't let go of Windows 7, and are threatening to keep it as Microsoft winds down even critical patches. Because it's "the best operating system for gamers ever made". Because after that Microsoft made a few changes that improved the experience of non-gamers at gamers expense.

"G-Sync with borderless" has Nvidia embed into Microsoft's desktop render, and when applicable turn off the vsync and control the framerate of it to match the game and the monitor. The downside to this is that the entire desktop experience beyond the game is also running at this framerate, so if you're running a 60 FPS video and a game at 30FPS you'll in theory only see half the video.

VelociBacon
Dec 8, 2009

This site recommends that you use only G-sync in Fullscreen and it's kinda confusing as to why. Is this no longer current?

Craptacular!
Jul 9, 2001

Fuck the DH
This page explains why. 3-5% performance hit.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Indiana_Krom posted:

Won't help with the sticky notes refresh rate problem, but otherwise is harmless. Running "Modern" apps in general tends to break around high refresh displays, especially if you have a multi-display+mixed refresh environment running. I saw a claim somewhere that Microsoft has fixed this behavior in the next Windows 10 feature update, might be worth keeping an eye out for.

https://www.reddit.com/r/Windows10/comments/f57tk4/dwm_multiple_monitors_with_different_refresh_rate/

It's only been like 5 loving years of this bullshit, and apparently it's still only half-fixed. gently caress MS's idiotic decision to make desktop composition work this way. There's like 0.001% of windows users using multiple monitors as a single display, and instead of making an option for that use case, everyone using multi-monitor is getting hosed for their sake.

K8.0 fucked around with this message at 21:47 on Mar 7, 2020

lDDQD
Apr 16, 2006

Tab8715 posted:

Why is this a bad thing? May we go into further detail?

It's similar to being stuck with x86 for another couple decades.

Fabulousity
Dec 29, 2008

Number One I order you to take a number two.

lDDQD posted:

It's similar to being stuck with x86 for another couple decades.

*looks around* We've been using it since 1978 or so, granted with some major upgrades along the way, most notably AMD64. What's wrong with evolving an existing and working design?

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Fabulousity posted:

*looks around* We've been using it since 1978 or so, granted with some major upgrades along the way, most notably AMD64. What's wrong with evolving an existing and working design?

bitpacked variable-length instructions are for babies and communists

v1ld
Apr 16, 2012

Started fooling around with RIS (Image Sharpening) now that AMD seem to have unfucked drivers. It seems pretty cool in that I can't see much of a difference between 3440x1440 and 3096x1296 (90% each axis) though it's a 19% drop in pixels pushed.

This is a quick PSA on creating custom resolutions if you want to play with this stuff which turns out to be a lot easier than at first look.

- Global Display properties > Custom Resolutions > Create New
- Enter Resolution, Refresh Rate
- Change Timing Standard to "CVT - Reduced blanking"
- Hit Create

- Use the new resolution in games after enabling Radeon Image Sharpening for them

That Timing Standard step will create all of the scary looking timing entries for you, which was what stopped me from trying this when I took a quick look months ago.

I've created resolutions at 80% and 90% of 3440x1440. At 80% some of the snow textures in Monster Hunter Iceborne look too sharp and artifical. 80% is 36% less pixels than full res so that's not unexpected. It's a lot harder to see differences at 90%.


I've been on AMD since I got a 7970 back in early '13 and these past two months have been the worst for driver stability. Can't ever remember bad drivers being much of an issue for stability (perf, yeah) before then to be honest, most probably because I was on the old card running on the stable code paths. But the past two months were just ridiculous - I could run either 19.12.1 which was totally stable or 20.1.4 if I didn't have video active in Chrome while playing a game. Cargo culting it up for those months basically. I'm sure they've been justifiably burned by this in their sales.

v1ld fucked around with this message at 16:03 on Mar 8, 2020

NotNut
Feb 4, 2020
my 5700 xt crashes sometimes when I'm playing Witcher 3. both monitors freeze for a second then completely stop getting input. it resolves itself when I restart my computer. is there any kind of log anywhere that could give me an idea of what's going on?

sauer kraut
Oct 2, 2004

NotNut posted:

my 5700 xt crashes sometimes when I'm playing Witcher 3. both monitors freeze for a second then completely stop getting input. it resolves itself when I restart my computer. is there any kind of log anywhere that could give me an idea of what's going on?

You can try running the Event Viewer and look at the application or system error log, but I've never found it very helpful. At best you might learn that ATIsomethingdriver.dll has crashed and then :shrug:
Install the latest driver if you haven't already, and please try with one monitor on only, with no stuff running in the background like chats, overlays, recording/streaming etc, and see what happens?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

NotNut posted:

my 5700 xt crashes sometimes when I'm playing Witcher 3. both monitors freeze for a second then completely stop getting input. it resolves itself when I restart my computer. is there any kind of log anywhere that could give me an idea of what's going on?

Update to the latest driver (20.2.1 iirc) if you haven’t.

Otherwise, sorry, Witcher 3 is a specific game I read a lot of complaints about with the AMD drivers. :shrug:

fuf
Sep 12, 2004

haha
I was waiting on the 3000 series but if there might be delays / supply shortages because the world is ending maybe I should just get a 2070S now? At least I can get a slightly better FPS while I self-isolate.

Or I should save my money for toilet paper and stick with my 1070...

Or should I wait until the end of March to see how things play out and what gets announced at the Nvidia conference thing?

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

fuf posted:

I was waiting on the 3000 series but if there might be delays / supply shortages because the world is ending maybe I should just get a 2070S now? At least I can get a slightly better FPS while I self-isolate.

Or I should save my money for toilet paper and stick with my 1070...

Or should I wait until the end of March to see how things play out and what gets announced at the Nvidia conference thing?

The 3000 series hasn't even been announced, there's no way to know what the impact of supply line disruptions will be when it actually comes out and no way around it. If you were going to wait, I'd say wait, paying full price for last year's tech isn't going to change anything.

Also, the real move is hording bidets. You'll make out like a bandit once the tp runs out.

fuf
Sep 12, 2004

haha
Yeah but what if they announce that the 3000 series isn't releasing until well into 2021, and also the 2000 series prices go way up because of production disruption? :(

Inept
Jul 8, 2003

fuf posted:

Yeah but what if they announce that the 3000 series isn't releasing until well into 2021, and also the 2000 series prices go way up because of production disruption? :(

fuf posted:

I was thinking about upgrading to a 2070S but I looked at some benchmarks for games I care about and in some cases it was only like 15-20 extra FPS. It didn't seem worth it for like $400.

Listen to your old self instead of spending $400 because you're worried about a hypothetical where you spend $500 to also not get a boost in games you plaly.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

fuf posted:

Yeah but what if they announce that the 3000 series isn't releasing until well into 2021, and also the 2000 series prices go way up because of production disruption? :(

You have a 1070 for another year :shrug: It's not like you ran out and bought a 2070S when they were released. The 2080 will be two years old in September, I don't see them pushing that to three. Furthermore, the 2070S came out 8 months ago, that's nearly a year worth of effective depreciation built into the purchase price since it'll get superceded that much sooner.

Also, and apologies if I'm misreading this, but anxious reactions to hypothetical events basically never produces desirable outcomes.

fuf
Sep 12, 2004

haha
haha alright, thanks for talking me down. Honestly I probably just wanted an excuse to make an unnecessary purchase. I will reluctantly see sense.

fuf fucked around with this message at 15:33 on Mar 9, 2020

Cygni
Nov 12, 2005

raring to post

Nvidia has now cancelled their web broadcast for GTC due to corona as well. It sounds like the product launch isnt cancelled though.

https://videocardz.com/newz/nvidia-to-share-gtc-news-on-march-24-cancels-plans-for-a-webcast

ATs take:

https://www.anandtech.com/show/15602/nvidia-axes-gtc-digital-keynote-in-favor-of-news-releases

Cygni fucked around with this message at 23:13 on Mar 9, 2020

FuturePastNow
May 19, 2014


Definitely going to use my 1070 for at least another year.

Ihmemies
Oct 6, 2012

So I will get my first VRR (Freesync2) monitor today. I have Nvidia RTX 2060.

Reading the thread it seems that I will have major problems with VRR if I have a 2560x1440 144Hz main display and a 1200x1920 60Hz secondary display for IRC/discord/stuff. Should I just get rid of the 2nd monitor?

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
VRR will work perfectly fine. Your GPU may not idle properly, and if you're playing video on your second monitor the refresh on the primary may drop to 60. Aside from that it just works.

The main thing you need to do, aside from setting your refresh rate and turning gsync on in the nvidia control panel is cap your FPS. Use the limiter in the Nvidia control panel to cap to 140fps, and if you play games like Overwatch with decent built-in limiters you can create profiles for those games that have no limit and use the in-game limiter instead. VRR only works when you are getting LESS than your refresh rate in FPS, if you do not cap your FPS and it goes high you get standard vsync on (high latency) or vsync off (tearing) behavior.

repiv
Aug 13, 2009

Nvidia is teasing something for next week

https://twitter.com/NvidiaANZ/status/1237657045022781440

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Creepy. I'm guessing eye tracking.

repiv
Aug 13, 2009

Might be something to do with foveated rendering yeah, Turing's variable rate shading is a natural way to implement it

Adbot
ADBOT LOVES YOU

ufarn
May 30, 2009
Is VR still gonna get talked up a lot? Even Half-Life: Alyx doesn't seem to be doing great numbers.

Having it be webcam-based would be interesting, I guess.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply