|
Sininu posted:Yess, thank you for the reassurance. He is right that the chips don’t put out much heat, but the potential problem with the Morpheus is that it can keep most GPU chips cool with very low fan speeds and airflow. That airflow is required to cool all the other components of the card (PCB, VRM, VRAM). If you opt to run GDDR6 without RAM heatsinks then I suggest that you run a more generous fan curve than required by the GPU temperatures or those components may reach dangerous temperatures, particularly when idle or at partial load. I personally wouldn’t run a GDDR6 card without any RAM heatsinks after seeing the 2080ti launch problems and thermal images but you might be fine.
|
# ? Mar 7, 2020 07:25 |
|
|
# ? Jun 3, 2024 22:43 |
|
Indiana_Krom posted:The problem with GCN isn't the age, it is the near total lack of updates it has received in that time. Yup. Sininu posted:Yess, thank you for the reassurance. Paul's right but you can pick up smaller heatsinks and thermal tape off amazon or wherever if you want to play it safe.
|
# ? Mar 7, 2020 08:49 |
|
I really want AMD to start doing better or my 1080 will be the last card I could afford.
|
# ? Mar 7, 2020 11:16 |
|
Malcolm XML posted:Fiji/Vega was that cleansheet redesign right -- it sucked for non-compute. I allow that the argument could be made that because AMD's GPU business are a bunch of underfunded bumbling chucklefucks including Raja Koduri, that instead of doing the one-step transition that Kepler -> Maxwell was, it has taken them more steps in GCN 4 -> GCN 5 -> Vega -> Navi -> RDNA, and that getting Sony and Microsoft to fund development into RDNA -> RDNA 2 -> RDNA 3 was a masterstroke of business dealings.
|
# ? Mar 7, 2020 12:23 |
|
ijyt posted:I really want AMD to start doing better or my 1080 will be the last card I could afford. Well in a few years you can pick up a 4050ti for 4% more performance but 2gb less ram, probably for $300
|
# ? Mar 7, 2020 13:00 |
|
DrDork posted:The pending development of 4k@144 and higher monitors strongly disagrees with you.
|
# ? Mar 7, 2020 15:03 |
|
future ghost posted:Anyone who still has a Voodoo card sticking around somewhere, check eBay. Streamers and collectors are paying dumb amounts of money for them right now, especially if they're rare late models or PCI. I literally have a V5500 in AGP (seemed like the way to go at the time!) in its box and a Voodoo 2. Please tell me you found a way for me to retire! I actually thought of finding an old AGP equipped motherboard/CPU and thought of building an old legacy system.. but who am I kidding I never will.
|
# ? Mar 7, 2020 17:59 |
|
eames posted:He is right that the chips don’t put out much heat, but the potential problem with the Morpheus is that it can keep most GPU chips cool with very low fan speeds and airflow. That airflow is required to cool all the other components of the card (PCB, VRM, VRAM). I'm running my Noctua NF-A12x25's at 25% when idle and they go up to 45% when under 100% load keeping the GPU itself at 63 degrees max after running for 40 min. Don't really want to make them louder. Ugh. Also Does the fullscreen here actually mean borderless windowed, as the pure exclusive fullscreen doesn't really exist in W10 anymore? I'm fed up with random programs like Sticky Notes pulling the refresh rate below 30 and manually disabling Gsync for them so I'm thinking changing that setting around. Sininu fucked around with this message at 18:05 on Mar 7, 2020 |
# ? Mar 7, 2020 18:02 |
|
Borderless windowed is a stretchy term. With Windows 10's desktop compositor and how things are handled in that regard for a while now, the operating system will run "fullscreen" games internally as windowed and treat them specially, but still pretend they're fullscreen. So long no other windows are overlaying their screen space, G-Sync should be active, if set to fullscreen only. If you configure a game explicitly to run in borderless windowed mode, then you'd need to keep the second G-Sync option enabled, because the operating system is oblivious to the intent of the game (window).
|
# ? Mar 7, 2020 18:25 |
|
what is the use case for only having full screen gsync active as opposed to full screen + borderless? I have the latter chosen because I figure gsync will help in things like emulators and generally apps running windowed.
|
# ? Mar 7, 2020 18:53 |
|
Penpal posted:what is the use case for only having full screen gsync active as opposed to full screen + borderless? I have the latter chosen because I figure gsync will help in things like emulators and generally apps running windowed. I think it's to avoid issues like the mentioned Sticky Notes pulling the refresh rate down to <30 fps because it somehow hooks into gsync and doesn't run at the desktop framerate for no good reason. Other programs also do weird stuff sometimes.
|
# ? Mar 7, 2020 19:21 |
|
Combat Pretzel posted:So long no other windows are overlaying their screen space, G-Sync should be active, if set to fullscreen only. If you configure a game explicitly to run in borderless windowed mode, then you'd need to keep the second G-Sync option enabled, because the operating system is oblivious to the intent of the game (window). Many newer games lack fullscreen option entirely or they do borderless regardless so I guess I have to deal with the crappy experience in some apps. Penpal posted:what is the use case for only having full screen gsync active as opposed to full screen + borderless? I have the latter chosen because I figure gsync will help in things like emulators and generally apps running windowed. Sininu posted:
Some other stock UWP apps like voice recorder also have that behaviour, while Calculator is just fine. Just checked and I can't even disable Gsync for them. Ryzen Master and Spotify have the issue.
|
# ? Mar 7, 2020 19:22 |
|
Sininu posted:Also Won't help with the sticky notes refresh rate problem, but otherwise is harmless. Running "Modern" apps in general tends to break around high refresh displays, especially if you have a multi-display+mixed refresh environment running. I saw a claim somewhere that Microsoft has fixed this behavior in the next Windows 10 feature update, might be worth keeping an eye out for.
|
# ? Mar 7, 2020 19:26 |
|
Arzachel posted:Yup. The short heatsinks Morpheus came with are 4.95 mm tall, which was too much, are heatsinks shorter than that even effective? E: I found some 4 mm ones that were out of production with a quick search so I guess they do. Sininu fucked around with this message at 19:47 on Mar 7, 2020 |
# ? Mar 7, 2020 19:34 |
|
I honestly wonder what the alcoholism/suicide rate is for driver engineers working on those multi display/mixed desktop composition + gsync problems.
|
# ? Mar 7, 2020 20:17 |
|
So remember how Windows 7 had that Aero theme that made the window borders look like translucent frosted glass? In the old days, people would turn that off since it used 3D acceleration if they were super ultra pedantic about framerate stability. It was competing with the game for tiniest amount of resources. G-Sync for windowed apps didn't exist until 2017 or so, so it wasn't an issue, but lots of non-Gsync proles wanted to play Borderless Windowed so the glasy window borders were replaced with something more Windows XP-ish. Microsoft made changes in Windows 8 that have continued to this day, so that the desktop experience is 3D accelerated and it can not be turned off, and part of this includes vsync for the desktop animations so that if a window spun around like a box or something it would be smooth and clip-free. When a game requested exclusive fullscreen, this desktop compositor would turn off and free up it's resources; and since your game took up your whole screen you couldn't see the difference. This was a BIG problem for the crowd that loves Borderless Windowed because it means that mode enables vsync regardless of the game's settings. Play CS in borderless so you can pull up your music player between matches, and enjoy additional mouse lag. This is why you'll find fucks on other technology boards who won't let go of Windows 7, and are threatening to keep it as Microsoft winds down even critical patches. Because it's "the best operating system for gamers ever made". Because after that Microsoft made a few changes that improved the experience of non-gamers at gamers expense. "G-Sync with borderless" has Nvidia embed into Microsoft's desktop render, and when applicable turn off the vsync and control the framerate of it to match the game and the monitor. The downside to this is that the entire desktop experience beyond the game is also running at this framerate, so if you're running a 60 FPS video and a game at 30FPS you'll in theory only see half the video.
|
# ? Mar 7, 2020 20:37 |
|
This site recommends that you use only G-sync in Fullscreen and it's kinda confusing as to why. Is this no longer current?
|
# ? Mar 7, 2020 21:23 |
|
This page explains why. 3-5% performance hit.
|
# ? Mar 7, 2020 21:32 |
|
Indiana_Krom posted:Won't help with the sticky notes refresh rate problem, but otherwise is harmless. Running "Modern" apps in general tends to break around high refresh displays, especially if you have a multi-display+mixed refresh environment running. I saw a claim somewhere that Microsoft has fixed this behavior in the next Windows 10 feature update, might be worth keeping an eye out for. https://www.reddit.com/r/Windows10/comments/f57tk4/dwm_multiple_monitors_with_different_refresh_rate/ It's only been like 5 loving years of this bullshit, and apparently it's still only half-fixed. gently caress MS's idiotic decision to make desktop composition work this way. There's like 0.001% of windows users using multiple monitors as a single display, and instead of making an option for that use case, everyone using multi-monitor is getting hosed for their sake. K8.0 fucked around with this message at 21:47 on Mar 7, 2020 |
# ? Mar 7, 2020 21:45 |
|
Tab8715 posted:Why is this a bad thing? May we go into further detail? It's similar to being stuck with x86 for another couple decades.
|
# ? Mar 8, 2020 01:57 |
|
lDDQD posted:It's similar to being stuck with x86 for another couple decades. *looks around* We've been using it since 1978 or so, granted with some major upgrades along the way, most notably AMD64. What's wrong with evolving an existing and working design?
|
# ? Mar 8, 2020 04:53 |
|
Fabulousity posted:*looks around* We've been using it since 1978 or so, granted with some major upgrades along the way, most notably AMD64. What's wrong with evolving an existing and working design? bitpacked variable-length instructions are for babies and communists
|
# ? Mar 8, 2020 05:36 |
|
Started fooling around with RIS (Image Sharpening) now that AMD seem to have unfucked drivers. It seems pretty cool in that I can't see much of a difference between 3440x1440 and 3096x1296 (90% each axis) though it's a 19% drop in pixels pushed. This is a quick PSA on creating custom resolutions if you want to play with this stuff which turns out to be a lot easier than at first look. - Global Display properties > Custom Resolutions > Create New - Enter Resolution, Refresh Rate - Change Timing Standard to "CVT - Reduced blanking" - Hit Create - Use the new resolution in games after enabling Radeon Image Sharpening for them That Timing Standard step will create all of the scary looking timing entries for you, which was what stopped me from trying this when I took a quick look months ago. I've created resolutions at 80% and 90% of 3440x1440. At 80% some of the snow textures in Monster Hunter Iceborne look too sharp and artifical. 80% is 36% less pixels than full res so that's not unexpected. It's a lot harder to see differences at 90%. I've been on AMD since I got a 7970 back in early '13 and these past two months have been the worst for driver stability. Can't ever remember bad drivers being much of an issue for stability (perf, yeah) before then to be honest, most probably because I was on the old card running on the stable code paths. But the past two months were just ridiculous - I could run either 19.12.1 which was totally stable or 20.1.4 if I didn't have video active in Chrome while playing a game. Cargo culting it up for those months basically. I'm sure they've been justifiably burned by this in their sales. v1ld fucked around with this message at 16:03 on Mar 8, 2020 |
# ? Mar 8, 2020 15:57 |
|
my 5700 xt crashes sometimes when I'm playing Witcher 3. both monitors freeze for a second then completely stop getting input. it resolves itself when I restart my computer. is there any kind of log anywhere that could give me an idea of what's going on?
|
# ? Mar 9, 2020 00:13 |
|
NotNut posted:my 5700 xt crashes sometimes when I'm playing Witcher 3. both monitors freeze for a second then completely stop getting input. it resolves itself when I restart my computer. is there any kind of log anywhere that could give me an idea of what's going on? You can try running the Event Viewer and look at the application or system error log, but I've never found it very helpful. At best you might learn that ATIsomethingdriver.dll has crashed and then Install the latest driver if you haven't already, and please try with one monitor on only, with no stuff running in the background like chats, overlays, recording/streaming etc, and see what happens?
|
# ? Mar 9, 2020 02:16 |
|
NotNut posted:my 5700 xt crashes sometimes when I'm playing Witcher 3. both monitors freeze for a second then completely stop getting input. it resolves itself when I restart my computer. is there any kind of log anywhere that could give me an idea of what's going on? Update to the latest driver (20.2.1 iirc) if you haven’t. Otherwise, sorry, Witcher 3 is a specific game I read a lot of complaints about with the AMD drivers.
|
# ? Mar 9, 2020 05:05 |
|
I was waiting on the 3000 series but if there might be delays / supply shortages because the world is ending maybe I should just get a 2070S now? At least I can get a slightly better FPS while I self-isolate. Or I should save my money for toilet paper and stick with my 1070... Or should I wait until the end of March to see how things play out and what gets announced at the Nvidia conference thing?
|
# ? Mar 9, 2020 12:46 |
|
fuf posted:I was waiting on the 3000 series but if there might be delays / supply shortages because the world is ending maybe I should just get a 2070S now? At least I can get a slightly better FPS while I self-isolate. The 3000 series hasn't even been announced, there's no way to know what the impact of supply line disruptions will be when it actually comes out and no way around it. If you were going to wait, I'd say wait, paying full price for last year's tech isn't going to change anything. Also, the real move is hording bidets. You'll make out like a bandit once the tp runs out.
|
# ? Mar 9, 2020 13:54 |
|
Yeah but what if they announce that the 3000 series isn't releasing until well into 2021, and also the 2000 series prices go way up because of production disruption?
|
# ? Mar 9, 2020 14:41 |
|
fuf posted:Yeah but what if they announce that the 3000 series isn't releasing until well into 2021, and also the 2000 series prices go way up because of production disruption? fuf posted:I was thinking about upgrading to a 2070S but I looked at some benchmarks for games I care about and in some cases it was only like 15-20 extra FPS. It didn't seem worth it for like $400. Listen to your old self instead of spending $400 because you're worried about a hypothetical where you spend $500 to also not get a boost in games you plaly.
|
# ? Mar 9, 2020 15:10 |
|
fuf posted:Yeah but what if they announce that the 3000 series isn't releasing until well into 2021, and also the 2000 series prices go way up because of production disruption? You have a 1070 for another year It's not like you ran out and bought a 2070S when they were released. The 2080 will be two years old in September, I don't see them pushing that to three. Furthermore, the 2070S came out 8 months ago, that's nearly a year worth of effective depreciation built into the purchase price since it'll get superceded that much sooner. Also, and apologies if I'm misreading this, but anxious reactions to hypothetical events basically never produces desirable outcomes.
|
# ? Mar 9, 2020 15:11 |
|
haha alright, thanks for talking me down. Honestly I probably just wanted an excuse to make an unnecessary purchase. I will reluctantly see sense.
fuf fucked around with this message at 15:33 on Mar 9, 2020 |
# ? Mar 9, 2020 15:30 |
|
Nvidia has now cancelled their web broadcast for GTC due to corona as well. It sounds like the product launch isnt cancelled though. https://videocardz.com/newz/nvidia-to-share-gtc-news-on-march-24-cancels-plans-for-a-webcast ATs take: https://www.anandtech.com/show/15602/nvidia-axes-gtc-digital-keynote-in-favor-of-news-releases Cygni fucked around with this message at 23:13 on Mar 9, 2020 |
# ? Mar 9, 2020 22:59 |
|
Definitely going to use my 1070 for at least another year.
|
# ? Mar 10, 2020 00:05 |
|
So I will get my first VRR (Freesync2) monitor today. I have Nvidia RTX 2060. Reading the thread it seems that I will have major problems with VRR if I have a 2560x1440 144Hz main display and a 1200x1920 60Hz secondary display for IRC/discord/stuff. Should I just get rid of the 2nd monitor?
|
# ? Mar 10, 2020 10:55 |
|
VRR will work perfectly fine. Your GPU may not idle properly, and if you're playing video on your second monitor the refresh on the primary may drop to 60. Aside from that it just works. The main thing you need to do, aside from setting your refresh rate and turning gsync on in the nvidia control panel is cap your FPS. Use the limiter in the Nvidia control panel to cap to 140fps, and if you play games like Overwatch with decent built-in limiters you can create profiles for those games that have no limit and use the in-game limiter instead. VRR only works when you are getting LESS than your refresh rate in FPS, if you do not cap your FPS and it goes high you get standard vsync on (high latency) or vsync off (tearing) behavior.
|
# ? Mar 10, 2020 11:06 |
|
Nvidia is teasing something for next week https://twitter.com/NvidiaANZ/status/1237657045022781440
|
# ? Mar 11, 2020 16:00 |
|
Creepy. I'm guessing eye tracking.
|
# ? Mar 11, 2020 16:27 |
|
Might be something to do with foveated rendering yeah, Turing's variable rate shading is a natural way to implement it
|
# ? Mar 11, 2020 16:35 |
|
|
# ? Jun 3, 2024 22:43 |
|
Is VR still gonna get talked up a lot? Even Half-Life: Alyx doesn't seem to be doing great numbers. Having it be webcam-based would be interesting, I guess.
|
# ? Mar 11, 2020 16:44 |