Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Branch Nvidian
Nov 29, 2012



I'm excited that by the time I'm ready to upgrade my 970 will be worth about $30.

Adbot
ADBOT LOVES YOU

Branch Nvidian
Nov 29, 2012



I wonder if Tom still has a job or not. :ohdear:

Branch Nvidian
Nov 29, 2012



Boris Galerkin posted:

No poo poo but the video is literally two voices talking to each other with a backdrop of green boxes so forgive me if I don't recognize who is who especially when people are making it sound like Tom is the super idiot one when the not-Tom guy is also really loving stupid in the video too.

Not-Tom is the CEO of Nvidia giving a keynote presentation, Tom is the guy backstage that jumped ahead several slides and then not understanding what the CEO is telling him to do.

Branch Nvidian
Nov 29, 2012



I realize that since I have a 970 right now, the proper card for me upgrade to will be the 1070, especially since I've only got a 1080p monitor, but goddamn I kind of want a 1080 FTW or Classified just so I can have all the power.

Branch Nvidian
Nov 29, 2012



Is having a dedicated PhysX card still a thing? Would it make any sense for me to keep my 970 to use as a PhysX card alongside a 1080 or 1070?

Branch Nvidian
Nov 29, 2012



So since the 1080 Ti is out, I'm guessing the regular 1080 is now relegated to the same position the 980 was when the 980 Ti came out? I've got a 970 and don't need to upgrade, but in the event I decide to before the 11-series releases I'd like to know if I should ignore the 1080.

Branch Nvidian
Nov 29, 2012



I'm not really interested in 4K right now, but I've been eyeing high refresh 1440 monitors; thanks for the info everyone.

Branch Nvidian
Nov 29, 2012



Paul MaudDib posted:

Seriously. If I had a choice between upgrading to GSync and using a 970/980/1060 in the meantime, or immediately upgrading to a 1070 or a 1080 I would choose the former every time.

You really have to be insane not to be using GSync/Freesync at this point, it's a one-time purchase that will last 5-10 years and will save you several GPU upgrades during that timeframe. Most people upgrade their CPUs more than their monitors and now that monitors have made a huge leap forward why not just bite the bullet and make this your once-a-decade monitor upgrade?

...

1440p 144/165 Hz or ultrawide 1440p 100 Hz are the place to be right now. Great quality, can be driven nicely with a single-card setup, not insanely overpriced but you don't feel like you wasted the GSync tax on a lovely monitor either.

So, right now I've got a single EVGA 970 FTW+, would you say it would be better to get a GSync equipped high refresh 1080p monitor or go for 1440? Since I don't really feel like I need to upgrade my GPU (I mostly play FFXIV and titles from around 2011), and newer games still run at acceptable frame rates for me. I know I should have gotten a GSync monitor in the first place, but the one I bought when I built my computer in late 2015 was only meant to be a stopgap in the first place. My initial post was just on the off chance I decided to upgrade to the latest gen stuff because expendable income.

My current monitor is an Acer G7 G257HL bmidx for reference.

Branch Nvidian
Nov 29, 2012



My current monitor is 25" and I refuse to go lower than that size wise. I get horrible motion sickness from FPS games, so I'm playing RPGs and third-person action games like Dark Souls or the Batman Arkham games. I want to be a wanker about TN vs IPS since I know IPS is the better tech, but it also really wouldn't kill me to have a TN panel. Oddly one of my biggest deals is screen bezel size. I don't like giant glossy plastic bezels, my current monitor is "bezel-less" since it artificially makes it look like the screen goes all the way to the edge even though it doesn't.

Branch Nvidian
Nov 29, 2012



Visual novels on a 1080 Ti.

Branch Nvidian
Nov 29, 2012



Crossposting from the PC building mega thread in case someone who reads this thread and not that one can help.

My brother's current PC is 6 years old, built with parts that were 2-ish years old at the time. He's basically unable to play modern games like Dead Space due lacking access to newer DX12 Ultimate apis and some other stuff, plus it's just a really limited system by today's standards. His current build is:

https://pcpartpicker.com/list/KCxzjZ
CPU: Intel Core i5-6500
Mobo: ASRock H270M-ITX/ac
RAM: G.Skill NT 16GB 2x8GB DDR4-2400 CL15
Storage: PNY 120GB SSD + Seagate Firecuda 2TB 7200RPM HDD
GPU: Sapphire AMD Radeon RX 470 4GB
PSU: Corsair RM550x
Case: Fractal Design Define Nano S

Saw an MSI AMD Radeon RX 6600 XT on Newegg that after promo+MIR came out to $230 and bought it thinking it would be a decent upgrade for him; however, I'm now thinking about the fact that he's still using a Skylake-S quad core with no hyperthreading and slow-rear end 2400 MHz RAM. He doesn't have much in the way of budget, which is why I bought him the card, but just how bad are his remaining components going to hamper the new GPU? Do I need to look at doing a full system upgrade as well at this point?

He desperately wants to play the new Dead Space remake, but I also don't want to spend more than an Xbox Series S if I can avoid it, since that could also play the game. FWIW I have a Ryzen 5 3600 non-X and 16GB of 3000 MHz DDR4 memory just laying around, so it'd literally just be a motherboard that would need to be replaced, but looking at prices for mITX options they're still kinda pricey for him/what I'm willing to just randomly spend on him.

The other question considering the GPU is that it's PCIe 4.0 x8 (full x16 slot, but only half the pins are wired up). As such, using his current board, or a B450/A520 replacement, would mean he'd be running at PCIe 3.0 x8. How much of a performance impact is that going to actually have? If it's 5% or less then whatever, but if we're talking double-digit performance percentage differences due to AMD being sneaky shits then the board will have to be B550.

Branch Nvidian
Nov 29, 2012



I could be completely wrong and have dreamed it all up, but I thought I read or heard in a video that the RX 470/480 lacked some kind of api access or something that just hard prevents the game from loading. Again, I may be making that up completely, but I haven’t found any videos of the game running on a 470 or 480.

Branch Nvidian
Nov 29, 2012



repiv posted:

the RX 470 supports DX12 FL12_0 which is sufficient (at least API wise) for any current game, with the obvious exception of metro exodus enhanced (but you can still play the OG version of exodus instead)

forspoken required FL12_1 at launch but that requirement was patched out

It might be this that I’m conflating it with. I’ll have him check if he can get it to run. That said, even if it does run, I’m not expecting it to be a particularly great experience given the age of the card. Certainly better than the 550 in the video that was posted, but still not great.

Branch Nvidian
Nov 29, 2012



Cygni posted:

https://www.newegg.com/yeston-rtx-4080-16g-d6-ya/p/1FT-007N-00081





if anyone has $1400 to blow on irony (you are buying it ironically... right?)

I love Yeston’s ridiculous cards. They’re too expensive for what you’re getting most of the time, but they’re like the only brand not doing edgy black and grey designs on everything. Sure they’re too far in the other direction for a lot of folks, but that they’re even doing it owns.

Branch Nvidian
Nov 29, 2012



kliras posted:

i genuinely think the card looks cool. the backplate is just completely random, but customizable backplates aren't much of a thing. i wouldn't mind a card like that if it had a blank backplate. not that it wouldn't be obscured by a cpu cooler anyway, but still

fwiw their 7900 XT model has a more generic backplate. It still has lcd thing with an anime girl in it, but the rest of the backplate is just grey with some color to it.

Branch Nvidian
Nov 29, 2012



I'm a broke brained idiot loser baby and just bought a Sapphire Pulse 7900 XTX (a month after pulling the trigger on a $230 6600 XT for my brother) to replace my EVGA 3070 Ti FTW3 model due to VRAM issues I've been encountering, most specifically with Resident Evil 4 Remake and trying to run the game at ultrawide 1440 resolution with textures that don't look like they're from the original GameCube game. Probably a dumb purchase, but alas here I am. Couple of questions that I should have asked before spending $1000 on a piece of computer hardware:

My case is a Cooler Master NR200P, so max card size is 330mm length and triple-slot height. The Sapphire Nitro+ is too big, so I opted for the Pulse, but it looks like the ASRock Phantom Gaming OC, which also fits in my case, is a better performing card. With both at $1000, should I have gone with the ASRock? I always thought Sapphire was kind of like the EVGA of the AMD-exclusive brands, so I just defaulted to them, but I don't really know where I got that idea or if it's even remotely accurate.

Secondly, I have an LG 34GP950G-B monitor with a dedicated G-Sync Ultimate module in it. I know that hardware G-Sync doesn't work with AMD GPUs, but I've read that the monitor supports regular VRR (aka bog-standard FreeSync), but I can't find anything about what the variable range on it would be. I really don't want to also have to replace my monitor, if the VRR window is some stupidly narrow range that ends up useless, since I've only had it for about 8 months at this point.

Please tell me how much of an idiot I am for replacing a good GPU that has questionable VRAM allocation with an arguably way overkill GPU for my gaming resolution without asking some pretty basic-rear end questions before hand.

Branch Nvidian
Nov 29, 2012



wargames posted:

does nvidia even ship the module anymore, i thought it was all g(free)sync now.

Monitors labeled as “G-Sync Ultimate” still have modules in them. When they started supporting software VRR as “G-Sync Compatible” they rebranded the version using hardware modules. It has led to more than a little confusion at times.


On the topic of TLOU PC: maybe I’m approaching it from the wrong perspective or just don’t quite understand things correctly, but it seems quite bizarre to me that a PS5 with an APU roughly equivalent to a Ryzen 3600 and a 6600 XT is able to run the game at 4K without choking to death even at 30fps, but a PC with a top shelf CPU and GPU, with vastly higher performance specs, are struggling to run the game.

Branch Nvidian
Nov 29, 2012



Dr. Video Games 0031 posted:

How much more powerful do you think high-end PCs are? The 3090 is generally "only" 2.2 times as performant as the 6600 XT at native 4K (less so at lower resolutions). And the PC port's ultra settings are a bit heavier than the console settings, so that should roughly account for the GPU performance we're seeing.

CPU performance is also not really "vastly" higher with the PC CPUs most users have. The goon who was reporting a CPU bottleneck here at 4K had a 4090 and a Zen 3 CPU, which will only be so much faster than the PS5's Zen 2 CPU. If the PS5 game was designed to apply a heavy load on the CPU with the 60fps mode enabled, then I can see a Zen 3 CPU struggling to do better than 80fps, especially when considering the heavier-than-console settings.

This is what I get for conflating floating point operations with actual graphical performance for some reason, and using TechPowerUp’s “relative performance scale” to translate to every use case.

Branch Nvidian
Nov 29, 2012



Uhh, another question that should have an obvious answer, but I want someone here to confirm or deny instead of some rando on Reddit: my 3070 Ti was a 2x8-pin card (290W) and I was running two separate cables from my PSU to it. The Sapphire Pulse 7900XTX is 3x8-pin (370W), can I just use the pigtail on one of the two cables or do I need to run a third from the PSU?

I know the PCIe slot supplies 75W of power and each cable from the PSU should supply 180W. 180x2+75=435, but maybe it’s smarter to just run three individual cables?

Branch Nvidian
Nov 29, 2012



So the anti-sag bracket included with the Sapphire Pulse 7900 XTX neither mounts as displayed in the installation guide, nor does it even fit in my PC case. It also had a big rear end warning that if the anti-sag bracket isn’t used and GPU sag results in the card becoming defective that the warranty is void. Lmao

Branch Nvidian
Nov 29, 2012



Qubee posted:

I would like to buy a new GPU, I currently have a GTX 1080. It is no longer capable of running certain games at a good enough quality. What would people suggest I get? Budget isn't an issue, but I hate wasting money. I don't want to spend $200 extra for %2 extra juice. What card can I upgrade to that has the best bang for your buck? I'm also disappointed in nvidias antics lately, so I might try supporting AMD? But I'm so used to nvidia cards.

This isn’t SAMart, but since I just replaced my 3070 Ti, I guess it’s for sale if a goon wants to buy it second hand. Still has 2 years of warranty left, but it’s EVGA and who knows how they’re going to be able to handle GPU warranties now that they’re out of the game. Going rate on eBay seems to be $450, so :shrug:

Branch Nvidian
Nov 29, 2012



sauer kraut posted:

That's not a laughing matter, sagging damage is very easy to determine.
Cracked PCB right above the right side of the PCIE connector and the hook latch thing; and vram module/gpu core solder ball joints ripped clean off.
They could easily make true on that threat, you need to finagle some kind of beam to hold the massive cooler up.

Companies have been lenient on the issue in the past to not spook customers, but with 250-400W quad-whopper coolers becoming the norm and Nvidia milking them for every penny, that could change.

Maybe the card makers need to start better reinforcing increasingly heavy cards instead of just throwing some poo poo in the box and putting it on the customer to figure out. The “solution” they provided, a cheap L-shaped piece of metal with a tiny padded foot, literally doesn’t work in my PC case. I’ve got the card supported by some foam, but this really should be on the manufacturers since they can add structural rigidity during the design phase.

Branch Nvidian
Nov 29, 2012



K8.0 posted:

They can't hang limitless weight off a bracket at one end and a PCIE slot. GPUs either need to get smaller (not happening) or supports are going to become more and more standard.

In this particular instance the support bracket attaches at the back where the card’s regular case mounting bracket is and it has a little foot on a metal track that supports the far end of the GPU. Attaching it turns a 2.5-3 slot card into a 3.5-4 slot card. If the bracket is mandatory then either the card should be advertised as being the slot height it will be with the bracket attached, or the bracket should be integrated into the card somehow.

The idea that customers should be responsible for the card’s structural unsoundness is kinda bullshit in my opinion. The idea of them being “lenient” about providing warranty coverage is laughable.

Branch Nvidian
Nov 29, 2012



Zero VGS posted:

The animation really feels delayed but I guess that's how the game was in the first place; it feels like a half second between pressing a directly and going that way. Warframe this ain't.

This is just how the game plays. The PS3, PS4, and PS5 versions all played like this too, as does TLOU Part 2. I got used to it after a while, but I think it lends itself to controller play better than KBM.

Branch Nvidian
Nov 29, 2012



Gyrotica posted:

Grandma Waifus 2, clearly.

They still look like they’re barely 13 though.

Branch Nvidian
Nov 29, 2012



Prescription Combs posted:

I definitely don't regret getting the 7900XTX.. Thing has been completely solid and stable.

I don’t regret getting a 7900XTX, but AMD’s driver software absolutely has a learning curve to it. FFXIV in borderless window mode doesn’t obey frame rate capping, so the GPU cranks out 300+ fps while the fans spin up to maximum speed and sound like a jet engine (found a solution for this using the Chill feature eventually).

RE4 Remake has crashes as well on it that are different from the crashes my 3070 Ti was running into. I’ll exit a cutscene, bust through a door, or enter/exit combat and the game will hitch for a few frames and then just freeze. Don’t know if this is the card’s fault or not. Haven’t tested anything else out with it other than compiling shaders on TLOU and lmao that’s not worth loving with at this venture.

Branch Nvidian
Nov 29, 2012



Combat Pretzel posted:

Seeing that NVidia is pushing/demoing that RTXDI and RTXGI path-tracing stuff via Cyberpunk 2077, I wholly expect the 50x0 series of cards to have path-tracing specific hardware accelerators. I just hope they're implemented in such a way that something like UE5's Lumen and similar implementations can profit from it.

--edit:
Didn't see the remix stuff up until now, but that adds to my theory.

Has anyone said Nvidia GeForce PTX 5090 yet?

Branch Nvidian
Nov 29, 2012



Kramjacks posted:

It's always nice when some clickbait thumbnail shows up on my feed to remind me how much hardware enthusiast journalism sucks.





AMD SAID WHAT?!

Branch Nvidian
Nov 29, 2012



Didn’t the 1070, and by extension all 10-series cards, have to deal with the fact that Nvidia’s founders edition cards had a premium over whatever MSRP they set? So a 1070 might have been $380 MSRP, but Nvidia’s in-house model was $450, so all of the AIBs wanted to charge at least $450 at first.

Branch Nvidian
Nov 29, 2012



FuturePastNow posted:

The 1630 is worse than the 1050 too

Yeah, and I'm worse than my dad, what's your point?

Branch Nvidian
Nov 29, 2012



Okay, so I'm just confused as poo poo and I've been trying a bunch of stuff on my own and haven't been able to figure out the exact source of the problem without going down several rabbit holes about silicon stepping mismatches and other poo poo. I've mentioned my system before, but for the sake of keeping everything in one post I'll list it again

AMD Ryzen 9 5900X with a big-rear end Noctua NH-D12L on it
Sapphire AMD Radeon RX 7900 XTX using three separate PSU cables
G.Skill Trident Z Neo 3600MHz CL16-19-19-39 RAM (specifically marked as being for AMD systems)
Samsung 950 Evo NVMe SSD
Cooler Master 850W SFX PSU

I'm assuming it's a GPU issue since this wasn't happening until after I changed from a 3070 Ti to the 7900 XTX, so in Final Fantasy XIV my clock speeds are jumping all over the place and intermittently this results in extreme stuttering/huge frame time spikes. I'll go into an instanced dungeon and my GPU utilization is jumping anywhere from 45%-90% with my clock speed going from 1400MHz up to around 2700MHz. CPU utilization isn't very high, but it's a 12-core/24-thread CPU so I wouldn't expect it to be high. XMP is enabled in BIOS. I uninstalled the Adrenalin software using DDU in safe mode and installed only the driver while in safe mode. This seemed to fix things for about a day and then the issue came back. I've seen things mentioning needing to have three power cables running to the GPU which I've done, turning off ULPS which I've done, increasing the power limit which I've done, setting a minimum and maximum frequency in Adrenalin which I tried before removing that entirely in favor of the driver alone. I'm honestly at a loss as to what I'm not thinking of or trying here. I don't have Vsync enabled, I have Freesync on, I'm running a 240Hz Freesync Premium rated monitor.

I know that RDNA3 is extremely different than Maxwell/Pascal/Turing/Ampere so I can't directly compare what I'm used to on Nvidia cards to this one, but it seems as if there is something causing a weird bottleneck, especially considering I've seen my framerate in FFXIV be well into the 300s shortly after I started using this card. I feel like all I do is ask for help lately, but I could really use guidance from someone who has an idea of what might be going on here.

Branch Nvidian
Nov 29, 2012



wargames posted:

look for radeon chill settings, vsync isn't bad with freesynce, but there are other amd option like enhanced sync, do you have anti-lag?

I don't have any of those settings available to me right now since I removed the Adrenalin software and went with a driver only install.

Branch Nvidian
Nov 29, 2012



Indiana_Krom posted:

Change your windows power plan to "high performance" and see if it keeps doing it.

Already did that as well.

Branch Nvidian
Nov 29, 2012



orange juche posted:

When you swapped the 3070ti for the 7900xtx, did you use DDU to clean every trace of NVidia software from your PC? Nvidia leaves some poo poo in your registry and system files that is not purged by the NVidia uninstaller. Swapping from NVidia to AMD, or vice versa can cause weird/sporadic issues with clocks and performance if you don't run DDU.

https://www.guru3d.com/files-tags/download-ddu.html

We would all like to believe that uninstalling the drivers for something will purge it, but often enough it leaves little bits behind. Purge the drivers for both AMD GPUs and NVidia GPUs from the system, and then do a clean install of the Radeon drivers.

E: I see you used DDU to purge the Adrenalin poo poo from your system, but I'm unsure if you'd already done that for NVidia. If you didn't, try it. Failing that, OS refresh. Backup your poo poo and then reinstall Windows, as it's very likely something left over from the fact that you went from NVidia to AMD GPUs causing the issue.

Did you have NVidia Broadcast or anything related to NVenc or other NVidia specific software installed on the system?

I used DDU in safemode to remove all traces of Nvidia software. I also ended up actually doing a whole OS reinstall about two weeks ago anyway. Part of me wonders if I've just got some strange setting toggled somewhere buried under several menus that's causing this to happen.

Indiana_Krom posted:

Another possibility is that since the 5900X is a multi-CCD CPU the game could be hopping between CCDs. Try pinning the game to cores only across a single CCD and see if it smooths out.

I just attempted this with Process Lasso, and started having the stuttering again. FWIW it also did this when I disabled the second CCD entirely in Ryzen Master.

Branch Nvidian
Nov 29, 2012



orange juche posted:

On the OS reinstall, did you reinstall the Nvidia drivers/3070ti under the reinstall, or has it been living wholly on the 7900XTX?

Sorry, the OS reinstall was post-7900 XTX installation.

Branch Nvidian
Nov 29, 2012



Cygni posted:

Have you checked temps for CPU/GPU? And see what the CPU/GPU reports the current limit is (power limit/thermal/current/VRM etc) while gaming?

CPU temp 74-75C. Thermal limit is 90C.
GPU temps 63C edge. 75C junction. Limits should be whatever the default limit set by AMD are.

orange juche posted:

Even more ironic since they don't have any NVidia hardware in their PC now. They drank the red koolaid and that poo poo was poison.

Yeah, the irony is not lost on me.

I'm starting to think I just need to reinstall FFXIV again and see if maybe something in one of the game files is busted and causing this, since I don't really seem to be having the issue in anything else.

Edit: yeah, I'm going to do another reinstall of FFXIV and see if it's just a config file that got hosed up causing it. Will report back once all the patches and every thing have been reinstalled.

Branch Nvidian fucked around with this message at 03:54 on Apr 27, 2023

Branch Nvidian
Nov 29, 2012



As of tonight the following setting seemed to work:
5900X in "Game Mode," so only one CCD active
Radeon Chill on with Minimum at 238fps, Maximum at 239fps.
SAM disabled
GPU Tuning Minimum Frequency of 2800MHz, Maximum of 2900MHz, 1150mV.
Power Tuning at +15%
Enhanced Sync: Off
Wait for Vertical Refresh: Always On
Adaptive Sync Compatible: Enabled

I'm sure none of this will matter and the issue will reappear tomorrow, but figured I'd update with what mostly worked for this evening. Still frequently had GPU utilization dropping down into the 60% range. Starting to wonder if it's a power limitation, even though I should be well under the necessary power threshold and a giant 3x transient spike isn't shutting the system down or anything.

Branch Nvidian
Nov 29, 2012



Reporting back in on my FFXIV GPU issue. Turns out the problem was being caused by LG's on-screen control application, which is used to update monitor firmware and such.... I cannot believe that was the issue, but it figures it would be some bizarre thing causing it. GPU utilization is still reported jumping around from the low 50% range up to 100% at times, but I kinda think that might be due to the chiplet design of RDNA3 and stuff not knowing how to accurately report it or something? I'm back to getting super high frame rates with middling GPU utilization, but as long as the game isn't stuttering constantly I don't care anymore.

On the subject of Jedi Survivor, I downloaded it since I've got the EA play premium plus whatever thing, and had a good laugh at how bad the game renders some stuff. These images are with FSR on and with it off. Ultra settings, 3440x1440. You might have to actually open the images in another tab at full resolution to see the issue, but lol.

On


Off

Branch Nvidian
Nov 29, 2012



Arrath posted:

Skyrim and Fallout had infamously bad performance and save corruption issues on the PS3 IIRC, but I think that was the more on the architecture of the console and how they tried to kludge the engine into working on it.

My favorite FO3 thing was the Alaska DLC where something about the light reflections off the snow would hard lock the PS3 and you'd have to pull the power cable out of the console because nothing would work on the system, to include the power button. After the patch that came with that DLC, sometimes the system would also hard lock because an explosion was too bright.

Adbot
ADBOT LOVES YOU

Branch Nvidian
Nov 29, 2012



I just completely deleted Jedi Survivor from my system. For one reason or another the patch never downloaded, and the game continued to hitch and lag like poo poo. Some real Bush League poo poo.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply