Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
orange juche
Mar 14, 2012



necrobobsledder posted:

The primary reason was to get him to stop being a scalper by threatening him to lose his clearance potentially if there was something legit illegal. So many people in the area have clearances to do their jobs here most folks are pretty drat straight laced outside leadership positions

It's not illegal to scalp, so there's nothing you can do about the dude besides ban him from a store, if you're store management. Threatening adverse action against a person's clearance, if they have one, without legitimate cause, however, is illegal.

Adbot
ADBOT LOVES YOU

orange juche
Mar 14, 2012



please do not cyberstalk dudes over gpus, it isn't worth it.

orange juche
Mar 14, 2012



MarcusSA posted:

This is surprisingly still in stock



https://www.costco.com/msi-gf75-thin-gaming-laptop---10th-gen-intel-core-i7-10750h---geforce-rtx-3060---144hz-1080p.product.100736982.html

Reading through that SD post they are saying it has a seriously crippled GPU that a 2070 beats.

No idea how true that is but :shrug:

It’s not as simple as getting a laptop with a 30 series and calling it good.

It's a laptop. Laptop cooling capabilities mean that the cpu and gpu have to fit inside about 75w. They allocate about 35w to the cpu and the rest to the GPU.

A lot of things will beat that laptop performance wise in the desktop space.

There are one off stupid builds by like Sager and Clevo etc, but if you're looking at that msi laptop you can't afford the ones I mentioned.

orange juche fucked around with this message at 07:42 on Mar 11, 2021

orange juche
Mar 14, 2012



MarcusSA posted:

I’m specifically talking about other laptops.

The 3060 in that one is super slow.

3060s aren't quick. I think the desktop version got a review of "well, it's a GPU, we guess"

The 3060Ti is plenty quick but unobtainable, much like loving everything in the GPU space. I could sell my loving 980Ti I bought 6? years ago for more than 50% of its sale price.

orange juche fucked around with this message at 07:59 on Mar 11, 2021

orange juche
Mar 14, 2012



my kinda ape posted:

Just sold my GTX 1070 for $490 minus shipping within two hours of putting it up on eBay :laffo:

I've got a 980Ti "Golden Dragon" edition which I could probably re-paste and then sell for about 400 bucks, might look into that since I've still got the box, antistatic bag, and all the goodies for the card.

orange juche
Mar 14, 2012



Mordekai posted:

Am I a fool for going with Pop! OS in my new desktop built around the 3070? I really don't care much about RTX, maybe for trying it out in some games.

If you're gaming, linux isn't an excellent choice, but if you're a "linux gamer" Pop! OS is debian based, with all of the positives and negatives that implies.

Proton and DXVK have made linux gaming a lot more workable, but it's still very hit and miss. It is a very deep rabbit hole you can get into with custom Proton builds for specific games and Lutris to manage your various installs outside of Steam.

orange juche fucked around with this message at 09:27 on Mar 18, 2021

orange juche
Mar 14, 2012



I mean it works, but you're going to be experiencing weird bugs. With warzone for example

quote:

AverytheFurry 2 weeks ago

Crashes my entire pc when trying to install

reply

Yeah it's still a bit poo poo

If you want to go on a quixotic quest to game on linux successfully, you're going to need to work on average twice as hard to get poo poo working vs someone on windows, and many games straight won't work or will break in awful ways. I have recent experience with linux gaming and it's not worth it if you have an option. Good luck with it though.

orange juche fucked around with this message at 09:40 on Mar 18, 2021

orange juche
Mar 14, 2012



Paul MaudDib posted:

With GPUs you absolutely do want to pre-spread it, there isn’t a heatspreader to “average” the hotspots so if you have spots that don’t get covered you can potentially damage the die.

So don’t do a “pattern”, put down some paste and spread it evenly. Dot or line is fine, complex patterns have more risk of creating an air bubble where the two lines meet.

For the record for both that cpu and GPU, there is no heat spreader, so he needs to apply an extremely thin layer across the entire die surface on both chips.

orange juche
Mar 14, 2012



njsykora posted:

This is why so many youtubers started hiding how much paste they use.

Paste application methods are like assholes, everyone has one and they all stink

orange juche
Mar 14, 2012



shrike82 posted:

dumb question but what are the headless plugs for GPUs for? i have machines running headless without them doing GPU compute frequently so i'm curious if there's some use case locked behind needing to pretend a monitor is connected

Certain headless apps, like using GPU transcode with an Nvidia card on a PLEX server requires a dummy plug in one of the video outputs in order to utilize the NVENC capabilities. I believe it's also used for GPU passthrough for virtual machines, because if there's nothing plugged in to the GPU the 3d acceleration and video decode functions are shut down/cant be utilized.

orange juche
Mar 14, 2012




Probably a safe bet. On the other hand in 2023 when semiconductor prices collapse you'll be able to trade 3 3080s for a dominos pizza.

orange juche
Mar 14, 2012



Cygni posted:

go to a potion seller with a big iron cauldron and get a antidote do I have to think of everything asus?

https://www.youtube.com/watch?v=R_FQU4KzN7A

you trying to get BIOS updates from asus

orange juche
Mar 14, 2012



Both Aveum and that UE5 hobbyist project look not super impressive, they're leaning too far into the hyper-real which causes your eyes to pick out the inconsistencies such as how the smoke from the chimney disappates into nothing instead of being a steadily rising haze, the water in both projects looks like poorly defined poo poo, etc.

Aveum looks way the gently caress worse than the other, EA is going to spend a shitpile of money for a complete dogshit game, since the gameplay is as inspired as the graphics, and we're all here to laugh at it.

VVV Pretty much, that bodycam game even though it was likely "gameplay" did a better job of fooling me into believing it was realistic via sound design, actor/camera movement/distortion, and faint artifacting you'd see on a lovely bodycam than either of the other 2. There were definitely some smeared textures and you can find lots of problems if you pick through it, but the artifacting covered for it in the moment.

orange juche fucked around with this message at 09:20 on Apr 22, 2023

orange juche
Mar 14, 2012



Dr. Video Games 0031 posted:

I'm left somewhat confused by these configurations. Why is there no 8CU configuration? Why do they think that gaming handhelds have a use for 8 CPU cores? 6 cores/12CU, 4 cores/8CU, and 4 cores/4CU would make much more sense and would surely be cheaper to produce too. There's a reason why, when valve commissioned a custom APU, they gave it only 4 cores.

I guess at least the 6 core 4CU version will be good as an emulation device, but it will probably be slower than the steam deck for native x86 games.

It likely will still be better as the steam deck is quite limited on its tdp, I believe it's capped to 15W where a lot of the handheld PCs are going up to 20+ on up through 28w, though it really doesn't do much after 20w on a 6800u.

The steam deck does have the benefit of having a known hardware configuration that the steam deck OS can optimize for, and the other handheld PCs don't, and it's observed in the steam deck doing more with 15w tdp than some other setups do all the way up to 28w.

orange juche
Mar 14, 2012



Branch Nvidian posted:

Okay, so I'm just confused as poo poo and I've been trying a bunch of stuff on my own and haven't been able to figure out the exact source of the problem without going down several rabbit holes about silicon stepping mismatches and other poo poo. I've mentioned my system before, but for the sake of keeping everything in one post I'll list it again

AMD Ryzen 9 5900X with a big-rear end Noctua NH-D12L on it
Sapphire AMD Radeon RX 7900 XTX using three separate PSU cables
G.Skill Trident Z Neo 3600MHz CL16-19-19-39 RAM (specifically marked as being for AMD systems)
Samsung 950 Evo NVMe SSD
Cooler Master 850W SFX PSU

I'm assuming it's a GPU issue since this wasn't happening until after I changed from a 3070 Ti to the 7900 XTX, so in Final Fantasy XIV my clock speeds are jumping all over the place and intermittently this results in extreme stuttering/huge frame time spikes. I'll go into an instanced dungeon and my GPU utilization is jumping anywhere from 45%-90% with my clock speed going from 1400MHz up to around 2700MHz. CPU utilization isn't very high, but it's a 12-core/24-thread CPU so I wouldn't expect it to be high. XMP is enabled in BIOS. I uninstalled the Adrenalin software using DDU in safe mode and installed only the driver while in safe mode. This seemed to fix things for about a day and then the issue came back. I've seen things mentioning needing to have three power cables running to the GPU which I've done, turning off ULPS which I've done, increasing the power limit which I've done, setting a minimum and maximum frequency in Adrenalin which I tried before removing that entirely in favor of the driver alone. I'm honestly at a loss as to what I'm not thinking of or trying here. I don't have Vsync enabled, I have Freesync on, I'm running a 240Hz Freesync Premium rated monitor.

I know that RDNA3 is extremely different than Maxwell/Pascal/Turing/Ampere so I can't directly compare what I'm used to on Nvidia cards to this one, but it seems as if there is something causing a weird bottleneck, especially considering I've seen my framerate in FFXIV be well into the 300s shortly after I started using this card. I feel like all I do is ask for help lately, but I could really use guidance from someone who has an idea of what might be going on here.

When you swapped the 3070ti for the 7900xtx, did you use DDU to clean every trace of NVidia software from your PC? Nvidia leaves some poo poo in your registry and system files that is not purged by the NVidia uninstaller. Swapping from NVidia to AMD, or vice versa can cause weird/sporadic issues with clocks and performance if you don't run DDU.

https://www.guru3d.com/files-tags/download-ddu.html

We would all like to believe that uninstalling the drivers for something will purge it, but often enough it leaves little bits behind. Purge the drivers for both AMD GPUs and NVidia GPUs from the system, and then do a clean install of the Radeon drivers.

E: I see you used DDU to purge the Adrenalin poo poo from your system, but I'm unsure if you'd already done that for NVidia. If you didn't, try it. Failing that, OS refresh. Backup your poo poo and then reinstall Windows, as it's very likely something left over from the fact that you went from NVidia to AMD GPUs causing the issue.

Did you have NVidia Broadcast or anything related to NVenc or other NVidia specific software installed on the system?

orange juche fucked around with this message at 03:17 on Apr 27, 2023

orange juche
Mar 14, 2012



Branch Nvidian posted:

I used DDU in safemode to remove all traces of Nvidia software. I also ended up actually doing a whole OS reinstall about two weeks ago anyway. Part of me wonders if I've just got some strange setting toggled somewhere buried under several menus that's causing this to happen.

I just attempted this with Process Lasso, and started having the stuttering again. FWIW it also did this when I disabled the second CCD entirely in Ryzen Master.

On the OS reinstall, did you reinstall the Nvidia drivers/3070ti under the reinstall, or has it been living wholly on the 7900XTX? Also, do you use multiple monitors?

E: There's also apparently a compositor thing that Microsoft added a couple years back that can cause weird interactions with GPUs called Multi-Plane Overlay, which allows Windows to continue displaying things that are not in focus in the background.

https://github.com/RedDot-3ND7355/MPO-GPU-FIX

Apparently it can gently caress up people on most AMD cards from 6000 series up, and also Nvidia RTX 3000 series cards. I've never had the issue with my RTX 3080, but since the problem and the fix was literally discovered by NVidia of all things, it is apparently a thing.

orange juche fucked around with this message at 03:34 on Apr 27, 2023

orange juche
Mar 14, 2012



gradenko_2000 posted:

Branch Nvidian is a pro-tier username, dang

Even more ironic since they don't have any NVidia hardware in their PC now. They drank the red koolaid and that poo poo was poison.

orange juche
Mar 14, 2012



ijyt posted:

£700 and dealing with ASUS made software, hmmm. Don't think I'll be regretting my 512GB Deck purchase.

Yup just wait for one of the chinese companies to make one that doesn't have weird ASUS bloatware and is also £100 cheaper, with better RAM or storage.

orange juche
Mar 14, 2012



Dr. Video Games 0031 posted:

Just hook up one of the daisy chained power connectors to the adapter. That's perfectly fine to do.

Daisy-chaining the power adapters is bad actually, you can overload the connectors and make the magic smoke come out, especially when you're dealing with a 4090. The 4090 draws 450W, which means you need 3x 150w connectors to guarantee you can drive it, and the 4th is for transients. Don't daisy chain your 4090.

Each 8 pin connector is rated to pass 150w, and while the wires can handle more, even up to 300-400w, do you want that power going through the connector that is rated to handle 150w? Get a slight power hiccup and suddenly your GPU is smoking from the power pins.

orange juche
Mar 14, 2012



RDNA3 is actually pretty good at RT, problem is FSR2 which is a core component of a raytracing solution at 4K, is basically non-existent. DLSS has much wider support and while FSR2 might eventually wind up supporting more games, it's a long way from catching up to DLSS in how widespread it is.

orange juche
Mar 14, 2012



AzureSkys posted:

Kind of a dumb question, but I need to use a few more monitors with my RTX 3090 (Which has 4 connectors and I need two more). I have an old GTX 660ti that fits OK to install on my motherboard and case.

However, the drivers for the RTX 30 series don't include the GTX 600 series so it's not being recognized by Windows 10 x64. I don't see any RTX 30 series drivers that do. Installing any that are compatible with the 660 then disables the 3090.

Is there any way to get them both to function or is it just not doable without the driver support working for both?

It's not a big deal, I'm just experimenting with some options and wanted to see if I can use this old hardware first.

Not really unless you want to dabble into custom edited drivers and you don't want to do that, unsigned drivers are a pretty big safety risk.

Get a newer 2nd GPU that fits in your case, there's single card solutions that are new enough to be supported I think, if card height is your concern.

orange juche
Mar 14, 2012



Yudo posted:

I have the terrible feeling that fsr 3 isn't coming until the middle of rdna 3's life cycle. That isn't going to help in terms of developer adoption, though apparently fsr is easy to implement. Maybe if it runs on rdna 2 it will end up on consoles.

FSR3 does run on RDNA2, since it's an open standard and AMD specifically stated it would run on older hardware, they'd be silly to not have it run on older hardware as current consoles are RDNA2 based, and also the lions share of their graphics hardware goes into consoles.

orange juche
Mar 14, 2012




They made the game for console and were told at the last minute "btw you need to release for PC too" but since they made the game all hosed up on account of consoles being able to handle that, the game is just wrecked.

Not that their dumb coding doesn't have an impact on consoles, just less of one.

orange juche
Mar 14, 2012



Subjunctive posted:

This must mean that nobody has even gestured at the build with a profiler on PC, or else the supernova-bright problems are just baked in so deep they don’t know how to attack them.

900ms waits is deep, bellowing lols though.

They got their professional certs at the bottom of a cracker jack box, the entire team

orange juche
Mar 14, 2012



Cross-Section posted:

Yeah, I was planning on taking a break until more patches dropped but then I stumbled upon that DLSS3 mod

https://www.youtube.com/watch?v=BbRdpHex2No

You indeed have to drop 5 bucks on the creator's Patreon but it doubled my framerate and got rid of a lot of the stutters I was experiencing, so I'd say that's a good enough deal

It didn't get rid of the stutter it just hid it. The game is still hitching and locking up like complete garbage, your GPU is just masking it from you with a fake frame.

orange juche
Mar 14, 2012



gradenko_2000 posted:

now I haven't programmed in over a decade, but from a process-oriented perspective, it seems to me like you would approach this problem from the end result backwards

if you want a game to run at 60 FPS, then you have 16.67ms to get everything done before a new frame needs to be rendered

that's your, let's call it a temporal budget, and so you'd look at what exactly "everything" breaks down into, and then parcel it out across the 16 milliseconds

you start with what absolutely needs to happen, and go on down the list - if anything is still taking up time that could be shoveled off into another thread, or doesn't need to be done at all, or can be made to run faster/shorter, you trim it

even after you've pared everything down to something that fits within your budget, you'd be doing it with a particular hardware profile in mind, and something like a 4-thread Pentium might not have enough threads to handle all the ones you'd spawn off the main render thread, or something like a Ryzen 1400 might not have enough clock speed/instructions-per-clock to complete your main render thread as fast as what you baseline against, and that's where hardware requirements and recommendations come in, but if you were baselining against, say, a Ryzen 3600, then you'd know that anyone with an even faster CPU than that would be able to hit the 60 FPS mark (setting aside the GPU now)

I'm just shooting the poo poo and everything I've mentioned is probably way easier said than done, but does this make sense?


Wiggly Wayne DDS posted:

you've just described the FOX Engine as developed at Konami. the development costs for that were around ~$50m from memory, then it got abandoned as everyone gawked from the side at how it ran perfectly on any platform. it costs a lot to create variations of fast or accurate shaders, budget ai, interleave it all across threads, but the end result was insane. that was a ground-up redesign where other engines are bolting on dynamic resolution scaling and calling it at day

Basically the devs planned on AI upscaling via DLSS and AI false frame generation saving the day, and were supremely lazy in prioritization of tasks in the render queue. It's gone from "must have been a bug" to "wow this is shocking incompetence and laziness, how in the gently caress did this pass QA", and yet EA will still make a fuckton of money because Star Wars is popular.

I don't really know how you fix something that is broken at such a low level that your render pipeline has 900ms thread locks, like, they developed the whole game wrong, as a joke, or something.

orange juche
Mar 14, 2012



power crystals posted:

The 5800X3D is notoriously a nightmare to cool due to the stacked die topology (the cache die effectively acts as a thermal insulator). The stock cooler won't break it or anything, but getting a fancier one may appreciably improve its performance by letting it boost for longer.

One of the best things you can do with the 5800x3d is undervolt it a bit on vcore, I dropped my voltage down in the BIOS by nearly 0.1 volt, i think .08 or something, and it shaved off 10C as long as I'm not under all-core 100% utilization. The CPU doesn't get nearly as hot under gaming loads as it did before, I was sitting close to 80C while playing VRChat, and went down to 68C with no other changes.

The amount of thermal load that the stacked cache puts the CPU under is pretty crazy, the CPU itself won't use more than about 70w under load, but it will drive right on up to 80C even under water cooling because the cache die blocks heat from traveling from the cores to the IHS.

orange juche fucked around with this message at 00:34 on May 7, 2023

orange juche
Mar 14, 2012



Yudo posted:

My undervolted 7900xt uses 80w of power idle (stock it is about 90w). That seems like alot. Aside from that, no driver problems!

yeah the MCD setup for the 7xxx series GPUs is an idle power hog. Under load it uses less than the equivalent Nvidia card but the Nvidia card is less power hungry at idle. Not sure if the 7x series cards just don't clock down as well, or if there's inefficiencies that are somehow gobbling up ~50w of power.

orange juche
Mar 14, 2012



Zephro posted:

Well, it doesn't until it does. If you start to bump up against the limits of the VRAM then it affects performance pretty dramatically. Forecasting the future is a mug's game but given that most new AAA games will be targeting the Series X and the PS5 and dropping support for older consoles, I'd be a little uncomfortable with 12GB if I was planning to hold onto the card for more than a couple of years. It might be fine, but even if it is it'll be a bit of tight squeeze.

edit: especially given I'd have paid $600 for the card, ie $200 more than an entire PlayStation 5

Yeah I mostly play VR poo poo nowadays and things like VRChat especially are VRAM hogs, because nobody believes in optimized texture work. I have a 3080 12GB and I can watch my frame rate literally slash in half as soon as I fill my VRAM and textures wind up cached in system RAM. Even worse is if you somehow wind up overflowing your VRAM significantly, which seems to pretty much be impossible outside of poo poo like VRChat, you can eat up all your system RAM with textures that didn't fit in VRAM, and you can crash your PC.

orange juche
Mar 14, 2012



Shipon posted:

one time i heard someone bitching about 16gb not being enough because of modded skyrim

at some point that's on you buddy, you're shoving bloated texture packs into age old games and pretending that means "fidelity"

I've run across poo poo where people are using 8k textures for skinning models, individual 8k textures for each material instead of realizing that 4k is quite probably fine, and they could also atlas the textures to save space in vram. Some people build avatars which consume an entire GB of VRAM just for the textures on one model. It's what happens when you give people who know gently caress all about best practices for 3d modelling the ability to import whatever the gently caress they want into your game.

orange juche
Mar 14, 2012



Twibbit posted:

Fsr without aa was in splat3 as well. I am not sure what is happening at Nintendo

Company bureaucracy basically. "This is the way it's always been done" is often a reason for those kinds of decisions.

orange juche
Mar 14, 2012



Kazinsal posted:

To me it's the fact that Nintendo has just been microwaving old hardware for the past two decades and going "yeah what about it" when challenged on it that miffs me because it forces them to make their games run in environments that are just utter crap in the name of cost and power efficiency.

The GameCube was mind-blowing when it came out. It ran a bunch of games that were generally really graphically great, and even though it was hamstrung by only having 480p output at best, it still was well built. Then the Wii came out and it was... an overclocked GameCube with motion controls as a gimmick to distract people from a lack of hardware improvement.

Then the Wii U showed up, with its enhanced motion controls in the form of a tablet with controller buttons on it... and it was hardware-wise another two Wii-grade processor cores and a single generation's worth of incremental improvement on an AMD graphics core that was already two generations out of date.

The Switch came out in 2017 with a CPU core from 2012 and a mobile GPU core from 2013 and now we're ten years technologically past that and Nintendo's content with the same hardware platform for the next two or three years. By the time the Switch gets replaced, used five year old cell phones will be able to play Tears of the Kingdom at a solid framerate and higher resolution than the Switch can right now.

This is from the same company that, in 1996, released a console that was effectively an SGI Onyx for $199.

e: I get that Nintendo's official target market is children. I'm not disputing that angle. Nintendo is a brand that's very well marketed towards youngins. But you can't put out two Zelda games in a row that are Dark Souls But In Hyrule and think that the people buying them are going to just be accepting towards the game running like poo poo because your hardware team doesn't give a drat about performance OR power efficiency.

The fact that the current new hotness, the Asus ROG Ally can run TOTK at a higher framerate than the official hardware, with all of the overhead that emulation imposes, is pretty damning on the performance front for the Switch. Even a Switch 2 or Switch Pro or whatever is going to be hosed in terms of performance. Current gen processors are just too far ahead of what Nintendo wants to use for their poo poo.

orange juche
Mar 14, 2012



I will say one thing, Nintendo hits you coming and going, they're the only console maker whose console is not a loss leader. They make money off of every console sold, from the instant you buy it. Sony and Microsoft have to dig out of a hole on price, as their consoles probably cost somewhere between 33 and 50% more than it sells for to make, and they have to make it up in game and peripheral sales to customers before they profit.

orange juche
Mar 14, 2012



Starship Troopers without any of the spirit is just not going to be great imo

orange juche
Mar 14, 2012



Paul MaudDib posted:

https://twitter.com/young_particle/status/1631257198327132164?t=4EhHsb0Sg0hFXK89j-FMNA&s=19

full thing: https://www.reddit.com/r/PiratedGames/comments/11fnc66/empress_explains_how_denuvo_works_through_the_one/

https://twitter.com/DenuvoScum/status/1628635273042051072

iirc her telegram thing was some findom scam too, charge tons of money to be around EMPRESS?

she's like terry davis level SCHIZO and VITRIOLIC (ok i'll stop) but she actually legitimately is breaking DRM that's resisted everyone else. And iirc she does the findom thing instead and only does Denuvo cracking to prove how much better she actually is. She charges $500 and will only do one game till it's finished, which takes her a couple months. She's previously gotten mad at people for criticizing the fact she wants money, which lol, she’s right, that's such a bizarrely nominal sum for the work. Mad as a hatter, through and through.

and the other guy only cracks Football Manager games (MKDEV) and supposedly it's an older version of denuvo since football games are infinite refreshes on a single engine.

Empress is legit loving nuts, she's on Timecube level brain, but you have to be loving nuts to crack poo poo like Denuvo.

orange juche
Mar 14, 2012



12GB VRAM is essentially dead at 1440P if developers keep on the path they are travelling, which they will do as the current gen consoles continue to evolve. Diablo 4 for example will happily consume 12GB of VRAM on my RTX 3080 without blinking, and then spill over into system RAM causing my whole system to slow down. Seriously, the game barely uses 25% of my GPU's power when the graphics are maxed out, but it inhales VRAM like nobody's business which absolutely shits on the performance with stuttering.

E: Diablo 4 specifically uses up to 16GB of VRAM in order to match the console experience. Taking textures down to high resolves the VRAM issue on 12GB VRAM, but the textures are quite noticeably blurrier in the environment.

https://www.youtube.com/watch?v=2Rl6sFoeOSU

orange juche fucked around with this message at 09:57 on Jun 27, 2023

orange juche
Mar 14, 2012



Dr. Video Games 0031 posted:

I think there was probably a mistake made in the settings menu if he was only getting 10 fps with RT Ultra enabled while using DLSS. The 4060 is capable of better than that at native 1080p or 1440p.

They're using Path Tracing/RT Overdrive, which a 4060 is woefully inadequate for. Hell on my RTX 3080, Path Tracing takes me down to ~40 FPS at 3440x1440 with DLSS Balanced. If they were using RT Ultra they'd probably be in the 40s to high 50s with frame gen.

orange juche
Mar 14, 2012



what is the percentage performance impact of using a PCIe 4.0x16 GPU in a 3.0x16 slot? Assuming you have a pretty decent gaming GPU like a 5800x3d

orange juche
Mar 14, 2012



Dr. Video Games 0031 posted:

In TechPowerUp's testing, they found a 2% difference on average with the 4090 at 1440p and 4K: https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-pci-express-scaling/

Though some games may perform a bit worse while others are largely unaffected. Metro Exodus for instance takes a pretty big hit in their testing (13% worse at 4K), but it seems to be the biggest outlier. There may be some other outliers that went untested, but for the most part it doesn't seem to matter.

It's a bigger concern for cards that may already only have x8 or x4 interfaces. x4 cards like the 6500 XT can run into problems at PCIe 3.0.

Sweet, I figured it was only a small percent drop, I'm running a 5800x3d on an X470 motherboard, and while I was prepared to buy an X570 board if needed to get 4.0x16 slots, I was hoping I wouldn't have to ship of theseus my computer any more than I already have. The next upgrade after GPU is going to be rebuilding the entire system on Zen 5 or whatever comes out after the 7000 series Ryzens. I upgrade part of my PC every few years, CPU/mobo/ram/case then GPU, but since I upgraded the motherboard last time they've gone from 3.0x16 to 5.0x16.

E: I may still clean slate it, because I've pushed the RAM expandability to its limits, and I've got a sub-par storage solution going on right now, so it might be time to just start over with a brand new build, and carry over my current GPU and upgrade it 6 months down the road when new GPUs are out/they're trying to kill off stock of current gen GPUs before next gen releases.

orange juche fucked around with this message at 06:02 on Oct 2, 2023

Adbot
ADBOT LOVES YOU

orange juche
Mar 14, 2012



https://www.youtube.com/watch?v=H1bj19C4ohA

Double the frames, double the blur, which means 4x the images in your eyes!

Definitely no one saw AMD's huge surprise because it's so loving blurry

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply