Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
buglord
Jul 31, 2010

Cheating at a raffle? I sentence you to 1 year in jail! No! Two years! Three! Four! Five years! Ah! Ah! Ah! Ah!

Buglord

dkj posted:

I have a 7900 XT and I’m really not happy with it. Maybe my expectations were unrealistic but nothing really seems to run that well.

What rez are you targeting?

Adbot
ADBOT LOVES YOU

acksplode
May 17, 2004



pyrotek posted:

I bet that is mostly because games up to now are still targeting release on last-gen hardware. Eventually some games are going to target higher fidelity rendering and push the new machines as hard as possible and target 30FPS.

Well that was Plague Tale 2, and like I said it caught a lot of flak for it. I think the horse is out of the barn after the last several years of 60fps since the last mid-gen refresh. FWIW PS5 has some actual current-gen exclusives and those all have at least a 60fps mode.

redeyes
Sep 14, 2002

by Fluffdaddy
Nvidia or bust for actual gaming. Meme or not. Truth.

repiv
Aug 13, 2009

it's the way it's meant to be played™️

dkj
Feb 18, 2009

buglord posted:

What rez are you targeting?

3440x1440

pyrotek posted:

What specifically are the issues? I'm interested in your impressions.

Heavily anecdotal stuff, but Far Cry 6 would occasionally run around 100fps, but usually more 60-70, but would also drop into the 30s and 40s. And it wasn’t dropping when stuff starts happening or certain areas, it would just drop to 30-40 and stay there until I restarted the game.
Also had a lot of stuttering and occasional freezing.

Been really irritated with the drivers constantly being replaced by Windows and crashing the system. Though that hasn’t happened since the previous driver release.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

dkj posted:

3440x1440

Heavily anecdotal stuff, but Far Cry 6 would occasionally run around 100fps, but usually more 60-70, but would also drop into the 30s and 40s. And it wasn’t dropping when stuff starts happening or certain areas, it would just drop to 30-40 and stay there until I restarted the game.
Also had a lot of stuttering and occasional freezing.

Been really irritated with the drivers constantly being replaced by Windows and crashing the system. Though that hasn’t happened since the previous driver release.

Not saying that’s necessarily the whole story here, but FC6 is quite cpu heavy. FSR2 at the worst setting barely gave me any performance compared to native (1440p, 3060ti, 11600K).

I also got the permanent fps drop sometimes, plus occasional seemingly random stutters with FC6.

New Zealand can eat me
Aug 29, 2008

:matters:


Prescription Combs posted:

The D4 demo over the last weekend was using close to 18GB of vram on my 7900xtx :lol:

I got to 22GB doing max 1080p @ 200%. It looked really good though!

repiv posted:

lumen is expensive and pretty much built on the assumption that you'll pair it with TSR or some third party upscaler to keep the performance in check

I was really impressed with how well it ran/looked in fortnite, especially after loving around with the UE5 Cityscape demo, which really does not like running with <64GB of ram, even for the cut down version (this is a development constraint). I should download and try that again with the 79 to see how much better it isn't.

Josh Lyman
May 24, 2009


pyrotek posted:

Diablo 4 was using 20+GB with high textures. Medium textures look exactly the same from gameplay distance from what I saw, but you can tell the difference at login and presumably during in-game cinematics. It could be a beta thing, or maybe not.
I think that's just VRAM allocation rather than necessity. During D4 beta, I was seeing around 11.2GB VRAM usage on my 4070 Ti at 1440p and everything maxed. I don't know if I was locked at 165fps but I didn't notice any performance issues (other than server-side lag and rubberbanding). So you might say, hey, if D4 is the only AAA game I know I'm going to play and it runs well, then just stick with the 4070 Ti. Of course, while I play almost no AAA games, it's reasonable that something might catch my eye in the next 4-5 years before the RTX 60 series, and it's reasonable that game might want more than 12GB VRAM. That said, my understanding is there's usually little visual difference between ultra and high textures, so that maybe that hypothetical situation is just the tradeoff I have to make for Nvidia's feature set (which is also a hypothetical use case).

All of this makes me wish I had tested my 1070 in my new system to see how it would run D4. Maybe I could've held out until an RTX 5070. :lol:

kliras
Mar 27, 2021

Josh Lyman posted:

I think that's just VRAM allocation rather than necessity. During D4 beta, I was seeing around 11.2GB VRAM usage on my 4070 Ti at 1440p and everything maxed. I don't know if I was locked at 165fps but I didn't notice any performance issues (other than server-side lag and rubberbanding). So you might say, hey, if D4 is the only AAA game I know I'm going to play and it runs well, then just stick with the 4070 Ti. Of course, while I play almost no AAA games, it's reasonable that something might catch my eye in the next 4-5 years before the RTX 60 series, and it's reasonable that game might want more than 12GB VRAM. That said, my understanding is there's usually little visual difference between ultra and high textures, so that maybe that hypothetical situation is just the tradeoff I have to make for Nvidia's feature set (which is also a hypothetical use case).

All of this makes me wish I had tested my 1070 in my new system to see how it would run D4. Maybe I could've held out until an RTX 5070. :lol:
i thought the textures looked fairly bland on med, but the game in general has a lot of weird blur and bokeh going on that probably didn't help; i think the default sharpness is also way too low, so upping it might address a lot of the issues with this

ran it on max in 1440p on my 1070 with fsr 2 quality mode with a g-sync compatible monitor

it's not like it's literally unplayable with medium textures or anything; game should look fine for you



should run alright, perhaps except for when there's a lot of multiplayer action on screen. cutscenes like this one ran at about 40-something i think

the one thing that's dumb about in-engine cutscenes like these is that you can't change your graphics settings specifically for them

kliras fucked around with this message at 11:20 on Mar 30, 2023

Josh Lyman
May 24, 2009


kliras posted:

i thought the textures looked fairly bland on med, but the game in general has a lot of weird blur and bokeh going on that probably didn't help; i think the default sharpness is also way too low, so upping it might address a lot of the issues with this

ran it on max in 1440p on my 1070 with fsr 2 quality mode with a g-sync compatible monitor
[/quite]
Now you have me contemplating flat out returning the 4070 Ti since I have to return a couple things anyway. (No, this is dumb, who knows if the 50 series will be any better and it could very well be worse)

[quote]
should run alright, perhaps except for when there's a lot of multiplayer action on screen. cutscenes like this one ran at about 40-something i think

the one thing that's dumb about in-engine cutscenes like these is that you can't change your graphics settings specifically for them
In D3 on my 1070 I would drop well below 100 fps in heavy 4 person gameplay, maybe in the 80ish range? Don’t remember exactly, but that alone is probably a sign to just keep the 4070 Ti.

I was also getting 41 fps in D4 cutscenes so I think that’s limited by the game. I actually don’t mind in-engine cutscenes since it gives the devs more flexibility for storytelling and incorporating your character into the cutscenes. I would actually be okay with them targeting 24 fps for ~*~cinematic effect~*~.

edit: Did some testing of D3 solo rifts with GeForce Experience overlay. Average FPS was generally at 165fps but 1% lows routinely dipped below 100fps so yeah, carrying over my 1070 into the new build is dumb, as expected.

Josh Lyman fucked around with this message at 01:09 on Mar 31, 2023

repiv
Aug 13, 2009

New Zealand can eat me posted:

I was really impressed with how well it ran/looked in fortnite, especially after loving around with the UE5 Cityscape demo, which really does not like running with <64GB of ram, even for the cut down version (this is a development constraint). I should download and try that again with the 79 to see how much better it isn't.

the implementation in fortnite is impressive and holds up very well outdoors, but the cracks start to show in interiors with light "bubbling" or smearing

it's probably nothing that can't be fixed by increasing the sample counts, but that's going to cost performance

epic admitted as much when they said they were aiming for 60fps on consoles in open scenes but only 30fps if high quality interior lighting is required

repiv
Aug 13, 2009

what on earth is going on here

quote:

The Last of Us Part 1 on PC has been surprising. Have you ever seen a game where DLSS affected character animation? At #TLOU we even have this, the effect that makes the game worse!

Video 1: DLSS enabled in Quality mode
Video 2: DLSS disabled

https://twitter.com/peachbIues/status/1641186959123947520

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
One of the several reasons I am grabbing a 7800X3D (besides my 5800X getting bottlenecked at 4k apparently) is that I am a casual+ World of Warcraft enjoyer and frankly there are amazing gains to be had in that game on the X3D architecture.

A peculiarity of the Wow engine is that the main hub cities are so packed with players, npcs, and objects/terrain that it hobbles almost any system no matter how powerful. As a result, people have mostly agreed that no one should expect good performance in hub cities.

This appears to have changed with the X3D; I can't speak for the "older" X3D chips but people are actually getting good performance in hub cities with the 7 series X3D chips. It's kind of amazing; I expected we would never see the day. I was recently watching a video of someone playing Wow on a 7950X3D/4090 system, and they were getting 70-90 consistent FPS in the Valdrakken central plaza which is just kind of nuts.

The 1% lows are still kind of low, but it's astonishing that the cache is apparently making such a big difference.

My question is this: the X3D chips vary heavily on the uplift you can get on various games. Sometimes it's modest, sometimes it's actually loving incredible, and rarely, such as in Wow, it's doing things that people kind of wrote off as a possibility.

What exactly is, for example, the 7950X3D chip doing here that provides this kind of strange, uneven uplift? Is it just the 3D cache? Is the 3D cache going to keep getting more important as consoles continue to scale up the intensity of their titles, or- like Wow- are the best case gaming applications of these chips found in clearing up hilariously unoptimized older titles? Is that entirely the 3D cache or is there other new features in the 7 series X3D chips that are also helping?

Historically I've barely cared about CPUs, like most gamers. Just buy a nice one every 4-5 years or whatever, the GPU was always the bottleneck, so the processor was more of a possible liability than anything to get excited about. This appears to be changing and I'm just curious why and what this means and how it potentially interacts with the CPU-heavy titles that the current crop of AAA games that are starting to get ported over?

e: btw I'm also curious, there are insinuations online that the 7800X3D is going to potentially be better for gaming, objectively, than the 7950X3D; is that true? If so why? So if I got a 7950X3D it would perform worse in gaming?

I'm thinking of maybe just grabbing a 7950X3D for no real reason but I don't want it to be worse for gaming than the 7800...

Taima fucked around with this message at 14:12 on Mar 30, 2023

VorpalFish
Mar 22, 2007
reasonably awesometm

Taima posted:

One of the several reasons I am grabbing a 7800X3D (besides my 5800X getting bottlenecked at 4k apparently) is that I am a casual+ World of Warcraft enjoyer and frankly there are amazing gains to be had in that game on the X3D architecture.

A peculiarity of the Wow engine is that the main hub cities are so packed with players, npcs, and objects/terrain that it hobbles almost any system no matter how powerful. As a result, people have mostly agreed that no one should expect good performance in hub cities.

This appears to have changed with the X3D; I can't speak for the "older" X3D chips but people are actually getting good performance in hub cities with the 7 series X3D chips. It's kind of amazing; I expected we would never see the day. I was recently watching a video of someone playing Wow on a 7950X3D/4090 system, and they were getting 70-90 consistent FPS in the Valdrakken central plaza which is just kind of nuts.

The 1% lows are still kind of low, but it's astonishing that the cache is apparently making such a big difference.

My question is this: the X3D chips vary heavily on the uplift you can get on various games. Sometimes it's modest, sometimes it's actually loving incredible, and rarely, such as in Wow, it's doing things that people kind of wrote off as a possibility.

What exactly is, for example, the 7950X3D chip doing here that provides this kind of strange, uneven uplift? Is it just the 3D cache? Is the 3D cache going to keep getting more important as consoles continue to scale up the intensity of their titles, or- like Wow- are the best case gaming applications of these chips found in clearing up hilariously unoptimized older titles? Is that entirely the 3D cache or is there other new features in the 7 series X3D chips that are also helping?

Historically I've barely cared about CPUs, like most gamers. Just buy a nice one every 4-5 years or whatever, the GPU was always the bottleneck, so the processor was more of a possible liability than anything to get excited about. This appears to be changing and I'm just curious why and what this means and how it potentially interacts with the CPU-heavy titles that the current crop of AAA games that are starting to get ported over?

e: btw I'm also curious, there are insinuations online that the 7800X3D is going to potentially be better for gaming, objectively, than the 7950X3D; is that true? If so why? So if I got a 7950X3D it would perform worse in gaming?

I'm thinking of maybe just grabbing a 7950X3D for no real reason but I don't want it to be worse for gaming than the 7800...

The 7950x has the same potential cache issues that all multi-ccd Ryzen chips do, but even more so. Only 1 CCD has the vcache so you're relying on the scheduler to get it right, and when that goes wrong there can be regressions.

The 7800x3d will be one CCD and so won't have that issue.

If your use case is gaming, get the 7800x3d, not the 2 CCD models. They make sense if you have a professional/prosumer workload that scales with core count but otherwise save some money and get the same or sometimes better performance in gaming.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
It's just the ability to have a larger working set of data without starting to take the massive latency hit of going to RAM. It's not typically going to see such big benefits in console games because those are designed around console hardware, where limited cache size + even more massive latency hit for going to GDDR system ram means huge gamestates are something to avoid. It mostly shows huge gains in games that have working data sets too big for the cache on other processors but small enough to fit into the 3d cache, which mostly means games that scale up to simulating a bunch of stuff, be that factory games, 4X games, etc.

The 7950X3D has the potential to be worse because it has two CCDs, and only one of them has the 3D cache (and slightly lowered clocks that come with it). It seems to be already mostly fine but there is potential for things to be mis-scheduled onto the wrong cores. If it's JUST for gaming, wait the week and buy a 7800X3D so it's not even a potential issue.

MarcusSA
Sep 23, 2007

Taima posted:


e: btw I'm also curious, there are insinuations online that the 7800X3D is going to potentially be better for gaming, objectively, than the 7950X3D; is that true? If so why? So if I got a 7950X3D it would perform worse in gaming?


Someone else can probably explain it better but it has something to do with the number of cores the 7950x3d has and that some games aren’t taking advantage of the right cores with all the extra cache.

Honestly we are close enough to the 7800x3d that you should wait for reviews.

shrike82
Jun 11, 2005

https://twitter.com/VideoCardz/status/1641383019536691201?s=20

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.


This sucks but if it delivers at least 3080-level performance or higher I guess it won't be completely terrible since you can't really buy them new for MSRP anymore

Dr. Video Games 0031
Jul 17, 2004

That's about what I was expecting, maybe a little better. There was no chance that $750 rumor from MLID was true.

Qubee
May 31, 2013




I would like to buy a new GPU, I currently have a GTX 1080. It is no longer capable of running certain games at a good enough quality. What would people suggest I get? Budget isn't an issue, but I hate wasting money. I don't want to spend $200 extra for %2 extra juice. What card can I upgrade to that has the best bang for your buck? I'm also disappointed in nvidias antics lately, so I might try supporting AMD? But I'm so used to nvidia cards.

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

Qubee posted:

I would like to buy a new GPU, I currently have a GTX 1080. It is no longer capable of running certain games at a good enough quality. What would people suggest I get? Budget isn't an issue, but I hate wasting money. I don't want to spend $200 extra for %2 extra juice. What card can I upgrade to that has the best bang for your buck? I'm also disappointed in nvidias antics lately, so I might try supporting AMD? But I'm so used to nvidia cards.

Get a used 3080 for $550 or a 4070 ti for $800 if you absolutely want to buy this gen

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Nvidida continuing with their awful pricing structure. The RTX 3070 had 56% of the cores of the 3090 for 33% of the 3090's MSRP. It was an ok-ish deal, though the 3060 Ti (46% of the cores, 27% of the MSRP) was generally better. The RTX 4070 has 36% of the cores (about the same ratio as the prior product two tiers down, the RTX 3060) of the 4090 and 38% of the MSRP (close to the same price ratio as the 3070 Ti, which pretty much everyone acknowledged as a poor value despite having 59% of the cores of a 3090).

If we had a reasonably priced non-halo product to compare to, it would probably look even worse. Keep in mind that the 3080 was 83% of a 3090's core count for 47% of the MSRP. That's what the kind of deal consumers should be demanding looks like.

Qubee posted:

I would like to buy a new GPU, I currently have a GTX 1080. It is no longer capable of running certain games at a good enough quality. What would people suggest I get? Budget isn't an issue, but I hate wasting money. I don't want to spend $200 extra for %2 extra juice. What card can I upgrade to that has the best bang for your buck? I'm also disappointed in nvidias antics lately, so I might try supporting AMD? But I'm so used to nvidia cards.

The 4090 is the only new GPU on the market right now that is well priced for its performance. You're about six months late to buy used GPUs at good prices.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Let's face it though, the 4070 is a x060 class card with a new name, and a decent price bump over last gen. Even at $600 it's not the best deal.

We need to wait for benchmarks, but it's not exactly inspiring

Qubee posted:

I would like to buy a new GPU, I currently have a GTX 1080. It is no longer capable of running certain games at a good enough quality. What would people suggest I get? Budget isn't an issue, but I hate wasting money. I don't want to spend $200 extra for %2 extra juice. What card can I upgrade to that has the best bang for your buck? I'm also disappointed in nvidias antics lately, so I might try supporting AMD? But I'm so used to nvidia cards.

I have a hard time recommending anything other than the Radeon 6800, simply because it rams the sweet spot in my eyes for price, performance and importantly low power usage; but RT is barely usable if that matters. They can be had for $450-500, and (ignoring RT workloads) they perform between a 3070 Ti and 3080, but draw power like a card between the 3070 and 3070 Ti, and has way more VRAM than all those cards.

If the 4070 at $600 performs similar but just with better RT perf, I have no idea what the point is. We'll see.

HalloKitty fucked around with this message at 15:50 on Mar 30, 2023

Dr. Video Games 0031
Jul 17, 2004

as usual, i disagree with solely comparing the middle of the product stack to the top of the product stack. comparisons to the previous generation are just as important. but of course, those comparisons aren't great either with the 40 series, so gently caress it i guess. i'll just agree with the poster who said that if the 4070 manages to match the 3080 but for $600, then that's at least buyable.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
I agree that comparing to halo products is far from ideal, but I'm not Nvidia who has chosen to make the halo product the only remotely reasonable value this generation. If there was any other short, easily understood comparison, I'd use that instead.

It's clear from all the information we can dig up though that the comparisons I'm making are in fact reasonable and Nvidia's pricing is bad in a way that is unprecedented in the GPU market.

Branch Nvidian
Nov 29, 2012



Qubee posted:

I would like to buy a new GPU, I currently have a GTX 1080. It is no longer capable of running certain games at a good enough quality. What would people suggest I get? Budget isn't an issue, but I hate wasting money. I don't want to spend $200 extra for %2 extra juice. What card can I upgrade to that has the best bang for your buck? I'm also disappointed in nvidias antics lately, so I might try supporting AMD? But I'm so used to nvidia cards.

This isn’t SAMart, but since I just replaced my 3070 Ti, I guess it’s for sale if a goon wants to buy it second hand. Still has 2 years of warranty left, but it’s EVGA and who knows how they’re going to be able to handle GPU warranties now that they’re out of the game. Going rate on eBay seems to be $450, so :shrug:

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

dkj posted:

3440x1440

Heavily anecdotal stuff, but Far Cry 6 would occasionally run around 100fps, but usually more 60-70, but would also drop into the 30s and 40s. And it wasn’t dropping when stuff starts happening or certain areas, it would just drop to 30-40 and stay there until I restarted the game.
Also had a lot of stuttering and occasional freezing.

Been really irritated with the drivers constantly being replaced by Windows and crashing the system. Though that hasn’t happened since the previous driver release.

With your first example, wonder if that's the VRR setting in Windows that was mentioned earlier causing weird behavior. If it's already off all I can think is drivers, may be worth doing a rip/reinstall with DDU to clean up.

For the Windows driver update you can turn that off (if you haven't already) in both Win10/11 - it's under System > About > Advanced Settings > Hardware tab > Device Install Settings. Toggle it to No & save, then Windows won't download & auto-install drivers for things like video, chipset, etc. in Windows Update. Used to require some registry hacking back with Win7 but I guess enough people complained that MS included it in the OS now.

BOOTY-ADE fucked around with this message at 17:35 on Mar 30, 2023

mobby_6kl
Aug 9, 2009

by Fluffdaddy

change my name posted:

This sucks but if it delivers at least 3080-level performance or higher I guess it won't be completely terrible since you can't really buy them new for MSRP anymore

Looking forward to $800 with tax and AIB markup :thumbsup:


e: interessting that they did end up going $200 down from the Ti, as I thought they might do here

mobby_6kl posted:

The Ada cards are fine, the question is how they're going to make the pricing make sense in the current stack. If the performance works out roughly like this



On one hand, I don't see them charging as much for the 4070 as the 3080 cost... on the other hand, dropping $200 from the Ti is a big gap and would disincentivize people from the higher margin card.

mobby_6kl fucked around with this message at 17:47 on Mar 30, 2023

pyrotek
May 21, 2004




Called it. It was pretty obvious assuming the leaked specs were right (they were.) It will barely perform better than the 3070 Ti it is effectively replacing. Nvidia is trying to basically give this generation only slightly better same price/performance as the previous generation, except at the top of the price range. You'll get performance between a 3070 Ti and 3080 but probably closer to the 3070 Ti for the same price, but at least you'll get an extra 4GB of VRAM on this tier compared to last generation.

HalloKitty posted:

Let's face it though, the 4070 is a x060 class card with a new name, and a decent price bump over last gen. Even at $600 it's not the best deal.

We need to wait for benchmarks, but it's not exactly inspiring

You barely even need to wait. It has the exact same memory set up as the 4070 Ti with 77% of the CUDA cores and slightly slower boost clocks. It should be like 75-80% of a 4070 Ti at UHD and a bit better than that at lower resolutions. You can't compare specs between architectures, but you can within the same architecture.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

K8.0 posted:

I agree that comparing to halo products is far from ideal, but I'm not Nvidia who has chosen to make the halo product the only remotely reasonable value this generation. If there was any other short, easily understood comparison, I'd use that instead.

It's clear from all the information we can dig up though that the comparisons I'm making are in fact reasonable and Nvidia's pricing is bad in a way that is unprecedented in the GPU market.

Part of the problem is the 3090 was kind of lovely as a Halo and the 4090 is really quite good for a halo so the 3000 downstack is going to look better since the bar is lower.

However, the 4000 series is kind of hosed from a value perspective no matter what so its hard to say that the results are misleading.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Lockback posted:

Part of the problem is the 3090 was kind of lovely as a Halo and the 4090 is really quite good for a halo so the 3000 downstack is going to look better since the bar is lower.

However, the 4000 series is kind of hosed from a value perspective no matter what so its hard to say that the results are misleading.

I included the actual core counts of the products so you can compare them that way. The point is that the 4090 is still a halo product, and even if it has less of a halo product margin than previously, Nvidia is still making good money on it so the cards below it must be bad values sold at markups previously reserved for those halo-tier products. Also, if you go look through Nvidia's historical catalog, you can see that my comparison is in no way misleading.

The 3090 was actually fairly well positioned for a top of the stack product in an Nvidia lineup. It's more that the one step down product was atypically good, usually x80s have been poor buys and the x70 was the value king, while the 3080 was a very good value if you could get it near launch for MSRP, and a historically great one if you compare it to other x80 and similar products.

dkj
Feb 18, 2009

BOOTY-ADE posted:

With your first example, wonder if that's the VRR setting in Windows that was mentioned earlier causing weird behavior. If it's already off all I can think is drivers, may be worth doing a rip/reinstall with DDU to clean up.

For the Windows driver update you can turn that off (if you haven't already) in both Win10/11 - it's under System > About > Advanced Settings > Hardware tab > Device Install Settings. Toggle it to No & save, then Windows won't download & auto-install drivers for things like video, chipset, etc. in Windows Update. Used to require some registry hacking back with Win7 but I guess enough people complained that MS included it in the OS now.

I’ll look into the VRR issue. I haven’t had the driver issue since the release before the last, but I believe I did turn it off and still had it happen after. I don’t remember exactly, I tried a bunch of stuff including registry edits after searching the issue.

sauer kraut
Oct 2, 2004

Branch Nvidian posted:

So the anti-sag bracket included with the Sapphire Pulse 7900 XTX neither mounts as displayed in the installation guide, nor does it even fit in my PC case. It also had a big rear end warning that if the anti-sag bracket isn’t used and GPU sag results in the card becoming defective that the warranty is void. Lmao

That's not a laughing matter, sagging damage is very easy to determine.
Cracked PCB right above the right side of the PCIE connector and the hook latch thing; and vram module/gpu core solder ball joints ripped clean off.
They could easily make true on that threat, you need to finagle some kind of beam to hold the massive cooler up.

Companies have been lenient on the issue in the past to not spook customers, but with 250-400W quad-whopper coolers becoming the norm and Nvidia milking them for every penny, that could change.

sauer kraut fucked around with this message at 21:22 on Mar 30, 2023

Quaint Quail Quilt
Jun 19, 2006


Ask me about that time I told people mixing bleach and vinegar is okay
My 3080 founders edition I got 111 days after release has been a good purchase, but going to the new Alienware HDR OLED ultra wide is straining it a bit in some games and it seems like the smallish amount of VRAM may become an issue soon.

New games are super VRAM heavy and it helps a ton in VR and VRchat as well.

I wonder if it had more than 10gb VRAM if the Diablo 4 beta would have stuttered. (On high texture settings) dropping it to medium made it run smooth until I forgot I was discord streaming it once.

It's a different time than the past when you could laugh at AMD throwing in VRAM just to make at least one of their numbers bigger for pretty much no reason.

Branch Nvidian
Nov 29, 2012



sauer kraut posted:

That's not a laughing matter, sagging damage is very easy to determine.
Cracked PCB right above the right side of the PCIE connector and the hook latch thing; and vram module/gpu core solder ball joints ripped clean off.
They could easily make true on that threat, you need to finagle some kind of beam to hold the massive cooler up.

Companies have been lenient on the issue in the past to not spook customers, but with 250-400W quad-whopper coolers becoming the norm and Nvidia milking them for every penny, that could change.

Maybe the card makers need to start better reinforcing increasingly heavy cards instead of just throwing some poo poo in the box and putting it on the customer to figure out. The “solution” they provided, a cheap L-shaped piece of metal with a tiny padded foot, literally doesn’t work in my PC case. I’ve got the card supported by some foam, but this really should be on the manufacturers since they can add structural rigidity during the design phase.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
They can't hang limitless weight off a bracket at one end and a PCIE slot. GPUs either need to get smaller (not happening) or supports are going to become more and more standard.

Cygni
Nov 12, 2005

raring to post

replace atx entirely, it stinks (it wont happen, i know)

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
Ditch towers, bring back horizontal, under-monitor form factors.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Lockback posted:

Ditch towers, bring back horizontal, under-monitor form factors.

unironically this. I'm basically headed that way with a 4U rackmount unit anyways.

Adbot
ADBOT LOVES YOU

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

Everyone should just build sandwich-style ITX so your GPU stands on its side

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply