Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Anime Schoolgirl
Nov 28, 2002

Suspect A posted:

The rx 6600 recently dropped below $200 and has double the vram I currently have and is twice as powerful to boot. Am I really screwed if I get it or should I wait for the 6700/XT's price to drop for the vram? 1080p monitor @ 75hz
If you have no intention of upgrading your monitor, a 6600 is going to be perfectly fine. You're going to have to turn down textures 1-2 tiers down, but your resolution doesn't need that level of texture detail to begin with.

I'm not sure the 6700XT is going to meaningfully drop in price from the $310-340 point until they're absolutely ready to put out the 7600XT/7700 no letters/7700XT, and there's not even a whiff of those SKUs being deployed even in workstation, so you're going to be waiting an unknown amount of time.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Anime Schoolgirl posted:

does GDDR6 have a provision for RAM with bank counts that are not an exponent of 2?

GDDR6X does. The chips just don't exist, there was never an actual implementation, just a JEDEC spec. The same was also true of GDDR5X actually.

GDDR6W from Samsung also exists (24gbit = 3 die stacks iirc) and I strongly suspect that's going to be the next move. It may have been delayed, GDDR6W would be the candidate for "we thought it would ship last year" scenarios imo especially given it's samsung - just like Samsung's Aquabolt HBM2 was a shitshow for AMD too. And it's really them or micron. It wouldn't surprise me to see NVIDIA buy up the whole production for a while, they're in a tighter bind than AMD and really need 1.5GB (clamshell) or 3GB modules badly to give them more capacity options.

I still don't know why they haven't done xxxtra-bigass modules with a larger package and the same bumpout and just jammed a bigger memory die in there. GDDR6X is already custom for them, if they said "we want a bigger die for 24gbit modules" micron would go "that'll cost a whole lot more" and NVIDIA would say "okay, when can you have them". You can probably do it without more than minimal timing changes I'd think, like go from 24/21 GT/s to 21/18 or something? I suppose in a way that lends credence to the idea of delays - maybe the timeline was just too short when samsung kicked the can, or when Micron missed a node shrink, or something.

The point earlier (before the marked point) about them and AMD having the potential to move the spot price around if they suddenly went clamshell-everything and causing a much larger BOM increase than current spot prices imply was interesting. NVIDIA pushing their stuff off to a custom memory IC (GDDR5X previously, too) is really an interesting supply-chain play. They retain backwards compatibility with non-X modules (they do pam4, pam3, and regular gddr6 signaling) and get a guaranteed, no-competition supply, and toss some new features into it, like PAM4/etc (GDDR7 is now adopting PAM3 iirc). At NVIDIA's scale in the GDDR commodity market you almost have to hedge somehow, and I bet they, Sony, and MS all have some options trading going on for GDDR spot prices too. But if you're gonna have to hedge 1/3 for of the market volume why not just put an order into the factory and get it customized the way you want. It's an extremely monopsonistic market, there's probably like 5 companies that are 95% of the sales volume.

But I think one aspect people didn't appreciate about the 3070 Ti / 3060 Ti GDDR6X / etc is that it gave them a way to move dies with GDDR6X if that's what was available (during the depths of mining/covid). It was a supply chain flexibility thing. It didn't make them good products though, I just think GDDR6X was relatively necessary to keep feeding the beast. Even on 8nm, 12 PHYs (3090) is already pretty big and the routing around the package is pulled in very, very close to the package (see the hole locations). They actually probably can't go to a full 512b bus, I disagreed with this in the GDDR5/Pascal days but I agree with it now, 24gbps PAM4 probably does not work with longer trace length. And now you have GDDR6X being used across the whole stack to try and squeeze a little more actual bandwidth into low-end products without having to pay the area cost for 6+ PHYs. You have PCIe buses being cut down to x8 to trim more PHY area. Etc.

NVIDIA is fighting physical fanout size problems on their monolithic designs. They are paying for expensive on-die caches, using superfast custom memory, etc all to let them use fewer PHYs, because a bunch of giant PHYs don't scale well. AMD has a little more room to play because the MCDs actually blow up the size of the package quite a bit, which ironically makes the routing quite a bit easier for more channels too, and in the N33/7600 segment they are simply staying at 6nm monolithic due to PHY cost (and still doing PCIe x8 and 4 PHY). I also still think stacked cache is inevitable (including on NVIDIA/monolithic). Also, a hot take is that AMD probably has more flexibility on MCDs than people realize, because the Infinity Link (not fabric) PHYs are quite a bit smaller than an actual PHY. The opportunity cost of disabling one is much less. OK we lost 5% of that die area, so? You don't have to pay the 6nm cost for MCDs. You know how AMD had a bunch of the MCD cache capacity disabled even on 7900XTX? They can do 4x fully-enabled instead of 6x 2/3rds enabled or whatever, and still have the same infinity cache amount, just with a different amount of actual bandwidth behind it. Or make a MCD with two PHYs for double-channel/non-clamshell but same upstream bandwidth feeding a single cache. Etc. There's tons of interesting options there for disambiguating some of the memory stuff from the actual GCD, and footprint spreading is actually not a sin at all in desktop.

Paul MaudDib fucked around with this message at 02:30 on Jun 22, 2023

spaceblancmange
Apr 19, 2018

#essereFerrari

repiv posted:

not even consoles are immune to graphics inflation apparently

https://twitter.com/tomwarren/status/1671572067076931594

3 years into a console generation and with GPU prices still stupid you can buy a 5600 3060ti prebuilt for the same price in australia. the xbox ones price was halved after less than 3 years.

Incredulous Dylan
Oct 22, 2004

Fun Shoe
Somehow I am still on this hybrid-cooled 1080ti. Normally I’d get the top of each gen or every other gen but it just kept holding up. I was going to get a 4090 but keep reading stories about new games running terribly on the best current GPUs. Why won’t they make it easy for me to fool myself into giving them my money?!?!

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

Incredulous Dylan posted:

Somehow I am still on this hybrid-cooled 1080ti. Normally I’d get the top of each gen or every other gen but it just kept holding up. I was going to get a 4090 but keep reading stories about new games running terribly on the best current GPUs. Why won’t they make it easy for me to fool myself into giving them my money?!?!

What have you been reading, the 4090 will crush any game out there

Cross-Section
Mar 18, 2009

A 7900 XTX runs better than the 4090 in Jedi Survivor actually

(though if you're running that paid DLSS3 mod the 4090 *does* more or less perform how it should)

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

change my name posted:

What have you been reading, the 4090 will crush any game out there

Big budget hit Gollum falls flat on its face even with the mighty 4090, I saw that 1% lows are around 30fps with RT on.

The game is awful and nobody should play it, though.

wargames
Mar 16, 2008

official yospos cat censor

Cross-Section posted:

A 7900 XTX runs better than the 4090 in Jedi Survivor actually

(though if you're running that paid DLSS3 mod the 4090 *does* more or less perform how it should)

you can score a 7900 XTX for 850 https://slickdeals.net/f/16732145-a...6aff71007204478


What are 4090s going for today?

Cross-Section
Mar 18, 2009

Twerk from Home posted:

Big budget hit Gollum falls flat on its face even with the mighty 4090, I saw that 1% lows are around 30fps with RT on.

The game is awful and nobody should play it, though.

Tbh it's less about the 4090 underperforming than these games absolutely sucking poo poo on anything but X3D CPUs combined with the latest AMD GPUs

Yudo
May 15, 2003

Twerk from Home posted:

Big budget hit Gollum falls flat on its face even with the mighty 4090, I saw that 1% lows are around 30fps with RT on.

The game is awful and nobody should play it, though.

Gollum is the new Crysis, except for all the wrong reasons.

Dr. Video Games 0031
Jul 17, 2004

wargames posted:

you can score a 7900 XTX for 850 https://slickdeals.net/f/16732145-a...6aff71007204478


What are 4090s going for today?

That's using a 10% off coupon code ($100 max) that can be applied any GPU purchase though, and you have to use the predatory "Affirm" payment plan poo poo. With that same code, the cheapest 4090 is $1500. Without it, it's around $1580.

Dr. Video Games 0031 fucked around with this message at 03:16 on Jun 22, 2023

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
lovely AAA rehashes and live service burned me out super bad and now it's terrible buggy releases rushed out way early, by the time it's patched up the hype has moved on etc.

honestly yuzu TOTK/BOTW and RTX remasters are probably the most interesting content released in recent years. And Metro EE, and Witcher 3 RTX (buggy). The AAA shovelware poo poo is rotten and I just am not that interested anymore. By the time the bugfixes have trickled in, the hype for most stuff has faded and ehhhh.

the pandemic Actually-MSRP 3090 is a champ because you could actually get it occasionally, it's got enough VRAM to churn through this poo poo and it's a generally high performer, and has the bf16 support you want for AI stuff. AI people pretty much seem to be converging on "lots of 3090s" as the happy medium for DIY training these days.

3090 for $700 is also still kind of a nobrainer imo. would you buy 4070 Ti 24GB used at $700, but it pulls 320W? (ok it's a bit slower than 4070 ti). Like it's actually been a super good value card. $2000 pandemic 3090 was a good deal. So, I think, was the 2080 Ti at $950 (evga GOATed). 1080 Ti too. Just buy a high end card when you need to upgrade, and ride it until you run into feature problems, these days.

The pascal circlejerking is a bit absurd though, DLSS meant the 2070S (in particular) and 2080 Ti (goat) aged really really well and I think it should have been obvious from the area expenditure that NVIDIA was betting on it. And Turing being volta-descended means that it has gained quite a bit on Pascal in the DX12 era. It's got way better feature and resource binding and directstorage and all that poo poo. Just like moving on from DX11 pretty well screws maxwell... Pascal is not going to age well in the next phase of DX12 implementations, I think.

Paul MaudDib fucked around with this message at 03:22 on Jun 22, 2023

Dr. Video Games 0031
Jul 17, 2004

Cross-Section posted:

A 7900 XTX runs better than the 4090 in Jedi Survivor actually

(though if you're running that paid DLSS3 mod the 4090 *does* more or less perform how it should)

The 7900 XTX is faster at 1080p due to Nvidia's driver overhead issue. That's a resolution nobody with these cards will use, though. At 1440p and 4K, the 4090 is faster by the expected margin.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Dr. Video Games 0031 posted:

The 7900 XTX is faster at 1080p due to Nvidia's driver overhead issue. That's a resolution nobody with these cards will use, though. At 1440p and 4K, the 4090 is faster by the expected margin.



NVIDIA has to fix the driver overhead at some point because the GPU side is rendering at 640p or whatever in 1080p performance mode. The CPU side of the loop has to run equivalently faster to keep DLSS fed.

That's partially what framegen is meant to help address but in general it's a fundamental problem with the approach of lowering render res and upscaling. The baseline expectation is that the CPU can run X% faster just fine.

And RT computation also increases CPU utilization a lot.

A 5800X3D is gonna be a champ, a 7800X3D will too. As a gamer I would never take the higher average vs worst-case scenario performance, and it runs beautifully smooth on non-XMP JEDEC memory, is super lower power, etc. X3D single-CCD is highly highly desirable, wait for Zen5 if you can but X3D are great chips now too.

13-series is like, fine, but 5800X3D is a super great contender given how utterly mature AM4 is at this point. And 5600X3D potentially is a bit short of general core horsepower. 5800X3D makes a lot of sense, and DDR4 is super duper cheap, and good SSDs are cheap, etc.

Paul MaudDib fucked around with this message at 04:03 on Jun 22, 2023

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION




$2000 3090s were not a “good deal”, regardless of Pandemic market challenges or not, at least with regard to gaming usage.

Regardless of someone’s disposable income, if a price needs to be quantified against a higher scalping price to call it a good deal, but both are well in excess of the MSRP, then neither is a “good deal”.

Nalin
Sep 29, 2007

Hair Elf
It's interesting how Yuzu's May progress report claims that the RTX 4060 Ti performs worse on emulation than the RTX 3060 Ti. They claim that using the 2x scaler (to render games at 1440p) basically saturates the 128-bit bus and tanks performance. And the GDDR6 RAM can't keep up with the bandwidth.

I wish somebody would actually benchmark that. I've only seen people re-posting the same news articles and not actually trying to verify it.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I don't think anyone wants to jump on the grenade of showing off an emulated game for a benchmark lest Nintendo send Seal Team Six

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
There’s very little professional tech coverage of emulator performance in general, sadly.

Kazinsal
Dec 13, 2011

gradenko_2000 posted:

I don't think anyone wants to jump on the grenade of showing off an emulated game for a benchmark lest Nintendo send Seal Team Six

They got away with legally forcing a guy into indentured servitude for piracy, you best believe that my 1440p 120 FPS playthrough of TOTK was 100% produced by a completely stock Nintendo Switch

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Question about how DSR works. If these are my settings under NVCP



Than is the max resolution here actually the "4K" downscaled DSR picture despite being listed as 1620p (1.5x 1080p, the resolution of the tv)


e:also I just realized I never noticed the DSR smoothness setting

Rinkles fucked around with this message at 11:44 on Jun 22, 2023

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

change my name posted:

What have you been reading, the 4090 will crush any game out there

It won't crush any game that has optimization issues. That's a decent chunk of PC games in the last two years. It cannot be understated just how bad things are on the software/game side of things. The hardware is more than fine. The games.... not so much

Plenty of discussion about it in this thread so I ain't gonna bang that particular drum too much

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Nalin posted:

It's interesting how Yuzu's May progress report claims that the RTX 4060 Ti performs worse on emulation than the RTX 3060 Ti. They claim that using the 2x scaler (to render games at 1440p) basically saturates the 128-bit bus and tanks performance. And the GDDR6 RAM can't keep up with the bandwidth.

I wish somebody would actually benchmark that. I've only seen people re-posting the same news articles and not actually trying to verify it.

Good thing that the Switch 2 is Ampere and not newer!

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Define a 4090 failing to crush a game. 60 fps at 4K or? Where's the line just curious

njsykora
Jan 23, 2012

Robots confuse squirrels.


I remember people being unhappy at Jedi Survivor specifically not hitting 4k60 at Ultra, and to be honest I get it. You should expect the card you spent 4-figures on to run literally anything at least that high.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Interesting, I've never felt that way. It's all relative, I might be getting 50 fps but that means someone else is getting like 12 fps, and it's not a race or some high level journey to make the most of your investment.

I mean that's the draw to me at the end of the day, not having cosmic power but instead just never having to worry about it. 50-60 FPS VRR is very playable, not bad for a worst case scenario :shrug:

The 90 might have been $1499 but just going out to dinner these days is like $80+ for 2 people so again, relative imo

ijyt
Apr 10, 2012

loving lol at the above post

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Taima posted:

Interesting, I've never felt that way. It's all relative, I might be getting 50 fps but that means someone else is getting like 12 fps, and it's not a race or some high level journey to make the most of your investment.

I mean that's the draw to me at the end of the day, not having cosmic power but instead just never having to worry about it. 50-60 FPS VRR is very playable, not bad for a worst case scenario :shrug:

The 90 might have been $1499 but just going out to dinner these days is like $80+ for 2 people so again, relative imo

Weird Pumpkin
Oct 7, 2007


lmao


As someone who spent way too much on a 3090 I think I'm just gonna sit on this until the next generation at the absolute minimum. My TV only does 60hz anyway, so as long as I get close to 4k60 at reasonable settings (and at the moment I don't really play any new games anyway) then I'm pretty much happy

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Taima posted:

The 90 might have been $1499 but just going out to dinner these days is like $80+ for 2 people so again, relative imo

I've seen this logic all over the place, but my own experience has been now that food at the grocery store or ever eating out is extremely expensive, I actually seem to have less money to spend on toys and bullshit!

MarcusSA
Sep 23, 2007

Twerk from Home posted:

I've seen this logic all over the place, but my own experience has been now that food at the grocery store or ever eating out is extremely expensive, I actually seem to have less money to spend on toys and bullshit!

Yeah this.

Shipon
Nov 7, 2005

Twerk from Home posted:

I've seen this logic all over the place, but my own experience has been now that food at the grocery store or ever eating out is extremely expensive, I actually seem to have less money to spend on toys and bullshit!

yeah that's why i don't do it anymore. eating a restaurant since covid is an awful experience now anyway, the QR code poo poo sucks so much

Yudo
May 15, 2003

Leaving the house ruins my immersion almost as much as dipping below 60fps.

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

PNY 3060 tis are back in stock at Best Buy for $275

kliras
Mar 27, 2021
life is just one elaborate joke at this point

https://twitter.com/VideoCardz/status/1672605679431868419

Dr. Video Games 0031
Jul 17, 2004

7900 XTX is down to $829.99 on Amazon after coupon is applied: https://www.amazon.com/PowerColor-Hellhound-Radeon-7900-Graphics/dp/B0BMWSRM7W/

No "affirm" bullshit required for this one. Their gigantic Red Devil model is $879.99 if you'd rather have that. $830 is a genuinely good price for the 7900 XTX, offering a pretty good price-to-performance improvement over the previous gen's $699 cards (which were barely even available for that much). I wonder if these prices will drop even more in the near future? The Hellhound was $849.99 when I looked last night, so it already dropped $20 since then.

edit: also comes with the RE4 remake still, which is an actually good bonus.

Dr. Video Games 0031 fucked around with this message at 18:11 on Jun 24, 2023

FuturePastNow
May 19, 2014


Microcenter also has a 7900XT for $699, in-store only.

Kibner
Oct 21, 2008

Acguy Supremacy
I've been really happy with my 7900xtx, fwiw.

Yudo
May 15, 2003

Unfortunately, Powercolor cards have a reputation for having bad coil whine. In my experience, that reputation is earned. It's Amazon, so you can always return it if the whine is too much (I can't stand it, but then I'm precious about high pitched noise). I think for $820 it's worth a roll of the dice. The xfx 7900xt I have has near inaudible coil whine; sapphire too seems to have a pretty good reputation on that front.

It's too bad about the hellhound, because otherwise it is has extremely quiet fans.

kliras
Mar 27, 2021
this sounds like fun, but what does this mean for lumen?

https://twitter.com/momentsincg/status/1672680437280374790

Adbot
ADBOT LOVES YOU

Josh Lyman
May 24, 2009


FuturePastNow posted:

Microcenter also has a 7900XT for $699, in-store only.
The Hellhound 7900XT for $700 is really compelling. It’s 26% faster than the 4070 at 1440p so a 4070 would need to be <$557 to match, which I don’t think any current deals do. And price/performance is a fair bit better than the 7900XTX for $830.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply