Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Kibner
Oct 21, 2008

Acguy Supremacy

wolrah posted:

That seems odd. I played through the modern series up to FC5 on a 4790K with a GeForce 970 so you'd think a CPU two generations newer would be just fine. IIRC New Dawn is basically FC5 Part 2 and uses the same underlying tech so it should be similar. By the time I got around to playing it I had upgraded to a Ryzen 3900X and GeForce 3070 though so I can't be sure.

As for adapters, all HDMI to VGA adapters are active, but HDMI prior to the absolute latest revision is still more or less DVI, which itself is more or less a digitized version of VGA that still functions as if there were a CRT beam so those active adapters are very simple.

The Far Cry games are extremely single-thread bound, iirc. And there wasn't much improvement in single-thread speeds for a long time.

Adbot
ADBOT LOVES YOU

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

wolrah posted:

That seems odd. I played through the modern series up to FC5 on a 4790K with a GeForce 970 so you'd think a CPU two generations newer would be just fine. IIRC New Dawn is basically FC5 Part 2 and uses the same underlying tech so it should be similar. By the time I got around to playing it I had upgraded to a Ryzen 3900X and GeForce 3070 though so I can't be sure.

I mean look at the clock rates




wolrah posted:

As for adapters, all HDMI to VGA adapters are active, but HDMI prior to the absolute latest revision is still more or less DVI, which itself is more or less a digitized version of VGA that still functions as if there were a CRT beam so those active adapters are very simple.

thanks, I'll probably pick one up in the near future

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

wolrah posted:

%USERPROFILE%\Documents\My Games\Far Cry New Dawn\gamerprofile.xml

Look for VideoModeProfileFullscreen.

Way too many games have these stupid hardcoded lists of resolutions they're OK with instead of just asking the system for what resolutions are available, but most don't actually enforce that list outside of their selector.

btw, it's possible i'm doing something wrong, but changing these to a lower resolution isn't working for me.

code:
<VideoModeProfileFullscreen adapterIndex="0" displayIndex="0" windowMode="0" width="1280" height="960" hdr="0" refreshRateNumerator="75025" refreshRateDenominator="1000" />

wolrah
May 8, 2006
what?

Kibner posted:

The Far Cry games are extremely single-thread bound, iirc. And there wasn't much improvement in single-thread speeds for a long time.
Yeah I guess that is true, during that time that AMD was loving around with the construction cores I do remember feeling pretty smug with my Haswell still holding its own years down the line.

edit:

Rinkles posted:

btw, it's possible i'm doing something wrong, but changing these to a lower resolution isn't working for me.

code:
<VideoModeProfileFullscreen adapterIndex="0" displayIndex="0" windowMode="0" width="1280" height="960" hdr="0" refreshRateNumerator="75025" refreshRateDenominator="1000" />
Not really sure on that, I set the same on mine and it seemed to work as expected, but I can also just set 1280x960 from the menus if I select 4:3 aspect ratio at the bottom. There's even a "Custom" aspect ratio option for which the default resolution is 1600x1024 so it seems like it's happy to do whatever you want.

wolrah fucked around with this message at 16:58 on Jul 10, 2023

kliras
Mar 27, 2021
quick video on power draw on the 7900 xtx vs 4080

https://www.youtube.com/watch?v=HznATcpWldo

Dr. Video Games 0031
Jul 17, 2004

I wish he would've used aib vs aib or reference vs reference in order to drive the point home better. A lot of replies are trying to use this as a gotcha, but then there's also a lot saying that their XTX behaves the same way so I think it's probably a universal problem with RDNA3. The main issue is that it just doesn't scale down in power consumption very well in less demanding workloads while the 4080 does (as do all other 40-series cards). The 200W difference in Overwatch 2 is just silly.

kliras
Mar 27, 2021
i also noticed a 13% discount over here in my humble european country on the "fe" (limited edition) fwiw

https://twitter.com/VideoCardz/status/1678519788795043847

njsykora
Jan 23, 2012

Robots confuse squirrels.


Yep Arc prices are falling and they're gonna get lower. The A750 LE is only £225 in the UK now. Approaching the point where I might consider getting one to gently caress about with.

Former Human
Oct 15, 2001

Paul MaudDib posted:

NVIDIA being willing to price 4070 at an actually-reasonable level

you should actually buy a 4070 (and a lot of people probably should)

You have an interesting idea of "reasonable"

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Former Human posted:

You have an interesting idea of "reasonable"

It’s reasonable enough that it puts Navi 32 in deep trouble as far as any sort of cost competitiveness. Just like the last 2 product generations - nvidia has generally settled on some decent prices, and AMD has to settle for some token undercuts and shrug while they bleed market share. Which is exactly what’s going to happen again with Navi 32.

If there was some giant market gap being left, they’d take it. Not only is there not a gap like that, but AMD actually has been struggling even to keep up with nvidia pricing. AMD has been paying for tsmc wafer costs for a while now, and they have had an architectural disadvantage since Maxwell that leaves them using more transistors for a given performance level. Which again has only deepened with RDNA3 and the MCM design - they’re using 150mm2 of N6 and 200mm2 of N5P to compete with a chip that’s effectively 223mm2 of N5P (AD104 with 75% enabled).

Again, nvidia is literally so reasonable that yet again AMD is in a position of using an expensive, underperforming product to keep up with a dirt-cheap nvidia midrange product. Just like 6800XT, just like 5700XT, just like Vega. They have absolutely firmly earned their “15% and falling” marketshare.

This time I’m frankly not even sure AMD can afford to just do the token “fifty buck undercut”, because DLSS alone is still worth more than fifty bucks on a $600 graphics card. Let alone efficiency, drivers, VR, CUDA, productivity apps…

If AMD is knocking $100+ off the price of the 4070 then they’re making GBS threads up their margins massively. They’d be using almost twice the silicon and advanced packaging steps just to compete with a plain old monolithic midrange product. It’s not collusion, they just suck at designing hardware and literally can’t afford to produce the product they’ve built at the price its performance can justify. Again, just like Vega, etc.

Nvidia pricing is too reasonable for AMD to even compete. There’s no other way to put it. Nvidia pricing is where it’s supposed to be and AMD can’t keep up, let alone “punish them for overpricing”. Because it’s not, and AMD is subject to the same price pressures themselves. This is just what it costs now. But people love to fantasize about collusion and stuff instead of the boring reality of TSMC price increases and BOM cost rising.

And remember AMD have to get a lower-priced cutdown out of that 350mm2 package too! It’ll have to compete with the likes of the 4060 ti, which nvidia absolutely can drop further and still make huge margins. 4060 ti and 4080 are both hideously overpriced in a way the 4070 and 4060 really aren’t.

Paul MaudDib fucked around with this message at 06:34 on Jul 11, 2023

Dr. Video Games 0031
Jul 17, 2004

kliras posted:

i also noticed a 13% discount over here in my humble european country on the "fe" (limited edition) fwiw

https://twitter.com/VideoCardz/status/1678519788795043847

This is another predatory payment plan discount by the way. Every order on Newegg for $300 or more is eligible for the same 12% discount, so it wasn't a special A770 deal or anything.

Dr. Video Games 0031 fucked around with this message at 06:55 on Jul 11, 2023

ijyt
Apr 10, 2012

Paul MaudDib posted:

It’s reasonable enough that it puts Navi 32 in deep trouble as far as any sort of cost competitiveness. Just like the last 2 product generations - nvidia has generally settled on some decent prices, and AMD has to settle for some token undercuts and shrug while they bleed market share. Which is exactly what’s going to happen again with Navi 32.

If there was some giant market gap being left, they’d take it. Not only is there not a gap like that, but AMD actually has been struggling even to keep up with nvidia pricing. AMD has been paying for tsmc wafer costs for a while now, and they have had an architectural disadvantage since Maxwell that leaves them using more transistors for a given performance level. Which again has only deepened with RDNA3 and the MCM design - they’re using 150mm2 of N6 and 200mm2 of N5P to compete with a chip that’s effectively 223mm2 of N5P (AD104 with 75% enabled).

Again, nvidia is literally so reasonable that yet again AMD is in a position of using an expensive, underperforming product to keep up with a dirt-cheap nvidia midrange product. Just like 6800XT, just like 5700XT, just like Vega. They have absolutely firmly earned their “15% and falling” marketshare.

This time I’m frankly not even sure AMD can afford to just do the token “fifty buck undercut”, because DLSS alone is still worth more than fifty bucks on a $600 graphics card. Let alone efficiency, drivers, VR, CUDA, productivity apps…

If AMD is knocking $100+ off the price of the 4070 then they’re making GBS threads up their margins massively. They’d be using almost twice the silicon and advanced packaging steps just to compete with a plain old monolithic midrange product. It’s not collusion, they just suck at designing hardware and literally can’t afford to produce the product they’ve built at the price its performance can justify. Again, just like Vega, etc.

Nvidia pricing is too reasonable for AMD to even compete. There’s no other way to put it. Nvidia pricing is where it’s supposed to be and AMD can’t keep up, let alone “punish them for overpricing”. Because it’s not, and AMD is subject to the same price pressures themselves. This is just what it costs now. But people love to fantasize about collusion and stuff instead of the boring reality of TSMC price increases and BOM cost rising.

And remember AMD have to get a lower-priced cutdown out of that 350mm2 package too! It’ll have to compete with the likes of the 4060 ti, which nvidia absolutely can drop further and still make huge margins. 4060 ti and 4080 are both hideously overpriced in a way the 4070 and 4060 really aren’t.

you write too many words

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Former Human posted:

You have an interesting idea of "reasonable"

Do not engage the possible userbenchmark admin

Geemer
Nov 4, 2010



I wish I'd lived in this fantasy world where just because the competition can't keep up ripping off consumers as hard the price is suddenly fair and reasonable. I bet they have flying cars and wages that keep up with inflation too.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
new goal: buying a GPU specifically to make people mad about it

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

gradenko_2000 posted:

new goal: buying a GPU specifically to make people mad about it

There are 4080s for $1,050 for Prime Day if that's your goal

kliras
Mar 27, 2021

gradenko_2000 posted:

new goal: buying a GPU specifically to make people mad about it
average linus video

change my name
Aug 27, 2007

Legends die but anime is forever.

RIP The Lost Otakus.

Circling in on a 6600/6600 XT for a friend's build, is it worth spending the extra $10 on a triple-fan card since it's a pretty low power GPU?

njsykora
Jan 23, 2012

Robots confuse squirrels.


I wouldn't no, the 6700XT is the point where I'd start to consider triple fan. Though if it's only $10 more it might be worth it if the lower potential noise is worth it to you.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Geemer posted:

I wish I'd lived in this fantasy world where just because the competition can't keep up ripping off consumers as hard the price is suddenly fair and reasonable. I bet they have flying cars and wages that keep up with inflation too.

I, too, yearn for the fantasy world where getting a corvette for $10k is fair and reasonable simply because that's the amount the average consumer is willing to pay for a car. Look buddy this guy on Craigslist is willing to sell me a 1992 Camry for $500, that's good enough for what I need to do, where do you get off thinking a corvette should be selling for $75k!?

hobbies cost money and i'm not sure why that's such a shocker to people. GTX 1070 launched at a (real) price of $450 like 7 years ago, 4070 is not really that much more expensive considering the spiraling cost of node shrinks and everything else that has transpired since.

if you don't find the value in gaming that's fine, a lot of people are low-key just upset that they don't have the time to spend 4 hours every night playing games anymore and can't justify staying on the hardware treadmill anymore (even though it's moving far slower than ever before). People want something that's so self-evidently good value that they can upgrade without feeling bad about it given how little they play, and that's just not how the dGPU market works in the post-moore's law era.

Paul MaudDib fucked around with this message at 14:50 on Jul 11, 2023

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Paul MaudDib posted:

I, too, yearn for the fantasy world where getting a corvette for $10k is fair and reasonable simply because that's the amount the average consumer is willing to pay for a car. Look buddy this guy on Craigslist is willing to sell me a 1992 Camry for $500, that's good enough for what I need to do, where do you get off thinking a corvette should be selling for $75k!?

hobbies cost money and i'm not sure why that's such a shocker to people. GTX 1070 launched at a (real) price of $450 like 7 years ago, 4070 is not really that much more expensive considering the spiraling cost of node shrinks and everything else that has transpired since.

if you don't find the value in gaming that's fine, a lot of people are low-key just upset that they don't have the time to spend 4 hours every night playing games anymore and can't justify staying on the hardware treadmill anymore (even though it's moving far slower than ever before). People want something that's so self-evidently good value that they can upgrade without feeling bad about it given how little they play, and that's just not how the dGPU market works in the post-moore's law era.

Looking back, even Turing offered a better value improvement, unless there's a specific segment I'm overlooking. By my understanding, this is the worst value improvement from a new generation in the history of the PC dGPU market. People are mad because it's never been like this before. It's new!

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

Paul MaudDib posted:

I, too, yearn for the fantasy world where getting a corvette for $10k is fair and reasonable simply because that's the amount the average consumer is willing to pay for a car. Look buddy this guy on Craigslist is willing to sell me a 1992 Camry for $500, that's good enough for what I need to do, where do you get off thinking a corvette should be selling for $75k!?

nice things cost money and i'm not sure why that's such a shocker to people. GTX 1070 launched at a (real) price of $450 like 7 years ago, 4070 is not really that much more expensive considering the spiraling cost of node shrinks and everything else that has transpired since.

Nice things do cost money, but as others have said, wages aren’t pacing inflation and Nvidia’s at least perceived (if not guilty) of making hay while the sun shines in the current environment. I literally just swapped an RX 6600 for an RTX 3060, I’m not immune to Nvidia’s value proposition to a point, but there’s a certain “your points are invalid, why can’t you all just resign yourself to Team Green supremacy and eat cake like me” vibe coming off your posts lately. Frustration with the state of things is being expressed here. You are not obligated to spring to Nvidia’s rescue.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Twerk from Home posted:

Looking back, even Turing offered a better value improvement, unless there's a specific segment I'm overlooking. By my understanding, this is the worst value improvement from a new generation in the history of the PC dGPU market. People are mad because it's never been like this before. It's new!

painting Turing as being anything other than disappointingly low is a massive retcon. Even Pascal got a "ehhh prices are rising and the gains aren't exactly what they used to be, at the founders' edition prices this is only an incremental improvement over the 970" with accusations of "selling a midrange die as 1070/1080 class" etc.

like yes, progress has been slowing since around 2008, and it's only been getting more and more intense over time. And more recently, starting with 16nm, costs have been spiraling significantly despite those mediocre gains.

Turing and Ampere were an attempt to get around that cost trend by using older nodes. And in large part... it worked! Ampere was actually an extremely good gain for the modern era, for example. Turing also kept pushing in the upper range (2080 Ti was a great card for $950). The problem is people's expectations are miscalibrated from the 2000s still, when moore's law was still working, both on cost and performance steps you get from it.

The other problem is... when you bump back from these artificially-cheap generations onto the actual cost-trend curve, where you go from 1070 at $449 to 4070 at $599, people feel like they're being ripped off because Ampere actually did offer some out-of-band high-value products (I reject the idea that outside 2070S and 2080 Ti people were gaga for Turing) and it's pretty inevitable that going back to TSMC was going to produce big performance gains but cause perf/$ to moonwalk in a lot of segments - especially the lower-end segments. It's the "archer's hangover" problem - at some point that strategy just leads to a hangover so strong that it will literally kill you.

Paul MaudDib fucked around with this message at 15:19 on Jul 11, 2023

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Hasturtium posted:

there’s a certain “your points are invalid, why can’t you all just resign yourself to Team Green supremacy and eat cake like me” vibe coming off your posts lately. Frustration with the state of things is being expressed here. You are not obligated to spring to Nvidia’s rescue.

I think this puts it perfectly in terms of the frustration, but...

Zedsdeadbaby posted:

Do not engage the possible userbenchmark admin

this is also possible! :haw:

Regrettable
Jan 5, 2010



I just assume Paul owns a bunch of Nvidia stock.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Personally I think it's just a true dislike/hate of AMD for whatever reason (tribalism, failed job opportunity, whatever), based on the past several years of anti-AMD screeds.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Hasturtium posted:

Nice things do cost money, but as others have said, wages aren’t pacing inflation and Nvidia’s at least perceived (if not guilty) of making hay while the sun shines in the current environment. I literally just swapped an RX 6600 for an RTX 3060, I’m not immune to Nvidia’s value proposition to a point, but there’s a certain “your points are invalid, why can’t you all just resign yourself to Team Green supremacy and eat cake like me” vibe coming off your posts lately. Frustration with the state of things is being expressed here. You are not obligated to spring to Nvidia’s rescue.

I'm just overall getting tired of generation after generation of "the new thing sucks, buy the thing we told you sucks last year" reviews, and overall general negativity. There's a lot of people who are whining just to whine. $450 to $600 in 7 years with a pandemic in the middle isn't that bad, and nobody can do anything about slowing node shrinks. NVIDIA put up a diligent fight with DLSS trying to get more out of their transistors and people constantly dump on them for it.

If you're fine with what you've got, great! if you aren't, go ahead and upgrade, there are reasonable products available at varying price points during each generation. if you can't afford it - buy a console, they're even cheaper. Hobbies cost money and at the end of the day they’re entertainment, crying because you can’t afford track day is silly and immature especially when your car works fine.

this isn't just an NVIDIA problem either, AMD is doing just as badly (actually much worse) at coping with the rising costs of node shrinks. RX 7600 isn't a great product by 2008 standards either, and N32 products are going to be an exciting mixture of "amazingly low margins" and "amazingly terrible perf/$". And they're not even trying to break the perf-per-transistor mold with things like DLSS, they're just resigned to their fate at this point. They’ll make the products they can make and if people don’t like them oh well, they’re not going to run negative margins on Vega or N32 just to make people happy. Put it fifty bucks under nvidia and call it a day… just like 6800XT.

MCM is fine for high end but deploying 500-1000mm2 of silicon isn’t cheap even if it yields 100%. That’s the thing that’s changed in the last 7 years since I was talking about quad MCM with mspaint diagrams… wafers cost 5x as much now with no signs of slowing.

Paul MaudDib fucked around with this message at 15:59 on Jul 11, 2023

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

SourKraut posted:

I think this puts it perfectly in terms of the frustration, but...

this is also possible! :haw:

I feel bad about gamers not getting new toys in general but like, some people would really rather shoot the messenger. Grow up.

It’s also just generally shocking how this kind of discourse is completely acceptable around AMD but people roll their eyes at any other brand. Remember when the AMD people were defending the chipset segmentation ploy with explanations of how technically reasonable it was, or running loops of NOPs to explain how defective early zen2 silicon wasn’t actually false advertising? Yeah this is way more justified and way less cope-y than that.

Even AMD says the days of big improvements at the same price point are over. They just would prefer to talk about advanced packaging instead, while brushing past the part where they’re using 2x or more the silicon at 2.5x or more the cost.

Paul MaudDib fucked around with this message at 15:43 on Jul 11, 2023

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
People are saying it sucks because it sucks. The idea that GPUs have to cost this much is something you and only you have in your head. Everything else sans most current-gen motherboards costs about the same as it always did. Also, I'm curious, are you pretending the 4090 doesn't exist, or are you pretending it's being sold at low margin or a loss? I'm curious here...

Both AMD and Nvidia are in strong positions right now that don't incentivize them to serve the gaming GPU market well. We can acknowledge that is real and also not like it.

The phantom goons you're creating in your mind are hilarious. Not everyone here has money, but this forum is getting older, and clearly a significant number of us don't NEED to care about a few hundred bucks on a GPU. We're also on average less youthfully stupid and willing to overpay for things, and more willing to call out corporate bullshit.

And cut the ancient history bullshit. Everyone acknowledged the 3060 Ti and 3080 as incredible deals at MSRP, the crypto bubble inflating the real-world value of those GPUs doesn't erase the intent to sell them at those prices.

Cygni
Nov 12, 2005

raring to post

Paul MaudDib posted:

I'm just overall getting tired of generation after generation of "the new thing sucks, buy the thing we told you sucks last year" reviews, and overall general negativity. There's a lot of people who are whining just to whine. $450 to $600 in 7 years with a pandemic in the middle isn't that bad, and nobody can do anything about slowing node shrinks. NVIDIA put up a diligent fight with DLSS trying to get more out of their transistors and people constantly dump on them for it.

If you're fine with what you've got, great! if you aren't, go ahead and upgrade, there are reasonable products available at varying price points during each generation. if you can't afford it - that's kind of a "grow up" moment. hobbies cost money and PC gaming is still not an expensive hobby. if it's too much, buy a console, they're even cheaper.

paul is right about this imo. and in the short to medium run it is going to get worse, not better, for people who want to own their own GPU. the physics are what they are, and the free ride is over.

i said it before but i do think a significant portion of the people that post about GPUs are interested it only from the reality TV show angle, and thats fine! do what you want, the drama is only going to get more fierce as the market continues to slow.

FuturePastNow
May 19, 2014


I just think the idea that DLSS makes GPU's "worth more" by any measurable amount is funny

kliras
Mar 27, 2021
talking about this in more direct number terms is also a pain in the rear end because you have to find some websites to scrape for performance and specs, so we're just throwing the same vaguely related charts at each other

i thought about pulling some stuff from tpu, but their tables are set up in an annoying way for scraping i can't be bothered to work around right now

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

FuturePastNow posted:

I just think the idea that DLSS makes GPU's "worth more" by any measurable amount is funny

the only way this would work would be if you made the argument that the only games that count are that have DLSS (and that all that games don't have DLSS already run "fast enough" without it)

otherwise, the fact that it isn't universal means it's a cherry-picked assertion. You'd have a better case for NIS or FSR 1.0/RSR doing this, but not DLSS.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

FuturePastNow posted:

I just think the idea that DLSS makes GPU's "worth more" by any measurable amount is funny

When you can’t keep going on the transistor treadmill due to cost, you have to find plan B to get more out of the transistors you’ve got. And hardware-accelerated tensor weighting does that significantly more effectively than procedural algorithms like FSR2.

Did you find the idea of hardware T&L or programmable shaders increasing product value to be funny? Why?

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Paul MaudDib posted:

When you can’t keep going on the transistor treadmill due to cost, you have to find plan B to get more out of the transistors you’ve got. And tensor cores do that significantly more effectively than procedural algorithms.

Did you find the idea of hardware T&L or programmable shaders increasing product value to be funny? Why?

Just wondering, are tensor cores “cheap” in terms of die space?

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



K8.0 posted:

People are saying it sucks because it sucks. The idea that GPUs have to cost this much is something you and only you have in your head. Everything else sans most current-gen motherboards costs about the same as it always did. Also, I'm curious, are you pretending the 4090 doesn't exist, or are you pretending it's being sold at low margin or a loss? I'm curious here...

Both AMD and Nvidia are in strong positions right now that don't incentivize them to serve the gaming GPU market well. We can acknowledge that is real and also not like it.

The phantom goons you're creating in your mind are hilarious. Not everyone here has money, but this forum is getting older, and clearly a significant number of us don't NEED to care about a few hundred bucks on a GPU. We're also on average less youthfully stupid and willing to overpay for things, and more willing to call out corporate bullshit.

And cut the ancient history bullshit. Everyone acknowledged the 3060 Ti and 3080 as incredible deals at MSRP, the crypto bubble inflating the real-world value of those GPUs doesn't erase the intent to sell them at those prices.

100% agree/accurate

gradenko_2000 posted:

the only way this would work would be if you made the argument that the only games that count are that have DLSS (and that all that games don't have DLSS already run "fast enough" without it)

otherwise, the fact that it isn't universal means it's a cherry-picked assertion. You'd have a better case for NIS or FSR 1.0/RSR doing this, but not DLSS.

Yeah, I the DLSS argument would make sense, as you basically inferred, if it was universal to all GPUs, which is why the Starfield drama, while hilarious (and stupid on AMD's part), is a bit ironic: yeah, FSR 1.0 really sucks and FSR 2.0 is better but not DLSS 2.0+ quality, but it at least runs on nVidia's product stack.

So pushing an argument that DLSS is a crucial decision-making element that should be accounted for at all times, is basically what you would expect from an Nvidia talking head.

Wait... is Paul actually Jacob Freeman???

repiv
Aug 13, 2009

Rinkles posted:

Just wondering, are tensor cores “cheap” in terms of die space?

someone compared the GTX16 and RTX20s die layouts, which were both turing based but with only the latter having tensor cores and RT cores, and the overhead of the RTX features was something like 10% extra die space

that's including the RT cores so the tensor cores are less than that

UHD
Nov 11, 2006


K8.0 posted:

And cut the ancient history bullshit. Everyone acknowledged the 3060 Ti and 3080 as incredible deals at MSRP, the crypto bubble inflating the real-world value of those GPUs doesn't erase the intent to sell them at those prices.

the intent ultimately didn't matter. the msrp might have been a great price but it's irrelevant if you cant actually buy it at those prices. and nvidia/amd may not have had a lot of control over such things but they sure took advantage of them all the same.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Rinkles posted:

Just wondering, are tensor cores “cheap” in terms of die space?

About 6% of total die area on Turing, based on comparative measurements of SM size between 16-series and 20-series from Fritzchens Fritz’ die shots. RTX is about 10% total and then it’s about 60/40 tensor vs RT.

https://old.reddit.com/r/hardware/comments/baajes/rtx_adds_195mm2_per_tpc_tensors_125_rt_07/

So, 30% more performance at iso-quality, for 6% more transistors. And it has knock-on effects because you’re rendering a sub-resolution image, raytracing is way more potent at 540p than 1080p, and memory bandwidth goes a lot farther, etc.

(and that trend predates Ada’s tiny memory buses, it’s just a general trend that allows even high-end cards to keep stretching without a return to 512b or using HBM. G6X is part of the same trend, as is memory delta compression, etc. Gotta get more out of your memory bus despite tighter and tighter integrity requirements in modern GDDR iterations preventing you from just fanning out infinitely.)

Paul MaudDib fucked around with this message at 16:25 on Jul 11, 2023

Adbot
ADBOT LOVES YOU

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Paul MaudDib posted:

I'm just overall getting tired of generation after generation of "the new thing sucks, buy the thing we told you sucks last year" reviews, and overall general negativity.

In this specific space, there's no bad products, just bad prices. I will defend that for any GPU with decent support. If we're going by MSRPs, then the 6650 XT had a $399 MSRP and the 7060 has a $299 MSRP, there's a pretty decent value improvement, and combined with the AV1 encoder and little bump in compute stuff feels like a normal GPU generation? When old thing used to cost a lot and now is cheap, then yeah you should buy old thing instead of new thing.

If we're going by street prices, then:


The 3070 was a pretty bad GPU for $730. Compared to a $730 3070, the $399 4060 Ti seems great.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply