Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
CaptainSarcastic
Jul 6, 2013



unpronounceable posted:

Why does stuff like this matter? Like, I get that it seems like Nvidia will overprice their cards, but why does it matter what % of a chip is disabled or anything like that, when ultimately you can judge what performance you're getting for a given cost?

I guess it is playing with segmentation in a weird way, but Nvidia has been playing games with segmentation for years. The 1060 3GB vs 6GB versions, the Turing Super refresh, and the existence of the whole 16xx line of cards all strike me as weird internal segmentation things.

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

time for the obligatory 3dmark leak



that's some weird scaling

wargames
Mar 16, 2008

official yospos cat censor

repiv posted:

time for the obligatory 3dmark leak



that's some weird scaling

seems very inline vs the 4080 the 4090 can still be ignored.

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.
in more 'min specs slowly creeping upward with next-gen titles', star wars jedi survivor has 1070/580 as its min specs, glad i just upgraded my gpu at last (got a 3060 ti for below msrp)

since it sounds like retailers have pretty much exhausted 3080 stock (there's still some here but the pricing isn't great and retailers are saying it's the last they're going to get) i wonder if that means the 4070/ti will be priced somewhat reasonably

what is going on with those 7900 xtx/xt benchmarks though, underwhelming raster performance compared to what was expected but also bizarre scaling?

someone has looked at what was up with portal rtx and it is indeed just a bizarrely terrible implementation possibly designed to intentionally run bad on amd cards, not even related to rt performance:

https://twitter.com/JirayD/status/1601036292380250112

repiv
Aug 13, 2009

it would be really dumb for nvidia to deliberately sabotage the game on AMD when they're all but guaranteed to win by miles even in a fair matchup, the whole premise favors their architectures

register count exploding on one vendor but not the other could just be down to shader compiler quirks, as this guy from guerilla games points out

https://twitter.com/_plop_/status/1601163128363831296

normally that would be a situation where the developer would talk to AMD and figure out what's going wrong, but lol at the politics of that when the developer is nvidia and the game is likely to make AMD look bad no matter what

repiv fucked around with this message at 01:53 on Dec 10, 2022

hobbesmaster
Jan 28, 2008

repiv posted:

it would be really dumb for nvidia to deliberately sabotage the game on AMD when they're all but guaranteed to win by miles even in a fair matchup, the whole premise favors their architectures

Intel intentionally sabotaged compute performance using code compiled with the Intel compiler or using the Intel performance primitives when AMD was on bulldozer.

AMD would probably be confused if someone stopped kicking them when they’re down.

Dr. Video Games 0031
Jul 17, 2004

change my name posted:

https://hothardware.com/news/nvidia-geforce-rtx-4070-ti-and-non-ti-4070-specs-leak

Full info on the 4070 ti has "leaked" apart from the pricing. What really happened is that they randomly sent them out to the tech press without any embargo info or marketing materials...

Huh? Who has a 4070 Ti without an embargo/NDA? I'm not seeing anything about this.

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.
think they just misread the bit about colorful accidentally putting up the 4070 ti specs a few days ago

wargames
Mar 16, 2008

official yospos cat censor

hobbesmaster posted:

Intel intentionally sabotaged compute performance using code compiled with the Intel compiler or using the Intel performance primitives when AMD was on bulldozer.

AMD would probably be confused if someone stopped kicking them when they’re down.

but they aren't really down, the 6000 series is quite good.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

unpronounceable posted:

Why does stuff like this matter? Like, I get that it seems like Nvidia will overprice their cards, but why does it matter what % of a chip is disabled or anything like that, when ultimately you can judge what performance you're getting for a given cost?

It matters because if mining demand never happened, the 3060 was not a great deal at $400. This new "4070" is at the same lovely place in the stack and will probably sell for at least $550, despite the fact that we are now in a post-mining post-semiconductor crunch glut. With stuff being cheaper now, nvidia has no BOM excuse to price it that high. It's just loving consumers. Hopefully people refuse to buy all these GPUs and force prices down hard.

lih
May 15, 2013

Just a friendly reminder of what it looks like.

We'll do punctuation later.
the 3060 was $329 msrp and not a great deal, but the 3060 Ti at $399 was pretty great value when it launched. the prices of the 3060 and 3050 were definitely inflated because they knew that anything would sell in that market (same as the lower-end rx 6000 cards)

the 4090 is also just well beyond what they usually do for a halo product and actually goes some way to justify its price, unlike the 3090, so using that as the comparison point throws things off a fair bit. costs would have been up anyway this gen from inflation + they're on a fairly cutting edge node for once which is much more expensive, so it would have been reasonable to expect another price increase as a result of all that. the issue is just they hiked the initial 4080 prices to a stupid degree to try to sell off 30 series stock first. now that the 30 series stock is mostly sold, it's possible the 4070/Ti prices end up more reasonable, & some sort of 4080 price cut is supposed to be on the way too but it's probably not going to be enough.

i wouldn't expect prices to be the sort of great value the 3060 Ti/3080 were at msrp but there's probably room for them to be possibly 'ok' instead of terrible like the 4080 is at the moment

lih fucked around with this message at 06:51 on Dec 10, 2022

Inept
Jul 8, 2003

lih posted:

the 4090 is also just well beyond what they usually do for a halo product and actually goes some way to justify its price, unlike the 3090

It's on a more expensive node than the 3090 but the die itself is a bit smaller and the VRAM is the same amount

They just made everything else in the stack shittier this time instead of the xx80 getting you 90% of the way there

hobbesmaster
Jan 28, 2008

wargames posted:

but they aren't really down, the 6000 series is quite good.

They are in RT

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

K8.0 posted:

It matters because if mining demand never happened, the 3060 was not a great deal at $400. This new "4070" is at the same lovely place in the stack and will probably sell for at least $550, despite the fact that we are now in a post-mining post-semiconductor crunch glut. With stuff being cheaper now, nvidia has no BOM excuse to price it that high. It's just loving consumers. Hopefully people refuse to buy all these GPUs and force prices down hard.

Yeah - the 3060 is about twice the performance of the 1060 but is only now getting down to the same $300 price target almost six and a half years after the 1060 and well after the 3060's own launch, while also using substantially more power. As someone focused on actual value for money and not just sheer performance, it feels like everything in the last few years has been disappointing and the 4000-series is just insulting when I'm used to thinking of $1000 as a good price for a whole computer.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Eletriarnation posted:

Yeah - the 3060 is about twice the performance of the 1060 but is only now getting down to the same $300 price target almost six and a half years after the 1060 and well after the 3060's own launch, while also using substantially more power. As someone focused on actual value for money and not just sheer performance, it feels like everything in the last few years has been disappointing and the 4000-series is just insulting when I'm used to thinking of $1000 as a good price for a whole computer.

I remember 1060 6GB hitting $250 by september 2016

I got my 2060S for $330 at dec 2019 and that turned out to be one helluva deal just before the crypto shitstorm

what's even more insulting IMO is AAA games progressively making GBS threads the bed over the same years

Dr. Video Games 0031
Jul 17, 2004

The AIB 1060 cards started at $250 at launch, and it was just the FE that had a $300 MSRP. That was back when Nvidia charged a premium for their FEs.

ijyt
Apr 10, 2012

It is insane to me that previously a top end PC was the price of a single 4090.

Shipon
Nov 7, 2005

ijyt posted:

It is insane to me that previously a top end PC was the price of a single 4090.
Wait until you find out how much "top end PCs" cost in the late 90s. You're talking like $3k...in late 90s dollars.

Dr. Video Games 0031
Jul 17, 2004

The early to mid twenty-teens were really the golden age for PC building. It was pretty expensive before that, and now it's gotten even dumber than before, it feels like.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

repiv posted:

DF gets a #1 victory royale, yeah fortnite we 'bout to get down (get down)

https://www.youtube.com/watch?v=O6GC8TZbJmI

series s is unsurprisingly being stretched quite thin, but at least it's playable

XSX: 864p-1836p dynamic, ~1274p average, TSR'ed to 4K
PS5: 864p-1836p dynamic, ~1188p average, TSR'ed to 4K
XSS: 540p-1080p dynamic, ~788p average, TSR'ed to 1080p

I think it's extremely impressive stuff, and Epic being aware of UE5's abysmal PC performance gives me some hope

Dr. Video Games 0031 posted:

The early to mid twenty-teens were really the golden age for PC building. It was pretty expensive before that, and now it's gotten even dumber than before, it feels like.

i5 2500K/970 was truly the golden age.

Mr.PayDay
Jan 2, 2004
life is short - play hard

lih posted:


someone has looked at what was up with portal rtx and it is indeed just a bizarrely terrible implementation possibly designed to intentionally run bad on amd cards,

No, this sounds like paranoia.
Compare it to the 1st RT Gen of Nvidia. The Turing raw performance on the 2070 or 2080 is quite poor as well.

My 4090 hovers around 30-40 fps, so that’s not spectacular raw performance per se.
We have a showcase where just DLSS3 does the magic and pushes it to 120-130 fps.
It’s an Incredible technology. Just loving expensive outside of enthusiast spending hobby budget.

Mr.PayDay fucked around with this message at 13:31 on Dec 10, 2022

Mr.PayDay
Jan 2, 2004
life is short - play hard

wargames posted:

but they aren't really down, the 6000 series is quite good.

Exactly. Comparing Nvidia’s 3rd (!) generation of Raytracing GPUs and Features to AMDˋs 1st gen Raytracing GPUs *should* show significant discrepancies.
Turing gets obliterated by RTX4000 as well.

Dr. Video Games 0031
Jul 17, 2004

Mr.PayDay posted:

No, this sounds like paranoia.
Compare it to the 1st RT Gen of Nvidia. The Turing raw performance on the 2070 or 2080 is quite poor as well.

My 4090 hovers around 30-40 fps, so that’s not spectacular raw performance per se.
We have a showcase where just DLSS3 does the magic and pushes it to 120-130 fps.
It’s an Incredible technology. Just loving expensive outside of enthusiast spending hobby budget.

The 6900 XT performs significantly worse than the 2070 even. There's something seriously wrong with the way the game plays on RDNA2. This isn't about 1st-gen RT cores vs 2nd or 3rd. It's just bugged.

repiv
Aug 13, 2009

That profile shows the AMD card is barely being utilized, I don't expect it would run well in the optimal case but it's nowhere near optimal right now

The question is just whether it's cartoon villain malice (if AMD detected then emit lovely shaders), Nvidia playing to the strengths of their own hardware and being apathetic towards how it runs on the competition, or them accidentally hitting an edge case in AMDs driver

hobbesmaster
Jan 28, 2008

repiv posted:

The question is just whether it's cartoon villain malice (if AMD detected then emit lovely shaders), Nvidia playing to the strengths of their own hardware and being apathetic towards how it runs on the competition, or them accidentally hitting an edge case in AMDs driver

Now it really reminds me of the intel compiler, take a look at this article and what Intel was doing circa 2007-2010. https://www.agner.org/forum/viewtopic.php?t=6

quote:

In other words, they claim that they are optimizing for specific processor models rather than for specific instruction sets. If true, this gives Intel an argument for not supporting AMD processors properly. But it also means that all software developers who use an Intel compiler have to recompile their code and distribute new versions to their customers every time a new Intel processor appears on the market. Three years later, I tried to run a program compiled with an old version of Intel's compiler on the newest Intel processors? You guessed it: It still runs the optimal code path. But the reason is more difficult to guess: Intel have manipulated the CPUID family numbers on new processors in such a way that they appear as known models to older Intel software.

Perhaps the initial design of Intel's CPU dispatcher was indeed intended to optimize for known processor models only, without regard for future models. If any of my students had made such a solution that was not future-oriented, I would consider it a serious flaw. Perhaps the Intel engineers discovered the missing support for future processors too late so that they had to design the next generation of their processors in such a way that they appeared as known models to existing Intel software.


Most of Intel's function libraries now contain two different CPU dispatchers, a fair one and an unfair one. It is not clear when the fair dispatcher is used and when the unfair dispatcher is used. The decision may depend on legal technicalities that are elusive to the programmer and to the end user. Software products with unfair CPU dispatching still abound despite the settlement with AMD that prohibits artificial performance impairment.

In 2010, Intel published an article on how the CPU dispatching works in the Intel Performance Primitives (IPP) function library. The article indicates a fair handling of non-Intel processors in the IPP library. This is in accordance with my test results. What the article does not mention is the unfair CPU dispatching in several other Intel function libraries.

If you ever noticed programs like matlab, mathematica, labview and similar were unusually slow on AMD in that era this was why.

While this is a bit of a digression, I point out this line “In other words, they claim that they are optimizing for specific processor models rather than for specific instruction sets”. This is how GPU driver optimizations work and there really isn’t another option at this point. Just look at intel’s performance in their new GPUs that are starting from “zero” and how it compares to AMD/nvidia hardware that should be similar. Portal RTX and nvidia’s new remix stuff is wholesale replacing a game’s fixed function pipeline and dropping in its own new code. It should not be surprising that this path has some sort of switch statement for Ada, ampere, Turing and a default fall through. That “default” is probably effectively a debugging path and a performance disaster.

v1ld
Apr 16, 2012

DLSS/FSR2/XeSS support for Skyrim: https://www.nexusmods.com/skyrimspecialedition/mods/80343

Good to see more of these retrofits happening. Extends the modding potential for many of these games that don't have modern optimization.

kliras
Mar 27, 2021
would love to see a pluggable solution with special k. but even special k has a habit of not working with games after a while like elden ring. i guess the good thing about skyrim is that it's not updated anymore which limits the risk of stuff breaking

hobbesmaster
Jan 28, 2008

Now I have to go see if I still have Skyrim installed and see what it looks like with using DLSS as DLAA looks like.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
I guess it's super hard to simply 'plug in' DLSS/FSR2.0. It needs motion data so games without TAA is a no-go.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

kliras posted:

would love to see a pluggable solution with special k. but even special k has a habit of not working with games after a while like elden ring. i guess the good thing about skyrim is that it's not updated anymore which limits the risk of stuff breaking

The AE update actually broke support for a poo poo ton of stuff. As I understand it, a lot of mods won’t ever work. The solution is to downgrade to an earlier version.

Faded Mars
Jul 1, 2004

It is I, his chronicler, who alone can tell thee of his saga.
Skyrim is a boring game. Is there any way to make this work with Fallout 4, which is a much better, cooler game using pretty much the same engine and all that?

repiv
Aug 13, 2009

hobbesmaster posted:

It should not be surprising that this path has some sort of switch statement for Ada, ampere, Turing and a default fall through. That “default” is probably effectively a debugging path and a performance disaster.

There's definitely a distinct path for Ada to leverage the binning/opacity/micromesh stuff that's exclusive to that, but at the API level there isn't really a distinction between Ampere/Turing/RDNA2 raytracing.

There's absolutely a distinction between best practices on Ampere/Turing and RDNA2 though, since the underlying RT implementation is so different, so it's possible that exactly the same code is running on Ampere and RDNA2 but it only becomes pathological when running on RDNA2. In that case a real game would be expected to either find a compromise that works on both or have a separate path tuned for RDNA2, but this is an Nvidia tech demo so they're obviously not going to do that.

If it performed like Quake 2 RTX (which is open source, so nothing up NVs sleeve) it would still be a runaway win for NV cards, so they really have no reason to rig the engine on purpose



Also of note is that Portal RTX currently just doesn't run at all on Intel cards, which points to lack of giving a poo poo about competitors hardware rather than actively sabotaging it

repiv fucked around with this message at 17:38 on Dec 10, 2022

kliras
Mar 27, 2021

hobbesmaster posted:

Now I have to go see if I still have Skyrim installed and see what it looks like with using DLSS as DLAA looks like.
enb compatibility is still wip so you might as well wait until that's patched in

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Faded Mars posted:

Skyrim is a boring game. Is there any way to make this work with Fallout 4, which is a much better, cooler game using pretty much the same engine and all that?

FO4 is getting a free “next gen” update next year, I wouldn’t be surprised if they integrated DLSS as part of that.

Enos Cabell
Nov 3, 2004


Zedsdeadbaby posted:

i5 2500K/970 was truly the golden age.

Second golden age I'd say, first was celeron 300a oc'd to 450 and voodoo2 era.

hobbesmaster
Jan 28, 2008

repiv posted:

There's definitely a distinct path for Ada to leverage the binning/opacity/micromesh stuff that's exclusive to that, but at the API level there isn't really a distinction between Ampere/Turing/RDNA2 raytracing.

There's absolutely a distinction between best practices on Ampere/Turing and RDNA2 though, since the underlying RT implementation is so different, so it's possible that exactly the same code is running on Ampere and RDNA2 but it only becomes pathological when running on RDNA2. In that case a real game would be expected to either find a compromise that works on both or have a separate path tuned for RDNA2, but this is an Nvidia tech demo so they're obviously not going to do that.

If it performed like Quake 2 RTX (which is open source, so nothing up NVs sleeve) it would still be a runaway win for NV cards, so they really have no reason to rig the engine on purpose



Also of note is that Portal RTX currently just doesn't run at all on Intel cards, which points to lack of giving a poo poo about competitors hardware rather than actively sabotaging it

Quake 2 RTX is actually compiled in… this remix stuff that Portal RTX is supposed to be an example of us something that intercepts calls and one explanation could be that nvidia is using more of their “secrets” to do that in an efficient way.

Arivia
Mar 17, 2011

Rinkles posted:

The AE update actually broke support for a poo poo ton of stuff. As I understand it, a lot of mods won’t ever work. The solution is to downgrade to an earlier version.

There's a thing called the "best of both worlds" patcher that allows for pre-AE (but Special Edition, as opposed to the very original release) mods to work on AE. The problem is the Skyrim Script Extender, which is code injection to specific memory addresses, whenever Bethesda updates the application that breaks (and a lot of mods use the Script Extender.) Unfortunately Bethesda has still introduced some minor updates since the AE for various reasons from "who cares" to "that's completely useless bullshit."

thats not candy
Mar 10, 2010

Hell Gem
i feel dumb for not picking up a 3080ti for 650ish at bestbuy a few months ago. i can only match that with a used one from ebay still

whats going on, i want to replace my 1080ti :cry:

mobby_6kl
Aug 9, 2009

by Fluffdaddy

thats not candy posted:

i feel dumb for not picking up a 3080ti for 650ish at bestbuy a few months ago. i can only match that with a used one from ebay still

whats going on, i want to replace my 1080ti :cry:

Is everything still a gently caress? I put it off until after I came back from vacation since I wasn't going to use the PC during that time. Joke's on me I guess?? The 1070 was still chugging along before I left but I was hoping to get something sensible to replace it finally.

Adbot
ADBOT LOVES YOU

cheesetriangles
Jan 5, 2011





Over the summer, the higher end 3000 series plummeted in price. I personally saw a sub 800 dollar 3090. Then they quickly went back up and there seems to be very little stock left.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply