|
unpronounceable posted:Why does stuff like this matter? Like, I get that it seems like Nvidia will overprice their cards, but why does it matter what % of a chip is disabled or anything like that, when ultimately you can judge what performance you're getting for a given cost? I guess it is playing with segmentation in a weird way, but Nvidia has been playing games with segmentation for years. The 1060 3GB vs 6GB versions, the Turing Super refresh, and the existence of the whole 16xx line of cards all strike me as weird internal segmentation things.
|
# ? Dec 10, 2022 00:19 |
|
|
# ? Jun 3, 2024 22:48 |
|
time for the obligatory 3dmark leak that's some weird scaling
|
# ? Dec 10, 2022 00:19 |
|
repiv posted:time for the obligatory 3dmark leak seems very inline vs the 4080 the 4090 can still be ignored.
|
# ? Dec 10, 2022 00:29 |
|
in more 'min specs slowly creeping upward with next-gen titles', star wars jedi survivor has 1070/580 as its min specs, glad i just upgraded my gpu at last (got a 3060 ti for below msrp) since it sounds like retailers have pretty much exhausted 3080 stock (there's still some here but the pricing isn't great and retailers are saying it's the last they're going to get) i wonder if that means the 4070/ti will be priced somewhat reasonably what is going on with those 7900 xtx/xt benchmarks though, underwhelming raster performance compared to what was expected but also bizarre scaling? someone has looked at what was up with portal rtx and it is indeed just a bizarrely terrible implementation possibly designed to intentionally run bad on amd cards, not even related to rt performance: https://twitter.com/JirayD/status/1601036292380250112
|
# ? Dec 10, 2022 00:48 |
|
it would be really dumb for nvidia to deliberately sabotage the game on AMD when they're all but guaranteed to win by miles even in a fair matchup, the whole premise favors their architectures register count exploding on one vendor but not the other could just be down to shader compiler quirks, as this guy from guerilla games points out https://twitter.com/_plop_/status/1601163128363831296 normally that would be a situation where the developer would talk to AMD and figure out what's going wrong, but lol at the politics of that when the developer is nvidia and the game is likely to make AMD look bad no matter what repiv fucked around with this message at 01:53 on Dec 10, 2022 |
# ? Dec 10, 2022 01:26 |
|
repiv posted:it would be really dumb for nvidia to deliberately sabotage the game on AMD when they're all but guaranteed to win by miles even in a fair matchup, the whole premise favors their architectures Intel intentionally sabotaged compute performance using code compiled with the Intel compiler or using the Intel performance primitives when AMD was on bulldozer. AMD would probably be confused if someone stopped kicking them when they’re down.
|
# ? Dec 10, 2022 01:38 |
|
change my name posted:https://hothardware.com/news/nvidia-geforce-rtx-4070-ti-and-non-ti-4070-specs-leak Huh? Who has a 4070 Ti without an embargo/NDA? I'm not seeing anything about this.
|
# ? Dec 10, 2022 01:42 |
|
think they just misread the bit about colorful accidentally putting up the 4070 ti specs a few days ago
|
# ? Dec 10, 2022 03:43 |
|
hobbesmaster posted:Intel intentionally sabotaged compute performance using code compiled with the Intel compiler or using the Intel performance primitives when AMD was on bulldozer. but they aren't really down, the 6000 series is quite good.
|
# ? Dec 10, 2022 05:03 |
|
unpronounceable posted:Why does stuff like this matter? Like, I get that it seems like Nvidia will overprice their cards, but why does it matter what % of a chip is disabled or anything like that, when ultimately you can judge what performance you're getting for a given cost? It matters because if mining demand never happened, the 3060 was not a great deal at $400. This new "4070" is at the same lovely place in the stack and will probably sell for at least $550, despite the fact that we are now in a post-mining post-semiconductor crunch glut. With stuff being cheaper now, nvidia has no BOM excuse to price it that high. It's just loving consumers. Hopefully people refuse to buy all these GPUs and force prices down hard.
|
# ? Dec 10, 2022 06:20 |
|
the 3060 was $329 msrp and not a great deal, but the 3060 Ti at $399 was pretty great value when it launched. the prices of the 3060 and 3050 were definitely inflated because they knew that anything would sell in that market (same as the lower-end rx 6000 cards) the 4090 is also just well beyond what they usually do for a halo product and actually goes some way to justify its price, unlike the 3090, so using that as the comparison point throws things off a fair bit. costs would have been up anyway this gen from inflation + they're on a fairly cutting edge node for once which is much more expensive, so it would have been reasonable to expect another price increase as a result of all that. the issue is just they hiked the initial 4080 prices to a stupid degree to try to sell off 30 series stock first. now that the 30 series stock is mostly sold, it's possible the 4070/Ti prices end up more reasonable, & some sort of 4080 price cut is supposed to be on the way too but it's probably not going to be enough. i wouldn't expect prices to be the sort of great value the 3060 Ti/3080 were at msrp but there's probably room for them to be possibly 'ok' instead of terrible like the 4080 is at the moment lih fucked around with this message at 06:51 on Dec 10, 2022 |
# ? Dec 10, 2022 06:49 |
|
lih posted:the 4090 is also just well beyond what they usually do for a halo product and actually goes some way to justify its price, unlike the 3090 It's on a more expensive node than the 3090 but the die itself is a bit smaller and the VRAM is the same amount They just made everything else in the stack shittier this time instead of the xx80 getting you 90% of the way there
|
# ? Dec 10, 2022 06:57 |
|
wargames posted:but they aren't really down, the 6000 series is quite good. They are in RT
|
# ? Dec 10, 2022 07:06 |
|
K8.0 posted:It matters because if mining demand never happened, the 3060 was not a great deal at $400. This new "4070" is at the same lovely place in the stack and will probably sell for at least $550, despite the fact that we are now in a post-mining post-semiconductor crunch glut. With stuff being cheaper now, nvidia has no BOM excuse to price it that high. It's just loving consumers. Hopefully people refuse to buy all these GPUs and force prices down hard. Yeah - the 3060 is about twice the performance of the 1060 but is only now getting down to the same $300 price target almost six and a half years after the 1060 and well after the 3060's own launch, while also using substantially more power. As someone focused on actual value for money and not just sheer performance, it feels like everything in the last few years has been disappointing and the 4000-series is just insulting when I'm used to thinking of $1000 as a good price for a whole computer.
|
# ? Dec 10, 2022 07:58 |
|
Eletriarnation posted:Yeah - the 3060 is about twice the performance of the 1060 but is only now getting down to the same $300 price target almost six and a half years after the 1060 and well after the 3060's own launch, while also using substantially more power. As someone focused on actual value for money and not just sheer performance, it feels like everything in the last few years has been disappointing and the 4000-series is just insulting when I'm used to thinking of $1000 as a good price for a whole computer. I remember 1060 6GB hitting $250 by september 2016 I got my 2060S for $330 at dec 2019 and that turned out to be one helluva deal just before the crypto shitstorm what's even more insulting IMO is AAA games progressively making GBS threads the bed over the same years
|
# ? Dec 10, 2022 10:36 |
|
The AIB 1060 cards started at $250 at launch, and it was just the FE that had a $300 MSRP. That was back when Nvidia charged a premium for their FEs.
|
# ? Dec 10, 2022 10:55 |
|
It is insane to me that previously a top end PC was the price of a single 4090.
|
# ? Dec 10, 2022 11:07 |
|
ijyt posted:It is insane to me that previously a top end PC was the price of a single 4090.
|
# ? Dec 10, 2022 11:10 |
|
The early to mid twenty-teens were really the golden age for PC building. It was pretty expensive before that, and now it's gotten even dumber than before, it feels like.
|
# ? Dec 10, 2022 11:17 |
|
repiv posted:DF gets a #1 victory royale, yeah fortnite we 'bout to get down (get down) I think it's extremely impressive stuff, and Epic being aware of UE5's abysmal PC performance gives me some hope Dr. Video Games 0031 posted:The early to mid twenty-teens were really the golden age for PC building. It was pretty expensive before that, and now it's gotten even dumber than before, it feels like. i5 2500K/970 was truly the golden age.
|
# ? Dec 10, 2022 11:40 |
|
lih posted:
No, this sounds like paranoia. Compare it to the 1st RT Gen of Nvidia. The Turing raw performance on the 2070 or 2080 is quite poor as well. My 4090 hovers around 30-40 fps, so that’s not spectacular raw performance per se. We have a showcase where just DLSS3 does the magic and pushes it to 120-130 fps. It’s an Incredible technology. Just loving expensive outside of enthusiast spending hobby budget. Mr.PayDay fucked around with this message at 13:31 on Dec 10, 2022 |
# ? Dec 10, 2022 13:28 |
|
wargames posted:but they aren't really down, the 6000 series is quite good. Exactly. Comparing Nvidia’s 3rd (!) generation of Raytracing GPUs and Features to AMDˋs 1st gen Raytracing GPUs *should* show significant discrepancies. Turing gets obliterated by RTX4000 as well.
|
# ? Dec 10, 2022 13:33 |
|
Mr.PayDay posted:No, this sounds like paranoia. The 6900 XT performs significantly worse than the 2070 even. There's something seriously wrong with the way the game plays on RDNA2. This isn't about 1st-gen RT cores vs 2nd or 3rd. It's just bugged.
|
# ? Dec 10, 2022 13:36 |
|
That profile shows the AMD card is barely being utilized, I don't expect it would run well in the optimal case but it's nowhere near optimal right now The question is just whether it's cartoon villain malice (if AMD detected then emit lovely shaders), Nvidia playing to the strengths of their own hardware and being apathetic towards how it runs on the competition, or them accidentally hitting an edge case in AMDs driver
|
# ? Dec 10, 2022 13:39 |
|
repiv posted:The question is just whether it's cartoon villain malice (if AMD detected then emit lovely shaders), Nvidia playing to the strengths of their own hardware and being apathetic towards how it runs on the competition, or them accidentally hitting an edge case in AMDs driver Now it really reminds me of the intel compiler, take a look at this article and what Intel was doing circa 2007-2010. https://www.agner.org/forum/viewtopic.php?t=6 quote:In other words, they claim that they are optimizing for specific processor models rather than for specific instruction sets. If true, this gives Intel an argument for not supporting AMD processors properly. But it also means that all software developers who use an Intel compiler have to recompile their code and distribute new versions to their customers every time a new Intel processor appears on the market. Three years later, I tried to run a program compiled with an old version of Intel's compiler on the newest Intel processors? You guessed it: It still runs the optimal code path. But the reason is more difficult to guess: Intel have manipulated the CPUID family numbers on new processors in such a way that they appear as known models to older Intel software. If you ever noticed programs like matlab, mathematica, labview and similar were unusually slow on AMD in that era this was why. While this is a bit of a digression, I point out this line “In other words, they claim that they are optimizing for specific processor models rather than for specific instruction sets”. This is how GPU driver optimizations work and there really isn’t another option at this point. Just look at intel’s performance in their new GPUs that are starting from “zero” and how it compares to AMD/nvidia hardware that should be similar. Portal RTX and nvidia’s new remix stuff is wholesale replacing a game’s fixed function pipeline and dropping in its own new code. It should not be surprising that this path has some sort of switch statement for Ada, ampere, Turing and a default fall through. That “default” is probably effectively a debugging path and a performance disaster.
|
# ? Dec 10, 2022 16:38 |
|
DLSS/FSR2/XeSS support for Skyrim: https://www.nexusmods.com/skyrimspecialedition/mods/80343 Good to see more of these retrofits happening. Extends the modding potential for many of these games that don't have modern optimization.
|
# ? Dec 10, 2022 16:39 |
|
would love to see a pluggable solution with special k. but even special k has a habit of not working with games after a while like elden ring. i guess the good thing about skyrim is that it's not updated anymore which limits the risk of stuff breaking
|
# ? Dec 10, 2022 16:43 |
|
Now I have to go see if I still have Skyrim installed and see what it looks like with using DLSS as DLAA looks like.
|
# ? Dec 10, 2022 16:44 |
|
I guess it's super hard to simply 'plug in' DLSS/FSR2.0. It needs motion data so games without TAA is a no-go.
|
# ? Dec 10, 2022 16:45 |
|
kliras posted:would love to see a pluggable solution with special k. but even special k has a habit of not working with games after a while like elden ring. i guess the good thing about skyrim is that it's not updated anymore which limits the risk of stuff breaking The AE update actually broke support for a poo poo ton of stuff. As I understand it, a lot of mods won’t ever work. The solution is to downgrade to an earlier version.
|
# ? Dec 10, 2022 16:46 |
|
Skyrim is a boring game. Is there any way to make this work with Fallout 4, which is a much better, cooler game using pretty much the same engine and all that?
|
# ? Dec 10, 2022 16:50 |
|
hobbesmaster posted:It should not be surprising that this path has some sort of switch statement for Ada, ampere, Turing and a default fall through. That “default” is probably effectively a debugging path and a performance disaster. There's definitely a distinct path for Ada to leverage the binning/opacity/micromesh stuff that's exclusive to that, but at the API level there isn't really a distinction between Ampere/Turing/RDNA2 raytracing. There's absolutely a distinction between best practices on Ampere/Turing and RDNA2 though, since the underlying RT implementation is so different, so it's possible that exactly the same code is running on Ampere and RDNA2 but it only becomes pathological when running on RDNA2. In that case a real game would be expected to either find a compromise that works on both or have a separate path tuned for RDNA2, but this is an Nvidia tech demo so they're obviously not going to do that. If it performed like Quake 2 RTX (which is open source, so nothing up NVs sleeve) it would still be a runaway win for NV cards, so they really have no reason to rig the engine on purpose Also of note is that Portal RTX currently just doesn't run at all on Intel cards, which points to lack of giving a poo poo about competitors hardware rather than actively sabotaging it repiv fucked around with this message at 17:38 on Dec 10, 2022 |
# ? Dec 10, 2022 16:53 |
|
hobbesmaster posted:Now I have to go see if I still have Skyrim installed and see what it looks like with using DLSS as DLAA looks like.
|
# ? Dec 10, 2022 16:55 |
|
Faded Mars posted:Skyrim is a boring game. Is there any way to make this work with Fallout 4, which is a much better, cooler game using pretty much the same engine and all that? FO4 is getting a free “next gen” update next year, I wouldn’t be surprised if they integrated DLSS as part of that.
|
# ? Dec 10, 2022 17:01 |
|
Zedsdeadbaby posted:i5 2500K/970 was truly the golden age. Second golden age I'd say, first was celeron 300a oc'd to 450 and voodoo2 era.
|
# ? Dec 10, 2022 18:04 |
|
repiv posted:There's definitely a distinct path for Ada to leverage the binning/opacity/micromesh stuff that's exclusive to that, but at the API level there isn't really a distinction between Ampere/Turing/RDNA2 raytracing. Quake 2 RTX is actually compiled in… this remix stuff that Portal RTX is supposed to be an example of us something that intercepts calls and one explanation could be that nvidia is using more of their “secrets” to do that in an efficient way.
|
# ? Dec 10, 2022 18:10 |
|
Rinkles posted:The AE update actually broke support for a poo poo ton of stuff. As I understand it, a lot of mods won’t ever work. The solution is to downgrade to an earlier version. There's a thing called the "best of both worlds" patcher that allows for pre-AE (but Special Edition, as opposed to the very original release) mods to work on AE. The problem is the Skyrim Script Extender, which is code injection to specific memory addresses, whenever Bethesda updates the application that breaks (and a lot of mods use the Script Extender.) Unfortunately Bethesda has still introduced some minor updates since the AE for various reasons from "who cares" to "that's completely useless bullshit."
|
# ? Dec 10, 2022 18:15 |
|
i feel dumb for not picking up a 3080ti for 650ish at bestbuy a few months ago. i can only match that with a used one from ebay still whats going on, i want to replace my 1080ti
|
# ? Dec 10, 2022 18:55 |
|
thats not candy posted:i feel dumb for not picking up a 3080ti for 650ish at bestbuy a few months ago. i can only match that with a used one from ebay still Is everything still a gently caress? I put it off until after I came back from vacation since I wasn't going to use the PC during that time. Joke's on me I guess?? The 1070 was still chugging along before I left but I was hoping to get something sensible to replace it finally.
|
# ? Dec 10, 2022 21:29 |
|
|
# ? Jun 3, 2024 22:48 |
Over the summer, the higher end 3000 series plummeted in price. I personally saw a sub 800 dollar 3090. Then they quickly went back up and there seems to be very little stock left.
|
|
# ? Dec 10, 2022 21:49 |