Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

My inclination is to say I would not buy that poo poo at gunpoint, gently caress DLSS 3.0 then

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

ijyt posted:

I don't see how a reply like this helps to do anything but rile up the person you're replying to more.

Anyway, see you 50 series.

For real, I don't know why dude cares so much that I am not in favor of this news. Why single me out anyway Cygni, you're spoiled for choice at people surprised by the overall announcement.

Rumors had the 4080 MSRP at $699-850 as recently as six days ago, I guess that was purely wishful thinking. To me the segmentation of the two 4080s reads more like 10GB 3080 vs 16GB 3090 from previous generation, with the top end 24GB card being more analogous to the 3090 TI.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I'm trying to chill tf out here and think. Ok:

3080 10GB had 82% as many CUDA cores as the 3090 16GB.

4080 12GB has 78% as many CUDA cores as the 4080 16GB.

I think what's going on here is they're rejiggering the product tiers somewhat, although in doing so I feel they're being kinda miserly with VRAM.

There was not an analogous "holy poo poo that's a fuckload more cores" product as the 4090, where the product below it has only 59% as many CUDA cores. The only step above the 3080 10GB at Ampere's launch was the 3090 and it had 18% more, not 41% more. That's a lot closer to the difference between the two 4080s. While $900 is fifty bucks higher than the high estimates in rumors, that's less egregious than my initial reaction - 3090 launch MSRP was $1500, so getting that analogous product (that is, the 4080 16GB) is actually lower MSRP than before. Though, you'd think there would be a VRAM upgrade generation to generation. And the MSRP for the big daddy, while high, at least comes along with a likely very large increase in processing power with that massive CUDA count.

Cygni posted:

I replied to you once with two sentences, relax. Someone expresses similar sentiment most launches, and I just wanted to know if there was something specific that was making you upset about it.

My brother I am high strung, and I was hoping to pay $800 for a 16GB card, that's the whole story

Agreed fucked around with this message at 17:22 on Sep 20, 2022

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

If they had improved the VRAM count at each performance tier more it would be more obvious that in terms of relative processing power at launch 4080 12GB ≈ 3080 10GB, and 4080 16GB ≈ 3090, and 4090 had no Ampere precedent. They kinda nerfed the 3090 tier's VRAM but lowered its price $300 in exchange in the 4080 16GB, while not increasing the base 4080's VRAM any versus the final 12GB 3080. But in terms of GPU power, it isn't so crazy. Just pricing a bit higher than expected for the product which is closest to Ampere's launch 3080.

Agreed fucked around with this message at 17:45 on Sep 20, 2022

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Cao Ni Ma posted:

3090s were terrible cards and strictly in the halo category that very few people buy into.

Now with the x80s going up to $1200 theyve blurred the line of terrible product you shouldnt buy to "well its just a few hundred more and its actually available"

3090 had a smaller increase in CUDA cores over the 3080 than the 4080 16GB does over the 4080 12GB; in a sense they're reducing VRAM but also price of the 3090-alike this gen. The 4090 did not have an Ampere analogous part, IMO, it'll be interesting to see what kind of performance it gets - 41% more CUDA cores vs the next product, maybe the extra $$ will not seem as wasteful this go around. I dunno. $1700 after tax is way outta my price range for a GPU upgrade either way.

Looking forward to benchmarks, chilling out for now 'til more info is available.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

lmfao

"Yeah my PC is basically a daughterboard to my graphics card."

PBCrunch posted:

So what happens when you take a super mega graphics card with all the deep learning poo poo on a 1080P monitor and render out at "4K" using DLSS3 or Xess or whatever and then shrink it back down to 1080P.

Does it end up looking better than native 1080P?

nVidia has had supersampling that renders back down to your monitor's resolution since like Kepler, looks real good but not as good as higher native resolution for real.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

So, "performance" versus "frame generation." What's that all about. Spin sensor going nuts.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

If the 4080 16GB offers very substantially better performance and more RT games come out that make me think "man I really want that to run super good but it runs super bad on my 3080 12GB, arrrgh" like Cyberpunk does if I try to set the rendering resolution above, like DLSS Ultra Performance with all visuals cranked... I guess I could at least consider it. And if I were to consider it, I might be like "well it's almost within reach, maybe for a few hundred more bucks 41% greater CUDA cores could potentially be in some hosed up nebulous sense 'worth it' this time" and go full stupid. Let me put it this way, stranger things have happened, but it's one hell of an investment in just fuckin' around.

Realistically though I'm hoping to just sit this gen out and try to get good use from my 12900K / 3080 12GB setup here until 14th or 15th gen Intel (or the superior AMD competitor, if they're solidly on top at the time) and build all new then with whatever is a good performer at the time, then pass this PC on to my oldest as a hand-me-down assuming it's held up pretty well.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Yeah, I'm at 4K. I tend to turn the rendering resolution down even in games that don't have DLSS or RT though, especially if it isn't a big difference I don't mind a few jaggies here and there to drop 150W off my usage. Like Battletech - it's not exactly trying to be a HOLY poo poo PIXELS kinda game, but at 4K with everything turned up even at 60fps it will use 350W if it's allowed to on some maps - yet looks drat near the same at 1440p rendering resolution, at 200W instead.

But some titles really do just kinda beg to have graphics cranked up and they can be nice, immersive experiences and I'll let the card rip now and then on those. Crysis remasters, The Witcher 3 with mods, Cyberpunk, but some others also can get the card working hard.

As far as newer features as I've noted previously I love ray tracing, it's great and immersive, even if super costly on performance. I like to have it as high as the game will go if I can and in titles so far I can if they have DLSS options anyway. Rather have a few jaggies from lower internal resolution than lose the visual pop that gives.

Check and see if you're really getting 100FPS+ with your setup - 1440p with all RT cranked and all settings at max is definitely nowhere near 100FPS for me in that, and while the 12900K isn't THE WORLD'S FASTEST GAMING CPU it's, you know, not dogshit either. Or do you mean 1440p with DLSS enabled also at some level? Curious, if you don't mind checking :)

Agreed fucked around with this message at 03:25 on Sep 21, 2022

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

change my name posted:

I mean I was willing to pay over MSRP for my 3070 because the government cut me a check and I wasn't going outside for a year, they have to acknowledge poo poo has changed

Yeah, with the absence of stimulus, looming recession, definite increases in cost of living, and the inability to use these things as money printers since Crypto ate poo poo and Ethereum went proof of stake, sky high prices seem ill-timed to say the least. Another factor causing elevated prices last year was the 25% tariff that isn't the case now - if the right political factors come together and put something like that back in place, they're gonna sell fuckin' miserably low numbers of these things.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I still can't figure out how they're arriving at the 3x performance figure. If that's just factoring in the guessed frames they're interpolating, so going from actual performance to "frame generation" like that one graphic that I was quizzical about earlier, I'd really like to know the REAL performance difference. How much faster is the 4080 12GB than the 3080 12GB at purely rendering frames with game engines, as opposed to having ... whatever this nVidia Super Dall-E thing is painting them shits as well.

Agreed fucked around with this message at 00:03 on Sep 22, 2022

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

:psyduck:

Hold up though, how does interpolated frames and a higher frame count actually improve the game's feel if the base framerate would still be below vsync? Is it frame skipping and filling in the blanks that it isn't rendering while still advancing the engine to the next frame with interpolation, to somehow get the engine to perform as though it were running at a higher framerate?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

repiv posted:

no, the engine ticks at the "internal" framerate and the doubled framerate is purely visual

DLSS3 won't improve input responsiveness, it's purely to make the graphics smoother

I do not see how this is a feature. Or how they're calling that "performance." Does that mean in situations where the engine is putting out, I dunno, 26 fps, it's just going to look smoother while still actually being 26fps as far as input and engine frames are concerned? Like, if the base performance is choking, how is this not going to be... just... smoother choking? I am trying to figure out how this improves gaming in practice.

If a game dips to sub-30fps right now, that clearly sucks and makes the whole thing slower and weird and jerky. If this just makes it so frames keep coming smoothly, but the engine is going all slow and weird, that's going to be bizarre. Like, does it go to super smooth looking slow-mo or something? Still lagging and slowing down for all intents and purposes but not jerky with obvious frame drops?

Once again I must :psyduck: This can't be as bad as I'm imagining it or they'd never have focused on this for this generation, right, I must be missing something.

Agreed fucked around with this message at 00:19 on Sep 22, 2022

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

If it's anything like the motionPlus TVs that interpolate frames, I'm not even slightly interested, I think those make poo poo look weird. I don't see why it's good to have cards do anything more than actually get faster and better at rendering real frames per second, honestly, maybe I'm getting old but this seems like such a dud of a feature and then they use it in the figures to act like it's a big real framerate boost. Whaaaaat?

Forgive me friends I'm still an old head using 60FPS and thinking that looks pretty good, I'm sure there's someone out there really looking forward to fake 140fps if they can only run at 70fps or whatever

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I'm already using two of those little GPU support pillar things from Amazon to keep my 3080 12GB from putting what seems like too much stress on the PCI-e slot, those absolutely monstrous 4090s are hilariously larger than this again!

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

3080 12GB has great performance in modern games, even if you're a 4K gamer you can either use DLSS or lower internal res to 1440p and it looks good still - might notice less geometry resolution on distant things, occasional jaggies, but in motion still very playable. nVidia has a nice feature to create artificial max resolutions between 1440p and 4K also and some games can look great and play great in those (every so often one gets a weird UI with it, like some parts will be slightly off screen like it isn't scaling quite right). I wouldn't hate better performance in Cyberpunk with everything cranked but that's a tall order, it's a really demanding engine that makes powerful hardware cry.

If the 4080 12GB can meet or beat it's performance and bring the new features with lower power draw, not like it's a terrible card IMO. Just weirdly segmented and more expensive than predicted.

Agreed fucked around with this message at 19:35 on Sep 22, 2022

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I went from a 2070 non-super to a 3080 12GB and it's a massive upgrade if you're trying to keep resolutions and settings high. However, if you're willing to employ DLSS in its performance modes and turn a setting down now and then, it can feed a 4K screen in a way that isn't ugly (I used it in this same PC with the same 4K display and played a lot of the same games, had plenty of fun and never thought GOD drat THAT'S UGLY while I was waiting to trade the 3070Ti I got initially in for a 3080 12GB - more DLSS, a few things down, no big deal).

For 1080p/60-90fps it probably ranges from overkill to can-still-handle-it, outside of maybe Cyberpunk or really cutting edge engines and even then if they scale well it'll hold up. My son games on the PC that the 2070 is in now and plays modern titles, on a 1080p screen - it kicks rear end still. Probably handle 1440p in most titles pretty well, if not maxed, and so long as you aren't intending super high refresh rate gaming.

Dr. Videogames' advice to not upgrade 'til you have trouble running something seems wise to me.

Agreed fucked around with this message at 06:12 on Sep 23, 2022

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

GPU mining is dead, nobody is profitable, and that leaves miners in the position of trying to compete with other miners also equally desperate to unload their inventory at whatever price they can get before the market totally bottoms out, so yeah, I'd imagine you can score some great cards at a cheap price. Beware of 3090s with uncooled rear VRAM, though, I saw posts during the crypto boom times with people figuring out how to bypass fried memory to keep mining with the stuff that had active cooling on it, but this could be the time to get a used high end card on the cheap if you were waiting.

I'd normally be like "and remember, with EVGA the warranty follows the card, not the original buyer, so those might be a better choice" but now with EVGA getting out of GPUs, :shrug: Maybe still relevant for the moment while they still have stock to take care of current warranty-holders?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Metro Exodus Enhanced is so good. I hope S.T.A.L.K.E.R. 2 is like that, but, you know, stalker-y.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

cheesetriangles posted:

I wouldn't count on Stalker 2 ever existing.

A man can hope!

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Already updated, d'oh

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

can you get Macho Man Randy Savage as the dragons in Skyrim yet on consoles

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I'm not impressed with the frame generation we've seen so far. I think it's a bullshit technology that kicks the can to give them marketing "performance improvements" because the actual generation to generation improvement here is less impressive without just going balls to the wall with power consumption and the biggest chip you can put on a card.

DLSS upscaling is awesome and a real improvement over simply lowering the rendering resolution and letting your monitor/TV or GPU scale it using the old ways, and to me is just better than competing upscaling methods, especially in motion and with certain details. I think most people can appreciate it and how it can make a gameplay experience much better without sacrificing settings or making the whole frame a jagged aliased mess.

But making fake frames and calling that real performance is dubious as hell IMO, you can say it leaves past generations behind but I totally disagree.

Agreed fucked around with this message at 20:27 on Sep 30, 2022

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Animal posted:

I think it has its place, especially in games like Flight Simulator where the motion vector is not rapidly changing

I guess I should reserve judgment for use cases like that, but to me, all the marketing materials claiming huge increases in actual performance because they can put out more frames when many of the frames are these artificial ones that weren't rendered by the engine, increase latency, may gently caress up UI elements, etc. is funny math. I think they ought to show the apples to apples performance improvements and claim that rather than invent this new "and now, with fake frames!!" tech and claim it as a massive uplift for the press.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Jesus loving christ that card is huge

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

3080 12GB is holding it down for me just fine over here, not too worried about getting a 4090 at this time.

Enjoy it, though, big spenders! If the IRS ever gets me my fuckin' tax return for 2021 things may change, but until such time as that, welp. Hey if any of you who do get such outrageous goods designed for as much power as can be put through them, let me know how they undervolt since I'm sure that's next on the priority list for y'all

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Enos Cabell posted:

4090 is a beast of a card, but I can't imagine a reason to upgrade from a 3080ti. Was worried I'd have some serious FOMO this morning, but I think waiting until the 5 series will be fine.

Yeah, I'm looking for a bigger perf/$ figure before I look to upgrade from my 3080 12GB. Maybe the likely refresh will be more enticing in that regard, just have to wait and see.

Edit: Nothing I've seen about this frame generation tech makes it look good to me. I don't anticipate using that ever. Maybe that's saying too much, I'm sure they'll improve it as time goes on - if it's ever attractive I guess I'll reevaluate but for the time being, blech.

Agreed fucked around with this message at 17:26 on Oct 12, 2022

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Edit: ^^^^ well, you win


https://www.youtube.com/watch?v=8k5oAaXINX8

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I like Cyberpunk, where else are you going to pretend to be a Shadowrunner in an FPS? (disregarding the lame 2007 Shadowrunner FPS)

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

$1 in 2016 becomes $1.20 in 2022. So that was looking like $1000 for that performance tier of card expressed in 2022 bux. In 2008 the GTX 280 launch price was $649, which is almost $900 today. Considering the required investment to get a GPU chip to the point of mass production has drastically increased over time I am not really surprised that the cost of these high performance cards remains high. Back in the GeForce 4 era I remember the Ti was like $399, which would be almost $700 today, and that was a much simpler design that did not require nearly as much investment to produce in the first place.

I think it's very fair to consider trends in computing that are demanding ever higher performance even as it gets harder to squeeze out of designs are indeed putting more and more pressure upwards on price, it does kinda suck to be out of the golden age when improvements were comparatively much cheaper to achieve.

Agreed fucked around with this message at 17:46 on Oct 12, 2022

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

High end PCs have been expensive to make or buy the whole time, though, I mean we're talking about the 4090 which is the supreme flagship of the lineup and I feel it is fair to compare its cost to what was the flagship configuration in years past. Like, two GeForce 8800 GTX at $599 each in late 2006, that'd be more than $1800 after tax today. And people definitely ran those in SLI, I remember my cousin joined the Navy and as part of the money he was making without day to day expenses he went all in on a build back then and it'd run any game you like at max settings for the era but it was mad expensive. Hell, I remember putting like $3500 into a PC in 2008 to build a powerful quad core setup with what was a lot of RAM for the time and a GTX 280 (after trying the dual GPU Radeon top end card of the era, which I hated). And that price tag was AFTER competition brought the GTX 280 itself down to a lower price.*

The best toys cost a lot and always have. There are usually nice performers at better prices, but right now all we see are the 4090 and 4080 from this generation and the 4070 obviously in that 12gb model I MEAN look at the performance gulf wtf nVidia so it's skewed toward the top.



*salient factor, right, AMD is competing with regard to general rendering performance but nVidia has the killer app of this era in RTX (and arguably DLSS, since given the performance of RTX it's kind of the thing that enables it to work in practice) and they are currently leveraging that advantage. If AMD can put up fiercer competition there I would expect to see more price pressure on nVidia.

Agreed fucked around with this message at 18:10 on Oct 12, 2022

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

KYOON GRIFFEY JR posted:

When did high end PC hardware cater to anyone else?

I think some general unease over the progression of late stage capitalism is creeping into the GPU discussion, honestly, yeah there's a lot of poo poo that is wildly unsustainable and it's just a matter of time until poo poo hits the fan, but in particular high end PC hardware at the time of its launch has never been value-oriented. Look at the launch price for a Pentium II, if all my examples of high end video card setups of yesteryear don't move you. Buying power takes a hit when inflation outpaces wages, for sure, and in an environment where the cost of living has gone up in various ways without commensurate increases in earnings for a lot of folks I can understand feeling like there's an impossible barrier to getting the newest high end stuff. But the newest high end stuff has always been really expensive in the era it was in, this is not new. Don't get a 4090, wait until we see what a 4060 looks like for a deal more in line with past expectations. Maxing settings at the highest common resolutions of the day using the highest end parts available has been an expensive endeavor the whole time.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

No they weren't, the prices already went way down from that point.

Listen, my man, I am not trying to be a dick but can you point to the time you think it was affordable to buy the newest, highest end hardware for a PC and there was no cost barrier to lower income earners?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

https://www.youtube.com/watch?v=g1Sq1Nr58hM

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I walked past the GPU aisle at Amazon Prime Day and went to the Headphone Hut to get some HD 660 S, I mean poo poo I haven't had a new pair of Senns in more than a decade I'll see what they're up to. A big reason I am not getting a 4090 or really worried about it even though I do game at 4K is that I can't justify apportioning that much discretionary budget away when there are still a lot of guitar pedals and music gear I haven't tried :mad:

Hopefully there's either a dynamite refresh later on, or the 5 series is awesome (or AMD steps up, I'm always rooting for them to but it just seems like GPUs aren't the main priority over there).

Agreed fucked around with this message at 00:22 on Oct 13, 2022

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Animal posted:

Man, even you are afflicted with a bit of 4090 FOMO

I dunno, man, I haven't paid that much for any GPU setup ever. Back when SLI was the thing I built with the fastest single GPU I could get typically because I wanted high performance but dual GPU had already proved to be bullshit for me when I tried it, incompatibility issues & just not the smooth n' straightforward experience I wanted. I am glad that these days you don't have to go dual GPU to get radical-tier performance, but I'm happy with how the 3080 12GB is performing for me at half the price for now and mainly looking forward to when a similar expenditure can meaningfully improve on it. I'm still just pushing 60 FPS, and very willing to use upscaling.

The industry desperately needs a new title or two to push the graphical envelope; I'm seeing Cyberpunk being used everywhere to promote Ada cards, deja vu.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

lmao they're "unlaunching" it because it got fully panned for obvious reasons

may you have at least this much shame every time you try something dumb like that, hey at least that means the only actual 4080 is a real upgrade now

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Subjunctive posted:

My 4090 is unusable in Windows with all the blue screening. I guess I need to wait for updated drivers or something. Sucks.

Are you sure it's a driver issue and not a bum card? I'm on the new drivers with my 3080 12GB and I haven't had the first bit of instability.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I'm fully cheering for Intel to Actually Do Something in GPUs here, it'd be great for everyone if we had a strong third competitor in the field and better encoding is a start I guess.

Re: "3080 or not" chat above, I had the 3070 Ti at first, used my old 2070 non-super while it was in flight returning, and upgraded in the end to a 3080 12GB and holy poo poo it was super worth it. The 3080 is the sweet spot for my performance needs, the others were compromise territory. Getting one for $300 is sick as gently caress, mine cost $800 (ok, long story but I actually only ended up paying $699 for it in the end, still $300 smokes that).

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

buglord posted:

Did anyone else feel like raytracing was a bit of a letdown for their personal use? I really thought my 3070 was gonna be playing a bunch of RTX enabled games but that never really panned out for me. DLSS is fantastic though.

Raytracing is dope to me actually, and adds a lot to my sense of immersion & also the part of my brain that has always enjoyed turning on shiny poo poo.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply