Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Nfcknblvbl posted:

If the bar here is set to see all new cards being the same price as the ones they replaced, then that's pretty low. Why should it be such a big deal they didn't raise the prices?

Because thanks to inflation it's actually lower priced in terms of purchasing power, they have frequently raised prices gen-to-gen in the past, and with questionable competition from AMD, they had an option to bump prices again and people would by and large have just paid them their asking price.

Getting upwards of double the performance for the same price is a pretty great deal, frankly.

Adbot
ADBOT LOVES YOU

Arzachel
May 12, 2012

Nfcknblvbl posted:

If the bar here is set to see all new cards being the same price as the ones they replaced, then that's pretty low. Why should it be such a big deal they didn't raise the prices?

Because they've done it every time they could get away with it and this was a prime opportunity with the hype around Cyberpunk and the new console cycle.

Nfcknblvbl
Jul 15, 2002

Since the double performance claims require DLSS 2.0 and RTX, I bet the jump is more like 25-%30% in just rasterization. After 2 years of inflation maybe the effective cost is reduced by 7%. Perhaps the sweet deal Nvidia got from Samsung 8nm was passed along to the consumer.

Truga
May 4, 2014
Lipstick Apathy

Nfcknblvbl posted:

If the bar here is set to see all new cards being the same price as the ones they replaced, then that's pretty low. Why should it be such a big deal they didn't raise the prices?

3070 being 500 instead of 350 is still lovely, but with no big competition and the ~2x experience~ i honestly expected more lol

repiv
Aug 13, 2009

Nfcknblvbl posted:

Since the double performance claims require DLSS 2.0 and RTX, I bet the jump is more like 25-%30% in just rasterization.

I know it's cherry picked but we saw Doom Eternal running very nearly twice as fast, that's pure raster gains

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Some of the games DF tested don't have RT or DLSS and they showed the same 70-90% performance increase. Granted, cherry picked tests and all but we're also limited to older and current games that tend to not capitalize on architectural improvements as much. I think in general we're looking at something like 80% generation over generation performance increase at the same price point, which is really nice. The $700 2080 was basically flat with the $700 1080Ti at launch. Now we're looking at something more like the 1080Ti launch where a $700 GPU got you a similarly large gain from a $650 980Ti. That's why people are happy right now.

Nfcknblvbl
Jul 15, 2002

My bad, I assumed incorrectly that all those tests used either DLSS or RTX.

space marine todd
Nov 7, 2014



DrDork posted:

The value is likely two-fold:

(1) They can advertise the bumped number to investors and dev teams to help get around the chicken-and-egg issue of setting up a new service. No one wants to do the work of moving a game over to something like Now if no one is gonna play it, and no one's gonna get Now if there are no games worth playing. This at least will give them a larger playerbase on paper, if probably not in reality.

(2) Some of those people might stick around and actually pick up a paid subscription later. Assuming running the service doesn't cost them too much money in the interim, that may be a net win for them in the long run.

But, yeah, I think it would have been a more worthwhile pack-in a year or two ago.

Yeah, great point. The other benefit is with really big games where users are being able to play right away instead of waiting for hours to download the dang thing.

Arzachel
May 12, 2012

K8.0 posted:

Some of the games DF tested don't have RT or DLSS and they showed the same 70-90% performance increase. Granted, cherry picked tests and all but we're also limited to older and current games that tend to not capitalize on architectural improvements as much. I think in general we're looking at something like 80% generation over generation performance increase at the same price point, which is really nice. The $700 2080 was basically flat with the $700 1080Ti at launch. Now we're looking at something more like the 1080Ti launch where a $700 GPU got you a similarly large gain from a $650 980Ti. That's why people are happy right now.

Control had DLSS disabled, I wouldn't be surprised if that actually shrinks the delta.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Nfcknblvbl posted:

Since the double performance claims require DLSS 2.0 and RTX, I bet the jump is more like 25-%30% in just rasterization. After 2 years of inflation maybe the effective cost is reduced by 7%. Perhaps the sweet deal Nvidia got from Samsung 8nm was passed along to the consumer.

Except df already has a 3080 and they get 70-100% faster raster performance than 2080

Efb

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

K8.0 posted:

Some of the games DF tested don't have RT or DLSS and they showed the same 70-90% performance increase. Granted, cherry picked tests and all but we're also limited to older and current games that tend to not capitalize on architectural improvements as much. I think in general we're looking at something like 80% generation over generation performance increase at the same price point, which is really nice. The $700 2080 was basically flat with the $700 1080Ti at launch. Now we're looking at something more like the 1080Ti launch where a $700 GPU got you a similarly large gain from a $650 980Ti. That's why people are happy right now.

repiv posted:

I know it's cherry picked but we saw Doom Eternal running very nearly twice as fast, that's pure raster gains

Yeah, usual disclaimers about "wait for independent reviews" apply as always, but unless they're just outright LYING about the hardware specs, this release is going to be a big generational jump in raw power even before you talk about DLSS 2 and RTX poo poo.

Dr. Fishopolis
Aug 31, 2004

ROBOT
Imma throw out some wild rear end speculation here but I think big navi is gonna be a raster monster, easily outstripping Nvidia in raw frames. I also think the driver suite is going to be an unmitigated catastrophe, they're going to promise near feature parity with Nvidia and deliver on maybe half that out of the gate. Your choices will be to go with AMD for raw power and hope to god they get their software poo poo together, or Nvidia for worse price/performance but it actually functions as advertised.

anyway those are my feelings like comment subscribe

Kraftwerk
Aug 13, 2011
i do not have 10,000 bircoins, please stop asking

I think Ampere will be what Pascal was when it first came out. I mean I don't "need" to upgrade my 1070 and I imagine barring some major AAA releases the 1080Ti will still hold strong for at least 1-2 more years. But if you pick up Ampere now you'll be covered for the entire upcoming console generation in my opinion.


ATI/AMD definitely come out with some amazing looking cards (on paper) and it's a drat shame about their drivers. You'd think that Lisa could bring in some top talent with all that Ryzen money so they can compete better.

I have this fantasy of AMD coming up with TSMC derived brute force solutions to problems that makes the hardware a computing monster and achieves performance comparable to Nvidia cards but through entirely different methods than what Nvidia does.

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

Dr. Fishopolis posted:

Imma throw out some wild rear end speculation here but I think big navi is gonna be a raster monster, easily outstripping Nvidia in raw frames. I also think the driver suite is going to be an unmitigated catastrophe, they're going to promise near feature parity with Nvidia and deliver on maybe half that out of the gate. Your choices will be to go with AMD for raw power and hope to god they get their software poo poo together, or Nvidia for worse price/performance but it actually functions as advertised.

anyway those are my feelings like comment subscribe

My Hot Take is they're going to peak at selling you something that's basically PS5/XBX parity + X% clocks because they can cool it better in a 3 slot multi-fan 305mm monster card than in a console.

e: With shitass drivers and no DLSS 2.0 equivalent.

KYOON GRIFFEY JR
Apr 12, 2010



Runner-up, TRP Sack Race 2021/22

mcbexx posted:

Is this snark? Graciously admitting defeat while trying to save face?
I can't even tell anymore.

https://twitter.com/AzorFrank/status/1301173699974967296

my bet is a reasonably cold-blooded recognition that 30x0 appearing to be extremely good and relatively cheap will cause a lot of people to drop cash on PC gaming, which AMD is very strong in right now from a processor perspective. AMD will probably get a rev bump in processors out of the launch. shame they don't have Zen 3 ready to roll though.

exquisite tea
Apr 21, 2007

Carly shook her glass, willing the ice to melt. "You still haven't told me what the mission is."

She leaned forward. "We are going to assassinate the bad men of Hollywood."


Some people have said "well these Nvidia benchmarks include DLSS" but everyone should hope DLSS 2.0 becomes more widespread everywhere as it's 30-40% more performance for basically nothing. Fortnite getting it is a good sign that it may became an integrated feature for Unreal Engine, which about a bazillion games use.

MarcusSA
Sep 23, 2007

exquisite tea posted:

Some people have said "well these Nvidia benchmarks include DLSS" but everyone should hope DLSS 2.0 becomes more widespread everywhere as it's 30-40% more performance for basically nothing. Fortnite getting it is a good sign that it may became an integrated feature for Unreal Engine, which about a bazillion games use.

Yeah DLSS is pretty cool and good from what I've seen of it.

Nfcknblvbl
Jul 15, 2002

KYOON GRIFFEY JR posted:

shame they don't have Zen 3 ready to roll though.

It's a good thing Zen 3 works on x570 and B550 boards. I think I might buy a 'cheap' Ryzen 5 2600 in the interim, or perhaps something used off of eBay.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Malcolm XML posted:

only amd has pcie 4.0 on the desktop and they already own the diy market.

AMD makes $ if you buy a zen cpu, x570 mobo and nvidia GPU. Similarly, nVidia is agnostic to intel v amd except insofar as intel has tried to kneecap non-CPUs with lovely IO/segmentation/integrated gpus

Do they own the DIY market? Even as of late I've still been getting intel CPUs because I thought they had the highest gaming FPS vs Ryzen?

VorpalFish posted:

Good, so there's a chance it won't pull 300w and can fit in smaller cases...

I mean the TGP is 220w which means it's not supposed to pull any more than that if you're running it stock.

The TDP on the 1080ti was 250w and Zotac managed to cram it into an mITX size, so I'd think they'll do the same for the 2070: https://www.zotac.com/us/product/graphics_card/zotac-geforce-gtx-1080-ti-mini

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Kraftwerk posted:

ATI/AMD definitely come out with some amazing looking cards (on paper) and it's a drat shame about their drivers. You'd think that Lisa could bring in some top talent with all that Ryzen money so they can compete better.

I'm sure she's trying, but that Ryzen money has only really existed for a year or two now, and that's not a lot of time to actually put out hiring asks, find people, onboard them, and then actually start fixing poo poo. Whatever's gonna launch with these cards will be based on the team from pre-Ryzen $$$. The next iteration, on the other hand, will likely have had enough time to have that cash do some good.

Voxx
Jul 28, 2009

I'll give 'em a hold
and a break to breathe
And if they can't play nice
I won't play with 'em at all

KYOON GRIFFEY JR posted:

AMD will probably get a rev bump in processors out of the launch. shame they don't have Zen 3 ready to roll though.

they are in a similar boat where zen3 needs to have a solid kick if they want people off their current stuff and upgrade

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Voxx posted:

they are in a similar boat where zen3 needs to have a solid kick if they want people off their current stuff and upgrade

Personally, I'm still rocking a 5820k, so it's a race between Zen 3 and Rocket Lake to see which one gets my money this fall.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

DrDork posted:

I'm sure she's trying, but that Ryzen money has only really existed for a year or two now, and that's not a lot of time to actually put out hiring asks, find people, onboard them, and then actually start fixing poo poo. Whatever's gonna launch with these cards will be based on the team from pre-Ryzen $$$. The next iteration, on the other hand, will likely have had enough time to have that cash do some good.

Yeah, they'd have to compete with nvidias first stab at multichip gpus

No pressure

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin



A taste of 3090 performance at the stupid resolution of 7680x4320. Settings are 'high with ray tracing enabled' where applicable.
As you're obviously utterly GPU limited in this scenario, we should probably be able to infer something about 3840x2160 performance too, if that's your thing.

AirRaid
Dec 21, 2004

Nose Manual + Super Sonic Spin Attack

Riflen posted:




A taste of 3090 performance at the stupid resolution of 7680x4320. Settings are 'high with ray tracing enabled' where applicable.
As you're obviously utterly GPU limited in this scenario, we should probably be able to infer something about 3840x2160 performance too, if that's your thing.

Loving the Control bar, which looks like 6 or 7 FPS without DLSS, 57FPS with. If anyone needed a clue as to how powerful DLSS2 is.

sean10mm
Jun 29, 2005

It's a Mad, Mad, Mad, MAD-2R World

Zero VGS posted:

Do they own the DIY market? Even as of late I've still been getting intel CPUs because I thought they had the highest gaming FPS vs Ryzen?

There's a reason CPU FPS is always tested at 1080p, because lower resolutions are the only place you have separation between CPUs. Over that everything is pretty flat because you're GPU bound anyway. Even then you're often comparing 110 FPS to 100 FPS or something.

So Intel wins in a scenario where, if you're spending real money on a gaming system, you're probably not playing. And if you're budget constrained you're still probably better off spending the price difference between AMD and Intel builds on other parts of your system.

Plus a lot of people have to do other poo poo with their PCs too, at which point AMD is just flatly a better buy usually.

LimburgLimbo
Feb 10, 2008

Zero VGS posted:

Do they own the DIY market? Even as of late I've still been getting intel CPUs because I thought they had the highest gaming FPS vs Ryzen?

They’ve taken a considerable lead in value for the money over Intel and the next gen is liable to expand that lead. Still lots of old Intels out there but Intel is getting completely hammered; even conservative estimates put AMD having consistently gained market share for the last two years, and some individual retailers report things like *90%* of sales being AMD CPUs. It’s not even close right now.

At the very high end they have a small FPS benefit over similar tiers of AMD but you pay out the rear end and most games are not meaningfully CPU bound now anyway. Buying and AMD and putting as much as you can afford into GPU is the name of the game unless you literally have no budget limits and are throwing everything at every last marginal gain specifically for gaming (or probably some other niche poo poo).

chaleski
Apr 25, 2014

Now that I have an idea of what GPU I want I need AMD and Intel to hurry up with their CPUs, I'm getting antsy to get my new build going :dogcited:

Thoom
Jan 12, 2004

LUIGI SMASH!

sean10mm posted:

There's a reason CPU FPS is always tested at 1080p, because lower resolutions are the only place you have separation between CPUs. Over that everything is pretty flat because you're GPU bound anyway. Even then you're often comparing 110 FPS to 100 FPS or something.

I wonder how long this will be true given the massive uplift in CPU power available to the next gen consoles. Even a lovely desktop CPU can run red rings of death around a Jaguar, but it's harder to get that kind of a delta over the roughly 3700X equivalent in the PS5.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
this focus on 8k is literally everything that's wrong with their initial 30xx presentation wrapped into one chart. Like I love you Nvidia, stop playing

repiv
Aug 13, 2009

If anyone was wondering Amperes HDMI 2.1 controller is the real deal, full 48Gbps plus DSC support

https://twitter.com/ghost_motley/status/1301219502898769920

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
I get their rationale, though. When you're selling a sports car, most people don't want to hear about handling characteristics in the reveal, they just want to know how fast it goes and what it looks like. The fine print comes later.

That being said, as a prospective 3090 buyer, I need to know handling characteristics before I sign on the dotted line.

I also wonder how many 3080 and 3090 buyers are planning on buying two and using the sale of one to partially pay for the other. >.>

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
It's kinda funny because the new LG OLED this year doesn't support the full 48gbps. I mean it doesn't really matter a ton in practice but it's still funny that you actually need last year's C9 if you want to match the pipe.

BIG HEADLINE posted:

I get their rationale, though. When you're selling a sports car, most people don't want to hear about handling characteristics in the reveal, they just want to know how fast it goes and what it looks like. The fine print comes later.

I suppose, but Nvidia is partnered with LG OLED for their TV integration, and the cheapest 8k OLED from LG is $20,000. By the time real people, even the average 3090 buyer, can use the tech that they're describing we'll be at least two generations past the 3090.

If someone says a car can do 0-60 in 2 seconds, I can buy that car and go drive it that fast right now. Nvidia is pretty much going "this will go 0-60 in 2 seconds on quantum roads that will be available in 2032" ok but by then there will be a faster car for the same price and its not even segmented generationally. It's just weird.

repiv posted:

who cares that the hdmi receiver maxes out at 40gbps when the fastest mode the panel supports only requires 32gbps

I've actually seen LG get a lot of flak for decreasing the pipe in enthusiast circles. I did go out of my way to say it's academic though.

Taima fucked around with this message at 21:16 on Sep 2, 2020

repiv
Aug 13, 2009

Taima posted:

It's kinda funny because the new LG OLED this year doesn't support the full 48gbps. I mean it doesn't really matter a ton in practice but it's still funny that you actually need last year's C9 if you want to match the pipe.

who cares that the hdmi receiver maxes out at 40gbps when the fastest mode the panel supports only requires 32gbps

KYOON GRIFFEY JR
Apr 12, 2010



Runner-up, TRP Sack Race 2021/22

Thoom posted:

I wonder how long this will be true given the massive uplift in CPU power available to the next gen consoles. Even a lovely desktop CPU can run red rings of death around a Jaguar, but it's harder to get that kind of a delta over the roughly 3700X equivalent in the PS5.

yeah but if you're buying a CPU today you can figure that a 3700X will be just fine for the foreseeable lifecycle of the CPU.

KingKapalone
Dec 20, 2005
1/16 Native American + 1/2 Hungarian = Totally Badass
Any more info on the need for PCIe 4.0 yet? Watched a video from HWUnboxed a few pages back and it sounds like it might only be needed for 4K? I got a B450 + 3600 last year and plan to get a 3080 so don't want to upgrade.

I'm hoping to see more info about PSU requirements too since I also got a new Supersonic G3 550W PSU last year. Needing to upgrade that would be annoying since PSUs have gone up in price and are selling out.

repiv
Aug 13, 2009

Taima posted:

I've actually seen LG get a lot of flak for decreasing the pipe in enthusiast circles. I did go out of my way to say it's academic though.

imagining some nerd absolutely seething that they can't feed a 12bit signal (of which no source material currently exists) into their 10bit television (which would dither it down to 10bit anyway)

MarcusSA
Sep 23, 2007

KingKapalone posted:

Any more info on the need for PCIe 4.0 yet? Watched a video from HWUnboxed a few pages back and it sounds like it might only be needed for 4K? I got a B450 + 3600 last year and plan to get a 3080 so don't want to upgrade.

I'm hoping to see more info about PSU requirements too since I also got a new Supersonic G3 550W PSU last year. Needing to upgrade that would be annoying since PSUs have gone up in price and are selling out.

I mean its more for future proofing at this point. If you get the 3080 its going to last you long enough to where when you do upgrade 4.0 might actually mean something but that is a good few years (probably 3 at least) away.

Perfect Potato
Mar 4, 2009

KingKapalone posted:

Any more info on the need for PCIe 4.0 yet? Watched a video from HWUnboxed a few pages back and it sounds like it might only be needed for 4K? I got a B450 + 3600 last year and plan to get a 3080 so don't want to upgrade.

I'm hoping to see more info about PSU requirements too since I also got a new Supersonic G3 550W PSU last year. Needing to upgrade that would be annoying since PSUs have gone up in price and are selling out.

Going by the system reqs and the DF video Nvidia aren't even pushing Ryzen cpus with it so you're probably not going to see much in the way of fps differences between 3 or 4 in current games, at least outside margin of error. 550 isn't enough for 3080, it'll probably be something like 540-550 overclocked on a mid range ryzen, you want at least 650w

Adbot
ADBOT LOVES YOU

Josh Lyman
May 24, 2009


Nvidia released 3090 is a joke and so many people now gonna bite bc it doesn’t say Titan on the box :thunk:

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply