Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
apropos man
Sep 5, 2016

You get a hundred and forty one thousand years and you're out in eight!
How do AMD get away with a 17% performance increase by reprogramming the BIOS on a brand new SKU?

Surely, if they'd done their R&D/design/testing properly then the card would be tuned to deliver its very best in terms of output/noise.

If you just add an extra 17% performance to a factory card then you're not putting it out with an optimal setup in the first place, or will this have a detrimental effect on the life of the card?

Adbot
ADBOT LOVES YOU

orcane
Jun 13, 2012

Fun Shoe

apropos man posted:

How do AMD get away with a 17% performance increase by reprogramming the BIOS on a brand new SKU?

Surely, if they'd done their R&D/design/testing properly then the card would be tuned to deliver its very best in terms of output/noise.

If you just add an extra 17% performance to a factory card then you're not putting it out with an optimal setup in the first place, or will this have a detrimental effect on the life of the card?
You don't always want to tune new products to run as well as possible, if that means you get too close to another product you still want to sell (in this case the RX 5600 XT was supposed to beat the 1660 Super/Ti but not allowed to get too close to the RX 5700). It means the new settings weren't as thoroughly tested as the old ones, that much is true.

It's a bit of a clusterfuck, because apparently AMD surprised even its board partners. Powercolor eg. used their RX 5700 cooler for the Red Dragon but with one heatpipe less, which they will apparently add back in future cards (the card does fine without it, though). While I'm sure most board partners will offer some replacement/upgrade deal, buyers of the initial shipments risk getting inferior cards with coolers optimized for a slower card, and a slower BIOS. Also, the cards aren't properly validated with the new clocks yet, so at least in the first weeks there could also be stability issues for cards with the enhanced BIOS (so people can blame "awful AMD drivers" again :v:). Not all manufacturers will raise their VRAM clocks and those that do currently have cards that are occasionally unstable, according to the guy who used to run Tom's Hardware Germany (however, he got the cards stable just by reducing the VRAM clocks by 50 MHz, so it doesn't take much to fix).

It shouldn't have an impact on the cards' longevity in general. The GPU clocks are no big deal (and the cards adjust those based on TDP, temperatures etc. anyway) and reportedly all cards used VRAM rated for the new speed to begin with (they were just limited to lower clocks in software). And if manufacturers want to keep their cut-down cooling solutions they're free to not raise the clocks.

apropos man
Sep 5, 2016

You get a hundred and forty one thousand years and you're out in eight!
Ah. I forgot about "market segmentation". Thanks. Interesting stuff.

Lungboy
Aug 23, 2002

NEED SQUAT FORM HELP
It sounds like there's still way more on the table too, flashing a 5700 bios to the 5600XT could be interesting.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Yeah that's called the 6600 XT, coming december 2020 in case you need a space heater.

Arzachel
May 12, 2012

Lungboy posted:

It sounds like there's still way more on the table too, flashing a 5700 bios to the 5600XT could be interesting.

You're likely bottlenecked by the memory even at the updated bios clocks.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

K8.0 posted:

Yeah that's called the 6600 XT, coming december 2020 in case you need a space heater.

What do you think they'll call the 2024 edition ?

Craptacular!
Jul 9, 2001

Fuck the DH

OhFunny posted:

Gamers Nexus doesn't seem particularly impressed. Especially with the drivers.

GamersNexus has a grudge against segmentation right now, and Lisa’s RTG is going back to the days of changing a jumper or limiting a bus so that you spend more on the model that hasn’t been throttled by the manufacturer.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

eames posted:

8 and 10 GB seems very realistic in light of memory prices. 16 and 20 not so much.

Techpowerup reviewed the new $299 RTX2060 and found that it uses the same larger TU104 die as the 2070 Super. Seems like the early reports of Nvidia producing price cut 2070 were accurate in a way...

8gb and 10gb will literally never happen unless nvidia does another 970 style memory bank switching type thing.

ufarn
May 30, 2009
What kind or amount of VRAM does it actually take to use 4K textures on 1080p/1440p in games? 8GB seemed like a requirement a while ago, but I don't know if technological improvement means you can make do with less.

Truga
May 4, 2014
Lipstick Apathy
i'd say you're mostly safe with an 8gb card. i wouldn't go for less tho.

after all, on a modern gpu, upping the texture res does almost nothing to your frame rate while having probably the biggest impact on image quality, as long as you have the vram to store said textures.

you can get a used 580 with 8gb for ~100bux these days and it'll run poo poo with 4k textures fine on a 1080p screen

SCheeseman
Apr 23, 2003

I'd imagine the next generation of consoles would probably cause texture memory usage in games to increase quite a bit.

Truga
May 4, 2014
Lipstick Apathy
doubtful. consoles gonna have what, 16 gigs? a bunch of that is used by the OS and the game tho, so

orcane
Jun 13, 2012

Fun Shoe
A lot of PC releases already have higher texture resolutions available, I doubt consoles will surpass those in a way that makes more VRAM mandatory than we see in current PC gaming benchmarks.

ufarn posted:

What kind or amount of VRAM does it actually take to use 4K textures on 1080p/1440p in games? 8GB seemed like a requirement a while ago, but I don't know if technological improvement means you can make do with less.
Depends on the game. If you add stuff like RTX into the mix you occasionally need more VRAM nowadays, not less (even though Nvidia cards reportedly do memory management better than AMD's, so the performance impact can be lower).

Judging by the recent mainstream card releases, a (slowly increasing) number of games produce bad minimum framerates with less than 8 GB of VRAM at high texture settings, even in 1080p (eg. Wolfenstein with RTX on, or Ghost Recon Breakpoint). In most titles 6 GB is enough for 1080p/max. textures but 4 GB often isn't anymore (eg. in the current CoD).

People who are willing to sometimes lower texture details on a new card should be fine with 6 GB in Full HD, but personally I wouldn't take less than 8 GB, especially if 1440p is the target.

SCheeseman
Apr 23, 2003

Truga posted:

doubtful. consoles gonna have what, 16 gigs? a bunch of that is used by the OS and the game tho, so

I don't think it's been announced yet, though admittedly it's unlikely to be as big as a jump as it was last gen (512MB>8GB). It's shared memory, so developers could choose to use 12GB+ for textures if they really wanted to.

Arzachel
May 12, 2012

Truga posted:

after all, on a modern gpu, upping the texture res does almost nothing to your frame rate while having probably the biggest impact on image quality, as long as you have the vram to store said textures.

I would've agreed back in the ps3/x360 era but 4k textures feel like a marginal improvement to me nowadays.

SwissArmyDruid
Feb 14, 2014

by sebmojo
RTX 2080 in the RTX 2060 KO, Up to +47% Workstation Performance

Sensationalist, but it's GamersNexus, so they deserve the benefit of the doubt.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

SwissArmyDruid posted:

RTX 2080 in the RTX 2060 KO, Up to +47% Workstation Performance

Sensationalist, but it's GamersNexus, so they deserve the benefit of the doubt.

having watched the video, the short version is that the 2060 KO is not using the same die as other 2060s, and is instead using a die that's some kind of cut-down variant of the 2080/2070 Super dies

this does not make the 2060 KO any better than other 2060s in terms of gaming performance

it does make the 2060 KO perform comparably to a 2080/2070 Super on some specific productivity workloads, with Blender rendering being the most applicable one

so it's an interesting find, but not at all relevant for gaming

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
Going by TPU numbers, even the MSI 5600 XT Gaming X is a mere 17% faster than a stock 1660S at 1440p. Small wonder AMD needed their emergency edition factory +17% OC BIOS or the 5600XT would be a complete shitshow.

Palladium fucked around with this message at 15:56 on Jan 23, 2020

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Truga posted:

doubtful. consoles gonna have what, 16 gigs? a bunch of that is used by the OS and the game tho, so

Games absolutely will have increased system requirements once next-gen consoles roll around, this has basically been a general rule since the days of the PS2, developers design around lowest common denominator instead of scaling down (its just a fact of life now that the PC market isn't big enough for a triple AAA game unless you're a kickstarter conman designing space ship games)

One aspect of this absolutely will be increased texture resolution, 8GB to 16GB is still double even if it isn't as big a jump as PS3>PS4 was

You can certainly expect to throw any 4 core CPUs in the trash, bye bye 6600k

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Zedsdeadbaby posted:

You can certainly expect to throw any 4 core CPUs in the trash, bye bye 6600k

I don't think this is going to be quite true. Games are not using multi-threading very effectively anyway, they are still very single-core dependent and I am not sure why that would dramatically change. PS4 Pro/Xbox One X had 8 cores and it didn't really change all that much.

It'll be a while before games will outright require more than 4 cores. You'll see more games that will support it, but requiring 4+ cores is a pretty tall order.

Cavauro
Jan 9, 2008

A six thread CPU like the 9600k already seems to get chumped or equalled by a lot of CPUs with more threads on new games like RDR2. Last gen AMD stuff, 7700k, etc. I feel like it would be really unlikely that this wouldn't become fully standard when actual new console hardware is out.

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"

Lockback posted:

I don't think this is going to be quite true. Games are not using multi-threading very effectively anyway, they are still very single-core dependent and I am not sure why that would dramatically change. PS4 Pro/Xbox One X had 8 cores and it didn't really change all that much.

It'll be a while before games will outright require more than 4 cores. You'll see more games that will support it, but requiring 4+ cores is a pretty tall order.

Really depends on the game; Battlefield 5, regardless of how much a buggy and incompetent mess it is overall, benefits greatly from having more than 4 cores when it comes to both average FPS and frametimes.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Lockback posted:

I don't think this is going to be quite true. Games are not using multi-threading very effectively anyway, they are still very single-core dependent and I am not sure why that would dramatically change. PS4 Pro/Xbox One X had 8 cores and it didn't really change all that much.

It'll be a while before games will outright require more than 4 cores. You'll see more games that will support it, but requiring 4+ cores is a pretty tall order.

The new CPUs are dramatically faster in terms of single-thread performance. Jaguar to Zen (2ish?) is an enormous leap, and the clocks are going up. The ante for CPU performance is being upped a great deal, and a CPU with fewer cores will no longer be able to match/exceed console performance by just being faster and handling more threads on each core.

Once games start primarily targeting the new systems, I would not expect consistently good performance from anything less than an 6/12 or maybe 8/8 system, but 8/16 will show significant advantages.

Cygni
Nov 12, 2005

raring to post

It is extremely likely that the console cores will be fairly low clocking to favor the GPU in the monolithic die, and game developers still need to design and use tools that actually take advantage of that parallelism... which considering there are still games coming out in 2020 that will really only stress 1 core is still a consideration. The vast majority of PC gamers also play on 60hz screens.

So yeah, i think the "youll be throwing your 4 core CPUs in the trash" thing is overstated. I mean if you really think the thread thing is gonna explode to match the next gen consoles, you should be prepared to sweep those 4/8s and 6/12s into the trashcan too.

VelociBacon
Dec 8, 2009

Console people know that their audience generally doesn't have these kinds of conversations and cares more about whether their favorite sports guy is in the game so I think it'll be 6/12 or something on the new CPUs.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Truga posted:

doubtful. consoles gonna have what, 16 gigs? a bunch of that is used by the OS and the game tho, so
Recall a bunch of comments on here like this when the new consoles came out 6 years ago and people thought 2gb cards would be fine and I was arguing that 4GB will very likely be necessary soon. You will absolutely need 8GB+ cards to get the same texture quality as the new consoles, I'd bet on it.

VelociBacon posted:

Console people know that their audience generally doesn't have these kinds of conversations and cares more about whether their favorite sports guy is in the game so I think it'll be 6/12 or something on the new CPUs.

What, the console CPU's or are you talking about expected equivalent PC CPU's? The new consoles have been as confirmed as they can possibly be to be 8 cores/16 threads for ages now, and with a likely boost clock of ~3.2ghz.

Happy_Misanthrope fucked around with this message at 19:11 on Jan 23, 2020

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

K8.0 posted:

The new CPUs are dramatically faster in terms of single-thread performance. Jaguar to Zen (2ish?) is an enormous leap, and the clocks are going up. The ante for CPU performance is being upped a great deal, and a CPU with fewer cores will no longer be able to match/exceed console performance by just being faster and handling more threads on each core.

Once games start primarily targeting the new systems, I would not expect consistently good performance from anything less than an 6/12 or maybe 8/8 system, but 8/16 will show significant advantages.

Sure, but "Show advantage" is a very different bar than "Throw 4 cores in the garbage". Of course newer CPUs are going to be better, but I don't think we're going to hit the "Game won't even run on a 6600k" for a while.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
I mean 8/16 will show significant advantages over 6/12 and certainly 8/8.

Any sort of 4/4 is going in the garbage with next gen games. The new consoles are going to have around the same per-core performance as a 6600k, but with twice the cores and four times the threads. There's no getting around that. Games might run, but they will run like absolute poo poo.

The CPUs in the PS4 and Bone, and even in the Bone X, are incredibly slow. Especially in terms of CPU performance, they were the weakest consoles ever released relative to PCs at the time. Anyone who thinks that old CPUs can compete with the performance of these new console CPUs needs to look at the specs of the current consoles and realize how weak they actually are.

K8.0 fucked around with this message at 19:23 on Jan 23, 2020

Craptacular!
Jul 9, 2001

Fuck the DH

K8.0 posted:

I mean 8/16 will show significant advantages over 6/12

Nah. You’re underestimating the difficulty in making game code use all that.

I agree 4/4 is done but there won’t be any real reason to buy more than 6/12 unless you want to parallel stream (which the consoles are probably gonna do again out of the box, and since AMD GPU rendering is awful you can expect the, to throw CPU reserved power at it.)

Consoles don’t let games just eat up all the resources.

Seamonster
Apr 30, 2007

IMMER SIEGREICH
Lol at all of you fussing over omg coars without considering the gigantic effective memory bandwidth the consoles will have. Its going to be hundreds of GB/s compared to like 50 in dual channel DDR4 now.

Truga
May 4, 2014
Lipstick Apathy

Happy_Misanthrope posted:

Recall a bunch of comments on here like this when the new consoles came out 6 years ago and people thought 2gb cards would be fine and I was arguing that 4GB will very likely be necessary soon. You will absolutely need 8GB+ cards to get the same texture quality as the new consoles, I'd bet on it.

well that wasn't me, 6 years ago i was already firmly in the 4gb camp, and now i'm firmly in the 8gb camp and tell people not to buy a 4gb card :v:

at the same time, i highly doubt console ports are gonna push 8gb cards hard

Truga fucked around with this message at 22:37 on Jan 23, 2020

Arzachel
May 12, 2012

Seamonster posted:

Lol at all of you fussing over omg coars without considering the gigantic effective memory bandwidth the consoles will have. Its going to be hundreds of GB/s compared to like 50 in dual channel DDR4 now.

Games just aren't cpu bandwidth intensive.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Seamonster posted:

Lol at all of you fussing over omg coars without considering the gigantic effective memory bandwidth the consoles will have. Its going to be hundreds of GB/s compared to like 50 in dual channel DDR4 now.
...which is going to be taken up 90% by the GPU, and they'll have smaller caches than desktop chips to boot. It's not like GDDR is some magical secret sauce to unlocking a CPU's performance, if it was you would see workstations utilize it. Yes bandwidth is important, to a point - CPU's love big caches and low latency, two aspects which most PC desktops will likely still be superior with over the consoles.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
The PS4, Pro, and Bone X already have GDDR5. The original Bone has DDR3, but the 32mb ESRAM proved to be enough for it to keep up with the PS4 fairly well.

It's really hard to overstate the impact of going from what is essentially low power Bulldozer at 1.6ghz to something like Zen 2 at 3.2ghz. This is part of why 2021 is going to be such a killer year for AMD. Many, many people are going to be forced to upgrade and Intel still won't be able to compete.

The GPUs are also going from approximately a 7870 to, in the case of the high end Xbox, 23% more compute power than a 5700 XT. That is insane.

orcane
Jun 13, 2012

Fun Shoe

Seamonster posted:

Lol at all of you fussing over omg coars without considering the gigantic effective memory bandwidth the consoles will have. Its going to be hundreds of GB/s compared to like 50 in dual channel DDR4 now.
You mean PCs have... the wrong RAM?

Cygni
Nov 12, 2005

raring to post

Its also worth bringing up that AMD's very obviously directed at consoles launch of SmartShift will come into play here. The 5700XT alone pulls more wattage than either the full PS4 Pro or XboneX do at load, and although the Series X render definitely looks like a large beefy boy, i don't imagine its total power draw will be anything like the double or so it would need to run everything at full tilt. Developers will essentially get to play with the full brunt of the CPU or the GPU, but not both at the same time. And I imagine most developers will lean on as much as they can get from the GPU. Gonna be interesting to see how they leverage that hardware and what they do with it.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
There are some bold claims in this thread about how unreleased consoles are going to FINALLY wreck the PC market THIS TIME HONEST

Spiderdrake
May 12, 2001



That's kinda the opposite of what is being said

I don't think "an increase in console hardware capabilities puts pressure on the PC userbase to replace aging hardware as demands rise" is much of a hot take, but this is mostly an argument of granularity

Adbot
ADBOT LOVES YOU

Craptacular!
Jul 9, 2001

Fuck the DH
Yeah, this is more a conversation between frugals and big spenders about whether new consoles will render mid-range PCs unenjoyable. The lack of information about what's inside the consoles at present and the different needs of the two markets only make things harder to speculate.

Assuming the new consoles are equal to a 2700X and a 5700 is a safe bet, but we don't know for sure what performance is going to be.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply