|
How do AMD get away with a 17% performance increase by reprogramming the BIOS on a brand new SKU? Surely, if they'd done their R&D/design/testing properly then the card would be tuned to deliver its very best in terms of output/noise. If you just add an extra 17% performance to a factory card then you're not putting it out with an optimal setup in the first place, or will this have a detrimental effect on the life of the card?
|
# ? Jan 22, 2020 09:52 |
|
|
# ? Jun 5, 2024 04:55 |
|
apropos man posted:How do AMD get away with a 17% performance increase by reprogramming the BIOS on a brand new SKU? It's a bit of a clusterfuck, because apparently AMD surprised even its board partners. Powercolor eg. used their RX 5700 cooler for the Red Dragon but with one heatpipe less, which they will apparently add back in future cards (the card does fine without it, though). While I'm sure most board partners will offer some replacement/upgrade deal, buyers of the initial shipments risk getting inferior cards with coolers optimized for a slower card, and a slower BIOS. Also, the cards aren't properly validated with the new clocks yet, so at least in the first weeks there could also be stability issues for cards with the enhanced BIOS (so people can blame "awful AMD drivers" again ). Not all manufacturers will raise their VRAM clocks and those that do currently have cards that are occasionally unstable, according to the guy who used to run Tom's Hardware Germany (however, he got the cards stable just by reducing the VRAM clocks by 50 MHz, so it doesn't take much to fix). It shouldn't have an impact on the cards' longevity in general. The GPU clocks are no big deal (and the cards adjust those based on TDP, temperatures etc. anyway) and reportedly all cards used VRAM rated for the new speed to begin with (they were just limited to lower clocks in software). And if manufacturers want to keep their cut-down cooling solutions they're free to not raise the clocks.
|
# ? Jan 22, 2020 10:49 |
|
Ah. I forgot about "market segmentation". Thanks. Interesting stuff.
|
# ? Jan 22, 2020 10:53 |
|
It sounds like there's still way more on the table too, flashing a 5700 bios to the 5600XT could be interesting.
|
# ? Jan 22, 2020 11:27 |
|
Yeah that's called the 6600 XT, coming december 2020 in case you need a space heater.
|
# ? Jan 22, 2020 12:03 |
|
Lungboy posted:It sounds like there's still way more on the table too, flashing a 5700 bios to the 5600XT could be interesting. You're likely bottlenecked by the memory even at the updated bios clocks.
|
# ? Jan 22, 2020 12:52 |
|
K8.0 posted:Yeah that's called the 6600 XT, coming december 2020 in case you need a space heater. What do you think they'll call the 2024 edition ?
|
# ? Jan 22, 2020 13:07 |
|
OhFunny posted:Gamers Nexus doesn't seem particularly impressed. Especially with the drivers. GamersNexus has a grudge against segmentation right now, and Lisa’s RTG is going back to the days of changing a jumper or limiting a bus so that you spend more on the model that hasn’t been throttled by the manufacturer.
|
# ? Jan 22, 2020 22:15 |
|
eames posted:8 and 10 GB seems very realistic in light of memory prices. 16 and 20 not so much. 8gb and 10gb will literally never happen unless nvidia does another 970 style memory bank switching type thing.
|
# ? Jan 23, 2020 06:40 |
|
What kind or amount of VRAM does it actually take to use 4K textures on 1080p/1440p in games? 8GB seemed like a requirement a while ago, but I don't know if technological improvement means you can make do with less.
|
# ? Jan 23, 2020 10:31 |
|
i'd say you're mostly safe with an 8gb card. i wouldn't go for less tho. after all, on a modern gpu, upping the texture res does almost nothing to your frame rate while having probably the biggest impact on image quality, as long as you have the vram to store said textures. you can get a used 580 with 8gb for ~100bux these days and it'll run poo poo with 4k textures fine on a 1080p screen
|
# ? Jan 23, 2020 10:41 |
|
I'd imagine the next generation of consoles would probably cause texture memory usage in games to increase quite a bit.
|
# ? Jan 23, 2020 10:55 |
|
doubtful. consoles gonna have what, 16 gigs? a bunch of that is used by the OS and the game tho, so
|
# ? Jan 23, 2020 11:14 |
|
A lot of PC releases already have higher texture resolutions available, I doubt consoles will surpass those in a way that makes more VRAM mandatory than we see in current PC gaming benchmarks.ufarn posted:What kind or amount of VRAM does it actually take to use 4K textures on 1080p/1440p in games? 8GB seemed like a requirement a while ago, but I don't know if technological improvement means you can make do with less. Judging by the recent mainstream card releases, a (slowly increasing) number of games produce bad minimum framerates with less than 8 GB of VRAM at high texture settings, even in 1080p (eg. Wolfenstein with RTX on, or Ghost Recon Breakpoint). In most titles 6 GB is enough for 1080p/max. textures but 4 GB often isn't anymore (eg. in the current CoD). People who are willing to sometimes lower texture details on a new card should be fine with 6 GB in Full HD, but personally I wouldn't take less than 8 GB, especially if 1440p is the target.
|
# ? Jan 23, 2020 11:17 |
|
Truga posted:doubtful. consoles gonna have what, 16 gigs? a bunch of that is used by the OS and the game tho, so I don't think it's been announced yet, though admittedly it's unlikely to be as big as a jump as it was last gen (512MB>8GB). It's shared memory, so developers could choose to use 12GB+ for textures if they really wanted to.
|
# ? Jan 23, 2020 11:18 |
|
Truga posted:after all, on a modern gpu, upping the texture res does almost nothing to your frame rate while having probably the biggest impact on image quality, as long as you have the vram to store said textures. I would've agreed back in the ps3/x360 era but 4k textures feel like a marginal improvement to me nowadays.
|
# ? Jan 23, 2020 11:20 |
|
RTX 2080 in the RTX 2060 KO, Up to +47% Workstation Performance Sensationalist, but it's GamersNexus, so they deserve the benefit of the doubt.
|
# ? Jan 23, 2020 11:27 |
|
SwissArmyDruid posted:RTX 2080 in the RTX 2060 KO, Up to +47% Workstation Performance having watched the video, the short version is that the 2060 KO is not using the same die as other 2060s, and is instead using a die that's some kind of cut-down variant of the 2080/2070 Super dies this does not make the 2060 KO any better than other 2060s in terms of gaming performance it does make the 2060 KO perform comparably to a 2080/2070 Super on some specific productivity workloads, with Blender rendering being the most applicable one so it's an interesting find, but not at all relevant for gaming
|
# ? Jan 23, 2020 12:02 |
|
Going by TPU numbers, even the MSI 5600 XT Gaming X is a mere 17% faster than a stock 1660S at 1440p. Small wonder AMD needed their emergency edition factory +17% OC BIOS or the 5600XT would be a complete shitshow.
Palladium fucked around with this message at 15:56 on Jan 23, 2020 |
# ? Jan 23, 2020 15:44 |
|
Truga posted:doubtful. consoles gonna have what, 16 gigs? a bunch of that is used by the OS and the game tho, so Games absolutely will have increased system requirements once next-gen consoles roll around, this has basically been a general rule since the days of the PS2, developers design around lowest common denominator instead of scaling down (its just a fact of life now that the PC market isn't big enough for a triple AAA game unless you're a kickstarter conman designing space ship games) One aspect of this absolutely will be increased texture resolution, 8GB to 16GB is still double even if it isn't as big a jump as PS3>PS4 was You can certainly expect to throw any 4 core CPUs in the trash, bye bye 6600k
|
# ? Jan 23, 2020 17:56 |
|
Zedsdeadbaby posted:You can certainly expect to throw any 4 core CPUs in the trash, bye bye 6600k I don't think this is going to be quite true. Games are not using multi-threading very effectively anyway, they are still very single-core dependent and I am not sure why that would dramatically change. PS4 Pro/Xbox One X had 8 cores and it didn't really change all that much. It'll be a while before games will outright require more than 4 cores. You'll see more games that will support it, but requiring 4+ cores is a pretty tall order.
|
# ? Jan 23, 2020 18:06 |
|
A six thread CPU like the 9600k already seems to get chumped or equalled by a lot of CPUs with more threads on new games like RDR2. Last gen AMD stuff, 7700k, etc. I feel like it would be really unlikely that this wouldn't become fully standard when actual new console hardware is out.
|
# ? Jan 23, 2020 18:13 |
|
Lockback posted:I don't think this is going to be quite true. Games are not using multi-threading very effectively anyway, they are still very single-core dependent and I am not sure why that would dramatically change. PS4 Pro/Xbox One X had 8 cores and it didn't really change all that much. Really depends on the game; Battlefield 5, regardless of how much a buggy and incompetent mess it is overall, benefits greatly from having more than 4 cores when it comes to both average FPS and frametimes.
|
# ? Jan 23, 2020 18:21 |
|
Lockback posted:I don't think this is going to be quite true. Games are not using multi-threading very effectively anyway, they are still very single-core dependent and I am not sure why that would dramatically change. PS4 Pro/Xbox One X had 8 cores and it didn't really change all that much. The new CPUs are dramatically faster in terms of single-thread performance. Jaguar to Zen (2ish?) is an enormous leap, and the clocks are going up. The ante for CPU performance is being upped a great deal, and a CPU with fewer cores will no longer be able to match/exceed console performance by just being faster and handling more threads on each core. Once games start primarily targeting the new systems, I would not expect consistently good performance from anything less than an 6/12 or maybe 8/8 system, but 8/16 will show significant advantages.
|
# ? Jan 23, 2020 18:46 |
|
It is extremely likely that the console cores will be fairly low clocking to favor the GPU in the monolithic die, and game developers still need to design and use tools that actually take advantage of that parallelism... which considering there are still games coming out in 2020 that will really only stress 1 core is still a consideration. The vast majority of PC gamers also play on 60hz screens. So yeah, i think the "youll be throwing your 4 core CPUs in the trash" thing is overstated. I mean if you really think the thread thing is gonna explode to match the next gen consoles, you should be prepared to sweep those 4/8s and 6/12s into the trashcan too.
|
# ? Jan 23, 2020 18:48 |
|
Console people know that their audience generally doesn't have these kinds of conversations and cares more about whether their favorite sports guy is in the game so I think it'll be 6/12 or something on the new CPUs.
|
# ? Jan 23, 2020 18:51 |
|
Truga posted:doubtful. consoles gonna have what, 16 gigs? a bunch of that is used by the OS and the game tho, so VelociBacon posted:Console people know that their audience generally doesn't have these kinds of conversations and cares more about whether their favorite sports guy is in the game so I think it'll be 6/12 or something on the new CPUs. What, the console CPU's or are you talking about expected equivalent PC CPU's? The new consoles have been as confirmed as they can possibly be to be 8 cores/16 threads for ages now, and with a likely boost clock of ~3.2ghz. Happy_Misanthrope fucked around with this message at 19:11 on Jan 23, 2020 |
# ? Jan 23, 2020 19:06 |
|
K8.0 posted:The new CPUs are dramatically faster in terms of single-thread performance. Jaguar to Zen (2ish?) is an enormous leap, and the clocks are going up. The ante for CPU performance is being upped a great deal, and a CPU with fewer cores will no longer be able to match/exceed console performance by just being faster and handling more threads on each core. Sure, but "Show advantage" is a very different bar than "Throw 4 cores in the garbage". Of course newer CPUs are going to be better, but I don't think we're going to hit the "Game won't even run on a 6600k" for a while.
|
# ? Jan 23, 2020 19:09 |
|
I mean 8/16 will show significant advantages over 6/12 and certainly 8/8. Any sort of 4/4 is going in the garbage with next gen games. The new consoles are going to have around the same per-core performance as a 6600k, but with twice the cores and four times the threads. There's no getting around that. Games might run, but they will run like absolute poo poo. The CPUs in the PS4 and Bone, and even in the Bone X, are incredibly slow. Especially in terms of CPU performance, they were the weakest consoles ever released relative to PCs at the time. Anyone who thinks that old CPUs can compete with the performance of these new console CPUs needs to look at the specs of the current consoles and realize how weak they actually are. K8.0 fucked around with this message at 19:23 on Jan 23, 2020 |
# ? Jan 23, 2020 19:21 |
|
K8.0 posted:I mean 8/16 will show significant advantages over 6/12 Nah. You’re underestimating the difficulty in making game code use all that. I agree 4/4 is done but there won’t be any real reason to buy more than 6/12 unless you want to parallel stream (which the consoles are probably gonna do again out of the box, and since AMD GPU rendering is awful you can expect the, to throw CPU reserved power at it.) Consoles don’t let games just eat up all the resources.
|
# ? Jan 23, 2020 20:33 |
|
Lol at all of you fussing over omg coars without considering the gigantic effective memory bandwidth the consoles will have. Its going to be hundreds of GB/s compared to like 50 in dual channel DDR4 now.
|
# ? Jan 23, 2020 22:21 |
|
Happy_Misanthrope posted:Recall a bunch of comments on here like this when the new consoles came out 6 years ago and people thought 2gb cards would be fine and I was arguing that 4GB will very likely be necessary soon. You will absolutely need 8GB+ cards to get the same texture quality as the new consoles, I'd bet on it. well that wasn't me, 6 years ago i was already firmly in the 4gb camp, and now i'm firmly in the 8gb camp and tell people not to buy a 4gb card at the same time, i highly doubt console ports are gonna push 8gb cards hard Truga fucked around with this message at 22:37 on Jan 23, 2020 |
# ? Jan 23, 2020 22:32 |
|
Seamonster posted:Lol at all of you fussing over omg coars without considering the gigantic effective memory bandwidth the consoles will have. Its going to be hundreds of GB/s compared to like 50 in dual channel DDR4 now. Games just aren't cpu bandwidth intensive.
|
# ? Jan 23, 2020 22:38 |
|
Seamonster posted:Lol at all of you fussing over omg coars without considering the gigantic effective memory bandwidth the consoles will have. Its going to be hundreds of GB/s compared to like 50 in dual channel DDR4 now.
|
# ? Jan 23, 2020 22:43 |
|
The PS4, Pro, and Bone X already have GDDR5. The original Bone has DDR3, but the 32mb ESRAM proved to be enough for it to keep up with the PS4 fairly well. It's really hard to overstate the impact of going from what is essentially low power Bulldozer at 1.6ghz to something like Zen 2 at 3.2ghz. This is part of why 2021 is going to be such a killer year for AMD. Many, many people are going to be forced to upgrade and Intel still won't be able to compete. The GPUs are also going from approximately a 7870 to, in the case of the high end Xbox, 23% more compute power than a 5700 XT. That is insane.
|
# ? Jan 23, 2020 23:21 |
|
Seamonster posted:Lol at all of you fussing over omg coars without considering the gigantic effective memory bandwidth the consoles will have. Its going to be hundreds of GB/s compared to like 50 in dual channel DDR4 now.
|
# ? Jan 23, 2020 23:32 |
|
Its also worth bringing up that AMD's very obviously directed at consoles launch of SmartShift will come into play here. The 5700XT alone pulls more wattage than either the full PS4 Pro or XboneX do at load, and although the Series X render definitely looks like a large beefy boy, i don't imagine its total power draw will be anything like the double or so it would need to run everything at full tilt. Developers will essentially get to play with the full brunt of the CPU or the GPU, but not both at the same time. And I imagine most developers will lean on as much as they can get from the GPU. Gonna be interesting to see how they leverage that hardware and what they do with it.
|
# ? Jan 24, 2020 00:03 |
|
There are some bold claims in this thread about how unreleased consoles are going to FINALLY wreck the PC market THIS TIME HONEST
|
# ? Jan 24, 2020 00:28 |
|
That's kinda the opposite of what is being said I don't think "an increase in console hardware capabilities puts pressure on the PC userbase to replace aging hardware as demands rise" is much of a hot take, but this is mostly an argument of granularity
|
# ? Jan 24, 2020 00:39 |
|
|
# ? Jun 5, 2024 04:55 |
|
Yeah, this is more a conversation between frugals and big spenders about whether new consoles will render mid-range PCs unenjoyable. The lack of information about what's inside the consoles at present and the different needs of the two markets only make things harder to speculate. Assuming the new consoles are equal to a 2700X and a 5700 is a safe bet, but we don't know for sure what performance is going to be.
|
# ? Jan 24, 2020 00:48 |