Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Gwaihir
Dec 8, 2009
Hair Elf
I just don't see "Socket that will support the next 3+ generations of CPU" to be any kind of requirement or selling point, especially given how small incremental CPU upgrades themselves are. I don't think we're going to see that change any time in the coming years, either.

By the time performance has increased enough that you want to actually upgrade your CPU, then there are going to be enough platform changes to want a new motherboard anyhow.

Adbot
ADBOT LOVES YOU

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
People are still using sandy bridge despite it not supporting anything recent so your argument holds no water

ed: I say this as I type from a sandy bridge 2600 workstation.

Gwaihir
Dec 8, 2009
Hair Elf
:psyduck:

Sandy Bridge is literally the basis of what I'm saying. It's socket/chipset only supports IVB as an upgrade, but it took so long for a worthwhile CPU upgrade to come out that the platform benefits from a new board are at least as significant as the CPU upgrade part.

That's my whole point.

PC LOAD LETTER
May 23, 2005
WTF?!
Its not really clear that any platform changes over the next 2-3yr or so are really going to warrant a new mobo for most consumers/gamers though.

U.2 connector support? Neat but M.2 drives are already more than fast enough for consumer/gamer stuff and U.2 drives carry the enterprise class price premium. PCIe 4.0? I suppose it might be needed for 8K resolution games/VR but I'm not even sure about that. XPoint/Optane support? Love the idea of the tech but it sounds like its going to be too expensive and small capacity for a while, plus M.2 drives are still more than fast enough for consumer/gamer stuff and its Intel specific for the DIMM's where it'd be real cool to have. WiFi in the chipset? Nice for laptops I guess but that isn't really a practical platform to be talking about for mobo upgrades.

All cool tech but none of that would make me WANT or NEED to buy a new mobo just to have it and I don't see why anyone in the consumer/gamer market would NEED to buy it either. Now there are always those who will WANT it just because its the latest greatest but I'm not talking about them and they're a small part of the market anyways.

For servers or HPC yeah absolutely you're going to want and need to upgrade for at least 1 or 2 of those things because they matter lots for those roles.

EdEddnEddy
Apr 5, 2012



Pryor on Fire posted:

Overclocking each core sounds like a fun problem, especially if each one has slightly different characteristics and it becomes a whole new aspect of the hobby tuning scene.

March 2nd is the release date, not sure if that's been posted yet.

I could be very interested in this. OC'ing is sort of an addictive hobby once you start meddling into a lot of the other fine tune details and getting the most speed, with the least voltage increase is always a work of art.

I will say that Intel did something sort of like this even back in the Sandybridge days.

In the Intel thread, that P67 Sabertooth my buddy no longer uses (due to nothing on the board itself, blame bad usb dongles :argh: ) I built that system originally to OC and send his way (his request). What was amazing that it did do, and I have not been able to successfully recreate it since on pretty much any Intel chip/board. Is the Per Core Turbo. His 2600K I was able to turbo all 4 cores at 4.6, 3 cores to 4.8, 2 to 4.9, and 1 to 5ghz. Often times though the chip would actually run around the 4.8-.9 mark with all 4 cores and hang out at 5ghz with 2 which was pretty incredible. This also allowed my early training with Offset voltage and unlike the 1.45-1.5vcore that some had to use for 5Ghz, I believe it only topped out at around 1.4~vcore under heavy loads (benchmarking mostly). That was an amazing board and an amazing overclock.

Heat wasn't too bad in CA, but he moved to Texas so I dropped him down to around 4.2Ghz to keep things sane and stable as if he has issues I can't really remote in to OC it.

So Intel did something of the sort (or at least Motherboard makers did). AMD might have a new method though which I will be taking a hard close look at for future builds.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Crotch Fruit posted:

The big thing I like are the 2+ and 3+ versions as well since to my understanding most AM2 or 2+ parts are compatible with each other, and AM2+ motherboards can even take AM3 CPUs sometimes, I don''t Intel ever tried to achieve similar levels of compatibility. That point might very well be moot since there are still a lot of different AMD sockets, but it is at least easier for me to look at an eBay listing and know right away which socket is newer.

It turns out that it's pretty easy to stay on the same sockets when you don't upgrade your architecture for 6 years.

And oh hey, look, AMD came out with a significant architecture upgrade and - guess what!? Time for new sockets! :toot:

Paul MaudDib fucked around with this message at 07:44 on Feb 17, 2017

Ihmemies
Oct 6, 2012

One thing I don't understand is why cpu die sizes keep getting smaller. My i7 920 is like 260mm˛ while a new 4core skylake is barely half of that. And most of skylake's die is wasted on cpu... Why won't Intel ditch the GPU already and make 260mm˛ 12core cpu's for $300? That I'd call technological progress. Currently my phone has more cores than my desktop, which is lol

At least AMD gets it and that is a very good reason to support them yet again.

Anime Schoolgirl
Nov 28, 2002

Ihmemies posted:

One thing I don't understand is why cpu die sizes keep getting smaller. My i7 920 is like 260mm˛ while a new 4core skylake is barely half of that. And most of skylake's die is wasted on cpu... Why won't Intel ditch the GPU already and make 260mm˛ 12core cpu's for $300?
they already have something like that but it's something like 1.8ghz per core

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

Turns out when transistors got smaller so did CPU packages in the die.

Ihmemies
Oct 6, 2012

Boiled Water posted:

Turns out when transistors got smaller so did CPU packages in the die.

But that doesn't apply to gpu's, why are they different? Gpu makers always try to fit as much stuff in a die as the technology allows.

E: big die gpu's have like 9B transistors while low core count cpu's are still in 1-2B range after all these years... (And that's with the stupid rear end gpu).

Ihmemies fucked around with this message at 08:32 on Feb 17, 2017

Watermelon Daiquiri
Jul 10, 2010
I TRIED TO BAIT THE TXPOL THREAD WITH THE WORLD'S WORST POSSIBLE TAKE AND ALL I GOT WAS THIS STUPID AVATAR.
Think about it. Why the gently caress would intel or whoever NOT increase the number of chips they can get per pi*150mm^2?

GPUs have different economics in that people expect more increases from them vs CPUs. Plus, the cpu market is much much much smaller than the cpu market. Nvidia/AMD NEEDS the extra flops they can get from a higher number or rops or streaming whatevers so people will actually buy them. Tell me, is there an enormous number of people holding onto 3-4 generation old GPUs the same way they do for CPUs? Do you see many people saying "Wow, I never thought I'd get some much out of this 560TI. I still don't see a need to upgrade!" the same way they do for sandy bridge? :P

Watermelon Daiquiri fucked around with this message at 10:11 on Feb 17, 2017

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
GPU's work in massively parallel, for them increasing core counts is primarily the way to get more out of them, while CPU's workloads tend to see ever diminishing returns on parallel workloads. Adding more cache or instructions will quickly send the TDP and power draw through the roof, so there is actually a real upper limit on how big you want to design a CPU (and infinitely big CPU is not better, an infinitely larger GPU might be). When most CPU workloads end up still end up using legacy instructions and rely heavily on clock speeds it's actually better design to go small to reduce chip complexity, as a larger more complex chip is more prone to faults and inherently has a harder time maintaining higher clocks.

Ihmemies
Oct 6, 2012

Why consoles have more cores if more cores=bad? 8 core cpu's have been a standard in console gaming world for years, in pc gaming intel and lovely graphics api's have been holding the progress back. Hopefully Ryzen pushes game devs to make their pc games work better with multiple cores in the long run.

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

Ihmemies posted:

Why consoles have more cores if more cores=bad? 8 core cpu's have been a standard in console gaming world for years, in pc gaming intel and lovely graphics api's have been holding the progress back. Hopefully Ryzen pushes game devs to make their pc games work better with multiple cores in the long run.

source your quotes

SwissArmyDruid
Feb 14, 2014

by sebmojo
More cores are not inherently a bad thing. However, there is a price to be paid for more cores.

* More cores means less die space dedicated per-core. If given 200 mm^2 of space, you could either fit four cores or eight cores into that area. An eight core part would have less silicon area per core, therefore being less capable than four cores that take up all the extra space.
* More cores means less TDP per core. If your part is rated for 65W, and you want eight cores, you will have to downclock those cores in order to stay inside your budget.
* Core size does not typically change from 4 to 6 to 8. Your core size is your core size, and your die grows or shrinks to accomodate it. (not counting binning and disabling) More areal surface given to higher-core-count parts means an increase in the potential for gamechanging flaws from the manufacturing process.
* Read another way: Smaller dies from less cores means more "perfect" dies-per-wafer.

Also, Cell was, technically, one core, with eight sub-cores. I vaguely remember someone explaining it to me as an eight-way DSP. Although it didn't even have eight cores at its disposal. One of them was disabled to bring up yields so they would have enough to even launch the PS3 (see bullet points 3 and 4 above) and one was reserved for OS functions. In the end, it was really a one-core, six-sub-core part, as far as consoles go. And you can't bin console chips the way you can bin desktop or server chips.

And let's not forget that the 360 was 3-core.

eames
May 9, 2009

also relevant

Ihmemies
Oct 6, 2012

I mean even xbone and ps4 are four year old 8core products now. 200mm˛ is small, my nehalem is like 260 and it's from 2008. Gpu makers produce dies with 600mm˛ and 8-9B transistora, why cpu's can't follow suit or make cpu's even with half the die of that?

Just gimme more cores, I'll buy a larger tower cooler if needed. Or make a new atx like standard which gives room for even larger coolers :/ I want performance, not excuses! Or at least make the CPU's with more cache. Ditch the stupid rear end igpu and slap some extra cache with the spare room&transistors.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Ihmemies posted:

I mean even xbone and ps4 are four year old 8core products now. 200mm˛ is small, my nehalem is like 260 and it's from 2008. Gpu makers produce dies with 600mm˛ and 8-9B transistora, why cpu's can't follow suit or make cpu's even with half the die of that?

Just gimme more cores, I'll buy a larger tower cooler if needed. Or make a new atx like standard which gives room for even larger coolers :/ I want performance, not excuses!

Go buy Knights Landing then.

Ihmemies
Oct 6, 2012

FaustianQ posted:

Go buy Knights Landing then.

I meant like why at least 8cores are STILL not a standard even tough they are a standard in phones, consoles etc? It should be a bare minimum, gently caress Intel for stagnating progress during the last decade

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

It's not Intel it's everyone. Good laptops are still dual cores (soon with THREADRIPPING) and really don't need more for basically anyone.

Moar cores in phones, I think is just because it's something you can put on the back of the box and go "see, better". Incidentally this was AMD marketing strategy for bulldozer.

evilweasel
Aug 24, 2002

Ihmemies posted:

I meant like why at least 8cores are STILL not a standard even tough they are a standard in phones, consoles etc? It should be a bare minimum, gently caress Intel for stagnating progress during the last decade

Because Intel isn't going to cut their profits in half giving everyone eight cores when nothing uses them.

Ihmemies posted:

I mean even xbone and ps4 are four year old 8core products now.

They are eight lovely cores.

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

evilweasel posted:


They are eight lovely cores.

This leaves the question why pay for eight lovely cores when four would've done the job?

Ihmemies
Oct 6, 2012

evilweasel posted:

Because Intel isn't going to cut their profits in half giving everyone eight cores when nothing uses them.


Chicken-egg, which came first... if eveyone had 8 cores I'd bet devs found use for them sooner or later. It's like in cars, the more cylinders you have, the more power you get! :iiaca:

Mr Shiny Pants
Nov 12, 2012

eames posted:

also relevant



This is only relevant for parallel execution in a single workload, not a load of single thread/process workloads. I just want to Handbrake a video when playing BF1 and also running 40 tabs of chrome and a couple of VMs without closing anything. My hex core goes a long way but the more the merrier.

PC LOAD LETTER
May 23, 2005
WTF?!
edit:/\/\/\/\/\/\ I think video encoding is one of those embarrassingly parallel problems that scales really well even with silly numbers of cores so its not the best example to go by. IIRC its generally accepted that most things scale very poorly past 16 cores but you tend to see diminishing returns kick in fairly hard past 8 cores.

Boiled Water posted:

This leaves the question why pay for eight lovely cores when four would've done the job?
My understanding was MS and Sony reaaaallly wanted a combined CPU/GPU SoC they could rely on to scale down in cost over the years and AMD seemed to be the only game in town to offer something good enough and for the right price.

In other news more Ryzen benches leaked: http://wccftech.com/amd-ryzen-5-1600x-cpu-benchmark-leak/?utm_source=dlvr.it&utm_medium=twitter

They're lovely synth benches (CPU-Z) mind you but until launch we'll have to take what we can get. The article is mostly a rehash of already known information. The only real new part is the CPU-Z scores. A Ryzen 5 1600X (6C12T Zen 3.3Ghz base 3.7Ghz boost) boosting itself to around 3.5Ghz consistently got a score of 1888 in single thread performance and 12544 in multithreaded performance. They mentioned that a Core i5-7600K (4C8T Kabylake 3.8Ghz base 4.2Ghz boost) got 2130 in single thread and 8206 in multithread performance for comparison. (edit) So a 12% difference in single thread performance between those 2 CPU's for that bench. ('nother edit) Going by my lovely math if you clock normalize those scores then its more like a 4% difference for single thread performance which is much narrower than I expected to see but there you go.

PC LOAD LETTER fucked around with this message at 15:14 on Feb 17, 2017

evilweasel
Aug 24, 2002

Boiled Water posted:

This leaves the question why pay for eight lovely cores when four would've done the job?

Because they went with AMD processors, which were cheaper, and figured that they could force people to code for eight cores to make up for the lovely per-core power because at least with consoles you've got a fixed hardware spec.

Sashimi
Dec 26, 2008


College Slice
It looks like the Intel vs. Ryzen battle could be boiling down to single thread performance, which favours Intel, and multi threaded performance, which favours Ryzen.

Of course a lot could change in two weeks when we get a chance to see overclocking capabilities, non-synthetic benchmarks, and so on.

brainwrinkle
Oct 18, 2009

What's going on in here?
Buglord
If Ryzen easily overclocks to ~4.5 GHz, it will be extremely compelling because the single threaded performance would be really near Intel's. If it's a lovely overclocker, it will still be interesting, but not a must buy.

FuturePastNow
May 19, 2014


I thought the consoles went with AMD because AMD could smash a better GPU onto a CPU than Intel.

If you trace the lineage of the XB1/PS4 CPUs' cores back they're more Bobcat than Bulldozer. In fact I think they're literally just Jaguar cores like Kabini. It's an outdated but perfectly fine little CPU so I don't see a problem. The GPU part matters more for the consoles, anyway.

Ihmemies
Oct 6, 2012

brainwrinkle posted:

If Ryzen easily overclocks to ~4.5 GHz, it will be extremely compelling because the single threaded performance would be really near Intel's. If it's a lovely overclocker, it will still be interesting, but not a must buy.

At least power should not be an issue, 4GHz with under 100W? Should be very easy to cool even at 4,5.. of course you never know what crap GloFo cooks up this time!

PC LOAD LETTER
May 23, 2005
WTF?!
Did Intel even try to compete for PS4 and Xb1?

I thought the only other competitive offer (vs completely internally custom designs) was some sort of Tegra-ish thing from Nvidia which fell through since AMD underbid them?

Ihmemies posted:

At least power should not be an issue, 4GHz with under 100W? Should be very easy to cool even at 4,5..
I'd expect them to turn into furnaces with more clocks/volts, just like Intel's 8C16T chips do.

PC LOAD LETTER fucked around with this message at 15:23 on Feb 17, 2017

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

brainwrinkle posted:

If Ryzen easily overclocks to ~4.5 GHz, it will be extremely compelling because the single threaded performance would be really near Intel's. If it's a lovely overclocker, it will still be interesting, but not a must buy.

Buildzoid seems to expect a 4.4-4.6Ghz upper limit but never really specified core count for those speeds, but Ryzen apparently has per core overclocking, so it'd be interesting to see if it's entirely possible to create custom overclocking profiles on a per program basis.

Ryzen will beat lovely locked Intel processors hands down, and it seems to be setting up triple to double the threads when facing unlocked processors (For instance, R7 1700 vs i7-7700K, R5 1600X vs i5-7600K)

Anime Schoolgirl
Nov 28, 2002

Boiled Water posted:

This leaves the question why pay for eight lovely cores when four would've done the job?
8 assy laptop cores are much, much easier to fab than 4 larger ones and Jaguar was actually much better IPC than the concurrent Bulldozer architecture. Also nobody else was putting gaming-grade GPUs on the same die. Basically this:

PC LOAD LETTER posted:

My understanding was MS and Sony reaaaallly wanted a combined CPU/GPU SoC they could rely on to scale down in cost over the years and AMD seemed to be the only game in town to offer something good enough and for the right price.

And if anyone's asking why they didn't use Intel, I have a feeling Intel wouldn't have wanted to be contracting at the margins Sony and Microsoft wanted. They might have for Nintendo, though, and that's another company knee deep in the poo poo pool of bad decisions of late :ohdear:

Anime Schoolgirl fucked around with this message at 15:34 on Feb 17, 2017

Haquer
Nov 15, 2009

That windswept look...

SwissArmyDruid posted:

Also, Cell was, technically, one core, with eight sub-cores. I vaguely remember someone explaining it to me as an eight-way DSP. Although it didn't even have eight cores at its disposal. One of them was disabled to bring up yields so they would have enough to even launch the PS3 (see bullet points 3 and 4 above) and one was reserved for OS functions. In the end, it was really a one-core, six-sub-core part, as far as consoles go. And you can't bin console chips the way you can bin desktop or server chips.

Yeah its main core was a 3.2ghz PowerPC based core and then had 7 active SPEs (~~Synergetic Processing Elements~~) of which one was reserved for the system as you said.

As I recall they initially wanted to release the PS3 without a GPU due to the power of the SPEs on the chip (the fucker had bandwidth for days)

Fake edit:
http://www.ign.com/articles/2013/10/08/playstation-3-was-delayed-originally-planned-for-2005

quote:

“For a while, [PS3 had] no GPU, it was going to run everything with SPUs. The ICE team proved to Japan that it was just impossible. It would be ridiculous. Performance-wise, it would be a disaster. That’s why they finally added the GPU, closer to the end.”


E: SPE = SPU, same poo poo

feedmegin
Jul 30, 2008

Boiled Water posted:

It's not Intel it's everyone. Good laptops are still dual cores (soon with THREADRIPPING) and really don't need more for basically anyone.

Moar cores in phones, I think is just because it's something you can put on the back of the box and go "see, better". Incidentally this was AMD marketing strategy for bulldozer.

Also more cores in phones is often the BIG.little thing, no? You're only using four at a time, it's just sometimes it's the four big beefy ones and sometimes the four little powersipping ones.

Anime Schoolgirl
Nov 28, 2002

8 cores on phones are used mostly for 1080p or 1440p video encoding and little else, and even then the higher end phones are coalescing to a 2-to-4 fast core SOC

Android itself will refuse to run on anything but the first core the vast majority of the time

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

feedmegin posted:

Also more cores in phones is often the BIG.little thing, no? You're only using four at a time, it's just sometimes it's the four big beefy ones and sometimes the four little powersipping ones.

Yeah this. Technically, your Octocore phone is really a quadcore, just with 4 high performance cores which rarely get used because, lol thermal limitation of modern phone design, and 4 lower performance, energy efficient cores to do basically everything. As far as I am aware, they're never used in conjunction.

PC LOAD LETTER
May 23, 2005
WTF?!

Anime Schoolgirl posted:

And if anyone's asking why they didn't use Intel, I have a feeling Intel wouldn't have wanted to be contracting at the margins Sony and Microsoft wanted. They might have for Nintendo, though, and that's another company knee deep in the poo poo pool of bad decisions of late :ohdear:
That is certainly possible but IIRC there were rumors that Intel did a surprisingly low bid for the original Xbox CPU. I don't think the legit CPU cost was ever leaked but I do remember some public comments that AMD thought they had that contract sewed up and were shocked by how low Intel was willing to go.

Anime Schoolgirl
Nov 28, 2002

PC LOAD LETTER posted:

That is certainly possible but IIRC there were rumors that Intel did a surprisingly low bid for the original Xbox CPU. I don't think the legit CPU cost was ever leaked but I do remember some public comments that AMD thought they had that contract sewed up and were shocked by how low Intel was willing to go.
2000-2001 was also a time when Intel started using every trick (both legal and illegal) possible to make up for getting owned by Jim Keller's second brainchild and managed to hit a two-decade jackpot in the form of anticompetitive laptop contracts.

Adbot
ADBOT LOVES YOU

Josh Lyman
May 24, 2009


PC LOAD LETTER posted:

That is certainly possible but IIRC there were rumors that Intel did a surprisingly low bid for the original Xbox CPU. I don't think the legit CPU cost was ever leaked but I do remember some public comments that AMD thought they had that contract sewed up and were shocked by how low Intel was willing to go.
Intel was getting killed in those P3 days though.

  • Locked thread