Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Boiled Water posted:

Ehh they really don't. Some of the "cores" are regular ol' CPU cores, but the majority, I think half, are graphics processors that only make pictures pretty.

I'm pretty sure you're thinking of the PS3's CPU. That had 8 cores in the CPU, one main PowerPC CPU, 6 :airquote:Synergistic Processing Elements:airquote: which were a different architecture and would be used variably for standard general purpose code, video decode, or graphics assistance, and yet another core that was primarily used for the system OS and security/encryption. The SPE cores lacked branch prediction.

(The Xbox 360's CPU was much simpler, 3 identical PowerPC cores that were effectively hyperthreaded, for 6 threads of execution)

Adbot
ADBOT LOVES YOU

Josh Lyman
May 24, 2009


Combat Pretzel posted:

Given that consoles have like seven cores available for use in games, developers have to figure out ways to make use of them. Should translate to better parallel code on the PC, too, eventually.
Aren't there only a handful of games where increased CPU parallelism would even help?

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Josh Lyman posted:

Aren't there only a handful of games where increased CPU parallelism would even help?

This is only because it took til this generation for every console to have multiple identical cores, and console design often places constraints on any PC games that are also to be ported to consoles. The Wii was single core, the PS3 was the mess of different cores I mentioned previously, and the 360 was the only one with multiple identical cores as also mentioned previously.

In order to take advantage of the PS4/Xbox One's full capability, you need to be using 6 or 7 of the cores (usually one or two of the cores in the 8 core setups can be reserved for other things while a game is playing). And even the Wii U has a 3 identical core/possibly 6 threads CPU architecture, essentially copying the last gen Xbox 360's design as a PowerPC CPU.

So with every current console having more than 2 cores for games, you can expect future games on the PC to also really want a lot of cores. It is, after all, easier to just plop your core-hungry game engine onto the PC rather than spend a bunch of time refactoring things so that it runs just as well with 2 cores or 1 core. GTA IV was the first major example of this years back, remember all the people who were angry that it really needed a quad-core CPU to run well because of how Rockstar chose to port it from the 360?

fishmech fucked around with this message at 18:33 on May 29, 2016

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Also because pre-DX12/Vulkan the graphics APIs were basically running on state machines that only supported single threading. For example draw calls could only be made from a single thread, and they make up a big chunk of the time in the update loop.

Eregos
Aug 17, 2006

A Reversal of Fortune, Perhaps?

DrDork posted:

This is pretty much only true if your idea of videogames ends with Civ V and MMOs, and your idea of a CPU starts with "i3". Virtually everything else is GPU limited unless you are content to game at 1080p@60Hz on a $400+ video card for some unknowable reason. And at that point you're already so far past 60FPS that being "CPU limited" is meaningless. Even most MMOs aren't meaningfully CPU bottlenecked once you start talking about high-Hz 1440p or higher resolutions.

While you're right that a 6700k is not the best option for gaming + lots of background tasks, unless your background tasks are things like re-encoding your entire media library, options like the 5820k can hang right with a 6700k in gaming performance and still have plenty of oomph left over for your 300 tabs of javascript.

Frankly a 6700k by itself would probably be fine as long as you have enough RAM to feed all your tabs, depending on the game and what you mean by "background tasks"; the i5-2400 was not exactly a killer chip, by comparison. Despite its numerical similarity to the 2500k/2600k, it has aged far less gracefully due to the lack of that giggle-inducing overclockability.

e; I mean, if you've got a system that works for you, then great! Just saying that what you describe is necessarily the simplest and most logical way of accomplishing what you've described as your objective.
You are right in some respects. I did seriously consider the 5820k. But not compromising on CPU singlethread performance was a decision borne of years of being CPU bottlenecked. Some games like GTA, Starcraft, Supreme Commander series and Battlefield are absolute CPU-devouring beasts. The cost comparison between CPUs and videocards was also a major factor for me. A top of the line videocard these days is easily $2000+, or most of the cost of a brand new build. At that point, from a money allocation perspective, it becomes a relatively cheap improvement to spend $100 more on a CPU to further negate the possibility of being CPU bottlenecked. Especially if you're gonna spend $4000+ for an SLI 4k build, which I haven't done yet but may do.

(I still hold it against blizzard that they designed StarCraft 2 to only utilize 2 cores :rant:)

Eregos fucked around with this message at 10:09 on May 30, 2016

pigdog
Apr 23, 2004

by Smythe
$2000 videocards exist because there are people and fields for which that kind of money is no sweat, but by no means are these cards worth that much money. They exist in order to make nearly as fast $650 cards seem like a good deal in comparison -- marketing 102. You're not supposed to actually buy a $2000 card unless you're a multimillionaire; that's just silly.

JawnV6
Jul 4, 2004

So hot ...

fishmech posted:

The SPE cores lacked branch prediction.
Classic fishmech.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Eregos posted:

Some games like GTA, Starcraft, Supreme Commander series and Battlefield are absolute CPU-devouring beasts

True, but at the same time almost any modern CPU will be more than "fast enough" to turn in stupid-fast FPS on those games. Having the absolute fastest single core so you can get 200FPS instead of 180FPS doesn't seem like something worth caring about, and games are only getting more and more friendly to 4+ cores (hey DX12/Vulkan, how you doin'?). Once you take overclocking into account, you can push a 5820k to within a few percent of your average overclocked 6700k, too, so it's not like there's even that much of a difference in raw single-thread performance.

Also, as of this week the absolute top-of-the-line video card is $600 ($1200 on eBay, wheeeeee!), and will be for the next few quarters until they release another $1000 Titan variant, so you'd have to be going pretty hog-wild to hit $4k these days.

In the end, if you're happy with a 6700k that's all that matters, though.

repiv
Aug 13, 2009

Intel are going ham with the Skylake-E socket :stare:

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Dats a lot of pins to bend

http://www.newegg.com/Product/Product.aspx?Item=19-117-649

Newegg has broadwell E in stock.

Don Lapre fucked around with this message at 14:42 on May 31, 2016

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Don Lapre posted:

Dats a lot of pins to bend

http://www.newegg.com/Product/Product.aspx?Item=19-117-649

Newegg has broadwell E in stock.

Woo. Anybody who wants >4 cores right now should still be putting Haswell-E on one of the nice new X99 boards, right? BW-E really looks more expensive for the same performance.

BurritoJustice
Oct 9, 2012

Twerk from Home posted:

Woo. Anybody who wants >4 cores right now should still be putting Haswell-E on one of the nice new X99 boards, right? BW-E really looks more expensive for the same performance.

Broadwell-E is actually basically worse than Haswell-E anyway, because the IPC increase isn't enough to offset the significantly worse overclocking. The E chips are overclocking about as well as the 1150 socket Broadwells, which is to say the worst overclocking headroom of any architecture in the last 5 years.

If someone offered me a 5960X or a 6900K I'd take the Haswell all day long.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Why the gently caress does the Skylake-E socket need to be that huge? Eight memory channels or some bullshit like that?

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Combat Pretzel posted:

Why the gently caress does the Skylake-E socket need to be that huge? Eight memory channels or some bullshit like that?

6 channel memory

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
What's the official party line on E5 Xeons in X99 boards? BW-E pricing is pretty high and people who actually need a bunch of threads might be looking at the cheap 8 and 10 core Xeons.

Gwaihir
Dec 8, 2009
Hair Elf

Twerk from Home posted:

What's the official party line on E5 Xeons in X99 boards? BW-E pricing is pretty high and people who actually need a bunch of threads might be looking at the cheap 8 and 10 core Xeons.

The current Xeons need C series chipsets, unfortunately. The motherboard OEMs do make "gaming" focused C series Xeon boards though.

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender

Gwaihir posted:

The current Xeons need C series chipsets, unfortunately. The motherboard OEMs do make "gaming" focused C series Xeon boards though.

Nope, that's just for the Skylake E3 Xeons.

Twerk from Home posted:

What's the official party line on E5 Xeons in X99 boards? BW-E pricing is pretty high and people who actually need a bunch of threads might be looking at the cheap 8 and 10 core Xeons.

The E5 Xeons should work in any X99 board after a BIOS update to enable Broadwell support.

NihilismNow
Aug 31, 2003
Presumably someone looking for a 10 core desktop is not concerned with gaming so much.
When you consider the cost of a 10 core Xeon is $150 over a cheap X99 for a C612 board really that big a deal? Or even price parity if you look at the really nice workstation X99 boards.

EoRaptor
Sep 13, 2003

by Fluffdaddy
In a good sign, Kaby Lake CPU's are already available for hardware development: http://arstechnica.com/gadgets/2016/05/intels-post-tick-tock-kaby-lake-cpus-definitely-coming-later-this-year/
Intel is giving every indication that it is on track for a 2H16 release, though not all features are completely finalized/announced.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

repiv posted:

Intel are going ham with the Skylake-E socket :stare:



If the *entry-level* processor for that socket is less than $600 at launch I'll be shocked. *starts saving up*

Hopefully they'll use all that chip space for a shitload of eDRAM or even HBM1.

Anime Schoolgirl
Nov 28, 2002

repiv posted:

Intel are going ham with the Skylake-E socket :stare:


That's actually Knight's Landing. There has been neither hide nor hair of the actual Skylake-E socket.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

BIG HEADLINE posted:

If the *entry-level* processor for that socket is less than $600 at launch I'll be shocked. *starts saving up*

Hopefully they'll use all that chip space for a shitload of eDRAM or even HBM1.
Eh, if they're sidestepping that idea for the regular Skylake, I doubt the -E will come with it. And if, it'll be used as justification to raise the price.

Anime Schoolgirl posted:

That's actually Knight's Landing. There has been neither hide nor hair of the actual Skylake-E socket.
There's been scuttlebutt about both sharing the same socket at least back in November already. Doesn't sound impossible. Maybe they want a single Northbridge for both CPUs.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

BIG HEADLINE posted:

If the *entry-level* processor for that socket is less than $600 at launch I'll be shocked. *starts saving up*

Hopefully they'll use all that chip space for a shitload of eDRAM or even HBM1.

At this point if they did release something like that I'd expect them to charge like $5000.

EdEddnEddy
Apr 5, 2012



Can someone explain to me what this Turbo Boost Extreme 3.0 tech might be compared to the normal turbo max frequency shown in the slide here?

And man those prices are getting stupid. I really do hope that Zen can do something, anything, to Intel's dominance as they are just getting stupid greedy now. Even at a 50% discount it is still freaking expensive for the top two chips. (the only ones I would get if I did a new build from my still spunky 3930K...)

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.

EdEddnEddy posted:

Can someone explain to me what this Turbo Boost Extreme 3.0 tech might be compared to the normal turbo max frequency shown in the slide here?

And man those prices are getting stupid. I really do hope that Zen can do something, anything, to Intel's dominance as they are just getting stupid greedy now. Even at a 50% discount it is still freaking expensive for the top two chips. (the only ones I would get if I did a new build from my still spunky 3930K...)

http://www.anandtech.com/show/10337/the-intel-broadwell-e-review-core-i7-6950x-6900k-6850k-and-6800k-tested-up-to-10-cores/2

Durinia
Sep 26, 2014

The Mad Computer Scientist

Combat Pretzel posted:

There's been scuttlebutt about both sharing the same socket at least back in November already. Doesn't sound impossible. Maybe they want a single Northbridge for both CPUs.

They both have 6 external memory channels, which is the major driver of that pin count. Reusing the socket would make economic sense. (I'm talking about same physical socket - I seriously doubt they'd be socket compatible in their pin arrangement)

EdEddnEddy
Apr 5, 2012




Thank you.


Yep until Windows 10 takes advantage of that natively, and Bios's aren't all screwed up with it, it looks like a completely useless bit of tech.

Nothing a solid proper offset overclock can't beat at least. Though if you can get a solid 4Ghz out of a single core that isn't terrible. What was the high OC limits of the 8 core in HW-E? 4.4Ghzish?


Question. Would there be a good X79 compatible Xeon chip that is unlocked for OC'ing? I am reading around that some of them actually are and the thought of putting in a 10core at 4Ghz or so in my aging X79 system is appealing for doing BR encoding in Placebo as my current chip takes around 6-7Hrs.

EdEddnEddy fucked around with this message at 18:16 on Jun 1, 2016

lDDQD
Apr 16, 2006

EdEddnEddy posted:

Can someone explain to me what this Turbo Boost Extreme 3.0 tech might be

Essentially, you get frequency control per-core, rather than for all the cores (voltage is still the same for all cores). You can then proceed to identify your slow and fast cores, and clock each one accordingly.
Finally, you've got a software tool, that allows you to target some processes to these faster cores (and some, to slower).

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

EdEddnEddy posted:

Yep until Windows 10 takes advantage of that natively, and Bios's aren't all screwed up with it, it looks like a completely useless bit of tech.
Windows supports up to 256 CPUs. It ain't the problem. The apps you use are.

EdEddnEddy
Apr 5, 2012



Combat Pretzel posted:

Windows supports up to 256 CPUs. It ain't the problem. The apps you use are.

True, but the per core voodoo that is in 3.0 is getting put into Win10 probably this July instead of using that stupid app they were talking about in that writeup.

Sort of how Windows 8+ runs better on multiple core system with HT than Win 7 as Win 8 can identify the difference between physical cores vs HT ones and split loads across cores vs just threads. Which is also why 8+ Gave the AMD 8 cores a small speed boost in small load performance as it is also able to decipher which cores are sharing on the same Node/Module or whatever the called it since they were just sort of split quad cores.

Gwaihir
Dec 8, 2009
Hair Elf

EdEddnEddy posted:

True, but the per core voodoo that is in 3.0 is getting put into Win10 probably this July instead of using that stupid app they were talking about in that writeup.

Sort of how Windows 8+ runs better on multiple core system with HT than Win 7 as Win 8 can identify the difference between physical cores vs HT ones and split loads across cores vs just threads. Which is also why 8+ Gave the AMD 8 cores a small speed boost in small load performance as it is also able to decipher which cores are sharing on the same Node/Module or whatever the called it since they were just sort of split quad cores.

It's a driver. Having a driver installed to use a hardware feature isn't some obscure requirement. The only change is bundling the driver with whatever the latest build vs needing to install it from Intel or a motherboard vendor website.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
So this is what the back of Skylake-E (and Knight's Landing) will look like:

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
now that i look at it again, you could totally melt the gold off of that thing and pawn just the gold, if you were completely ignorant of the perceived value of a microprocessor

e: Are we still sure that consumer HEDT and Xeon Phi will be on the same platform?

Sidesaddle Cavalry fucked around with this message at 10:53 on Jun 2, 2016

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Sidesaddle Cavalry posted:

now that i look at it again, you could totally melt the gold off of that thing and pawn just the gold, if you were completely ignorant of the perceived value of a microprocessor

e: Are we still sure that consumer HEDT and Xeon Phi will be on the same platform?

Possibly the same socket, probably not the same platform.

Gwaihir
Dec 8, 2009
Hair Elf

BIG HEADLINE posted:

So this is what the back of Skylake-E (and Knight's Landing) will look like:



That is ridiculous (ly cool) looking!

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

BIG HEADLINE posted:

So this is what the back of Skylake-E (and Knight's Landing) will look like:



Slotted cpus are back!

EdEddnEddy
Apr 5, 2012



Sooooo... Whats that tab on the end for?

Toast Museum
Dec 3, 2005

30% Iron Chef

EdEddnEddy posted:

Sooooo... Whats that tab on the end for?

Skimming the article the photo is from, it's probably an Omni-Path connector for the version packaged as a co-processor.

Anandtech posted:

The connector end of the co-processor has an additional chip under the heatspreader, which is most likely an Intel Omni-Path connector or a fabric adaptor for adjustable designs. This will go into a PCIe card, cooler applied and then sold.

WhyteRyce
Dec 30, 2001

Can Trump Make Intel Great Again?
http://fortune.com/2016/06/02/intel-ceo-brian-krzanich-trump/?xid=yahoo_fortune

Can get he build some smartphones and get ARM to pay for it?

WhyteRyce fucked around with this message at 16:23 on Jun 2, 2016

Adbot
ADBOT LOVES YOU

30 TO 50 FERAL HOG
Mar 2, 2005



So what's the deal with engineering samples?

I need a transcoding PC and while searching for processors found this

It appears as though a particular motherboard would be needed I guess?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply