|
Boiled Water posted:Ehh they really don't. Some of the "cores" are regular ol' CPU cores, but the majority, I think half, are graphics processors that only make pictures pretty. I'm pretty sure you're thinking of the PS3's CPU. That had 8 cores in the CPU, one main PowerPC CPU, 6 Synergistic Processing Elements which were a different architecture and would be used variably for standard general purpose code, video decode, or graphics assistance, and yet another core that was primarily used for the system OS and security/encryption. The SPE cores lacked branch prediction. (The Xbox 360's CPU was much simpler, 3 identical PowerPC cores that were effectively hyperthreaded, for 6 threads of execution)
|
# ? May 29, 2016 16:40 |
|
|
# ? May 10, 2024 08:14 |
|
Combat Pretzel posted:Given that consoles have like seven cores available for use in games, developers have to figure out ways to make use of them. Should translate to better parallel code on the PC, too, eventually.
|
# ? May 29, 2016 17:01 |
|
Josh Lyman posted:Aren't there only a handful of games where increased CPU parallelism would even help? This is only because it took til this generation for every console to have multiple identical cores, and console design often places constraints on any PC games that are also to be ported to consoles. The Wii was single core, the PS3 was the mess of different cores I mentioned previously, and the 360 was the only one with multiple identical cores as also mentioned previously. In order to take advantage of the PS4/Xbox One's full capability, you need to be using 6 or 7 of the cores (usually one or two of the cores in the 8 core setups can be reserved for other things while a game is playing). And even the Wii U has a 3 identical core/possibly 6 threads CPU architecture, essentially copying the last gen Xbox 360's design as a PowerPC CPU. So with every current console having more than 2 cores for games, you can expect future games on the PC to also really want a lot of cores. It is, after all, easier to just plop your core-hungry game engine onto the PC rather than spend a bunch of time refactoring things so that it runs just as well with 2 cores or 1 core. GTA IV was the first major example of this years back, remember all the people who were angry that it really needed a quad-core CPU to run well because of how Rockstar chose to port it from the 360? fishmech fucked around with this message at 18:33 on May 29, 2016 |
# ? May 29, 2016 17:54 |
|
Also because pre-DX12/Vulkan the graphics APIs were basically running on state machines that only supported single threading. For example draw calls could only be made from a single thread, and they make up a big chunk of the time in the update loop.
|
# ? May 29, 2016 18:07 |
|
DrDork posted:This is pretty much only true if your idea of videogames ends with Civ V and MMOs, and your idea of a CPU starts with "i3". Virtually everything else is GPU limited unless you are content to game at 1080p@60Hz on a $400+ video card for some unknowable reason. And at that point you're already so far past 60FPS that being "CPU limited" is meaningless. Even most MMOs aren't meaningfully CPU bottlenecked once you start talking about high-Hz 1440p or higher resolutions. (I still hold it against blizzard that they designed StarCraft 2 to only utilize 2 cores ) Eregos fucked around with this message at 10:09 on May 30, 2016 |
# ? May 30, 2016 10:05 |
|
$2000 videocards exist because there are people and fields for which that kind of money is no sweat, but by no means are these cards worth that much money. They exist in order to make nearly as fast $650 cards seem like a good deal in comparison -- marketing 102. You're not supposed to actually buy a $2000 card unless you're a multimillionaire; that's just silly.
|
# ? May 30, 2016 11:17 |
|
fishmech posted:The SPE cores lacked branch prediction.
|
# ? May 30, 2016 14:00 |
|
Eregos posted:Some games like GTA, Starcraft, Supreme Commander series and Battlefield are absolute CPU-devouring beasts True, but at the same time almost any modern CPU will be more than "fast enough" to turn in stupid-fast FPS on those games. Having the absolute fastest single core so you can get 200FPS instead of 180FPS doesn't seem like something worth caring about, and games are only getting more and more friendly to 4+ cores (hey DX12/Vulkan, how you doin'?). Once you take overclocking into account, you can push a 5820k to within a few percent of your average overclocked 6700k, too, so it's not like there's even that much of a difference in raw single-thread performance. Also, as of this week the absolute top-of-the-line video card is $600 ($1200 on eBay, wheeeeee!), and will be for the next few quarters until they release another $1000 Titan variant, so you'd have to be going pretty hog-wild to hit $4k these days. In the end, if you're happy with a 6700k that's all that matters, though.
|
# ? May 30, 2016 16:03 |
|
Intel are going ham with the Skylake-E socket
|
# ? May 31, 2016 14:28 |
|
Dats a lot of pins to bend http://www.newegg.com/Product/Product.aspx?Item=19-117-649 Newegg has broadwell E in stock. Don Lapre fucked around with this message at 14:42 on May 31, 2016 |
# ? May 31, 2016 14:39 |
|
Don Lapre posted:Dats a lot of pins to bend Woo. Anybody who wants >4 cores right now should still be putting Haswell-E on one of the nice new X99 boards, right? BW-E really looks more expensive for the same performance.
|
# ? May 31, 2016 14:55 |
|
Twerk from Home posted:Woo. Anybody who wants >4 cores right now should still be putting Haswell-E on one of the nice new X99 boards, right? BW-E really looks more expensive for the same performance. Broadwell-E is actually basically worse than Haswell-E anyway, because the IPC increase isn't enough to offset the significantly worse overclocking. The E chips are overclocking about as well as the 1150 socket Broadwells, which is to say the worst overclocking headroom of any architecture in the last 5 years. If someone offered me a 5960X or a 6900K I'd take the Haswell all day long.
|
# ? May 31, 2016 15:00 |
|
Why the gently caress does the Skylake-E socket need to be that huge? Eight memory channels or some bullshit like that?
|
# ? May 31, 2016 15:14 |
|
Combat Pretzel posted:Why the gently caress does the Skylake-E socket need to be that huge? Eight memory channels or some bullshit like that? 6 channel memory
|
# ? May 31, 2016 15:24 |
|
What's the official party line on E5 Xeons in X99 boards? BW-E pricing is pretty high and people who actually need a bunch of threads might be looking at the cheap 8 and 10 core Xeons.
|
# ? May 31, 2016 15:25 |
|
Twerk from Home posted:What's the official party line on E5 Xeons in X99 boards? BW-E pricing is pretty high and people who actually need a bunch of threads might be looking at the cheap 8 and 10 core Xeons. The current Xeons need C series chipsets, unfortunately. The motherboard OEMs do make "gaming" focused C series Xeon boards though.
|
# ? May 31, 2016 17:52 |
|
Gwaihir posted:The current Xeons need C series chipsets, unfortunately. The motherboard OEMs do make "gaming" focused C series Xeon boards though. Nope, that's just for the Skylake E3 Xeons. Twerk from Home posted:What's the official party line on E5 Xeons in X99 boards? BW-E pricing is pretty high and people who actually need a bunch of threads might be looking at the cheap 8 and 10 core Xeons. The E5 Xeons should work in any X99 board after a BIOS update to enable Broadwell support.
|
# ? May 31, 2016 19:33 |
|
Presumably someone looking for a 10 core desktop is not concerned with gaming so much. When you consider the cost of a 10 core Xeon is $150 over a cheap X99 for a C612 board really that big a deal? Or even price parity if you look at the really nice workstation X99 boards.
|
# ? May 31, 2016 23:19 |
|
In a good sign, Kaby Lake CPU's are already available for hardware development: http://arstechnica.com/gadgets/2016/05/intels-post-tick-tock-kaby-lake-cpus-definitely-coming-later-this-year/ Intel is giving every indication that it is on track for a 2H16 release, though not all features are completely finalized/announced.
|
# ? Jun 1, 2016 02:21 |
|
repiv posted:Intel are going ham with the Skylake-E socket If the *entry-level* processor for that socket is less than $600 at launch I'll be shocked. *starts saving up* Hopefully they'll use all that chip space for a shitload of eDRAM or even HBM1.
|
# ? Jun 1, 2016 03:52 |
|
repiv posted:Intel are going ham with the Skylake-E socket
|
# ? Jun 1, 2016 03:59 |
|
BIG HEADLINE posted:If the *entry-level* processor for that socket is less than $600 at launch I'll be shocked. *starts saving up* Anime Schoolgirl posted:That's actually Knight's Landing. There has been neither hide nor hair of the actual Skylake-E socket.
|
# ? Jun 1, 2016 08:59 |
|
BIG HEADLINE posted:If the *entry-level* processor for that socket is less than $600 at launch I'll be shocked. *starts saving up* At this point if they did release something like that I'd expect them to charge like $5000.
|
# ? Jun 1, 2016 15:49 |
|
Can someone explain to me what this Turbo Boost Extreme 3.0 tech might be compared to the normal turbo max frequency shown in the slide here? And man those prices are getting stupid. I really do hope that Zen can do something, anything, to Intel's dominance as they are just getting stupid greedy now. Even at a 50% discount it is still freaking expensive for the top two chips. (the only ones I would get if I did a new build from my still spunky 3930K...)
|
# ? Jun 1, 2016 17:06 |
|
EdEddnEddy posted:Can someone explain to me what this Turbo Boost Extreme 3.0 tech might be compared to the normal turbo max frequency shown in the slide here? http://www.anandtech.com/show/10337/the-intel-broadwell-e-review-core-i7-6950x-6900k-6850k-and-6800k-tested-up-to-10-cores/2
|
# ? Jun 1, 2016 17:21 |
|
Combat Pretzel posted:There's been scuttlebutt about both sharing the same socket at least back in November already. Doesn't sound impossible. Maybe they want a single Northbridge for both CPUs. They both have 6 external memory channels, which is the major driver of that pin count. Reusing the socket would make economic sense. (I'm talking about same physical socket - I seriously doubt they'd be socket compatible in their pin arrangement)
|
# ? Jun 1, 2016 18:00 |
|
Malloc Voidstar posted:http://www.anandtech.com/show/10337/the-intel-broadwell-e-review-core-i7-6950x-6900k-6850k-and-6800k-tested-up-to-10-cores/2 Thank you. Yep until Windows 10 takes advantage of that natively, and Bios's aren't all screwed up with it, it looks like a completely useless bit of tech. Nothing a solid proper offset overclock can't beat at least. Though if you can get a solid 4Ghz out of a single core that isn't terrible. What was the high OC limits of the 8 core in HW-E? 4.4Ghzish? Question. Would there be a good X79 compatible Xeon chip that is unlocked for OC'ing? I am reading around that some of them actually are and the thought of putting in a 10core at 4Ghz or so in my aging X79 system is appealing for doing BR encoding in Placebo as my current chip takes around 6-7Hrs. EdEddnEddy fucked around with this message at 18:16 on Jun 1, 2016 |
# ? Jun 1, 2016 18:07 |
|
EdEddnEddy posted:Can someone explain to me what this Turbo Boost Extreme 3.0 tech might be Essentially, you get frequency control per-core, rather than for all the cores (voltage is still the same for all cores). You can then proceed to identify your slow and fast cores, and clock each one accordingly. Finally, you've got a software tool, that allows you to target some processes to these faster cores (and some, to slower).
|
# ? Jun 1, 2016 19:13 |
|
EdEddnEddy posted:Yep until Windows 10 takes advantage of that natively, and Bios's aren't all screwed up with it, it looks like a completely useless bit of tech.
|
# ? Jun 1, 2016 21:34 |
|
Combat Pretzel posted:Windows supports up to 256 CPUs. It ain't the problem. The apps you use are. True, but the per core voodoo that is in 3.0 is getting put into Win10 probably this July instead of using that stupid app they were talking about in that writeup. Sort of how Windows 8+ runs better on multiple core system with HT than Win 7 as Win 8 can identify the difference between physical cores vs HT ones and split loads across cores vs just threads. Which is also why 8+ Gave the AMD 8 cores a small speed boost in small load performance as it is also able to decipher which cores are sharing on the same Node/Module or whatever the called it since they were just sort of split quad cores.
|
# ? Jun 1, 2016 21:50 |
|
EdEddnEddy posted:True, but the per core voodoo that is in 3.0 is getting put into Win10 probably this July instead of using that stupid app they were talking about in that writeup. It's a driver. Having a driver installed to use a hardware feature isn't some obscure requirement. The only change is bundling the driver with whatever the latest build vs needing to install it from Intel or a motherboard vendor website.
|
# ? Jun 1, 2016 22:09 |
|
So this is what the back of Skylake-E (and Knight's Landing) will look like:
|
# ? Jun 2, 2016 10:38 |
|
now that i look at it again, you could totally melt the gold off of that thing and pawn just the gold, if you were completely ignorant of the perceived value of a microprocessor e: Are we still sure that consumer HEDT and Xeon Phi will be on the same platform? Sidesaddle Cavalry fucked around with this message at 10:53 on Jun 2, 2016 |
# ? Jun 2, 2016 10:50 |
|
Sidesaddle Cavalry posted:now that i look at it again, you could totally melt the gold off of that thing and pawn just the gold, if you were completely ignorant of the perceived value of a microprocessor Possibly the same socket, probably not the same platform.
|
# ? Jun 2, 2016 14:23 |
|
BIG HEADLINE posted:So this is what the back of Skylake-E (and Knight's Landing) will look like: That is ridiculous (ly cool) looking!
|
# ? Jun 2, 2016 14:43 |
|
BIG HEADLINE posted:So this is what the back of Skylake-E (and Knight's Landing) will look like: Slotted cpus are back!
|
# ? Jun 2, 2016 14:55 |
|
Sooooo... Whats that tab on the end for?
|
# ? Jun 2, 2016 15:46 |
|
EdEddnEddy posted:Sooooo... Whats that tab on the end for? Skimming the article the photo is from, it's probably an Omni-Path connector for the version packaged as a co-processor. Anandtech posted:The connector end of the co-processor has an additional chip under the heatspreader, which is most likely an Intel Omni-Path connector or a fabric adaptor for adjustable designs. This will go into a PCIe card, cooler applied and then sold.
|
# ? Jun 2, 2016 15:52 |
|
Can Trump Make Intel Great Again? http://fortune.com/2016/06/02/intel-ceo-brian-krzanich-trump/?xid=yahoo_fortune Can get he build some smartphones and get ARM to pay for it? WhyteRyce fucked around with this message at 16:23 on Jun 2, 2016 |
# ? Jun 2, 2016 16:19 |
|
|
# ? May 10, 2024 08:14 |
|
So what's the deal with engineering samples? I need a transcoding PC and while searching for processors found this It appears as though a particular motherboard would be needed I guess?
|
# ? Jun 2, 2016 16:46 |