Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
canyoneer
Sep 13, 2005


I only have canyoneyes for you

The Slack Lagoon posted:

He's using avid Media composer.

I'll recommend that but it's not likely he wants to go for that. He is really odd about wanting TOP TIER future proofing. Tired to explain that's not a thing.

He originally wanted the 6900 but I shut that down.

Give your friend 4 options. Have the first be hella-spensive Saudi prince level.
Have the second one be the right one.
Have the third and fourth be a little too cheap or underpowered, or a used workstation.

He will then buy the second one and feel good about it.

Adbot
ADBOT LOVES YOU

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
He honestly has a rosier upgrade path with Skylake than with Broadwell-E - it's entirely possible LGA1151 might have one more core iteration after Kaby, while LGA2011-3 is effectively EOL. The only thing that might be attractive to him is that eventually all those used 2011-3 Xeons will get pulled and sold at a massive discount, so there's a slight possibility he'll be able to pick up (down the line) a 10+ core Xeon for a song. Doesn't help him right *now*, though.

The Slack Lagoon
Jun 17, 2008



So from what I'm understanding 6800 might be worth it over an un oc 6700, and a 7700 probably wouldn't do much better than the 6700. Seem about right?

I put this together for him, +- RAM/HDDs. I feel like anything is going to blow him away compared to the 6 yo laptop he is using now

http://pcpartpicker.com/list/X86Z2R

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

The Slack Lagoon posted:

So from what I'm understanding 6800 might be worth it over an un oc 6700, and a 7700 probably wouldn't do much better than the 6700. Seem about right?

I put this together for him, +- RAM/HDDs. I feel like anything is going to blow him away compared to the 6 yo laptop he is using now

http://pcpartpicker.com/list/X86Z2R

Keep in mind he can still pick up a *new* 5820K for ~$100 less than a 6800K for *maybe* a loss of ~5% in performance across-the-board and a bit more heat, if he doesn't mind buying a CPU that was launched in Q3 2014. There isn't a tremendous difference between the two - no significant architectural improvements, just a die shrink.

EDIT: Oh, and the 5820K can only support 64GB of memory while the 6800K can support 128GB. Figured I'd mention that.

Also, that ASRock board you have listed - the first review mentions that unless you get a board that has a pre-updated BIOS (or RMA the brand new board before even starting the build to get one that does), it won't boot with a Broadwell-E chip.

BIG HEADLINE fucked around with this message at 21:47 on Nov 3, 2016

mayodreams
Jul 4, 2003


Hello darkness,
my old friend

The Slack Lagoon posted:

He's using avid Media composer.

I'll recommend that but it's not likely he wants to go for that. He is really odd about wanting TOP TIER future proofing. Tired to explain that's not a thing.

He originally wanted the 6900 but I shut that down.

Echoing what canyoneer said about a used workstation. I used to be a systems engineer that designed post suites and labs, and was Avid certified at one point. While MC can run on a lot more stuff now, having a qualified system running the proper drivers will make a world of difference if your friend actually intends to make money doing any real work.

http://avid.force.com/pkb/articles/en_US/user_guide/en269631?popup=true

Mr Chips
Jun 27, 2007
Whose arse do I have to blow smoke up to get rid of this baby?

The Slack Lagoon posted:

I'll recommend that but it's not likely he wants to go for that. He is really odd about wanting TOP TIER future proofing. Tired to explain that's not a thing.
TOP TIER future proofing? I recommend one of those Supermicro quad socket Xeon E7 towers. A $35k 96 core box seems pretty future proof

gourdcaptain
Nov 16, 2012

BIG HEADLINE posted:

Also, that ASRock board you have listed - the first review mentions that unless you get a board that has a pre-updated BIOS (or RMA the brand new board before even starting the build to get one that does), it won't boot with a Broadwell-E chip.

Yeah, when I recently built a system with a 6800k I had to look out for that. And because I was crazy/stupid enough to do it as a Micro ATX case I didn't have that many motherboard options. Luckily the ASUS one I settled on had a weird feature called BIOS Flashback where you can stick an update on a USB stick and flash the BIOS by pressing a button on the motherboard with just the power supply hooked up and not needing RAM or the CPU installed. Downside was a Mobo manual so badly written I had to google half the weird acronyms they had for overclocking stuff pretty clearly aimed at gaming (which a six+ core processor is a bad fit for) and the Secure Boot/UEFI settings in the config menus were written precisely backwards.

Plus side, it's performing pretty well for rendering video (personal use) and large compile jobs (professional), and I got the CPU in a combo deal.

ufarn
May 30, 2009

The Slack Lagoon posted:

A friend wants to build a desktop for video editing. He was looking at 6800k - will there be a 6 core kaby lake?

Also, would kaby vs Skylake make a lot of difference for video editing?

How much of a difference would a 4 core i7 vs a 6 core make for video editing?
Kaby Lake has built-in optimization for 4K and various codecs. I assume this means a big performance boost, but at the very least it's way more power efficient at processing video.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
He'll be waiting two months for Kaby Lake to hit the shelves, though. The official launch doesn't happen until January 4th.

Anime Schoolgirl
Nov 28, 2002

There will be no six core Kaby Lake. You're gonna have to wait until Coffee Lake for that, est Q4 2017, but chances are Intel will bullshit up a new socket just for that.

HMS Boromir
Jul 16, 2011

by Lowtax
I thought Coffee Lake was rumored for Q2 2018?

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

HMS Boromir posted:

I thought Coffee Lake was rumored for Q2 2018?

Wikipedia has it listed as "1H 2018," which likely means July-September 2018. I can't imagine Intel's in much of a hurry.

craig588
Nov 19, 2005

by Nyc_Tattoo

BIG HEADLINE posted:

EDIT: Oh, and the 5820K can only support 64GB of memory while the 6800K can support 128GB. Figured I'd mention that.

Not true. The 5820K can handle 128GB. Unless there's some bug where it'll report 128GB, but only actually use 64GB of it? I'm only using 16GB of memory because the type of stuff I do doesn't demand memory at all so I don't have first hand experience, but all of the documentation says a max of 128GB. I probably could have gotten away with 8GB, but I'm not building a PC with 8GB of memory in 2016.

Anime Schoolgirl
Nov 28, 2002

All Haswell-E processors claim to support only 64GB, but actually support as many unbuffered sticks as you can throw at it.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

craig588 posted:

Not true. The 5820K can handle 128GB. Unless there's some bug where it'll report 128GB, but only actually use 64GB of it? I'm only using 16GB of memory because the type of stuff I do doesn't demand memory at all so I don't have first hand experience, but all of the documentation says a max of 128GB. I probably could have gotten away with 8GB, but I'm not building a PC with 8GB of memory in 2016.

I believe that the only downside of it officially supporting only 64GB is that if you complain about anything not working, Intel will tell you to pound sand if you're in an unsupported memory configuration.

EdEddnEddy
Apr 5, 2012



Twerk from Home posted:

I believe that the only downside of it officially supporting only 64GB is that if you complain about anything not working, Intel will tell you to pound sand if you're in an unsupported memory configuration.

Probably true, similar to the SB-E and the X79 not officially supporting PCI-E 3.0 so you have to run a little script to enable it every driver install. Gotta love the Sorta within Spec, but not official stuff they did with these server core rejects or something for the Enthusiast crowd.





gourdcaptain posted:

Yeah, when I recently built a system with a 6800k I had to look out for that. And because I was crazy/stupid enough to do it as a Micro ATX case I didn't have that many motherboard options. Luckily the ASUS one I settled on had a weird feature called BIOS Flashback where you can stick an update on a USB stick and flash the BIOS by pressing a button on the motherboard with just the power supply hooked up and not needing RAM or the CPU installed. Downside was a Mobo manual so badly written I had to google half the weird acronyms they had for overclocking stuff pretty clearly aimed at gaming (which a six+ core processor is a bad fit for) and the Secure Boot/UEFI settings in the config menus were written precisely backwards.

Plus side, it's performing pretty well for rendering video (personal use) and large compile jobs (professional), and I got the CPU in a combo deal.

I don't know how you think a 6+ Core Overclocked chip isn't good for Gaming. :colbert: It opens up the ability to not only game but encode/stream/whatever the hell you wanna do while gaming at the same time with more cores to shuffle around. Until games do use more than 4 cores, having the lack of a bottleneck is a wonderful thing. Though running a VM and a bunch of tabs in Opera the other day, I somehow used up all my 32G of memory.

I will admit, I have played some games like WoT/War Thunder while Encoding a BluRay level video (so almost everything is at 100% utilization) and everything rolls along just fine. Probably would work ok on a modern 4 core skylake, but still. For such an old 6 core, I was happy.

redeyes
Sep 14, 2002

by Fluffdaddy

Twerk from Home posted:

I believe that the only downside of it officially supporting only 64GB is that if you complain about anything not working, Intel will tell you to pound sand if you're in an unsupported memory configuration.

I totally went to the Intel page and looked at the supported memory config and was like WFT?! Errored on the side of caution. This is for a semi-important 3D rendering box and I suppose it was for the best. Still 128GB works eh?

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender

redeyes posted:

I totally went to the Intel page and looked at the supported memory config and was like WFT?! Errored on the side of caution. This is for a semi-important 3D rendering box and I suppose it was for the best. Still 128GB works eh?

The main reason why Haswell-E lists 64gb max is because waaaayyy back in Aug 2014 when these chips launched 8gb was largest DDR4 DIMM that existed so it was impossible to have more than 64gb of ram with only 8 DIMM slots.

Intel certainly isn't going to go back and re-validate all of their old products every time ram density increases.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib

Anime Schoolgirl posted:

There will be no six core Kaby Lake. You're gonna have to wait until Coffee Lake for that, est Q4 2017, but chances are Intel will bullshit up a new socket just for that.

So Coffee Lake is yet another 14nm refinement which will be sold alongside 10nm Cannonlake. If Coffee Lake gets 6 cores, will Cannonlake?

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender

ConanTheLibrarian posted:

So Coffee Lake is yet another 14nm refinement which will be sold alongside 10nm Cannonlake. If Coffee Lake gets 6 cores, will Cannonlake?

From what I've read it looks like Coffee Lake is going to actually use the same new architecture as Cannonlake but done using the existing 14nm process instead of their new 10nm process. It looks like Intel is going to only start using their 10nm node where the power savings going from 14->10 will have the most impact; mobile/tablets/mainstream laptops.

If that's true then it looks like Intel is hedging their bets on their 10nm process node and leaving desktop/high-end laptops on 14nm. That's probably why they bumped the core count up; you aren't getting a new node so have 2 more cores.

I hope they also use this opportunity to clarify their product offerings a little better:

Pentium - 2 cores no hyperthreading
i3 - 2 cores w/hyperthreading
i5 - 4 cores w/ hyperthreading
i7 - 6 cores w/ hyperthreading

To answer your question: Yes, Cannonlake will get 6 cores since Coffee Lake is Cannonlake.

gourdcaptain
Nov 16, 2012

EdEddnEddy posted:

I don't know how you think a 6+ Core Overclocked chip isn't good for Gaming. :colbert: It opens up the ability to not only game but encode/stream/whatever the hell you wanna do while gaming at the same time with more cores to shuffle around. Until games do use more than 4 cores, having the lack of a bottleneck is a wonderful thing. Though running a VM and a bunch of tabs in Opera the other day, I somehow used up all my 32G of memory.

I will admit, I have played some games like WoT/War Thunder while Encoding a BluRay level video (so almost everything is at 100% utilization) and everything rolls along just fine. Probably would work ok on a modern 4 core skylake, but still. For such an old 6 core, I was happy.
Fair enough, although that's not the use case I was thinking of. I'd have to peg my encoding tools to only run on less than six cores because the experience has been less than optimal when I tried it earlier with any game made in the last five years (although I'm admittedly not overclocking this machine at all.) Gaming's a bit moot anyway on the higher end because all I've got in this machine is an RX 460 (since the grand extent of my gaming requiring a GPU 90%+ of the time is Dolphin and PCSX2). Also, I'm running Linux, which rather limits the gaming surface. And I only got that GPU because the CPU didn't have an integrated graphics card and there was a combo. (I've for years wanted a system with an Iris Pro because it's the perfect level of GPU for my needs... and now they've discontinued it before I found a CPU I could otherwise get along with with one.)

EdEddnEddy
Apr 5, 2012



Well in the flip case then, the X79/X99 chips support most of the important VM features so you could Virtualize a Windows 10 OS and run your games on that just fine while the Linux part does something else.

gourdcaptain
Nov 16, 2012

EdEddnEddy posted:

Well in the flip case then, the X79/X99 chips support most of the important VM features so you could Virtualize a Windows 10 OS and run your games on that just fine while the Linux part does something else.

True. But I'd need a second graphics card to passthrough to the VM, a seperate monitor for that VM (and I certainly don't have room on my desk setup for a third monitor over the two I want for Linux work), and the MicroATX case certainly would be even less fun to work that into (or I could go full ATX, I guess). I don't play enough recent games to make it worth it given most of my gaming is on emulators natively available on Linux (and most not even using the GPU beyond scaling the image.) Dolphin's the only thing pushing the GPU right now enough to struggle on some things, and it's going to work out once the Vulkan drivers for my card settle down.

is that good
Apr 14, 2012

Krailor posted:

I hope they also use this opportunity to clarify their product offerings a little better:

Pentium - 2 cores no hyperthreading
i3 - 2 cores w/hyperthreading
i5 - 4 cores w/ hyperthreading
i7 - 6 cores w/ hyperthreading
Good luck with that; Anandtech is claiming that desktop Kaby Lake has a 2 core pentium with hyperthreading whose only distinction from the i3s is 3mb of cache instead of 4
E: no they're not and I'm not sure why I thought they were. My bad

is that good fucked around with this message at 01:16 on Nov 5, 2016

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

BIG HEADLINE posted:

There won't be a consumer hexacore until Coffee Lake in ~2 years (maybe 18 months) from now. That being said, the cheaper option than going with a Broadwell-E on an X99 board is to get a 6700K, which is a quad core with Hyperthreading. It's still a quad core, but HT simulates another four cores which most higher-quality video editing programs can utilize.

Kaby Lake will be ~10% faster than Skylake, but for video the main improvement it'll grant is in *decoding*, not *encoding*.

The 6800K will be the faster option, but there *are* reviews out there that will show him if the price difference will be worth it.

Someone asked a similar question on Reddit: https://www.reddit.com/r/buildapc/comments/4xos0i/video_editing_cpu_6700k_6800k_or_5820/

I think there's a Skylake-X coming in Q2 2017 that has the HEDT chips?

If not: I was on the fence about whether to upgrade now or wait, and either way right now I'm feelin' real good about my 5820K purchase :smug:

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
I did some reading up on Kaby Lake, and it seems the 4K video benefits go both ways - decoding *and* encoding.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

BIG HEADLINE posted:

I did some reading up on Kaby Lake, and it seems the 4K video benefits go both ways - decoding *and* encoding.

Sure. HEVC Main10 looks poised to be the official standard for 4K BluRay and Kaby Lake is the first one that has Quick Sync that supports it, both decoding and encoding. Huge huge improvements all across the board. That's why I'm holding off upgrading that TV PC until we get a Kaby Lake i3-6100 or G4400 equivalent.

Also IIRC starting with Skylake you don't even need to be using the iGPU for Quick Sync to work, you should be able to Quick Sync to a display that's attached to a GPU.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
VP9 is better :colbert:

EdEddnEddy
Apr 5, 2012



gourdcaptain posted:

True. But I'd need a second graphics card to passthrough to the VM, a seperate monitor for that VM (and I certainly don't have room on my desk setup for a third monitor over the two I want for Linux work), and the MicroATX case certainly would be even less fun to work that into (or I could go full ATX, I guess). I don't play enough recent games to make it worth it given most of my gaming is on emulators natively available on Linux (and most not even using the GPU beyond scaling the image.) Dolphin's the only thing pushing the GPU right now enough to struggle on some things, and it's going to work out once the Vulkan drivers for my card settle down.

Ah but have you tried Dolphin VR yet? :vrfrog:

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

I don't see you releasing the VP9 patent into the public domain. Also you're not my real dad and you can't tell me what to like. :colbert:

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Paul MaudDib posted:

I don't see you releasing the VP9 patent into the public domain. Also you're not my real dad and you can't tell me what to like. :colbert:

...how do we still only have same old libvpx. Come on.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


Wait what?

I'm under the impression Coffee Lake comes out next year (2017) and the it's Ice Lake (2018).

gourdcaptain
Nov 16, 2012

EdEddnEddy posted:

Ah but have you tried Dolphin VR yet? :vrfrog:

I lack the gear, the desire, and the ability to experience most kinds of 3D effects effectively or without crippling eye strain or nausea. Except for the 3DS weirdly enough.

The Slack Lagoon
Jun 17, 2008



Suggested he wait for the kaby 4 core i7 and be said naw I really need more cores. v0v

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
If six cores show up outside of the HEDT context, I sure hope the eight cores move down the price ladder. A big-rear end premium for quad channel only, while still doing away with the integrated graphics, isn't cool.

AEMINAL
May 22, 2015

barf barf i am a dog, barf on your carpet, barf
im almost glad CPU improvements have begun to stagnate, seeing how little of an upgrade kaby lake is makes me feel a lot better about building a skylake pc 3 months back

Potato Salad
Oct 23, 2014

nobody cares


Its also helping me justify going for higher-end SKUs as they will remain high end for, well, at least half a decade at this point.

HMS Boromir
Jul 16, 2011

by Lowtax
Same here, CPU stagnation and an unexpected sale got me to spring for a 6600K. Given how long it takes me to get around to buying most AAA games and how few of them tend to be really CPU-heavy to begin with, I plan to laugh all the way to 2025. Or a motherboard failure.

HMS Boromir fucked around with this message at 16:33 on Nov 5, 2016

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender

Combat Pretzel posted:

If six cores show up outside of the HEDT context, I sure hope the eight cores move down the price ladder. A big-rear end premium for quad channel only, while still doing away with the integrated graphics, isn't cool.

This is why we really need Zen to be at least kind of competitive; to keep Intel from completely raping our wallets.

Adbot
ADBOT LOVES YOU

Anime Schoolgirl
Nov 28, 2002

They're still going to do that, Intel continues enjoying a "premium" reputation no matter where you look, even during the Athlon XP days.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply