Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Actuarial Fables
Jul 29, 2014

Taco Defender
Thanks for the advice, fishmech and Eletriarnation, I really appreciate it. I'll shift my search from atom parts to a cheap xeon pre-built and add those deal sites to my bookmarks.

Adbot
ADBOT LOVES YOU

ehnus
Apr 16, 2003

Now you're thinking with portals!

Combat Pretzel posted:

Haha, what the poo poo?!

"Free optimization" probably just means money.

Sort of. The situations I've heard of involve the game source going to Intel (or other third parties) for them to work their magic at optimizing. Or Intel sending people to the game team to sit with their developers and work.

It's free developer time focused only on making the game look better or go faster. Which is kind of like free money.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Actuarial Fables posted:

Thanks for the advice, fishmech and Eletriarnation, I really appreciate it. I'll shift my search from atom parts to a cheap xeon pre-built and add those deal sites to my bookmarks.

Sure thing. The Lenovo system I mentioned, the TS140, is on eBay (sold by Newegg actually) for $200 right now with an e3-4150. Not quite as good IMO as paying $50 more for the quad-core, but if you're confident it would be good enough to have a fast dual-core this is a great deal. Note if you care about ECC that all dual-core desktop chips from Haswell onward support it since they discontinued the dual-core Xeon alternative after Ivy Bridge; the only feature that they're lacking that a consumer might care about is VT-d. Consumer quads still have ECC locked out to get you to buy E3 Xeons instead.

New Zealand can eat me
Aug 29, 2008

:matters:


Shimrra Jamaane posted:

It's disappointing that the 7700k still doesn't seem like a worthwhile upgrade over my OC 2500k. I want to build a whole new PC already but I might as well continue to wait as SSDs continue to fall in price.

Are there any motherboards with more than one USB 3.1C port?

I'm hitting a lot of instances now, especially with the recent AMD update (which fixed microstutter/frame time issues for Fury), where GPU frame time is typically <10ms and CPU (2600k@4.8) frame time is closer to 12-15ms. The only upgrade I'm aware of is a 3770k, I'm not actively looking but if I find a good deal on a used one I would consider it. Most benchmarks I've found seem to indicate this would be good for 1-5% overall, but they were all done at a time when 1440p wasn't really common. I also suspect the benefit would be greater in modern DX12/Vulkan titles

There's nothing better than a 3770k that would work with the P8Z68-V PRO, correct?

craig588
Nov 19, 2005

by Nyc_Tattoo
Good luck getting a 3770K that'd hit even 4.6GHz. At 4.6GHz you'd be lucky to tie the performance of a 4.8GHz Sandybridge, forget about beating it. I really don't think you should be very CPU limited with a 4.8GHz Sandybridge, I went from a 4.6GHz Sandybridge to a 4.7GHz Haswell, but I keep it at only 4.3GHz for the temperature benefits (doesn't break 70C during Prime) and it feels like it performs the same as the 4.6 Sandybridge as to being no issue at all to game performance. I got the Haswell replacement because I needed another computer and might as well buy something new and conga the old PC down. I think any upgrade would put you into a very similar performance place and high CPU times probably are a coding problem with the game rather than something that could be easily solved with more processor power.

The best reason for enthusiasts with highly clocked Sandybridges to get new processors now is when we want to get an additional PC, there's no sense in buying old stuff when you need more PCs, but it's still real hard to justify processor upgrades.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
The only thing putting a 3770K into a Z68 does is "upgrade" your PCIe to 3.0 spec, assuming you have a "GEN3" board (if not, I don't think you *can* put a 3-series CPU in them). So a bit more speed out of an M.2 SSD that you can't boot from unless you hack your BIOS microcode.

My intent to upgrade to Skylake-X isn't bore out of the fact that I believe it'll be a huge upgrade so much as eventually these 2500/2600K have to start failing, plus the fact that I've gotten away with only spending $450 over the initial buy-in of ~$1800 for a new case and 970. I've never had a system go *that far* value wise before.

BIG HEADLINE fucked around with this message at 06:25 on Dec 10, 2016

Shimrra Jamaane
Aug 10, 2007

Obscure to all except those well-versed in Yuuzhan Vong lore.
Guess I'll wait until 2018 to get disappointed by Coffee Lake and begin the cycle anew.

New Zealand can eat me
Aug 29, 2008

:matters:


Seeing a lot of people having no issues hitting 4.8Ghz @ 1.33-1.38v, with water cooling. I'm sure that DDR3-1600 and the additional 4.3GB/s of memory bandwidth (and supporting the F16C instruction set, which is something both the Xbox One and PS4 processors implement) would have a pretty significant impact on max cpu frame times @ 1440p.


Fake edit: found some benchmarks where people are seeing <12ms max cpu frame times at the same settings in Do4m, and <11ms in DiRT Rally (the two games I care about). That should translate to ~83 and ~91 FPS, as opposed to bouncing off of 60. 90 is especially relevant in DiRT because of VR. Anything less is... not good :barf:

I'm not saying this is worth $250 to me, but I've made a little bit more than that from renting my CV1 out to friends so it's something I'm considering, because I don't really use it much otherwise. I can't deny the extreme value I've gotten from my 2600k, but there are definitely at least a handful of niche cases where the upgrade is justified. That said, I'll probably end up waiting, maybe Coffee Lake will be worth it (lol) and I'll be able to get one for less.

SlayVus
Jul 10, 2009
Grimey Drawer
Wait till cannon lake, 10nm architecture of coffee lake.

NihilismNow
Aug 31, 2003

Eletriarnation posted:

Note if you care about ECC that all dual-core desktop chips from Haswell onward support it since they discontinued the dual-core Xeon alternative after Ivy Bridge; the only feature that they're lacking that a consumer might care about is VT-d. Consumer quads still have ECC locked out to get you to buy E3 Xeons instead.

Do you still have to go with the C series chipsets because i assume consumer boards lack ECC support?

Anime Schoolgirl
Nov 28, 2002

NihilismNow posted:

Do you still have to go with the C series chipsets because i assume consumer boards lack ECC support?
yes, hope you enjoy that 100 dollar premium

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

NihilismNow posted:

Do you still have to go with the C series chipsets because i assume consumer boards lack ECC support?

Yes. If you desperately want ECC you should either buy a socket AM1 board (I think the ASUS ones support it at least), get a cheap prebuilt server, or get ready to pay through the nose if you really need a self-built Intel system. It's kind of nonsensical because from what I understand all of that is taking place in the memory controller and the board basically just needs an option to tell the processor to turn it on, but I guess it's a way they see to make more money and that's what matters.

Intel also doesn't allow you to use E3 Xeons in consumer chipset boards (anymore, you could with Socket 1155 and 1150), not that you have much reason to other than ECC with Skylake consumer chips even supporting VT-d.

Eletriarnation fucked around with this message at 18:12 on Dec 10, 2016

silence_kit
Jul 14, 2011

by the sex ghost

Eletriarnation posted:

Yes. If you desperately want ECC you should either buy a socket AM1 board (I think the ASUS ones support it at least), get a cheap prebuilt server, or get ready to pay through the nose if you really need a self-built Intel system. It's kind of nonsensical because from what I understand all of that is taking place in the memory controller and the board basically just needs an option to tell the processor to turn it on, but I guess it's a way they see to make more money and that's what matters.

LOL, everyone here is constantly bitching about how the new computer chips they desperately want to buy aren't as fast as they'd like, but then they simultaneously bitch about the price of high-end products not targeted towards consumers.

Computer chip companies would go bankrupt and would not be able to afford the development of new and better products and technologies if they only charged marginal production cost + 30% profit for their products like how the posters in this thread would demand. Computer chip development cost/production set-up cost is insanely expensive, and the high Xeon prices help bankroll that development.

silence_kit fucked around with this message at 18:52 on Dec 10, 2016

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
As an engineer who works for a hardware vendor with substantial gross margins on a lot of its products, I get that R&D can be very expensive and has to be effectively amortized out through sales to have a sustainable business model.

However, if Intel opened up ECC support for chips that already support it (Celerons, Pentiums, and i3s - not even i5s or i7s, though I am sure the capability is there too) in consumer motherboards I'm not really sure that it would hurt their sales of Xeon systems, which are all 4+ cores, in any substantial way. Do you think there are a lot of businesses that would be fine with a dual-core server but are buying something a lot bigger because they want ECC?

Eletriarnation fucked around with this message at 18:59 on Dec 10, 2016

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Unfortunately, since we're just consumers, we can't really see buying things like Xeons and upcoming-generation processors as a method of funding the R&D development for products that we would see as fast enough for the money. It's not so much schizophrenia as it is an unwillingness to place that much dollar value on products that aren't targeted at home consumers/small-time firms anymore to begin with.

e: grammar

Sidesaddle Cavalry fucked around with this message at 19:03 on Dec 10, 2016

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD

Eletriarnation posted:

As an engineer who works for a hardware vendor with substantial gross margins on a lot of its products, I get that R&D can be very expensive and has to be effectively amortized out through sales to have a sustainable business model.

However, if Intel opened up ECC support for chips that already support it (Celerons, Pentiums, and i3s - not even i5s or i7s, though I am sure the capability is there too) in consumer motherboards I'm not really sure that it would hurt their sales of Xeon systems, which are all 4+ cores, in any substantial way. Do you think there are a lot of businesses that would be fine with a dual-core server but are buying something a lot bigger because they want ECC?

Yes. See expensive per-core DB/application licensing schemes.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

silence_kit posted:

LOL, everyone here is constantly bitching about how the new computer chips they desperately want to buy aren't as fast as they'd like, but then they simultaneously bitch about the price of high-end products not targeted towards consumers.

Computer chip companies would go bankrupt and would not be able to afford the development of new and better products and technologies if they only charged marginal production cost + 30% profit for their products like how the posters in this thread would demand. Computer chip development cost/production set-up cost is insanely expensive, and the high Xeon prices help bankroll that development.

Lol

silence_kit
Jul 14, 2011

by the sex ghost

It's totally true though. If Intel had set the price for all their computer chips to be the marginal production cost x 1.3 (maybe this number is something like $1 - $20), they wouldn't have been able to bankroll development of new chips and new chip technology, and then the posters in this thread would be throwing even bigger tantrums than they already do about how they think computer chips should go faster than they currently do.

silence_kit fucked around with this message at 00:32 on Dec 11, 2016

Anime Schoolgirl
Nov 28, 2002

silence_kit posted:

It's totally true though. If Intel had set the price for all their computer chips to be the marginal production cost x 1.3 (maybe this number is something like $1 - $20), they wouldn't have been able to bankroll development of new chips and new chip technology, and then the posters in this thread would be throwing even bigger tantrums than they already do about how they think computer chips should go faster than they currently do.
ah yes, consumer desktop chips, known for their immense sales volume of $(rear end pennies), not like the multi-billion dollar datacenter xeon market that's completely and utterly insignificant, surely it is worth making the former pay 200 dollars for the privilege of not having ram errors

silence_kit
Jul 14, 2011

by the sex ghost

Anime Schoolgirl posted:

ah yes, consumer desktop chips, known for their immense sales volume of $(rear end pennies), not like the multi-billion dollar datacenter xeon market that's completely and utterly insignificant, surely it is worth making the former pay 200 dollars for the privilege of not having ram errors

Their sales volume is still pretty high--I think that most of Intel's revenue may still be in their non-server products. It's obviously not as profitable as server chips, and since sales of PCs are going down that will probably change in the future.

Whining about how computer chip companies disable functionality in their chips is like complaining about how software companies charge money for their products when 'bits are free man'.

Anime Schoolgirl
Nov 28, 2002

~98% of those consumer products are laptops or embedded boards, lol if you think those leaky runt shits they sell for desktop chips sell by tens of millions

ARM and AMD have a "why the gently caress not" policy for ECC support for the most part, intel is just doing this because they have an effective monopoly on mindshare and performance.

Anime Schoolgirl fucked around with this message at 01:18 on Dec 11, 2016

Kazinsal
Dec 13, 2011
HardOCP reviews the i7-7700K.

TL;DR same IPC, same perf/clock, same stock clocks. 10-20W less under heavy load.

Zen better not be a dumpster fire, save us AMD

Anime Schoolgirl
Nov 28, 2002

Kazinsal posted:

HardOCP reviews the i7-7700K.

TL;DR same IPC, same perf/clock, same stock clocks. 10-20W less under heavy load.

Zen better not be a dumpster fire, save us AMD
i'm the kaby lake OC load wattage that is 35-45w lower than what all other sites are reporting

keep that pentium 4 salt going kyle bennett

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Proud Christian Mom posted:

Yes. See expensive per-core DB/application licensing schemes.

I have to confess, I don't see the connection. Why does the existence of per-core licensing for enterprise software suggest that dual-core processors would be adequate for a substantial number of people who currently buy larger systems for enterprise purposes?

silence_kit posted:

Whining about how computer chip companies disable functionality in their chips is like complaining about how software companies charge money for their products when 'bits are free man'.

It's really not, because I'm still happy to pay Intel for their product but I don't see why I need a "server" chipset to enable ECC when I know drat well that the memory controller doing the work is in the processor. I also don't think they have to do a lot of ongoing development around ECC or that it's a hot new feature that they still need to pay off the bills for developing.

Eletriarnation fucked around with this message at 02:33 on Dec 11, 2016

Gwaihir
Dec 8, 2009
Hair Elf

Anime Schoolgirl posted:

ah yes, consumer desktop chips, known for their immense sales volume of $(rear end pennies), not like the multi-billion dollar datacenter xeon market that's completely and utterly insignificant, surely it is worth making the former pay 200 dollars for the privilege of not having ram errors

The desktop group at intel brings in double the datacenter group's revenue.
Intel doesn't break down their net income by group, so specific profit centers, on the other hand, :iiam:

Kazinsal posted:

HardOCP reviews the i7-7700K.

TL;DR same IPC, same perf/clock, same stock clocks. 10-20W less under heavy load.

Zen better not be a dumpster fire, save us AMD

I mean, save us from what? Optimistically, they put out a chip that matches Broadwell or Skylake. And.. it might get like a 20$ price drop out of Intel. Hoooraaay??

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Given that Kaby Lake is a lackluster, and Coffee Lake will likely be too, if Zen happens to hit close to Skylake for way cheaper and more features (like ECC... hue hue hue!, or 2S workstations on the cheap, or 8 cores cheaper than what HEDT offers), then this would light a fire under Intel's rear end, which is a good thing.

Gwaihir
Dec 8, 2009
Hair Elf
I think you're just vastly overestimating how much fire could possibly be lit. If intel suddenly had to upend it's architecture or something radical it's not like we'd see the effect for years.
Like, yea, they could re-badge some E3 series Xeons as Core i5/7s and toss in more of the things like ECC or VTd or the other random chipset features, but it's not like that gets us lower frame times in games or something, and the market of people who care about the other stuff, well, lol.

(That and I seriously doubt that amd is going to sell a chip that performs in line with it's intel equivalent for like 100$ less.)

IanTheM
May 22, 2007
He came from across the Atlantic. . .

Gwaihir posted:

I think you're just vastly overestimating how much fire could possibly be lit. If intel suddenly had to upend it's architecture or something radical it's not like we'd see the effect for years.
Like, yea, they could re-badge some E3 series Xeons as Core i5/7s and toss in more of the things like ECC or VTd or the other random chipset features, but it's not like that gets us lower frame times in games or something, and the market of people who care about the other stuff, well, lol.

(That and I seriously doubt that amd is going to sell a chip that performs in line with it's intel equivalent for like 100$ less.)

Intel's had a gigantic monopoly for years now, they power almost every PC and their only competition is smartphones and alternative server architectures. Their margins on x86 CPUs are probably really fat, to keep supplying themselves with enough money to keep their R&D lead. AMD at least causes Intel to reassess certain things, and shapes up the x86 market enough that maybe 6-8 core becomes super mainstream. Anyway, AMD's also gotten back into gear at the right moment, since x86 has hit such a roadblock that Intel can't steam much further ahead anyway.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Gwaihir posted:

I think you're just vastly overestimating how much fire could possibly be lit. If intel suddenly had to upend it's architecture or something radical it's not like we'd see the effect for years.
Like, yea, they could re-badge some E3 series Xeons as Core i5/7s and toss in more of the things like ECC or VTd or the other random chipset features, but it's not like that gets us lower frame times in games or something, and the market of people who care about the other stuff, well, lol.

(That and I seriously doubt that amd is going to sell a chip that performs in line with it's intel equivalent for like 100$ less.)

But they could be doing something innovative like giving us a giant gently caress-off cache, which does measurably help in games, look at the i7-5775C. Something with an even faster cache and higher clocks would no doubt help in smoothing out frame times and would likely be a compelling upgrade for many

HalloKitty fucked around with this message at 21:22 on Dec 11, 2016

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I suppose some eSRAM kind of thing would be an advantage, but it'd require some operating system support to lock down access to a single application exclusively.

Gwaihir
Dec 8, 2009
Hair Elf

Combat Pretzel posted:

I suppose some eSRAM kind of thing would be an advantage, but it'd require some operating system support to lock down access to a single application exclusively.

It's definitely a huge advantage, like HalloKitty mentioned the 5775C has been the best gaming CPU upgrade in forever.

(Certainly far more effective in raising minimum frame rates than going to 6 or 8 core chips)

Gwaihir fucked around with this message at 19:39 on Dec 11, 2016

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Gwaihir posted:

It's definitely a huge advantage, like HalloKitty mentioned the 5775C has been the best gaming CPU upgrade in forever.

(Certainly far more effective in raising minimum frame rates than going to 6 or 8 core chips)

It does beg the question of whether the reason Intel hasn't mainstreamed this (rather than burying it in non-LGAed Skylake-Rs) is because it'll instantly make the rest of their consumer SKUs look like poo poo. Somehow I think we won't see it put out for general 'consumption' until Cannonlake, or *hopefully* Skylake-X. Maybe that'll be Kaby-X's 'ace in the hole,' extremely high clocks mated with a ton of eDRAM, since it'll be a Socket 2066 chip, there'll be plenty of room for 'extras' on the package.

BIG HEADLINE fucked around with this message at 21:06 on Dec 11, 2016

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
So anyway, isn't SRAM way faster than DRAM? Wouldn't it make more sense to make the existing L3 cache considerably bigger instead of adding L4 DRAM? Probably wouldn't get an additional 128MB out of it, since SRAM cells are larger than DRAM (I think), but the speed difference should make very well up for this?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Gwaihir posted:

It's definitely a huge advantage, like HalloKitty mentioned the 5775C has been the best gaming CPU upgrade in forever.

(Certainly far more effective in raising minimum frame rates than going to 6 or 8 core chips)

Was the 5775C really that much better than the 4790K? I'm not seeing it in reviews, and from what I remember it was considered a pretty disappointing architecture overall unless you were gaming on iGPU or really sensitive to power consumption. Yes, IPC improved marginally over Haswell and it consumed less power, but it also overclocked significantly more poorly than Haswell did so overall you were losing total performance.

Paul MaudDib fucked around with this message at 00:42 on Dec 12, 2016

GRINDCORE MEGGIDO
Feb 28, 1985


HalloKitty posted:

But they could be doing something innovative like giving us a giant gently caress-off cache, which does measurably help in games, look at the i7-5775C. Something with an even faster cache and higher clocks would no doubt help in smoothing out frame times and would likely be a compelling upgrade for many

That's what I'm talking about. Also more pcie lanes.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

BIG HEADLINE posted:

It does beg the question of whether the reason Intel hasn't mainstreamed this (rather than burying it in non-LGAed Skylake-Rs) is because it'll instantly make the rest of their consumer SKUs look like poo poo. Somehow I think we won't see it put out for general 'consumption' until Cannonlake, or *hopefully* Skylake-X. Maybe that'll be Kaby-X's 'ace in the hole,' extremely high clocks mated with a ton of eDRAM, since it'll be a Socket 2066 chip, there'll be plenty of room for 'extras' on the package.

I went back and dug up reviews on the 5775C. I think you guys are remembering it with rose-colored glasses here.

quote:

In our first look at Broadwell on the desktop, our recommendation that it would only appeal to those who need the best integrated graphics solution regardless of cost still stands. Part 2 has revealed that clock-for-clock, Broadwell gives 3.3% better performance from our tests although DRAM focused workloads (WinRAR) can benefit up to 25%, although those are few and far between. If we compare it back several generations, that small IPC gain is wiped out by processors like the i7-4790K that overpower the CPU performance in pure frequency or even the i7-4770K which still has a frequency advantage. From an overall CPU performance standpoint out of the box, the i7-5775C sits toe-to-toe with the i7-4770K with an average 1% loss. However, moving the comparison up to the i7-4790K and due to that frequency difference, the Broadwell CPU sits an average 12% behind it, except in those specific tests that can use the eDRAM.

There’s nothing much to be gained with overclocking either. Our i7-5775C CPU made 4.2 GHz, in line with Intel’s expectations for these processors. If we compare that to an overclocked 4.6 GHz i7-4790K, the 4790K is still the winner. Overclocking on these Broadwell CPUs still requires care, due to the arrangement of the CPU under the heatspreader with the added DRAM. We suggest the line method of thermal paste application rather than the large-pea method as a result.
http://www.anandtech.com/show/9482/intel-broadwell-pt2-overclocking-ipc/11

Based on the mediocre performance of the 5775C, I think it's more likely that it's because it was super expensive for Intel to build, and was not an overall win for performance unless you were using the iGPU. Laptop parts use the iGPU so they get the eDRAM, desktop is almost certainly using a discrete GPU so they get no eDRAM and higher clockrates at a lower price.

Take a look at Skylake: they dropped the eDRAM from the desktop parts and boom clock speeds are up a full 800 MHz to 5 GHz or so, plus you keep the IPC gains.

Paul MaudDib fucked around with this message at 01:01 on Dec 12, 2016

Gwaihir
Dec 8, 2009
Hair Elf
The big advantage with it was all about minimum framerates, it wasn't really a much faster chip outside of that.

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




Gwaihir posted:

The big advantage with it was all about minimum framerates, it wasn't really a much faster chip outside of that.

Most tech outlets didn't report minimum frames in their testing, going purely by average frame rate nothing about Broadwell stood out. A scarcity at launch drove prices artificially high which added to the chip going largely ignored. I got one at a good price, but I hadn't previously been looking for one. I can't really say too much about it's performance against the 4790K because I haven't benchmarked anything, but overall it runs much cooler, which is somewhat important in my mITX case.

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

Combat Pretzel posted:

So anyway, isn't SRAM way faster than DRAM? Wouldn't it make more sense to make the existing L3 cache considerably bigger instead of adding L4 DRAM? Probably wouldn't get an additional 128MB out of it, since SRAM cells are larger than DRAM (I think), but the speed difference should make very well up for this?

SRAM is way more expensive and on die. eDRAM is on package but not on die.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Gwaihir posted:

The big advantage with it was all about minimum framerates, it wasn't really a much faster chip outside of that.

I did some digging and I found TechReport's review of the 6700K where Scott Wasson points this out:

http://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed/6

The big thing to remember here is that this is at stock clocks, and that's Broadwell's turf. The problem with Broadwell is it only manages a 13% overclock over stock clocks, whereas the 6700K usually pushes a full 25% over stock clocks. That's going to make a huge dent in minimum frametimes. And Broadwell is doing pretty well if it hits 4.2 GHz, some reviewers had problems getting Broadwell above 4.0 GHz. Skylake is pretty much always going to hit 4.7 GHz, so the minimum overclock is going to be a lot better too.

But, that is interesting for sure. It's definitely doing very well given its clockrate. If you aren't interested in overclocking it's probably a reasonable option. I still don't know if it's justified for future chips versus just having a higher clockrate though.

The biggest barrier for me, though: Microcenter doesn't carry the 5775C. :v: That makes it automatically a $100 price differential versus the 6700K.

Paul MaudDib fucked around with this message at 01:57 on Dec 12, 2016

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply