Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Atomizer
Jun 24, 2007



VulgarandStupid posted:

They are both mITX computers.

I think what he was saying is that stuffing your GPU into an external enclosure really just results in paying for redundant items. So you have your Razer Core that runs for $500. What is it really? Its an enclosure, a power supply and a USB HUB. You can just build a mITX system, and it will be almost as small or just as small, and you'll only pay for the case and power supply once. You'll also have access to higher end CPUs and better RAM. Which, if you think about it, is probably the smart decision, since the only three things that control performance (or more specifically gaming performance) are CPU, GPU and RAM. There is a little bit of a price premium for mITX builds, but realistically, if you think about it, it actually ends up kind of even. Any mITX board worth its salt will have wifi included, so you're not paying extra for that. mITX cases also need less fans, because there is a smaller volume to cool. When you have gigantic cases, there are a bunch of dead areas and if you don't stuff the case with fans, those dead areas will build up heat. So you spent a little more on the motherboard, but the case is just as cheap. You spend less on fans, and you probably spend less on a power supply, because once you realize you're never going to go dual GPU, you realize that 450W is more than enough and always will be.

The white and black cases are only ~70 each, the silver case is a crowdfunded project so thats about $180.

I like that tall "Sword Art Online" :wtc: case; do you have any more info on it? I'm assuming it's just decorated that way, and I don't have a model number or anything else to look it up.

As others mentioned, the main use case for the Core is for a laptop. It totally makes sense to have your nice Ultrabook with you through the day, then bring it home to connect to your Core so you can game. The Core includes Ethernet as well as USB ports; if you could get enough power over the Type C cable that would be the only thing you'd need to connect to your laptop, which would be sweet.

That being said, it makes a little less sense to use a video card dock like the Core on any non-portable system, however, if future desktops trend towards supporting external graphics then it's still totally viable. The idea is that your standardized graphics dock can be swapped between your desktop AND your laptop as desired. It's universal now to have a desktop with an internal graphics card, but if the Core works, then... it just works. Realistically the most commonly upgraded component is probably the video card anyway, so as I described last time that's the only thing I'd ever need to upgrade before retiring a PC several years down the line.

You're right about mITX builds being the most well-rounded choice now, I won't argue with that. That's just not what I need or want at this point.

nostrata posted:

Mine just came in. Now to gently caress off the rest of the day setting it up.

Lol, it's like a 5-minute setup process (hardware, at least.)

It took me a while to figure out how to setup the RAID array (Ctrl-I). I'm debating whether the performance is worth the loss of SMART diagnostics and RAPID mode (Samsung Magician only sees the array and not the individual drives.)

The performance sure is nice though. ~1 TB of space and almost 1 TB/sec R/W for ~$300 is pretty sweet.

nostrata posted:

If you are just rdp-ing in do you even need a monitor connected? Just use it for the initial setup and then put it in place with just power and data.

No, but I found out (using Chrome Remote Desktop) that the mouse cursor disappears on the remote desktop (Win10) if you don't have a physical mouse attached to the host. You don't need a keyboard or monitor at all though.

nostrata posted:

As for my skull Canyon, it certainly looks a lot different sitting on my desk. Still trying to download some stuff and get it configured the way I like it. But the few games I have on it so far have run decently during their test runs. But I don't really play any newer games. I'll need to find more intense parts of the game to play to really test it out but so far I'm pleased with it.
Only issue I have so far is that sometimes after the display shuts off I can't bring it back up. My monitor refuses to see the source on the hdmi. I know the computer is running because I was able to logmein to it. I suspect it's just a driver issue and I'll work it out later.

Did you manually update drivers after installing the OS? I did, and then I used the Intel Driver Update Utility; the utility detected that some updates weren't installed, except that they were. I'll wait a little while before revisiting that so they can work out the bugs and properly detect the installed drivers.

Oh yeah, here are some benchmark results for those interested: https://www.passmark.com/baselines/V8/display.php?id=61898251144

Adbot
ADBOT LOVES YOU

Assepoester
Jul 18, 2004
Probation
Can't post for 10 years!
Melman v2

Atomizer posted:

I like that tall "Sword Art Online" :wtc: case; do you have any more info on it? I'm assuming it's just decorated that way, and I don't have a model number or anything else to look it up.
Correct, the case is just an RVZ02.

http://www.silentpcreview.com/SilverStone_Raven_RVZ02

It doesn't come in white, it was custom painted that way.

https://www.reddit.com/r/buildapc/comments/3u19ne/build_complete_white_raven_asuna_silverstone/

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




Atomizer posted:

I like that tall "Sword Art Online" :wtc: case; do you have any more info on it? I'm assuming it's just decorated that way, and I don't have a model number or anything else to look it up.

As others mentioned, the main use case for the Core is for a laptop. It totally makes sense to have your nice Ultrabook with you through the day, then bring it home to connect to your Core so you can game. The Core includes Ethernet as well as USB ports; if you could get enough power over the Type C cable that would be the only thing you'd need to connect to your laptop, which would be sweet.

That being said, it makes a little less sense to use a video card dock like the Core on any non-portable system, however, if future desktops trend towards supporting external graphics then it's still totally viable. The idea is that your standardized graphics dock can be swapped between your desktop AND your laptop as desired. It's universal now to have a desktop with an internal graphics card, but if the Core works, then... it just works. Realistically the most commonly upgraded component is probably the video card anyway, so as I described last time that's the only thing I'd ever need to upgrade before retiring a PC several years down the line.

You're right about mITX builds being the most well-rounded choice now, I won't argue with that. That's just not what I need or want at this point.

The Silverstone RVZ02 is the case I'm currently using. It's a nice case and supports a full size GPU, but it doesn't support 3.5" drives. I ripped out the slimline DVD mounting point and shoved a 3.5" drive, but its loud, which is very annoying. Both the CPU and GPU go right up against filtered intakes and have separate thermal chambers so it cools pretty well. It's also relatively cheap and is only 12 liters in size.

The Razer Core is a great step forward for PC gaming, as it will hopefully make gaming more accessible for laptop owners. It is, however, very expensive and apparently drivers still need to be written at the device level, which seems backwards to me. It's also the first of its kind, if you don't count the Alienware Amplifier which is actually quite a bit cheaper, but hideously large and uses a proprietary connector. I hope competitors start putting out better and cheaper products though. I think it is way bigger than it needs to be, video cards don't need to be so full encased. Just a mesh screen or something over them would be fine. Most users won't need a 400W power supply, as we know the new top of the line GTX1080 is only 1080W, so 250W, leaving room for overclocking would be more than enough. I think that might also bring things down to a level where you could have an external (power brick) power supply, and make the case even smaller. Most video cards are probably something like 1.5 liters, 8 liters to house a graphics card and a USB hub is too much. So a smaller power supply, less enclosure material and some healthy competition should bring both the size and costs down. That would hopefully get us away from nearly the size of a computer to something closer to a docking station.

nostrata
Apr 27, 2007

Atomizer posted:



Lol, it's like a 5-minute setup process (hardware, at least.)

Did you manually update drivers after installing the OS? I did, and then I used the Intel Driver Update Utility; the utility detected that some updates weren't installed, except that they were. I'll wait a little while before revisiting that so they can work out the bugs and properly detect the installed drivers.

Well sure putting everything together was simple but I wanted to have it operational by the time I got home. Had a good time playing around with it after the kids went to bed. Played some games at 1080 with med/high quality and didn't notice any framerate issues but I wasn't really tracking it. Maybe tomorrow I'll actually monitor it but I was pleased with the performance. I didn't experience the monitor issue again tonight but I'll try that update utility in the morning and keep an eye on it.

Games I played were borderlands 2, Diablo 3, and marvel heroes for those interested. I know those aren't particularly taxing and are fairly old games at this point but thats what I spend most of my gaming time on.

Potato Salad
Oct 23, 2014

nobody cares



I like small cases. A lot.

That case is giving me a severe case of :gizz:

My wife and I are buying a new PC and I'm juggling the value of $30 in the price difference between i5 6500 and 6600. Graaaaaah.

It's not that money is an issue -- it's a matter of principle (bang vs buck).

suck my woke dick
Oct 10, 2012

:siren:I CANNOT EJACULATE WITHOUT SEEING NATIVE AMERICANS BRUTALISED!:siren:

Put this cum-loving slave on ignore immediately!

Potato Salad posted:

I like small cases. A lot.

That case is giving me a severe case of :gizz:

My wife and I are buying a new PC and I'm juggling the value of $30 in the price difference between i5 6500 and 6600. Graaaaaah.

It's not that money is an issue -- it's a matter of principle (bang vs buck).

Flip a coin because wasting time on deciding whether to spend an insignificant amount of money by itself lowers your bang:buck ratio.

HMS Boromir
Jul 16, 2011

by Lowtax
Obviously you take the third option and buy a 6600K. :getin:

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Potato Salad posted:

I like small cases. A lot.

That case is giving me a severe case of :gizz:

My wife and I are buying a new PC and I'm juggling the value of $30 in the price difference between i5 6500 and 6600. Graaaaaah.

It's not that money is an issue -- it's a matter of principle (bang vs buck).

If you're chasing that performance per dollar, enjoy your 6500 and you'll probably never notice a difference.

Potato Salad
Oct 23, 2014

nobody cares


blowfish posted:

Flip a coin because wasting time on deciding whether to spend an insignificant amount of money by itself lowers your bang:buck ratio.

Coin flip says purge the unclean 6500.

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




Potato Salad posted:

I like small cases. A lot.

That case is giving me a severe case of :gizz:

My wife and I are buying a new PC and I'm juggling the value of $30 in the price difference between i5 6500 and 6600. Graaaaaah.

It's not that money is an issue -- it's a matter of principle (bang vs buck).

Would you consider a silverstone SG13B instead? Volume wise it is smaller at 10.5L, it's also $30 cheaper. It has better storage mounting options, cooling options and more flexibility with PSUs. The dimensions are just a bit different.

NihilismNow
Aug 31, 2003

Potato Salad posted:

I like small cases. A lot.

That case is giving me a severe case of :gizz:

My wife and I are buying a new PC and I'm juggling the value of $30 in the price difference between i5 6500 and 6600. Graaaaaah.

It's not that money is an issue -- it's a matter of principle (bang vs buck).

Buy a 6600k. The thing is going to last 5+ years anyway (probably). The price per month is negligable and you will have that extra power on tap all those years. Then at the end you can overclock it to death to squeeze out those last few extra months.

I'm never going to buy another budget CPU now that they last as long as they do.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

NihilismNow posted:

Buy a 6600k. The thing is going to last 5+ years anyway (probably). The price per month is negligable and you will have that extra power on tap all those years. Then at the end you can overclock it to death to squeeze out those last few extra months.

I'm never going to buy another budget CPU now that they last as long as they do.

I doubt he will be able to OC it much at all with only ultra low profile CPU coolers available for the case he is getting. Also a SFX PSU will severely limit OCing unless he splurges on one of the 600W ones.

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




AVeryLargeRadish posted:

I doubt he will be able to OC it much at all with only ultra low profile CPU coolers available for the case he is getting. Also a SFX PSU will severely limit OCing unless he splurges on one of the 600W ones.

Just how much wattage does OCing use?

I mean, the 6600k is rated at 91W, and the GTX1080 is rated at 180W. I would think even a 450W could handle some overclocking just fine. The mobo/drives won't use anything close to 50W, leaving another 100W overhead for overclocking.

EdEddnEddy
Apr 5, 2012



Also sometimes you can push up a few hundred mhz without needing to bump the voltage which shouldn't take too much power/create too much heat for a mild boost.

If anything you could always move out of that case down the line into some other system, add a hsf/psu, and last another 2+ years with a real 1Ghz OC or so. All depends on usage though and even then, Sandy Bridge (and SB-E) still are not irrelevant chips even stock because the performance gains per generation are only ~10% and unless you are gaming/encoding, you really won't notice much, however, OC'ing does give the chip potential of going another tier/generation or 2 faster which can extend it's useful life. A locked CPU is pretty much as good as it's going to get, the day you buy it.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

VulgarandStupid posted:

Just how much wattage does OCing use?

I mean, the 6600k is rated at 91W, and the GTX1080 is rated at 180W. I would think even a 450W could handle some overclocking just fine. The mobo/drives won't use anything close to 50W, leaving another 100W overhead for overclocking.

I have seen rises of around 80W for a high OC on an i5, I like leaving more headroom than 20W after the OC.

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




AVeryLargeRadish posted:

I have seen rises of around 80W for a high OC on an i5, I like leaving more headroom than 20W after the OC.

There's another 29W in there, as 91W+180W doesn't actually equal 300W. Plus the 50W for misc stuff is also a bit overstated. However, I'm guessing our friend here will run into thermal issues with a low profile cooler before he runs into a power shortage

I'm running a 5775C (65W), stock speed 980Ti (250W), a 1TB SSD, and a 3TB HDD, and I'm pulling 365W~ from the wall while running Firestrike. My 500W Gold Modular SFX-L PSU was only ~$85, which isn't unreasonable. I dunno, I think its not only doable, but I don't think its unreasonable either.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
I feel like we also have to remember that the normal 6600 TDP is only 65W - the 6600K's 91W is already giving some extra headroom over what the processor should be using at stock settings. Allowing for an 80W rise over that means that you're expecting to OC a 65W processor to the point that it uses 170W, which seems frankly impossible unless you're using LN2 or some other novel form of cooling.

Personally I would go for the 6600K, because it's not a substantial premium and I think you'll end up buying an aftermarket cooler anyway with that case. The linked article reviewing the case notes that with Prime95 and Furmark (which is a rather unrealistic load) they only saw their i5-4690K get up to 55C at worst. That's a fine temperature with lots of headroom and I wouldn't expect the comparable Skylake chip at stock settings to be any worse, so I totally expect that you could at least do a multiplier-only overclock and evaluate how that affects your power draw and temperatures to see if tweaking voltage is an option. If you think that it's a bottleneck after you get the 1080 and you don't trust the power supply to let you put the voltage higher, then a better SFX supply is comparatively (to the 1080) cheap.

Eletriarnation fucked around with this message at 14:19 on May 18, 2016

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Atomizer posted:

As others mentioned, the main use case for the Core is for a laptop. It totally makes sense to have your nice Ultrabook with you through the day, then bring it home to connect to your Core so you can game.

It's a nice idea but it's not going to be practical until the trend toward heavily-throttled 10W TDP mobile parts stops. A lot of games will easily max an overclocked desktop quadcore let alone some 2ghz mobile SKU. The thermal solutions aren't built for sustained load either, after they're up to temperature the CPU will start throttling and your framerates will crash.

Modern CPUs haven't gotten more power efficient (for a given amount of processing) so much as they've gotten better at idling it down so that computation isn't wasted. If anything the performance trend has actually been downwards.

A RVZ01 or Dan SFX-A4 case are still the clear winners for serious gaming in a portable form-factor.

Paul MaudDib fucked around with this message at 06:57 on May 18, 2016

Atomizer
Jun 24, 2007



VulgarandStupid posted:

The Silverstone RVZ02 is the case I'm currently using. It's a nice case and supports a full size GPU, but it doesn't support 3.5" drives. I ripped out the slimline DVD mounting point and shoved a 3.5" drive, but its loud, which is very annoying. Both the CPU and GPU go right up against filtered intakes and have separate thermal chambers so it cools pretty well. It's also relatively cheap and is only 12 liters in size.

The Razer Core is a great step forward for PC gaming, as it will hopefully make gaming more accessible for laptop owners. It is, however, very expensive and apparently drivers still need to be written at the device level, which seems backwards to me. It's also the first of its kind, if you don't count the Alienware Amplifier which is actually quite a bit cheaper, but hideously large and uses a proprietary connector. I hope competitors start putting out better and cheaper products though. I think it is way bigger than it needs to be, video cards don't need to be so full encased. Just a mesh screen or something over them would be fine. Most users won't need a 400W power supply, as we know the new top of the line GTX1080 is only 1080W, so 250W, leaving room for overclocking would be more than enough. I think that might also bring things down to a level where you could have an external (power brick) power supply, and make the case even smaller. Most video cards are probably something like 1.5 liters, 8 liters to house a graphics card and a USB hub is too much. So a smaller power supply, less enclosure material and some healthy competition should bring both the size and costs down. That would hopefully get us away from nearly the size of a computer to something closer to a docking station.

Thanks for the ID on that case, guys; it turns out it actually looks even better in the original black as compared to animu-white. :barf:

I probably should've mentioned that aside from the fact that the Core isn't even available yet, I'm in no hurry to get one and will wait until the price drops (on either that or the future competitors.) I have other gaming systems and plenty of other devices to do what I need, so when the time is right I'll get some external graphics adapter. That's part of what's justifying the expenditure: I'm going to wait until it's a reasonable price anyway.

nostrata posted:

Well sure putting everything together was simple but I wanted to have it operational by the time I got home. Had a good time playing around with it after the kids went to bed. Played some games at 1080 with med/high quality and didn't notice any framerate issues but I wasn't really tracking it. Maybe tomorrow I'll actually monitor it but I was pleased with the performance. I didn't experience the monitor issue again tonight but I'll try that update utility in the morning and keep an eye on it.

Games I played were borderlands 2, Diablo 3, and marvel heroes for those interested. I know those aren't particularly taxing and are fairly old games at this point but thats what I spend most of my gaming time on.

Ah, that's another thing; even as-is without an external GPU, it's plenty powerful for the games I'm interested in. I checked that Intel-provided list and everything that I care about is older, but plays at 1080p on medium-to-high settings. My main game is World of Tanks, which is about 5 years old at this point. I don't give a poo poo if I don't have a PC that can run 4k VR games.

Paul MaudDib posted:

It's a nice idea but it's not going to be practical until the trend toward heavily-throttled 10W TDP mobile parts stops. A lot of games will easily max an overclocked desktop quadcore let alone some 2ghz mobile SKU. The thermal solutions aren't built for sustained load either, after they're up to temperature the CPU will start throttling and your framerates will crash.

Modern CPUs haven't gotten more power efficient (for a given amount of processing) so much as they've gotten better at idling it down so that computation isn't wasted. If anything the performance trend has actually been downwards.

A RVZ01 or Dan SFX-A4 case are still the clear winners for serious gaming in a portable form-factor.

Yeah, I mean I guess that's true for the Ultrabooks, but there are gaming laptops (like what Razer is building) that have higher TDP chips and are better-positioned to take advantage of easy GPU upgrades. My old Y500 for example has a 3630QM which is still a solid CPU and happens to have a 45W TDP. It's not the most portable device, but Lenovo took gaming laptops in the right direction by selling a good i7 with 16 GB of RAM and 650M SLI, plus a 15.6" 1080P display all for a great price. If I needed to do anything to boost performance a few years later (i.e., today) an external GPU upgrade would be exactly what I'm looking for. Again, though, the machines I play on work exactly as well as they ever did because I play the same games I did when I bought/built the devices originally.

I like the SFX-A4 and RVZ02, though; I'll keep those in mind for a potential future desktop build. That's kind of what attracted me to the NUC in the first place; I long since grew tired of building regular ATX boxes, so smaller, more creative machines are what interest me nowadays.

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




I uhh, have a lot of spare time, at least for the time being. Would anyone be interested in a SFF/Mini-ITX thread?

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.

VulgarandStupid posted:

I uhh, have a lot of spare time, at least for the time being. Would anyone be interested in a SFF/Mini-ITX thread?

oh my yes

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Paul MaudDib posted:

Modern CPUs haven't gotten more power efficient (for a given amount of processing) so much as they've gotten better at idling it down so that computation isn't wasted. If anything the performance trend has actually been downwards.

Nonsense. Intel has focused a great deal on computational power efficiency the last 10 years, and they have delivered real gains. They've done a lot about idle, yes, but efficiency at 100% load is also much better. They have been managing to squeeze out 5 to 10% performance gain in mainstream desktop chips each generation since Core 2, sometimes significantly more (eg Sandy Bridge). They're basically operating with a self/market-imposed TDP cap of about 95W in that market segment, so the only way to get more performance in the long run is for efficiency to go up.

Preemptive strike: please don't go all :byodood: BUT OC'D 2500K :byodood: on me. Intel does their engineering work based on factory clock/TDP/etc. targets, so if you want evaluate their work, that's what you need to look at.

Speaking of which... one thing which seldom comes up in these grumpy overclocker discussions is that reduced OC headroom is an inevitable consequence of really good power-efficient design. The only reason you get to OC at all is that chips have timing slack, i.e. their critical paths actually run faster than they need to. Any timing slack in excess of the margin needed to statistically guarantee correct operation at factory clocks can be converted into reduced power consumption at the same clocks, or increased performance at the same power, or some mixture of the two. The flip side is that once Intel's engineers have gotten done optimizing their chips this way, there's less timing slack to operate much faster than the fastest factory bin (eg 6700K in this generation), and OC'ers start getting all bitter-old-man about how they have to crank voltage and generate insane heat to get anywhere these days.

Raskolnikov
Nov 25, 2003

VulgarandStupid posted:

I uhh, have a lot of spare time, at least for the time being. Would anyone be interested in a SFF/Mini-ITX thread?

Absolutely! Small computers are so cute?

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.
My 5820k has gone from stable at 4.3GHz@1.3V to DPC_WATCHDOG BSODs about once per night (near-zero load) at 4.1GHz. Did I melt it or something?

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Malloc Voidstar posted:

My 5820k has gone from stable at 4.3GHz@1.3V to DPC_WATCHDOG BSODs about once per night (near-zero load) at 4.1GHz. Did I melt it or something?

If you are not putting load on the CPU when it crashes I don't see the OC or voltage being the problem here.

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.

AVeryLargeRadish posted:

If you are not putting load on the CPU when it crashes I don't see the OC or voltage being the problem here.
Motherboard complained "overclocking failed" during startup after a recent BSOD. Updated BIOS and it lasted a whole two days


Broadwell-E please get here faster so I can hopefully get a new processor cheaper

SuperDucky
May 13, 2007

by exmarx
We're having some, uh, interesting stability issues with Broadwell-EP on boards that worked just fine on Haswell. AMI and Intel are stumped.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

BobHoward posted:

Nonsense. Intel has focused a great deal on computational power efficiency the last 10 years, and they have delivered real gains. They've done a lot about idle, yes, but efficiency at 100% load is also much better. They have been managing to squeeze out 5 to 10% performance gain in mainstream desktop chips each generation since Core 2, sometimes significantly more (eg Sandy Bridge). They're basically operating with a self/market-imposed TDP cap of about 95W in that market segment, so the only way to get more performance in the long run is for efficiency to go up.

Preemptive strike: please don't go all :byodood: BUT OC'D 2500K :byodood: on me. Intel does their engineering work based on factory clock/TDP/etc. targets, so if you want evaluate their work, that's what you need to look at.

Speaking of which... one thing which seldom comes up in these grumpy overclocker discussions is that reduced OC headroom is an inevitable consequence of really good power-efficient design. The only reason you get to OC at all is that chips have timing slack, i.e. their critical paths actually run faster than they need to. Any timing slack in excess of the margin needed to statistically guarantee correct operation at factory clocks can be converted into reduced power consumption at the same clocks, or increased performance at the same power, or some mixture of the two. The flip side is that once Intel's engineers have gotten done optimizing their chips this way, there's less timing slack to operate much faster than the fastest factory bin (eg 6700K in this generation), and OC'ers start getting all bitter-old-man about how they have to crank voltage and generate insane heat to get anywhere these days.

I'm thinking exclusively in the post-Sandy Bridge era. Nobody cares about the improvements made in the wake of the Pentium 4 , widely hailed as The Shittiest Architecture Of All time. You don't get brownie points for recovering ground you never should have lost in the first place (see also: AMD construction cores). Tell me what you've done for me today.

And what I mean is that laptops today are equipped with slower processors than they used to be - the ultrabook concept has spread across the entire line now (with the exception of workstation and desktops replacement models), and what you get is a 10W processor with a 2GHz sustained speed that will boost to 3.1GHz for a minute or two. It's rare to find anything except Y and U-type SKUs in consumer gear now unless you specifically go hunting for it. Those SKUs just aren't suitable for gaming and certainly don't justify a $500 Razer Core with a high-end GPU. Especially not with the weak-rear end thermal solutions that are typical today, and especially not now that we're finally getting iGPUs with eDRAM and HBM2 that are finally worth a poo poo (Iris Pro is roughly on par with a GTX 750).

Seriously though I think you didn't even read before reflexively mashing post, because Intel doesn't even make a 95W mobile processor. Not even close, Haswell maxed out at 55W and Skylake currently only has a 44W. And virtually nobody is going to be plugging a Razer Core into anything that could possibly support a full-power desktop SKU.

Paul MaudDib fucked around with this message at 17:29 on May 18, 2016

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

Bullshit.

http://www.cpubenchmark.net/laptop.html

Even your bottom of the barrel Intel mobile chips (your i3's or m5's, 15W TDP) can spank the high-end stuff from five years prior. They are insanely more efficient today and still improving and you don't get to make poo poo up whole cloth.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

BangersInMyKnickers posted:

Bullshit.

http://www.cpubenchmark.net/laptop.html

Even your bottom of the barrel Intel mobile chips (your i3's or m5's, 15W TDP) can spank the high-end stuff from a few years prior. They are insanely more efficient today and still improving and you don't get to make poo poo up whole cloth.

Top Skylake i3 was missing, next highest (6100H): 3820
Top Skylake Core m5 (6Y57): 3018

Top Haswell i7 (4940MX): 9645
Mainstream Haswell i7 (4710MQ): 8004

Top Sandybridge i7 (2960XM): 7823
Mainstream Sandybridge i7 (2720QM): 6132

So no, even indulging your ninja edit (Haswell is "a few years ago" at this point), a Skylake m5/i3 does not even keep up with a mainstream Sandybridge i7 let alone "spank it". Not even close.

Equivalent processors are certainly getting better (incrementally), the problem is that laptops that would have had an i5 or i7 a couple years ago now go with low-power processors to get the weight/thickness down and the battery life up. Those processors and their thermal solutions are totally inadequate for the kind of games where you'd want to drop $800+ on a Razer Core and a decent GPU.

With some of the weaksauce coolers that get deployed on those laptops, I question whether those performance levels could even be sustained for longer than the minute or two it takes to run a benchmark. Boost mode looks good on benchmarks and gives a nice MS Office experience, but if you don't design to get rid of the heat it doesn't translate into real-life performance.

Paul MaudDib fucked around with this message at 18:18 on May 18, 2016

Anime Schoolgirl
Nov 28, 2002

SuperDucky posted:

We're having some, uh, interesting stability issues with Broadwell-EP on boards that worked just fine on Haswell. AMI and Intel are stumped.
:frogon:

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

Paul MaudDib posted:

Top Skylake i3 was missing, next highest (6100H): 3820
Top Skylake Core m5 (6Y57): 3018

Top Haswell i7 (4940MX): 9645
Mainstream Haswell i7 (4710MQ): 8004

Top Sandybridge i7 (2960XM): 7823
Mainstream Sandybridge i7 (2720QM): 6132

So no, even indulging your ninja edit (Haswell is "a few years ago" at this point), a Skylake m5/i3 does not even keep up with a mainstream Sandybridge i7 let alone "spank it". Not even close.

Equivalent processors are certainly getting better (incrementally), the problem is that laptops that would have had an i5 or i7 a couple years ago now go with low-power processors to get the weight/thickness down and the battery life up. Those processors and their thermal solutions are totally inadequate for the kind of games where you'd want to drop $800+ on a Razer Core and a decent GPU.

With some of the weaksauce coolers that get deployed on those laptops, I question whether those performance levels could even be sustained for longer than the minute or two it takes to run a benchmark. Boost mode looks good on benchmarks and gives a nice MS Office experience, but if you don't design to get rid of the heat it doesn't translate into real-life performance.

And how many cores do the processors you are quoting have to achieve those benchmark scores you are citing? I was intentionally comparing 2 core processors but you're throwing 4 core ones in which is a complete bullshit comparison unless you account for it.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD

Paul MaudDib posted:

Modern CPUs haven't gotten more power efficient (for a given amount of processing) so much as they've gotten better at idling it down so that computation isn't wasted. If anything the performance trend has actually been downwards.

except youre still literally wrong about this

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

BangersInMyKnickers posted:

And how many cores do the processors you are quoting have to achieve those benchmark scores you are citing? I was intentionally comparing 2 core processors but you're throwing 4 core ones in which is a complete bullshit comparison unless you account for it.

Hey, you didn't say "the best equivalent processor", you said "the high-end stuff from a couple years ago". Nobody disagrees that processor-on-processor things have improved a lot. What I said is that laptops are shipping with weaker processors, not that they aren't available. Laptops going with dual-cores when they used to go with quad-cores is part of that problem.

But still:

Haswell dual-core (4610M): 5110.
Sandybridge dual-core (2640M): 3917

So they still not only remain unspanked but also still outperform Skylake i3s and m5s. Even the Sandy Bridge.

Gwaihir
Dec 8, 2009
Hair Elf

Paul MaudDib posted:

Top Skylake i3 was missing, next highest (6100H): 3820
Top Skylake Core m5 (6Y57): 3018

Top Haswell i7 (4940MX): 9645
Mainstream Haswell i7 (4710MQ): 8004

Top Sandybridge i7 (2960XM): 7823
Mainstream Sandybridge i7 (2720QM): 6132

So no, even indulging your ninja edit (Haswell is "a few years ago" at this point), a Skylake m5/i3 does not even keep up with a mainstream Sandybridge i7 let alone "spank it". Not even close.

Equivalent processors are certainly getting better (incrementally), the problem is that laptops that would have had an i5 or i7 a couple years ago now go with low-power processors to get the weight/thickness down and the battery life up. Those processors and their thermal solutions are totally inadequate for the kind of games where you'd want to drop $800+ on a Razer Core and a decent GPU.

With some of the weaksauce coolers that get deployed on those laptops, I question whether those performance levels could even be sustained for longer than the minute or two it takes to run a benchmark. Boost mode looks good on benchmarks and gives a nice MS Office experience, but if you don't design to get rid of the heat it doesn't translate into real-life performance.

You could rant about laptop OEM choices, but they're just building machines to suit the market. Which is OMG CUTE/THIN and Battery life!!!

I agree that performance on U series chips is considerably less than stellar (I traded in my XPS13 for a heavier machine with a real quad, in the end I don't give a gently caress if my backpack has 5 more pounds in it), you can get UI stuttering and other "ugh" artifacts just by firing up a browser with 20 something tabs open on a 4k monitor.

So my easy solution is just not buying a U series laptop, not make asinine arguments that somehow CPUs aren't getting more power efficient.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Gwaihir posted:

You could rant about laptop OEM choices, but they're just building machines to suit the market. Which is OMG CUTE/THIN and Battery life!!!

I agree that performance on U series chips is considerably less than stellar (I traded in my XPS13 for a heavier machine with a real quad, in the end I don't give a gently caress if my backpack has 5 more pounds in it), you can get UI stuttering and other "ugh" artifacts just by firing up a browser with 20 something tabs open on a 4k monitor.

So my easy solution is just not buying a U series laptop, not make asinine arguments that somehow CPUs aren't getting more power efficient.

Yeah, it's absolutely a problem with OEMs. Like I have said repeatedly, the problem is that laptops are going with weaker processors than they used to in the name of power efficiency and weight/thickness. I've explicitly said that processor-on-processor there's been improvements but some people have problems with reading comprehension.

It still doesn't mean that for the overwhelming majority of laptops that ship with a U or Y series processor is going to be worth dropping $800 on a Razer Core and a GTX 1070 or something. The sustained CPU performance of the laptops that are going out is just not good enough that that makes any goddamn sense at all. The guy who wants to plug his ultrabook into a Razer Core and game on it is severely mistaken about the performance that will yield.

Paul MaudDib fucked around with this message at 19:08 on May 18, 2016

Aquila
Jan 24, 2003

SuperDucky posted:

We're having some, uh, interesting stability issues with Broadwell-EP on boards that worked just fine on Haswell. AMI and Intel are stumped.

Uh Oh, and Dell just "upgraded" our most recent order to v4 xeons for "free".

japtor
Oct 28, 2005

Paul MaudDib posted:

Yeah, it's absolutely a problem with OEMs. Like I have said repeatedly, the problem is that laptops are going with weaker processors than they used to in the name of power efficiency and weight/thickness. I've explicitly said that processor-on-processor there's been improvements but some people have problems with reading comprehension.
I think their reading comprehension is fine, they're probably just taking issue with one particular part of how you opened the topic:

Paul MaudDib posted:

It's a nice idea but it's not going to be practical until the trend toward heavily-throttled 10W TDP mobile parts stops. A lot of games will easily max an overclocked desktop quadcore let alone some 2ghz mobile SKU. The thermal solutions aren't built for sustained load either, after they're up to temperature the CPU will start throttling and your framerates will crash.
That part is fine (...although I vaguely recall reading that it's the other way around with games, i.e. there's a handful of games that push desktop quads, not "a lot", but whatever, we get the point about lower power CPUs being used nowadays)

quote:

Modern CPUs haven't gotten more power efficient (for a given amount of processing) so much as they've gotten better at idling it down so that computation isn't wasted. If anything the performance trend has actually been downwards.
That part is wrong and/or worded really poorly though, there's a reason it seems like the root of all the replies are about that particular statement while you've had to clarify (multiple times!) that you agree with them.

But anyway Razer Core, yeah there's probably gonna be inflated/unrealistic expectations with it, but at the same time I wouldn't really expect it to hit the same user base as most light notebook users to begin with. Price for one thing obviously, but also just the idea of having a desktop setup at all. If someone's serious enough to get a GPU dock I'm thinking they might get a decent enough laptop in the first place, whether cause they want so and so specs, and/or they just throw a bunch of money at the highest spec stuff.

SuperDucky
May 13, 2007

by exmarx

Aquila posted:

Uh Oh, and Dell just "upgraded" our most recent order to v4 xeons for "free".

Essentially a BIOS issue with two faces. If we use the Broadwell BIOS and put a Haswell in, CPU1 doesn't show up if its a "-L." Sorta similarly, non-engineering sample 2680v4's (our primary go-to) don't show up in the BIOS at all, whereas all other v4s work fine. Mind you, the engineering sample 2680v4's work fine in the same serialized board. :frogsiren:

Adbot
ADBOT LOVES YOU

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Paul MaudDib posted:

Haswell dual-core (4610M): 5110.
Sandybridge dual-core (2640M): 3917

So they still not only remain unspanked but also still outperform Skylake i3s and m5s. Even the Sandy Bridge.

Passmark is a dumb benchmark but what the hell, hopefully with a few more numbers for roughly the best 2-core laptop CPU from each generation you will see the light.

6567U 28W 5479
5557U 28W 4935
4610M 37W 5110
3540M 35W 4626
2640M 35W 3918

From Haswell to Broadwell they lost a tiny bit of CPU performance, but they also dropped 9 watts of power and upgraded the iGPU to Iris, which was a giant leap in graphics performance. So what you're seeing there is efficiency and overall performance gains in every generation, some of them pretty big. And if you compare Skylake to Sandy, hey guess what - you can't quite find support for some of the hyperbole there, but the Sandy gets smoked. You'd have to go to a desktop power bin dual core Sandy to beat the mobile Skylake.

E: it's the i3-2130 and it scores 4062 with a 65w tdp so whoops best Sandy desktop dual doesn't quite compete

BobHoward fucked around with this message at 22:04 on May 18, 2016

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply