|
So I've been planning on replacing my dinosaur media PC with a skylake NUC, but I found this in an amazon review. http://nucblog.net/2016/04/skylake-i3-and-i5-nuc-whea-errors/ Anyone heard anything else about that?
|
# ? Apr 8, 2016 07:18 |
|
|
# ? May 20, 2024 07:52 |
|
canyoneer posted:So I've been planning on replacing my dinosaur media PC with a skylake NUC, but I found this in an amazon review. Its worth noting most of the NUCs have had some mild to moderate bios / firmware / driver pains that are usually fully ironed out a few months after release.
|
# ? Apr 8, 2016 13:20 |
|
canyoneer posted:So I've been planning on replacing my dinosaur media PC with a skylake NUC, but I found this in an amazon review. Not heard about it, but I have a Skylake nuc running ubuntu 14 desktop and it's been mostly great. No sign of MCE's in syslog. System did hardlock the first night with nothing of note in the logs, but has been solid since then (I updated the bios to current from original release the day after it hardlocked). Overall I'm really happy with this NUC. Aquila fucked around with this message at 17:23 on Apr 8, 2016 |
# ? Apr 8, 2016 17:00 |
|
When people talk about integrated graphics are they really just referring to the CPU handling GPU duties?
|
# ? Apr 9, 2016 06:09 |
|
KingEup posted:When people talk about integrated graphics are they really just referring to the CPU handling GPU duties? No, most modern [consumer] CPUs actually have a small specialized GPU built right onto the die. It sucks compared to a discrete GPU but it is OK for basic desktop work and has a specialized video core for rendering DIVX/H264 and such. Paul MaudDib fucked around with this message at 07:25 on Apr 9, 2016 |
# ? Apr 9, 2016 06:16 |
|
Integrated graphics is literally a fairly basic GPU on the CPU die. The lowest end ones these days are enough to do video 1080p video encode/decode smoothly, and play lighter games at passable settings. And then there's the Intel Iris Pro 580, which can do 4K 60fps encode/decode and is smack between a GTX 750 and 750 Ti for single-precision GFLOPS. Too bad it only has 128 MB of eDRAM and is only available on a handful of high-end mobile chips. Kazinsal fucked around with this message at 06:20 on Apr 9, 2016 |
# ? Apr 9, 2016 06:18 |
|
Paul MaudDib posted:No, most modern [consumer] GPUs actually have a small specialized GPU built right onto the die. Ok so how is RAM allocated to the on die GPU? I read that dual channel RAM makes a huge difference when it comes to gaming but is there any limit to how much can be used? I'm curious because because I'll probably buy the new NUC with iris pro (for playing DOTA2 only at this stage) and I'm assuming it going to be equivalent to a Radeon 5850 (but that only has 1GB ram). Hmmm... according to Apple quote:Apple computers using Intel Iris Pro Graphics 6200 as the primary GPU dynamically allocate up to 1.5 GB of system memory. I assume it would be the same for non Apple systems. KingEup fucked around with this message at 09:04 on Apr 9, 2016 |
# ? Apr 9, 2016 06:21 |
|
Paul MaudDib posted:No, most modern [consumer]
|
# ? Apr 9, 2016 06:57 |
|
Corrected, thanks.
|
# ? Apr 9, 2016 07:25 |
|
Heck, the tiny 11-inch tablet convertible I got with the second worst (I think) Skylake integrated GPU (HD 515 on a Intel Core m5 6Y54) can smoothly with hardware decoding play 1080p 30fps HEVC and be in a Discord call through the inefficient web interface and still not rise above its minimum 500 MHz CPU clock speed (running a minimal load Arch Linux setup). The same video without hardware decoding requires it to edge into the border of using Turbo mode at 1.1-1.4 GHz. (Which is certainly not viable on an entirely passively cooled system for any extended period, and my lap at least appreciates getting roasted less). Good hardware video decoding rocks. By comparison, my old AMD Bobcat powered netbook couldn't even do a completely stable Discord call without maxing out a 1.2 GHz CPU core. gourdcaptain fucked around with this message at 09:46 on Apr 9, 2016 |
# ? Apr 9, 2016 09:43 |
|
Kazinsal posted:Integrated graphics is literally a fairly basic GPU on the CPU die. The lowest end ones these days are enough to do video 1080p video encode/decode smoothly, and play lighter games at passable settings. A handful of high-end mobile chips... and in this thing.
|
# ? Apr 9, 2016 10:41 |
|
Paul MaudDib posted:No, most modern [consumer] CPUs actually have a small specialized GPU built right onto the die. It sucks compared to a discrete GPU but it is OK for basic desktop work and has a specialized video core for rendering DIVX/H264 and such. For computers with an integrated GPU and also a dedicated one, are such software (say VLC, or Firefox/Chrome for Netflix etc) smart enough to just push the work onto the integrated GPU instead of the dedicated GPU?
|
# ? Apr 9, 2016 11:40 |
|
Boris Galerkin posted:For computers with an integrated GPU and also a dedicated one, are such software (say VLC, or Firefox/Chrome for Netflix etc) smart enough to just push the work onto the integrated GPU instead of the dedicated GPU? Why wouldn't they?
|
# ? Apr 9, 2016 11:58 |
|
Boris Galerkin posted:For computers with an integrated GPU and also a dedicated one, are such software (say VLC, or Firefox/Chrome for Netflix etc) smart enough to just push the work onto the integrated GPU instead of the dedicated GPU? The iGPU only does work if you plug your display into the iGPU instead of the dGPU - it can't do anything if you don't plug it in.
|
# ? Apr 9, 2016 11:59 |
|
Paul MaudDib posted:The iGPU only does work if you plug your display into the iGPU instead of the dGPU - it can't do anything if you don't plug it in. Actually it can. The overhead would be pushing the resulting frames from its framebuffer (not feeding a monitor) over PCI to the dGPU. As a practical matter I dont know if thats actually done though, why bother over using the dGPU? Edit: and they use a (usually reserved) chunk of normal system RAM. feedmegin fucked around with this message at 13:42 on Apr 9, 2016 |
# ? Apr 9, 2016 13:39 |
|
Boris Galerkin posted:For computers with an integrated GPU and also a dedicated one, are such software (say VLC, or Firefox/Chrome for Netflix etc) smart enough to just push the work onto the integrated GPU instead of the dedicated GPU? Are you talking about video decoding? mp4 playback has been trivial since the Nvidia 8000 series, dont worry about which chip does it. Those hybrid setups you describe cause all sorts of delightful problems with software, and I'm sure developers just love every second of it. Bonus points if you can't turn off the iGPU at all in the locked custom laptop BIOS. sauer kraut fucked around with this message at 14:14 on Apr 9, 2016 |
# ? Apr 9, 2016 14:12 |
|
Boris Galerkin posted:For computers with an integrated GPU and also a dedicated one, are such software (say VLC, or Firefox/Chrome for Netflix etc) smart enough to just push the work onto the integrated GPU instead of the dedicated GPU? Apple does this for OS X but it's not a Windows feature for a lot of reasons. Well to be specific Apple defaults everything to the integrated GPU but if certain tasks require the dedicated, it'll switch over to that fairly seamlessly until it's finished. Not quite parallel work like you're imagining.
|
# ? Apr 9, 2016 17:33 |
|
I've always hoped that there'll be some middleware enabling to use the IGP for physics acceleration, since in a lot of gamer computers, the thing is not in use due to dedicated graphics. Yet it never happened.
|
# ? Apr 9, 2016 18:01 |
|
Rastor posted:A handful of high-end mobile chips... and in this thing. I've ordered one, super excited to get my hands on it. The waiting is killing me because my main computer died a few weeks ago.
|
# ? Apr 9, 2016 18:32 |
|
It should be noted that in older* systems, integrated graphics were built into the motherboard chipset (northbridge). *Intel Core 2 and Diamondville Atom and older / AMD pre-Fusion (Llano/A-series)
|
# ? Apr 9, 2016 19:33 |
|
That new NUC is sweeeet.
|
# ? Apr 9, 2016 19:53 |
|
Combat Pretzel posted:I've always hoped that there'll be some middleware enabling to use the IGP for physics acceleration, since in a lot of gamer computers, the thing is not in use due to dedicated graphics. Yet it never happened. It probably made everything worse than better, much like using an older Nvidia card for phys-x things made your main GPU worse.
|
# ? Apr 9, 2016 21:15 |
|
I think tapping into unused integrated graphics resources is one of the new features of DX12.
|
# ? Apr 9, 2016 21:17 |
|
nostrata posted:I've ordered one, super excited to get my hands on it. The waiting is killing me because my main computer died a few weeks ago. Son of a bitch, I didn't know preordering had begun and now they're sold out.
|
# ? Apr 9, 2016 21:36 |
|
Combat Pretzel posted:I've always hoped that there'll be some middleware enabling to use the IGP for physics acceleration, since in a lot of gamer computers, the thing is not in use due to dedicated graphics. Yet it never happened. I think the issue with that is that it can cause the CPU to throttle itself because of heat.
|
# ? Apr 10, 2016 01:35 |
|
Prescription Combs posted:That new NUC is sweeeet. This is major overkill for what I'm going to be using it for (basically simple HTPC stuff) but I couldn't resist, can't wait to get it. Something about having such a powerful machine in a tiny little box really makes it appealing to me. MaxxBot fucked around with this message at 22:07 on Apr 10, 2016 |
# ? Apr 10, 2016 22:00 |
|
Boiled Water posted:It probably made everything worse than better, much like using an older Nvidia card for phys-x things made your main GPU worse. AMD actually implemented async Crossfire in certain laptop chipsets with dGPU's a few years ago with the expected results. It provided a performance boost in one or two games with hosed up framepacing while making everything else worse or completely broken. I'm not surprised they quietly swept that "feature" under the rug. DX12 seems to be the same story.
|
# ? Apr 10, 2016 22:25 |
|
Mr SoupTeeth posted:AMD actually implemented async Crossfire in certain laptop chipsets with dGPU's a few years ago with the expected results. It provided a performance boost in one or two games with hosed up framepacing while making everything else worse or completely broken. I'm not surprised they quietly swept that "feature" under the rug. DX12 seems to be the same story. 2015 seems to be the year where game developers looked at SLI and CF and went "yeah not having that". Or you know whomever makes more than one graphics card A Thing.
|
# ? Apr 10, 2016 22:46 |
|
Boiled Water posted:2015 seems to be the year where game developers looked at SLI and CF and went "yeah not having that". Or you know whomever makes more than one graphics card A Thing.
|
# ? Apr 10, 2016 22:54 |
|
Boiled Water posted:2015 seems to be the year where game developers looked at SLI and CF and went "yeah not having that". Or you know whomever makes more than one graphics card A Thing. Of course that happens right when AMD/nVidia finally made it anything other than a waste of money/power. I'm still sore about my 1st gen SLI experience, expensive mainboard, two expensive cards, ridiculously loud power supply to drive said cards (there was literally one 800w Sparkle on the market with enough amps on the 12v rail), and it never worked right in a single instance. The tech/gaming press really showed their true colors when covering it, showing off impressive scaling in benchmarks but failing to mention experience ruining frame pacing issues for years. Gotta keep those sweet review samples and ad money coming. Mr SoupTeeth fucked around with this message at 23:34 on Apr 10, 2016 |
# ? Apr 10, 2016 23:14 |
|
How about this RAM and SSDs for a Skull Canyon NUC? http://www.newegg.com/Product/Product.aspx?Item=N82E16820232169 http://www.newegg.com/Product/Product.aspx?Item=N82E16820147399 The 850 Evo seems pretty solid and well-liked. It'd be nice to have a pair of 1 TB SSDs in there but there are few options (that Intel 540s TLC and the SanDisk X400, I guess, the latter probably being the better choice.)
|
# ? Apr 11, 2016 08:44 |
|
Atomizer posted:How about this RAM and SSDs for a Skull Canyon NUC? Intel>Samsung 850 EVO>Sandisk>else. The goon hivemind considers Samsung to be the optimum price/performance point, with little benefit for the average user from moving to intel, but more consistent performance than Sandisk. 32GB RAM seems a bit excessive, unless you have a specific reason for that amount in mind.
|
# ? Apr 11, 2016 09:32 |
|
Intel's most recent 2.5" SSD worth a drat was released in 2014, and even it was a rebadge of a slightly older enterprise-level drive. The new one they're putting out uses the same controller as the much-maligned Crucial BX200. Pretty much the only reason you'd still want a 730 is it's the only affordable consumer drive available with a battery backup built into the drive, but if you want more than 480GB, you're screwed, and the 240GB is gimped to 270MB/sec writes (which, let's face it, for normal computer work is more than fast enough). And I'm saying this with a 240GB 730 as my boot and a 750GB 840 EVO as my Steam drive (won't trust anything else on it). The second 1TB EVOs hit $199 and/or Pros hit ~230-250, I'm buying one. BIG HEADLINE fucked around with this message at 10:57 on Apr 11, 2016 |
# ? Apr 11, 2016 10:54 |
|
BIG HEADLINE posted:Intel's most recent 2.5" SSD worth a drat was released in 2014 No it wasn't I mean, if you're going for the ultimate NUC, might as well get the SSD of SSDs I have no idea if an M.2 adapter module/cable exists so that you can cram it into the Skull Canyon, though. Sidesaddle Cavalry fucked around with this message at 14:35 on Apr 11, 2016 |
# ? Apr 11, 2016 14:25 |
|
Sidesaddle Cavalry posted:No it wasn't I mean, if you're going for the ultimate NUC, might as well get the SSD of SSDs Does that thing have some kind of ribbon cable that connects to a PCIe slot? I've only seen the M.2 and the conventional expansion card ssd's, that one is new to me.
|
# ? Apr 11, 2016 14:54 |
|
Atomizer posted:How about this RAM and SSDs for a Skull Canyon NUC? Those parts are fine but if you might as well go all-in and get the NVMe version of the drive instead. Who cares about money.
|
# ? Apr 11, 2016 14:58 |
|
BangersInMyKnickers posted:Does that thing have some kind of ribbon cable that connects to a PCIe slot? I've only seen the M.2 and the conventional expansion card ssd's, that one is new to me. This form factor (sff-8639 or U.2 as Intel is calling it) is mostly for enterprise backplanes etc.. The cables mechanically identical to the somewhat defunct SATA-Express so we might see some motherboards with ports but I think it'll be rare, slot and m.2 are gonna be the vast majority of consumer grade for a while. Also those nvme drives get stonkin' hot, they need a real good airflow.
|
# ? Apr 11, 2016 16:02 |
|
priznat posted:This form factor (sff-8639 or U.2 as Intel is calling it) is mostly for enterprise backplanes etc.. The cables mechanically identical to the somewhat defunct SATA-Express so we might see some motherboards with ports but I think it'll be rare, slot and m.2 are gonna be the vast majority of consumer grade for a while. Nice. One of the issues I've bumped in to with VM hosts it how to handle SSD caching drives, either you put them on the SAS/SATA controller and eat the throughput limitations or you stick them in the PCIe expansion slots and cross your fingers that you don't need to add any additional expansion cards for NICs or whatever. Glad to see they're working to make the disk backplanes usable again.
|
# ? Apr 11, 2016 16:40 |
|
priznat posted:Also those nvme drives get stonkin' hot, they need a real good airflow. Why do they get hotter than 2.5" or slot?
|
# ? Apr 11, 2016 16:45 |
|
|
# ? May 20, 2024 07:52 |
|
Subjunctive posted:Why do they get hotter than 2.5" or slot? I believe it's the controller chips that ends up producing most of the heat, when doing constant read/write activity. The actual storage chips barely heat up at all.
|
# ? Apr 11, 2016 16:46 |