Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
canyoneer
Sep 13, 2005


I only have canyoneyes for you
So I've been planning on replacing my dinosaur media PC with a skylake NUC, but I found this in an amazon review.

http://nucblog.net/2016/04/skylake-i3-and-i5-nuc-whea-errors/

Anyone heard anything else about that?

Adbot
ADBOT LOVES YOU

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

canyoneer posted:

So I've been planning on replacing my dinosaur media PC with a skylake NUC, but I found this in an amazon review.

http://nucblog.net/2016/04/skylake-i3-and-i5-nuc-whea-errors/

Anyone heard anything else about that?

Its worth noting most of the NUCs have had some mild to moderate bios / firmware / driver pains that are usually fully ironed out a few months after release.

Aquila
Jan 24, 2003

canyoneer posted:

So I've been planning on replacing my dinosaur media PC with a skylake NUC, but I found this in an amazon review.

http://nucblog.net/2016/04/skylake-i3-and-i5-nuc-whea-errors/

Anyone heard anything else about that?

Not heard about it, but I have a Skylake nuc running ubuntu 14 desktop and it's been mostly great. No sign of MCE's in syslog. System did hardlock the first night with nothing of note in the logs, but has been solid since then (I updated the bios to current from original release the day after it hardlocked). Overall I'm really happy with this NUC.

Aquila fucked around with this message at 17:23 on Apr 8, 2016

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.
When people talk about integrated graphics are they really just referring to the CPU handling GPU duties?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

KingEup posted:

When people talk about integrated graphics are they really just referring to the CPU handling GPU duties?

No, most modern [consumer] CPUs actually have a small specialized GPU built right onto the die. It sucks compared to a discrete GPU but it is OK for basic desktop work and has a specialized video core for rendering DIVX/H264 and such.

Paul MaudDib fucked around with this message at 07:25 on Apr 9, 2016

Kazinsal
Dec 13, 2011


Integrated graphics is literally a fairly basic GPU on the CPU die. The lowest end ones these days are enough to do video 1080p video encode/decode smoothly, and play lighter games at passable settings.

And then there's the Intel Iris Pro 580, which can do 4K 60fps encode/decode and is smack between a GTX 750 and 750 Ti for single-precision GFLOPS. Too bad it only has 128 MB of eDRAM and is only available on a handful of high-end mobile chips.

Kazinsal fucked around with this message at 06:20 on Apr 9, 2016

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.

Paul MaudDib posted:

No, most modern [consumer] GPUs actually have a small specialized GPU built right onto the die.

Ok so how is RAM allocated to the on die GPU? I read that dual channel RAM makes a huge difference when it comes to gaming but is there any limit to how much can be used?

I'm curious because because I'll probably buy the new NUC with iris pro (for playing DOTA2 only at this stage) and I'm assuming it going to be equivalent to a Radeon 5850 (but that only has 1GB ram).

Hmmm... according to Apple

quote:

Apple computers using Intel Iris Pro Graphics 6200 as the primary GPU dynamically allocate up to 1.5 GB of system memory.


I assume it would be the same for non Apple systems.

KingEup fucked around with this message at 09:04 on Apr 9, 2016

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Paul MaudDib posted:

No, most modern [consumer] GPUs CPUs actually have a small specialized GPU built right onto the die. It sucks compared to a discrete GPU but it is OK for basic desktop work and has a specialized video core for rendering DIVX/H264 and such.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Corrected, thanks.

gourdcaptain
Nov 16, 2012

Heck, the tiny 11-inch tablet convertible I got with the second worst (I think) Skylake integrated GPU (HD 515 on a Intel Core m5 6Y54) can smoothly with hardware decoding play 1080p 30fps HEVC and be in a Discord call through the inefficient web interface and still not rise above its minimum 500 MHz CPU clock speed (running a minimal load Arch Linux setup). The same video without hardware decoding requires it to edge into the border of using Turbo mode at 1.1-1.4 GHz. (Which is certainly not viable on an entirely passively cooled system for any extended period, and my lap at least appreciates getting roasted less). Good hardware video decoding rocks.

By comparison, my old AMD Bobcat powered netbook couldn't even do a completely stable Discord call without maxing out a 1.2 GHz CPU core.

gourdcaptain fucked around with this message at 09:46 on Apr 9, 2016

Rastor
Jun 2, 2001

Kazinsal posted:

Integrated graphics is literally a fairly basic GPU on the CPU die. The lowest end ones these days are enough to do video 1080p video encode/decode smoothly, and play lighter games at passable settings.

And then there's the Intel Iris Pro 580, which can do 4K 60fps encode/decode and is smack between a GTX 750 and 750 Ti for single-precision GFLOPS. Too bad it only has 128 MB of eDRAM and is only available on a handful of high-end mobile chips.

A handful of high-end mobile chips... and in this thing.

Boris Galerkin
Dec 17, 2011

I don't understand why I can't harass people online. Seriously, somebody please explain why I shouldn't be allowed to stalk others on social media!

Paul MaudDib posted:

No, most modern [consumer] CPUs actually have a small specialized GPU built right onto the die. It sucks compared to a discrete GPU but it is OK for basic desktop work and has a specialized video core for rendering DIVX/H264 and such.

For computers with an integrated GPU and also a dedicated one, are such software (say VLC, or Firefox/Chrome for Netflix etc) smart enough to just push the work onto the integrated GPU instead of the dedicated GPU?

suck my woke dick
Oct 10, 2012

:siren:I CANNOT EJACULATE WITHOUT SEEING NATIVE AMERICANS BRUTALISED!:siren:

Put this cum-loving slave on ignore immediately!

Boris Galerkin posted:

For computers with an integrated GPU and also a dedicated one, are such software (say VLC, or Firefox/Chrome for Netflix etc) smart enough to just push the work onto the integrated GPU instead of the dedicated GPU?

Why wouldn't they?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Boris Galerkin posted:

For computers with an integrated GPU and also a dedicated one, are such software (say VLC, or Firefox/Chrome for Netflix etc) smart enough to just push the work onto the integrated GPU instead of the dedicated GPU?

The iGPU only does work if you plug your display into the iGPU instead of the dGPU - it can't do anything if you don't plug it in.

feedmegin
Jul 30, 2008

Paul MaudDib posted:

The iGPU only does work if you plug your display into the iGPU instead of the dGPU - it can't do anything if you don't plug it in.

Actually it can. The overhead would be pushing the resulting frames from its framebuffer (not feeding a monitor) over PCI to the dGPU. As a practical matter I dont know if thats actually done though, why bother over using the dGPU?

Edit: and they use a (usually reserved) chunk of normal system RAM.

feedmegin fucked around with this message at 13:42 on Apr 9, 2016

sauer kraut
Oct 2, 2004

Boris Galerkin posted:

For computers with an integrated GPU and also a dedicated one, are such software (say VLC, or Firefox/Chrome for Netflix etc) smart enough to just push the work onto the integrated GPU instead of the dedicated GPU?

Are you talking about video decoding? mp4 playback has been trivial since the Nvidia 8000 series, dont worry about which chip does it.
Those hybrid setups you describe cause all sorts of delightful problems with software, and I'm sure developers just love every second of it. Bonus points if you can't turn off the iGPU at all in the locked custom laptop BIOS.

sauer kraut fucked around with this message at 14:14 on Apr 9, 2016

computer parts
Nov 18, 2010

PLEASE CLAP

Boris Galerkin posted:

For computers with an integrated GPU and also a dedicated one, are such software (say VLC, or Firefox/Chrome for Netflix etc) smart enough to just push the work onto the integrated GPU instead of the dedicated GPU?

Apple does this for OS X but it's not a Windows feature for a lot of reasons.

Well to be specific Apple defaults everything to the integrated GPU but if certain tasks require the dedicated, it'll switch over to that fairly seamlessly until it's finished. Not quite parallel work like you're imagining.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I've always hoped that there'll be some middleware enabling to use the IGP for physics acceleration, since in a lot of gamer computers, the thing is not in use due to dedicated graphics. Yet it never happened.

nostrata
Apr 27, 2007

Rastor posted:

A handful of high-end mobile chips... and in this thing.

I've ordered one, super excited to get my hands on it. The waiting is killing me because my main computer died a few weeks ago.

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me
It should be noted that in older* systems, integrated graphics were built into the motherboard chipset (northbridge).

*Intel Core 2 and Diamondville Atom and older / AMD pre-Fusion (Llano/A-series)

Prescription Combs
Apr 20, 2005
   6
That new NUC is sweeeet.

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

Combat Pretzel posted:

I've always hoped that there'll be some middleware enabling to use the IGP for physics acceleration, since in a lot of gamer computers, the thing is not in use due to dedicated graphics. Yet it never happened.

It probably made everything worse than better, much like using an older Nvidia card for phys-x things made your main GPU worse.

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me
I think tapping into unused integrated graphics resources is one of the new features of DX12.

Atomizer
Jun 24, 2007



nostrata posted:

I've ordered one, super excited to get my hands on it. The waiting is killing me because my main computer died a few weeks ago.

Son of a bitch, I didn't know preordering had begun and now they're sold out. :saddowns:

computer parts
Nov 18, 2010

PLEASE CLAP

Combat Pretzel posted:

I've always hoped that there'll be some middleware enabling to use the IGP for physics acceleration, since in a lot of gamer computers, the thing is not in use due to dedicated graphics. Yet it never happened.

I think the issue with that is that it can cause the CPU to throttle itself because of heat.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

Prescription Combs posted:

That new NUC is sweeeet.

This is major overkill for what I'm going to be using it for (basically simple HTPC stuff) but I couldn't resist, can't wait to get it. Something about having such a powerful machine in a tiny little box really makes it appealing to me.

MaxxBot fucked around with this message at 22:07 on Apr 10, 2016

Mr SoupTeeth
Jan 16, 2015

Boiled Water posted:

It probably made everything worse than better, much like using an older Nvidia card for phys-x things made your main GPU worse.

AMD actually implemented async Crossfire in certain laptop chipsets with dGPU's a few years ago with the expected results. It provided a performance boost in one or two games with hosed up framepacing while making everything else worse or completely broken. I'm not surprised they quietly swept that "feature" under the rug. DX12 seems to be the same story.

champagne posting
Apr 5, 2006

YOU ARE A BRAIN
IN A BUNKER

Mr SoupTeeth posted:

AMD actually implemented async Crossfire in certain laptop chipsets with dGPU's a few years ago with the expected results. It provided a performance boost in one or two games with hosed up framepacing while making everything else worse or completely broken. I'm not surprised they quietly swept that "feature" under the rug. DX12 seems to be the same story.

2015 seems to be the year where game developers looked at SLI and CF and went "yeah not having that". Or you know whomever makes more than one graphics card A Thing.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Boiled Water posted:

2015 seems to be the year where game developers looked at SLI and CF and went "yeah not having that". Or you know whomever makes more than one graphics card A Thing.
To be fair, with only a few exceptions, there really wasn't anything that came out in 2015 to honestly justify SLI/CF when you could max or nearly max everything at 1080/1440 with a single card. Now that we're talking 120+Hz and/or 4k maybe they'll start paying attention to it again.

Mr SoupTeeth
Jan 16, 2015

Boiled Water posted:

2015 seems to be the year where game developers looked at SLI and CF and went "yeah not having that". Or you know whomever makes more than one graphics card A Thing.

Of course that happens right when AMD/nVidia finally made it anything other than a waste of money/power.

I'm still sore about my 1st gen SLI experience, expensive mainboard, two expensive cards, ridiculously loud power supply to drive said cards (there was literally one 800w Sparkle on the market with enough amps on the 12v rail), and it never worked right in a single instance. The tech/gaming press really showed their true colors when covering it, showing off impressive scaling in benchmarks but failing to mention experience ruining frame pacing issues for years. Gotta keep those sweet review samples and ad money coming.

Mr SoupTeeth fucked around with this message at 23:34 on Apr 10, 2016

Atomizer
Jun 24, 2007



How about this RAM and SSDs for a Skull Canyon NUC?

http://www.newegg.com/Product/Product.aspx?Item=N82E16820232169

http://www.newegg.com/Product/Product.aspx?Item=N82E16820147399

The 850 Evo seems pretty solid and well-liked. It'd be nice to have a pair of 1 TB SSDs in there but there are few options (that Intel 540s TLC and the SanDisk X400, I guess, the latter probably being the better choice.)

suck my woke dick
Oct 10, 2012

:siren:I CANNOT EJACULATE WITHOUT SEEING NATIVE AMERICANS BRUTALISED!:siren:

Put this cum-loving slave on ignore immediately!

Atomizer posted:

How about this RAM and SSDs for a Skull Canyon NUC?

http://www.newegg.com/Product/Product.aspx?Item=N82E16820232169

http://www.newegg.com/Product/Product.aspx?Item=N82E16820147399

The 850 Evo seems pretty solid and well-liked. It'd be nice to have a pair of 1 TB SSDs in there but there are few options (that Intel 540s TLC and the SanDisk X400, I guess, the latter probably being the better choice.)

Intel>Samsung 850 EVO>Sandisk>else. The goon hivemind considers Samsung to be the optimum price/performance point, with little benefit for the average user from moving to intel, but more consistent performance than Sandisk.

32GB RAM seems a bit excessive, unless you have a specific reason for that amount in mind.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
Intel's most recent 2.5" SSD worth a drat was released in 2014, and even it was a rebadge of a slightly older enterprise-level drive. The new one they're putting out uses the same controller as the much-maligned Crucial BX200. Pretty much the only reason you'd still want a 730 is it's the only affordable consumer drive available with a battery backup built into the drive, but if you want more than 480GB, you're screwed, and the 240GB is gimped to 270MB/sec writes (which, let's face it, for normal computer work is more than fast enough).

And I'm saying this with a 240GB 730 as my boot and a 750GB 840 EVO as my Steam drive (won't trust anything else on it). The second 1TB EVOs hit $199 and/or Pros hit ~230-250, I'm buying one.

BIG HEADLINE fucked around with this message at 10:57 on Apr 11, 2016

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

BIG HEADLINE posted:

Intel's most recent 2.5" SSD worth a drat was released in 2014

No it wasn't I mean, if you're going for the ultimate NUC, might as well get the SSD of SSDs :unsmigghh:

I have no idea if an M.2 adapter module/cable exists so that you can cram it into the Skull Canyon, though.

Sidesaddle Cavalry fucked around with this message at 14:35 on Apr 11, 2016

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

Sidesaddle Cavalry posted:

No it wasn't I mean, if you're going for the ultimate NUC, might as well get the SSD of SSDs :unsmigghh:

I have no idea if an M.2 adapter module/cable exists so that you can cram it into the Skull Canyon, though.

Does that thing have some kind of ribbon cable that connects to a PCIe slot? I've only seen the M.2 and the conventional expansion card ssd's, that one is new to me.

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender

Atomizer posted:

How about this RAM and SSDs for a Skull Canyon NUC?

http://www.newegg.com/Product/Product.aspx?Item=N82E16820232169

http://www.newegg.com/Product/Product.aspx?Item=N82E16820147399

The 850 Evo seems pretty solid and well-liked. It'd be nice to have a pair of 1 TB SSDs in there but there are few options (that Intel 540s TLC and the SanDisk X400, I guess, the latter probably being the better choice.)

Those parts are fine but if you might as well go all-in and get the NVMe version of the drive instead. Who cares about money.

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

BangersInMyKnickers posted:

Does that thing have some kind of ribbon cable that connects to a PCIe slot? I've only seen the M.2 and the conventional expansion card ssd's, that one is new to me.

This form factor (sff-8639 or U.2 as Intel is calling it) is mostly for enterprise backplanes etc.. The cables mechanically identical to the somewhat defunct SATA-Express so we might see some motherboards with ports but I think it'll be rare, slot and m.2 are gonna be the vast majority of consumer grade for a while.

Also those nvme drives get stonkin' hot, they need a real good airflow.

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

priznat posted:

This form factor (sff-8639 or U.2 as Intel is calling it) is mostly for enterprise backplanes etc.. The cables mechanically identical to the somewhat defunct SATA-Express so we might see some motherboards with ports but I think it'll be rare, slot and m.2 are gonna be the vast majority of consumer grade for a while.

Also those nvme drives get stonkin' hot, they need a real good airflow.

Nice. One of the issues I've bumped in to with VM hosts it how to handle SSD caching drives, either you put them on the SAS/SATA controller and eat the throughput limitations or you stick them in the PCIe expansion slots and cross your fingers that you don't need to add any additional expansion cards for NICs or whatever. Glad to see they're working to make the disk backplanes usable again.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

priznat posted:

Also those nvme drives get stonkin' hot, they need a real good airflow.

Why do they get hotter than 2.5" or slot?

Adbot
ADBOT LOVES YOU

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

Subjunctive posted:

Why do they get hotter than 2.5" or slot?

I believe it's the controller chips that ends up producing most of the heat, when doing constant read/write activity. The actual storage chips barely heat up at all.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply