|
Anyone ever had a CPU go bad? My 5820k started causing weird full system hardlocks and crashes. Took me a while to narrow it down, but after swapping it out a couple of times its pretty clear the processor was the culprit. Apparently something to watch out for on 5820k processors at least. I've found reviews on Newegg mentioning similar troubles. Worked for about a year before starting to die. Had adequate cooling and no crazy overclocking.
|
# ? Sep 26, 2016 18:00 |
|
|
# ? May 27, 2024 03:40 |
|
Yes I have seen it a few times, where we confirmed that replacing the CPU resolved the issue and going back to the bad CPU brought back the issue. It is vastly less common than motherboard failure or other failures which is why nobody ever expects it, but it happens.
|
# ? Sep 26, 2016 18:07 |
|
I haven't personally seen this but I read that in these cases the CPU pads might be corroded. You might be able to clean the pads with some electrical contact cleaner. It makes some sense give the VRMs are located inside the CPU. In the Skylake generation they went back to mobo VRMs.
|
# ? Sep 26, 2016 18:43 |
|
A 5820K should still be under warranty shouldn't it? Intel is pretty good about replacing it as long as you can detail the troubleshooting steps pretty clearly and weren't just abusing the hell out of it at 1.5v+ or something. I had a similar issue with an i7 920 back in the day and even at stock speeds it was crashing similar to the way you mentioned. Got on a chat with Intel and they swapped it out with a perfectly new i7 920 (the best revision of it as well at the time). That chip is still kicking like a champ ever since.
|
# ? Sep 26, 2016 19:06 |
|
EdEddnEddy posted:A 5820K should still be under warranty shouldn't it? Yes, it's under warranty and I've already received a replacement from Intel. Just really surprising to have a CPU go bad is all.
|
# ? Sep 26, 2016 19:08 |
|
Yea it really isn't very common these days, but it can happen. I had thought I ran into that on my i3-2100 system I won at Nvidia Lan, however it turned out to be the Patriot DDR3 memory in the system as the issue followed me over to a replacement Motherboard/CPU build. (The errors I got kept saying that CPU 1 error and stuff, but all Memtest came back fine.) Patriot recently came through on that Lifetime warranty too. I sent them the bad 4GB (2X2G) kit, and they sent me a same speed, but 8G (2X4G kit).
|
# ? Sep 26, 2016 19:19 |
|
apropos man posted:I didn't realise that Linux being good at mutli-threading was a thing. I guess it depends on what packages you're using to interface with the kernel? In that case I look forward to having a go with full disk encryption. Quite excited about this £100 upgrade PBCrunch posted:I don't know if its accurate to say Linux is good at taking advantage of threads; the types of tasks people accomplish using Linux tend to lend themselves to multiple threads. What I was getting at is more that the Unix philosophy of "do one thing well" tends to lend itself very nicely to threading. Instead of monolithic applications you tend to get applications that consist of multiple processes strung together, both at an application level (eg via pipes) and an OS level. You still can't defeat a true I/O bottleneck or Amdahl's law, but monolithic applications can often make things sequential that don't have to be and by separating them out you can get more cores into effective use. It's certainly not true of everything of course, but one classic example is Adam Drake's article Command-line tools can be 235x faster than your Hadoop cluster.
|
# ? Sep 26, 2016 22:46 |
|
redeyes posted:I haven't personally seen this but I read that in these cases the CPU pads might be corroded. You might be able to clean the pads with some electrical contact cleaner. It makes some sense give the VRMs are located inside the CPU. In the Skylake generation they went back to mobo VRMs. I have to know, what on earth does the placement of VRMs have to do with corrosion. Also if the lands are corroded you have been doing some serious hazmat poo poo to your computer because that poo poo is gold plated.
|
# ? Sep 27, 2016 06:16 |
|
Paul MaudDib posted:What I was getting at is more that the Unix philosophy of "do one thing well" tends to lend itself very nicely to threading. Instead of monolithic applications you tend to get applications that consist of multiple processes strung together, both at an application level (eg via pipes) and an OS level. Oh hell no I would love to know of any actual examples of professionally written compute intensive Unix applications which are broken down into multiple tiny do-one-thing-well processes strung together with pipes, because I would find that amazing. It is a painful model for writing anything serious and it is not efficient either (piping poo poo around isn't free). The main reason "do one thing well" is popular is that it's a great tool for rapidly throwing together programs when you don't care much about maintainability or performance. The downside is that it's a great tool for rapidly etc. (because the rapidity tempts people into going to that well too often, and soon you have a lovecraftian nightmare of shell script that has to be maintained). quote:You still can't defeat a true I/O bottleneck or Amdahl's law, but monolithic applications can often make things sequential that don't have to be and by separating them out you can get more cores into effective use. That is not the example you think it is. Hadoop's entire reason to exist is massive parallelism... at the cluster level. Using it for the described job size is like firing up one of those monstrous 40 foot tall mining ore haulers to take a 1 mile trip to the post office instead of riding your bike. Bike dude gets back and has a frosty drink in hand before the guy using the ore hauler has completed its engine start procedure. Drake is reminding people that overhead matters. (also wouldn't be surprised if the Hadoop newbie Drake linked to made horrible optimization mistakes. Drake massively improved the performance of his own hack with a few simple tweaks, and that's the kind of thing you don't know to do when you're new to a technology.)
|
# ? Sep 27, 2016 07:43 |
|
Well this is interesting: https://www.techpowerup.com/226306/msi-announces-bios-updates-for-kaby-lake-7th-gen-core-processors Nice to know the Kaby Lakes don't need a 200-series board.
|
# ? Oct 1, 2016 08:42 |
Yeah it's kind of an open secret that multithreading is (still) difficult and even people and systems that are really really good at it can't scale out well, if at all, and they are often much more error prone. I think this is completely independent of the unix do one thing and spit text both ways philosophy, I mean I guess since unix is this confusing clusterfuck of 7561523 tools and processes that gets parallelized by the OS to some degree, but nobody really cares about OS performance anymore. I'd imagine if machine learning takes off in a big way then we'll see some insane benefits to increased threads/core capacity that we can't even understand. Maybe that's actually the answer, instead of some massive complicated shitshow like threading in Java we can all just twiddle our thumbs until the machines figure it out for us.
|
|
# ? Oct 1, 2016 14:35 |
|
BIG HEADLINE posted:Well this is interesting: https://www.techpowerup.com/226306/msi-announces-bios-updates-for-kaby-lake-7th-gen-core-processors
|
# ? Oct 1, 2016 14:56 |
|
Kaby Lake promises less CPU strain playing 4K videos. Is this something that would be delegated to a modern dedicated graphics card (like Nvidia's 10XX series) if you have it? Or is this a useful feature whether you go with built-in graphics as well as for users with dedicated graphic cards? e: \/ Thank you, thank you. lllllllllllllllllll fucked around with this message at 22:01 on Oct 2, 2016 |
# ? Oct 2, 2016 21:12 |
|
lllllllllllllllllll posted:Kaby Lake promises less CPU strain playing 4K videos. Is this something that would be delegated to a modern dedicated graphics card (like Nvidia's 10XX series) if you have it? Or is this a useful feature whether you go with built-in graphics as well as for users with dedicated graphic cards? Possibly whatever features they've pipelined into the chips that deals with high bitrate video will have great implications for mobile computing. Even if a laptop only has a 1080 screen, if 4K becomes standard for downloads and streaming then poeple are gonna buy a CPU that can handle it, even if it's being downscaled most of the time. Not forgetting people using a laptop plugged into a secondary display as a presentation device. I guess it's a case of improving the section of CPU that is utilised for video data streams.
|
# ? Oct 2, 2016 21:29 |
|
lllllllllllllllllll posted:Kaby Lake promises less CPU strain playing 4K videos. Is this something that would be delegated to a modern dedicated graphics card (like Nvidia's 10XX series) if you have it? Or is this a useful feature whether you go with built-in graphics as well as for users with dedicated graphic cards? Although they advertise it as being for 4K video, what's going on is that they're adding support for a newer standard codec, which would still benefit decoding and encoding in that format at 1080p or even 320x240. Most video cards out right now won't happen to support it but it'll eventually be standard there. The previous generation of CPUs supported the new HEVC codec but only really well for 1080p resolution and 8 bit per channel color depth. Kaby Lake has better support for HEVC that remains useful at 4K resolution and 10 bits per color channel AND adds hardware decoding for VP9 format video as well.
|
# ? Oct 2, 2016 21:41 |
|
Is there anyone who is serious about the quality of their output, which you presumably are if you're making 4k, that is satisfied with gpu or intel hardware offload? Also I can't think of anyone in the world who would be pushed over the edge on a buying decision over a feature like this. It's just not something you hang a generation of products on.
|
# ? Oct 3, 2016 00:15 |
|
I figure most of the hardware accelerated encode / decode is primarily oriented around streaming video use cases such as video teleconferencing and live streaming video games.
|
# ? Oct 3, 2016 00:49 |
|
Fuzzy Mammal posted:It's just not something you hang a generation of products on. In a cynical world where Intel knows that a lot of battery test suites hinge heavily on things like "played a looped YouTube video until the laptop dies," something like this may give noticeable battery life boosts, and saying you're adding 25%+ to the "useful life" of a charge is absolutely something you can hang a mobile CPU generation on. On paper, anyhow.
|
# ? Oct 3, 2016 03:43 |
|
DrDork posted:In a cynical world where Intel knows that a lot of battery test suites hinge heavily on things like "played a looped YouTube video until the laptop dies," something like this may give noticeable battery life boosts, and saying you're adding 25%+ to the "useful life" of a charge is absolutely something you can hang a mobile CPU generation on. On paper, anyhow. plus its gonna reuse a bunch of the functional blocks for AVC/HEVC anyway so the marginal silicon added is pretty low
|
# ? Oct 3, 2016 03:58 |
|
Fuzzy Mammal posted:Is there anyone who is serious about the quality of their output, which you presumably are if you're making 4k, that is satisfied with gpu or intel hardware offload? Also I can't think of anyone in the world who would be pushed over the edge on a buying decision over a feature like this. It's just not something you hang a generation of products on. Yeah, this is all about battery life. Dropping CPU power usage watching HEVC video from ~12W to 1W is huge.
|
# ? Oct 3, 2016 05:08 |
|
My customers in the embedded segment are super stoked about the triple 4k outputs provided by the proc on Sky/Kaby without having to add a dedicated GPU FWIW.
|
# ? Oct 3, 2016 14:15 |
|
DrDork posted:In a cynical world where Intel knows that a lot of battery test suites hinge heavily on things like "played a looped YouTube video until the laptop dies," something like this may give noticeable battery life boosts, and saying you're adding 25%+ to the "useful life" of a charge is absolutely something you can hang a mobile CPU generation on. On paper, anyhow. If most people really are just watching Youtube videos then that's a useful thing to design for.
|
# ? Oct 3, 2016 15:12 |
|
SuperDucky posted:My customers in the embedded segment are super stoked about the triple 4k outputs provided by the proc on Sky/Kaby without having to add a dedicated GPU FWIW. But will any mobo manufacturer put three hdmi/dp outputs on their poo poo?
|
# ? Oct 3, 2016 15:41 |
|
Potato Salad posted:But will any mobo manufacturer put three hdmi/dp outputs on their poo poo? Did they stop doing that? I have a really cheap lovely P67 motherboard from 2011, and it's got displayport, HDMI, and DVI. When I look at mobos now I don't see many that have displayport, what happened?
|
# ? Oct 3, 2016 15:43 |
|
Mine has Displayport, DVI and VGA, it wasn't very expensive but it's a Z170. A quick glance at other LGA 1151 chipsets seems to show that you need to go for something higher end if you want even one displayport on H170/B150.
|
# ? Oct 3, 2016 15:51 |
|
Potato Salad posted:But will any mobo manufacturer put three hdmi/dp outputs on their poo poo? Imagine when you have embedded solutions you can order in things like more identical ports.
|
# ? Oct 3, 2016 16:04 |
|
If you're going to run three displays, you probably want to go balls out and have them all on DP. For that you currently still need a dedicated card. Having a combo of DP, HDMI and DVI sucks balls.
|
# ? Oct 3, 2016 16:04 |
|
Combat Pretzel posted:If you're going to run three displays, you probably want to go balls out and have them all on DP. For that you currently still need a dedicated card. Having a combo of DP, HDMI and DVI sucks balls.
|
# ? Oct 3, 2016 16:24 |
|
real_scud posted:DP will always and forever sucks balls until they update the spec to not have a monitor completely removed from showing up as accessible when you turn it off. This is an issue with monitors implementing the spec incorrectly.
|
# ? Oct 3, 2016 16:28 |
|
Displayport owns and I can't believe how slow it has been to catch on, especially because the licensing fees are cheaper than HDMI as I understand.
|
# ? Oct 3, 2016 16:34 |
|
Twerk from Home posted:Displayport owns and I can't believe how slow it has been to catch on, especially because the licensing fees are cheaper than HDMI as I understand. There are no fees, iirc
|
# ? Oct 3, 2016 16:35 |
|
The cable length limitations on DP limit its deployment scenarios compared to HDMI. Furthermore, companies have already invested plenty into HDMI and its compatibility with TVs, projectors, etc. is undisputed. HDMI max is 30 meters for 1080P @ 60Hz and DisplayPort maxes out at 5 meters for official certifications and anything beyond getting spotty in what bandwidth is actually supported.
|
# ? Oct 3, 2016 17:20 |
|
Potato Salad posted:But will any mobo manufacturer put three hdmi/dp outputs on their poo poo? We do 1 DP external and 2 DVI-D on the board itself. My Asrock z77 extreme4 had HDMI, VGA and DVI.
|
# ? Oct 3, 2016 17:53 |
|
PerrineClostermann posted:There are no fees, iirc VESA themselves don't charge any fees, but MPEG LA has patents covering aspects of DisplayPort and charges $0.20/unit to avoid any trouble. https://en.wikipedia.org/wiki/DisplayPort#Cost
|
# ? Oct 3, 2016 19:11 |
|
real_scud posted:DP will always and forever sucks balls until they update the spec to not have a monitor completely removed from showing up as accessible when you turn it off. The spec says monitors should be connected in power-off mode. Most monitors just don't implement the spec right. Combat Pretzel posted:If you're going to run three displays, you probably want to go balls out and have them all on DP. For that you currently still need a dedicated card. Having a combo of DP, HDMI and DVI sucks balls. Yeah, especially since DisplayPort is the standard for "professional" monitors that get used with integrated graphics for your average work PC, and since DisplayPort is a one-way conversion. You can convert DisplayPort to HDMI or DVI, which means that using DP lets you run any modern monitor at a minimum of 1440p@60 or 4K@30, but you cannot go the other way. This means that if your standard monitors are DisplayPort only you need active converters. SuperDucky posted:We do 1 DP external and 2 DVI-D on the board itself. My Asrock z77 extreme4 had HDMI, VGA and DVI. DVI-D pretty much needs to die out like VGA did at this point, since again, you can adapt either HDMI or DVI to DVI. I get hypothetically wanting to have HDMI on there since it's the standard connector for consumer equipment but a HDMI<->DVI adapter is like two bucks. Paul MaudDib fucked around with this message at 23:26 on Oct 3, 2016 |
# ? Oct 3, 2016 23:23 |
|
Paul MaudDib posted:
I don't disagree, but we had to do it that way due to compatibility restrictions. We have a chassis with a built-in monitor on the faceplate that has to have DVI, also, I'd be scared of surface mounting a DP connector on the board, its too tall, not enough support. e: you couldn't believe the hell we've gotten for not having VGA on this Skylake-S board. The very reason we never went to HDMI was that it didn't have a collar/lock on the connector. DP did, so we bit the bullet. PLUS, iirc, there's no way to pull an analog/vga signal out of the HD/P530 graphics on Skylake-S without an offboard BMC or graphics controller, which there was absolutely no room for on the already slam-full PICMG 1.3 board. SuperDucky fucked around with this message at 01:24 on Oct 4, 2016 |
# ? Oct 4, 2016 01:20 |
|
SuperDucky posted:I don't disagree, but we had to do it that way due to compatibility restrictions. We have a chassis with a built-in monitor on the faceplate that has to have DVI, also, I'd be scared of surface mounting a DP connector on the board, its too tall, not enough support. VGA and DVI should be actively eradicated, unless you're Dell or Lenovo and making PCs that work in lovely corporations with 15 year old projectors or something.
|
# ? Oct 5, 2016 09:01 |
|
I have a DP->VGA adaptor that cost 5 bucks and seems to work fine, I don't see much point in keeping them around on the actual hardware.
|
# ? Oct 5, 2016 10:24 |
|
Isn't DVI still the only way to use those overclockable Korean monitors?
|
# ? Oct 5, 2016 11:21 |
|
|
# ? May 27, 2024 03:40 |
|
HMS Boromir posted:Isn't DVI still the only way to use those overclockable Korean monitors? It's a shame if you have one of those monitors (I do!) but I guess next time I get a new video card I'll probably get something with G-Sync/Freesync anyway. Here's hoping that DP->DLDVI converters go down in price, it'd work really nicely as a secondary.
|
# ? Oct 5, 2016 11:28 |