|
MrYenko posted:For me, moving from Nehalem to Skylake isn't even about performance, its just to get out of my ancient X58 chipset motherboard, and even that isn't because of speed concerns, but because the thing is flat out old. Dead USB ports, it hasn't had a functioning onboard network interface in years, and I really feel like it's the weak point of my machine, currently. Don't forget the fact that it's some horrible mix of PCIe 1.1 and 2.0a, and you have all both slots available after the video card.
|
# ? Feb 10, 2015 21:14 |
|
|
# ? May 27, 2024 04:13 |
|
Yeah, PCI-e upgrades are one thing that's happened over several years that's possibly worthwhile to upgrade for, and trying to avoid a lot of the PCI-e lane splitting BS is another reason I opted for a server platform in 2011. Then there's SATA2 / SATA3 ports to consider since newer SSDs are capped under SATA2 (and guess who bought a top-of-the-line SATA SSD without thinking of this? Me, of course). M.2 slots are going to eclipse all of these as well, so at least I'll be able to use those super fast SSDs coming out later this year... when they're on sale of course. Another thing about server motherboards is that most don't have sound cards (rightfully so). I felt really silly when I wound up buying a sound card in 2014 because I simply didn't have one and realized I needed one all of a sudden. Putting my Xeon into service as a dedicated server is just feeling good at this point for me.
|
# ? Feb 10, 2015 23:27 |
|
necrobobsledder posted:Yeah, PCI-e upgrades are one thing that's happened over several years that's possibly worthwhile to upgrade for, and trying to avoid a lot of the PCI-e lane splitting BS is another reason I opted for a server platform in 2011. Then there's SATA2 / SATA3 ports to consider since newer SSDs are capped under SATA2 (and guess who bought a top-of-the-line SATA SSD without thinking of this? Me, of course). M.2 slots are going to eclipse all of these as well, so at least I'll be able to use those super fast SSDs coming out later this year... when they're on sale of course. I have yet to find onboard sound that does not add distortion. I currently use an awesome Asus card that cost me like $20 and has a SPDIF out.
|
# ? Feb 10, 2015 23:35 |
|
mayodreams posted:I have yet to find onboard sound that does not add distortion. I currently use an awesome Asus card that cost me like $20 and has a SPDIF out. The little dedicated card that fits onto my Maximus VII Gene sounds pretty good playing through my Marantz.
|
# ? Feb 11, 2015 22:00 |
|
mayodreams posted:I have yet to find onboard sound that does not add distortion. I currently use an awesome Asus card that cost me like $20 and has a SPDIF out. I'm not sure what kind of noise I'll get when using toslink for audio out, but I get the impression that the processing will all happen onboard and still have all the board interference noise (and especially jitter) when converting to the signal for toslink optical. I'm not exactly doing audio recording but it's really annoying hearing random noises when I start up a high CPU task. I used to get hard drive whining noises on an older Cirrus Logic chipset card from Turtle Beach a decade ago (all that trouble to avoid having to use the Linux software mixer) that I didn't seem to get on Windows, so there may be a software component involved that I'm not aware of (aside from the ASIO stuff that is).
|
# ? Feb 11, 2015 22:51 |
|
atomicthumbs posted:The little dedicated card that fits onto my Maximus VII Gene sounds pretty good playing through my Marantz. I've been curious about that. I'm still on a Z68 board, but I'm sure things have improved some. I am sending SPDIF to some Edirol Studio Monitors which are just amazing. I've had issues in the pas with USB DACs getting bogged down with a high CPU or audio load. Like playing a simple game and having music on too.
|
# ? Feb 11, 2015 23:32 |
|
How do I get Quick Sync Video to work on my 4790K? I've made sure the onboard GPU is enabled and I installed the Intel drivers, but nothing that uses QSV seems to work.
|
# ? Feb 11, 2015 23:53 |
|
You dinguses talking motherboard audio should go to the audiophile ridicule thread. S/PDIF is a digital connector. There's no loving "jitter," all the processing is through the Windows software sound mixer and comes out bit-perfect identical whether it's onboard S/PDIF, split off HDMI, USB to S/PDIF audio interface, or a $500 internal sound card's S/PDIF interface. Digital. The output is 1 or 0. It can't jitter. The digital interfaces on the card completely skip the DAC and any component which can produce any noise of any variety. The only thing that the sound card and driver can do is up-mixing from stereo to surround (or 5.1 to 7.1 etc.) for content that isn't natively multi-channel. atomicthumbs posted:How do I get Quick Sync Video to work on my 4790K? I've made sure the onboard GPU is enabled and I installed the Intel drivers, but nothing that uses QSV seems to work. In the BIOS/UEFI setup, is iGPU Multi-Monitor set to "Enabled?" If it already is and it's still not working, try hooking up a monitor cable (like hook a DVI to another input on your screen or whatnot). No need to actually put an image on screen via the Windows multi-monitor settings, but it used to be a requirement for enabling QuickSync and might be worth trying as a troubleshooting step. Alternatively, if you have an Nvidia video card, just use NVEnc - it's pretty good, too.
|
# ? Feb 12, 2015 02:45 |
|
necrobobsledder posted:I'm not sure what kind of noise I'll get when using toslink for audio out, but I get the impression that the processing will all happen onboard and still have all the board interference noise (and especially jitter) when converting to the signal for toslink optical. I'm not exactly doing audio recording but it's really annoying hearing random noises when I start up a high CPU task. what F^2 said, jitter is a real phenomenon but it isn't very significant for digital audio in 99% of cases. But really I'm posting this because those random noises you hear when starting up a high CPU task? May not be what you think. CPUs get their power from switch-mode regulators. Switchers have magnetic coil inductors, which can mechanically flex a teeny little bit as the electric current flowing through them changes. And that current is changing; modern CPUs tend to rapidly switch between lots of current and none when they're not fully loaded. These switching events may sometimes happen at audible frequencies... In other words sometimes it is possible to hear noises emanating from components on your motherboard rather than your speakers, if the room's quiet and there's nothing else masking the sound.
|
# ? Feb 12, 2015 07:09 |
|
My fairly cheap dac gets crackling sounds through toslink input, but it has a usb input option which works fine. Granted I'm on a super cheapo Asrock board from a while ago. The processing happens on your dac either way. Toslink is going to give you a line-level signal. (Some dac manufacturers specifically call out that some toslink transmitters don't work too well at 24/192, which is probably what you're shooting for, but you should probably be fine.)
|
# ? Feb 12, 2015 09:28 |
|
BobHoward posted:what F^2 said, jitter is a real phenomenon but it isn't very significant for digital audio in 99% of cases. https://www.youtube.com/watch?v=HP73edpQwgc Certain nVidia 970's were doing that pretty bad. edit: oh yeah, lol'ing at the jitter worry.
|
# ? Feb 12, 2015 09:41 |
|
Dammit, looks like my Z77 board is getting flaky, and not detecting graphics cards 100% of the time Normally I'd just replace the old board and chip with something newer and sell the chip, but kinda attached to this 2500K and not seeing much benefit to going Haswell. This might be the first time I ever replace an old motherboard with something of similar spec.
|
# ? Feb 13, 2015 00:41 |
|
wipeout posted:Dammit, looks like my Z77 board is getting flaky, and not detecting graphics cards 100% of the time What model do you have? I am still on a p67 board running strong for 4 years now!
|
# ? Feb 13, 2015 02:30 |
|
Darkpriest667 posted:What model do you have? I am still on a p67 board running strong for 4 years now! This board is an Asrock Z77e-ITX - it just intermittently decides there is nothing in the PCI-e slot. Cleared CMOS, tried another graphics card (one nVidia, one AMD), reseated them, updated BIOS's and tried betas. Same deal (Bugger is, I can find a lot of cheapish ATX S1155 boards, but most of the mITX ones are overpriced now. Was hoping to keep this alive until there was something really compelling to upgrade to)
|
# ? Feb 13, 2015 03:22 |
|
wipeout posted:(Bugger is, I can find a lot of cheapish ATX S1155 boards, but most of the mITX ones are overpriced now. Was hoping to keep this alive until there was something really compelling to upgrade to) Amazon has brand new ASUS P8Z77-I DELUXE/WDs for $143 after a $50 rebate. Seems like a solid deal all things considered.
|
# ? Feb 13, 2015 04:27 |
|
SamDabbers posted:Amazon has brand new ASUS P8Z77-I DELUXE/WDs for $143 after a $50 rebate. Seems like a solid deal all things considered. That's pretty awesome and I appreciate it. I'll have to see what I can find in the uk, though. GRINDCORE MEGGIDO fucked around with this message at 20:02 on Feb 13, 2015 |
# ? Feb 13, 2015 06:35 |
|
BobHoward posted:In other words sometimes it is possible to hear noises emanating from components on your motherboard rather than your speakers, if the room's quiet and there's nothing else masking the sound. It's also possible to analyze this noise and use it to extract RSA keys. In news, Skylake Core-M coming later this year according to Krzanich. Haswell-EX specs:
|
# ? Feb 13, 2015 12:52 |
|
wipeout posted:Dammit, looks like my Z77 board is getting flaky, and not detecting graphics cards 100% of the time Get a can of compressed air or something and blow out the connector slot. I had a similar issue with my board and it turned out to be dust in the PCIE slot.
|
# ? Feb 13, 2015 12:56 |
|
Welmu posted:It's also possible to analyze this noise and use it to extract RSA keys. Can anyone explain to me why the "top" (at least model number wise) part is a 4 core 140w part. Seems a bit insane, the frequency difference is relatively minor to the 18 core part.
|
# ? Feb 13, 2015 13:02 |
|
BurritoJustice posted:Can anyone explain to me why the "top" (at least model number wise) part is a 4 core 140w part. Seems a bit insane, the frequency difference is relatively minor to the 18 core part. A 22% increase in clock speed is not a minor difference. Some workloads are not heavily threaded, so clock speed becomes king, and a 140W TDP 4 core proc will get you better results than an 18 core with 25W more TDP that will throttle cores to maintain that TDP max.
|
# ? Feb 13, 2015 14:44 |
|
Welp RIP Broadwell desktop according to the rumor mill. Your little 14nm lanes were burning too bright for this world.
|
# ? Feb 13, 2015 18:29 |
|
SwissCM posted:Get a can of compressed air or something and blow out the connector slot. I had a similar issue with my board and it turned out to be dust in the PCIE slot. Air duster ordered, you rock!
|
# ? Feb 13, 2015 20:01 |
|
wipeout posted:This board is an Asrock Z77e-ITX - it just intermittently decides there is nothing in the PCI-e slot. Cleared CMOS, tried another graphics card (one nVidia, one AMD), reseated them, updated BIOS's and tried betas. Same deal Bummer I have a Fatal1ty Pro board , also by ASrock, I swear by the Fatal1ty Professional series I've owned 3 of them with different chipsets and they have always been real good to me. I like ASrock as an alternative to ASUS and Gigabyte.
|
# ? Feb 14, 2015 00:00 |
|
sauer kraut posted:Welp RIP Broadwell desktop according to the rumor mill.
|
# ? Feb 14, 2015 00:00 |
|
mayodreams posted:A 22% increase in clock speed is not a minor difference. Some workloads are not heavily threaded, so clock speed becomes king, and a 140W TDP 4 core proc will get you better results than an 18 core with 25W more TDP that will throttle cores to maintain that TDP max. But Intel has 4.4GHz 4 cores in 88w TDPs, and the only difference I can see really is the extra cache and QPI links which can't make up the difference. Noting the clock speeds I should've compared to the 10 core. 6 more cores and 12.5% slower clocks for only a 25w delta?
|
# ? Feb 14, 2015 00:23 |
|
mayodreams posted:A 22% increase in clock speed is not a minor difference. Some workloads are not heavily threaded, so clock speed becomes king, and a 140W TDP 4 core proc will get you better results than an 18 core with 25W more TDP that will throttle cores to maintain that TDP max. Also you can put 8 of them in a system
|
# ? Feb 14, 2015 00:24 |
|
BurritoJustice posted:But Intel has 4.4GHz 4 cores in 88w TDPs, and the only difference I can see really is the extra cache and QPI links which can't make up the difference. Noting the clock speeds I should've compared to the 10 core. 6 more cores and 12.5% slower clocks for only a 25w delta? Those might be partially defective chips that are salvageable by pumping massive amounts of power through them. Not ideal but better than simply throwing them away if their yields are struggling.
|
# ? Feb 14, 2015 04:21 |
|
isndl posted:Those might be partially defective chips that are salvageable by pumping massive amounts of power through them. Not ideal but better than simply throwing them away if their yields are struggling. Depending on how they fuse the chips, it could also be a frequency/voltage/TDP tradeoff, where marginal chips with a really leaky set of cores get fused off and they just allocate the TDP on the other half to bumping the frequency/voltage up to compensate. 10 Cores with a 160W TDP due to a flaky subset of cores turns into a 4670K with ECC ram.
|
# ? Feb 14, 2015 04:34 |
|
BurritoJustice posted:Can anyone explain to me why the "top" (at least model number wise) part is a 4 core 140w part. Seems a bit insane, the frequency difference is relatively minor to the 18 core part. That part exists for software that is licensed per-core that needs the larger memory capacity of the EX platform. BurritoJustice posted:But Intel has 4.4GHz 4 cores in 88w TDPs, and the only difference I can see really is the extra cache and QPI links which can't make up the difference. Noting the clock speeds I should've compared to the 10 core. 6 more cores and 12.5% slower clocks for only a 25w delta? Whole lot more transistors on the E7, Base vs Turbo, more stringent validation, market positioning. This should clear things up: Here's a picture of a Haswell i7: Here's a picture of the active* components of a 4-core, 40 MB E7: Note that the cores are only a fraction of the overall die. I am somewhat surprised that the 8883v3 is slower than the 8883v2, though. * this is a Ivy Bridge EX, so the DDR4 controllers will (probably) be larger on the Haswell-EX. Same 22 nm process, though. Some portion of the blanked-out areas would still be active.
|
# ? Feb 14, 2015 05:28 |
|
Darkpriest667 posted:Bummer I have a Fatal1ty Pro board , also by ASrock, I swear by the Fatal1ty Professional series I've owned 3 of them with different chipsets and they have always been real good to me. I like ASrock as an alternative to ASUS and Gigabyte. Ya, this little Itx board has been awesome, fingers crossed it is just some crud in the slot. I'd buy Asrock again no question. Hoping Broadwell socketed chips are actually going to appear - aren't they the last procs z97 will support?
|
# ? Feb 14, 2015 06:30 |
|
Methylethylaldehyde posted:Depending on how they fuse the chips, it could also be a frequency/voltage/TDP tradeoff, where marginal chips with a really leaky set of cores get fused off and they just allocate the TDP on the other half to bumping the frequency/voltage up to compensate. 10 Cores with a 160W TDP due to a flaky subset of cores turns into a 4670K with ECC ram. Although fusing is a thing, there are very different dies involved here (see PCjr's pic) and they do not ever turn one of these big monster parts into a 4670k. Intel's current pattern is to do (at least) three groups of designs (ignoring everything for mobile): Xeon E7 Xeon E5 / LGA20xx enthusiast desktop Xeon E3 / LGA11xx desktop There is no crossover based on fusing between the three groups because they are very different designs. And the E7 is really different. They're targeted at very large and very high reliability servers, where cost is not a problem and exotic features are common. For example, E7 doesn't have standard DDR memory interfaces on-chip, instead it speaks a different protocol to arrays of external buffer/controller chips, allowing it to support truly fuckoff quantities of memory. (Probably multiple terabytes for the new E7 v3 family. Looks like E7 v2 already supported up to 1.5TB.) Intel doesn't have a great need to play fusing games in order to recover poorly performing or partially defective E7 chips. The prices are so high ($5K or more per chip is not out of the question) that they can easily afford low yield. E7 is also held to much more stringent reliability requirements. They're very conservative with binning. That, plus the extra power used by RAS and scalability features, is why you end up with much lower top-bin frequencies than the non-behemoth product lines.
|
# ? Feb 14, 2015 08:01 |
|
sauer kraut posted:Welp RIP Broadwell desktop according to the rumor mill.
|
# ? Feb 14, 2015 13:43 |
|
BobHoward posted:E7 is also held to much more stringent reliability requirements. They're very conservative with binning. That, plus the extra power used by RAS and scalability features, is why you end up with much lower top-bin frequencies than the non-behemoth product lines. Yeah, I probably picked a lovely part to compare it to, it was the first 'stupid fast. 4 cores' thing I could think of. It's still probably TDP budgeting, disable the leaky cores and rebin it as a different SKU. That improves yields and allows for a sort of natural market segmentation effect without needing to actually produce more than one set of silicon masks. They can certainly afford to be super picky about which parts end up in the 5k CPU bin, but even partial recovery on a high cost item is better than no recovery.
|
# ? Feb 14, 2015 14:12 |
|
Looks like there's enough demand for those 2TB+ RAM graph processing nodes that LinkedIn and some others wanted from OEMs that Intel decided to chip in with that kind of memory hardware. Neat.
|
# ? Feb 14, 2015 18:11 |
|
Can anyone explain why twitch streamers keep buying Xenon CPUs for encoding x264 for streaming when consumer grade CPUs murder them in performance? http://www.anandtech.com/bench/CPU/53
|
# ? Feb 16, 2015 23:05 |
|
If you look at the x264 2 pass, the xeons murder the dekstop cpus.
|
# ? Feb 16, 2015 23:17 |
|
Live streaming x264 presets are strictly 1 pass I thought.
|
# ? Feb 16, 2015 23:19 |
|
I thought the point of the Xeons was that you can get more than four cores, so you can encode and stream multiple things in parallel, or stream and play on the same box without much of an impact.
|
# ? Feb 16, 2015 23:26 |
|
Honestly I thought they were mostly buying Xeons because it lets them set higher asking points for donations.
|
# ? Feb 16, 2015 23:28 |
|
|
# ? May 27, 2024 04:13 |
|
BobHoward posted:(Probably multiple terabytes for the new E7 v3 family. Looks like E7 v2 already supported up to 1.5TB.) 1.5 TB per socket necrobobsledder posted:Looks like there's enough demand for those 2TB+ RAM graph processing nodes that LinkedIn and some others wanted from OEMs that Intel decided to chip in with that kind of memory hardware. Neat. You could get a 4 socket, 2TB node from IBM in 2011 on the E7v1s. You see these a in a lot of large database boxes; Intel has made a lot of money off of Oracle for a while now. They'll even do stuff like provide Oracle-exclusive SKUs; $300K (list) will get you a 120 core, 6 TB system from Lenovo. Fajita Fiesta posted:Can anyone explain why twitch streamers keep buying Xenon CPUs for encoding x264 for streaming when consumer grade CPUs murder them in performance? http://www.anandtech.com/bench/CPU/53 I don't know what Xeons they're buying, but a WAG would be that encoding and gaming at the same time probably benefits from the extra cores over the consumer i7s that is not reflected in the synthetic encode-only benchmark.
|
# ? Feb 16, 2015 23:30 |