|
snickothemule posted:Hey fellas, with all the self isolation going on I think I'm developing a fresh case of brain worms, I'm looking at a 1660 v4 ES chip (6900k equivalent) that is reported to be overclockable. Being an engineering sample I understand is a huge risk, but having 2 extra cores over my 6800k could do wonders for some of my workloads in photogrammetry and give me 40pcie lanes instead of the current 28 (which isn't a problem now, but I may end up adding a u.2 drive down the track). AFAIK v4s are not overclockable although specifically the ES only may be. What’s the s-spec (will be a 4 letter q code for an engineering sample)? The 1660v3 retail chips are overclockable and are down to around $160 on eBay. I would strongly encourage you to not gently caress with ES silicon unless the price is really really right. Haswell-E performs really close to Broadwell when overclocked and I don’t think you’ll notice any difference unless you really push it balls to the wall. You can probably flip your 6800K for at least $100 or so. A $60 upgrade isn’t really a bad deal for two more cores. Zen2 is really good priced and is now basically outperforming X99 in all respects on a per-core basis but you will be spending more like $270 for that 8-core or $400 for a 12c. And Zen2 cannot substitute for the PCIe lanes unless you move to TR3000, where the entry level is $1400. Waiting for the 4000 series is an option too but prices will bump back up to MSRP and we’ll see whether that’s better than the deals you can get on 3000 today and as prices continue to decline. Zen2 is a major performance increment, Zen3 will have to prove its case for its higher prices just like Coffee Lake does. Paul MaudDib fucked around with this message at 03:16 on Mar 31, 2020 |
# ? Mar 31, 2020 03:13 |
|
|
# ? Jun 13, 2024 05:06 |
|
It's a QK3S chip, sitting at around $280USD, supposedly it can do 4.2gh at 1.25v which has raised my eyebrow a bit. I've also seen another unit that only does the standard clock speed that was also an engineering sample IIRC. I'm predominately thinking of selling the 6800k if this chip is ok, mostly I'm interested in gaining the two cores without having to upgrade CPU, MB, RAM. I'm still rather happy with how the 6800k performs for it's age, for my workloads and for gaming it still hangs in there ok unless there's something that really only depends on clock speed, which is why I'm partial for trying to get a chip closer to 6900k performance with those additional cores. Those v3's look pretty good, but if I can get this model to OC like a few folks have reported over on reddit and overclockers, it might be worth my time, even though Broadwell is such a strange product. Also, thanks Paul. I appreciate your insight. snickothemule fucked around with this message at 04:33 on Mar 31, 2020 |
# ? Mar 31, 2020 04:27 |
|
That’s like, a basic overclock on Haswell-E. You can do very close to that if not the same thing on one of those 1660v3s. Especially given that it’s ES silicon - Broadwell was the first 14nm chip and early silicon was not good. Auto voltage on my 5820K gets me 4.13 at 1.27v for sure and I’ve never hosed around exhaustively to see the exact limit. With sufficient voltage, Haswell-E will do about 4.6 or 4.7 and Broadwell-E will do as high as 4.7 or 4.8. I would strongly encourage you to get a used 1660v3 off eBay instead of spending almost twice as much on an ES. That’s crazy and a waste of money, it’s one thing if it’s basically a $60 upgrade for one component but at that point you should just pony up for a 3700X or a 3900X. ES are known to have weird quirks and may or may not run on your board, the board needs microcode to support each specific processor, they are worth it if they’re cheap but not at almost $300. If you want a best shot at good silicon try to send a message to one of the 1660v3 sellers on eBay and ask them to handpick a chip with a production code that starts with J. These are all late-production silicon that stands a very good chance of overclocking effortlessly. Don’t reveal the wu-tang secret though lol, or they’ll start upcharging for them. (Not that there’s a huge community of X99 owners let alone who are still using them, I guess...) I’d love to own a 6950X too, but it’s just not worth the money, even used. Broadwell has always been too expensive and a poor seller compared to Haswell-E, and this has translated into inflated prices on the used market as well. It has very little to offer except for the 10C version, it’s barely higher IPC and doesn’t clock notably better, and the power consumption difference is minor. It’s just not worth paying twice as much for buggy engineering samples. Paul MaudDib fucked around with this message at 16:28 on Mar 31, 2020 |
# ? Mar 31, 2020 04:56 |
|
D. Ebdrup posted:So like I started with saying: we're screwed, and might as well go back 20 years unless we're doing HPC. :P I mean, AMD keeps eating their lunch, might as well go dump some money into Si-Five.
|
# ? Mar 31, 2020 05:11 |
|
Comet Lake-H, Intel’s response to Renoir: https://www.hd-tecnologia.com/estas-son-las-especificaciones-de-los-nuevos-intel-comet-lake-h-para-portatiles/
|
# ? Mar 31, 2020 07:37 |
|
I'm surprised there hasn't been talk of the Intel's new Xeon R-line processors, their weird subterfuge to avoid dropping CPU prices. Feels like they are finally starting to feel pressure from AMD. We were planning to purchase new server and Xeon Gold 6248 was the prime candidate, when we noticed the 6248R for slightly lower price. But instead of a cheaper 6248, the new processor is more like a rebranded Platinum 8268, twice the price of 6248. Intel Xeon Gold 6248R Benchmarks and Review Big Refresh Gains https://ark.intel.com/content/www/us/en/ark/compare.html?productIds=192481,199351,192446
|
# ? Apr 1, 2020 01:00 |
Probably because most of us don't spend +3000USD on a CPU?
|
|
# ? Apr 1, 2020 12:14 |
|
I've noticed it, in a quote from HPE.. we were comparing epyc and intel, and bam, the new part was right there in the quote, with a much lower price than I've seen before from Intel. There's not much to talk about though, AMD still has the better product in general. Prejudice is rampant, however
|
# ? Apr 1, 2020 13:15 |
|
Cygni posted:Cyrix has been dead for 20 years my dude. NatSemi bought them, then flipped them to VIA, who used the ip to make their low power C3 chips. They were still repackaging it for the embedded market, even making a quad core version called Eden x4, but it’s not in production anymore. They are using ARM now. Rastor posted:Centaur still Hauls, in China This is interesting: Gamers Nexus smuggled one of the Cyrix/Centaur/VIA/Zhaoxin x86 systems out of China https://www.youtube.com/watch?v=-DanhnASClQ Edit: but this is one of the old VIA Nano designs, not the new microarchitecture Rastor fucked around with this message at 14:03 on Apr 2, 2020 |
# ? Apr 2, 2020 13:45 |
|
Rastor posted:This is interesting: Gamers Nexus smuggled one of the Cyrix/Centaur/VIA/Zhaoxin x86 systems out of China I'm legit surprised they even got the thing to RUN some of those tests. That is such an ancient arch, and it was low performance when it was brand new... a decade ago. In other news, first Tiger Lake U (with Xe) 3dmark runs with everything actually running at some sort of real clockspeed are showing up: https://twitter.com/TUM_APISAK/status/1245899475991662592 5% ahead of the Ryzen 4800U in the graphics score, and the CPU is running at a locked 3ghz and not turboing, so there is probably more performance to come. On one hand, thats good for an IGP... the best for any laptop ever. On the other hand, its not really the revolution some were hoping for. People got excited about IGP and it will probably still be disappointing, i know, shocking.
|
# ? Apr 3, 2020 04:37 |
|
Cygni posted:People got excited about IGP and it will probably still be disappointing, i know, shocking. Says you. I've been waiting for this moment for years. In one fell swoop, Intel and AMD have killed off Nvidia MX GPUs, and I couldn't be more pleased.
|
# ? Apr 3, 2020 07:43 |
Die-shot from AnandTech featuring 8 cores of Coffee Lake silicon.
|
|
# ? Apr 3, 2020 08:56 |
|
SwissArmyDruid posted:Says you. I've been waiting for this moment for years. In one fell swoop, Intel and AMD have killed off Nvidia MX GPUs, and I couldn't be more pleased. Agreed. The MX series always seemed to be highly compromised devices--not really performant enough to do much, but sucked a lot more power than an iGPU, so they'd hurt battery life considerably. Always felt like you never got enough of anything to make it a legit good option, instead of "well this is what we've got." If an iGPU can replace the MX entirely, that's all the better for the thin-and-light laptops I enjoy.
|
# ? Apr 3, 2020 16:06 |
|
so if Rocket Lake is expected to have Xe graphics, and that very early leak from Tiger Lake, which does have Xe graphics, suggests performance that's comparable to a Ryzen APU, doesn't that create a situation where any non-F Rocket Lake CPU is going to be able to drive graphics at a level similar to, say, an Athlon 3000G or a Ryzen 2200G? That seems kind of interesting, I'm kinda thinking from the perspective of even office computers with a Rocket Lake Pentium having a decent iGPU. Meanwhile Comet Lake on the desktop is still going to be using UHD, right?
|
# ? Apr 5, 2020 04:29 |
|
gradenko_2000 posted:so if Rocket Lake is expected to have Xe graphics, and that very early leak from Tiger Lake, which does have Xe graphics, suggests performance that's comparable to a Ryzen APU, doesn't that create a situation where any non-F Rocket Lake CPU is going to be able to drive graphics at a level similar to, say, an Athlon 3000G or a Ryzen 2200G? Yep. It’s also interesting because the dedicated Intel GPU card should have 2x or 4x the performance of a single Xe iGPU due to the rumored chiplet design, though there will be differences in CUs and certainly frequency.
|
# ? Apr 5, 2020 07:05 |
|
Comet Lake won’t be available (and reviews are embargoed, bullshit) until May 27th. They are keeping the April 30th “launch date”, but it’s a paper launch. Initially was supposed to be available in Feb/March, and with only 1 part that is really new. Prices better be great, otherwise it’s getting in to “might as well wait for Zen3” territory.
|
# ? Apr 6, 2020 18:35 |
|
I've been tossing around the idea of a pc refresh whenever Nvidia's next x80ti comes out, but with the pandemic I'm bored and eyeing Comet Lake. Good idea or wait it out, since the x80ti is probably a year out still? I'm not sure what the next proc after Comet Lake will bring. Currently on an I7-6700k and 1080ti.
Sipher fucked around with this message at 20:53 on Apr 6, 2020 |
# ? Apr 6, 2020 20:33 |
|
Sipher posted:I've been tossing around the idea of a pc refresh whenever Nvidia's next x80ti comes out, but with the pandemic I'm bored and eyeing Comet Lake. Good idea or wait it out, since the x80ti is probably a year out still? I'm not sure what the next proc after Comet Lake will bring. Currently on an I7-6700k and 1080ti. I'm in the same boat. The rumor is that the next x80ti is coming this year. The next intel CPU is Rocket Lake, which is Tiger Lake backported to 14nm. So finally new features, but likely the same temp/power issues keeping a lid on performance. I was gonna get the 10 core Comet Lake, but now im thinkin ill get the 3080 Ti instead and see what Zen 3 has to offer. On the 6700k, GN has a good comparison video. https://www.youtube.com/watch?v=LCV9yyD8X6M
|
# ? Apr 6, 2020 21:06 |
|
Sipher posted:I've been tossing around the idea of a pc refresh whenever Nvidia's next x80ti comes out, but with the pandemic I'm bored and eyeing Comet Lake. Good idea or wait it out, since the x80ti is probably a year out still? I'm not sure what the next proc after Comet Lake will bring. Currently on an I7-6700k and 1080ti. I think I'm probably stuck holding out in the 12-18 month timeframe for Intel HEDT to be competitive against Zen 3 — Zen 2 Threadripper at work has spoiled me, but have always run Intel at home. But, if they can't get their poo poo together... :rajatear:
|
# ? Apr 6, 2020 23:01 |
|
|
# ? Apr 6, 2020 23:22 |
|
It still slightly pizzles me when people say their 6xxx series or newer unlocked chips need an upgrade. Outside of some big multithreaded workloads, how big really is the performance difference between those and a super current chip in real world usage? I know Benchmarks can show the difference, but at the same time, actual personal experience with the hardware can sometimes feel a bit closer than the numbers may suggest. I built a few Ryzen 3700/3800X systems recently and while they were faster, they didn't feel that exponentially faster than my ancient rig. Hell even running a few Benchmarks only threw them a stones throw above since we were all using similar GPU's too. (Though it was a 2080 Super vs my non Super). I still can max out Ultrawide gaming, run a VM, and do some remote work at the same time on my now ancient x79 1660v1 with a 2080. I can imagine stuff being faster and better, but I wonder truly by how much. Don't get me wrong I would love to upgrade someday, but being I recently became House Poor, I'm pretty much stuck with this beast for at least a few more years.
|
# ? Apr 7, 2020 19:48 |
|
4 or 6 core can be a performance bottleneck in certain multitasking use cases like live streaming and certain games. If you do any sort of video encoding or software rendering each is that much more speed. The territory you speak of, the less obvious stuff is still a pretty big difference, power consumption, memory speed) latency, usb3/c, nvme also add up onto big gains.
|
# ? Apr 7, 2020 20:20 |
|
LRADIKAL posted:4 or 6 core can be a performance bottleneck in certain multitasking use cases like live streaming and certain games. If you do any sort of video encoding or software rendering each is that much more speed. Yea I know about the more cores for multithreaded heavy task which has prompted me to debate grabbing a 1680v2 for this old rig, but for the most part I can still play and Livestream 1080P60 rather well which is nice though it is using the 2080's NVEC encoder vs Software. As for all the other things, this old thing has nVME, an add-on USB 3.0 card for VR as well as the onboard 3.0 ASMedia stuff that's hit or miss for sure. I think the biggest benefit might be power consumption for me, however I will probably remain the same with whatever new build since I'd stick with HEDT and any power savings I get will be canceled out by all the cores among other things. Idle might be better though.
|
# ? Apr 7, 2020 20:31 |
|
EdEddnEddy posted:It still slightly pizzles me when people say their 6xxx series or newer unlocked chips need an upgrade. Outside of some big multithreaded workloads, how big really is the performance difference between those and a super current chip in real world usage? The key for a 3700x build is fast as hell RAM, which might also benefit an Intel build. I've used some 6 core Intel builds with a single stick of RAM and iGPU because work computers. They work fine but are noticeably slower in response. Then again I spent more on parts for my gaming AMD build than it cost for the Dell that I'm using at work. So I have way better RAM and a faster NVMe drive so it's not a fair comparison. I think I'd be as happy with my work 6 core i5 desktop with more RAM and my RX 5700 though. I just like to run stuff in the background like watching a Netflix stream on another monitor which certainly smooths things out with the extra cores. I think most people are just being told to get good RAM with a Ryzen. 3600CL18 RAM takes longer from power on to BIOS splash screen than BIOS to windows desktop. If I set my RAM to 2666CL16 which is more typical in an Intel build it takes significantly longer to get into Windows (it's still fast as all hell). IF can't be that big a game changer, but without getting a feel myself I can't really comment other than looking at benchmarks which we all know average and max FPS aren't as important as worst case response time which is when things get frustrating.
|
# ? Apr 7, 2020 20:57 |
|
LRADIKAL posted:4 or 6 core can be a performance bottleneck in certain multitasking use cases like live streaming and certain games. If you do any sort of video encoding or software rendering each is that much more speed. This I would agree with. Where the 6600K is in benchmarks its not great in comparison, it is often quite playable. But sadly I'm on a Cities Skylines kick at the moment and CPU usage is getting right up there for large cities. Ditto for Gal Civ 3 and endgame, so more cores and hertz and IPC would be welcome. Unless their engines are lovely then drat. quote:The territory you speak of, the less obvious stuff is still a pretty big difference, power consumption, memory speed) latency, usb3/c, nvme also add up onto big gains. My Z170X Gaming 7 board with a 6600K has DDR 4 ram running at 3200 fine, USB 3.1 & C and NVME support. It's old, but not that old.
|
# ? Apr 7, 2020 23:33 |
|
6700k has nowhere been near taxed on any of my games, I play at 4k60 and it's always the gpu doing most of the work. The 6700k is easily managing workloads at 60 frames so far. It's got a few years yet.
|
# ? Apr 7, 2020 23:38 |
|
Zedsdeadbaby posted:6700k has nowhere been near taxed on any of my games, I play at 4k60 and it's always the gpu doing most of the work. The 6700k is easily managing workloads at 60 frames so far. It's got a few years yet. The CPU load at 1080p and 4k should be nearly identical. The CPU load will increase if you go from 60fps to 144fps.
|
# ? Apr 8, 2020 00:06 |
|
I know, that's why I referred to the 60 fps in particular. For a guy like me that's just swell enough, but peeps who game at 120+ fps will obviously need a beefier cpu sooner than I will
|
# ? Apr 8, 2020 00:24 |
|
I had a 7700k feeding my GTX 1080 video card at 1080p high hz and in several newer games it consistently couldn't keep up making the GPU to idle down. The next couple generations of video cards are definitely going to leave 4 core CPUs behind with anything past mid-range.
|
# ? Apr 8, 2020 00:44 |
|
Zedsdeadbaby posted:I know, that's why I referred to the 60 fps in particular. For a guy like me that's just swell enough, but peeps who game at 120+ fps will obviously need a beefier cpu sooner than I will Also, people who are sensitive to micro stutter and transient fps drops still like stuff a little more cutting edge.
|
# ? Apr 8, 2020 01:18 |
|
5820k for life!
|
# ? Apr 8, 2020 01:36 |
|
DrDork posted:5820k for life! If I was less lazy I'd post here friendship with 2600K ending, now 3960X is my new friend. so many threads, so much RAM
|
# ? Apr 8, 2020 03:57 |
|
kind of want to move my machine from an O11 dynamic to an ncase m1. should I buy an itx mobo in anticipation of z390 being phased out soon i suppose what i'm askin is how imminent is the new intel chipset Sphyre fucked around with this message at 08:09 on Apr 8, 2020 |
# ? Apr 8, 2020 07:31 |
|
Zedsdeadbaby posted:I know, that's why I referred to the 60 fps in particular. For a guy like me that's just swell enough, but peeps who game at 120+ fps will obviously need a beefier cpu sooner than I will I'm running a 4790k with a 1070. If I run CoD: Warzone in 4k my GPU usages is 99% and my CPU is 65%-ish, I get about 40fps. If I run at 1080p my GPU is 95% and my CPU is about 50% and cruise at over 100fps. This CPU just doesn't want to be replaced.
|
# ? Apr 8, 2020 08:48 |
|
DrDork posted:5820k for life! I bought a used 5820k and an asus x99 ws board and its been the best build ive had. Wouldnt mind one of those haswell-e 8 core xeons but the chepest one is in the us and it adds a huge shipping cost onto it. I've seen second hand 5960x's go for around £200 but then thats inline in price with some of the modern ryzen stuff. Although maybe if they drop a bit more in price.
|
# ? Apr 8, 2020 12:39 |
|
EdEddnEddy posted:It still slightly pizzles me when people say their 6xxx series or newer unlocked chips need an upgrade. Outside of some big multithreaded workloads, how big really is the performance difference between those and a super current chip in real world usage? I think part of it may do with the vulnerability mitigations. While your average consumer may not feel the performance drop from them, and even some high end users for that matter, I think the presence of mind that you're using a chip that has hardware defenses against the recent big name attacks is driving some of it. Or gives them an excuse to upgrade earlier than planned anyway, even if they didn't notice
|
# ? Apr 8, 2020 14:41 |
|
Sphyre posted:kind of want to move my machine from an O11 dynamic to an ncase m1. should I buy an itx mobo in anticipation of z390 being phased out soon Rumors have the next chipset coming out in the coming months but nothing official that I’ve seen. The AsRock phantom gaming itx is generally considered the best MITX Z390 board, it at least has the best VRM and 3 fan headers. I’m using it with a 9900K in the ncase and it works great. I believe the next best is the ASUS rog.
|
# ? Apr 8, 2020 15:23 |
|
iospace posted:I think part of it may do with the vulnerability mitigations. The real pros here are using chips old enough to not be vulnerable to some of those issues in the first place!
|
# ? Apr 8, 2020 15:44 |
|
Nutsak posted:I'm running a 4790k with a 1070. If I run CoD: Warzone in 4k my GPU usages is 99% and my CPU is 65%-ish, I get about 40fps. If I run at 1080p my GPU is 95% and my CPU is about 50% and cruise at over 100fps. This CPU just doesn't want to be replaced. If the game is maxing out individual cores then you may benefit from an upgrade. In that scenario, the CPU will never hit 100% total. Also, playing at an atrocious frame rate is easier on the CPU. Most people aim for 60fps.
|
# ? Apr 8, 2020 17:52 |
|
|
# ? Jun 13, 2024 05:06 |
|
It's certainly something I'd test. I'd honestly never thought to check each core. The issue I have when I think of upgrading my CPU is do I hold out for the Covid-19 stuff to end (potential price drops) and get a 10 series because of the new socket, or just bite the bullet and get a 9 series and hope I don't need to upgrade in the next couple of years.
|
# ? Apr 8, 2020 21:25 |