Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

MaxxBot posted:

Back when I got my 6600k everyone thought that 4/4 would be perfectly fine for gaming looking forwards a few years, that turned out to be wrong. I do think the 9400F is a good part but I'd just be a bit wary of skimping on threads after that experience, especially since the next gen consoles are gonna be 8/16. I don't expect the 9400F to ever be passed up in raw gaming performance by the Ryzen 6/12 but I worry about weird frametime spikes in badly optimized games like you're starting to see with 4/4 CPUs.

4C4T started getting long in the tooth after Haswell. If you were trying to keep the budget down you didn't really have any other options, but it was not a long-term-oriented pick, it was something you bought to try and tide you over for another couple 3 years. If you were going for 3-5+ years (which is where we're at - the 6600K came out in 2015 after all) you should have gone for the 6700K or the 5820K. Yeah it sucks but sometimes future proofing actually does work, it's not gonna last you forever but paying the i7 tax does get you another year or two at the bitter end. The same thing happened with Sandy Bridge, the 2500K was feeling the pain by the 6000 series and the 2600K was hanging in there a little better.

Like with the "RAM speeds don't matter!" thing, not gonna defend other people's bad advice, but I do think you may be expecting too much from a 3+ year old midrange processor. Buying high-end does get you more lifespan, that's the tradeoff you make. By the time 6C6T starts having problems (in stuff other than poorly-coded ubisoft titles running intrusive DRM) will you really care, especially given that AMD has started pushing things forward again?

But yeah, not gonna lie, if we put a single timestamp on the whole "Intel got lazy" thing, it would be the 6000 series. Up until that point Intel had pushed 6C down to $350 on the HEDT platform with the 5820K, and taken 8C16T from server-only onto the flagship slot in the HEDT lineup. If they had followed that up by pushing 6C onto the consumer platform then Zen would have been DOA. The 8000 series lineup (6C12T = i7, 6C6T = i5, 4C4T = i3) should really have been the 6000 series lineup. On the HEDT platform, the 6900K should have dropped to midrange pricing, with the 6950X coming in at the $1000 price point of the 5960X. Instead HEDT prices actually increased and the consumer lineup stagnated.

(sounds absurd but again, that's what happened to the HEDT lineup in the past... 6C went from $1000 on X58, to $600 on X79, to $350 on X99... 8C should have followed a similar progression and dropped to $500-600 in the 6000 series but instead Intel raised prices slightly and then slotted the 6950X over the top, and let the consumer lineup stagnate.)

Paul MaudDib fucked around with this message at 18:18 on Apr 22, 2019

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Of course as to the general question of "should I buy an i5 right now", the answer is probably no. I don't think Zen2 is going to straight-up beat Intel in gaming performance (especially given some rumblings about bad clocks) but I think it'll close the gap enough that you don't care, especially if AMD can maintain prices that are similar to Zen+ launch prices. Coffee-Lake-ish performance 8C16T for ~$330 would be a winner, Coffee Lake-ish performance 6C12T for ~$220-250 would be a winner.

Or, the 2600X will probably get even cheaper once the 3000 series launches. 2600X is already $180 and under if you extreme coupon, I wouldn't be surprised to see that hit $125 after the 3000 series launches. Even if you want Ryzen, the smart money is on holding off another month to see what Zen2 looks like at Computex (probably). Prices are going nowhere except down.

Paul MaudDib fucked around with this message at 18:54 on Apr 22, 2019

Stickman
Feb 1, 2004

I can see the appeal, but the more I think about it, the less buying a high-end computer to ride "longevity" makes sense to me. Right now the performance differences between a 2600X/9400f and a 97/9900k mostly show up in the 100+ fps range, where I personally can't really notice a subjective difference, and you save a lot of money with the mid-range processors and then upgrading sooner rather than later if you're willing to part out your old components. If you really want or need to the push those high fps, why are you trying to eke the last bits of life out of your old system in which the parts are also depreciating so you're still probably better off parting it out for a decent upgrade now instead of later when bits start dying?

CPUs, motherboards, and to some extent GPUs all hold a decent amount of their value until they die or are far outclassed, so just a little bit of effort can save money and smooth out experience over time!

E: I get that there's a hobbiest component, too, but even then - this way you get to build computers more often!

Cygni
Nov 12, 2005

raring to post

PC LOAD LETTER posted:

???

You don't need DDR 3400+ with either the 1xxx or 2xxx Ryzens unless you're trying to get peak synthetic bench scores and its kinda silly to talk about high speed RAM with either a budget Intel or AMD build anyways.

I was talking about the chart I linked with a 9400F and 2666 memory being roughly the same as a 2600X with 3400.

That said, if you haven’t checked prices lately, a 2x8 kit of 2666 is $70, 3200 is $87, and 3400 is $92. We aren’t breaking the bank anymore.

The last few years, a bonus of Intel platforms was you could use cheaper ram as Ryzen liked memory frequency and Intel didn’t scale as drastically. With prices like they are now, I think that argument is less relevant.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Stickman posted:

I can see the appeal, but the more I think about it, the less buying a high-end computer to ride "longevity" makes sense to me. Right now the performance differences between a 2600X/9400f and a 97/9900k mostly show up in the 100+ fps range, where I personally can't really notice a subjective difference, and you save a lot of money with the mid-range processors and then upgrading sooner rather than later if you're willing to part out your old components. If you really want or need to the push those high fps, why are you trying to eke the last bits of life out of your old system in which the parts are also depreciating so you're still probably better off parting it out for a decent upgrade now instead of later when bits start dying?

CPUs, motherboards, and to some extent GPUs all hold a decent amount of their value until they die or are far outclassed, so just a little bit of effort can save money and smooth out experience over time!

E: I get that there's a hobbiest component, too, but even then - this way you get to build computers more often!

There is also the confounding factor of resale value. You can get every penny you put into a 7700K back out (except maybe the mobo), and in the meantime you used an i7 basically for free for two years. Rightly or wrongly (it's wrongly), a lot of people want to drop a new processor into their existing system rather than build a whole new one, and so they chase the best upgrade for that particular board. There's only so many of those processors to go around, and a lot of people who want them. And in contrast, there's a surplus of low-end chips like the 6400 because people want to upgrade and get rid of them. A 6400 or 7400 has lost almost half its value since launch.

In contrast, the aggressive price cuts and lack of platform segmentation on AMD causes resale values to drop like a stone. First-gen Ryzen buyers lost about 2/3 of their value in 2 years. A Sandy Bridge buyer lost that much value over... 7 years. The problem is, the 2700 that is now selling for $220 is directly competing against your 1800X that you bought for $530 on launch day. A 9900K/8700K doesn't directly compete against a 7700K, and the 8350K doesn't compete against the 7600K, because they don't fit the same boards.

This leads to different optimal strategies for the different platforms... AMD you are better off buying a generation behind (already depreciated) or midrange (less price to depreciate). Intel you are better off just sucking it up and dropping the cash for the flagship on launch day, even if you intend to resell it in 2 years.

Paul MaudDib fucked around with this message at 19:12 on Apr 22, 2019

Stickman
Feb 1, 2004

I agree to some extent, but I think the absolute price depreciation is more important than relative depreciation. Whenever I look at Intel CPU used prices, the i5s and i7/9s tend to have the same absolute depreciation over the first four-five years. For instance, the 4790k now sells for $190-220 ($120-150 less than MSRP) and the 4570 for $50-60 ($130-140 less than MSRP). So that means in terms of the depreciation cost of the i5 vs i7 you come out the same either way, but the i7 requires a greater up-front cost. If that extra performance is enough to warrant the up-front cost it makes the high-end chips a better value proposition than say, high end GPUS, but the i5s aren't more expensive in the long-run.

E: Basically I think either way, longevity is only a budget-friendly (and performance-friendly) consideration if it's "a couple of years" rather than "this'll help me in 4+ years".

Stickman fucked around with this message at 19:29 on Apr 22, 2019

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
Factoring in resale for a CPU purchase? That is a new one.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
The other way of thinking about it is that stepping up from an i5 to an i7 using your example only costs the extra ~$150 up-front cost once and then you get to ride the higher performance from then on, over multiple cycles.

That is, your 4790k goes from $350 -> $200, while the 4570 goes from $200 -> $50, and assuming the same spread continues in future generations, the X790k also goes for $350 and eventually sells for $200, while the X570 starts at $200 and eventually sells for $50, so the cost to go from the 4790k -> X790k is $150, the same as the cost to go from the 4570 to the X4570.

So for an upfront additional $150, you now get better performance over multiple generations, which would seem worthwhile, and only increase in performance-value-per-dollar the longer you kept that cycle up. Of course that assumes Intel's pricing schemes and resale values hold relatively stable cycle to cycle, which is not always you can assume.

Risky Bisquick posted:

Factoring in resale for a CPU purchase? That is a new one.

If you're not the type to hodl forever, it's not really as crazy as it sounds. Plenty of people do the same thing when considering GPU purchases, based on the reality that in a year or two something faster will be out, but there's always a healthy market for used hardware.

Stickman
Feb 1, 2004

DrDork posted:

The other way of thinking about it is that stepping up from an i5 to an i7 using your example only costs the extra ~$150 up-front cost once and then you get to ride the higher performance from then on, over multiple cycles.

That is, your 4790k goes from $350 -> $200, while the 4570 goes from $200 -> $50, and assuming the same spread continues in future generations, the X790k also goes for $350 and eventually sells for $200, while the X570 starts at $200 and eventually sells for $50, so the cost to go from the 4790k -> X790k is $150, the same as the cost to go from the 4570 to the X4570.

So for an upfront additional $150, you now get better performance over multiple generations, which would seem worthwhile, and only increase in performance-value-per-dollar the longer you kept that cycle up. Of course that assumes Intel's pricing schemes and resale values hold relatively stable cycle to cycle, which is not always you can assume.


If you're not the type to hodl forever, it's not really as crazy as it sounds. Plenty of people do the same thing when considering GPU purchases, based on the reality that in a year or two something faster will be out, but there's always a healthy market for used hardware.

That's a good point!

Risky Bisquick posted:

Factoring in resale for a CPU purchase? That is a new one.

I generally buy used high-end GPUs near the end of their cycle and then swap them out when they start feeling anemic. In the end I'm enjoying continuous top-of-the-mid tier performance for less yearly cost than flogging a top-tier card far past it's performance period. Plus, if you do the same thing with new cards you can stay within warranty and ensure you'll never get screwed by an out-of-warranty failure.

E: Of course, if you're a nice person and give your old equipment to friends/family, then the whole framework changes!

Stickman fucked around with this message at 19:44 on Apr 22, 2019

hobbesmaster
Jan 28, 2008

Stickman posted:

I can see the appeal, but the more I think about it, the less buying a high-end computer to ride "longevity" makes sense to me. Right now the performance differences between a 2600X/9400f and a 97/9900k mostly show up in the 100+ fps range, where I personally can't really notice a subjective difference, and you save a lot of money with the mid-range processors and then upgrading sooner rather than later if you're willing to part out your old components.

A better CPU improves 1% frame time which can be pretty noticeable when you're at 100+fps on a 144Hz or 240Hz screen. Though I guess if you're running one of those screens at full frame rate its probably because you're playing a competitive FPS in which case all your video settings are on the lowest.

Cygni
Nov 12, 2005

raring to post

My personal bar for being willing to deal with selling poo poo to the general public is like... $300. Which means I don’t flip CPUs.

If it’s worth less than that, I’m probably going to give it away or convince myself to store it for no rational reason (what if I need this GTX 960!!!!)

craig588
Nov 19, 2005

by Nyc_Tattoo
I give everything away. Want to build me a new PC? You can have my old one. It's never worth it to me to bother selling anything. I also tend to keep things until they physically fail. I got a Haswell because the capacitors on my 775 6800 popped and the computer stopped booting. It kept working for a few months with bulging capacitors but when it finally stopped booting I built another machine with cost as little object so I have a 5820K now.

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
It's just bizarre to factor resale in. You buy what you need/want at the time for the budget you currently have. Later on you can resell your cpu later when you wholesale change the platform, which let's be honest this is the intel thread, you get two years per motherboard if you are at the beginning of a cycle.

Stickman posted:

That's a good point!


I generally buy used high-end GPUs near the end of their cycle and then swap them out when they start feeling anemic. In the end I'm enjoying continuous top-of-the-mid tier performance for less yearly cost than flogging a top-tier card far past it's performance period. Plus, if you do the same thing with new cards you can stay within warranty and ensure you'll never get screwed by an out-of-warranty failure.

E: Of course, if you're a nice person and give your old equipment to friends/family, then the whole framework changes!

GPU != CPU

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Risky Bisquick posted:

It's just bizarre to factor resale in. You buy what you need/want at the time for the budget you currently have. Later on you can resell your cpu later when you wholesale change the platform, which let's be honest this is the intel thread, you get two years per motherboard if you are at the beginning of a cycle.

Whether or not you want to factor in resale is entirely up to how you want to look at things. For people who don't bother/want to sell stuff when they're done, obviously that factor doesn't matter at all. Same for people who want to build an entirely new machine every so often and aren't much worried about $100 here or there.

Resale does come in to play when you're very cost constrained, and that $100 you recoup by selling the CPU is a make or break issue for your entire upgrade.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

DrDork posted:

Both the PS4 and XBox One have 8 core CPUs (well, 2x 4-core for the PS4) since their release in 2013 and we're still really not seeing huge issues with 4- and 6-core desktop CPUs keeping up.

The xb1 is also a 2x4-core FYI. AMD offered both companies the same quad core Jaguar CPU tile as a SoC building block, they both elected to build a SoC with two copies of it.

The reason desktop CPUs are having no problem keeping up is that Jaguar cores are crap. Clock speed is modest, about 1.5 GHz iirc (a bit faster in the xb1x), and clock for clock they are light years behind *lake or Zen. Jaguar was the successor to Bobcat, a core designed for cheap netbooks. It’s optimized for small physical size and low power, not performance.

How bad is it? I’m having trouble locating any benchmarks of Jaguar cores, but I would not be surprised at all if, in practice, a Haswell i3 (also from 2013) running at 3 GHz or better already had equal or better x86 compute throughput than the 2013 consoles. Just 2 cores, but you probably gain a factor of 2 in x86 instructions per clock, and definitely gain a factor of 2 in clock frequency.

What’s more, think about this choice. I offer you one of two computers. They both score the same on a benchmark that spawns 1000 fully independent threads, each doing an identical amount of work. One of the computers got the score by using 1000 hardware threads, the other just one HW thread. They cost the same. Which one would you rather have?

Unless you’re a datacenter trying to allocate whole cores to customers, it’s always the lower thread count option. Amdahl’s Law never stopped being a thing.

So anyways, today’s console games shouldn’t be a problem for anything but the lowest tier desktop cpus, no matter how imposing the core count delta might seem at first glance.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong
The Xbox One is clocked at 1.75 GHz, the Xbox One X at 2.3 GHz. The PS4 is clocked at 1.6 GHz, the PS4 Pro at 2.13 GHz.

Then there's a mess of differences in RAM amount, speeds, types, and also GPU cores/shaders and counts beyond all that. The salient point is that the XBO plain is behind the PS4 more than you'd expect it to be, but simultaneously the Xbox One X is ahead of the PS4 Pro quite a ways more than you'd initially expect.

Otakufag
Aug 23, 2004
For the love or Jesus Christ our lord and savior please let this be the console generation where all games will finally run at 60fps.

FunOne
Aug 20, 2000
I am a slimey vat of concentrated stupidity

Fun Shoe
That'll be the next following that only does 4k60 HDR or something.


Then all the games will have native resolution scaling to keep the FPS up. :argh:

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Otakufag posted:

For the love or Jesus Christ our lord and savior please let this be the console generation where all games will finally run at 60fps.

Rehumanize yourself and face to bloodshed a Switch.

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord

Otakufag posted:

For the love or Jesus Christ our lord and savior please let this be the console generation where all games will finally run at 60fps.

1080p60 :razz:

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
I'm thinking both Sony and Microsoft will put out next-gen consoles that can do 4K60 gaming but said next-gen consoles will probably cost $500-$600.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

spasticColon posted:

I'm thinking both Sony and Microsoft will put out next-gen consoles that can do 4K60 gaming but said next-gen consoles will probably cost $500-$600.

The PS5 has already floated a $500 starting price in the recent ~rumor~ pieces.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

spasticColon posted:

I'm thinking both Sony and Microsoft will put out next-gen consoles that can do 4K60 gaming but said next-gen consoles will probably cost $500-$600.

Yes, in the same way that the XBox One is "4k capable": it supports upscaling to a 4k output! :hurr:

Now if you mean internally rendering at 4k60....very unlikely. Even at bumped down details, you'd need at least a 2070 equivalent to make that reasonable. That's a $500+ video card right now. Even if you assume that it drops in price over the next year prior to console launches, and Sony/MS get bulk discounts, you're still talking $300+ just for the GPU, and since contracts for the hardware are likely already signed, I'd bet they'd be closer to $400. Who knows what the CPU/RAM configuration will cost, but for the PS4 it was about $175 (or just shy of 50% of total retail price). Even if we just keep that largely static at $200, we're already up to $500-$600 and we still have a SSD and everything else to add in. And we haven't even started to talk about the difficulty of actually selling a $600 console; consumers have traditionally not really gone in on consoles that expensive--Sony found that out the hard way with the PS3's launch pricing. I'd be real surprised if they launched at anything about $500.

4k60 rendering might be doable for the PS5 Pro / XBox Two X or whatever they're gonna call the inevitable mid-life refresh, but it just isn't gonna happen at launch except for graphically simple games.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Not just cost but literally just a 2070 by itself is a higher power budget than entire consoles right now. Throw in at least another 45W for a CPU, even at modest clocks and assuming 7nm. And then you need a cooler that can dissipate all of that in a 4L formfactor (or whatever) without sounding like a leafblower.

Is it technically possible to design a console that could do native 4K60? Yeah, but you start running into the same compromises as a extremely SFF mini-PC, because that's all a console is. You can engineer some of the problem away with a custom cooling solution but noise and heat are still a challenge, there is only so much space for a finstack and only so much air you can push at a reasonable volume in a SFF chassis.

Paul MaudDib fucked around with this message at 04:49 on Apr 23, 2019

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

These things are also both running on AMD apus. Console optimizations count for a lot, but AMD isn't really in the 4k race currently, and I got my doubts there about to come out with a chip that'll do 4k60, with a sane amount of power consumption, at console-level pricing.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Unless AMD is jumping way past NVIDIA you can see they're not. Again, they will be jumping to a Zen or Zen2 core at probably like 3 GHz or so, maybe 3.2, plus a Navi GPU. Even if you are extracting 4K60 performance out of a 2070 size chip, that is a lot of power and heat and cost for a console, and so far AMD can't even do that. Plus even on 7nm, and even at modest clocks, it's going to be another 45W for the CPU. Press X to doubt.

(of course AMD really does have to get their act together with Navi, they hardly gained with VII even with a node shrink to 7nm. On air, the clocks are practically no different. Higher on water, and they really need to be getting those clocks on air with Navi to even stand a chance.)

Paul MaudDib fucked around with this message at 06:58 on Apr 23, 2019

PC LOAD LETTER
May 23, 2005
WTF?!

Cygni posted:

I was talking about the chart I linked with a 9400F and 2666 memory being roughly the same as a 2600X with 3400.
Sure but you were also talking about building a budget system too and while it won't break the bank anymore your dollars will still go farther spent elsewhere in a budget system.

Especially since Ryzens like low latency RAM quite a bit (high latency but high speed RAM doesn't get you as much of a benefit with them, up to a point anyways) and trying to get low latency DDR4 3200/3400 is much more expensive than the cheap CL16-18-18-etc kits that won't offer much benefit over lower latency DDR4 3000 kit and its usually much easier to tune and tweak the numbers on most of those instead. The CL14 DDR4 3200 16GB kits run around $140 right now.

edit: Console chat: Power-wise you'd probably have to assume the limit for consoles to be around 150-200W given the sort of cooling they tend to put into them as well as the price, sound, and size restrictions they have to work in. I think the XB1X already draws around 180W in some games and the PS4 tends to be around 150W or so. The ~100W "limit" was more for older consoles that used passive cooling, or smaller HSF units with lower speed fans to be quiet, by default and/or stuck to $300 or less launch prices which aren't really relevant anymore. \/\/\/\/\/\/\/\/

PC LOAD LETTER fucked around with this message at 08:38 on Apr 23, 2019

Arzachel
May 12, 2012
The mid-cycle refresh consoles drew well over 100w and with the rumoured $500 price, I'm pretty sure it'll stay that way.

Cygni
Nov 12, 2005

raring to post

Rest of the 9-series parts released. Nothing especially crazy.

https://www.anandtech.com/show/14256/intel-9th-gen-core-processors-all-the-desktop-and-mobile-45w-cpus-announced

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
What's up with the manufacturers and their current hatred of front-facing I/O ports of their ATX cases? I can fit one of those new ATX cases with top-facing ports it will fit just nicely underneath my the compartment of my table, but those I/O ports will be completely blocked by the underside of the table. Conversely if I put the case on my table the I/O ports will be damned awkward from a usability standpoint.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
Leaked Intel roadmaps, I knew 10nm was still in rough shape and wouldn't make it to desktop for at a year but if these are true things are even worse than I thought :stare:. Ice Lake appears to be mobile only and even then just a limited release, the shitshow continues.

https://twitter.com/witeken/status/1121052220072583174
https://twitter.com/witeken/status/1121054888102961153
https://twitter.com/witeken/status/1121056506148003845?s=19

MaxxBot fucked around with this message at 18:11 on Apr 24, 2019

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I just read the QEMU 4.0 change log, and they say Intel considers MPX failed. That's a new one.

Khorne
May 1, 2002
It's a reasonable move if they can compete with AMD on price once their increased 14nm fab capacity comes online.

10nm from intel isn't going to clock as well as 14nm++..+, and it's unlikely it has a huge ipc increase. Combine that with 10nm intel /7nm tsmc likely being a short lived stepping stone node due to EUV being here for real this time, finally, and you have the decision they made.

Intel is completely dominant in the laptop space and needs to maintain their lead there. The characteristics of the 10nm process are great for that. No one wants to buy a consumer desktop CPU on 10nm that performs worse than the refined 14nm process.

Khorne fucked around with this message at 18:22 on Apr 24, 2019

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Combat Pretzel posted:

I just read the QEMU 4.0 change log, and they say Intel considers MPX failed. That's a new one.

https://intel-mpx.github.io/overview/

It breaks lots of programs (i.e. is not just a recompile) and has bad performance overhead, and there are software-only bounds checking systems which do better. Perhaps they could have fixed it to be more useful, but I guess they gave up.

eames
May 9, 2009

MaxxBot posted:

Leaked Intel roadmaps, I knew 10nm was still in rough shape and wouldn't make it to desktop for at a year but if these are true things are even worse than I thought

I do wonder if they plan to break socket compatibility or not, I’d totally replace my launch 8700K with a 10C/20T model if it runs on Z370.

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.
Was there a leaked roadmap for server stuff too?

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

priznat posted:

Was there a leaked roadmap for server stuff too?

just glue two or more of the desktop roadmaps together

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

eames posted:

I do wonder if they plan to break socket compatibility or not, I’d totally replace my launch 8700K with a 10C/20T model if it runs on Z370.

Hope springs ever eternal. It'd be nice to be able to get some extra life out of my Z390.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

priznat posted:

Was there a leaked roadmap for server stuff too?

They got out the glue this time for the Platinum 9200 chips, the only limit now is the size of the package.

Adbot
ADBOT LOVES YOU

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord

MaxxBot posted:

Leaked Intel roadmaps, I knew 10nm was still in rough shape and wouldn't make it to desktop for at a year but if these are true things are even worse than I thought :stare:. Ice Lake appears to be mobile only and even then just a limited release, the shitshow continues.

Only shipping 10nm mobile for 2+ years is just unbelievably bad. I don’t think it’s true because it would effectively be handing AMD enormous marketshare.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply