|
Palladium posted:man have some sympathy for the small 10M+ sub channel who can't afford to get a real 9700K No idea how, he's got to be one of the most obnoxious people on youtube. The PewDiePie of hardware reviews. Insha'allah, but I'm contemplating delidding and running the 9900k bare die. I've been doing this with my 4700k for like 4 years and it's been great. You're already cracking a $500 CPU in a fancy vice, what's another hour of dremelling some plastic on the motherboard socket?? Get your calipers and start stacking tiny washers, my dudes. forbidden dialectics fucked around with this message at 07:45 on Oct 22, 2018 |
# ? Oct 22, 2018 07:42 |
|
|
# ? May 27, 2024 13:07 |
|
Winks posted:They've known that their 10 nm fabs were not going to be up by now for quite some time. Probably even earlier than that, like late 2017, right after it became apparent they weren't going to be able to make any miracles happen with their 10nm process and rumors of those half busted (no working iGPU) low clocked 10nm chips being sold in China on the down low started to pop up.
|
# ? Oct 22, 2018 07:52 |
|
Hyperthreading's benefits has always been somewhat dubious for games, some benefit from it, some are negatively impacted. The move to 8 physical cores likely makes it even less useful. My personal experience is that systems with HT enabled feel far more responsive under load, so given the option I'd always leave HT on. But anyway, this is LTT we are talking about. I believe the far more interesting question is why Intel (or their board partners) chose not to enable Turbo powerlimits for the 9900K by default. Limiting it to 95W instead of unlimited (200W+) generally only decreases performance in productivity applications by only 10-15% while it reduces power consumption by 50% and eliminates all thermal issues. Average power consumption in games is around 75W, so with a 95W limit you only lose 1-2 FPS average and 99th percentile frametimes go down by 3 FPS. Note that you can still overclock the CPU within those power limits. This 9x00K series is another perfect example of a chips running way past their efficiency curves to squeeze out those last few percents of pointless and barely noticeable benchmark performance in order to look good in the youtube chart slides, just like Vega, Pascal and even Ryzen to a lesser degree. I can only assume that the reason why Intel doesn't lock things down like Nvidia (very limited power and voltage limits that board partners can't modify) is because they're scared of Zen 2. some sources: application and gaming performance with 95W turbo limit: https://www.computerbase.de/2018-10/intel-core-i9-9900k-i7-9700k-cpu-test/3/#abschnitt_benchmarks_mit_95wattpowertarget more slides with applications and games and the 95W turbo limit: https://www.golem.de/news/core-i9-9900k-im-test-acht-verloetete-5-ghz-kerne-sind-extrem-1810-136974-5.html power efficiency: (last image ) https://www.techpowerup.com/reviews/Intel/Core_i9_9900K/16.html eames fucked around with this message at 11:43 on Oct 22, 2018 |
# ? Oct 22, 2018 07:58 |
|
Yeah it does seem that if you just leave it stock it will be fine. OC'ing this thing is just rolling coal but for a computer.
|
# ? Oct 22, 2018 08:28 |
|
Xerophyte posted:I don't mean that it's a yield improvement because they can disable any shorted or otherwise nonfunctional execution schedulers, I certainly expect that would mean disabling the physical core. I mean that they're setting fixed thermal, clock and power targets and you get a higher yield for those with hyperthreading disabled. If that is case it's a marketing decision to prioritize thermals, power and base clock over HT on the i7, but the product line specifics are always a marketing decision. Sorry for the misunderstanding, I took your post as being about defects. I agree there is binning going on to make the 9900K. In modern times they're often binning on power efficiency instead of or in addition to Fmax; the die with the lowest leakage and/or best Hz/volt can hit higher clocks without exceeding TDP. For that reason I agree that the i7 HT decision could've been about prioritizing clocks over HT; it's plausible that yield distributions required disabling HT to yield enough high frequency parts that would stay inside 95W TDP. Vanagoon posted:I'm curious as to what happens to completely bad dies, do they shred them up before they dispose of them to make it harder for anyone to study them and steal info from/about them, or is there some process to reclaim valuable elements from them (like the copper wiring, I imagine after a while they would be throwing away a lot of copper that might be worth saving). On the last point, there's not enough metal to be worth recycling. Most defect testing happens on the wafer or just after dicing, prior to packaging, and there's several orders of magnitude more copper in the heatspreader slug that's part of a finished, packaged CPU. By mass, I'd expect the die is >99.9% silicon and silicon dioxide. One consequence of there being so little mass of any material that's not highly refined sand is that they have no shame about using super rare metals. For example, Intel used hafnium as a gate material at their 45nm node (and 32nm too I think). I wouldn't be surprised if most of Intel's scrap gets shredded, but not to stop people from peeking at their circuits. There are commercial outfits which provide this kind of analysis as a service, and they just buy retail samples AFAIK. Avoids legal issues, and it's cheap compared to the cost of doing the work. Intel might care about trying to prevent it from happening prior to product launch, but once chips are being sold to the public there's absolutely no way to stop it. The more likely reason to grind up scrap: Because it's super cheap and leaves zero possibility of any of it becoming counterfeit CPUs through black markets.
|
# ? Oct 22, 2018 09:01 |
|
eames posted:Hyperthreading's benefits has always been somewhat dubious for games, some benefit from it, some are negatively impacted. The move to 8 physical cores likely makes it even less useful. My personal experience is that systems with HT enabled feel far more responsive under load, so given the option I'd always leave HT on. But anyway, this is LTT we are talking about. Llamadeus posted:Extremely let down by youtube channel Linus Tech Tips, exemplars of scientific rigour. Combat Pretzel fucked around with this message at 11:55 on Oct 22, 2018 |
# ? Oct 22, 2018 11:53 |
|
Linus tested on a 4 phase mobo that limited the CPU TDP and throttled it. Drop 9900K into a proper motherboard and it'll draw 150 watts without touching any overclocking settings. Hilariously enough both numbers are considered "in spec" because base freq is only 3.6 GHz. https://www.youtube.com/watch?v=NGHiRrQ2AAo
|
# ? Oct 22, 2018 13:36 |
|
forbidden dialectics posted:Insha'allah, but I'm contemplating delidding and running the 9900k bare die. Removing the solder looks like a loving nightmare. Steve from gamersnexus was doing it to the bare die with a blade snapped off a box cutter, I kept expecting him to ruin the chip.
|
# ? Oct 22, 2018 14:02 |
|
https://semiaccurate.com/2018/10/22/intel-kills-off-the-10nm-process/ big if true
|
# ? Oct 22, 2018 14:14 |
|
repiv posted:big if true At least compared to everyone else on <14nm
|
# ? Oct 22, 2018 14:59 |
|
Can anyone explain how a substantially smaller company like AMD can beat Intel to a better process node despite the massive financial war chest Intel has compared to AMD? I know AMD is a fabless chip company but even still I’d been of the belief that cash wins the war when it comes to making chips. I know back in the Athlon days that AMD did real well with a superior architecture (I built almost nothing but AMD boxes for like 6 years until the Core series in 2008) but that doesn’t have anything to do with manufacturing. Is TSMC just better at fabs than Intel via magic secret sauce engineers or something?
|
# ? Oct 22, 2018 15:02 |
|
For one, Intel ignored the findings in the reports they commissioned about the state of their fab capability. Having Jim Keller help you out probably doesn't hurt either.
|
# ? Oct 22, 2018 15:05 |
|
It's intel vs all the other fabs. Intel can't really choose to make their chips anywhere else so they end up putting everything in one basket. Intel loving up where at least one of their competitors does not has always been a possibility and it finally happened. AMD just has the freedom to choose the winner and fab their chips there.
|
# ? Oct 22, 2018 15:05 |
|
necrobobsledder posted:Can anyone explain how a substantially smaller company like AMD can beat Intel to a better process node despite the massive financial war chest Intel has compared to AMD? necrobobsledder posted:I’d been of the belief that cash wins the war when it comes to making chips
|
# ? Oct 22, 2018 15:14 |
|
Also tsmc is ahead because they get that apple money, which makes 75% of their business. If intel had taken apple's offer to make iphone chips they would probably be ahead now.
|
# ? Oct 22, 2018 15:15 |
|
Perplx posted:Also tsmc is ahead because they get that apple money, which makes 75% of their business. If intel had taken apple's offer to make iphone chips they would probably be ahead now. Oh, I did not know they made chips for Apple. Any danger of Apple buying them out?
|
# ? Oct 22, 2018 15:26 |
|
Sininu posted:Oh, I did not know they made chips for Apple. Any danger of Apple buying them out?
|
# ? Oct 22, 2018 15:28 |
|
The GloFo contract running out was also a millstone off the neck of AMD.Sininu posted:Oh, I did not know they made chips for Apple. Any danger of Apple buying them out?
|
# ? Oct 22, 2018 15:38 |
|
I ordered a 9900k, hope I didn't make a huge mistake.
|
# ? Oct 22, 2018 16:06 |
|
ufarn posted:The GloFo contract running out was also a millstone off the neck of AMD. Hasn't run out yet, GloFo just gave up on making high performance nodes and they agreed to "renegotiate" the agreement.
|
# ? Oct 22, 2018 16:24 |
|
The news isn't entirely unexpected, I do wonder if Intel has the guts to release another incompatible chipset or (even socket) for 14nm++++++ before 7nm. Chances are that they'll crank CFL up to 12 cores on the ring-bus but with enforced power limits to keep it from melting the copper IHS. eames fucked around with this message at 16:33 on Oct 22, 2018 |
# ? Oct 22, 2018 16:27 |
|
Semiaccurate's last big Intel scoop was that Apple dropped Intel as a modem supplier for the new iPhone. Before that it was Apple dropping Intel chips in their laptops.
Winks fucked around with this message at 16:42 on Oct 22, 2018 |
# ? Oct 22, 2018 16:39 |
|
demerjian'd again https://twitter.com/intelnews/status/1054397715071651841
|
# ? Oct 22, 2018 16:45 |
|
TheCoach posted:Linus tested on a 4 phase mobo that limited the CPU TDP and throttled it. Drop 9900K into a proper motherboard and it'll draw 150 watts without touching any overclocking settings. He tested on a Asus Maximus XI Hero right? That's a doubled 4-phase, which is practically as good as an 8-phase. Also, each of those stages can handle 100A, which is about twice as much as the phases on other boards. Not saying he didn't manually set a power limit, or that there aren't BIOS shenanigans, but a Maximus Hero should be capable of pushing much, much more than 95W. Does not pass the smell test.
|
# ? Oct 22, 2018 16:51 |
|
repiv posted:demerjian'd again quote:For several years now SemiAccurate has been saying the the 10nm process as proposed by Intel would never be financially viable Pad everything with weasel words and you're never wrong!
|
# ? Oct 22, 2018 16:53 |
|
repiv posted:demerjian'd again This denial reeks of desperation
|
# ? Oct 22, 2018 16:58 |
|
Winks posted:Semiaccurate's last big Intel scoop was that Apple dropped Intel as a modem supplier for the new iPhone. Before that it was Apple dropping Intel chips in their laptops. On the even of the first Optane drives launched, he also went on some big rant about how it was HORRIBLY BROKEN and Intel was LYING in their benchmarks and endurance was the EXACT SAME as NAND because he had no understanding of what metadata was vs. traditional over-provisioning. Then the drives launched, no reviewers could bother to substantiate any of his claims even if they cared to and he shut up and moved on to a different topic I think he was even ranting about how some on-site demo/news release day was full of SUSPICIOUS and UNDERHANDED tactics all because he wasn't invited therefor Intel must be hiding something. Then he received an invite and shut up about it WhyteRyce fucked around with this message at 17:08 on Oct 22, 2018 |
# ? Oct 22, 2018 17:05 |
|
I mean, I assume Intel did shitcan the “initially proposed” 10nm like three years ago.
|
# ? Oct 22, 2018 17:12 |
|
Arzachel posted:Pad everything with weasel words and you're never wrong! if Demerjian is saying that Intel loosened up some of their design rules to get 10nm out the door... didn't that one come out like a couple months ago? It looks like semiaccurate covered it weeks ago, at a minimum. Sounds like he's going farther here and saying that Intel is skipping 10nm entirely which... is an idea. Doesn't sound super plausible but. WhyteRyce posted:On the even of the first Optane drives launched, he also went on some big rant about how it was HORRIBLY BROKEN and Intel was LYING in their benchmarks and endurance was the EXACT SAME as NAND because he had no understanding of what metadata was vs. traditional over-provisioning. Then the drives launched, no reviewers could bother to substantiate any of his claims even if they cared to and he shut up and moved on to a different topic Charlie has predicted 7 of the last 3 disasters for both Intel and NVIDIA. He's an old-school ATI dude (I think?) who just absolutely loathes them and to steal a phrase from someone here, you can hear his hateboner ripping through his pants with every breathless post about the imminent disasters that are going to befall them. Paul MaudDib fucked around with this message at 17:30 on Oct 22, 2018 |
# ? Oct 22, 2018 17:16 |
|
I have an Asus Z390-H Strix is it going to get normal performance from a 9900K or should I be exchanging it to a ~Maximus~. I don’t understand if the power rail crap people are talking about is for OCing or just stock.
|
# ? Oct 22, 2018 17:57 |
|
Can anyone explain why Intel would just throw the entire process out? I have been part of enough computer screwups to believe that they might have to roll back six months worth of hacks and try something new to get yields up, or whatever, but to get to a point where they would need a clean sheet of paper seems really odd.
|
# ? Oct 22, 2018 18:48 |
|
Intel will undoubtedly use the process to make something that will work with whatever its flaws are. It might not be CPUs but they will do whatever they can to salvage it.
|
# ? Oct 22, 2018 18:58 |
|
mewse posted:Removing the solder looks like a loving nightmare. Steve from gamersnexus was doing it to the bare die with a blade snapped off a box cutter, I kept expecting him to ruin the chip. Yeah I'm gonna whip out the sandpaper like der8auer, looks like the smallest chance of slicing my fingers off. what is wrong with me
|
# ? Oct 22, 2018 19:19 |
|
They've had a bottom end 10 nm CPU on the market since May, the i3-8121U.
|
# ? Oct 22, 2018 19:27 |
|
One of my local webstores is selling a new in box combo of a asus x299-deluxe and a i7-7740x at 745€, how does it compares to a z390+i9-9900k combo(which is 900 new)?
|
# ? Oct 22, 2018 19:32 |
|
Very badly, that's the 4 core 299 model that's phased out and beaten by 4 core regular CPUs. There was no good reason for it to exist.
|
# ? Oct 22, 2018 19:36 |
|
Winks posted:They've had a bottom end 10 nm CPU on the market since May, the i3-8121U. yeah but that's worse in every respect than the relevant 14nm chip and seems to have been shoved out for no other purpose than to claim that it's in production
|
# ? Oct 22, 2018 19:53 |
|
Hed posted:I have an Asus Z390-H Strix is it going to get normal performance from a 9900K or should I be exchanging it to a ~Maximus~. I don’t understand if the power rail crap people are talking about is for OCing or just stock. The Strix models can handle roughly 200W TDP while the ~Maximus~ can handle up to 250W. That just means you can push higher voltages, i.e. higher clock speeds, with more confidence on a Maximus.
|
# ? Oct 22, 2018 20:41 |
|
evilweasel posted:yeah but that's worse in every respect than the relevant 14nm chip and seems to have been shoved out for no other purpose than to claim that it's in production Also to recoup a tiny tiny bit of cash back from the node, and to help debug the many and varied issues with the process.
|
# ? Oct 22, 2018 20:43 |
|
|
# ? May 27, 2024 13:07 |
|
It can produce CPUs. Can it produce good CPUs? We'll see.
|
# ? Oct 22, 2018 20:46 |