Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
hobbesmaster
Jan 28, 2008

suck my woke dick posted:

"don't you think it's a bit risky betting the entire business on intel getting 10nm out roughly in time"

"nah it's not like they're amd lol"

Whats curious is that they're not concerned about intel 5G chipsets in general. https://newsroom.intel.com/news/intel-introduces-portfolio-new-commercial-5g-new-radio-modem-family/

Adbot
ADBOT LOVES YOU

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
5G is gonna be a tough sell anyway if carriers are still gonna be nickel diming with stingy data caps.

wargames
Mar 16, 2008

official yospos cat censor

Palladium posted:

5G is gonna be a tough sell anyway if carriers are still gonna be nickel diming with stingy data caps.

you don't like the 29 differant definitions of Unlimited* data**

* unlimited to 2gb

**only on content and platforms we own, we will charge you if you dare to visit any non-approved sites

evilweasel
Aug 24, 2002

hobbesmaster posted:

Whats curious is that they're not concerned about intel 5G chipsets in general. https://newsroom.intel.com/news/intel-introduces-portfolio-new-commercial-5g-new-radio-modem-family/

quote:

Intel also announced it has successfully completed a full end-to-end 5G call based on its early 5G silicon

that is, uh, not a statement that fills me to bursting with confidence "we managed to successfully complete one telephone call without the modem breaking"

hobbesmaster
Jan 28, 2008

evilweasel posted:

that is, uh, not a statement that fills me to bursting with confidence "we managed to successfully complete one telephone call without the modem breaking"

Its not an uncommon thing to see in cell phone industry publications. It means that they have working phones and base stations for the new technology so its ready for pilot deployments.

The real question is what is intel's yield...

wargames
Mar 16, 2008

official yospos cat censor

hobbesmaster posted:

Its not an uncommon thing to see in cell phone industry publications. It means that they have working phones and base stations for the new technology so its ready for pilot deployments.

The real question is what is intel's yield...

these things require silicon right? So probably using their 14nm+++++++++++ node so might not be terrible.

hobbesmaster
Jan 28, 2008

wargames posted:

these things require silicon right? So probably using their 14nm+++++++++++ node so might not be terrible.

Yeah current LTE-A modems use 14nm: https://www.intel.com/content/www/us/en/wireless-products/mobile-communications/xmm-7560-brief.html

Cellular modems usually use older nodes so I'm not sure why delays in 10nm screw them over. Unless it was designed for 14nm equipment that is not going to be available because the 10nm stuff isn't online?

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

hobbesmaster posted:

Yeah current LTE-A modems use 14nm: https://www.intel.com/content/www/us/en/wireless-products/mobile-communications/xmm-7560-brief.html

Cellular modems usually use older nodes so I'm not sure why delays in 10nm screw them over. Unless it was designed for 14nm equipment that is not going to be available because the 10nm stuff isn't online?

5g stuff needs 10nm or better because of the hilarious power requirements needed to drive the system. A 5g SOC is like twice as power hungry as current SOCs, and almost all of it is in the actual modem. Same with the base station stuff, a LOT of the gear is so close to cutting edge they desperately need the extra 40% power savings in order to avoid having the telco racks sound like the mid 90s 1U pizza box servers.

Knifegrab
Jul 30, 2014

Gadzooks! I'm terrified of this little child who is going to stab me with a knife. I must wrest the knife away from his control and therefore gain the upperhand.
I am going to be upgrading my mother board and CPU, my GFX and other things are still pretty solid so this is only a partial upgrade. So I have two questions:

1) Are we on the precipice of any big technology releases that I should just wait for in terms of intel processors?

2) If not, what is the best high performance CPU that is not crazy insane but good for intense gaming? I am willing to break the bank within reason.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

evilweasel posted:

that is, uh, not a statement that fills me to bursting with confidence "we managed to successfully complete one telephone call without the modem breaking"

I realize why you're cynical but you're not thinking about this the right way. If you're on the engineering team for a chip, after you get first silicon, you go into a multi-month frenzy of "bringup" - trying out each function, verifying it works, debugging problems. You're also co-debugging and developing drivers at the same time. The goal is to identify, as fast as possible, a comprehensive list of chip design bugs with no software work-around, and how serious they are (can they be fixed by changing just metal layers, or do you need a full layer spin).

Time-to-market is critical. Bugfixes that need silicon changes inherently take a long time, and then you have to re-verify everything after the revised silicon comes back from fab, and hope that you don't need an additional spin. Everyone puts in long hours during bringup. I can speak from experience (not at Intel, for the record): the first time you get a complex function like a 5G phone call up and running on an early silicon revision, everybody's pretty loving happy, and it's a milestone they're proud to make a press release about. It's the culmination of a few years of work, the glory of "Hey, we made a thing that works!" (that feel is what any good engineer is addicted to).

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

BobHoward posted:

(can they be fixed by changing just metal layers, or do you need a full layer spin).

Modern multipatteringing 14++++ is like what, a 4-6 month lead from 'welllll FUCKKK' through to new silicon in hand?

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

Knifegrab posted:

I am going to be upgrading my mother board and CPU, my GFX and other things are still pretty solid so this is only a partial upgrade. So I have two questions:

1) Are we on the precipice of any big technology releases that I should just wait for in terms of intel processors?

2) If not, what is the best high performance CPU that is not crazy insane but good for intense gaming? I am willing to break the bank within reason.

We're hitting some pretty hard limits with CPU gains, now is probably as good of time as any to buy in. AMD and Intel are pretty competitive right now, with the AMD side being able to get 2 more cores and hyper threading for less money at the trade off of somewhat slower single-threaded performance. The Ryzens also have a weird quirk where the inter-core communications are tied to your ram speed so you'll want to spend a bit more there to make sure you can do something spec'd for DDR4-3000+. Personally, I am going to stretch my 2500k a little further until the Ryzen 2s are out but I'm picky.

Ryzen 5 2600X and i5 8500 around the $200 range, i5 8600k or Ryzen 7 1800X close to the $250 range.

The Illusive Man
Mar 27, 2008

~savior of yoomanity~

Knifegrab posted:

I am going to be upgrading my mother board and CPU, my GFX and other things are still pretty solid so this is only a partial upgrade. So I have two questions:

1) Are we on the precipice of any big technology releases that I should just wait for in terms of intel processors?

2) If not, what is the best high performance CPU that is not crazy insane but good for intense gaming? I am willing to break the bank within reason.

Word is that Intel has 8-core mainstream chips (I.e. using the more affordable ‘Z’ platform) coming very soon, though nothing has been officially announced. They’re supposedly coming around fall so I’d at least wait til then unless you need a system now.

The current top dog for gaming is Intel’s 8700K, which has 6 cores and 12 threads (versus the rumored Intel chips mentioned above which would have 8 cores and 16 threads). Currently runs around $350, plus $150ish for a motherboard. Also be ready to get gouged on DDR4 RAM.

AMD’s 2600X and 2700X are also very good, though for purely gaming they’re slightly behind the 8700K.

hobbesmaster
Jan 28, 2008

Methylethylaldehyde posted:

5g stuff needs 10nm or better because of the hilarious power requirements needed to drive the system. A 5g SOC is like twice as power hungry as current SOCs, and almost all of it is in the actual modem. Same with the base station stuff, a LOT of the gear is so close to cutting edge they desperately need the extra 40% power savings in order to avoid having the telco racks sound like the mid 90s 1U pizza box servers.

Its infrastructure stuff so mid 90s 1 U pizza box server should be ok right :v:

I didn't realize the power density was that much higher for 5g, but I work with the slower side of the cell industry.

wargames
Mar 16, 2008

official yospos cat censor

Space Racist posted:

Word is that Intel has 8-core mainstream chips (I.e. using the more affordable ‘Z’ platform) coming very soon, though nothing has been officially announced. They’re supposedly coming around fall so I’d at least wait til then unless you need a system now.

The current top dog for gaming is Intel’s 8700K, which has 6 cores and 12 threads (versus the rumored Intel chips mentioned above which would have 8 cores and 16 threads). Currently runs around $350, plus $150ish for a motherboard. Also be ready to get gouged on DDR4 RAM.

AMD’s 2600X and 2700X are also very good, though for purely gaming they’re slightly behind the 8700K.

even then AMD isn't that far behind in games from what i've seen, but if you are doing 144hz gaming you go intel.

eames
May 9, 2009

hobbesmaster posted:

Its not an uncommon thing to see in cell phone industry publications. It means that they have working phones and base stations for the new technology so its ready for pilot deployments.

The real question is what is intel's yield...

:thunk:

calcalistech.com posted:

Intel will not provide 5G modems for Apple’s 2020 mobile devices, according to internal company communications reviewed by Calcalist, and people familiar with the matter. Apple has notified Intel it would not use a mobile modem developed by the chipmaker in its next-generation mobile device, Intel executives said in the communications. Further development of the modem component internally called “Sunny Peak” has been halted and the Intel team that's working on the product will be redirected to other efforts, the executives said.
...
Sunny Peak is a 5G, Wi-Fi and Bluetooth component designed by Intel for Apple’s future phone.
source: https://www.calcalistech.com/ctech/articles/0,7340,L-3741719,00.html

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

lol remember when LTE first came out and everyone gave apple huge amounts of poo poo for not adopting it right away and all the phones that did had such power-hungry poo poo first gen radios that you had to turn it off anyway or you'd get like 4hr standby on battery. First gen wireless chipsets are a thing to be avoided.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

Methylethylaldehyde posted:

Modern multipatteringing 14++++ is like what, a 4-6 month lead from 'welllll FUCKKK' through to new silicon in hand?

I have no idea, but can tell you that the total delay was on the order of months on a foundry 45nm process. The common trick to help out with OH gently caress moments is to make your first wafer order larger than it needs to be and instruct the foundry to hold some wafers aside without doing the upper (or all) metal layers. When you come back to them with a metal-only spin, you’ll have some wafers ready which only need metal process steps to complete.

This is one of the reasons why you hope that if you have fixes to make, they can be done entirely in metal. Gets you back on the path towards shipping parts for revenue much sooner. Metal masks are also less expensive, especially the higher you go in the stack. For this reason, another standard mitigation is to pepper your chip with spare gates - flip flops and combinatorial logic gates in the base layers, with vias to bring their inputs and outputs to low metal layers, but no actual circuit connections. This lets you do much more sophisticated fixes with metal changes: if there’s some spare gates close to where a designer hosed up, you can patch in new logic without needing a base layer change to create the gates.

Hed
Mar 31, 2004

Fun Shoe

BangersInMyKnickers posted:

lol remember when LTE first came out and everyone gave apple huge amounts of poo poo for not adopting it right away and all the phones that did had such power-hungry poo poo first gen radios that you had to turn it off anyway or you'd get like 4hr standby on battery. First gen wireless chipsets are a thing to be avoided.

My wife’s first and only Android was that Verizon HTC 4G that had stats like this. 🚮

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

BobHoward posted:

I have no idea, but can tell you that the total delay was on the order of months on a foundry 45nm process. The common trick to help out with OH gently caress moments is to make your first wafer order larger than it needs to be and instruct the foundry to hold some wafers aside without doing the upper (or all) metal layers. When you come back to them with a metal-only spin, you’ll have some wafers ready which only need metal process steps to complete.

This is one of the reasons why you hope that if you have fixes to make, they can be done entirely in metal. Gets you back on the path towards shipping parts for revenue much sooner. Metal masks are also less expensive, especially the higher you go in the stack. For this reason, another standard mitigation is to pepper your chip with spare gates - flip flops and combinatorial logic gates in the base layers, with vias to bring their inputs and outputs to low metal layers, but no actual circuit connections. This lets you do much more sophisticated fixes with metal changes: if there’s some spare gates close to where a designer hosed up, you can patch in new logic without needing a base layer change to create the gates.

That's super clever for an early run. As long as you have die space, adding little FPGA-style modular bits could save your rear end if something bad happened.

EIDE Van Hagar
Dec 8, 2000

Beep Boop

Methylethylaldehyde posted:

That's super clever for an early run. As long as you have die space, adding little FPGA-style modular bits could save your rear end if something bad happened.

Its not an fpga, in that it is not programmable in the field. its just extra logic gates that aren’t hooked up to metal routing at all unless you make new metal layer masks. It just means you don’t have to do place and route for the base layers (FEOL) again, you still have to make a new copy of the chip with the new metal layers.

The ECO process at intel works exactly like that except no need to early extra wafers from the foundry in advance because you are the foundry. Just place a few lots on hold after FEOL.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull
^^^ that, and I’d also like to clarify that spares are forever. They don’t go away when mass production starts. The reason is that everyone would love to ship version A0 silicon. “A0” is a quasi-standard two-part stepping or version code, where A-Z is a version for the base layers and 0-9 refers to the metal stack (and when you do a base layer spin, the metal layer version resets to 0). If any A series version of a chip is good enough, it ships, and by definition the spares are still there because they’re all present in the A version base layer masks.

But even if you have to do a B0 or C0 or beyond, you’re still going to leave the spares in, for two reasons. One is that you might find a C0 bug that requires use of spares to make a C1. The other is just that past A0 you’re trying to do only ECOs (engineering change orders), which are very narrowly targeted fixes done at the lowest possible level of abstraction. Most of the chip’s physical design won’t be touched, to save the expense, time, and risk of redoing it, and that includes unused spares. Any situation so bad it calls for a full redo of place and route probably means management is going to cancel the whole project as a failure that will never make its market window. So, the spares stay in. They’re not doing anything, they’re just... there.

JawnV6
Jul 4, 2004

So hot ...

BobHoward posted:

This lets you do much more sophisticated fixes with metal changes: if there’s some spare gates close to where a designer hosed up, you can patch in new logic without needing a base layer change to create the gates.
There's another usage, but I wouldn't be surprised if it's fallen out of favor recently. Not so much a lie as a FIB of omission...

Methylethylaldehyde posted:

That's super clever for an early run. As long as you have die space, adding little FPGA-style modular bits could save your rear end if something bad happened.
They don't get "taken out," chips are validated as a whole and you couldn't go back and take them out. "Spare gates," "bonus gates," they're standard practice in the industry and some combination of those terms should dredge up papers on the topic. Plenty of reasonable research on "where" to put them, but as I understand it's still more art than science.

GRINDCORE MEGGIDO
Feb 28, 1985


It's fascinating, and something that seems super obvious to me. But only in hindsight.

GRINDCORE MEGGIDO fucked around with this message at 03:38 on Jul 6, 2018

EIDE Van Hagar
Dec 8, 2000

Beep Boop
Also most design rules these days just don’t allow for blank space at all.

Having a large area of just open silicon during etch steps can affect the etch rate of the areas around it,

Also, a lot of photo masks also depend on repeating features because they use diffraction from nearby features to accurately produce a feature. You might need to throw in some “dummy” transistors anyway just to make the photo pattern for your real transistors work out, so you might as well just make some spare real transistors if you can.

Koramei
Nov 11, 2011

I have three regrets
The first is to be born in Joseon.

Space Racist posted:

Word is that Intel has 8-core mainstream chips (I.e. using the more affordable ‘Z’ platform) coming very soon, though nothing has been officially announced. They’re supposedly coming around fall so I’d at least wait til then unless you need a system now.

These are gonna be more expensive than the 6 core ones though aren't they, when they launch? Are they likely to bring down the 6 core's price?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Koramei posted:

These are gonna be more expensive than the 6 core ones though aren't they, when they launch? Are they likely to bring down the 6 core's price?

Nobody knows how Intel will play it. They could ditch the 6C6Ts and make 6C12T the i5s while the 8C16C become i7s, they could make the i5s be 8C8T, or they could just introduce the 8C16T over the top of the stack as i9s.

The 6Cs will probably come down modestly but the 8Cs are probably going to be more expensive. Intel tends not to alter its price points (instead, they introduce new hardware at the same price point) which kinda suggests they'll go with the i9 option to me.

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

C.H.O.M.E. posted:

Its not an fpga, in that it is not programmable in the field. its just extra logic gates that aren’t hooked up to metal routing at all unless you make new metal layer masks.

Yeah, it's not programmable like an FPGA, but all the base FPGA building blocks, the adders and pipeline bits are there in silicon, just not attached to anything until the engineer figures out oh poo poo, and needs to delay one line of the serial bus by .9 ns to get it to work, so they route in a single stage delay to get it back in line.

PC LOAD LETTER
May 23, 2005
WTF?!

PCjr sidecar posted:

Intel is a fab company that also designs chips and has some small (billion dollar) side businesses so it would be shocking in a way the amd or ibm sell off wasn’t.
I think the only way they decide to go fabless if their 7nm process is screwed up as bad or nearly as bad as their 10nm process is supposed to be.

I don't think anyone really doubts they can tough out being stuck on 14nm++++++ for a couple of years or so but if they're going into 2022 and they're still stuck on it with no fix in sight I think even they realize they'll be totally boned since by then TSMC will probably at least be on 5nm+ (or whatever they call their update of it) even with some delays.

The Illusive Man
Mar 27, 2008

~savior of yoomanity~

Paul MaudDib posted:

Nobody knows how Intel will play it. They could ditch the 6C6Ts and make 6C12T the i5s while the 8C16C become i7s, they could make the i5s be 8C8T, or they could just introduce the 8C16T over the top of the stack as i9s.

The 6Cs will probably come down modestly but the 8Cs are probably going to be more expensive. Intel tends not to alter its price points (instead, they introduce new hardware at the same price point) which kinda suggests they'll go with the i9 option to me.

Yeah, I'm guessing the existing price structure will be maintained and the 8C/16T chips will be slotted in a tier above (and, like you said, probably branded as an i9 chip).

The 7820X has been going for ~$465 since March or so, I'm guessing the mainstream 8C chip will land around there too.

redeyes
Sep 14, 2002

by Fluffdaddy
I got an i7 8700k and overclocked a bit. Put it at 5ghz with a 120mv offset and no other offsets, AVX at 5ghz. I figured I would need watercooling or something beefy but actually it runs kinda cool. Idle at 35c to 40c. AVX maxed workload at around 86c. I can't tell, but is this a silicon lottery type chip? Seems pretty cool running.

Otakufag
Aug 23, 2004
Just give me a 8 core 8 thread cpu for gaming till 2022, gently caress that gay hyperthreading crap.

eames
May 9, 2009

redeyes posted:

I got an i7 8700k and overclocked a bit. Put it at 5ghz with a 120mv offset and no other offsets, AVX at 5ghz. I figured I would need watercooling or something beefy but actually it runs kinda cool. Idle at 35c to 40c. AVX maxed workload at around 86c. I can't tell, but is this a silicon lottery type chip? Seems pretty cool running.

120mv offset means nothing because the SVID is individual to your particular chip's quality. There are also other factors like the LLC setting. It could be running at 1.3V Vcore (which would be decent) or 1.45V (which would be bad and damage your chip at those temperatures).

HWInfo will display your actual Vcore (not to be confused with SVID) and it is best checked under load.

redeyes
Sep 14, 2002

by Fluffdaddy

eames posted:

120mv offset means nothing because the SVID is individual to your particular chip's quality. There are also other factors like the LLC setting. It could be running at 1.3V Vcore (which would be decent) or 1.45V (which would be bad and damage your chip at those temperatures).

HWInfo will display your actual Vcore (not to be confused with SVID) and it is best checked under load.

Ah ok, HWinfo reports a min vcore of .384 and maximum of 1.408v under all core load. Still runs cool as can be. I only have a Hyper 212 cooler with a 4000rpm pwm fan.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

redeyes posted:

I got an i7 8700k and overclocked a bit. Put it at 5ghz with a 120mv offset and no other offsets, AVX at 5ghz. I figured I would need watercooling or something beefy but actually it runs kinda cool. Idle at 35c to 40c. AVX maxed workload at around 86c. I can't tell, but is this a silicon lottery type chip? Seems pretty cool running.

Under AVX load that is really cool. Can you confirm the CPU speed is 5ghz under AVX load? If so then yeah big lottery win.

redeyes
Sep 14, 2002

by Fluffdaddy

Lockback posted:

Under AVX load that is really cool. Can you confirm the CPU speed is 5ghz under AVX load? If so then yeah big lottery win.

Yeah, I may be wrong. I was running the Intel burn in test inside their Intel Extreme tuning utility. I'm having trouble finding if this is actually an AVX workload.

quote:

Unfortunately Xtu does not stress AVX instructions to hard. If you use programs that utilize AVX instruction sets you might be better off using prime95 version 24.9 when using prime95 make sure your voltage mode is set for manual / override

Huh, I read it does use AVX but then I read it doesn't use them that much? I'll try prime95 instead.

I'll say this, this chip runs a lot cooler than my older 7700k

redeyes fucked around with this message at 18:39 on Jul 6, 2018

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

redeyes posted:

Yeah, I may be wrong. I was running the Intel burn in test inside their Intel Extreme tuning utility. I'm having trouble finding if this is actually an AVX workload.

Huh, I read it does use AVX but then I read it doesn't use them that much? I'll try prime95 instead.

I'll say this, this chip runs a lot cooler than my older 7700k

Check the box it came in, look for the serial number. You're looking for "17" or "18" - the number before or after is usually the week of the year it was made. Maybe you got an older one from an earlier batch before they started Laverne and Shirley-ing them out with lovely QA. It should be listed on the chip as well, but if you're getting those temps, don't *touch* anything.

eames
May 9, 2009

redeyes posted:

Yeah, I may be wrong. I was running the Intel burn in test inside their Intel Extreme tuning utility. I'm having trouble finding if this is actually an AVX workload.

Huh, I read it does use AVX but then I read it doesn't use them that much? I'll try prime95 instead.

I'll say this, this chip runs a lot cooler than my older 7700k

Try Asus realbench instead, prime95 has FMA3 enabled by default and will make your CPU run far hotter than it ever would under "regular" heavy load.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

BIG HEADLINE posted:

Check the box it came in, look for the serial number. You're looking for "17" or "18" - the number before or after is usually the week of the year it was made. Maybe you got an older one from an earlier batch before they started Laverne and Shirley-ing them out with lovely QA. It should be listed on the chip as well, but if you're getting those temps, don't *touch* anything.

85C under load with no AVX is good but not super special. I think he is just not testing AVX, though getting to 5GHZ easily certainly is nice. Maybe quality is improving?

Adbot
ADBOT LOVES YOU

eames
May 9, 2009

Lockback posted:

Maybe quality is improving?

historic silicon lottery data suggests that this is indeed the case

https://en.wikichip.org/wiki/intel/core_i7/i7-8700k#Overclockability

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply