Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Cygni posted:

https://videocardz.com/newz/specs-pricing-and-performance-of-intel-12th-gen-core-65w-desktop-cpus-leaked



That 12400F for $167 is a killer deal. Assuming there are enough B660 DDR4 boards to go around on the 5th, thats da new go-to.

If I'm not mistaken the 11400F ended up just $10 less than the non-F most of the time. That said, $200 still seems incredibly competitive.

Adbot
ADBOT LOVES YOU

Dr. Video Games 0031
Jul 17, 2004

Also, that's not MSRP. I'd expect at least $180 for the 12400F.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

Dr. Video Games 0031 posted:

Also, that's not MSRP. I'd expect at least $180 for the 12400F.

Yes, however the 11400 (not F) landed at about the price they put in their rocket lake chart

mobby_6kl
Aug 9, 2009

by Fluffdaddy
Has anyone done some serious tests on efficiency at different power levels? From the initial launch it seemed clear it wasn't an issue and they were just juicing the CPUs for maximum benchmark numbers but it seems like nobody went back and tested performance vs power.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

mobby_6kl posted:

Has anyone done some serious tests on efficiency at different power levels? From the initial launch it seemed clear it wasn't an issue and they were just juicing the CPUs for maximum benchmark numbers but it seems like nobody went back and tested performance vs power.

Put down your pitchforks: Intel’s 12th-gen CPUs aren’t power hogs [PCWorld]

    In the end, you can certainly say that Ryzen can be a more power-efficient CPU under the heaviest of loads than rival Intel 12-gen chips—that’s of no doubt. But as you can see, there’s a lot more to the story. Given Alder Lake’s generally better idle power states, ability to finish tasks faster and the simple fact that the vast majority of games and “heavy” content creation applications such as Photoshop, Premiere, and Lightroom Classic don’t use all of the threads and cores available, it’s dead wrong to declare Alder Lake a huge, power hungry CPU. It clearly isn’t most of the time. And even when it is hammering the power, it’s often much faster than Ryzen.

Begall
Jul 28, 2008

mobby_6kl posted:

Has anyone done some serious tests on efficiency at different power levels? From the initial launch it seemed clear it wasn't an issue and they were just juicing the CPUs for maximum benchmark numbers but it seems like nobody went back and tested performance vs power.

I don't have the article to hand, but someone did a lot of testing with various power limits on a 12900K, and if I remember correctly you basically got 98% of the gaming performance at 125W as you did at 250W.

Shipon
Nov 7, 2005

Begall posted:

I don't have the article to hand, but someone did a lot of testing with various power limits on a 12900K, and if I remember correctly you basically got 98% of the gaming performance at 125W as you did at 250W.
I haven't seen power usage on the 12900k go above 140-150W even in Flight Sim so that explains the lack of difference for the most part

EDIT: Huh, guess it was the 12900KS, 5.5 single 5.2 all-core. seems like quite a leap for the K to KS to go up 300 MHz in the all-core boost

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
looks like the embargo dropped early

https://www.youtube.com/watch?v=_FJKCVFOa_c
https://www.youtube.com/watch?v=ydgN4W97Esk

e:
new stock coolers

Rinkles fucked around with this message at 19:25 on Jan 4, 2022

Dr. Video Games 0031
Jul 17, 2004

mobby_6kl posted:

Has anyone done some serious tests on efficiency at different power levels? From the initial launch it seemed clear it wasn't an issue and they were just juicing the CPUs for maximum benchmark numbers but it seems like nobody went back and tested performance vs power.

https://www.techpowerup.com/review/intel-core-i9-12900k-alder-lake-tested-at-various-power-limits/



With games, the results are obvious: the 12900K already pulls 125 - 140 watts when gaming only anyway. Though the efficiency at 100W is nice. The CPU test suite (and they ran a LOT of benchmarks/tests) is more interesting to me because these are generally tests that will use the CPU's full power budget if possible. And even still, 85% of the performance at 50% of the power draw.

The 12900K should've been a 190W chip at stock settings with a simple bios switch to flip to push it to 241W for anyone who would want to do that. So I agree, they juiced the hell out of the 12900K just to ensure that they would top the 5900X in every benchmark, even though it makes little sense to actually use the CPU that way in normal usage. That extra couple percent was really important to Intel I guess, more important than fighting their reputation of producing inefficient, power-hungry chips. This has not panned out well for them it seems, though.

Dr. Video Games 0031 fucked around with this message at 21:12 on Jan 4, 2022

mobby_6kl
Aug 9, 2009

by Fluffdaddy

Dr. Video Games 0031 posted:

https://www.techpowerup.com/review/intel-core-i9-12900k-alder-lake-tested-at-various-power-limits/



With games, the results are obvious: the 12900K already pulls 125 - 140 watts when gaming only anyway. Though the efficiency at 100W is nice. The CPU test suite (and they ran a LOT of benchmarks/tests) is more interesting to me because these are generally tests that will use the CPU's full power budget if possible. And even still, 85% of the performance at 50% of the power draw.

The 12900K should've been a 190W chip at stock settings with a simple bios switch to flip to push it to 241W for anyone who would want to do that. So I agree, they juiced the hell out of the 12900K just to ensure that they would top the 5900X in every benchmark, even though it makes little sense to actually use the CPU that way in normal usage. That extra couple percent was really important to Intel I guess, more important than fighting their reputation of producing inefficient, power-hungry chips. This has not panned out well for them it seems, though.
Holy poo poo that's an absolutely heroic effort on their part to test all this stuff.

Thanks for digging this up. It's pretty much the impression I got from the launch reviews but nothing beyond some sketchy Chinese benchmarks ever really tested this properly.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Newegg prices

12400 $210

12400F $180

Inept
Jul 8, 2003


AMD drop the price of the 5600x you shits

Hughmoris
Apr 21, 2007
Let's go to the abyss!
Relatively speaking, how does the i7-6700u hold up?

A friend has an old HP Notebook laptop collecting dust. A quick look shows it sporting the i7-6700u with 6 GB of RAM.

If the processor is halfway decent I might try my hand at upgrading the RAM and adding an SSD. Find some usage for it.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Hughmoris posted:

Relatively speaking, how does the i7-6700u hold up?

A friend has an old HP Notebook laptop collecting dust. A quick look shows it sporting the i7-6700u with 6 GB of RAM.

If the processor is halfway decent I might try my hand at upgrading the RAM and adding an SSD. Find some usage for it.

moving a computer to an SSD (and in your case also adding some RAM) is pretty much a guaranteed way to keep a computer going as a spreadsheet driver and forums posting battlestation no matter how old it is

since it's a laptop, then it obvs can't hold a GPU, so that'll hold it back in gaming scenarios, but otherwise that still sounds like a perfectly usable computer

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

Hughmoris posted:

Relatively speaking, how does the i7-6700u hold up?

A friend has an old HP Notebook laptop collecting dust. A quick look shows it sporting the i7-6700u with 6 GB of RAM.

If the processor is halfway decent I might try my hand at upgrading the RAM and adding an SSD. Find some usage for it.
I have an i7-5600u based laptop with 16gb RAM and a SSD and it works fine. I wouldn't use it for gaming, but for remote work or media use I can't complain. Even got a new battery from Dell so I can use it for 5-6 hours at client sites.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?
Wow, the little i5 is really energy efficient



This is with equivalent performance to the 5600X (it's a little behind at 720p)



https://www.igorslab.de/en/intel-co...rboards-part-1/

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
"Intel is the value brand that manages to also be more energy efficient than its AMD counterpart"

Send that back to, say, 2019 or so

Lol

Arzachel
May 12, 2012

Hughmoris posted:

A friend has an old HP Notebook laptop collecting dust. A quick look shows it sporting the i7-6700u with 6 GB of RAM.

If the processor is halfway decent I might try my hand at upgrading the RAM and adding an SSD. Find some usage for it.

Check whether part of the RAM isn't soldered to the motherboard before shelling out for new sticks.

Internet Savant
Feb 14, 2008
20% Off Coupon for 15 dollars per month - sign me up!

Hughmoris posted:

Relatively speaking, how does the i7-6700u hold up?

A friend has an old HP Notebook laptop collecting dust. A quick look shows it sporting the i7-6700u with 6 GB of RAM.

If the processor is halfway decent I might try my hand at upgrading the RAM and adding an SSD. Find some usage for it.

I am still using a Dell XPS with an i7-6500u with 8gb of ram and an SSD in it as my primary computer. It's a solid 5 years old and holding up well enough that while I want to refresh the laptop, I don't really need too.

Seconding the comment on ram. Crack it open and see what you have before doing anything else. 6 GB is an oddball amount of ram that feels like some is soldered on motherboard with some more added as a sodimm stick

SSJ_naruto_2003
Oct 12, 2012



Shrimp or Shrimps posted:

What settings did you land on for your 12600k?

Hey, so here's a (not optimized) OC for my 12600k. I think i can drop the voltage a little more and/or get the ring bus up to 41 or 42. I will be tinkering with it more in the upcoming days.

Stock



Overclock


Both times the temp was hitting a fan breakpoint, i think i have it set up to go to 60% power at 65C and either 90 or 100% at 80c. This is on a noctua u12S

It does take about 40watts more for that improvement, so im thinking i'll drop the e-cores back down and lower the voltage some more. I don't do enough heavy rendering to need a full-system OC of this level, but i would like some more single threaded performance.

Hughmoris
Apr 21, 2007
Let's go to the abyss!
Thanks for the advice. I bought a $50 500GB SanDisk SSD and installed it with no issues. Completely forgot to look at the RAM while I had it open though. :doh:

This thing hums along with the new SSD. I had a family member try it out and they couldn't notice any performance difference between it and their new laptop for the general tasks of surfing, youtube, emails etc...

Going to toss in a new $14 battery and call it good for now. All in all, a fun and successful project.

Shipon
Nov 7, 2005
Why would my bus clock (12900K) be sitting at 99.8 MHz and attempting to change it to anything else (like, say, the 100 MHz the BIOS is set to) causes it to crash?

mobby_6kl
Aug 9, 2009

by Fluffdaddy
Am I getting it right that the B660 boards can't do any overclocking at all, even with an unlocked CPU? Or is it pretty much irrelevant if you can just bump up the power limit?

VorpalFish
Mar 22, 2007
reasonably awesometm

mobby_6kl posted:

Am I getting it right that the B660 boards can't do any overclocking at all, even with an unlocked CPU? Or is it pretty much irrelevant if you can just bump up the power limit?

They can't do multiplier based overclocking. They will do memory oc (aka xmp), I believe many should allow power limit tweaking and may have variants of MCE (allow peak single core freq for all core).

I'm of the opinion that actual oc basically isn't worth the time and effort anymore and if b660 / h670 are enough cheaper and meet connectivity needs they should probably be the de facto standard choice.

Others apparently disagree.

mrk
Jan 14, 2004

what the f/2.8 is going on here!

VorpalFish posted:

They can't do multiplier based overclocking. They will do memory oc (aka xmp), I believe many should allow power limit tweaking and may have variants of MCE (allow peak single core freq for all core).

I'm of the opinion that actual oc basically isn't worth the time and effort anymore and if b660 / h670 are enough cheaper and meet connectivity needs they should probably be the de facto standard choice.

Others apparently disagree.

Yup aside of padding out benchmark numbers, real world use inc games are so good at stock that it's pretty pointless nowadays really. We have reached a time where GPU bottlenecks are more a problem with screens that are so high resolution and with high refresh rates.
I have not OCd my 12700KF, and will not either. I also have standard 64GB of 3600 CL18 DDR4 yet my performance numbers in Cinebench and the like match up with others with faster RAM or game FPS isn't much behind at all (if not similar in many games).

I have even undervolted my GPU because it shows no loss in fps but gains in the way of 100 watts less power draw in gaming, and less heat plus less fan noise.

wargames
Mar 16, 2008

official yospos cat censor
Intel is willing to open their piggy bank for talent

https://appleinsider.com/articles/22/01/06/apple-loses-lead-apple-silicon-designer-jeff-wilcox-to-intel

WhyteRyce
Dec 30, 2001

Intel has always been willing to spend to acquire or retain top of the ladder guys. The problem is extravagant spending for talent usually cut off there

BlankSystemDaemon
Mar 13, 2009



It goes without saying that Jeff Wilcox wasn't the only person to design the M1 for Apple, but what a lot of articles aren't printing and what wasn't sensationalized to the same degree is that Gerard Williams III, John Bruno, and Manu Gulati - all on the M1 team like Jeff Wilcox - left Apple years ago and are now at Qualcomm.
They also had a brief flirt with a start-up called Nuvia which they founded and which was supposed to be working on an ARM server CPU, but so far as I was able to determine at the time, the Nuvia offices were in the same building as the Qualcomm CPU engineers, and there was cross-talk even before the acquisition.

movax
Aug 30, 2008

BlankSystemDaemon posted:

It goes without saying that Jeff Wilcox wasn't the only person to design the M1 for Apple, but what a lot of articles aren't printing and what wasn't sensationalized to the same degree is that Gerard Williams III, John Bruno, and Manu Gulati - all on the M1 team like Jeff Wilcox - left Apple years ago and are now at Qualcomm.
They also had a brief flirt with a start-up called Nuvia which they founded and which was supposed to be working on an ARM server CPU, but so far as I was able to determine at the time, the Nuvia offices were in the same building as the Qualcomm CPU engineers, and there was cross-talk even before the acquisition.

The key problem for those other guys IMO is that they don’t have a fully vertically integrated stack where they control the entire device from silicon to OS to industrial design. Qualcomm can get the best CPU guys in the world whose work will be wasted on Android turd sandwiches. Microsoft squandered Surface, IMO — didn’t do them much good owning the OS and telling the ODM what to do.

WhyteRyce
Dec 30, 2001

Didn’t Qualcomm just blame Windows ARM failures on greedy OEMs

BlankSystemDaemon
Mar 13, 2009



movax posted:

The key problem for those other guys IMO is that they don’t have a fully vertically integrated stack where they control the entire device from silicon to OS to industrial design. Qualcomm can get the best CPU guys in the world whose work will be wasted on Android turd sandwiches. Microsoft squandered Surface, IMO — didn’t do them much good owning the OS and telling the ODM what to do.
Oh, that's absolutely a good point - however, it's not like Microsoft are using ICC to build with, so even if they did have everything else exactly as it should be (and this is still Microsoft we're talking about here, so that's a big ask), they still wouldn't have had the exact advantage Apple has with the M1.
Then again, given how badly ICC performs on AMD systems, that's probably for the best.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
compiler isn’t the only thing Apple has going for them, everything the asahi Linux people have shown is that it still flies on public compilers. it’s just more cope from the x86 gang.

Helter Skelter
Feb 10, 2004

BEARD OF HAVOC

Shipon posted:

Why would my bus clock (12900K) be sitting at 99.8 MHz and attempting to change it to anything else (like, say, the 100 MHz the BIOS is set to) causes it to crash?

Raising that is basically a full-system overclock and not generally recommended unless you know what you're doing. 99.8MHz is close enough (there's always a little variance) to 100MHz that I wouldn't gently caress with it.

BlankSystemDaemon
Mar 13, 2009



Paul MaudDib posted:

compiler isn’t the only thing Apple has going for them, everything the asahi Linux people have shown is that it still flies on public compilers. it’s just more cope from the x86 gang.
Intimate knowledge of hardware when writing compiler code is absolutely going to make a difference, but my comment wasn't meant how I think you interpreted it.

What compiler is AsahiLinux using (not what's supported, because that's obviously both GCC and LLVM)? Because all of the optimizations for M1 that Apple have done are as far as I know included in LLVM, as that's what Xcode uses - and in turn is what every macOS application developer is using.

SomeoneTM would need to do two compiles of AsahiLinux, one with LLVM and one with GCC - and then they'd presumably have to spend some time adjusting all the various optimization flags so that they're completely the same across the compilers, because anything else would be an apples to oranges comparison (and it's where almost every compiler benchmark article falls short, as it's a very common mistake).
I've yet to find anyone publishing numbers on this, so I think it's premature to conclude that whether there's a difference or not.

I think it's neat that there's finally a high-performing ARM core, I just wish it was available outside of a very small subset of hardware - and this is true for AWS Graviton too, even though the ARM NeoVerse N1 is what it's based on, it's not readily available if you don't do butt compute.

Ihmemies
Oct 6, 2012

So 12700K is good for games and applications, yes? Thought about:
- 12700k
- Asus z690 rog strix a ddr4
- noctua n15 black

Use my old 4x8GB ddr3600 cl15 b.die.

Price would be around 1000€, not cheap. But I don't see the value of paying 200€ more for 4 efficiency cores. 50€/pop is not good. I'd pair the cpu with 3080, and replace my old 8700k setup.

Ddr5 doesn't seem to offer anything now either except hundreds of dollars of extra money for no noticeable gain.

Shipon
Nov 7, 2005
you get the cool-rear end golden wafer packaging though

Cygni
Nov 12, 2005

raring to post

12700K is Very Good and with a 3080 will be extremely performant. I would probably consider looking at a B660 board and not that Strix Z690-A, just because its super expensive. There really isn't much point to overclocking modern CPUs, and the B660 boards will perform exactly the same, just without overclocking and with a few less supported USB ports that manufacturers dont put on the IO plate anyway.

In the US, if you wanted to stick with Asus, the B660-A Strix (which is not identical, but similar) is $150 cheaper which is nuts, and there are B660 boards all the way down to sub $100. Might make the proposition more attractive.

Dr. Video Games 0031
Jul 17, 2004

Cygni posted:

12700K is Very Good and with a 3080 will be extremely performant. I would probably consider looking at a B660 board and not that Strix Z690-A, just because its super expensive. There really isn't much point to overclocking modern CPUs, and the B660 boards will perform exactly the same, just without overclocking and with a few less supported USB ports that manufacturers dont put on the IO plate anyway.

In the US, if you wanted to stick with Asus, the B660-A Strix (which is not identical, but similar) is $150 cheaper which is nuts, and there are B660 boards all the way down to sub $100. Might make the proposition more attractive.

I'd sooner get the MSI Z690-A Pro for the price of that B660 board. Robust IO, a solid set of features and headers, four m.2 slots, two pcie x4 slots. The audio chip is slightly worse, but that's the only downside I'm seeing and that's probably not noticeable for most people. Oh, and no included heat spreader for three of the m.2 slots, but meh.

Cygni
Nov 12, 2005

raring to post

Dr. Video Games 0031 posted:

I'd sooner get the MSI Z690-A Pro for the price of that B660 board. Robust IO, a solid set of features and headers, four m.2 slots, two pcie x4 slots. The audio chip is slightly worse, but that's the only downside I'm seeing and that's probably not noticeable for most people. Oh, and no included heat spreader for three of the m.2 slots, but meh.

Yeah, Asus stuff is super overpriced this generation. Just wanted to give an Asus option if they wanted to stick with em for some reason. In general, the Z690-A Pro is probably the default board out there right now (if you are cool with the downgraded audio).

Adbot
ADBOT LOVES YOU

Shipon
Nov 7, 2005

Dr. Video Games 0031 posted:

I'd sooner get the MSI Z690-A Pro for the price of that B660 board. Robust IO, a solid set of features and headers, four m.2 slots, two pcie x4 slots. The audio chip is slightly worse, but that's the only downside I'm seeing and that's probably not noticeable for most people. Oh, and no included heat spreader for three of the m.2 slots, but meh.
Do you need heat spreaders for NVMe drives, even? But yeah, four M.2 ports on it is wonderful and if anything my only gripes with the board (I have the DDR4 Wifi model) are that there's no add-on temp sensor plugs on the board or no postcode LED (Yeah all of that stuff have moved way up to the high end way-too-expensive boards I get that). It's a real shame that all of the high-end boards are DDR5 only given the sorry state of DDR5 modules currently available. Give it another year or two and DDR5 will be fine but right now? Nah.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply