Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Begall
Jul 28, 2008
We’ve reached a point where “65W” actually means 400W+? :psyduck:

Adbot
ADBOT LOVES YOU

cerious
Aug 18, 2010

:dukedog:
Should just drop the 'W' like the 'nm'

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

SwissArmyDruid posted:



What, actually, the gently caress, Intel. I thought these were supposed to be half-filled with "efficiency cores".

Average wattage hasn’t gone up though, so those higher peaks are being balanced by lower lows.

What you’re seeing is a progression of the same trends that have impacted AMD and NVIDIA GPUs: power consumption is getting “burstier” on these newer nodes. When everything switches on at once they can suck >2x “average” current for a few milliseconds, then it goes lower for a while to compensate.

There is of course a parallel conversation here about the extent to which our power supplies were specced to handle that kind of surge. It was a problem for Vega, a problem for RDNA1, a problem for Ampere (the POSCAP mess), etc. A good PSU should be able to handle it but a lot of people don’t have top tier PSUs, and have pretty marginal capacity that doesn’t leave tons and tons of headroom for surges, because it is a widely recommended place where you could save twenty bucks by cutting corners.

Paul MaudDib fucked around with this message at 05:58 on Aug 8, 2021

Drakhoran
Oct 21, 2012

Begall posted:

We’ve reached a point where “65W” actually means 400W+? :psyduck:

They might be using the same kind of "Watts" as the producers of hi-fi speakers.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull
Those peak numbers characterize worst-case transitory demand spikes running some kind of power virus load crafted by the engineering team to switch as many flipflops as fast as possible. System integrators have to size the PSU to handle such spikes to guarantee stable operation under all possible scenarios.

It's been a trend for a long time that each new generation of chips (from anyone, not just Intel) increases the ratio between peak and long-term average power draw. For example, a common method of using up the transistors which new nodes give you is to plonk down special purpose accelerator blocks that are usually dark. When your engineering team does the exercise of assuming everything's active at everything's Fmax at the same time, you do get some big numbers.

Note that the "continuous current" aka long-term average numbers show no changes between 10th/11th/12th gen for each TDP bin, other than 12th gen 35W TDP losing 2A (which is good). So in that respect, Intel is holding the line, which is good. The real issue is that the line was ridiculous in the first place: a supposedly 125W TDP CPU is allowed to draw 26A continuous from the 12V supply (312W)? Intel TDP used to actually mean something, many years ago.

Inept
Jul 8, 2003

BobHoward posted:

The real issue is that the line was ridiculous in the first place: a supposedly 125W TDP CPU is allowed to draw 26A continuous from the 12V supply (312W)? Intel TDP used to actually mean something, many years ago.

the 65W part continuously consuming 270W is even more funny. I don't think AMD has anything that bad

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Inept posted:

the 65W part continuously consuming 270W is even more funny. I don't think AMD has anything that bad

They *did*. Bulldozer.

The track is on the other bogey now.

wargames
Mar 16, 2008

official yospos cat censor

SwissArmyDruid posted:



What, actually, the gently caress, Intel. I thought these were supposed to be half-filled with "efficiency cores".

So they are pulling an NVIDIA and MOAR POWAH!!!!

Gwaihir
Dec 8, 2009
Hair Elf

Inept posted:

the 65W part continuously consuming 270W is even more funny. I don't think AMD has anything that bad

A full power 64c Epyc clocks in 280w, but that's about it I think. Desktop stuff like the 5800x peaks at 140w.

AARP LARPer
Feb 19, 2005

THE DARK SIDE OF SCIENCE BREEDS A WEAPON OF WAR

Buglord
540 watts lmao I have a literal spaceheater that’s 600 watts.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
Looking forward to the future when AIOs give way to self-contained phase change apparatuses that require a licensed refrigerant tech to service.

I mean, Intel already brought back TECs.

FuturePastNow
May 19, 2014


BIG HEADLINE posted:

They *did*. Bulldozer.

The track is on the other bogey now.

At least AMD labeled the 220W Blulldozer parts as 220W parts. Even included an AIO with the retail box version.

Indiana_Krom
Jun 18, 2007
Net Slacker

B-1.1.7 Bomber posted:

540 watts lmao I have a literal spaceheater that’s 600 watts.

When someone asks how powerful your CPU is, you can just say "It peaks at about 3/4 horsepower".

BlankSystemDaemon
Mar 13, 2009




Lots of nerds are gonna have to have their house electrics rewired, so really Intel is doing The Society a favour by ensuring there'll be plenty of jobs in the construction business.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
Looking forward to how mobo makers are gonna market their one-click "easy" 1 extra watt per 1MHz OC

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

B-1.1.7 Bomber posted:

540 watts lmao I have a literal spaceheater that’s 600 watts.

I wonder at what point it’s worth building dedicated exhaust ventilation into your house for your SLI 3090 build?

I know my computer in a well ventilated area of my house makes it hot as balls with a 2080Ti and a 9700k. My cooling costs!!!

Hughmoris
Apr 21, 2007
Let's go to the abyss!
If I want to try one of those new Intel CPUs, do I need a 240V outlet installed?

shrike82
Jun 11, 2005

do a server build hidden in a closet then stream games to a thin client

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Inept posted:

the 65W part continuously consuming 270W is even more funny. I don't think AMD has anything that bad

I noticed that the Dell micro-PCs (micro desktops like this) have gone from using 35W desktop CPUs on prior generations to 35W laptop CPUs on 11th gen. This was pretty confusing to me initially because 35W TDP is 35W TDP, right? I guess it's not anymore, though.

silence_kit
Jul 14, 2011

by the sex ghost
Is it possible to hit the high continuous power dissipations in non-fake workloads though?

AutismVaccine
Feb 26, 2017


SPECIAL NEEDS
SQUAD

Gwaihir posted:

A full power 64c Epyc clocks in 280w, but that's about it I think. Desktop stuff like the 5800x peaks at 140w.

with 5x scaling and PBO i got my 5800x CPU package power to 183W (cinebench for like 12hours to test the new system, oversized custom loop).

Modern CPUs can consume hilarious amounts of power, good to see that that wont change too fast.

Inept posted:

the 65W part continuously consuming 270W is even more funny. I don't think AMD has anything that bad

+1, thats kinda insane

AutismVaccine fucked around with this message at 16:10 on Aug 8, 2021

VorpalFish
Mar 22, 2007
reasonably awesometm

AutismVaccine posted:

with 5x scaling and PBO i got my 5800x CPU package power to 183W (cinebench for like 12hours to test the new system, oversized custom loop).

Modern CPUs can consume hilarious amounts of power, good to see that that wont change too fast.

+1, thats kinda insane

PBO is specifically running the CPU out of spec and allowing it to violate PPT.

On the intel side of things, the 65w cpus absolutely do not draw 270w sustained when using intel's recommended motherboard settings:

https://www.techpowerup.com/review/intel-core-i5-11400f/20.html

Will it do more than 65w short duration? Absolutely. Can you uncap, overclock, and cause crazy sustained draw? Sure, but that's on you.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
You can't really compare AMD and Intel CPUs head to head in terms of spec/OC. AMD's auto-clocking is very advanced and is basically getting everything out of the CPUs already. PBO is throwing a ton of power at essentially no gain. Intel's time-limited boost spec, whatever the gently caress they're calling it, is only really time-limited for thermal design purposes for OEM and maybe SI applications. There's really no downside to letting it run in suck down all the power mode forever, Intel doesn't actually consider it out of spec, and it does get you meaningful performance gains for the power cost. That's why there are always a bunch of motherboards that default to running that way all the time. So AMD has their TDP, their actual power draw, and their transient power draw, but Intel has TDP, actual power draw with "spec" time limits, then a lower drop-down power, or maintaining that boost power forever, and also transients. The only sane way to compare them would be setting up a system for a specific goal, benching that specific goal while also capturing power usage, and then analyzing efficiency. In a vacuum none of the numbers mean anything.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
yeah Intel does this thing where they want to be able to sell CPUs on the OEM and light-user side without scaring off customers/motherboard manufacturers with high wattage numbers and while only needing their piddly-rear end coolers, so the base clocks are set/advertised as low with a simplistic timed boost that actually lets them hit a "65w TDP"... but they also know that they need to goose the clocks real high if they want to be competitive in the gaming/power-user space... so you can just lift the boost limit and let it run at full clip all day without even violating what Intel would consider in-warranty behavior

I don't know if I would necessarily call this "cheating", but there's definitely an element of misleading-ness to all of it because they're trying to have their cake and eat it too

I remember a couple of months back Hardware Unboxed took issue with this sort of thing, because they found a couple of motherboards where the power delivery was so sparse and underbuilt that even though it could run a CPU within the "Intel guidance" limits with the low clocks and low TDP, it couldn't deliver no-limit-boosting, which is what happens if you tell a motherboard manufacturer "it only needs to be able to handle 65 watts" and they wanna cut so many corners that they actually do exactly that

VorpalFish
Mar 22, 2007
reasonably awesometm

Which is exactly what those benchmarks are. The chip will stay within long duration power limits on the box as long as the uefi settings aren't modified to allow it to do more (or as you mentioned have their default set that way which imo is kinda scummy practice by the board maker - yes, leave the option there for people that want it, but behavior out of the box should match the packaging.)

Cygni
Nov 12, 2005

raring to post

Yeah, both Intel and AMDs "TDP" calculations are far from perfect and their boosting behaviors (and board vendor implementations) are disparate enough that for desktop specifically, you cant use them like you used to. The reality is that the DIY market has sent a clear signal to the hardware providers that they don't really give a poo poo about power draw on average, and AMD/Intel/Nvidia have all responded to that signal with ballooning power budgets. Especially in a world with process improvement grinding down to an unbelievably expensive and slow slog, sending the power budget through the roof for the DIY segment is one of the easiest ways to keep giving people the gaming performance increases they want.

silence_kit
Jul 14, 2011

by the sex ghost
Are the continuous power dissipation numbers going up for real applications, or only for fake applications which are designed to turn on all of the sub-circuits of the computer chip?

If it is only the case that power is going up for case 2) and not case 1), then I don't really see the controversy here. It makes a lot of sense to me for Intel and AMD to design for real world usage instead of fake computer usage.

Begall
Jul 28, 2008
That hardware unboxed video found intel motherboards that were so bad that they weren’t able to run >65W CPUs within standard lower limits (that the box claimed were supported) without them being throttled to the point of losing half their performance relative to a properly built motherboard.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

VorpalFish posted:

Which is exactly what those benchmarks are. The chip will stay within long duration power limits on the box as long as the uefi settings aren't modified to allow it to do more (or as you mentioned have their default set that way which imo is kinda scummy practice by the board maker - yes, leave the option there for people that want it, but behavior out of the box should match the packaging.)

Why is it scummy that a high performance gamer board defaults to high performance gamer settings? The average person buying these boards does not understand what the hell is going on. Probably half the people building their own systems don't even enable XMP, which is a huge performance hit and in an ideal world it would be practical to default that to on as well.

Steve at GN makes a huge deal out of this and benches Intel systems at the "spec" which is honestly stupid. No one going to his youtube channel is trying to build an office PC with no cooling from a 11900k.

VorpalFish
Mar 22, 2007
reasonably awesometm

K8.0 posted:

Why is it scummy that a high performance gamer board defaults to high performance gamer settings? The average person buying these boards does not understand what the hell is going on. Probably half the people building their own systems don't even enable XMP, which is a huge performance hit and in an ideal world it would be practical to default that to on as well.

Steve at GN makes a huge deal out of this and benches Intel systems at the "spec" which is honestly stupid. No one going to his youtube channel is trying to build an office PC with no cooling from a 11900k.

I think Steve has this one at least a little right - I prefer when review sites test at both stock and oc/raised limits and report power and performance for both cases.

And I think it's scummy because they're doing it to make their motherboard look better than it actually is. If you test board A and board B where board B enforces power limit and A does not, A looks like it performs better with the same chip even if B is just as capable of running the chip just as fast by raising the power limit.

I think running a chip at a higher power limit should be a conscious choice made by someone who's chosen appropriate cooling to go with it, not something you don't realize is happening.

If you aren't the kind of person who's gonna even enable xmp, maybe you're running the stock cooler too, and the performance at stock long power limit is probably sufficient in that scenario.

Edit: I would be way more on board with xmp on being a default - ostensibly the memory kit is being guaranteed by the manufacturer to run at the advertised clocks and timings, and it has negligible impact to system power draw.

VorpalFish fucked around with this message at 19:57 on Aug 8, 2021

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
I get what you're saying, but if you're dumb and lazy enough that you buy a high-end board and a high end CPU, and then you put a poo poo-rear end cooler on it... I honestly don't care, go gently caress yourself. You can deal with the poorly cycling thermal throttling, you deserve it.

Keep in mind that all the higher-end K SKUs don't come with an HSF. You have to do the research to buy an expensive CPU and expensive motherboard and then buy a garbage cooler to wind up in that situation. And yes, at that point you should also be going into BIOS and setting things, but the fact of the matter is that that is a different, more difficult task for most people than buying a list of good parts and putting them together. Also I think we're past the days of benches being the judge of motherboards, but maybe I'm wrong.

Klyith
Aug 3, 2007

GBS Pledge Week

silence_kit posted:

Are the continuous power dissipation numbers going up for real applications, or only for fake applications which are designed to turn on all of the sub-circuits of the computer chip?

Real applications that use the CPU. For a lot of regular enthusiasts, games are the only thing that does it.

Most apps don't really put that much CPU pressure on the system while active.

K8.0 posted:

Also I think we're past the days of benches being the judge of motherboards, but maybe I'm wrong.

intel has made them relevant again

VorpalFish
Mar 22, 2007
reasonably awesometm

Klyith posted:

Real applications that use the CPU. For a lot of regular enthusiasts, games are the only thing that does it.

Most apps don't really put that much CPU pressure on the system while active.

intel has made them relevant again

That's exactly what we're talking about; some of those motherboards are allowing the chip to maintain boost way above the rated power limit by default, which makes them look a lot better.

In my opinion, that's scummy on the part of the board maker, not intel - and to be clear those are non k series skus that do come with lovely stock heatsinks that a masochist could conceivably use.

Yes, some of the boards couldn't handle 140w because of inadequate/undercooked vrms, but if you want to run at uncapped power, get a board with decent vrm.

Imo they should all default to enforcing power limit of the sku by default, with the option to uncap - should be opt in, not opt out.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Klyith posted:

Real applications that use the CPU. For a lot of regular enthusiasts, games are the only thing that does it.

Most apps don't really put that much CPU pressure on the system while active.

intel has made them relevant again

everything that intel says about benchmarks is true, especially if it contradicts what they said about benchmarks before.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
For H&B platforms I agree with you, it's a lower end platform where lower end components are being used and efficiency is far more likely to be a real concern. On Z platforms, goose it to the moon by default.

Klyith
Aug 3, 2007

GBS Pledge Week
Anyways the last 4-5 pages of this thread are retrospectively super funny now.

Intel: We want everyone to use ATX12VO so PCs will be more efficient #green
Also Intel: Here's our new 500W CPU!

Shipon
Nov 7, 2005
to be fair someone stuck at home on their gaming computer pulling down ~800 W from the wall is still using about as much electricity as it takes a tesla to drive a little over 3 miles so if someone's hobby or entertainment has them driving more than 10 miles or so per outing, tsk tsk

shrike82
Jun 11, 2005

love to spend summer sitting next to a 1kw gaming desktop

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

shrike82 posted:

love to spend summer sitting next to a 1kw gaming desktop

just play on a laptop via Moonlight

bing bong so simple

Adbot
ADBOT LOVES YOU

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
Meanwhile, the current consoles are legitimately comparable to a midrange current gen PC and ship with 315W PSU, peak energy usage observed right now is around 201W.

https://www.eurogamer.net/articles/digitalfoundry-2020-xbox-series-x-power-consumption-and-heat-analysis

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply