Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
MachinTrucChose
Jun 25, 2009
I'm extremely underwhelmed. What's supposed to excite me here exactly?

New architecture with 10% power savings and 10% speed improvement
Too negligible to matter for the home user. Only big companies will care, and hopefully they realize 99% of their employees can get by on Atoms.

Dedicated H264 encoding chip
Niche improvement that seems pointless. Who does video encoding other than scene groups and video professionals? Those people already have dedicated hardware for this. Little Jenny can capture her vlog from her webcam just fine with her current hardware.

You can't overclock like you used to
Overclocking is a stupid waste of money and shouldn't be done but limiting the option is a negative for the CPU riceboy types.

Integrated GPU that doesn't suck
The only interesting thing so far. But even that will probably do more harm than good. The vast, vast majority of buyers won't play high-end 3D games, so adding 40W to the power consumption (or however much it draws at idle load) of every next-generation Intel-based computer just wastes the consumer's money and rapes the environment further.

When I saw the thread title I was really hoping news of a design that cut power consumption in half or something. Turns out it's mostly more of the same. Changing the face of computing? It's just changing the socket type.

Adbot
ADBOT LOVES YOU

MachinTrucChose
Jun 25, 2009

Jabor posted:

:rolleyes:

Anyway, what makes you think this will have a significantly higher idle power consumption than the alternatives for that market segment?

I take back what I said. I thought that sort of GPU would draw a high amount of power, but the two cards the OP mentions only draw 8W at idle. Here's hoping Intel's GPU is even more efficient. Out of curiosity, anyone know how much the current crappy onboards draw?

As for the 10%, I just feel it's too minor an improvement for a new generation. I honestly think the home user is set for the next 2 to 3 years (if not more) in terms of computing power, so I wouldn't mind if the focus became almost exclusively about power consumption.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply