|
I'm extremely underwhelmed. What's supposed to excite me here exactly? New architecture with 10% power savings and 10% speed improvement Too negligible to matter for the home user. Only big companies will care, and hopefully they realize 99% of their employees can get by on Atoms. Dedicated H264 encoding chip Niche improvement that seems pointless. Who does video encoding other than scene groups and video professionals? Those people already have dedicated hardware for this. Little Jenny can capture her vlog from her webcam just fine with her current hardware. You can't overclock like you used to Overclocking is a stupid waste of money and shouldn't be done but limiting the option is a negative for the CPU riceboy types. Integrated GPU that doesn't suck The only interesting thing so far. But even that will probably do more harm than good. The vast, vast majority of buyers won't play high-end 3D games, so adding 40W to the power consumption (or however much it draws at idle load) of every next-generation Intel-based computer just wastes the consumer's money and rapes the environment further. When I saw the thread title I was really hoping news of a design that cut power consumption in half or something. Turns out it's mostly more of the same. Changing the face of computing? It's just changing the socket type.
|
# ¿ Sep 20, 2010 06:46 |
|
|
# ¿ May 5, 2024 00:04 |
|
Jabor posted:
I take back what I said. I thought that sort of GPU would draw a high amount of power, but the two cards the OP mentions only draw 8W at idle. Here's hoping Intel's GPU is even more efficient. Out of curiosity, anyone know how much the current crappy onboards draw? As for the 10%, I just feel it's too minor an improvement for a new generation. I honestly think the home user is set for the next 2 to 3 years (if not more) in terms of computing power, so I wouldn't mind if the focus became almost exclusively about power consumption.
|
# ¿ Sep 20, 2010 07:05 |