|
movax posted:Clearly it's time for VapoChill to make a comeback Hah, well, anyone who has one stashed away somewhere can feel smug now. Prometia, VapoChill... ah, those were the days. Total overkill. I wonder if you can adapt the fittings to the new sockets? Actually, scratch that, if you have one of those things, you'll MAKE it fit. Even if it means having a bracket fabricated. It'd be worth it. vv Hey, I went back and read it. Yup, these closed loop cooling solutions on the market are mostly pointless, or for people who don't like the look of giant towers. An NH-D14 is going to do a better job the vast majority of the time, and if the pump fails at some point, you're screwed. Fan fails? I'm sure you'll be able to find 12v 120mm fans well into the future. HalloKitty fucked around with this message at 22:18 on Jun 4, 2013 |
# ? Jun 4, 2013 22:08 |
|
|
# ? May 10, 2024 22:58 |
|
^^^ Edit: Thanks buddy I even forgot that point, the ability of heat pipe coolers to KEEP YOUR poo poo ALIVE even if the active element (fans duh) fail, very very much unlike liquid cooling. ^^^ God drat it, I hate when an effortpost is last on the page. Nobody is going to read that. So here's some bad news for y'all vapochill enthusiasts to make us all feel bad! http://www.brightsideofnews.com/news/2009/4/10/asetek-ends-vapochill-resurrection-efforts2c-cites-intel-socket-issues.aspx Agreed fucked around with this message at 22:41 on Jun 4, 2013 |
# ? Jun 4, 2013 22:12 |
|
Agreed posted:God drat it, I hate when an effortpost is last on the page. Nobody is going to read that. I don't quite get that. They could just slather goo all over the socket unless they mean under the IHS which would be a bitch for SNB because of the solder. Aren't people still rolling their own phase change rigs these days? Or is that scene dead?
|
# ? Jun 4, 2013 22:20 |
|
Shaocaholica posted:Aren't people still rolling their own phase change rigs these days? Or is that scene dead? Alive and well last I checked, but goofy to the point of strapping a Peltier on it... In my jerk opinion.
|
# ? Jun 4, 2013 22:44 |
|
As for the voltage draw and TDP melting motherboards, I think the high end boards are ready for it. Gigabyte announced as much at Computex this past weekend: http://wccftech.com/gigabyte-confirms-upcoming-5ghz-amd-fx-processors-computex-2013/ The slide says they beefed up the VRM's on the motherboard to handle the new 5ghz monstrosity. I don't think I'll be getting one, but I'm actually a bit excited to see the benchmarks. I'll bet it holds up pretty well vs. the Haswell i7, at the cost of raising your electric bill a little and keeping you warm in the winter.
|
# ? Jun 5, 2013 01:36 |
|
Anyone getting a 5ghz CPU is already wasting far more energy running their video card(s). Intel is resting on their power-sipping laurels, and has put performance on the back burner. Good time for AMD to sweep in, though intel is probably on the right track for the long haul.
|
# ? Jun 5, 2013 04:10 |
|
On the topic of cooling, just put it into mineral oil! http://arstechnica.com/security/2013/06/password-crackers-go-green-by-immersing-their-gpus-in-mineral-oil/
|
# ? Jun 5, 2013 09:24 |
|
Riso posted:On the topic of cooling, just put it into mineral oil! I swear I remember a Prescott P4 era overclocking team that did this same thing, the CPU was naked and it basically acted like a massively effective passive radiator, moving cool mineral oil from the bottom of the tank to the top of the tank as it heated up, and then coming back down. Like one big heat pipe, except completely not. Here's a realthink: GPUs have been using vapor chamber coolers for some time, that's a tech that we have not seen adopted for CPU usage. There's probably a good reason; vapor chambers can be built to dimension, of course, but they work so well on graphics cards because they make good contact with all the important bits. With the CPU, there's really only one important bit, and the contest is more "how can we get the heat into as many 8mm headpipes as possible very quickly?" (answer: direct contact loses to shallow blocks with quick heat transfer, mainly because 8mm pipes are ideal in dimension and you can fit more of them so they're rapid at heat exchange) in the high end, and "just stick the pump on it" for liquid prepackaged loops. I'm as excited as Factory Factory for the neato whirly coolers that will hit the market some day soon...ish, but in the meantime unless someone figures out a way to adapt vapor chamber technology for CPU usage in a prepackaged way, I doubt we're going to see any revolutions in cooling any time soon. Which is generally fine, since most CPUs are going down, down, down in TDP, and this is an intentional anomaly from a hot-running family of processors whose main claim to fame is high clocks at a massive wattage cost. So it goes. I think I am looking for a solution to a problem that doesn't really exist, since existing tech is more than adequate for whatever you intend to do... It'd just be neat to see a new, badass cooling setup hit the market and shake things up. Supercooled processors behave differently than their room temperature or even their moderately enthusiast counterparts (you know, like -40ºC moderate via dry ice). If we had a cooling solution that would let us safely exceed voltage requirements since heat would be out of the picture and V=IR, it'd just be, well, neat. Resistance and the death of transistors occurs so quickly at normal temps at too-high voltages with the tiny process lithography that's in use today, but that all changes when you start getting into the -100ºC range. But what possible consumer application could there be for such a thing? Doomed to be the purview of record-setters, I suppose.
|
# ? Jun 5, 2013 09:47 |
|
Some already have built cascade cooling (multiple refrigeration units) for their CPUs to get into -triple digits Just a quick search turns up some impressive stuff. For example, here's a 3 stage cascade cooling build: https://www.youtube.com/watch?v=GB8bUkknEXM YouTube video posted:"With no charge it can go as low as -150º Celsius, operating temperature is around -135 idle cpu (50w), -125 loaded cpu (200w)" Of course, you can keep running like this for as long as you want, unlike some kind of pointless dry ice or liquid nitrogen run. HalloKitty fucked around with this message at 12:21 on Jun 5, 2013 |
# ? Jun 5, 2013 12:19 |
|
Some of the single-stage phase-change coolers have built-in heaters for behind the CPU and for heating the hose to prevent condensation, so you could theoretically run them forever as long as the heaters didn't fail. Even then the risk of condensation at operating temps is relatively low, other than some risk that droplets could form and drip on the videocard. Other than using obscene amounts of power, a phase-change setup can be made relatively quiet - about on par with a small refrigerator anyways. It's odd to think that 'extreme' cooling solutions might become slightly more common given the way desktop CPUs seem to be heading back to the early 2000's. Mineral oil cooling isn't that efficient (you still have to use a radiator to dissipate heat like a water-cooling loop) while also having bad thermal properties and being messy as hell. Good luck keeping a warranty or re-selling anything that you put into a mineral oil tank. Fans spinning in oil do look really neat though. Vegetable oil cooling is a hilariously awful idea unless you don't have a sense of smell at all. It's been attempted before, but things like putting peltiers inside consumer heatsinks is one route they could try for consumer cooling. The Ultra ChillTEC wasn't the greatest design so it was outperformed by standard aircoolers of the time in practice, though peltiers are still sometimes used in below-ambient watercooling loops.
|
# ? Jun 5, 2013 18:27 |
|
LCD Deathpanel posted:It's odd to think that 'extreme' cooling solutions might become slightly more common given the way desktop CPUs seem to be heading back to the early 2000's. Yeah, it's interesting that 1. history is repeating itself with Intel acting arrogantly because AMD is down on its luck, and 2. processor designs/processes (ivy/haswell 22nm) optimized for low power at normal clock speeds consume much more power at higher clock speeds relative to older processors (sandy 32nm) not quite as optimized for low power.
|
# ? Jun 6, 2013 08:05 |
|
It isn't just Intel optimization for mobile though fivr is perhaps shaping up as a bad thing for desktop users. 22nm cpus are small and transistor dense, Haswell even more so than ivy. Less area + more transistors means more heat and less oc potential. I also wonder about how the characteristics of finfets stack up against planar transitors and if this is contributing to the changes we are seeing. I'm no expert and have not the slightest. The market as a whole seems happy with desktop cpu performance and does not want to liquid cool commodity computers. The server market determines desktop cpu design: while I and most enthusiasts/prosumers consider energy usage of ancillary importance, the big money cares a great deal. Yudo fucked around with this message at 09:23 on Jun 6, 2013 |
# ? Jun 6, 2013 09:16 |
|
LCD Deathpanel posted:
Wow, I've seen mineral oil, but people use vegetable oil for cooling? Is it just "hey this was cheap and I thought it would work" jury rigged stuff? This just leaves animal oil. You'd think someone on Reddit would have made a bacon grease-cooled rig by now or something.
|
# ? Jun 6, 2013 14:48 |
|
Tom's has their A10-6800k review up Single core still stinks.
|
# ? Jun 6, 2013 14:55 |
|
Kinda want to see a 4670R comparison just because Also they need to put that the not-otherwise-indicated system RAM for the 6800K (and other APUs) is 1833 on the graphic itself. Realistically people will be getting 1333 or 1600 which will crap on the numbers a bit more.
|
# ? Jun 6, 2013 19:34 |
|
Sir Unimaginative posted:Also they need to put that the not-otherwise-indicated system RAM for the 6800K (and other APUs) is 1833 on the graphic itself. Realistically people will be getting 1333 or 1600 which will crap on the numbers a bit more.
|
# ? Jun 6, 2013 20:21 |
|
Alereon posted:The jump from 1600 to 1866 isn't huge, but there's no excuse for using DDR3-1333.
|
# ? Jun 6, 2013 20:28 |
|
Alereon posted:No one should be pairing a new CPU with slow memory, there's no price difference between DDR3-1333 and 1600, and only 10% between 1600 and 1866. The jump from 1600 to 1866 isn't huge, but there's no excuse for using DDR3-1333. Did I miss something and Haswell can actually use faster memory for anything more than long compression/decompression stuff? I do agree that in the absence of a price difference, faster memory makes sense just because it's faster at the same price, but the efficiency of Intel's memory controller and access is such that it is a very narrow class of workloads that benefit from greater than 1333mhz DDR3, unless something drastically changed with the move from SB/IVB to Haswell. Granted AMD is a different story and scales much more linearly. Poor bastards.
|
# ? Jun 6, 2013 20:30 |
|
Alereon posted:The jump from 1600 to 1866 isn't huge, but there's no excuse for using DDR3-1333. And if I had said reasonably instead of realistically I'd deserve that. Reasonably it's hard to see how someone with enough money to afford a PC with games-quality rendering would be buying an APU at all. dont be mean to me fucked around with this message at 20:33 on Jun 6, 2013 |
# ? Jun 6, 2013 20:30 |
|
Agreed posted:Did I miss something and Haswell can actually use faster memory for anything more than long compression/decompression stuff? I do agree that in the absence of a price difference, faster memory makes sense just because it's faster at the same price, but the efficiency of Intel's memory controller and access is such that it is a very narrow class of workloads that benefit from greater than 1333mhz DDR3, unless something drastically changed with the move from SB/IVB to Haswell.
|
# ? Jun 6, 2013 20:48 |
|
I put 1600 into my system when they were very much not the same price, as I recall my 16GB kit cost about $200 back in 2011 and that's for a Sandy Bridge system. Just trying to grab that little bit extra for multi-channel real time audio to hopefully prevent any dropouts under any circumstances, but going higher than that was cost prohibitive at the time and not a significant improvement to merit it. In the balance though I also nabbed two pre-tsunami 2TB HDDs so the overall build cost evened out nicely in the long term I guess. At least that cost prohibitive thing has come down on RAM and people building AMD systems (for some reason (why?? (this makes me sad))) can get their necessary faster RAM for faster performance without breaking the bank. I believe I priced something like $340 for iffy RAM and nearly $400 for good stuff for fast RAM back then.
|
# ? Jun 6, 2013 20:54 |
|
Okay, it's official now: AMD dropping Windows exclusivity in the consumer market and going after Android and ChromeOS design wins. It's like the starting gun for the next generation of computing, may the best company
|
# ? Jun 9, 2013 23:10 |
|
Frankly I wasn't aware AMD had Windows exclusivity. When did that happen?
|
# ? Jun 9, 2013 23:16 |
|
According to the article, a decade ago. The policy was limited to consumer CPU offerings, though.
|
# ? Jun 9, 2013 23:17 |
|
Mad_Lion posted:As for the voltage draw and TDP melting motherboards, I think the high end boards are ready for it. Gigabyte announced as much at Computex this past weekend: What's crazy to me is the TDP difference between the 9000 and 8350 models - almost 100W and for 800Mhz gain. That just seems outrageous to me but maybe it's not that unusual, who knows. That's just a LOT of juice for that chip and the heat output has to be insane, makes me wonder if they'll still be pre-packaged with closed-loop watercooling units like the Bulldozer chips.
|
# ? Jun 10, 2013 01:19 |
|
Honestly with every chip but Sandy Bridge through Ivy Bridge (and MAYBE Haswell), that's about/at least the power consumption increase you'd see clocking them up that far anyway. That they're offering those clocks validated and everything is pretty impressive, if a bit loony.
|
# ? Jun 10, 2013 01:22 |
|
Factory Factory posted:Honestly with every chip but Sandy Bridge through Ivy Bridge (and MAYBE Haswell), that's about/at least the power consumption increase you'd see clocking them up that far anyway. That they're offering those clocks validated and everything is pretty impressive, if a bit loony. Nah Haswell is hot hot hot (when OCed). Worse than Ivy, even.
|
# ? Jun 10, 2013 15:46 |
|
sincx posted:Nah Haswell is hot hot hot (when OCed). Worse than Ivy, even.
|
# ? Jun 10, 2013 16:29 |
|
Bob Morales posted:Tom's has their A10-6800k review up Wait, does the A10 actually perform better without a discrete graphics card or am I misreading this?
|
# ? Jun 11, 2013 01:57 |
|
Detroit Q. Spider posted:Wait, does the A10 actually perform better without a discrete graphics card or am I misreading this?
|
# ? Jun 11, 2013 02:03 |
|
Alereon posted:When you use DDR3-2133 and compare it to a discrete graphics card about as slow as the onboard video, yes. Jesus. I realize that the 6670 is hardly cutting edge but I never thought I'd live long enough to see integrated graphics seriously overtake almost any discrete graphics card that isn't ten years old. Maybe I can replace my Phenom II B60/4850 with an A10 at this point
|
# ? Jun 11, 2013 02:08 |
|
To be fair that is a Radeon HD 6670 DDR3, which limits it to roughly the same amount of memory bandwidth as the system memory, thus eliminating all of its advantages. A "real" Radeon HD 6670 with GDDR5 would have twice the memory bandwidth and a significant performance advantage.
|
# ? Jun 11, 2013 03:26 |
|
Alereon posted:To be fair that is a Radeon HD 6670 DDR3, which limits it to roughly the same amount of memory bandwidth as the system memory, thus eliminating all of its advantages. A "real" Radeon HD 6670 with GDDR5 would have twice the memory bandwidth and a significant performance advantage. Ah. Something to look for when graphics card shopping then.
|
# ? Jun 11, 2013 03:35 |
|
Only extremely budget cards that nobody really buys for playing games would have DDR3, it's really unusual and usually deceptive marketing. Under-educated consumers will think that a card with tons of slow memory is better than one with a reasonable amount of fast memory.
|
# ? Jun 11, 2013 16:52 |
|
Well, its official, the 5ghz FX9590 exists.
|
# ? Jun 11, 2013 19:17 |
|
drat, wonder how much it's gonna cost.
|
# ? Jun 11, 2013 20:26 |
|
Alereon posted:To be fair that is a Radeon HD 6670 DDR3, which limits it to roughly the same amount of memory bandwidth as the system memory, thus eliminating all of its advantages. A "real" Radeon HD 6670 with GDDR5 would have twice the memory bandwidth and a significant performance advantage. With the A10 really start to pick up steam when DDR4 hits the streets?
|
# ? Jun 11, 2013 20:30 |
|
Bob Morales posted:With the A10 really start to pick up steam when DDR4 hits the streets?
|
# ? Jun 11, 2013 20:39 |
|
Endymion FRS MK1 posted:Well, its official, the 5ghz FX9590 exists. "Unlocked for overclockers". I wonder how much headroom they have. And since they state that they'll be available this Summer, it could be within a month.
|
# ? Jun 11, 2013 20:51 |
|
|
# ? May 10, 2024 22:58 |
|
Alereon posted:You wouldn't be able to use existing products with DDR4, it's not like DDR2/3 where there was some degree of compatibility on the memory controller level, it's a completely different technology. That said, DDR4 should scale to higher memory bandwidth than DDR3 with resulting performance increases, but it won't be launching at much higher speeds. You can buy JEDEC standard DDR3-2133 now, and DDR4-2133 is the debut speed. Just like how DDR3 scaled from 1066Mhz to 2133Mhz (DDR3 800 was never commercially viable), DDR4 is planned to scale from 2133Mhz to 4266Mhz. Well yea, I mean when AMD releases a board and such that will work with it.
|
# ? Jun 11, 2013 20:53 |