|
Factory Factory posted:The Mod is now officially done. Holy poo poo! I knew you were human! And that just looks lovely! I'm not sure if I'd go for anything quite that intensive, but drat if those results don't have me thinking about it.
|
# ? Dec 27, 2013 05:12 |
|
|
# ? May 30, 2024 07:13 |
|
Factory Factory posted:The Mod is now officially done. I find your lack of radiator space disappointing :P [timg]http://i.imgur.com/8Fsn0BQ.jpg][/timg]
|
# ? Dec 27, 2013 08:22 |
|
Michymech posted:I find your lack of radiator space disappointing :P I find your case over-large and inconvenient. What are you doing with that, housing a vagrant?
|
# ? Dec 27, 2013 08:28 |
|
You've made a grave error. Once you go aftermarket you'll never be able to use stock cards again. Made that mistake years ago and now every card upgrade is an adventure in thermal tape and copper. Amazing how a stock blower seems bearable until it suddenly isn't anymore.
|
# ? Dec 27, 2013 08:33 |
|
Factory Factory posted:I find your case over-large and inconvenient. What are you doing with that, housing a vagrant? Well not quite, It has ATM tho a 480mm rad in the top and a 360mm rad in the bottom that will be cooling the pair of MSI lighting r9 290xs I plan to get
|
# ? Dec 27, 2013 08:51 |
|
Michymech posted:Well not quite, It has ATM tho a 480mm rad in the top and a 360mm rad in the bottom that will be cooling the pair of MSI lighting r9 290xs I plan to get PFFFFFFFTTTTHHHHHH, you can do that way smaller and still quiet, like me and El Scotch are doing (well, he's done, I am waiting on parts). Viva Arc Mini R2. Mind you, my SG10 fits in its entirety inside the R2. So it DOES fit a vagrant.
|
# ? Dec 27, 2013 13:05 |
|
I want the mod so bad!!! Seriously there is no way to get that bracket??
|
# ? Dec 27, 2013 13:10 |
|
Animal posted:I want the mod so bad!!! Seriously there is no way to get that bracket?? NZXT sells them for $30, VRM fan included, in white, black, and red.
|
# ? Dec 27, 2013 18:04 |
|
deimos posted:PFFFFFFFTTTTHHHHHH, you can do that way smaller and still quiet, like me and El Scotch are doing (well, he's done, I am waiting on parts). Viva Arc Mini R2. Ha you wouldnt be able to fit a 480 in a arc mini, my case is the Phanteks Enthoo Primo Sure its rather large but I love the room you have to work with
|
# ? Dec 27, 2013 18:55 |
|
Michymech posted:Ha you wouldnt be able to fit a 480 in a arc mini, my case is the Phanteks Enthoo Primo Sure its rather large but I love the room you have to work with Your apples are inferior to my oranges.
|
# ? Dec 27, 2013 19:16 |
|
El Scotch posted:Your apples are inferior to my oranges. Sorry I cant hear you over the 13 fans I have in my case
|
# ? Dec 27, 2013 19:20 |
|
Michymech posted:Ha you wouldnt be able to fit a 480 in a arc mini, my case is the Phanteks Enthoo Primo Sure its rather large but I love the room you have to work with Anything is possible, if you try. (A Phobya 400mm radiator in an NZXT H630) Two 200mm fans are literally inaudible.
|
# ? Dec 27, 2013 19:45 |
|
Michymech posted:Sorry I cant hear you over the 13 fans I have in my case If I wanted to hear fans I'd stick with air cooling.
|
# ? Dec 27, 2013 20:03 |
|
I can't seem to find an answer to whether or not "The Mod" or it's variants (including this ready-made NZXT version) have clearance for a dual SLI setup. I imagine custom builds allow more variance than that.
|
# ? Dec 27, 2013 20:05 |
|
KakerMix posted:I can't seem to find an answer to whether or not "The Mod" or it's variants (including this ready-made NZXT version) have clearance for a dual SLI setup. I imagine custom builds allow more variance than that. 3-slot clearance, hell yes. 2-slot clearance... It does if 1) you watch your bracket clearance and 2) you use a slim VRM fan like a Noctua NF-B9. Here's a guy who has 5-slot total spacing, two GPUs and a sound card(?) in between: From Overclock.net's The Red Mod thread (AMD cards). He does the variation where he leaves the stock fan and heatsink baseplate. There are more examples in the Nvidia thread. True 2-slot clearance absolutely requires the thin fan and chopping off the non-functional part of the NZXT G10. It would also really benefit from VRAM/VRM heatsinks, because the cards are in such close proximity. Using a thick fan gets you something that fits, but the fan can't get any air.
|
# ? Dec 27, 2013 20:16 |
|
Factory Factory posted:3-slot clearance, hell yes. 2-slot clearance... It does if 1) you watch your bracket clearance and 2) you use a slim VRM fan like a Noctua NF-B9. Thank you for the information. I also found this: which looks like it would work for my needs out of the box. Now if only I had gotten a case that supports 140mm fans...
|
# ? Dec 27, 2013 20:22 |
|
I'm still trying to find a 360mm fan to replace the 9 fans on my front rads. Anyone know of anything good?
|
# ? Dec 27, 2013 21:19 |
How come graphics cards don't have a replaceable GPU and RAM slots like a motherboard does?
|
|
# ? Dec 28, 2013 01:23 |
|
They'd take up too much space to fit on an expansion card.
|
# ? Dec 28, 2013 02:12 |
|
There isn't really a need for it anyways. You'd have to upgrade the board any time you wanted a more powerful GPU (or at least 99% of the time) and you'd want to get new memory to match the new GPU.
|
# ? Dec 28, 2013 02:51 |
|
veedubfreak posted:I'm still trying to find a 360mm fan to replace the 9 fans on my front rads. Anyone know of anything good? I'm sure you could make it work.
|
# ? Dec 28, 2013 03:12 |
|
Looks like Amazon has the R9-290 back in stock at not so ridiculous prices, $513.
|
# ? Dec 28, 2013 04:10 |
|
deimos posted:Looks like Amazon has the R9-290 back in stock at not so ridiculous prices, $513. Still a 25%+ markup. Garbage.
|
# ? Dec 28, 2013 04:25 |
|
I missed the Ambilight clone discussion; don't all commercial solutions use FPGAs? Nothing else will be OS/driver/PC-independent I assume. I take it the real TVs that can do it take obvious advantage of the fact they have access to the raw display data internally. Wonder if it's worth Kickstarting...
|
# ? Dec 28, 2013 04:35 |
So I just had the most aggravating problem and felt like posting about it, I wasn't going to go to the effort to post in HOTS even though I was seriously about to try a clean Windows 8 install on a spare SSD, and then I accidentally fixed it in an even dumber way and it kinda belongs here I guess. I'm using W7 with a 27" using dual DVI and a run of the mill 23" 1080p using HDMI. Last night I was puttering around, think I played PS2 for a bit, then played some BF4 (which is loving beautiful at >60fps and 1440p btw!), which hard locked my PC after about 30 seconds. Weird, whatever. Restart, immediate BSOD due to atikmdag.sys, PAGE_FAULT_IN_NONPAGED_AREA... again and again, nothing fixes it. Safe mode is fine, I use multiple apps and check the registry myself to clean out all ancient traces of any AMD/ATI software, I try installing old drivers, I try with and without CCC installed (since it's a piece of crap and going without it fixed this issue for some people). Had a useful restore point from just a couple days earlier so I tried that but still no dice. I give up after about 5 hours of fuckery since I had an appointment in the morning. I come back to look at it again just now, BSODs of course. I'm running out of ideas so I disable the (sandy bridge) iGPU in my BIOS and what the gently caress, everything starts up no problem, albeit with some rough edges (have to reset 27" monitor as primary, move desktop icons back, fix underscanning on the monitor connected via HDMI etc). Then, I realized that a day or two ago (but likely with at least one successful restart in between), for whatever reason I moved the HDMI monitor from the onboard connector to a video card HDMI connector... not sure why, maybe an experiment to see if I could drag windows across monitors more smoothly while playing video or something. So, the "solution" makes sense in a roundabout windows 95-style fuckery way, but anyone have any idea why? I imagine I could reenable SB video and move the HDMI monitor back to the onboard connector but it's not really important, Virtu is kind of a bloated piece of crap and I'm using a pair of 290s anyway so power management is kind of a joke. I just don't get why it was crashing - if my video cards were "used" to the other monitor belonging to onboard video, then if anything it should have just not worked (and/or plugging it into the video card should have eliminated issues, not caused them), why the gently caress were they trying to write to memory reserved for onboard video, or whatever was going on there, after the monitor was explicitly given to the GPU? And HDMI is pretty friendly, I mean even if I just moved that monitor from one connector to another while the PC were on, I imagine it would've gotten picked back up and kept working fine. This is just really weird, I haven't had any pants on head retarded Windows issues like this in probably a couple years. Factory Factory posted:3-slot clearance, hell yes. 2-slot clearance... It does if 1) you watch your bracket clearance and 2) you use a slim VRM fan like a Noctua NF-B9 movax posted:I missed the Ambilight clone discussion; don't all commercial solutions use FPGAs? Nothing else will be OS/driver/PC-independent I assume. I take it the real TVs that can do it take obvious advantage of the fact they have access to the raw display data internally. Straker fucked around with this message at 05:44 on Dec 28, 2013 |
|
# ? Dec 28, 2013 05:34 |
|
movax posted:I missed the Ambilight clone discussion; don't all commercial solutions use FPGAs? Nothing else will be OS/driver/PC-independent I assume. I take it the real TVs that can do it take obvious advantage of the fact they have access to the raw display data internally. Probably not, since Phillips has US patents. Straker posted:This silly "the mod" stuff is just shoehorning a mass market CPU waterblock onto a GPU, right? Surely there are CPU blocks with the barbs on the side, that would help a lot with clearance especially if you aren't using thin hoses. The block/pump assembly isn't the problem; it's the clearance of the decorative part of the mounting bracket and the extra fan you install to cool VRMs and whatnot that the CPU block doesn't cover. Factory Factory fucked around with this message at 05:51 on Dec 28, 2013 |
# ? Dec 28, 2013 05:45 |
|
Currently have a Sapphire Radeon 5850 that works fine for the most part but recently been dealing with a memory leak in Battlefield 4 that some of the community has tracked down to ATI drivers. ATI support has been pretty lovely about the whole thing, and I'm fed up with them to the point where I'm ready to get a new NVidia card. Been eyeing the 770 4GB/780 based on the PC parts picking thread but after reading through this thread AMD/ATI seems to have the lead in price/performance ratio lead with the R9-290. Should I get over my issues with AMD and look again at their offerings? I'd really like to stay away from them as I also have an issue to this day when using two monitors, it produces a flicker on the primary monitor when overdrive is enabled which I can't believe has not been fixed yet, but am willing to go back to them if I can get a better card for the money. I don't plan on overclocking it myself and highly doubt I'd be using more than a single monitor for 1920x1200 if that makes a difference. I'm also willing to hold off a month or so if it looks like prices may go down due to competition or new cards in the works.
|
# ? Dec 28, 2013 06:01 |
You mean overdrive as in the CCC overclocking options, or pushing pixels to change color faster? Because if you mean the former, there shouldn't be any flickering or anything unless you're OCing the RAM too far... and if you mean the latter, if your monitor needs it it probably already does it in hardware I've been using dual monitors for years and the last nvidia card I owned was an 8800GTS which died in a hurry, I just buy whatever offers the best price:performance at the highish part of the curve, and usually at a huge discount since I've been able to avoid needing a new GPU right when I build a new PC. If it's not an emergency you're probably best off waiting and seeing if 290s drop back to $400 in the next month or so (the altcoin mining frenzy is just about over but a 290 isn't worth it at like >$450), or maybe even waiting for new nvidia cards; there will allegedly be some consumer offerings within a few months. edit: I assume you've already tried different drivers?
|
|
# ? Dec 28, 2013 06:20 |
|
Factory Factory posted:Probably not, since Phillips has US patents. Oh, well that sucks. Maybe do it in a way where it just happens to be a device that sits in line and outputs color data...what you do with it is up to you. Could be a fun project!
|
# ? Dec 28, 2013 06:30 |
|
deimos posted:Looks like Amazon has the R9-290 back in stock at not so ridiculous prices, $513. Magic Underwear posted:Still a 25%+ markup. Garbage. It's just straight bonkers. I could probably sell for my unlocked, custom-cooled 290(X) for a nice chunk of change, but then I'd be back at square one for the third time in as many months.
|
# ? Dec 28, 2013 06:38 |
I actually considered flipping my 290s as soon as I got them, since I haven't been making as much use of them as expected (too much other stuff to catch up on), but the prices aren't really worth it any more unless you can flip them locally for cash Which isn't really anything to be upset about, since it'll be nice being able to recommend custom cooled 290s once they're down to like $420. I feel bad for anyone who "has" to have a high end build right now, four figures for two 780s or better is kinda ridiculous (whereas $800 for a pair of 290s is totally reasonable )
|
|
# ? Dec 28, 2013 06:44 |
|
This probably goes here instead of the laptop or OC threads, is there a way to undervolt or at least disable turbo on my GTX660m? It throttles down to 720mhz in an hour or two under load and I would much rather fiddle with it by hand. I guess dusting the intake/fan is in order, but gently caress turbo anyway.
Arzachel fucked around with this message at 15:49 on Dec 28, 2013 |
# ? Dec 28, 2013 14:11 |
|
movax posted:Oh, well that sucks. Maybe do it in a way where it just happens to be a device that sits in line and outputs color data...what you do with it is up to you. Could be a fun project! That was what I was pushing: grab an FPGA board with HDMI tx/rx and enough ram to hold a frame, then output the color info on GPIO pins inline. HDCP is doable too, but you need to crack it manually.
|
# ? Dec 28, 2013 14:31 |
|
Malcolm XML posted:That was what I was pushing: grab an FPGA board with HDMI tx/rx and enough ram to hold a frame, then output the color info on GPIO pins inline. If I knew a drat thing about how to accomplish it, there are cheap color timing devices that seem to already have HDMI throughput, HDCP, and whatnot implemented. But hacking them is probably not something I could pull off in a reasonable time.
|
# ? Dec 28, 2013 15:33 |
|
Factory Factory posted:The block/pump assembly isn't the problem; it's the clearance of the decorative part of the mounting bracket and the extra fan you install to cool VRMs and whatnot that the CPU block doesn't cover. I honestly don't get why a company would make something like that with such an odd design. If it was simply to throw their brand name on, they could put it on any where on the bracket without needing that goofy part that sticks out and takes up room. Factory Factory had the right idea in cutting that little part off, probably would make it easer to use a 2nd card if he didn't have a smaller case that was already kinda cramped. I'm thinking with the VRM cooling part and the choice of fans, the best route for anyone with dual cards would be the slim fans, and having a case with at least one side panel fan that could blow in right over the cards. I'm still using my Antec 300 Illusion case and that single side 120mm intake is a blessing, especially a while back when I was running 2 GTX 465 cards in SLI and there wasn't much room between them for airflow.
|
# ? Dec 28, 2013 21:11 |
|
Arzachel posted:This probably goes here instead of the laptop or OC threads, is there a way to undervolt or at least disable turbo on my GTX660m? It throttles down to 720mhz in an hour or two under load and I would much rather fiddle with it by hand. I guess dusting the intake/fan is in order, but gently caress turbo anyway. You should be able to flash the bios to do that, but it's a bad idea. You can raise the power target in EVGA Precision to eliminate power throttling, but if it's temperature related you really want it throttling instead of burning itself up. At least on desktop boards there are over 40 different power states with different voltages and clock speeds depending on the temeprature and GPU load. Removing those states will give you worse overall performance and dramatically worse battery life. The Boost implementation on the Kepler GPUs is possibly one of the best power/temperature/performance management systems ever implemented and it's almost always a better idea to figure out what's causing it to throw a red flag and fix the problem instead of disabling Boost.
|
# ? Dec 28, 2013 21:14 |
|
craig588 posted:You should be able to flash the bios to do that, but it's a bad idea. You can raise the power target in EVGA Precision to eliminate power throttling, but if it's temperature related you really want it throttling instead of burning itself up. At least on desktop boards there are over 40 different power states with different voltages and clock speeds depending on the temeprature and GPU load. Removing those states will give you worse overall performance and dramatically worse battery life. The Boost implementation on the Kepler GPUs is possibly one of the best power/temperature/performance management systems ever implemented and it's almost always a better idea to figure out what's causing it to throw a red flag and fix the problem instead of disabling Boost. It's temperature, the GPU throttles to stay below 93C° which is fine, but the thing would be stable at near idle voltages with a slight underclock. The laptop (Lenovo's y580) is under a year old so I'm not sure it's dust, I'll probably have to either RMA or refit the heatsink myself.
|
# ? Dec 29, 2013 02:37 |
|
Try giving it a lower power target. Once you get to a point where it doesn't hit temperature limits you can try raising the offset clock. The GPU can adjust voltages in 12mv steps anywhere from .98 to the maximum defined in the bios (Usually 1.175v on desktop boards, but it might be lower on laptops). By giving it a lower power target it will lower clock speeds and voltages under high TDP scenarios further than it already does. Once you get to a point where it's not thermally throttling you can try raising the offset clock with the new lower power target to gain back speed you might have lost under lower load conditions. Power target is everything on Keplers. You can make the same board draw 50 watts or 250 watts just through power target adjustment (With a corresponding hit to speed as well).
|
# ? Dec 29, 2013 06:30 |
|
Just as a heads up, Anandtech's updated their GPU Bench for 2014. 780 TI's, r9 290's and everything in between!
|
# ? Dec 29, 2013 06:34 |
|
|
# ? May 30, 2024 07:13 |
|
Ghostpilot posted:Just as a heads up, Anandtech's updated their GPU Bench for 2014. 780 TI's, r9 290's and everything in between! This benchmark is garbage! They only test the 290x and the 780Ti in situations where the 290X has a clear advantage, like raising the resolution, or enabling more features! Coming from that, you'd get the opinion that for about twice the noise, you can get pretty much the same performance, or better slightly, for nearly $300 less! Oh wait that's accurate poo poo gently caress
|
# ? Dec 29, 2013 07:46 |