|
Nephilm posted:Videocards have thermal thresholds of something stupid like 80-100 C, you really needn't worry about causing damage to the card, but if they get too hot they'll start automatically throttling down (modern CPUs do this as well). Gotcha. I was just curious because I built a new computer to replace my old Alienware laptop, and while the CPU consistently operates around 20ish degrees cooler, the 770 surprised me a bit by occasionally getting up to 81 degrees where my laptop's throttles itself at 79.
|
# ? Jun 11, 2014 21:18 |
|
|
# ? May 29, 2024 00:56 |
|
Spiritus Nox posted:More of a curiosity question, but: I know that you generally want your CPU to stay under ~72 degrees when possible, but I've heard people say not to worry about a graphics card getting up to 85. I was just curious, at about what temperature will you have to start worrying about a GPU degrading more quickly the way you worry about a CPU when it starts getting over 70? GPU designs are typically fine up to 95 C, but typical reference designs and fan settings cool to 80-85 C. I'm not sure what goes in to temperature recommendations, exactly. AMD CPUs have a limit of ~62 C, Intel CPUs 68-72 C for desktop chips, GPUs are fine up to 95 C, and the "immediate throttling to avoid immediate damage" temperature is 105 C regardless of manufacturer. Cooler is always better. Balancing temperature with longevity is mostly statistical - you can't stop degredation entirely, but you can make it take long enough that it's unlikely to bother you.
|
# ? Jun 11, 2014 21:26 |
|
Aleksei Vasiliev posted:So I bought one of the cheap R9 290s with blower coolers off eBay without realizing how loud it would be. Factory Factory posted:GPU designs are typically fine up to 95 C, but typical reference designs and fan settings cool to 80-85 C. e; modern hardware is really pretty difficult to hurt from simple temperatures or soft-mod voltages. To burn poo poo out you gotta start really getting outside the box and overriding all the built-in self-protections that chips have these days. DrDork fucked around with this message at 21:46 on Jun 11, 2014 |
# ? Jun 11, 2014 21:42 |
|
Factory Factory posted:Intel CPUs 68-72 C for desktop chips, This is for the top of the heat spreader. This is not the temp you get from your sensors in your motherboard or realtemp. The intel tcase (the ~72c) temp is not measurable by consumers.
|
# ? Jun 11, 2014 21:59 |
|
DrDork posted:Most of the current R290(X) cards throttle at 85C, so that's been a pretty solid "try to keep it cooler than this" number simply because otherwise you tank your performance from letting the card ping back and forth between like 800Mhz and whatever your boost clock is. R9s throttle at 95°C, not 85.
|
# ? Jun 11, 2014 22:40 |
|
This all becomes a moot point when you use water
|
# ? Jun 11, 2014 23:15 |
|
veedubfreak posted:This all becomes a moot point when you use water sub 65°C temp buddy.
|
# ? Jun 11, 2014 23:24 |
veedubfreak posted:This all becomes a moot point when you use water Did you top off your leaky reservoir yet?
|
|
# ? Jun 11, 2014 23:25 |
|
fletcher posted:Did you top off your leaky reservoir yet? It's not the res! I think it's one of the lower radiator connections, but it's such a slow leak I can't find the drat thing. But yes.
|
# ? Jun 11, 2014 23:34 |
|
Is it worth it to overclock RAM on a Titan Black? Or is it pointless like it is for the 780?
|
# ? Jun 11, 2014 23:35 |
deimos posted:sub 65°C temp buddy. is there where I brag about my stock cooler Animal posted:Is it worth it to overclock RAM on a Titan Black? Or is it pointless like it is for the 780? I got results from overclocking my RAM on my 780ti up to a point then there was no benefit. Past ~7400 mhz I am core clock limited (edit: I'm implying that I would figure a 780 would get results too) Ignoarints fucked around with this message at 23:45 on Jun 11, 2014 |
|
# ? Jun 11, 2014 23:39 |
|
veedubfreak posted:It's not the res! I think it's one of the lower radiator connections, but it's such a slow leak I can't find the drat thing. But yes. I wonder how much it would help if you dyed your fluid a neon color to track it down.
|
# ? Jun 12, 2014 00:29 |
|
deimos posted:R9s throttle at 95°C, not 85.
|
# ? Jun 12, 2014 00:41 |
|
Nvidia GPUs throttle starting at 60 C, but it's only 13 MHz per 10 C.
|
# ? Jun 12, 2014 00:43 |
|
Even the mobile ones?
|
# ? Jun 12, 2014 13:48 |
Factory Factory posted:Nvidia GPUs throttle starting at 60 C, but it's only 13 MHz per 10 C. really? I have not noticed that so far
|
|
# ? Jun 12, 2014 14:30 |
|
I think nvidia gpu's throttle at 80c.
|
# ? Jun 12, 2014 16:08 |
|
Sidesaddle Cavalry posted:I wonder how much it would help if you dyed your fluid a neon color to track it down. Good call, I still have some of the neon biocide I bought like 5 years ago left.
|
# ? Jun 12, 2014 16:40 |
|
Yeah, I've never seen my 660 go any higher than 65C, and GPU-Z insists performance is power-limited not thermal.
|
# ? Jun 12, 2014 16:40 |
|
It's a turbo bins thing. Temperature goes into making the decision of how many bins over baseclock to boost. At least in Boost 1.0 (GTX 600 series). Every 10 C is minus one bin. I... I can't actually push my 680 to 60 C to show you a graph. Even Furmark is topping at 58 C. Damned CLC cooling. But when it was on the stock cooler, you could see the behavior plain as day when overclocking.
|
# ? Jun 12, 2014 16:47 |
|
DrDork posted:Honestly, your cheapest option is simply to resell it and buy one that has a 3rd party cooler. You're going to spend about $100 on any sort of after-market cooling solution. I was going to call FUD on this because I spent about $45 each on my Accelero IIIs to cool my R9 290s. Did these double in price recently or did I luck out on an insanely good deal?
|
# ? Jun 12, 2014 17:12 |
|
Stanley Pain posted:I was going to call FUD on this because I spent about $45 each on my Accelero IIIs to cool my R9 290s. Did these double in price recently or did I luck out on an insanely good deal?
|
# ? Jun 12, 2014 19:51 |
|
Suddenly I don't feel so bad about paying 100 bucks for full cover blocks.
|
# ? Jun 12, 2014 19:52 |
|
DrDork posted:Assuming you mean the Accelero Extreme III then yes--they normally go for $75 or so. Take a look at NewEgg Accelero prices to get an idea. God are they worth it though. I should move mine from my old card to the new, but I've been lazy. Accellero Xtreme III dropped my 6970 from 95C@full load to about 65C, and it was quieter at load then stock was at idle. Those things are amazing.
|
# ? Jun 12, 2014 20:24 |
Factory Factory posted:It's a turbo bins thing. Temperature goes into making the decision of how many bins over baseclock to boost. At least in Boost 1.0 (GTX 600 series). Every 10 C is minus one bin. It might be a 1.0 thing because on my 770's the clock speed never dropped even up to 90 degrees. For whatever reason the top, hot card would actually run one 13 mhz increment higher at almost all times too. I can touch 70 degrees with my 780ti though at least and it doesn't drop either.
|
|
# ? Jun 12, 2014 22:49 |
|
DrDork posted:Assuming you mean the Accelero Extreme III then yes--they normally go for $75 or so. Take a look at NewEgg Accelero prices to get an idea. Yeah that's the one. $45 each was a drat good deal then My 290s never get above 60-65C.
|
# ? Jun 13, 2014 00:01 |
I downloaded the latest nvidia driver two days ago to do a true clean install (I've "stacked" a ton of these, DDU showed an incredible number of video cards reported) but I guess it was corrupted because it didn't unpack successfully. Cue two days of BSOD's and headaches, and eventually complete driver failure every time I tried to run a game. Re downloading and clean installing a fresh happy copy didn't help at all, in fact it failed most of the time halfway through, although past the actual driver itself. What a god drat headache. The bad driver didn't even install, it was just unpacking the file and failed. I simply could not believe that was the cause but the numerous BSOD's pointed towards driver failures and eventually I'd just get "bf4.exe image failed" or "titanfall.exe image failed" and a message that the nvidia driver was not compatible with Windows 7 (?? I have windows 8). Also the phrase "Test Mode" appeared on my desktop. I just reinstalled Windows and all is well. I can't believe I forgot to backup the minidumps so now I can't even point out the specific things that failed. Has anyone ever heard of this? I got the driver from what appeared to be their website but I'm just not sure anymore. I noticed afterwards that the icon for the file was blank, instead of the green logo.
|
|
# ? Jun 13, 2014 02:40 |
|
Stanley Pain posted:Yeah that's the one. $45 each was a drat good deal then edit: Nevermind - looks like the Tahiti/Hawaii die sits recessed below the frame so the old cooler won't work without this shim which probably means performance would be worse than the stock cooler. Good thing I read up on that before trying to install it. future ghost fucked around with this message at 07:28 on Jun 13, 2014 |
# ? Jun 13, 2014 04:55 |
|
drat it, I got all excited because I remembered a friend of mine was mining dogecoins and had some cards he wanted to sell... but they're only R9 280X. I kinda wanted 290 for Freesync. Still, he says I can have them for $170 each. I know for a fact he ran them undervolted for 5 months total and that was the only use. He has the original boxes and the Amazon receipts, so if I have them put back to stock there's no way Gigabyte wouldn't honor the warranty right?
|
# ? Jun 13, 2014 15:56 |
|
Some obvious take aways from E3 after show chatting with some industry buddies of mine. -XBO sucks more or less due to its memory bandwidth limit. If a dev is working on both, the XBO will collect dust for at lot of staff as they've only got enough time and brain power to deal with one build, the PS4. I'm sure there are people who will be specifically tasked to make the XBO build 'good enough' but for the majority of the studio they just couldn't give a poo poo to run the XBO build. -8GB is a lot of memory. Devs are having 'trouble' filling it on the first round of games but I'm assured it will be fully utilized within the next 18 months/next round of games. Basically some of the initial release titles might not be fully taxing the systems but everyone knows that already. -Releated but art assets seem to be growing in density slowly even though 512MB => 8GB is a huge jump, the devs are taking it slow so for various reasons like disruption to the workflow, old ported engines, and inadvertently overshooting the '8GB' cap. -Lots of engines are just ported from last gen with some add-ons and also some cleanup. No real time to write a truely new next gen engine for most studios. I'm sure some are but most aren't, especially the big titles with less dev time, hard dates and shareholders. -Some MS/affiliate buddies are angry that MS forced the Kinect bundle for so long yet had no real push for kinect games/functionality during the years/months leading up to the XBO release.
|
# ? Jun 13, 2014 19:02 |
|
Shaocaholica posted:Some obvious take aways from E3 after show chatting with some industry buddies of mine. Weren't devs required to use kinect at first, like put something in the game. Like the head leaning in BF4.
|
# ? Jun 13, 2014 19:06 |
|
Zero VGS posted:Still, he says I can have them for $170 each. I know for a fact he ran them undervolted for 5 months total and that was the only use. He has the original boxes and the Amazon receipts, so if I have them put back to stock there's no way Gigabyte wouldn't honor the warranty right?
|
# ? Jun 13, 2014 19:39 |
|
Don Lapre posted:Weren't devs required to use kinect at first, like put something in the game. Like the head leaning in BF4. From what little I was told this was either non existent or a really really soft request. I mean what is MS supposed to do? Slap their hands? Not allow the title to ship?
|
# ? Jun 13, 2014 19:52 |
|
It was probably more a "hey we'll give you some cash if you use the Kinect." There certainly was a time at the end of the XBox 360 cycle where games were tossing in craptastic token Kinect "functionality," and it sure as hell wasn't because it was good for gameplay.
|
# ? Jun 13, 2014 20:11 |
|
I pitched to my friend at Infinity Ward to have some kinect based metric for game rage and rage quitting and to monetize that in game. Like pay $25/match to see how enraged your opponent is after a death. Like a voice text transcription of what they yelled (for privacy reasons). Of course there would need to be a waiver to release that on the other end but the incentive might be half off your own fee or something like that. shaocaholica [gun icon] drdork ("What the gently caress", throws controller)
|
# ? Jun 13, 2014 20:29 |
|
http://www.twitch.tv/twitch/b/537807660?t=2h58m45s "You guys can't say this, but [Planetside 2] is going to look better on the PS4." "It's true. We're able to custom build our shaders and our whole pipeline... ultra settings on PC, ultra settings is different for a guy who bought a $600 videocard last year from someone who bought a $1000 this year..."
|
# ? Jun 13, 2014 20:39 |
|
My take away from your experiences Shaocaholica is that pretty much everything we thought sucked about the XBox One will continue to suck and Microsoft was nuts for pulling a 360 version two instead of building a better end-to-end memory architecture - I mean, programming for the PS4 may involve a sort of moderately wonky OpenGL thing, but it's at least familiar and the process now must be basically translating the build there to the XB1 rather than doing anything new or cool. Which is a shame, honestly, if you think about it; Microsoft brought absolutely nothing cool to the table this gen, it's just a shittier PS4 as far as games are concerned and doesn't have any dramatically new or cool multimedia features so it's just bare-minimum competitive there. Their IP isn't without merit or anything, people are still going to play games that only MS brings out, but Sony has some killer looking IP coming too. Also, who nailed it with regard to "we don't know how resource utilization will look on the consoles coming up because we're still way too early for that to be determined?" Seriously, thank you for asking - it confirms and complicates some things; it's a shame that Microsoft's approach sucks, but at least we sorta suspected that going in - you don't have real memory bandwidth if most of your memory bandwidth comes from DDR3, just having a fastlane for a very small portion isn't nearly as good as having way higher end to end! Just plain isn't! Also what the poo poo half the ROPS/related stuff? Yeah, we're in the ERA OF SHADERS echo echo echo, but they're still going to be operating on primitives and it can put out slightly over half of what the PS4 can there, that seems so backwards rear end I can't even begin to think what the hell they were intending to do there. Maybe an XBox One Two will be coming out to fix the hardware deficit at some point. Ugh.
|
# ? Jun 13, 2014 21:07 |
|
Nephilm posted:http://www.twitch.tv/twitch/b/537807660?t=2h58m45s "...both of them can run ultra settings, but that new video card now is still going to be able to pull more." I don't think there's anything wrong with saying a newer more expensive video card can run a game on ultra settings better than an older card. Hasn't PS2 had problems with running like poo poo since the beginning even on high spec PCs? I mean, if they made a janky PC game, maybe they're right that they can make it smoother on a console . Like when PCs get shoddy console ports, but in the wrong order.
|
# ? Jun 13, 2014 21:27 |
|
PS2 is a rare mixture of being very demanding of both GPU and CPU
|
# ? Jun 13, 2014 21:29 |
|
|
# ? May 29, 2024 00:56 |
|
teh_Broseph posted:"...both of them can run ultra settings, but that new video card now is still going to be able to pull more." I don't think there's anything wrong with saying a newer more expensive video card can run a game on ultra settings better than an older card. Do you mean PS2 emulation? It's a totally different way of doing things compared to the x86/x64 OSes that you see emulators created for, and it involves trying to sync up two proprietary processors in a kind of black-box manner... The general rule for emulation has always been that it takes roughly ten times the processing power to emulate something as it does to actually do it, and measuring processing power is difficult when you step outside of the world of raw FLOPs. I don't see what that really has to do with anything?
|
# ? Jun 13, 2014 21:32 |