|
Sounds a little like my 2500K in an ASUS P8Z68-V LX. I looked at doing voltage overclocking but most of the sources I could find seemed to be saying that on Sandy Bridge going over 1.4V is a bad idea for the lifespan of the CPU. At "stock" voltage settings and 4.4GHz I am seeing a vCore of ~1.39V at load, so I think I'm at the end of the line already unless I want to risk having to hit up eBay for a new proc.
|
# ? Jul 12, 2016 15:20 |
|
|
# ? May 26, 2024 20:08 |
|
~1.4v seemed like it was unreasonable for my 1st-gen i7 (65nm?). But then, I ended up running my i7-875K at 1.44-ish 24/7, in order to achieve 4.2GHz anyway. And it was fine, for years and years... in fact, it can still do that voltage/clock combination just fine today. A buddy of mine tried a similar thing: he had a Xeon version of the i7-920 (don't remember the exact Xeon model#, but it was literally a server-branded 920), and he ran it at 4.4GHz @ ~1.42v. Initially, it was fine, but it started to degrade fast. After a couple months, it was no longer stable at 4.4. So he dropped it down to 4.3, but after a while it got wonky at that clock also. Eventually, it couldn't even do 4.2, so he gave up and set it to a more conservative 4.0GHz @ 1.2?v. Meanwhile, mine kept on trucking at 4.2GHz @ 1.44v. The difference? My buddy had a heatpipe tower cooler, while I had a custom loop, which was keeping my 875K as cool as a cucumber: it didn't really go above 55°C on a typical load. His was hitting low 80s. Now that I replaced my Lynnfield with a Broadwell-C, I actually still find myself running my i7-5775C at about 1.4v. It actually probably doesn't even need that voltage to do 4.2GHz (which is about the most it will ever do; you could give it 1.5v if you wanted to, and it wouldn't really make a difference; these things are terrible overclockers). Anyway, it's been fine so far, although I've only had it for about 4 months.
|
# ? Jul 12, 2016 17:25 |
|
Ak Gara posted:My 5 and a half year old Corsair H100 (not even H100i) is starting to make grinding noises. I haven't gotten the grind yet, but the naming and specs of all their new stuff is confusing as hell. Why they can't just have a H## that is smaller-greater for bigger/better instead of all this GT/GTX/###i Bullshit is beyond me. Let me know what you get as I was looking to fit a H115/H120 in my R500 since it can fit the bigger rad over the H100 supposedly, pending my Ram/MB clearance. NihilismNow posted:Maybe it is my motherboard but mine did 1.41 volt out of the box (6700k + Asrock Z170m Extreme 4). I have to run a -0.08 voltage offset to keep temps sane. It does 4.5 ghz stable at that voltage (1.33). At "stock" voltage it just cooks itself. Most of the time it is only running 0.7-0.8 volt. Hmm I know ASRock has always been a bit aggressive with voltages but man I don't remember having to use 1.4V+ Solid on anything since a 2600K at 5Ghz. Even my Q9550 at 3.84Ghz (1Ghz OC) only needed 1.35v at max load. :/ I know everyone says the newest stuff is bad at OC'ing, but drat if running the same OC voltages as 4+ generation old hardware seems pretty crazy/odd to me. Guess I need to get my hands on some new chips and experiment some. I have a G3258 thats at 4.4ghz with stock voltages (1.2ish I think, I'll have to check) but I may get a i7 4970K for cheap to replace it and let that HTPC be a backup VR system in another case with my 980Ti in the future..
|
# ? Jul 12, 2016 19:23 |
|
NihilismNow posted:Maybe it is my motherboard but mine did 1.41 volt out of the box (6700k + Asrock Z170m Extreme 4). I have to run a -0.08 voltage offset to keep temps sane. It does 4.5 ghz stable at that voltage (1.33). At "stock" voltage it just cooks itself. Most of the time it is only running 0.7-0.8 volt. Mine was similar too in default "auto" voltages, Asus z170i. It set a bunch of other voltages way high as well, like user agent / vccio, etc. I fired it up at 4.4ghz, auto everything for the first boot after an install and it was giving the chip 1.45v vcore. Due to a buggy Bios (I think), offset voltage wasn't working - adaptive mode did, just when I was regretting the whole thing and wondering if my sandy bridge would do 5GHz with this cooler. That was a lovely bug. Wonder if this 14nm process is going to improve much? GRINDCORE MEGGIDO fucked around with this message at 20:57 on Jul 12, 2016 |
# ? Jul 12, 2016 19:48 |
|
EdEddnEddy posted:I haven't gotten the grind yet, but the naming and specs of all their new stuff is confusing as hell. Why they can't just have a H## that is smaller-greater for bigger/better instead of all this GT/GTX/###i Bullshit is beyond me. Let me know what you get as I was looking to fit a H115/H120 in my R500 since it can fit the bigger rad over the H100 supposedly, pending my Ram/MB clearance. I ended up going for a H110i GT from Amazon. The CW-9060019-WW version. It should fit in my Corsair 550D okay. I don't know about the R500 (500R?) though.
|
# ? Jul 13, 2016 09:03 |
|
Ak Gara posted:I ended up going for a H110i GT from Amazon. The CW-9060019-WW version. It should fit in my Corsair 550D okay. I don't know about the R500 (500R?) though. If you are so inclined, take some pics and let me know how you like it sound/performance wise. My H100 still works, but that 3930K can get toasty in the summer if I have it at anything less then Max Cooling performance mode which spins up the fans if the CPU does anything but idle. (Only hits 70C max under 100% loads though) Any lower mode though and Ripbot can bring it close to 80C at times. Not dangerous, but still hot.
|
# ? Jul 13, 2016 23:18 |
|
EdEddnEddy posted:If you are so inclined, take some pics and let me know how you like it sound/performance wise. My H100 still works, but that 3930K can get toasty in the summer if I have it at anything less then Max Cooling performance mode which spins up the fans if the CPU does anything but idle. (Only hits 70C max under 100% loads though) Any lower mode though and Ripbot can bring it close to 80C at times. Not dangerous, but still hot. Will do. How long have you had your H100? I actually had to replace the thermal paste on mine after 4 years as my temps where slowly climbing. I don't know if it was due to the pre-applied paste the H100 came with but this time I'm gonna be using some good ol MX-4. I'm going to rerun Cinebench 11.5 and see if I can beat my old 2500k's 7.88 score!
|
# ? Jul 14, 2016 04:37 |
|
Ak Gara posted:Will do. How long have you had your H100? I actually had to replace the thermal paste on mine after 4 years as my temps where slowly climbing. I don't know if it was due to the pre-applied paste the H100 came with but this time I'm gonna be using some good ol MX-4. Hmm. I got it in 2011 after I won my 500R and put my C2Q 9550 setup into it, but I had remove and replaced it with my 3930K setup in 2012 and I think I repasted it since then in like 2013/14 once too. Using IC Diamond I believe which shouldn't need repasting as soon as AS5 did but I may need to try that and see what happens. Though 6 cores at 4.6Ghz is still a good bit o heat for anything to disapate even with 4 fans on that rad in a push pull. Only 1.325v max though which is nice for like -10C over the 1.4 a lot of people seem to run at that speed when the X79's first arrived.
|
# ? Jul 14, 2016 20:16 |
|
I'm wondering if ya'll can help me visualize my situation at the moment. I've got a 3770K clocked at 4.7 turbo (4/8 cores), and I'm looking to take a leap forward by the end of the year. I do video editing, so anything new at this point would be an upgrade. Is that correct? I'm considering an i7-69xx model just to take that big step. Should I consider Xeons instead?
|
# ? Jul 15, 2016 00:29 |
|
A couple of 8-core Xeons off ebay would probably give you better performance/$, and of course you could go with new ones if you hate money. Though anything from the 69xx line would be a significant improvement as well, of course. Kaby Lake and Skylake-E might be coming out soon but I don't think there's been an official ETA yet.
|
# ? Jul 15, 2016 01:28 |
|
As someone who just moved from a i5 3570k @ 4.6 to x99 and 6900k @ 4.4, I would say Broadwell-E or HEDT in general is your best bet if you want to buy new. Haswell-E is still good, will save you some cash and can overclock more readily than Broadwell-E. Nothing in Kaby Lake is going to do anything much for you. It's rumored to just have a few desktop SKUs and its only real improvement is support for Intel's new bonkers storage products (xpoint/optane). If things follow previous years, which isn't necessarily a given nowadays, Skylake-E would be released around June 2018, which is quite a wait.
|
# ? Jul 15, 2016 09:58 |
|
YEY My H110i GT is here! ....where are the loving screws? And the pre-applied thermal paste? Why are the fins dusty, and why are there so many bent fins? gently caress you, Amazon. But hey, free returns via them sending someone to my door to pick it up.
|
# ? Jul 15, 2016 13:35 |
|
Ak Gara posted:YEY My H110i GT is here! Yea Newegg > Amazon for PC Parts. Amazon is great for TV's, Beds, other stuff, but PC Parts seem to be a dice roll. Newegg has the H110i GT with a rebate right now too. If you want Premiere for the perks, let me know as well.
|
# ? Jul 15, 2016 22:08 |
|
Newegg doesn't sell waterblocks
|
# ? Jul 16, 2016 00:20 |
|
PerrineClostermann posted:Newegg doesn't sell waterblocks But they do? http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=waterblock&N=-1&isNodeId=1 Or do you mean full-coverage custom GPU waterblocks? Cuz yeah, they don't sell those.
|
# ? Jul 16, 2016 00:38 |
|
Riflen posted:As someone who just moved from a i5 3570k @ 4.6 to x99 and 6900k @ 4.4, I would say Broadwell-E or HEDT in general is your best bet if you want to buy new. Haswell-E is still good, will save you some cash and can overclock more readily than Broadwell-E. mobby_6kl posted:A couple of 8-core Xeons off ebay would probably give you better performance/$, and of course you could go with new ones if you hate money. Though anything from the 69xx line would be a significant improvement as well, of course. I like to think whatever I buy will be outdated tomorrow, so might as well buy what I need now and not worry about what the road maps tell me. Thanks to the two of you. I'm going to chew on this and start looking at price points.
|
# ? Jul 16, 2016 01:38 |
|
As far as gaming goes will it make a difference if I choose a system with a i7 6700T over a i5 6400T (all other specs being equal incl. identical discrete GPU: NVIDIA 960GTX)?# Edit: it's $182 extra for the i7. KingEup fucked around with this message at 12:57 on Jul 18, 2016 |
# ? Jul 18, 2016 09:40 |
KingEup posted:As far as gaming goes will it make a difference if I choose a system with a i7 6700T over a i5 6400T (all other specs being equal incl. identical discrete GPU: NVIDIA 960GTX)?# Yes it will make a pretty noticeable difference, not so much because of the extra threads on the i7 but because of the large clock speed difference of 2.8GHz on the i5 vs 3.6GHz on the i7.
|
|
# ? Jul 18, 2016 13:56 |
|
As long as you have the 960 GTX, it will be the limiting factor with either of those CPUs in most games. The GTX 960 is kind of a bad choice, consult the GPU thread. Spending half of that 180$ difference upgrading to an RX480, waiting for GTX 1060, or finding a good deal on a GTX 970 or 980 world be much better for most games. Things like Cities Skylines excluded. PBCrunch fucked around with this message at 14:12 on Jul 18, 2016 |
# ? Jul 18, 2016 14:08 |
|
Maybe one of you can help me ID what this is. I assume it has something to do with testing the Pentium M but I have no clue what the SPT stands for.
|
# ? Jul 18, 2016 16:35 |
|
RaffyTaffy posted:Maybe one of you can help me ID what this is. Looks like something along these lines: https://hubslide.com/stntkiller/ate-testers-overview-s56d32fa6c48f80826925e9db.html Production line testing/binning, maybe.
|
# ? Jul 19, 2016 02:57 |
|
PBCrunch posted:The GTX 960 is kind of a bad choice, consult the GPU thread. Spending half of that 180$ difference upgrading to an RX480, waiting for GTX 1060, or finding a good deal on a GTX 970 or 980 world be much better for most games. My monitor only runs at 1360x768 so it should be fine for years to come.
|
# ? Jul 19, 2016 04:33 |
|
KingEup posted:My monitor only runs at 1360x768 so it should be fine for years to come. Or get a monitor that doesn't violate human rights.
|
# ? Jul 19, 2016 13:28 |
|
I'm not sure why people react so violently to low resolution monitors. For me a smooth framerate and decent settings are orders of magnitude more important than high resolution, and I'm not in the business of buying $300 video cards so I can get all three. The only reason I'm planning to ever upgrade from 1360x768 at all is because adaptive sync sounds like a cool way to further extend the life of a GPU by softening the effect of framerate drops once it starts getting long in the tooth.
|
# ? Jul 19, 2016 13:55 |
|
HMS Boromir posted:I'm not sure why people react so violently to low resolution monitors. For me a smooth framerate and decent settings are orders of magnitude more important than high resolution, and I'm not in the business of buying $300 video cards so I can get all three. The only reason I'm planning to ever upgrade from 1360x768 at all is because adaptive sync sounds like a cool way to further extend the life of a GPU by softening the effect of framerate drops once it starts getting long in the tooth. Because it would be horrible for actual daily desktop use. If you have a larger, higher resolution monitor, you could always run the game at a lower resolution than native (which horrifies goons), and still be in a better position than sticking with a poo poo monitor forever, because you actually have space to work when you're back at the desktop.
|
# ? Jul 19, 2016 13:58 |
|
I mean, I've been using this monitor for four years now and it hasn't been "horrible for actual daily desktop use". When I'm doing work for university or whatever there's occasional stuff where a bigger / second monitor would be more comfortable but that's like 0.1% of the time I'm using the thing. If you're doing real, paid work that benefits from extra screen real estate then 1360x768 is a war crime, sure, but I'm not, so it's fine.
|
# ? Jul 19, 2016 14:04 |
|
You might think that, but it's only because you haven't tried using a modern resolution and don't know any better.
|
# ? Jul 19, 2016 15:02 |
|
i'm really surprised that 1366x768 is still a thing, 1600x900 costs just as much to OEMs and windows by default multiplies the UI by 1.5x, are they really that worried about blind grandmas
|
# ? Jul 19, 2016 15:04 |
|
HMS Boromir posted:I mean, I've been using this monitor for four years now and it hasn't been "horrible for actual daily desktop use". When I'm doing work for university or whatever there's occasional stuff where a bigger / second monitor would be more comfortable but that's like 0.1% of the time I'm using the thing. If you're doing real, paid work that benefits from extra screen real estate then 1360x768 is a war crime, sure, but I'm not, so it's fine. Stockholm Syndrome post of the year here.
|
# ? Jul 19, 2016 15:12 |
|
HMS Boromir posted:I'm not sure why people react so violently to low resolution monitors. For me a smooth framerate and decent settings are orders of magnitude more important than high resolution, and I'm not in the business of buying $300 video cards so I can get all three. The only reason I'm planning to ever upgrade from 1360x768 at all is because adaptive sync sounds like a cool way to further extend the life of a GPU by softening the effect of framerate drops once it starts getting long in the tooth. Because they're horrible, and there's no situation other than "I absolutely must play games with the highest settings at the native resolution" where having low resolution benefits you. Using those 1280x720/1360x768/1366x768 monitors is essentially still using the same 1024x768 resolution we were using 20 years ago, it's simply not appropriate and things aren't really designed to work with it. Sure sometimes it's the only option on very small laptop displays, but that's a tradeoff you have to put up with the same way you used to have to put up with screens that ghosted so bad playing most games on there would be impossible too. I mean you might as well say this is a perfectly usable way to browse the web:
|
# ? Jul 19, 2016 15:17 |
|
HMS Boromir posted:I'm not sure why people react so violently to low resolution monitors. For me a smooth framerate and decent settings are orders of magnitude more important than high resolution, and I'm not in the business of buying $300 video cards so I can get all three. The only reason I'm planning to ever upgrade from 1360x768 at all is because adaptive sync sounds like a cool way to further extend the life of a GPU by softening the effect of framerate drops once it starts getting long in the tooth. I have a friend like you who was still using a 17" 1024x768 monitor last year, you should really pick up a cheap IPS 1080p LCD and give it a shot. You'll be pretty happy.
|
# ? Jul 19, 2016 15:19 |
|
My last employer shipped me a 1280x1024 19" monitor. I was convinced the cost of the monitor was less than the shipping cost and they'd have been better served just giving me $60 to spend on a monitor. It turns out they were shipping them to reduce their massive stockpile of old monitors they hadn't thrown out.
|
# ? Jul 19, 2016 15:26 |
|
HMS Boromir posted:I mean, I've been using this monitor for four years now and it hasn't been "horrible for actual daily desktop use". 2560x1440x2 baby!
|
# ? Jul 19, 2016 15:51 |
|
...I should probably stop posting about my dumb monitor. Every time I do a swarm of people has to jump in and tell me I'm an idiot for not spending a bunch of money on something I don't want. I've used 1080p monitors and the "upgrade" is not worth money to me.
|
# ? Jul 19, 2016 16:29 |
|
HMS Boromir posted:...I should probably stop posting about my dumb monitor. Every time I do a swarm of people has to jump in and tell me I'm an idiot for not spending a bunch of money on something I don't want. I've used 1080p monitors and the "upgrade" is not worth money to me. You can literally get a better monitor at a Goodwill for $5. If that's a bunch of money to you, I don't know what to tell you. 768-high resolutions were obsoleted before LCD monitors were mass market.
|
# ? Jul 19, 2016 16:39 |
|
HMS Boromir posted:The only reason I'm planning to ever upgrade from 1360x768 at all is because adaptive sync sounds like a cool way to further extend the life of a GPU by softening the effect of framerate drops once it starts getting long in the tooth. Same. Apparently NVIDIA might be working on adding integer scaling to their drivers so that should extend GPU life even further. Quite happy to go 4k or more as long as it has an adaptive refresh rate and interger scaling mode. Why integer scaling is important: https://forums.geforce.com/default/topic/844905/geforce-drivers/integer-scaling-mode/ KingEup fucked around with this message at 16:54 on Jul 19, 2016 |
# ? Jul 19, 2016 16:39 |
|
KingEup posted:Same. Well at least someone understands.
|
# ? Jul 19, 2016 16:42 |
|
I don't know if I can hold off til Skylake-X/Kaby Lake-X to upgrade from my 2500k. Not because anything needs more cpu HP, just because I want something new and exciting.
|
# ? Jul 19, 2016 17:02 |
|
Our first LCD monitor was a big, heavy thing with 1280x1024 native, from 2001.
|
# ? Jul 19, 2016 17:03 |
|
|
# ? May 26, 2024 20:08 |
|
I understand that building PCs and keeping up with it is a neverending treadmill of money for upgrades, but monitors are probably the longest lasting part of a PC. I'm using a 1080p 24" Dell monitor that I bought ~7 years ago and I'm horrified that someone would use something worse. It was probably only $150 or $200 at the time. This is coming from someone who replaced his 27" CRT TV 4 years ago with a 42 inch LCD, and got his first smartphone 8 months ago. priznat posted:I don't know if I can hold off til Skylake-X/Kaby Lake-X to upgrade from my 2500k. Not because anything needs more cpu HP, just because I want something new and exciting. Except it's a 3700k that I'd be upgrading from. I barely even play video games anymore either, so there is zero reason for me to upgrade. Even if I did, upgrading my GPU would be the smarter move
|
# ? Jul 19, 2016 17:05 |