Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler
Sounds a little like my 2500K in an ASUS P8Z68-V LX. I looked at doing voltage overclocking but most of the sources I could find seemed to be saying that on Sandy Bridge going over 1.4V is a bad idea for the lifespan of the CPU. At "stock" voltage settings and 4.4GHz I am seeing a vCore of ~1.39V at load, so I think I'm at the end of the line already unless I want to risk having to hit up eBay for a new proc.

Adbot
ADBOT LOVES YOU

lDDQD
Apr 16, 2006
~1.4v seemed like it was unreasonable for my 1st-gen i7 (65nm?). But then, I ended up running my i7-875K at 1.44-ish 24/7, in order to achieve 4.2GHz anyway. And it was fine, for years and years... in fact, it can still do that voltage/clock combination just fine today. A buddy of mine tried a similar thing: he had a Xeon version of the i7-920 (don't remember the exact Xeon model#, but it was literally a server-branded 920), and he ran it at 4.4GHz @ ~1.42v. Initially, it was fine, but it started to degrade fast. After a couple months, it was no longer stable at 4.4. So he dropped it down to 4.3, but after a while it got wonky at that clock also. Eventually, it couldn't even do 4.2, so he gave up and set it to a more conservative 4.0GHz @ 1.2?v. Meanwhile, mine kept on trucking at 4.2GHz @ 1.44v.

The difference? My buddy had a heatpipe tower cooler, while I had a custom loop, which was keeping my 875K as cool as a cucumber: it didn't really go above 55°C on a typical load. His was hitting low 80s.

Now that I replaced my Lynnfield with a Broadwell-C, I actually still find myself running my i7-5775C at about 1.4v. It actually probably doesn't even need that voltage to do 4.2GHz (which is about the most it will ever do; you could give it 1.5v if you wanted to, and it wouldn't really make a difference; these things are terrible overclockers). Anyway, it's been fine so far, although I've only had it for about 4 months.

EdEddnEddy
Apr 5, 2012



Ak Gara posted:

My 5 and a half year old Corsair H100 (not even H100i) is starting to make grinding noises. :ohdear:

I was hoping to wait until Kabby Lake but I'd rather replace it now, then swap it to whatever PC I have next.

I'm thinking of getting a H110i 280mm but their naming system is a little weird.

First there was the H110i, then the H110i GT, by CoolIT, then their was a H110i GTX, by Asetek which was a smaller rad with lower speed fans, and tubes that mounted vertically rather than out the side. The H110i GT was later named the H110i, and the H110i GTX was renamed the H115i. Is that correct?

I'd also look into replacing the stock fans with some Corsair SP140's, but their website lists their 120's having both more airflow AND more static pressure AAAND run quieter than the SP140's.

I haven't gotten the grind yet, but the naming and specs of all their new stuff is confusing as hell. Why they can't just have a H## that is smaller-greater for bigger/better instead of all this GT/GTX/###i Bullshit is beyond me. Let me know what you get as I was looking to fit a H115/H120 in my R500 since it can fit the bigger rad over the H100 supposedly, pending my Ram/MB clearance.

NihilismNow posted:

Maybe it is my motherboard but mine did 1.41 volt out of the box (6700k + Asrock Z170m Extreme 4). I have to run a -0.08 voltage offset to keep temps sane. It does 4.5 ghz stable at that voltage (1.33). At "stock" voltage it just cooks itself. Most of the time it is only running 0.7-0.8 volt.

Hmm I know ASRock has always been a bit aggressive with voltages but man I don't remember having to use 1.4V+ Solid on anything since a 2600K at 5Ghz. Even my Q9550 at 3.84Ghz (1Ghz OC) only needed 1.35v at max load. :/ I know everyone says the newest stuff is bad at OC'ing, but drat if running the same OC voltages as 4+ generation old hardware seems pretty crazy/odd to me. Guess I need to get my hands on some new chips and experiment some. I have a G3258 thats at 4.4ghz with stock voltages (1.2ish I think, I'll have to check) but I may get a i7 4970K for cheap to replace it and let that HTPC be a backup VR system in another case with my 980Ti in the future..

GRINDCORE MEGGIDO
Feb 28, 1985


NihilismNow posted:

Maybe it is my motherboard but mine did 1.41 volt out of the box (6700k + Asrock Z170m Extreme 4). I have to run a -0.08 voltage offset to keep temps sane. It does 4.5 ghz stable at that voltage (1.33). At "stock" voltage it just cooks itself. Most of the time it is only running 0.7-0.8 volt.

Mine was similar too in default "auto" voltages, Asus z170i. It set a bunch of other voltages way high as well, like user agent / vccio, etc.

I fired it up at 4.4ghz, auto everything for the first boot after an install and it was giving the chip 1.45v vcore.

Due to a buggy Bios (I think), offset voltage wasn't working - adaptive mode did, just when I was regretting the whole thing and wondering if my sandy bridge would do 5GHz with this cooler. That was a lovely bug.

Wonder if this 14nm process is going to improve much?

GRINDCORE MEGGIDO fucked around with this message at 20:57 on Jul 12, 2016

Ak Gara
Jul 29, 2005

That's just the way he rolls.

EdEddnEddy posted:

I haven't gotten the grind yet, but the naming and specs of all their new stuff is confusing as hell. Why they can't just have a H## that is smaller-greater for bigger/better instead of all this GT/GTX/###i Bullshit is beyond me. Let me know what you get as I was looking to fit a H115/H120 in my R500 since it can fit the bigger rad over the H100 supposedly, pending my Ram/MB clearance.


Hmm I know ASRock has always been a bit aggressive with voltages but man I don't remember having to use 1.4V+ Solid on anything since a 2600K at 5Ghz. Even my Q9550 at 3.84Ghz (1Ghz OC) only needed 1.35v at max load. :/ I know everyone says the newest stuff is bad at OC'ing, but drat if running the same OC voltages as 4+ generation old hardware seems pretty crazy/odd to me. Guess I need to get my hands on some new chips and experiment some. I have a G3258 thats at 4.4ghz with stock voltages (1.2ish I think, I'll have to check) but I may get a i7 4970K for cheap to replace it and let that HTPC be a backup VR system in another case with my 980Ti in the future..

I ended up going for a H110i GT from Amazon. The CW-9060019-WW version. It should fit in my Corsair 550D okay. I don't know about the R500 (500R?) though.

EdEddnEddy
Apr 5, 2012



Ak Gara posted:

I ended up going for a H110i GT from Amazon. The CW-9060019-WW version. It should fit in my Corsair 550D okay. I don't know about the R500 (500R?) though.

If you are so inclined, take some pics and let me know how you like it sound/performance wise. My H100 still works, but that 3930K can get toasty in the summer if I have it at anything less then Max Cooling performance mode which spins up the fans if the CPU does anything but idle. (Only hits 70C max under 100% loads though) Any lower mode though and Ripbot can bring it close to 80C at times. Not dangerous, but still hot.

Ak Gara
Jul 29, 2005

That's just the way he rolls.

EdEddnEddy posted:

If you are so inclined, take some pics and let me know how you like it sound/performance wise. My H100 still works, but that 3930K can get toasty in the summer if I have it at anything less then Max Cooling performance mode which spins up the fans if the CPU does anything but idle. (Only hits 70C max under 100% loads though) Any lower mode though and Ripbot can bring it close to 80C at times. Not dangerous, but still hot.

Will do. How long have you had your H100? I actually had to replace the thermal paste on mine after 4 years as my temps where slowly climbing. I don't know if it was due to the pre-applied paste the H100 came with but this time I'm gonna be using some good ol MX-4.

I'm going to rerun Cinebench 11.5 and see if I can beat my old 2500k's 7.88 score!

EdEddnEddy
Apr 5, 2012



Ak Gara posted:

Will do. How long have you had your H100? I actually had to replace the thermal paste on mine after 4 years as my temps where slowly climbing. I don't know if it was due to the pre-applied paste the H100 came with but this time I'm gonna be using some good ol MX-4.

I'm going to rerun Cinebench 11.5 and see if I can beat my old 2500k's 7.88 score!

Hmm. I got it in 2011 after I won my 500R and put my C2Q 9550 setup into it, but I had remove and replaced it with my 3930K setup in 2012 and I think I repasted it since then in like 2013/14 once too. Using IC Diamond I believe which shouldn't need repasting as soon as AS5 did but I may need to try that and see what happens.


Though 6 cores at 4.6Ghz is still a good bit o heat for anything to disapate even with 4 fans on that rad in a push pull. Only 1.325v max though which is nice for like -10C over the 1.4 a lot of people seem to run at that speed when the X79's first arrived.

P.N.T.M.
Jan 14, 2006

tiny dinosaurs
Fun Shoe
I'm wondering if ya'll can help me visualize my situation at the moment. I've got a 3770K clocked at 4.7 turbo (4/8 cores), and I'm looking to take a leap forward by the end of the year. I do video editing, so anything new at this point would be an upgrade. Is that correct? I'm considering an i7-69xx model just to take that big step. Should I consider Xeons instead?

mobby_6kl
Aug 9, 2009

by Fluffdaddy
A couple of 8-core Xeons off ebay would probably give you better performance/$, and of course you could go with new ones if you hate money. Though anything from the 69xx line would be a significant improvement as well, of course. Kaby Lake and Skylake-E might be coming out soon but I don't think there's been an official ETA yet.

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin
As someone who just moved from a i5 3570k @ 4.6 to x99 and 6900k @ 4.4, I would say Broadwell-E or HEDT in general is your best bet if you want to buy new. Haswell-E is still good, will save you some cash and can overclock more readily than Broadwell-E.

Nothing in Kaby Lake is going to do anything much for you. It's rumored to just have a few desktop SKUs and its only real improvement is support for Intel's new bonkers storage products (xpoint/optane).
If things follow previous years, which isn't necessarily a given nowadays, Skylake-E would be released around June 2018, which is quite a wait.

Ak Gara
Jul 29, 2005

That's just the way he rolls.
YEY My H110i GT is here!

....where are the loving screws? And the pre-applied thermal paste? Why are the fins dusty, and why are there so many bent fins?

gently caress you, Amazon. But hey, free returns via them sending someone to my door to pick it up.

EdEddnEddy
Apr 5, 2012



Ak Gara posted:

YEY My H110i GT is here!

....where are the loving screws? And the pre-applied thermal paste? Why are the fins dusty, and why are there so many bent fins?

gently caress you, Amazon. But hey, free returns via them sending someone to my door to pick it up.

Yea Newegg > Amazon for PC Parts.


Amazon is great for TV's, Beds, other stuff, but PC Parts seem to be a dice roll.



Newegg has the H110i GT with a rebate right now too.

If you want Premiere for the perks, let me know as well.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
Newegg doesn't sell waterblocks :colbert:

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

PerrineClostermann posted:

Newegg doesn't sell waterblocks :colbert:

But they do?

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=waterblock&N=-1&isNodeId=1

Or do you mean full-coverage custom GPU waterblocks? Cuz yeah, they don't sell those.

P.N.T.M.
Jan 14, 2006

tiny dinosaurs
Fun Shoe

Riflen posted:

As someone who just moved from a i5 3570k @ 4.6 to x99 and 6900k @ 4.4, I would say Broadwell-E or HEDT in general is your best bet if you want to buy new. Haswell-E is still good, will save you some cash and can overclock more readily than Broadwell-E.

mobby_6kl posted:

A couple of 8-core Xeons off ebay would probably give you better performance/$, and of course you could go with new ones if you hate money. Though anything from the 69xx line would be a significant improvement as well, of course.


I like to think whatever I buy will be outdated tomorrow, so might as well buy what I need now and not worry about what the road maps tell me.

Thanks to the two of you. I'm going to chew on this and start looking at price points.

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.
As far as gaming goes will it make a difference if I choose a system with a i7 6700T over a i5 6400T (all other specs being equal incl. identical discrete GPU: NVIDIA 960GTX)?#

Edit: it's $182 extra for the i7.

KingEup fucked around with this message at 12:57 on Jul 18, 2016

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

KingEup posted:

As far as gaming goes will it make a difference if I choose a system with a i7 6700T over a i5 6400T (all other specs being equal incl. identical discrete GPU: NVIDIA 960GTX)?#

Edit: it's $182 extra for the i7.

Yes it will make a pretty noticeable difference, not so much because of the extra threads on the i7 but because of the large clock speed difference of 2.8GHz on the i5 vs 3.6GHz on the i7.

PBCrunch
Jun 17, 2002

Lawrence Phillips Always #1 to Me
As long as you have the 960 GTX, it will be the limiting factor with either of those CPUs in most games.

The GTX 960 is kind of a bad choice, consult the GPU thread. Spending half of that 180$ difference upgrading to an RX480, waiting for GTX 1060, or finding a good deal on a GTX 970 or 980 world be much better for most games.

Things like Cities Skylines excluded.

PBCrunch fucked around with this message at 14:12 on Jul 18, 2016

RaffyTaffy
Oct 15, 2008
Maybe one of you can help me ID what this is.






I assume it has something to do with testing the Pentium M but I have no clue what the SPT stands for.

Prescription Combs
Apr 20, 2005
   6

RaffyTaffy posted:

Maybe one of you can help me ID what this is.






I assume it has something to do with testing the Pentium M but I have no clue what the SPT stands for.

Looks like something along these lines: https://hubslide.com/stntkiller/ate-testers-overview-s56d32fa6c48f80826925e9db.html

Production line testing/binning, maybe.

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.

PBCrunch posted:

The GTX 960 is kind of a bad choice, consult the GPU thread. Spending half of that 180$ difference upgrading to an RX480, waiting for GTX 1060, or finding a good deal on a GTX 970 or 980 world be much better for most games.

My monitor only runs at 1360x768 so it should be fine for years to come.

suck my woke dick
Oct 10, 2012

:siren:I CANNOT EJACULATE WITHOUT SEEING NATIVE AMERICANS BRUTALISED!:siren:

Put this cum-loving slave on ignore immediately!

KingEup posted:

My monitor only runs at 1360x768 so it should be fine for years to come.

Or get a monitor that doesn't violate human rights.

HMS Boromir
Jul 16, 2011

by Lowtax
I'm not sure why people react so violently to low resolution monitors. For me a smooth framerate and decent settings are orders of magnitude more important than high resolution, and I'm not in the business of buying $300 video cards so I can get all three. The only reason I'm planning to ever upgrade from 1360x768 at all is because adaptive sync sounds like a cool way to further extend the life of a GPU by softening the effect of framerate drops once it starts getting long in the tooth.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

HMS Boromir posted:

I'm not sure why people react so violently to low resolution monitors. For me a smooth framerate and decent settings are orders of magnitude more important than high resolution, and I'm not in the business of buying $300 video cards so I can get all three. The only reason I'm planning to ever upgrade from 1360x768 at all is because adaptive sync sounds like a cool way to further extend the life of a GPU by softening the effect of framerate drops once it starts getting long in the tooth.

Because it would be horrible for actual daily desktop use. If you have a larger, higher resolution monitor, you could always run the game at a lower resolution than native (which horrifies goons), and still be in a better position than sticking with a poo poo monitor forever, because you actually have space to work when you're back at the desktop.

HMS Boromir
Jul 16, 2011

by Lowtax
I mean, I've been using this monitor for four years now and it hasn't been "horrible for actual daily desktop use". When I'm doing work for university or whatever there's occasional stuff where a bigger / second monitor would be more comfortable but that's like 0.1% of the time I'm using the thing. If you're doing real, paid work that benefits from extra screen real estate then 1360x768 is a war crime, sure, but I'm not, so it's fine.

Fuzzy Mammal
Aug 15, 2001

Lipstick Apathy
You might think that, but it's only because you haven't tried using a modern resolution and don't know any better.

Anime Schoolgirl
Nov 28, 2002

i'm really surprised that 1366x768 is still a thing, 1600x900 costs just as much to OEMs and windows by default multiplies the UI by 1.5x, are they really that worried about blind grandmas

Barry
Aug 1, 2003

Hardened Criminal

HMS Boromir posted:

I mean, I've been using this monitor for four years now and it hasn't been "horrible for actual daily desktop use". When I'm doing work for university or whatever there's occasional stuff where a bigger / second monitor would be more comfortable but that's like 0.1% of the time I'm using the thing. If you're doing real, paid work that benefits from extra screen real estate then 1360x768 is a war crime, sure, but I'm not, so it's fine.

Stockholm Syndrome post of the year here.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

HMS Boromir posted:

I'm not sure why people react so violently to low resolution monitors. For me a smooth framerate and decent settings are orders of magnitude more important than high resolution, and I'm not in the business of buying $300 video cards so I can get all three. The only reason I'm planning to ever upgrade from 1360x768 at all is because adaptive sync sounds like a cool way to further extend the life of a GPU by softening the effect of framerate drops once it starts getting long in the tooth.

Because they're horrible, and there's no situation other than "I absolutely must play games with the highest settings at the native resolution" where having low resolution benefits you. Using those 1280x720/1360x768/1366x768 monitors is essentially still using the same 1024x768 resolution we were using 20 years ago, it's simply not appropriate and things aren't really designed to work with it. Sure sometimes it's the only option on very small laptop displays, but that's a tradeoff you have to put up with the same way you used to have to put up with screens that ghosted so bad playing most games on there would be impossible too.

I mean you might as well say this is a perfectly usable way to browse the web:

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

HMS Boromir posted:

I'm not sure why people react so violently to low resolution monitors. For me a smooth framerate and decent settings are orders of magnitude more important than high resolution, and I'm not in the business of buying $300 video cards so I can get all three. The only reason I'm planning to ever upgrade from 1360x768 at all is because adaptive sync sounds like a cool way to further extend the life of a GPU by softening the effect of framerate drops once it starts getting long in the tooth.

I have a friend like you who was still using a 17" 1024x768 monitor last year, you should really pick up a cheap IPS 1080p LCD and give it a shot. You'll be pretty happy.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
My last employer shipped me a 1280x1024 19" monitor. I was convinced the cost of the monitor was less than the shipping cost and they'd have been better served just giving me $60 to spend on a monitor. It turns out they were shipping them to reduce their massive stockpile of old monitors they hadn't thrown out.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

HMS Boromir posted:

I mean, I've been using this monitor for four years now and it hasn't been "horrible for actual daily desktop use".
Where in the hell did you find a 1366x768 desktop monitor four years ago?!

2560x1440x2 baby!

HMS Boromir
Jul 16, 2011

by Lowtax
...I should probably stop posting about my dumb monitor. Every time I do a swarm of people has to jump in and tell me I'm an idiot for not spending a bunch of money on something I don't want. I've used 1080p monitors and the "upgrade" is not worth money to me.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

HMS Boromir posted:

...I should probably stop posting about my dumb monitor. Every time I do a swarm of people has to jump in and tell me I'm an idiot for not spending a bunch of money on something I don't want. I've used 1080p monitors and the "upgrade" is not worth money to me.

You can literally get a better monitor at a Goodwill for $5. If that's a bunch of money to you, I don't know what to tell you. 768-high resolutions were obsoleted before LCD monitors were mass market.

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.

HMS Boromir posted:

The only reason I'm planning to ever upgrade from 1360x768 at all is because adaptive sync sounds like a cool way to further extend the life of a GPU by softening the effect of framerate drops once it starts getting long in the tooth.

Same.

Apparently NVIDIA might be working on adding integer scaling to their drivers so that should extend GPU life even further. Quite happy to go 4k or more as long as it has an adaptive refresh rate and interger scaling mode.

Why integer scaling is important: https://forums.geforce.com/default/topic/844905/geforce-drivers/integer-scaling-mode/

KingEup fucked around with this message at 16:54 on Jul 19, 2016

HMS Boromir
Jul 16, 2011

by Lowtax

Well at least someone understands.

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.
I don't know if I can hold off til Skylake-X/Kaby Lake-X to upgrade from my 2500k. Not because anything needs more cpu HP, just because I want something new and exciting.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
Our first LCD monitor was a big, heavy thing with 1280x1024 native, from 2001.

Adbot
ADBOT LOVES YOU

canyoneer
Sep 13, 2005


I only have canyoneyes for you
I understand that building PCs and keeping up with it is a neverending treadmill of money for upgrades, but monitors are probably the longest lasting part of a PC. I'm using a 1080p 24" Dell monitor that I bought ~7 years ago and I'm horrified that someone would use something worse. It was probably only $150 or $200 at the time.

This is coming from someone who replaced his 27" CRT TV 4 years ago with a 42 inch LCD, and got his first smartphone 8 months ago. :psyduck:

priznat posted:

I don't know if I can hold off til Skylake-X/Kaby Lake-X to upgrade from my 2500k. Not because anything needs more cpu HP, just because I want something new and exciting.

:same:
Except it's a 3700k that I'd be upgrading from. I barely even play video games anymore either, so there is zero reason for me to upgrade. Even if I did, upgrading my GPU would be the smarter move

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply