Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
HornyBoy123
Mar 4, 2005

Alereon posted:

Nope, it's just a decent mid-range videocard. Your CPU is too slow to justify a Radeon HD 6900-series card, and the 6850 is reasonably fast. There's actually a parts picking megathread here you should post in if you want more feedback. If you actually want something more than general advice you should probably say what you want the videocard FOR, like what games you play, what settings you want to play them on, and how big your monitor(s) are. As well as what you're using now and why you're not happy with it.
People actually fill the sockets with thermal paste on purpose to reduce the risk of condensation from sub-ambient cooling, so as long as it's not something metallic like Arctic Silver you'll be totally fine.

I wasn't being sarcastic, just joking around :)

I want to play Skyrim on max when it comes out--I have 4GB of DDR2-1066 RAM and willb be upgrading to 8GB--might want to play Arkham City on max too.

The E6850 isn't that bad of a CPU; don't tell me to upgrade to a quadcore i5 or i7 dammit!

My monitor is a 20.1" 1680x1050--I'll be sinking all my money into a 2560x1600 display soon though, which is why I want to find a nice pair for my processor.

HornyBoy123 fucked around with this message at 04:29 on Nov 1, 2011

Adbot
ADBOT LOVES YOU

MikeC
Jul 19, 2004
BITCH ASS NARC
Just a followup to my 5850 hd video card fan issue i posted a few pages back.

Ok, so I got the aftermarket cooler, and I took apart an old that had water damage on it radeon 2900 just to see if I could even attempt something like this. I pulled it apart ok did the cleaning bit and managed to reassemble it sans thermal paste since it is busted so I think i have a legit shot at this.

Just looking for some tips before I dive into this tommorow. I know with thermal paste, "less is more". But I just don;t know how much to put. If it comes to making a call, should I lean towards putting more or should I put less?

Also while disassembling the 2900, I noticed there was a mega fuckton of thermal paste all over the place where the GPU met the heatsink plate. So much that gunk filled was pushed out of the square chip and literally covered the entire "moat" area beside the raised portion of the chip. Is it required to clean all of this off? Even with lots of 70% iso aolcohol and lots of q tips, there remained a lot of stubborn parts stuck to raised metal bits in the "moat" surrounding the gpu.

I didn't want to press too hard with the q tips for fear of leaving lint. How important is it to get all of that old gunk off before I attache the aftermarket heatsink?

Thanks, and please wish me luck :D

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Alereon posted:

People actually fill the sockets with thermal paste on purpose to reduce the risk of condensation from sub-ambient cooling, so as long as it's not something metallic like Arctic Silver you'll be totally fine.

That's a pretty good idea I guess. Wouldn't leave any space for moisture when going for [H]ard level overclocks with a 500W Peltier or something. Never thought about it. Cool :)

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

MikeC posted:

Videocard heatsink stuff
You don't have to get it completely clean, just make sure the GPU core (shiny part) and heatsink base are spotless. With thermal paste, you want to apply a paper-thin, even layer to the GPU core only. Imagine one coat of white paint. Try not to smear it around when you install the heatsink, though a bit of movement is unavoidable and not a problem.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Alereon posted:

People actually fill the sockets with thermal paste on purpose to reduce the risk of condensation from sub-ambient cooling, so as long as it's not something metallic like Arctic Silver you'll be totally fine.

Even Arctic Silver's not a huge deal - it's silver particles suspended in non-conductive goop, so it's not conductive. It can take localized charges and gently caress with capacitance, so it's not a good idea to go spreading it everywhere, but it won't short anything out and in all likelihood won't cause any issues. It certainly won't set the CPU on fire or anything.

HornyBoy123
Mar 4, 2005

HornyBoy123 posted:

I wasn't being sarcastic, just joking around :)

I want to play Skyrim on max when it comes out--I have 4GB of DDR2-1066 RAM and willb be upgrading to 8GB--might want to play Arkham City on max too.

The E6850 isn't that bad of a CPU; don't tell me to upgrade to a quadcore i5 or i7 dammit!

My monitor is a 20.1" 1680x1050--I'll be sinking all my money into a 2560x1600 display soon though, which is why I want to find a nice pair for my processor.

Is there any reason to not get a 5870 on eBay for ~$140? :downs: What does the 6850 have that the 5870 doesn't? (Here I thought I knew a lot)

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Space Gopher posted:

Even Arctic Silver's not a huge deal - it's silver particles suspended in non-conductive goop, so it's not conductive. It can take localized charges and gently caress with capacitance, so it's not a good idea to go spreading it everywhere, but it won't short anything out and in all likelihood won't cause any issues. It certainly won't set the CPU on fire or anything.
Arctic Silver (along with all pastes with significant metal content) is actually conductive, they've been saying it's "only conductive under extreme pressure" for years, and while it may be LESS conductive than other products, it absolutely will short if you give it the chance to, and you have to be very careful with it. That's why they sell non-conductive products like Alumina and Ceramique.

HornyBoy123 posted:

Is there any reason to not get a 5870 on eBay for ~$140? :downs: What does the 6850 have that the 5870 doesn't? (Here I thought I knew a lot)
Don't buy old-generation videocards. The 6800-series has better picture quality, much better DirectX11 tessellation performance, other new features, and much lower power usage (and thus lower heat and noise). Here's a review, which compares it to the previous generation..

Alereon fucked around with this message at 04:50 on Nov 1, 2011

MikeC
Jul 19, 2004
BITCH ASS NARC

Alereon posted:

You don't have to get it completely clean, just make sure the GPU core (shiny part) and heatsink base are spotless. With thermal paste, you want to apply a paper-thin, even layer to the GPU core only. Imagine one coat of white paint. Try not to smear it around when you install the heatsink, though a bit of movement is unavoidable and not a problem.

so you recommend avoiding the "dot" or "line" method that I keep seeing in youtube? Its just so hard to judge how much they used.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

MikeC posted:

so you recommend avoiding the "dot" or "line" method that I keep seeing in youtube? Its just so hard to judge how much they used.
Yes, it's basically impossible to eyeball, and your goal is a flat, even layer anyway. Some people think that allows air bubbles, but that's really never been the case (if you care enough you can pull the heatsink off and see what kind of thermal paste imprint you got, there should be VERY little actually left between the contact surfaces, just enough to fill the microscopic imperfections).

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Alereon posted:

Yes, it's basically impossible to eyeball, and your goal is a flat, even layer anyway. Some people think that allows air bubbles, but that's really never been the case (if you care enough you can pull the heatsink off and see what kind of thermal paste imprint you got, there should be VERY little actually left between the contact surfaces, just enough to fill the microscopic imperfections).

Having just done this myself, I have to agree that since the GPU heat spreader is even larger than a CPUs it is really best to spread as thin a layer as possible across the entire contact area on the GPU. I had to redo the paste on my AXP 2 because they did a poo poo job of pre-applying the paste on the cooler; when I went back to do it I applied some old coolermaster gunk to the GPU instead of applying it to the cooler and it worked much better.

HornyBoy123
Mar 4, 2005

Alereon posted:

Arctic Silver (along with all pastes with significant metal content) is actually conductive, they've been saying it's "only conductive under extreme pressure" for years, and while it may be LESS conductive than other products, it absolutely will short if you give it the chance to, and you have to be very careful with it. That's why they sell non-conductive products like Alumina and Ceramique.
Don't buy old-generation videocards. The 6800-series has better picture quality, much better DirectX11 tessellation performance, other new features, and much lower power usage (and thus lower heat and noise). Here's a review, which compares it to the previous generation..

Thanks man, read every bit of that :)

Also, wondering if anybody knows of a store similar to American Musical Supply where I could buy computer parts/electronics that allows payments without needing credit? Just curious--not going to make any dumb consumerist-type decision I promise :)

RomaVictor
Jan 14, 2008
Above all things, truth beareth away the victory.
I'm one of those guys who has a good gaming computer but doesn't actually know jack poo poo about it.

I have a GTX 550 Ti, and I'm wondering if I should get a second one of those and run it in SLI or get a single 570 Ti.

Advice?

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
1) No.
2) There is no such thing as a GTX 570 Ti.
3) Buy a 560Ti. Maybe that's what you meant to say? Do that and try to sell the old card.

RomaVictor
Jan 14, 2008
Above all things, truth beareth away the victory.

Dogen posted:

1) No.
2) There is no such thing as a GTX 570 Ti.
3) Buy a 560Ti. Maybe that's what you meant to say? Do that and try to sell the old card.

I meant GTX 570. That card does exist. And it's better than 560.

Srebrenica Surprise
Aug 23, 2008

"L-O-V-E's just another word I never learned to pronounce."
If I were being offered a 5870 for $140 I'd snap it up in a second, assuming I had an okay power supply. The 5850/5870 may have poorer tessellation performance but they are excellent cards. I bought a 5850 when they popped briefly back in retail channels months and months ago for something like $180 and with overclocking it matches the 560Ti/6950. Generally buying old video cards is throwing your money away (especially in the case of Crossfire) because new generations fill the old price point gaps with better performance taking into account price fluctuations from launch (usually about a hundred bucks) but the 6xxx series equivalents to the 58xx series are running at fifty bucks less than the original MSRP.

Srebrenica Surprise fucked around with this message at 01:38 on Nov 2, 2011

Aphrodite
Jun 27, 2006

Edit: Blah, wrong thread sorry.

Aphrodite fucked around with this message at 02:57 on Nov 2, 2011

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

RomaVictor posted:

I meant GTX 570. That card does exist. And it's better than 560.

Please read the extensive post in the OP of the parts picking megathread that explains why you probably don't need a 570 and should buy a 560ti.

illcendiary
Dec 4, 2005

Damn, this is good coffee.
I have a netbook with a broken screen. I can use an external monitor with it, but I want to be able to use the external monitor when I boot up the computer, so that I can install a fresh copy of Windows 7. Does anyone know how I might go about accomplishing this?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

illcendiary posted:

I have a netbook with a broken screen. I can use an external monitor with it, but I want to be able to use the external monitor when I boot up the computer, so that I can install a fresh copy of Windows 7. Does anyone know how I might go about accomplishing this?

What model netbook? There should be an option in the BIOS, but big deal if you can't see to change the option. Though you might be able to stumble through it by dead reckoning if you knew exactly where it was.

You might also check to see if a key combination like Fn-F7 triggers a screen switch (just an example - look at the keyboard/the manual for the system).

illcendiary
Dec 4, 2005

Damn, this is good coffee.

Factory Factory posted:

What model netbook? There should be an option in the BIOS, but big deal if you can't see to change the option. Though you might be able to stumble through it by dead reckoning if you knew exactly where it was.

You might also check to see if a key combination like Fn-F7 triggers a screen switch (just an example - look at the keyboard/the manual for the system).

It's a two-year old HP Mini-110 that I bought at Costco. I'm trying to get rid of it and I feel like it'd be smarter to do that with a fresh install.

FN+F2 is the function that works on this little netbook, but it only works once I'm logged into Windows. I'll keep poking around Google to see if I can find a BIOS layout so maybe I can enter the keystrokes blind.

HornyBoy123
Mar 4, 2005

illcendiary posted:

It's a two-year old HP Mini-110 that I bought at Costco. I'm trying to get rid of it and I feel like it'd be smarter to do that with a fresh install.

FN+F2 is the function that works on this little netbook, but it only works once I'm logged into Windows. I'll keep poking around Google to see if I can find a BIOS layout so maybe I can enter the keystrokes blind.

I work in IT with these all the time--would be happy to replace the screen for you for the costs of the parts and $10 or so :)

We charge $65 in the shop.

illcendiary
Dec 4, 2005

Damn, this is good coffee.

HornyBoy123 posted:

I work in IT with these all the time--would be happy to replace the screen for you for the costs of the parts and $10 or so :)

We charge $65 in the shop.

Thanks for the offer! How much does a 10.1", 1024x600 LCD usually cost?

I recently bought a MacBook Air, so I'm not exactly hurting to get this one fixed. Just wanted to be able to get rid of it for $80-100 or so. Mostly just want it off my hands.

HornyBoy123
Mar 4, 2005

illcendiary posted:

Thanks for the offer! How much does a 10.1", 1024x600 LCD usually cost?

I recently bought a MacBook Air, so I'm not exactly hurting to get this one fixed. Just wanted to be able to get rid of it for $80-100 or so. Mostly just want it off my hands.

~$48 with 3 year warranty

Stump Truck
Nov 26, 2007
Why? Yes
I asked this in the system building thread but maybe this is a better place for it. I bought a motherboard that came with "2 x SATA 6GB/sec cables". My HDD that I bought I found out is SATA 3GB/sec. Do I need to buy new cables for the hard drive or will any SATA cable work with any speed of SATA hardware?

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

Stump Truck posted:

I asked this in the system building thread but maybe this is a better place for it. I bought a motherboard that came with "2 x SATA 6GB/sec cables". My HDD that I bought I found out is SATA 3GB/sec. Do I need to buy new cables for the hard drive or will any SATA cable work with any speed of SATA hardware?
They are backwards-compatible.

scanlonman
Feb 7, 2008

by R. Guyovich
What's the best way to connect TWO external monitors to a laptop with one VGA port?

What's the best adapter, or method?

Nintendo Kid
Aug 4, 2011

by Smythe

scanlonman posted:

What's the best way to connect TWO external monitors to a laptop with one VGA port?

What's the best adapter, or method?

None. I mean you can use a splitter but that degrades quality and would only give you two monitors showing the same thing.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

scanlonman posted:

What's the best way to connect TWO external monitors to a laptop with one VGA port?

What's the best adapter, or method?

USB video adapter which is slow
PC Card video adapter which is also slow
Matrox DualHead2Go which can be weird because it makes 2 monitors act as one big wide one. So 2 19" LCD displays would be seen as one big 2560x1024 screen.

I have one of those for sale if you're interested ;) They're expensive as all hell now.

scanlonman
Feb 7, 2008

by R. Guyovich

Bob Morales posted:

USB video adapter which is slow
PC Card video adapter which is also slow
Matrox DualHead2Go which can be weird because it makes 2 monitors act as one big wide one. So 2 19" LCD displays would be seen as one big 2560x1024 screen.

I have one of those for sale if you're interested ;) They're expensive as all hell now.

So, no "true" way to have two monitors running 1080p?

Send me a email on your Matrox DualHead2Go, scanlonman at gmail dot com

Nintendo Kid
Aug 4, 2011

by Smythe

scanlonman posted:

So, no "true" way to have two monitors running 1080p?


Yeah not a chance. Your laptop's video card wouldn't be able to handle that either, in all likelihood, even with the proper ports.

Athropos
May 4, 2004

"Skeletons are Number One! Flesh just slows you down."
I recently added a second HD5870 to my system as a fairly cheap video card upgrade but I think I'm having some problems. In crossfire, my two HD5870s seem to drop in usage when things get hectic in games, most noticeably in Battlefield 3 and in Shogun 2, which causes framerate drops.

If I look at something quiet I'll get the usual 100% usage per GPU and my framerate will skyrocket, but when poo poo gets real I'll tend to drop to about 60% usage on each card and then my FPS will take a 30ish fps dive or so. I don't really understand why my cards think it's a good idea to take a break when they are most needed. I'm using catalyst 11.10 and CAP4. Monitoring temps and CPU usage has all been fine, temperatures on the GPUs never exceed 65c and CPU usage stays at about 60-80% in most demanding games, except Bad Company 2 which ramps it up to 100% usage.
My system is such:

i5 750 OCed at 4ghz
2 x HD5870
8GBs of DDR3
P-55 GD65 Motherboard from MSI - http://www.msi.com/product/mb/P55-GD65.html
650w Corsair PSU (52A on the 12v rail)

Running about 3 case fans and an aftermarket CPU fan, 1 HDD and 1 DVD-R.

I'm starting to think the culprit might be a lack of power from my PSU or something wrong with the motherboard's management of two cards in crossfire. But I'm no expert. Any help? Battlefield 3 stutters like a motherfucker since my GPUs drop loads randomly, but other games tend to be much better.

Farecoal
Oct 15, 2011

There he go
So, reporting back with my 6670 Radeon HD connected to my 1080p TV with a HDMI cable. It works, but there's a black box all around the display and everything is really muddy, all the colors bleed together and stuff. Is there some obvious thing I'm missing?

Nintendo Kid
Aug 4, 2011

by Smythe

Farecoal posted:

So, reporting back with my 6670 Radeon HD connected to my 1080p TV with a HDMI cable. It works, but there's a black box all around the display and everything is really muddy, all the colors bleed together and stuff. Is there some obvious thing I'm missing?

Your video card driver probably has a default setting to prevent overscan, which scrunches everything into the middle. Go into the settings, there should be a way to set that to "0%" or something like that to end that.

Farecoal
Oct 15, 2011

There he go

Install Gentoo posted:

Your video card driver probably has a default setting to prevent overscan, which scrunches everything into the middle. Go into the settings, there should be a way to set that to "0%" or something like that to end that.

I found an underscan/overscan option bar in the Catalyst Control Center, but it looks crappy when set to 0%. The screen's slightly cut off and still looks muddy & blurry.

Farecoal fucked around with this message at 05:30 on Nov 4, 2011

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Farecoal posted:

I found an underscan/overscan option bar in the Catalyst Control Center, but it looks crappy when set to 0%. The screen's slightly cut off and still looks muddy.
Yeah, that's why overscan compansation exists, TVs usually don't display the outer edge of the image. You might want to verify your TV's actual physical resolution and set your display resolution to that if possible (they often use panels of a slightly different resolution than the signal they're meant for, for example 1366x768 for 1280x720), but that won't fix the fact that the edges are cut off, unless your TV has an option to disable overscan entirely (good TVs often have that, possibly in a service menu). These are some of the reasons why I mentioned that using a TV as a monitor is usually a bad idea.

Farecoal
Oct 15, 2011

There he go

Alereon posted:

Yeah, that's why overscan compansation exists, TVs usually don't display the outer edge of the image. You might want to verify your TV's actual physical resolution and set your display resolution to that if possible (they often use panels of a slightly different resolution than the signal they're meant for, for example 1366x768 for 1280x720), but that won't fix the fact that the edges are cut off, unless your TV has an option to disable overscan entirely (good TVs often have that, possibly in a service menu). These are some of the reasons why I mentioned that using a TV as a monitor is usually a bad idea.

Actually, I fixed the cut-off issue by setting it slightly above 0%.

Farecoal fucked around with this message at 22:13 on Nov 4, 2011

Athropos
May 4, 2004

"Skeletons are Number One! Flesh just slows you down."

Athropos posted:

I recently added a second HD5870 to my system as a fairly cheap video card upgrade but I think I'm having some problems. In crossfire, my two HD5870s seem to drop in usage when things get hectic in games, most noticeably in Battlefield 3 and in Shogun 2, which causes framerate drops.

If I look at something quiet I'll get the usual 100% usage per GPU and my framerate will skyrocket, but when poo poo gets real I'll tend to drop to about 60% usage on each card and then my FPS will take a 30ish fps dive or so. I don't really understand why my cards think it's a good idea to take a break when they are most needed. I'm using catalyst 11.10 and CAP4. Monitoring temps and CPU usage has all been fine, temperatures on the GPUs never exceed 65c and CPU usage stays at about 60-80% in most demanding games, except Bad Company 2 which ramps it up to 100% usage.
My system is such:

i5 750 OCed at 4ghz
2 x HD5870
8GBs of DDR3
P-55 GD65 Motherboard from MSI - http://www.msi.com/product/mb/P55-GD65.html
650w Corsair PSU (52A on the 12v rail)

Running about 3 case fans and an aftermarket CPU fan, 1 HDD and 1 DVD-R.

I'm starting to think the culprit might be a lack of power from my PSU or something wrong with the motherboard's management of two cards in crossfire. But I'm no expert. Any help? Battlefield 3 stutters like a motherfucker since my GPUs drop loads randomly, but other games tend to be much better.

For anyone wondering I've narrowed it down to two things. In Battlefield 3 the "RenderDevice.ForceRenderAheadLimit 1" command in the console 1 has completely stopped my horrible stuttering and impending crashes. The game now behaves as it should. In single player performance is stellar, in multiplayer on 64 player servers I'm getting CPU bottlenecked by all the poo poo that is going on and that leaves about enough processing power for 60-70% GPU load on both cards, which is enough for high settings without AA at 60+ fps on extremely busy areas.

So yeah, it's my CPU.

Athropos fucked around with this message at 07:02 on Nov 4, 2011

scanlonman
Feb 7, 2008

by R. Guyovich

scanlonman posted:

What's the best way to connect TWO external monitors to a laptop with one VGA port?

What's the best adapter, or method?


It looks like this:
Kensington Universal Multi-Display Adapter

http://www.amazon.com/Kensington-Universal-Multi-Display-Adapter-Black/dp/B002F9NSMQ/ref=cm_cmu_pg__header

Works perfect for what I need, the pictures look exactly what I want. Or am I crazy?

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

scanlonman posted:

It looks like this:
Kensington Universal Multi-Display Adapter

http://www.amazon.com/Kensington-Universal-Multi-Display-Adapter-Black/dp/B002F9NSMQ/ref=cm_cmu_pg__header

Works perfect for what I need, the pictures look exactly what I want. Or am I crazy?

Works, but it will be slow. The advantage is you get an independent screen compared to the devices like the Matrox that split 1 output across 2 monitors.

http://www.tomshardware.com/reviews/add-a-monitor-usb,1054.html

There used to be a good demo on Youtube of a USB VGA adapter where you could see how choppy it was. Works fine for spreadsheets or another screeen to read documentation or whatever.

Adbot
ADBOT LOVES YOU

scanlonman
Feb 7, 2008

by R. Guyovich

Bob Morales posted:

Works, but it will be slow. The advantage is you get an independent screen compared to the devices like the Matrox that split 1 output across 2 monitors.

http://www.tomshardware.com/reviews/add-a-monitor-usb,1054.html

There used to be a good demo on Youtube of a USB VGA adapter where you could see how choppy it was. Works fine for spreadsheets or another screeen to read documentation or whatever.

Yeah, the person that wants it won't be doing any gaming what-so-ever. Just a internet/office setup.

  • Locked thread