Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

Factory Factory posted:

Probably about the time the integrated sound, network, and etc. on the motherboard got good enough that we didn't need a ton of expansion slots any more. And switching to broadband instead of dial-up modems. Remember when you used to pay a premium for 6 PCI slots instead of 5 because you really needed that last one?

Yeah, I remember being one of those people. And before that, fussing over available IRQs. Now? I have a non-video card plugged into my main system for the first time in years, and only because I was troubleshooting whether network problems were due to the onboard dual LAN dying.

Adbot
ADBOT LOVES YOU

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Star War Sex Parrot posted:

It all sort of happened at once: multi-slot coolers, auxiliary power for GPUs, the rise of PCI Express (making GPU placement in the case a bit more flexible) and the better integration of more devices into the north bridge. Hell, we basically stopped calling it a north bridge around the same time because the south bridge disappeared. The relevancy of front-side bus died too.

I guess I shouldn't be so surprised that people who haven't built a PC in 8 years have no clue what's going on anymore.

Other way around on Northbridge/Southbridge, AMD's processors absorbed the memory controller into the die, then Nehalem followed suit with both the memory controller and the PCIe controller. AMD FX systems currently have a split Northbridge (and BIOSes still refer to both Northbridge and CPU-NB), and Intel systems call the "Southbridge" the PCH because the northbridge is entirely on die. Same functions, just not a separate chip. AMD APUs have die-integrated full northbridges, too, so the southbridge is now the FCH, Fusion Controller Hub.

Longinus00
Dec 29, 2005
Ur-Quan

Civil posted:

While that card is passively cooled, it still requires additional power, and has the case heating issues that go along with that. I was hoping AMD would solve the problem at the chipset level, rather than an OEM solution that takes 3 slots because the heatsink is so massive.

This is the card I'm currently using. Remember when just about every video card looked like this? I'd like to see mid-range performance in a package this size. Pipe dream?
http://www.newegg.com/Product/Product.aspx?Item=N82E16814131339

The reason midrange cards require additional power is because it takes all that additional power to reach mid range performance. You may as well complain about how new mid range CPUs require extra 12V headers on the motherboard. There's nothing you can currently change about the "chipset", whatever you mean by that, to fix it. Now if you mean you want a card as fast as mid range performance of X years ago then you're in luck.

Chuu
Sep 11, 2004

Grimey Drawer

HalloKitty posted:

That's probably true. Once you had all that space and nothing but a graphics card to fill it, the idea became less ridiculous.

I just built a low power rig for a computer dedicated to Quickbooks, and it's weird not even having to have a graphics card anymore. Does anyone even need a full sized ATX motherboard these days?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
You can get a full overclocking experience and dual video cards on an mATX board, and mini-ITX boards are just starting to get full overclocking and using laptop parts like mini-PCIe and mSATA to get full feature sets. Managing heat can be crazy in such small boxes, but pretty much no, full ATX is not needed for most people.

hobbesmaster
Jan 28, 2008

Chuu posted:

I just built a low power rig for a computer dedicated to Quickbooks, and it's weird not even having to have a graphics card anymore. Does anyone even need a full sized ATX motherboard these days?

Workstations. Ridiculous hardware means you can tackle ridiculous problems.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Chuu posted:

I just built a low power rig for a computer dedicated to Quickbooks, and it's weird not even having to have a graphics card anymore. Does anyone even need a full sized ATX motherboard these days?

Nah, if I was going to build a machine for someone, and I was pretty sure their needs wouldn't require lots of slots, I'd definitely steer in the direction of a nice micro ATX board and case.
Something like the Fractal Design Arc Mini takes a full size ATX PSU, microATX board, and still has 6x 3.5" and 2x 5.25" bays. Even if they had a monster card, you only have to take out the middle cage, culling 3x 3.5". Very few people would need more space.

HalloKitty fucked around with this message at 09:27 on May 1, 2012

movax
Aug 30, 2008

HalloKitty posted:

Nah, if I was going to build a machine for someone, and I was pretty sure their needs wouldn't require lots of slots, I'd definitely steer in the direction of a nice micro ATX board and case.
Something like the Fractal Design Arc Mini takes a full size ATX PSU, microATX board, and still has 6x 3.5" and 2x 5.25" bays. Even if they had a monster card, you only have to take out the middle cage, culling 3x 3.5". Very few people would need more space.

:stare: And 7 fans, holy poo poo.

Gunjin
Apr 27, 2004

Om nom nom
You can never have enough space :colbert:





I'm an edge case and I know it.

zer0spunk
Nov 6, 2000

devil never even lived
I'm waiting on a 680 to be delivered, more then likely it'll be about a week. Is it even worth it to pop in a 8800 512 GTS until then? it's a z77/3770k setup so it's got the 4000 IGP...

ultrabay2000
Jan 1, 2010


Only if you want to play games until the 680 arrives.

zer0spunk
Nov 6, 2000

devil never even lived
I've been trying to figure out how they fare against each other but no ones exactly doing benchmarks comparing the two..It's like a two minute install if that, case is already open and such but if its pointless I might as well not bother.

mayodreams
Jul 4, 2003


Hello darkness,
my old friend

zer0spunk posted:

I'm waiting on a 680 to be delivered, more then likely it'll be about a week. Is it even worth it to pop in a 8800 512 GTS until then? it's a z77/3770k setup so it's got the 4000 IGP...

I got by with my 2500K graphics for a week waiting on my card. It worked fine for desktop use, but worthless for any kind of games made in the last 5 years.

zer0spunk
Nov 6, 2000

devil never even lived

mayodreams posted:

I got by with my 2500K graphics for a week waiting on my card. It worked fine for desktop use, but worthless for any kind of games made in the last 5 years.

Yeah, IB is supposed to be 30-40% more powerful then the 3000 IGP in SB taking it from total poo poo to somewhat useable. I'm just trying to see where it falls in line in terms of what dgpu it's comparable to. The little discussion I've seen has folks saying it's both more and less powerful then the 8800..If it's going to be the same thing, I can just use the IGP until I can pop in the new dgpu..

ultrabay2000
Jan 1, 2010


I would think the 8800GT would be at least somewhat more powerful than the HD4000. I know my 8800GTS could play Skyrim at 1080p pretty easily, I'm not so sure if a HD4000 could do that.

e: on second thought...

ultrabay2000 fucked around with this message at 21:05 on May 1, 2012

Mayne
Mar 22, 2008

To crooked eyes truth may wear a wry face.
8800GT is faster than ATI HD5570 which beats HD4000 by a wide margin in pretty much all tests.

JBark
Jun 27, 2000
Good passwords are a good idea.

ultrabay2000 posted:

I would think the 8800GT would be at least somewhat more powerful than the HD4000. I know my 8800GTS could play Skyrim at 1080p pretty easily, I'm not so sure if a HD4000 could do that.

e: on second thought...

I was bored at work today, I put a 9800GT (virtually identical to an 8800gt) in our 3720QM test-stand, and ran the Lost Planet 2 DX9 benchmark with default settings at 720p.

3720QM - 21.1fps avg
9800GT - 51.5fps avg

Now that's the 3720QM and not the 3770K, but they both have the HD4000, and the only difference I could find between the two is that the 3720QM actually supports a higher max GPU freq, 1250 vs 1150.

So yeah, still pretty terrible, but at least starting to get in the usable range for low res.

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

JBark posted:

Same here, 8800GTX for $519 was my last video card purchase over 4.5 years ago (wow, long time!). I've stopped playing games because it's so drat slow at 1920x1200 it's not even worth it. Waiting with barely contained anticipation for details on the 670, so I can decide between that or a 7870 in my much needed new PC.

Edit:
Though the specs I'm hearing on the 660Ti don't look to shabby at all for $250-ish.

My last purchase for a video card was a couple years ago, bought the GTX 465, and after the rebates it was $190 with free shipping off Newegg after a $30 MIR. Best sub-$200 card I ever got, so far it's kept up with most games I have with the rig I built out of spare parts that a friend let me have after I built him a new system (his old one had a bad motherboard, he gave me the working parts in exchange for helping him set up a new system).

Can't complain, and now I have some leftover parts from a PC I worked on for my brother that also had a bad motherboard and hard drive - he bought a new PC and gave me the old one to salvage. Only parts I can see salvaging are the 4GB DDR3-1333, Phenom II X4-840 CPU and cooler, wireless g/n card, and combo DVD/CD optical drive, the rest of the parts are cheap junk (an old HP desktop with a flimsy 300W PSU and some off-brand motherboard with bulging/leaking capacitors).

Also, for anyone looking to get something decent that was last-gen, I've checked around and saw places like Amazon still sell HD5000 series AMD cards and 400 series Nvidia, and most aren't too bad price-wise. I was thinking about getting a 5850 or 5870 for when I set up the Phenom system eventually, might keep my current rig and hold it as a backup or sell it later.

E: fixed CPU name to avoid confusion :shobon:

BOOTY-ADE fucked around with this message at 18:25 on May 2, 2012

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Original Phenom? Wasn't that thing dire? Surely you'd be better off just replacing it.

Edit: ah, apparently there is no X4-840, so it must be a Phenom II. That's not bad. Disregard this.
Double edit: wait, what? Phenom II X4-840 came out only November 2011. The machine must be in warranty.. I don't know

HalloKitty fucked around with this message at 14:57 on May 2, 2012

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT
^ No idea about warranty status - brother had already contacted HP twice, the motherboard was replaced once and hard drive replaced once, both went bad again in less than 6 months so I guess he thought "gently caress it, might as well get a new one". I don't blame him, he's got a bunch of Dell towers that are still running strong, but had nothing but issues with 3 different HP towers he bought.

orange juche
Mar 14, 2012



Don't know if this is a good spot to post this, but I have an Intel Core i7 Sandy Bridge computer with a single AMD Radeon 6870 GPU, I was wondering if it would be more worthwhile/future-proof to just double down on the GPU and go CrossFire with another 6870, as they are about $170 on NewEgg, or should I abandon the Barts architecture to go with the Southern Islands chips at ~$350?
Mind that Barts is a 40nm chipset, and overclocks like poo poo, where as the GCN architecture in Southern Islands is 28nm, so the Southern Islands chips run cooler, and use less power, and potentially have more room for overclocking. Just looking for an informed opinion from someone who has made the upgrade or would be familiar with the performance difference?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
What is your screen resolution?

I upgraded from a single 6850 to two in CF for a 1920x1200 monitor before the 7000 series dropped. I don't think I'll ever do another dual-GPU setup as long as a single one can hit the kind of performance I want, but I like the performance and have had good luck with driver updates, so I'm happy with the pair. Buying new, the equivalent performance for me would be roughly a 7950, probably overclocked a bit, as 6850s overclock better than 6870s and mine are up pretty high.

GCN's major differences are a video encode engine (which is currently non-functional as there's no driver support, and you have QuickSync on the IGP anyway), better GPGPU performance if you do Folding@Home or some other compute task with your card, better long idle power draw (a desktop user won't get a lick of use out of unless you're using i-mode Virtu), and simultaneous stereoscopic 3D and Eyefinity.

But that phrase you used, "future-proof?" Can't be done. Don't try to do it.

orange juche
Mar 14, 2012



Well as far as overclocking, the 6870 is literally almost impossible to overclock, either I got a bad chip, or something, because a mere 5-10mhz bump on core causes artifacts in 3DMark Vantage, and same deal for UniEngine Heaven(I have since read reviews on several tech sites and they all come to the same conclusion that the 6870 just has 0 overclock headroom).
Maybe future-proof was the wrong word to use, as I am aware I will have to upgrade to keep up, but I keep hearing bad poo poo about going with dual-card setups. I currently run a single 1920x1200 display and I get respectable FPS (between 45 and 60) in Skyrim/BF3/Other New poo poo, but was hoping to go to EyeFinity with 5760x1200 resolution. Does anyone know how much GPU horsepower would be required to facilitate this? Is it even possible on a single card except with the insanely expensive (ha) 7970?

E: It seems that the limiting factor in EyeFinity is not the processing power of the GPU, but the amount of RAM onboard, looks like the 6870 is plenty powerful enough, if it is tied to 2GB of RAM, so I just have to look for a replacement pair of those or a 7870 2GB edition. :cripes: RAM is expensive on video cards though.

orange juche fucked around with this message at 01:30 on May 3, 2012

Longinus00
Dec 29, 2005
Ur-Quan

sh1gman posted:

Well as far as overclocking, the 6870 is literally almost impossible to overclock, either I got a bad chip, or something, because a mere 5-10mhz bump on core causes artifacts in 3DMark Vantage, and same deal for UniEngine Heaven(I have since read reviews on several tech sites and they all come to the same conclusion that the 6870 just has 0 overclock headroom).
Maybe future-proof was the wrong word to use, as I am aware I will have to upgrade to keep up, but I keep hearing bad poo poo about going with dual-card setups. I currently run a single 1920x1200 display and I get respectable FPS (between 45 and 60) in Skyrim/BF3/Other New poo poo, but was hoping to go to EyeFinity with 5760x1200 resolution. Does anyone know how much GPU horsepower would be required to facilitate this? Is it even possible on a single card except with the insanely expensive (ha) 7970?

E: It seems that the limiting factor in EyeFinity is not the processing power of the GPU, but the amount of RAM onboard, looks like the 6870 is plenty powerful enough, if it is tied to 2GB of RAM, so I just have to look for a replacement pair of those or a 7870 2GB edition. :cripes: RAM is expensive on video cards though.

How badly do you want that extra performance? Upgrading after just one generation is usually not a worthwhile investment, especially since the 7800 series is so much more expensive than the 6800 series. If you really want to move up and 'future proof' then you pretty much might as well go all out and splurge one of the 7900 series. I happen to have a 6870 and I have no problem OCing it to the factory OC levels that manufactures ship the more expensive cards with. I also have no problems getting 60fps @1080p in skyrim but I don't play it in the highest texture level that was patched in.

orange juche
Mar 14, 2012



Yeah, looking at what I have, I do fine at 1200P, but if I want to add 2 monitors to go 3x1 triple monitor, then I will either have to crossfire another card in, or go to the next generation up with a 2GB model. Can you crossfire 2 1GB cards and get decent FPS in 3x1 EyeFinity? If I can do that, then I can just stay with the 6870's

orange juche fucked around with this message at 20:47 on May 3, 2012

Longinus00
Dec 29, 2005
Ur-Quan

sh1gman posted:

Yeah, looking at what I have, I do fine at 1200P, but if I want to add 2 monitors to go 3x1 triple monitor, then I will either have to crossfire another card in, or go to the next generation up with a 2GB model. Can you crossfire 2 1GB cards and get decent FPS in 3x1 EyeFinity? If I can do that, then I can just stay with the 6870's

If you're serious about playing at multimonitor resolutions you pretty much have to go top of the line unless you're willing to sacrifice image quality settings from what you might normally be used to at 1920x1200. The anandtech review of the 690 has 5760x1080 benchmarch numbers if you want to get an idea of what some cards are capable of.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

sh1gman posted:

Can you crossfire 2 1GB cards and get decent FPS in 3x1 EyeFinity? If I can do that, then I can just stay with the 6870's
No, the video RAM isn't combined when you crossfire so you'd still need a card with a minimum of 2GB of video RAM. Preferably 3GB so your setup isn't immediately obsoleted when new games come out. This is the last generation of games where 1GB will be enough for 1080p, so it stands to reason it's also the last where 2GB will be enough for a large Eyefinity display. A 6870 wouldn't be fast enough for Eyefinity gaming anyway, so you should probably look at a Radeon HD 7950 3GB. They're actually remarkably good values right now.

texting my ex
Nov 15, 2008

I am no one
I cannot squat
It's in my blood
GTX 690 looks like a beast, but I'm happy that for once Anandtech had the 5970 on most comparison charts. It seems to still hold up fairly well, so I guess I'm not changing to the 680.

Dotcom656
Apr 7, 2007
I WILL TAKE BETTER PICTURES OF MY DRAWINGS BEFORE POSTING THEM
Recently I've been looking on buying a 2560x1440 Res display to run in a 2 display setup with a 1920x1080 monitor as my secondary. Would a single 1GB 6870 be enough to run that and run games like BF3, STALKER, etc on the 2560x1440 monitor?

I might upgrade to a 680 if the 6870 isn't powerful enough.

the
Jul 18, 2004

by Cowcaster
edit: nm

the fucked around with this message at 20:33 on May 6, 2012

Mayne
Mar 22, 2008

To crooked eyes truth may wear a wry face.
6870 definitely isn't powerful enough to run newer games at 2560x1440.
Here's a BF3 chart, it gets 14fps average at that resolution.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
TweakTown has a performance preview of the Geforce GTX 670 2GB. Since the GTX 680 is such overkill performance doesn't go down as much as you'd think, so depending on pricing this card looks like a great value.

GRINDCORE MEGGIDO
Feb 28, 1985


Lots of people on OCN wishing they hadn't bought 680's right now - http://www.overclock.net/t/1253432/gigabyte-gtx-670-oc-version-hands-on

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

That would be the smallest performance delta for the largest price difference of any of their cards going back many generations. Seems surprising that they would basically make their own high end redundant.

KillHour
Oct 28, 2007


I don't get why they're so mad about it, though. Obviously, they knew they weren't going to get a good price/performance card for this generation. They paid the flagship tax and the "gotta have it now" surcharge, and they should have seen it coming.

Granted, they didn't know the 670 would be so close, but that doesn't diminish the performance of the cards they have.

If the $$$ to FPS ratio was acceptable to them when they bought it, it should still be now.

sethsez
Jul 14, 2006

He's soooo dreamy...

KillHour posted:

I don't get why they're so mad about it, though. Obviously, they knew they weren't going to get a good price/performance card for this generation. They paid the flagship tax and the "gotta have it now" surcharge, and they should have seen it coming.

Granted, they didn't know the 670 would be so close, but that doesn't diminish the performance of the cards they have.

If the $$$ to FPS ratio was acceptable to them when they bought it, it should still be now.

Normally I'd agree, but it really has to sting to know that they paid $100 more for essentially 1 frame per second in most games. I can't think of a time when the flagship has been this close to the next card down.

KillHour
Oct 28, 2007


Do we know that the MSRP is going to be $400 yet? With that showing, I wouldn't be surprised if it hit at $450.

sethsez
Jul 14, 2006

He's soooo dreamy...

It's just speculation right now, yeah. Still, either way it's an odd choice for nvidia. Either they've got practically similar cards at relatively different price points, which makes the more expensive one a complete sucker's buy, or they have practically identical cards at practically identical price points, which seems pointless.

Cryolite
Oct 2, 2006
sodium aluminum fluoride
Argh, where are their ~$200 cards? :argh:

answer: 6 months away :(

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

One thing to be cautious about - as new as the GK104 is, we're still in the land of driver-level enhancements that could be significant. Though you'd think "well it's just a GTX 680 with one disabled SM" would mean that the overclocking result is totally legit, I find it extremely peculiar that while it was overclocked to stock GTX 680 speeds, a stock GTX 680 lost out in 3Dmark11 on the Graphics subscore.

To me that says potential driver-level issues. It's also significant that the GTX 670 was overclocked to stock 680 speeds, while the GTX 680 obviously can exceed those by a substantial amount.

Still, as the real launch date approaches I do look forward to seeing what kind of press we get regarding the real performance of the cards, and if nVidia offers any clarification that might help contextualize the scores. Given nobody is out of NDA yet who has an official unit, we're one part speculation, one part jaws-dropped. Just have to remember the speculation angle and keep our heads on straight 'til we find out more about the performance differences from reputable sources.

If it is indeed going to offer GTX 680 performance at a Radeon HD 7950 price, I'll be purchasing one and it would require an absurdly good 660/660Ti to do anything but match it on the price:performance curve.

  • Locked thread