Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Ignoarints
Nov 26, 2010

veedubfreak posted:

I'll trade you my unused BF4 code for it :) lovely game for lovely game.

BF4 is my brainless jam. Although I've never had a single "netcode" problem even once, and if I did, I'd hate it. Also I own babies all day long which makes it worth it. It's also my baseline game for testing GPU's really

edit: oh hello buddy :)

This thing is shockingly lightweight. My 770's were far heavier, and longer

Ignoarints fucked around with this message at 22:14 on May 30, 2014

Adbot
ADBOT LOVES YOU

L0VE
May 3, 2010
I've recently been looking at videocards since the fan's on my 7950 are dying, when I noticed that there's an overlap in price between some R9 290 and R9 290X. Seemed silly at first to choose the regular 290 over than the X version (all models have aftermarket coolers) however, the biggest differences between the two cards seems to lie in the X versions extra shaders and higher clock frequency.

I know that some cards can unlock the extra shaders, but discounting that possibility more or less leaves us with a difference in core and memory clock speed. Since the great aftermarket coolers allow both cards to disregard temperature when overclocking it seems to come down to silicon lottery and binning to determine which card tops the performance graph.

How much of a gamble is it to buy a Premium cost (Sapphires Vapor-X or Toxic) lower tier card (regular 290) and hoping it will out-clock a Regular cost higher tier card (e.g 290X)? I imagine binning and being so far into the production cycle almost guarantees that any card with fully functional shaders will be made into a 290X.

Could the same be said for the GPU clock though? Will a 290X inherently reach higher clocks? A more aggressive binning for a regular 290 would definitely justify the higher price point, too bad I know gently caress all about whether companies do this.

fake edit: Figured this topic was a better fit for this thread rather than the parts picking since I'm not really looking for advice, more interested in general binning and how companies determine what goes into which card; will most likely hold my purchase until proper 20nm cards are released.

Dobermaniac
Jun 10, 2004
How much am I wasting money on a video card on my current computer? My processor is an Intel i5 750(circa 2009), 4gb ram, and a Radeon 5770 1gb. All I play is WoW mostly and a few older steam games. I want to turn every setting up to the max. Is it worth purchasing a new video card or should I wait and just upgrade my whole computer in about a year?

Processor: http://ark.intel.com/products/42915/Intel-Core-i5-750-Processor-8M-Cache-2_66-GHz
Video Card: http://www.newegg.com/Product/Product.aspx?Item=N82E16814150447

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Purchase a good video card now and take it with you to your new computer next year.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva
Depending on your power supply and CPU cooler you can probably overclock the CPU a bit also if you want to make up the difference somewhat.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

cisco privilege posted:

Depending on your power supply and CPU cooler you can probably overclock the CPU a bit also if you want to make up the difference somewhat.

Overclocking doesn't actually increase the maximum wattage a gpu/cpu can pull does it? I thought only overvolting does that but I'm probably wrong.

Arzachel
May 12, 2012

Zero VGS posted:

Overclocking doesn't actually increase the maximum wattage a gpu/cpu can pull does it? I thought only overvolting does that but I'm probably wrong.

Faster switching transistors are going to leak more power.

Arzachel fucked around with this message at 00:25 on May 31, 2014

Shaocaholica
Oct 29, 2002

Fig. 5E

Zero VGS posted:

Overclocking doesn't actually increase the maximum wattage a gpu/cpu can pull does it? I thought only overvolting does that but I'm probably wrong.

It does. They both do and they both multiply on top of each other.

Ignoarints
Nov 26, 2010
780ti pretty sweet. Exactly what I thought it'd be like. About 20% less fps than 770 SLI (in BF4 so far at least) but more playable when it dips into the 50's and 60's. Cool beans. Runs pretty cool too at 70 degrees

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>

Ignoarints posted:

780ti pretty sweet. Exactly what I thought it'd be like. About 20% less fps than 770 SLI (in BF4 so far at least) but more playable when it dips into the 50's and 60's. Cool beans. Runs pretty cool too at 70 degrees

What do you mean by that? Is there more microstutter with SLI or something in that range?

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb

Dobermaniac posted:

How much am I wasting money on a video card on my current computer? My processor is an Intel i5 750(circa 2009), 4gb ram, and a Radeon 5770 1gb. All I play is WoW mostly and a few older steam games. I want to turn every setting up to the max. Is it worth purchasing a new video card or should I wait and just upgrade my whole computer in about a year?

Processor: http://ark.intel.com/products/42915/Intel-Core-i5-750-Processor-8M-Cache-2_66-GHz
Video Card: http://www.newegg.com/Product/Product.aspx?Item=N82E16814150447

I had an i5-750 and a GTX 460 and a few months ago I bought a GTX 770. I didn't want to order a new motherboard, cpu, and ram at the time so I just overclocked my i5 to like 4.2 GHz. I've been happy with the performance but I'm still tempted to upgrade to a i5-4690K once it's released. I don't feel like I *have* to upgrade though, this setup should serve me well for awhile longer.

So I'd say yes, it's definitely worth purchasing a new video card now!

Ignoarints
Nov 26, 2010

Hace posted:

What do you mean by that? Is there more microstutter with SLI or something in that range?

Oh yeah I've sort of mentioned it before. It's probably the biggest practical downside I've noticed with the cards I've SLI'd so far, which are just 660ti's and 770's. If I just use one 770 the performance is predictable in the sense that the performance suffers when it's I'm pushing it to it's limit. So if I have the settings at a point where it can only do 50 fps, it's still playable, but not the greatest.

However if I ever manage to get SLI down to 50 fps it's far less playable. I guess I would just call it micro stuttering. When they are pushed to the same limit it's just less... pleasant to play.

It's difficult to really compare apples to apples, but a good example is in Crysis 3 if I maxed out absolutely everything I could it would drop me into the 50's in SLI and I simply wouldn't want to play it there. With a single card it's really bad, maybe in the 20's and low 30's but it's pretty much just as playable.

Fortunately since you get so much more performance from SLI it's not a big deal. But it's something I noticed. Since I was aware of the fps differences in benchmarks between 770 sli and 780ti I guessed that even though I'm losing overall performance that since it would probably perform better as a single card that could pick up some of the slack and make what I'm losing not as bad. Since we're talking 90 fps to 70 fps or so it's still pretty high and in the enjoyable zone.

All that being said, 770 SLI is still a little better overall so far. But I'll see how it goes when I overclock too

BurritoJustice
Oct 9, 2012

780ti's can overclock to 690/770 sli levels quite commonly. The stock clocks are very conservative.

Ignoarints
Nov 26, 2010

BurritoJustice posted:

780ti's can overclock to 690/770 sli levels quite commonly. The stock clocks are very conservative.

Yeah I'm going to really spend my time with it. Going to get a water cooling bracket and, if it's worth it, bios reflash. Who knows, maybe I won't even want to SLI later (ha)

Edit: I would believe overclocking to perhaps a reference 770 SLI level. My 770's were pretty banging though at 1228mhz and ~7600 memory. I'm not ruling it out though that would be awesome as hell

Edit 2: After working out a max overclock of 1215 mhz clock / 7440mhz memory this came out to a 5.5% improvement in 1440p Heaven scores (1028->1088). The 770 SLI setup pulled in a score of 1166, although with a 20 fps higher max fps and similar minimum fps. All in all, impressed and slightly above expectations in performance, and basically right in line with my reasoning above. Since I got happened to get this card on sale at the same price as the 2 770's, I'd probably have chosen this route if just for simplicity if I had the choice at the beginning.

There is more memory overclock on the table but I was starting to get zero returns for it, which is a first for me.

Ignoarints fucked around with this message at 21:45 on May 31, 2014

Shaocaholica
Oct 29, 2002

Fig. 5E
Guh this poo poo is still being written:

Internet posted:

Furthermore, the frame buffer was doubled to 2GB though this will not boost its performance and can be addressed as a gimmick - marketing to lure consumers into paying more for the same product.

This was part of a mini review of an entry level GT 740 with GDDR5. Vram is a lot more important than just a framebuffer. And how can 2GB of GDDR5 be construed as a marketing gimmick in loving 2014. :ughh:

Shaocaholica
Oct 29, 2002

Fig. 5E
In other news I just got back a Dell Optiplex SFF loaner that's been loaned out for the last 4 years and I totally forgot about the GPU in it. Turns out its pretty rear end:

Ignoarints
Nov 26, 2010

Shaocaholica posted:

Guh this poo poo is still being written:


This was part of a mini review of an entry level GT 740 with GDDR5. Vram is a lot more important than just a framebuffer. And how can 2GB of GDDR5 be construed as a marketing gimmick in loving 2014. :ughh:

Probably because of the 128 bit memory bus is what its referring to.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Shaocaholica posted:

In other news I just got back a Dell Optiplex SFF loaner that's been loaned out for the last 4 years and I totally forgot about the GPU in it. Turns out its pretty rear end:


I like how it only has the equivalent of dual-channel DDR2-800 memory, about the same as integrated video even when it was new. The GPU itself actually is reasonably decent, if it was paired with GDDR3 like the HD 4670 it actually ran pretty well.

Shaocaholica
Oct 29, 2002

Fig. 5E
So with all the issues watch dogs has on PC, how do the console versions compare?

Shaocaholica
Oct 29, 2002

Fig. 5E
What's the replacement for the 7750? I think the low profile 7750 is the fastest single slot low profile card there is but it's discontinued.

BurritoJustice
Oct 9, 2012

Shaocaholica posted:

What's the replacement for the 7750? I think the low profile 7750 is the fastest single slot low profile card there is but it's discontinued.

There is a single slot bracket, low profile 750ti made by Galaxy, which absolutely destroys the 7750. The cooler is quite chunky into the second slot though.

Shaocaholica
Oct 29, 2002

Fig. 5E

BurritoJustice posted:

There is a single slot bracket, low profile 750ti made by Galaxy, which absolutely destroys the 7750. The cooler is quite chunky into the second slot though.

Haha awesome. I have a single slot cooler I could salvage thermals be damned. See what happens I guess. I'll try to source ath 750ti.

Fuzz1111
Mar 17, 2001

Sorry. I couldn't find anyone to make you a cool cipher-themed avatar, and the look on this guy's face cracks me the fuck up.

Dobermaniac posted:

How much am I wasting money on a video card on my current computer? My processor is an Intel i5 750(circa 2009), 4gb ram, and a Radeon 5770 1gb.
My missus has an i7 950 (same nehalem architecture as your 750) and when I moved a gtx670 from my PC (with a 3930k) to hers I couldn't tell any difference in how well it ran.

An overclock will make a fair bit of difference in some games, but don't go too far with it if power delivery is a concern - I'd stop where it needs a significant increase in voltage (from stock) to go faster.

Shaocaholica
Oct 29, 2002

Fig. 5E
Thinking about getting that 750ti low profile and fitting a single slot heatsink to it. A heatsink I'll have to salvage. So which low profile cards had the highest performing single slot heatsinks? 7750s? 6670?

Spek
Jun 15, 2012

Bagel!
Is it possible to configure the way graphics cards upscale lower than native res games? Specifically with NVidia but I'd be curious about ATI too. What I mean is it currently seems to upscale with bilinear or something similar whereas I'd like linear/unfiltered whatever you want to call it. I'd also like to be able to do closest integer upscaling, ie if I tell it to run a 800x600 game on a 2560x1440 monitor I'd like it to have a 1600x1200 image centred in the monitor with simple, crisp, pixel doubling rather than a blurry 1920x1440 as I currently get when I tell it to preserve aspect ratio or even worse if I don't.

I'd also love for it to upscale before sending it to the OS so Windows doesn't disorganize all open windows on the primary monitor and all monitors to the right of the primary as currently happens. That's the most annoying damned thing and I can't believe it still happens more than a decade after I first ran into it.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Shaocaholica posted:

Thinking about getting that 750ti low profile and fitting a single slot heatsink to it. A heatsink I'll have to salvage. So which low profile cards had the highest performing single slot heatsinks? 7750s? 6670?

Probably depends on the model, but the 6670 actually had the slightly higher TDP - 66W vs. the 7750's 55W. I'd look at 6670 coolers first.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva
Most single-slot coolers are going to be custom-fitted to a given card's reference PCB. You might have better luck with one of Arctic Cooling's passive models assuming it'll fit - IIRC the S1 line is single-slot with the right mounting points.

Shaocaholica
Oct 29, 2002

Fig. 5E

cisco privilege posted:

Most single-slot coolers are going to be custom-fitted to a given card's reference PCB. You might have better luck with one of Arctic Cooling's passive models assuming it'll fit - IIRC the S1 line is single-slot with the right mounting points.

Gotta be low profile too :)

I've figured all the details. The AMD mounting points are 45mm and the new Nvidia ones are 44mm. Toms did a write up on swapping heatsinks by reaming out the nvidia holes by ~0.5mm which should accommodate the AMD heatsink. I'll also have to get a mini fan cable extension and I was going to roll my own or just extend the cable but luckily theres one on ebay for like $3 from china. I have a few low profile cards laying around so I'll start with those heatsinks first before buying a 6670 just for the heatsink but also luckily there's a dead 6670 low profile on ebay now as well.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
For when my Oculus Rift DK2 comes, I'm probably in need of a new graphics card. What's the newest semi-affordable thing du jour right now? The recommendations in the OP are a bit dated. In regards to (driver) stability, what's the better choice of the two brands? I heard Nvidias drivers were becoming terrible lately. Worth it to go with the GTX versions of Nvidia cards?

Nephilm
Jun 11, 2009

by Lowtax

Combat Pretzel posted:

For when my Oculus Rift DK2 comes, I'm probably in need of a new graphics card. What's the newest semi-affordable thing du jour right now? The recommendations in the OP are a bit dated. In regards to (driver) stability, what's the better choice of the two brands? I heard Nvidias drivers were becoming terrible lately. Worth it to go with the GTX versions of Nvidia cards?

Those are questions for the parts picking megathread.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Okay. Besides that, any new cards coming this summer I should be aware about? --edit: I suppose the question is in relation to upcoming cards that support DX12 by 100%.

Combat Pretzel fucked around with this message at 21:29 on Jun 1, 2014

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Combat Pretzel posted:

For when my Oculus Rift DK2 comes, I'm probably in need of a new graphics card. What's the newest semi-affordable thing du jour right now? The recommendations in the OP are a bit dated. In regards to (driver) stability, what's the better choice of the two brands? I heard Nvidias drivers were becoming terrible lately. Worth it to go with the GTX versions of Nvidia cards?

I think the recommendation for dk2 is a 780 or better.

Ignoarints
Nov 26, 2010

Combat Pretzel posted:

For when my Oculus Rift DK2 comes, I'm probably in need of a new graphics card. What's the newest semi-affordable thing du jour right now? The recommendations in the OP are a bit dated. In regards to (driver) stability, what's the better choice of the two brands? I heard Nvidias drivers were becoming terrible lately. Worth it to go with the GTX versions of Nvidia cards?

Drivers for nvidia are terrible recently ? :confused: I found them to be unusually frequent and beneficial this year so far personally. But I don't know if I just dodged some bullets along the way. I've had plenty of AMD cards but always preferred nvidia drivers in general.

Sorry I don't know the answer to your question though, but to anyone else is the occulus rift favoring one brand over another at the moment?

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I've come across bits of that, but it's as usual Internet conjecture. It's just that I've been bitten hard by the 560Ti debacle 2-3 years ago (or whenever they came out), so I'm a little wary when reading about supposedly slightly unstable drivers.

Shaocaholica
Oct 29, 2002

Fig. 5E

Combat Pretzel posted:

I've come across bits of that, but it's as usual Internet conjecture. It's just that I've been bitten hard by the 560Ti debacle 2-3 years ago (or whenever they came out), so I'm a little wary when reading about supposedly slightly unstable drivers.

I missed that. What happened?

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Plenty of 560Ti cards seemed unstable back then. I had to return mine two times to get a stable one. Not sure what the exact reason was, but according to the Internet, plenty of people had problems. These chips seemed usually factory overclocked. Must have been related, because when I used nTune to clock them back down, they became more stable.

craig588
Nov 19, 2005

by Nyc_Tattoo
Yeah, at some point (I think it was around 314.xx) the driver found a path through the 560 that a lot of the cards really didn't agree with. I'm not sure if it's even 100% fixed yet, I think there are still a few work arounds people on 560s have to deal with. Even if it was only 5% of cards, it's real lovely to go from working to useless without any warning.

Support for current generation stuff has remained good, but they seem to have dropped off their attention to Fermi.

craig588 fucked around with this message at 01:09 on Jun 2, 2014

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Starting with the GTX 550 Ti nVidia has also had a few cards that used 192-bit memory buses but with even power-of-two VRAM, which resulted in 25% of the RAM being wasted. On a 1024MB card for example, you'd have 768MB accessible over the 192-bit bus, but 256MB that was only reachable via one 64-bit bus, making it essentially useless. If a game actually tries to access data in this memory performance tanks nearly as badly as if it wasn't there. As I recall they eventually released drivers that kept data out of this region to fix weird stuttering and other issues.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Good to know about the 192bit bus, because it's still in recent more affordable cards. I suppose I'll be good with a 7xx series card, since Maxwell cards won't be coming in 2014 from what I just found out.

Come to think of it, I've a tendency towards Nvidia because their Linux and *nix drivers are more mature, in case I feel like defecting again.

Adbot
ADBOT LOVES YOU

Fame Douglas
Nov 20, 2013

by Fluffdaddy
nVidia also has adaptive VSync, which is pretty aces.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply