Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Party Plane Jones
Jul 1, 2007

by Reene
Fun Shoe

SALT CURES HAM posted:

So I just set up a new computer with an AMD/ATI R7 250X, and overall this thing kicks rear end, but I'm running into a weird little problem. Whenever I run a game in 720p (or even 1080p, with Crysis and Metal Gear Rising: Revengeance) on my TV over HDMI, I get a decent amount of underscan- not enough to be a huge problem but definitely noticeable.

I tried loving with the scaling settings in Catalyst Control Center but none of them seem to work. Changing my desktop resolution to 720p, playing a game in 720p, and changing it back solves the problem but it's kind of a giant pain and doesn't fix Crysis or MGR. Anyone else have this problem or find out a fix?

Changing the scaling settings in CCC should have fixed it; underscan on HDMI cables for AMD cards has been around for at least a couple years. It's definitely hugely annoying.

Adbot
ADBOT LOVES YOU

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
You gotta change it for every refresh rate and resolution, its annoying as poo poo. Its like ATI thinks every one still owns dlp tv's or something.

life is killing me
Oct 28, 2007

I had to do it, too.

Had hoped that the new omega drivers would help but they probably don't and I haven't had the nuts to update just yet anyhow.

Radio Talmudist
Sep 29, 2008
So how's downsampling on AMD cards? I have an R9 290, not sure if it can handle it well. I'm also curious if I can use it on a 1080p display, or if you can only downsample on a higher resolution display

1gnoirents
Jun 28, 2014

hello :)

Radio Talmudist posted:

So how's downsampling on AMD cards? I have an R9 290, not sure if it can handle it well. I'm also curious if I can use it on a 1080p display, or if you can only downsample on a higher resolution display

Traditionally higher end AMD cards could handle higher resolutions better anyways (when compared to the nvidia cards of the time), a 290 was at one point considered overkill for 1080p, so I'd go hog wild and try it out. Yes you can use it for 1080p, that's really what its for. It's unrealistic to use it for higher resolutions in fact

The_Franz
Aug 8, 2003

Party Plane Jones posted:

Changing the scaling settings in CCC should have fixed it; underscan on HDMI cables for AMD cards has been around for at least a couple years. It's definitely hugely annoying.

There's a registry fix to do this permanently for all resolutions.

http://blog.ktz.me/?p=323

Doctor Albatross
Jul 7, 2008

I prescribe a dirt bath and a diet of freshwater karp.
So I'm currently running a 660ti; for about a year I was running two 27" 1080p monitors and recently I picked up a single 27" 1440p one (BenQ BL2710) because it was massively discounted. As you can imagine, my 660ti isn't exactly up to snuff to run brand new games at 1440p, so I'm stuck at 1080p or playing older games for now. I made the purchase knowing I would be upgrading my GPU soon - and I intend to pick something up in the next week and a half. It just depends on what.

I'll be visiting the US for a few weeks from New Zealand, and over there I can get two GTX 970s for the price of a single one over here. I'm strongly considering getting two and running them in SLI (I checked, my motherboard supports SLI), but am I going overboard? Would I be better off just getting one? I do intend to rebuild my entire system next year, and figured the 970(s) would be nice to keep for the new rig.

1gnoirents
Jun 28, 2014

hello :)

Doctor Albatross posted:

So I'm currently running a 660ti; for about a year I was running two 27" 1080p monitors and recently I picked up a single 27" 1440p one (BenQ BL2710) because it was massively discounted. As you can imagine, my 660ti isn't exactly up to snuff to run brand new games at 1440p, so I'm stuck at 1080p or playing older games for now. I made the purchase knowing I would be upgrading my GPU soon - and I intend to pick something up in the next week and a half. It just depends on what.

I'll be visiting the US for a few weeks from New Zealand, and over there I can get two GTX 970s for the price of a single one over here. I'm strongly considering getting two and running them in SLI (I checked, my motherboard supports SLI), but am I going overboard? Would I be better off just getting one? I do intend to rebuild my entire system next year, and figured the 970(s) would be nice to keep for the new rig.

Probably wrong thread but consider a 980? Will keep you at a single card but run 1440p very nicely (or perfect). It's hard to say SLI will be "overboard" as there are settings particularly in new games that will bring almost anything to its knees, but it will be an excess of GPU power in almost every scenario with the added (fairly minor) downsides of running two cards. At the moment even a single (OC'd) 970 is very good for 1440p. A 980 is likely to be perfect imo (with OC especially)

edit: on second thought id probably still go with 970 SLI if I had to choose and the difference in $ wasn't a big deal

1gnoirents fucked around with this message at 23:14 on Dec 11, 2014

teh_Broseph
Oct 21, 2010

THE LAST METROID IS IN
CATTIVITY. THE GALAXY
IS AT PEACE...
Lipstick Apathy
For the AMD Crysis stuff, try making a custom resolution of 1920x1076. Every time I've had an AMD card and played any of the Crysis games on a TV I had to do that for everything to display properly. The same res has helped for another game or two over the years but no idea what they were off the top of my head.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
When is the 960 or 960ti going to come out? I've decided to wait for my tax refund in late winter or my birthday in April to upgrade from my 660ti since I'm only gaming at 1080p. I'm hoping for a 960 or 960ti that gets 85-90 percent of the 970's performance but for 200-250 dollars.

Edit: My CPU would still be a i5-2500K so that wouldn't bottleneck a new card would it?

spasticColon fucked around with this message at 04:29 on Dec 12, 2014

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

spasticColon posted:

When is the 960 or 960ti going to come out? I've decided to wait for my tax refund in late winter or my birthday in April to upgrade from my 660ti since I'm only gaming at 1080p. I'm hoping for a 960 or 960ti that gets 85-90 percent of the 970's performance but for 200-250 dollars.

Edit: My CPU would still be a i5-2500K so that wouldn't bottleneck a new card would it?

It technically doesn't exist yet.

Rastor
Jun 2, 2001

I'm going to guess that a 960 will be a thing by the US tax deadline, but it's only a guess.

Captain Yossarian
Feb 24, 2011

All new" Rings of Fire"

Radio Talmudist posted:

So how's downsampling on AMD cards? I have an R9 290, not sure if it can handle it well. I'm also curious if I can use it on a 1080p display, or if you can only downsample on a higher resolution display

Works great. Monkeyed around with it a lot in the Vanishing of Ethan Carter. Currently back to running my 120hz monitor so when I bump up the res using downsampling (say, to 2560*1440) the refresh rate automatically adjusts down to 60hz which is expected. Works in Dark Souls II as well. Also tried it on the desktop and works fine there too. Not having an Nvidia, I imagine it is most useful in the same situations: games where you have horsepower to spare.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

spasticColon posted:

When is the 960 or 960ti going to come out? I've decided to wait for my tax refund in late winter or my birthday in April to upgrade from my 660ti since I'm only gaming at 1080p. I'm hoping for a 960 or 960ti that gets 85-90 percent of the 970's performance but for 200-250 dollars.

Edit: My CPU would still be a i5-2500K so that wouldn't bottleneck a new card would it?

The rumor mill is saying that the 960 is going to be a 2GB 128-bit bus card, which is not promising at all and would make it basically half the card the 970 is because the GDDR5 is already pretty wrung out on the 970. Even at 8GHz effective memory clocks, that's just not a lot of memory bandwidth.

This also could be completely baseless, but nVidia has been leaking like a sieve lately so there could be a grain of truth in it. The 970 is already a pretty cut down 980, with how big the die is I'd be surprised if the 960 were the same die.

The_Franz
Aug 8, 2003

Oh look, AMD finally fixed their driver so it doesn't underscan on the HDMI port by default.

http://support.amd.com/en-us/kb-articles/Pages/GPU-5001.aspx

1gnoirents
Jun 28, 2014

hello :)
Physx I'd actually want to see in real games

https://www.youtube.com/watch?v=1o0Nuq71gI4

I always thought fluids (and smoke, gas, etc) were always a sticking point in even the prettiest games where the developers always had to resort to pure visual tricks. This would be rad

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.
Anyone know what the term is for when a texture "flickers" like it is partially transparent? I loaded up Battlefield 4 last night with settings on ultra to test out my gtx 970 and was noticing that. Haven't noticed it on other games like Far Cry 4.

Is that a config issue or just the game itself? It's mostly noticeable when moving the view around while looking at certain textures.

The GPU is not overclocked and is running the latest nvidia driver, etc.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

priznat posted:

Anyone know what the term is for when a texture "flickers" like it is partially transparent? I loaded up Battlefield 4 last night with settings on ultra to test out my gtx 970 and was noticing that. Haven't noticed it on other games like Far Cry 4.

Is that a config issue or just the game itself? It's mostly noticeable when moving the view around while looking at certain textures.

The GPU is not overclocked and is running the latest nvidia driver, etc.

Battlefield 4 does that on my 780ti sometimes

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

Don Lapre posted:

Battlefield 4 does that on my 780ti sometimes

I don't recall it doing it on my 580 but tbh I hadn't played it in ages.

So just :dice: then possibly eh ;)

1gnoirents
Jun 28, 2014

hello :)
I heard that a lot about BF4 when it first came out. If I recall if you turn down texture filtering one notch it mostly clears it up. However, I've only seen it rarely... considering I have or had the same cards as others who reported this as a crippling issue I assume it must not be the GPU specifically (ie, not broken or something), or its in combination with something else. I did see it but I'd be surprised if I saw it even once in a few hours.

I noticed it a lot more with borderlands 2.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


1gnoirents posted:

Physx I'd actually want to see in real games

https://www.youtube.com/watch?v=1o0Nuq71gI4

I always thought fluids (and smoke, gas, etc) were always a sticking point in even the prettiest games where the developers always had to resort to pure visual tricks. This would be rad

but, ever user would have to have a nvidia gpu which doesn't exactly work out.

r0ck0
Sep 12, 2004
r0ck0s p0zt m0d3rn lyf

Tab8715 posted:

but, ever user would have to have a nvidia gpu which doesn't exactly work out.

Nope. All games that support Physx have an option to turn it off for those that don't want to or can't use it.

1gnoirents
Jun 28, 2014

hello :)

Tab8715 posted:

but, ever user would have to have a nvidia gpu which doesn't exactly work out.

Sometimes there is the option in games that use it today for AMD. I think it offloads it to the CPU, but the performance is fairly poor or really poor depending. Some games its just off if you have AMD though. I think it has to do with the game engine.

Swartz
Jul 28, 2005

by FactsAreUseless

1gnoirents posted:

Sometimes there is the option in games that use it today for AMD. I think it offloads it to the CPU, but the performance is fairly poor or really poor depending. Some games its just off if you have AMD though. I think it has to do with the game engine.

This is the correct answer. It utilizes the CPU if not Nvidia, or replaces with other effects if turned off.

Unreal Engine 4 makes heavy use of PhysX and I'm sure they realize that not everyone uses Nvidia cards, it'll just offload to the CPU or replace the effects with others.

PhysX FleX looks really promising and I hope it gets implemented in UE4 along with Grassworks.
In terms of hair, hate to say it Nvidia but AMD does it better with TressFX, and both cards are apparently supported, so they should go with that for hair rendering.

craig588
Nov 19, 2005

by Nyc_Tattoo

priznat posted:

Anyone know what the term is for when a texture "flickers" like it is partially transparent? I loaded up Battlefield 4 last night with settings on ultra to test out my gtx 970 and was noticing that. Haven't noticed it on other games like Far Cry 4.

Is that a config issue or just the game itself? It's mostly noticeable when moving the view around while looking at certain textures.

The GPU is not overclocked and is running the latest nvidia driver, etc.

I think it's Z fighting and there isn't really anything you can do about it. It's caused by rounding errors with 2 surfaces that are incredibly close to each other.

wolrah
May 8, 2006
what?

r0ck0 posted:

Nope. All games that support Physx have an option to turn it off for those that don't want to or can't use it.

Unfortunately when you think about it this is more of a limitation than you want as an end user. Everything you can do with PhysX has to have no practical impact on the game, basically it's only allowed to be visual. Otherwise you have one game for nVidia users and one for everyone else. Obviously no major game developer would consider going that far, the real gameplay capabilities of PhysX will exist almost entirely in nVidia tech demos and the occasional Tegra exclusive title.

As a gamer I wish nVidia would open PhysX, but apparently they see more value in it remaining proprietary.

r0ck0
Sep 12, 2004
r0ck0s p0zt m0d3rn lyf
Yep, that is true. Never gonna change the game.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

Twerk from Home posted:

The rumor mill is saying that the 960 is going to be a 2GB 128-bit bus card, which is not promising at all and would make it basically half the card the 970 is because the GDDR5 is already pretty wrung out on the 970. Even at 8GHz effective memory clocks, that's just not a lot of memory bandwidth.

This also could be completely baseless, but nVidia has been leaking like a sieve lately so there could be a grain of truth in it. The 970 is already a pretty cut down 980, with how big the die is I'd be surprised if the 960 were the same die.

If that's true then I might just wait until the 20nm GPUs drop. Those are still coming out next year correct?

I don't want to get a 970 then three months later a 20nm GTX975 is released for $299 because the video card industry is loving retarded.

This has already happened to me when I got my 660ti and then three months later the faster 760 was released and it was cheaper. :negative:

spasticColon fucked around with this message at 19:34 on Dec 13, 2014

1gnoirents
Jun 28, 2014

hello :)
Is it ever not going to be exactly that way

DarthBlingBling
Apr 19, 2004

These were also dark times for gamers as we were shunned by others for being geeky or nerdy and computer games were seen as Childs play things, during these dark ages the whispers began circulating about a 3D space combat game called Elite

- CMDR Bald Man In A Box
I recently sold my ASUS 670GTX DC2 4GB for only £20 less than what I paid for it brand new. Happy days! This was sold on eBay however and around 6 weeks later the buyer contacted me to say that he's only just got around to installing it and it doesn't seem to work. He says that the DVI port gives poor quality video and that the HDMI port flat out doesn't work.

As I sold it through my eBay company I couldn't refuse the return request as we have a 60 day policy. I personally thought he just didn't know how to use it properly and I'd be sending it right back. However it seems that although the DVI output is fine, the HDMI is actually goosed.

The computer will recognise the monitor, even auto set it for an extended desktop and get the model correct in the display label along with the correct resolution. But there is no output on screen whatsoever. I've now tried different monitors and cables, even computers and all give the exact same issues. Drivers, BIOS and all that jazz all updated.

The guy seems sincere and isn't chasing a refund or anything, also there is no obvious signs of damage to the port. Is there anything else I can try? Does this fault ring any bells with anyone?

Also the HDMI port was working fine before I sent it so I know it wasn't a DOA fault.

DarthBlingBling fucked around with this message at 20:17 on Dec 13, 2014

Tacier
Jul 22, 2003

Did they fix TressFX or something? I remember when Tomb Raider was released it tanked frame rates by like 30% and didn't even look that great.

1gnoirents
Jun 28, 2014

hello :)

DarthBlingBling posted:

I recently sold my ASUS 670GTX DC2 4GB for only £20 less than what I paid for it brand new. Happy days! This was sold on eBay however and around 6 weeks later the buyer contacted me to say that he's only just got around to installing it and it doesn't seem to work. He says that the DVI port gives poor quality video and that the HDMI port flat out doesn't work.

As I sold it through my eBay company I couldn't refuse the return request as we have a 60 day policy. I personally thought he just didn't know how to use it properly and I'd be sending it right back. However it seems that although the DVI output is fine, the HDMI is actually goosed.

The computer will recognise the monitor, even auto set it for an extended desktop and get the model correct in the display label along with the correct resolution. But there is no output on screen whatsoever. I've now tried different monitors and cables, even computers and all give the exact same issues. Drivers, BIOS and all that jazz all updated.

The guy seems sincere and isn't chasing a refund or anything, also there is no obvious signs of damage to the port. Is there anything else I can try? Does this fault ring any bells with anyone?

Also the HDMI port was working fine before I sent it so I know it wasn't a DOA fault.

Sounds like its hosed to me. There isn't a lot to do besides what you've done. I've seen a lot of physical HDMI damage like this but you said you can't see it. Warranty :/

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Tacier posted:

Did they fix TressFX or something? I remember when Tomb Raider was released it tanked frame rates by like 30% and didn't even look that great.

There was a TressFX 2.0 used in the XB1/PS4 versions of Tomb Raider that was supposed to be a lot better in regards to performance than TFX 1.0 that we got on the PC version, hopefully that tech is ported over into future TressFX enabled games.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
Yeah warranty that eBay card if you still can. Getting a returned refurbished card, you always have more resale value with those anyway.

Spacewolf
May 19, 2014
OK, two interlinked questions I have:

1. I'm currently in Florida with parents until Jan 10th, but at home I have a brand new Dell 660 desktop (AKA an Inspiron 3847). It has an Intel HD Graphics 4-something graphics card. Power supply is 220v.

Without opening the machine up, because it's in New Jersey and I'm in Florida, how do I know if I can swap out the graphics card for a discrete model like an nVidia or ATI?

2. How the heck do I figure out which of the cards Staples is selling at a discount (Why Staples, because dad has a ton of staples rewards points he needs to use before the end of the year and the card should cost between $80-100 or so) is any good? Mind you, I'm not going for a high-end card (For one thing, I am in no case going to upgrade the power supply, just too expensive), but something that can do Paradox gaming, some Hitman/Splinter Cell, and the occasional space sim. The reason I want to swap out the Intel card for an nVidia or ATI card is because basically I've got a bad feeling about how an Intel card will play with software as time goes on. Also, hey, if I can get a better video card for <$100 (given discounts at staples.com), it seems like a no-brainer.

Provided, mind you, I can actually *install* said card - when we opened up the case when I first got it, weirdly...There were 2 slots with 4GB RAM chips in them, there were supposed to be 4 slots if I'm looking at the Dell website re the service tag correctly. There was an empty expansion slot that to my dad *looked* like it could support a video card, but because my eyesight and dexterity are poor enough that I'd be a danger trying to mess with the internals, I wasn't looking myself. So what I'm wondering is - is that Intel card removable/replacable independent of the motherboard and similar? Because I'd rather not juggle two video cards.

Edit: So, called Dell. Apparently there's only 2 RAM slots in my machine, nobody has any idea why it says 4 on the website. Re the graphics card, apparently I can plug it in via the PCI slot and it'll be detected and ignore the Intel card. Whee.

So, the prime question for this thread is: How do I figure out what'd be a good video card given the Inspiron 3847's power supply and my $100 price limit after discounts?

Spacewolf fucked around with this message at 16:15 on Dec 14, 2014

td4guy
Jun 13, 2005

I always hated that guy.

There is literally nothing worth spending money for on the Staples website for video cards.



Here's a performance comparison between the HD6450 and your onboard Intel.

If you were shopping at Amazon or NewEgg instead of Staples, you'd have a lot more options at probably, like, half the price.

track day bro!
Feb 17, 2005

#essereFerrari
Grimey Drawer
I'm getting a fair amount of what looks like textures flickering on and off on my msi 970, its been fine for a while with a slight overclock. I just set msi afterburner to default but it stil seems to happen?

First I thought it was something to do with mods in ets2, but it seems to be doing it on Assetto Corsa, you see things flickering in the background and the shadows all seem to be flickering also. Have I broken something? or is this some sort of driver setting?

Tried downgrading the drivers but that hasnt made any difference? Is this card hosed? The highest temps ive seen are around 70c and that was doing the firestrike benchmark thingy.

track day bro! fucked around with this message at 18:41 on Dec 14, 2014

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

td4guy posted:

There is literally nothing worth spending money for on the Staples website for video cards.



Here's a performance comparison between the HD6450 and your onboard Intel.

If you were shopping at Amazon or NewEgg instead of Staples, you'd have a lot more options at probably, like, half the price.

Staples price matches.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Don Lapre posted:

Staples price matches.

More specifically and importantly, Staples.com price matches Amazon.

Adbot
ADBOT LOVES YOU

The_Franz
Aug 8, 2003

Don Lapre posted:

Staples price matches.

All of the $100-ish cards on the Staples website are horribly outdated and/or lowest of the low-end junk that are actually a downgrade over a newer Intel GPU. Rewards points or not, it's a complete waste of money when you can get a GTX 750 or an R7 260 for about $100 on Newegg or Amazon.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply