Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Mad_Lion
Jul 14, 2005

I don't see HIS in that list, but anecdotally, I got a very quality 7850 2GB from them. It's not their special ICEQ edition, either, just the regular kind with a sorta wimpy fan, but it will overclock to 1050/1350 on stock voltage with the power limit set to +20%, and 1100 if I use MSI afterburner (still on stock voltages).

Adbot
ADBOT LOVES YOU

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

Agreed posted:

Why don't we ever talk about PNY? They're apparently built like hammers, but it's all MSI and Asus and Gigabyte and XFX...

I don't know honestly, they're the lowest priced GTX stuff over anyone else right now, and they have a lifetime warranty if you register with them which no one else does. On B&H their 760 and 770 are $240 and $315 with no tax or shipping. I've got a ton of Dell.com credit and I'm gonna have them price match the 770s and see how that goes.

Hamburger Test
Jul 2, 2007

Sure hope this works!
My French is so rusty it may as well not exist, so going with the Google translate it sounds like they attribute PNY's low return rate to their sales mostly being in the lower end which generally have a lower failure rate?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Since we apparently like XFX again, maybe we should give PNY a shot. But originally it was anecdotal goon-evidence that waived us off PNY the same as XFX.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
I think there was more to disliking XFX than just anecdotal evidence, like some reviews showing their cards were non-reference designs cheaping out on power delivery etc.

simplefish
Mar 28, 2011

So long, and thanks for all the fish gallbladdΣrs!


Ghostpilot posted:

Though I really feel for anybody looking to pick up a video card right now.

Why do you say that? I'm putting together a new PC and can easily wait a while for a new graphics card if something big's around the corner.

If this is a stupid question, I apologise. I'm not current with what's going on for graphics cards - the last one I bought was the GeForce4 MX440. I've just had to go and look up what nVidia Greenlight was.

simplefish fucked around with this message at 16:51 on Jan 15, 2014

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Dogen posted:

I think there was more to disliking XFX than just anecdotal evidence, like some reviews showing their cards were non-reference designs cheaping out on power delivery etc.

We also had that problem with Gigabyte. They seem to have cleaned up their act. I'm more suspicious of PNY just because, look, the poo poo ain't free to make, there's a reason everyone prices a given card around the same price, I just don't trust the discount... If they start doing volume like MSI or Asus or whatever then I guess we'll know for sure (for now) but in the meantime, I'm still kinda hesitant to recommend them.

Feel like XFX has a ways to go yet to clear their name but they had some really cool stuff on the AMD side with the dual bios 290 launches that, iirc, all flashed to 290x specs? I dunno, I remember veedubfreaking out about it (:ironicat:) or something like that...

Srebrenica Surprise
Aug 23, 2008

"L-O-V-E's just another word I never learned to pronounce."

Dogen posted:

I think there was more to disliking XFX than just anecdotal evidence, like some reviews showing their cards were non-reference designs cheaping out on power delivery etc.
I don't even think you had to look further than the PCB. I still wouldn't buy a XFX AMD card without making 100% sure it's at least as good as the reference model, and I remember some issues with their rebate house and warranty service too.

beejay
Apr 7, 2002

simplefish posted:

Why do you say that? I'm putting together a new PC and can easily wait a while for a new graphics card if something big's around the corner.

If this is a stupid question, I apologise. I'm not current with what's going on for graphics cards - the last one I bought was the GeForce4 MX440. I've just had to go and look up what nVidia Greenlight was.

I think he was just saying that some AMD video cards are expensive now due to bitcoin/litecoin mining, and additionally that binning has eliminated the "free" unlock from 290 to 290X. Nothing that a "regular" user should have to worry about.

simplefish
Mar 28, 2011

So long, and thanks for all the fish gallbladdΣrs!


Thanks. I'm trying to lurk and learn more about the current state of the technology and market again.

You all do scary things with silicon.

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

simplefish posted:

Why do you say that? I'm putting together a new PC and can easily wait a while for a new graphics card if something big's around the corner.

If this is a stupid question, I apologise. I'm not current with what's going on for graphics cards - the last one I bought was the GeForce4 MX440. I've just had to go and look up what nVidia Greenlight was.

Beejay covered it already, but yeah, it was more on anybody considering the AMD side of things. AMD typically has a better price-to-performance ratio than Nvidia (who typically has better features), but litecoin mining throwing all of that out of whack. So now Nvidia is the cheaper way to go despite their prices remaining unchanged.

Also, the dustup also raised prices of some aftermarket coolers, with the Arctic's Accelero Xtreme III going up 10-15 bucks while Gelid Icy Vision nearly doubled. Thankfully, the Gelid's spike was brief.

Agreed posted:

We also had that problem with Gigabyte. They seem to have cleaned up their act. I'm more suspicious of PNY just because, look, the poo poo ain't free to make, there's a reason everyone prices a given card around the same price, I just don't trust the discount... If they start doing volume like MSI or Asus or whatever then I guess we'll know for sure (for now) but in the meantime, I'm still kinda hesitant to recommend them.

Feel like XFX has a ways to go yet to clear their name but they had some really cool stuff on the AMD side with the dual bios 290 launches that, iirc, all flashed to 290x specs? I dunno, I remember veedubfreaking out about it (:ironicat:) or something like that...

My first 290 was an XFX that sadly didn't unlock (Powercolor had the 100% unlock rates for awhile), but XFX was notable for having a decent rate of unlocks while shipping with BF4. I kept getting black screens with it and decided to err on the side of caution by sending it back (though the the black screens wound up being a driver issue that was fixed a month later).

The major issue with XFX was that they simply weren't forthcoming with their warranty information which led to a lot of mixed messages. Their cards had the warranty stickers on the screws, but the prevailing rumor was that you could install an aftermarket cooler without voiding the warranty so long as you lived in the US and notified them of having replaced the cooler. :confused:

I tried searching their website for confirmation - as well as contacting them directly - without any luck or response. I didn't like the idea of Schrodinger's warranty if I replaced the wholly inadequate stock cooler (on a card that was already giving me issues) so I decided to jump ship.

It's another anecdote, and maybe you'll never need to use the warranty, but that was enough for me to cross them off my list for future cards.

veedubfreak
Apr 2, 2005

by Smythe
I asked the XFX rep on hardforums about the sticker and his response was that basically the warranty was good in the US as long as you put the original heatsink back on for RMA. A lot of the hate for the XFX cards came out of the bad batch of 7970s DD cards I believe. I stick with XFX because I basically only ever buy reference models due to water block compatibility. The early XFX cards were pretty much all unlockable, but as time goes on less and less are being unlocked.

Here's the response direct from the horse's mouth.
http://hardforum.com/showthread.php?t=1781796&page=3

veedubfreak fucked around with this message at 18:42 on Jan 15, 2014

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
I hate XFX because I bought their GeForce 7950 GT with video capture, and when it died, they gave me an 8600 GTS (a major downgrade) without video capture. When I complained, they "upgraded" me to a 7900 GT, still without video capture. The 7900 GT still didn't work right - it needed any memory clock rate besides the default one in order not to scramble the display output.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

It is so nice just having standardized minimum production quality (Kepler forward)... shopping for AMD cards seems like a pain in the rear end even without the whole *coin bullshit fluctuating prices wildly. At the very least I know that regardless of brand, any nVidia card is only as likely to fail as a standard engineering sample, and all aftermarket cards have to meet or beat it on all the points we care about.

I don't know if it's a lack of vendor clout, or supply issues, or what, but I wish AMD would hurry up and do the same thing to standardize their production. It is ridiculous having a given skew from a specific vendor that exceeds nominal failure rate for electronics by 3x, even if the overall line is within expected failure rate for consumer electronics.

The lotto ought to just be for overclocking capability, not for "so is this manufacturer going to try to obfuscate how badly they're loving me on parts quality and power delivery, or can I trust them to at least make a basically decent product?" :mad:

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.
Speaking of XFX, in my search for a Radeon R9 280X that's actually in stock at Amazon, I've noticed its the only one that seems to have been in stock lately so I was thinking about buying one, however, I've read reviews that seem to indicate that their cards have serious issues with VRM cooling to the point they can overheat at stock. Is this something that I should be seriously worried about from the XFX model? I have a large case (ancient Antec P160 that I'm replacing next build) so it won't just be dumping hot air back in itself.

If its an issue I'll just continue to be patient until I get my hands on something like an Asus or MSI card.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Beautiful Ninja posted:

Speaking of XFX, in my search for a Radeon R9 280X that's actually in stock at Amazon, I've noticed its the only one that seems to have been in stock lately so I was thinking about buying one, however, I've read reviews that seem to indicate that their cards have serious issues with VRM cooling to the point they can overheat at stock. Is this something that I should be seriously worried about from the XFX model? I have a large case (ancient Antec P160 that I'm replacing next build) so it won't just be dumping hot air back in itself.

If its an issue I'll just continue to be patient until I get my hands on something like an Asus or MSI card.

While it is entirely possible that a given brand has some kinda issue, I think it's worth taking into consideration the definite fact that you're looking at self-reporting and a strong bias toward "god drat it I have an issue I'm going to shout to the winds about this!!!" versus "welp my card works great I'm going to go about my business and play games and stuff, maybe if I think about it later I'll do a positive review? *never writes review*"

If you read on a tech tear-down site that there's a cheap component that can cause problems, take that very seriously. Product reviews online, be very careful and consider the source.

Ghostpilot
Jun 22, 2007

"As a rule, I never touch anything more sophisticated and delicate than myself."

Agreed posted:

It is so nice just having standardized minimum production quality (Kepler forward)... shopping for AMD cards seems like a pain in the rear end even without the whole *coin bullshit fluctuating prices wildly. At the very least I know that regardless of brand, any nVidia card is only as likely to fail as a standard engineering sample, and all aftermarket cards have to meet or beat it on all the points we care about.

I don't know if it's a lack of vendor clout, or supply issues, or what, but I wish AMD would hurry up and do the same thing to standardize their production. It is ridiculous having a given skew from a specific vendor that exceeds nominal failure rate for electronics by 3x, even if the overall line is within expected failure rate for consumer electronics.

The lotto ought to just be for overclocking capability, not for "so is this manufacturer going to try to obfuscate how badly they're loving me on parts quality and power delivery, or can I trust them to at least make a basically decent product?" :mad:

AMD could've saved themselves so much grief over the black screen issues people were having if such a thing were in place. Virtually everybody having those issues had Elpida memory on their cards, while those with Hynix memory were spared.

Wistful of Dollars
Aug 25, 2009

Random point of amusement, apparently EVGA's cracked-out 780 Ti has no TPD limit.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Kamikaze card!

Gonkish
May 19, 2004

So in other words we just have to wait for crazy people to set their houses on fire?

Byolante
Mar 23, 2008

by Cyrano4747

Gonkish posted:

So in other words we just have to wait for crazy people to set their houses on fire?

If somebody is going to lose a fingat its going to be from LN2 not fire.

The MUMPSorceress
Jan 6, 2012


^SHTPSTS

Gary’s Answer
Is the 670gtx known to be unstable with Bioshock 2 and Bioshock infinite? Both of those games have bluescreened my computer a couple of times when running DX11, and I've never had any other game do that to my current PC before. I updated to the latest drivers but I still get an occasional bluescreen. Maybe one for every 5-6 hours of gameplay.

My computer continues to run Skyrim at 1920x1200 with a bunch of hi-res textures and stuff installed without issue, so I'm doubting the hardware is failing in the traditional sense.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

HalloKitty posted:

Kamikaze card!

Or maybe just to save a step having to manually voltmod cards. This is supposed to be the insane overclocker death-run version.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
TechReport has a little on the Oculus Rift "Crystal Cove" prototype. Still a 1080p screen, but this time with an AMOLED panel and a lot of enhancements to avoid motion sickness.

The reason it's in this thread, though, is a little tidbit that they're working on something G-sync like, since that's a natural ally for something like a Rift. Unfortunately, they aren't talking about it any more than saying just that much.

I am also super hype that the tie-in launch game for the Rift will be EVE Valkyrie, but that's neither here nor there. SPACE SIMS! :byodood:

But there was some intimation about AMD being all up ons this thing. AMD had a Rift at their CES booth accompanied by a prototype positional audio headset with TrueAudio acceleration. Audio on the Rift is another notable "Yes, it's a thing, but we're not talking about it" subject.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!
I wonder what they have Carmack working on.

Stanley Pain
Jun 16, 2001

by Fluffdaddy
Starting to see some first real world "numbers" about mantle.

http://www.rockpapershotgun.com/2014/01/16/see-a-10000-ship-battle-in-stardocks-insane-new-engine/

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
I'm highly skeptical of a tech demo that touts motion blur as a "high end feature", especially a motion blur implementation that appears to cost around 100ms. :psyduck:

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!
Soooo Mantle is not to increase GPU sales but CPU sales for AMD?

Gwaihir
Dec 8, 2009
Hair Elf

Factory Factory posted:

TechReport has a little on the Oculus Rift "Crystal Cove" prototype. Still a 1080p screen, but this time with an AMOLED panel and a lot of enhancements to avoid motion sickness.

The reason it's in this thread, though, is a little tidbit that they're working on something G-sync like, since that's a natural ally for something like a Rift. Unfortunately, they aren't talking about it any more than saying just that much.

I am also super hype that the tie-in launch game for the Rift will be EVE Valkyrie, but that's neither here nor there. SPACE SIMS! :byodood:

But there was some intimation about AMD being all up ons this thing. AMD had a Rift at their CES booth accompanied by a prototype positional audio headset with TrueAudio acceleration. Audio on the Rift is another notable "Yes, it's a thing, but we're not talking about it" subject.

In the vein of rifts, space sims, and :flashfap: collaboration between the two, you really owe it to yourself to check out this one: http://forums.somethingawful.com/showthread.php?threadid=3530373&userid=0&perpage=40&pagenumber=25

Stream of the dev working on it: http://www.twitch.tv/marauderinteractive/b/495077760

There's also a great Rift demo up on youtube (I would link it, but can't search youtubes on my work machine).

e: I think this is it: https://www.youtube.com/watch?v=oSKeseN4uJk

beejay
Apr 7, 2002

The motion blur enabling/disabling and FPS gains were during the DirectX part, not Mantle.
I don't even know what the above comment referencing CPUs is talking about.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

beejay posted:

The motion blur enabling/disabling and FPS gains were during the DirectX part, not Mantle.
I don't even know what the above comment referencing CPUs is talking about.

Watch the video again. The Blur is enabled throughout the Mantle part and then when DX is enabled they have to take it off.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

beejay posted:

The motion blur enabling/disabling and FPS gains were during the DirectX part, not Mantle.
I don't even know what the above comment referencing CPUs is talking about.

Did you not have audio turned on for the tech demo? Although now that I re-watched it I am not sure if it's Mantle or the engine itself that makes it take advantage of all cores. I do recall that Mantle has a lot of task scheduling tools.

beejay
Apr 7, 2002

Stanley Pain posted:

Watch the video again. The Blur is enabled throughout the Mantle part and then when DX is enabled they have to take it off.

Yeah but I don't see where motion blur is causing a hit in Mantle which is what that post seemed to be getting at. To me it looks like they are saying Mantle can run everything turned on whereas with DX you have to turn off things to maintain fps. I mean motion blur itself is kind of dumb but whatever, that's up to the game designers.


deimos posted:

Did you not have audio turned on for the tech demo? Although now that I re-watched it I am not sure if it's Mantle or the engine itself that makes it take advantage of all cores. I do recall that Mantle has a lot of task scheduling tools.

I see what you mean, I took it to be the engine but I'm not so sure. Also they were running that demo on an i7 :v: I think they were mostly just saying that with the GPU taking so much of the load, that the CPU is more free to do other things.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

beejay posted:

Yeah but I don't see where motion blur is causing a hit in Mantle which is what that post seemed to be getting at. To me it looks like they are saying Mantle can run everything turned on whereas with DX you have to turn off things to maintain fps. I mean motion blur itself is kind of dumb but whatever, that's up to the game designers.


I see what you mean, I took it to be the engine but I'm not so sure. Also they were running that demo on an i7 :v: I think they were mostly just saying that with the GPU taking so much of the load, that the CPU is more free to do other things.



First bit of video = Mantle + Motion Blur with VERY smooth framerates.


Rest of video = DX + Blur for a brief moment then no blur after. The DX code path runs like total poo poo with the blue enabled.

Near the end they switch back to mantle which doubles the DX framerate (8 - 16ish :v:)


They allude to allowing more "simulation" stuff going on in the background (the turrets and pew pew laser projectiles, etc).

Stanley Pain fucked around with this message at 21:18 on Jan 17, 2014

veedubfreak
Apr 2, 2005

by Smythe
This showed up today :)

Wistful of Dollars
Aug 25, 2009

Everything looks to be in one piece. :chord:

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
I remember when 3dfx was pimping motion blur. It was a quake 3 feature.

forbidden dialectics
Jul 26, 2005





So my new power supply is installed, I got the fixed EVGA Classified BIOS (oh, how strange for it to come out on the same day as the Kingpin edition!!!), the latest Classified Voltage Tool (which lets you increase the VDD PWM frequency, basically increases the level of LLC) and I am ready to clock. Let's see where this goes...

forbidden dialectics fucked around with this message at 08:16 on Jan 18, 2014

DiggityDoink
Dec 9, 2007
I like how this has pretty much combined with the OC thread since Haswell has it's low headroom.

Seriously, less threads to check.

Speaking of, I'd still like to see your full build when you're done veedub, I don't think you put up pics of it yet.

DiggityDoink fucked around with this message at 08:32 on Jan 18, 2014

Adbot
ADBOT LOVES YOU

Parker Lewis
Jan 4, 2006

Can't Lose


I'm playing Dishonored on a single 2GB 760 at 1440p.

I get a solid 60fps with only FXAA. When I force 2X MSAA I see framerate drops to the low 40s, and when I force 4X MSAA I see drops to the low 30s. Memory usage with 2X MSAA is around 1-1.2GB and 4X MSAA is around 1.7GB.

4X MSAA looks like it's close to maxing out the memory on my card but 2X MSAA seems like it's constrained more by the GPU and would benefit from adding a second 2GB 760 in SLI, right?

I do have concerns about putting another $250 into 2GB video cards but it seems like I'd get a better bang for my buck right now by going with SLI 760s than a single R9 290.

Parker Lewis fucked around with this message at 19:43 on Jan 19, 2014

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply