|
I don't see HIS in that list, but anecdotally, I got a very quality 7850 2GB from them. It's not their special ICEQ edition, either, just the regular kind with a sorta wimpy fan, but it will overclock to 1050/1350 on stock voltage with the power limit set to +20%, and 1100 if I use MSI afterburner (still on stock voltages).
|
# ? Jan 15, 2014 13:33 |
|
|
# ? May 31, 2024 03:08 |
|
Agreed posted:Why don't we ever talk about PNY? They're apparently built like hammers, but it's all MSI and Asus and Gigabyte and XFX... I don't know honestly, they're the lowest priced GTX stuff over anyone else right now, and they have a lifetime warranty if you register with them which no one else does. On B&H their 760 and 770 are $240 and $315 with no tax or shipping. I've got a ton of Dell.com credit and I'm gonna have them price match the 770s and see how that goes.
|
# ? Jan 15, 2014 14:37 |
|
My French is so rusty it may as well not exist, so going with the Google translate it sounds like they attribute PNY's low return rate to their sales mostly being in the lower end which generally have a lower failure rate?
|
# ? Jan 15, 2014 15:43 |
|
Since we apparently like XFX again, maybe we should give PNY a shot. But originally it was anecdotal goon-evidence that waived us off PNY the same as XFX.
|
# ? Jan 15, 2014 16:21 |
|
I think there was more to disliking XFX than just anecdotal evidence, like some reviews showing their cards were non-reference designs cheaping out on power delivery etc.
|
# ? Jan 15, 2014 16:30 |
|
Ghostpilot posted:Though I really feel for anybody looking to pick up a video card right now. Why do you say that? I'm putting together a new PC and can easily wait a while for a new graphics card if something big's around the corner. If this is a stupid question, I apologise. I'm not current with what's going on for graphics cards - the last one I bought was the GeForce4 MX440. I've just had to go and look up what nVidia Greenlight was. simplefish fucked around with this message at 16:51 on Jan 15, 2014 |
# ? Jan 15, 2014 16:34 |
|
Dogen posted:I think there was more to disliking XFX than just anecdotal evidence, like some reviews showing their cards were non-reference designs cheaping out on power delivery etc. We also had that problem with Gigabyte. They seem to have cleaned up their act. I'm more suspicious of PNY just because, look, the poo poo ain't free to make, there's a reason everyone prices a given card around the same price, I just don't trust the discount... If they start doing volume like MSI or Asus or whatever then I guess we'll know for sure (for now) but in the meantime, I'm still kinda hesitant to recommend them. Feel like XFX has a ways to go yet to clear their name but they had some really cool stuff on the AMD side with the dual bios 290 launches that, iirc, all flashed to 290x specs? I dunno, I remember veedubfreaking out about it () or something like that...
|
# ? Jan 15, 2014 16:38 |
|
Dogen posted:I think there was more to disliking XFX than just anecdotal evidence, like some reviews showing their cards were non-reference designs cheaping out on power delivery etc.
|
# ? Jan 15, 2014 16:38 |
|
simplefish posted:Why do you say that? I'm putting together a new PC and can easily wait a while for a new graphics card if something big's around the corner. I think he was just saying that some AMD video cards are expensive now due to bitcoin/litecoin mining, and additionally that binning has eliminated the "free" unlock from 290 to 290X. Nothing that a "regular" user should have to worry about.
|
# ? Jan 15, 2014 16:48 |
|
Thanks. I'm trying to lurk and learn more about the current state of the technology and market again. You all do scary things with silicon.
|
# ? Jan 15, 2014 16:50 |
|
simplefish posted:Why do you say that? I'm putting together a new PC and can easily wait a while for a new graphics card if something big's around the corner. Beejay covered it already, but yeah, it was more on anybody considering the AMD side of things. AMD typically has a better price-to-performance ratio than Nvidia (who typically has better features), but litecoin mining throwing all of that out of whack. So now Nvidia is the cheaper way to go despite their prices remaining unchanged. Also, the dustup also raised prices of some aftermarket coolers, with the Arctic's Accelero Xtreme III going up 10-15 bucks while Gelid Icy Vision nearly doubled. Thankfully, the Gelid's spike was brief. Agreed posted:We also had that problem with Gigabyte. They seem to have cleaned up their act. I'm more suspicious of PNY just because, look, the poo poo ain't free to make, there's a reason everyone prices a given card around the same price, I just don't trust the discount... If they start doing volume like MSI or Asus or whatever then I guess we'll know for sure (for now) but in the meantime, I'm still kinda hesitant to recommend them. My first 290 was an XFX that sadly didn't unlock (Powercolor had the 100% unlock rates for awhile), but XFX was notable for having a decent rate of unlocks while shipping with BF4. I kept getting black screens with it and decided to err on the side of caution by sending it back (though the the black screens wound up being a driver issue that was fixed a month later). The major issue with XFX was that they simply weren't forthcoming with their warranty information which led to a lot of mixed messages. Their cards had the warranty stickers on the screws, but the prevailing rumor was that you could install an aftermarket cooler without voiding the warranty so long as you lived in the US and notified them of having replaced the cooler. I tried searching their website for confirmation - as well as contacting them directly - without any luck or response. I didn't like the idea of Schrodinger's warranty if I replaced the wholly inadequate stock cooler (on a card that was already giving me issues) so I decided to jump ship. It's another anecdote, and maybe you'll never need to use the warranty, but that was enough for me to cross them off my list for future cards.
|
# ? Jan 15, 2014 18:03 |
|
I asked the XFX rep on hardforums about the sticker and his response was that basically the warranty was good in the US as long as you put the original heatsink back on for RMA. A lot of the hate for the XFX cards came out of the bad batch of 7970s DD cards I believe. I stick with XFX because I basically only ever buy reference models due to water block compatibility. The early XFX cards were pretty much all unlockable, but as time goes on less and less are being unlocked. Here's the response direct from the horse's mouth. http://hardforum.com/showthread.php?t=1781796&page=3 veedubfreak fucked around with this message at 18:42 on Jan 15, 2014 |
# ? Jan 15, 2014 18:37 |
|
I hate XFX because I bought their GeForce 7950 GT with video capture, and when it died, they gave me an 8600 GTS (a major downgrade) without video capture. When I complained, they "upgraded" me to a 7900 GT, still without video capture. The 7900 GT still didn't work right - it needed any memory clock rate besides the default one in order not to scramble the display output.
|
# ? Jan 15, 2014 18:49 |
|
It is so nice just having standardized minimum production quality (Kepler forward)... shopping for AMD cards seems like a pain in the rear end even without the whole *coin bullshit fluctuating prices wildly. At the very least I know that regardless of brand, any nVidia card is only as likely to fail as a standard engineering sample, and all aftermarket cards have to meet or beat it on all the points we care about. I don't know if it's a lack of vendor clout, or supply issues, or what, but I wish AMD would hurry up and do the same thing to standardize their production. It is ridiculous having a given skew from a specific vendor that exceeds nominal failure rate for electronics by 3x, even if the overall line is within expected failure rate for consumer electronics. The lotto ought to just be for overclocking capability, not for "so is this manufacturer going to try to obfuscate how badly they're loving me on parts quality and power delivery, or can I trust them to at least make a basically decent product?"
|
# ? Jan 15, 2014 18:51 |
|
Speaking of XFX, in my search for a Radeon R9 280X that's actually in stock at Amazon, I've noticed its the only one that seems to have been in stock lately so I was thinking about buying one, however, I've read reviews that seem to indicate that their cards have serious issues with VRM cooling to the point they can overheat at stock. Is this something that I should be seriously worried about from the XFX model? I have a large case (ancient Antec P160 that I'm replacing next build) so it won't just be dumping hot air back in itself. If its an issue I'll just continue to be patient until I get my hands on something like an Asus or MSI card.
|
# ? Jan 15, 2014 19:33 |
|
Beautiful Ninja posted:Speaking of XFX, in my search for a Radeon R9 280X that's actually in stock at Amazon, I've noticed its the only one that seems to have been in stock lately so I was thinking about buying one, however, I've read reviews that seem to indicate that their cards have serious issues with VRM cooling to the point they can overheat at stock. Is this something that I should be seriously worried about from the XFX model? I have a large case (ancient Antec P160 that I'm replacing next build) so it won't just be dumping hot air back in itself. While it is entirely possible that a given brand has some kinda issue, I think it's worth taking into consideration the definite fact that you're looking at self-reporting and a strong bias toward "god drat it I have an issue I'm going to shout to the winds about this!!!" versus "welp my card works great I'm going to go about my business and play games and stuff, maybe if I think about it later I'll do a positive review? *never writes review*" If you read on a tech tear-down site that there's a cheap component that can cause problems, take that very seriously. Product reviews online, be very careful and consider the source.
|
# ? Jan 15, 2014 19:49 |
|
Agreed posted:It is so nice just having standardized minimum production quality (Kepler forward)... shopping for AMD cards seems like a pain in the rear end even without the whole *coin bullshit fluctuating prices wildly. At the very least I know that regardless of brand, any nVidia card is only as likely to fail as a standard engineering sample, and all aftermarket cards have to meet or beat it on all the points we care about. AMD could've saved themselves so much grief over the black screen issues people were having if such a thing were in place. Virtually everybody having those issues had Elpida memory on their cards, while those with Hynix memory were spared.
|
# ? Jan 15, 2014 22:22 |
|
Random point of amusement, apparently EVGA's cracked-out 780 Ti has no TPD limit.
|
# ? Jan 17, 2014 06:35 |
|
El Scotch posted:Random point of amusement, apparently EVGA's cracked-out 780 Ti has no TPD limit. Kamikaze card!
|
# ? Jan 17, 2014 10:05 |
|
So in other words we just have to wait for crazy people to set their houses on fire?
|
# ? Jan 17, 2014 12:38 |
|
Gonkish posted:So in other words we just have to wait for crazy people to set their houses on fire? If somebody is going to lose a fingat its going to be from LN2 not fire.
|
# ? Jan 17, 2014 13:03 |
|
Is the 670gtx known to be unstable with Bioshock 2 and Bioshock infinite? Both of those games have bluescreened my computer a couple of times when running DX11, and I've never had any other game do that to my current PC before. I updated to the latest drivers but I still get an occasional bluescreen. Maybe one for every 5-6 hours of gameplay. My computer continues to run Skyrim at 1920x1200 with a bunch of hi-res textures and stuff installed without issue, so I'm doubting the hardware is failing in the traditional sense.
|
# ? Jan 17, 2014 17:51 |
|
HalloKitty posted:Kamikaze card! Or maybe just to save a step having to manually voltmod cards. This is supposed to be the insane overclocker death-run version.
|
# ? Jan 17, 2014 18:02 |
|
TechReport has a little on the Oculus Rift "Crystal Cove" prototype. Still a 1080p screen, but this time with an AMOLED panel and a lot of enhancements to avoid motion sickness. The reason it's in this thread, though, is a little tidbit that they're working on something G-sync like, since that's a natural ally for something like a Rift. Unfortunately, they aren't talking about it any more than saying just that much. I am also super hype that the tie-in launch game for the Rift will be EVE Valkyrie, but that's neither here nor there. SPACE SIMS! But there was some intimation about AMD being all up ons this thing. AMD had a Rift at their CES booth accompanied by a prototype positional audio headset with TrueAudio acceleration. Audio on the Rift is another notable "Yes, it's a thing, but we're not talking about it" subject.
|
# ? Jan 17, 2014 18:54 |
|
I wonder what they have Carmack working on.
|
# ? Jan 17, 2014 19:07 |
|
Starting to see some first real world "numbers" about mantle. http://www.rockpapershotgun.com/2014/01/16/see-a-10000-ship-battle-in-stardocks-insane-new-engine/
|
# ? Jan 17, 2014 19:11 |
|
I'm highly skeptical of a tech demo that touts motion blur as a "high end feature", especially a motion blur implementation that appears to cost around 100ms.
|
# ? Jan 17, 2014 19:19 |
|
Soooo Mantle is not to increase GPU sales but CPU sales for AMD?
|
# ? Jan 17, 2014 19:22 |
|
Factory Factory posted:TechReport has a little on the Oculus Rift "Crystal Cove" prototype. Still a 1080p screen, but this time with an AMOLED panel and a lot of enhancements to avoid motion sickness. In the vein of rifts, space sims, and collaboration between the two, you really owe it to yourself to check out this one: http://forums.somethingawful.com/showthread.php?threadid=3530373&userid=0&perpage=40&pagenumber=25 Stream of the dev working on it: http://www.twitch.tv/marauderinteractive/b/495077760 There's also a great Rift demo up on youtube (I would link it, but can't search youtubes on my work machine). e: I think this is it: https://www.youtube.com/watch?v=oSKeseN4uJk
|
# ? Jan 17, 2014 20:04 |
|
The motion blur enabling/disabling and FPS gains were during the DirectX part, not Mantle. I don't even know what the above comment referencing CPUs is talking about.
|
# ? Jan 17, 2014 20:05 |
|
beejay posted:The motion blur enabling/disabling and FPS gains were during the DirectX part, not Mantle. Watch the video again. The Blur is enabled throughout the Mantle part and then when DX is enabled they have to take it off.
|
# ? Jan 17, 2014 20:14 |
|
beejay posted:The motion blur enabling/disabling and FPS gains were during the DirectX part, not Mantle. Did you not have audio turned on for the tech demo? Although now that I re-watched it I am not sure if it's Mantle or the engine itself that makes it take advantage of all cores. I do recall that Mantle has a lot of task scheduling tools.
|
# ? Jan 17, 2014 20:21 |
|
Stanley Pain posted:Watch the video again. The Blur is enabled throughout the Mantle part and then when DX is enabled they have to take it off. Yeah but I don't see where motion blur is causing a hit in Mantle which is what that post seemed to be getting at. To me it looks like they are saying Mantle can run everything turned on whereas with DX you have to turn off things to maintain fps. I mean motion blur itself is kind of dumb but whatever, that's up to the game designers. deimos posted:Did you not have audio turned on for the tech demo? Although now that I re-watched it I am not sure if it's Mantle or the engine itself that makes it take advantage of all cores. I do recall that Mantle has a lot of task scheduling tools. I see what you mean, I took it to be the engine but I'm not so sure. Also they were running that demo on an i7 I think they were mostly just saying that with the GPU taking so much of the load, that the CPU is more free to do other things.
|
# ? Jan 17, 2014 20:27 |
|
beejay posted:Yeah but I don't see where motion blur is causing a hit in Mantle which is what that post seemed to be getting at. To me it looks like they are saying Mantle can run everything turned on whereas with DX you have to turn off things to maintain fps. I mean motion blur itself is kind of dumb but whatever, that's up to the game designers. First bit of video = Mantle + Motion Blur with VERY smooth framerates. Rest of video = DX + Blur for a brief moment then no blur after. The DX code path runs like total poo poo with the blue enabled. Near the end they switch back to mantle which doubles the DX framerate (8 - 16ish ) They allude to allowing more "simulation" stuff going on in the background (the turrets and pew pew laser projectiles, etc). Stanley Pain fucked around with this message at 21:18 on Jan 17, 2014 |
# ? Jan 17, 2014 21:15 |
|
This showed up today
|
# ? Jan 18, 2014 01:50 |
|
Everything looks to be in one piece.
|
# ? Jan 18, 2014 02:46 |
|
I remember when 3dfx was pimping motion blur. It was a quake 3 feature.
|
# ? Jan 18, 2014 06:12 |
|
So my new power supply is installed, I got the fixed EVGA Classified BIOS (oh, how strange for it to come out on the same day as the Kingpin edition!!!), the latest Classified Voltage Tool (which lets you increase the VDD PWM frequency, basically increases the level of LLC) and I am ready to clock. Let's see where this goes...
forbidden dialectics fucked around with this message at 08:16 on Jan 18, 2014 |
# ? Jan 18, 2014 08:12 |
|
I like how this has pretty much combined with the OC thread since Haswell has it's low headroom. Seriously, less threads to check. Speaking of, I'd still like to see your full build when you're done veedub, I don't think you put up pics of it yet. DiggityDoink fucked around with this message at 08:32 on Jan 18, 2014 |
# ? Jan 18, 2014 08:30 |
|
|
# ? May 31, 2024 03:08 |
|
I'm playing Dishonored on a single 2GB 760 at 1440p. I get a solid 60fps with only FXAA. When I force 2X MSAA I see framerate drops to the low 40s, and when I force 4X MSAA I see drops to the low 30s. Memory usage with 2X MSAA is around 1-1.2GB and 4X MSAA is around 1.7GB. 4X MSAA looks like it's close to maxing out the memory on my card but 2X MSAA seems like it's constrained more by the GPU and would benefit from adding a second 2GB 760 in SLI, right? I do have concerns about putting another $250 into 2GB video cards but it seems like I'd get a better bang for my buck right now by going with SLI 760s than a single R9 290. Parker Lewis fucked around with this message at 19:43 on Jan 19, 2014 |
# ? Jan 19, 2014 19:39 |