|
AVeryLargeRadish posted:No, not really. It's probably a better idea to just sell it and use the money for something else. I was looking forward to "Physx 2.0" after a little demo I saw but ... much like the first go around its pretty much evaporated afaik, sadly. I've only seen one game ever where physx was actually cool and really well integrated into the game (Killing Floor 2) every other example I've seen its been totally optional in every sense of the word.
|
# ? Feb 9, 2017 01:19 |
|
|
# ? Jun 4, 2024 01:50 |
|
1gnoirents posted:I was looking forward to "Physx 2.0" after a little demo I saw but ... much like the first go around its pretty much evaporated afaik, sadly. I've only seen one game ever where physx was actually cool and really well integrated into the game (Killing Floor 2) every other example I've seen its been totally optional in every sense of the word. That's just the reality of game development unfortunately. If a technique is too heavy to run on consoles it just won't get used 99% of the time, and the other 1% will have it tacked on at the last minute. I'm still salty that Epic ripped their SVOGI implementation out of UE4 and reverted to static baked lighting to appease the console perf target gods
|
# ? Feb 9, 2017 02:09 |
|
Physx died the moment it became Nvidia proprietary
|
# ? Feb 9, 2017 04:30 |
|
PhysX was dead from day one because multiplayer matters in most games these days. Graphics settings can vary between players with no real impact if things are done right (see olf-school Counterstrike hitscan resolution for how to do it wrong) but physics have to be synchronized if they're meaningful. That means you get three choices: 1. PhysX hardware is mandatory. 2. PhysX software implementation significantly increases CPU requirement for PCs without hardware. 3. PhysX used for meaningless visual fluff that doesn't impact gameplay. Obviously #3 was the most common choice in commercial games. In Borderlands 2 for example you got more debris and more detailed liquids. In a single player title PhysX could actually matter, but it still mostly seems to be ignored there.
|
# ? Feb 9, 2017 04:55 |
|
wolrah posted:PhysX was dead from day one because multiplayer matters in most games these days. Graphics settings can vary between players with no real impact if things are done right (see olf-school Counterstrike hitscan resolution for how to do it wrong) but physics have to be synchronized if they're meaningful. I can't seem to find it, but I recall there being a multiplayer arena FPS game that was in development, using Physx physics as a primary point of gameplay. This was before they got bought out, and I don't think the game ever released as a result.
|
# ? Feb 9, 2017 07:34 |
|
wolrah posted:In a single player title PhysX could actually matter, but it still mostly seems to be ignored there. Because you need to have 1 or 2, 3 is often 1 for nvidia users or massive fps sink via 2 for amd users. It needs to be a hardware solution for both amd and nvidia or will never ever catch on because then the consoles will also have to have it.
|
# ? Feb 9, 2017 08:38 |
|
KingEup posted:What model did they give you? 1gnoirents posted:Yeah rare in the field style coil whine busting combos are always appreciated even if there is no guarantee I went from an HX750 to an HX750i, which is the fully-modular one with a diagnostic USB port and platinum efficiency etc. My problems with the RX470 disappeared and it's now in Normal Graphics Card Noise range.
|
# ? Feb 9, 2017 10:17 |
|
craig588 posted:the 7870 uses between 175-185 depending on the version you have, even if you had a +20 watts version of the 480 you're still only at 170 watts. The power draw issues 480s had at launch are all fixed now, no reason to worry about that either. Thanks, that actually is a good point I haven't been able to get info on. Unlike 10 years ago, I don't actually follow the PC hardware market that much any more and thanks to a gazillion "review" pages with just as many measuring methods (calculated card consumption/system total consumption...blah) and brand comparisons, I was unable to find a reliable source on power draw of only the cards. Since my monitors don't have display ports, it really comes down to needing 2 HDMI ports and the MSI Gaming X and the Nitro+OC are both within 10€ of each other right now. I guess I'll get the Nitro+ then, I always liked the build quality of Sapphire cards, they never let me down, and a bit of noise doesn't phase me, I use headphones anyway. Might as well pay 10€ more for 2 frames more and easier fan replacement.
|
# ? Feb 9, 2017 12:26 |
|
DisplayPort to HDMI adapters are cheap...
|
# ? Feb 9, 2017 13:23 |
|
The only game I know that currently uses PhysX (and why I still have a 970 in my system w/ a 1080) is Killing Floor 2. Gibs+Fluids look incredible
|
# ? Feb 9, 2017 13:35 |
|
Really I don't think GPU PhysX would have much of an impact even if it were cross platform. The things it covers are either cheap enough to run on the CPU now (cloth physics), replaceable with much cheaper GPU approximations that look almost as convincing (particle physics), or are so hideously expensive that no cross-platform game would use them as anything but an afterthought (fluid dynamics).PerrineClostermann posted:I can't seem to find it, but I recall there being a multiplayer arena FPS game that was in development, using Physx physics as a primary point of gameplay. This was before they got bought out, and I don't think the game ever released as a result. You're probably thinking of CellFactor: Revolution. It did get released but nobody cared because it was terrible. repiv fucked around with this message at 15:47 on Feb 9, 2017 |
# ? Feb 9, 2017 14:08 |
|
Stanley Pain posted:The only game I know that currently uses PhysX (and why I still have a 970 in my system w/ a 1080) is Killing Floor 2. Gibs+Fluids look incredible PhysX is actually a reasonably popular physics engine, it's just no one bothers to do the whole "additional effects on Nvidia cards" thing.
|
# ? Feb 9, 2017 14:11 |
|
repiv posted:
Ah, they took that tech demo and made an actual game out of it. Did it really suck? That's unfortunate.
|
# ? Feb 9, 2017 18:42 |
|
I would think adding some really good PhysX support to say, the next Elder Scrolls or Fallout game would be a big selling point for it. Getting some good real fabric, water, fire, and debris stuff going to make the world feel more alive, as well as Dragon battles to be even more epic, would make things look absolutely amazing on PC. Hell I am still waiting for proper fabric physics on pretty much any ingame character since the Nvidia demo for the Geforce 4 ti, Grace.
|
# ? Feb 9, 2017 19:15 |
|
betamax hipster posted:PhysX is actually a reasonably popular physics engine, it's just no one bothers to do the whole "additional effects on Nvidia cards" thing.
|
# ? Feb 9, 2017 19:18 |
|
Josh Lyman posted:It seems like only a couple games support PhysX these days. Nah he's right. PhysX is everywhere, it's the default physics engine in UE3/UE4/Unity (that's shitload of games), and a number of in-house engines use it as well. Even AMD-backed games like Hitman use PhysX. The distinction is that most games only use the CPU-based PhysX solver, not the GPU-accelerated parts.
|
# ? Feb 9, 2017 19:25 |
|
EdEddnEddy posted:I would think adding some really good PhysX support to say, the next Elder Scrolls or Fallout game would be a big selling point for it. Not using Gamebryo would be a selling point for it vv Alright, I should have been more fair HalloKitty fucked around with this message at 20:01 on Feb 9, 2017 |
# ? Feb 9, 2017 19:31 |
|
HalloKitty posted:Not using Gamebryo would be a selling point for it Some pretty well coded games use Gamebryo. The problem is the code Bethesda piled on top of it.
|
# ? Feb 9, 2017 19:52 |
|
xthetenth posted:Some pretty well coded games use Gamebryo. The problem is the code Bethesda piled on top of it. Yeah. Gamebryo really isn't the problem. Catherine is in Gamebryo. Civ IV is in Gamebryo, loving Sid Meier's: Pirates! is in Gamebryo. (So is Rocksmith of all things)
|
# ? Feb 9, 2017 20:11 |
|
repiv posted:Nah he's right. PhysX is everywhere, it's the default physics engine in UE3/UE4/Unity (that's shitload of games), and a number of in-house engines use it as well. Even AMD-backed games like Hitman use PhysX. *She It's actually a pretty decent middleware. Which it turns out is a far more valuable trait than its ability to be GPU-accelerated in an era when most games are GPU-bound and incapable of saturating CPU threads!
|
# ? Feb 9, 2017 20:57 |
|
Where's the spreadsheet
|
# ? Feb 9, 2017 21:16 |
|
Does Nvidia still do the thing where the CPU solver for PhysX still using x87 floating point operations or some dumb thing like that as an inducement to getting a PhysX card/getting rid of your AMD? I mean, I assume they do, it's Nvidia. There's no amount of lovely anticompetitor things they won't do these days.
|
# ? Feb 9, 2017 21:24 |
|
I hear if you try to enable PhysX on an AMD card it'll try to re-enable the power circuitry on any detected 480's and hope your motherboard explodes. TEACH YOU TO BUY FROM RED TEAM.
|
# ? Feb 9, 2017 21:31 |
|
SwissArmyDruid posted:Does Nvidia still do the thing where the CPU solver for PhysX still using x87 floating point operations or some dumb thing like that as an inducement to getting a PhysX card/getting rid of your AMD? Actually no, the lovely x87 code was from the old PhysX2 codebase they inherited from Ageia. NV re-wrote a large chunk of the CPU solver in PhysX3 and moved everything over to SSE with proper multi-threading. Believe it or not the CPU path in PhysX is really good these days, despite Nvidia's obvious conflict of interest.
|
# ? Feb 9, 2017 21:32 |
|
DrDork posted:I hear if you try to enable PhysX on an AMD card it'll try to re-enable the power circuitry on any detected 480's and hope your motherboard explodes. TEACH YOU TO BUY FROM RED TEAM. You joke, but sometimes I feel like we're about just short of active out-and-out sabotage across lines. There is no shortage of cheating in benchmarks from both sides. repiv posted:Actually no, the lovely x87 code was from the old PhysX2 codebase they inherited from Ageia. Ah, you see, I take issue with that, and this time around, I have David Kanter as backup! http://www.realworldtech.com/physx87/4/ quote:So why is PhysX using x87? I won't speculate as to WHY they migrated their little black boxes from single-threaded x87 to multi-threaded SSE, but it certainly wasn't a "oh, we inherited this old code!" situation. SwissArmyDruid fucked around with this message at 23:29 on Feb 9, 2017 |
# ? Feb 9, 2017 21:35 |
|
IIRC their excuse was that the PC/x87 and PPC/AltiVec codebases were heavily diverged, so even if they ported the AltiVec path to SSE the resulting merge would have been a mess. So they sat on it until the big PhysX3 rewrite. Who knows if they were bullshitting, but either way it hasn't been an issue since 2011
|
# ? Feb 9, 2017 21:52 |
|
Wouldn't all the bad press that they got around the time when PhysX was sort of big and Made to be Played (GameWorks?) be what may have pushed them to make it suck a little less on NonNvidia hardware?
|
# ? Feb 9, 2017 23:15 |
|
repiv posted:I'm still salty that Epic ripped their SVOGI implementation out of UE4 and reverted to static baked lighting to appease the console perf target gods https://www.youtube.com/watch?v=PEfqtOYjolE After looking at this video, me too!
|
# ? Feb 9, 2017 23:43 |
|
They really should've called this the "we're sorry" edition: http://www.guru3d.com/news-story/evga-details-icx-technology.html
|
# ? Feb 11, 2017 06:00 |
|
PerrineClostermann posted:Physx died the moment it became Nvidia proprietary Even before that everybody knew that hardware PhysX was a DOA tech on day one because of market realities, it's just that none of the professional reviewers want to be that evil tech-hatin' guy breaking Ageia's bubble.
|
# ? Feb 11, 2017 06:35 |
|
BIG HEADLINE posted:They really should've called this the "we're sorry" edition: http://www.guru3d.com/news-story/evga-details-icx-technology.html I thought it was later determined it was not the VRMs that were causing EVGA cards to self-immolate?
|
# ? Feb 11, 2017 06:35 |
|
spasticColon posted:I thought it was later determined it was not the VRMs that were causing EVGA cards to self-immolate? They added a fuse between power delivery and VRMs to prevent issues from occuring.
|
# ? Feb 11, 2017 06:40 |
|
spasticColon posted:I thought it was later determined it was not the VRMs that were causing EVGA cards to self-immolate? Yeah, it was a batch of faulty caps. They couldn't take the heat like they were specced to and the extra cooling kits helped keep faulty ones working.
|
# ? Feb 11, 2017 06:56 |
|
AVeryLargeRadish posted:No, not really. It's probably a better idea to just sell it and use the money for something else. Those GTX 950s are going for ~$80-100 used on Ebay last time I checked. It boggles the mind how much used, semi-outdated tech goes for when brand new RX470s are hitting $135 MIR.
|
# ? Feb 11, 2017 07:58 |
|
Palladium posted:Those GTX 950s are going for ~$80-100 used on Ebay last time I checked. It boggles the mind how much used, semi-outdated tech goes for when brand new RX470s are hitting $135 MIR. You can get a 1050 for $110 on Newegg which is slightly faster than a 950 and a 1050 Ti for $140. Boost speeds of the 1050 Ti gives almost double the gigapixel and gigatexels per second. GT/s is more important currently, unless you're running a higher resolution where GP/s becomes a limiting factor. The RX 470 for $140 is a good deal. pre:RX 470 38.6 GP/s 154.4 GT/s 1050 Ti 60.8 GP/s 91.2 GT/s SlayVus fucked around with this message at 08:16 on Feb 11, 2017 |
# ? Feb 11, 2017 08:03 |
|
SlayVus posted:You can get a 1050 for $110 on Newegg which is slightly faster than a 950 and a 1050 Ti for $140. Boost speeds of the 1050 Ti gives almost double the gigapixel and gigatexels per second. I doubt anyone pays much attention to raw paper specs (1050 Ti is only 41.3 GPixel/s, BTW) than end performance, which makes the RX470 by far the better card at equivalent prices unless TDP is a overriding factor. https://www.hardwareunboxed.com/the-best-sub-200-graphics-card-gtx-1050-ti-vs-rx-470-rx-460-4gb/
|
# ? Feb 11, 2017 08:39 |
|
spasticColon posted:I thought it was later determined it was not the VRMs that were causing EVGA cards to self-immolate? Yeah, the point I was trying to make is that this particular revision seems to have every "anti-card exploding" bell and whistle tacked on.
|
# ? Feb 11, 2017 09:40 |
|
Palladium posted:I doubt anyone pays much attention to raw paper specs (1050 Ti is only 41.3 GPixel/s, BTW) than end performance, which makes the RX470 by far the better card at equivalent prices unless TDP is a overriding factor. That doesn't take into account the boost clock speed which it will almost always be over. https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_10_series Numbers in parentheses are the performance when the card is boosting. I have a Titan XP, overclocked to 1987 core clock. That gives me a 190.7 GP/s and 445 GT/s vs what's listed at 147/343(Default boost). If Vega can do neighborhood of 250 GP/s and 600 GT/s, I would definitely upgrade for VR. SlayVus fucked around with this message at 10:02 on Feb 11, 2017 |
# ? Feb 11, 2017 09:49 |
craig588 posted:Yeah, it was a batch of faulty caps. They couldn't take the heat like they were specced to and the extra cooling kits helped keep faulty ones working. lol, no. Extra cooling will maybe extend the life of a faulty cap by a week or two but it won't do jack poo poo for long term reliability.
|
|
# ? Feb 11, 2017 13:03 |
|
|
# ? Jun 4, 2024 01:50 |
|
SwissArmyDruid posted:You joke, but sometimes I feel like we're about just short of active out-and-out sabotage across lines. There is no shortage of cheating in benchmarks from both sides. 'Nvidia can recompile it at will' Um no it is not as simple as flipping a compiler switch if you want to take proper advantage of SSE. Especially if the old code was hand tuned assembler. Even if not you'd want to use SIMD intrinsics for efficiency not just recompile.
|
# ? Feb 11, 2017 15:29 |