Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
1gnoirents
Jun 28, 2014

hello :)

AVeryLargeRadish posted:

No, not really. It's probably a better idea to just sell it and use the money for something else.

I was looking forward to "Physx 2.0" after a little demo I saw but ... much like the first go around its pretty much evaporated afaik, sadly. I've only seen one game ever where physx was actually cool and really well integrated into the game (Killing Floor 2) every other example I've seen its been totally optional in every sense of the word.

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

1gnoirents posted:

I was looking forward to "Physx 2.0" after a little demo I saw but ... much like the first go around its pretty much evaporated afaik, sadly. I've only seen one game ever where physx was actually cool and really well integrated into the game (Killing Floor 2) every other example I've seen its been totally optional in every sense of the word.

That's just the reality of game development unfortunately. If a technique is too heavy to run on consoles it just won't get used 99% of the time, and the other 1% will have it tacked on at the last minute.

I'm still salty that Epic ripped their SVOGI implementation out of UE4 and reverted to static baked lighting to appease the console perf target gods :(

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
Physx died the moment it became Nvidia proprietary

wolrah
May 8, 2006
what?
PhysX was dead from day one because multiplayer matters in most games these days. Graphics settings can vary between players with no real impact if things are done right (see olf-school Counterstrike hitscan resolution for how to do it wrong) but physics have to be synchronized if they're meaningful.

That means you get three choices:

1. PhysX hardware is mandatory.
2. PhysX software implementation significantly increases CPU requirement for PCs without hardware.
3. PhysX used for meaningless visual fluff that doesn't impact gameplay.

Obviously #3 was the most common choice in commercial games. In Borderlands 2 for example you got more debris and more detailed liquids.

In a single player title PhysX could actually matter, but it still mostly seems to be ignored there.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

wolrah posted:

PhysX was dead from day one because multiplayer matters in most games these days. Graphics settings can vary between players with no real impact if things are done right (see olf-school Counterstrike hitscan resolution for how to do it wrong) but physics have to be synchronized if they're meaningful.

That means you get three choices:

1. PhysX hardware is mandatory.
2. PhysX software implementation significantly increases CPU requirement for PCs without hardware.
3. PhysX used for meaningless visual fluff that doesn't impact gameplay.

Obviously #3 was the most common choice in commercial games. In Borderlands 2 for example you got more debris and more detailed liquids.

In a single player title PhysX could actually matter, but it still mostly seems to be ignored there.

I can't seem to find it, but I recall there being a multiplayer arena FPS game that was in development, using Physx physics as a primary point of gameplay. This was before they got bought out, and I don't think the game ever released as a result.

wargames
Mar 16, 2008

official yospos cat censor

wolrah posted:

In a single player title PhysX could actually matter, but it still mostly seems to be ignored there.

Because you need to have 1 or 2, 3 is often 1 for nvidia users or massive fps sink via 2 for amd users. It needs to be a hardware solution for both amd and nvidia or will never ever catch on because then the consoles will also have to have it.

Craptacular!
Jul 9, 2001

Fuck the DH

KingEup posted:

What model did they give you?

1gnoirents posted:

Yeah rare in the field style coil whine busting combos are always appreciated even if there is no guarantee


I went from an HX750 to an HX750i, which is the fully-modular one with a diagnostic USB port and platinum efficiency etc. My problems with the RX470 disappeared and it's now in Normal Graphics Card Noise range.

Hopper
Dec 28, 2004

BOOING! BOOING!
Grimey Drawer

craig588 posted:

the 7870 uses between 175-185 depending on the version you have, even if you had a +20 watts version of the 480 you're still only at 170 watts. The power draw issues 480s had at launch are all fixed now, no reason to worry about that either.

Thanks, that actually is a good point I haven't been able to get info on. Unlike 10 years ago, I don't actually follow the PC hardware market that much any more and thanks to a gazillion "review" pages with just as many measuring methods (calculated card consumption/system total consumption...blah) and brand comparisons, I was unable to find a reliable source on power draw of only the cards.

Since my monitors don't have display ports, it really comes down to needing 2 HDMI ports and the MSI Gaming X and the Nitro+OC are both within 10€ of each other right now. I guess I'll get the Nitro+ then, I always liked the build quality of Sapphire cards, they never let me down, and a bit of noise doesn't phase me, I use headphones anyway. Might as well pay 10€ more for 2 frames more and easier fan replacement.

Dren
Jan 5, 2001

Pillbug
DisplayPort to HDMI adapters are cheap...

Stanley Pain
Jun 16, 2001

by Fluffdaddy
The only game I know that currently uses PhysX (and why I still have a 970 in my system w/ a 1080) is Killing Floor 2. Gibs+Fluids look incredible :gibs:

repiv
Aug 13, 2009

Really I don't think GPU PhysX would have much of an impact even if it were cross platform. The things it covers are either cheap enough to run on the CPU now (cloth physics), replaceable with much cheaper GPU approximations that look almost as convincing (particle physics), or are so hideously expensive that no cross-platform game would use them as anything but an afterthought (fluid dynamics).

PerrineClostermann posted:

I can't seem to find it, but I recall there being a multiplayer arena FPS game that was in development, using Physx physics as a primary point of gameplay. This was before they got bought out, and I don't think the game ever released as a result.

You're probably thinking of CellFactor: Revolution. It did get released but nobody cared because it was terrible.

repiv fucked around with this message at 15:47 on Feb 9, 2017

betamax hipster
Aug 13, 2016

Stanley Pain posted:

The only game I know that currently uses PhysX (and why I still have a 970 in my system w/ a 1080) is Killing Floor 2. Gibs+Fluids look incredible :gibs:

PhysX is actually a reasonably popular physics engine, it's just no one bothers to do the whole "additional effects on Nvidia cards" thing.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

repiv posted:



You're probably thinking of CellFactor: Revolution. It did get released but nobody cared because it was terrible.

Ah, they took that tech demo and made an actual game out of it. Did it really suck? That's unfortunate.

EdEddnEddy
Apr 5, 2012



I would think adding some really good PhysX support to say, the next Elder Scrolls or Fallout game would be a big selling point for it.

Getting some good real fabric, water, fire, and debris stuff going to make the world feel more alive, as well as Dragon battles to be even more epic, would make things look absolutely amazing on PC.


Hell I am still waiting for proper fabric physics on pretty much any ingame character since the Nvidia demo for the Geforce 4 ti, Grace.

Josh Lyman
May 24, 2009


betamax hipster posted:

PhysX is actually a reasonably popular physics engine, it's just no one bothers to do the whole "additional effects on Nvidia cards" thing.
It seems like only a couple games support PhysX these days.

repiv
Aug 13, 2009

Josh Lyman posted:

It seems like only a couple games support PhysX these days.

Nah he's right. PhysX is everywhere, it's the default physics engine in UE3/UE4/Unity (that's shitload of games), and a number of in-house engines use it as well. Even AMD-backed games like Hitman use PhysX.

The distinction is that most games only use the CPU-based PhysX solver, not the GPU-accelerated parts.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

EdEddnEddy posted:

I would think adding some really good PhysX support to say, the next Elder Scrolls or Fallout game would be a big selling point for it.

Not using Gamebryo would be a selling point for it

vv Alright, I should have been more fair

HalloKitty fucked around with this message at 20:01 on Feb 9, 2017

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

HalloKitty posted:

Not using Gamebryo would be a selling point for it

Some pretty well coded games use Gamebryo. The problem is the code Bethesda piled on top of it.

MagusDraco
Nov 11, 2011

even speedwagon was trolled

xthetenth posted:

Some pretty well coded games use Gamebryo. The problem is the code Bethesda piled on top of it.

Yeah. Gamebryo really isn't the problem. Catherine is in Gamebryo. Civ IV is in Gamebryo, loving Sid Meier's: Pirates! is in Gamebryo. (So is Rocksmith of all things)

betamax hipster
Aug 13, 2016

repiv posted:

Nah he's right. PhysX is everywhere, it's the default physics engine in UE3/UE4/Unity (that's shitload of games), and a number of in-house engines use it as well. Even AMD-backed games like Hitman use PhysX.

The distinction is that most games only use the CPU-based PhysX solver, not the GPU-accelerated parts.

*She

It's actually a pretty decent middleware. Which it turns out is a far more valuable trait than its ability to be GPU-accelerated in an era when most games are GPU-bound and incapable of saturating CPU threads!

Haquer
Nov 15, 2009

That windswept look...

Where's the spreadsheet

SwissArmyDruid
Feb 14, 2014

by sebmojo
Does Nvidia still do the thing where the CPU solver for PhysX still using x87 floating point operations or some dumb thing like that as an inducement to getting a PhysX card/getting rid of your AMD?

I mean, I assume they do, it's Nvidia. There's no amount of lovely anticompetitor things they won't do these days.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
I hear if you try to enable PhysX on an AMD card it'll try to re-enable the power circuitry on any detected 480's and hope your motherboard explodes. TEACH YOU TO BUY FROM RED TEAM.

repiv
Aug 13, 2009

SwissArmyDruid posted:

Does Nvidia still do the thing where the CPU solver for PhysX still using x87 floating point operations or some dumb thing like that as an inducement to getting a PhysX card/getting rid of your AMD?

I mean, I assume they do, it's Nvidia. There's no amount of lovely anticompetitor things they won't do these days.

Actually no, the lovely x87 code was from the old PhysX2 codebase they inherited from Ageia. NV re-wrote a large chunk of the CPU solver in PhysX3 and moved everything over to SSE with proper multi-threading.

Believe it or not the CPU path in PhysX is really good these days, despite Nvidia's obvious conflict of interest.

SwissArmyDruid
Feb 14, 2014

by sebmojo

DrDork posted:

I hear if you try to enable PhysX on an AMD card it'll try to re-enable the power circuitry on any detected 480's and hope your motherboard explodes. TEACH YOU TO BUY FROM RED TEAM.

You joke, but sometimes I feel like we're about just short of active out-and-out sabotage across lines. There is no shortage of cheating in benchmarks from both sides.

repiv posted:

Actually no, the lovely x87 code was from the old PhysX2 codebase they inherited from Ageia.

Ah, you see, I take issue with that, and this time around, I have David Kanter as backup! http://www.realworldtech.com/physx87/4/

quote:

So why is PhysX using x87?

PhysX is certainly not using x87 because of the advantages of extended precision. The original PPU hardware only had 32-bit single precision floating point, not even 64-bit double precision, let alone the extended 80-bit precision of x87. In fact, PhysX probably only uses single precision on the GPU, since it is accelerated on the G80, which has no double precision. The evidence all suggests that PhysX only needs single precision.

PhysX is certainly not using x87 because it contains legacy x87 code. Nvidia has the source code for PhysX and can recompile at will.

PhysX is certainly not using x87 because of a legacy installed base of older CPUs. Any gaming system purchased since 2005 will have SSE2 support, and the PPU was not released till 2006. Ageia was bought by Nvidia in 2008, and almost every CPU sold since then (except for some odd embedded ones) has SSE2 support. PhysX is not targeting any of the embedded x86 market either; it’s designed for games.

The truth is that there is no technical reason for PhysX to be using x87 code. PhysX uses x87 because Ageia and now Nvidia want it that way. Nvidia already has PhysX running on consoles using the AltiVec extensions for PPC, which are very similar to SSE. It would probably take about a day or two to get PhysX to emit modern packed SSE2 code, and several weeks for compatibility testing. In fact for backwards compatibility, PhysX could select at install time whether to use an SSE2 version or an x87 version – just in case the elusive gamer with a Pentium Overdrive decides to try it.

But both Ageia and Nvidia use PhysX to highlight the advantages of their hardware over the CPU for physics calculations. In Nvidia’s case, they are also using PhysX to differentiate with AMD’s GPUs. The sole purpose of PhysX is a competitive differentiator to make Nvidia’s hardware look good and sell more GPUs. Part of that is making sure that Nvidia GPUs looks a lot better than the CPU, since that is what they claim in their marketing. Using x87 definitely makes the GPU look better, since the CPU will perform worse than if the code were properly generated to use packed SSE instructions.

I won't speculate as to WHY they migrated their little black boxes from single-threaded x87 to multi-threaded SSE, but it certainly wasn't a "oh, we inherited this old code!" situation.

SwissArmyDruid fucked around with this message at 23:29 on Feb 9, 2017

repiv
Aug 13, 2009

IIRC their excuse was that the PC/x87 and PPC/AltiVec codebases were heavily diverged, so even if they ported the AltiVec path to SSE the resulting merge would have been a mess. So they sat on it until the big PhysX3 rewrite.

Who knows if they were bullshitting, but either way it hasn't been an issue since 2011 :shrug:

EdEddnEddy
Apr 5, 2012



Wouldn't all the bad press that they got around the time when PhysX was sort of big and Made to be Played (GameWorks?) be what may have pushed them to make it suck a little less on NonNvidia hardware?

Raskolnikov
Nov 25, 2003

repiv posted:

I'm still salty that Epic ripped their SVOGI implementation out of UE4 and reverted to static baked lighting to appease the console perf target gods :(

https://www.youtube.com/watch?v=PEfqtOYjolE

After looking at this video, me too! :cry:

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
They really should've called this the "we're sorry" edition: http://www.guru3d.com/news-story/evga-details-icx-technology.html

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

PerrineClostermann posted:

Physx died the moment it became Nvidia proprietary

Even before that everybody knew that hardware PhysX was a DOA tech on day one because of market realities, it's just that none of the professional reviewers want to be that evil tech-hatin' guy breaking Ageia's bubble.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

BIG HEADLINE posted:

They really should've called this the "we're sorry" edition: http://www.guru3d.com/news-story/evga-details-icx-technology.html

I thought it was later determined it was not the VRMs that were causing EVGA cards to self-immolate?

SlayVus
Jul 10, 2009
Grimey Drawer

spasticColon posted:

I thought it was later determined it was not the VRMs that were causing EVGA cards to self-immolate?

They added a fuse between power delivery and VRMs to prevent issues from occuring.

craig588
Nov 19, 2005

by Nyc_Tattoo

spasticColon posted:

I thought it was later determined it was not the VRMs that were causing EVGA cards to self-immolate?

Yeah, it was a batch of faulty caps. They couldn't take the heat like they were specced to and the extra cooling kits helped keep faulty ones working.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

AVeryLargeRadish posted:

No, not really. It's probably a better idea to just sell it and use the money for something else.

Those GTX 950s are going for ~$80-100 used on Ebay last time I checked. It boggles the mind how much used, semi-outdated tech goes for when brand new RX470s are hitting $135 MIR.

SlayVus
Jul 10, 2009
Grimey Drawer

Palladium posted:

Those GTX 950s are going for ~$80-100 used on Ebay last time I checked. It boggles the mind how much used, semi-outdated tech goes for when brand new RX470s are hitting $135 MIR.

You can get a 1050 for $110 on Newegg which is slightly faster than a 950 and a 1050 Ti for $140. Boost speeds of the 1050 Ti gives almost double the gigapixel and gigatexels per second.

GT/s is more important currently, unless you're running a higher resolution where GP/s becomes a limiting factor. The RX 470 for $140 is a good deal.
pre:
RX 470      38.6 GP/s     154.4 GT/s
1050 Ti      60.8 GP/s       91.2 GT/s

SlayVus fucked around with this message at 08:16 on Feb 11, 2017

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

SlayVus posted:

You can get a 1050 for $110 on Newegg which is slightly faster than a 950 and a 1050 Ti for $140. Boost speeds of the 1050 Ti gives almost double the gigapixel and gigatexels per second.

GT/s is more important currently, unless you're running a higher resolution where GP/s becomes a limiting factor. The RX 470 for $140 is a good deal.
pre:
RX 470      38.6 GP/s     154.4 GT/s
1050 Ti      60.8 GP/s       91.2 GT/s

I doubt anyone pays much attention to raw paper specs (1050 Ti is only 41.3 GPixel/s, BTW) than end performance, which makes the RX470 by far the better card at equivalent prices unless TDP is a overriding factor.

https://www.hardwareunboxed.com/the-best-sub-200-graphics-card-gtx-1050-ti-vs-rx-470-rx-460-4gb/

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

spasticColon posted:

I thought it was later determined it was not the VRMs that were causing EVGA cards to self-immolate?

Yeah, the point I was trying to make is that this particular revision seems to have every "anti-card exploding" bell and whistle tacked on.

SlayVus
Jul 10, 2009
Grimey Drawer

Palladium posted:

I doubt anyone pays much attention to raw paper specs (1050 Ti is only 41.3 GPixel/s, BTW) than end performance, which makes the RX470 by far the better card at equivalent prices unless TDP is a overriding factor.

https://www.hardwareunboxed.com/the-best-sub-200-graphics-card-gtx-1050-ti-vs-rx-470-rx-460-4gb/

That doesn't take into account the boost clock speed which it will almost always be over.

https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_10_series

Numbers in parentheses are the performance when the card is boosting. I have a Titan XP, overclocked to 1987 core clock. That gives me a 190.7 GP/s and 445 GT/s vs what's listed at 147/343(Default boost).

If Vega can do neighborhood of 250 GP/s and 600 GT/s, I would definitely upgrade for VR.

SlayVus fucked around with this message at 10:02 on Feb 11, 2017

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

craig588 posted:

Yeah, it was a batch of faulty caps. They couldn't take the heat like they were specced to and the extra cooling kits helped keep faulty ones working.

lol, no. Extra cooling will maybe extend the life of a faulty cap by a week or two but it won't do jack poo poo for long term reliability.

Adbot
ADBOT LOVES YOU

feedmegin
Jul 30, 2008

SwissArmyDruid posted:

You joke, but sometimes I feel like we're about just short of active out-and-out sabotage across lines. There is no shortage of cheating in benchmarks from both sides.


Ah, you see, I take issue with that, and this time around, I have David Kanter as backup! http://www.realworldtech.com/physx87/4/


I won't speculate as to WHY they migrated their little black boxes from single-threaded x87 to multi-threaded SSE, but it certainly wasn't a "oh, we inherited this old code!" situation.

'Nvidia can recompile it at will' Um no it is not as simple as flipping a compiler switch if you want to take proper advantage of SSE. Especially if the old code was hand tuned assembler. Even if not you'd want to use SIMD intrinsics for efficiency not just recompile.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply