Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SwissArmyDruid
Feb 14, 2014

by sebmojo

Boris Galerkin posted:

I mean if you wanted to you could. I'd understand most of it.

I should have been in bed two hours ago, but here goes. I'm sure the others will rip me a new one for getting things wrong. A float is basically a number that is expressed in scientific notation.

A single precision float* is represented in memory by 32 bits (FP32, or Floating Point 32). With this, a number may have up to about 6-9 significant numbers. However, some things that you do with a computer requires that your math use more significant digits than just 9. CAD is a good example of this. Therefore, they use double-precision floats**. This increases the number of bits that can be used to represent a number to, you guessed it, 64. (FP64.) FP64, using double the number of bits, can store significant digits up to 15-17.

Now, you asked,

Boris Galerkin posted:

Hmm, so you're saying the (major) difference between a consumer/gaming and workstation GPU is that AMD/Nvidia has intentionally destroyed some of DP FPUs? Is this just like the whole binning thing where some chips/wafers are tested good enough to be used for the GTX line vs the say GT line and so on except even more artificial because the FPUs are intentionally destroyed or am I simplifying it way too much?

The answer is: Sort of. They may not even have destroyed anything, they might have just come out of the fab wrong. It can be a binning thing. Furthermore, consumer-grade applications, (read: Games) don't really care too much about FP64 performance. This is generally a level of precision in math that is way overkill for video games.

One more thing. Workstation silicon, considering the emphasis on precision and exactness, also come equipped with ECC RAM. Because it doesn't make much sense to have such precise and detailed math, only for a random bit to flip and invalidate the entire result.

Workstation graphics cards are math machines. They are designed to get it right, above all else. As a result, sometimes you'll see that a workstation card can wind up being slower-clocked than their consumer-grade bretheren. Otherwise that piston timing might be just out of whack, or that bridge not quite contiguous. As an engineer, I can't take that chance, and so a workstation video card goes into my CAD rig.

Consumer video cards are allowed a little slop. They don't have to get it exactly right, they just need to get it to _look_ right. (FISR***, anti-aliasing, ambient occlusion, etc.) And they don't need FP64 to do that, not really, because FP64 math is quite slow, when a high refresh rate, texture quality, and resolution are far more desirable. Hence why they are clocked higher, and don't care about disabled floating point units.

Even if you were to play games on a workstation graphics card, and this is entirely possible, you wouldn't gain any real benefit from it. The game is sending down requests for FP32 math, not FP64, and so the double precision math bits don't just spontaneously and graciously run it as FP64 math instead so that your headshot was that much more accurate, or whatever.

*https://en.wikipedia.org/wiki/Single-precision_floating-point_format
**https://en.wikipedia.org/wiki/Double-precision_floating-point_format
***https://en.wikipedia.org/wiki/Fast_inverse_square_root

Adbot
ADBOT LOVES YOU

Captain Hair
Dec 31, 2007

Of course, that can backfire... some men like their bitches crazy.
So, whatever happened to the rumoured 960ti? I heard a few reports about them and then nothing more was ever said about them...

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Captain Hair posted:

So, whatever happened to the rumoured 960ti? I heard a few reports about them and then nothing more was ever said about them...

Rumors happen and it seems tech sites feel that being right is worth being wrong repeatedly. It's why when list every possibility I joke about being wccftech because they do it constantly.

Sininu
Jan 8, 2014

Afterburner reports that my 970M core clock is 540MHz all the time ingame. Which is surely incorrect since games run as expected. Is there anything I could to fix it?

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Rikaz posted:

Hello I'm currently buying new computer (Precision Tower 3620) and I have no idea what graphics card they want me to choose (these are the once I have to chose from ) Sorry it's in Swedish I'll translate if needed. I don't know If I should get one or two (dubbla) which is too much or little.
Right now I have a Nvidia msi gtx 750ti on this computer. Monitor is Dell Ultrasharp 25.
What I do: I'm a 'graphic designer' I draw, I'd like to do more 3D art and such but my current computer can't really handle that. ( the main games I intend to play is DarkSouls3 and probably noting more craving)
I'd be very grateful for any help choosing.

You should visit the PC Part Picking thread. They will help you find something appropriate, but the short version is that 1) you are looking at cards for professional workstation use, which isn't what you need, and also you are buying a prebuilt computer, which is never good in terms of value for money, or actually getting half decent parts. If you want to buy anything other than a very basic office machine, you should be buying parts from a PC store and assembling yourself, or paying the small fee that most such places charge to assemble something based on your specifications. The Part picking thread can help you with where to shop, and what to buy to suit your needs and budget, I suggest you post there before you buy anything.

sauer kraut
Oct 2, 2004

Captain Hair posted:

So, whatever happened to the rumoured 960ti? I heard a few reports about them and then nothing more was ever said about them...

It was never gonna happen. The 960 is a full chip, and there's no reason to cut the 970 even more while it's selling like crack at 330$ anyway.
All the while AMD thought it wise waiting 14 months to ship a full Tonga 380X, so no pressure on the 250$ front at all.

sauer kraut fucked around with this message at 14:20 on Apr 19, 2016

penus penus penus
Nov 9, 2014

by piss__donald

SinineSiil posted:

Afterburner reports that my 970M core clock is 540MHz all the time ingame. Which is surely incorrect since games run as expected. Is there anything I could to fix it?

I have found that afterburner has correctly displayed the clock speed for me and even was able to warn me of a driver issue that id probably be unaware of by displaying a low clock. 540 mhz is very close to the limp mode for Nvidia which could indicate a driver problem. Or you could have your igpu selected in afterburner , or the game has very low demand on the gpu . First thing I would do is test another game with a known heavy gpu load, if you don't have one run Heaven 4.0 and see what the clock reports at

Boris Galerkin posted:

What exactly do you mean by "double precision performance" because I thought all modern CPUs and GPUs were capable of "double precision performance."


Did you just leave the CPU cooler attached and GPU slotted in? Man I'd be too scared about a bit of turbulence of bump from other people other carry ons sending the cooler right into my chip and crackin it or the large GPU bending out of shape or something.

Yes but I drove. I wouldn't fly with the gpu in for sure. I did fill the gaps around the gpu with bubble though because I was still paranoid about that 90 degree pcie

penus penus penus fucked around with this message at 14:22 on Apr 19, 2016

Panty Saluter
Jan 17, 2004

Making learning fun!

Anime Schoolgirl posted:

some tvs actively avoid displaying the native resolution for reasons only known by cthulhu, especially on HDMI, the worst display connection standard in the world

How is a bad implementation if a standard the standard's fault?


As far as DSR and blur, I found that setting the desktop to a DSR resolution and adjusting the blur until text looks "right" a good way to balance things.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

sauer kraut posted:

All the while AMD thought it wise waiting 14 months to ship a full Tonga 380X, so no pressure on the 250$ front at all.

I think there's pretty good odds it was a good idea for AMD, I think that rumor about a major Apple design win for Polaris is very believable, and Tonga is part of the reason why.

repiv
Aug 13, 2009

That leaked shroud showed up again

https://twitter.com/OC3D/status/722391813860950016

Sininu
Jan 8, 2014

THE DOG HOUSE posted:

I have found that afterburner has correctly displayed the clock speed for me and even was able to warn me of a driver issue that id probably be unaware of by displaying a low clock. 540 mhz is very close to the limp mode for Nvidia which could indicate a driver problem. Or you could have your igpu selected in afterburner , or the game has very low demand on the gpu . First thing I would do is test another game with a known heavy gpu load, if you don't have one run Heaven 4.0 and see what the clock reports at
Newest drivers and only game I have checked core frequency in is Black Ops 3. Also Nvidia Inspector seems to give correct clock readings.

penus penus penus
Nov 9, 2014

by piss__donald

SinineSiil posted:

Newest drivers and only game I have checked core frequency in is Black Ops 3. Also Nvidia Inspector seems to give correct clock readings.

Is your gpu selected from the drop down near the top of the program ? Also does the clock graph just show an even 540 mhz across the board ?

Sininu
Jan 8, 2014

THE DOG HOUSE posted:

Is your gpu selected from the drop down near the top of the program ? Also does the clock graph just show an even 540 mhz across the board ?
Yes, I'm looking at GPU1 reading which is 970M in Afterburner.
I tested it in Dirt Rally now too - exact same result.
Also restarted my computer, just in case.

Durinia
Sep 26, 2014

The Mad Computer Scientist
http://www.nextplatform.com/2016/04/19/drilling-nvidias-pascal-gpu/

This is focused on HPC/Analytics stuff, but includes some good info on Pascal's architecture. Of particular note:

NVIDIA Guy posted:

"The way we are able to service these many markets is that we find that right balance between a core design and then we get synergy and when we do X, then X actually helps us in other markets and it amplifies our strengths as a company. So for instance, with the Pascal architecture, we are doing pre-emption, and the motivation for that came from HPC customers, but it also benefits automotive, which has real-time workloads and GeForce as well."

There's also a link to a Pascal whitepaper that NVIDIA posted. I've reposted it here if you don't want to sign up for their sales list. :P

penus penus penus
Nov 9, 2014

by piss__donald

SinineSiil posted:

Yes, I'm looking at GPU1 reading which is 970M in Afterburner.
I tested it in Dirt Rally now too - exact same result.
Also restarted my computer, just in case.

As long as another program is reporting correctly its safe to say that install is borked. Can try reinstalling, or using EVGA precision which does the same things

Blackfyre
Jul 8, 2012

I want wings.

Durinia posted:

http://www.nextplatform.com/2016/04/19/drilling-nvidias-pascal-gpu/

This is focused on HPC/Analytics stuff, but includes some good info on Pascal's architecture. Of particular note:


There's also a link to a Pascal whitepaper that NVIDIA posted. I've reposted it here if you don't want to sign up for their sales list. :P

Whilst interesting I will admit I am stupid enough to ask for a smarter person to clarify, does this mean Pascal will have better async compute functionality so will pretty much be competitive with AMD?

Sininu
Jan 8, 2014

THE DOG HOUSE posted:

As long as another program is reporting correctly its safe to say that install is borked. Can try reinstalling, or using EVGA precision which does the same things
I did a little Googling and it seems like Afterburner has trouble reading Alienware laptop GPU core clocks for some reason or something? Quite lame tbh.
https://forums.geforce.com/default/topic/811170/is-msi-afterburner-reporting-the-coreclock-incorrectly-/?offset=3
http://forum.notebookreview.com/threads/official-alienware-17-r2-r3-owners-lounge.770314/page-201
http://forum.notebookreview.com/threads/official-alienware-15-r1-r2-benchmark-thread.770319/page-25

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

More than meets the eye

Anime Schoolgirl
Nov 28, 2002

Panty Saluter posted:

How is a bad implementation if a standard the standard's fault?
it's more like a strict implementation of a standard because it wasn't until hdmi 1.4 that "hey maybe we should allow more flexible resolutions than 1280x720 or 1920x1080"

and i'm guessing that tv was made before hdmi 1.4

then there's funnier examples during that era where 1280x720 displays just refuse to display 1280x720...on VGA :allears:

Froist
Jun 6, 2004

Anime Schoolgirl posted:

it's more like a strict implementation of a standard because it wasn't until hdmi 1.4 that "hey maybe we should allow more flexible resolutions than 1280x720 or 1920x1080"

and i'm guessing that tv was made before hdmi 1.4

then there's funnier examples during that era where 1280x720 displays just refuse to display 1280x720...on VGA :allears:

My Samsung TV's panel has a native resolution of 1366x768 but will only accept a signal of that resolution on one of the three HDMI ports. The other two will only scale up a 720p signal and blur the pixels. Explain that one.

(Answer: I bought at a terrible time in the HDTV development lifecycle)

Durinia
Sep 26, 2014

The Mad Computer Scientist

Blackfyre posted:

Whilst interesting I will admit I am stupid enough to ask for a smarter person to clarify, does this mean Pascal will have better async compute functionality so will pretty much be competitive with AMD?

It's different in this case - pre-emption has to do with the ability for a card to be interrupted accurately and quickly - for using very detailed debuggers and OS pre-emption for services. That said, it's one of those cases where the hardware to do this might also help with Async compute.

I think what everyone's hoping is that they made a small extension to HyperQ in Pascal to include the graphics pipe in the ability to run concurrent kernels. If they did that, I'd expect a big statement about it when they reveal the product line.

fozzy fosbourne
Apr 21, 2010

I don't think this was posted yet, rumor has it that the PlayStation 4.5 will use a Polaris chip http://www.giantbomb.com/articles/sources-the-upgraded-playstation-4-is-codenamed-ne/1100-5437/

Hopefully it supports freesync

Panty Saluter
Jan 17, 2004

Making learning fun!

Anime Schoolgirl posted:

it's more like a strict implementation of a standard because it wasn't until hdmi 1.4 that "hey maybe we should allow more flexible resolutions than 1280x720 or 1920x1080"

and i'm guessing that tv was made before hdmi 1.4

then there's funnier examples during that era where 1280x720 displays just refuse to display 1280x720...on VGA :allears:

Considering the wackadoo resolutions some cheaper TVs used (e.g., the plasma that way really 1024 x 1024, but was stretched and sold as 720p) I can't really fault the HDMI consortium for not foreseeing or accomadating more than just the "standard" resolutions. More flexibility is always good though.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Eh, those tv's still accepted regular signals. The scaling was all done internally.

penus penus penus
Nov 9, 2014

by piss__donald

Froist posted:

My Samsung TV's panel has a native resolution of 1366x768 but will only accept a signal of that resolution on one of the three HDMI ports. The other two will only scale up a 720p signal and blur the pixels. Explain that one.

(Answer: I bought at a terrible time in the HDTV development lifecycle)

Yeah very common on Tv's from like 2007 and up until fairly recently really. Some cheaper tv's still behave this way. Its common to have the good port labeled as HDMI/DVI. I have no idea why though except to blandly guess cost purposes.

It is nice finally to have TV's that can function correctly as monitors these days though, which was a whole other can of worms.

Now as to why the lovely TV I'm using can somehow tell I'm using DSR , I have no idea.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Durinia posted:

It's different in this case - pre-emption has to do with the ability for a card to be interrupted accurately and quickly - for using very detailed debuggers and OS pre-emption for services. That said, it's one of those cases where the hardware to do this might also help with Async compute.

Pre-emption is also valuable for VR, at least how Oculus does it. GCN has always been better at that particular piece of the pipeline.

Durinia
Sep 26, 2014

The Mad Computer Scientist

Subjunctive posted:

Pre-emption is also valuable for VR, at least how Oculus does it. GCN has always been better at that particular piece of the pipeline.

Yes, this is a positive development for VR. Would address the comments about "current NVIDIA preemption being possibly catastrophic for VR" that came out in a David Kanter podcast a while back.

Basically, it lets you pause your rendering kernel to refresh the screen, maintaining the necessary rates for VR even if you have a spike in rendering a frame.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨

Durinia posted:

Yes, this is a positive development for VR. Would address the comments about "current NVIDIA preemption being possibly catastrophic for VR" that came out in a David Kanter podcast a while back.

Basically, it lets you pause your rendering kernel to refresh the screen, maintaining the necessary rates for VR even if you have a spike in rendering a frame.

It's not about refresh as much as being able to schedule time warp precisely. (I was never able to find anyone at Oculus who knew the source of that putative quote, FWIW.)

Durinia
Sep 26, 2014

The Mad Computer Scientist

Subjunctive posted:

It's not about refresh as much as being able to schedule time warp precisely. (I was never able to find anyone at Oculus who knew the source of that putative quote, FWIW.)

Yes, good point.

Short version: more real-time control is good.

Automata 10 Pack
Jun 21, 2007

Ten games published by Automata, on one cassette

Captain Hair posted:

So, whatever happened to the rumoured 960ti? I heard a few reports about them and then nothing more was ever said about them...

That's pretty much the 970, really. :p

Panty Saluter
Jan 17, 2004

Making learning fun!
2x DSR on Dark Souls 3 is real pretty but runs at a paltry 30 fps on my 970 (3620 x 1527 on my display). I'm daydreaming of a 980ti just to run stupid resolutions.

I am not an adult :v:

Automata 10 Pack
Jun 21, 2007

Ten games published by Automata, on one cassette

Panty Saluter posted:

2x DSR on Dark Souls 3 is real pretty but runs at a paltry 30 fps on my 970 (3620 x 1527 on my display). I'm daydreaming of a 980ti just to run stupid resolutions.

I am not an adult :v:

I have a 980ti and I can only get up to 2804 x 1577 before it dips below 60fps.

By the way is that resolution downscaled to 1577 still going to improve the IQ or am I loving it up?

Rite Of Massage
Aug 16, 2005

Don Lapre posted:

More than meets the eye

the hottest tech clickbait website are speculating that, if these rumors are to believed, pascal is gonna be dropped in june, if these leaks are true and i am going to continue to assume these leaks are true for my post to get views

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.

New low RCS shroud looking fine!

Panty Saluter
Jan 17, 2004

Making learning fun!

Mutant Standard posted:

I have a 980ti and I can only get up to 2804 x 1577 before it dips below 60fps.

By the way is that resolution downscaled to 1577 still going to improve the IQ or am I loving it up?

DSR renders internally at the chosen resolution then resizes to the actual monitor resolution.

There are a ton of resolutions available, and 1.25 factor might work but I can't see it having much benefit. Guess I could do something real dumb like try it and switch back if it sucks.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Smithereens posted:

the hottest tech clickbait website are speculating that, if these rumors are to believed, pascal is gonna be dropped in june, if these leaks are true and i am going to continue to assume these leaks are true for my post to get views

Since titan X is basically off the market now its probably likely.

Blackfyre
Jul 8, 2012

I want wings.
I'm sort of hoping GTX 1080 isn't a thing as if its not that great there'll be "MORE LIKE GTX 1080....P!!" all over the web.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Blackfyre posted:

I'm sort of hoping GTX 1080 isn't a thing as if its not that great there'll be "MORE LIKE GTX 1080....P!!" all over the web.

I'm just hoping that's really not the shourd, 1080 or not, because ugh it's so much more ugly than the minimalist look for Kepler and Maxwell. It's like I'm staring at a low poly version of the insanely ugly 200 series coolers.

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.
http://wccftech.com/amd-radeon-m400-mobility-lineup-leak/

Well, I hope you like rebrands!

Adbot
ADBOT LOVES YOU

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

How often do new media cards get made anyway?

Disappointed to see an R9 part be a rebrand through if that's actually accurate.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply