Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
repiv
Aug 13, 2009

EA and Remedy showed off their experiments with DXR:

https://www.youtube.com/watch?v=LXo0WdlELJk

https://www.youtube.com/watch?v=70W2aFr5-Xk

neat

Adbot
ADBOT LOVES YOU

Alpha Mayo
Jan 15, 2007
hi how are you?
there was this racist piece of shit in your av so I fixed it
you're welcome
pay it forward~

Very cool. Thanks for making my $570 GTX 1070 Ti obsolete NVidia.

Sniep
Mar 28, 2004

All I needed was that fatty blunt...



King of Breakfast
Same but 1080 Ti?

I hope the new cards aren't particularly good at bit coins so that I can get one

Craptacular!
Jul 9, 2001

Fuck the DH
Here's all the known games that will support that cool new tech:



I still maintain the best thing to do with any generation is wait for the x80ti launch, no matter which card you intend to buy at that point. Sure it feels like you're only relevant for half a generation, but you get over that by telling yourself that every generation before the x80ti is simply a paper launch and you should just wait.

1gnoirents
Jun 28, 2014

hello :)
You could make the same arguments for the Titan since its release but I get the feeling the lineup we've gotten used to is going to change

ufarn
May 30, 2009

Paul MaudDib posted:

I seem to remember reading something about how you could do a very sparse sample of raytracing and then train a deep-learning network that would transform a raster image into a realistic-looking raytraced image, but I haven't been able to find it since I read it.

That would probably be the only actual use for tensor hardware on consumer cards that I've heard of.

edit: was probably one of these two

https://casual-effects.com/research/Mara2017Denoise/Mara2017Denoise.pdf

https://arxiv.org/pdf/1603.06078.pdf
Previous discussion.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
I thought ray tracing was supposed to be virtually impossible to do in realtime. Mind you, I read that around 10-15 years ago. Is it just the actual existence of raw computing power that's making it possible now?

Kazinsal
Dec 13, 2011

Zedsdeadbaby posted:

I thought ray tracing was supposed to be virtually impossible to do in realtime. Mind you, I read that around 10-15 years ago. Is it just the actual existence of raw computing power that's making it possible now?

It's making it possible to raytrace the stuff that benefits most from raytracing and then use machine learning/fuzzing to bullshit the rest.

eames
May 9, 2009

Does this mean DXR rendering will bring linear performance scaling with multiple GPUs?

repiv
Aug 13, 2009

Remedy posted their GDC slides breaking down that demo above and comparing it to their normal rasterizer: https://www.remedygames.com/experiments-with-directx-raytracing-in-remedys-northlight-engine/

warning: 858mb powerpoint lol

eames posted:

Does this mean DXR rendering will bring linear performance scaling with multiple GPUs?

If someone were to build a purely raytraced engine, then yes it would be trivial to get linear scaling over an arbitrary number of GPUs. I doubt pure raytracing will be fast enough to ship in games any time soon though, the first round of DXR implementations will probably be hybrids (e.g. rasterized base pass + raytraced reflection pass) and have the usual problems with multiple GPUs.

repiv fucked around with this message at 20:42 on Mar 19, 2018

AARP LARPer
Feb 19, 2005

THE DARK SIDE OF SCIENCE BREEDS A WEAPON OF WAR

Buglord
For the olds with ancient aging bodies among us, can someone explain what ray-tracing brings to the gaming table that's different from what's going on now?

Please explain as if you were talking to a three year old.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Dadbod Apocalypse posted:

For the olds with ancient aging bodies among us, can someone explain what ray-tracing brings to the gaming table that's different from what's going on now?

Please explain as if you were talking to a three year old.

This is a good youtube explanation:

https://www.youtube.com/watch?v=DtfEVO9Oc3U

Basically raytracing is being able to go by a pixel-by-pixel basis to exactly identify what is "seen" by the monitor. It looks really good but its really computationally expensive. This DX12 implementation is likely a combination of things to do enough of it cheaply enough to be feasible in games.

eames
May 9, 2009

repiv posted:

If someone were to build a purely raytraced engine, then yes it would be trivial to get linear scaling over an arbitrary number of GPUs. I doubt pure raytracing will be fast enough to ship in games any time soon though, the first round of DXR implementations will probably be hybrids (e.g. rasterized base pass + raytraced reflection pass) and have the usual problems with multiple GPUs.

I see.

Anandtech's Slides mention "Ray Tracing for Gameworks" which includes Ray Traced Area Shadows, Glossy Reflections and Ambient Occlusion. They also mention that these effects done with RTX on Volta would be "integer multiples faster" than with DXR on older hardware (I assume that means software via compute units).
Sounds nice but a bit Hairworks 2.0 to me.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

eames posted:

I see.

Anandtech's Slides mention "Ray Tracing for Gameworks" which includes Ray Traced Area Shadows, Glossy Reflections and Ambient Occlusion. They also mention that these effects done with RTX on Volta would be "integer multiples faster" than with DXR on older hardware (I assume that means software via compute units).
Sounds nice but a bit Hairworks 2.0 to me.

Hairworks except it makes everything you look at much nicer and allows for real reflections in water and surfaces that aren't just blurry rendering tricks. We are probably years and years away from it being fully utilized in game engines though.

magimix
Dec 31, 2003

MY FAT WAIFU!!! :love:
She's fetish efficient :3:

Nap Ghost

Eletriarnation posted:

Eh, I'm probably weird but I usually play my games on a 4k/60Hz screen and I just have a 1060. For newer games I often step down to 1080p, but emulators and anything indie/older than a year or two work great.
I'll admit being curious to see what variable refresh rate looks like, but not enough to pay the G-Sync premium yet.

As year-end treat for myself, I picked up an ASUS ROG PG279, and it is loving lovely. For me, it stems from a couple of aspects. Previously I could un-cap the frame-rate but then have tearing, and tearing is nails-down-a-chalkboard noticeable and intolreable to me. Screen-tearing pains me. Or I could lock to 60Hz (on my LG 27EA63V-P), but then have some juddering or pacing issues unless I ratcheted settings down sufficiently. With the hardware-sync, I get no tearing, and no need to worry about ensuring absolutely consistency of framerate. I get to crank everything the hell up, squeeze everything I can from my card (the 980ti seems to do alright at 1440p), and it's good times.

To be sure, it was a meaty premium over a 27-inch, responsive monitor *without* G-Sync, but I found it very worthwhile, and pretty much the moment I fired the first game up on it, any fear of buyers remorse entirely dissipated. Hell, I'd get a second one (I have a dual-screen set-up), but my other monitor is fine for media and consoles (PS4, Switch, etc, through a HDMI switcher).

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Kazinsal posted:

Yeah turns out just because Logitech is paying for your Gamer House $35k/year ain't poo poo for living in Los Angeles or wherever.

I'm sure minimum slave wage with practically zero employable experience/skillset gains won't stop the next generation of BWM kids embarking on living their childhood dream of making money through playing viddy gaems.

Siets
Sep 19, 2006

by FactsAreUseless
Honestly that first DXR example made by EA looked stunning, but I think it is way too early in the tech to say that it is obsoleting current gen hardware with the next gen. Considering the current DX12 adoption rates alone, I would not expect developers to be utilizing DXR for years yet (unless your that one indie dev who wants to make the next Myst who has all the time in the world to focus on A. puzzles and B. experimental graphics.) Microsoft also specifically stated that the API leaves pretty much all of the interpretation of the feature up to the hardware engineers, so even if next gen cards "support" DXR hardware acceleration you can bet that there will continue to be multiple iterations for years to come.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Dadbod Apocalypse posted:

For the olds with ancient aging bodies among us, can someone explain what ray-tracing brings to the gaming table that's different from what's going on now?

Please explain as if you were talking to a three year old.

Raytracing actually models millions of rays of light leaving each source and reflecting from objects. This looks realistic, but is computationally infeasible for rendering complex scenes in realtime. In the field of cinema CGI, taking 30-60 minutes per frame is not uncommon.

Realtime graphics like gaming use a bag of mathematical tricks to simulate this. It looks worse, but you can do it in real-time.

Llamadeus
Dec 20, 2005

Zedsdeadbaby posted:

I thought ray tracing was supposed to be virtually impossible to do in realtime. Mind you, I read that around 10-15 years ago. Is it just the actual existence of raw computing power that's making it possible now?
A path-traced Quake 2 (without any denoising): https://www.youtube.com/watch?v=x19sIltR0qU (youtube's compression doesn't do it much good but there's a link to the source video)

Dadbod Apocalypse posted:

For the olds with ancient aging bodies among us, can someone explain what ray-tracing brings to the gaming table that's different from what's going on now?

Please explain as if you were talking to a three year old.
Modern real-time rendering uses a large amount of hacks and approximations to get something vaguely approaching realism. Path tracing is like a ground up approach to rendering a scene, essentially building it up photon by photon, so it gets a lot of realistic lighting effects right without any extra effort.



The softening of the shadow here as it extends further out would be expensive to implement in current engines, for instance. In practice I think the most immediate benefit of path tracing is more accurate reflections, since current games use either a cubemap or some screen space effect, both of which can look pretty janky.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Paul MaudDib posted:

Raytracing actually models millions of rays of light leaving each source and reflecting from objects. This looks realistic, but is computationally infeasible for rendering complex scenes in realtime. In the field of cinema CGI, taking 30-60 minutes per frame is not uncommon.

Realtime graphics like gaming use a bag of mathematical tricks to simulate this. It looks worse, but you can do it in real-time.

Well, except it works in reverse, doesn't it? It doesn't trace every ray, it traces from each pixel and says "What object will I hit and how does the light interface with it?"

Basically, you go backwards and start with where the light ends up and only bother figuring out what happens to those light rays, not the ones that don't end up in the "eye".

Llamadeus
Dec 20, 2005
You can do both at once: https://en.wikipedia.org/wiki/Path_tracing#Bidirectional_path_tracing

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC
https://www.youtube.com/watch?v=x19sIltR0qU

This is someone who built ray tracing into Quake 2. They're running on a Titan Xp.


Edit: beaten like I'm back in Quake 2's multiplayer.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
eBay prices are coming down. $235 for a 1060 3 GB, $460 for a 1070. Still above MSRP but that's 25% lower than they were a month ago.

If the dip in Bitcoin and Eth continues, and especially if it turns out NVIDIA has been stockpiling next-gen Turing/Ampere cards, prices are going to plunge pretty hard here.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Paul MaudDib posted:

I seem to remember reading something about how you could do a very sparse sample of raytracing and then train a deep-learning network that would transform a raster image into a realistic-looking raytraced image, but I haven't been able to find it since I read it.

That would probably be the only actual use for tensor hardware on consumer cards that I've heard of.

edit: was probably one of these two

https://casual-effects.com/research/Mara2017Denoise/Mara2017Denoise.pdf

https://arxiv.org/pdf/1603.06078.pdf

SwissArmyDruid posted:

Rest in peace, raytracing. Dead before you could ever become a thing.

https://www.pcper.com/news/General-Tech/OTOY-Discussed-AI-Denoising-Unite-Austin

TL;DR: Instead of raytracing an entire scene, raytrace a fraction of sample evenly distributed across the scene, and then have denoising AI interpolate the rest.

I'm amazed that it took less than a year-and-a-half to go from whitepapers and talks to implementation.

edit: ack, not the last page. Still, so. We're back to heavy compute on GPUs again, are we? PLEASE don't resurrect the decaying corpse of GCN again, AMD. PLEASE.

SwissArmyDruid fucked around with this message at 23:56 on Mar 19, 2018

SlayVus
Jul 10, 2009
Grimey Drawer
So with all this AI graphics technology, will that make tensor cores in next gen GPUs useful?

repiv
Aug 13, 2009

SlayVus posted:

So with all this AI graphics technology, will that make tensor cores in next gen GPUs useful?

That's the theory, yes

"Anandtech posted:

Meanwhile NVIDIA also mentioned that they have the ability to leverage Volta's tensor cores in an indirect manner, accelerating ray tracing by doing AI denoising, where a neural network could be trained to reconstruct an image using fewer rays, a technology the company has shown off in the past at GTC.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
EVGA just had some B-stock GTX Titan 6GBs flash through for $400, guess they really are cleaning out the warehouse.

Better hurry up, if prices keep declining then in another few weeks we'll go back to nobody really caring about Kepler anymore.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
I feel like raytracing could be extremely useful for VR content, and it could be done in real-time on games like Superhot which have Quake 2 level graphics by default.

I've viewed static raytraced images on Gear VR and it really sells the realism when the lighting is so intricate.

repiv
Aug 13, 2009

Zero VGS posted:

I feel like raytracing could be extremely useful for VR content, and it could be done in real-time on games like Superhot which have Quake 2 level graphics by default.

I've viewed static raytraced images on Gear VR and it really sells the realism when the lighting is so intricate.

It's a natural fit for VR because instead of rendering a normal rectangular image then applying VR distortion correction afterwards, which causes an uneven distribution of detail, a raytracer can just render the distorted image directly. Much cleaner.

The hard part is hitting 90fps but I suppose you could dynamically adjust the sample count to hit that mark, increasing noise instead of dropping frames.

repiv fucked around with this message at 01:29 on Mar 20, 2018

Star War Sex Parrot
Oct 2, 2003

Perhaps a long shot, but is there anyone working at NVIDIA who’d entertain questions from someone with an offer? Glassdoor and such are helpful, but don’t always tell the whole story so I’m interested in soliciting as many opinions as I can. I have PMs if so.

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
If you play enough games using the tensor cores for ray tracing eventually your 1180ti will be smart enough to mine when your not home to buy itself a body.

Arivia
Mar 17, 2011

Lockback posted:

If you play enough games using the tensor cores for ray tracing eventually your 1180ti will be smart enough to mine when your not home to buy itself a body.

a body, or a body with double guac?

Alpha Mayo
Jan 15, 2007
hi how are you?
there was this racist piece of shit in your av so I fixed it
you're welcome
pay it forward~

Lockback posted:

If you play enough games using the tensor cores for ray tracing eventually your 1180ti will be smart enough to mine when your not home to buy itself a body.

Or it invents a sim-world where Nvidia GPUs with Tensor cores exist :tinfoil:

Cygni
Nov 12, 2005

raring to post

Lockback posted:

If you play enough games using the tensor cores for ray tracing eventually your 1180ti will be smart enough to mine when your not home to buy itself a body.

My GPU left me and is dating a barista HELP
Pages: 1234567... Last

Twinty Zuleps
May 10, 2008

by R. Guyovich
Lipstick Apathy
Since everyone else shared a hardware path-tracing denoising link, I'll share Pixar's old one: http://graphics.pixar.com/library/MLDenoisingB/paper.pdf

coke
Jul 12, 2009
So with all the GPU stocks being cleared, does that mean a new product is incoming??

Or just the price is coming down etc and they still have no competition or pressure to release anything new for gamers till end of the year.

Craptacular!
Jul 9, 2001

Fuck the DH
The past 8 pages or so has been "nobody loving knows".

ufarn
May 30, 2009
Nvidia is (still) set to announce their lineup around March 26. We literally don't know anything else.

ufarn fucked around with this message at 11:04 on Mar 20, 2018

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
nVidia making such a big deal recently about raytracing makes me think back to the console "bit wars" of the 90s.

Adbot
ADBOT LOVES YOU

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
AMD fires back with the return of Blast Processing

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply