|
EA and Remedy showed off their experiments with DXR: https://www.youtube.com/watch?v=LXo0WdlELJk https://www.youtube.com/watch?v=70W2aFr5-Xk neat
|
# ? Mar 19, 2018 19:19 |
|
|
# ? Jun 10, 2024 11:07 |
|
repiv posted:EA and Remedy showed off their experiments with DXR: Very cool. Thanks for making my $570 GTX 1070 Ti obsolete NVidia.
|
# ? Mar 19, 2018 19:42 |
|
Same but 1080 Ti? I hope the new cards aren't particularly good at bit coins so that I can get one
|
# ? Mar 19, 2018 19:48 |
|
Here's all the known games that will support that cool new tech: I still maintain the best thing to do with any generation is wait for the x80ti launch, no matter which card you intend to buy at that point. Sure it feels like you're only relevant for half a generation, but you get over that by telling yourself that every generation before the x80ti is simply a paper launch and you should just wait.
|
# ? Mar 19, 2018 19:54 |
|
You could make the same arguments for the Titan since its release but I get the feeling the lineup we've gotten used to is going to change
|
# ? Mar 19, 2018 19:59 |
|
Paul MaudDib posted:I seem to remember reading something about how you could do a very sparse sample of raytracing and then train a deep-learning network that would transform a raster image into a realistic-looking raytraced image, but I haven't been able to find it since I read it.
|
# ? Mar 19, 2018 19:59 |
|
I thought ray tracing was supposed to be virtually impossible to do in realtime. Mind you, I read that around 10-15 years ago. Is it just the actual existence of raw computing power that's making it possible now?
|
# ? Mar 19, 2018 20:12 |
|
Zedsdeadbaby posted:I thought ray tracing was supposed to be virtually impossible to do in realtime. Mind you, I read that around 10-15 years ago. Is it just the actual existence of raw computing power that's making it possible now? It's making it possible to raytrace the stuff that benefits most from raytracing and then use machine learning/fuzzing to bullshit the rest.
|
# ? Mar 19, 2018 20:20 |
|
Does this mean DXR rendering will bring linear performance scaling with multiple GPUs?
|
# ? Mar 19, 2018 20:27 |
|
Remedy posted their GDC slides breaking down that demo above and comparing it to their normal rasterizer: https://www.remedygames.com/experiments-with-directx-raytracing-in-remedys-northlight-engine/ warning: 858mb powerpoint lol eames posted:Does this mean DXR rendering will bring linear performance scaling with multiple GPUs? If someone were to build a purely raytraced engine, then yes it would be trivial to get linear scaling over an arbitrary number of GPUs. I doubt pure raytracing will be fast enough to ship in games any time soon though, the first round of DXR implementations will probably be hybrids (e.g. rasterized base pass + raytraced reflection pass) and have the usual problems with multiple GPUs. repiv fucked around with this message at 20:42 on Mar 19, 2018 |
# ? Mar 19, 2018 20:34 |
|
For the olds with ancient aging bodies among us, can someone explain what ray-tracing brings to the gaming table that's different from what's going on now? Please explain as if you were talking to a three year old.
|
# ? Mar 19, 2018 20:36 |
|
Dadbod Apocalypse posted:For the olds with ancient aging bodies among us, can someone explain what ray-tracing brings to the gaming table that's different from what's going on now? This is a good youtube explanation: https://www.youtube.com/watch?v=DtfEVO9Oc3U Basically raytracing is being able to go by a pixel-by-pixel basis to exactly identify what is "seen" by the monitor. It looks really good but its really computationally expensive. This DX12 implementation is likely a combination of things to do enough of it cheaply enough to be feasible in games.
|
# ? Mar 19, 2018 20:40 |
|
repiv posted:If someone were to build a purely raytraced engine, then yes it would be trivial to get linear scaling over an arbitrary number of GPUs. I doubt pure raytracing will be fast enough to ship in games any time soon though, the first round of DXR implementations will probably be hybrids (e.g. rasterized base pass + raytraced reflection pass) and have the usual problems with multiple GPUs. I see. Anandtech's Slides mention "Ray Tracing for Gameworks" which includes Ray Traced Area Shadows, Glossy Reflections and Ambient Occlusion. They also mention that these effects done with RTX on Volta would be "integer multiples faster" than with DXR on older hardware (I assume that means software via compute units). Sounds nice but a bit Hairworks 2.0 to me.
|
# ? Mar 19, 2018 20:43 |
|
eames posted:I see. Hairworks except it makes everything you look at much nicer and allows for real reflections in water and surfaces that aren't just blurry rendering tricks. We are probably years and years away from it being fully utilized in game engines though.
|
# ? Mar 19, 2018 20:50 |
|
Eletriarnation posted:Eh, I'm probably weird but I usually play my games on a 4k/60Hz screen and I just have a 1060. For newer games I often step down to 1080p, but emulators and anything indie/older than a year or two work great. As year-end treat for myself, I picked up an ASUS ROG PG279, and it is loving lovely. For me, it stems from a couple of aspects. Previously I could un-cap the frame-rate but then have tearing, and tearing is nails-down-a-chalkboard noticeable and intolreable to me. Screen-tearing pains me. Or I could lock to 60Hz (on my LG 27EA63V-P), but then have some juddering or pacing issues unless I ratcheted settings down sufficiently. With the hardware-sync, I get no tearing, and no need to worry about ensuring absolutely consistency of framerate. I get to crank everything the hell up, squeeze everything I can from my card (the 980ti seems to do alright at 1440p), and it's good times. To be sure, it was a meaty premium over a 27-inch, responsive monitor *without* G-Sync, but I found it very worthwhile, and pretty much the moment I fired the first game up on it, any fear of buyers remorse entirely dissipated. Hell, I'd get a second one (I have a dual-screen set-up), but my other monitor is fine for media and consoles (PS4, Switch, etc, through a HDMI switcher).
|
# ? Mar 19, 2018 20:52 |
|
Kazinsal posted:Yeah turns out just because Logitech is paying for your Gamer House $35k/year ain't poo poo for living in Los Angeles or wherever. I'm sure minimum slave wage with practically zero employable experience/skillset gains won't stop the next generation of BWM kids embarking on living their childhood dream of making money through playing viddy gaems.
|
# ? Mar 19, 2018 20:53 |
|
Honestly that first DXR example made by EA looked stunning, but I think it is way too early in the tech to say that it is obsoleting current gen hardware with the next gen. Considering the current DX12 adoption rates alone, I would not expect developers to be utilizing DXR for years yet (unless your that one indie dev who wants to make the next Myst who has all the time in the world to focus on A. puzzles and B. experimental graphics.) Microsoft also specifically stated that the API leaves pretty much all of the interpretation of the feature up to the hardware engineers, so even if next gen cards "support" DXR hardware acceleration you can bet that there will continue to be multiple iterations for years to come.
|
# ? Mar 19, 2018 21:02 |
|
Dadbod Apocalypse posted:For the olds with ancient aging bodies among us, can someone explain what ray-tracing brings to the gaming table that's different from what's going on now? Raytracing actually models millions of rays of light leaving each source and reflecting from objects. This looks realistic, but is computationally infeasible for rendering complex scenes in realtime. In the field of cinema CGI, taking 30-60 minutes per frame is not uncommon. Realtime graphics like gaming use a bag of mathematical tricks to simulate this. It looks worse, but you can do it in real-time.
|
# ? Mar 19, 2018 21:17 |
|
Zedsdeadbaby posted:I thought ray tracing was supposed to be virtually impossible to do in realtime. Mind you, I read that around 10-15 years ago. Is it just the actual existence of raw computing power that's making it possible now? Dadbod Apocalypse posted:For the olds with ancient aging bodies among us, can someone explain what ray-tracing brings to the gaming table that's different from what's going on now? The softening of the shadow here as it extends further out would be expensive to implement in current engines, for instance. In practice I think the most immediate benefit of path tracing is more accurate reflections, since current games use either a cubemap or some screen space effect, both of which can look pretty janky.
|
# ? Mar 19, 2018 21:21 |
|
Paul MaudDib posted:Raytracing actually models millions of rays of light leaving each source and reflecting from objects. This looks realistic, but is computationally infeasible for rendering complex scenes in realtime. In the field of cinema CGI, taking 30-60 minutes per frame is not uncommon. Well, except it works in reverse, doesn't it? It doesn't trace every ray, it traces from each pixel and says "What object will I hit and how does the light interface with it?" Basically, you go backwards and start with where the light ends up and only bother figuring out what happens to those light rays, not the ones that don't end up in the "eye".
|
# ? Mar 19, 2018 21:24 |
|
You can do both at once: https://en.wikipedia.org/wiki/Path_tracing#Bidirectional_path_tracing
|
# ? Mar 19, 2018 21:27 |
|
https://www.youtube.com/watch?v=x19sIltR0qU This is someone who built ray tracing into Quake 2. They're running on a Titan Xp. Edit: beaten like I'm back in Quake 2's multiplayer.
|
# ? Mar 19, 2018 21:30 |
|
eBay prices are coming down. $235 for a 1060 3 GB, $460 for a 1070. Still above MSRP but that's 25% lower than they were a month ago. If the dip in Bitcoin and Eth continues, and especially if it turns out NVIDIA has been stockpiling next-gen Turing/Ampere cards, prices are going to plunge pretty hard here.
|
# ? Mar 19, 2018 23:51 |
|
Paul MaudDib posted:I seem to remember reading something about how you could do a very sparse sample of raytracing and then train a deep-learning network that would transform a raster image into a realistic-looking raytraced image, but I haven't been able to find it since I read it. SwissArmyDruid posted:Rest in peace, raytracing. Dead before you could ever become a thing. I'm amazed that it took less than a year-and-a-half to go from whitepapers and talks to implementation. edit: ack, not the last page. Still, so. We're back to heavy compute on GPUs again, are we? PLEASE don't resurrect the decaying corpse of GCN again, AMD. PLEASE. SwissArmyDruid fucked around with this message at 23:56 on Mar 19, 2018 |
# ? Mar 19, 2018 23:53 |
|
So with all this AI graphics technology, will that make tensor cores in next gen GPUs useful?
|
# ? Mar 20, 2018 00:26 |
|
SlayVus posted:So with all this AI graphics technology, will that make tensor cores in next gen GPUs useful? That's the theory, yes "Anandtech posted:Meanwhile NVIDIA also mentioned that they have the ability to leverage Volta's tensor cores in an indirect manner, accelerating ray tracing by doing AI denoising, where a neural network could be trained to reconstruct an image using fewer rays, a technology the company has shown off in the past at GTC.
|
# ? Mar 20, 2018 00:31 |
|
EVGA just had some B-stock GTX Titan 6GBs flash through for $400, guess they really are cleaning out the warehouse. Better hurry up, if prices keep declining then in another few weeks we'll go back to nobody really caring about Kepler anymore.
|
# ? Mar 20, 2018 00:44 |
|
I feel like raytracing could be extremely useful for VR content, and it could be done in real-time on games like Superhot which have Quake 2 level graphics by default. I've viewed static raytraced images on Gear VR and it really sells the realism when the lighting is so intricate.
|
# ? Mar 20, 2018 01:00 |
|
Zero VGS posted:I feel like raytracing could be extremely useful for VR content, and it could be done in real-time on games like Superhot which have Quake 2 level graphics by default. It's a natural fit for VR because instead of rendering a normal rectangular image then applying VR distortion correction afterwards, which causes an uneven distribution of detail, a raytracer can just render the distorted image directly. Much cleaner. The hard part is hitting 90fps but I suppose you could dynamically adjust the sample count to hit that mark, increasing noise instead of dropping frames. repiv fucked around with this message at 01:29 on Mar 20, 2018 |
# ? Mar 20, 2018 01:13 |
|
Perhaps a long shot, but is there anyone working at NVIDIA who’d entertain questions from someone with an offer? Glassdoor and such are helpful, but don’t always tell the whole story so I’m interested in soliciting as many opinions as I can. I have PMs if so.
|
# ? Mar 20, 2018 01:49 |
|
If you play enough games using the tensor cores for ray tracing eventually your 1180ti will be smart enough to mine when your not home to buy itself a body.
|
# ? Mar 20, 2018 03:30 |
|
Lockback posted:If you play enough games using the tensor cores for ray tracing eventually your 1180ti will be smart enough to mine when your not home to buy itself a body. a body, or a body with double guac?
|
# ? Mar 20, 2018 03:40 |
|
Lockback posted:If you play enough games using the tensor cores for ray tracing eventually your 1180ti will be smart enough to mine when your not home to buy itself a body. Or it invents a sim-world where Nvidia GPUs with Tensor cores exist
|
# ? Mar 20, 2018 03:44 |
|
Lockback posted:If you play enough games using the tensor cores for ray tracing eventually your 1180ti will be smart enough to mine when your not home to buy itself a body. My GPU left me and is dating a barista HELP Pages: 1234567... Last
|
# ? Mar 20, 2018 03:46 |
|
Since everyone else shared a hardware path-tracing denoising link, I'll share Pixar's old one: http://graphics.pixar.com/library/MLDenoisingB/paper.pdf
|
# ? Mar 20, 2018 04:05 |
|
So with all the GPU stocks being cleared, does that mean a new product is incoming?? Or just the price is coming down etc and they still have no competition or pressure to release anything new for gamers till end of the year.
|
# ? Mar 20, 2018 06:16 |
|
The past 8 pages or so has been "nobody loving knows".
|
# ? Mar 20, 2018 06:20 |
|
Nvidia is (still) set to announce their lineup around March 26. We literally don't know anything else.
ufarn fucked around with this message at 11:04 on Mar 20, 2018 |
# ? Mar 20, 2018 06:30 |
|
nVidia making such a big deal recently about raytracing makes me think back to the console "bit wars" of the 90s.
|
# ? Mar 20, 2018 07:48 |
|
|
# ? Jun 10, 2024 11:07 |
|
AMD fires back with the return of Blast Processing
|
# ? Mar 20, 2018 08:27 |