art3mis Posted August 18, 2017 Share Posted August 18, 2017 Thought this research by Nvidia quite interesting https://blogs.nvidia.com/blog/2017/05/10/ai-for-ray-tracing/ 3 Quote Link to comment Share on other sites More sharing options...
Guest tar Posted August 19, 2017 Share Posted August 19, 2017 I call bogus on this- it's a GPu denoiser. There are no denoisers that take minutes per frame: Quote Existing algorithms for high-quality denoising consume seconds to minutes per frame, which makes them impractical for interactive applications. Quote Link to comment Share on other sites More sharing options...
acey195 Posted August 20, 2017 Share Posted August 20, 2017 5 hours ago, marty said: I call bogus on this- it's a GPu denoiser. There are no denoisers that take minutes per frame: Well it depends on what resolution you have, and what amount of data loss you allow (blurring) I think predicting the location of noise and having data sets for image completion (to replace the noise with) is certainly not a bad idea... There was also a siggraph demo of real-time fluids based on a similar principle Quote Link to comment Share on other sites More sharing options...
Guest tar Posted August 20, 2017 Share Posted August 20, 2017 (edited) redshift devs looked at it and say that it helps for the last 5% maybe. It's just not very good at the moment. adding that Neat Denoiser works at approx 4 fps for 2K images IIRC, 1080ti 12 cores@3.33, so a render at pixel samples of 5x5 would equal 25 times slower to denoise, which is 6.25 secs per frame. Edited August 20, 2017 by tar Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.