meeotch Posted May 7, 2022 Share Posted May 7, 2022 (edited) Here's a scene with some pieces that are the result of a voronoi shatter. I've got a principled shader on them, with noise displacement turned on. If you render 10 frames of the "close" ROP, you'll see geometry flickering at the edges of the pieces. Weirder still, if you do a single frame render (in Render View, or to mplay), then save the frame and render it again, you'll see that the geo artifacts are *not deterministic*. The same frame renders differently each time. Now dive into the RENDER_test node, and instead render it with one of the pieces blasted. Suddenly, the frame renders pretty much the same each time. Doesn't seem to matter which piece you delete. Now go back to the original file node with all the pieces, and render it with coving disabled on the Geometry tab of the RENDER_test object. Again, the frame renders the same way each time. So it appears that coving is acting nondeterministically, depending on how much geo is in the object - WTF? Sure - I expect a little geo weirdness when displacing geo with sharp edges. But I at least expect each frame to render the same every time - and I'm pretty sure that's where the flickering is coming from. If you look at it closely over many frames, it appears that it's happening in certain spots, always on the edges, and that it's flipping back and forth between two different coving solutions at each spot - though each spot flips independently of the others. I've tried everything to eliminate the flickering: all the various dicing parameters, ray predicing, shading quality, flatness, re-dicing, uniform measuring, sample coving, sample lock. I've also tried pre-dividing the pieces to get more polys in the geo. No love. Any bright ideas? (The actual shot that this comes from involves a large number of these pieces that move slowly and then come to a stop. Even when they are completely static, and the camera is static, they flicker from frame to frame.) testgeo_v2.bgeo.sc MRS_001_0160_fx_test_v060.hip Edited May 7, 2022 by meeotch Quote Link to comment Share on other sites More sharing options...
Keshaw singh Posted May 23, 2022 Share Posted May 23, 2022 make another attribute for curvature area and blend then with displacement and it works well Quote Link to comment Share on other sites More sharing options...
meeotch Posted May 24, 2022 Author Share Posted May 24, 2022 Thanks for the reply - unfortunately, I don't think that solution will give visually similar results. If you look at the geo (output of a voronoi shatter), you can see that blending off the displacement toward the edges will result in really simple, straight-line silhouettes. But I did hear back from Sidefx with a good explanation of why the geo breaks in the first place, and why the renders are non-deterministic. Short version: it's a combination of the displacement being along the normal, the normals being discontinuous at the edges, and multithreading. The fix is either to define a continuous "displacement normal" to use, or just use 3D noise for the displacement, instead of 1D noise. ---------------- From Sidefx: The issue is that all the fragments have shared points, but they have different vertex normals for each of the faces that share the points (due to cusping). Because of threading, the order that faces get diced is indeterminate (one face might get diced first on one run, but later on a 2nd run). Mantra assumes that the displacement shader will displace the same point in the same direction on every run. But since the shared points have different vertex normals, the direction to move the point (along the normal) is determined by which face dices the polygon first. For example, if you had a cube and you wanted to displace one of the corners along the normal, what direction would you expect the point to be moved? There's no really good answer to that. You might say that it should move along the average of all the normals. And that's what mantra would do if you had point normals. There are a few solutions to this problem. 1) Use 3D noise displacement (create a noise vector based on the shading point P) and don't displace along the normal. 2) Create a per-point attribute that represents the shared normal and displace along that normal instead. Both of these would involve diving into the displacment VOP net. The 2nd would involve setting up another attribute on the geometry as well. You can also change the geometry to have point normals instead of vertex normals. This would change the look quite drastically though. Using bump mapping instead of displacement would probably work but there still might be subtle issues because of displacing along the vertex normals. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.