edward Posted March 2, 2014 Share Posted March 2, 2014 As I mentiond I'm used to column major format. Just to be pedantic, "column-major" refers to a storage layout format for matrices. What we're really talking about here is a "row vector" vs "column vector" convention. 2 Quote Link to comment Share on other sites More sharing options...
magneto Posted March 2, 2014 Share Posted March 2, 2014 After spending a day working on your second file, I have a few more questions 1. Can you use 1D noise instead of 3D? 2. 1D noise should get rid of the need to create orient vectors which simplifies the process. I got it working in SOPs but not in the shader. Not sure if it's reasonable to do this. 3. Other than this, I don't know if there is a way to proceduralize the masking process. If we could specify primitives, it would be easy but for points since they are shared, it only includes some outer points (outside group). Thanks again, it's pretty good example Quote Link to comment Share on other sites More sharing options...
Netvudu Posted March 2, 2014 Share Posted March 2, 2014 (edited) I fiddled myself with the masking process for a while, in fact I already masked with the first file, before Rayman´s second example,(actually it was as simple as adding a new custom attribute and multiplying by it at shader time) but I quickly realized that this is something you want to do non-procedurally, because I foresee many many different situations where a procedural method isn´t going to cut it. On a best-case scenario it shoud be semi-procedural, allowing the user for some manual input....IMHO of course. Edited March 2, 2014 by Netvudu 2 Quote Link to comment Share on other sites More sharing options...
magneto Posted March 2, 2014 Share Posted March 2, 2014 Thanks Netvudu. How did you get the points that use 1 for the displacement? My only issue was that when using all inside points, they also "bleed" into the outside points so you also get displacement on the outside. I think allowing people to specify these points is reasonable, but it would also be great to have a good default. The inside points actually cover this nicely except I don't know to stop the outside points from getting affected. If you could only use the inside primitives, then it's not a problem, but I don't know how you could use this in the shader. Is there a way to read primitive attributes in a shader? That would be great Quote Link to comment Share on other sites More sharing options...
rayman Posted March 4, 2014 Share Posted March 4, 2014 Hello. This is the next example of dispalcement with uvs. Unfortunately I coundn`t find better way of doing it so I`m showing you my old setup with rest position. If I remember correctly displacement is done before shading, so when I`m using P for displacing geometry and then to move uvs it produces different results. Do you know if there is a way to store some variables between displace and shading stage? In this case what i want is to generate noise vectors, store them as attribute and then get this attribute to move uvs. I hope this makes sence. P.s. I`m sorry but this time I really dont have the time to answer you guys, hopefully next time I will. Cheers SHOP_edgedisp_v08B.hipnc 1 Quote Link to comment Share on other sites More sharing options...
rayman Posted March 7, 2014 Share Posted March 7, 2014 Why is that when I plug something different than a box shape (like a sphere or cylinder) it kind of not works properly? Displacement occurs on outside primitives and it all looks kind of not right. Is there something can be done about it? That would be really great, because your technique works beautifully. Yes, you have to do several things manually. If you look the example files, they are separated in 2 columns. Right side is the auto-setup, right side is the manual one, where you have to change a few things to make it work the way you want. The most important thing is to orient matrices to match your geometry sides. The way I do it in the examples is using plane with uvs aligned to pieces. I then transfer 1 vectors from plane to pieces to describe orientations. This whole setup wont work well with spheres, as the hole idea is to create 1 or 2 directional displacement. It should work with 3dir displacement as well, but you can do it using much simpler method. After spending a day working on your second file, I have a few more questions 1. Can you use 1D noise instead of 3D? 2. 1D noise should get rid of the need to create orient vectors which simplifies the process. I got it working in SOPs but not in the shader. Not sure if it's reasonable to do this. 3. Other than this, I don't know if there is a way to proceduralize the masking process. If we could specify primitives, it would be easy but for points since they are shared, it only includes some outer points (outside group). Thanks again, it's pretty good example 1and2. I`ve never tried this. Can you upload some hips. It sounds interesting and probably it is a better solution if it makes it simpler. I fiddled myself with the masking process for a while, in fact I already masked with the first file, before Rayman´s second example,(actually it was as simple as adding a new custom attribute and multiplying by it at shader time) but I quickly realized that this is something you want to do non-procedurally, because I foresee many many different situations where a procedural method isn´t going to cut it. On a best-case scenario it shoud be semi-procedural, allowing the user for some manual input....IMHO of course. Yes, I also think semi-procedural solution is the best approach. -------------------------------- About the uvs - I hope somebody will help me with this, so we can find better solution. Currently it is working only with rectangular uvs, and we have to know ratio between world units and uvs. If uvs are 10x10units in world space, then you have to multiply uv offset by 0.1 in shader, so the uvs will match displacement offset. Not the best solution at all. Which is more (currently) you have to know uvs orientation according rotated offset matrices, so X Y and Z offset vector have to match UVW directions in world space D: ! This I think can be done procedurally very easy, as there is a way to get uv vectors in world space inside shader, so at least it this one is not a problem (in theory). In next example I`ll try to implement it. I`m also searching for another solution using pointclouds. Not sure if it is possible, but I`ll tell you about this one later. Just have to do more tests... 1 Quote Link to comment Share on other sites More sharing options...
magneto Posted March 7, 2014 Share Posted March 7, 2014 Thanks rayman, I will upload my experiments based on your second file on the weekend Also I was looking at your last file, can you please tell me what you changed? Is it only the red nodes in the shader? If it has better techniques, I would love to use that as the base instead, but I already modified the second one heavily, that's why I was wondering what's changed exactly Quote Link to comment Share on other sites More sharing options...
rayman Posted March 7, 2014 Share Posted March 7, 2014 I don`t think I changed it at all. Just added some uvs to pieces and removed unused nodes (connectevity_id). Red nodes inside the shader are the ones that I want to remove and make them procedural. Current setup is reaaaaally ugly. But it works (: 1 Quote Link to comment Share on other sites More sharing options...
magneto Posted March 15, 2014 Share Posted March 15, 2014 (edited) Sorry for the late reply. I attached my changes but there are some problems I did the displacement in SOPs because it's easier to see. After it can be done there correctly, then I could think about doing it in the shader. 1. I used 1D noise but right now the inside pieces are separate meshes. Is this intentional? Because when I displace these, it creates cracks in the geometry. 2. I also had to save the original normals from TimeShift, because I couldn't restore the original normals. Should I multiply the transformed piece's normal with transpose(invert(matrix))? Because it doesn't work. So basically this is what I wanted to do. Only displace the inside points using the first frame as reference. Thanks SHOP_edgedisp_v07_simple2.hipnc Edited March 16, 2014 by magneto 1 Quote Link to comment Share on other sites More sharing options...
cradders Posted March 19, 2014 Share Posted March 19, 2014 Hi guys, I've been following this post with great interest. I've been playing around with the files myself, although I must state I'm very much a beginner and I may not be grasping some of the finer details. I have some questions regarding the displacement mask that is created. From what I see, the whole model is displaced, then a 'displacement mask' is used to pull back or mask (or something) the outside edges to their original un-displaced shape, right? This is done by selecting points that are on the inside. In this example, it's achieved by squashing a bounding box to just smaller than the original shape and using that as a container to select points. Three questions about this really: 1) What exactly is the criteria for this point selection? Because it's not exactly 'internal points only' because many of the points seem to actually be on outside faces anyway. 2) Wouldn't it make more sense to select areas for displacement by faces/primitives rather than points? Why can't you use the 'inside' group that is created by the original Voronoi fracture? I've tried this and it almost gives me the results I want - only internal faces are displaced, but they split apart from the untouched external faces at the boundary. Is there a way to fix this? 3) BIG QUESTION: How can you get this workflow to work for an initial shape that isn't a box? The 'rescaling of the bounding box' method seems limited to working on shapes that were originally boxes to begin with. Obviously a bounding box for any other shape won't reliably group internal points. Any help, advice or example files would be greatly appreciated! 1 Quote Link to comment Share on other sites More sharing options...
pezetko Posted March 30, 2014 Share Posted March 30, 2014 I didn't read the rest of the discusion just want to notice you that bug that required matrixToFloat and floatToMatrix conversion in shader was fixed in 13.0.357 (I checked it in 13.0.362 and matrix now gives directly expected values). So there is no need for that hack anymore. 1 Quote Link to comment Share on other sites More sharing options...
magneto Posted March 31, 2014 Share Posted March 31, 2014 Thanks pezetko, do you know what it was about? It was a strange bug but I am glad someone at SESI fixed it Quote Link to comment Share on other sites More sharing options...
rayman Posted March 31, 2014 Share Posted March 31, 2014 Sorry for the late reply. I attached my changes but there are some problems I did the displacement in SOPs because it's easier to see. After it can be done there correctly, then I could think about doing it in the shader. 1. I used 1D noise but right now the inside pieces are separate meshes. Is this intentional? Because when I displace these, it creates cracks in the geometry. 2. I also had to save the original normals from TimeShift, because I couldn't restore the original normals. Should I multiply the transformed piece's normal with transpose(invert(matrix))? Because it doesn't work. So basically this is what I wanted to do. Only displace the inside points using the first frame as reference. Thanks Hello! 1) Yes, you can use 1D noise with point normals for direction, but there one problem that I don`t think can be solved (easily). If inside pieces are separated from the rest of the geo, then inside point normals will be fine(most of them), BUT edges oh the sides wont be merged, so the outside geometry wont match inside one. If inside and outside pieces are one mesh, then they will stay together BUT then vertices on the edges wont be displaced in right direction (they will form point-faces normals averaged vector). So you still have to prepare geometry normals before displacement. What you can try is to use one mesh, loop through all points, assign them (0,0,0) normal, get their adjacent faces(prims), and if this face is into inside group get its normal and add it to point normal. Then normalize all normals. This will give you relatively good normals for displacement. BUT normals on the edges wont be good enough again (if angle between inside and outsude poligons is not 90 degree they will displace inside or outside). So you have to fix this also. For each point between inside and outside faces, you have to get inside and outside averaged normals separately, and project inside vector on imaginary plane described by outside normal. This will work fine for points on sides of the pieces BUT again not for all (points on the corners). So there is a much more math involved and in the end it will be more complicated than "simple" matrix solution. And the real problem is that even if you make it work in SOPs this method wont work in SHOP. When you run displacement or micropolys the whole mesh is tesselated. So when you move only inside points They will look fine, but the edge of outside piece wont match them even if they are one mesh. So, again - I`m not sure this method is possible at all. 2) I think that vectors are transformed automatically (at leas normals), but I may be wrong. If you get them from first frame then you have to multiply them by just the matrix (not the inverted one) 1 Quote Link to comment Share on other sites More sharing options...
rayman Posted March 31, 2014 Share Posted March 31, 2014 Hi guys, I've been following this post with great interest. I've been playing around with the files myself, although I must state I'm very much a beginner and I may not be grasping some of the finer details. I have some questions regarding the displacement mask that is created. From what I see, the whole model is displaced, then a 'displacement mask' is used to pull back or mask (or something) the outside edges to their original un-displaced shape, right? This is done by selecting points that are on the inside. In this example, it's achieved by squashing a bounding box to just smaller than the original shape and using that as a container to select points. Three questions about this really: 1) What exactly is the criteria for this point selection? Because it's not exactly 'internal points only' because many of the points seem to actually be on outside faces anyway. 2) Wouldn't it make more sense to select areas for displacement by faces/primitives rather than points? Why can't you use the 'inside' group that is created by the original Voronoi fracture? I've tried this and it almost gives me the results I want - only internal faces are displaced, but they split apart from the untouched external faces at the boundary. Is there a way to fix this? 3) BIG QUESTION: How can you get this workflow to work for an initial shape that isn't a box? The 'rescaling of the bounding box' method seems limited to working on shapes that were originally boxes to begin with. Obviously a bounding box for any other shape won't reliably group internal points. Any help, advice or example files would be greatly appreciated! 1) Really no criteria in this case. Something I created in a few minutes just to illustrate how to use masks to isolate displacement regions. I have to say I did the whole thing on my old laptop (windows 8/vista display drivers) and was limited with old H13 build, where bound node is working differently from newer ones and the whole thing crashes every 5 minutes (literally). I wanted to create better example but really didn`t have the chance, so I`m leaving this to you. 2)I don`t use groups because there may be some cases when geometry comes from another software where prims are not separated, so I prefer manual solution. You already said the reason why I currently don`t use them-the mesh is separated, but I think they may be helpful. Probably I`ll give them a chance (: 3)Yes, this wont work on different shapes. But you can paint them at leas (: I have some ideas and if they work I`ll post them. Quote Link to comment Share on other sites More sharing options...
cradders Posted April 2, 2014 Share Posted April 2, 2014 (edited) Thanks for the info Rayman, very helpful! So one more question. I'd like to use this displacement alongside a standard Mantra shader that contains all my other material info (color etc). I've experimented with copying the nodes in, but without much success. How would you implement this custom displacement into a standard Mantra shader? Cheers! Edited April 2, 2014 by cradders Quote Link to comment Share on other sites More sharing options...
fxrod Posted April 3, 2014 Share Posted April 3, 2014 All these techniques seem useful for rectangular shapes. I'm wondering, though, if anyone has found a reliable solution for curved surfaces? Jeff W's solution works well because displacing points along tangents of a flat surface ensures that the points stay (slide) on the surface. Try using his technique on a teapot and you easily see the problem where points are pushed off the surface. I also don't know if a displacement trick will work. You'll also still get interpenetrating micropolygons, if the amplitude on the noise is high enough betwen adjacent chunks. Seems like SESI should create a full-on solution for this time-honored problem. 2 Quote Link to comment Share on other sites More sharing options...
rayman Posted April 3, 2014 Share Posted April 3, 2014 Yes, this technique won`t work on curved meshes. You can use different method to handle them. If you have boundary geometry for shattered pieces it is possible to create 3d displacement and intersect with bound. I created this test with simple shattered sphere and it seems to work fine: This is done with single wrangle sop so it should be easily converted to shop. I`ll create some more tests and will share the results soon. 1 Quote Link to comment Share on other sites More sharing options...
rayman Posted April 3, 2014 Share Posted April 3, 2014 Thanks for the info Rayman, very helpful! So one more question. I'd like to use this displacement alongside a standard Mantra shader that contains all my other material info (color etc). I've experimented with copying the nodes in, but without much success. How would you implement this custom displacement into a standard Mantra shader? Cheers! This sounds strange. It should work if you replace displacement part with the custom one. I`ll check this later. Quote Link to comment Share on other sites More sharing options...
gosch Posted April 3, 2014 Share Posted April 3, 2014 I've also played around this task a little (thanks to rayman for great examples What I've found useful is to capture orthogonal basis from subdivided geometry to point cloud and then perform multi step displacement updating direction on every iteration. This caused softening stair-step look of geometry. Not perfect, but much better result ) Some of my experiments https://vimeo.com/90639775 7 Quote Link to comment Share on other sites More sharing options...
rayman Posted April 3, 2014 Share Posted April 3, 2014 Wow, Igor, this looks amazing! Really clever solution! This is my intersection inside shop attempt. 1 Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.