Jump to content


Popular Content

Showing most liked content on 06/07/2022 in all areas

  1. 2 points
    Hi, the issue is that you are transferring UV to your packed geo. That "promotes" the UV to a single value for the entire packed fragment. Then, when you unpack it, it transfers that single value to the entire packed fragment, messing up your UV. You do not need to "Transfer Attributes" when you pack your geometry. That transfers attributes on the "shell", but the content still has the original data, regardless. If you isolate a single piece, for debug, and look at the vertices in the spreadsheet, you can see that the first UV value is transferred on your packed geo. Then, when unpacking, that value is copied on every vertices for that packed fragment. Just remove the * in the "Transfer Attributes", and it'll work.
  2. 1 point
    I came up with another solution using pcopen in a solver. It's a bit unpredictable, but I found the result pretty and interesting. There is a hip file if anyone is interested.
  3. 1 point
    I ended up solving the issue by first using the Orient Along Curve SOP (which is an amazingly handy SOP that can do so much more than just this!) to first build the orient attributes, before the hair curve goes into the Vellum network, then point deforming the low res hair curve back onto the high res one.
  4. 1 point
    Impact records help when you need to pick it up from the Bullet collision geometry. If not - you can do separate collision detection using pop collision detect too.
  5. 1 point
    I haven't test it with hair yet, but under Image Output/Filters you can enable Denoising. I'm not sure what effect denoise has on thin hairs. Also try another HDRi image if you're using one. Some of them tend to produce noise. It's not unheard of to blur the hdri map to obtain smoother results.
  6. 1 point
    Look at this: https://www.sidefx.com/tutorials/pattern-selection/
  7. 1 point
  8. 1 point
    Ok! First - the most important part of the method. Check this diagram and attached file - they are the core algorithm I came up with. 1. Let's say we have a simple 2d point cloud. What we want is to add some points between them. 2. We can just scatter some random points (yellow). The tricky part here is to isolate only the ones that lay between the original point cloud and remove the rest. 3. Now we will focus just on one of the points and will check if it is valid to stay.Let's open point cloud with certain radius (green border) and isolate only tiny part of the original points. 4. What we want now is to find the center of the isolated point cloud (blue dot) and create vector from our point to the center (purple vector). 5. Next step is to go through all points of the point cloud and to create vector from yellow point to them (dark red). Then check the dot product between the [normalized] center vector (purple) and each one of them. Then keep only the smallest dot product. Why smallest - well that's the trick here. To determine if our point is inside or outside the point cloud we need only the minimum result. If all the points are outside , then the resulted minimum dot will always be above zero- the vectors will tends to be closer to the center vector. If we are outside the point cloud the result will always be above zero. On the border it will be closer to 0 and inside - below. So we are isolating the dot product corresponding to the brightest red vector. 6. In this case the minimum dot product is above 0 so we should delete our point. Then we should go to another one and just do the same check. Thats basically all what you need. I know - probably not the most accurate solution but still a good approximation. Check the attachment for simpler example. In the original example this is done using pointCloudDot function. First to speedup things I'm deleting most of the original points and I'm trying to isolate only the boundary ones (as I assume that they are closer to gaps) and try not to use the ones that are very close together (as we don't need more points in dense areas). Then I scatter some random points around them using simple spherical distribution. Then I'm trying to flatten them and to keep them closer to the original sheets - this step is not essential, but this may produce more valid points instead of just relying on the original distribution. I'm using 2 different methods - the first one ( projectToPcPlane ) just searches for closest 3 points and create plane from them. Then our scattered points are projected to these closest planes and in some cases it may produce very thin sheets (when colliding with ground for example). There is a parameter that controls the projection. Then second one is just approximation to closest points from original point cloud. Unfortunately this may produce more overlapping points, so I'm creating Fuse SOP after this step if I'm using this. The balance between these 2 projections may produce very different distributions, but I like the first one more, so when I did the tests the second one was almost always 0. Then there is THE MAIN CHECK! The same thing that I did with the original points I'm doing here again. In 2 steps with smaller and bigger radius - to ensure that there won't be any points left outside or some of them scattered lonely deep inside some hole. I'm also checking for other criteria - what I fond that may give better control. There may be left some checks that I'm not using - I think I forgot some point count check, but instead of removing it I just added +1 to ensure that it won't do anything - I just tried to see what works and what not. Oh and there are also some unused vex functions - I just made them for fun, but eventually didn't used. So there it is. If you need to know anything else just ask. Cheers EDIT: just edited some mistakes... EDIT2:file attached pointCloudDotCheck.hiplc