Jump to content
[[Template core/front/profile/profileHeader is throwing an error. This theme may be out of date. Run the support tool in the AdminCP to restore the default theme.]]

doc last won the day on May 7

doc had the most liked content!

Community Reputation

14 Good

About doc

  • Rank
    Illusionist

Personal Information

  • Name
    el Doktor
  • Location
    Vancouver
  1. If you are using mantra you'll want to look at phantom objects in your mantra rop. If you are in Solaris I think LPE's are the way to go
  2. Rotating Packed Primitives in VOP?

    Try this out: https://www.sidefx.com/docs/houdini/vex/functions/packedtransform.html
  3. Pyro Shredding Question

    If memory serves the old shredding method would allow you to pick a target temperature to act as a boundary. The temperature gradient would then be calculated which in essence make vectors that pointed in or out, away from the boundary. The vectors on the inside acted as the "stretch" and the vectors pointing out acted as the "squash" . I found the terms squash and stretch to be missleading. As you increase the size of the "stretch" the velocity vectors pointing in increase the pressure on the inside. The increased pressure causes the velocities in the bouancy direction to increase, hence the term stretch. The squash work in the opposite way. I found the easiest way to understand what this dip did was to copy the vex code inside it and put it in a wrangle sop. Sops is a much more intuitive place to figure out what's going on
  4. Pyro Shredding Question

    What version of Houdini are using? In newer versions of Houdini shredding is just disturbance (take a look inside the solver). Not sure if the gas shred dop is still there, but the docs you are quoting sound like they have been taken from that operator.
  5. If both parameters are integer parameters (ie not float parameters set to integer values) then you might try using the chi expression instead of ch. I believe ch returns a float, so it might be casting you integer to a float
  6. A number of different ways to handle this. 1 create a grid above your surfaced flip sim. Scatter points on the grid. Copy your flat lillypads to the points. Ray the Lilly pads onto your surfaced flip sim. 2. Create your lillypads as above. In your flip surfacing node there's an option to output a VDB. Instead of rating on a polygon surface. You can use the SDF value and the gradient of the VDB to move the points of the lillypads to the surface of the water. If you were really ambitious you could do this in a solver and actually inherit the velocities from the flip sim
  7. Ryew is correct. The idea behind the data bindings is you can associate an incoming field with a different name to a bind inside the vopnet. This especially handy when you have vopnets inside an hda
  8. Vellum dress up

    You might want to crack open the vellum drape node and take a look. It takes points that are to be welded and creates a stitch constraint in their place. If I remember correctly the rest length is then scaled down over time. At a specified frame the stitch constraints are removed and the welds are activated. Seems like you could use this technique for what you are trying to do
  9. I noticed that increasing the resolution of the sphere seems to make the problem go away, but I'm not sure why that would make a difference
  10. Very close to getting it working, but in my test I'm getting a weird wedge of high density. Maybe somebody smarter can tell me what I missed? volume_fall_off_test_v1.hip
  11. Camera Projection

    ejr32123 is correct. You want a constant shader on the top of the sidewalk, but you want proper lighting on the fractured interior faces. A larger issue you'll have is when the pieces break and move the lighting on the top surface should change. The shadows should change, the diffuse and specular lighting should change. But how do account for this if the lighting is baked into the projected image? No easy answer as far as I know. People often make a copy of the original image and try and remove specular and shadows in the hopes of creating a diffuse texture map that can be used with a principled shader. Photoshop or gimp seem the most obvious tools to do this, but I believe that there are some photogrammetry solutions that have some automated mechanisms for doing this. Once this is done you'll need to reproduce the lighting in cg so that it matches the photo. You'll probably need to render a shadow pass for areas where your cg debris is supposed to cast shadows on the constant surface. Then it'll probably take some love in compositing to make it all work. Hope that helps
  12. Are you rendering through Solaris or /obj?
  13. Camera Projection

    It looks like your geometry is picking up lighting, which it doesn't need because the lighting already exists in the image. Try rendering the geo with a constant shader. The issue should go away
  14. hmmm... I just gave it a try. seems to be working with my example, 50 lights. Although there does seem to be a limit to what can be displayed in opengl lops_crowd_lights_doc_v2.hip
  15. not sure how much flexibility there is in terms of creating the point ordering procedural in say vex. However, the "Planar patch from curves" sop is your friend here. It'll make point groups with the proper ordering so that your welds can work. This is great because once you have these in place you can procedural adjust the resolution of your geometry. If you look inside the attached you'll see an example. I'm using /obj/shirt/resample2 (the green node) as a means of controlling the resolution of "Planar patch from curves" sop as well as my resample. Works like a charm. just_shirt_v22.hip
×