Jump to content

loudsubs

Members
  • Posts

    146
  • Joined

  • Last visited

  • Days Won

    2

loudsubs last won the day on June 6 2017

loudsubs had the most liked content!

Personal Information

  • Name
    .
  • Location
    USA

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

loudsubs's Achievements

Newbie

Newbie (1/14)

  • Week One Done
  • One Month Later
  • One Year In

Recent Badges

46

Reputation

  1. Does this toolset create uv's for the trunk, branches, and leaves?
  2. Yeah thanks, I have tried using a wire sim, but havent really been able to get what I want. It always becomes a battle between adding too much stiffness to compensate for the inertia from the keyframed positions and rotations. The animation I want to do could be compared to a really intense roller coaster. Lots of dips, spirals, etc. So I havent really found a way to tame a wire sim, when that kind of force is being put on it. So that's why I'm thinking do a procedural sop animation first, because of the control I get for dialing in the exact keyframes on position and twist along the curve, and then doing a sim after that base level animation to give it some organic jiggle and life. Offsetting animation with chops has been the closest thing I've found
  3. Hi all, I'm starting to look into kinefx, and am interested in finding a way to make a simple rig for a line, that would basically allow me to hand-animate the lead point's position and rotation, and the child pts of the line will automatically follow the position and rotation, but be offset. It's basically to control a flying snake-like character... so in concept, in the attached image of this ribbon wand thing, the performer is controlling the lead point of the ribbon, and then getting some beautiful motion as the tail of the ribbon follows that lead motion. My goal is to have a way to hand animate just one lead point, and get houdini to handle the cool spiraling and offset motion for the "tail" of the line. Would anyone be able to share a simple setup showing how to set this up with kinefx? Thanks!
  4. Yeah, definitely use .rs instead of bgeo if you want to do the instancing at rendertime. (use the redshift proxy rop). I attached an old example of doing it this way, should still work. (FYI the redshift tab may not have all the latest updates though since I created this a while ago) Alternatively, I find it easier to do instancing with redshift using the packed prim workflow with the copy to points sop. Create i@variant on both sides of the copy, so no stamping is needed. Turn on "piece attribute" in copy node Turn on "pack and instance" in the copy node Up at /obj level, on the geometry object you're going to render, which contains the packed prims: Redshift Tab > instancing > tick on "instance SOP level packed prims" rs_proxy_instance.hiplc
  5. Do you have this hip file I can look at? I'm not getting as cool of an effect when I try to mix flip
  6. The threshold range will take that input range and remap it as a 0-1 multipier on your disturbance value. So anything .05 or above will not get effected, and the closer you get to 0, the stronger the effect. Basically masks the edges of your smoke to get the fine detail
  7. Very cool! thank you for taking a look at my file, and giving the detailed explanation, lots to dissect here. Thank you!
  8. I'm using a pathsolver setup (from entagma) to get some random animation on points, moving either 1 step in X or Z each iteration. And then running into problems trying to create rotations from the animation data. The math to get the correct amount of rotations (happening in CHOPs), seems to be working, however the orientation or the axis' that the rotations are supposed to happen on, are getting screwed up. If I only allow for rotations along one axis, either X or Z, I can get it to work. But when I enable both, I start getting really weird behavior, and rotations happening along different axes. So if there's a quaternion expert out there that wouldn't mind taking a look, I'd really appreciate it. Basically the part in the scene where I'm having issues is the pink rotations node near the bottom, and that is getting its data from the green chopnet just above. The stuff near the top, in the grey subnet, you can pretty much ignore, it's just generating some animation on the points in a technical way. Thanks! pathsolver04.hiplc
  9. Did you try the split point SOP? This might be what you need, if you have a procedural method of grouping the points that need to be split. (I just grouped by hand in this example file). pointsplit.hiplc
  10. I'm planning out a new rig, and will be utilizing GPU quite a bit for sim and rendering. I'm pretty novice with hardware building, and am looking for some advice on a MB and case that could fit 3 GPU's. 660ti 3gb - Old card, but seems like it would be fine for the viewport 1070ti 8gb - for sims and rendering 1080ti or Titan RTX - TBD on which card I will get, but it will be something big for the 3rd card This is the rest of the build I've come up with so far. I'd like to figure out if these 3 cards would fit with this ASRock TRX40 Creator ATX sTRX4 MB and The Enthoo Primo case, or the Enthoo 719 case? https://pcpartpicker.com/list/hMfj9G Cheers!
  11. Been trying to crack this one since they changed the sourcing. Few variations here. Done in 17.5 vel_test2.hip
  12. Guessing here, haven't tried this for writing videos - Partition by attribute, and make sure the attribute you're wedging is in the filename. Then you should get 3 different files.
  13. I have basic tentacle-type setup, that's what I'm going for anyway... I'm creating dynamic constraints from a cluster of vellum points that are pinned to some SOP animation, and attaching to some vellum softbodies with no pinned animation. Done using the 'attach to geo' option on the vellum constraint DOP. The constraint creation is working, but I want the soft bodies to get pulled into the points that are creating the dynamic constraints (like tentacles pulling something in). I've tried by manipulating the rest length, but that doesn't seem to have any effect. Anyone know how to get this to work? Thanks! vellum01.hip
  14. GPU mantra GPU sparse smoke solver. Like the one this guy made Some love to the CHOPs animation tools, for non-animators (ie: spring, jiggle). Ways to make it easier to procedurally (group?) effect only certain parts of the incoming data. More control and options to fine-tune when using spring, etc.
  15. Yeah that's pretty much it, make a simple scene and test it. Think of it like doing vector math in a vopsop. the gradient fiield is a vector, and you can add it to the vel field which is another vector. You can also mulltiply a vector field by a scalar field, (ie: temperature). That is how buoyancy works in a smoke sim, there is a bouyancy direction (a vector), and it gets multiplied by a scalar field, temperature, which controls the magnitude. The control field parm of some of the shaping tools is indeed a mask, almost like a group field in sops. It wants a scalar field like density or temperature, or something custom like fit length of a curl field (which you could get with a couple gas analysis nodes and a gas vop) You dont really need to mess with the bindings tab often. You can do some of these operations with vdbs in sops (vdb analysis), and then lookup the volume values on points that are sitting in the same position using a pointvop and volume sample. This can be a good way to inspect the numbers more if youre trying to figure what exactly the calculation is doing
×
×
  • Create New...