Welcome to od|forum

Register now to gain access to all of our features. Once registered and logged in, you will be able to contribute to this site by submitting your own content or replying to existing content. You'll be able to customize your profile, receive reputation points as a reward for submitting content, while also communicating with other members via your own private inbox, plus much more! This message will be removed once you have signed in.


  • Content count

  • Joined

  • Last visited

  • Days Won


Community Reputation

369 Excellent

About mestela

  • Rank
    Houdini Master

Personal Information

  • Name matt
  1. No probs! Realised I missed the bit about random speeds, in that case change the line in the solver wrangle that adds a constant amount: @active += 0.2; ... to something that adds a random amount per point: @active += rand(@ptnum)*0.3; I had to then boost the number of substeps on the solver to 20 (or boost the frame range), as it grows much slower.
  2. This approach is a little heavy, but it works. Tried cunning things with carves and adjustPrimLength that's been discussed here, but the problem is the polylines have all sorts of orientations, so growing them in that fashion often makes them grow opposite to the overall flow. The base idea here is the same as the infection/growth stuff that all the kids are doing nowadays. Growing_lines_between_points_me.hipnc
  3. AFAIK people generate vectors in external apps (usually After Effects or Nuke with plugins like Twixor or go fancy with openCV libs or hacked ffmpeg builds), and then use that image sequence to make a velocity field. I assume that's how the amazing video below was done. (edit) Ah, he says in another video he uses deep flow: https://thoth.inrialpes.fr/src/deepflow/ (/edit) Had a quick look around last night and this morning, found this javascript image processing library. They have an implementation of 'Lucas-Kanade optical flow', I have a suspicion it'd be good starting point to port to vex: https://github.com/inspirit/jsfeat https://github.com/inspirit/jsfeat/blob/master/src/jsfeat_optical_flow_lk.js -matt
  4. http://www.tokeru.com/cgwiki/index.php?title=HoudiniUserInterfaceTips#colour_correction_toolbar_in_viewport
  5. This might help: http://www.tokeru.com/cgwiki/index.php?title=HoudiniDops#Grain_attached_to_things.2C_then_explode
  6. This might help: http://www.tokeru.com/cgwiki/index.php?title=HoudiniVex#Access_group_names_procedurally_in_vex
  7. blast the 'inside' group -> divide 'remove shared edges' -> ends 'unroll' voronoi_edges.hipnc
  8. useless? ha, ouch. idea is the same, project your uv's before you carve, they should maintain. uv_carve_fix2.hipnc
  9. You weren't clear about what you want fixed, was this the sort of thing you wanted? Use a uvtexture sop in 'rows and cols' mode, it's the closest way to map parametric uvs into 'real' uvs. Put the uvtexture sop before or after the carve depending on if you wanted the uvs to follow the animation or not. uv_carve_fix.hipnc
  10. attribInterpolate is probably the easiest way these days: http://www.tokeru.com/cgwiki/index.php?title=Houdini#Slide_points_along_edges
  11. Sweet! Here's a quick example of using noise, ramps, nonsense to get a growing look. All realtime, keyframeable, no simming required. fakey_noise_growth.hipnc
  12. no probs. don't forget with this sort of thing you may not even need a solver to define the growth pattern; some noise run through bbox based ramps and whatnot will probably be fine, and save you having to use a pesky solver.
  13. As you say, you were super close. All I did was split out the stuff, and be lazy with how to control the growth timing. The solver growth is in its own sop solver, running on the static points rather than the deforming ones, just so it was easier to debug. Basically the same technique you were using, to control the timing all I did was enable the bounding box controls for the group where you define your seed points, and keyframed it to wipe through the person so it starts above his head (thus meaning no seed points are defined), then wiping through to his feet. The dopnet is just the grain sim, the only extra bit I had to do was to make sure it copied Cd from outside the sim, otherwise it never inherited the growing 'infection'. GrainDancer4_using_infect_wrangle_fiddle.hipnc
  14. Doh, immediately after I posted realised how to make it be fully dynamic rather than switch, if you need that level of detail. Here I've ramped the targetstiffness from 0 at the feet to 1 near his waist, then turn it off completely after frame 41. grain_dancer_v02.hipnc
  15. go lazy; point deform the grain source, switch it on the sim frame? grain_dancer.hipnc