Jump to content

Search the Community

Showing results for tags 'instancing'.

More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • General
    • Lounge/General chat
    • Education
    • Jobs
  • Houdini
    • General Houdini Questions
    • Effects
    • Modeling
    • Animation & Rigging
    • Lighting & Rendering
    • Compositing
    • Games
    • Tools (HDA's etc.)
  • Coders Corner
    • HDK : Houdini Development Kit
    • Scripting
    • Shaders
  • Art and Challenges
    • Finished Work
    • Work in Progress
    • VFX Challenge
    • Effects Challenge Archive
  • Systems and Other Applications
    • Other 3d Packages
    • Operating Systems
    • Hardware
    • Pipeline
  • od|force
    • Feedback, Suggestions, Bugs

Found 12 results

  1. Hello, I recently went through Steven Knipping's tutorial on Rigid Bodies (Volume 2). And now I have a nice scene with a building blowing up. The simulation runs off low-res geometry and then Knipping uses instancing to replace these parts with high-res geometry that has detailing in the break-areas. The low-res geometry has about 500K points as packed disk primitives so it runs pretty fast in the editor. Also, it gets ported to Redshift in a reasonable amount of time (~10 seconds). However, when I switch to the high-res geometry, as you might guess, the 10 seconds turn into around 4 minutes, with a crash rate of about 30% due to memory. When I unpacked it I think it was 40M points, which I can understand are slow to write/read in the Houdini-Redshift bridge, but is there no way to make Redshift use the instancing and packed disk primitives? My theory kind of is that RS unpacks all of that and that's why it takes forever, because when I unpack it beforehand, it works somewhat faster - at least for the low-res. The high-res just crashes. I probably don't understand how Redshift works, and have wrong expectations. It would be nice if someone could give me an explanation. Attached you'll find an archive with the scene with one frame (Frame 18) of the simulation included as well as the folder for the saved instances of low- and high-res geometry. Thanks a lot for your help, Martin (Here's the pic of the low-res geometry, the high-res is basically every piece bricked down into ~80 times more polygons.) Martin_RBDdestr_RSproblem_archive.zip
  2. Hi! How do I make instances of deformable geometry with deformer (wire, lattice, …) on the surface of the animated object? I want to do, for example "hair" with a custom geometry, or algae. They should be copied in the animated surface to have minimal memory usage.
  3. Hi, So I made an rbd sim where I had only 200 unique pieces. Then I made an rbd sim using 100s of each of these pieces. I cached the simmed points. After this, I want to place my unique pieces onto these points. What's the best and most optimal way to do this for the viewport and the renderer? I have seen many different techniques, I can't tell which one is the best. Should I just use Instance SOP? Should I save the first frame of the sim and use this instead? I can transform them using Transform Pieces SOP, but I don't know how to bring them optimally. Should the pieces be brought it as packed geometry, packed fragments, packed disk geo, or something else? I want Houdini to know everything is coming from these 200 unique pieces so it can show them more efficiently in the viewport and the renderer. I noticed if I save my pieces onto separate files as packed fragments, Houdini is saving the entire packed geometry onto each file that takes a lot of space. How do you do this? Thanks
  4. Hi, I have some particles that I was trying to use instance to Copy lights on them. But I want to control their intensity base on random point/id numbers, so each particle will follow different intensity. I couldn't come up with a way to control the lights intensity for each individual particle separately. I came up with the idea of using copy stamping after my popnetwork node. but I need to use a way such as object merge, but unfortunately it doesn't support lights. I would like to know if there is anyway to import light in my nodes with something like Object merge. Any idea that how can I do that or how can I control each individual lights intensity based on random id/point number? Thanks
  5. Hi Could anyone tell me why these box instances do not appear until frame 94. Thanks for your help instance attempt.hipnc
  6. Hi Guys If anyone has done this and has a solution or any advice that would be great and hugely appreciated. It is fairly easy to do all of this if your volumes to instance are within your hip scene. However once you are only referencing them from disk I am struggling to run the cvex over each instanced volume (of which I am using 20 different variations hence using the "instancefile" attr.) I am also trying to volume sample each instance to pull in surface, rest etc primitives into the cvex context to drive my displacement. many thanks tom
  7. Hi, does anybody have experience with exporting ASS files from houdini to render them in Maya and preserve instancing? Now htoa seems to bake-in instances and ASS files are huge, instancing is apparently not preserved. Any ideas? Thanks, Juraj
  8. Hey guys, I've got an issue in DOP with plugging a source volume node into the pre-solve input of a smoke solver when using instancing/partitioning. From the scene attached, here are different scenarios: - `fluidsource1/enable_partitioning` is disabled and the source volume is plugged into the post-solve input: WORKS. - `fluidsource1/enable_partitioning` is disabled and the source volume is plugged into the pre-solve input: WORKS. - `fluidsource1/enable_partitioning` is enabled and the source volume is plugged into the post-solve input: WORKS. - `fluidsource1/enable_partitioning` is enabled and the source volume is plugged into the pre-solve input: FAILS—the container is entirely filled with a density of 1 at a frame after the expected frame. I would like to plug the source volume into the pre-solve step so the velocities advect the density on the creation frame—am I correct to believe that this is a valid approach? If so, what's going wrong here? Also if, instead of sourcing the volume only on a single frame, a continuous emission is performed by turning off the `sparse_emission` switch, then it works fine again. And finally I thought of scaling the volume density value by 0 outside the single emitting frame defined for each instance but I don't even know how to procedurally do that with volumes named "density_0", "density_1", ..., "density_n"? And I would have to do the same for the velocity sources and every other field to source, which is suboptimal. I'd appreciate any help! smoke_presolve_instancing.hipnc
  9. I recently built a animating instance geometry system. The instancefile attribute is used to select between different files. There is subframe instancefile selection information as well and the feature for subframe motion bur is enabled on the SOP. I am unable to achieve motion blur with this setup. I was under the impression that if Mantra received subframe information of the same geometry(same number of vertices and points, with the same numbers and attributes) it would be able to compute the motion blur. At the current state I don't have velocities attribute associated with the loaded geometry. This is an undesired way to achieve motion blur. Any help would be appreciated. I can post a .HIP file when I simplify the method down to a few nodes.
  10. MultiPoint Particle

    Hello guys, I would like to know if it is possible to create a multi instancing with FlipFluid or PopParticle on Houdini. Like one point drives multi point. It's the same thing when you activate MultiPoint on Maya Cheers, Sylvain. PS : Enclosed there is an exemple.
  11. Hi, I've been playing around with instancing and I didn't manage to change the animation start frame on a per instance basis. For example, if I instantiate a cube with an animation (keyframes or bgeo sequence) onto points, all the instances play the animation the same way. (See file below) How can I change the animation start frame on a per instance basis to randomize things a bit (be it keyframes or bgeo cache)? Thank you in advance. Ryuji InstanceAnim.hipnc
  12. Hi, I'm trying to write a displacement shader that takes a curve and changes geometry (in my case a grid) to follow the curve. i can do it if the curve is simple and has origin on 0,0,0 and goes in one direction. it gets more difficult once the origin is off 0,0,0 and goes in different directions. what's the easiest and simplest way to do this?