Jump to content

Search the Community

Showing results for tags 'instancing'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General
    • Lounge/General chat
    • Education
    • Jobs
  • Houdini
    • General Houdini Questions
    • Effects
    • Modeling
    • Animation & Rigging
    • Lighting & Rendering
    • Compositing
    • Games
    • Tools (HDA's etc.)
  • Coders Corner
    • HDK : Houdini Development Kit
    • Scripting
    • Shaders
  • Art and Challenges
    • Finished Work
    • Work in Progress
    • VFX Challenge
    • Effects Challenge Archive
  • Systems and Other Applications
    • Other 3d Packages
    • Operating Systems
    • Hardware
    • Pipeline
  • od|force
    • Feedback, Suggestions, Bugs

Found 19 results

  1. I'm having an issue where some of my instanced objects are not casting shadows when rendering to Arnold. In my example project that I've uploaded, box_instance_1 is not casting shadows on the ground plane where as box_instance_2 is casting shadows perfectly fine. Even stranger, when I hide box_instance_1 from render, box_instance_2 suddenly does not cast shadows. As far as I know I've set up both instances in the same way. Clearly something weird is going on, and I'd love to know if anyone else has encountered this issue? I'm on version 16.5.439 rock_instances.hip
  2. With the help of both the Redshift community and resources here, I finally figured out the proper workflow for dealing with Redshift proxies in Houdini. Quick summary: Out of the box, Mantra does a fantastic job automagically dealing with instanced packed primitives, carrying all the wonderful Houdini efficiencies right into the render. If you use the same workflow with Redshift, though, RS unpacks all of the primitives, consumes all your VRAM, blows out of core, devours your CPU RAM, and causes a star in nearby galaxy to supernova, annihilating several inhabited planets in the process. Okay, maybe not that last one, but you can't prove me wrong so it stays. The trick is to use RS proxies instead of Houdini instances that are in turn driven by the Houdini instances. A lot of this was based on Michael Buckley's post. I wanted to share an annotated file with some additional tweaks to make it easier for others to get up to speed quickly with RS proxies. Trust me; it's absolutely worth it. The speed compared to Mantra is just crazy. A few notes: Keep the workflow procedural by flagging Compute Number of Points in the Points Generate SOP instead of hard-coding a number Use paths that reference the Houdini $HIP and/or $JOB variables. For some reason the RS proxy calls fail if absolute paths are used Do not use the SOP Instance node in Houdini; instead use the instancefile attribute in a wrangle. This was confusing as it doesn’t match the typical Houdini workflow for instancing. There are a lot of posts on RS proxies that mention you always need to set the proxy geo at the world origin before caching them. That was not the case here, but I left the bypassed transform nodes in the network in case your mileage varies The newest version of Redshift for Houdini has a Instance SOP Level Packed Primitives flag on the OBJ node under the Instancing tab. This is designed to basically automatically do the same thing that Mantra does. It works for some scenarios but not all; it didn't work for this simple wall fracturing example. You might want to take that option for a spin before trying this workflow. If anyone just needs the Attribute Wrangle VEX code to copy, here it is: v@pivot = primintrinsic(1, “pivot”, @ptnum); 3@transform = primintrinsic(1, “transform”, @ptnum); s@name = point(1, “name_orig”, @ptnum); v@pos = point(1, “P”, @ptnum); v@v = point(1, “v”, @ptnum); Hope someone finds this useful. -- mC Proxy_Example_Final.hiplc
  3. Hello all, I've run into a problem that's been making me go in 6 directions and I can't seem to get any of them straight in my head. I have one cache that I want to copy to points that show up over time, I'm basically doing a running explosion like this https://gfycat.com/shinyhardanaconda When I copy the packed and cached sim onto the points the cache doesn't play with the offset of the points on the copied caches, so how would I go about offsetting each new sim on the new point. I've created a point attribute that marks the frame that each point gets created but I don't know how to connect that to a time offset or shift that would make it work on each point. Am I even doing this the right way? Anything helps, thanks.
  4. Hi, I am looking for the ways to replicate in Houdini point instancing done in some other application. I will skip here the data importing part because it is clear in my situation how to do that with Python. Let's say I already have a Python dictionary with elements like 'name':'transform', where 'name' is the name of the object to be instanced and 'transform' is a list of 16 floats representing world transform matrix of that instance. I have figured out so far how to do it at the object level. Here is my Python code for that: # My existing dictionary containing pairs like 'name':'transform' my_dict = {'foo': '....', 'bar':'......', ......} # where each dictionary value is a list of 16 floats for key in my_dict.keys(): node = hou.node('/obj').createNode('null') node.setName(key) m4 = hou.Matrix4(1) m4.setTo(my_dict[key]) node.setParmTransform(m4) This gives me bunch of named nulls with the correct transforms. And I can parent my object under those nulls. But I need the same at the SOP level. I need a bunch of points with 'name', vector 'scale' and preferably quaternion 'orient' attributes which I can pipe into the the right input of the CopySOP for instancing. Any help on that would be much appreciated.
  5. I'm having some trouble making my clustered pyro sim have collisions. It's a custom setup where I make a couple of containers with instancing at start frame and I've made a small change in the Gas Resize Fluid Dynamic DOP to resize each container separately based on $OBJID. The problem is that when I add collision objects, my density just doesn't show up or source anymore. When I use the shelf tool to make a static or deforming collision object, I don't get any smoke. Or I get it in weird places where the containers overlap. And when I use a Source Volume node set to Collision, all I get is what appears to be the Source field, but there is no density coming from it. When I disable the Source Volume node, density is sourcing and moving normally again. I've tested it in small scenes with basic geometry where I use shelf tools to setup the clustering and collision and then it seems to work. With the Source Volume node gives me the same issue, except when there's just 1 container (so no clusters). Then it works just fine. Since the shelf tool works, my guess is that I have to set some setting somewhere so that it works with instancing. I've been comparing the working test scene with my scene, but I haven't found anything yet. I'm afraid I won't see what I need to change even if it's under my nose. I've added the .hip file, but beware as it makes quite a bit of geo. So expect some loading time when you go in the SURFACE node. Thanks for any help! Eckhart TerrainDemat.v013.hip
  6. Hello! I'm currently working on a shot where a rocket takes off. I've gotten the setup working quite well for the start and the look of the spreading of the fluid, I now need the rocket to take off further. This introduced the problem that my container becomes gigantic as the bottom of the container spreads horizontally but the trail is just narrow. That's why I dug into instancing over the last few days and got quite good results. The issue I am failing at is transferring values between my clusters. I take the smoke object from the current iteration and add 1 and subtract 1 to get the adjacent smoke objects and merge them. I took the setup from the incredibly generous Florian Bard (http://flooz-vfx.com/ It's the rabbit_trail.hip file where he does that) but when trying to apply that to my particular case I couldn't get it to work. I tried various ways of importing the other smoke objects but apparently it's not fetching them. This is an image of my latest Version (v67) showing the cut. Attached are also some dailies demonstrating the issue. It seemed to be fine before I introduced dissipation in my sim but v67 (pyro_daily_v67_[1001-1105].mp4) now has this very apparent cut where the smokeobjects meet. (The look I'm going for is v38 (pyro_daily_v38_[0980-1160].mp4) - that was before the sim needed to be bigger and thus worked in Instancing.) That's the basic gist of what I do with the clusters. Import current, merge with next, merge with previous, then smooth out based on voxel proximity to border. Here is the HIP as well if taking a peek could help. GoFullApolloDreizehn_In_v243o.hip [Also for some reason my smoke object clusters start at 2. In my scene I have only two, and they are named smokeobject1_2 and _3 which I can't quite understand.] Thank you in advance, Martin
  7. Hi guys, I am currently building a environment for which I want to instance a lot of mega scan rocks. I am using the instance object so that I can instance thousands of the rocks with a huge hit. The problem is now that I turn on the displacement on the rocks everything comes to a halt. It looks like mantra is placing all the objects and then dicing and displacing the geometry. Is there a way of dicing and displacing the geo first? Should I just bake the displacement into the geo?
  8. Hello, I recently went through Steven Knipping's tutorial on Rigid Bodies (Volume 2). And now I have a nice scene with a building blowing up. The simulation runs off low-res geometry and then Knipping uses instancing to replace these parts with high-res geometry that has detailing in the break-areas. The low-res geometry has about 500K points as packed disk primitives so it runs pretty fast in the editor. Also, it gets ported to Redshift in a reasonable amount of time (~10 seconds). However, when I switch to the high-res geometry, as you might guess, the 10 seconds turn into around 4 minutes, with a crash rate of about 30% due to memory. When I unpacked it I think it was 40M points, which I can understand are slow to write/read in the Houdini-Redshift bridge, but is there no way to make Redshift use the instancing and packed disk primitives? My theory kind of is that RS unpacks all of that and that's why it takes forever, because when I unpack it beforehand, it works somewhat faster - at least for the low-res. The high-res just crashes. I probably don't understand how Redshift works, and have wrong expectations. It would be nice if someone could give me an explanation. Attached you'll find an archive with the scene with one frame (Frame 18) of the simulation included as well as the folder for the saved instances of low- and high-res geometry. Thanks a lot for your help, Martin (Here's the pic of the low-res geometry, the high-res is basically every piece bricked down into ~80 times more polygons.) Martin_RBDdestr_RSproblem_archive.zip
  9. Hi! How do I make instances of deformable geometry with deformer (wire, lattice, …) on the surface of the animated object? I want to do, for example "hair" with a custom geometry, or algae. They should be copied in the animated surface to have minimal memory usage.
  10. Hi, So I made an rbd sim where I had only 200 unique pieces. Then I made an rbd sim using 100s of each of these pieces. I cached the simmed points. After this, I want to place my unique pieces onto these points. What's the best and most optimal way to do this for the viewport and the renderer? I have seen many different techniques, I can't tell which one is the best. Should I just use Instance SOP? Should I save the first frame of the sim and use this instead? I can transform them using Transform Pieces SOP, but I don't know how to bring them optimally. Should the pieces be brought it as packed geometry, packed fragments, packed disk geo, or something else? I want Houdini to know everything is coming from these 200 unique pieces so it can show them more efficiently in the viewport and the renderer. I noticed if I save my pieces onto separate files as packed fragments, Houdini is saving the entire packed geometry onto each file that takes a lot of space. How do you do this? Thanks
  11. Hi, I have some particles that I was trying to use instance to Copy lights on them. But I want to control their intensity base on random point/id numbers, so each particle will follow different intensity. I couldn't come up with a way to control the lights intensity for each individual particle separately. I came up with the idea of using copy stamping after my popnetwork node. but I need to use a way such as object merge, but unfortunately it doesn't support lights. I would like to know if there is anyway to import light in my nodes with something like Object merge. Any idea that how can I do that or how can I control each individual lights intensity based on random id/point number? Thanks
  12. Hi Could anyone tell me why these box instances do not appear until frame 94. Thanks for your help instance attempt.hipnc
  13. Hi Guys If anyone has done this and has a solution or any advice that would be great and hugely appreciated. It is fairly easy to do all of this if your volumes to instance are within your hip scene. However once you are only referencing them from disk I am struggling to run the cvex over each instanced volume (of which I am using 20 different variations hence using the "instancefile" attr.) I am also trying to volume sample each instance to pull in surface, rest etc primitives into the cvex context to drive my displacement. many thanks tom
  14. Hi, does anybody have experience with exporting ASS files from houdini to render them in Maya and preserve instancing? Now htoa seems to bake-in instances and ASS files are huge, instancing is apparently not preserved. Any ideas? Thanks, Juraj
  15. Hey guys, I've got an issue in DOP with plugging a source volume node into the pre-solve input of a smoke solver when using instancing/partitioning. From the scene attached, here are different scenarios: - `fluidsource1/enable_partitioning` is disabled and the source volume is plugged into the post-solve input: WORKS. - `fluidsource1/enable_partitioning` is disabled and the source volume is plugged into the pre-solve input: WORKS. - `fluidsource1/enable_partitioning` is enabled and the source volume is plugged into the post-solve input: WORKS. - `fluidsource1/enable_partitioning` is enabled and the source volume is plugged into the pre-solve input: FAILS—the container is entirely filled with a density of 1 at a frame after the expected frame. I would like to plug the source volume into the pre-solve step so the velocities advect the density on the creation frame—am I correct to believe that this is a valid approach? If so, what's going wrong here? Also if, instead of sourcing the volume only on a single frame, a continuous emission is performed by turning off the `sparse_emission` switch, then it works fine again. And finally I thought of scaling the volume density value by 0 outside the single emitting frame defined for each instance but I don't even know how to procedurally do that with volumes named "density_0", "density_1", ..., "density_n"? And I would have to do the same for the velocity sources and every other field to source, which is suboptimal. I'd appreciate any help! smoke_presolve_instancing.hipnc
  16. I recently built a animating instance geometry system. The instancefile attribute is used to select between different files. There is subframe instancefile selection information as well and the feature for subframe motion bur is enabled on the SOP. I am unable to achieve motion blur with this setup. I was under the impression that if Mantra received subframe information of the same geometry(same number of vertices and points, with the same numbers and attributes) it would be able to compute the motion blur. At the current state I don't have velocities attribute associated with the loaded geometry. This is an undesired way to achieve motion blur. Any help would be appreciated. I can post a .HIP file when I simplify the method down to a few nodes.
  17. MultiPoint Particle

    Hello guys, I would like to know if it is possible to create a multi instancing with FlipFluid or PopParticle on Houdini. Like one point drives multi point. It's the same thing when you activate MultiPoint on Maya Cheers, Sylvain. PS : Enclosed there is an exemple.
  18. Hi, I've been playing around with instancing and I didn't manage to change the animation start frame on a per instance basis. For example, if I instantiate a cube with an animation (keyframes or bgeo sequence) onto points, all the instances play the animation the same way. (See file below) How can I change the animation start frame on a per instance basis to randomize things a bit (be it keyframes or bgeo cache)? Thank you in advance. Ryuji InstanceAnim.hipnc
  19. Hi, I'm trying to write a displacement shader that takes a curve and changes geometry (in my case a grid) to follow the curve. i can do it if the curve is simple and has origin on 0,0,0 and goes in one direction. it gets more difficult once the origin is off 0,0,0 and goes in different directions. what's the easiest and simplest way to do this?
×