Jump to content


  • Content count

  • Donations

    0.00 CAD 
  • Joined

  • Last visited

  • Days Won


Posts posted by Atom

  1. Another way to control memory usage is to control domain size expansion. Sparse can easily get out of control as it enlarges itself to contain the pyro, consuming more memory by dividing the bounding size by the division value. For example missile trails that shoot off in all direction are a bad idea for a single simulation. Perhaps you need to break yours down into smaller manageable simulations that will fit within your physical memory? Or turn up Dissipation to remove old values from the simulation more quickly..?

  2. On 6/2/2021 at 5:04 PM, catchyid said:

    any ideas?

    Maybe try turning on the Alfred Style Progress. I think that may kick out a frame by frame log to the console. Not sure how to intercept the console on-the-fly, however...


    -> Try placing python code inside the Pre-Render or Post-Render script field of the filecache or ROP. You could access the hou.playbar to grab the current frame. Remember to switch the field type from hScript to Python. 


    What if you updated your farm to use a tops based solution?

    Set the filecache to a Save Current Frame, then loop over the frame range in tops setting a new frame based geometry filename every iteration? This could allow you to drop in a pythonscript node that can run some code every frame.


    • Like 1

  3. Ok, here is an attempt at tomogachy's suggestion. Sweep before the DOPNet, then point deform after. The scale along length doesn't really work right, because it needs to know the pscale of the point where the connection occurs, then use that value as the start of the taper. Also, some points just don't mesh well and seem stuck to neighbors?



  4. I tried adding leaves, but the branch connectors dance around. I'm not sure if this set up will work with more branches. When I enabled the secondary branches, the bonecapturebiharmonic goes off into la-la land. You may want to come up with some overall mass concept for the tree and divide the branches into various weights. With more secondary branches, they weigh down the parent branch quickly.



  5. Here's another version where the scatter changes its seed per frame. Add a popnet to sample from that changing source. The popnet will add, age, and remove points. Base the pscale on the age and you can fade them in and out over particle life.



  6. This version references a CHOP wave in sample/hold mode. The scatter fetches this chopnet value for the seed.

    Here's a quick attempt at synchronizing the scatter change with the color fade. A lag channel is added to the CHOP network to generate a source supplied to the color node.



    You can adjust the period of the wave1 node to change the speed for both outputs. Use the phase to align the red waveform with the purple.






    • Like 2

  7. Zero does work as you can see no error in my screenshot. Make sure your attribute wrangle is running in detail mode.

    @UltramanThat's a much simpler solution, but what if I want to change every 48 frames? The round will produce a change at frame 24 as well.


  8. You can brute force this by using code to drive the scatter seed.

    It's crude, but it works.

    if(@Frame>=1 && @Frame<48){ f@seed = 0.1;}
    if(@Frame>=48 && @Frame<96){ f@seed = 0.2;}
    if(@Frame>=96 && @Frame<144){ f@seed = 0.3;}
    if(@Frame>=144 && @Frame<192){ f@seed = 0.4;}
    if(@Frame>=192 && @Frame<240){ f@seed = 0.5;}

    So every 48 frames a new seed is generated as a Detail attribute.

    Drive the global seed of the scatter by fetching the value using the hScript detail function.


  9. Try cross animating your temperature field. The position of the spheres is irrelevant, you can animate one close to the other to sell the ignition effect, however. One thing to keep in mind is that the noise you are adding to your temperature field exceeds the low value of 0.1 required for ignition. Bump that value up on the solver, or don't add noise to the temperature field.


  10. I watched the video tutorial that demonstrates how to convert the CMU Mocap database into a .bgeo.sc file. The format is strange, with the result that looks like a skeleton per-frame.Untitled-2.thumb.jpg.4b7271fa445d92aeaa0d9000d21ac4a9.jpg

    This small setup will translate the converted mocap data into something similar to the FBX Animation Import node that can connect to the RigMatchPose.

    The FBX characters were generated by the latest version of MakeHuman, 1.2. Each character is exported in the T-pose using the CMU bone skeleton. This allows you to switch freely between characters and mocaps.



  11. It may be that the folder structure doesn't exists the same way in Maya as in Houdini. Perhaps an embedded HDA in Maya doesn't have write privileges to the disk? Verify that your target cache folders exist and your HDA has access to them.

    If you are exporting a simulation, make sure to activate Initialize Simulation OPs on each of the ROPs. That should force a cook on export.

  12. I guess I would start with how many unique agents do you have? You only need a proxy for each unique agent. Then leverage the point instancing system for whatever render system you are using to deploy each agent.