Jump to content


  • Content count

  • Donations

    0.00 CAD 
  • Joined

  • Last visited

  • Days Won


toadstorm last won the day on August 12

toadstorm had the most liked content!

Community Reputation

288 Excellent

About toadstorm

  • Rank

Contact Methods

  • Website URL

Personal Information

  • Name
  • Location
    San Jose, CA

Recent Profile Visitors

5,223 profile views
  1. Assigning material to prim group

    Generally you'd just use the group mask parameter on the primitive wrangle itself, rather than a conditional statement. If you have to use a conditional, though, it'd look like this: if(inprimgroup(0, "sphere", @primnum)) { s@shop_materialpath = "whatever"; } This wouldn't let you use wildcards, though. It's better to just use the group mask parameter on the wrangle.
  2. transfering Cd to deforming geo

    Here's an option that doesn't necessarily require a solver (unless the surface you're transferring from eventually animates away from the arm). You can freeze the arm in place and animate your shape over it to turn points white, then afterwards use Point Deform to make the points follow the original animation. The reason your colors were getting screwed up is that you were running Grain Source on a moving object, which means every frame your points were being regenerated and not necessarily in exactly the same order or number. This way you only compute that source once and then apply the original animation onto those points, so it has the advantage of being much quicker to evaluate, too. color_transfer_deforming_geo_toadstorm.hiplc
  3. QWidget window resizing

    I don't know anything about that tutorial, but the parent widget, whatever it is, needs to be enclosed in some kind of QLayout (QHBoxLayout, QVBoxLayout, etc) if you want it to automatically stretch to fit the parent container. You'll have better luck with Qt stuff in general if you give up on QDesigner and just start coding your interfaces by hand. It takes a bit of practice but QDesigner is really a bit of a crutch.
  4. Get current context path with python

    import toolutils network = toolutils.networkEditor() path = network.pwd().path()
  5. @orient attribute to @N an @up?

    Another quick way to do this is to just rotate world N (+Z) and world up (+Y) by your orient quaternion. v@up = qrotate(p@orient, {0,1,0}); v@N = qrotate(p@orient, {0,0,1});
  6. trying to understand primuv function

    This is because, I assume, you're doing this in a point wrangle. The problem is that point wrangles are inherently a for/each loop, looping over each point. This means that every one of your points is looping over all 20 primitives, then stopping on the 20th primitive, simultaneously. Instead, assuming you have 20 points and 20 prims: vector uv = set(0.5, 0.5, 0); @P = primuv(1, "P", @elemnum, uv); @elemnum, @ptnum, @primnum, etc. are automatic attributes inside the wrangle that tell you which point, prim, or whatever you're currently iterating on.
  7. trying to understand primuv function

    Well, you're not really defining what "prim" and "primuv" are. You're initializing them as variables, but you have to actually set them to something to be useful. Primuv() usually works alongside xyzdist()... xyzdist can fetch the "prim" and "primuv" values from the nearest point on the surface, and then you can pass those values along to primuv. It's a little confusing because xyzdist() has some quirks in its arguments. If you look at the docs for xyzdist(), there's a little "&" following the prim and primuv arguments. This means that even though xyzdist officially returns a float (the distance to the surface from your lookup position), it can also write to those & arguments. So the full code would look something a little more like this: int prim; vector primuv; float dist = xyzdist(1, @P, prim, primuv); // this actually WRITES to prim and primuv! vector sample = primuv(1, "P", prim, primuv); // this is fetching P from the prim and primuv coords supplied v@P = sample;
  8. Offset time on Copy to Points

    Hmm... what you'd probably want to do here is start by creating an incrementing point attribute on your template points that represents what frame in the sequence you want each copy to load from. For this example, let's call it i@frame. Once you have that, you could create a string primitive attribute on your packed primitives (after the copy operation) that would point to the full path of the texture on disk: s@texture_path = sprintf("/path/to/textures/root/texture_%g.png", i@frame); Now that you have this as a primitive attribute, you could leverage material stylesheets to override the emissive texture (or whatever texture) on the material applied to your sparks. Stylesheets are a bit cumbersome to use in Houdini, but they're very powerful and they'll work with any render engine. Once you have a material assigned to your object, go to Inspectors > Data Tree > Material Style Sheets, then create a new stylesheet on the geometry container that your sparks live in. Then right-click the stylesheet in the Data Tree and "Add Style". You can then add a Target of type "Primitive", and then an "Override Script" of type "Material Parameter". Then change the "Override Name" column to the name of the parameter you want to override in your material... if you hover over the name of the material parameter in your shader network, you should see the parameter name. In your case it's probably going to be the emission color texture, which is "emitcolor_texture" in a Principled Shader. The "Override Type" should be "Attribute Binding", because you're using the attribute you created earlier to override the material value. Then set the Override Value to the name of the attribute you created: "texture_path". Your stylesheet should look something like the attached screenshot. Once that's done, if you render (and your material is set to be emissive, emissive textures are enabled, and your emission color is 1.0), you should see a unique texture applied to each spark.
  9. Sure... assuming you're emitting particles from a source in some given direction, you could project the particles onto the surface using minpos() in a POP Wrangle to find the closest position on a mesh to a given point in space. Another way to do this is to start with an object that has well-defined UV texture coordinates. You can simply set @P = v@uv in a Point Wrangle (might need to promote the UVs to Point first), solve the simulation, then use Point Deform to warp your finished simulation from the UV-coordinate space back to the original world space.
  10. The most straightforward way to do this would be to create a parameter on some node and then reference that parameter value in place of $F. For example, rand(ch("/obj/null1/parm")).
  11. You solve intersections for particles with different pscales with POP Grains. That is the way to do it, unless you want to use Vellum grains, which are certainly viable but might be a little bit slower. POP Interact can only apply forces, which are not going to get you clean collisions. pop_advect_grains_toadstorm.hiplc
  12. POP Grains will resolve intersections. If you need the collisions to be "softer", you can adjust the collision stiffness. To get FLIP-like motion, consider running a smoke simulation first, caching out the velocity field, and then advecting your POP grains through that field via POP Advect by Volumes.
  13. Instance rotations in asset

    I'm not sure I understand the problem. If your cogs are packed geometry, and you're rotating them in Houdini, those rotations should export via VAT, just like rigid body simulations would.
  14. Instance rotations in asset

    You could consider using the Labs Vertex Animation Textures ROP. This will bake your animation into a texture that describes the transform matrix of each cog as RGBA data. You'll have to build a shader inside UE4 to have this work properly, but the ROP will handle most of that for you.