Jump to content

toadstorm

Members
  • Content count

    420
  • Donations

    0.00 CAD 
  • Joined

  • Last visited

  • Days Won

    26

Everything posted by toadstorm

  1. How to use RBD constraints in DOP?

    https://www.tokeru.com/cgwiki/index.php?title=ConstraintNetworks
  2. Using attributes as parameter

    I'm not exactly sure what you're getting at with the Thickness Ramp. That's for spine profiles, and you're not using one of those. If you were extruding faces along a curve and wanted to control the extrusion thickness over that length, that's the control you'd use. If you *are* trying to use this to drive extrusion inset along a spline, you can just draw that ramp and then enable use of a Thickness Attribute, and pass along your attribute like before. In the file you sent me, all you need to do is set the Local Control > Inset Scale attribute to "inset". inset_by_attribute_2.hip
  3. Using attributes as parameter

    Here's a quick example... not sure if this is exactly what you're looking for. I'm just building a ramped attribute and using that as the inset mask attribute. inset_by_attribute.hip
  4. CopyStamp texture path not working

    There were a few little things wrong in your approach. First, don't copy stamp. Copy stamping is dead. Second, you need to create a string attribute for the Material SOP to override... Redshift is dumb and their texture sampler doesn't have an actual parameter for the texture path that you can override, so you have to instead manually create a spare parameter on the Material Builder, and then channel reference the RS Texture Sampler to follow that attribute. Then you can tell the Material SOP to override that value, after the copies are made. The override is meant to apply to the finished objects; you don't need to do anything before the copy operation other than create your range of random texture values. Here's your file back with the changes made. Wall_paper_02_toadstorm.hip
  5. Automate File Naming Convention?

    Throw this in your Python Source Editor and run it: import hou import os fbx_rops = [f for f in hou.root().allSubChildren() if f.type().name() == "rop_fbx"] if fbx_rops: for rop in fbx_rops: outparm = rop.parm("sopoutput") if "_fixed" not in outparm.eval(): basepath = os.path.splitext(outparm.unexpandedString())[0] newpath = basepath + "_fixed.fbx" outparm.set(newpath)
  6. Vellum sim on part of KineFX skeleton

    It's hard to say exactly what's going wrong without seeing a file, but my guess is that you're not properly applying the orientations of your vellum sim back to the transforms of your skeleton. When you run the Vellum simulation, assuming you're using a constraint type like "String" that supports orientation, the p@orient value of your input points needs to match the KineFX joints. You can do this pretty quickly by converting your 3@transform attribute (this is in world space and common to all KineFX joints): p@orient = qconvert(3@transform); Then feed this into your Vellum String configuration and simulate. I'm using Pins to keep some of the original joints attached to their animation. The p@orient attribute will update during the simulation. To turn these results into something useful post-sim, we have to take this computed orientation and convert it back into the 3@transform attribute. You start with a default matrix, make sure its scale matches that of the original, then rotate it to match the orientation of your simulated points: // convert world space orient quaternion (from vellum) back to transform matrix vector scale = cracktransform(0, 0, 2, 0, 3@transform); matrix3 xform = ident(); scale(xform, scale); matrix3 rot = qconvert(p@orient); 3@transform = xform * rot; Now that that's done, you can connect your skeleton into the Bone Deform SOP and it ought to work. I'm attaching an example HIP file... you'll have to load your own FBX, define a clip name and pick new pin numbers on the Vellum Constraint. vellumcat.hip
  7. Make an HDA out of the relevant assets, then load that HDA into your other scenes.
  8. Redshift rendering Zdepth

    There's nothing wrong with your depth pass... it's just that the depth is being displayed in scene units, so the pixel value of the furthest objects is about 11 (the background is 4000). If you turn the exposure slider all the way down, you should be able to see values. Or just take a look at the RGBA values in the monitor as you mouse over the pixels.
  9. The easiest way to do this is probably via Copy to Points. Just object merge your Object B into a network, define the point you want as a point group, then Copy to Points to that point group. Your line will automatically align its local +Z axis to the point normal of your template mesh.
  10. how to create voronoi morph

    This would be a MOPs way to do it... less VEX but a few more nodes involved to get the scale fake working as intended. If you could guarantee an equal number of chunks on both sides of the equation, it'd be easier to blend transforms from A to B via MOPs Apply Attributes, but it's hard to guarantee chunk counts with voronoi. voronoi_morph.hip
  11. You can use a Transform VOP to transform from space:current (camera space) to space:world or space:object, then use the result for your P.y lookup.
  12. MOPs Plus

    Sorry about the repost... I realized this is probably the more appropriate forum for a commercial tool. I've just released MOPs Plus into early access. This is an add-on to the existing MOPs toolset that adds some new convenient SOP toys for typography, collision geometry, randomizing or switching textures, and a few other tricks. It also adds MOPsDOPs, an extension of the MOPs toolset into simulations that allows you to use familiar tools to manipulate RBDs, Vellum, and POPs. Since it's in early access, if you use the coupon code MOPSPLUSEA at checkout, you'll get a 25% discount. You can find more details about the plugin here: https://motionoperators.com/info/mopsplus/ Feel free to hit me up with any questions at all! Thanks to everyone who's helped me test and refine so far.
  13. MOPs Plus

    Hi everyone! I've just released MOPs Plus into early access. This is an add-on to the existing MOPs toolset that adds some new convenient SOP toys for typography, collision geometry, randomizing or switching textures, and a few other tricks. It also adds MOPsDOPs, an extension of the MOPs toolset into simulations. Since it's in early access, if you use the coupon code MOPSPLUSEA at checkout, you'll get a 25% discount. You can find more details about the plugin here: https://motionoperators.com/info/mopsplus/ Feel free to hit me up with any questions at all! Thanks to everyone who's helped me test and refine so far.
  14. I'm trying to export an ABC from Houdini with instancing enabled. I have a bunch of regular packed primitives with a Cd attribute on the points, and they're being written out to disk just fine. Instancing on the Alembic ROP makes these files very manageable. However, when loading these in to Maya via the VRayMesh proxy loader, the Cd attribute isn't detected as a color set. Additionally, in Houdini, if I try to read this Alembic back in, the Cd attribute is gone, even after unpacking. I'm using 16.5.439. Anyone have any ideas?
  15. Offset time on Copy to Points

    Try setting your frame intrinsic to @Frame-@startFrame instead, and use "set" in place of "add".
  16. Offset time on Copy to Points

    Not sure just by glancing at your network, but it looks like your startFrame attribute is an int. Try changing it to a float? Also, the viewport will never show time offset sequences correctly. Make sure you render or unpack to get a proper preview.
  17. Apply rotate and scale to RBD?

    The `transform` 3x3 matrix intrinsic controls both rotation and scale. If you're assigning rotations based on quaternions, you're going to run into scale issues because quaternions can't contain scale information. The `w` attribute is for angular velocity, not rotation, so it won't help you here. What you can do is use cracktransform() in VEX to extract the scale from a matrix as a vector, then use the `scale()` VEX function on your rotated matrix to scale it back to the original value, either before or after your rotation (depending on whether you want your scaling to happen in world or local space). You could also consider using MOPs Apply Attributes to handle this for you.
  18. Assigning material to prim group

    Generally you'd just use the group mask parameter on the primitive wrangle itself, rather than a conditional statement. If you have to use a conditional, though, it'd look like this: if(inprimgroup(0, "sphere", @primnum)) { s@shop_materialpath = "whatever"; } This wouldn't let you use wildcards, though. It's better to just use the group mask parameter on the wrangle.
  19. transfering Cd to deforming geo

    Here's an option that doesn't necessarily require a solver (unless the surface you're transferring from eventually animates away from the arm). You can freeze the arm in place and animate your shape over it to turn points white, then afterwards use Point Deform to make the points follow the original animation. The reason your colors were getting screwed up is that you were running Grain Source on a moving object, which means every frame your points were being regenerated and not necessarily in exactly the same order or number. This way you only compute that source once and then apply the original animation onto those points, so it has the advantage of being much quicker to evaluate, too. color_transfer_deforming_geo_toadstorm.hiplc
  20. QWidget window resizing

    I don't know anything about that tutorial, but the parent widget, whatever it is, needs to be enclosed in some kind of QLayout (QHBoxLayout, QVBoxLayout, etc) if you want it to automatically stretch to fit the parent container. You'll have better luck with Qt stuff in general if you give up on QDesigner and just start coding your interfaces by hand. It takes a bit of practice but QDesigner is really a bit of a crutch.
  21. Get current context path with python

    import toolutils network = toolutils.networkEditor() path = network.pwd().path()
  22. @orient attribute to @N an @up?

    Another quick way to do this is to just rotate world N (+Z) and world up (+Y) by your orient quaternion. v@up = qrotate(p@orient, {0,1,0}); v@N = qrotate(p@orient, {0,0,1});
  23. trying to understand primuv function

    This is because, I assume, you're doing this in a point wrangle. The problem is that point wrangles are inherently a for/each loop, looping over each point. This means that every one of your points is looping over all 20 primitives, then stopping on the 20th primitive, simultaneously. Instead, assuming you have 20 points and 20 prims: vector uv = set(0.5, 0.5, 0); @P = primuv(1, "P", @elemnum, uv); @elemnum, @ptnum, @primnum, etc. are automatic attributes inside the wrangle that tell you which point, prim, or whatever you're currently iterating on.
  24. trying to understand primuv function

    Well, you're not really defining what "prim" and "primuv" are. You're initializing them as variables, but you have to actually set them to something to be useful. Primuv() usually works alongside xyzdist()... xyzdist can fetch the "prim" and "primuv" values from the nearest point on the surface, and then you can pass those values along to primuv. It's a little confusing because xyzdist() has some quirks in its arguments. If you look at the docs for xyzdist(), there's a little "&" following the prim and primuv arguments. This means that even though xyzdist officially returns a float (the distance to the surface from your lookup position), it can also write to those & arguments. So the full code would look something a little more like this: int prim; vector primuv; float dist = xyzdist(1, @P, prim, primuv); // this actually WRITES to prim and primuv! vector sample = primuv(1, "P", prim, primuv); // this is fetching P from the prim and primuv coords supplied v@P = sample;
  25. Offset time on Copy to Points

    Hmm... what you'd probably want to do here is start by creating an incrementing point attribute on your template points that represents what frame in the sequence you want each copy to load from. For this example, let's call it i@frame. Once you have that, you could create a string primitive attribute on your packed primitives (after the copy operation) that would point to the full path of the texture on disk: s@texture_path = sprintf("/path/to/textures/root/texture_%g.png", i@frame); Now that you have this as a primitive attribute, you could leverage material stylesheets to override the emissive texture (or whatever texture) on the material applied to your sparks. Stylesheets are a bit cumbersome to use in Houdini, but they're very powerful and they'll work with any render engine. Once you have a material assigned to your object, go to Inspectors > Data Tree > Material Style Sheets, then create a new stylesheet on the geometry container that your sparks live in. Then right-click the stylesheet in the Data Tree and "Add Style". You can then add a Target of type "Primitive", and then an "Override Script" of type "Material Parameter". Then change the "Override Name" column to the name of the parameter you want to override in your material... if you hover over the name of the material parameter in your shader network, you should see the parameter name. In your case it's probably going to be the emission color texture, which is "emitcolor_texture" in a Principled Shader. The "Override Type" should be "Attribute Binding", because you're using the attribute you created earlier to override the material value. Then set the Override Value to the name of the attribute you created: "texture_path". Your stylesheet should look something like the attached screenshot. Once that's done, if you render (and your material is set to be emissive, emissive textures are enabled, and your emission color is 1.0), you should see a unique texture applied to each spark.
×