Jump to content

All Activity

This stream auto-updates     

  1. Today
  2. Hello everyone, Is it possible to add a callback that runs everytime a node is created ? At my studio we have a library of hda (hundreds of nodes) and I'd like to check if the node comes from the general library, or the project library and simply set a color depending on its definition path. I tried scripts/OnCreated.py with a package and appending the path to the Houdini path but nothing is happening. I also thought appending a few lines of code using nodeDefinition.sections()["OnCreated"].setContents("print('Hello')") but doing this for hundreds of hda's, and everytime one is created, requires a bit more work than what I can do for now. Is there a solution ? I've looked at ui event but I haven't found anything interesting. Cheers,
  3. Ok, thanks works for curvature , but not for ambient occlusion. Same thing... I guess that's why this tool is mark as beta... Another bug? Every studio use substance in their pipeline, and it has been years since this implementation stay in his child age... Hey Sidefx, I guess most of us could do tools you bring in H19 like the slice tool, I would rather you to fix this kind of things and COP...
  4. Houdini Particle Effects streaks

    Thanks for the plug @Librarian . I should probably make a post on here telling people about them since I have been doing to for a while now. @Pj20 to me your reference looks like pyro flowing around a sphere and particles being adverted through it. My tutorial is a little different but maybe you can borrow some ideas from there: https://vimeo.com/531163248
  5. Form Design - TIMJ

  6. Weathering texture tool

    Im surprised the result looks accurate while there are lot's of overlapping UV , do you know guys if there is a way with autouvlayout to pack and avoid cross UDIM vertex? Im trying to establish better link with Substance and Houdini , trying to get the best of both worlds. Basically, if I create and get some asset a look made in substance of a given procedural shader like an edge and scratch procedural wearing material recreat live the mesh dependency map directly in Houdini, without any export, baking in substance, importing back blablabla... When I link that AO and curvature with an external prebake EXR map it's work (after I have to fix a naming error in the Lab tools) but not when I try to "live" connect. Like shown above The image plane doesn't seem to be recognized and taken into consideration in the SBSAR processing... Im totally bloked. Image plane are named correctly, single channel as expected by substance.. Ideally at the end, processing every asset with PDG. I haven't a post from you @konstantin magnus about PDG, are you not using it ? Im joining my hip and test substance file I quickly created.. Maybe it's a limitation of substance archive? It can't only take baked, but I guess if so we should be able to trick it isn'it? Has Anyone done PDG map baking for all element of a scene and don't mind sharing it by chance ? Im joining my hip and test substance file I quickly created.. Maybe it's a limitation of substance archive? It can't only take baked, but I guess if so we should be able to trick it isn'it? Many thanks guys!! testhoudini2.sbsar testhoudini3.sbsar Substance_automation_vince.hip
  7. Im trying to establish better link with Substance and Houdini , trying to get the best of both worlds. Basically, if I create and get some asset a look made in substance of a given procedural shader like an edge and scratch procedural wearing material recreat live the mesh dependency map directly in Houdini, without any export, baking in substance, importing back blablabla... When I link that AO and curvature with an external prebake EXR map it's work (after I have to fix a naming error in the Lab tools) but not when I try to "live" connect. Like shown above The image plane doesn't seem to be recognized and taken into consideration in the SBSAR processing? Ideally at the end, processing every asset with PDG... Anyoen has done PDG map baking for an all scene ? Im joining my hip and test substance file I quickly created.. Maybe it's a limitation of substance archive? It can't only take baked, but I guess if so we should be able to trick it isn'it? Many thanks guys ________________________________________________________________ Vincent Thomas (VFX and Art since 1998) Senior Env and Lighting artist & Houdini generalist & Creative Concepts http://fr.linkedin.com/in/vincentthomas Substance_automation_vince.hip testhoudini2.sbsar testhoudini3.sbsar
  8. Getting IK legs to parent to hips

    Definitely agree that KineFX is a bit uncharted territory, not much tutorials or how to's out there, so I'm glad we can get a bit wiser together this way Just curious, wouldn't you rather use the IK solver for 4bones than having two IK chains for hind legs? I guess for insects and stuff the two IK chain setup is correct, but for like mammals? What's the advantage over the IKsolverVOP?
  9. Getting IK legs to parent to hips

    Nice setup! I do think it's the way to go actually, creating the IK then blending it in twice, OR creating the reverse foot before and blending it in after IK leg. I guess you don't need a second IK, just do a skeletonblend. Here's my setup, It's pretty similar to what you got. reverseFoot_v001_pernilla.hip
  10. ahaha I do too, but i force myself to do in the right order . Im by no way an image scientist
  11. I think you need to create a custom viewer state.
  12. Pop Grain Collision Shape

    Is there a way to change the default sphere collision shape when using pop grains? Would something like a capsule, flattened sphere, or a custom SOP shape be possible? I don't need different shapes collision to be per particle, im just looking for something other than the default sphere for a collision shape type.
  13. How to rotate constraints

    There must be a good reason why this works, but hell if I know what it is. Nice find!
  14. Yesterday
  15. How to rotate constraints

    Yep glue works great. interesting. Thanks Henry
  16. How to rotate constraints

    Hey Henry! Just saw this. I was trying stuff earlier and found if I use 2 constraints, one "position" type and another "rotation", it works a lot better than using just one set to "all". Very bizarre: I'm going to check your file out now
  17. I've solved my issue. The problem was I was waaay too close to the screen for my reference. Reconstruction was accurate, but the render was too distorted to be viewable, so I moved back some 30 feet and used a long lens (100mm) and everything worked like a charm. Thanks everyone!
  18. Yes, for me at least measurements are taken with a laser, and the reconstructions are made to the correct scale. But sometimes that's not enough. Sometimes the measurements are just difficult to take without scaffolding and you have to trust the measurements you are given. For example in the case of projection in most cases the technical guys give you the position where the projector will be installed. But in practice the projectors are installed a few days before the show. And when it comes to installing things that can be more than 50 kg at 10 or 20 meters high, well they are not necessarily placed exactly as they were told, etc. etc... That's why I use the little method I described which works perfectly. ... and I trust only my eye
  19. How to rotate constraints

    I'm honestly not entirely sure why the hard constraint isn't enforcing rotation well enough, but the cheap workaround would be to use a Glue Constraint Relationship instead. Set s@constraint_type to "glue" with a f@strength of -1, then use a Glue Constraint Relationship DOP in place of your Hard Constraint. This will force both objects to solve as a single physical object. You also don't need multiple DOP objects for this system to work. As long as you set the i@active and i@animated attributes correctly on your static and active pieces, you can solve everything as a single DOP object and simplify things a little. 002_toadstorm.hip
  20. flip, vellum and rbd in one effect

    Not exactly what you want but will help you to figure out how to get there if not also blow your mind... needs to be seen more often anyway!
  21. biological modelling methods

    For UV unwrapping a mesh which based on underlying curves1, you can also use this code inside a Point or Vertex Wrangle: // INPUT 0: MESH // INPUT 1: CURVES with 'tangentu'-attribute int prim_crv; vector uvw_crv; float dist = xyzdist(1, v@P, prim_crv, uvw_crv); vector tangent = primuv(1, 'tangentu', prim_crv, uvw_crv); matrix3 rot = dihedral(tangent, {0,1,0}); vector pos = primuv(1, 'P', prim_crv, uvw_crv); vector dir = normalize(pos - v@P) * rot; float angle = fit11(atan(dir.z, dir.x) / M_PI, 0.0, 1.0); float perim = primintrinsic(1, 'measuredperimeter', prim_crv); float u = (angle * 2.0 * M_PI * dist); float v = uvw_crv[0] * perim; v@uv = set(u, v, 0.0); 1 https://forums.odforce.net/topic/43213-biological-modelling-methods/?do=findComment&comment=204915
  22. How to rotate constraints

    So one follow-up question: the constraint drifts in time, I've made the constraint longer to make it more obvious (desired behavior the link points and anchors are collinear): Increasing the "Error Reduction Parameter" inside the Hard Constraint Relationship node makes it better but it's not perfect Is there some other setting that would fix this? 002.hip
  23. How to rotate constraints

    @julian johnson this is exactly what I was looking for. Thank you!!
  24. How to rotate constraints

    001a just turns off your Geometry wrangle and switches the object to animated static which allows the Dopnet to update the geometry orientation and in turn the constraint. 001c adjusts only your Geometry wrangle to also update the orient attribute so that the constraint can 'see' the rotation. Not sure on the ins and outs of why these should 'work' but they seem to do what you want. Maybe? J 001a.hipnc 001c.hipnc
  25. How to rotate constraints

    @Librarian Thanks for the file! That was indeed helpful I'm still curious to see if there is a way for the constraints to infer their orientation from the rotation of the geometry it's anchored to, like in the file I uploaded
  26. Hard Surface Design

    I have no idea, but it looks like, that the lines are not even create by intersections or something but taken from existing geometry. Maybe by shortest path from random point selection or something. btw: as you can see in the first example from Konstantin, that Voronoi on 1d-scaled Mesh and Deltamush is doing the job quite well.
  1. Load more activity
×