Jump to content

Neon Junkyard

  • Posts

  • Joined

  • Last visited

  • Days Won


Neon Junkyard last won the day on August 28 2019

Neon Junkyard had the most liked content!


Contact Methods

  • Website URL

Personal Information

  • Name
    Drew Parks

Recent Profile Visitors

4,220 profile views

Neon Junkyard's Achievements


Newbie (1/14)



  1. you can use hou.hipfile.addEventCallback() to register a callback for when the hipfile is saved you can add this code in the HDA python module, or (depending on what you want to do), put it in the 'On Created' event handler to register it when the node is created, or the hipfile is loaded. You will probably also want to check for any existing callbacks before setting a new one https://www.sidefx.com/docs/houdini/hom/locations.html#scene_events https://www.sidefx.com/docs/houdini/hom/hou/hipFile.html
  2. you can do this qt for python, e.g. a python panel, but its not a trivial thing Fuzzy search is typically done with a proxy model using QT model/view architecture for larger applications, but you might be able to get away with using a qlistwidget and a qcompleter, which is much simpler to use
  3. in vex use setprimintrinsic() and change the 'unexpandedfilename' attribute to a packed-disk primitive path on disk
  4. Neon Junkyard


    I pretty much exclusively do this with python as well but you can also manipulate the timeline to split things up into different files, if you dont want to dip into python example case you have a model with a bunch of pieces you want to export as individual .fbx files, put a connectivity sop set to primitives, and in a primitive wrangle - setdetailattrib(0, 'frame_max', i@class, 'max'); if(@class != @Frame-1) { removeprim(0, @primnum, 1); } Output to a filecache, fbx w/e, set to Save Frame Range, and put expression in the end frame parameter- detail(0, 'frame_max', 0) miller time
  5. List comprehension? Not really... Foreach is in itself shorthand, for( int i = 0; i< len(a1); i++). You could create a custom VEX function library for stuff like this, close as you will get to python classes
  6. this is how I deal with this for freelancing, if I am physically at a studio using their equipment I tend to not use any personal HDAs for exactly this reason, instead kind of unboxing them with limited functionality for the purpose of the job and leaving it as a subnet. Outside HDAs can also be extremely problematic for certain pipelines, farm & cloud rendering and most studios discourage/outright ban them anyway if you really want to use a personal HDA at a studio you can blackbox it (Assets > create black boxed asset). this will make it usable and distributable but unable to be inspected if I am working remotely (most of my work), with random clients or studios regardless I include it in my contract that I do not include any houdini source files, assets, setups etc, only renders/caches/nuke scripts w/e, as setups/tools that you have been developing for years are worth far more than any individual job. Much like any non-redistributable software EULA finally, if you really want to be a pain in the ass you can make your tools so complicated, or bury tons of python code so deep in them that no one will be able to figure them out/use them anyway. job security
  7. I have an fbx subnet with a bunch of different models that Im object merging into a separate geo network with wildcards to grab all the file nodes in the fbx subnet, and enabling create path attribute in the object merge I need the shop_material path to correctly point to the corresping fbx material that was created, and while most objects are read in with the shop_materialpath attribute correctly, the houdini fbx importer will interpret the material assignment at the object level depending on how the model was prepped, so the resulting shop_materialpath from the object merge is missing. The textures and materials are there at the object level (in the fbx subnet) but the shop path is wrong, and thus textures break in the object merge. so I need in a python sop to run a loop over each prim, eval the obj level shop_materialpath parameter from its path parameter (i.e. its source geo network in the fbx subnet) and append that string to the current geo shop_materialpath for the missing prims thanks
  8. Personally, I never use /shop, its annoying to navigate to and on complex projects I like to group my materials in multiple mat networks at the object level . Arnold/Redshift/Pipelines in general dont care where you put your materials in terms of breaking the render, you can put them in literally any context and it will work as long as the shop_materialpath attribute is maintained. Generally speaking though, its probably best to only use sop-level material networks in HDAs, standalone assets etc. Otherwise it makes it extremely difficult to navigate your scene should you hand it off to someone, or come back to it months or years later. Ultimately personal preference though
  9. in a python sop you can use the pressButton() argument pointing to your file cache "save to disk" parameter (might have to use evalparm as well) that will in essence physically press the save to disk button on every cook. combine that with the detail attribute filepath variables and that should be all you need also you can set a global variable holding your detail expression (under edit menu > aliases and variables > variables), say $CACHE = `detail(blahblah)`, then use $CACHE in your filepath instead of that expression, allowing you to read everything back in as packed prims in a single node using the filemerge sop utilizing that variable (similar to how you can load back in multiple wedge passes)
  10. Ah thanks Tomas, those are great solutions! Yea the vex route Im realizing is more trouble than its worth, currently for most cases that dont require interpolated values im using the ray sop (that is now compilable) set to mindist with transform points unchecked and import attributes from hits. Doesnt account for your second solution though so thank you for that I know as soon as I spend any sort of time making some custom attribute transfer sop that handles all edge case scenarios sidefx will release a new version of it, happens every time
  11. I updated to h17.5 and all my custom HDAs broke, sidefx updated the bound sop and now im getting this error message on all my HDAs - /obj/path/to/my/HDA/bound1: "Too many elements found for parameter "blahblah/bound1 Bounding Type" This is across about 20 different HDAs in this scene, with probably 100 total instances throughout the scene. Is there any way to go about fixing this sort of issue without manually updating every single HDA by hand? However this is just one case, looking for general solutions and good practices for handling & updating nested HDAs so things dont break like this Example case- you have a tree HDA, inside that you have various leaf modules that are each their own HDA. Each leaf's parameters are relative reference linked to the top level tree HDA. Each leaf is a different HDA (leaf_A, leaf_B etc) however they all share a common HDA that controls the leaf shader, for instance. Now the problem comes when you try and update that leaf shader HDA, if you version up your asset & make changes, the other leaf assets will not update to the latest version so you have to go and update them all by hand. But the bigger problem is if you do something like change the name of a parameter (that is expression linked to the top level HDA) e.g. from leafcolor to leafColor, everything will break and you will get this error message: I'm not 100% sure what these options really mean, but from what I can tell the old leafcolor parameter will be a spare parameter now, unique to that HDA instance, and you can go and delete all spare parameters and that will get rid of them, and I think thats what "destroy all spare parameters" is suppose to do for each instance, HOWEVER since all the other instances of this HDA are nested inside LOCKED HDAs, that button will have no effect and everything will remain broken. Other than undoing these changes, the only way ive been able to fix this is to go and manually delete all the spare parameters in EACH HDA INSTANCE, not just the master HDA. sigh. That is just one example of a parameter change but it can happen in a million different ways. Anyone have any insight on how to handle/avoid situations like this, best practices etc? Any help is much appreciated
  12. couple things... if you are using packed primitives, make sure you add a redshift spare parameters to your objects, and enable "instance sop level packed primitives" under the redshift OBJ > settings > instancing tab of your geometry network second, are you using the sop-level instancer or the OBJ level instancer? sop level instance doesnt actually instance anything its just a copy sop wrapped in an HDA. you want to be using the OBJ level instance third, if you have nested OBJ networks inside a subnet you need to enable "render the OBJ nodes inside OBJ/SOP subnets" on the redshift OBJ spare parameters tab, which should be enabled by default. you should enable "force update OBJ nodes inside of subnets" in the IPR tab as well finally, redshift only allows point attributes on geometry* (for the most part, prim attributes can be used for strands/hair, you can promote color to vertex attribute to eliminate point color bleeding from shared edges, etc) buts its a safe bet to promote all your custom attributes to point level BEFORE you pack anything. Also if you have mismatched attributes when using packed primitives you are going to have a bad day, i.e. if you have color on your points on 1 thing and color on your source geo on another thing and merge them together that attribute will break in your shaders
  13. awesome, I will check that out. starting to use intrinsics more and more
  14. Is there a way to transfer -all- point & prim attributes in VEX like you can with attribute transfer (or ray), which is old and slow and non-compilable, looking for a modern VEX solution Normally I would just use intersect or pcopen or whatever for single attributes but I want to be able to transfer ALL incoming attributes (or have a typed list that can utilize wildcards) without having to manually specify them, so it can account for any incoming attributes upstream thanks
  15. I saw this HDA demo from Kim Goosens circa 2012, where it takes any smooth input curve and outputs a straight edge or angled curve with angle shaping controls Anyone have any insights as to how you might go about recreating this? Especially interested in the parts towards the end where he switches between angled curves and straight edges, with control over the angle of the curve The closest I can get is doing a linear resample and quantising/snapping to grid, which gives an okay result but it is limited to the nearest snap point or rounded integer, there isnt any actual control over overall edge angle, like for instance I would like to be able to set a global curve angle, so curves can only be 90 degrees along an axis or say 60 degrees, or increments of 15 degrees, etc https://www.youtube.com/watch?v=fPy4U0eGQ0Q&t=1s
  • Create New...