Jump to content

Otl Config/managment Help

Recommended Posts

I'm experimenting with the various otl configuration options

(EX. "Give Preference to Definitions from Index Files".. etc)

I'm curious what other facilities are doing for otl managment.. allowing artists to keep up to date with latest otls but maintain the ability to lock off at a particular version. What are your thoughts on embedding etc.

Here's what I want to do (and maybe this is avail, looking at it now)

-Maintain a global OPlibraries file with all the latest approved otls (we're doing this already)

-When you save a file it stores paths to the actual .otl files in use, but does not embed automatically.

-If a user modifies an otl in a shot hip file, then of course you're simply embedded (or maybe a local shot path is used to store this one-off otl version?)

-When the file is re-opend if the internal otl path does not match the show approved path then a warning/option box is presented allowing you to lock off where you're at or change to the show ver. It's a workflow convention we're using that you never overwrite an otl that's been released on a show.. so updating an otl means making a new version directory and storing it there.. this way old files can still point to the older otl's they were built on without needing to embed

-We've also started archiving hip files for renders that get kicked off on the cue.. in those archived hip files I'd like to embed ever operator so the file can stand on it's own. This is not a working file, but just a snapshot of everything at the point the file was saved.

I'm looking now to see if there's a combination of options that will allow me to do this. If others have a good workflow I'm all ears.

Thanks for any input.


Share this post

Link to post
Share on other sites

Here's a summary of what works for us, in an Effects oriented pipeline. I'm sure the needs of a Feature Animation pipeline would be more intense.

A master OPlibraries file which contains stuff akin to:


...and so on. Ours has quite a few more things, but thats the root of it. We wish that the OPlibraries file could be done away with and otls loaded by compound paths. Just thank goodness that the OPlibraries evalulates env vars and has preprocessor capabilities. We use a very very short $HOUDINI_OTL_PATH for this.

Complication: A small complication comes in of course when creating new OTLs from custom stuff because it uses the $HOUDINI_OTL_PATH in the file browsers - which now contains only a path to the master OPlibraries file and that's all. No longer can you pick a sensible place to save your new OTL from the pulldown.


We choose not to embed any OTLs. They all remain external to keep the hipfile sizes down.

Complication: When you unlock an OTL to make custom changes, I wish we could set it to automatically embed any unlocked OTLs. Currently if the OTL is missing or has its parameters changed, the node will fail to load properly even though customizations have been performed. In the case of a missing OTL, the node disappears and it replaced with a Subnet and in the case of changed parameters, many of the changes that you've made that rely on the parameters at the time of unlocking will be invalid and your custom changes are corrupted.


At the time we submit the job to the queue, we consciously do an OTL copy (otcp) of any referenced OTLs (which are NOT part of the SESI distribution) into the Embedded section of the hipfile, save the new hipfile in a new location and render the frames from that. This allows a render to complete correctly even if the definition of the OTL gets changed partway through a long render. We do similar kinds of things with all reasonable published resources (like chan files) where we resolve paths in a careful way. This whole technique also helps for archiving encapsulated scenes at the end of a project. This whole effort is because here we do often overwrite certain OTLs without versioning up when we publish them - we publish them "in place" a lot of the time. We do maintain every revision of an OTL in a version control system and also in an asset management system but Houdini itself is unaware of most of that process.

Complication: you have to be careful and respectful for when the user wants to change the definition of an OTL or something in the scene on a job which already on the farm/queue. Easy enough by check date stamps and such. It can take anywhere from 10 seconds to 2 minutes to perform this OTL embedding process and we found we needed to write a better opfind command to handle it properly (for example, opfind does not allow you to search against evalled strings - e.g. expanded environment variables). Version 8's opfind is better than 7.0's but still a bit lacking.


Thats the crux of the OTL setup here.. there are a lot of short'n'sweet peripheral asset management tools and so on, but this is the configuration and how we deal with it. I wish that SESI'd help us solve the "Complications" portions of the this post (they've been RFE'd for a hundred years) but this does still work with a few inelegant workarounds.

Hope this helps at all,


Share this post

Link to post
Share on other sites
Here's a summary of what works for us...


Thanks very much for the detailed reply. I've been messing around with the various ot commands to try and come up with some sort of improved system.

Right now we do embed all the time (at least on the show I'm on). If you have embedded nodes houdini will give you the option to update them if they don't match the current definition. It's actually worked fine, but it can confuse many artists and more often then not the embedded version is identical to the show version. The real hard part is that it makes it difficult to debug where an embedded version actually came from. Id like to do away with embedding for working files and only keep them when needed (unique)

We have a very similar setup going with archived hip files. Each time we kick off a render the hip file is saved in an archive area and everything is embedded. Cue scripts point to this archived file, not the users working file. That's working out great so far

Currently I'm testing other options for general use though. I've got a script I'm testing now that does the following.

a) I still set embedding on for all saves

B) when a hip file is loaded I run a script where all embedded nodes timestamps are tested against the other verions of that node in the lib path.. if an embedded node is found to be idential to one on disk in our show area, the embedded one is deleted. So at startup, all non-unique embedded nodes are cleaed away.

c) Check all inuse otls and do an otload on the current paths.. this actually stores the path to the otl in the hip file as a seperate entry from what's found in the oplibrary file.. even if the path is identical. Bascially it's storing an internal oplib table that's saved with the file

d) last step is to check the current otl path for each inuse otl against the show oplibraries and warn the user if you don't match

I'm still playing around with this. If I had the ability to run on-hip-save scripts I could make this much easier and much less guess work. Seems like there's a few options.

1) trigger an embed and run an otversion or otcomment to store the actual path it came from with the node. then query that on load and handle it however

2) Have an hip variable or a dummy otl for holding a string and store all the otl file paths that are in current use there. This becomes an internal otl table. Then on re-load run a procedure to check against current versions and present options to lock off or upgrade

I'd like to get this right . The archiving hip file actually takes care of most of my worries about loosing data or opening old hip files and getting unexpected results.

Now that we're doing feature animation we've been ask on more then one occasion to re-do the entire movie in 3d. On Polar Express it was a whole other team that trailed the regular team to re-create the shot from a second camera. Often they were working with hip files that were very old and pulled from tape backups.. the embedding was curcial to be able to recreate the data.

I'm going to read your post more carefully when I get back to work in the new year.. today was a slower day at work and I had some time to experiment with some of these idea and at least get myself aquainted with the commands available in H8 and the shortcomings of the workflow



Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now