Jump to content

LaidlawFX

Members
  • Content count

    937
  • Donations

    0.00 CAD 
  • Joined

  • Last visited

  • Days Won

    19

LaidlawFX last won the day on June 4

LaidlawFX had the most liked content!

Community Reputation

160 Excellent

1 Follower

About LaidlawFX

  • Rank
    Houdini Master
  • Birthday 02/23/1985

Contact Methods

  • Website URL
    http://LaidlawFX.com

Personal Information

  • Name
    Ben
  • Location
    Bordeaux, France
  • Interests
    Getting away from the computer
  1. Transfer HDA animation

    These are certainly some good reservations about storing the materials as HDAs. No matter what you will have a massive collection of files to stores these materials. These materials could be substance materials, or as jason/python file as other examples to apply the same logic thoughts. As far as the shit loads of HDAs, you can store these appropriately in your HOUDINI_PATH or HOUDINI_OTLSCAN_PATH. This way you only load the HDAs that you need. These would be the separate load directorys you append on load. You can also import via python files from disk not already in those directories, which may be the best method. Houdini does not really care about how many thousands of HDAs you load, assuming you didn't build really bad HDAs that cook the world on load. Generally speaking you don't actually need that many HDAs for even a feature film. A couple hundred might be enough for all FXs and studio functions. A few dozen really matter in the grand scheme. As far as materials the whole concept is to have as common as pool as possible. You don't want a dozen ways to do car paint. You can also manage hiding the HDAs via OPCustomize file. So you could load all the HDAs and ophide them all by default. Look through that file and you'll see a whole pile of tricks. On creation of these HDAs, as you do not want to do them by hand, you can sort them by the tab-submenu paths, and update the OPcustomize, or any other corresponding HDA options, via python. This way they are organized better than by hand. Honestly, in the case of materials required for objects you do not want the Lighter to ever have to look up the correct materials for the objects. So seeing them all like in "All" shouldn't be a real concern. Only OCD like me and you notice or do anything about it. I've worked for many studios where people just tab and go. So there will be a level of automation you will need to develop for your project to load them either way. Cycle back to the shotgun conversation that can help manage that.
  2. Transfer HDA animation

    So yet again not a an extremely simple answer, so this will spiral out as you read through it more. To keep it simple with the materials. You can just use groups/attributes to define the different parts of your character and then with the material sop you can define your materials per a region. Plus what I think you are after is the material overrides. You can define the overrides to the shader on the mesh. So you could define the diffuse map per each primitive. This can get a bit expensive with the file size on disk, especially for an animated sequence, but you could theoretically have one uber shader for the whole production, and everything is defined via the mesh. You can do this at the object level, too depending on how you split up your object. Finding the happy medium is key. For a one person project, and for standardized shading models this might work out really well. An alternative that is similar and will not over use the material overrides, but still have all presets defined on your uber-shader, is by having packets of shaders within a material subnet HDA. So you could save all the presets shader for your character, for instance, all the clothing types in one material HDA subnet, that contains all possibilities. i.e. Ralph_Clothing.hda with the material subnet hda containing clothes/dirty clothes/clean clothes/pants A more advanced setup would be on import of your object with a file wrapper on your HDA, to parse the geometry or an external saved file for subsequent required materials. You can do it similar to the above, or even by creating the uber shaders dynamically in python and applying the presets directly in the scene. By applying the presets in the scene, say if you have a lookdev department authoring the materials, you may have issues maintaining their updates. However, with even more pipeline framework you can dynamically refresh those once they are updated. Another alternative, in the simple case is that the lookdev department authors their materials as HDAs too. Then you get the benefit of version control built in. But this requires you to build a lot more complex naming convention system, for storing HDAs, etc. There are also material stylesheets you can use. Jeff Wagner just did a few video's on those, but they may be more up your alley, a bit too much to explain in a forum post however.
  3. Transfer HDA animation

    It's good to know you're pipeline is just in Houdini. * IMHO, you can ignore this part, just don't build yourself too much into a single software centric pipe. One software generally won't be the solution for everything, and production every-software has it's strengths. For instance Shotgun, Nuke and Houdini all go hand in hand. So if you are doing strictly shot based work from anim to rendering, like a film or commercial pipe. Plus if you are not working on the part of the pipe to help your animators read from a library of animation cycles you can simplify it pretty quickly. Also not working on something to help FX team then it's pretty simple. For Lighting/Rendering just go with geometry caches. This way the lighting artist/rendering person only needs to read the caches from each respective department. On your character rigs you will just need a render button, that will export your geometry sequence .bgeo.sc to a project/sequence/shot#/department/asset/version structure or something similar, you can get real fancy with the file structures. For instance, for relative common asset just replace common in the corresponding hierarchy, and mirror of authoring scenes, and machine generated content. If you want you can even include an .xml/jsn for meta data on import, or include it in the details attributes for custom tags of information, static asset, frame range viable, render option etc... This way your Lighting artist can just load all assets from the same hierarchy. By having all the geometry cache in the same known structure it becomes easier with out a program like Shotgun to manage the asset dependencies to auto load all assets on launch of the .hip file based on file locations. Sometimes it's good to export the animation channels out on the rig/hda, as you can use the same animation caching method that your animators will use. Save you from developing an extra system. But you are going to need a geometry cache system, that is the most common production tool across all studios. Also for FX artist, it is sometimes really needed to have the entire rig for simulations and FX work. Character TDs doing grooming, cloth, or rag dolls for instance would need them depending on the work flow. If you save your T-pose in an easy to access mode on your HDAs this becomes easier, but since DOPs can inherit object transforms easier it can be a toss up depending on your artist. For materials yes you can keep them in the rig. The thing to remember is they are only a string attribute on the primitive, so you can bend and twist this string to your hearts content. It also really depends on the complexity of your production, or are you building a pipeline for a studio to do hundreds of projects? Are you doing a simple commercial no change in materials for the whole sequence, or are you doing a fashion commercial where every shot is the same pirouetting ballerina rocking different clothing? Also are you shading an entire crowd or just one character. You do not want to have hundreds of the same material loaded in a scene in that case. You would want to have a load detector for a common material area, if this material has been loaded do not reload it. All thousand characters would use the same material, this will certainly render simpler and keep your scene file manageable. Also do you need per shot flexibility on adjusting the materials, especially for the light artist? The possibilities for mantra and comp have gotten so many amazing options than two decades ago you may not need any of those if you have a competent compositor and one person to set up the shot. Hope it helps, there are a lot of questions to reflect on scope, scale, desired functionality, flexibility, time and resources... So not really one best answer with out those variables.
  4. Transfer HDA animation

    So an aside that may have found more of what you want... There are the Python Panels: Autorigs, Pose Library, and Character Picker that ship with Houdini. These have been updated by I believe Michael Goldfarb. Here as an example: https://www.sidefx.com/tutorials/autorigging-masterclass/ So it may be worth not listening to me, and looking into those and getting in touch with SideFX, before I take you down a more complex path, lol. Sorry these are new tools I do not use in production. At the heart of it you would want a library where you could save your different animations too. So Walk, Run, Sit, etc... You would need to be able to read and to write from these files to the parameter pane. You may want to split up the rigs export to different components even like upper and lower, or other component pairings. You can do this with say FBX files, chops with chan files, etc. Storing the animation with FBX may be good if your animators liek to use MotionBuilder, Maya and other programs. I would say your most studio robust and flexible would be encoding all with python. You can even go into the realm of making your own Python Panel in Houdini to help manage the files on disk and help import, including nifty images/gifs of the the different exports. I believe the python panels mentioned above can be used as examples if not used directly to get you started.
  5. Transfer HDA animation

    There is no standard tools per say. But you can copy and paste the nodes/HDA from one scene to another from two open sessions they use the same copy buffer, or merge the hips from the file drop down. If you want to invest time you can script it, or make the several different nodal methods to copy the info/hda.
  6. How much RAM is too much ?

    We had a run away sim on RIPD in 2012 that would eat up to the max 128 gigs of ram on the server farms in Taiwan at the time. Generally this should be an extreme outlier like it was. The Shape of Water team in Toronto had a few sims that did this too. Every film generally has one or two shots that depending on the artist will use up this much ram for a sim. Another case is if you are doing some heavy handed photogammetry where you toss hundreds of photos at it, as opposed to a proper plate set of like a dozen high resolution photos. A more practical math for evaluation is for farm nodes we usually do 1 cpu core to 2/4 gigs of ram for rendering. So if you get one of the new thread rippers and your machines is part of the farm, it can be used to run dozens of small jobs or a massive sim. So for a comp Render node or FX render node. Most studios on average have worker stations at 64 gigs of ram. It's generally cheaper for a studio to give them a second box than over doing the ram.
  7. Can not Type [ or {

    Are you using a non-us keyboard? Sidefx does not really support different keyboards styles. Otherwise I would check out your keyboard setting. We have also had keyboards die in the past.
  8. vertex animation textures BBOX size

    It is probably for the best now if you just use an external calculator at the moment to change the value manually. Unfortunately there is a bit too much to explain to help you out with this issue to say where to put it. I would have to explain a lot of the fundamentals of Houdini to you in order for it to make any sense. Apologies I can not be as helpful. Perhaps someone else can give you some help?
  9. vertex animation textures BBOX size

    Are you talking about the values in the HDA? Depending on at what point your are talking about it's as simple as putting in the /100 or *.01 in the correct location for the value you are looking at. For instance in my version I change the settings to allways output the right value for Unity/Proprietary as that was our primary platform(s) as opposed to unreal. -Ben
  10. Houdini 17 Wishlist

    That is certainly an annoying one.
  11. Read files through a loop

    Another alternative is that you can do this in a python sop too. With a loop just merge each iteration of Geo to the main Geo.
  12. Vertex Animation Shader Culls Geometry

    What engine are you on?
  13. Houdini Points to Unity Line Renderer

    Hello Oliver, Welcome to the forum. We were just having the line rendering the issue the other day, lol. For animated FBX, use the Vertex Texture Exporter, or you will have to animate your geo at the object level. Each element in a separate object with each animation on that object, and then export the whole object level tree. When exporting for the frame range $F does not really apply in the case of FBX as it spits out one monolithic file, not multiple files. I highly recommend the Vertex Texture Exporter, as it will be more efficient for almost all FX cases, as opposed to traditional FBX rigs. The mesh is still FBX, but the animation is really a fancy displacement shader. The process is a lot cheaper at runtime unless you are doing complex stuff like characters that are blending motions. The unity FBX importer does not import lines or points/vertexs it will cull them out. FBX does store them though. You would need to make a custom unity importer, which there are a few unity forum post about on the interwebs. You will need to be a bit more on the tech side to do it. The alternative hack is to make polygons out of your points/vertices and lines so they import. Dirty I know, but it depends on how complex your pipe is. If you are working on a side project I would recommend this. If you are working at a company I would recommend the later. Make sure your normals are on the vertex level, and that your points are fused for your geometry for FBX. Also visualize the face normal, this can be reversed from the your vertex normals. Use the reverse node to flip the face normals. This generally happens when you build a mesh in a few different ways. I could open your file, but I'm lazy This reverse winding of the face can have different issues for different rendering programs, especially which way it interpolates the normals in your case invisible.
  14. Empty locked node

    The file on disk should not be affected, your right. If it crashed while saving then it could. I take it you have not recovered the files from $TEMP? That is what I was imagining happensed, as that is the most likely scenario. Another possibility did you change major versions? Could be something with how they changed geometry storage.
  15. Empty locked node

    This will happen when a scene file crashes, and you open it again. The crash file only save the info that is in the system memory, not stuff like geometry. Once the locked node Geo is gone it can not come back. It is more like a short term temporary buffer. Great for performance testing, or for sharing files on odforce. Generally speaking you should only lock nodes temporarily. For long term use you should use a file cache node or something similar. This is the same as other software systems, albeit it may not be as obvious.
×