Jump to content

bento

Members
  • Content count

    13
  • Donations

    0.00 CAD 
  • Joined

  • Last visited

Community Reputation

0 Neutral

About bento

  • Rank
    Peon

Personal Information

  • Name
    ben

Recent Profile Visitors

1,002 profile views
  1. Hi, According to the 15.5 release notes, there is now a builtin approach for rendering overscan: You can now render extra “overscan” image data outside the display window in image formats that support it (for example .pic and .exr). This extra image data may be useful to other software such as Nuke. All I can find in the docs is: Image Overscan Houdini vm_overscan Enlarges the crop region by this many pixels (horizontal and vertical amounts). If the crop region is the full image, additional pixels outside the image will be rendered. For images that support arbitrary data windows (OpenEXR, Houdini) pixels outside the image resolution will be saved. For other image formats, the pixels will be computed but the results discarded. I've tried adding to both my camera and my output mantra node (via 'Edit Rendering Parameters..' ), but neither seems to have had an effect. Any one had any luck with overscan? Any suggestions? EDIT: Is actually working fine. Doesn't show up in RV. doh!
  2. Thanks Miles - thought it would be something along those lines. RFE it is then. cheers
  3. Sure - here's the hip. https://drive.google.com/file/d/0B16CnRwtGChua2ZSTktSdjBuMGM/view?usp=sharing Just swap 'Enable Diffuse' and 'Enable Subsurface Scattering' in the mantrasurface1 shop to see the difference.
  4. Hi - when using the mantraSurface shader, or building a surface from physicalSSS or surfacemodel, I find that the SSS shading does not receive indirect illumination. Have a look at the attached image. Bounce light from the card is visible on the underside of the sphere when shaded with simple diffuse, but neither the single scatter or multi scatter pick up those contributions. Is there anything that can be done to include indirect paths in SSS calculations. Or is that an RFE? cheers
  5. Thanks for the suggestion. You're right: I was being a numpty and trying to use a locally installed hda on our farm. All good now that hda is in network location. cheers
  6. Hi, I'm attempting to use images embedded in a HDA for a light-rig. All is fine when rendering within my houdini session. However when I export IFDs and execute them on the farm (command line mantra) I get messages along these lines and the textures are missing in the renders: [16:41:26] mantra: Unable to load texture 'opdef:/Object/turntable?ttBG.pic' Within the HDA references to the embedded textures are relative, i.e. opdef:../?ttBG.pic Is that possibly causing a problem? Do I need to make these absolute? Or is there a flag/switch somewhere to make the IFD export include the embedded images? Or is this just a limitation of IFD export? I can imagine embedding every texture into the IFD for every frame being quite wasteful. Any one have experience with this? thanks Ben
  7. Here's the screen grab that didn't attach properly.
  8. Hi, I'm attempting to create a digital asset via python scripting and would like to set up some of the parameters and interface as part of that process. I've had success creating parameters and arranging the interface for normal nodes (spare parameters), but taking the same sort of approach with an asset only appears to be half working. Here is my debugging example code: hou.hipFile.clear() # just whilst we're testing; do a fresh start objNode = hou.node('/obj').createNode('geo') objNode.setName('myAsset') if objNode.canCreateDigitalAsset(): objNode = objNode.createDigitalAsset( name='test::myAssetA', description='myAsset', hda_file_name='/tmp/foo_v001.hda', ) parm_group = objNode.type().definition().parmTemplateGroup() first = parm_group.findFolder('Transform') folderTemplate = hou.FolderParmTemplate('myFolder','My Folder') buttonTemplate = hou.ButtonParmTemplate('myButton','My Button') folderTemplate.addParmTemplate(buttonTemplate) parm_group.insertBefore(first,folderTemplate) objNode.type().definition().setParmTemplateGroup(parm_group) What I'm hoping to do here is create an asset, add a folder as the first item in the interface, and add a button to that folder. Like I said, this sort of seems to work - if I then go to the Edit Opertator Type Properties dialog I see the parameters listed as expected. Weirdly though, all the pre-existing folders are now renamed to match the new folder I added (i.e. myFolder_1,myFolder_2 etc). Then when I look at the interface in the parameter editor, 'myFolder' appears as a check box instead of a folder and the custom UI elements are drawn last rather than first. (see attached screen grab). So, the question is - am I going about this the wrong way (likely), or do you think this might be a bug (happy to log if so)? cheers Ben
  9. Thanks - I'll take a look.
  10. Thanks Michael, Yes. I'm just looking at setting up an asset management layer for houdini (for more traditional assets such as characters, props etc rather than 'tool' type assets). Initial instinct is to keep approach in the same ball park as how we would approach things in Maya (bear with me...) What I'd like to do is have working (hip) files for asset generation that would contain the asset subnet itself and any required development aids (think a light rig and turn table for producing a lookdeved asset) The hip files containing the development aids and the 'proto-asset' would be saved and versioned, available for anyone to pick up at a later date. Then to pass the resulting asset out into production the creator would 'publish' their work. At this point I'd like to write out a versioned HDA to the central location. That all works fine if I only publish once, but it seems a little messy if I then make changes and publish again. createDigitalAsset() wont work on the same asset twice, so from what I can tell I'd get into doing some HDADefinition stuff second time round. Is that right?
  11. Hi - I've got a feeling I'm coming about this from the wrong angle, but does anyone know if it is possible to programmatically export a subnet as an digital asset, without actually turning it into an asset? I'm looking for an equivalent to hou.Node.createDigitalAsset(), but which doesn't turn the source into an asset in the current session. Something like exportAsDigitalAsset().... If this seems backward or bizarre, please do say and I'll elaborate on the motivations. See if there's a better way of coming at it. thanks Ben
  12. Thanks - shadow() did the job. I'd been skirting around that in my first stab at things, but couldn't quite get my head round its usage. For the record this is doing the job for me (as an inline piped into the Ce of a computelighting): vector shad = {0,0,0}; illuminance(P,N,1,"categories","A") { shad += shadow({1,0,0}); } illuminance(P,N,1,"categories","B") { shad += shadow({0,1,0}); } illuminance(P,N,1,"categories","C") { shad += shadow({0,0,1}); } $out = 1 - shad; All I want it for is generating some shadow mattes so I can 'grade' a backplate projected on to a grid for a turntable setup. Just want to avoid having to do it in comp after the fact. Much better if it can be live. Thanks again Ben
  13. Hi, Does anyone have thoughts on how I might be able to access the direct_shadow component on a light by light basis in VEX? I'm using PBR, and need to do some shadow compositing in render rather than in post. I can use the shadowmatte vop to give me a matte to use, but it doesn't appear to allow me to restrict which lights contribute to the matte. My current thinking is to use an illuminance loop with categories, but I'm unsure if/how I can query the lights for shadowing info. Is that data exported from lights, or in PBR is it all lumped into one output by the integrator? (I see the pbrlighting vop exports direct_shadow as one vector...) any ideas? Ben
×