Welcome to od|forum

Register now to gain access to all of our features. Once registered and logged in, you will be able to contribute to this site by submitting your own content or replying to existing content. You'll be able to customize your profile, receive reputation points as a reward for submitting content, while also communicating with other members via your own private inbox, plus much more! This message will be removed once you have signed in.

Starrider

Members
  • Content count

    16
  • Joined

  • Last visited

Community Reputation

0 Neutral

About Starrider

  • Rank
    Peon
  • Birthday 02/08/1980

Personal Information

  • Name Daniel
  • Location Los Angeles

Recent Profile Visitors

2,464 profile views
  1. Thanks for the replies! I actually got both things solved with your help. I guess subnets just dont work, but using the materialbuilder node instead and putting everything inside of it, fixes both issues. The parameter node for export can live in there (and it's working) and my main network stays nice and clean, as shaders are contained. I haven't checked the issue with layer mixed shaders though. Moneitor - I'll keep this in mind, when I use them. Thanks to both of you for your help!
  2. Thanks for your reply! I tried setting up a material subnet with a shader inside (tried principal and also classicshadercore). I then assigned the subnet's shader to my geometry and used opcook -F to force cook them (also tried to cook the whole subnet). But that didn't get it working. Strange. Any idea what I could be missing? Are you saying that to use subnets I have to use the classicshadercore, or is that just something you prefer? How are you force cooking them? opcook -F ? I might go back to the older workflow for now than. This is a bit sad.
  3. i'm just starting to use the new material workflow in h16 and got a couple of questions: * when i'm working in a more complex scene and i have a bunch of shaders with nodes connected to their inputs, my matnet gets polluted pretty quickly. i tried to split things out into subnets, but then the shaders didnt work anymore (also tried it with connecting the outputs, didnt work either). is there another way to 'compile' shaders into some kind of container that i missed? what's the recommended workflow for that? * i'm trying to export masks (aovs) for comp (for example debris: red, main geo: green: character: blue). is there a way to add those without cracking the shader nodes open? i hacked it for now by connecting my (always export) mask parameter node to an unused slot on the principal shaders (for example emission and turning down the emission intensity). this is something which is needed in most productions and it feels odd that there is no extra slot for additional aov export parameters. is there another way for that? thanks in advance!
  4. Hey! Is this somewhere available now? Thanks! Daniel
  5. We tried to nail it to: Some stuff which is hard to do in Maya is easy to do in Houdini and some stuff which is easy to do in Maya is hard to do in Houdini.
  6. Hey! I'd be really interested to try it but the download link is not working anymore. Is it still available? Thanks Ohohoh. I just realized it's working now. Sorry for wasting posts.
  7. Check out 3dbuzz, there are some free tornado tutorials (Houdini 6 but should be helpful anyway).
  8. drop a trail sop after your particle sop and put the switch to "compute velocity". should work fine then. maybe you have to put a cache sop in front of the trail. daniel
  9. It didn't take that long. Like one day for the first implementation and then bux fixing and tweaking. It's a basic implementation of http://www.iro.umontreal.ca/labs/infograph...5-PVFS/pvfs.pdf without all the softbody, stickyness, geo collision components - just the particle interaction. Thanks for the nice comments!
  10. Thanks for your help guys! To be a bit more specific: I have several animated meshes (cycles) (around 20 characters with 15 animations each) . I've exported bgeo's from Maya (with the plugin from houdinistuff.com) but the normals are wrong and I need the point velocities for motion blur. For now I've wrote a python script which automatically parses the folders, sets up a scene with nodes for file I/O and facet etc. Everything is merged together so I just have to play it till the last animation ends and all the files are on my harddrive. The point velocities shouldn't be too hard to add. So I've almost solved it but would be really interested if there is an easier way to do this (maybe in batch mode using Hython). Cheers!
  11. hi! i have to calculate the normals and the velocity (dependent on another file) on several bgeo sequences and write them back over the original files. what's the easiest way to do that? i know that i could write a python command but i hope there's an easier way for that...? any ideas?
  12. Here's a bgeo loader plugin for Maya: http://houdinistuff.com/ (great tool!)
  13. Hi! I'm looking for a mocap library with modelled and textured characters like http://www.rocketbox.de/ for a crowd animation. Does anyone now another one? Would be great! Cheers Daniel
  14. I thought that transferAttributes would do some interpolation like shown below with weighted results based on the distances to the sampled points. But if it would do that the resulting points would not sit on that kind of grid the result shows. They'd sit randomly scattered (as the scatter does) on the cylinder. Am I overlooking anything?
  15. Hm I think I don't need P because it's stored in Cd and Cd is automatically transfered if "copy local variables" is set, isn't it? anyway - thanks for the reply!