Jump to content

Mister Fidget

Members
  • Content count

    5
  • Donations

    0.00 CAD 
  • Joined

  • Last visited

Everything posted by Mister Fidget

  1. Raytracing for Renderman

    If there is a render parameter you want/need that isn't available by default on ROP (or most nodes really), simple click the cog in the upper right hand of the parameter window and choose "Edit Rendering Parameters." On the far left you should see a list of folders for different renders. You can manually search for the parameters you want by opening up folders for the renderer you want (I assume PRMan in your case), otherwise you can search with the filter at the bottom left (much easier in my opinion). Then just add the parameter to your ROP and the SOHO backend will treat the parameter appropriately so long as you pulled the parameter from the folder of the same renderer that the ROP corresponds to. Some parameters overlap, but a parameter for say 3Delight might not work with PRMan or vice versa, even though they are both "RenderMan" renderers. I hope that helps some.
  2. Studying Renderman

    In reply to your first question (how to use Mantra and RenderMan alongside one another), when writing different shaders for Mantra and RenderMan you can use the Select SHOP node to switch shaders auto-magically at rendertime based on which renderer you are using. So you could pipe in a RenderMan shader (either hand written or using VOPS) into one input and a Mantra shader into another and depending on which renderer you use, Houdini will assign the correct one to the geometry without you having to do anything other than hit "Render." This is useful for making ROP node networks with Mantra, RenderMan and other ROP types mixed together and having Houdini automatically take care of the shader assignments at render time. It is also nice for easily comparing shaders between different renderers. I wish I had known of this node when I was trying to recreate the Mantra glow shader using RSL; it would have made my life much easier. I haven't had a reason to use the node since discovering it, so if you end up using it let me know how it goes.
  3. Fur and getting guides to work

    Good news everyone! I figured it out how to drive hair using imported guide curves without using the flowfield SOP! The flowfield SOP is a good solution for individual artists working on a project in isolation, but if someone has to share their work on a collaborative project then using an experimental node like flowfield can create a sticky situation (like for instance, my situation that I explained in the last post). I am including an image of the hair as driven by only the imported guide hairs. I still need to work out some problems that are causing the procedurally generated hairs to not shape quite right (as is probably obvious from the image), but at least the main hurdle of getting the foreign guide hairs recognized properly has been overcome. You are right jim c: there seems to be very little out there on the topic of grooming long hair in Houdini. There seems to be just as little about successfully importing guide curves from outside applications. If either one was easier/more documented I think that a lot more hair would be done in Houdini. Thus, I will post the file once I have cleaned it up some so that others can dissect the process I went through. I'll also explain the concepts of what I did for the learning and benefit of others. Thanks again jim; I could not have figured it out without your instructions and example file.
  4. Fur and getting guides to work

    @ dobin: Thanks for the suggestion. However, I haven't been able to get the IGES format to export anything other than the groomed NURBS curves hair guides (which I have already exported and imported previously with the FBX format). Besides, it isn't so much the hair animation I am trying to get out as it is the groom. The animation and simulation of the hair I can do in Houdini pretty easily. But I can't seem to get the Fur SOP or the Fur HDA to recognize the previously groomed curves as guides. @ jim c: Thanks jim. I have been able to get the Houdini generated hair guides to flow in the same direction as the original groom without much fuss using the flowfield SOP and your example file. Original Maya groom: Houdini groom: Obviously, I haven't put in any time yet on shading, length maps, etc. But at least being able to get the general shape and flow is encouraging. Sadly, I don't think the flowfield solution will work. I know that I can install the flowfield SOP on my user account at the school, but I don't have control over what the other students working on the shots will/can do. Also (and probably most importantly), I don't have control over the installed version of Houdini on any of our render farm's server blades here at the University. We have a decent number of shots, so being able to render on the farm will be helpful in rendering the remainder of the film in a reasonable amount of time. Since the flowfield solution is probably not the best option (given the above reasons), I am trying to figure out how the scattered guides created in the Fur HDA are recognized as legitimate hair guides so I can force the HDA to view the the scattered Maya hair guides as legitimate. I guess my new modified question is if anyone knows what is special about the hair guides that are scattered using a Fur SOP (which is how the scattered guides are generated in the Fur HDA in H11) that allows the Fur procedural shader to recognize them as legit hair guides to drive hair generation?
  5. Fur and getting guides to work

    I have a problem is extremely similar to jim c's original problem so I felt there wasn't much point in starting up a new thread. Background info: So I am working on finishing up a student film along with some of my fellow classmates. So far we have been lighting/rendering in Maya using RenderMan for Maya. This has been a huge headache. The director and I agreed that we should light the remaining sequences in Houdini (version 11) using Mantra. We are really liking the results so far (as well as the comparative ease of use), but we have run into a roadblock. The main charater has hair that was groomed in Maya. We were planning on rendering this separately in Maya and then compositing the hair and the main beauty pass together in post (as we have done in previous films). However, the scenes we need to light in Houdini have reflections in them and thus require that we have something visible. We thought of just having polygon hair that is phantom rendered to catch secondary rays, but this solution ends up causing its own set of problems. The problem: So at this point we are trying to get the hair that has been groomed in Maya moved over into Houdini in order to render it along with everything else. Converting it to polygons and exporting it is heavy and impractical at render time. I am able to get the guide curves out of Maya and into Houdini. But once I have them in there I can't seem to get Houdini to use them as guide curves to drive the fur/hair. The Fur SOP and the Fur HDA both give errors of mismatched geometry. I could try to regroom the hair from scratch in Houdini to match the Maya hair, but that seems like more trouble than it is worth, especially when the guide hairs have already been groomed once before in Maya. Besides, it will help keep consistency at this point when we have already batch rendered some of the other shots with hair in Maya. phrenzy84 mentioned being able to use guide hairs from another program, but never quite explained the details of which attributes need to be created/transfered. Could you go over the general aspects of the process that need to be in place to get it to work? I understand you used blender, but as we are not using that application, I need to principles of using outside guide curve geometry more than the blender specific steps.
×