Welcome to od|forum

Register now to gain access to all of our features. Once registered and logged in, you will be able to contribute to this site by submitting your own content or replying to existing content. You'll be able to customize your profile, receive reputation points as a reward for submitting content, while also communicating with other members via your own private inbox, plus much more! This message will be removed once you have signed in.


  • Content count

  • Joined

  • Last visited

  • Days Won


Community Reputation

7 Neutral

About mikelyndon

  • Rank

Personal Information

  • Name Mike
  • Location Vancouver, Canada
  1. Hi Callan, Are you using the apprentice version of Houdini? Because unfortunately the vertex animation export tools don't work with apprentice because you can't export fbx and images are watermarked. Mike
  2. Hi James, It is possible. Each fractured piece has to be in it's own geometry container so that when you export it as an fbx it creates the necessary hierarchy. You can then use "import fbx chunks" on the destruction mesh to create the depth 1 pieces. Mike
  3. Hey Jack, Can you post one of these assets so I can have a look? Mike
  4. Hi Milan, The git repo has 2.0.14 as the latest. Did you have any plans of submitting 2.0.15 to github? Mike
  5. You need to add this to Houdini.env HOUDINI_ACCESS_METHOD = 2 Thursday, May 9, 2002 Houdini 5.0.117: Adding new environment variable, HOUDINI_ACCESS_METHOD. This value can be 0, 1, or 2. It specifies what method Houdini will use to check file permissions. 0 uses the default method, which does a real check of file permissions using the proper NT security model. But this method can be quite slow on systems which use a login server elswhere in their network. Method 1 uses the NT security model again, and is much faster than method 0, but doesn't work on Win2k SP2 (Microsoft broke an important function in that release). Method 2 just checks the file attributes, which is fast and works, but ignores all the NT security stuff, so files and directories which are read only because of NT security permissions will report that they can be written to.
  6. Hey Milan, When connecting the fractured geometry to the bt_extracells without a cellpt attribute it causes houdini to crash. A warning or error would be better I think. Mike
  7. Hey Memo, Considering how much your work has helped me in openFrameworks I thought it would only be fair to see if I could help you out on this. I found this little gem which seems to solve the problem. The issue is how the polyExtrude treats the incoming geometry. If the group field is left empty it treats the geo as primitives and ignores the Normal orientation. If you create an edge group first and feed that into the polyExtrude it treats the geo as edges and should then work as you'd want it to. Hope that helps. Mike trail_polyextrude_with_groups.hipnc
  8. I started with the basic smoke shader and built it from there. You could even use Matt's modified shader as a starting point. The light setup was pretty much the same as the setup in the paper(section 7.4 and 7.5). One light for the sun, one for sky illumination and for the ground bounce illumination. You could add more if needed but that is enough to get you started. Because each light is representing such a large light source you could get away with using directional lights. All the lights used dmap shadows. Even though I don't think directional lights are meant to work for photon generation they do. So an initial render generates the photons using pbr, then a second render generates the image using micro poly renderer. So you're not actually rendering the image using pbr, you're just generating the photon map with the pbr engine and then using the photon map in the shader in combination with the other stuff to do a micropoly render. That way you're not waiting 18h for a frame. I wish there were tutorials on this kind of stuff but none that I know of. A lot of the time I just had to keep testing different possible solutions until something worked. I'd say these forums and the sidefx site/forums are your best bet for finding more info. One tip I can give is if you want to separate lights in the shader you can use the light mask parameter on the illuminance loop inside the shader. We created our cloud setup before 12.5 was released and it's cloud tools were available. I've played around with the cloud fx tools but haven't used them in production. The cloud light is trying to solve the same problem though - how to approximate multiple scattering in a cloud. The difference is the cloud light is attempting to calculate this within sops while we chose to compute this at render time. I had tried to compute the scattering in sops using the volume bake sop when we were doing rnd but the photon map generation option was already further along and I never had time to go back and explore some more. One big take away from the paper, and Matt's shader and texture lookup addresses this, is that clouds are heavily forward scattering. I mention this because if you're going to approximate how light travels through a cloud with the tools available then I'd recommend you use a phase function of 0.9 or above(the built in cloud shader uses 0.2). At least that's roughly the response for a single order scatter. Things start changing with increased orders of scattering and you can see that in the paper. I can't count how many times I read through that paper so I really understood what was physically correct and then how to approximate that in the lighting and shader setup. Hopefully one day I'll be able to talk in more detail about the setup I used. For now, if you have any more questions I'll try and answer them. Mike
  9. Hey Oliver, We used that paper as the basis for a cloud shader being used in production at the moment. I also integrated Matt's awesome work into the shader. The trick is to break the shader into direct and indirect lighting as well as single scatter and multiple scattering. At least that's how I handled it. So Matt's texture lookup is great for direct lighting for the first scatter component so you can get all those cool features from the lookup. Then to mimic increased orders of scattering we used a gi light to generate photons for the cloud and used a point cloud lookup or the vex photon function in the shader. So we were using pbr for the photon generation and then micropoly for the image render. If you have any question let me know.
  10. Hi Dennis, The way I understand it is this - When creating a shader for pbr you generally add together your diffuse, specular and refraction bsdf and pass that through the conserve energy node so that pbr can ensure the shader doesn't reflect more light than it receives. The physicalSSS doesn't output a bsdf so instead we have to export the result to Ce which is then added to the result from the other contributions. This means that the final result can possibly reflect more light than was received. To compensate for this the fakealbedo uses a pbrDiffuse multiplied by the sss intensity and colour and added to the diff, spec, etc and then run through the conserve energy. This way the conserve energy can take into account the sss. The conserveenergy2 node creates a scale factor that is used to multiply against the result of diff, spec and refraction and is also multiplied against the sss colour and intensity. This ensures that when diff, spec, refraction and sss are added together for the final result the reflected light doesn't exceed the amount of light received. So without these calculations if you have a light with an intensity of 1 and both diff and sss have an intensity of 1 you could possible receive twice as much light reflected from the surface. I hope that all makes sense. I don't have any render time comparisons but the main reason for the stripped down shader was usability. When the MSM was first released in 11 it went through a lot of iterations while still in beta and even in production builds. I believe there was a lot of talk from the community concerning "old school" shading techniques and parameters and the newer ideas for physically correct renderers. The MSM is a combination of those two schools of thought and so they can sometimes be confusing if not properly understood. I wanted a simpler, cleaner interface for myself and other artists to use specifically for pbr. Hope that helps. Mike
  11. Are you going to use the shader with micropoly or pbr? We created our own material from scratch to replicate the mantra surface material but without all the confusing bits and bobs and specifically for pbr. One of the things to be aware of is that you need to remove the sss contribution from the bsdf if you want correct energy conservation. At least that's how I read what going on in the mantra surface material. If you look for the fakealbedo multiply node inside /shop/mantrasurface1/surfaceModel you'll see what I'm talking about. It's a little convoluted and difficult to explain so I'll post an example when I get home. Mike EDIT: I've attached a stripped down material that would be used with pbr. Hope it helps stripped_down_material.otl
  12. Here's a simple setup to look at. The sphere on the left renders with sss fine but the instanced object doesn't display single scatter sss in the render. And if I turn on multi scatter it doesn't even render. point_instance_procedural_v01.hipnc
  13. Hey guys, I was hoping someone could enlighten me as to why I can't get sss to work with the point instance procedural. I'm using pbr. If I turn all components of the surface shader off except for sss and try single scatter the instance object renders black. If I attempt to render with multi scatter the render seems to get stuck on generating the point cloud. Thanks Mike
  14. Hey dude. Have you tried using the voronoi fracture tools? That might be a more elegant solution to your problem. Mike
  15. Thanks Morné. My bleary eyes weren't seeing straight at 1 in the morning. Mike