Jump to content


Popular Content

Showing most liked content on 06/10/2016 in all areas

  1. 3 points
    http://www.go-ghost.com sub-pixel sample compositing system
  2. 2 points
    I was just about to post that Neilan Here's the cool embedded vimeo version: Mostly Houdini/mantra
  3. 1 point
    Hi. After some research I developed the concept of the surface shader to make shading artist work more efficient. A while ago I have implemented it in VEX and now I want to share it with you. GitHub Features: PhySurface VOP Energy conserving surface model PBR and RayTrace render engines support GTR BSDF with anisotropy (also avaible as a separate VOP node) Conductor Fresnel Volume absorption Raytraced subsurface scattering Artist-friendly multiple scattering (also avaible as a separate VOP node) Ray-marched single scattering Translucency Dispersion Thin sheet dielectric Transparent shadows Extra image planes support Per-component image planes Per-light image planes Variance anti-aliasing support Layered material Nesting material PhyVolume VOP Color scattering and absorption Per-light image planes PhyShader v1.2.0 - download: This is usability release. BSDF has changed to GTR New artist-friendly SSS Added layer support Added metallic desaturation Improved dispersion Materials: Added PhySurface Layered material Added PhySurface Nested material Improved PhySurface material Viewport support UI: New Inside IOR presets menu Changed dispersion presets menu Numerous bugfixes
  4. 1 point
    I know it's tough out there, but I got the CHOPS, the SOPS, the POPS, and the DOPS (and Python) to help deliver on time and under budget. I'm on the market and well rested, ready to chew through the latest vfx challenges with a committed crew. 8 years of Houdini and 17 years in the vfx industry. Check out my reel over at: The breakdown's at: https://docs.google....-bh63-CoVcY/pub Resume: https://docs.google....L64OrmVZsr8/pub my LinkedIn will all sorts of accolades from old work mates: http://www.linkedin.com/in/clearmenser And of course there's all my helpful comments here on odForce http://forums.odforce.net/profile/1210-kleer001/ edit: added newest reel
  5. 1 point
    The journal (renamed the Changelog) is now available natively on the new website: http://www.sidefx.com/changelog/ This should work more effectively and won't require a redirect to the old site. Thanks for your patience as we migrate to the new site. Downloads and the Store are now being worked on! Robert
  6. 1 point
    "Imagine never having to set up mattes in Maya ever again!" Yes!!!
  7. 1 point
    This might be a stupid question but I'm wondering what - if any - renderers use previous frame data for precalculation of the current frame? And as I suspect the first question is going to be "Why would you want that?" and it's in regard to estimating sequence render times - something you can't really do based on the first N frames, unless you have a quite unchanging scene.
  8. 1 point
    There's a few ways to tackle this. One way is to put down a pop kill, use the drop-down to choose kill by condition, and modify the example to kill if the colour is black: dead = (@Cd.r == 0) ? 1 : 0; (btw, thats a shortcut way of writing this which you might be more familiar with: ) if (@Cd.r == 0) { dead = 1; } else { dead = 0; } Another way that a workmate showed me that I'm using more and more, is to keep as much control in sops before the popnet. The default behaviour of the pop source is to scatter over the input geo; but if you change the mode to 'all points', then it's up to you to feed it points that move around like the pop source scatter does. A scatter sop can do this. The advantage now is that you can use all your usual sop tricks, and get non-sim based, very accurate control of your emission. In the attached example I made @density based on colour, have a scatter sop that is driven by density, but was surprised to find that even with density at 0, it still scatters points. No matter. A blast that kills points where @density=0 fixes that, and can feed that directly to the pop sim. Setting the seed on the scatter to @Frame so its constantly jittering, and we're done. This might cause issues if you're emitting really fast particles and need nicely distributed emissions in sub-frame regions, but it's never bothered me so far. -matt attrib_as_emit_value_fix.hipnc
  9. 1 point
    You mean like displaying ETA for current frame using data from previous frames? I don't think it's a job for renderer. They can even be rendered in two different locations, and simultaneously, so this info is not present yet. It's a job for render manager, and some of them can do decent job on estimating time of arrival for sequences although, as you mentioned, neither CPU time nor RAM usage isn't to be trivially estimated from a single frame. Some systems take random frames from a sequence to farm, estimate statistics and assign job at night to pools of machine based on that facts. For example they can this way save some licenses by assigning multiply jobs which do not saturate RAM to a single machine. As to using actual render samples from previous frames, this is completely different topic, so I hope you didn't mean that...
  10. 1 point
    This post has a wrangle that does a similar thing: http://forums.odforce.net/topic/25884-houdini-dop-inherit-velocity/#comment-150391
  11. 1 point
    Try this. Throw down an attribute wrangle inside of your sop solver. Make sure the "current frame" is in the first input, "previous frame" in your second input. i@attribute; @attribute = max(@attribute, point(1, "attribute", @ptnum)); This should work if your values are just between 0 and 1 for your point attribute. Let me know if you need explanation on the text.
  12. 1 point
    Well, I solved my own problem. For anyone curious, the key was using the stored pcx, pcz and tangentu attributes. A little attribute point wrangle did the trick: float angleT = -$PI/2 + atan(v@tangentu.x,v@tangentu.z); vector pivot = set(@pcx,@P.y,@pcz); float dist = distance(@P,pivot); float r = atan(@P.x-pivot.x,@P.z-pivot.z); @P.x = pivot.x + dist*sin(r+angleT); @P.z = pivot.z + dist*cos(r+angleT);
  13. 1 point
    Works well for me. keep_value_solver.hipnc
  14. 1 point
    Haha, yes indeed. That was my first job straight out of school back in '98. Oh man, I could tell stories about that gig for hours.
  15. 1 point
  16. 1 point
    Derivatives are often used in surface shading as a way to antialias. In order for texture mipmapping to work, you need to have an idea of how far apart in UV space a sample and its neighbour are. If they are less than a texel apart in UV space, you can use the highest resolution mip level. But if they're several texels apart, you want to blend all those intermediate texels together to get an approximation of the colors blurring together, and thus use a lower-resolution mip level which has pre-baked this blur in the form of a downscaled texture. In order to determine the mip level to use, the shader takes the difference of the UVs with its neighbours at each sample - the "derivative". They can also be used to determine a reference frame on a surface, which you need to do tangent-space normal mapping. At the discrete level computer graphics operates at, a derivative is simply the difference of two values over an interval - (V1 - V0) / (P1 - P0).
  17. 1 point
    Hey thanks for the link. The paper helped a lot.
  18. 1 point
    Damn, it almost looks like you know what you are doing, Alejandro!!
  19. 1 point
    This is a good read. https://renderman.pixar.com/view/wave-effects-on-surfs-up
  20. 1 point
    I just work with the Mantra Surface internal guts as a start from the gallery as it has all the grunt work pretty well sorted and it does a good job. The Mantra Surface material has an option that allows you to wire in your own custom fresnel strength across the surface. Its in the last folder tab and you choose custom. Then dive inside the shader, find the Shading Model VOP and it has an input for fresnel right near the top where you can wire in your own reflection strength logic across the surface and expects a value from 0-1. The Alpha Mix VOP internally does a dot product between the normalized I (eye) and N (surface normal) to give you a nice 0-1 roll and control for the strength of surfaces with normal parallel to the eye (alpha para) and surfaces with normal perpendicular (alpha perp) to the eye. It also has a nice roll-off parameter that controls how quickly para rolls in to perp. Good for starting up a fake fresnel set-up. Or you can roll your own method from scratch and not use the Gallery Mantra Surface's custom fresnel input (you can see that I am steering you to modify the Mantra Surface). First start with a physically correct Fresnel reflection using ior (which is 1/eta): Then this set-up with the Alpha Mix VOP as above: And yet another incarnation using a Ramp VOP being fed the Alpha Mix set to do a straight dot product of normalized I and N which btw can be used as is fed in place of the Alpha Mix in to the Mantra Surface: There are so many different incarnations of this that it's ultimately up to you how you want to manage the way the light is built up on the surface including reflected light multiplied by a surface roll-off however you wish to construct it. If you want to perturb the reflected light, it is best to disturb the surface normals with some noise feeding in to the Alpha Mix and make sure that it is Normalized first. This is an old school hack for me going back to 1993 when I first encountered it (with a Point SOP no less) where you add slight amounts of noise to the surface normals to alter the reflections. Nice for realistic disturbed mirror reflections on glass buildings. fresnel_reflection_examples.hip
  21. 1 point
    it would be actually nice if I attached the file to previous post multi_fresnel_example.hip