Jump to content

Leaderboard


Popular Content

Showing most liked content on 10/23/2015 in all areas

  1. 3 points
    HACK ALERT!! In reply to comments on the BSDF bonanza thread. You can use a very very simple hack to trick the pathtracer to use an irradiance cache, of sorts... In the past I've done a lot of point cloud baking for this purpose, in combination with a lot of other `indirect light` vop hackery, in order to do stuff like the gi light distance threshold... which works, but by using the indirect light vop (deprecated stuff) hacks made rendering much slower when more than 1 bounce was needed and irr cache was not in use... and sometimes the pcwriting would randomly skip objects for reasons I never got to the bottom of. In this case I'm using ptex baking (H15), but I suppose it could be anything... Since the ggx thread post was made I had what currently seems to be a much better/simpler idea than how I did it before, without any real modification to the pathtracer. Basically the hack is, plug the ptex bake into Ce and zero F on indirect rays (not the pbrlighting input)... despite zero'ing indirect rays Ce is magically looked upon by erm, indirect rays... But of course there are lots of practical implications and associated further hackery... Need to wedge bake entire scene (though it could maybe be selective, like just the caching the walls) auto filling in the shaders cache file path Just discovered baking doesn't work on packed geo!! Don't want to be killing indirect reflection beyond the first bounce. This leads to needing pre multiplied separate F components until they arrive in compute lighting, which in turn means making your own shader struct and all the layer blending tools that go with that. OR (I really really hope someone knows how to do this and shares it here), make a is_diffuse/reflection/refraction/etc ray vop. I have a hunch that the best way to do irrcaching in general might be to voxelize the scene... not only because it would provide a cache for shading as we are doing here, but also because we (meaning sideFX or other brainy people) could then start looking at things like cone tracing (like nvidia is doing for realtime gi). But the best thing would be (really dreaming here) that it would remove geometry complexity from the problem of raytracing... Basically the voxels would become a geo LOD, so if the cone angle is big and the distance is bigger than say 2 or 3 voxels then it would do all that expensive ray intersection stuff against the level set instead... forests, displacements, bazzillion polys, reduced down to voxels. I think this might work because I've been doing this in essence for years, but limited to hair... by hiding hair from all ray scopes and using a volume representation to attenuate/shadow direct/indirect light reaching the hair (nice soft sss looking results). But! I hear some say it will be blurry etc, because the volume lacks definition, so there is also a simple illum loop to trace `near` shadows and so add some texture/definition back in... fast... even compared to today's much faster hair rendering in mantra, and arguably better looking (the sss/attenuation effect esp if the volume shader casts colour attenuated shadows), but there is the hassle of generating the volumes even if automated as much as it can without sesi intervention. 1m11s for cached, 2m26s for regular. This is using the principled shader. Btw a test on this simple scene looks like the GI light works (here!), but it is way way way brighter than brute force PBR, and yeah also had grief with the gilight, sometimes not writing the actual file... irrcache_v003.hip
  2. 3 points
    Hi all! I just wanted to share the latest vfx job Gimpville did on The wave(norwegian disatster movie). Houdini was used for all the simulations and rendering and we couldnt have done this in any other software. Myself and Ole Geir Eidsheim was responsible for all the Houdini magic on this show. Its all flip/pops/sops/bullet/pyro and mantra awsomnes! Hope you all like it! Enjoy!
  3. 2 points
    Haven't posted my work here in a long time, but here's a hip file in which I used Python to implement the divide and conquer closest pair of points algorithm. Obviously not nearly as fast as a pcopen lookup, but I wanted to learn about the divide and conquer paradigm and had a lot of fun figuring it out! Code runs in O(n*log(n)) time, as explained in the following video: Enjoy! Comments are always welcome And no, I don't script in C++, no need to tell me that it's faster divide_conquer_find_closest.hipnc
  4. 1 point
    Hey, First off, thank you for taking the time to read this and help me out. I am a bit unhappy with the flexibility of the Voronoi Fracture Configure Object node (from Make Breakable), so I found this: From what I understand, he is piping the first-fractured pieces through a foreach node, adding a centroid point - all done. My problem is figuring out how that centroid point can be divided into more points (eg. by time), thereby also re-fracturing the current piece into tinier bits. If I didn't explain something properly, please do let me know. Thank you!
  5. 1 point
    You are right. int a=3, b=2; i@i = a / b; f@f = (float) a / b; No need to cast both sides of expression to float, by the way. The right side will be converted implicitly. There will be an Implicit Bingo in single wrangle.
  6. 1 point
    Another way which can sometimes be easier to read is dot syntax: @Cd = 0; @Cd.r = @displaceValue; @Cd is one of the few special values that automatically converts itself to a vector, so @Cd=0; gets converted to @Cd={0,0,0}; You can also call the suffix as r/g/b, or x/y/z, or an array index, all are useful depending on circumstance. Ie, @Cd.x = 1; @Cd.r = 1; @Cd[0] = 1; are all the same, and will make the points red component to 1.
  7. 1 point
    You can't have variables inside of brackets like that. You would have to do: @Cd = set(@displaceValue, 0, 0);
  8. 1 point
    You need to put the light source behind monitor to see the rest of the help page.
  9. 1 point
    WTF is this BS? So Nuke and Fusion opens the Houdini (particles/points only) alembic files without any issues but Maya just don't import particles/points it seems. Also tried cleaning the attributes - and when that didn't work - even tried copying an add node point onto each particle so it's really stripped of all particle related attributes, but no difference. No clue what is going on. Importing Alembic meshes works great though.
  10. 1 point
    I have assembled a simple example for you to start with. For soft body effects you should push it a bit further. Edit. and another example of cutting thin soft body sheets. It is even more simple. cuttingSys_1_.hip cuttingSys_2_.hip
  11. 1 point
    Sooner or later you will have to have a decent orthogonal frame of reference either as a matrix3 or orient (quaternion). N, up etc are good for basic copy/instancing. How you can get this frame is another problem depending on your case. PolyFrameSOP or computed manually etc. Using orient seems to be the most straightforward, but it's just another version of N and up combo (cleaner, because you can do math on it, either with quaternions or matrices). Basics attached. dancing_teapots.hip
×