Jump to content


  • Content count

  • Donations

    0.00 CAD 
  • Joined

  • Last visited

Community Reputation

0 Neutral

About kodiak

  • Rank
  • Birthday 03/04/1978

Personal Information

  • Name
  • Location

Recent Profile Visitors

3,322 profile views
  1. OpenGL renderer

    Hey Mondi, Just out of curiousity, have you been part of the demoscene before? Just suspicious because of the subject/look of the images, the fact that you're coding the engine and also that you're Skandinavian (where it was so popular..still is). I was contemplating the idea about a year ago to create some classic, demoscene look with Houdini, but try and push it to another level of complexity. Looks like you're on the right track with this, I will be curious how this ends up. cheers, Andras
  2. African Buffalo

    It's a Australian golden silk orb-weaver (Nephila edulis).
  3. Autodesk Buys Softimage XSI

    muhaha..at least something interesting happens while we are living these couple of boring days until it just settles down and no one cares anymore back to your shots, ladiez
  4. Maya 2009

    And what about being paid to do your job at a low(er than - in my opinion - acceptable) efficiency? I'm a tad tired of preaching, not sure I really care to spread the word anymore.
  5. seams in NURBS surfaces

    Hey Sergio, I seem to remember conversations about IGES and Houdini single vs. double precision (and trimming) issues back in the 3.0 era, so there should be some information about this in the mailing list archive. Tessellation shouldn't be a problem, I remember rendering quite heavy CAD models in PRMan exported from I-DEAS and PRO/Engineer (that's how I got into it in the first place) other tessellating raytracers (maya software and some early versions - maybe 1.9 - of mental ray hooked into Pro/E) just failed on them because of the same issues. I don't think this changed since, so Houdini might have a precision issue with some of these rather than tessellation problems. There is also an solution in PRMan called binary dicing that sometimes helps these cases. Oh man, dealing with NURBS feels like memories of a previous life now Take care, Andras
  6. Maya 2009

    Stop making fun of us who are FORCED to use Maya...kthx
  7. The SSS Diaries

    Sorry for the offtopic (slighly less so than the gamma intermezzo), but don't you guys sit about 10 meters away from each other? Go on, interesting conversation.
  8. I don't think it's possible with the out of box toolkit in a pure technical sense (and not even considering budget and time constraints)..but that's just me. Someone could prove me wrong, if got a year worth of time on his hands
  9. Mdd Reader

    I don't think it's just a Lightwave thing, it can appear in other contexts as well. Having designed and tested the inhouse MDD tools Nicholas is talking about in the other thread on the SESI forum, I've also seen the necessity to have an option to flip the Z axis. Mind you, in that environment there was no data flow from Lightwave to Houdini, but rather we had an inhouse geo exporter for Maya that would export the static meshes later used to receive the MDD point cache. Earlier we had a Maya->Lightwave pipeline as well, and they seemed to be working consistently together. It might be the case that Point Oven did some "smart" manipulation to make Maya and Lightwave work together better, but that's likely to be the tool used by many people intending to use the factory MDD tools in Houdini as well, so I would say it's a good thing to have that option in general, it costs almost nothing performance wise. PO was the tool that we use that time to generate the MDDs from Maya before we rolled our own one, and I vaguely remember thinking about an option in the exporter to emulate that as well. Sorry for being a bit hazy but it was over a year ago I've been into this. cheers, Andras
  10. Houdini Work

    I don't know any GOOD artist/TD out of a job who wants one..I guess there isn't a huge difference in this regard (I hope at least).
  11. Linear To Light renders?

    Depends on if you want to go crazy or not For practical uses, I would say arguably a gamma of 2.2 (or rather correction) should do for texturing. For viewing purposes, making sure you have a screening device that maps this 2.2 gamma to the final screening device (that is projector/screen/print stock/light conditions) and also that you're comping in the right color space and your comp display LUT is setup accordingly (as is your comp output driver). Also, something I've seen happening is for example in Nuke, when you're applying a LUT to the loader, there is an option to pre divide + post multiply with that.. It should be on if there is an alpha involved (ie. a beauty pass, not necessarily with AOVs). For a little more involved setup, you could try to follow this interesting article. Additional comments/corrections welcome as it's an important but overlooked subject (especially at smaller places).
  12. Linear To Light renders?

    You could do it when you're converting your texture to RAT. Alternatively you could add a color correction VOP (or equivalent shader code) after the texture and change the gamma there.
  13. Linear To Light renders?

    Nope, you should render with a gamma of 1.0. As you said, you will need to put your textures to linear space as well beforehand, and use a display gamma of 2.2 in Mplay. Also, the colour picker will need to be set to a gamma of 2.2 as well to show you what you're actually picking by setting this environment variable. Not sure if there is a new/better way in H9. HOUDINI_COLOR_PICKER_GAMMA This variable specifies the gamma exponent for the device specific color correction of the color picker gadgets and color parameters.
  14. Where Are You Now?

    I'm from Hungary, been in the UK for a long while, now in Toronto..and soon moving on the other side of the world. Where do I click?