Jump to content

gui

Members
  • Posts

    222
  • Joined

  • Last visited

  • Days Won

    1

gui last won the day on September 4 2018

gui had the most liked content!

About gui

  • Birthday 12/31/1983

Personal Information

  • Name
    Guilherme
  • Location
    São Paulo, Brazil

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

gui's Achievements

Newbie

Newbie (1/14)

  • Week One Done
  • One Month Later
  • One Year In

Recent Badges

28

Reputation

  1. Look for "packages" in the help. It´s cleaner and will solve this bug.
  2. yeah, but that shouldn´t be a problem. For other reasons, I changed the way I edit the HOUDINI_PATH env variable: I left the houdini.env as defaults, but used the new approach, with "packages". It´s cleaner and didn´t generate any permissions errors.
  3. I figured it out. Just adding a space before the "&" did the trick.
  4. Aren´t you setting up the default search folders with "&"? If I don´t append the "&" at the end, I can´t get houdini to work, but otherwise I can´t save the presets...
  5. Hi, sorry for the late reply, but I didn´t have much time to review that. I put some notes in the file, but in a nutshell, I just used paren_blend constraint to match de movement. The idea is to use python to automate the constraints creation.ap_mocap_retarget_share_v003.hiplc
  6. Send me a file and I will show how to do it.
  7. The overall idea is to "bind" the TPoses and follow each bone rotation from there. There are more intelligent approaches out there, but far harder to implement. A simple parent constraint from the TPose should work. In my setup, I did that using python objs to store the transformations and a python asset to automate the placing of constraints, but It still needs human input to link the source bone to the target one. One note: I don´t like FBX very much, because it tries to mimic a joint approach and put the deformations in the nulls, instead of bones. It sometimes gives strange gimbal problems. My usual workflow is to convert the fbx to bhv and then use mcbiovision to create a clean houdini rig internally.
  8. Hi @xquisite , I, unfortunately, don´t have plans to release it in the near future, since It needs a major rewrite of the code and some tests using chops instead of python to do the retarget. Most of my code was for handling errors that came from the CMU mocaps and for automating stuff. The core logic of the retarget I build it´s just linear algebra, so nothing "intelligent" for retargeting here. If you need help for building the retarget, just drop the scene and I will help you.
  9. For the folks out there looking for a way to do motion retargeting inside houdini, here´s a tease for a tool that has being working for 2 years now! In a couple of weeks I will share more info about how it works, since I don´t have plans to release it, but I will help anyone who wants to do the same - the tool doesn´t have fancy a algorithm, but works very well. https://vimeo.com/299246872
  10. I didn´t understand your equation. If you are using positions DEF, you should multiply them by (DEF)-1 * ABC to get positions, A, B and C. If you are using positions A, B and C to go to D, E and F, you math should work, just check if your matrix is correct (use a make xform vop node and compare them).
  11. Last week a guy asked in the brazilian houdini group, in facebook, how to simulate colored smoke. I believe there are lots of hip files with this kind of effect, but while thinking about it, it came to my mind the possibility of using CMYK instead of RGB, since cmyk is more suitable for mixing colored things other than light. I couldn´t spend more time testing it or improving the file, but it seems to work, so here´s the hip file. colored_smoke_V002.hip
  12. As far as I understand, shader properties are evaluated as a whole, for the shader, so plugging attributes that are being calculated as the rays are being fired won´t work there. To make that work, like, if you need to disable displacement for objects that are far from the camera, you will need to use material stylesheets. Look for the first part of the stylesheet webinar with Jeff Wagner, he teaches how to do that!
  13. Perhaps SSS in Arnold is faster than mantra, since it´s Houdini's first implementation of the pathtraced SSS. That said, I rendered last month a fluid sim with refractions, volume and sss and frame times are faster than Arnold, so, perhaps it´s how you track the noise and which tricks (sss, volume, refractions, and so on...) are been used in the scene.
  14. You can use mcbiovision to convert bvh files in a Houdini script that generates the node structure plus a bclip file with the animations. A french guy build an asset that does batch conversion with Python, but I'm on an iPad right now and don't have it here. You will have to build the skin inside Houdini and will be tied to the CMU rig. I had to build a custom retargeting system to overcome that. Hope to upload a demo video this month. You will have to clean an correct most of the data, since it has lots of problems, like foot sliding and floating around. One option would be doing this inside Adsk Motion Builder and exporting as a fbx file.
  15. Both machines have the same Houdini version? (HClient and Houdini)
×
×
  • Create New...