Jump to content

gui

Members
  • Content count

    218
  • Donations

    0.00 CAD 
  • Joined

  • Last visited

  • Days Won

    1

gui last won the day on September 4 2018

gui had the most liked content!

Community Reputation

25 Excellent

About gui

  • Rank
    Initiate
  • Birthday 12/31/1983

Personal Information

  • Name
    Guilherme
  • Location
    São Paulo, Brazil
  1. Motion Retargeting Tool

    Hi, sorry for the late reply, but I didn´t have much time to review that. I put some notes in the file, but in a nutshell, I just used paren_blend constraint to match de movement. The idea is to use python to automate the constraints creation.ap_mocap_retarget_share_v003.hiplc
  2. Motion Retargeting Tool

    Send me a file and I will show how to do it.
  3. Motion Retargeting Tool

    The overall idea is to "bind" the TPoses and follow each bone rotation from there. There are more intelligent approaches out there, but far harder to implement. A simple parent constraint from the TPose should work. In my setup, I did that using python objs to store the transformations and a python asset to automate the placing of constraints, but It still needs human input to link the source bone to the target one. One note: I don´t like FBX very much, because it tries to mimic a joint approach and put the deformations in the nulls, instead of bones. It sometimes gives strange gimbal problems. My usual workflow is to convert the fbx to bhv and then use mcbiovision to create a clean houdini rig internally.
  4. Motion Retargeting Tool

    Hi @xquisite , I, unfortunately, don´t have plans to release it in the near future, since It needs a major rewrite of the code and some tests using chops instead of python to do the retarget. Most of my code was for handling errors that came from the CMU mocaps and for automating stuff. The core logic of the retarget I build it´s just linear algebra, so nothing "intelligent" for retargeting here. If you need help for building the retarget, just drop the scene and I will help you.
  5. For the folks out there looking for a way to do motion retargeting inside houdini, here´s a tease for a tool that has being working for 2 years now! In a couple of weeks I will share more info about how it works, since I don´t have plans to release it, but I will help anyone who wants to do the same - the tool doesn´t have fancy a algorithm, but works very well. https://vimeo.com/299246872
  6. Matrix Transformation in VOP

    I didn´t understand your equation. If you are using positions DEF, you should multiply them by (DEF)-1 * ABC to get positions, A, B and C. If you are using positions A, B and C to go to D, E and F, you math should work, just check if your matrix is correct (use a make xform vop node and compare them).
  7. Colored Smoke with CMYK

    Last week a guy asked in the brazilian houdini group, in facebook, how to simulate colored smoke. I believe there are lots of hip files with this kind of effect, but while thinking about it, it came to my mind the possibility of using CMYK instead of RGB, since cmyk is more suitable for mixing colored things other than light. I couldn´t spend more time testing it or improving the file, but it seems to work, so here´s the hip file. colored_smoke_V002.hip
  8. Controlling shader properties

    As far as I understand, shader properties are evaluated as a whole, for the shader, so plugging attributes that are being calculated as the rays are being fired won´t work there. To make that work, like, if you need to disable displacement for objects that are far from the camera, you will need to use material stylesheets. Look for the first part of the stylesheet webinar with Jeff Wagner, he teaches how to do that!
  9. Perhaps SSS in Arnold is faster than mantra, since it´s Houdini's first implementation of the pathtraced SSS. That said, I rendered last month a fluid sim with refractions, volume and sss and frame times are faster than Arnold, so, perhaps it´s how you track the noise and which tricks (sss, volume, refractions, and so on...) are been used in the scene.
  10. Houdini 15 and mocap data

    You can use mcbiovision to convert bvh files in a Houdini script that generates the node structure plus a bclip file with the animations. A french guy build an asset that does batch conversion with Python, but I'm on an iPad right now and don't have it here. You will have to build the skin inside Houdini and will be tied to the CMU rig. I had to build a custom retargeting system to overcome that. Hope to upload a demo video this month. You will have to clean an correct most of the data, since it has lots of problems, like foot sliding and floating around. One option would be doing this inside Adsk Motion Builder and exporting as a fbx file.
  11. Both machines have the same Houdini version? (HClient and Houdini)
  12. Cracks are visible in render before it happen

    Hi. Here, the problem don´t seem to happen.
  13. there isn´t a "v" attribute in the source geometry. In that case, you should toggle the "add velocity" on in the "create_density_volume" node, in sops. SmokeTest2.hipnc
  14. FLIP Fluid Masterclass

    Hi Robert, can u share some hip files that were used in the class? thanks
  15. try to toggle it off. Probably the geo don´t have velocity, so it won´t have motion blur.
×