Jump to content

Applying Corrective shapes with kinefx


CrsBns

Recommended Posts

I have been stuck for a couple of days trying to work out how to apply corrective shapes to a houdini kinefx rig.

I have a rig set up with multiple rig pose nodes including IK. I would like to get a value at the end of posing of some joints, like thigh x/y/z rotation. I can then say if thigh is rotated in the x by a certain amount, apply this corrective blend shape. However I run into gimble lock issues when I get the rotation of the joint from "localtransform" attribute supplied by the rig pose node.

I have read that pose space deformations can be used but I cannot for the life of me figure out how to use it and there doesn't seem to be any tutorials on it either.

How would a professional go about applying corrective blend shapes to a kinefx rig?

Thanks.

Link to comment
Share on other sites

Maybe a quick .hip file would help. There are lots of way to do this.

-- You could fit the angle rotation from the rig pose and use that to drive the blend shape.
-- As you say you can extract the rotational transforms of the localtransform and fit those to drive it (I haven't run into gimble lock problems yet).
-- For knees and elbows, fitting the dot product of the upper limb (the vector between the shoulder joint and elbow joint) and the lower limb (the vector between the elbow and wrist) can be a useful way to drive the blend shape.
-- There's a joint angle VOP that can be useful.

I have not had a chance yet to explore how to use the PSD SOPs with kinefx but if you post an example we might be able to help figure it out.

Link to comment
Share on other sites

11 hours ago, madebygeoff said:

Maybe a quick .hip file would help. There are lots of way to do this.

-- You could fit the angle rotation from the rig pose and use that to drive the blend shape.
-- As you say you can extract the rotational transforms of the localtransform and fit those to drive it (I haven't run into gimble lock problems yet).
-- For knees and elbows, fitting the dot product of the upper limb (the vector between the shoulder joint and elbow joint) and the lower limb (the vector between the elbow and wrist) can be a useful way to drive the blend shape.
-- There's a joint angle VOP that can be useful.

I have not had a chance yet to explore how to use the PSD SOPs with kinefx but if you post an example we might be able to help figure it out.

Thanks for the response. I have made a hip file and will upload it now if you want to take a look.

For your first point, this doesnt work for when there are multiple rig nodes including IK one.

Second point, In the hip file I am uploading if you move between frame 1 and 2 the left hip will rotate from -90 to -120 and gimbal will flip out. Also I have noticed that when getting the local transform rotation when I haven't done any pose, my rotation doesnt start at 0, it starts at something wierd like 0, -90, -90. So that probably doesnt help.

I didn't know about the dot product thing, sounds like it could be useful.

I was suggested the joint angle VOP by another guy on another forum a year ago when I was asking about this but it still didn't work.

I would love to post an example of me attempting PSD in kinefx but I don't even know where to begin. Not much in the way of tutorials out there.

In the meantime I have come up with my own solution to getting pose correction morphs to work but it is very awkward and cumbersome and I don't like it. I figure I am missing a much simpler and better solution.

What I do is, pose the rest skeleton with where I want the joint to be rotated when the corrective is fully activated. I then turn on local transform parameter and use python to calculate the rotation from the local transform matrix and save the rotation as a detail attribute on the animated rig pose.

Then once finished posing, I run another python node which looks for joints that are associated with corrective morphs and then it calculates the local rotation of the current joint.
I then have 2 rotations for the joint, what it currently is and what it should be if the corrective is to be activated.
I then make a Vector in the z direction and rotate it by each rotation. I then compare the distance between these 2 vectors.
I then manually adjust the radius for when I want the rotation to start and apply the blendshape based on that.

Problems with this method are that it doesn't account for twist, the joints needs to be lined up in the z direction, and it takes a lot of work just to set it up, and its probably slow.

GimbalLockPoseDriver.zip

Link to comment
Share on other sites

I did this quick, so I'm not 100% sure it will work in all cases, but it should point you in the right direction. I mocked up two options:

1-- uses the joint angle VOP. It's a handy VOP. Basically you pipe in the joint you're interested in and then tell it which child you are interested in. You can hard-wire it with another get_point_transform VOP. Here I just used "0", which is the joint's first child. You can also set it to -1, which will calculate the average of all the children (which might be useful in some cases). It then return the angle between them in radians. I just fit the output based on what I though the min and max range should be and it outputs a 0.0-1.0 value that I can use to drive the blend shape.

2-- uses the local transform. I multiply the local transform of the joint in rest pose with the local transform of the joint after the rig pose. That is the same as moving the rest state joint to the new (animated) position and rotation. If you extract the rotation from the localtransform, you've got how much it rotated in degrees. Fit that and you have another variable to drive the blend shape.

It's a bit late, so I didn't do the dot product, but it's the same idea as 1. You'd create a vector between shoulder and elbow (upper arm) and another between elbow and wrist (lower arm), then take the dot product. The dot product gives you a range (1 when the vectors are pointing in the same direction, 0 when perpendicular, and -1 when pointing in the opposite direction). You fit that to get your 0 - 1 range.

Try those. I think they should work and you shouldn't get any gimble lock problems. They don't account for twist, but they could be updated to do it.

Also, just an aside, about the weird localtransform values: I don't like the orient_joints SOP. I prefer to do it manually. In the skeleton SOP click each joint and hit "P" to open up the individual controls. You can make sure the local transform values are what you would expect.

GimbalLockPoseDriver_v02.hipnc

Edited by madebygeoff
Link to comment
Share on other sites

6 hours ago, madebygeoff said:

I did this quick, so I'm not 100% sure it will work in all cases, but it should point you in the right direction. I mocked up two options:

1-- uses the joint angle VOP. It's a handy VOP. Basically you pipe in the joint you're interested in and then tell it which child you are interested in. You can hard-wire it with another get_point_transform VOP. Here I just used "0", which is the joint's first child. You can also set it to -1, which will calculate the average of all the children (which might be useful in some cases). It then return the angle between them in radians. I just fit the output based on what I though the min and max range should be and it outputs a 0.0-1.0 value that I can use to drive the blend shape.

2-- uses the local transform. I multiply the local transform of the joint in rest pose with the local transform of the joint after the rig pose. That is the same as moving the rest state joint to the new (animated) position and rotation. If you extract the rotation from the localtransform, you've got how much it rotated in degrees. Fit that and you have another variable to drive the blend shape.

It's a bit late, so I didn't do the dot product, but it's the same idea as 1. You'd create a vector between shoulder and elbow (upper arm) and another between elbow and wrist (lower arm), then take the dot product. The dot product gives you a range (1 when the vectors are pointing in the same direction, 0 when perpendicular, and -1 when pointing in the opposite direction). You fit that to get your 0 - 1 range.

Try those. I think they should work and you shouldn't get any gimble lock problems. They don't account for twist, but they could be updated to do it.

Also, just an aside, about the weird localtransform values: I don't like the orient_joints SOP. I prefer to do it manually. In the skeleton SOP click each joint and hit "P" to open up the individual controls. You can make sure the local transform values are what you would expect.

GimbalLockPoseDriver_v02.hipnc

Thanks for taking the time to do this. I don't know much about using Rig VOPs so this is very helpful for me learning how to use them.

For the joint angle VOP I can see this being useful for things that rotate in only one direction, like elbows and knees however the dot product sounds useful there also. It doesn't help with joints that need an xyz for drivers though.

The local transform, is closer to what I want but it still runs into gimbal lock very easily. I have reuploaded the file you sent me but added some extra bits on the right hand side that involve rotating the shoulder. With a shoulder you may need a corrective for when it is rotated up, down, forwards and backwards however if you scrub the timeline (with Shoulder_XYZ_Output1 selected) you can see for example between frames 12 and 13 the rotation outputted by the VOP jumps from (-107,24,87) to (79,-209,-82).

I also tried cleaning up the shoulder with the result being in Shoulder_XYZ_Output2.  On this one the rotations all start out at 0,0,0 but still gimbal starts flipping out as you scrub the timeline.

I don't know much about manipulating matricies and vertexs but is there a way to convert the local rotation into just one axis? Like perform some maths on (79,-209,-82) to output something like y = 80 as a random example. The crude work around I found was to perform the rotation of the rest joint and the pose joint on some arbitury vector like (0,1,0) and then detect how close they are to each other, but a non gimbal locked rotation method would be so much better.

Aren't Quaternions' supposed to avoid gimbal lock? Maybe theres some way we can get these out of the VOP?

GimbalLockPoseDriver_v02.hipnc

Link to comment
Share on other sites

Some of the problems in your file were coming from the IK solver flipping due to the placement of the pole vector (twist controller). I animated the twist controller to smooth out the behavior of the elbow. But you're right that the extract transform doesn't flow well and tends toward axis flipping.

You could definitely convert the local transform to a quaternion, but a quaternion only gives you an axis of rotation and an angle to rotate around that axis, so you'd still have to convert that to Euler X,Y,Z rotational components. I'll have to think a little more about how that might work and whether it wouldn't have the same problem as the extract transform VOP.

What about something like this?

It takes the dot product of the upper arm and an up vector to give you the Y component (I did this one in a wrangle, but you could recreate in a VOP). On a quick test it produced smooth changes from 1 to -1. You'd still want to fit the value from 0 to 1 through a min and max value. But you should be able to replicate this for any axis.

I'll think about whether there's another way to do it with the local transform matrix.

GimbalLockPoseDriver_v03.hipnc

Link to comment
Share on other sites

1 hour ago, madebygeoff said:

Some of the problems in your file were coming from the IK solver flipping due to the placement of the pole vector (twist controller). I animated the twist controller to smooth out the behavior of the elbow. But you're right that the extract transform doesn't flow well and tends toward axis flipping.

You could definitely convert the local transform to a quaternion, but a quaternion only gives you an axis of rotation and an angle to rotate around that axis, so you'd still have to convert that to Euler X,Y,Z rotational components. I'll have to think a little more about how that might work and whether it wouldn't have the same problem as the extract transform VOP.

What about something like this?

It takes the dot product of the upper arm and an up vector to give you the Y component (I did this one in a wrangle, but you could recreate in a VOP). On a quick test it produced smooth changes from 1 to -1. You'd still want to fit the value from 0 to 1 through a min and max value. But you should be able to replicate this for any axis.

I'll think about whether there's another way to do it with the local transform matrix.

GimbalLockPoseDriver_v03.hipnc

You've given me more to play with, thanks! Though with that method it doesnt work it the rest of the character is rotated, it would need to use local transform somehow.

It is similar to the method I have been playing around with for the last couple of hrs but I cannot get it to work. I have a few seperate rigs that store my pose in the position I want it then I basically do some stuff with local transform and dot matrix to test how close the points are. It works pretty well but I have run into problems with not being able to fetch rigs without plugging them into the rig vop which only allows 4. If you have lots of blend shapes then that will be lots of rigs it needs to read so it will probably be too slow anyway.

I really like the idea of just getting an x y and z vector and fitting it from there but I'm struggling to think of a way to do it that acounts for all the rotations of the rig upstream. I'll keep working on it.

Link to comment
Share on other sites

So I have been working on my alternative solution a bit more and it still awkward to set up but not as bad as it was before thanks to the rig vops stuff you showed me. I still think there has to be a much simpler and faster way to do this. When I run the performance monitor on this method the rig obj took 0.8ms and a full half of that was soley on the rig VOP.

Anyway I'll upload it below, can you see any obvious problems with this Geoff? I am hoping asking rig VOP to read lots of pre made rig poses won't prove to be too slow but I wouldn't know without a rig. Might test it with one now.

GimbalLockSolutionA.hipnc

Link to comment
Share on other sites

Sorry, got a little bogged down with work, but a few thoughts:

I don't think there's anything wrong with the approach, but I agree, I don't like the idea of piping in lots and lots of poses. It might get slow. Plus, I'd be looking for a setup where you can reuse the same VOP or wrangle on any joint.

What about using the dot product approach I suggest before, but instead of using an arbitrary up vector, you could use the parent joint as a reference. Should account for any upstream transforms. You could use either transform or localtransform since the two joints are always relative to one another.

I separated the components by row. The 4x4 transform matrix is (this is a little simplified. I highly recommend the 3blue1brown series on linear algebra - https://www.3blue1brown.com/topics/linear-algebra) set up so that:

Row1: corresponds to the rotational transformation of the x-axis
Row2: corresponds to the rotational transformation of the y-axis
Row3: corresponds to the rotational transformation of the z-axis
Row4: corresponds to the translation transformation

Then, again, you'd just fit this to drive the blend shapes.

The nice thing about this is that it doesn't matter what other transformations happen upstream. And with a little tweaking you could make this VOP an HDA and use it on any joint.

 

GimbalLockPoseDriver_v05.hipnc

Link to comment
Share on other sites

I appreciate you taking the time to try and help me solve this, so thanks again. 

I have been playing around with the solution you uploaded but I cannot get it to work consistently. I think twisting the arm completely messes up the dot product calculations. In the simple IK example I mapped the dot product rotations to what looked to be forwards and up. All is well. I then try IK doing similar motions and get messy results. It mostly works ok, I get up, fwd, down, back correctives how I should, but then I will randomly get a full corrective of 1 for the up position when the arm is no where near up, same for forwards, the arm will be backwards yet it will briefly calculate 1 for forwards.

I think it is the twisting of the elbow. 

Thanks for showing me a little how transformations matrices work. I may have to come back to those tutorials in the future. 

Its confusing me how difficult this problem is turning out to be. I initially thought there would be some kinefx sop or something that would handle it or it would just require some simple math, but every method I try seems to be problematic.

If you have the time I have uploaded a file demonstrating the problem with that method. If you scrub the time line and look at the detail attribs, you will see the correctives are mostly correct however will randomly become massively incorrect for a few frames. If you are busy with work I understand, you have already helped me a lot.

GimbalLockSolutionA2.hipnc

Link to comment
Share on other sites

Sorry, I've been slammed with work this week and haven't been able to really test this stuff. But I'll try to spend some time this weekend and get a working solution. It's a good exercise.

I know kinefx is pretty confusing for full-on character rigging right now. SideFX has promised those tools are coming (we've already seen a few like the reverse foot and shoulder correction pop up). But since this is a new feature, they seem to have focused mainly on a motion capture setup, which makes sense since the market in game development is probably a bigger one for them at the moment. Not many studios are going to switch their character rigging/animating from Maya to Houdini anytime soon.

That said, I hate rigging in other apps, so I'm really excited and really like the approach that KineFX opens up. BUT, there's no way to make it work at the moment without a pretty good understanding of what's happening under the hood (and in my opinion first among that is understanding how to work with matrices).

If you want to try it, my first thought is that maybe the way to go is to create a couple new temporary control joints. Then you can separate out the various rotations. Say have 1 that is only driven by the x-rotation of the shoulder, 1 that is driven by the y-rotation, and one that is driven by the twist. Then use each of those to compare (dot) against the parent.

By the way, have you cracked open the old PSD SOPs to see how they did it with the old object-based approach? Might be useful.

Anyway, I'll take another crack this weekend. I have a project coming up that will require the same tools, so its time well spent.

Link to comment
Share on other sites

Yeah, I suppose it makes sense that kinefx is pretty new so they will be adding features like correctives some time in the future. Maybe I do need to learn matricies in the mean time, though I already have a stack of other stuff I need to learn in regards to character rigging, and houdini in general.

I look forwards to seeing what you can come up with at the weekend. In the meantime I'll have another look at the old PSD sops. Though last time I played around with them I came to the conclusion they only worked with obj based rigs. I tried importing a DAZ character with the regular FBX import but still couldn't get PSD setup on it. I am very much an amateur at this sort of thing so I scoured the internet for tutorials but came up short.

Link to comment
Share on other sites

OK. I'm pretty sure this is working (or at least it is on these two rigs). It's working in my setup and I dropped it into your setup above and it seems to be working although if you drop it in yours, your joints are aligned differently, so some of the calculations (up and down) are backwards. It would take a little clean-up to make it work on any joint regardless of orientation.

It's the same approach as before, using the dot product each axis, but I simplified it a bit by referencing the same joint in rest position. For good measure I also threw in a ramp so you could map the deformation response. The other limitation right now is that to keep the VOP simple it only works for 90-degrees in one direction and 90-degrees in the other. But if you needed to cover a full 360 rotation, you can use a separate dot product for each half of the pair (up/down, fwd/back, etc.) and change the rest axis accordingly.

See if this works.

GimbalLockPoseDriver_v06.hipnc

Link to comment
Share on other sites

Thanks for taking a look Geoff. I plan on having a more thorough look at your solution tomorrow but for now I have loaded it up but have hit a potential issue, I move the timeline to 24 and the PSD values are all zero apart from up, I then place a rig pose just before the vop and rotate and transform the root around a bit and all the values start to flip out.

Link to comment
Share on other sites

Here is the hip. I just added a rig pose before the vop and rotated the root of the skeleton around to simulate the character moving about a scene. All the values start changing when the shoulder is not moving at all. Perhaps your solution requires the pose to be rotated back to the rest position although I don't understand matrices well enough at this point to understand what your VOP is doing.

GimbalLockPoseDriver_v07.hipnc

Link to comment
Share on other sites

It works. You just have to place your root or spine controls above where the arm rest position splits off. Normally I'll lay out my rig vertically starting with root/COG, then spine, then neck and head. And then I split the arms off and finally legs and tail (if you've got one).

The VOP works by first comparing the animated shoulder joint with the joint at rest (before any arm animations). That's why you need to have spine controls above the rest split; you need them to be applied to both the rest shoulder and animated shoulder so you are comparing apples to apple. Then it uses the world transform of each joint to compare how far the shoulder has been rotated. One way to think about the 4x4 transform matrix is that each row corresponds to the vector of each axis after a transformation has been applied:

row1: is the x-axis vector (1,0,0) after the rotation has been applied.
row2: is the y-axis vector (0,1,0) after the rotation has been applied.
row3: is the z-axis vector (0,0,1) after the rotation has been applied.
row4: is the position in space of the local origin (0,0,0) after any translation has been applied.

That's a little simplified and those 3blue1brown videos do a very good job of explaining it better. From there it is just cleaning things up and using the dot product to compare the vector after animation to the rest vector. It splits out one row at a time of the 4x4 matrix, lops off the 4th item in the row (transform matrices are really 3x4 with an extra column of zeros to pad it out) and calculating the dot product. Here, just for simplicity, I compared the axis I'm interested in with another axis that is perpendicular to it, so the driver value starts at 0 and moves toward 1 or -1 depending on which direction you move it, but you could set it up different ways. This VOP also doesn't account for translation of the joint, but you can see above that all you'd need to do is pull out the 4th row and calculate the distance is has moved (by subtracting the 4th row of the rest joint transform matrix).

Hope that makes sense. There might be other ways to approach it, but that's my best solution at the moment. Good luck.

GimbalLockPoseDriver_v07.hipnc

Link to comment
Share on other sites

The way I use rig pose is probably very wrong but I dont generally seperate them off for specific body parts. With this method you would need each blend shape joint to be on a seperate rig pose node I think.

Anyway I dont think that matters because I found that if I just swap the xforms in the VOP to local transforms, I can place a rig node after the arm rig and move and rotate the root around as much as I want without it upsetting the drivers. Brilliant solution so far Geoff, though I will test it a bit more tomorrow to make sure there are no surprises. Can you see a problem with using the local transform instead of the regular transform?

Thanks for explaining more to me about the matrices operations, I was confused why you where dotting seemingly random axis together but it makes total sense once you said that you are grabbing a perpendicular axis. In that case I guess it doesnt matter which rest axis you use as long as it isnt the same as the pose axis.

I have been stuck on this problem for about 2 weeks but I think youve cracked it, so thanks again. I appreciate you taking the time to help me out!

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...