Jump to content

Alain2131

Members
  • Content count

    49
  • Donations

    0.00 CAD 
  • Joined

  • Last visited

Everything posted by Alain2131

  1. Hey folks ! I have some animated geometry (in the example scene it's from dop, but any animated geo could be used), and I'm trying to separate the pieces in a subnet of geo nodes, then animate the geo node from the pieces' movement. In other words - extract sop animation to obj level. I have the geo nodes with the position keyed and working. It's for the rotation that I'm struggling. I tried two methods : 1 - I used the extractTransform node, and it's almost working, but there's a slight rotation offset. Why is that so ? 2 - I tried to create the matrix myself and extract the rotation from that using VEX. The matrix building and rotation computing //Inputs : // 0 - Animated center // 1 - Animated geo // 2 - Static center // 3 - Static geo // m = moving | s = static vector p0m = point(0, "P", 0); // Center (animated) vector p1m = point(1, "P", 0); // Z axis (animated) vector p2m = point(1, "P", 1); // Y axis (animated) vector p0s = point(2, "P", 0); // Center (static) vector p1s = point(3, "P", 0); // Z axis (static) vector p2s = point(3, "P", 1); // Y axis (static) vector zAxism = normalize(p0m - p1m); vector tempYAxism = normalize(p2m - p1m); vector yAxism = normalize(cross(tempYAxism, zAxism)); matrix mm = maketransform(zAxism, yAxism, p0m); vector zAxiss = normalize(p0s - p1s); vector tempYAxiss = normalize(p2s - p1s); vector yAxiss = normalize(cross(tempYAxiss, zAxiss)); matrix ms = maketransform(zAxiss, yAxiss, p0s); //mm -= ms; vector theRotm = cracktransform(0, 0, 1, p0m, mm); vector theRots = cracktransform(0, 0, 1, p0s, ms); theRotm -= theRots; setpointattrib(0, "rot", 0, theRotm); But the rotation is not right, it looked like the axis were not the right one (like X belonged to Y or smth). I played around with the vectors order, to no avail. I'm interested to know why the extractTransform doesn't work correctly, but what I'm really looking forward is to know what's wrong with the second method. Thanks in advance ! Here's the scene : extractRot_tests.hip
  2. [PYTHON] Getting current frame over network using rpyc

    Hey Alex, thanks for the answer ! I'll send an RFE as you proposed But do you think this is because I'm using rpyc in Maya instead of hrpyc ? I don't really know what I'm doing, I'm just poking around Meanwhile, fcur does return the correct frame ! Noice ! Thanks for that =)
  3. Hello ! I'm trying to do some tests using rpyc to control Houdini over the network (currently only on the same computer) I managed to get it to work (creating some nodes, getting the frame range, setting the frame) But, strangely, I can't get the correct current frame number It will return 1.0, but if I use hou.setFrame(XX) it will then return that value (whatever XX was) even after manually changing the frame in Houdini I tried using hou.frame(), hou.expandString("$F") They both had the same problem My steps to get it to work were copying the rpyc folder at C:\Program Files\Side Effects Software\Houdini 17.0.XXX\python27\lib\site-packages to C:\Program Files\Autodesk\Maya2017\Python\Lib\site-packages In Houdini, open a Python Shell and write/execute import hrpyc hrpyc.start_server() In Maya, open a Python Shell and write/execute import rpyc connection = rpyc.classic.connect("localhost", 18811) hou = connection.modules.hou print hou.frame() # Returns 1.0 hou.setFrame(52) # Sets the frame to 52 print hou.frame() # Returns 52.0 print hou.expandString("$F") # Returns 52 But if you manually change the frame in Houdini and only execute the print hou.frame() line, the problem will be apparent Even in another Houdini instance the problem is present. import hrpyc connection, hou = hrpyc.import_remote_module() print hou.frame() Is there something I did wrong, missed or anything ? Thanks !
  4. See this page print node.parm("theParm").description()
  5. Python from string parameter

    Okay, I see now My bad, here's the "right-er" info here In fact, it's "exec" that you want exec(kwargs['node'].parm('python').eval()) See below for the test I did
  6. Houdini not exporting geometry to FBX?

    Here is a draft of the idea It seems to work correctly on static meshes (haven't tested much, so there might be some problem) It is painfully slow for animated sequence though. 20 seconds for a standard FBX export turns into 20 minutes or more. Sooo yeah, some work is needed on that area Also, if some points or prims has the attribute, and some doesn't, those will be left out. Again, this is a draft. If some interest is shown, I might work on it a bit more. I encourage anyone that feels like it to enhance the tool The tool currently work with attributes, not groups. fbx_export_v1.1.hip FBX_export_v1_1.hda Please let me know if the tool is useful EDIT : I updated the tool a bit See the tool's help for some (I hope) useful info I found out what's taking so long for animated export - it's the unpacking Using the Fetch Unpacked Geometry from the dopimport was actually slower than an unpack under it, and an unpack was slower than using none before the export. But hey, it's not made for animated geometry. I've had no problem using it for static geometry yet. If the point or prim string split attribute is empty on some points or prim, the tool now exports them correctly.
  7. Houdini not exporting geometry to FBX?

    This seems to be a recurrent problem for pretty much everybody. The FBX export doesn't have this capability in and of itself. If you want, you could look into the rbd_to_fbx from the gamedev toolset, they actually did it. Their method is to create a subnet with a geo node for all the packed fragment, and export that subnet. I believe this could be replicated inside of sop.
  8. Python from string parameter

    See this The eval() function is what you are looking for stringCode = node.parm("someCodeParm").eval() eval(stringCode) As for calling this code when pressing a button, is this inside an HDA ?
  9. group points based on formula

    The npoints function needs the input for which it'll look of the point count. The (<geometry>geometry) in the help page means that it wants, well, geometry, and specifying 0 means that it will look for the info in the first input of the wrangle. Here's an example of that :
  10. Orient object along it's velocity

    Well, as for the "cube orientation vector", it doesn't exist as a rotation would exist in another package, say Maya. Not in that context. It exist only at the /Obj level (or any object subnet) As for finding the closest non-zero value for the velocity with Python, that's exactly it ! Get current value, if 0 get value at next frame until the value is not 0. .. In theory, but I can't get the geometry to recook when the frame is changed. It always give me the initial frame's attribute value. Or so I think, it keeps throwing my timeslider to astronomical values. So that didn't work. Instead, I've done something horrible. It works, but it's horrible. cinnamonMetal_orientToVel_v2.hip Since that "orientation vector" only existed at the /Obj, that's exactly what I did. Does that hack work out for you ? PleaseForgiveMeIKnowIDidSomethingHorrible
  11. Display UV island in different colours

    I was about to say "Or you could use the GameDev's UV Visualize" - but it uses the exact same technique as you described. Neat trick ! You can also display the flattened UV in the 3D view with the UV Visualize sop, which could help you with that stretching problem of yours.
  12. Orient object along it's velocity

    The popping happens at frame 13 and 49 (or anytime the cube transitions from/to not moving) - but the timewarp should fix that, thanks to the Pre/Post-Hold. But as mentioned, that's a hacky way to fix that As for getting the matrix transformation of the cube, this is pretty much what I do using maketransform But because it's based on the velocity vector to construct it, when the velocity is 0 the matrix gets squashed down. I decided that when the velocity was 0, to not do anything - thus the snapping. A way to fix that would be to find the closest velocity vector that is not 0 and use it. But I don't know how to get an attribute at another frame than the current one. There would probably be a way to do it in Python. You mentioned "[...] have the cube vectors orient along it's path." Not sure what that means. What vectors ? Do you mean the cube orientation vector ?
  13. Orient object along it's velocity

    Here's my take to it : cinnamonMetal_orientToVel.hip Unfortunately, I haven't been able to keep the first animation frame to pop and to keep the final position/orientation. Instead I used a time warp with the anim duration to keep it from popping. Is this (almost) what you wanted ?
  14. Visualise DOP forces

    Hey Atom, thanks for the answer ! This is great ! I've been able to produce the velocity field, but had not much success bringing it into DOPs. Obviously the volume source wouldn't help with wires/rigids (still tried it), but the SOP Vector Field seems about right. Unfortunately, I'm not sure how to actually apply the force. I did manage to visualize a green opaque cube the following image where the force should be with the Vector Field Visualization node, so I know it reads something, but no luck actually having the wire react. The visualizer works perfectly, even with an animated noise field. How to bring the custom field in DOPs ? Thanks ! EDIT : Forgot to mention that the SOP Vector Field has that warning " Warning wireobject1 - /obj/geo1/dopnet1/sopvectorfield1: This data is attached to an unexpected parent data. "
  15. Hello ! Is there a way to visualize the forces in a DOP simulation, much like this, but for wires ? I have a Wind Force (not the pop one), and there is no option to visualize it out of the box. There might be a way to grab the force field information at any location (locations would be a grid of points built specifically for that) and pass it down onto the points to visualize the force on the Normals or smth, but I don't know how to. Something like grabbing the result of all the forces at a location, or only a specific force, either one would be wonderful. Thanks !
  16. Maybe I should change the question a bit - Is there a way to populate Maya's Joint Orient when exporting from Houdini with FBX ?
  17. Hey guys ! I have a simple test setup - a bone in Houdini with a Pre-Transform rotation of (45, -45, 0), and then an animation on the x rotate When I export that to Maya by FBX, I would expect the Pre-Transform to transfer to the Joint Orient of the bone But it doesn't. So the animation in Maya is a mess in the three axis. Is there anything I missed ? Thanks !
  18. point deforming problem

    This technique is from the Applied Houdini Rigids 3 tutorial video from Steven Knipping. Highly recommended, any and all of his videos ! It cost money, but it is sooo worth it !
  19. Extract rotation from packed geometry to a bone (Python)

    6 months has passed. And so have I, to other matters. But then, while I was watching a SideFX's masterclass, a certain handy dandy little node called Extract Transform was briefly shown. And I thought - hey, that might be it ! What's been missing to complete the tool of this thread ! And it is ! It works ! Yay ! Works on packed geo only. Let me know if this tool is of use to you. Cheers ! RigidBody_Baker.hda rigidBody_baker_tests.hip
  20. Hello ! I'm trying to get an animation transfer from packed geometry to bone to work. It works when transferring the animation to a geometry (there is no visible problem in the viewport), but in the Animation Editor, the keys are not very clean. See below Here is the nice and tidy one, the source : Here is the result from the script, the baked one : As I said, in the viewport it works, but these keys are not very clean. Why do they look like that ? How to make them cleaner ? (See script and scene below) BUT ! What I really want is to bake the animation on a bone, not on a box. The previous problem still apply to this situation. The position works, but the rotation just fails miserably (with the same code !). The very same rotation is applied to the bone as the previous one that was applied to the test box, but the result in the viewport is not the same at all. The red is the source and the blue is the skinned box (to the visible bone). Why is that ? What am I missing ? Here's the code : node = hou.pwd() obj = hou.node("/obj") def extractEulerRotates(self, rotate_order="xyz"): # Thanks to the Houdini help page for that. But there is a problem here though return hou.Matrix4(self.extractRotationMatrix3()).explode(rotate_order=rotate_order)["rotate"] # The extracted rotation from this function is incorrect. def bakePackedAnim(): # Saving out some time-related variables intialFrame = hou.intFrame() startFrame = int(hou.hscriptExpression("$RFSTART")) # Don't know how to do it in Python endFrame = int(hou.hscriptExpression("$RFEND")) # Don't know how to do it in Python hou.setFrame(startFrame) # ''' # Initial setup : Creates a bone and a box, and then skins the box to the bone with a Capture Proximity. theBone = obj.createNode("bone", "tranformed_bone") # Create only one bone. Would put it in a loop to create multiple. # theBone.moveToGoodPosition() # Easier to work without it. Will uncomment in the end # TO REMOVE... But weirdly with -not a bone- it's working. Hmmm. testGeo = obj.createNode("geo", "test_geo") fileNode = testGeo.allSubChildren()[0] testTransform = testGeo.createNode("xform") testTransform.setFirstInput(fileNode) testTransform.moveToGoodPosition() testTransform.setDisplayFlag(True) testTransform.setRenderFlag(True) # Remove up to here skinnedGeo = obj.createNode("geo", "skinned_geo") # skinnedGeo.moveToGoodPosition() # Easier to work without it. Will uncomment in the end skinnedGeo.deleteItems(skinnedGeo.allSubChildren()) # Removes the file node boxNode = skinnedGeo.createNode("box") captureProximNode = skinnedGeo.createNode("captureproximity") captureProximNode.setFirstInput(boxNode) captureProximNode.moveToGoodPosition() captureProximNode.parm("rootpath").set(str(theBone.path())) deformNode = skinnedGeo.createNode("deform") deformNode.setFirstInput(captureProximNode) deformNode.moveToGoodPosition() #deformNode.setDisplayFlag(True) #deformNode.setRenderFlag(True) # Applying some color to the skinned box attribWrangle = skinnedGeo.createNode("attribwrangle", "color") attribWrangle.setFirstInput(deformNode) attribWrangle.parm("snippet").set("@Cd = {0,0,1};") attribWrangle.moveToGoodPosition() attribWrangle.setDisplayFlag(True) attribWrangle.setRenderFlag(True) # ''' # Transfers the animation from the specified geometry to the bone workingNode = hou.node("/obj/animated_box/OUT_script").geometry() # Gets the geometry of my test scenario for i in xrange(startFrame, endFrame+1): # For some reasons, xrange goes from the correct start value to the end value, minus 1. Strange. hou.setFrame(i) theFullTransform = workingNode.prims()[0].fullTransform() thePosition = workingNode.points()[0].attribValue("P") # This code works only for one object. Would do a loop here through all the pack geo. theRotation = extractEulerRotates(theFullTransform) # Got a problem with how this function extracts the rotation # Position key = hou.Keyframe(thePosition[0]) theBone.parm("tx").setKeyframe(key) testTransform.parm("tx").setKeyframe(key) key = hou.Keyframe(thePosition[1]) theBone.parm("ty").setKeyframe(key) testTransform.parm("ty").setKeyframe(key) key = hou.Keyframe(thePosition[2]) theBone.parm("tz").setKeyframe(key) testTransform.parm("tz").setKeyframe(key) # Rotation key = hou.Keyframe(theRotation[0]) theBone.parm("rx").setKeyframe(key) testTransform.parm("rx").setKeyframe(key) key = hou.Keyframe(theRotation[1]) theBone.parm("ry").setKeyframe(key) testTransform.parm("ry").setKeyframe(key) key = hou.Keyframe(theRotation[2]) theBone.parm("rz").setKeyframe(key) testTransform.parm("rz").setKeyframe(key) hou.setFrame(intialFrame) bakePackedAnim() # Would need to create a UI or a button for convenience, calling this out. And the scene : packed_anim_baker.hip Thanks ! EDIT : P.S. If I copy and paste relative reference from the source's rotation to the bone's rotation, the result is the same : obviously wrong rotation.
  21. Wire solver - rotational constraint

    Up !
  22. Wire solver - rotational constraint

    Hey guys ! This problem has been bugging me for some time now, so here I am ! So I've got a wire that needs to have one end keep its position and the other end to follow a geometry's animation. I specify the face and/or the point to follow. I have the "keep position" figured out, and half of the other figured. I currently have the "following" end correctly follow the geometry, but only in position. I need the wire to follow the orientation of the face of the moving geometry. I managed to do it following one of Houdini's example scene, but it's not applicable in my case. Or I could not figure out how to apply it to my case, so this test is provided with the scene below. See this for a quick glance at the problem. Thanks ! wire_rot_constraint.hip
  23. Hey there ! The situation is that I have a Python code (in a HDA, but in the test a simple python sop will do the trick) in which I tried to modify a certain geometry's attribute. The thing is - I can't have that geometry connected to the python sop (since the final result would be to integrate it all in an HDA). So ! I know we get a "GeometryPermissionError: Geometry is read-only." error when trying to do so with the Python node not connected to the geometry. But ! It is possible to manually modify a node's geometry attributes by locking the node (setHardLocked in python), and manually inputting the values we want in the geometry spreadsheet. The question is - can we do the same in Python ? Can we lock the geometry, and then "auto-manually" change the attributes in Python on that specific locked node ? Thanks !
  24. Wire solver - rotational constraint

    Up !
  25. Wire solver - rotational constraint

    Hey Victor, Thanks for the setup ! Unfortunately I tried something similar (instead of three points I had tried with only one), but discarded it because of the strange behavior it induces to the wire. So it doesn't fit my need, but thanks !
×