Jump to content


  • Content count

  • Donations

    0.00 CAD 
  • Joined

  • Last visited

Everything posted by Alain2131

  1. Houdini not exporting geometry to FBX?

    @zarralax Sorry for the delay, didn't see it. Unsure what you mean by saying "retain each objects' pivot" This could apply to multiple things 1 - Inherit the container's pivot 2 - If it is a packed prim, take that pivot (packedfulltransform) 3 - Take the center as a pivot They could be added as parameters to the tool, if this is what you mean @art3mis I'm glad it's useful !
  2. Hey guys ! Let's lay out the bases. What's the big idea. To start with, I've got some skinned mesh (a tree or something) that I want a wire sim to be applied to. I get the bones' position and parenting information, create a polyline representation of the bones, and then send that to a wire sim. I then need to gather the result and apply it back to the bones. The problem is - the information from the wire sim (pos and orient) are world-space. I need them local-space to be written back onto the bones, and I need them "zero-based", so that a local translation is not applied two times, idem for rotation. Don't know how to make that conversion. Now ! For what I've actually managed to achieve. For the test, I've got no geometry, only very simple bones (a two-bones chain), and I got them into sops in polyline, and the sim is applied to it. I now need to write that info to the bones. That's where I hit a wall. Can someone help me with that ? Either by how to convert the world-space wire sim to the local-space of the bones, or by some trick with nodes (such as sticky/rivets, which is what I'm trying in the hip file). See the aforementioned for some of the tests that I did. TL;DR : I need to transfer a wire sim (that has its initial positions from bones) to their respective bones. See hip file for current tests state. Thanks ! wire_to_bones.hip
  3. I've figured it out. I'm not sure what the timeshift does, but it's not what I need. I need the shift node, with $C put into the Scroll Offset, and change the Units to Frames in the Common tab. The shift node needs to be placed somewhere that has more than one channel (duh), and in my case under the math node was the correct location. The behavior will the that the first channel will have 0 frame of offset, the second will have 1 frame of offset, the third will have 2 frames, and so on. If I would have left the Units in Seconds, it would be 0, 1, 2 seconds of offset. Makes sense, but I didn't know about that when asking the question. Sooo, yeah, that's it ! Thanks for reading !
  4. Hey guys ! I'm starting out in CHOP, so be gentle. I'm trying some things to get going, and already hit a roadblock. I got a geo node that I'm assigning the rx rotation from CHOP, adding a sin wave to it. Works well. But where I'm stuck is that I'm trying to add that same animation to a second geo node, but I want to add some sort of time offset between the two animations. So I'm thinking "Let's add a timeshift node under the constant, and use some kind of global variable (I believe $C ? Or @C ? unsure) to make a random number for each channel as an offset number !" Sooo, that didn't work. $C and @C doesn't return anything interesting (aka. not 0 or not erroring out). Maybe I understand this wrong ? If so I'm unsure as to how to use those global variables. Also, as a side note, I've seen chan[X-Y] being used sometimes in videos, but didn't find it in the help. Nevermind found it, gotta learn to use it now. It's in the Common parameter section of a lot of CHOP nodes. So ! Is the idea wrong ? How should I approach this ? Thanks in advance ! chop_timeoffset_test.hip EDIT : P.S. I've just noticed that in the help, it says When processing multiple channels, [...]. So I believe placing the timeshift under the math node would be more appropriate. But even doing so it stills errors out. Also, there's this - but I haven't been able to make it work. I tried the different combinations of Reference and Unit Values, to no avail.
  5. [SOLVED] [VEX] point() function returning "old" value

    Sorry for double-post, but I think this deserves it. I think I figured it out. I had three ideas to fix the problem. 1 - Don't write the positions and matrices to the geometry. Instead, generate a duplicate of all the attributes in memory using arrays 'n things, and modify those instead. Only at the end would I write the result to the points. I ruled it out pretty fast because I was lazy (what a hassle it would have been). How to represent multiple matrices and positions while keeping the point they belong to referenced without dictionary-like features ? Can I even make an array of matrices ? Maybe, don't know, again, a hassle. Not much thought put into this one. 2 - Convert the code to work running over points, and set the attributes using the @ syntax, so that when looping the values would be correctly returned. Ruled it out. A hassle. 3 - The problem was to loop over the points, set some attributes, read some, set the new matrix and pos, then loop again with the new geometry. A job perfectly suited for the for loop in count mode, with Fetch Feedback and Feedback Each Iteration. I did some (small) tweaks, and it works ! Yay ! See set_point_transformation_matrices_working.hip
  6. Hello guys ! I noticed that in vex, if you modify an attribute and then read it back using the point() function (or prim(), or anything not using the @ nomenclature), it will return the input value, not the modified one. See this code, running over Detail vector thePos = point(0, "P", 0); // Input point's pos is {0,0,0} thePos += {1,2,3}; setpointattrib(0, "P", 0, thePos); thePos = point(0, "P", 0); printf("%d ", thePos); // Prints {0,0,0} instead of {1,2,3} I mean, even this code doesn't work ! No input, running over Detail addpoint(0, {1,2,3}); vector thePos = point(0, "P", 0); printf("%d ", thePos); // Prints {0,0,0} instead of {1,2,3} But interestingly enough, when running over Points (or prim or whatever), this happens //Input is one point, at {0,0,0} @P += {1,2,3}; vector thePos = point(0, "P", 0); printf("Real pos : %d\n", @P); // Prints {1,2,3} printf("Problem pos : %d\n\n", thePos); //Still prints {0,0,0} Not sure what that means, but it can't be applied in my case, as I need to run over Detail. Just wanted to point it out. I need to be able to modify some point attributes (pos and others) and then read them back when running over Detail. How can I do that ? Is it possible ? Thanks in advance ! point_old_value.hip
  7. [SOLVED] [VEX] point() function returning "old" value

    Hey guys, thanks for the fast replies ! I see now, it makes sense. I was a bit confused by the @ syntax which was able to read the "correct" value. But unfortunately I can't use two different wrangles to fix my problem. I'll explain a bit more. I have some points, representing bones position. Those points starts with the original bones' position and rotation (with a rot attribute). What I need, is with one node to write some position/rotation that the "bones" needs to perform in a temp attribute (tempPos & tempRot) on all the required points. No problem there. Then, I need to apply the transformation on the points. That's where the problem is. In the test case, I only apply some transformation on the two first bones of the chain. I worked out how to apply the transformation to the points in local. So that works. But each transformations gets overridden by the next one, so that only the last one gets taken into account, because of the reason mentioned above. If I override the master loop which loops over all points and simply place two wrangles one after the other saying the first is i=0, and the second i=1, it works as intended. But that's not a viable solution in my case. See set_point_transformation_matrices.hip Thanks ! P.S. There is some major flaws in my math. I just found out that any other points than the two firsts doesn't work properly. It has nothing to do with the matter at hand, it's just a forewarning. UPDATE : I figured out the math/logic - I had the world position inside the matrices instead of the local position. Fixed that. So yeah, last problem is the one explained in this post. Hip file updated.
  8. Hey folks ! I have some animated geometry (in the example scene it's from dop, but any animated geo could be used), and I'm trying to separate the pieces in a subnet of geo nodes, then animate the geo node from the pieces' movement. In other words - extract sop animation to obj level. I have the geo nodes with the position keyed and working. It's for the rotation that I'm struggling. I tried two methods : 1 - I used the extractTransform node, and it's almost working, but there's a slight rotation offset. Why is that so ? 2 - I tried to create the matrix myself and extract the rotation from that using VEX. The matrix building and rotation computing //Inputs : // 0 - Animated center // 1 - Animated geo // 2 - Static center // 3 - Static geo // m = moving | s = static vector p0m = point(0, "P", 0); // Center (animated) vector p1m = point(1, "P", 0); // Z axis (animated) vector p2m = point(1, "P", 1); // Y axis (animated) vector p0s = point(2, "P", 0); // Center (static) vector p1s = point(3, "P", 0); // Z axis (static) vector p2s = point(3, "P", 1); // Y axis (static) vector zAxism = normalize(p0m - p1m); vector tempYAxism = normalize(p2m - p1m); vector yAxism = normalize(cross(tempYAxism, zAxism)); matrix mm = maketransform(zAxism, yAxism, p0m); vector zAxiss = normalize(p0s - p1s); vector tempYAxiss = normalize(p2s - p1s); vector yAxiss = normalize(cross(tempYAxiss, zAxiss)); matrix ms = maketransform(zAxiss, yAxiss, p0s); //mm -= ms; vector theRotm = cracktransform(0, 0, 1, p0m, mm); vector theRots = cracktransform(0, 0, 1, p0s, ms); theRotm -= theRots; setpointattrib(0, "rot", 0, theRotm); But the rotation is not right, it looked like the axis were not the right one (like X belonged to Y or smth). I played around with the vectors order, to no avail. I'm interested to know why the extractTransform doesn't work correctly, but what I'm really looking forward is to know what's wrong with the second method. Thanks in advance ! Here's the scene : extractRot_tests.hip
  9. [PYTHON] Getting current frame over network using rpyc

    Hey Alex, thanks for the answer ! I'll send an RFE as you proposed But do you think this is because I'm using rpyc in Maya instead of hrpyc ? I don't really know what I'm doing, I'm just poking around Meanwhile, fcur does return the correct frame ! Noice ! Thanks for that =)
  10. Hello ! I'm trying to do some tests using rpyc to control Houdini over the network (currently only on the same computer) I managed to get it to work (creating some nodes, getting the frame range, setting the frame) But, strangely, I can't get the correct current frame number It will return 1.0, but if I use hou.setFrame(XX) it will then return that value (whatever XX was) even after manually changing the frame in Houdini I tried using hou.frame(), hou.expandString("$F") They both had the same problem My steps to get it to work were copying the rpyc folder at C:\Program Files\Side Effects Software\Houdini 17.0.XXX\python27\lib\site-packages to C:\Program Files\Autodesk\Maya2017\Python\Lib\site-packages In Houdini, open a Python Shell and write/execute import hrpyc hrpyc.start_server() In Maya, open a Python Shell and write/execute import rpyc connection = rpyc.classic.connect("localhost", 18811) hou = connection.modules.hou print hou.frame() # Returns 1.0 hou.setFrame(52) # Sets the frame to 52 print hou.frame() # Returns 52.0 print hou.expandString("$F") # Returns 52 But if you manually change the frame in Houdini and only execute the print hou.frame() line, the problem will be apparent Even in another Houdini instance the problem is present. import hrpyc connection, hou = hrpyc.import_remote_module() print hou.frame() Is there something I did wrong, missed or anything ? Thanks !
  11. See this page print node.parm("theParm").description()
  12. Python from string parameter

    Okay, I see now My bad, here's the "right-er" info here In fact, it's "exec" that you want exec(kwargs['node'].parm('python').eval()) See below for the test I did
  13. Houdini not exporting geometry to FBX?

    Here is a draft of the idea It seems to work correctly on static meshes (haven't tested much, so there might be some problem) It is painfully slow for animated sequence though. 20 seconds for a standard FBX export turns into 20 minutes or more. Sooo yeah, some work is needed on that area Also, if some points or prims has the attribute, and some doesn't, those will be left out. Again, this is a draft. If some interest is shown, I might work on it a bit more. I encourage anyone that feels like it to enhance the tool The tool currently work with attributes, not groups. fbx_export_v1.1.hip FBX_export_v1_1.hda Please let me know if the tool is useful EDIT : I updated the tool a bit See the tool's help for some (I hope) useful info I found out what's taking so long for animated export - it's the unpacking Using the Fetch Unpacked Geometry from the dopimport was actually slower than an unpack under it, and an unpack was slower than using none before the export. But hey, it's not made for animated geometry. I've had no problem using it for static geometry yet. If the point or prim string split attribute is empty on some points or prim, the tool now exports them correctly.
  14. Houdini not exporting geometry to FBX?

    This seems to be a recurrent problem for pretty much everybody. The FBX export doesn't have this capability in and of itself. If you want, you could look into the rbd_to_fbx from the gamedev toolset, they actually did it. Their method is to create a subnet with a geo node for all the packed fragment, and export that subnet. I believe this could be replicated inside of sop.
  15. Python from string parameter

    See this The eval() function is what you are looking for stringCode = node.parm("someCodeParm").eval() eval(stringCode) As for calling this code when pressing a button, is this inside an HDA ?
  16. group points based on formula

    The npoints function needs the input for which it'll look of the point count. The (<geometry>geometry) in the help page means that it wants, well, geometry, and specifying 0 means that it will look for the info in the first input of the wrangle. Here's an example of that :
  17. Orient object along it's velocity

    Well, as for the "cube orientation vector", it doesn't exist as a rotation would exist in another package, say Maya. Not in that context. It exist only at the /Obj level (or any object subnet) As for finding the closest non-zero value for the velocity with Python, that's exactly it ! Get current value, if 0 get value at next frame until the value is not 0. .. In theory, but I can't get the geometry to recook when the frame is changed. It always give me the initial frame's attribute value. Or so I think, it keeps throwing my timeslider to astronomical values. So that didn't work. Instead, I've done something horrible. It works, but it's horrible. cinnamonMetal_orientToVel_v2.hip Since that "orientation vector" only existed at the /Obj, that's exactly what I did. Does that hack work out for you ? PleaseForgiveMeIKnowIDidSomethingHorrible
  18. Display UV island in different colours

    I was about to say "Or you could use the GameDev's UV Visualize" - but it uses the exact same technique as you described. Neat trick ! You can also display the flattened UV in the 3D view with the UV Visualize sop, which could help you with that stretching problem of yours.
  19. Orient object along it's velocity

    The popping happens at frame 13 and 49 (or anytime the cube transitions from/to not moving) - but the timewarp should fix that, thanks to the Pre/Post-Hold. But as mentioned, that's a hacky way to fix that As for getting the matrix transformation of the cube, this is pretty much what I do using maketransform But because it's based on the velocity vector to construct it, when the velocity is 0 the matrix gets squashed down. I decided that when the velocity was 0, to not do anything - thus the snapping. A way to fix that would be to find the closest velocity vector that is not 0 and use it. But I don't know how to get an attribute at another frame than the current one. There would probably be a way to do it in Python. You mentioned "[...] have the cube vectors orient along it's path." Not sure what that means. What vectors ? Do you mean the cube orientation vector ?
  20. Orient object along it's velocity

    Here's my take to it : cinnamonMetal_orientToVel.hip Unfortunately, I haven't been able to keep the first animation frame to pop and to keep the final position/orientation. Instead I used a time warp with the anim duration to keep it from popping. Is this (almost) what you wanted ?
  21. Visualise DOP forces

    Hey Atom, thanks for the answer ! This is great ! I've been able to produce the velocity field, but had not much success bringing it into DOPs. Obviously the volume source wouldn't help with wires/rigids (still tried it), but the SOP Vector Field seems about right. Unfortunately, I'm not sure how to actually apply the force. I did manage to visualize a green opaque cube the following image where the force should be with the Vector Field Visualization node, so I know it reads something, but no luck actually having the wire react. The visualizer works perfectly, even with an animated noise field. How to bring the custom field in DOPs ? Thanks ! EDIT : Forgot to mention that the SOP Vector Field has that warning " Warning wireobject1 - /obj/geo1/dopnet1/sopvectorfield1: This data is attached to an unexpected parent data. "
  22. Hello ! Is there a way to visualize the forces in a DOP simulation, much like this, but for wires ? I have a Wind Force (not the pop one), and there is no option to visualize it out of the box. There might be a way to grab the force field information at any location (locations would be a grid of points built specifically for that) and pass it down onto the points to visualize the force on the Normals or smth, but I don't know how to. Something like grabbing the result of all the forces at a location, or only a specific force, either one would be wonderful. Thanks !
  23. Maybe I should change the question a bit - Is there a way to populate Maya's Joint Orient when exporting from Houdini with FBX ?
  24. Hey guys ! I have a simple test setup - a bone in Houdini with a Pre-Transform rotation of (45, -45, 0), and then an animation on the x rotate When I export that to Maya by FBX, I would expect the Pre-Transform to transfer to the Joint Orient of the bone But it doesn't. So the animation in Maya is a mess in the three axis. Is there anything I missed ? Thanks !
  25. point deforming problem

    This technique is from the Applied Houdini Rigids 3 tutorial video from Steven Knipping. Highly recommended, any and all of his videos ! It cost money, but it is sooo worth it !