Jump to content

All Activity

This stream auto-updates     

  1. Past hour
  2. Hi, if you made a curve and that your normal is a « natural » normal (not tweaked manually in another wrangle), then when making your curve, output the tangentu attribute. It should be naturally perpendicular to N. Otherwise, create your own perpendicular vector in a Wrangle using a cross product between your N and an up vector defined arbitrarily (in VEX should be something like v@perpN = cross(@N, up); and define up before like vector up = {0,1,0};) Then same code as you have with N: @P += ch(« offsetperp ») * v@perpN;
  3. Today
  4. Hello; Please help : I need to move my geometry perpendicular to existing Normals. Thanks. CrossNormal_01.hip
  5. Packed Primitive transformation equation

    I think you’ll find better info her : http://www.tokeru.com/cgwiki/index.php?title=HoudiniDops#Rigid_Body_Dynamics
  6. Packed Primitive transformation equation

    If I remember correctly, if you want to use the intrinsics of your prims coming out of sim, to instance higher rez objects on the points, or maybe use CHOPS to smooth / reverse / tweak the sim, the equation should be like : Ppoint = Pprim + (Ppoint - PIVOTprim) * TRASNFORMprim where PIVOT and TRANSFORM are respectectively a vector and matrix3. Ppoint should be static (position of the points at start of the sim). Hope this helps
  7. House fire sourcing Pyro

    I appreciate the replies! Im not sure i understand whats going on in that file. Why do the particles point away from the windows, but the fire goes towards the windows? The particle sim, is very small compared to the pyro sim. Why does it still work? And what is controlling the direction of the fire? Im pretty much starting with 17.5, so could that be the issue? Things look alot different. When it comes to smoke, should i emit from the fire, or just create another sim that just emits smoke? And if i want to control the smoke, from the fire, is there a way i could do that without changing the fire sim? Maybe im wrong, but it looks like when smoke is being created from the fire, that the settings control everything. There is no separate set of parameters that only control the motion/shape of the smoke, and leave the fire alone. Also im more looking for how to start the fire. Id like to know how this would be done "professionally". Like would the source come from the geometry of the building? Source from the window glass? Put geo on the floor, like a campfire, and go from there? Is that the correct way to start this, or if a deformed sphere like i already have work for my needs?
  8. Vellum leather belt

    simulating grid and then pointdeforming hires belt is the way to go, vellum uses pscale for thickness so you can still define how thick your belt is even if it's just a grid here is an example showing such workflow, I kept your scene scale, just be aware that your scene seems to be in decimeters so I also increased gravity to account for that belt_ts.hip
  9. VOP Keyword Reserved

    I found a link which could help, in case that people are looking for it . Thanks John Kunz: http://mrkunz.com/blog/08_22_2018_VEX_Wrangle_Cheat_Sheet.html
  10. Vellum leather belt

    Hey clear, I've uploaded the .HIP file below, so far I've used Vellum Tetrahedral Softbody and standard Houdini cloth to simulate the strap but because the model has height on it and isn't a flat grid the walls are pretty much collapsing on eachother and I can't find the correct strut settings. Because this didn't work I have also tried to simulate a grid and then later extrude it then point deform my actual belt strap but I haven't had any luck with this either. I feel like I'm approaching the idea completely wrong. Btw if you're wandering what you're looking at when you open the .HIP file I didn't want to get too into modeling etc before knowing I could pull it off with a proxy belt model first, so the little rectangle that animates is supposed to be the belt buckle. I'm not really upset with this result as I find houdini very hard but it's still no where near what I need. Essentially I would like multiple belts to be animated by dragging the buckle around and the strap will naturally follow sort of like a tail of a animal etc. I don't have knowledge outside of vellum but thought Vellum would be the way to go if I later decide I want to really push/manipulate the shape of the belt by twisting and all that. Thanks for your interest! belt.hip
  11. Hi, In Bullet simulation, pieces are packed and have attributes : (1) P , (2) orient quaternion, (3) pivot. Somehow, these three attributes compute the final transformation matrix for each piece. My goal is to animate these simulated pieces in reverse (not just playback in reverse, but instead create more "interesting" animation). My question is how are Pivot and P related? Initially, P equals Pivot, then as the simulation goes: Orientation changes (which is logical), and Pivot drifts away from P (i.e. not equal anymore). I checked bullet[T|R|S] are they are all zero or identity. Thanks,
  12. Yesterday
  13. Vellum leather belt

    What do you have so far and how is it not working for you?
  14. Hi, I am following this tutorial, but I don't get a shadow matte with my toon shader. I am wondering why this is. If you want to take a look: shadowmattehip.hipnc (with the Links_Schaduw mantra node) I would also like to know how I could get an equivalent of the diffuse color image plane while Toon Shading. Is Toon Shading more limited in this regard or is it possible? (to have highly customisable composites) Thanks.
  15. Initial State with Vellum Grains

    In the dopnet there is a output node. You can save to disk the frame you want to be the initial state. On the dopnet node, under simulation use that frame for initial state. Hope this helps.
  16. Using the rayed and then volume-extruded grid in addition to bumping the substeps way up (5 min/10 max), using volume collisions and the RBD solver are doing the trick on my quick test mesh. Haven't tried it on the final big guy yet--I'm guessing more substeps will be in order.
  17. Mmmh. Just for clarification. While this isn't finished, it's close to the current needs. I create a "shadow" rig which resembles the final bone structure. I'm going to create this rig one by one afterwards, but if I need to change it's proportions, I want to alter the shadow rig and have some clever mumbo-jumbo apply this to the existing bone rig. Each "bone" does calculate it's restlength, position and orientation. it also carries it's name which is identical to the real bones name. This applies for the nulls, too. Does some script already exists which could be altered or is this new territory? Cheers!
  18. Hi there I made some snow with very small vellum grains What Im trying to achieve is to set an initial state when my snow is already relaxed (I am throwing a box made of grains in top of a mountain), and then to continue colliding with some objects. How can i make the simulation read the first frame and then start writing the next
  19. Vellum leather belt

    I'm trying to find a way to animate a metal belt buckle and have the leather strap follow the position whilst being effected by gravity/environment and still not intersect with the buckle itself. Essentially I'm looking to recreate the "rope knot" effect that everyone has seen recently but with a more irregular shape such as a belt (rectangular instead of round ) Is Vellum the best system to create something like this? any help would really be appreciated thanks everyone
  20. Hi there, I am trying to project a texture onto an UV unwrapped object through the projection camera and then bake the projected texture on the original object UVs. However, when I do it, the uvtexture node gives me new UVs used for projection than the original object UVs. I would appreciate any tips how can I bake it out on the original object UVs. I attached my setup in a file below. Many thanks for any tips! uvprojectionbake.hip
  21. A slightly more brutal approach could be using an Attribute Blur SOP on P and try slightly higher blur values. Or use a DeltaMush deformer, with some non-spiky object as reference object. If that won't help, is it possible the distance to the nearest points might be different for the spiky areas? You can check that easily using point clouds: (More on point clouds: http://www.tokeru.com/cgwiki/index.php?title=JoyOfVex20 )
  22. I'm sorry to say it, but this setup is a bad idea. I the initial fracture you have size differences but still you scatter 250 points on each piece resulting in insane small pieces. If you really want to do that put the fracturing after the for each network. But anyways the resulting shapes won't be nice and it will take very long to calculate. If you want just a convex decomposition 4 points should be enough.
  23. Thank you, that worked perfectly!
  24. Houdini 18 Wishlist

    Flipbook / Mplay: It should be able to create directories by its own while saving. Attribute Wrangle: Quick preview of necessary arguments while typing a function.
  25. From what I see in the images, your candidates also have a low density of points. Maybe you can run a Point Wrangle, then run a nearpoints() function to get the near points in a given volume around your point. Then you store on each point an attribute density = len(nearpoints())/(4/3Pi*power(radius,3)) delete points that have a density below a threshold. Play with the threshold until you identified most of problematic-spiky-lowrez area. And finish manually... should save you some time. Of course it is based on the assumption that your spikes are low point dentity areas... You could mix that as well with curvatures (selected areas that have high curvature AND low density of points ?). Last but not least, you could use a similar attribute that would be based on len(neighbours()) (eg. the number of points that share an edge with your point)==> if your spiky areas are less connected, as you mentionned, you’ll see that as well. and you can even delete based on these 3 attributes...
  26. EmberGen pyro real-time engine

    If they can show their pyro interacting with fluids or deforming geometry while spawning realistic sparks, then maybe people might start to feel a bit threatened. But the more features that are added, the slower things get. Directors often want things to look a certain way, so control is a pretty big thing to lose. But if they can offer all that, kudos to them. I just didn't see it in the video.
  1. Load more activity
×