Welcome to od|forum

Register now to gain access to all of our features. Once registered and logged in, you will be able to contribute to this site by submitting your own content or replying to existing content. You'll be able to customize your profile, receive reputation points as a reward for submitting content, while also communicating with other members via your own private inbox, plus much more! This message will be removed once you have signed in.


  • Content count

  • Joined

  • Last visited

Community Reputation

5 Neutral

1 Follower

About toadstorm

  • Rank

Contact Methods

  • Website URL http://www.toadstorm.com

Personal Information

  • Name Henry
  • Location Los Angeles, CA
  1. You can possibly use SOuP for Maya to extract your vector attribute and bind it to a color set that can be read by mental ray. The arrayToPointColor node can extract this information. See this link: http://www.toadstorm.com/blog/?p=240 Also if your attribute types are set correctly, Houdini should export Alembics with readable color sets. Check this thread: https://www.sidefx.com/forum/topic/38939/?page=1#post-178403
  2. Thanks for doing this, Alessandro! Great to see you as always.
  3. If you want to move points along a surface based on texture UVs (not per-primitive parametric UVs), you need to do a little magic with xyzdist() and primuv(), or XYZ Distance VOP and Primitive Attribute VOP if you prefer. Convert your UV attribute to Point type, then use a Point Wrangle to move your point position to v@uv. Don't forget to store your original point position in the Wrangle, or via a Rest SOP. Then give each of your particles a goalUV vector attribute. Then use xyzdist() to find the nearest primitive # and parametric UV to your distorted surface. Store that primitive number and UV coordinate on your particle as point attributes (posprim and posuv are what particles use internally for sliding). Then move your mesh back to world space using a Rest SOP or a Wrangle, and use the primuv() function on your particles to find the "P" attribute based on posprim and posuv. This should return the nearest point on the surface given the goal UV you defined earlier. I'm attaching a .HIP to make it a little easier to read. goal_to_uv.hip
  4. Hi, I'm trying to create a sort of sand-in-hourglass effect, where piles of sand accumulate over time from a steady stream. I'm using the POP Grains DOP to try to accomplish this, but the particles don't seem to want to stack. I'm relatively new with this particular solver so I'm not entirely sure what the approach is for accumulating piles like this. Can anyone point me in the right direction? Thanks!
  5. I actually just ran into this problem recently, and a colleague of mine came up with a pretty quick solution. Inside your DOPnet, use a File DOP to write out your .sim files. Then you can just read them back into another DOPnet and they will collide as normal with other objects in your scene (one-way of course).
  6. Perfect! That's exactly what I needed to know. Thanks!
  7. Hello again, I'm trying to do a controlled destruction with the Bullet solver. I have a glue network set up so that I can have pieces break off and eventually split into smaller pieces. The problem is that the destruction in this case isn't forced by an actual impact... I want the pieces to just drift off. I was hoping to use a Modify Data DOP to somehow fool the pieces into thinking they've suffered an impact, but I can't seem to make it work. I was editing the "glueimpulse" value in the Position data for each piece... I know this works for the RBD solver, but nothing is happening in Bullet. The glue weight is zero, except for the cluster glue weight (which is -1). Is there some other way to convince the chunks to break apart without an impact, but have the clusters stay together initially until I break those links manually in the glue network? I just don't want to have to surgically break off each large chunk in the glue network and then start breaking those down into smaller chunks completely by hand.
  8. This is so weird. What about the Drag Force would cause such a strong reaction?
  9. I just can't seem to convince the bullet solver not to explode. I have some geometry that I've pre-shattered in SOPs, no clusters or anything, and I'm trying everything I can do run the sim through bullet to no avail. I want to use all the nice glue network constraints that RBD can't do, but I have no idea how to prevent bullet from exploding. I've searched through several threads saying that I need to make sure all my pieces are convex hulls, but I don't know how to iteratively generate those. When I look at the collision geometry generated by the bullet solver, nothing seems to be overlapping, but inevitably a few tiny pieces shoot out of the simulation and then everything explodes within 2 or 3 frames of the simulation starting. I've tried increasing substeps and collision padding, and nothing changes. What can I do to prevent this from happening? I'd even suffer through RBD slowness if I could have fine control over gluing objects together, but I can't find any information on how to do that. I'm attaching a .HIP if anyone wants to take a look and tell me what I'm doing wrong. Thanks in advance. Using Houdini 12.5.469. shatter.zip
  10. Should have posted my build... I've encountered this problem in both Houdini 12.0.683 and 12.1.125, on Windows 7.
  11. I have a quick question about the render scheduler. When I'm test rendering, I want to use the scheduler to keep track of render times while I'm optimizing Mantra. However, even after the test is completed, the "Elapsed" counter keeps ticking until I manually kill the process from the scheduler. I'm rendering using the Render View... Preview and Auto-Update are both disabled. Is there some way to have the process kill itself when the render is completed? Thanks!
  12. I haven't tried that only because I don't think that Maya can do a whole lot with a point velocity attribute on a mesh... Maya can use them for particles, but I don't know of a way to use a point velocity attribute for rendering in mental ray or vray. I suppose that you could use a VOP SOP to convert point velocity in Houdini to camera space, and then write that point attribute out in an Alembic cache. You could then use that attribute in Maya to write your camera space velocity to your material color. I know VRay can handle negative shader colors if you enable them... not sure about mental ray.
  13. I think I solved my own problem... I figured I'd post the solution here in case anyone has the same question. I used SOuP for Maya to read the data directly from the AlembicNode (instead of trying to read attributes from the mesh itself) because the mesh wasn't properly reading the color array. SOuP's arrayToPointColor node transferred the information into a Maya-compatible color set very quickly, and then can output into another mesh node.
  14. Hello, I've been struggling with trying to apply custom Alembic attributes from Houdini to Maya objects... specifically, exporting point or primitive colors to Maya. I can read the attributes in Maya using getAttr, but there doesn't seem to be any direct way to connect the color attribute exported from Houdini to Maya's colorPerVertex. I could write a Python script to make these connections manually by iterating through every point on the mesh, but my mesh has a changing number of points throughout the animation, and the script would be far too slow to allow smooth playback if it had to run every frame. Is there a better way to handle this? I don't have the C++ ability to actually get into the Alembic plugin and modify it.
  15. Pazuzu, that looks amazing. Is there any chance you could post a .HIP?