Jump to content

nicholasds

Members
  • Posts

    10
  • Joined

  • Last visited

Personal Information

  • Name
    Nick

Recent Profile Visitors

717 profile views

nicholasds's Achievements

Newbie

Newbie (1/14)

  • Week One Done
  • One Month Later
  • One Year In

Recent Badges

0

Reputation

  1. I'm trying to create a Python UI state that returns the currents fields in a DOP network. I want to query the fields and then store there values so that I can print it in the viewport. I can find all the python documentation to get the fields themselves, but I can't seem to find out how to get the values stored in the voxels. I know it can be done in SOPs, but I really want to do it directly in DOPs. Is this possible? FYI, just using a simple Sparse Pyro Shelf tool for my testing. I want to query the density and temperature fields and calculate the maximum value stored in those fields. Thanks!
  2. Hi all I'm trying to bake hair textures to cards I've generated. The built in Houdini baker requires a camera which I don't think is ideal. Saber mentions a technique in this video (at around the 30min mark) where he uses Mantra to sample at every normal to capture the data, but he doesn't show how. Can anyone offer any ideas on how to implement this? Thanks so much!
  3. Thanks guys. I ended up using packed prims and using a point cloud lookup to group them together. Worked well.
  4. Hi all Can anyone suggest a good way to loop over geo and group connected geo by a threshold of primitive count? For eg. - I have a mesh that has 140 000 000 prims. - I can do a connectivity sop and split out connected pieces. Many are smaller than my threshold of say 10 000 000. - I'd like to combine the smaller connected pieces into a group until it hits the 10 000 000 threshold and then move onto a new group etc. Right now it creates far too many individual meshes based purely on connectivity so I'm trying to combine the smaller pieces into a more manageable chunks. Thanks in advance!
  5. Wow, this is fantastic. Thank you so much for taking the time to explain this to me!
  6. Hi. I have an hda that requires one input (mesh) and runs a python script along with a few houdini bakes. Is there a way to run an HDA on the farm by just submitting the hda and the input mesh without having to launch Houdini and submit a job?
  7. Hi all. Can anyone tell me what the best method for bringing in ASCII data into Houdini would be. I have PTX point clouds that I would like to mesh in Houdini, but I can't load them. Here's an example of the first 20 lines from one of the PTX files: 3680 2463 7115.197796 14574.203673 51.118767 0.737771 -0.675051 0 0.675051 0.737771 0 0 0 1 0.737771 -0.675051 0 0 0.675051 0.737771 0 0 0 0 1 0 7115.197796 14574.203673 51.118767 1 0 0 0 0.500000 0 0 0 0 0 0 0.500000 0 0 0 0 0 0 0.500000 0 0 0 0 0 0 0.500000 0 0 0 0 0 0 0.500000 0 0 0 0 0 0 0.500000 0 0 0 0 0 0 0.500000 0 0 0 0 0 0 0.500000 0 0 0 0 0 0 0.500000 0 0 0 0 0 0 0.500000 0 0 0 Thanks!
  8. Hi all I'm looking to replicate a Houdini sim that I have inside Unreal Engine. I would like to export my vector fields as flow maps that can drive Unreal's native particles so that they still remain interactive (collisions, emission, etc). I'm having trouble accessing some of the native POP fields outside of DOPs. Can anyone explain how to import a POP field such as pop axis force out of DOPs into SOPs so that I can then export it? Thanks! Nick
  9. Thank you for the info Jordi. At least I know I am not missing something obvious I will look at that info you shared. Thank you!
  10. Hi all I've read through previous posts on this topic as well as other websites attempting to solve this, but I have yet to find an adequate solution. I am trying to build a workflow for meshing large point clouds (lidar ply files) that we can offload to our render farm. The issues I'm currently having: 1. The point clouds have no normals so point cloud ISO does not work. I'm not sure what math other point cloud software uses to generate normals on point clouds. 2. I've tested using both VDB from points and Particle Fluid Surfacer. The particle fluid surfacer seems to be faster at meshing the points, but the big problem with both is that it ends up giving you thickness which is ultimately double the number of polygons and completely unnecessary. 3. The point clouds are massive (averaging 100 million points per file), and I haven't found a good way of automatically splitting that up into better chunks. The point cluster is really slow on that many points. Any advice would be much appreciated. I'd love to keep this inside Houdini rather than using traditional point cloud software, mainly because I can run it on the farm and because the rest of our workflow is all houdini based. Thanks!
×
×
  • Create New...