Jump to content


  • Content count

  • Donations

    0.00 CAD 
  • Joined

  • Last visited

Community Reputation

2 Neutral

About MJNorris

  • Rank

Personal Information

  • Name
  • Location

Recent Profile Visitors

874 profile views
  1. Hi, I've just got a question about compile blocks. I'm working on a relatively simple process that I need to run on extremely heavy data sets (groom curves), and I've noticed something odd. I need to do a check on each curve, and for reasons I need to do it on every curve in isolation so I'm putting my loop inside a compile block, If I loop over 100'000 curves and run this process then I get extremely fast results for the first 10'000 or so curves, however after about the halfway point I see my CPU usage drops right off and instead of seeing it calculate hundreds of curves per second it's down to 10-20 curves per second. It calculates the first half of the data set in about 4 mins but takes 30 mins to do the second half! At first I thought that this was due to the data set getting progressively heavier, however if that was true I would expect the CPU usage to remain high, despite the progress slowing down? I also had a go at randomising the sort order too, to see if that changed the result and it was the same. So I guess my question is this: What factors, apart from the size of the data-set could cause a compiled block process to slow down over time? Any insights are very welcome! Thanks in advance! M
  2. Hello There! I'm trying to understand how to use viewer states with HDAs, and I've managed to successfully get my paintAttribute sop to switch states from inside my HDA, amazing! I did this with the default state parameter on the type_properties by entering sidefx_attribpaint in that field. Seems to work great! However I'm struggling to get the same approach to work with sculpt or edit SOPs, and when I check for the viewer state in the Viewer State Browser window it doesn't pick anything up... So what does this mean? Do those sops actually use viewer states? If so what are they called? If not, what do they use? Very confused here so any help is very much appreciated. Thanks in advance Matt
  3. How to search through CPIO data?

    Hi there! I'm looking at getting the most out of the hou.Node.saveItemsToFile() & hou.Node.loadItemsFromFile() functions, we're already using them to export/load node snippets, but I was hoping to use the hou.loadCPIODataFromString Function to scan through the file before loading it in. The main idea being to check for node name clashes prio to loading, and then have a set of options to deal with that. However I'm struggling to work out the "proper" way of converting that data into a python friendly format. So far what I'm doing is this: with open(cpio_file, "r") as f: CPIO_string = f.read() CPIO_data = hou.loadCPIODataFromString(CPIO_string) And that gives me what appears to be a massive tuple, full of other massive tuples, filled with raw strings. I can search through them line by line but it feels wrong, very very wrong. Is there some clever way of converting that data block into a python friendly format? Any help is much appreciated! Matt
  4. Trouble with NaNs

    Thanks a lot for sharing Jiri! Very useful
  5. Trouble with NaNs

    Hi There, I'm currently working on some general purpose scene debugging stuff and one of the things I wanted to test for is nans in my geometry, fairly straight forward using isnan(), however I am struggling to come up with a sure fire method for generating nans inside my test scene! Does anybody know of a guaranteed way of creating a nan in your geometry? Thanks in Advance. M
  6. Vellum Constraint Attributes, what do they all do?

    Thanks a lot Johner, That was really helpful!
  7. Hi Everybody, I've got a few (hopefully) quick question about Vellum Constraint Attributes! So I'm basically using the vellum grain solver, with a custom constraint network made in SOPs. I've been able to get what I think is correct behaviour by adding my own attributes for "type", "restlength", "stiffness", "compressstiffness" e.t.c However I noticed that there are some attributes that the Vellum Solver seems to add, such as "pts[]", "stress", "L" & "typehash". Not a problem so far, except when I add a SOP solver to my DOPs to dynamically add constraints these attributes are missing. I've been able to get my constraints to work by correctly setting the "pts[]" attribute, however do I also need to be working out the "stress" and "L" values? Are "stress" and "L" influencing the solver, or are they more like data out attributes? Finally, do I need to worry about the "typehash" attribute? I can't easily upload a scene while I'm at work but if you guys think it will help explain my situation better I can do it when I get home! Thanks in advance! M
  8. Custom VEX function input type

    Fantastic, this is exactly what I was after! Many Thanks
  9. Custom VEX function input type

    Hi there, I'm trying to wrap my head around custom vex functions, mainly from a theoretical point at this stage so not really addressing a specific use, just general knowledge. When I create a VEX function I believe that I have to specify exactly what type of data it is expecting from the arguments and exactly what type of data it will return. eg. function int my_function(float a){ return int(a); } in which case I have to feed it an INT argument and it will always return an INT value. My question is this, is there any way of creating a VEX function which can accept multiple arguments, but not fail if one or more of those arguments are absent when the function is called? Or better yet a way of taking a single argument with out specifying what type of data that argument should be? I'm guessing that this is somehow possible as most of the existing Houdini VEX functions work with multiple flavours of input. Thanks in advance. M
  10. Array Function VEX

    Okay, I actually got it working but those who have a similar issue I needed to use the 'funtion' function as the VEX compiler seems to be getting confused so .................... //doing an explicit function declaration seems to stop VEX getting muddled up function vector[] rgb_array() { return { {1, 0, 0}, {0, 1, 0}, {0, 0, 1} }; }
  11. Array Function VEX

    Hi there, I'm playing around with creating some custom VEX functions and came across something a bit odd that I just can't seem to get working. If I want to create a custom function that returns an array, say a vector array I will need to declare the return type at the beginning, according to the docs the following should work // A function which returns an array of vectors vector[] rgb_array() { ... }; However if I run this in a wrangle I get a syntax error on line1. Is this a bug? Or am I declaring my function type incorrectly? Thanks in advance m
  12. Hi all, I'm trying to create some physically accurate bullet sims using real world values for mass/density and friction, so I've been looking at some engineering tables for friction coefficients: http://roymech.co.uk/Useful_Tables/Tribology/co_of_frict.htm What I'm confused about is this: It seems that real world friction is calculated based on the coefficient of friction between two objects, eg wood & wood has a friction coeff of 0.25 - 0.5; wood and concrete 0.62; wood and metal 0.2 - 0.6. So friction is not a material specific attribute, so if I have 3 houdini objects, one wood, one concrete and one metal, how do I create realistic friction settings for each combination when I can only set a single friction value per object? Is it even possible? Also, how does Houdini calculate friction coefficient between two objects? Does it multiply the friction values? Or average them? Any help is much appreciated Thanks in advance!
  13. EXR meta data issues

    Thanks for the help, makes a lot more sense, although I'm finding it a bit temperamental, sometimes working, sometimes not. I'll keep investigating. Thanks again
  14. EXR meta data issues

    Hi there, I'm trying to set custom data as meta data using the mantra render attribute: "vm_image_exr_attributes" according to the docs i should be able to parse the attribute a python dictionary object, however whenever I try to do this: test_dict = {"count" : 0, "path" : "/jobs/"} my_node.parm("vm_image_exr_attributes").set(test_dict) I get the following error: TypeError: in method 'Parm_set', argument 2 of type 'std::map<std::string,std::string,std::less<std::string >,std::allocator<std::pair<std::string const,std::string > > > const &' What I am trying to achieve is a dictionary comprised of file paths that I can then check in nuke. Can anyone shed any light on this? thanks
  15. Trouble with debris shelf tool

    Hi All, I'm relatively new to Houdini and I'm working on a destruction sim of a car hitting a crystal/ice statue. I'm really close to getting a result that I'm happy with but I've hit a brick wall trying to get a fine spray of particles as a secondary effect. Here's what I have done so far: 1. import statue GEO and use a combination of cookie SOPs and Voronoi Fracture to pre-divide the mesh into groups. 2. create an RBD sim using the RBDglueObject node, bring in the low-res car proxy and use it as a collision object. 3. smash car into statue. 4. cache out the RBD as a .bgeo sequence. 5. read in the cache and everything works fine! 6. press the debris shelf tool, this is where things start getting confusing. so I'm getting several odd results, firstly I have to run the sim from frame1, despite having set the debris_sim node start frame value to the correct frame (512). I'm also really confused about the relationship between the constantBirthrate attribute of the POP_replicate node and the constantBirthRate attribute of the POP_source node. My end goal is to use the particles to instance some very simple geo with a copy SOP, however when I do this I find that where I thought there was just one particle, there are actually a load of particles, all occupying the same space, resulting in loads of copies of geo overlapping each other! I haven't uploaded a scene file because my scene is a little heavy but if anybody needs one I'll upload a simplified version with the cache as a .zip Many thanks in advance and any advice or help at all is much appreciated. M.