Jump to content


  • Content count

  • Donations

    0.00 CAD 
  • Joined

  • Last visited

  • Days Won


Everything posted by Atom

  1. creating sound from animated curves

    Not an image, like a .WAV file. I tried the menu shown, it worked. Thanks, I didn't realize Houdini could do that.
  2. creating sound from animated curves

    That's interesting, do you know of anyway to render the audio output?
  3. You can always keyframe gravity, kick out a geo sequence and use ReTime to play it back as you see fit. ap_keyframed_gravity_back.hiplc
  4. MTL To Redshift Material

    I put together a simple script to read the .mtl file, typically associated with a .obj, and create a Redshift material for each entry it finds in the .mtl with a map_Kd token. A Redshift material is created and a texture map is linked to the diffuse color with the filename for the map populated with what is found in the .mtl entry. This is useful when importing architectural .obj files that typically have a large number of materials associated with them. Expected .mtl format of supported tokens. #newmtl _4_2_ <- Material name. #Ns 96.078431 <- Specular intensity #Kd 0.640000 0.640000 0.640000 <- Diffuse color. #Ks 0.500000 0.500000 0.500000 <- Specular color. #Ni 1.000000 <- Index of refraction. #d 1.000000 <- Opacity. #map_Kd 21_budova/_4_2_.jpg <- Map name. #map_Kn 21_budova/_4_2_n.jpg <- Map name. The result of one-click texture mapping. Here is the Colosseum auto texture mapped. The path to the .mtl file and texture path is hard coded. Place the code in a shelf tool button and adjust the path to point to your .mtl file. mtl_to_redshift_061618.zip
  5. I came across this HIP file posted on Discord and thought I'd re-post it here. It is a way to deform one object based upon the proximity of another without a solver or a simulation. If you are the author of this HIP file feel free to chime in and take credit! CheapAssCollisionDeform.hipnc
  6. What if you move the transform after the sim?
  7. Generating vellum collision texture

    An attribute transfer node can do that.
  8. I think I would use both systems, Create a tube that fills the entire trench with fluid particles. then have a few emitters inside the trench to push things along. ap_River.hipnc
  9. Just cache the final surface out one layer at a time and use the previous layer's surface as the collision object for the new layer. That is how a lot of those stacked candy or desert animations are approached.
  10. Lay splines along geometry

    The Ray node can sometimes be leveraged for this kind of work. It works well with points, but tends to flatten more complex geometry. ap_Knittingtest02.hipnc
  11. Flip initial velocity without shape change

    One way is to switch your source at the SOP level. One source has velocity up and the other has no velocity. ap_switch_flip_vel.hiplc
  12. Yep, that works too. I had to add a wrangle to flip the normals back around and a reverse.
  13. Eventually, you'll catch on to the workflow. All you have to do is promote the vertex normals, back to point and then attribute transfer them onto the original geometry, thus preserving it.
  14. What about something like this? Use a polyextrude to generate vertex normals, then use a wrangle to throw away the newly extruded points? ap_Normals.hiplc
  15. Here is how I have handled that in the past. Use a tube or ring around the exterior and transfer the normals from the outer object to the inner one. After looking at the image, I guess that is not right either..? ap_Normals.hiplc
  16. Learning redshift

    It looks correct to me, have you tried selecting one of the Global Illumination engines with Redshift? By default I think they are turned off. Try choosing Brute Force for both drop downs.
  17. Learning redshift

    I placed my code in a shelf tool button and then just setup a crude variable based switch to point to the models I was processing. Something like this... n = 2 if n==0: texture_path = '/media/banedesh/Storage/Documents/Models/Blendswap/cc0_Destruction_Assets/Building_Budova~/01' file_name = '/media/banedesh/Storage/Documents/Models/Blendswap/cc0_Destruction_Assets/Building_Budova~/01/21_budova.mtl' if n==1: texture_path = '/media/banedesh/Storage/Documents/Models/Blendswap/cc0_Destruction_Assets/Colosseums/Colosseum4' file_name = '/media/banedesh/Storage/Documents/Models/Blendswap/cc0_Destruction_Assets/Colosseums/Colosseum4/Colosseum.mtl' if n==2: texture_path = '/media/banedesh/Storage/Documents/Models/Blendswap/cc0_Destruction_Assets/Fancy_Architecture/002_Sanpietro' file_name = '/media/banedesh/Storage/Documents/Models/Blendswap/cc0_Destruction_Assets/Fancy_Architecture/002_Sanpietro/sanpietro.mtl' The dome light should work, is it a gamma issue or a color shift? By default, the MPlay render window that pops up has the wrong gamma, it defaults to 2.2 and it should be 1.0. I have to manually change that all the time. But if you poke around in the Redshift render ROP you can specify what gamma is sent to the MPlay window.
  18. Learning redshift

    I wrote a couple of helper script which build redshift materials from .MLT files and from imported FBX material network. They might help, you can check them out here.
  19. You could embed a python node that looks at the date and sets an attribute to a 0 to 1, depending if that date has passed. Then use that attribute to switch the output from the intended result to a null. A better approach IMHO, is to publish your code with a creative commons license, where it is clear that the code/process is yours, and you are granting them a license. By publishing the code publicly, you can also take that code to a new studio, when you move on. If it is a big concern, you had better have the conversation up front, rather than stick the studio with some gotcha expiring code that may put you in violation of agreements you may sign.
  20. Redshift crash (log attached)

    Are you monitoring the heat on your GPUs while rendering? They might be getting hot and dropping into thermal protection mode? 3 980ti = 750watts, motherboard can use up to 50. 80 plus power supply means you need to draw 1200 watts to get 1000 watts (roughly). It might be worth monitoring your power supply as well. https://www.cpuid.com/softwares/hwmonitor.html
  21. Cascading Waterfall Advice/Help

    Base your simulation scale off the size of the characters that ship with Houdini. Once the simulation is complete, you can scale it up to the project scale size. I think I would use multiple emitter points, one at each little shelf along the waterfall to get things started. ap_waterfall_021819.hiplc
  22. Just use two wrangles. Wrangle #1: vector thePos = point(0, "P", 0); // Input point's pos is {0,0,0} thePos += {1,2,3}; setpointattrib(0, "P", 0, thePos); Wrangle #2: thePos = point(0, "P", 0); printf("%d ", thePos); // Prints {0,0,0} instead of {1,2,3}
  23. ocean + ripple solver setup

    You might want to take a look at some of the options on Tokeru. http://www.tokeru.com/cgwiki/index.php?title=HoudiniExperiments#Bow_vectors
  24. Hi All, I wanted to update my profile signature but I just can't seem to find it under my account settings. Is this feature still available on the website? If so what menu is under? Thanks
  25. Carve Contour Lines In Car Body

    Nice setup Konstatin, I have taken the result from your file and symmetrically re-fractured it, adding displacement to the newly fractured pieces. Here is it with another scatter seed and a different UV projection type.