Jump to content


Popular Content

Showing most liked content since 03/22/2019 in all areas

  1. 6 points
    nature.hipnc just to say hello and share some stuffs. /cnc_verkstad/ Tesan Srdjan
  2. 6 points
    I also moved all the tumblr example files I've been sharing onto my new website that you can find here. https://www.richlord.com/tools
  3. 6 points
    More Unlimited Fun nature2 fun.hipnc
  4. 5 points
    Here's a music video I made using a bunch of the techniques from this thread.
  5. 5 points
    Hi, maybe this approach might work: int count = npoints(1); for (int i = 0; i < count; i++) { vector camP = point(1, "P", i); vector dir = normalize(camP - @P); float bias = 0.01; vector hit_ps[]; int hits = intersect_all(0, @P, dir * 100, hit_ps, {}, {}); if (hits && distance(@P, hit_ps[-1]) > bias) // or just check number of intersections: // if ( length(hits) >1 ) { @Cd *= 0.6; } } cull.hiplc
  6. 4 points
    "Organics" Houdini 17.5 & Redshift Cheers, Tom
  7. 4 points
    Another version of the same system. Heres the hip. Its a bit bigger because I stashed the growth geo. 19_03_28_green_growth.hiplc
  8. 4 points
    This is not really on topic given the other stuff on this thread, but I made some growth awhile back, and finally got around to animating it. Attached the hip. Its not very robust, but I got what I needed out of it, so I'm calling it good. 19_03_26_animated_growth.hiplc
  9. 3 points
    again, not for groups, but if you create primitive string attribute on your geo you can use stylesheets to override cryptomatte property 1. create primitive string attribute that holds your "id" value or unique name (or just convert your groups to name attrib) (in this case I created unique s@name attrib per box) 2. create property on your object (in this case I called it: masks) 3. create stylesheets override for Primitives with Override Script that uses name attribute binding to override your cryptomatte property example file: ts_cryptomatte_from_attribute.hip
  10. 3 points
    Some new finished work ! Been working a long time on this one. Not totally finished yet, there may be some cleaning to work on... Any feedback is welcome.
  11. 3 points
    I've made this to save time loading shaders from Substance Source, but should work for any substance material. Just select the parent folder of the texture you want. Figured some others might find it useful too. It has proper viewport handling of Base Colour, Roughness, Metallic, Normal, Emission and Displacement. Not 100% perfect, but pretty damn close. Hdalc for now. Tried to upload to orbolt but kept getting an error (Invalid node type name). Don't think it likes redshift nodes. If anyone has any experience with this let me know! MT_RSSubstance_1.0.hdalc
  12. 3 points
    I usually do it like this: vdb_maskvel_dv.hip
  13. 2 points
    I see a couple of things happening. You have based your grain generation off of the proxy mesh for the ground plane. It is better to dedicate a unique object for this. Also I noticed that the points coming into the point deform have a count mis-match. This is due to the fact that when OpenCL is enabled, it generatsd a different amount of particles, compared to the CPU rest version. Tun Off OpenCL to reestablish the match. Basically you don't need the point deform version after making these modifications. But it could be used as a secondary material element. ap_LetterSetup_001_collProbs.hiplc
  14. 2 points
    Organics: "Seed" Organics: "Bud" Organics: "Leaf" These were all rendered in Redshift at very high resolutions (10800 square/14400x9600) for printing: https://www.artstation.com/thomashelzle/prints Love it! :-) Cheers, Tom
  15. 2 points
  16. 2 points
    Hth, few more notices: - now I found this great answer: https://www.sidefx.com/forum/topic/55015/#post-246795 - collision field is just a scalar field - btw have you seen this great explanation? At 29:00 Jeff talks about divergence https://vimeo.com/42988999 The "make_divergence_free" part of smoke solver is three steps ( 2.A or 2.B or 2.C, depends on the method you choose) 1. Gas Enforce Boundary http://www.sidefx.com/docs/houdini/nodes/dop/gasenforceboundary.html 2.A Gas Project Non Divergent https://www.sidefx.com/docs/houdini/nodes/dop/gasprojectnondivergent.html 2.B or 2.C Gas Project Non Divergent Multigrid (or its OpenCL version enabled) https://www.sidefx.com/docs/houdini/nodes/dop/gasprojectnondivergentmultigrid.html 3. Gas Feedback http://www.sidefx.com/docs/houdini/nodes/dop/gasfeedback.html
  17. 2 points
    Btw you can also add Visualiser to the "collisionvel" field (it is not visualized by default) - unlock the /obj/smoke_no_vel/dopnet1/pyro/smokeconfigureobject2 - find the "collision_velocity" node - add a "Vector Field Visualization" node Connect and set it, something like this:
  18. 2 points
    Hi guys, I saw in Youtube a animated Magnetic Resonance, I and tried to reconstruct that real head using the 2D sequence of that video...follow the result: (The video was 720p, forhaps, better resolution ans quality I could generate a more detailed geometry)
  19. 2 points
    R&D in Mystique type effect - flipping tiles driven by procedural animation. In this case, I used curvature to drive rotation of tiles in copy stamp sop.
  20. 2 points
    @Atom I've been looking at Python in Houdini only recently, so I look forward to get more people's insight. Put it in any of the locations corresponding to $HOME, $HSITE, $HIP, $JOB inside a structure like {loocation}/{houdiniversion}/python2.7libs/ For example I have $HSITE at F:/HOUDINI/HSITE/ so I put the huilib.py in F:/HOUDINI/HSITE/houdini17.0/python2.7libs/ After this import huilib in the shelf tool Script field will be able to find it. As for running the examples, here I'm not so sure about the correct method, but what I've been doing is putting my scripts in a subfolder ../python2.7libs/PRB_HOM/ which contains an empty __init__.py and then call those scripts from the shelf with import. For example, I've put in said folder the example shipped with huilib, all_gadgets.py, while changing the script bottom part from: if __name__ == '__main__': ui = TestDialog(name = 'test', title = 'Test UI') ui.show() into def main(): ui = TestDialog(name = 'test', title = 'Test UI') ui.show() if __name__ == '__main__': main() And then in the shelf tool Script field I can do: from PRB_HOM import all_gadgets as tool reload(tool) tool.main() And this way it gets picked up. This is how I'm running my shelf tools in general, so I can store .py files in a repo and use an external editor and not have the scripts stored in the .shelf file. But I'm curious to know if this is "the proper" way to do it @Stalkerx777 or anyone! It would still be nice if we could work in the Parameter Editor and store that into a .ui file. Cheers PS: I failed to refresh the page and notice you guys already sorted it
  21. 2 points
  22. 2 points
  23. 2 points
    And something completely different. I currently test a LG 75" TV as a monitor (75UK6500PLA). On the photo it looks much smaller as it feels in reality ~1.7m wide and 1m high, about 1.5m away from me. My table is 2m wide and 1m deep, the top is 1.1m from the floor. It is unbelievable how for instance the ART animation a couple of posts above looks in 4k at this size. A totally different impression from my old 30" Dell monitor. Instant gallery-feel :-) What's also cool is the fact, that suddenly the table has much more space on top and because of the distance to the monitor, it feels much less crowded. Image quality feels very good so far, the pixels of course are relatively large at this scale, but I totally love working in Houdini this way. Many other applications can be kept smaller and distributed over the screen - I didn't scale windows itself. Screen brightness is quite even, with a slight darkening towards the edges, but nothing that would worry me. Of course, because of the size and being relative close, one has to move the head and so also looks at the screen surface at different angles, but again, none of that feels problematic so far. At first I was quite worried since the mouse/wacom was lagging a lot, but then found that the "game mode" takes care of that. I'm also disabling all image "enhancement stuff", dynamic brightness etc. of course. It's connected via HDMI to my GeForce GTX 1080 TI. I haven't tried it as a TV so can't comment on that (stopped watching TV almost 20 years ago ;-) ). Cheers, Tom
  24. 2 points
    Thanks jmarko :-) And I certainly agree about Houdini - it helps us grow as artists OpenGL rendered from the viewport... Cheers, Tom
  25. 2 points
    just to select random end point per curve without any sop or vex for loops you can do (pointwrangle) int index = -(rand(@primnum) < 0.5); int pts[] = primpoints(0, @primnum); i@group_randomend = pts[index] == @ptnum;
  26. 2 points
    Like image or ? untitled.wav
  27. 2 points
    Heyya! Over the past couple of days I've been building and extending this, at it's core very simple, Noise generator tool. It's called, incredibly intuitively: Noiser. I've gotten quite sick of always doing the same simple Noise-VOP over and over again so I built this nifty tool that saves me a small but accumulating amount of time (and energy) every time I need some noise. I'm really fond of it and it rarely takes me more than a couple of minutes into any project that I drop it down. Here's a quick video demonstration: https://vimeo.com/271007816 And the DL: noiser.hda and a simple demo screen of the defaults: And that's the relatively simple setup. I hope someone else will find it as useful as I do! Cheers, Martin PS: I just found a volume-noise tool in my OTLs-folder, so I thought I'd just share this as well. Practically the same thing working for both SDFs and Volumes (VDB & Primitive). vol_noise.hda
  28. 2 points
    Hey! With the desire to contribute to this community, I will use this thread to collectively post some of my setups here, in an attempt to help others learn Houdini. My experience was that going through established setups was a great way to learn - and that breaking working setups and learning what breaks them, teaches things that tutorials never really touch. So whenever I found a page sharing advanced setups like that, my heart lit up, and this thread will be an attempt to recreate this feeling in others. Being a current student at Filmakademie BW, this thread is also an hommage to the work of my (just graduated) great colleague Juraj Tomori - and his "Juraj's Playgound". (Here is the link) Enough babbling about, let me start off by showing my reel of the works that accumulated over the past two years in which I worked with Houdini. The first post will also be sharing the first setup shown: The arena shot of the plasma cannon destroying a concrete pillar. The entire sequence was my bachelor thesis about "Procedural 3D for Film Using Houdini" where I tried explaining a procedural approach in written form and detailing my setups. If anyone is interested, I will also be sharing it here. The arena environment was also procedurally modelled in Houdini, which I will also be sharing a setup on in the next couple of days. I hope you will like it, Cheers, Martin
  29. 2 points
    you can also put in a third argument in the foreach, the index you are running over. //run over primitives, use this on a polyline and you will get the order of the points int primpts[] = primpoints(0,@primnum); foreach(int index;int i;primpts){ setpointattrib(0,"index",i,index,"set"); }
  30. 2 points
    I've attached a cleaner and updated (for H17.0+) version of this hip file. In the original file, the rubber toy was in there for initial tests. It serves no purpose. Load your alembic geo in as houdini geo or delayed load primitives. Put down an unpack sop, and a convert sop to manipulate it in Houdini. If you're in need of more help, upload your file and your alembic caches so I or anyone else can set you in the right direction. run_smoke_v003.hipnc
  31. 2 points
    Interesting challenge. Not too difficult IMO for an illusionist... which I'm not Nevertheless here is a way to start. It'all about transfer attributes and UI. For the moment the difficulty (for me anyway) is to add a square mode to the scales. here is a file. I start from an Entagma tutorial : Built-In Nodes vs. VEX: Voronoi-Morph vimeo.com/312285886 There is a control with some paramaters to tweak, and a curve who control voronoi "influence". But the curve could or rather should be used to transfert other attributes. ChameleonSkin_test1.hipnc.
  32. 2 points
    - convert your velocity volume to VDB - make sure it's a single vect3 VDB primitive (VDB Vector Merge SOP) - convert to speed float VDB (VDB Analysis SOP: Length) - compute Minimum or Maximum or other and store in Prim attrib or Detail or ... (Volume Reduce SOP)
  33. 2 points
    actually, you can't read groups directly as i thought, but you can easily convert them to a string attribute and read that: groups.hiplc
  34. 2 points
    A lot of people asked me to share this fake fire method.If you interested it, you can will check this simple hip. After rander i used ACES for a better look. fake_fire_rnd.hip
  35. 2 points
    During the last 3 weeks, a did some Rnd and published my results on vimeo . Some people asked me to share my files here, so here we are i hope it will help!
  36. 2 points
    Hello, since last week i can play with houdini again to keep going my tests ... and bellow , some of my latest hip files from this video: torus+wrinckles+.hiplc stick man rbd+ .hiplc bubbles- rbd+cloth-2.hiplc
  37. 2 points
    O! I've found the answer: "/home/work/mytexture.`padzero(4,$F%150)`.tif"
  38. 1 point
    Is there a reason you're not rendering using python directly? You can check inside $HFS/bin/hrender.py or render.py for some examples. If your usage is simple, those scripts would be enough for you to use directly
  39. 1 point
  40. 1 point
    primitive wrangle on your original geo: int pt = addpoint(0, @P); setpointattrib(0, "N", pt, v@N); removeprim(0, @primnum, 1);
  41. 1 point
    And another guide, this time on FLIPs and RBDs interaction
  42. 1 point
    Hi, have you any audio drivers active? Just try to deactivate it. Good luck!
  43. 1 point
    Thank you kleer001 and Midasssilver for your responses it was max problem , no such issues in maya
  44. 1 point
    There is a specific syntax to create attribute arrays: i[]@name to create array of integers f[]@name to create array of floats I have more examples here, I hope that all are valid but you will see the pattern: http://lex.ikoon.cz/vex-arrays-and-matrices/ So if you want to store an attribute and see it in the Geometry spreadsheet, just add this line: int points[] = pcfind(0 , "P" , pos , ch("radius") , 300); i[]@points = points; //add this line Btw for me those [] were quite confusing. And still are. When working with variable array, you write name[] only when you are defining it. Then use just the name. When working with attribute array, you write i[]@name even when you are reading from it.
  45. 1 point
    Hi. How about computing local space per primitive instead, and then get noise position from point position in the local space? Some sort of edge based UV unwrap. // Primitive wrangle. int pts[] = primpoints(0, @primnum); // Compute averaged primitive normal from point normals computed from their neighbours. vector normals[]; foreach (int pt; pts) { vector normalized_edges[]; vector pt_pos = point(0, "P", pt); foreach (int nb; neighbours(0, pt)) { vector nb_pos = point(0, "P", nb); append(normalized_edges, normalize(pt_pos - nb_pos)); } append(normals, normalize(avg(normalized_edges))); } vector normal = normalize(avg(normals)); // Compute edge tangent. vector pt0 = point(0, "P", pts[0]); vector pt1 = point(0, "P", pts[1]); vector edge = normalize(pt0 - pt1); // Compute bitangent and orthonormalize matrix. vector perp = normalize(cross(normal, edge)); normal = normalize(cross(edge, perp)); 3@tangent_space = set(perp, normal, edge); Final deformation code: // Point wrangle. int prim; xyzdist(1, @P, prim, set(0)); matrix3 tangent_space = prim(1, "tangent_space", prim); vector pos = @P * invert(tangent_space); float deform = noise(pos * {10,1,100}) * 0.05; v@P += v@N * deform; Some image sampling could work too: tangent_space_noise.hipnc
  46. 1 point
    My very first post on ODForce. Thanks Polvy ... I extended your selector for string based selection (using multiple concatenated string menus) . (note wont work with normal ordered menu as it seems the token evaluates to a number regardless). A nice aspect is it keeps my flow in readable form .. ( nb. case and typo sensitive) Could easily do this with some VEX in a wrangle but for some this might be helpful. selector_StringTypes_02.hiplc
  47. 1 point
    OK, I solved the thing... I'm about to render it.. as a glass. How are You guys doing this? You have some workaround for this? I mean.. "Material Fracture" is fractured all the way from frame 1. so it's on the render as well. Is there some way to mix the "uncracked" and "cracked" mesh based on some attribute? What will be the best way to handle this? On a pictur is frame 1, the cracking starts on Frame 5 or so...
  48. 1 point
    My open source solution to render volume light effects in Houdini Mantra based on the first part of this paper. Equiangular sampling (better sampling around light sources) Per-Light export Physically correct result Support for colored absorption (attenuation) Source code and example here https://github.com/somesanctus/volumelight
  49. 1 point
    I promised, that i will publish some source files and here they are. Inside you can find some network from demovideo, from pre work to render. All assets are unlocked(i used its for git), don't pay attention to that. Happy x-mas. Tree_generator_demoscene_unlocked.hipnc
  50. 1 point
    if you have a fairly simple geometry you could use the old tristrip-trick or just remove edges between pairs of triangles like freaq said. quads.hip