Jump to content

Leaderboard


Popular Content

Showing most liked content since 03/08/2019 in all areas

  1. 8 points
    nature.hipnc just to say hello and share some stuffs. /cnc_verkstad/ Tesan Srdjan
  2. 7 points
    Article on SideFX.com: https://www.sidefx.com/community/houdini-175-launch-event/
  3. 6 points
    I also moved all the tumblr example files I've been sharing onto my new website that you can find here. https://www.richlord.com/tools
  4. 6 points
    More Unlimited Fun nature2 fun.hipnc
  5. 5 points
    Here's a music video I made using a bunch of the techniques from this thread.
  6. 5 points
    Hi, maybe this approach might work: int count = npoints(1); for (int i = 0; i < count; i++) { vector camP = point(1, "P", i); vector dir = normalize(camP - @P); float bias = 0.01; vector hit_ps[]; int hits = intersect_all(0, @P, dir * 100, hit_ps, {}, {}); if (hits && distance(@P, hit_ps[-1]) > bias) // or just check number of intersections: // if ( length(hits) >1 ) { @Cd *= 0.6; } } cull.hiplc
  7. 5 points
    Hi. How about computing local space per primitive instead, and then get noise position from point position in the local space? Some sort of edge based UV unwrap. // Primitive wrangle. int pts[] = primpoints(0, @primnum); // Compute averaged primitive normal from point normals computed from their neighbours. vector normals[]; foreach (int pt; pts) { vector normalized_edges[]; vector pt_pos = point(0, "P", pt); foreach (int nb; neighbours(0, pt)) { vector nb_pos = point(0, "P", nb); append(normalized_edges, normalize(pt_pos - nb_pos)); } append(normals, normalize(avg(normalized_edges))); } vector normal = normalize(avg(normals)); // Compute edge tangent. vector pt0 = point(0, "P", pts[0]); vector pt1 = point(0, "P", pts[1]); vector edge = normalize(pt0 - pt1); // Compute bitangent and orthonormalize matrix. vector perp = normalize(cross(normal, edge)); normal = normalize(cross(edge, perp)); 3@tangent_space = set(perp, normal, edge); Final deformation code: // Point wrangle. int prim; xyzdist(1, @P, prim, set(0)); matrix3 tangent_space = prim(1, "tangent_space", prim); vector pos = @P * invert(tangent_space); float deform = noise(pos * {10,1,100}) * 0.05; v@P += v@N * deform; Some image sampling could work too: tangent_space_noise.hipnc
  8. 5 points
    Finally after two years of full stagnation I found the energy to fully revamp it using HUGO (which is freaking cool) and update with the very latest projects. Unfortunately 2 projects I have been working I am not allowed to share (yet)… one of them was 8 months of my life so I am pretty gutted but hopefully soon will be ok to say we did it. https://jordibares.com [jordibares.com] Most of it now is Houdini of course… :-)
  9. 4 points
    "Organics" Houdini 17.5 & Redshift Cheers, Tom
  10. 4 points
    Another version of the same system. Heres the hip. Its a bit bigger because I stashed the growth geo. 19_03_28_green_growth.hiplc
  11. 4 points
    This is not really on topic given the other stuff on this thread, but I made some growth awhile back, and finally got around to animating it. Attached the hip. Its not very robust, but I got what I needed out of it, so I'm calling it good. 19_03_26_animated_growth.hiplc
  12. 4 points
    A lot of people asked me to share this fake fire method.If you interested it, you can will check this simple hip. After rander i used ACES for a better look. fake_fire_rnd.hip
  13. 3 points
    Because the three values defines an imaginary line going from 0,0,0 to the position in space described by the 3 values. That point can be far away or i can be close to 0,0,0. This way the 3 values are able to defines both a direction and a length. -b
  14. 3 points
    again, not for groups, but if you create primitive string attribute on your geo you can use stylesheets to override cryptomatte property 1. create primitive string attribute that holds your "id" value or unique name (or just convert your groups to name attrib) (in this case I created unique s@name attrib per box) 2. create property on your object (in this case I called it: masks) 3. create stylesheets override for Primitives with Override Script that uses name attribute binding to override your cryptomatte property example file: ts_cryptomatte_from_attribute.hip
  15. 3 points
    Some new finished work ! Been working a long time on this one. Not totally finished yet, there may be some cleaning to work on... Any feedback is welcome.
  16. 3 points
    I've made this to save time loading shaders from Substance Source, but should work for any substance material. Just select the parent folder of the texture you want. Figured some others might find it useful too. It has proper viewport handling of Base Colour, Roughness, Metallic, Normal, Emission and Displacement. Not 100% perfect, but pretty damn close. Hdalc for now. Tried to upload to orbolt but kept getting an error (Invalid node type name). Don't think it likes redshift nodes. If anyone has any experience with this let me know! MT_RSSubstance_1.0.hdalc
  17. 3 points
    Creating geometry beforehand is not a bad idea, because it lets you tweak otherwise hard to control process. Do you have a depth for you image? (from a render or stereo camera, or the challange is to create an illusion of such?). In case you want to go right from pixels to voxels and you have some depth channel, the example uses Houdini's render with P-plane, which holds depth in (RG)B channel. Anyhow it's just a gray image after all... volume_from_depth.hipnc btw:
  18. 3 points
    If you're using H17, there's a much easier approach that's also a lot faster than the other solutions - just turn on the Use Piece Attribute toggle on Attrib Promote example_fix_03.hip
  19. 3 points
    I use a wacom pen since the Windows 3.1/95 days when the tablet (called a "digitizer" back then) was huge (UD 1212 I think it was called, 12" x 12"), thick and clumsy as hell, the serial port a pita to set up and the driver was able to grind everything else to a halt when one moved it around quickly enough on those early PCs... ;-) These days I have an Intuos 4 Medium. But yeah, no wrist or other problems at all, the pen really makes a difference and is much more natural and agile. With the middle click on the second side button of the pen, I am using the mouse almost never, only in some weird software that doesn't deal well with the pen and shoots parameters through the roof when dragging a slider etc.. Not so many around of those anymore though. For me using the pen never was an issue and it came naturally. But I know that for some people it does not click. I think it partly has to do with the pasture, I hold the pen relatively upright with the index finger on the sidebutton. This way, when clicking with the tip, the pen does not slide. I also have the tablet in front of me and the keyboard behind it, which works very well for me. One also has to get comfortable with zooming with the available shortcuts instead of the mouse wheel, since reaching for that scroll-wheel thingy on the tablet is cumbersome. And yeah, some things don't work so well, depending on the software. Like I always use the Tab key to open the node menu instead of the right mouseclick, since Houdini is overly sensitive to the slightest movement while clicking and one always moves a tiny bit when using the side button. But that doesn't really bother me, I find a mouse much more clumsy in many other ways. Especially the direct positioning of the pointer to an absolute position is so much more muscle-memory friendly than the mouse shuffling. Those alternative monster-mouse things didn't really convince me, even if they are supposed to be better for the hand - it's still moving around a brick... ;-) I highly recommend finding solutions for the things that bother you with the pen, different settings, different pasture, different pens, tablet keys on the left or right, a different size tablet even. It's really worth it IMO. Cheers, Tom Edit: And make sure you disable the frigging "Windows Ink" crap in the wacom driver, otherwise everything reacts like goo... ;-)
  20. 3 points
    Definitely go for a wacom. I've exclusively used a wacom for houdini for a many years now and it's just wonderful, and it's a lot faster getting around and laying down nodes than with a mouse ontop of being ergonomic. If I only have a mouse I can barely get anything done and it's just tedious, I don't know how you people do it. Get a wacom.
  21. 3 points
    I thought it fitting to post this here too ;). For better or worse, I'm launching a vfx and animation studio at the end of the week. Some of you may recognize some of the name (if you squint and look at it just right). http://theodstudios.com
  22. 3 points
    Trying out Alexey Vanzhula's modeling tools called Direct Modeling Tools. Awesome workflow and it speeds up the process allot. I had no idea what to model so it's a bit long. Video of the process here. https://www.youtube.com/watch?v=PaQ3ZgyDJEo
  23. 3 points
    I exposed some parameters, in case someone is looking for a screw nut / cog wheel generator ; ) hex_nut.hiplc
  24. 3 points
    Working with converted Step Data works actually quite well in Houdini in comparison to other packages. Please check the following: Normal Node--> Add normals to Vertices, Cusp Angle 30 Weighting Method: By Face Area. If you still have Problems you can give me your HIP file, i will have a look and try to help you.
  25. 3 points
    I usually do it like this: vdb_maskvel_dv.hip
  26. 2 points
    I see a couple of things happening. You have based your grain generation off of the proxy mesh for the ground plane. It is better to dedicate a unique object for this. Also I noticed that the points coming into the point deform have a count mis-match. This is due to the fact that when OpenCL is enabled, it generatsd a different amount of particles, compared to the CPU rest version. Tun Off OpenCL to reestablish the match. Basically you don't need the point deform version after making these modifications. But it could be used as a secondary material element. ap_LetterSetup_001_collProbs.hiplc
  27. 2 points
    Organics: "Seed" Organics: "Bud" Organics: "Leaf" These were all rendered in Redshift at very high resolutions (10800 square/14400x9600) for printing: https://www.artstation.com/thomashelzle/prints Love it! :-) Cheers, Tom
  28. 2 points
    Hth, few more notices: - now I found this great answer: https://www.sidefx.com/forum/topic/55015/#post-246795 - collision field is just a scalar field - btw have you seen this great explanation? At 29:00 Jeff talks about divergence https://vimeo.com/42988999 The "make_divergence_free" part of smoke solver is three steps ( 2.A or 2.B or 2.C, depends on the method you choose) 1. Gas Enforce Boundary http://www.sidefx.com/docs/houdini/nodes/dop/gasenforceboundary.html 2.A Gas Project Non Divergent https://www.sidefx.com/docs/houdini/nodes/dop/gasprojectnondivergent.html 2.B or 2.C Gas Project Non Divergent Multigrid (or its OpenCL version enabled) https://www.sidefx.com/docs/houdini/nodes/dop/gasprojectnondivergentmultigrid.html 3. Gas Feedback http://www.sidefx.com/docs/houdini/nodes/dop/gasfeedback.html
  29. 2 points
    Hi guys, I saw in Youtube a animated Magnetic Resonance, I and tried to reconstruct that real head using the 2D sequence of that video...follow the result: (The video was 720p, forhaps, better resolution ans quality I could generate a more detailed geometry)
  30. 2 points
    @Atom I've been looking at Python in Houdini only recently, so I look forward to get more people's insight. Put it in any of the locations corresponding to $HOME, $HSITE, $HIP, $JOB inside a structure like {loocation}/{houdiniversion}/python2.7libs/ For example I have $HSITE at F:/HOUDINI/HSITE/ so I put the huilib.py in F:/HOUDINI/HSITE/houdini17.0/python2.7libs/ After this import huilib in the shelf tool Script field will be able to find it. As for running the examples, here I'm not so sure about the correct method, but what I've been doing is putting my scripts in a subfolder ../python2.7libs/PRB_HOM/ which contains an empty __init__.py and then call those scripts from the shelf with import. For example, I've put in said folder the example shipped with huilib, all_gadgets.py, while changing the script bottom part from: if __name__ == '__main__': ui = TestDialog(name = 'test', title = 'Test UI') ui.show() into def main(): ui = TestDialog(name = 'test', title = 'Test UI') ui.show() if __name__ == '__main__': main() And then in the shelf tool Script field I can do: from PRB_HOM import all_gadgets as tool reload(tool) tool.main() And this way it gets picked up. This is how I'm running my shelf tools in general, so I can store .py files in a repo and use an external editor and not have the scripts stored in the .shelf file. But I'm curious to know if this is "the proper" way to do it @Stalkerx777 or anyone! It would still be nice if we could work in the Parameter Editor and store that into a .ui file. Cheers PS: I failed to refresh the page and notice you guys already sorted it
  31. 2 points
  32. 2 points
  33. 2 points
    And something completely different. I currently test a LG 75" TV as a monitor (75UK6500PLA). On the photo it looks much smaller as it feels in reality ~1.7m wide and 1m high, about 1.5m away from me. My table is 2m wide and 1m deep, the top is 1.1m from the floor. It is unbelievable how for instance the ART animation a couple of posts above looks in 4k at this size. A totally different impression from my old 30" Dell monitor. Instant gallery-feel :-) What's also cool is the fact, that suddenly the table has much more space on top and because of the distance to the monitor, it feels much less crowded. Image quality feels very good so far, the pixels of course are relatively large at this scale, but I totally love working in Houdini this way. Many other applications can be kept smaller and distributed over the screen - I didn't scale windows itself. Screen brightness is quite even, with a slight darkening towards the edges, but nothing that would worry me. Of course, because of the size and being relative close, one has to move the head and so also looks at the screen surface at different angles, but again, none of that feels problematic so far. At first I was quite worried since the mouse/wacom was lagging a lot, but then found that the "game mode" takes care of that. I'm also disabling all image "enhancement stuff", dynamic brightness etc. of course. It's connected via HDMI to my GeForce GTX 1080 TI. I haven't tried it as a TV so can't comment on that (stopped watching TV almost 20 years ago ;-) ). Cheers, Tom
  34. 2 points
    Thanks jmarko :-) And I certainly agree about Houdini - it helps us grow as artists OpenGL rendered from the viewport... Cheers, Tom
  35. 2 points
    just to select random end point per curve without any sop or vex for loops you can do (pointwrangle) int index = -(rand(@primnum) < 0.5); int pts[] = primpoints(0, @primnum); i@group_randomend = pts[index] == @ptnum;
  36. 2 points
    Hey! With the desire to contribute to this community, I will use this thread to collectively post some of my setups here, in an attempt to help others learn Houdini. My experience was that going through established setups was a great way to learn - and that breaking working setups and learning what breaks them, teaches things that tutorials never really touch. So whenever I found a page sharing advanced setups like that, my heart lit up, and this thread will be an attempt to recreate this feeling in others. Being a current student at Filmakademie BW, this thread is also an hommage to the work of my (just graduated) great colleague Juraj Tomori - and his "Juraj's Playgound". (Here is the link) Enough babbling about, let me start off by showing my reel of the works that accumulated over the past two years in which I worked with Houdini. The first post will also be sharing the first setup shown: The arena shot of the plasma cannon destroying a concrete pillar. The entire sequence was my bachelor thesis about "Procedural 3D for Film Using Houdini" where I tried explaining a procedural approach in written form and detailing my setups. If anyone is interested, I will also be sharing it here. The arena environment was also procedurally modelled in Houdini, which I will also be sharing a setup on in the next couple of days. I hope you will like it, Cheers, Martin
  37. 2 points
    you can also put in a third argument in the foreach, the index you are running over. //run over primitives, use this on a polyline and you will get the order of the points int primpts[] = primpoints(0,@primnum); foreach(int index;int i;primpts){ setpointattrib(0,"index",i,index,"set"); }
  38. 2 points
    I've attached a cleaner and updated (for H17.0+) version of this hip file. In the original file, the rubber toy was in there for initial tests. It serves no purpose. Load your alembic geo in as houdini geo or delayed load primitives. Put down an unpack sop, and a convert sop to manipulate it in Houdini. If you're in need of more help, upload your file and your alembic caches so I or anyone else can set you in the right direction. run_smoke_v003.hipnc
  39. 2 points
    - convert your velocity volume to VDB - make sure it's a single vect3 VDB primitive (VDB Vector Merge SOP) - convert to speed float VDB (VDB Analysis SOP: Length) - compute Minimum or Maximum or other and store in Prim attrib or Detail or ... (Volume Reduce SOP)
  40. 2 points
    I have dealt with some of the same issues as you @caskal and my solution was as everyone else suggested. I have a Wacom tablet and an evoluent vertical mouse. I use my left hand with the Wacom pen and right hand with the mouse. This way I can easily switch between them and not overuse my wrist.
  41. 2 points
    Post a hip file so we can diagnose the issue
  42. 2 points
    I had similar problems., but I've found the ultimative Solution (my doc told me). Try a tennis ball and massage your forearm muscles. Sound super stupid but its awesome! there are also special massage balls for this purpose. Its worth a try!
  43. 2 points
    Hello! My name is Joyce Kambey. Currently reside in Singapore, and looking for a job. Will be graduating from 3dsense Media School (VFX Specialisation) in 1 week. Before I came to 3dsense I have 3 years production experience as character rigger and 3d animator. I have used houdini for rigging & animation works for almost 3 years, and used houdini for FX works for 8 months. I have attached my resume with this topic. here's my demoreel: links: linkedin: https://www.linkedin.com/in/joycekambey/ vimeo: https://vimeo.com/joycekambey artstation : https://www.artstation.com/joycekambey e-mail : joyce.kambey@gmail.com JoyceKambeyResume2019.pdf
  44. 2 points
    Ah okay, then I would use expressions such as below to calculate that point without having to do a merge. Add these to the pivot translate of the 'rotate' xform node. And vice versa for 'rotate1', just edit the path of the second centroid function. (centroid(opinputpath(".", 0), D_X) + centroid("../rotate1/", D_X)) / 2 (centroid(opinputpath(".", 0), D_Y) + centroid("../rotate1/", D_Y)) / 2 (centroid(opinputpath(".", 0), D_Z) + centroid("../rotate1/", D_Z)) / 2
  45. 2 points
    just an idea you could explore: cancer.hipnc
  46. 2 points
    Here is a better preview of one of the textures.
  47. 2 points
    Hello all. I wanted to share my production library of tools I have posted on Github. Previously a lot of these were posted on Orbolt.com after my 2016 GDC talks. The tools were originally designed to be used in a HOUDINI_PATH style library as opposed to individual hdas. In this manner they can include a larger array of script based tools that don't fall in the paradigm of Orbolt. This toolset is generally complementary to the GameDev, Mops, etc toolset, there are only a few overlaps nowadays. There are two primary libraries of tools. https://github.com/LaidlawFX/LaidlawFX - This is the production library built up over the last decade of work that has been cleansed of studio specific work. This currently contains a set HDA's, a common python library, shelf tools, and example files on how to used the .hdas. Some fun tools in here are an Ivy Generator inspired by Thomas Luft Ivy generator http://graphics.uni-konstanz.de/~luft/ivy_generator/ . File Cache nodes that have been tested on over a dozen productions. RBD to FBX extractors, and an array of ROP tools. Also a module of python code that includes a few sub-modules such as verbosity logging, multiple hda queuing, fbx exporting, file explorer opening, and the large set of options for file caching. Additionally it contains shelf scripts so that you no longer need to use the Material Library, you can just tab and use the shaders like normal nodes. https://github.com/LaidlawFX/HoudiniEngineDev - This is over one hundred hda's to test Houdini Engine implementations in different DCC's and game engine. Each of them test a simple principle from UI parameters to different geometry types and more. Hope they help you at some point. Feel free to branch or leave comments here and I can update the source. Thanks, -Ben
  48. 2 points
  49. 2 points
    During the last 3 weeks, a did some Rnd and published my results on vimeo . Some people asked me to share my files here, so here we are i hope it will help!
  50. 2 points
    O! I've found the answer: "/home/work/mytexture.`padzero(4,$F%150)`.tif"
×