Jump to content


Popular Content

Showing most liked content since 07/06/2020 in Posts

  1. 7 points
    Yes I do! I just created one Here you go!
  2. 6 points
    My latest reel, 2020. Collection of shows 2016-2019.
  3. 5 points
    Just sharing my 2020 Reel. Im available. https://vimeo.com/429294957 Thanks! Daniel Moreno http://www.danmoreno.com https://vimeo.com/danmoreno https://www.linkedin.com/in/danmoreno https://www.imdb.com/name/nm1625127/
  4. 3 points
    Hello! So i created a few tools for a recent project for creating trees. I thought Id share it with the community. This is my first ever toolset Ive created so If you like it consider donate a few bucks on gumroad. I currently have it as a "pay what you want" product. You are more than welcome to try it out and come with suggestions for future potential updates. Hope you like it! https://gum.co/nEGYe
  5. 2 points
    You are welcome. Submitting RFEs doesn't hurt. C++ implementation is still much faster than laspy, but it's possible to do an easy and quick specific modification for las import until LIDAR Import SOP gets some improvements without going the C++ route. I attached an example with adding classification as an attribute (added closing the las file to avoid memory issues). I added an updated file: pz_load_las_with_python_classification_attribute_memory_fix.hipnc as it looks like context management protocol is not implemented in laspy (with File(): block will not close the file at exit) so I switched to try/except/finally instead. It will not error on the node, so watch python console for exception logging.
  6. 2 points
    The classification works fine. But with inFile = inFile.points[I] you are overwriting inFile object with multi-dimensional arrays of all attributes so you no longer get them by .x/.y/.z or other named properties. I uploaded a modified scene, where you can set classification and then it returns only a subset of points that match the correct classification. inFile.points[I] Returns subset of points in an array where I is True inFile.Classification == 2 Returns array of True/False values to determine which points are classified with id 2. Another approach would be adding Classification as an attribute to all points and then use vex expressions, groups, or other partitioning mechanisms to separate points. pz_load_las_with_python_classification.hipnc
  7. 2 points
    Hi, pretty neat library! Thank you for the tip. There is no need for csv, you can do a lot with laspy and numpy itself. Attached example scene to load data from las file. Seems that Lidar Import SOP ignores scale and offset. To make it work (18.0.499, Python 2.7 branch) I cloned the https://github.com/laspy/laspy repository. Then copied content of laspy folder to $HOME/Houdini18.0/python2.7libs/laspy so I have $HOME/houdini18.0/python2.7libs/laspy/__init__.py (and rest of the library) and it's possible to load it into Houdini with import laspy in Python shell. (Numpy is already included with Houdini) I used example file from repository: https://github.com/laspy/laspy/blob/master/laspytest/data/simple.las import logging from laspy.file import File import numpy as np node = hou.pwd() geo = node.geometry() file_path = geo.attribValue("file_path") inFile = File(file_path, mode='r') try: # --- load point position coords = np.vstack((inFile.x, inFile.y, inFile.z)).transpose() scale = np.array(inFile.header.scale) offset = np.array(inFile.header.offset) # there is no offset in simple.las example from laspy library # offset = np.array([1000, 20000, 100000]) # just for testing that offset works # geo.setPointFloatAttribValues("P", np.concatenate(coords)) # same as Lidar Import SOP - seems that it ignores scale (and offset?) geo.setPointFloatAttribValues("P", np.concatenate(coords*scale+offset)) # --- load color color = np.vstack((inFile.red, inFile.green, inFile.blue)).transpose() geo.addAttrib(hou.attribType.Point, "Cd", (1.0,1.0,1.0), False, False) # add color atttribute geo.setPointFloatAttribValues("Cd", np.concatenate(color / 255.0)) # transform from 1-255 to 0.0-1.0 range) except Exception: logging.exception("Processing lidar file failed") finally: inFile.close() pz_load_las_with_python.hipnc
  8. 2 points
    To fill quadratic polygons with copies just take the square root of the intrinsic primitive area as point scale: int pt_add = addpoint(0, v@P); float area = primintrinsic(0, 'measuredarea', i@primnum); float scale = sqrt(area); setpointattrib(0, 'pscale', pt_add, scale, 'set'); removeprim(0, i@primnum, 1); KM_recursive_subd_001.hipnc
  9. 2 points
    To import a specific script in Python you need to append the folder where it is to the python path (the path contains all the paths to the different modules it loads). You can do it using the sys module : import sys sys.path.append("/path/to/the/folder/where/my/script/is/myscript.py") import myscript And if you want to import just a specific function from your file and not all the file : import sys sys.path.append("/path/to/the/folder/where/my/script/is/myscript.py") from myscript import myfunction Cheers,
  10. 2 points
    I built a non-linear editor/clip mixer for Houdini. On pre-sale right now (PC only, while I continue to work on the Mac version). It's great for bringing in a bunch of different FBX mocap files and mixing and blending them together with a graphical interface. https://gum.co/houdiniClipMixer
  11. 1 point
    Hi, goal is to create a landscape based on LiDAR and use that in UE4 to finaly get a fulldome projection for a planetarium. LiDAR files (scanned by GEO departments of a country for example) come in many cases with lots of usefull information like the position of vegetation, buildings, water and of course the ground. https://en.wikipedia.org/wiki/Lidar https://geodetics.com/lidar-point-clouds/ These classifications than can be used to feed heightfields with masks and finaly scatter vegetation on the "correct" position. The screenshot shows an imported *.*las file from Slovenia with 5 different classes. Right now the website seems to be down. Anyway, here is a link that shows how to create DEMs from these LiDAR files - and hopefully the website will be back online soon: http://paleoseismicity.org/tutorial-how-to-make-a-dem-from-the-slovenian-lidar-data/ http://gis.arso.gov.si/evode/profile.aspx?id=atlas_voda_Lidar@Arso&culture=en-US (currently offline ) The next screenshot shows the progress from LiDAR inside Houidin with ground and vegetation class, converted to Heightfield with scattered instances based on the LiDAR vegetation class points and finaly imported into UE4. And here the world map of that area (the projection and date are different, so just in case you wonder why it looks a little bit different, see https://en.wikipedia.org/wiki/Map_projection for more information about map projection): Here is a first try of importing a LiDAR based HDA into UE4. There is VERY little erosion to keep the main shape of the LiDAR terrain, but you can see how detailed the shape of the landscape is just out of the box. The trees are based on real positions (depends on the date the LiDAR was scanned of course). You can even see streets/pathes and fundaments of the buildings just to show, how detailed LiDAR can be. Its one heightfield with max resolution 8129 x 8129 (see the UE4 manual for landscapes: https://www.sidefx.com/docs/unreal/_landscapes.html#LandscapeSize ). Depending on the points of the LiDAR file you could tile the heightfield to get even more details. And finaly a video running through the forest (its also from the Slovenia LiDAR files, but a part with way more forest since we made an animation with the GPS data of bears. The position is here: Google maps position of the UE Forest ) For fun I definitly wanna build in some animals, probably bears, just to feel the fear of a dark forest (and no, these animals are awesome and no danger to us, so dont be afraid - the chance to meet one of these beautiful animals is extremly rare and we are not part of there dinner plan). Sry for jumping so much in the video, but since there are many micro elevations, the character stucks sometimes ( and to much quake3 in the past ). To avoid that, you could use for example a HF resample and smooth out these little obstacles. https://youtu.be/i_3SaAJ8lsM I try to keep that post updated with my progress. Cheers and thanks for all the amazing help here. sant0s
  12. 1 point
    Yes, it should. Usually, they come in pair Hip file attached ________________________________________________________________ Vincent Thomas (VFX and Art since 1998) Senior Env artist & Lighting & MattePainter & Creative Concepts http://fr.linkedin.com/in/vincentthomas Human_Lsys.hipnc
  13. 1 point
    Another quick test - so the HF is on maximum for UE4, means 8129 x 8129. Of course there is only one texture now and it looks shitty - but was only to test with the resolution. The LiDAR file out of the box, combined with super little erosion to keep the main shape, alreadygives some crazy details on the ground. The trees are also on the "correct" positions filtered by the classification. Tomorrow I will try to make it with more layers of the HF and the different classifications from the LiDAR. Have a good evening everyone.
  14. 1 point
    Julien means the same thing I already mentioned. Loading and building attributes in Python is much slower than doing same thing in C++ most of the time. Lidar Import SOP is a fast C++ node compared to pylas. Loading your *.las example takes 20ms with Lidar Import SOP on my machine but same file takes almost 6 seconds with Python SOP with pylas (3 seconds just for transforming numpy nd array to serialized form and another 2.5 seconds for setPointFloatAttribValues method). If you do have enough memory and you do end up reading most of the file anyway it's faster just to use Lidar Import SOP to load all points fast and then use Python SOP to add only additional data that Lidar Import SOP cannot read (like classification) and blast what you don't need. Like this:
  15. 1 point
    Nothing fancy, just to show how easy it is to go super simple from LiDAR to UE4 with that cool tool now
  16. 1 point
    here...all I can say the best thing about this so far is....it's bloody finicky !!! - in secret sauce, try disabling it...won't work....even fracture it with 1 fracture point, ie. no actual break, won't work...you have to do an actual dummy break with 2 points...must be something under the hood. after that, controls are at best.......finicky. vu_ballPuller.hiplc
  17. 1 point
    If I understand correctly this question: How import into the PythonModule a Custom Script stored in HDA ? Like your screenshot, to import all functions stored into "import_that_script" inside the PythonModule. There is an answer in docs: https://www.sidefx.com/docs/houdini/hom/hou/HDAModule.html So in your case in the PythonModule section: import toolutils myscript = toolutils.createModuleFromSection("myscript", kwargs["type"], "import_that_script") Then, you can execute a function as imported module do: myscript.myfunction()
  18. 1 point
    Awesome thread! Loooove it!!! Beautifully executed!!
  19. 1 point
    string test = ""; int searchPts[] = { 0,1,2,3 }; foreach (int i; int num; searchPts) { test += " "+itoa(num); } s@check = test; try this.
  20. 1 point
    This looks like an unmitigated UX disaster.
  21. 1 point
    maybe It Helps https://www.sidefx.com/forum/topic/56694/?page=1#post-254029 https://tosinakinwoye.com/2017/01/23/houdini-vex-snippets/
  22. 1 point
    Another Tool that I build some time ago was a particle to debris tool. Just something to save myself from doing for the one-thousandth time. It automatically creates debris pieces from a simple box shatter and instances those particles onto incoming particles. It detects whether pscale/id is present and does all the stuff you would normally do for each debris sim. It also normalizes the volume for the debris particles (with some fancy math) so the distribution of scale can be handled cleanly through the pscale on the particles.
  23. 1 point
    Hehe might have to do a rebrand to “Simply Advanced Tree Tools” or something like that
  24. 1 point
    Excellent ! Thank for the video, I downloaded it some time ago, but now I'm going to try it out immediatly True that Simple tree may not be the best way to describe this tool anymore !
  25. 1 point
    For everone who dont know it already: detailed explanations about VDB, narrowband levelsets etc... from the founders at Dreamworks. Lot of explained examples in one file. A must have for everyone: Website: https://www.openvdb.org/download/ File: https://artifacts.aswf.io/io/aswf/openvdb/houdini_examples.hip/1.0.0/houdini_examples.hip-1.0.0.zip Dont miss to have a look, again, again... Cheers
  26. 1 point
    patreon.com/posts/38913618 Subdivision surfaces are piecewise parametric surfaces defined over meshes of arbitrary topology. It's an algorithm that maps from a surface to another more refined surface, where the surface is described as a set of points and a set of polygons with vertices at those points. The resulting surface will always consist of a mesh of quadrilaterals. The most iconic example is to start with a cube and converge to a spherical surface, but not a sphere. The limit Catmull-Clark surface of a cube can never approach an actual sphere, as it's bicubic interpolation and a sphere would be quadric. Catmull-Clark subdivision rules are based on OpenSubdiv with some improvements. It supports closed surfaces, open surfaces, boundaries by open edges or via sub-geometry, open polygons, open polygonal curves, mixed topology and non-manifold geometry. It can handle edge cases where OpenSubdiv fails, or produces undesirable results, i.e. creating gaps between the sub-geometry and the rest of the geometry. One of the biggest improvement over OpenSubdiv is, it preserves all boundaries of sub-geometry, so it doesn't introduce new holes into the input geometry, whereas OpenSubdiv will just break up the geometry, like blasting the sub-geometry, subdividing it and merging both geometries as is. Houdini Catmull-Clark also produces undesirable results in some cases, i.e. mixed topology, where it will either have some points misplaced or just crash Houdini due to the use of sub-geometry (bug pending). Another major improvement is for open polygonal curves, where it will produce a smoother curve, because the default Subdivide SOP will fix the points of the previous iteration in subsequent iterations which produces different results if you subdivide an open polygonal curve 2 times in a single node vs 1 time in 2 nodes, one after the other. This is not the case for polygonal surfaces. VEX Subdivide SOP will apply the same operation at each iteration regardless of topology. All numerical point attributes are interpolated using Catmull-Clark interpolation. Vertex attributes are interpolated using bilinear interpolation like OpenSubdiv. Houdini Catmull-Clark implicitly fuses vertex attributes to be interpolated just like point attributes. Primitive attributes are copied. All groups are preserved except edge groups for performance reasons. Combined VEX code is ~500 lines of code.
  27. 1 point
    this one. https://www.youtube.com/watch?v=17z5oZCpPeA&t=8s
  28. 1 point
    Hello friend! I have a hip file available [for free of course] here: hopefully this can help you get started
  29. 1 point
    This is the official release of the Houdini Music Toolset (HMT)! Here's a tour and demonstration Download and installation instructions as well as documentation can be found on Github. I'm also releasing two tutorials: 00 Installation and Sound Check 01 How to make a Simple Note For the last 5 years I've been doing progressively more advanced music composition in Houdini. The mergers of music and visuals have been a life-long passion for me. In addition to teaching dynamics and FX in Houdini, I've also given selective talks and demonstrations on my personal music developments to groups like the Vancouver Houdini User Group, the Los Angeles Houdini User Group, and the Procedural Conference in Breda. I always experience an overwhelming amount of enthusiasm and a supportive community. Here's my way of both saying thank you as well as furthering anyone who would also like to combine musical and visual art. The Houdini Music Tool-set turns Houdini into a powerful music making suite (a MIDI sequencer). Be sure to keep a look out for free weekly tutorials covering the tool-set and workflows. Enjoy!
  30. 1 point
    Hello Everyone, I put together a short video tutorial on how to use the particle system to break glue bonds that are holding together fractured geometry. You can view the video here: I have other videos posted in this link as well. http://forums.odforce.net/topic/17105-short-and-sweet-op-centric-lessons/page-5#entry127846
  31. 1 point
    Hi, in this tutorial you will learn procedural animation with powerful procedural tools in Houdini. Excellent for Beginner Houdini users, users migrating from other 3D content creation packages (e.g., Blender, 3DMax), anyone interested in Houdini and use it in procedural animation. Download for free: https://artstn.co/m/oJoN WHAT'S INSIDE? 5 Video Chapters (117 min. of tutorial) Houdini Project File INTRODUCTION Chapter 1 Dividing Geometry Into Blocks - 22 min. Chapter 2 Gradual Emergence of Pieces - 15 min. Chapter 3 Procedural Animation of The Crane - 32 min. Chapter 4 Modeling The Crane - 24 min. Chapter 5 Metal Scaffolding - 24 min. TOOLS Houdini 17.5 / 18 MINIMUM REQUIREMENTS Houdini Beginner Download for free: https://artstn.co/m/oJoN
  32. 1 point
    Also found a (rather low quality) recording of the talk I am refering to (roughly from minute 02:10 on) In this talk he states that for the connection each pore looks for the nearest point in the direction of the two nearest curve and then one beyond. So my approach right now (testing this takes time as I still struggle with vex syntax) is: get nearest point using the "nearestpoint" function, build vector from current point to each of the nearpoints. Then build direction vector from the nearest two curve points and do a dot product to check if the point matches the directionvector best, and connect these points with a curve once that is done. All in a point wrangle. After that I resample the curve, project to the surface using minpos and subtract as a VDB. Does that make sense or is there a better way?
  33. 1 point
  34. 1 point
    Here is a quick hack with Librarian's plane dropped into Farmfield's wind tunnel. ap_ff_POP_wind_tunnel_effect_013020.hiplc
  35. 1 point
    I made this quick and dirty script to transfer material colors from .mtl file to poly attributes. Mb it will help someone.
  36. 1 point
    cccc @caskal Budy Dude Golden tech ... Explore old file from 2012 ..Don't no who posted ... just use those vopsh... like Simon did ...experimental fun..Cop same princip disp.hipnc
  37. 1 point
    Finally finished this example that's been sitting on my backburner. It's a really simple demonstration of how you can use the Pose Tool and Pose Scopes to get invisible rigs and motion paths in Houdini. Has anyone used them in production, by chance? What's been your experience? Caveat: that hip file is definitely NOT an example of good rigging practice, by any means! Industrial Robot Arm by UX3D on Sketchfab (CC BY-NC 4.0) invisible_rigs_motion_paths.hipnc
  38. 1 point
    I just found out about the radial basis function (RBF) in Scipy.interpolate, which does an even better job at interpolating between irregular points. As it does not use delaunay, it also lost the trianglish appearance it had before. RBF looks way smoother than what you´d get from Attribute Transfer SOP, too. Btw: Does someone know, how to make the same code work with vector arrays in NumPy? I tried it in the commented lines, but currently I have to use three separate arrays for each color channel and therefore call each function three times, as well : ( import numpy as np from scipy.interpolate import griddata import scipy.interpolate as interp node = hou.pwd() geo1 = node.geometry() inputs = node.inputs() geo2 = inputs[1].geometry() method_nr = node.evalParm('method') method_names = 'multiquadric,inverse_multiquadric,gaussian,linear,cubic,quintic,thin_plate'.split(',') method_str = method_names[method_nr] grid_x = np.array(geo1.pointFloatAttribValues('px')) grid_z = np.array(geo1.pointFloatAttribValues('pz')) color_r = np.array(geo2.pointFloatAttribValues('cr')) color_g = np.array(geo2.pointFloatAttribValues('cg')) color_b = np.array(geo2.pointFloatAttribValues('cb')) #color = np.array(geo2.pointFloatAttribValues('Cd')) #np.reshape(color, (30, 3)) pos_x = np.array(geo2.pointFloatAttribValues('px')) pos_z = np.array(geo2.pointFloatAttribValues('pz')) rbf_red = interp.Rbf(pos_x, pos_z, color_r, function=method_str) rbf_green = interp.Rbf(pos_x, pos_z, color_g, function=method_str) rbf_blue = interp.Rbf(pos_x, pos_z, color_b, function=method_str) #rbf_color = interp.Rbf(pos_x, pos_z, color, function=method_str) smooth_rbf_red = rbf_red(grid_x, grid_z) smooth_rbf_green = rbf_green(grid_x, grid_z) smooth_rbf_blue = rbf_blue(grid_x, grid_z) #smooth_rbf_color = rbf_color(grid_x, grid_z) geo1.setPointFloatAttribValuesFromString("clr_r", smooth_rbf_red.astype(np.float32)) geo1.setPointFloatAttribValuesFromString("clr_g", smooth_rbf_green.astype(np.float32)) geo1.setPointFloatAttribValuesFromString("clr_b", smooth_rbf_blue.astype(np.float32)) #geo1.setPointFloatAttribValuesFromString("Cd", smooth_rbf_color.astype(np.float32)) scipy_grid_to_color_points.hiplc
  39. 1 point
    I put together this setup after watching the Ari Danish masterclass on height fields in Houdini 17. ap_hf_terrain_select_features_102618.hiplc This is the default Terrain: Hills, generated by the shelf tool, with a bunch of pre-made point selections sets all ready setup. Bedrock, sediment, water, debris, direction, occlusion, slope, flat areas and peak edges are provided. There is some custom masking taking place. For instance I did not want rocks from debris inside my water so I masked them out. The instancers, at the /obj level, reference these point sets to deploy various plant models. I have included the basic set of cc0 models consisting of 1 pine tree, 2 rocks and 11 grass meshes. When first opening the scene you may have to click the Reset Simulation button on the heightfielderode node to generate all the point sets. This should take a short amount of time to erode out 24 frames on the height field. You should be able to supply any of the height field shelf tools to the selection set input. Revisit the heightfieldmaskbyfeature node parameters to dial in the best point sets for your particular height field. Setup to render in Redshift.
  40. 1 point
    no loops necessary and you should try to avoid them if you can also first point of the primitive doesn't guarantee that it's at the start or at the end or even that the primitive has any start or end if it's closed, so you are better off using points belonging to first or last vertex and either checking if the prim is closed or using the neighbourcount() test so to detect all you can combine first test with test whether point belongs to the first or the last vertex of the primitive int pts[] = primpoints(0, @primnum); int isend = neighbourcount(0, @ptnum) == 1; int isfirst = @ptnum == pts[0]; int islast = @ptnum == pts[-1]; i@group_ends = isend; i@group_start = isend && isfirst; i@group_end = isend && islast; OR mentioned open/closed poly test int pts[] = primpoints(0, @primnum); int isopen = !primintrinsic(0, "closed", @primnum); int isfirst = @ptnum == pts[0]; int islast = @ptnum == pts[-1]; i@group_ends = isopen && (isfirst || islast); i@group_start = isopen && isfirst; i@group_end = isopen && islast;
  41. 1 point
    Hi, I played around your scene a bit. This is the result. I also ended up changing your extrusion expression using exp() instead of pow(). The gist of it was to do 2 iteration phases. 1 is ahead, and the other is behind. Then just blend between those two. The challenging part was to create the attributes needed to specify the iteration level, and the blending amount (check the blend_and_iter wrangle node for how I processed it). Although this seems to chug to slow when there's already too many polygons to process. I wanted to place it inside a compiled block, but polyExtrudes are not yet compilable. H16.5.268 NC - Subdiv_Test_v2.rar
  42. 1 point
    Any way to do this on bone level? I need to mix or rather append different fbx animations, but I would like to keep the bones.
  43. 1 point
    I do keep adding links as I find things. Because I am just editing the first page, it does not count as a reply. Any additions I make don't trigger a bump to the top of the forum. So check back often!
  44. 1 point
    don't mean to hijack...but I did these awhile ago in Max MCG....I'm absorbing Houdini like a sponge as you might have noticed thru various posts and would one day hope to do these in Houdini... Is there something already though ? Basically, given any spline(s), curl it (..well, curl a copy)...it was more challenging to do the curl vertically instead of flat on surface.
  45. 1 point
    Here is lazy procedural approach. Slice by planes into rows, resample cross-sections by approximate brick length and turn pscales on. Copy bricks to points and finally make use of new boolean. Shatter mode will give you mortar, I used another method, to allow later remeshing. Try to randomize cross-sections a bit, so, it won't resample into almost same curves (brickwork won't look lame). bricker.hipnc
  46. 1 point
    Hi, I made a setup for create wet sand, taking this simulation example. I share it to help anyone. Improvements are most welcome! Wet_grain_setup.hip
  47. 1 point
    gonna tattoo this on my neck, for every hovering AD behind me
  48. 1 point
    I'm pretty sure FLIP fluids don't work that way. You can activate a FLIP object - that is, the object that contains the entire particle stream, but the individual particles aren't handled as individual objects... it would make it impossibly heavy, and cause rather strange behaviour too I think. If you want particles to remain static, before being "activated" in the sim, you need to group just the points you want to activate at a particular frame from a SOPs point cloud, and use them as an emitter. If you want those particles to be present in the sim from the start - as in, collidable, but static, then it's a rather more difficult notion. About the best way I can think of is to give those particles a massive viscosity value, and maybe use a gravity mask to stop them falling under gravity. Basically, complicated. It may be better to try and re-think your approach. FLIP fluids behave like fluid, often even when you're trying everything possible to make them do otherwise... you can waste a LOT of time trying to invent ways to stop FLIP fluids being fluid :-)
  49. 1 point
    I think the main part of the look could be done with just couple of opaque surfaces. You also could have animation by offsetting noises and slightly changing geometry. Have a quick example. Take a look at a shader and cop inside an attached scene. comet_od_v14.hipnc
  50. 1 point
    Implementation of anisotropic method is quite simple, I made one https://github.com/mishurov/anisotropy_matrix But since Houdini uses OpenVDB for surfacing the FLIP, therefore OpenVDB developers should add that algorithm into their library. And there are some perfomance issues to use fastest method for calculation of Single Value Decomposition and so on.