Jump to content

Leaderboard


Popular Content

Showing most liked content since 09/18/2017 in all areas

  1. 54 points
    Few tips and tricks to manipulate gas simulation. 1. Independent resolution grid. E.g. Overriding vel grid size independent to a density grid. 2. Creating additional utilities. E.g. gradient, speed, vorticity and etc which can be used to manipulate forces. 3. Forces via VEX and some example snippets. smokesolver_v1.hipnc P.S. Some of this technique are not Open CL friendly though.
  2. 32 points
    Filament like structure, combination of Smoke Solver, VDB Advect Points + Volume Rasterize Particles. smokesolver_v3.hipnc
  3. 31 points
    Another, focused on instancing smoke objects. Manipulating points with basic instancing attributes, i@cluster, v@scale and f@sourceframe. How to activate smoke object and holding a volume source. This method ideal for triggering independent gas simulation on impact data. Additional examples, e.g. grid clustering method for trail and non-trail which I'm merging from a separate thread. smokesolver_v2.hipnc
  4. 22 points
    Pixelkram / Moritz S. (of Entagma) and I are proud to announce MOPs: an open-source toolkit for creating motion graphics in Houdini! MOPs is both a suite of ready-to-use tools for solving typical motion graphics problems, and a framework for building your own custom operators easily. More information is available from our website: http://www.motionoperators.com Enjoy!
  5. 21 points
    There are so many nice example files on this website that I am often searching for. I wanted to use this page as a link page to other posts that I find useful, hopefully you will too. Displaced UV Mapped Tubes Particles Break Fracture Glue Bonds Render Colorized Smoke With OpenGL Rop Moon DEM Data Creates Model Python Script Make A Belly Bounce Helicopter Dust Effect Conform Design To Surface Benjamin Button Intro Sequence UV Style Mapping UV Box and Multiple Projection Styles Ping Pong Frame Expression Instance vs. Copy (Instance Is Faster) Particle Bug Swarm Over Vertical and Horizontal Geometry Rolling Cube Rounded Plexus Style Effect Pyro Smoke UpRes Smoke Trails From Debris Align Object Along Path Fading Trail From Moving Point Swiss Cheese VDB To Polygons Get Rid Of Mushroom Shape In Pyro Sim A Tornado Ball Of Yarn Particles Erode Surface Unroll Paper Burrow Under Brick Road Non Overlapping Copies Build Wall Brick-By-Brick FLIP Fluid Thin Sheets Smoke Colored Like Image Volumetric Spotlight Moving Geometry Using VEX Matt's Galaxy Diego's Vortex Cloud Loopable Flag In Wind Eetu's Lab <--Must See! Wolverine's Claws (Fracture By Impact) Houdini To Clarisse OBJ Exporter Skrinkwrap One Mesh Over Another Differential Growth Over Surface [PYTHON]Post Process OBJ Re-Write Upon Export Rolling Clouds Ramen Noodles Basic Fracture Extrude Match Primitive Number To Point Number Grains Activate In Chunks Fracture Wooden Planks Merge Two Geometry Via Modulus Fill Font With Fluid DNA Over Model Surface VDB Morph From One Shape To Another Bend Font Along Curve Ripple Obstacle Across 3D Surface Arnold Style Light Blocker Sphere Dripping Water (cool) Exploded View Via Name Attribute VEX Get Obj Matrix Parts eetu's inflate cloth Ice Grows Over Fire Flying Bird As Particles DEM Image To Modeled Terrain Pyro Temperature Ignition Extrude Like Blender's Bevel Profile Particles Flock To And Around Obstacles BVH Carnegie Mellon Mocap Tweaker (python script) Rolling FLIP Cube Crowd Agents Follow Paths Keep Particles On Deforming Surface Particle Beam Effect Bendy Mograph Text Font Flay Technique Curly Abstract Geometry Melt Based Upon Temperature Large Ship FLIP Wake (geo driven velocity pumps) Create Holes In Geo At Point Locations Cloth Blown Apart By Wind Cloth Based Paper Confetti Denim Stitching For Fonts Model A Raspberry Crumple Piece Of Paper Instanced Forest Floor Scene FLIP pushes FEM Object Animated Crack Colorize Maya nParticles inside an Alembic Path Grows Inside Shape Steam Train Smoke From Chimney Using Buoyancy Field On RBDs In FLIP Fluid Fracture Along A Path COP Based Comet Trail eetu's Raidal FLIP Pump Drip Down Sides A Simple Tornado Point Cloud Dual Colored Smoke Grenades Particles Generate Pyro Fuel Stick RBDs To Transforming Object Convert Noise To Lines Cloth Weighs Down Wire (with snap back) Create Up Vector For Twisting Curve (i.e. loop-d-loop) VDB Gowth Effect Space Colonization Zombie L-System Vine Growth Over Trunk FLIP Fluid Erosion Of GEO Surface Vein Growth And Space Colonization Force Only Affects Particle Inside Masked Area Water Ball External Velocity Field Changes POP particle direction Bullet-Help Small Pieces Come To A Stop Lightning Around Object Effect Lightning Lies Upon Surface Of Object Fracture Reveals Object Inside Nike Triangle Shoe Effect Smoke Upres Example Julien's 2011 Volcano Rolling Pyroclastic FLIP Fluid Shape Morph (with overshoot) Object Moves Through Snow Or Mud Scene As Python Code Ramp Scale Over Time Tiggered By Effector Lattice Deforms Volume Continuous Geometric Trail Gas Enforce Boundary Mantra 2D And 3D Velocity Pass Monte Carlo Scatter Fill A Shape Crowd Seek Goal Then Stop A Bunch Of Worms Potential Field Lines Around Postive and Negative Charges Earthquake Wall Fracture Instance Animated Geometry (multiple techniques) Flip Fluid Attracted To Geometry Shape Wrap Geo Like Wrap3 Polywire or Curve Taper Number Of Points From Second Input (VEX) Bullet Custom Deformable Metal Constraint Torn Paper Edge Deflate Cube Rotate, Orient and Alignment Examples 3D Lines From 2D Image (designy) Make Curves In VEX Avalanche Smoke Effect Instant Meshes (Auto-Retopo) Duplicate Objects With VEX Polywire Lightning VEX Rotate Instances Along Curved Geometry Dual Wind RBD Leaf Blowing Automatic UV Cubic Projection (works on most shapes) RBD Scatter Over Deforming Person Mesh FLIP Through Outer Barrier To Inner Collider (collision weights) [REDSHIFT] Ground Cover Instancing Setup [REDSHIFT] Volumetric Image Based Spotlight [REDSHIFT] VEX/VOP Noise Attribute Planet [REDSHIFT] Blood Cell Blood Vessel Blood Stream [REDSHIFT] Light Volume By Material Emission Only [REDSHIFT] Python Script Images As Planes (works for Mantra Too!) [REDSHIFT] MTL To Redshift Material [REDSHIFT] Access CHOPs In Volume Material [REDSHIFT] Mesh Light Inherits Color [REDSHIFT] Color Smoke [REDSHIFT] FBX Import Helper Dragon Smashes Complex Fractured House (wood, bricks, plaster) Controlling Animated Instances Road Through Height Field Based Terrain Tire Tread Creator For Wheels Make A Cloth Card/Sheet Follow A NULL Eye Veins Material Matt Explains Orientation Along A Curve Mesh Based Maelstrom Vortex Spiral Emit Multiple FEM Objects Over Time Pushing FEM With Pyro Spiral Motion For Wrangle Emit Dynamic Strands Pop Grains Slope, Peak and Flat Groups For Terrains Install Carnegie Mellon University BVH Mocap Into MocapBiped1 Ramp Based Taper Line Fast Velocity Smoke Emitter Flip Fill Cup Ice Cubes Float [PYTHON]Export Houdini Particles To Blender .bphys Cache Format Collision Deform Without Solver or Simulation Mograph Lines Around Geometry Waffle Cornetto Ice Cream Cone Ice Cream Cone Top Unroll Road Or Carpet Burning Fuse Ignites Fuel or Painted Fuel Ignition Painted Fuel Combustion Small Dent Impact Deformation Particle Impact Erosion or Denting Of A Surface Helicopter Landing Smoke And Particles Radial Fracture Pieces Explode Outwards Along Normal Tangent Based Rocket Launch Rolling Smoke Field Tear/Rip FLIP (H12 still works in H16) Rain Flows Over Surface Smoke Solver Tips & Tricks Folding Smoke Sim VEX Generated Curve For Curling Hair Copy and Align One Shape Or Object To The Primitives Of Another Object (cool setup) A Better Pop Follow Curve Setup FEM Sea Cucumber Moves Through Barrier Fracture Cloth Smoke Confinement Setup Merge multiple .OBJ directly Into A Python Node Blood In Water Smoke Dissipates When Near Collision Object Whirlpool Mesh Surface Simple Bacteria Single Point Falling Dust Stream Flames Flow Outside Windows Gas Blend Density Example Localized Pyro Drag (smoke comes to a stop) Granular Sheet Ripping Post Process An Export (Post Write ROP Event) Corridor Ice Spread or Growth Set Velocity On Pieces When Glue Bonds Break Water Drops Along Surface Condensation Bottle Grains Snow or Wet Sand Starter Scene A Nice Little Dissolver Turn An Image Into Smoke Fading Ripples Grid Example Stranger Things Wall Effect Face Through Rubber Wall [PYTHON]Create Nurbs Hull Shelf Tool [PYTHON] Ramp Parameter Select Outside Points Of Mesh, Honor Interior Holes Sparks Along Fuse With Smoke Umbrella Rig Melt FLIP UVs Tire Burn Out Smoke Sim Flip or Pyro Voxel Estimate Expression Motorcycle or Dirt Bike Kicks Up Sand Particles Push Points Out Of A Volume [PYTHON]Cellular Automata Cave Generator Punch Dent Impact Ripple Wrinkle VEX Rotate Packed Primitive Via Intrinsic Kohuei Nakama's Effect FLIP Fluid Inside Moving Container Particles Avoid Metaball Forces FLIP Divergence Setup FLIP Transfer Color Through Simulation To Surface Morph Between Two Static Shapes As Pyro Emits Constraint Based Car Suspension Pyro Smoke Gas Disturbs Velocity Wire Solver Random Size Self Colliding Cables Fast Cheap Simple Collision Deform CHOP Based Wobble For Animated Character Slow Motion FLIP Whaitewater Avoid Stepping In Fast Pyro Emission FLIP Fluid Fills Object Epic Share Of Softbody/Grain Setups (Must see!) Balloon, Pizza, Sail, Upres Shirt, Paint Brush Create Pop Grain Geometry On-The-Fly In A DOPs Solver Use Google To Discover Attached HIP Files Useful Websites: Tokeru Houdini Houdini Vex Houdini Python FX Thinking iHoudini Qiita Ryoji Video Tutorials: Peter Quint Rohan Dalvi Ben Watts Design Yancy Lindquist Contained Liquids Moving Fem Thing Dent By Rigid Bodies Animating Font Profiles Guillaume Fradin's Mocap Crowd Series(no longer available) Swirly Trails Over Surface http://forums.odforce.net/topic/24861-atoms-video-tutorials/ http://forums.odforce.net/topic/17105-short-and-sweet-op-centric-lessons/page-5#entry127846 Entagma SideFX Go Procedural
  6. 21 points
  7. 19 points
    Since there's been a lot of talk around the web about graphics APIs this past week with Apple's decision to deprecate OpenGL in MacOS Mojave, I thought I'd take this opportunity to discuss the various graphics APIs and address some misconceptions. I'm doing this as someone who's used all versions of OpenGL from 1.0 to 4.4, and not with my SideFX hat on. So I won't be discussing future plans for Houdini, but instead will be focusing on the APIs themselves. OpenGL OpenGL has a very long history dating back to the 90s. There have been many versions of it, but the most notable ones are 1.0, 2.1, 3.2, and 4.x. Because of this, it gets a reputation for being old and inefficient, which is somewhat true but not the entire story. Certainly GL1.0 - 2.1 is old and inefficient, and doesn't map well to modern GPUs. But then in the development of 3.0, a major shift occurred that nearly broken the GL ARB (architecture review board) apart. There was a major move to deprecate much of the "legacy" GL features, and replace it with modern GL features - and out of that kerfuffle the OpenGL core and compatibility profiles emerged. The compatibility profile added these new features alongside the one ones, while the core profile completely removed them. The API in the core profile is what people are referring to when they talk about "Modern GL". Houdini adopted modern GL in v12.0 in the 3D Viewport, and more strict core-profile only support in v14.0 (the remaining UI and other viewers). Modern GL implies a lot of different things, but the key ones are: geometry data and shader data must be backed by VRAM buffers, Shaders are required, and all fixed function lighting, transformation, and shading is gone. This is good in a lot of ways. Geometry isn't being streamed to the GPU in tiny bits anymore and instead kept on the GPU, the GL "big black box" state machine is greatly reduced, and there's a lot more flexibility in the display of geometry from shaders. You can light, transform, and shade the model however you'd like. For example, all the various shading modes in Houdini, primitive picking, visualizers, and markers are all drawn using the same underlying geometry - only the shader changes. OpenGL on Windows was actually deprecated decades ago. Microsoft's implementation still ships with Windows, but it's an ancient OpenGL 1.1 version that no one should use. Instead, Nvidia, AMD and Intel all install their own OpenGL implementations with their drivers (and this extends to CL as well). Bottlenecks As GPUs began getting faster, what game developers in particular started running into was a CPU bottleneck, particularly as the number of draw calls increased. OpenGL draw calls are fast (more so that DirectX), but eventually you get to a point where the driver code prepping the draw started to become significant. More detailed worlds meant not only bigger models and textures, but more of them. So the GPU started to become idle waiting on draws from the CPUs, and that draw load began taking away from useful CPU work, like AI. The first big attempt to address this was in the form of direct state access and bindless textures. All resources in OpenGL are given an ID - an integer which you can use to identify a resource for modifying it and binding it to the pipeline. To use a texture, you bind this ID to slot, and the shader refers to this slot through a sampler. As more textures we used and switched within a frame, mapping the ID to its data structure became a more significant load on the driver. Bindless does away with the ID and replaces it with a raw pointer. The second was to move more work to the GPU entirely, and GLSL Compute shaders (GL4.4) were added, along with Indirect draw calls. This allows the GPU to do culling (frustum, distance based, LOD, etc) with an OpenCL-like compute shader and populate some buffers with draw data. The indirect draw calls reference this data, and no data is exchanged between GPU and CPU. Finally, developers started batching as much up as possible to reduce the number of draw calls to make up for these limitations. Driver developers kept adding more optimizations to their API implementations, sometimes on a per-application basis. But it became more obvious that for realtime display of heavy scenes, and with VR emerging where a much higher frame rate and resolution is required, current APIs (GL and DX11) were reaching their limit. Mantle, Vulkan, and DX12 AMD recognized some of these bottlenecks and the bottleneck that the driver itself was posing to GPU rendering, and produced a new graphics API called Mantle. It did away with the notion of a "fat driver" that optimized things for the developer. Instead, it was thin and light - and passed off all the optimization work to the game developer. The theory behind this is that the developer knows exactly what they're trying to do, whereas the driver can only guess. Mantle was eventually passed to Khronos, who develops the OpenGL and CL standards, and from that starting point Vulkan emerged. (DirectX 12 is very similar in theory, so for brevity’s sake I'll lump them together here - but note that there are differences). Vulkan requires that the developer be a lot more up-front and hands on with everything. From allocating large chunks of VRAM and divvying it up among buffers and textures, saying exactly how a resource will be used at creation time, and describing the rendering pipeline in detail, Vulkan places a lot of responsibility on the developer. Error checking and validation can be entirely removed in shipping products. Even draw calls are completely reworked - no more global state and swapping textures and shaders willy-nilly. Shaders must be wrapped in an object which also contains all its resources for a given draw per framebuffer configuration (blending, AA, framebuffer depths, etc), and command buffers built ahead of time in order to dispatch state changes and draws. Setup becomes a lot more complicated, but also is more efficient to thread (though the dev is also completely responsible for synchronization of everything from object creation and deletion to worker and render threads). Vulkan also requires all shaders be precompiled to a binary format, which is better for detecting shader errors before the app gets out the door, but also makes generating them on the fly more challenging. In short, it's a handful and can be rather overwhelming. Finally, it's worth noting that Vulkan is not intended as a replacement for OpenGL; Khronos has stated that from its release. Vulkan is designed to handle applications where OpenGL falls short. A very large portion of graphics applications out there don't actually need this level of optimization. My intent here isn't to discourage people from using Vulkan, just to say that it's not always needed, and it is not a magic bullet that solves all your performance problems. Apple and OpenGL When OSX was released, Apple adopted OpenGL as its graphics API. OpenGL was behind most of its core foundation libraries, and as such they maintained more control over OpenGL than Windows or Linux. Because of this, graphics developers did not install their own OpenGL implementations as they did for Windows or Linux. Apple created the OpenGL frontend, and driver developers created the back end. This was around the time of the release of Windows Vista and its huge number of driver-related graphics crashes, so in retrospect the decision makes a lot of sense, though that situation has been largely fixed in the years since. Initially Apple had support for OpenGL 2.1. This had some of the features of Modern GL, such as shaders and buffers, but it also lacked other features like uniform buffers and geometry shaders. While Windows and Linux users enjoyed OpenGL 3.x and eventually 4.0, Mac developers were stuck with a not-quite-there-yet version of OpenGL. Around 2012 they addressed this situation and released their OpenGL 3.2 implementation ...but with a bit of a twist. Nvidia and AMD's OpenGL implementations on Windows and Linux supported the Compatibility profile. When Apple released their GL3.2 implementation it was Core profile only, and that put some developers in a tricky situation - completely purge all deprecated features and adopt GL3.2, or remain with GL2.1. The problem being that some deprecated features were actually still useful in the CAD/DCC universe, such as polygons, wide lines, and stippled lines/faces. So instead of the gradual upgrading devs could do on the other platforms, it became an all-or-nothing affair, and this likely slowed adoption of the GL3.2 profile (pure conjecture on my part). This may have also contributed to the general stability issues with GL3.2 (again, pure conjecture). Performance was another issue. Perhaps because of the division of responsibility between the driver developer of the GPU maker and the OpenGL devs at Apple, or perhaps because the driver developers added specific optimizations for their products, OpenGL performance on MacOS was never quite as good as other platforms. Whatever the reason, it became a bit of a sore point over the years, with a few games developers abandoning the platform altogether. These problems likely prompted them to look at at alternate solution - Metal. Eventually Apple added more GL features up to the core GL4.1 level, and that is where it has sat until their announcement of GL deprecation this week. This is unfortunate for a variety of reasons - versions of OpenGL about 4.1 have quite a few features which address performance for modern GPUs and portability, and it's currently the only cross platform API since Apple has not adopted Vulkan (though a third party MoltenVK library exists that layers Vulkan on Metal, it is currently a subset of Vulkan). Enter Metal Metal emerged around the time of Mantle, and before Khronos had begun work on Vulkan. It falls somewhere in between OpenGL and Vulkan - more suitable for current GPUs, but without the extremely low-level API. It has compute capability and most of the features that GL does, with some of the philosophy of Vulkan. Its major issues for developers are similar to those of DirectX - it's platform specific, and it has its own shading language. If you're working entirely within the Apple ecosystem, you're probably good to go - convert your GL-ES or GL app, and then continue on. If you're cross platform, you've got a bit of a dilemma. You can continue on business as usual with OpenGL, fully expecting that it will remain as-is and might be removed at some point in the future, possibly waiting until a GL-on-top-of-Metal API comes along or Apple allows driver developers to install their own OpenGL like Microsoft does. You can implement a Metal interface specific to MacOS, port all your shaders to Metal SL and maintain them both indefinitely (Houdini has about 1200). Or, you can drop the platform entirely. None of those seem like very satisfactory solutions. I can't say the deprecation comes as much of a surprise, with Metal development ongoing and GL development stalling on the Mac. It seems like GL was deprecated years ago and this is just the formal announcement. One thing missing from the announcement was a timeframe for when OpenGL support would end (or if it will end). It does seem like Apple is herding everyone toward Metal, though how long that might take is anyone's guess. And there you have it, the state of Graphics APIs in 2018 - from a near convergence of DX11 and GL4 a few short years ago, to a small explosion of APIs. Never a dull moment in the graphics world
  8. 15 points
    During the last 3 weeks, a did some Rnd and published my results on vimeo . Some people asked me to share my files here, so here we are i hope it will help!
  9. 14 points
    Congratulations, SideFX -- and a priceless still frame! Best of luck for the future and keep innovating!
  10. 14 points
    Basic smoke solver built within SOP solver, utilising openVDB nodes. Happy exploring & expanding =) P.S. DOP’s smoke solver still solves quicker in many cases though. vdbsmokesolver_v1.hipnc vdbsmokesolver_v2.hipnc
  11. 14 points
  12. 13 points
    Hi, I played around your scene a bit. This is the result. I also ended up changing your extrusion expression using exp() instead of pow(). The gist of it was to do 2 iteration phases. 1 is ahead, and the other is behind. Then just blend between those two. The challenging part was to create the attributes needed to specify the iteration level, and the blending amount (check the blend_and_iter wrangle node for how I processed it). Although this seems to chug to slow when there's already too many polygons to process. I wanted to place it inside a compiled block, but polyExtrudes are not yet compilable. H16.5.268 NC - Subdiv_Test_v2.rar
  13. 12 points
    The latest production build of Houdini brings exciting new improvements to the licensing and render restrictions on Houdini Indie while also adding a powerful new UV Flatten tool and important refinements to FBX import and export. The Houdini Indie program has been a successful initiative designed to provide indie animators and gamedevs, who have a gross income less than $100K USD, with access to the features of Houdini FX with very few restrictions. In response to feedback from the Houdini Indie community, SideFX is increasing the render resolution limit for image sequences from 1080p to 4K x 4K in order to facilitate the generation of high definition imagery, which is especially important for artists creating VR content. SideFX will also be issuing a supplementary license to all Indie users to allow them to install Houdini on a second computer/laptop, or to work in dual boot mode. The use of these two licenses is restricted to a single artist, who can only use Houdini Indie on one of these computers at a time as per the EULA. With these added features and the recent addition of third party rendering, SideFX has set a new annual price for Houdini Indie of $269 USD for a single year or $399 USD for a two-year term. LEARN MORE...
  14. 11 points
    Hello I've started with this around H 16 release. Basically wanted to explore, to which level I'd be able to use procedural modeling when it comes to characters. So, "non procedural" part here belongs to another app, exactly Maya, where I've created a base body model, rig, posing - while Houdini part is hair of all sorts (hair, eyelashes, eyebrows..), also a lot of suit. Detailed map, what exactly belongs to which app is here. Let's say that 'harness system' is what I'm considering as most successful part. Later, started with Mantra renders, which turned out in kind of addiction - here are few of around hundred renders of this thing, I did in Mantra.
  15. 11 points
    Hello, here you can download my files from softbody week-6 : You can use my files for your jobs or just for fun/learn/make tutorial/ whar ever . It's free BUT Keep in mind i am not a professionnal, so pay attention : i'm quite sure my way to work is not the best way to work in production/studio. cheers ! Yohann upres-shirt.hiplc balloon-v2.hiplc baloons-grain solver.hiplc upres-attribtransfer.hiplc upres-curvature.hiplc
  16. 10 points
    Hi all, I had been doing a rnd project on how to generate knitted garments in Houdini lately. And one my inspiration was from a project which was done by Psyop using Fabric engine and the other one is done by my friend Burak Demirci. Here are the links of them. http://fabricengine.com/case-studies/psyop-part-2/ https://www.artstation.com/artist/burakdemirci Some people asked to share my hip file and I was going to do it sooner but things were little busy for me. Here it is, I also put some sticky notes to explain the process better, hope it helps. Also this hip file is the identical file of the one that I created this video except the rendering nodes https://vimeo.com/163676773 .I think there are still some things that can be improved and maybe done in a better way. I would love to see people developing this system further. Cheers! Alican Görgeç knitRnD.zip
  17. 10 points
    A small summary from the Houdini -Carbon Workflow for the german Movie Project "Die kleine Hexe". Hope You like it
  18. 9 points
    I put together a simple script to read the .mtl file, typically associated with a .obj, and create a Redshift material for each entry it finds in the .mtl with a map_Kd token. A Redshift material is created and a texture map is linked to the diffuse color with the filename for the map populated with what is found in the .mtl entry. This is useful when importing architectural .obj files that typically have a large number of materials associated with them. Expected .mtl format of supported tokens. #newmtl _4_2_ <- Material name. #Ns 96.078431 <- Specular intensity #Kd 0.640000 0.640000 0.640000 <- Diffuse color. #Ks 0.500000 0.500000 0.500000 <- Specular color. #Ni 1.000000 <- Index of refraction. #d 1.000000 <- Opacity. #map_Kd 21_budova/_4_2_.jpg <- Map name. #map_Kn 21_budova/_4_2_n.jpg <- Map name. The result of one-click texture mapping. Here is the Colosseum auto texture mapped. The path to the .mtl file and texture path is hard coded. Place the code in a shelf tool button and adjust the path to point to your .mtl file. mtl_to_redshift_061618.zip
  19. 9 points
    Early stage WIP of procedural shells, planning to do a tutorial when I'm done.
  20. 9 points
    I know this is a bit late, but I missed this thread and felt like I should comment. Firstly, this topic amuses me since I think I could count on two hands (maybe some toes) the amount of times we've had to lock threads in the 20 years this site has been running. So the assertion that we're trigger happy thread lockers is a bit of a stretch. Now it may be that some of the very small number of locked threads belonged to some people in this thread, and if that's true then I would suggest taking a look at your online etiquette and perhaps adjusting a few things. You should note that we have very few rules governing each forum, where most other forums on the internet have a laundry list of do's and don'ts for posting. The reason for that is that we have an amazing community of professionals who actually like to help each other produce great work. Healthy debates are certainly welcome, but personal attacks are not. tl;dr: If the topic descends to the equivalent of "no, you!", "heck you!" etc, then it is looked at as a lockable entity. Don't want that to happen? Don't behave that way. M
  21. 9 points
    Hello, most of the .hip from my video #7 are here. enjoy note : thank you particuleskull, your hip (from november in this thread) helps me a lot for my grain rnd ! elephant grain.hiplc grain for each voronoi pieces-v3.hiplc grain-tree.hiplc paint brush-v2+.hiplc sticky projection.hiplc
  22. 9 points
    I had chance to test new Flip NarrowBand workflow and wanted to share resaults with community
  23. 8 points
    Hi! Here is something I've been working for a few time on and off. I was searching for something that I could use for plants that are colliding near the camera. So I ended up using the bullet solver with packed geometry. I'm not sure if it's the right way to go for this kind of things but at least I learned a lot about the constraints. Thinking afterwards I probably should have used a few more substeps but I hope you can enjoy it anyway! Suggestions for improvements are welcome
  24. 8 points
    In case it's useful to anyone, here's an asset I made a little while ago and finally got round to documenting. It takes one or m ore curves and generates 3 levels of braided curves - first level coiled around the input, second coiled around the first and the third around the second. Additionally it can make a final level of 'hairs'. It supports animated inputs, and will transfer velocity onto the output but in many cases it's probably better to use a timeshift to generate the rope on 1 frame and use a point deform or similar. This also avoids texture jumping problems if using the supplied 'pattern' colour method. There's a fairly comprehensive help card with it and below is a demo video. If anyone finds bugs or problems, please let me know and I'll try and fix them when I have time... I'd be very interested to see anything anyone makes with it... TB__RopeMaker_1_0.hda
  25. 8 points
    Here is a breakdown of how I made this bear made out of duplo blocks: The video isn't super comprehensive, I mainly cover setting up the constraints in SOPs, though I briefly touch on some other parts too. But the hip file is available to pick apart! Thanks! duplo_bear.zip
  26. 8 points
    What I do while waiting for renders at work on a saturday... retro_cyan.hip
  27. 8 points
    Christian Arnesen did an awesome piece of animation with MASH, and I wanted to replicate it in Houdini: Here is the breakdown of 3 techniques for doing so, using Alembic caches, Copy Stamping, and VEX (file is attached below or at the link): Hope it's helpful! arnesen_cubes_in_houdini.zip
  28. 8 points
    I'd looked at this problem a few weeks ago, got 90% the way there looking at some other odforce posts, this inspired me to finish it. The idea is to first calculate the angle between each curve segment, then apply it in a for each loop so each segment is rotated by the total of all the previous segment angles, at the correct pivot point. curve_unroll.hip
  29. 8 points
    Uh oh, I'm back. While putzing around, trying to come up with good ideas for my next blog post (covering discrete operators seems a bit too mathy, so I'm open to suggestions. Maybe building a cloth solver in VEX?), I spent some time revisiting an old idea I've been attempting to solve for the better part of a year, rolling up a curve. That's probably not the best way to describe the effect, so I'll let my below GIF do the talking. This solution wouldn't have been anywhere close to possible without Stephen Bester (https://vimeo.com/user13305957) for a solid vex foundation of chained rotations, or Julian Johnson's transport frame file (http://julianjohnsonsblog.blogspot.com/2017/). Above you'll see final effect in action. Still not sure how to describe it other than rolling up a curve... I've attached the final OTL to this post along with the above example HIP! It works over multiple curves as well, so dont be afraid to try crazy stuff with it, it's pretty gosh darn fun! If you try it on a vertical line, you might need to change the up direction, or turn off "set up direction." Maybe it's time to make a Github too... <3 <3 <3 JR_CurveRoll.zip
  30. 7 points
    Hey Guys I wanted to share with you a small script I've done to have a better support or external editors to edit your expressions and code. It works with a system or file watcher, that means it doesn't freeze Houdini and you don't need to close your editor to see you updates It's available by a simple right-click on a parameter => expression => External Expression Editor ( you can set a hotkey ). It works with VEX and Python expressions but also regular HScript expressions ( but with no completion ). It works very well with Visual Studio Code as it has a nice VEX and Python plugins, but it could work with sublime text, notepad ++ etc. ( I haven't tested all the IDE out there so if you find a bug with one of them, feel free to send an email to contact@cgtoolbox.com ). A small video about it: https://vimeo.com/242470411 You can download it for free here: http://cgtoolbox.com/houdini-expression-editor/ PySide, PySide2 and PyQt are supported, it is compatible with Houdini 15 to 16.5.
  31. 7 points
    Houdini implementation
  32. 7 points
    Hi guys, i would like share with all of you my latest personal project i did on my spare time. Hope you like it.
  33. 7 points
    Hey! This is my first post on here. I've been working on this apple decomposition shot. This is where it's at now, but I'm going to be taking another stab at getting it to a higher level of realism. Any feedback on how to approach it better/critiques on whats standing out are appreciated! Thanks!
  34. 7 points
    After a year, I have another project to which I can share the source files. (link to the previous one) It is nothing revolutionary or special, it was partially done in After Effects, but I hope that it might be helpful and inspirational for somebody. Video from the 270° presentation: https://vimeo.com/246338905 Source .hiplc files: http://bit.ly/2BO4Rj3 Everything I can do in Houdini is based on the help and knowledge shared by the community. I can do it thanks to you guys! Thank you very much. Thanks goes to great guys from http://www.xlab.cz, I did it for them! Preview:
  35. 7 points
    I did a little python toolset to speed up my VEX writing. It replaces "code shortcuts" into "code snippets" and also updates the parm interface (default values, channel ranges from comments, ramps are generated from custom library). You can build your own libraries of snippets with specific channel ranges and defaults. I am really python newbie, so it is "alpha". The code is not prepared for exceptions, it is not exemplary, but I hope it would help somebody. It helps me already so I will debug it on the fly. The initial idea comes from here, Matt Estela: http://www.tokeru.com/cgwiki/index.php?title=HoudiniUserInterfaceTips#customise_right_click_menus qqReplace.py rampCollect.py updateInterfaceFromComments.py
  36. 7 points
    It is now possible to get surface distance from the softselection with the new function of H16.5! geodesic_distance_softsel.hiplc
  37. 7 points
    primuv already is basically a 2d representation of the primitive's footprint. a not-aligned bounding box of each primitive can also be found in the primitives intrinsics called "bounds". if you want to copy and squeeze models onto primitives I would create the copies first, create a "copynum"-attribute and distribute the models with something like this: vector bbox = relbbox(0, @P); int prim = point(0, "copynum", @ptnum); float height = sqrt( primintrinsic(1, "measuredarea", prim) ); @P = primuv(1, "P", prim, set(bbox.x, bbox.z, 0) ); @P += prim_normal(1, prim, {0.5, 0.5, 0}) * bbox.y * height; The bounding box from line 1 is stretched across the primitives from line 2 with primuv in line 4. The approximate height of the copies gets derived from the primitives area size in line 3 and is raised along the primitives normals in line 5. copy_on_prims.hipnc
  38. 7 points
  39. 7 points
    Subjective question, at least this how I started: 1. Shelf tools, learn the basics and concepts. 2. Dive into DOPs OTLs/HDAs and look around to see how they are implemented. My examples are just a product of what those Side Effects guys implemented. 3. Read Houdini's documentation. 4. Be inspired by all the test that posted on Vimeo/YouTube that doesn't look like off the shelf imagery.
  40. 6 points
    Now Im guessing you can post this. All the presentations from Siggraph are available! https://vimeo.com/goprocedural and talking about H17
  41. 6 points
  42. 6 points
    I'm working on something related to art direct the swirly motion of gases; Its an implementation of a custom buoyancy model that let you art direct very easily the general swirly motion of gases without using masks, vorticles, temperature sourcing to have more swirly motion in specific zones, etc. Also it gets rid of the "Mushroom effect" for free with a basic turbulence setup. Here are some example previews. Some with normal motion, others with extreme parameters values to stress the pipeline. For the details is just a simple turbulence + a bit of disturbance in the vel field, nothing complex, because of this the sims are very fast (for constant sources: average voxel count 1.8 billions, vxl size 0.015, sim time 1h:40min (160 frames), for burst sources, vxl size 0.015, sim time 0h:28min). I'm working on a vimeo video to explain more this new buoyancy model that I'm working on. I hope you like it! Cheers, Alejandro constantSource_v004.mp4 constantSource_v002.mp4 burstSource_v004.mp4 constantSource_v001.mp4 burstSource_v002.mp4 burstSource_v003.mp4 burstSource_v001.mp4 constantSource_v003.mp4
  43. 6 points
    Here is another slight variation. Instead of generating a fixed length line segment that moves through time, generate the full path over the entire time range. Then add a primitive attribute @path_pos to each line primitive. Drive the offset along path value, of the path deformer, with this attribute. Then you can have some geometry leading others as they each flow along their own path. float frame_end = 200.0; // deform_path node expects input in the range of 0-1. f@path_pos = fit01(@Frame,0,frame_end); // Now offset each path based upon it's index. float delta = 0.05; // per-line delay time can be set here. f@path_pos -= (delta * @prim); ap_ps_Cardume_Odforce_v3.04.hiplc
  44. 6 points
    "The Tree" Another R&D image from the above VR project: The idea for the VR-experience was triggered by a TV-show on how trees communicate with each other in a forest through their roots, through the air and with the help of fungi in the soil, how they actually "feed" their young and sometimes their elderly brethren, how they warn each other of bugs and other adversaries (for instance acacia trees warn each other of giraffes and then produce stuff giraffes don't like in their leaves...) and how they are actually able to do things like produce substances that attract animals that feed on the bugs that irritate them. They even seem to "scream" when they are thirsty... (I strongly recommend this (german) book: https://www.amazon.de/Das-geheime-Leben-Bäume-kommunizieren/dp/3453280679/ref=sr_1_1?ie=UTF8&qid=1529064057&sr=8-1&keywords=wie+bäume+kommunizieren ) It's really unbelievable how little we know about these beings. So we were looking to create a forest in an abstract style (pseudo-real game-engine stuff somehow doesn't really cut it IMO) that was reminiscent of something like a three dimensional painting through which you could walk. In the centre of the room, there was a real tree trunk that you were able to touch. This trunk was also scanned in and formed the basis of the central tree in the VR forest. Originally the idea was, that you would touch the tree (hands were tracked with a Leap Motion controller) and this would "load up" the touched area and the tree would start to become transparent and alive and you would be able to look inside and see the veins that transport all that information and distribute the minerals, sugar and water the plant needs. From there the energy and information would flow out to the other trees in the forest, "activate" them too and show how the "Wood Wide Web" connected everything. Also, your hands touching the tree would get loaded up as well and you would be able to send that energy through the air (like the pheromones the trees use) and "activate" the trees it touched. For this, I created trees and roots etc. in a style like the above picture where all the "strokes" were lines. This worked really great as an NPR style since the strokes were there in space and not just painted on top of some 3D geometry. Since Unity does not really import lines, Sascha from Invisible Room created a Json exporter for Houdini and a Json Importer for unity to get the lines and their attributes across. In Unity, he then created the polyline geometry on the fly by extrusion, using the Houdini generated attributes for colour, thickness etc. To keep the point count down, I developed an optimiser in Houdini that would reduce the geometry as much as possible, remove very short lines etc. In Unity, one important thing was, to find a way to antialias the lines which initially flickered like crazy - Sascha did a great job there and the image became really calm and stable. I also created plants, hands, rocks etc. in a fitting style. The team at Invisible Room took over from there and did the Unity part. The final result was shown with a Vive Pro with attached Leap Motion Controller fed by a backpack-computer. I was rather adverse to VR before this project, but I now think that it actually is possible to create very calm, beautiful and intimate experiences with it that have the power to really touch people on a personal level. Interesting times :-) Cheers, Tom
  45. 6 points
    I would use the wind tunnel option on the Pyro Object DOP. Enable the wind tunnel option and set the vector as the direction and magnitude of the wind tunnel. Make sure that it is a smoke sim. Disable anything to do with temperature. Set buoyancy to 0. Disable any temperature field in the source volume and the Fluid Source VOP. Get rid of gravity. Make the container size just big enough to run the sim. Very narrow. On the Solver, disable all shaping including diffusion except for confinement. Use lots of confinement to add nice swirling detail. See the attached hip file for one example setup. From my sim comparing to the reference footage, a lot of finessing with either localized sink in the shoe to velocity sculpting or invisible colliders are used around the shoe to get the streamers to do what they are doing which is to be expected. That the streamers would be art directed far beyond a simple physical simulation. wind_tunnel_shoe.hip
  46. 6 points
    Feel free to check this tutorial / collection of VEX snippets I prepared. I tried to cover many different areas of VEX scripting, mostly stuff I found useful.
  47. 6 points
    Here's one more way: int primnum; vector primuv; xyzdist(0, @P, primnum, primuv); primuv.x += @Time * chf('speed'); primuv.x %= 1.0; @P = primuv(0, 'P', primnum, primuv); moveit_v05.hip
  48. 6 points
    Inspired by couple of papers and Houdini tutorials I started playing with fractals and raymarching. Here are some of the results so far. EDIT: see the following post for the project file
  49. 6 points
    Thanks a bunch for the detailed setup! Im genuinely curious, where does one start if they want to learn these things?
  50. 6 points
    I was playing today with an L-system tree and grains funnily enough, then read your post. Had a go at changing my setup to a simple grow anim (more just a scale up and bend really), kinda works. Give it a go with your setup: tree_grain_grow.hip
×