Jump to content

All Activity

This stream auto-updates

  1. Today
  2. CamRig Pro A dynamic camera system for Houdini, designed for fast setups, flexible rig modes, and multi-renderer DOF management. Get it here: https://shirmanor.gumroad.com/l/camrigpro Promo_2.mp4 Features One-click creation of a complete camera hierarchy 13 Lens Presets 3 Render Engines - Redshift, Mantra & Karma Supports both OBJ and SOP-based look-at targets Preserves camera orientation and position when switching targets Three rig modes: Rig – dolly, truck, pedestal, pan, tilt, roll Manual – direct XYZ transforms with the CamRig Pro null Orient to Path – align and animate the camera along curves All rig modes support look-at targets and focus nulls Rig mode has an added Parent to Target mode for camerawork base around a moving target DOF controls for Redshift, Karma, and Mantra (focus distance, aperture for Karma/Mantra Only) Generates renderer-specific ROPs when selected (Redshift, Karma, or Mantra) Import to Solaris Auto-assigns the camera to the scene viewport Supports multiple rigs with auto-numbering and non-destructive setup Utility buttons: reset transforms or clear objects using Zero All & Clear Target/Focus + Global Reset buttons Fully written in native Python — no HDAs or external dependencies making it easy to share Shelf/Tab menu integration for one-click access inside Houdini Includes 2 optional custom node shapes, CamRig Pro node and a minimal connecter Requirements Houdini 20.5+ Python 3.11 Included CamRig Pro Python tool Instructions for use inside Houdini Compatible with Redshift, Karma, and Mantra
  3. Yesterday
  4. Hi, on the Shift chop use expression: floor($C/24)*0.25 where 24 is 8 points of the box multiplied by 3 channel. 0.25 is speed multiplier.
  5. Thanks for sharing the file, that makes things a lot clearer. A couple of things that have worked for me: First, try to keep the hierarchy consistent between your sim output and the final asset. If the GeoClip node is changing paths, it’s often easier to explicitly set the primitive path so everything lines up properly. For caching, you don’t always need both .bgeo and USD caches. In many cases you can feed the .bgeo straight into Solaris with a Geometry Clip Sequence, then export USD at the end. And if things start getting really heavy (fur, feathers, etc.), proxies or payloads help keep the lighting stage responsive. Hope that helps smooth out the workflow a bit!
  6. Last week
  7. Hi everyone, I’m trying to offset some animated objects (just a few boxes) using CHOPs. I tried the Shift node, but as you can see in the image below, it doesn’t behave the way I expected. Any ideas what I might be missing? CHOP_03.hip
  8. Hi everyone, I'm trying to achieve a few effects in Houdini and would love some guidance or tutorial recommendations: Animated maze lighting – lights turning on/off in a maze-like pattern. Light trails with changing colours – trails that follow moving objects and shift hue over time. Neon-style moving lights – glowing neon lights that animate along a path. What's the best approach to set these up? Should I use SOPs, CHOPs, or shaders for the color animation? Any tips, example files, or tutorials would be greatly appreciated! Thanks in advance!
  9. Hello, I work with a large scene from another program. And because we are talking about a space scene, the ratio of the sizes of objects sometimes differs by 5000 times. And if I try to reduce the size of the scene, some objects become too small. If I work with the original size of the scene, I get big problems in the viewport and on the render, the picture is distorted, particles disappear and much more. If I simply move the box by any of the coordinates by the number 1e7, part of the box disappears, the viewport starts shaking. I have attached screenshots. Perhaps these are limitations in the program itself, that you cannot move objects at such a distance.
  10. better render it through mplay or send a batch render if you want the full quality
  11. SOLVED: So it was my specular anistropy setting in the material. I wonder if that's the problem that a lot of people have as I've encountered many posts asking about boolean artifacts and I imagine they are usually from hard surface modelling and people are assigning metallic materials to them.
  12. Hello! So I have been working on a project and I have a submarine with a lot of booleans etc and some of the topology isn't perfect. I can render in Mantra and it comes out looking perfect, but in Karma, the exact same geo looks like garbage. You can see a lot of issues in the mesh. What is the difference in the way Manta vs Karma is handling the geometry. I want to render the project in Karma but not if I can't get to the bottom of this issue. I've attached screenshots. ^^ Mantra render, perfect smoothing Karma render hard edges on front of sub and at boolean torpedo tubes and on top front of tower
  13. Hi there. I guess the point is the expressions works only when the solver is activated. In your case this is frame 5: Have a look. At frame 5 there is not an error: You could enter node parameters manually, without expressions to rid of errors: but I don't know if it worth it. Regards.
  14. Here's what I could manage using a Rig Attribute Wrangle. Tried to get the smallest angle between vectors but failed miserably
  15. Hi; I have a bunch of points that all have a vector attribute as in the ss below. I then turned these points into joints and want to align the generated joints' Y axis to their respective vector attributes. I used a rig attribute vop that runs on points and used a foreach transform vop node to loop over each joint. But the problem is that although I could bind the per point vector attribute into the loop, I just couldn't get the foreach transform to work on each point's respective vector attribute, it only works on the first point's vector attribute. This is so frustrating How on earth can a such a simple task be done in a rig attrib vop? Thankx in advance; AJ ALIGN_JOINTS_TO_VEC.hiplc
  16. First of all you can't expect opgl viewport look would be the finnal render look especially involved with lighting and materials, you should tweak the material and lighting settings according to the render reulst and you can tell the differences from the refelctions. the render one is more physically correct. That is beauty of path traced rendering.
  17. Earlier
  18. Hi there, That's correct in Houdini 20.5 ACES is already built-in. You can find the settings under Edit → OCIO settings. There’s also an explanation there about how different files are interpreted. For example, if the filename contains ACEScg, it will be interpreted as ACES, but I’m not 100% sure about this — I thought it was reading that from the metadata. That’s probably why, since you exported an EXR, it’s being interpreted as Linear sRGB. Whenever I want to make sure a plate is in ACES, I export it from Resolve and choose ACEScg as the Output Transform. That way I know it’s ACES in all applications. I also include it in the filename and make sure it’s saved in the metadata. This works for me, but if someone has a better or more technically accurate explanation, I’d love to hear it. Cheers, Kamil
  19. In this case easier make god rays in nuke
  20. Alenbic have UVs, if animation the same as static geo just export alembic with uv
  21. Try USD, FBX super old and weird
  22. The simplest way I've been able to do this is by using UV Flatten to split the edge groups into separate islands (I call it uv2 or something to avoid overwriting existing uvs) Then subdivide. THEN use Group From Attribute Boundary, which looks at UVs by default. Change it to your custom uv2 channel.
  23. I have another question... The clothing I modeled in a tailoring package exports a model a certain way if I use an fbx character import node, I don't get to see the full path if I use a file node, then the path is exposed on the points, just no geometry one screenshot is of the fbx character import, and the other is from blender where I can see the actual names that I used in the app where I created the clothing item.. Is there any way with the fbx character import node to expose the full path?
  24. Hello! I have clothing I created in an external app. I exported the model as static FBX and an Alembic animation (I have a reason for this). When I use Attribute copy to transfer the UVs from the static object to the animated object (after locking the first frame), it copies the UVs, but creates a UV border for each primitive where the attribute transfer does not. I have to use a second attribute copy or transfer to then put it on the animated mesh. I only plan on rendering the animation in Houdini, and maybe one other application...but the FBX is used for a different purpose. I want to make sure the UVs are the same across the board. Both of the exact same topology, point count, prim count, UVs. What's the most effective way to transfer UVs between a static FBX and animated alembic? Do I just use the attribute transfer in this case, to keep the UV borders?
  25. As I mentioned, the denoiser makes the god rays go wiggly when they're smaller/closer to the source, even when rendering at 2x resolution.
  26. How can one go about making the fog box not way grainy. I've cranked samples very high (both in XPU and CPU) and made sure the lights weren't in the volumes (I read somewhere that makes a difference), but the improvement has been negligible. I'm using a light filter gobo. I tried the denoiser and it helps far away from the source but the god rays go all warbly nearer the source. What magic box am I not ticking or magic number tweaking?
  27. just remove by length perimeter with measure guess its the easiest way
  1. Load more activity
×
×
  • Create New...