Jump to content


  • Content count

  • Joined

  • Last visited

Community Reputation

1 Neutral

About Anti-Distinctlyminty

  • Rank

Personal Information

  • Name
  • Location
    United Kingdom
  1. Faceted Normals After Displacement

    Hi Adrian, thanks for the input. I dug around inside the other nodes as you suggested. The get_space subnetwork seemed to just be getting a rotation matrix for the purpose of converting whatever was input into object space (or whatever space was chosen). But that doesn't seem to imply I was doing anything wrong. In an effort to try and understand the displacement better, I'm trying to apply layers of displacement. For example rotate first, then displace along normals. The Shading Normal VOP doesn't have too much in the way of documentation (e.g. what space are the inputs expected to be in?), but it states that it will pick up the global variables of the same name if nothing is inputted. So this is the closest I have gotten, but as you can see I'm still getting artifacts here. Furthermore, if you try to make the object subdivided now then the displacement disappears. I feel there's something essential to all this that I'm missing here. Displacement_Normals.hipnc
  2. Faceted Normals After Displacement

    Although, in my ongoing effort to actually understand displacement in mantra, I do have some confusion as to why this setup doesn't work. Any ideas there? Displacement_Normals.hipnc
  3. Faceted Normals After Displacement

    Ok, I thought it may be something to do with the shadow calculation rather than the normal calculation on the mesh, and it seems that is correct! I changed my Raytracing Bias to 0.17 and the shadows are now smoothly interpolated. Edit: You beat me to it thank you Brian!
  4. Faceted Normals After Displacement

    That much better! There are still some artefacts on the faces that are going into shadow, but certainly better. Edit: it seems to be from the torus rows and columns being increased :/ Yet this raises the question of what was actually being done wrong before? As far as I can tell, wasn't what I was doing before essentially the same thing? The results for both displacement setups are the same.
  5. Faceted Normals After Displacement

    You're right that the exact same code does work at SOP level, but still gives facet artefacts. I'm really curious to see what the solution is...surely mantra can do this? Otherwise every displaced object has to be subdivided.
  6. Faceted Normals After Displacement

    Hi all, I've set up a displacement shader to rotate an object. As you can see, when the displacement is applied, the normals are not smoothly interpolated. Is there anyway to fix this without subdividing the entire object? Displacement_Normals.hipnc
  7. Anyone know how to use 'Houdini Engine Procedural: Point Generate'?

    Ok...some minor success. I have used displacement in a shader. This allows me to load packed geo and just displace it as I please at render time, so my IFDs will be very small. However, I'm unsure if the packed instanced geometry (the microvilli in the example file I have attached) displaces just the point position of the packed primitive, or it displaces all geometry. I can't think of a way to get the instances to align themselves correctly to the surface after displacement (see attached diagram). Is this possible? Instancing_Displacement.7z
  8. Anyone know how to use 'Houdini Engine Procedural: Point Generate'?

    Hi Symek, I've attached an example that shows the kind of situation that I'm faced with all the time. We have a deforming base, upon which I must instance lots of stuff, and sometimes these shots can be very long and quite large in size too (a current shot has 1.7 million points as the point cloud to instance upon). In our previous application (Lightwave) we'd have a simple base mesh, that is displaced and the instances are scattered over it. Setup was simple, file sizes were small and it was easy to setup. But because all of this has to be cached out, I'm faced with gigantic file sizes that take an age to bake out. We're not a huge studio so our licenses are limited. I was hoping to avoid the HE Procedural Shop due to the licensing issue (it would mean that our render power with these scenes would be cut to about 20% due to licensing limitations). It would be good if I could have a similar setup to Lightwave - just displace the geometry points at render time. The ability to run CVEX on geometry would be enough I think (assuming that a point instance procedural sees the points after the CVEX is run). Instancing.7z
  9. Anyone know how to use 'Houdini Engine Procedural: Point Generate'?

    Thanks symek, that worked in the end. Unfortunately, my ultimate goal did not work out. I was trying to use these generated points as a source of points to do instancing on using a Point Instance Procedural Shop. But it just seems to take the original geo, and not the points generated from the HDA. This is a real bummer as this would've solved a lot of issues for me. If you know of a way to take a deforming mesh and cover it in instances at render time, do let me know.
  10. Anyone know how to use 'Houdini Engine Procedural: Point Generate'?

    Thanks symek, That gets me one step closer. Is there any way to feed the HDA that is called a mesh? E.g. If I want to scatter points over a SOP in my scene. I tried using an object merge in the asset, having that as an asset parameter, as adding a string attribute containing the path to the input points, but it didn't work.
  11. Hi all, I'm trying to get the SHOP procedural called 'Houdini Engine Procedural: Point Generate' to work, but so far I've had no luck. There's no example and the documentation doesn't actually say how to use it. There's the obvious docs listed here: http://www.sidefx.com/docs/houdini/nodes/vop/enginepointgen But that's all I can find. I've also attached the closest I have gotten. If you go to the shop > vm_geo_enginepointgen1 you can see I've told the procedural to call the Scatter SOP, which it seems to try and do so, but it complains that there are not enough sources specified. Which is fair enough, but I see no way of telling the called scatter to use any geometry as input. So, has anyone got this working? Perhaps calling their own asset? Point_Procedural.hipnc
  12. Move points at render time?

    We can have very long shots (my current one is 3200 frames) with heavy geometry. You can see my issue I'm sure
  13. Move points at render time?

    Unfortunately no, neither of those will work. What I'm after is some sort of underwater wavy motion. I can do this just fine, but it means I have to write geo out for every frame, which is very wasteful. Imagine you are rendering something like some underwater kelp gently swaying in the current. A simple moving displacement will work fine, but I cannot see how that can be done.
  14. Move points at render time?

    Hi all, I wasn't sure whether to put this into shading or effects, so it just ended up here. The title says it all - is there a way to move points at render time? Just apply a noise or something like that, essentially a render time VOP Network. The reason is I have a heavy piece of geo that just needs to sit in the background and wobble slightly. That's all. It seems very wrong to write out geo for every frame.
  15. Blurring Normals in Shaders

    Hi all, Is there a way to blur surface normals in a shader? I'm currently trying to find a way to make a whole bunch of primitive spheres look like they are blending together. I can't find much in the way of resources on this. My thinking was to do something like a pcfilter in the shader, and modify the surface normals, but I can't see how to use the pc functions in the shader, or modify the surface normals of the principal shader (I'd rather not rebuild that entire thing from scratch).