Jump to content
[[Template core/front/profile/profileHeader is throwing an error. This theme may be out of date. Run the support tool in the AdminCP to restore the default theme.]]

madebygeoff last won the day on September 14

madebygeoff had the most liked content!

Community Reputation

36 Excellent

1 Follower

About madebygeoff

  • Rank
    Initiate

Personal Information

  • Name
    Geoff Bailey
  • Location
    Brooklyn, NY
  1. Randomizing instanced USD variants

    Thanks, Ryew. He does have a method in that last video using a for-each loop running over all the selected primitives. But it seemed a bit complicated. Was able to chat with Mark Tucker at SideFX who suggested a simpler method. For anyone else looking, the trick is to make sure in the instancer that you are using "instanceable reference (or inherit or specialize depending on your needs)" instead of the point instancer. And your reference (with variants) needs to be the top level prim that you are feeding in for instanceing. Then you could write a bit of vex (for instance): string variants_mtl[] = usd_variants(0, "/instancer1/Instance0", "mtl"); int random = @elemnum%3; usd_setvariantselection(0, @primpath, "mtl", variants_mtl[random]); Or you can append a set variant LOP and use @prim%3 in the "variant name index" box. It's a little confusing why @prim doesn't work in the wrangle, but that's how I got it to work. You take a slight performance hit by not using the point instancer, but it's a useful tradeoff in some situations.
  2. Generate a line art with curves

    Have you checked out the new tangent fields SOP in 19.5? It'll help keep a consistent direction for your curves and then maybe you can use curvature to select which lines to keep? https://www.sidefx.com/docs/houdini/nodes/sop/tangentfield.html https://entagma.com/new-in-houdini-19-5-pt-1-tangent-fields-and-a-bit-of-vex/
  3. New to solaris and usd, but I've gone through the basic overviews and tutorials. Trying to understand better how to manipulate different levels of data in LOPs. So let's say I've got an asset referenced in with 4 geometry variants and 4 material variants (so a total of 16 variants). When I go to instance, instead of creating 16 different prims for each unique variant and instancing them as a collection, is there a more efficient way to randomize the instances? Is there a way I can instance a single prim and randomize the variants after (or with) the instancer? Or do I have to establish all the possible variants ahead of the instancer?
  4. Sharp Edge Waves

    That was set up as proof of concept. You can, of course adjust the resolution as well as the pattern as well as use all the various heightfield tools to smooth or sharpen to taste. The one limitation, of course, is that you can't get your waves to crest over the top of themselves. A heightfield can't have two heights in the same X,Z position, obviously. But depending on the scale of your scene might be more lightweight than actual geo.
  5. Sharp Edge Waves

    You could also go the heightfield route and use a heightfield pattern node and a bunch of different noises. creased_abstract_waves2.hiplc
  6. naming subnetwork output node

    I could be mistaken, but I think it has to be converted to an HDA before you can change the output names.
  7. Lighting: To clip or not to clip

    To be technically accurate, the other place you have to be careful is when you are rendering out to disk, especially if you are rendering to a file format with limited bit depth. If you are rendering to jpgs (for some reason), you only have 8-bits 0-255 per channel, so any value over 255 gets clipped. If you are rendering to something like png16, you have 16-bits, so 0-1023 per channel. Anything over that is clipped. But that's why most of the time you will see people rendering to a 32-bit, floating point format, usually .exr files. Those files can store the over-bright information and make sure it gets carried down the pipeline. So in practice, you don't really have to worry about it until the end (again see Chris's site if you want to get the more complicated answer, but for most projects, you don't have to worry about it).
  8. Lighting: To clip or not to clip

    Yes. Over-brights (values over 1.0) are very common in lighting. Especially in scenes with a single very bright light source (daylight exteriors for example). Generally you want them (they mimic realistic exposure values) and you want to protect them throughout rendering and compositing so that things you do later in compositing behave naturally. They only become an issue at the end of the finishing pipeline when you have to decide how to map your image to a particular colorspace and format which will have a limited range. At that point you'll hear people talking about rolling off the highlights (using a exposure curve smoothly map the overbrights toward the upper limit of the format, which we perceive as white) and clipping (the point at which highlight information is cut off and simply mapped as the upper limit of the format). If you want to do a deep dive into lighting and CG cinematography, Chris Brejon has a great online resource: https://chrisbrejon.com/cg-cinematography/
  9. As Michael says, interior lighting from only a window source is tricky and expensive. But, most renderers will have some kind of portal light (the names vary by renderer) but it works like a dome light except that you set a size like an area light. They might work slightly differently in different renderers, but you load your HDRI into your dome light, then position one or more portals in front of your windows, the portals are optimized to push light into the interior. Here's a description of the Octane version: https://docs.otoy.com/StandaloneH_STA/StandaloneManual.htm#StandaloneSTA/Portals.htm
  10. There's a couple good short videos on instance attributes in the documentation for the copy to points sop https://www.sidefx.com/docs/houdini/nodes/sop/copytopoints
  11. Basically I'm trying to calculate facing ratio at SOP level as an attribute to feed into a shading network for a character (for reasons that don't matter here, I need to do this in SOPs and not in the shader itself). I wrote a basic bit of VEX to calculate the facing ration, but if the object is transformed at the object level or parented, it doesn't work because it's grabbing the point position prior to parenting and object level transformations, which are likely to occur in animation. I thought I could use ow_space() but that doesn't seem to work (possibly because parent transforms are done after SOP level calculations?). Any way to grab point positions that will take object-level transforms into account?
  12. Houdini Vellum Mesh Flickering / Jittery

    So, for the jittering: you can think of vellum working by putting a whole bunch of springs connecting your points. You can see them (the constraints) by connecting a null to the second output of the cloth or solver SOP. Each of those has: 1) a stiffness (stretch stiffness) for how much they want to hold their original length (rest length), 2) a stiffness (bend stiffness) for how much they want to keep their original angle in relation to the points surrounding them, and a stiffness (compression stiffness) for how easily they can compress their length. There's other attributes but those are the most important. With each step, the solver moves the points and then it goes back and tries to satisfy the requirements of all the constraints. It does this several times (iterations) getting closer and closer. Jittering happens when you give the solver settings that it can't satisfy given the constraint requirements. For instance, lets say you crumple a piece of fabric but have a very high bend stiffness. You'll get areas that have to bend to avoid penetration, but the constraint setting won't allow it. So the solver freaks out and you get jittering between multiple bad solves. So the main thing is that you had VERY high stiffness settings all around. 1x10^10 for stiffness, 1x10^6 for compression and a low bend stiffness. So as the cloth stretches over the rock, it can't stretch. I just turned the stiffness settings down (and turned compression stiffness off since you dont have anything compressing) and it worked fine. Lastly, your settings were VERY inefficient. I think in trying to solve the jittering you may have tried tweaking a lot of other stuff, but it resulted in a very slow sim for how simple it is. Your mesh was crazy dense. I turned the remesh down to a target size of 0.1 or 0.05. You can raise it later if you really need more detail, but vellum is now pretty stable across high and low res meshes. Start low and dial things in then turn the resolution up. Your edge length scale in the cloth SOP was also REALLY small. Generally 0.25 - 0.5 is a good place to start (you had 0.01). And your solver settings were very high. Generally the more substeps, the fewer constraint iterations you need. So with 5 substeps (a decent place to start) you can usually get away with 25-50 constraint iterations for cloth. The more substeps, the smaller each movement, the fewer constraint iteration are needed to solve all the constraint requirements. And finally, you had velocity damping turned all the way up. Sometimes you'll hear that turning it up will help smooth sims, but generally that's not true and it results in behavior that isn't very natural. Use it sparingly. You can always turn on the visualizer in the solver and see what's causing the jittering. Best of luck. fabric_pulling_over_rock2.hiplc
  13. Houdini 19.5 skeleton blend issue

    The new skeleton blend added a couple features, but it still works the same. In its default state the only noticeable difference is that "bias" has been replaced by "weights", which might lead to confusion. Bias blended between input 1 (slider at 0) and input 2 (slider at 1). Now the skeleton blend works more like the blend shapes node, where you can add multiple blends and give them each a weight between 0 and 1. If nothing is happening, check that at least one of your weights is > 0. And double check that your "Attribute to match" exists on both skeletons (which it should from what you've described). Lastly, you can always downgrade to an earlier version of any SOP. Go to Windows > Asset Manager > Configuration and set "asset definition toolbar" to "show always". Now at the top of your SOP parameters you should see the version number of your SOP. You can change it to 2.0 and it will revert back to what is used in the tutorial. If none of this works, post a .hip file.
  14. Well it all depends on what you want to call "correct". Houdini, like a lot of computer programs, uses floating point numbers to represent its number system. It's a bit complicated, but think of it like scientific notation in binary. The short answer is that in floating point 0.7 has an endlessly repeating decimal, kind of like how if you tried to write 1/3 in standard notation, you'd end up with endless .3333333333333s. Same thing. So instead Houdini picks the closest value it can simply represent in floating point. In this case, it's 0.69999999999999996. If you click on the parameter name, it will expand to show you the actual floating point value that Houdini is using internally for its calculations. If you click the parameter again, it collapses down to a simplified value that Houdini chooses to display. So what you are seeing is "correct". It is displaying the actual value that Houdini is using to calculate the scattering phase. But it's also annoying to look at so many decimals. So depending on what you want: in your expression use something like round() or floor() or ceiling() to truncate the value to something easier to read.
  15. transfer Cd with CopyToPoints

    Geo spreadsheet is your friend here. And I admit it's a little confusing. Plus in your question there's some mixing of metaphors going on. When you create a primitive sphere and apply a color to it, the color is applied to the point representing the primitive and the viewport does a trick to color the sphere from the point color. When you pack and instance the copy to points, the sphere (primitive or poly) is stored in memory and color IS transferred to the new points that represent each packed sphere. If you look in the geo spreadsheet, you can see that Cd IS transferred to the points regardless of whether the sphere is primitive or poly. However, because the primitive sphere is itself packed, the viewport can't do its little display trick so they all look grey, despite having a Cd attribute. As Atom said, if you want to see the colors on the primitive spheres you have to unpack them (either by unchecking the pack and instance option or by appending an unpack node and transferring Cd in "Transfer attributes". Now the primitives are assigned a color and the viewport can display them. Finally, pscale is NOT technically transferred. If you look at the geo spreadsheet after the Copy to Points, you'll see that there is NO pscale attribute on the points. The default is for pscale not to be transferred. Instead it (and the orient attribute) are applied as a transform matrix to all the copies. You could change this by unchecking "Transform Using Target Point Orientations" and deleting ^pscale (not pscale) from the list of "Attributes from target". Now pscale IS transferred as an attribute (although if you want to use it to scale your copies you now have to do that manually, but that is another post topic). Digging through the geo spreadsheet is often a good way to figure out how Houdini is moving attributes about.
×