Jump to content
[[Template core/front/profile/profileHeader is throwing an error. This theme may be out of date. Run the support tool in the AdminCP to restore the default theme.]]

Posts posted by ejr32123


  1. this doesn't directly answer your question, but karma can do this. When using karma/solaris it saves the snap shot to disk and you can see it again when re-opening houdini. A lot of good features are being added but you have to use solaris to use them.


  2. 14 hours ago, doc said:

    ejr32123 is correct. You want a constant shader on the top of the sidewalk, but you want proper lighting on the fractured interior faces. 

    A larger issue you'll have is when the pieces break and move the lighting on the top surface should change. The shadows should change, the diffuse and specular lighting should change. But how do account for this if the lighting is baked into the projected image? No easy answer as far as I know. People often make a copy of the original image and try and remove specular and shadows in the hopes of creating a diffuse texture map that can be used with a principled shader. Photoshop or gimp seem the most obvious tools to do this, but I believe that there are some photogrammetry solutions that have some automated mechanisms for doing this.

    Once this is done you'll need to reproduce the lighting in cg so that it matches the photo. You'll probably need to render a shadow pass for areas where your cg debris is supposed to cast shadows on the constant surface. Then it'll probably take some love in compositing to make it all work.

    Hope that helps

    Not sure how to do it in mantra, but in redshift I can make a material that receives GI and shadows (and I can also disable one or both of those if I want),  but not normal lighting. That way my texture matches the scene but can also pick up shadows in case a shadow is cast over it. If you can't do that in mantra it would be possible to render the geometry top faces again but with a shadow catcher material, then in compositing you comp the shadow back on the top faces as you said.

    • Like 1

  3. 7 hours ago, Scaletta said:

    Thank you it was nice with constant shader but this time the depth of the shards disappeared. I think I'm making a mistake somewhere.

     

    Ekran Alıntısı.PNG

    Are you using a separate material for the inner faces? That might help. I think you just don't want lighting on the top faces where your texure is being projected. The rest of the geometry needs lighting otherwise it looks flat. You just don't want lighting on the top faces so that it matches the footage. The fracture node outputs inner faces groups to help with that.


  4. would be nice to see a more streamlined way of controlling random time offsets with point attributes. point attributes don't really work with timeshift unless in a for loop, but it doesn't work very efficiently and is really slow. CGwiki has a page about time offsets in for loops. Of course chops works, but point attribute + timeshift seems like it could be way easier.


  5. 25 minutes ago, Atom said:

    I haven't tested volume lighting, but material emission does work.

    Yea, that works for me. I wanted to see if I could boost it with a volume light. I should have clarified that.


  6. 3 hours ago, Atom said:

    Or use a multiply node and two RSTexture nodes inside the material blender. I'll often tint my AO passes with a solid color using this technique.

    that is what I wound up doing. I stupidly put a rs multiply node, then it came out gray. After a few mins I realized it needed a rs multiply vector, lol. I forgot sometimes rs nodes split things like that up.


  7. well, I actually found the answer to my question by editing the material in mixer. I think the problem is quixel materials are meant for games, so they include a "occlusion" texture. If I disable it, it looks like my render. Of course there is no way to plug in a occlusion texture into a rs material or principled shader. So what exactly am I supposed to do to get this looking good in houdini?

    image.png.71b44e648b7bc6febad3eca1ebaaaa0d.png

    image.thumb.png.dfc6740c3c2ac00671fc308e61cb88ae.png


  8. hello. I recently started using more of quixel's megascan library with redshift in houdini. Most of the 3d assets render fine, but I seem to have the most trouble with getting textures to look decent, especially grass. I can get dirt looking good, but it seems like I have to constantly change settings like displacement amount depending on the shot angle/lighting. I swear it seems impossible that the render they show was actually made from the same textures. Look at their render of the grass, then look at mine. It's like their are gaps of shapeless color between patches of grass. It looks nothing like the fully dense grass render they have. I tried using both mantra and redshift, nothing comes close to making the grass look anything like the render they have. It just looks flat with gaps between patches of grass.
    any tips?

    image.thumb.png.21fd4ebe9c11a1e07c2f5df799a254c1.png

    image.thumb.png.ac4681ebde97d34ba5cf73a771e66876.png


  9. Hopefully other people don't know this so I am not the only dumb person here, lol. I just found out if you use UPPERCASE letters in your node name, it shows first in the choose operator dialogue. That is why SESI scene files always have the term "OUT" in caps. 

    Anyway, that's it from me. Good luck...

    image.png.bf712edc5e93361504037021cbf8a875.png

    • Like 1

  10. 14 hours ago, Atom said:

    As far as I know, there is only one valid case where you actually need to use the Hero RBD object. That case is if you want to have the object float upon the flip surface. If you can work with packed objects, you might want to go with that.

    I agree with your observation, though. I can't find any way to alter the velocity beyond the initial velocity.

    The reason I used hero over packed was because I thought I could give different points different velocities to guide it. Since packed geo only has one point I used the hero rbd. But apparently that doesn't work.


  11. kinda hard to trouble shoot without a file, but can you try using a volume visualization node and set density to flame/temperature (depending on which you are using for fire). You can then visualize the flame/temp field in the viewport. Is it still disappearing/reappearing? 

    edit: did you cache your pyro simulation? it is possible you dragged the playhead to an uncached area so its rendering incorrectly.

    • Thanks 1

  12. Its because you are not sourcing any temperature or burn. If you go to volume rasterize attributes and only use temperature and burn it will be empty. You need to change the coverage attribute to either nothing or change the way you are sourcing (like giving each particle a density value higher then 0). You have density as your coverage attribute and your points that have burn/temperature have a density value of 0. So you are effectively multiplying your burn/temperature by 0. So only density is being sourced.

    image.png.3fbd5a5590885e6fc9f6f49cd3a0d049.png

×