Jump to content

Help us (SideFX) to enable mixing multiple materials in Karma XPU!


Recommended Posts

Hello! For those that don't know me, I'm Bryan Ray, a technical consultant with SideFX Software. 

We know that the capability to mix multiple materials is something users want in Karma XPU, but there are some significant technical hurdles to overcome. We would therefore like to solicit some scenes and assets that are representative of the kinds of workflows you want to see. If you can contribute something that will help us to evaluate which of the solutions we're pursuing will work for you, please reply here, and we can work out the details.

--

For those interested in the nitty-gritty, there are essentially three approaches we're looking at. The current way it works in Karma CPU, and the way the single Mix works in XPU is the BSDFs themselves are blended. The downside to doing this on the GPU is that it creates a resource constraint that gets exponentially worse the more materials that are layered in, which slows the render down, potentially a lot. But it's reasonably physically accurate.

Alternatively, we can mix the parameters instead, which is what is commonly suggested people do manually to get around the problem. When you do it, it's a lot of extra work wiring Mix nodes for every single input. We could essentially automate all that work away by having the Mix node do the parameter mixing itself, so it would look like the BSDF mixing, but not actually be that. The downside of this approach is that it's not really physically accurate. Instead of having properties layer over one another, you'd get an interpolated result. This is how game engines typically do it. Maybe this is good enough, or what you're expecting. We don't know; that's why we're asking.

The third way is a stochastic blend, where each pixel sample chooses one of the layers, with a probability based on its weight. This is relatively physically accurate and doesn't have the resource constraint of BSDF blending, but it's noisy, which might result in needing longer render times or more aggressive denoising to arrive at the desired level of quality. 

Link to comment
Share on other sites

Bryan,
Perfect timing - I was just investigating the current situation with this - and found your post. It great to hear that this is being worked on.

Reading your descriptions, option 2 - 'Parameter blending' - sounds the preferred option to me, but could we be greedy? Would it be possible to implement more than one method - say parameter blending and stochastic blending - and provide a switchable choice?

...always wanting more : )

 

Edited by Mike_A
Link to comment
Share on other sites

:D I don't want to promise too much, but yes, that's a possibility. It's uncertain if there will be UI to expose the alternate modes or if it will be controlled by an environment variable. But we need to gather the necessary data before making the decisions. If you could give me a general description of what kind of work you do, Mike, and how you'd want to apply blended materials, that would help.

Link to comment
Share on other sites

So in this example, it appears you're not concerned with interpolation or actual blending, since you're just using a threshold to choose emissive or chrome? It's hard to tell through the compression--are there places where the materials are truly mixed?

Mixing two materials in Karma XPU is already possible, by the way, using the BSDF blending method. The performance problems aren't typically noticeable at that level of complexity, so it's allowed as of Houdini 20.0.

Link to comment
Share on other sites


Bryan,
Thanks for the response.

First thing to say is that I am probably not a typical Houdini user. I'm a generalist focused on product visualisation / environments - not vfx, so my requirements may not reflect the majority.

There are two ways in which I typically work:

1. Mixing materials with a clean threshold (black / white mask)
This is the vast majority for me. The current issue is XPU's limit of two - that's constraining. Sometimes I'd like to layer 3, 4, 5 or more materials in a shader network and mix between them with masks.

Typical examples:
Multiple surface finishes on the same product / mesh
    - metal surfaces where there are a mix of 'bare metal, 'undercoat' and 'top coat' materials.
    - natural or man-made environment surfaces that change across location.
Decals and labels on products

This simple mixing doesn't require any material interpolation, just the ability to mix multiple materials with a black / white mask. I would assume simple 'parameter mixing' would be the best choice here.


2. Mixing materials with some degree of 'interpolation'
Much less common for me, but there are times when I do want to use some basic interpolation between materials - rather than just a clean 'one or the other' mix. Below is an example done in Redshift. This is totally texture based with animated texture maps for the 'white water' / 'foam' that I believe used this 'interpolated' type of material mixing.

Ocean and water textures

No need to be physically accurate for this sort of work - but render speed is important again.

Typical examples:
Simulated white water / foam over base water
Dirt layers over base materials
Medical visualisation when dealing with flesh / fluids 
Blending of bump / normal / displacement materials


I hope this helps!


 

Edited by Mike_A
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...