Jump to content

multi-context VEX shader


Harry

Recommended Posts

Not that I know of. the way would be to compile them to a surface and displacement shader respectively and build a Material from those in Houdini. This way everything will get automatically updated when you change your code.

 

Is there a specific reason why you absolutely need to have this together?

 

You could also try and compile your shaders with the compiler pragma #pragma optable vop and wire a multicontext shader yourself in Houdini.

Link to comment
Share on other sites

And, about that, if you create a mantrasurface (which contains both surface and displacement networks) from the material palette and you save the vex source to disk as a vfl file, you will see the code produced by "surfaceOutput" (the return statements for the surface shader method), but there is no mention of "dispOutput" (nor any "displace" shader method anywhere). So it looks like saving the VEX source from a material only saves out the surface shader and not the displacement shader. Is that correct?

Link to comment
Share on other sites

Thanks! Yes, didn't see that. But then, if that is the code that gets generated and executed at render time, then it doesn't seem optimal because a lot of what only feeds the displacement outputs is present in the surface code (the result of those computations are never used in the surface shader)…

 

Anyways, it illustrates what I'm talking about, I think.

 

In the end, the best practice seems to prototype with the VEX builder and then save a surface VFL and a displacement VFL refactor both to use common VEX libraries or include/header files. And optimize the VEX code with a text editor.

 

I'm a complete n00b thinking out loud here, not hoping that it will ever makes sens to anybody. ;-)

Edited by up4
Link to comment
Share on other sites

Don't forget that you are looking at the raw conversion of your node network to vex code. The compiler will optimize the code a LOT when compiling to bytecode and get rid of most (if not all) of the stuff that does not contribute to the final output of the shader. So no need to spend a lot of time manually refactoring and optimizing the code.

Edited by dennis.albus
Link to comment
Share on other sites

And the another thing (for multicontext shaders, and I'm still talking out loud here) is that sharing data between geometry and surface and displacement is not so trivial (for a n00b like me). I haven't figured how to peek into vertex/point/primitive data from a surface shader or if it is even feasible (starting from the currently shaded surface and not from a hard coded network path). Is it (possible)? What would be the best way? Thanks again!

Link to comment
Share on other sites

You mean inspecting geometry and shading information from another position on the geometry than the one you are currently shading?

 

AFAIK this is not possible. I would love to have the possibility to have direct access to geometry and shading information, especially in different spaces (like in UV space for example to do proper blurring of procedural textures).

 

You can do a workaround by sending rays to the desired locations using the trace or gather functions, but it is not nearly as convenient as accessing them directly.

Link to comment
Share on other sites

Actually, I'm trying (and failing) to get anything at all on the currently sampled geometry other than the globals. If I create a custom primitive attribute on the geometry in a SOP network, would using renderstate to retrieve the "object:name" and feeding both that and the result of a primid call to the the prim function be the best way to find the value of that attribute? If so, does prim_attribute perform smooth interpolation of the same attribute in UV space? If not, what does it do compared to a simple "prim" call? I'm back to basics here, but it seems a little bit under-documented. And thanks!

Link to comment
Share on other sites

  • 2 weeks later...

I was wondering about this exact same thing as well. If you make a VOP material, it has surface, displacement, fog, etc. I can see these codes separately. But how do these separate codes become one at render time?

 

Is there no way to do this using had written VEX code? If not, do I have to write code for each component (surface, displacement, etc) separately using File > VEX Type? If so, how will I combine all of these separate shaders in the end?

 

I know some people here write their shaders using VEX, so how do they do this above step?

Link to comment
Share on other sites

they don't "become one".  they are individual shaders that are generated from (potentially) shared vop networks.   the "material" vop is really not much more than a (slightly) fancier subnet.  if you have a material that contains both displacement and surface shading, you can apply it as a "material" or you can apply it as only a surface shader or only a displacement shader.  so really, it's just that the material node collects shaders and lets you refer to them by a single common name rather than having to come up with a way to deferentiate between surface shaders and displacement shaders (like: material "bricks" vs surface shader "bricks_surface" and displacement shader "bricks_displacement").

 

personally, it feels a bit like the material workflow sort of sputtered a bit in development.  perhaps they implemented it just enough to give people what they wanted (the ability to share common vop logic between related shaders) without having to totally change the paradigm (surface shaders do one thing, displacement shaders do something else, etc).  there are some other quirks to working with the material builder nodes that i don't like so i tend to just write separate shaders.  if i need to combine them, i use subnets.  it's way easer to combine two disparate shaders than it is to disentangle a material node into it's constituent parts.

 

in terms of writing one piece of vex code that can compile to different types... you can do that by setting flags on the vcc command line.  vcc will take a context flag and you can check that with #ifdef's to rework your code to compile appropriately for whichever shader you're writing.

  • Like 1
Link to comment
Share on other sites

Thanks Miles. How does one for example take a VOP material like Mantra Surface, click "View Code" and then gather the codes for each component (surface, displacement, etc) and put it inside a single VEX based material? I just want to do this as an experiment. If I just compile it using Houdini, then I don't see the code of the new compiled material so I don't want that. Something like this:

My VEX material()
{
    // default mantra surface
    // surface code:
    ...
    // displacement code:
    ...
    // end
}

Is this possible?

 

Thanks :)

Link to comment
Share on other sites

Hi Ryan,

 

From what Miles says, I don't think it is possible. But it is exactly what I was referring too myself! Also, the pragma solution is less elegant, IMHO, than just writing 2 separate shaders (surface, displacement) and using libraries of shared included (#include) code and then combining them using the material VOP net.

  • Like 1
Link to comment
Share on other sites

  • 2 weeks later...

i'm not really sure what is to be gained by mixing surface and displacement code since they're not likely to be calculating the same things.  however, if you combined a displacement shader with a sop shader, then you're talking about mostly the same code with some extra bits here and there.

 

i have done this in the past -- tho i was basically doing inline code in vops to avoid some of the wrangling.

Link to comment
Share on other sites

When I want to use the same code in a surface shader and displacement shader I'll calculate the values in the displacement context, then export those values as parameters that can then used in a surface context with the dimport function:

https://www.sidefx.com/docs/houdini13.0/vex/functions/dimport

http://www.sidefx.com/docs/houdini13.0/nodes/vop/dimport

 

That's actually something I don't like about the material context sometimes... global variables like N are not always identical in surface and displacement context, but the interface can confuse that fact a bit.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...