Jump to content

Faceting in Displacement Map From Copernicus


Recommended Posts

Greetings,

First off. COPERNICUS IS AMAZING!! I have wanted a system like this for so long. I know it is in beta so there are many kinks but it has massive potential already..... so many crashes. lol.

I have successfully implemented planar projection for decals and subrance style warp projection. Now I am trying to do displacement generation from a high and low poly mesh.

I am currently attempting to replicate the functionality in the mapsBaker node. My logic is that the mapsBaker node works in the old cops system and copernicus can do everything cops can do with geometry.

I am very close. I can get the overall shape but the displaced mesh has visible faceting from the low poly mesh.

here is the vex I am using in a copernicus wrangle

int prim;
vector uv;
vector hitPos;
vector dispDir;
float dispSwtich;
float dispMag;

// origP is the rasterized low poly world positions
//dispMag is the length to the hit
dispMag = xyzdist(1, v@origP, prim, uv);

//get position on high poly mesh
hitPos = primuv(1, "P", prim, uv);

//get the normalised direction to the high polygon mesh
dispDir = normalize(hitPos-v@origP);

// check the direction against the rasterized low polygon normal
dispSwtich = dot(v@N, dispDir);

//if the direction is opposite the normal direction reverse the displacement value
if(dispSwtich<0){
    dispMag *=-1;

}

@disp = dispMag;

I am basing it off of the code form the maps baker:

float incrementX = (1.0/$XRES);
float incrementY = (1.0/$YRES);

float offsetX = incrementX * 0.5;
float offsetY = incrementY * 0.5;

vector udimoffset = detail($LOW, "__udimoffset", 0);

vector WorldN = $vec;
vector MyN = uvsample($LOW, "N", "uv", set((($IX+1)*incrementX - offsetX) + floor(udimoffset.x), (($IY+1)*incrementY - offsetY)+ floor(udimoffset.y), 0));
vector lowpos = uvsample($LOW, "P", "uv", set((($IX+1)*incrementX - offsetX) + floor(udimoffset.x), (($IY+1)*incrementY - offsetY)+ floor(udimoffset.y), 0));

$DISP = distance($pos, lowpos) * $Alpha;
$DISP = ($Alpha == 0.0) ? 0 : $DISP;

if (dot(normalize(lowpos-$pos), MyN) >= 0) {
    $DISP *= -1;
}

if ($RemapHeight == 1)
    $DISP = fit($DISP, $HeightRange.x, $HeightRange.y, 0, 1);

rZ3mS7t.png

notes:
I have enabled subdivisions on the sop import node. it helps but the faces can still be seen on the resultant rendered mesh.
I have tested the dicing quality from .1 to 1 and the faceting remains
If I send a generic noise from copernicus to lops there is no faceting> I can see the UV seams but I think that is expected as the pattern becomes discontinuous at those locations.

here is a screen grab of the setup:

1z9wova.png

 

Any suggestions or help is much appreciated.

Cheers!

Link to comment
Share on other sites

  • 2 weeks later...

I would really love to know more about this, too. I've had some similar problems, and have issues with "tearing" along UV seams in displaced meshes ... for that, I've been trying stuff like using a uniform value along the outlines of UV seams and it helps but doesn't fix the problem.

 

If I had to speculate here, are these faceted lines following the actual edges of the lower poly geometry? It would seem to me that an answer to the "why" might lie in that ... I'm also curious if we need to be doing something to the normals of our displaced surfaces because displacing them obviously changes the surface which means the previous normal vector is no longer right? Or do MaterialX standard surface and other shaders do something internally to compensate? I don't know, but the things I've seen in my work and in your pictures are kind of leading me to believe that, no, it doesn't, and maybe we need to compute the correct "updated" normals to fit the displacement.

 

Curiously enough, I started seeing that weird faceting on a piece of geometry earlier (a piece of firewood) when playing around with the preview shader in Copernicus. The baffling thing is that is that it was coming and going from changing something that shouldn't have been related to it at all, then I had an OpenCL exception and Houdini crashed so I couldn't investigate it further. Copernicus is great, but it's still a bit unstable like you say. I'm curious if you've tried things like doing a little "pre-displacement" of the points to make sure your lower poly geometry conforms as closely as possible to the more detailed version before you try the displacement and if you've tried adjusting the normals? In your pic it looks like you're using the height input alone and no normal map. There may be a deeper technical reason there ... again, to speculate, it might be that we're displacing pixels across the surface which changes the "true" normals of that surface but the points have an actual normal that takes precedent, resulting in this faceted look. I'm not sure about Houdini but in game engines and DirectX 12 when the GPU executes pixel/fragment shaders it uses barycentric interpolation across the filled pixels of a surface.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...