Sustaxata Posted March 28, 2014 Share Posted March 28, 2014 Hello everyone, I am trying to apply some normal maps in tangent space to some models. I used both Zbrush and xNormal for exporting the normals, none of them work. As far as I understand it, there are two types of normal maps, tangent and object space. Since the geometry will be deformed, I am using NM in tangent space. I wanted to use the default mantra surface model first, to see how it works, but it doesn't work in the first place. It basically takes in a normal map and then uses the convertNormal subnetwork (posted below). The some of the channels still seem flipped. I tried flipping other channels using the multiply node, but it simply does not work. I have tried looking it up, and there are some pretty ancient threats. One of them also provides an example file, but it seems houdini changed a bit since then. http://www.sidefx.com/exchange/info.php?fileid=561&versionid=561 The VEX function vop_computeTangents($tanU, $tanV, normalize(N), $uv, 2); does not expect the same input anymore. I haven't really worked with VEX code yet, so I am a bit stranded here. Isn't there a fast way to set tanget space normal maps up? I can't believe such a simple thing is so hard to set up. Do I need to flip certain channels when I export the NM? Am I overlooking something, or am I just really stupid? Any kind of help or information is welcome Thanks a lot Sustaxa Quote Link to comment Share on other sites More sharing options...
edward Posted March 29, 2014 Share Posted March 29, 2014 Haven't tried this before but maybe it helps: http://www.sidefx.com/exchange/info.php?fileid=561&versionid=561 Quote Link to comment Share on other sites More sharing options...
eetu Posted March 29, 2014 Share Posted March 29, 2014 Ohh, I was just setting this up at work yesterday. The exchange asset had some issues; you can use the new compute tangents vop in place of the inline node, plus some vector type hints were set to 3d vector instead of 3d normal, messing things up. I'll see if I can share it on Monday, although I'm not 100% sure how standard the normals maps I was working with were. 1 Quote Link to comment Share on other sites More sharing options...
eetu Posted March 29, 2014 Share Posted March 29, 2014 And, by the way, the SideFX documentation on this is plain wrong: "Fit range from -1:1 to 0:1" should obviously be the other way round.. Quote Link to comment Share on other sites More sharing options...
eetu Posted March 29, 2014 Share Posted March 29, 2014 Oo-kay, this here looks like it's working, I simplified the vopnet a bit as well. The head texture is from Nvidia. I hope the axes etc are the proper way, I don't have Zbrush here. If not, just do some more or less flipping using_normal_maps_ee.hip 2 Quote Link to comment Share on other sites More sharing options...
Sustaxata Posted March 29, 2014 Author Share Posted March 29, 2014 @edward: yes thats the link, which uses the function that changed in the newer houdinis @eetu: Yes thats pretty much exactly what I'm looking for, You're a hero! The file you posted didn't work for some reason, so I had to create a new makeinsancexform and use the exact same values as you did. Only thing I'm surprised by is that houdini does not provide an in-built solution for this, considering that normal maps are a pretty common thing to use Problem solved. Thanks a lot! Quote Link to comment Share on other sites More sharing options...
edward Posted March 31, 2014 Share Posted March 31, 2014 And, by the way, the SideFX documentation on this is plain wrong: "Fit range from -1:1 to 0:1" should obviously be the other way round.. I wonder if what ZBrush output can depend on the various settings? If the ZBrush output are 0-1 values, then the fit from -1:1 to 0:1 makes sense to me because we had subtracted 0.5 earlier? Quote Link to comment Share on other sites More sharing options...
eetu Posted March 31, 2014 Share Posted March 31, 2014 I wonder if what ZBrush output can depend on the various settings? If the ZBrush output are 0-1 values, then the fit from -1:1 to 0:1 makes sense to me because we had subtracted 0.5 earlier? Well, we're talking about the normal vector here, why would you clamp it to positive values? The "subtract 0.5 and normalize" does the same thing, so actually that whole step is probably extraneous.. Quote Link to comment Share on other sites More sharing options...
Sustaxata Posted March 31, 2014 Author Share Posted March 31, 2014 Sorry if thats really noobish of me, but wouldn't the substraction of 0.5 on every channel distort the normals? Quote Link to comment Share on other sites More sharing options...
eetu Posted March 31, 2014 Share Posted March 31, 2014 Sorry if thats really noobish of me, but wouldn't the substraction of 0.5 on every channel distort the normals? I might be wrong as I do not have ZBrush here, but a step like that is needed if the normal map is in a non-float format, like that face map above. Normals live in the [-1..1] range, but integer file formats do not tend to store negative values. In that case the normals tend to be fitted from [-1..1] to [0..1] for saving into the image map, and then again expanded from [0..1] to [-1..1] on import to the shader. That can be node with a fit node, or by subtracting 0.5 and multiplying by to, ot normalizing instead of the multiply if you wat to be sure that the end result is normalized. If the normals are brought in as .exr, then a step like that would of course distort the normals. 1 Quote Link to comment Share on other sites More sharing options...
freaq Posted March 31, 2014 Share Posted March 31, 2014 (edited) basically if it is an 8 bit map (256 values per channel) it is like this:0-127 the value is negative 128 no change (effectively the 0 point) 129-256 the value is positive if you are mapping from 0-1: 0-5 negative 0.5-1 positive to map this to floating values for normals:(val*2)-1 if 8bit style index to map this to floating values for normals: (val/128)-1 Edited March 31, 2014 by freaq 1 Quote Link to comment Share on other sites More sharing options...
eetu Posted April 1, 2014 Share Posted April 1, 2014 I tried my scene (above) at work, and it did not work right, seems like the normal map axes here are defined differently. So prepare mentally for some flipping, maybe Quote Link to comment Share on other sites More sharing options...
freaq Posted April 3, 2014 Share Posted April 3, 2014 normalmaps sometimes come in "maya" encoded or "max" encoded,basically it meas the Y and Z axis may be interchanged or flipped.not sure if this is the ace but might be worth trying. Quote Link to comment Share on other sites More sharing options...
acorreia Posted August 28, 2014 Share Posted August 28, 2014 Sorry for diging this post from the grave.I have used the method eetu created to render my maps created in zBrush, but I have one small problem. I want to render out my images using PBR and with eetu's file unless its micropolygon renderer it comes up as black. Any solution? I have been trying but no luck whatsoever. Cheers,A Quote Link to comment Share on other sites More sharing options...
dszs Posted August 29, 2014 Share Posted August 29, 2014 Could it be it does not have a BSDF output in the shader. PBR won't work with it. Correct me if I am wrong Quote Link to comment Share on other sites More sharing options...
acorreia Posted August 29, 2014 Share Posted August 29, 2014 Yes dszs. I believe I managed to solve it, just forgot to post it here though. Created another lighting model and conected the yellow output (BSDF as you said) and voilá, it works!Thanks Quote Link to comment Share on other sites More sharing options...
el_diablo Posted October 4, 2014 Share Posted October 4, 2014 I wonder if this is still problematic in the recent builds, or am doing something wrong? I get some problems on 8-bit normal maps, withouth the eetu's math, which look like clamping of values even if I choose the -1,1 on normal map options in the mantra surface. With the fix it works out ok. Quote Link to comment Share on other sites More sharing options...
el_diablo Posted October 12, 2014 Share Posted October 12, 2014 Anyone know whats the expected output of 'Make Instance Transform' with N,v,up connected as in eetu's Normal map example. I can't seem to find any extended node documentation, as to how the 4x4 matrix output is calculated from the numerous inputs. I must be missing something simple regarding the operation of that node. I understand how transform matrices are usually composed, but would love for some info on that specific vop node. Quote Link to comment Share on other sites More sharing options...
eetu Posted October 12, 2014 Share Posted October 12, 2014 Have you checked the illuminating example image in bottom of http://www.sidefx.com/docs/houdini13.0/copy/instanceattrs (from wolfwood iirc) Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.