Jump to content

xxyxxy

Members
  • Content count

    25
  • Donations

    0.00 CAD 
  • Joined

  • Last visited

  • Days Won

    1

xxyxxy last won the day on October 13 2018

xxyxxy had the most liked content!

Community Reputation

7 Neutral

About xxyxxy

  • Rank
    Peon

Personal Information

  • Name
    xxyxxy
  1. Procedural textures workflow

    Check out "Texturing and Modeling: A Procedural Approach" by Ebert! Also, any book full of tutorials on natural materials should be helpful. I like "Blender Cycles: Materials and Textures Cookbook" by Valenza. Also don't forget about non-CG books about pattern implying (same problem, you are conveying the sense of the pattern without building the thing out of the pattern) like Gupta's "Rendering in Pen and Ink" and Scott Robertson's "How to Render". Both have chapters on specific materials.
  2. Material Assignment best practices

    this video is for katana, but pretty sure everything they do here can be accomplished with material style sheets... nice pipeline for lotsa buildings with overrides
  3. This is a rookie question, but I have been making a lot of math-ey shapes with VEX Volume Procedurals (and before that Volume Wrangles) but I am realizing that a lot of my experiments aren't even using lighting at all... so I want to try just doing my raymarching of the volume in the shader itself and simply render to a plane. I have two questions about this -- first when I render, how should I fit my camera to the plane? Ideally, I would just be writing into the near clipping plane directly but I can't figure out how to set that up procedurally versus me just monkeying around in the viewport. Second, in terms of aliasing the only thing that should matter here is primary rays... correct? I have more experience with GPU shadertoy life wherein the AA needs to be done in the shader itself since there is only one "sample" that plops the fragment on the screen. Thanks! P.S. If anybody has an example setup or video of this, I would be grateful. I have seen some 2D fractal art which I assume was written in this fashion but no HIP files.
  4. Oh, this is great! Thanks for the pointer to the source! Thanks for the BSDF clarification, I always thought map was the right term since people sometimes used those merl textures to do (angle ==> [0..1]) lookups instead of like an analytic function... but now I realize that no matter how you get your (angle ==> [0..1]) value it is still called a BSDF.
  5. Thanks so much for the answer. I have a couple of clarifying questions if you don't mind. I took a look at the generated VEX code for both pbrdiffuse VOP and layermix VOP to try to understand, but all I saw was a bunch of exports and a call to layer_composite_a_over_b that I couldn't see the implementation for. Where does this weight (alpha) come from? Is it based Fresnel in the sense of just the incoming light angle or is it based on the specular layer's "reflectivity" setting? Also, for the PBR-Non Metallic VOP I do see the controls for energy conservation... does this process of energy conservation only occur when the box is ticked? Lastly, I apologize if this is trivial but I understand BSDFs to be just maps from a solid angle to a [0..1] value. Does that mean adding two BSDFs as described here is simply adding their map values? You can add bsdf values together and scale them by vectors or floats. Multiplying a BSDF by a color vector makes the surface reflect that color. http://www.sidefx.com/docs/houdini/render/understanding.html
  6. I'm learning about material layering in Mantra, and up until now I have always used the layermix VOP. This video shows another way of combining materials, chaining one node's "layer" output into another node's "base" input (at 33:22): Are these methods equivalent? The documentation has got me a bit confused. It makes it sound like the layermix VOP just averages the inputs and the bsdfs whereas the chaining method also performs some sort of energy conservation: The nodes take care of the physical aspects of combining the looks (fresnel components, energy conservation) automatically. Does that mean the layermix VOP does NOT perform these conservations? What exactly is happening in both of these cases? The fact that the video advises that the order in which you chain materials has gotten me especially confused. I am somewhat familiar with the layering techniques (averaging output pixel color in realtime world, performing inter-layer scattering in offline world) but I have no idea how Mantra works in either of these two cases. Does anybody have insight into layermix VOP vs chaining layers to base inputs?
  7. Ah, that's a good idea! I find translating tutorials to be pretty great, trying to make this in /mat currently:
  8. Hey gang, I am confused about best practices with exporting entire scenes from Houdini. I noticed that if I use File-Export-Alembic to export my entire scene to Alembic all of my top-level nodes that are just an alembic import SOP are imported fine... although their groups are renamed to something like "alembic_1_1" or something crazy. Whereas all of my top-level nodes that consist of an alembic import SOP and an convert SOP are not imported fine -- they are listed in the scene graph but are not drawn in the viewer. However they do retain their original group names like "DirtyMetal" etc. How can I get the best of both worlds? I want to both see my geometry in the viewer and also have the original group names. The only clue I have is that in the former case when I middle-mouse click it tells me I have an "Alembic Packed Geometry". How do I convert from polygon to "Alembic Packed Geometry"? I don't see that option in the convert SOP. Alternatively, is this a dumb workflow? Is there a more feature-rich way to do this than File-Export-Alembic? Part of me wants to give up and install the USD plugin for the Houdini-to-Katana transfer but I'd really like to figure out what I am doing wrong.
  9. Did you ever find anything good? I am also looking to sharpen up on hair and skin shading in Mantra, but most published tutorials are hard-surface or pyro shading only.
  10. Megascan animated vegetation

    About halfway through this talk, the artist shows a way to paint wind-weights on his vegetation... I think it just samples a time-varying noise field at runtime:
  11. What is the ODFORCE of lighting?

    Thanks, I'll check that out! I guess maybe lighting isn't as technical as H so no need for defined communities?
  12. Hey gang, I'm starting to light more and more and was wondering if anybody had forum recs... I haven't found anything CG-centric that I like yet, although some of the advice posts on cinematography.com are fun to read.
  13. Workflow for creating terrain

    I am a terrain noob, but besides triplanar projection (or PxrRoundCube in RenderMan) you can also look into writing some custom code to break up texture repetition, as in this IQ article or detail textures in this David Rosen article: http://www.iquilezles.org/www/articles/texturerepetition/texturerepetition.htm https://www.gamasutra.com/blogs/DavidRosen/20091225/86222/Detail_Textures.php Note these techniques are both for real-time so might not give you the fidelity you are looking for.
  14. Specify light color in Kelvin?

    Awesome, thank you!!!
  15. Is there a way to feed in say 5000K as my light color in the Color Editor? I see there is a temperature slider in the TMI section but I don't understand how its [-1, 1] range maps to the normal black body gradient. Specifically, when it is at the lowest it seems to be a bright yellow instead of a deep red. Also (off-topic) is there any quick way to get a light to "look at" an object? I see there is a constraint I can add but I just want to do it quickly and interactively in the viewport.
×