Jump to content

Search the Community

Showing results for tags 'shading'.

More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


  • General
    • Lounge/General chat
    • Education
    • Jobs
    • Marketplace
  • Houdini
    • General Houdini Questions
    • Effects
    • Modeling
    • Animation & Rigging
    • Lighting & Rendering
    • Compositing
    • Games
    • Tools (HDA's etc.)
  • Coders Corner
    • HDK : Houdini Development Kit
    • Scripting
    • Shaders
  • Art and Challenges
    • Finished Work
    • Work in Progress
    • VFX Challenge
    • Effects Challenge Archive
  • Systems and Other Applications
    • Other 3d Packages
    • Operating Systems
    • Hardware
    • Pipeline
  • od|force
    • Feedback, Suggestions, Bugs

Found 76 results

  1. I saw this video where you can turn geometry into displacement. https://www.youtube.com/watch?v=rwzo8LD4Tac I want to know how I can turn modeled geometry into a displacement shader. I'd like to turn the modeled geometry on the right into parameters for a displacement shader for the plane on the left shown in the picture.
  2. Hi everyone, so I have been working with some renders of particles that have a material assigned to them with 0 opacity. The problem comes when I try to render out as well a Deep Camera Map and bring it to Nuke. Using the usual workflow with deep images, I am able to see the particles only when I use the DeepToPoints node in Nuke - prior to that I don't even get an Alpha nor I can see them. Is there any way I can export a map for the deep so I can make holdouts with this? Thank you!
  3. Shading Techniques III

    Check out the course at: https://www.cgforge.com/course?courseid=shading-techniques-iii Thanks for watching!
  4. Hi everyone, did a pyro sim along with a collision object. When I up the density from 1 to 4 in the pyro shader, the voxels are getting visible (attached an image to the post showing the issue). Is there any way to prevent that or soften the voxels on the outside in a post process or something? Cheers, Christian
  5. Hi people, Disclaimer: I'm pretty new to houdini and I'm probably using it in a way more basic way that most people. I really like the basic Color SOP node, using that to apply a color to geometry. However, if I want to apply a texture map to geometry, I've been using the UVQuickShade SOP. My question is - if create two cubes and apply a color 1,0,0 using Color SOP to one, and I apply UVQuickShade using an image that is constant color 1,0,0 to the other, why do the two cubes look different in the viewport and in renders? Render Hip
  6. Shading Techniques I

    Hello Odforce! I just released a new course - Shading Techniques I, and this one is all about improving your shading & texturing skills by exploring a variety of techniques. Part I mainly revolves around tri-planar workflows, and you'll also get an awesome wizard tower scene to practice with. Check it out a cgforge.com
  7. Shading Theory with Karma

    I'm happy to announce that I've teamed up with SideFX to release a new free course! - Shading Theory with Karma. These videos are designed to teach you the fundamental ideas behind shading/texturing while utilizing the principled shader, karma, and my dear friend - Shaderbot. Visit CG Forge to download shaderbot along with access to the videos for free. https://www.cgforge.com/course?courseid=shadingtheory Have a nice day,
  8. Hello, my name is kalata and I am an experienced Houdini Artist. I also have solid experience in 3Ds Max, Nuke, After Effects and Photoshop. You can check my demo reel here: https://www.behance.net/gallery/96292145/Procedural-Apartment-Blocks https://www.behance.net/gallery/93554753/Procedural-Neighbourhood-Generator
  9. Hi, I try to offset a texture with a For Loop in a shader. In the example file you can see I iterate 5 times and offset the texture 0.1 units with each iteration. Is it possible to combine all of the iteration steps and show the result of that in the rendered image? Like in the attached picture where I painted in the desired result with red. kind regards Jon forLoop_VexBuilder.hip
  10. Hi Guys, is it possible to go ROP > COP > ROP I have a displacement from COP {noise} baked to texture, I would like to shading Color based on P.y but after displacement. Is there a way? I basicely using the same noise for Cd as for Displacement just shifted with ramp into some colors. The only way I found is "ramp from attribute" {from displacement Cd} and than take it back to COP and write textures. But the color is point color so I have to subdivide the mesh way to high. [using Arnold] B.
  11. Hello, Here is short video showing how volumes can be used to shade and texture semi-translucent organic things. Volumes For Organic Assets This approach allows to achieve close-up realistic organic look for semi-transclucent assets where SSS or colored refraction is not enough. An object with UV's is converted to signed distance volume with UVW grid. Then volume is used to set density and perform uv lookup in rendering. This way density can be adjusted by depth. I.e. not so dense near the surface and very dense at the core. Or not very dense, then dense patch like island of fat, then not dense again. UVW grid is used to texture volume. Different textures can be used at different depth. This approach allows very flexible yet powerful way to place any texture at given depth. Texture X at depth Y. I.e. big veins are 5mm under the surface. This approach is best for close up organic hero renders where SSS or refraction are not looking good enough. Attaching example file volumetric_textures_example.hip
  12. Material Builder to Layer

    Hey guys, I'm trying to get a custom material to export a layer properly so I can combine other shaders. I've attached the .hip file below. This is the material I built. As a simple example, it works as intended. However, there's no layer yet: So, I dove into the principle_core shader and was able to (unsuccessfully) create a pbrdiffuse node, plug that into the layerpack and then export the layer for mixing: But unfortunately when I assign the layermix to the object, I lose the ramp. It seems to take just the first input of the ramp: It's probably very simple, but this is driving me crazy. Any ideas? material_help.hipnc cw
  13. World Space Position?

    Hi everyone, I've got a problem that has me stumped. In my setup, I'm flying with a camera through a volume that has its density modulated via a couple of noises in the mat context. But whatever I do, I'm unable to make the noises stick to their position, it seems like the P attribute from the globals node is always in camera space. I must be missing something obvious since a lengthy internet search didn't give me any meaningful results. I've attached my file, to observe the problem just render a frame and then jump to another frame, the camera will have moved but the noise pattern is in exactly the same place. Thanks in advance Paul world_space_noise.hip
  14. Hi all, I am working on a project that has pages of a book flipping with camera on top. Audience will be able to see the content on pages. I have already created animation of pages flipping. Problem is to assigning 10 different / unique page texture content wise to 10 pages of animated book. What is the best solution to do that. I am using copy and transform to define no of pages. Any pointers will be helpful. Thanks in advance.
  15. Check this out guys: The bumps on top of the tentacle are done with displacement. In fact the same displacement map across the tentacle. How can it be, that the amount of displacement seems to be inverse to the thickness of the geometry? I would expect the displacement to be uniform across the geometry, regardless of the geometry on which it sits? Can anybody make sense of this for me, please?
  16. Hi, is there a way ro reach texture from Agent materials? opdef:../..?clothing_`padzero(3,ch("../../texture"))`.jpg I need just slightly edit it... Thanks BK
  17. Hey! Could someone help me understand this? I have noise going into the vector displacement of a principled shader. Why do I have change the amplitude of the noise in z-axis to affect the y (up/down) position? If I do the same at sop level I can just adjust the amplitude in the y value. I tried messing around with the transform vop, but I didn't get anything working. Thanks displace.hiplc
  18. Looking for a way to apply a flat and fast shader to particles and also be able to control size, color and opacity over life as you could do with trapcode particular after effects plugin or x-particles. Thx!
  19. Hello! So recently I saw two new videos that were produced by FORTICHE, the same company who produced the League of Legends video for Imagine Dragons' "Warriors" a few years back. While that video was impressive, these two newer ones are outstanding. I am sure it takes a lot of money, time, and work to produce anything in this style, but I just adore it and would like to create something similar one day. Does anyone have and information, theories, or even tutorials on how they achieved the look? (I know they implement a lot of 2D elements along with the 3d but I'm curious about the 3D part) Also I have some questions about hair. I have a basic understanding of how hair works with guide lines and such. What I don't understand is how they make the hair so unique, and held up, but still have it be rendered with individual strands. Also, for the k-pop video, how do they make the hair seem so flatly shaded over certain sections but still individual strands? Here are some good frames that showcase mostly 2D esk 3D They also have a really nice flat shading effect on the hay which I am really curious about. The main shine for this looks hand drawn over but the rest of the jewel looks half painted and half actually shaded. This jacket looks really 2D as well, curious how they got the simulated lighting on it to look so drawn. Here is where we start talking about the hair. The hair in this frame is very artistically styled. It may be a basic question but is that just achieved through guide hairs? How are the guide hairs made to not fall due to gravity but still simulate like actual hair? It doesn't just fall down but acts like it is constantly being pulled back to a certain hairstyle, are volumes used? These two shots exemplify what I mean by the hair looks like it is rendered in layers. The hair is made up of individual strands but its as though when they are grouped together they are instead rendered as one entity. Right under the thing south west of her ear you can see the shadows affecting the individual hairs but everything looks so held together, there are very little stray hairs. Here is the hair in motion. Sorry if this post is long, but if someone could help explain how companies get hair beyond just an undercut or simple part, that would be great. y2mate.com_-_kda_popstars_ft_madison_beer_gi_dle_jaira_burns_official_music_video_league_of_legends_UOxkGD8qRB4_1080p.mp4
  20. Hello people! Please tell me how to build the material networks correctly. For example I have an object with different components and different materials. I need to apply one mask for all materials (such as dirt).I can create the mask on the top level and bring out parameters for all materials to connect it. Seems it works and i can export the general mask as image component. But all other export components which were created in the individual materials level do not work now. Please tell me how to build such a connection properly. Thank! matnet_v01.hipnc
  21. Houdini For The New Artist

    Download the Course Files & Watch for Free at CG Circuit https://www.cgcircuit.com/tutorial/houdini-for-the-new-artist Hello ODFORCE! I'm very excited to share my first Houdini course titled: Houdini For the New Artist Houdini for The New Artist is perfect for anyone interested in learning Houdini for the first time. To keep things interesting, we learn about the basics while building "Alfred the Rhino" from scratch. If you're looking for an intro tutorial that gives you a bit of everything, fun to work with, and straight to the point - this is for you. Be sure to check out the full course and download the course files and practice along. Thank you for watching! Course Outline Intro 42s The Interface 12m 26s Setting up Our Project 12m 53s Utilizing Attributes 10m 47s Caching 8m 23s Applying Materials 9m 55s Adding the Backdrop 6m 41s Basic Shading Parameters 5m 36s Lighting 9m 30s Rendering 12m 29s
  22. Introduction to FX using Houdini - workshop

    Hey what's up guys, Only 1 week left to sign up for my fast-track Houdini workshop "Introduction to FX using Houdini". Limited student capacity! Over the course of 9 weeks and 33+ hours of tutorial content students learn how to do procedural modeling, particle-, RBD-, pyro-, FLIP- and whitewater simulations. On top of that we're going to talk about lighting, shading, rendering and we're going to dive deep into VEX scripting. I would love to answer any questions that you might have!!! Cheers, Manu Find the complete course syllabus here: https://www.cgmasteracademy.com/courses/16-intro-to-fx-using-houdini Also for more detailed information on the workshop, check out an interview that was done on 80.lv recently: https://80.lv/articles/making-first-steps-in-vfx-with-procedural-tools/ Here's a quick teaser:
  23. Hi guys, I'm attempting to get the thickness of an object and use this to blend between two pbr layers - in this case a sharp refractive layer into a sss-y one. I could sample an sdf volume on disk and read the thickness from there. But I'm wondering is there a way to do this at render time. For example, catching the position of a ray entering and exiting a surface with lots of sample. Any ideas? Example scene much appreciated! Thanks.
  24. Hey, I have been trying to create groups from FBX subnet, which can be accessed after the crowd simulation done. So I can apply different base colors through Material Style sheet and I am also curious to know if I can export different groups of object to other 3D packages if I have to render it there. what I want to do is, I want to create primitive groups which can be accessed after unpacking the agent SOP. So I can access that data for further process. I somehow manage to make the groups based on the shaders applied to particular objects, but some of the shaders were not included in the groups, I would like to have a proper work around for this. Or is there any particular method to make groups during exporting the fbx from maya ? Thank you All.
  25. cops to material

    I am trying to edit a texture using cop2net or img network. I can do the edit of the image alright, but in my principled shader, trying to refer to that image it wont work. shouldn't I be able,in my bump map slot on the principled shader, to input a relative path like: ../../cop2net1/Bump_map/OUT/ or in case of img network: /img/Bump_map/OUT/ I have not used cops before, so maybe I am doing something wrong. Do I need to force it to export into other context? any help is appreciated. cheers