Jump to content

Search the Community

Showing results for tags 'Rendering'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General
    • Lounge/General chat
    • Education
    • Jobs
    • Marketplace
  • Houdini
    • General Houdini Questions
    • Effects
    • Modeling
    • Animation & Rigging
    • Lighting & Rendering
    • Compositing
    • Games
    • Tools (HDA's etc.)
  • Coders Corner
    • HDK : Houdini Development Kit
    • Scripting
    • Shaders
  • Art and Challenges
    • Finished Work
    • Work in Progress
    • VFX Challenge
    • Effects Challenge Archive
  • Systems and Other Applications
    • Other 3d Packages
    • Operating Systems
    • Hardware
    • Pipeline
  • od|force
    • Feedback, Suggestions, Bugs

Found 203 results

  1. I am trying to create watermelon juice splashing out of a watermelon. Kinda struggling with the shader. I'm playing around with IOR and reflections but failing to get that translucent kind of look a juice has. Any help is appreciated. Thank you!
  2. Rendering thin lines Mantra

    Hi all, Recently I have been trying to understand a bit more about rendering. I was trying to achieve something similar with Mantra. My approach from past experience would be running a pop sim and drawing lines out of the particles to create straight trails as the picture shows. I would then volumerasterizeattributes an attribute like curveu which is ramped to control the intensity along the curve. It would essentially create a volume to render in Mantra and then use Micropolygon rendering to render the volume slightly faster than PBR. I would use a pyroshader and would wire the rasterized attribute into the temperature and density to give it some intensityand color. What I can't seem to achieve is these thin lines that are shown on the photo, I end up with a very thick individual lines in the volume regardless how low I push the pscale or how high the point count is. I assumed I needed more subdivisions to prevent the volume from stepping which solved the issue to some degree but took a tool by being a lot heavier in the scene, but I still can't get the thin lines like in the preview. Can someone explain their approach on how they would use POP's and lines to create such result as the picture shows, or just clarify that it is indeed a necessity to push the pointcount quite high and reduce the pscale in volumerasterize as low as my hardware allows. Thanks, C
  3. Chanel Series

    New Project https://www.behance.net/gallery/104198553/THE-CHANELSERIES
  4. Nekiya

    Hello, I'd like to share a shortfilm that I've been working on for the last few months. It is set up and rendered entirely in Houdini; whilst some 3d models are drawn in other bits of software, the majority is procedurally drawn and the outliers also tend to undergo transformations. Strictly no 2d effects, everything is simulated spatially within the scenes (this is part of an ongoing practice ethos of trying to understand all geometry in 3d and render a fully composited shot). I am very excited to continue to develop and unpack a lot of the systems that have been built for this project. It would be great to get some feedback, thoughts and criticisms from this community. I hope you enjoy. LINK: https://vimeo.com/446918887 Cheers, Egmontas Description: Nekiya is a short film that constructs a spirit realm as a reflection on the naive hypothesis for the existence of ghosts. Who has seen the spirits? Neither I, nor you. But when the invisible weighs upon your chest, the spirits are reaching through. Let the spirits emerge and let Spookiness be true. As Spookiness has an overwhelming desire to meet Soba, she anxiously struggles to determine her place in this delicate world full of occurrences at the edge of her perception and control. Soba begins to doubt the verity of not only her constructed memories, her tears, but the seemingly indeterminate nature of her body itself. Spirits emerge as three-dimensional entities with embedded spatial, material and behavioural data that characterises their potential and inherent constructional capacity within what we can now determine as our multiple realities. These spirits are a consequence of procedural logics that inform both simulation and translation in a digitally pliable universe. Nekiya is the hypothesis of a haunted space of spectrality; it is an episode in the search for a practice of digital poetics.
  5. Does anyone have settings for a Principled Shader (or a DA or something you can share) to make amber glass such as in this photo?
  6. Hi I am trying to follow Ari's tutorial https://vimeo.com/323291527 for generating a side by side montage of some wedged Pyro sim, that i'm trying to test out. My TOP network seems to work fine except the final node: ffmpegencodevideo. I am not really sure how to set the correct path on windows, as Ari's version is for MAC, and have tried removing the C:, .exe and also using forward and backslashes. Do I even need to set the environment file to look at the FFMPEG? Quite confused on how to set this up correctly.
  7. Redahift Z depth

    Hi I'm trying to render Z depth for a smoke in Houdini via Redshift, it's look like working fine with object, but somehow not working with a Pyro smoke. It's look like Redshift doesn't "see" the volume, any idea why? Thank you
  8. Hi all, I'm not a lighting or shading person. I'm barely a Houdini person, if the truth was told, but I struggle on... I am having a terrible time trying to figure out how to assign random textures to packed geometry in Solaris/Lops/USD/Karma (referred to as SULK from now on, OK?) I have a Crowd scene with some flag waving agents. I want the flags to have a material on them where I can choose a random texture from a set of three textures. With me so far? Excellent. How? I've tried 'editMaterial' node I've tried 'materialVariation' node I've even looked at 'variations' but that's s total mess in there - I'm not going anywhere near that! The flag geometry is packed and has an attribute called 'flagTex' which has a random int value from 1 - 3. When I import the geometry into SULK, that attribute becomes primvar@flagTex So far so mediocre. Now the problem: How do I use that attribute to choose a texture file on the shader? I can do this in Mantra, so I would expect it can be done in SULK, but how? I expected SULK to relieve some of the pain of StyleSheets but in truth it is a lot worse. Please, someone, help me out here before I throw in the towel and ditch SULK. Many thanks. Dan
  9. I am tring to assign either .tiff or .rstexbin files to my character that i am using for RBD. when i assign the textures to the character, they seem flat even though i assigned the displacement map. The character have at least 4 UV sets per texture it seems. is there an easy way to create all of the groups needed since the character has at least 12 different objects that were merged together before being packed. I unpacked the geo and could se the uv sets that were all layered on top of one another. i have 25 different texture maps and i am having to select the polygons individually and then group them to use a material sop to assign to the group. if there is an easier way about this please let me know. I have attached url for the textures and the hip file. https://drive.google.com/uc?id=1mjLxLb2AWJcL2qAZTXe0fIVlP860Ku78&export=download DinoCharacterHelp.v01.hipnc
  10. It's my first week using the apprentice version (build: 18.0.391) and I am having troubles rendering my files out in mantra. It seems that this problem is not unique but the available solutions are not working for me (like spacing in file paths, firewall issues, etc). I am still going my way through this but been not able to render my builds is really bumming me out since long time. Any leads will be appreciated. Thanks.
  11. Looking for some advice on rendering a smoke/dust sim with a heavy building destruction sim as a holdout using mantra. Without holdouts the smoke takes about an hour or so per frame as expected, but with the holdouts it is taking many hours (up to 6 hour) for some of the heavier frames. I am using packed alembic caches for each of the layers then merging them together and setting as force matte in mantra.
  12. Pyro Sim not rendering properly

    So i got some help from a guy on here for my uni project. The file he gave me looks amazing but when i come to render it, the fire doesnt show at all. Does anyone know how to resolve this?PyroSim.zip
  13. Hello! I just got myself a threadripper 3960x for freelance simulation purposes etc. My workstation is now up and I'm doing some performance tests. I'm getting some very unexpected results when rendering in mantra. And I'm very curious if someone can help me understand what's going on under the hood? A simple scene I created(with some geo, lights and reflective materials) takes about 2 minutes to render in mantra in bucket mode. All good and as expected! The same scene takes about x3 times as long in progressive mode. If I monitor the core activity, they're almost idling while rendering, then suddenly, about halfway through, they all accelerate up to max speeds and finish the render in seconds... Anyone wanna take me to school? Thanks, Joachim
  14. hey guys, do you know if it is possible to use intrinsic information in a shader? something like intrinsic:indexorder in a render state just like with packed attributes. also, still in a render state, packed:primnum or packed:ptnum doesn't seem to be working. if i make an attribute in a wrangle on packed prims, let's say i@primid = i@primnum, and i call that in the shader with a render state it works as expected. any idea? thanks Martino
  15. Hi all, this is a technical question to help me understand what's going on under the hood about how geometry lights are rendered: I am playing with writing my own raytracer within Houdini, where I also want to use geometry lights. The way I go about it is that each shaded point does a shadow test to a randomly generated point on the surface of the light geometry. As this happens completely at random it is somewhat likely that the generated point will lie on the far side of the geometry, so the light geometry itself will cast a shadow (self shadow on). This results in the first attached picture. This makes sense to me - if you consider a shaded point on the wall behind the turquoise cube - it has only a 1/6th chance of generating a point on that cube that it can actually see whereas a point to the top and right of that point sees 3 sides of the same light geometry, therefore it has a higher chance of generating a point it is illuminated by. This is not what I would imagine seeing in real life (though maybe my conception is wrong). However when I rebuild and render the scene in Houdini/Mantra (type geometry with a box as geo and self shadow on), the result is very different (see attachment 2). Maybe someone can shed a light on how Mantra does its magic. Cheers!
  16. Forest Fire

    Forest Fire entirely done using Houdini and Rendering in Arnold.
  17. Quick question. This is the diffuse AOV from a render using nothing but a distant light. To give you an idea, this scene is rather large. The geometry in this scene is a ocean volume and a plane. I have removed the displacement map from the material to isolate this as much as possible. I can not for the life of me find what sampling quality I'm missing here. Does anyone recognize this kind of circle pattern coming from the light?
  18. Reflection with alpha

    Hey everyone, I've got a question that, even after lengthy research still has me stumped. I've got a pyro fireball and a ground geometry that reflects the fireball. The simple question is: How do I set up the ground plane and its material so it reflects the fireball, including alpha, and just gives me a black alpha where there is nothing to reflect? I've tried to force it to phantom, that just makes it invisible, other, somewhat hacky workarounds like this one are rather old and in the SHOP context. Nothing I tried seems to work. When I'm working in 3ds max and Vray, I use this technique all the time (vray object properties -> reflection/refraction matte), I just don't get how this can be this much of a faff in Houdini, so I'm probably missing something pretty obvious. I'm grateful for any and all help, thanks in advance!
  19. Hi all! There's 1 week left to register for my Mastering Destruction 8-week class on CGMA! In supplemental videos and live sessions we'll be focusing on some of the awesome new features in Houdini 18, and exploring anything else you might be interested in, such as vehicle destruction. For those that haven't seen me post before, I'm an FX Lead at DNEG, and previously worked at ILM / Blue Sky / Tippett. Destruction is my specialty, with some highlights including Pacific Rim 2, Transformers, Jurassic World, and Marvel projects. I've also done presentations for SideFX at FMX/SIGGRAPH/etc, which are available on their Vimeo page. If you have any questions, feel free to ask or drop me a message! https://www.cgmasteracademy.com/courses/25-mastering-destruction-in-houdini https://vimeo.com/keithkamholz/cgma-destruction
  20. hey i am new to houdini and i just created something like explosion, and i want to render with a moving camera, so the scene is moving. can i do it? will it take a longer time to render with animated camera? i am afraid that it wont work so mybe someone who has done it before can tell me?
  21. I've start test Houdini 18 and Arnold 6. the first test was simple splines rendering, 250.000 splines instanced 25 times. it loads a 140MB alembic file. rendered in 6 core Xeon CPU and Nvidia Quadro RTX 5000. (windows 10 pro) the startup for Arnold GPU is slow, it renders faster, so it seems but for clear up the final image it takes for forever or just dropped /crashed, hard to tell on the GPU. the CPU is quite fast but much slower then GPU if it ever would finish. (adaptive sampling was on) As soon as Arnold finishes rendering the scene, it stops and do not refresh any more on parameter changes. so far i am not impressed with the Arnold GPU rendering. here is the same scene Arnold CPU with only direct Lighting. (on my MacBook) some test with Arnold GPU. it performed much better with just direct lighting.
  22. I am working with the sea of cloud.It takes me too mach time to rendering it,so I want to render this cloud seperately and composite them by using the DEEP Camera Map,but there is no relationship between clouds,because they are rendering seperately.I want to know if there is a solution to add the shadow to the cloud. secondly ,what's the difference between Deep Shadow Map(DSM) and Deep Camera Map(DCM)?
  23. How can I render my particles with the isotropic volume lighting model and get self shadows? Its like there is no occlusion and I can see my particle field from one way to the other side, and I can't distinguish wich are in front and wich are in the back... Cheers, Diogo
  24. Are there any good resources on rendering particles with Karma? And does it support micropolygon rendering? Thanks
  25. Recent work from Chinese animated feature 哪吒 (NeZha), I was responsible for creating and rendering of the water effects using Houdini flip to build a guided system for water and render in Mantra, final compositing done in Nuke.
×