Welcome to od|forum

Register now to gain access to all of our features. Once registered and logged in, you will be able to contribute to this site by submitting your own content or replying to existing content. You'll be able to customize your profile, receive reputation points as a reward for submitting content, while also communicating with other members via your own private inbox, plus much more! This message will be removed once you have signed in.

Search the Community: Showing results for tags 'mantra'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General
    • Lounge/General chat
    • Education
    • Jobs
  • Houdini
    • General Houdini Questions
    • Effects
    • Modeling
    • Animation & Rigging
    • Lighting & Rendering
    • Compositing
    • Games
  • Coders Corner
    • HDK : Houdini Development Kit
    • Scripting
    • Shaders
  • Art and Challenges
    • Finished Work
    • Work in Progress
    • VFX Challenge
    • Effects Challenge Archive
  • Systems and Other Applications
    • Other 3d Packages
    • Operating Systems
    • Hardware
    • Pipeline
  • od|force
    • Feedback, Suggestions, Bugs

Found 147 results

  1. how to create the effect where the trail created by motion blur is faded off instead of being solid through out. in renderman it's achieved with shutter effects, but i could not find anything similar in mantra. Please help.
  2. Hi everyone, I've noticed an issue shining a light through a refractive material. The refractions don't seem to be energy conserving. Using a point light with intensity 1, exposure 0, no attenuation, shining through a grid with a basic mantra surface material with refractions enabled, the brightness values of the resulting pixels in the render are extremely high. Some are as high as 4,000. Why is this? In this thread, Symek mentions that the BSDF model is not inherently energy conserving, so it would have to be applied post-process. Seems odd that this wouldn't already be done in SideFX's shaders... If that's the case, though, how can I make refractions in the mantra surface shader energy conserving? Thanks, Chris EDIT: Forgot to upload sample scene, here it is light_refraction_broken.hip
  3. Hello, everyone! I'm very excited to share my Procedural Lake Houses tutorial series, where I show how to generate the houses all the way from base silhouette to final shading. Example of the Generated Content: Link to cmiVFX page: https://cmivfx.com/products/494-procedural-lake-house-building-creation-in-houdini-volume-1 Thank you for watching and have a good day!
  4. Still a Houdini novice, but curious what all the fuss is with 3rd party renderers. Especially hearing a lot about Redshift. Can any experienced Houdniks summarize the key advantages of Redshift over Mantra? RS apparently used to create this impressive still http://jm-b.cgsociety.org/art/sculpture-maya-woman-houdini-nude-redshift-flower-figure-pose-dry-ballisticpublishing-cfe-expose12-figures-adult-3d-1345912
  5. I'm having trouble with mantra rending hair with houdini16. I keep getting an error message. [17:06:10] mantra: Unable to resolve geometry op:/obj/geo1_deform/OUT_GROOM. Please ensure the object is included for rendering. You can turn off vm_renderable if needed. Invalid detail specified Invalid detail specified Invalid detail specified I don't know how to turn off vm_renderable or what is causing mantra to fail here. Is this a bug? any help much appreciated B
  6. For anyone rendering scenes with mantra, and have lots of lights, there have been some amazing optimizations that showed up in the last few days. The scene has 528 disc lights with varying colors: 16.0.504.20 11m 51s (no active radius) 8m 29s (active radius 3) 16.0.535 3m 28s (no active radius) 3m 13s (active radius 3) 16.0.537 1m 18s (no active radius) 1m 9s (active radius 3) I haven't had a chance to try it in 15.5, since the scene was built using several H16 nodes, but if anyone else gets the chance to compare against 15.5, that'd be cool!
  7. Hello community, Some months ago I asked how to achieve this kind of effect Blackpixel user at sidefx forums did an awesome job with a mantra shader, at that time I was just learning the basics so I get stuck trying to understand, here is the post if someone wants to check out ( https://www.sidefx.com/forum/topic/44970/ ) I'm uploading the file that blackpixel did, all in mantra with shader builder, now that I learn the basics of shader builder I'm trying to really understand what he did, here are my thoughts: So he take a worley noise (4 points) and split up to 5 fit ranges, one is set to the refraction, he multiplies 2 to get the displacement, then he does several multiply with colormix, is this to get the colours on the borders? I tried to do it myself from scratch (attaching the hip too) with a similar but not the same result, I started adding one by one to see what is doing but I can't figure out yet. My main cuestion is, from a designer point of view, lets say I want to create a shader that is black/milky (btw he isn't using SSS but it looks like), how should I think to say: this will be the plastic hole, this will be the border one, this will be the main surface, I guess he's doing all of that dividing the worley in fits right?, mantra is kinda scary yet so powerful. Sorry for the long read Cheers! worley_holes_blackpixel.hip worley_holes_caskal.hipnc
  8. Hello guys, I'm working on a Bifrost liquid mesh that I imported into Houdini to render it with mantra since Maya couldnt manage my huge alembic file ( +300Gb ). There is no foam yet since it's in the process of meshing but I wanted to know what do you think about the liquid mesh shader ? I used the Ocean material from Houdini with some adjustment. Thank you guys!
  9. Hey Guys, this is the first visualisation of a project for school. It's a Floating Island which has 4 different areas. A grasland, a sand desert, a swamp area and on top of the mountain an ice land. The base geometry was made with WorldMachine 2 and also the base textures were made with this tool. After that, the base textures were edited in PS. All of this data was then imported in Houdini and things like clouds and plants were added. Also the shading, lighting and rendering was done in Houdini with Mantra. This is my first project in Houdini, Greetings Chrizzo Floating Island_Konzept_V1.0_Ani_01.mp4
  10. Hiya! I'm still new to Houdini and I was wondering if anyone had made use of 3D decals (projected textures) in Houdini? I've found them really useful in the past and I was wondering if anyone had played around with them. They look like this kinda thing: I'm not talking about the Decal material node, I mean things similar in effect to what is talked about in this blog post. Thanks!
  11. First things first: – This is not a “why Mantra is so slow” topic. In fact, Mantra is my first renderer, so until now, for what I read, it seems pretty fast and good. – I came from a digital product design background and I’m really new to 3d and Houdini. – I’m using a MacBook Pro for learning which I know its definitely not the best machine to work with 3d stuffs. (Specs: Mid 2012, 2.3Ghz i7, 8GB RAM, GeForce GT 650M 1GB) So after this (not so happy) presentation, lets talk about my problem: I was doing a tutorial from Niels Prayer about how to make a geometry react from an audio file and after that I've tried to render a sequence of 860x540 images. I waited 14 long hours for about 12 secs of video. Is this normal for my computer specs? I forgot to cache the geo, but I don’t know if this could be a great factor to get a faster render, could be? I've read some topics about Mantra settings from Odforce and SideFX forum and found some interesting tips about how to get better and faster rendering trying to get down the pixel sample values, increase the max ray samples and tweak the noise level to an acceptable value. So I’ve attached a hip file with my Mantra settings, and my machine can render this frame in about 3:13 min. This is current the best I can do in terms of speed and quality. Do you guys can check it out my file and see what I can do to get a faster render? Please feel free to hit me with tips on how to get the best of Mantra and Houdini in general. Thanks a lot. Cheers, render-test.zip _render_test_1.exr
  12. i am starting to experiment some Non Photorealistic Rendering. First i am starting off with non animation Versions. simple point replicate rendered disc's: Houdini viewport: constant shader in mantra: working out some edge strokes and rendered with toon shader: scene Files: point_repli_v001.hiplc point_repli_v002.hiplc
  13. Working on a project that is just a simple break away of a fractured model. Problem is that Mantra renders all the cracks in the shape before they are pulled apart. Rendering the same scene in other renderers like Arnold Render does not yield this problem. The shape is not cracked before it it pulled apart which is the desired effect. Anyone know why this is not the case in Mantra? Houdini 15.5, latest Arnold render.
  14. Hello, Mantra during rendering is always converting all non-rat textures into rat and then discarding them after the render is finished. So I am considering batch-converting all of them to rat using the iconvert tool while keeping their original name and replacing only file extension. Batch conversion of textures is not a problem, but I am curious about re-linking them in shader networks. Arnold has an elegant solution for that - it checks if there is a .tx version of a texture and if so, it will use it and save some time. It also allows to do one-time conversion during the first use of a texture and then it will keep .tx version around inside the texture folder. So I am humbly hoping that there is somewhere hidden feature for this behavior also in Mantra. Any experience with that? Otherwise I can just scan all shading nodes in Houdini with python and replace texture file extensions for rat. Thanks, Juraj
  15. This is a small project i had been working on lately, mostly had to limit myself with the render settings as this almost took 1 hour per frame. any feed back would be appreciated. Thank you
  16. Hi! Is there a way to bake a volume to point cloud during render? I did it for polygons by pcwrite function and by micropoly mantra with uniform measuring (1 point for 1 primitive). But if I do this with any volume shader - pcwrite does not work after pbrlighting node (as part of compute lighting node, and I did pbrlighting manually to get Cf): rendering freezes. I want to get point cloud with Cf color for each voxel.
  17. Hi all, I'm trying to render to several outputs at the same time, and perform multiple takes on those outputs. In the example scene shows more clearly what I mean, I have two sub-takes (each shows a different primitive), and a Wedge node that is set to render both of these takes. In the /out you will see a Wren node plugged into a Mantra node, the idea being that if I set the Wedge node to render using the Mantra node as an output driver, the Wren node that is connected into it should also render. But it renders using the wrong take. If you open the attached file and click on 'Render Wedges' of the /out/wedge1 node it should render the Wren node, then the Mantra node to MPlay. You can see that the Wren node always renders the same output. I see in the documentation there are instructions on how to wire up multiple nodes to make them all render in sequence, but this doesn't seem to work at all where takes are concerned. Am I doing something wrong or is this a bug? Wedge.hipnc
  18. Hi I'm testing renderman21.2,the pxrprimvar node can read variables in Houdini, but I can't do it in maya. here is my work flow: 1,in Houdini I create some attributes : rest, mynormal, myfloat. use attributerename node to RiName. 2,export abc file to maya. 3,use PxrPrimvar node in maya to read those variable, neither rest,Rest,myfloat,Myfloat works for me. any one can tell me how to do this please? thank you very much
  19. Hi! I recently fully committed myself to Houdini from C4D. Mostly because my first love in 3D was Softimage and when I met houdini was like....but I digress.... I have a few newbie questions that I will greatly appreciate if a more experience user answer them with some honest advice: 1. Very confused about GI: Is Indirect Global photon map the GI solution for houdini. or the PBR mantra render with multiple diffuse bounces is also capable of GI simulation. I ask because the last time I used GI with photon maps was in lightwave with montecarlo calculations (also a bit in C4D before the physical render was implemented) and I remember the horrors of the flickering and bloches and etc etc. So it really scares me to have to go back to photon maps. I am particularly confuse becaused if the PBR rendering is not simmulating GI. What means diffuse bounces? why Env maps HDR actually illuminates the scene....A clarification of this will be greatly appreciated. 2. glass and transparent shadows: In most render engines I had used refractions produced transparent shadows. In mantra it seems that this is called (appropriately, I must admit) faux caustic. This solution is quite good enough for me. But 80% of the time I just want the semi transparent shadow, not the higlights and other details caustic cause. Is faux caustic the only way to go in this cases? 3. And an important question that I am honestly asking as a newbie with no ulterior motive: is one of the answer "use Arnold (or insert any other renderer)" ? I actually really do like Mantra, the shading tools, noises and integration of custom parameters encourage me. I also mostly do motion graphics, some VFX but nothing that push me to look to an ultra-realistic render (read Arnold). But a good enough GI solution (kind of C4D physical renderer) is kind of very useful. Sorry for the lengthy post....oh crap I just made it longer!
  20. I have experience using deep rendered images as a compositor at large facilities, and while the file size is large, the ability to work in nuke while slower is still functional and the benefits of the deep render outweigh the file size and comp slowdown. I am doing some testing with Houdini for a whitewater sim and the DCM output is gigantic.. depending on the particles on screen, file size maxed at 3.2 GB for an exr frame. Nuke needless to say can't handle this, even Houdini's output of a DCM with a relatively small amount of particles will bring Nuke to a dead crawl. It seems like there are quite a few DCM settings buried. Does anyone have any experience on optimizing the DCM settings to get a result that is good but saves on file size and computing for Nuke? Cheers, Ryan
  21. I have a question that has been bugging me for some time and I couldn't find much information about it. Which is the best and most efficient way to render many polygons? Using delayed load procedurals or using packed disk primitives? Or, am I confused and are they both doing the same thing and there's no difference between the 2 workflows? As far as I know, they both create instance geometry. The documentation doesn't help much either, half of the things I read talk bout optimizing a render using delayed load procedurals, and the other half about using packed primitives. I'm wondering if packed primitives is the new workflow and using delayed load procedurals was the old way of doing it as is now obsolete? Here are the 2 workflows I'm talking about: Packed Disk Primitives Here I pack all my geometry and write it out to disk. I then load it back and change the load setting to "Packed Disk Primitives". Then I generate my IFDs and they are now referencing the geometry from disk instead of having to write them out (And the IFDs are a few KB or MB big). I then render using those IFDs. Here is what the documentation says about it: "Packed Primitives express a procedure to generate geometry at render time." "Because Packed Disk Primitives by their nature are geometry streamed from a file, similar to Alembic primitives, we don’t have to use a special procedural to get smaller IFDs." Delayed Load Procedurals Here I write out my geometry (not packed) as bgeo and then make a Delayed Load Procedural shader and select the bgeo files I just wrote to disk. I then go to the Rendering -> Geometry tab of my object and load my Procedural Shader. I then create my IFDs and then render them out. In the documentation about the delayed load procedurals, it talks about optimizing geometry this way. So I know there are these 2 ways, but are they both equally the same, or is one of them better than the other? Which workflow do you use? Also, when using the packed disk primitives, if the geometry you want to render is unique and it can't be instanced (or there's just no point in doing it), do you still pack it (so its only 1 packed prim) and save it out? Or do you use the delayed load procedurals? Do you use any other workflow? Any advice on this would be greatly appreciated! Thanks
  22. Hi, We started using deep rendering and noticed nuke scripts literally explodes if we forgot to check "Exclude from DCM" for each extra passes. My problem is, how do we exclude base components from DCM, like Direct Lighting and Indirect Lighting? There is no Exclude from DCM checkbox for those passes. I tried in H14 and H15. Do I need to manually export those passes in the Extra Image Planes section? Thanks Maxime
  23. This has been really getting on my nerves the past few days. I'm just messing around with particles and their attributes, connecting them to a Principal Shader for practice. I can get Emission to turn on(emitillum), I can control the intensity(emitint) but I cannot for some reason figure out how to control the Emission color. I know that Emission Color's parameters are: emitcolorr emitcolorb emitcolorb In my example I did a lowish-res flip sim and connected the velocity to Emission Intensity (@emitint = @v;) for example. However, When I colored my particles using a ramp and using the velocity channel, I figured I would try: @emitcolorr = @Cd.r ...and so on for Green and Blue. But, this did not work. I feel like I'm dancing around the answer but not able to find it after searching. I wonder if anyone else has run into this? Best, Stark!
  24. This has been really getting on my nerves the past few days. I'm just messing around with particles and their attributes, connecting them to a Principal Shader for practice. I can get Emission to turn on(emitillum), I can control the intensity(emitint) but I cannot for some reason figure out how to control the Emission color. I know that Emission Color's parameters are: emitcolorr emitcolorb emitcolorb In my example I did a lowish-res flip sim and connected the velocity to Emission Intensity (@emitint = @v;) for example. However, When I colored my particles using a ramp and using the velocity channel, I figured I would try: @emitcolorr = @Cd.r ...and so on for Green and Blue. But, this did not work. I feel like I'm dancing around the answer but not able to find it after searching. I wonder if anyone else has run into this? Best, Stark!
  25. Hi guys I have a flip tank simulation from ocean surface and it has a displacement map created from the ocean below. So as I render the flip object with mantra when "Allow Motion Blur" is off everything is OK but when I turn on the "Allow Motion Blur" it can not create the displaced geometry in render time and it takes for ever although cpus are on 100% load. And I should say the "Allow Image Motion Blur" is off and I just want motion vector pass from rendering. I tried different rendering engines and raytrace motion blur but no change. I try to create a small hip from that now. I would be really thankful if you can help me with this problem. Update: I added hip file and sorry for the size. ForumFiles.rar