Jump to content

Search the Community

Showing results for tags 'ifd'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General
    • Lounge/General chat
    • Education
    • Jobs
    • Marketplace
  • Houdini
    • General Houdini Questions
    • Effects
    • Modeling
    • Animation & Rigging
    • Lighting & Rendering + Solaris!
    • Compositing
    • Games
    • Tools (HDA's etc.)
  • Coders Corner
    • HDK : Houdini Development Kit
    • Scripting
    • Shaders
  • Art and Challenges
    • Finished Work
    • Work in Progress
    • VFX Challenge
    • Effects Challenge Archive
  • Systems and Other Applications
    • Other 3d Packages
    • Operating Systems
    • Hardware
    • Pipeline
  • od|force
    • Feedback, Suggestions, Bugs

Product Groups

There are no results to display.


Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


Skype


Name


Location


Interests

Found 17 results

  1. I am using an IFD workflow, and I want to render an identical scene from 3+ different camera positions. I've come across several different ways of rendering these 3+ images sequentially, but that's not exactly what I want. I would like to make 1 IFD file that renders 3+ images from different camera positions. Is there a way to do that? I am working with a large dataset. For easy math, let's say that it takes 10 minutes to read/load, and then 10 minutes to render. Rather than doing three sets of load+render + load+render + load+render (60 minutes), I'd like to be able to do load+render+render+render (40 minutes). I have found that I can use the "Stereo Cam Template" (even though I am not rendering in stereo) to work with two of these cameras to do load+render+render in one IFD file. But how can I do this with more cameras?
  2. Hey! I'm in a situation where I have cached a huge amount of points on disk and I want to render those points. What I do in the scene is reading the bgeo.sc caches, merge them all together, trail the points to calculate velocity, apply a pscale, assign a shader and render. If I run it on the farm with IFDs it takes a lot of time to generate the IFDs for render, and I'm trying to reduce the time of this task. I noticed that the IFD itself is tiny, the big part is that under the IFD folder there's a storage folder where basically I have a huge bgeo.sc file per frame which I suppose is the geometry I'm about to render. I wonder if all of this is not redundant since all those operation can be done at rendertime I suppose...I tried to set the file sop in "packed disk primitive" but it seems I cannot apply pscale after that, all the points render at pscale 1...
  3. Hello node friends ! I have a problem in hands here. Basically I'm doing a stadium crowd sim. So far so good. I'm then doing the shading over on the material style sheet and rendering with mantra. I also have a cloth sim with the same proprieties over on that style sheet. The problem I have here is the following: I can render the flags over on the farm with the IFD generation, I've included the SHOPs on the mantra ROP. That part is solved. When it comes to rendering the Crowd, it just renders black. I've tried caching out the results, using procedural geometries and the procedural agent SHOP but it still renders black. I can render it just fine on my machine, when it comes to the IFD rendering part, the machines can't render it. Notice that I have all the agent caches and maps over on a shared network drive that it can access, otherwise the cloth wouldn't be able to render it either. Any ideas?
  4. Hello node friends ! I have a problem in hands here. Basically I'm doing a stadium crowd sim. So far so good. I'm then doing the shading over on the material style sheet and rendering with mantra. I also have a cloth sim with the same proprieties over on that style sheet. The problem I have here is the following: I can render the flags over on the farm with the IFD generation, I've included the SHOPs on the mantra ROP. That part is solved. When it comes to rendering the Crowd, it just renders black. I've tried caching out the results, using procedural geometries and the procedural agent SHOP but it still renders black. I can render it just fine on my machine, when it comes to the IFD rendering part, the machines can't render it. Notice that I have all the agent caches and maps over on a shared network drive that it can access, otherwise the cloth wouldn't be able to render it either. Any ideas?
  5. Hello node friends ! I have a problem in hands here. Basically I'm doing a stadium crowd sim. So far so good. I'm then doing the shading over on the material style sheet and rendering with mantra. I also have a cloth sim with the same proprieties over on that style sheet. The problem I have here is the following: I can render the flags over on the farm with the IFD generation, I've included the SHOPs on the mantra ROP. That part is solved. When it comes to rendering the Crowd, it just renders black. I've tried caching out the results, using procedural geometries and the procedural agent SHOP but it still renders black. I can render it just fine on my machine, when it comes to the IFD rendering part, the machines can't render it. Notice that I have all the agent caches and maps over on a shared network drive that it can access, otherwise the cloth wouldn't be able to render it either. Any ideas?
  6. Hi, How do you render a IFD file in the mantra node? I know how to create it by enabling disk file in the driver. Once the IFD is created what do I do? I am trying to render to disk. I read the manual and I did not really see how to render and IFD in mantra node. Thanks, Evan
  7. Dear fellow magicians, I am trying to setup render machine for mantra distributed rendering (using mantra -H command) and I am struggling to make environment variables working. Lets say I have two machines - one being master and one being slave. Master have env var $ROOT=/home/trandzik/proj/dreams Slave have env variable $ROOT=/mnt/TRANDZIK01/dreams Distributed rendering work with no problem when geometry is included in IFD file and no file paths are used in scene. Once I want to include path to some file (lets say texture located in $ROOT/img.jpg), slave won't find this file because master save IFD with hardened path (using its own $ROOT var). Slave is therefore trying to find image at location /home/trandzik/proj/dreams/img.jpg instead of /mnt/TRANDZIK01/dreams/img.jpg and of course fails rendering it. Please, does anyone know how to generate IFD with non-hardened environment variables? Using distributed rendering is great option for fast lookdev and I believe it would be great to setup its pipeline correctly with environment variables. Thank you very much Peter
  8. I need to reduce render time for volume so, I tried to use make ifd format. then, how can I render this IFD? the only thing I can is just to click render to MPlay. I wanna render to disk.
  9. Hi, Ever since we started running H16 we've been seeing random frames get stuck on the farm for no reason. They take much longer than the surrounding frames, but they render normally when they are requeued. That goes for Mantra renders, Ifd generation, and geometry caching. Has anyone experienced the same thing? We are running Houdini 16.0.600 on a Windows 10 farm and Deadline 9. Thanks, Ahmed
  10. I put together a simple intro to writing/rendering IFDs, using Packed Disk Prims to make better IFDs, and using HQueue to efficiently generate IFDs for rendering with Mantra. https://vimeo.com/223443000
  11. Hi everyone. I've got a little problem, I'm trying to render my work in mantra, but the problem is that houdini reload the geometry for each frame and it's a bit long. When i render it in the render view everything works well, Houdini loads the geometry once and render each frame very quickly, but when i render it on the disk , he loads the geometry for each frame, and i don't know why. what i render is just a simple sphere instance on a lot of point , arround 10 million. Is there a way to cache the geometry for the render ? Thanks.
  12. Hi, I was wondering if anyone knew why as soon as i hit allow editing of content my shader stops working in my ifd's and it just renders grey scale? Does anyone know why this and or a solution to it? Do I have to cache my shader maybe after I do this? If so how does anyone know how I go about doing this? Many thanks for your help. Kind regards, Frank Engen
  13. Hi, I'm attempting to use images embedded in a HDA for a light-rig. All is fine when rendering within my houdini session. However when I export IFDs and execute them on the farm (command line mantra) I get messages along these lines and the textures are missing in the renders: [16:41:26] mantra: Unable to load texture 'opdef:/Object/turntable?ttBG.pic' Within the HDA references to the embedded textures are relative, i.e. opdef:../?ttBG.pic Is that possibly causing a problem? Do I need to make these absolute? Or is there a flag/switch somewhere to make the IFD export include the embedded images? Or is this just a limitation of IFD export? I can imagine embedding every texture into the IFD for every frame being quite wasteful. Any one have experience with this? thanks Ben
  14. Hi there, I am working on a few pyro shots which get quite heavy geo wise (around 6+gb per frame). We are using "Royal Render" (comparable to Deadline) to deploy to our farm. RR needs ifd files. My question is, if there is a possibilty to just link to the already cached bgeo files in the ifd? All machines have network access and could just grab the bgeo files. As it stands right now, I spend hours caching the bgeo files, then generate the needed ifd files via the rop, which again takes hours and produces quite big files aswell (comparable in size to the geo files). So to me it seems, that the ifd files not just link to, but contain the actual geo - which seems a bit odd to me. Am I missing an option to check? Do I somehow need to declare the geo? So far I only found the option to compress the ifd file. I am working on H14. Help is apprecieated. Alex
  15. Hi, does anybody know how to change the tile index without recreating an Ifd? Is there any commandline arguments or python filtering for mantra? Thanks nap
  16. Hi! Looking at transformation matrices of IFD files really confuses me. For one they seem to do exactly the opposite of what the camera's transformation parameters in Houdini are indicating, which I think is because Mantra moves the scene and there's no actual camera like in OpenGL. However what's really confusing me is if I have a camera with SRT, ZXY order and transformations rot(45, 30, 0) and trans(10, 5, 0) I get a transformation matrix in the IFD that looks like this: 0.866025396499 0.353553407243 0.353553391789 0 0 0.707106765732 -0.707106796641 0 -0.500000012618 0.612372443928 0.612372417161 0 -8.66025396499 -7.07106790109 6.53176023263e-008 1 Looking at the fourth row, I would expect to see 10 and 5 somewhere, but they don't show up. When I rebuild the same transforms in Nuke I get identical values for each component but for the last row. But the results from the IFD seem illogical to me. I'm really stuck on this, maybe you guys can help. Thanks, Nhat
  17. Hi Is it possible to change camera ray_transform inside IFD using python? I can modify properties using tile script and filterCamera() function, but I can not access the transform of my camera. Thanks
×
×
  • Create New...