yongbin Posted September 13, 2017 Share Posted September 13, 2017 (edited) Hello, We are in stuck to find good approach for working with programs that has different unit system. We use maya, houdini, katana and nuke and they have different unit systems. Maya has centimeter unit system. Houdini and katana have meter unit system. Nuke does not specify their unit system. Scene generated from maya is too big for houdini and katana, and raise many issues. Dynamics are unrealistic in houdini, light are too bright in katana. Currently we scale down whole alembic scene (which is made in maya) as soon as we import it in houdini. (With some camera tweek.) Maybe similar solution could applied to katana. (We are not exploring yet.) We can work that way. But our compositors is not happy with it. Because they have to composite images rendered in both maya and houdini. Channels containing z or vector are different, and 'deep image' is not matched, so they should scale it. They want unified images even if they render from different programs. So we tried to change maya's unit system to meter. It failed because it's viewport does not properly working with small objects. (say 0.0001m). And alembic export has bug that it does not working well with meters. What would be alternatives? Does anybody use centimeter unit in houdini? (and katana also?) Please let us know if you have better solutions. Edited September 13, 2017 by yongbin Quote Link to comment Share on other sites More sharing options...
LaidlawFX Posted September 14, 2017 Share Posted September 14, 2017 You can change the unit systems in Houdini under Edit > Preference > Hip File Options. bytw, Maya and Houdini have the same scale and unit size by default and Katana should based on how it was created at Sony too. It sounds like your scene files are not in standard human scales. All physic calculations are done at human scale as the baseline whether for lights or simulations. If you are dealing with .0001m you are dealing in micrometer's and that's not what most CG math solutions are designed for which can explain your biggest issues. "Normalizing" the assets is pretty common if your pipeline does not produce standardized assets. With your import/export HDA you can include a scale, or fancier normalizing calculation so that it changes it so your calculations are the same to produce consistent results. This also helps because for physic simulation calculations, the parameter values you end up entering are in a more human contextual space. At a previous studio they had decimeter as standard which worked for the game engine, but made physics appear too faster or slow relative to in the game. Quote Link to comment Share on other sites More sharing options...
Sierra62 Posted September 14, 2017 Share Posted September 14, 2017 All of the companies I have worked at set it so that Maya is in cm scale, then when we work in Houdini we scale the assets down by 0.1 to be at meter scale, then we scale them back up to Maya scale for either exporting or rendering. Quote Link to comment Share on other sites More sharing options...
Sepu Posted September 14, 2017 Share Posted September 14, 2017 Yep we do the same, so far is working great. Quote Link to comment Share on other sites More sharing options...
Noobini Posted September 14, 2017 Share Posted September 14, 2017 1 hour ago, Sierra62 said: All of the companies I have worked at set it so that Maya is in cm scale, then when we work in Houdini we scale the assets down by 0.1 to be at meter scale, then we scale them back up to Maya scale for either exporting or rendering. what am I missing here ? 1 m = 100 cm so why use factor of 0.1 ? shouldn't it be 0.01 ? Quote Link to comment Share on other sites More sharing options...
Yon Posted September 15, 2017 Share Posted September 15, 2017 (edited) there is not a way without the conversion, so your best bet is to automate. This is a price you pay for the luxury of rendering in two programs in your pipeline. A separate read node for maya/houdini with conversions built in would be your fix. Edited September 15, 2017 by Yon Quote Link to comment Share on other sites More sharing options...
davpe Posted September 15, 2017 Share Posted September 15, 2017 (edited) concerning compositing, if you have two render with differently scaled depth channel for instance, it should work if compositor multiplies channel data of one render to match the other one. only they need to know what is the scale difference - which is probably always the same so I don't see that much of a problem here. companies I've been working for were also scaling assets upon loading into scene. scaling a camera to make your scene appear different scale (as you suggested above) may be a very bad idea in some cases. Not sure what exactly were issues coming from but I've seen renders introducing artifacts when camera was scaled, or significant differences in render times (like 20 mins vs. few hours). Edited September 15, 2017 by davpe Quote Link to comment Share on other sites More sharing options...
Sierra62 Posted September 15, 2017 Share Posted September 15, 2017 13 hours ago, Noobini said: what am I missing here ? 1 m = 100 cm so why use factor of 0.1 ? shouldn't it be 0.01 ? My apologies, it should be 0.001. My current studio uses decimeter scale in maya, so that through me off. Quote Link to comment Share on other sites More sharing options...
LaidlawFX Posted September 15, 2017 Share Posted September 15, 2017 2 hours ago, davpe said: scaling a camera to make your scene appear different scale (as you suggested above) may be a very bad idea in some cases. +1 don't. Scale a camera, scale the environment. If the environment get's extremely out of wack to do so, then the camera is probably at the wrong scale. Sometimes if you track a camera the solution maybe be correct, but at the wrong scale. Quote Link to comment Share on other sites More sharing options...
Noobini Posted September 16, 2017 Share Posted September 16, 2017 (edited) well if it's decimeter...then it's back to 0.1 since 1 m = 10 decimeter 'Decimation'....one in ten...ie. one in ten of you will be executed by the other nine.... Edited September 16, 2017 by Noobini Quote Link to comment Share on other sites More sharing options...
Guest tar Posted September 16, 2017 Share Posted September 16, 2017 the main issue is that if you don't scale your scene down from Maya then you are computing in less floating point precision in Houdini as a lot of operations are still in 32bit. FP precision is logarithmic in nature and is most dense >-1<1 Quote Link to comment Share on other sites More sharing options...
yongbin Posted September 18, 2017 Author Share Posted September 18, 2017 On 9/15/2017 at 6:03 AM, Sierra62 said: then we scale them back up to Maya scale for either exporting or rendering. Is it easy to scale back the scene? What method do you use? Quote Link to comment Share on other sites More sharing options...
yongbin Posted September 18, 2017 Author Share Posted September 18, 2017 (edited) On 9/14/2017 at 11:18 PM, LaidlawFX said: If you are dealing with .0001m you are dealing in micrometer's and that's not what most CG math solutions are designed for Sorry, I exaggerated a bit. But I think we are frequently dealing in millimeter. (edge of an product?) And if we think maya's unit as a meter, maya's viewport is not working well in millimeters. Edited September 18, 2017 by yongbin Quote Link to comment Share on other sites More sharing options...
tamagochy Posted September 18, 2017 Share Posted September 18, 2017 I rescale scenes and we have fix for cameras. Compositors didn't have any problems. I think rescale its usual practice, all studios use it and everything working ok)) Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.