Jump to content

Houdini 13 Wishlist


LaidlawFX

Recommended Posts

Guest mantragora

Every viewport bug I've sent in has been fixed. It's very simple, to fix the viewport just send in bug reports!

No it's not simple.

Build 12.5.456, I can see IsoSurface of the collision object + particles in dops but i have very often geometry visible even if I turn off all object nodes, or have visible geometry from node that is not active and render/template flag is set in another place. Oh, and in 3.2OGL mode collision IsoSurface is squashed.

FastfForward to 12.5.475 and now in 3.2OGL I can see collision IsoSurface correctly but the particles and geometry has gone to hell. Now if I switch to H11 OGL mode I can see geometry but I can't see collision IsoSurface. If I switch to 1.2 OGl I can see them both.

WTF in 456 I see particles/geometry + isosurface in DOPS and in 475 I can't ? Viewport doesn't work at all correctly from the day H12 seen the daylight. Is is far and away from simple.

Link to comment
Share on other sites

No it's not simple.

WTF in 456 I see particles/geometry + isosurface in DOPS and in 475 I can't ? Viewport doesn't work at all correctly from the day H12 seen the daylight. Is is far and away from simple.

This all works well on os x version - GL 2.1 , maybe good bug reporting helps make it work well... ;)

Link to comment
Share on other sites

If you have quicktime installed on your system and write access you can use the mplay; File > Export > Quicktime Movie...

yes, thats right, but if You want to capture directly to disc, quicktime would be a very handy option, right now we do this in a post process

Don't have this issue, possibly you may be switchign from one image plane to another and the adjustment does not change. Usually it does otherwise it would be a bug.

my settings are set by default to gamma 2.2, and if I switch to alpha it also is gamma corrected, what makes no sense for alpha, for other channels it might be hard to know if it is a channel to correct or not, but I want to see single lightexports with gamma, and mattes without.

Def in the non-flipbook context..

Sorry , I don't understand You here, exporting from flipbook wiht a loaded background exports without background

Link to comment
Share on other sites

For the flipbook, you need to have it in the Houdini scene, on a plate. Such as a grid with uv and a constant material applied with out emission and siffuse light checked off. If you want to do a slap comp you can do it in cops. I do agree with the mplay display option of background having a checkbox in the save image as and export(s) dialog to enable the background image to be saved.

As for capturing directly to disc with quicktime, are you talking about rendering to quick time, as opposed to an image sequence then converting? Because that's decades old crazy talk, you don't want to loose an image over the sake of the sequence. Otherwise I'm confused what option you would want? Maybe ask your pipeline guys for a post render dailies submission process, an extra rop you can chain after your mantra rop. The same process that you would submit a render to your farm, you can have another option that detect when the whole frame sequences is finished and submit the post process quicktime to dailies. That would be cool if sidefx had a node like that already, but it would have to have have an editable python/hscript string like point wrangle where maybe hqueue is the default, and then you can change it for deadline, rush, qube, etc. easily. Maybe even make that node the same for regular que submitters. You could do it as a post-render script on the rop, too, if you wrap it into a mantra rop wrapper.

As for the gamma shift in alpha you may need to remove/cut your mplay preferences and reset them(otherwise if you can repeat with stock Houdini settings, that's a bug). Maybe an option on the image planes or soho would be a to set a flag in the metadata of the .exr/.rat/,pic to know if that image plane should be gamma shifted. If you use the preset Pz zdepth pass it will apply the normalization setting to this layer only, but this is probably looking for that direct name within mplay. A custom named preset would have to follow a pattern so it would recognize it.

Link to comment
Share on other sites

It's just an indication how bad their acceleration structures were. Embree doesn't speed up automatically every renderer. It doesn't support deformation blur (nor multisegment blur as I remember). It was written as an example of modern design, Kind of state of the art of ray tracing on CPU, For mature renderers as Mantra or Arnold it doesn't bring that much I suppose. I know there is Vray plugin, but does it work so well? I've heard it's very picky both performance and stability wise. Also it's inconvenience to rely on hardware vendor specific software. Even though AMD supports eventually AVX/AVX2 it always takes a time to catch up with Intel malicious innovations.

(Note, that is pre-Embree2.0)

Link to comment
Share on other sites

It's just an indication how bad their acceleration structures were. Embree doesn't speed up automatically every renderer. It doesn't support deformation blur (nor multisegment blur as I remember). It was written as an example of modern design, Kind of state of the art of ray tracing on CPU, For mature renderers as Mantra or Arnold it doesn't bring that much I suppose. I know there is Vray plugin, but does it work so well? I've heard it's very picky both performance and stability wise. Also it's inconvenience to rely on hardware vendor specific software. Even though AMD supports eventually AVX/AVX2 it always takes a time to catch up with Intel malicious innovations.

(Note, that is pre-Embree2.0)

SESI's insight: http://www.sidefx.com/index.php?option=com_forum&Itemid=172&page=viewtopic&p=134222#134222

  • Like 1
Link to comment
Share on other sites

  • 2 weeks later...

how about that?

we've had hard time when we have to figure out what is happening inside SOP solver.

it is greatly annoying when we check some node, if we move to next frame, everything is messed up. because we changed display flag for debugging.

so i want we can set 'final node' for export to nextframe. if we have that option, we can freely moving around nodes and see what's happen in that node without breaking network when time changed.

And I think it can expand to other dop solvers.

Edited by yongbin
Link to comment
Share on other sites

how about that?

we've had hard time when we have to figure out what is happening inside SOP solver.

it is greatly annoying when we check some node, if we move to next frame, everything is messed up. because we changed display flag for debugging.

so i want we can set 'final node' for export to nextframe. if we have that option, we can freely moving around nodes and see what's happen in that node without breaking network when time changed.

And I think it can expand to other dop solvers.

Does the render flag, vs display flag not work in that context?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...