Jump to content


  • Content count

  • Donations

    0.00 CAD 
  • Joined

  • Last visited

  • Days Won


haggi last won the day on November 17 2016

haggi had the most liked content!

Community Reputation

28 Excellent

About haggi

  • Rank

Contact Methods

  • Website URL

Personal Information

  • Name

Recent Profile Visitors

4,281 profile views
  1. From the Alembic 1.7.1 source code: I'm not sure if Autodesk itself uses the same original sources, but it seems so. So I fear there you will have no luck reading an animated sequence. We solved the problem (with Maya 2015) by building our own abc reader plugin which can be connected to a particle instancer what work quite well.
  2. Export Points to maya 2017

    Or you could ask the creator of http://redpawfx.github.io/partio if he is able to provide a version for Maya2017. He has a particle emitter in his toolset.
  3. Control Where Points Scatter

    Placing only one point in an volume does not guarantee that only one chunk is created. I'd use a boolean instead. First you bool the desiered chunk and then you fracture the rest.
  4. Per point distance VEX

    If you only want to know the distance of the same point id in input 1 you can keep the point wrangle type and the advantages of parallel execution: f@dist = length(@P - point(1, "P", @ptnum)); Of course this only works if the input 1 has the same number of points as the first input.
  5. Render to pointcloud?

    Ahhh... great. Thanks a lot.
  6. Hi, I'm searching for a way to create a pointcloud from rendered geometry. More or less it was possible with prman where I was able to save point positions with rendered color information during rendering (or in a pre render pass). Same procedure worked in more ore less in mentalray. Can I do it in mantra as well? The result should be a camera independent point clouds with rgb and position data.
  7. Happy 10th Birthday Od[force]

    Happy birthday. It is incredible how helpful the community here is. Thank you.
  8. You create a surface vdb which creates necessary data on the surface only by definition. But to simplify the process you could use a fluidSource sop which seems to fill the interior.
  9. Hi, during the work with my maya alembic reader, I discovered a problem with alembic files written by houdini. Obviously Houdini exports an additional point geometry together with my mesh. If I load the alembic in Maya, I get a mesh shape and a particle shape. In houdini I can see two packed objects in the alembic sop. This only happens in certain situations. e.g. if I create a box and deform it, only one packed object is created in the alebic export. Only if I export my flip mesh an additional packed point geometry is exported and I get a warning in the exporter: "Renaming node to /geo1/file3_1 to resolve collision". Does anyone know what's going on here? It seems not to be a new phenomenon, it happens the same way with H16.
  10. But it is helpful to keep in mind that 10 nodes in a vop can sometimes be replaced by one line of code. I just followed a Jeff Wagner flip fluid webinar where he combines some vop operators in a dop node and even if it is not very complicated, the network grows fast and most of the time you are moving the viewport to check where all the connections go. So with two lines of code I was able to do the same thing what results in a very much better overview and less nodes.
  11. Particles from Houdini to Maya

    Okay, did you really try it? In the source code you can see: data["bhclassic"]=readBGEO; What means, bhclassic should be handled as bgeo. But maybe the code on github is newer than the plugins.
  12. Particles from Houdini to Maya

    There is no need to rename it as begeo, partio should be able to read bhclassic directly.
  13. Continue simulation from a certain frame

    You are mixing two completely different things here. Usually you do a sim and save only the result of the sim without all additional sim data which is needed for simulation (e.g. collision geometry, additional fields). So your final simulation result contains (in your case) some fields like temperature, velocity, density etc. These can be saved in the dopIO as bgeo, bgeo.sc or other formats. If you want to continue a simulation you will have to save the .sim files out of your dop network (as much as I know) what happens during simulation. You cannot convert the result of a simulation into a .sim file, you will have to do the simulation again.
  14. Continue simulation from a certain frame

    To continue the simulation at a certain frame you need a sim file for this frame. Sim files can be exported from the dop network. I prefer to use a file node in the dop network to write sim files to disk. Then you can use the last sim file as inital state in the dop network - of course you will have to set the sim start to your new start frame.
  15. render EXR colorspace

    If you mean the RenderView tab in Houdini, you can see what it does by simply showing the ColorCorrection bar. There you can see which gamma is used to display the image, applying the same gamma in Nuke should result in the same image. And if you did not change the output gamma value in your mantra render node, the rendered image is linear (gamma of 1.0). BTW.: You should not change the gamma before compositing, but at the end and only set the view in Nuke to the appropriate settings. The result is a better quality in most cases.