Jump to content

bonjarry

Members
  • Posts

    15
  • Joined

  • Last visited

Personal Information

  • Name
    J

Recent Profile Visitors

2,550 profile views

bonjarry's Achievements

Newbie

Newbie (1/14)

0

Reputation

  1. It's similar to the second illustration, where the two ends are being moved, so i need to reset the center dynamically.
  2. I have two packed rbds connected by a constraint network. At the center of the constraint (halfway between the two objects), I have a third anchor point. I would like this point to stay at the center even if the distance between them changes (spring constraint). From the docs (http://www.sidefx.com/docs/houdini15.5/nodes/dop/constraintnetwork), I have set the anchor_id=-1 to keep it relative to one of the objects, but this isn't exactly what I'm trying to do. I also tried using a sop solver in dops to reset the position of the "centroid" anchor each timestep, but this seemed to make the simulation unstable. The anchor was positioned correctly, but the constraints themselves were behaving odd. I'm wondering if manually setting the position of an anchor messes up the forces/attributes associated with that constraint? (This is a simplified case of a larger setup, I can try to put together an example hipfile if that will make more sense)
  3. A quick update: The Houdini-Education license, though not explicitly used by the render machine, must be seen by the render machine. (ie. allowable IP subnet mask). The Render-NonCommercial licenses were accessible by the render machine, but the Education license was not. Because of this, mantra assumed it was a regular apprentice non-commercial license, thus the watermark and lower resolution.
  4. i am using the mantra command to directly render the IFD
  5. The licenses we have are Non-Commercial Renderer, the educational version. While rendering, I watched "hkey" to see which licenses were used and both the local and remote renders used the same type of license. On the remote host, I use "hserver -S ..." to set the license server and it is able to locate it fine.
  6. I am trying to setup a rendering pipeline on some remote machines. We have Houdini Educational with Non-Commercial Rendering licenses. On the local network, we are able to render without restrictions and watermarks. We were able to get a license to allow us to render on the other network. My question is, does the watermark and render resolution restrictions come from ONLY the license being used or is there something in the actual installation files? I did some test renders on a local machine and a remote machine using the SAME license. Local machine gave an HD image without watermark. Remote machine gave a lo-res image with the "Houdini 3D Animation Tools" watermark. Any thoughts? Thanks
  7. I have a particle system, with copied geometry within an OTL. I am using it inside Houdini Engine for Maya. Is there a way to bake the simulation to a deforming mesh to push it down the pipeline? I tried using Keys->Bake Simulation and Duplicate Special, but neither of those get the global particle movement (the particles are frozen in space). Any suggestions, on either the Houdini or Maya side would be helpful. My main concerns are that files the reference the asset will still depend on the Houdini OTL and sim. I want to avoid this. Thank you!
  8. We are beginning to introduce deep images into our pipeline. Unfortunately we are currently using H12.1, so are forced to use RAT instead of EXR. We have a small program to convert RAT to EXR just by translating the slices to layers in the EXR. My question has to do with how the Deep Images are "flattened" in compositing. (This may be more suited for a Nuke discussion, but thought someone here might have some insight). With volumes and particles, how does it combine the many samples down a single pixel? What do I need to set up in my shader? Is it based on alpha, opacity, etc? I'm noticing some differences between normally rendered EXR and the RAT->EXR Deep Images. I lose the wispy effect of millions of particles with varying alpha values. Does it have to do with low sampling rates for the depth? Any advice would be great! Thanks
  9. I would like to do a "simple" simulation of some hanging banners. But I would like them to already be torn (old and dirty). Not too much character interaction, just gravity and some wind forces. This will be my first venture into cloth in Houdini. I've seen ways of tearing a cloth through simulation, but what would be the best way to do an already torn piece of cloth (with edges and holes)? Is this something the I could achieve solely through surfacing? or better to do some in modeling? Also, any general tips on cloth? Thanks!
  10. Hey crunch, thanks for the reply. Unfortunately, using hou.getenv() seems to show the same problem. From the Houdini Docs, it says Is there something that would cause this to not happen?
  11. Our pipeline relies on setting environment variables to move around and pass information to the software (houdini, maya, etc.) such as which show, shot, asset is being worked on. Working with Maya and other software, the environment variables are passed no problem. Something seems to have happened to my particular setup so that, while working in Houdini, the environment variables are lost. They are correct in the terminal, but when printing them out in the Houdini Python shell, they are not set. It works correctly on other user accounts. Can anyone think of a setting or something somewhere that might cause this? thanks!
  12. Hey guys, I have scattered points on arbitrary geometry and I am trying to connect them by doing a random walk. I will start with one point, pick a random next point from neighbours, etc. etc. Currently, I am using a point cloud in vops to find the nearest neighbour counts in a given radius. I am basing some of my thinking process on this implementation of Dijkstra's algorithm. His basic process is: 1. create a hi-res grid (quads) and count each point's neighbours (using neighbourCount in vops) 2. use forEach to count up first N neighbours and save each neighbour number as attribute ( neighbour1, neighbour2, neighbour3, etc.) 3. uses these neighbour ID attributes in a python sop later for easy lookup of nearby points. In it, he is using the fact that all points are connected to find a path on a surface. Unfortunately, don't have the luxury of working with already connected points (from using the scatter sop) and I don't have much experience with the point cloud nodes. Is there a way to retrieve the nearby point ids from unconnected points? I know the radius and number of points. Any suggestions to point me in the right direction would be great. Thanks!!
  13. I have a Houdini file (hipnc) with read/write permissions for all users (the ideal situation for our pipeline). However, after opening and saving this file, the permissions are changed so that only I have read/write permissions. Has anyone else experienced this and/or have any thoughts on how to make sure that the file is always accessible to all users that need it? Thank you!
  14. thanks guys! I had actually seen that before and tried to decipher the hip file. i'm having trouble understanding the vopsop and how that works with the rendering. it seems like they have a pretty broad range of capabilities, so i am having trouble finding the information that pertains to what i want to do. do you have any tips for how to get started? thanks again!
  15. I am just beginning to get into rendering in Houdini. I am comfortable doing simulations and working with POP networks (and have programmed various particle systems in the past in C++). The effect I am working on is an exploding electrical panel (similar to the video below). I can play with the motion of the particles all day long, but when it comes to actually rendering them, I don't know where to start. I need some glow and some short trails (but the particles will fade out). These will then be composited into a rendered scene from Maya. If you could just give me a few topics to look into (tutorials, or example files would also be helpful), I would be much appreciative! Thanks! (starts exploding around 3:40) http://www.youtube.com/watch?v=Q-29URy7UOE&feature=youtu.be&t=3m39s
×
×
  • Create New...