Jump to content

Candide

Members
  • Content count

    9
  • Donations

    0.00 CAD 
  • Joined

  • Last visited

Community Reputation

0 Neutral

About Candide

  • Rank
    Peon

Personal Information

  • Name
    Matthew
  • Location
    London
  1. Dandelion Wire Simulation

    Hey folks, I am trying to create a realistic dandelion simulation and beginning to bang my head against the wall/desk. I'd like to end up with this: I have attached my (rather tortured) scene attempt. My method is to bring all the sop geometry in as one big wire sim and affect gluetoanimation and kangular on the fluff/stem. There are 3 movements that I'm attempting to capture: 1. The twitching of the fluff 2. The bending of the stem while the prongs are still attached 3. The moving of the prong/fluff when they a re broken. I am achieving the first movement by affecting the wires with wind with random activation (otherwise the wires settle to still) but I don't want this wind to affect the prongs. Basically I'm all tangled up and if anyone has any ideas it would help me out a lot. Thanks all dandelion_Test.hip
  2. Multiple State Changes In Dops

    sadhu that is great. Thanks a lot. Great stuff. I really appreciate it
  3. Multiple State Changes In Dops

    A scene file would really help. This is a little over my head. I've been told to use a switch solver with an animated switchValue which makes sense for cloth-rbd sims, but I don't understand how to fully represent an abject as a flip fluid in dops (or how to change dop parms in the same way as I can change sop parms, to affect the switchValue by things like collision, sigh) Any further help would be massively appreciated
  4. Multiple State Changes In Dops

    Hi There, I am attempting to create a scene that will procedurally change an object's physical state based on emergent events (impacts for example). I'm not too experienced with the dops context but I assume the sop solver is integral. Here is a dummy scene (hardly worth posting) of a sphere about to hit a groundplane. How could I set up a network that turned the sphere into a flip fluid when it hits the ground, and continues to evaluate as a flip fluid? Ideally I would be hoping to extend this to various other dynamic solvers (smoke for example) solid_to_fluid_test.hipnc
  5. Hi there I have attached the a simple scene to illustrate my problem. I am attempting to write a file with a filename copied from a parameter that uses python to generate the filename. In the attached scene the parameter produces '$HIP/pythonTest.bgeo' When I paste relative the reference using `chs("../parm")` I get an error saying on the file node saying "unable to read file '$HIP/pythonTest.bgeo'" If I manually type this in or copy it from a parameter that I type it into it works fine, so I know it has to do with resolving the "HIP..." filename into a full path but I don't know what command I can use to do this. Thanks a lot pythonFileTest.hip
  6. Hi folks, I have a question regarding texture baking. I have attached a simplified version of the scene I'm working on which simply exports a baked texture (constant purple) attached to a sphere (texture looks like purpleTexture). When I re-apply this texture I am getting little black lines along the UV joins. (fig2) Another poster asked a similar question and was advised to turn off all gaussian filters, which I'm pretty sure I have done. I'm pretty sure the problem is somewhere in the mantra bakeTex node but I don't know mantra well enough to do much more than arbitrarily set values and hope the lines go away. Any help would be appreciated. blackLines.hip
  7. world coordinates from illuminance loop

    Thanks for your reply. The illuminance loop is in the surface shader vex shop which is defined in the otl. The problem I have is basically understanding how to extract coordinates from the light shader (called by the illuminance loop) which will be in the same coordinate system as the surface position defined in the surface shader. I want to append the world position of the light that is being called, within the surface shader illuminance loop that is calling it. Seems simple enough but everything I try is failing. Perhaps you can't post stuff from work but if you have any examples that would be great. Another solution to my problem would be to attach a unique identifier to an instanced light and be able to retrieve THAT from within the illuminance loop.
  8. Hi there, This is kind of a complicated question but hopefully someone can help. I am trying to project onto a surface using some vector maths but I am completely confused as to how to get world coordinates out of the illuminance loop. The attached scene has 2 different shaders that I feel illustrate this point. Change the material on texturedGeo. The instance point geo creates a single point that will act as the 'projector', gives it a normal that will be it's projecting direction, and writes it out to a pointcloud ("scatterCloud.pc" which odforce won't let me upload so you'll have to make an empty one yourself if you're kind enough to help). when shader = PCpreview: The shader iterates through the points in scatterCloud and colours the geometry if the direction of projection and the vector connecting the projector to the shaded point are within a certain search angle (in this case 45 degrees). This produces a render as shown in PCpreview.png from cam1. This is all fine so far. when shader = mjw_surface1: Things start to get weird. This should be a very simple shader. It runs an illuminance loop and colours the surface blue if it determines the shaded points should be 'lit' by the illuminance loop (in other words if the vector N from the shaded point and L from the light shader are within a certain angle (in this case 45 degrees). Within the light shader (mjw_light1) is where all the confusion really lies. I do not understand how to determine the light position (P in a light shader) or the surface position (Ps in a light shader) in world space in order to return the correct L vector. The closest I can get is to just use the P and Ps values as they are (which I believe is in camera space) and do P-Ps which creates a render as shown in mjw_surface.png from cam1 The reason I need to use the illuminance loop is because I need to detect intersection and only shade the surface points that are the first to be intersected. If anyone can explain to me how I could get the world space coordinates from the light shader (wo_space(P) obviously doesn't work) OR if anyone could explain how I could detect intersection in a surface shader I would be HUGELY appreciative as I have been working on this for ages illuminanceLoopQuestion.hip mjw_otl.otl
  9. Hello friendly houdini community, I would like to know how to procedurally write point clouds from a vopsop. I'm scattering points over a surface and I would like to create a point cloud based on these scattered points. This point cloud will record the ID and position of each point so that a shader can sample the nearest X points and do various fun stuff after that. Currently the pic/hip file attached is creating a scatterCloud.pc file of Okb (with no information). (the other method approach in the attached hip file reads a point cloud from an input to the vopsop, I hope this isn't the only method of writing point clouds on the fly because I guess that would mean creating a bunch of geometry attributes to be read by the shader and this seems less elegant ((and I'm confused by how to do it as you can see)) ) Anyway I'd appreciate any advice on procedurally writing point clouds or help with my muddled hip file. Thanks a lot, Mat pcwritevopsop.hip
×