Jump to content

JH12

Members
  • Content count

    87
  • Donations

    0.00 CAD 
  • Joined

  • Last visited

Community Reputation

5 Neutral

About JH12

  • Rank
    Peon

Personal Information

  • Name
    JH
  1. Probably a fairly simple thing to do but I'm struggling to get there. How would I take a simple piece of geo, say a cube, and turn it into a wireframe look, which mostly maintains the overall shape of the geo, but has a wavy/noisy effect on the lines itself. As far as rendering, I would like to do it in Redshift but open to mantra if its simpler. But I'm thinking this is a geometry thing rather than a shader effect..? Simple sketch attached. Thanks for the advice
  2. I find myself often in a situation, particularly in SOPS, where I want to copy a node from one area of my graph and paste it into another. My gripe is that if I copy the original node, and then paste it to its new position, the copied node maintains its wiring to its original parent (which is almost never what I want). I want to be able to copy it, select its new parent, and then have it wire itself in underneath its new parent when I hit paste. Is that possible? It seems to me like very logical functionality, so I'm wondering if I'm missing a shortcut.
  3. How would you cap this pipe?

    Fantastic, thanks so much, very simple indeed.
  4. How would you cap this pipe?

    Simple modelling question: How would you most efficiently cap the ring of this pipe that I have highlighted in yellow? Thanks for the advice.
  5. Hi, I have some geo with point animation (such as animating the surface with a point VOP, or alternatively using a mountain SOP). I want to be able to export this to an FBX (or other format) for playback in Unity. When I export FBX, the animation will import fine back into Houdini, but Unity is not showing any animation, and from my research thus far it seems that I cannot export point animation to Unity, only object level stuff which seems quite limiting. (See Olivers post here: https://www.sidefx.com/forum/topic/76189/) Does anybody know what the best way is to do this? Thanks
  6. I've got a list of coordinates in a .txt file and I'd like to be able to create a loop which iterates over these values and pipes them into a parameter, so I can watch that parameter animate in realtime (as though I was middle mouse dragging over a parameter, for example). So far, by importing the data as a for loop and applying it on each iteration to the parameter, it doesnt work.. it will just update once on applying the script, but wont continuously update in the parameter in the viewport as new values are iterated over and assigned. Is there a way to achieve this? Using python 3, preferably. Thanks for the tips.
  7. Thank you Rence, the centroid function looks like what I need. Cheers
  8. I have an animated model, Model A. I want to set its pivot point, and then return the position of that point, using that position to drive some transform parameters of another object, Model B... so for example, as Model A rises vertically I want the other Model B to scale wider. How would I most easily achieve this ? Thanks for the advice
  9. Hi, Probably a fairly simple answer to this one... I've scattered my source points onto a grid and fractured an identical grid using a voronoi fracture. I have a Point VOP which I've created which animates those source points. What Id like to do is have each of the fractured voronoi pieces follow the animation of it's associated fracture source point. Hip is attached. How would I go about this? Thanks for the help Voronoi_Points.hiplc EDIT: I achieved what I needed using the Assemble SOP, and applying my Point Vop directly to the packed geo. Though I am still curious to know, without an assemble SOP, how you would assign a group of @name:piece* primitives to a follow the fracture points?
  10. Yes! That's just what I wanted, only had no idea how. To stop the points changing every frame, I changed your "else" just set the unused points to Alpha = 0 rather than delete them, keeping the constancy of the point numbers on the grid. So now I can do things like blend shapes or fx which reference the point IDs. Thanks mate, much appreciated!
  11. Not entirely sure if you've already done this, but have you tried caching the simmed points out and then applying your instances to that, rather than re-running your sim with the new objects?
  12. Sure, here is a sample, it's a very decimated example to make it easier to work with. The .hip is nothing, its just a file sop for the supplied BGEO. Thanks for taking a look at this, it's greatly appreciated. Azure_Test.zip
  13. I'm revisiting a problem which I couldn't solve a while ago, as very much a part time Houdini user I'm struggling to work this out, and hoping someone may be able to give me a push in the right direction. I've simplified my question in hopes that it is more approachable. Basically, I'm dealing with a pretty dirty mesh of an actor which is coming out of a capture from a depth sensor (Azure Kinect). Whilst sequential frames are fairly similar in shape, the topology is very different frame to frame - specifically the point IDs change almost randomly across every frame, as do the number of points. What I'm wanting to do, is see if there is a way to retopo or reproject this erratic frame sequence of mesh onto a new surface, and maintain a consistency of point ID. I've attached a simplified diagram to try to explain my problem and what I'm hoping to achieve. Thanks very much for any assistance with this, much appreciated.
  14. I have a node called wireframe, which takes RGBA values for a wire colour (Parameters: wireColourR, wireColourG etc) If i wanted to wire in the Cd values from points on the geometry, is there a way of adding this Wire Colour parameter as an input on the WireFrame node? Thanks for the advice
  15. Yeh Mplay displays a similar alpha, but with all channels on it creates an image like in my post above, whereas AE/ Prem do not, the image has far less detail. Maybe this could also be a Redshift thing.. I dont know yet. By changing my AE settings to the following I can get the image to look similar, but whatever I export still seems to have the same alpha issue once in Premiere. Perhaps this is starting to be more of an issue for an Adobe forum, but I just thought there would be some people here who have a simple answer, given this must be a very common workflow.
×