Jump to content

AntoineSfx

Members
  • Content count

    144
  • Donations

    0.00 CAD 
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by AntoineSfx

  1. Random spheres at render time

    Thanks I found the incriminating points. Also, I just realized that polywire outputs a dirty (unconnected points and not water proof) mesh with the default parameters for this geometry, but it can be fixed by increasing the segments and division, and a fuse after that if the problems remain
  2. Not clear when this happens, but when a brickered polygon is used as the input of a polywire, I have weird results Spheres are rendered at some points of the polygon.. randomSpheres.hipnc
  3. setpointattrib after addpoint in detail wrangle

    So I got this working: here is the fix: setpointattrib(0, "order", 0, 0. , "set"); The type of the order attribute is determined by the type of the fourth argument, when called first on order (creation of the attribute) Here I added a point after the 0 to force the parser to see it as a float. After that the attribute is created as a float, and everything makes sense. Alternatively, passing a float variable as the fourth argument will work too. Edit.. float f=0; setpointattrib(0, "order", 0, f , "set"); setpointattrib(0, "order", 1, 1.0, "set"); <<<--- 1.0 works as expected float f=0; setpointattrib(0, "order", 0, f , "set"); setpointattrib(0, "order", 1, 1, "set"); <<< --- 1 point #1 has order set to 0 when the argument is an int, any integer really..
  4. Given two points in input, why doesn't this work: in point wrangle / detail : setpointattrib(0, "order", 0, 0 , "set"); setpointattrib(0, "order", 1, 10 , "set"); for(int i=1;i<10;i++) { float f= (float)i / 10; int a=addpoint(0, lerp(point(0,"P",0), point(0,"P",1), f)); setpointattrib(0, "order", a, f, "set"); } I'm trying to add points, and keep track of when it was added in the loop The setpointattrib in the loop has no effect. I assume that I can't setpointattrib until after the whole node is cooked ?
  5. Smooth direction to surface?

    It's interesting, what is being computed is unclear. Do you have access to that software and can you tell us the name of that operator.. Maybe there is some online documentation that explains what is actually computed ? Or .. can you can post a few other test case for this operator ? Tricky ones for example (non convex curves, extreme cases.. ) I'm not sure it target curve (polygon) is resampled in order to add more points as it gets closer to the circle, or maybe it's a side effects of casting more rays on a segment that is closer to a section of the circle. Also, the normals don't seem to be involved. The rays also don't look like they're aiming at an associated (computed from the straight skeleton or something) point, as you can see almost parallel rays in the top left quadrant. My guess is that they are minimizing the sum of several functions, which include the length of the ray, the difference of the ray angle on the target between consecutive (or some kernel function) points, smooth resampling of the target I tried to minimize distance to target but it doesn't feel the same
  6. Polyexpand failing in a for loop

    I have this simple setup: a few 2 points segments, which I want to polyexpand individually. I'm not trying to polyexpand all of the segments as a whole. It's failing when I iterate over the segments in a for each primitive loop with 1 flip event ignored because of infinite loop detection during skeleton computation. Also the cooking time is insanely high, even tough in the end the geometry looks correct.. Can you tell me what's happening here ? It looks like there is some feedback loop which leads to an exponential time increase with the count of primitives, but I can't find an explanation. This is the default Block Begin setup with For-Each Primitive. odforce.polyexpand.hipnc
  7. yes, no need to mess with nested function calls. I like it better because it seems to refer to .. the point being processed, whereas using the variable iteration kinda defeats the purpose of having an iterator.. Thanks for the trick, hoping it's no going to come haunt me later
  8. Using a Block Begin SOP / for loop, I'm trying to generate one font SOP which text is read from a string attribute set on each point I don't manage to get the syntax right for the text field It should be something like this: a reference to the node containing the string attribute, an attribute name, and an index, per Houdini documentation: point(surface_node, point_number, attribute, index) So I thought this should work: point("../pointwrangle1/", detail("../foreach_begin1_metadata1/","iteration",0), "letter",0) It doesn't reject it, but it doesn't return the text in string attribute "letter", but the integer value of iteration instead What's happening here ? odforce.text.hipnc
  9. Thanks. It's not clear from the documentation that strings and numbers are treated differently.
  10. I'm trying to duplicate some geometry using a Block Begin then referring to the iteration metadata. ( There is one dopnet per instance, before the for loop. The result of the for loop it not the input of one dopnet) After the geometry passes through a dopnet, the detail value is set to the correct value on the first frame, but after that, the value is set again, using the last know value of iteration before that ( i.e numiterations - 1) There is something that I don't understand with the way objects are created in this setup.
  11. This is expressions vs VEX.. Not a Houdini historian here, so I'd rather send you on this thread on sidefx.com : https://www.sidefx.com/forum/topic/34896/ But the thing is, you're not going to drive the offset.x of the Mountain SOP *per primitive* like that. I'm not sure what you expect it to do. The closest I can think of is to append a polyextrude after your primitive wrangle, and use that attribute to scale the extrude distance, in local control / distance scale. Or if you want to go that way, I asked something related a few weeks ago, If you reassemble (merge in the for loop) those primitives, your result won't be continuous any more, because from one adjacent prim to another, you not only vary x and y in the mountain itself, but yo also vary the "offset" parameter of the mountain, which leads to a different function really.. Maybe you could describe what you want to do in the first place. #xyproblem
  12. Normal Direction

    Also, be aware that the tangent at a point of a polygon is really .. undefined, because ... differential calculus. So it's up to you to define what you want at the point; whether it's the average of the two limit tangents, the weighted average, the bisector, and so on A curve is really a sequence of points which are a weighted average of the points of the hull. You can transfer the tangent of the curve obtained by building an interpolating curve on your polygon, which is basically a way of averaging the tangent over a number of points, depending on how the curve itself is built (2 choices in Resample, but you can make your own if you want), and which kernel parameters you use to transfer the tangent back (in Attribute Transfer)
  13. HD print screen?

    I assume you meant OpenGL, as in a quick and dirty way to render ? I'm not sure what the OP meant with " what you can see in the screen."
  14. HD print screen?

    I don't know what you're trying to achieve, but if you want to extract a vector graphics from the projected view of an object, you can certainly do it. If the program that will open that file accepts any format that you have in Save / Geometry at Geometry level, then yo can simply do that. Otherwise, you can cull the backface with delete / Normal / Backface from ( then point to an orthographic view camera), project the object on a grid with Ray SOP, convert lines SOP , then write to a file in a wrangle using a format that is friendly to your application, depending on your skills. You can also do STL to PS or anything really, STL being an ASCII format, it shouldn't be hard to convert to whatever format you need
  15. If you want to do this with L-systems specifically, you have to write a set of rules that would allow you to input: an initial growth direction (derived from the normal on the target surface)... L-systems can read its own variables b,c,d , you can rotate the initial segment to give the initial growth direction. But then you have to figure out a set of rules to will implicitly growth the trunk towards global Y+, so that it eventually becomes vertical, but not too soon so that it doesn't hit your sphere. You also also tropism with T(g) which will allow you to grow the secondary branches towards global Y-. You can also hard code the premise in a for loop (parameter premise car reference a string you build in a for loop) It seems doable, but I've seem this recently: It would be a lot easier to compute the location of a few points from one point in space (initial growth point), one normal, and a bounding sphere in which you can't enter.
  16. Calculate length of adjacent pieces sop ?

    maybe something like: 1/ in connect adjacent pieces, enable restlength 2/ add an attribute promote from primitives to detail, set sum as promotion detail and restlength as original name You know have a detail attribute with the sum of the lengths of the polygons
  17. Duplicating Digital Asset with parameters variation...

    Thanks. I think I've read stamping is going to be deprecated, but this could be a solution for now. However, when I add a dopnet in the geo network that references something being stamped, it's no longer working, I have the default stamp value I believe, once it enters the dopnet Is duplicating dopnets a good idea anyway ? I could certainly replicate the geometry before the dopnet and use it as a whole, but it made more sense for what I had in mind to have independant dopnet, one in each instance In this example, I expected each instance to be move along y axis also, but this is missing. The static part (the font object) works as expected. Why is it so ? minimaldopcopies.hipnc
  18. enable [x] tangent Attribute tangentu in node "resample2" You can store the cross product in @N if you want, but by doing @N / @tangentu / @tangentv , you have a normal, orthogonal frame for each point, which is a strong foundation for anything downstream IMO. Also see polyframe, which generates N, tangentu and tangentv. No need to do cross products by yourself
  19. Can you provide the hip file ? Also, be aware that the vector you're looking for , after the wrangle I provide, is @tangentv, not @N.
  20. Enable tangentu in a resample, then in a PointWrangle, v@tangentv=cross(v@N,v@tangentu); [edit v@tangentv =, not p@tangenv = ]
  21. Duplicating Digital Asset with parameters variation...

    Can you explain what you / he / meant back then ? I'm trying to make variations of a digital asset but the parameters are not at the correct level. For example, a text parameter for a digital asset is found at the Scene level, but once it's merged into a object merge and copied, I can't find a way to access that parameter to make the variations.
  22. So I have one digital asset, which has one parameter (a string), and contains a DOP. The idea would be to replicate this asset on a grid, and vary the string parameter. However, I don't understand how to access that parameter, as it exists at Scene level. When I import the DA at geo level with Object Merge, I don't have the interface for that asset, so I can't set that text value. Can I somehow create a parameter on the Object Merge at geo level that would be linked to the parameter of the digital asset at scene level ?
  23. How much RAM is too much ?

    Thanks for the answers. I will take this into consideration for my next upgrade.
  24. How much RAM is too much ?

    Under Linux, is there an upper limit to the amount of RAM Houdini can handle right now ? I have 64 GB right now, but I'm considering an upgrade. I assume I will hit another bottleneck before RAM starts to really be a problem. Can you give me a real life example for an indie user where I would benefit from having more than 64 GB of RAM ? By real life example, I mean ... if I can go to a random frame with no delay of a large sim, but each frames now requires 4 hours to render.. I now have another problem to solve
  25. local minimum in 3d

    absolutely, I actually encountered that in magnetostatics a few years back; now I'll have to see if I can implement that on a discrete mesh.. It should be interesting
×