Jump to content


  • Content count

  • Donations

    0.00 CAD 
  • Joined

  • Last visited

Community Reputation

1 Neutral

About marcosimonvfx

  • Rank

Personal Information

  • Name
  • Location

Recent Profile Visitors

612 profile views
  1. subdivide without smoothing

    Ah, that one does exactly what I wanted - even in the description it says that it doesn't move the edge points. Thanks!
  2. subdivide without smoothing

    Hm, not exactly. I really just wanted to get an additional subdivision between two points without changing the position of the two points - but these don't have to necessarily be the end points. So first there's two points - in the next step one more gets added (in between the two points). Then, in the next step two get added - one between the first and the middle and the second one between the middle and last. Then in the next step four points are added ... So pretty much what the subdivision node does, but the subdivision node smoothes the parameters so all points (except the two very edges) change position. In the end I managed to do it by subdividing (so points "loose" their correct position) and then checking the point positions before the subdividing: every point now has the old pointnumber *2. So it's easy to reference back the position this point had before the subdivision. Kinda hacky - I would expect there'd be a possibility on a subdivide node "no interpolation" ... but oh well. Thanks for your input, guys!
  3. subdivide without smoothing

    Thanks. But the resample node only leaves the endpoints alone. If I have a curve with three points, one at x=0, one at x=0.0158487 one at x=1 then resampling the curve with 4 segments (one division between every two points) will space them out with equal length, effectively moving my second point. Rather I would want x0 to stay at 0, x1 to be at 0.0158487/2 and x2 to stay at 0.0158487 etc
  4. subdivide without smoothing

    Hi, as the title says, I'm trying to subdivide a curve without changing the shape of the curve at all. I see no option in the subdivide node to not interpolate between points - even point attributes seem to always be interpolated with no chance of turning it off - help!
  5. takes node activation / visibility

    Turn on the Display Toggle and set it to 0 on the Render Tab.
  6. Hey Follyx, your scene looks fine on my end. I tested all three of your OUT nodes plus a region render and can't see any aliasing. Are you sure you're not zoomed in on your image? When you zoom in the render view Houdini doesn't filter the pixels smoothly so anything other than 100% will look janky (aliased).
  7. Set Arnold displacement height in LOP

    Afaik displacement is a material attribute in USD. Because the SOP import node doesn't create materials this attribute gets lost in translation.
  8. Grouping with multiple BBOXs

    You can change the group node under "Bounding Type" to "Bounding Object" - then it will take the geometry from the second input to group your points. If you create a box and plug that box into the second input you basically have recreated the type "Bounding Box", just with more options. So you can for example create many boxes, merge them and then plug this merge-node into your second input on the group node - like many bounding boxes.
  9. Hi I'm experimenting with (rendering) colorspaces and found something I can't explain. Hopefully someone here can: #the setup: My OCIO is set to ACEScg as working space. I create a lightbox with a green wall and shine a red light on it. Rendering on Mantra, preview LUT is turned off (raw) Scenario #1: The wall is made green via a color node (0/1/0) and a principled shader that uses point colors. All other things are set to 0. --> the wall is completely black. Scenario #2: The wall uses now a principled shader with a green texture (saved as "raw" with (0/1/0) color values) --> without anything else the wall still is completely black (no matter whether "source color space" is set to automatic or linear) --> if I use an ocio transform and transform from sRGB to ACEScg the wall lights up a little! This makes sense, because when I do the same transformation in Nuke I can see that sRGB 0/1/0 becomes (0.29/5.11/+0.00) - so a pure red light now would pick up some of the color. Scenario #3: The wall uses again a color node (0/1/0) with a principled shader. The light now gets a red texture ("raw" 1/0/0) --> The wall is black again (so I assume no transformation of the texture was done) --> However: Changing the RGB color settings from (1/1/1) on the light to something else in addition to the texture changes the image. The last point confuses me. So far everything pointed towards Houdini just reading the RGB values of textures as "data" without changing them unless an ocio node is used. But that way with a purely red texture (1/0/0) there should always be a 0 in green and blue, no matter if I set the light color to (1/1000/1000). The render also changes whether I use a principled shader or a classic shader and becomes very noisy and error-y when a light color of for example (0.001/1000/1000) is used - so it might be a bug. Wondering if anybody else has any thoughts on that.
  10. editmaterial - how to?

    This might be a super basic question, but for the life of me I cannot get an editmaterial node to work with Karma nodes (for redshift it works): Redshift: In a materiallibrary I create a rsMaterialBuilder and inside I create my node network, flowing into a redshift_usd_material node at the end. Then when I create an editmaterial node and load this material I get the complete tree of nodes to modify. With Karma this seems different. First I don't need to create a materialbuilder, I can create a principled shader for example right there (though I have also tried with a materialbuilder). I create my network with for example a texture (and I have tried with and without a collect node at the end) and after want to modify this in an editmaterial node. But if I do that the network inside is always broken and none of the additional nodes (texture) is there (see picture)? Has this ever happened to you? What am I doing wrong?
  11. Random Textures in Solaris/USD/Lops/Karma

    Just stumbled across this topic - this might be way too late, but maybe it can help someone else: I use a principled shader in this solution: go to the texture tab and mouse over the parameter to get the name and type (eg. basecolor_texture, string). Outside you need to create a primvar with this type and parameter. Then Karma will take this value, set on the primitive, rather than the default one specified in the shader. You can do that with an attribwrangle or a materialvariation: In the materialvariation be sure to put the primitive that you want to change in "primitives", set the name and type and fill in the new value that you want to overwrite the shader with. Done!
  12. Hi After reading this article on scratchapixel, where they talk about an efficient algorithm for finding intersections of a ray and a box I wanted to see how it compared to the built in intersect function. To my knowledge, the built in intersect function is looping over all the prims given in no particular order - I infer this information from the description of the primfind function, which states that it should be used with intersect, because"primfind uses an underlying tree structure to speed up search" (so I take it intersect does not). My test scene creates boxes on n scattered points in a volume. On one attribute wrangle I use the built in intersect function, on another one I use the one from scratchapixel. I would expect the scratchapixel one to be faster, also because the intersect has to loop over n * 6 prims (6 prims per box) whereas the scratchapixel one only has to iterate over n points. However the intersect is magnitudes (!) faster than the custom one. Does anybody have ideas why?
  13. bind vector array

    Thanks guys, that did it!
  14. bind vector array

    Hi, I'm having a slight problem where I'm sure there's a quick solution I'm just overlooking. On a geometry I create a vector array { {1,0,0}, {0,1,0}, {0,0,1} } together with a simple color {1,1,1}. In a materialbuilder shader then, if I bind to the simple color and hook that up the the surface output I see the white color. However, when I create a bind for the vector array, get the nth element from it and hook this into the surface output, I cannot get any of the array vectors to display. File and screenshot attached. Thanks for your help! M vecArray_odforce.hiplc
  15. technical: How are geometry lights rendered

    I never thanked you for these! Thanks! I didn't quite find what I was looking for, but I found my answer eventually in the raytracing gems book: http://www.realtimerendering.com/raytracinggems/ [page 216ff]. Tldr: Instead of shooting rays to the points on the lightsource, trying to see if they reach the point only the direction is taken and a ray shot in that direction. If the ray hits the lightsource (no matter where) then it's good.