Jump to content

papsphilip

Members
  • Content count

    144
  • Donations

    0.00 CAD 
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by papsphilip

  1. Relative Flatness

    curvature would be perfect if there was some kind of threshold. remapping curvature to 0-1 and ramping it doesn't change anything.
  2. Relative Flatness

    Hi, i am trying to make a mask based on the relative flatness of my mesh or relative curvature? i am not sure how to put it. i have a rock with bumps everywhere obviously but you can clearly see areas where in general it is flat although there are small bumps, you can distinguish relatively flat surfaces by eye. i tried generating the curvature map and then blurring it but the results are not that good. They are soft after the blur and not precise. the other thing i tried is clustering by normal -6 clusters- for a box like projection if you will. but the problem with clustering is yuo have to decide the number of clusters. i would prefer a threshold of normal similarity. thats what i will try to do next
  3. Relative Flatness

    here is curvature compared to a custom wrangle solution. Almost there. The wrangle does almost what i want Any other ideas?
  4. Extract transform

    a bit of clean up for anyone interested extract_transform.hip
  5. Extract transform

    i have a library of packed primitives with a unique name attribute that i import, copy and transform, and mirror. I need to extract these final transforms to export a json. Is there a way to do that? Usually i have the orient attribute that helps me to generate a 3x3 transform but in this case i don't have any other attributes other than the position. extract_transform.hip
  6. Extract transform

    thank you! didn't think to use a 3x3.. 4x4 contains the position as well but i guess copy to points uses directly the position of the incoming geo points.
  7. Extract transform

    packinject SOP seems to be applying the transformation correctly first input is a my packed primitive library with name attribute, the second input is for my already placed and named packed primitives. Injection method set to collate packed sources using the name attribute. So at least i know that the intrinsic transform is correct after copy and transform or mirror operations and can be applied to other geo. i haven't figured out how to apply that transform myself yet copy to points output for some reason is not correct 4@transform = primintrinsic(0,"packedfulltransform",@primnum); This is what i used to extract the transform extract_transform.hip
  8. Extract transform

    yeah i am using the "packedfulltransform" intrinsic attribute right now. seems to still be there after the mirror sop or the copy and transform. i want to copy my named primitives to the points with the transform just to verify that it is working but i haven't managed to do that yet. is there a way to use copy to points but instead of pscale and orient use a transform?
  9. VEX get ordered menu Labels

    menu = node.parm('parm') // get the parm index = menu.evalAsInt() // get the current index label = menu.menuLabels()[index] // get the label of the index try this
  10. How can i get the ordered menu labels in VEX? I know how to do this in Python but what about VEX?
  11. Random value hold 1 or -1

    thank you very much for the detailed response! will study these asap
  12. Trying to find a little trick to return random selection of two specific values (1 or -1) and hold it for a number of frames or i in a loop so it would be 1,1,1,-1,-1,-1,1,1,1,-1,-1,-1 or even better random ranges 1,1,1,-1,-1,-1,1,1,1,1,-1,-1,-1,-1,1,1,-1,-1 something like this but with better control of the distribution for(int i=0; i<10; i++) { float test = fit(floor(fit01(rand(i),-1,1)),-1,0,-1,1); int np = addpoint(0, pos); setpointattrib(0, "test", np, test, "set"); {
  13. Vdb blend by gradient

    this was my solution
  14. Is there an easy way to blend two vdbs using for example a gradient on the Y axis? i have a displaced fog vdb and i want the displacement to happen in a certain area. Can i control that with a mask somehow inside the volume vop or do i need to blend the fog vdb without the displacement? Any workflow tips?
  15. VEX get ordered menu Labels

    awesome! thank you @acey195
  16. no this is not working for me either. Houdini 19.038 Also the HDAs convert the licence to limited commercial But i am not sure this is what i was talking about though You can project textures onto geo using the various types of projection, but i didnt understand if you can actually get your viewport and turn it into an image realtime
  17. is there a way to do that interactively without rendering or exporting a viewport flipbook? without custom raytracers. Just straight up grab the viewport and use it as an image on a grid. Feed the viewport as you are seeing it through the camera but projected on flat plane.
  18. old thread to revive , but there should be an easier way to just grab the viewport as an image realtime and have it as a texture on a grid. The raytracing experiments are really interesting but they are a bit involved and allow for a lot of tweaking if you want something specific. The viewport gets rendered realtime and it would be nice to have an option to grab that live openGL somehow. i suppose you could create a TOP network and trigger an openGL ROP every time there is a change in your hip file or when you save or when the camera moves, but that would be slow and certainly not realtime. i am working on project where i am making a short video for a screen inside a room, i have setup the room in 3d and i would like to check the layout of my scene on that screen and not in the 3d space i am building it. could there be a hack to steal the viewport stream?
  19. Generate Goal and Twist controls

    after further testing i see that apart from the ikchain problem the stabilize joints isn't doing anything. lock and unlock is not working this might be something really obvious but this is my first time with kinefx tools
  20. i have an animated quadruped rig with some feet sliding issues. I am trying to use the stabilize joints SOP but i don't have goal or twist control. How can i generate those? geo.FBX rig.hip walk.FBX
  21. root = hou.node("/obj/") netbox = root.findNetworkBox("Mynetworkbox") nodes = netbox.nodes() print(nodes) i am trying to get all nodes inside a specific network box but print(netbox) returns None
  22. Bake 2d particle position to texture

    well that was my first choice but the texture for 500.000 or 1M particles over 480 frames is pretty huge and heavy. i wasn't sure this was the right workflow to be honest. Also i am still not sure about the resulting precision of these textures from COPs. Even though i export 32bit float exr in my custom solution, when i go over the image in Houdini with the color sampler i can see the numbers are rounded. So a @P.x value of 0.625753 becomes 0.6258 since vat also uses COPs if i am not mistaken i was afraid of losing precision.
  23. Sounds simple but i've never done this before. How can i bake particle positions into a texture sequence? the point count is the same through out the animation their position is always on XZ plane and 0 to 1 space
  24. Bake 2d particle position to texture

    i managed to export a position map for all my particles - one texture per frame - using a custom but simple solution. i write each particle position as Cd on a grid of points. The grid point count must equal the particle count - if grid points are more its ok. Then in COPS using a VopCopfilter i write the Cd from each grid point as one pixel value. So bottom left pixel 0,0 takes Cd from point 0 then continues right to the end of the row and then to the row above starting left to right. i will upload my solution as soon as i find the time. I am also thinking now maybe a python node would be the best solution of all. i will definitely try that as well - exporting point positions to an image straight from a python node
×