Jump to content

meldrew

Members
  • Content count

    65
  • Donations

    0.00 CAD 
  • Joined

  • Last visited

Everything posted by meldrew

  1. gradient r&d

    Hi all, I'm approaching something in H/Redshift which I've not really attempted before - So was interested to see if there were any thoughts surrounding it. Basically I'm trying to recreate the soft/feathered gradient seen in the attached ref image. The inner hard edge I'm planning to achieve with some transformed geo floating above the grid below, however the super subtle feathering is challenging me a little. At first I though of using a simple attribVOP on a grid > point Cd. However I'm nut sure how to achieve the outer feather using that technique... Similarly with an attrib transfer from geo > grid it seems not very feasible to get the extreme feather unless the mesh is crazy high res? I'm continuing to experiment & look through forum threads, but if anyone has some suggestions/tut links/examples etc. would be very much appreciated! (Worth noting system = H17.5.258 & RS 2.6.43) Thanks in advance!
  2. gradient r&d

    Hey @Aizatulin & @srletak Sorry for the delay in getting back... Thanks for these solutions - especially the latest side-by-side Aizatulin. I've been working on a setup on/off all week & I think I'm pretty much there now with the combination of what to do in vex/vops. It got a little more complex as I've decided to try and implement displacement which is based on the above techniques, however with some separate controls, so I'm just tweaking & learning as i go. I'll see how far i can get over the weekend & with a little luck post some updates sometime soon. Thanks again to everyone who has replied!!
  3. gradient r&d

    @Aizatulin Thanks a bunch for this - As you mention it is a great comparison of the 2 techniques in terms of drawing the shapes & the attrib transfer aspect of it. However I am still struggling implementing a point sourcing function to the original VOP setup that @konstantin magnusshared last week. If I understand correctly, the following VEX code is the definition of the points as source, finding nearest point on grid & adding the normal direction?: int pt_near = nearpoint(1, v@P); vector pos_near = point(1, 'P', pt_near); float dist_near = distance(v@P, pos_near); vector dir = normalize(v@P - pos_near); What I would really like to do is apply this point sourcing through VOPs , so that I can have multiple wheels being generated & then run each wheel through its own colour ramping before colourmix node at the end. Apologies - I'm aware that I'm probably verging on being stupid here! as always any help much appreciated. vop_point_sourcing.hip
  4. gradient r&d

    Hi! Sorry if the notes in the .hip file were unclear... The visual effect I want to achieve is accomplished mainly by the first post from response from @konstantin magnus i.e A wheel of color that can be gradiated over its length. However after seeing @Aizatulin solutions - I would also like to implement some more control over the effect, namely: - Define wheel sources with points. - Define & control individual wheel radius (with just a float param or pscale attrib etc.) - Ramp over the soft min/max VOP node for each wheel. As I mention in my post above, I believe I understand the VEX mathematical principles to a far better degree than I did previously, however I am struggling to integrate these 3 elements of Aizatulin's wrangle approach with Konstantins VOP approach. Hopefully my post here makes sense. Thanks!
  5. gradient r&d

    Thank you both again so much for the clear explanations! I think (hope) I have a much clearer understanding of the VEX workings at this point - in terms of definitions & some of the mathematics - as these examples are broken down in such helpful detail. However my SOP-based brain is still struggling a little to know exactly how to integrate the two solutions of VOP/VEX. I've put together a version of the node-tree I would like to achieve in this example file (painfully incorrect implementation - however it's more to illustrate the order of operations I'd like to achieve) If someone would be able to take a look at this & outline a potential solution or direction that would, as always, be a massive help. Feel like I am getting ever closer to the eureka moment, but I've found it so helpful in learning process to reverse engineer previous examples, I thought I'd try my luck and ask one more time. Again, many thanks in advance if anyone does have the time to help me out. vex_vop_hybrid.hipnc
  6. gradient r&d

    Hi all ok, so I've manage to get almost all of these suggestions working to some degree - However I'm still lacking a little understanding of the original suggestion provided by @Konstantin 1. If I am using the arctangent - Is it possible to create multiple 'colorwheels' on the same grid geometry? For example, I'm attempting to create a system where the source of each wheel would be a point projected onto the geo (using the add SOP, or from a particle sim etc.) - So was attempting to make a kind of hybrid of 'colorwheel.hipnc' & 'colorcurve.hipnc', however so far I've been totally unsuccessful. The transform matrix allows me to rotate the colour ramp, but as I understand the arctangent uses the dimensions of the grid to generate the curve? (although if i alter the grid scales, the curve remains the same, which is what's throwing me off, I think). 2. I've also been trying to impliment the ramp paramater in the radius of the gradient from Aizatulin's file. But again to no avail - As I can't quite figure out where to wire that into the VOP setup. Apologies for being a little slow with this & as always any tips welcomed with open arms! Thanks!
  7. gradient r&d

    Konstantin - Thanks so much (again!) for taking the time to make these files up & sharing! Before I open them & dissect fully, I'm going to spend some time trying to create your answer from the 'colorcurve' reply to try and learn as much as I can along the way! I know there's no shame in using SOPs if it still gets the correct results, however I am always trying to advance in Houdini (whenever I get the time to really sit down with it) - So I also really appreciate everyone's responses here with varying methodology. Odforce community is super helpful & supportive as always!
  8. gradient r&d

    Hi Aizatulin, Wow, ok there's a whole lot for me to unpack there, the ramp for the radius is particularly helpful. thanks a million! (my vex is ashamedly almost non-existent... Have been looking for an exercise that forces me to practice it for quite some time - this seems like a good time to do that! ) My loose plan is to build an asset where i am intersecting poly geo with a grid & then using those intersection edges to drive the gradients. Thanks again, the massive point in the right direction is greatly appreciated!
  9. gradient r&d

    Hi Konstantin, Thanks so much for taking the time to make that file up for me (with the node descriptions in the vop!) - It is very much appreciated! In all honesty this technique is a little beyond me as a casual user - However I've rebuilt your setup & am getting the same results, so in that case it was a success! I do have a few questions though, if you'd be so kind to maybe give me some hints? 1. The 'pinch' of the gradient at the end of the arc - is this a result of specifically using the arctangent / cartesian function? If so, would there be a way around it to generate a fully soft gradient around the entire circumference? 2. What (if any) would be a suitable way to approach basing the gradient on any given piece of geometry? (i.e draw a curve projected onto a grid, perhaps?) I assume the length based calculations could still work - However I'm not sure where one would 'plug in' the geo to the vop setup. 2b. The arctangent function is turning the 2d coordinates into a radial value. So my thinking is perhaps for a curve this node would be replaced with another function/pattern node? Again, thanks so much for your time! I have already learned a lot today just experimenting with various parameters & exploring the help docs with a little more direction! T.
  10. Hi all, I recently found this Toolset on Github allowing realtime input & record of input from a leapmotion controller. https://github.com/arqtiq/HouLEAP Unfortunately I'm having a little trouble getting it to run properly. In the readme it explains to 'simply copy the content of the **/houdini16.x** folder to your houdini home/hsite folder.' So my question is, where would be the correct place to place the python scripts that the tool provides? (I'm not entirely sure what the 'hsite' folder is referring to?) Houdini see's the OTL's however i return the attached error in, what i assume, is the python scripting. Or perhaps I need to define LEAP in the .env file? Any tips much appreciated EDIT: The error is reported from the example .hip contained in the Github repository linked in my post. Also This was run in H 17.5.258.
  11. Multiple cameras, one cook?

    Hi Luke, thanks for the quick response - So if I understand you correctly, Mantra isn't thinking about rendering anything outside the frustum in any case? If so that's great (and what I assumed/hoped anyway!) I assume you mean saving geo out as .bgeo, caching sims etc.? Then pointing IFD directly to these caches? I'll need to look more into this & packed geo, as I'm rendering on a cloud service... But that is another thread I suppose. ha. I always endeavour to cache things out as efficiently as I can, so I'll revisit this and see if there's any improvements I can make. It is a cube map of sorts yes - However it's not for VR, so any spherical mapping/fish-eye lens type solutions will lead to distortion that is not wanted in this case. I'm rendering content for 3 walls & floor of a box, all angles 90deg, with the pov at eye height - roughly centred within said box. Would there be a specific camera you would recommend? Thanks in advance for any additional pointers, appreciate your time. I know these are fairly rudimentary questions
  12. Hi all, Ok, so I have a scene/set of scenes where I need to render the same sequence from numerous cameras. Is there a way to render, say, 4 cameras at the same position, but facing different directions, but only calculating the lighting/reflections/shading etc. once? As far as I understand .ifd just include the instructions to render, not an actual cache of the scene itself. Time is a little tight, so any optimisation would be a benefit. (Cams are also various resolutions, not uniform, in case that has an impact.) Thanks!
  13. Calculating / matching orientation

    Hey all, Left this thread behind because the workflow got pretty intense! As an update - @moneitor @petz & @galagast solutions all worked well - Thank you so much! - But the issue I kept running into, which was eventually revealed, was that the tracking data was bad So eventually I got it working with all 3 approaches once we re-tracked and got it done correctly. Thanks again everyone for your kind help - I learned a heck of a lot solving this in the end!
  14. Hello all, Have been racking my brains/this forum on something seemingly quite simple, but not getting anywhere... Basically, I would like to match the scale/orientation of one set of points, based on another. Then be able to extract those 'transform' paramaters. For example, if i have a pointcloud A, with 10 points, which has been re-scalled/oriented to create Pointcloud B (in a separate process, I don't have access to those transf params). How would i then re-orient it to match? I have a photogrametry pc, which i am trying to match a tracked camera to. The issue is that the camera track is coming it at origin, so I need to transform to match the original pc. I have isolated 10 identically positioned (not identical ptnum) points from the pc & tracking data to use as 'calibrators', but can't figure out the best/most efficient next step in doing the re-orient/scale. Any help, tips, or pointers in the right direction for threads would be very much appreciated as always.
  15. Calculating / matching orientation

    Thanks @moneitor - That is exactly what I was trying to achieve, done in a way I would have not even thought about. I'll spend some time this evening/over the weekend going through it & trying to better understand the math that's going on in there. Plus some VOP stuff that's pretty new to me. Thanks for the annotations as well, they're always very helpful!
  16. Calculating / matching orientation

    Essentially I am trying to access the results to whatever calculation(s) the 'extract transform' SOP does at object level... If anyone has any idea how to do that?
  17. Calculating / matching orientation

    Hey @jkunz07, That's great, and super lightweight. Thanks! However I'm still trying to calculate how to extract XYZ translate/rotate parameters from this process. Any ideas on that? i.e what XYZ translate/rotate is required to get from geo A > geo B. Being able to morph one to the other is v useful, but my poinclouds are *very* heavy, so being able to just apply a transform to the .fbx would be a lighter fix in this case, If I can get that data from a sample of 3 or 4 points. Thanks again!
  18. Calculating / matching orientation

    I've worked up a quick annotated example file, with a proxy version of my problem (and a few variants of my initial explorations to solve included/bypassed). I am calculating the centre (average) of each stream as a detail attrib, and my idea was to use that as a pivot point to base the orientation from. Unfortunately I have no real math background, so I'm struggling to wrap my head around the concept of reverse engineering the rotational/translation values. Also would be very helpful to get some tips on correctly using the centerpoints I've generated in the pivot parameters of the transform node. (I am currently exploring some other threads on this specific topic.) Thanks EDIT: One approach I hadn't considered was pclookup/filter etc in VOPs - Again, something I've never used before. So I'm beginning to explore that as well. pc_match_001.hip
  19. Calculating / matching orientation

    Yes, and I also want to extract the XYZ rot/scale transforms needed to place them on top, so I can then apply it to the .fbx upon import. Apologies if I wasn't clear, or if this is a v simple issue that I just am not sure how to approach - one of the pitfalls of being self taught I suppose. :/ Thanks again!
  20. Calculating / matching orientation

    Hi Jesper, Thanks for the response - Yep, have access to both in H. The camera track (which is the thing I'd like to re-orient) has been done in C4D and has been given to me as an .fbx, the pointcloud is directly from photogrammetry. So far my approach has been finding 3 points in the track that i can pintoint in the PC, then creating a bounding box of each and attempting to align. I would share a .hip but unfortunately can't at the moment due to NDA's etc. Thanks again!
  21. Trail SOP help

    Hi all, I'm having what I'm expecting is a very simple issue with the trail SOP. As my points are dying/ptnum is resetting, I'm getting glitches at the ends of my trails. Obviously if it was a POP sim, I could just trail > cal vel, then append an add SOP to ad a primitive based on id, however I can't figure out how to do it using the @ ptnum variable in the attribwrangle. Is @ ptnum is actually just the wrangle variable, and not actually the point attrib? Been scratching my head on this for a while. .hipnc attached, any help very much appreciated.broken_trails.hipnc
  22. Hi all, I've started putting an asset together which allows me to create multiple 'threads' from a single line, then effect them as if they are fraying/weaving. I'm quite happy with it, however my approach doesn't lend itself well to anything other than straight lines, and I'd like to apply it to more/multiple complex curves - so that it follows their contours exactly. At the moment it does 'work', however it distorts the original curve quite a lot, which I'd like to avoid. Could anyone suggest a way for me to adapt my current VOP setup to calculate the trigonometry per curve? Or an alternative to using the 'wireU' attrib? Or alternatively, if there is a different approach I should be taking all-together? any pointers in right direction would be much appreciated .hip attached - Thanks in advance! (N.B. This setup is loosely based on a thread I originally found here on odforce some time ago - but I cannot for the life of me find it now, so a hat tip goes to OP if reading.) thread_tool_asset_003.hip
  23. Thread PointVop Guidance

    Thanks a lot Jiri, Will take a look at these threads & see if I can get where I need - The helix along a curve seems ideal. (I need to start learning VEX poperly so I can *finally* start to move away from VOPs, they have such heavy overheards in comparison.)
  24. fill mocap volume

    Hi all, I'm working with some mocap meshes that I'm trailing particles over using the 'minpos' technique in POP VOPs. I'm getting a little stuck though, as when i make trails of the particles, these obviously trail behind the mocap as the keep their birth XYZ coords. What I'd like to achieve is that the trails adhere to the surface of the body/geo in the same way that the particles do, so i end up with something like 0:24 > 0:26 of the attached video. Maybe I need to do something within a sop solver? perhaps dop pop is the completely incorrect way to go? Any hints or tips greatly appreciated - I've attached my .hip here should anyone have the time to take a look! Thanks! (hints on filling the volume as-in the video will also be met with rapturous thanks... I'm not even sure where to start with that one.) surface_particles_trail_test.hip
  25. Hi all, I've been trying to figure out how to create a linear ripple effect (i.e. rather than have a single point as the source, I'd like to source it from a line/selected edge/group of points, so that it travels across a surface like a wave.) Now, of course, at a basic level the 'waveform' SOP node performs this exact operation - *However* it can only generate a single wave at a time, and it lacks the signature 'decay' of the ripple over time. I had the idea of using the wave operation in CHOPS using a sine wave with decay on it, however any changes to the amplitude are applied across the whole waveform, rather than 'at birth'. So it doesn't create a dying wave, it just scales down the whole waveform. (see image). Any ideas on how to achieve this? (just as an FYI, eventually I'd like to control the decay length with an animatable paramater & creation/scale of the ripple from CHOPS with a trigger.) It seems like it should be super easy - Maybe I'm just having a bit of a mental block! :/ Any mind jogging greatly appreciated!
×