Welcome to odforum
Register now to gain access to all of our features. Once registered and logged in, you will be able to contribute to this site by submitting your own content or replying to existing content. You'll be able to customize your profile, receive reputation points as a reward for submitting content, while also communicating with other members via your own private inbox, plus much more! This message will be removed once you have signed in.
Search the Community: Showing results for tags 'morph'.
Found 17 results

Hey all, I'm a compositing student but for a current group project we are to make a bumper and ended on doing it in Houdini because it just looks best. So I don't have much experience in Houdini before this except watching some courses/tutorials and playing around very little. The learning curve is quite steep and I've run into several snags. First of all I followed this tutorial to get the first morph/transition and things panned out pretty well. However I got a third model and maybe even a fourth and I don't know how to add these into the mix. In my current file there is just a simple blendshapes for the third model but it's quite uninteresting. That's it for starters at least. Other than that I'm trying to figure out how to change the colors between each model as well. The project file is probably pretty dank since it's my first project in Houdini. I really love the program and want to continue working in it! Hopefully someone out there could take some time to look it over and give some input. Attached is a .rar with .hip and the .fbx camera as well as x3 .obj models used. bumper.rar

Hi there, I am trying to build a propagation driven blend / morph effect to change the look of a geometry. But I am already stuck by transfering the atrributes between the geo A and geo B. I managed to build a set up where I change the geometry by using a attribute vop and subtracting B minus A. Here come the BUTs couldn´t animate it by keyframing the Bias of a mix node. couldn´t use a propagation effect with scattert points In the second set up I tried to change the set up by using a scatter node insted of the normal geometry. It works but it seems that the point order isn´t the same and while changing the bias of the mix node all the points change it´s position?! In the scatter node I unchecked "Rondomize Point Order" but without any effect. I also tried to use set up with the change from lowres mesh to a highres mesh, but here I couldn´t get solution with the scattered point, because of the primitive Attribute node in the second Vop. I also tried to recreate it with point cloud node but failed as always. I am pretty new to Houdini and I know that I am missing a lot of foundation knowledge, it would be great if someone could give me a hint what I am doing wrong. Thanks and all the best Dennis 170422_position_transfer_001.hiplc

 vop
 attributes

(and 2 more)
Tagged with:

Hey, while waiting for a sim at work Im playing with hudini and trying to reproduce some effect aixponza did for nike presto. Some good ideas how they did this effect on second 13? I attached a very primitivesetup, maybe someone has some input.

I want to have a puddle of water/lava that starts to rise up to create a character standing up that i have animated, i have tried doing it in reverse by melting him and playing it backwards but it looks wrong, i want to learn how to do it properly. I am looking for any good tutorials that might help me? Any help would be amazing! Thanks

Hey, while waiting for a sim at work Im playing with hudini and trying to reproduce some effect aixponza did for nike presto. Some good ideas how thez did this effect on second 13? I attached a very basic setup, maybe someone has some input. Morph.hip

Hello everyone, I am trying to achieve a similar effect likes this by Ivan: On the sidefx forums, Ivan mentions that he creates two fields, one to attract the particles and one to keep the particles on the surface. I managed to create the force field to attract the particles but i cant find a way to create a field to keep the particles inside the geometry and form the shape. Any tips or help is highly appreciated.

Hello everybody, I'm trying to figure out an effect done by Simon Homedal, here are pics of the process, seems to be a subdivided triangular mesh that animates into infinite fractal, could be this done subdivisions, extrusions, blend, for each loops and solver? if so, how can you manage so houdini doesn't crash when getting smaller? putting a end value on the for each node? I watched Entagma video on fractal and for each divisions, now I'm trying to figure out how to combine all this stuff. Reference pictures: Here is the video (second 0:28) Also Simon kinda explain the method on IAMAG 2016 Class at min 32:20 He said that he uses a polygon, subdivided, and then animate with a blend shape, then build all the stuff from there, the screen record is available at u$d 100 but he only shows 2 quick houdini screens so its kinda expensive for that amount of info in my opinion (althoug the overall talk is great) . At 34:20 is a similar effect again with a sphere Any info will be great, in mean time will keep thinking and playing with houdini. Thanks!

Hey guys, Just watched the awesome work of Aixsponza and I'm wondering how to achieve this kind of effect, I let the video below and some still frames first Are they using vdbs for morph? and attribute transfer to drive the transformation? as for the triangles effect, a poly reduce with poly extrude or some point vop to extrude them? I really love this kind of morph transformations and any insight on how to aproach will be helpful Thanks!

Hi everyone, I am trying to create flower blooming animation in Houdini. There are five variety of flowers with different shapes. What would be the best approach to create such a animation? I am looking to animate something like this :
 5 replies

 blooming
 deformation

(and 3 more)
Tagged with:

Hey guys, So i am attempting an attraction and formation effect once again, but stuck in between. So let me get this started, Suppose there are certain points scattered on a plane surface and they all have same facing direction. Now they need to attract to a target object and lock to the position and deform according to the movement of target object. They should orient according to the normals of the target object and should slide over the surface of the object as soon as they reach near to the target. I have scattered same number of points on the surface and target objects in order to match perfectly and am able to drive those particles towards the target object and lock them onto their respective ids. But the thing is i am not able to slide over the surface according to orientation of target object and not able to blend the orientation. As soon as i add pop spin or torque my orientation goes for a toss as it doesnt match to the target objects orientation. So i need to blend the orientation when they are near to their target ids and when they are finding it. Then i need to drive these as rigid bodies so that they do not penetrate when they stick to their targets so thats another issue. So there are couple of issues i am stuck with and need to sort out one by one. If someone could help me with this it would be great, I will try and post a hip file meanwhile if i can.

hello all houdini user... this is great place to learn houdini so far..everyday i allways follow these forum.. by the way.. i found this link.. and friend of mine using that software to do that kind fx.. because i really love houdini...i start to adapt that fx in houdini...and start to research for these kind fluid morphing fx. then i found these particle setup..from jk from these forum...and i think thats good for start point.. but ends up when didnt have more time to continued the R&D because some reason.. so, is there any one could help me how these morphing points converted on to flip fluid or particle fluid..and then inherited the point velocity to morphing on to "K" shape..so i can save my ass from my boss. thanks.. i so much appreciate for these one fx..thanks heres's the hip file.. JK_Points_Morph_setupTo_flipFluid.hipnc

Hi guys, I have been learning to do a morph with flip. I able to do it with normal scatter in SOP or just with particle and some vopsop added. But when come to flip particles, I tried to remain the waterlike attribute after morph. I added same amount of PT with custom IDs on both emitter and goal object. But I just couldn't get the particle rest at my goal object with some slowingdown look. Anyone have some suggestion or idea ? Much appreciated. Moph_testt.hipnc

Hey everyone, I'm trying to recreate an effect Peter Claes did with his "Jam" simulation where the jam morphs into the character, but I can't seem to get anything close to it at the moment. I've got a basic scene going using flip fluids. www.youtube.com/watch?v=p6J7AC1GsRc  Peter Claes "Jam" Thank you for your time!

Welcome, I made myself a DA dealing with locally morphing geometry based on a distance from a locator. I just used a custom attribute, which value was dependent on a centroid of a locator, then imported this attribute into a VOP SOP and used a little bit of math to get a falloff distribution outwards the locator point, and blend between source and target geometry. Now, how to do the same for locally blending shaders/ textures ( I am not very experienced at VOP materials)? I cannot import attribute here, from what I see I can get attribute values by referencing a file on disk, not a node directly. How about making a digital asset for locally morphing textures then? Maybe I am looking in a wrong place? Thanks in advance,

Hi Everyone, I just finished up another tutorial for CMIVFX. This one is about particle morphs. It covers a pretty wide range of ways to do the effect, from simple SOPS morphs, to complicated Steer/Arrive behavior in POPS. There's also a section about turning a RBD point object in DOPS into a particle system and applying the Steer/Arrive solution in the DOP network. This is a mid to advanced level tutorial, and it assumes you already have Houdini experience and experience in POPS, SOPS, and DOPS. I think there is a lot of really good information in here, and hopefully it is presented in a clear, concise way, and at a decent pace. You can check it out here: http://cmivfx.com/store/564Houdini+Particle+Morphing+Effects I hope you enjoy it and find it useful! Adam

Hi all, Been away for so long, this is my first post in some years I believe. Anyway I was at a Houdini event the other day and they were talking through all the great new features in Houdini 13 and beyond and I noticed on one of the slides that numpy is now included in Houdini since 12.5 or even earlier. It's been so long since I delved into Houdini that this massively useful fact had totally passed me by. The first thing that lept into my head when I saw this was my old HDK sop RBF morpher which is a totally awesome deformer that has gotten me out of so many tricky deformation scenarios over the years. Finally I realised I could rewrite this in a Python sop and not have the issue of people needing to compile it to use it. A few days later and after a couple of hours work here is the result. No error checking yet just the bare bones of the functionality. Of course it's no where near the speed of the HDK one, but hey at least compilers aren't an issue with this one. This is the source code and I've attached a example hip file and otl, noncommercial versions. # This code is called when instances of this SOP cook. import numpy as np import math node = hou.pwd() geo = node.geometry() inputs = node.inputs() kernel = node.parm('kernel').eval() power = node.parm('power').eval() scale = node.parm('scale').eval() def linear(r): return r def smooth(r): return r*r*math.log1p(r*r) def cube(r): return r*r*r def thinPlate(r): return r*r*math.log1p(r) def sqrt(r): return math.sqrt(r) def pow(r): return math.pow(r,power) kernels = {0: linear, 1: smooth, 2: cube, 3: thinPlate, 4: sqrt, 5: pow} def rbfU(r): return kernels[kernel](r) if len(inputs) > 2: featureGeo = inputs[1].geometry() targetGeo = inputs[2].geometry() numFeaturePoints = len(featureGeo.iterPoints()) matrixQ = np.zeros((numFeaturePoints,4)) #setup all the matrices with the feature point positions i = 0; for p in featureGeo.points(): matrixQ[i,0] = p.position()[0] matrixQ[i,1] = p.position()[1] matrixQ[i,2] = p.position()[2] matrixQ[i,3] = 1 i += 1 #print matrixQ matrixQtranspose = matrixQ.transpose() #print matrixQtranspose matrixK = np.zeros((numFeaturePoints, numFeaturePoints)) #scale = 1 for row in range(numFeaturePoints): for col in range(numFeaturePoints): ppt = featureGeo.iterPoints()[row] p1 = ppt.position() ppt = featureGeo.iterPoints()[col] p2 = ppt.position() p = p1p2 matrixK[row,col] = rbfU(p.length()/scale) #print matrixK #setup the final set of linear equations in one massive matrix matrixA = np.zeros((numFeaturePoints+4, numFeaturePoints+4)) for row in range(numFeaturePoints): for col in range(numFeaturePoints): matrixA[row,col] = matrixK[row,col] for row in range(numFeaturePoints): for col in range(numFeaturePoints, numFeaturePoints+4): matrixA[row,col] = matrixQ[row,colnumFeaturePoints] for row in range(numFeaturePoints, numFeaturePoints+4): for col in range(numFeaturePoints): matrixA[row,col] = matrixQtranspose[rownumFeaturePoints,col] #print matrixA #setup the solutions to all the linear equations, i.e. the target feature positions targetX = np.zeros((numFeaturePoints+4)) targetY = np.zeros((numFeaturePoints+4)) targetZ = np.zeros((numFeaturePoints+4)) i = 0; for p in targetGeo.points(): targetX[i] = p.position()[0]; targetY[i] = p.position()[1]; targetZ[i] = p.position()[2]; i += 1 #solve the linear equations to find the weights that map the features to the targets weightsX = np.linalg.solve(matrixA, targetX) weightsY = np.linalg.solve(matrixA, targetY) weightsZ = np.linalg.solve(matrixA, targetZ) #print weightsX #apply the weights to the actual points on the input geometry to get the final resulting positions relative to the target feature points NfPts = numFeaturePoints for opt in geo.points(): outX = weightsX[NfPts]*opt.position()[0] + weightsX[NfPts+1]*opt.position()[1] + weightsX[NfPts+2]*opt.position()[2] + weightsX[NfPts+3]; outY = weightsY[NfPts]*opt.position()[0] + weightsY[NfPts+1]*opt.position()[1] + weightsY[NfPts+2]*opt.position()[2] + weightsY[NfPts+3]; outZ = weightsZ[NfPts]*opt.position()[0] + weightsZ[NfPts+1]*opt.position()[1] + weightsZ[NfPts+2]*opt.position()[2] + weightsZ[NfPts+3]; p1 = opt.position() i = 0 for p2 in featureGeo.points(): p = p1p2.position(); rF = rbfU(p.length()/scale); outX += weightsX[i]*rF outY += weightsY[i]*rF outZ += weightsZ[i]*rF i+=1 opt.setPosition((outX,outY,outZ)) [/CODE] ExampleRBF.hipnc RBFmorpher.otl

Greetings to the greatest 3D community! I'd like to introduce you my new digital asset "RampBlend". The idea behind “RampBlend” is to help character riggers and animators get rid of complex musclesystems which are often too slow, troublesome and tedious to set up. It was specifically created with the concept of “corrective blend shapes” in mind, where changes to the Blend factor trigger sequential morph targets. I'd like to thank this community and Mangi who kindly shared the base scene for my tests. Any comments or suggestions to help improve this asset are more than welcomed. You can download "RampBlend" at this url: http://www.orbolt.co.../ik_::RampBlend Introductory video (vimeo HD720): http://vimeo.com/74942919 Introductory video (Youtube HD1080):

1

 blendshape
 driven
 (and 7 more)