Jump to content

A different houdini


rotate

Recommended Posts

p7855849.jpg

I’m glad to share some experiments in houdini. they are still some small idea and prototype. Through our development of digital assets, we try to use houdini in a wide range of areas.

Can digital animators control a real robot? Yes,! We designed a small robot, by 5 joint and one LED light. First, we created a robot model according to real-size, then animated in houdini. Specially we setup an digital asset, which can transfer the animation data into the MCU of the robot. Once we completed the virtual robot animation in Houdini,then you can just play! And the LED light on the robot is controled by one spot light in houdini.

Can the virtual lighting interact with the real environment ? Yes,houdini can make it come true.In this example, when we turn on the real lamp, the Houdini scene is illuminated as well. We use Arduino to collect the lighting data, then send them back to Houdini. Each light sensor can control the lightness of one houdini light. Or it could be a spherical lighting array. In the future, maybe we can illuminate the virtual scene in a more interesting way.

This is another experiment, we also use sensors to connect with houdini operators. we try to use a Resistor to control the parameters of VOP sop, to achieve the deformation of objects.

This is a 64-pixeled LED device to show the different result, like to make faces, to warn you the error message. This can be fun in daily work. We also produced a digital asset,to control the LED display by houdini. In future we can also model some patterns in houdini, and generate them in the device.

Kinect is a depth sensor. We use Processing to write a program, which can capture 3D point cloud,and send them into houdini. In this example, we use just boxes and copystamping to achieve the effect.

We realize that Houdini is an amazing, powerful tool. It is not only a tool for film production, but also is available in working with data visualization, robot science, new media art, interactive device. The most important thing is,that it is a mathematical tool to reproduce the nature created by God.

This is the video: http://v.youku.com/v...AxMTMyNjI4.html

Thanks ! :)

Edited by rotate
  • Like 4
Link to comment
Share on other sites

Thats pretty cool, any hints on how you have linked houdini and the robot ?

TThank you very much, I am sorry, my English is not good.

I use python to build digital assets, angle data transmission via RS232 port to the microcontroller (stc12c5410, high-speed and inexpensive).

Digital assets python code:

###############################

#before first create

import serial

print "Starting up serial comms"

hou.session.ser = serial.Serial(4)

hou.session.ser.setBaudrate(38400)

hou.session.ser.timeout = 1

hou.session.updateFrame = 0

hou.session.x = 0

##############################

#python module

def updateValues():

if(hou.session.updateFrame != hou.intFrame()):

hou.session.ser.flushInput()

hou.session.ser.write("w")

hou.session.x = hou.session.ser.readline()

print hou.session.x

hou.session.updateFrame = hou.intFrame()

pass

def getx():

updateValues()

return hou.session.x

#############################

#after last delete

print "shutting down serial comms"

hou.session.ser.close()

#############################

Expression code:

########################################

b1=hou.node("/obj/bb/box_1").parm("rx").eval()

hou.session.ser.write ("#00A"+str(round(b1,1))+"!")

b2=hou.node("/obj/bb/box_2").parm("rx").eval()

hou.session.ser.write ("#01A"+str(round(b2,1))+"!")

b3=hou.node("/obj/bb/box_3").parm("rx").eval()

hou.session.ser.write ("#02A"+str(round(b3,1))+"!")

b4=hou.node("/obj/bb/box_4").parm("rx").eval()

hou.session.ser.write ("#03A"+str(round(b4,1))+"!")

b5=hou.node("/obj/bb/box_5").parm("rx").eval()

hou.session.ser.write ("#04A"+str(round(b5,1))+"!")

b6=hou.parm('/obj/robot_7ch_v0_fbx/root/spotlight1/light_intensity').eval()

hou.session.ser.write ("#05A"+str(round(b6*10+26,1))+"!")

Expression to send commands to the RS232 port. For example, sending “ # 01A60! ” Meaning is the second joint is rotated 60 degrees。

Edited by rotate
  • Like 1
Link to comment
Share on other sites

This is really awesome work guys. I would also be interested what your arduino/python/hou setup is like. It is great to see houdini used in new ways like this! Very rhino3d/grashopper like.

Thank you very much, I am sorry, my English is not good.

The robot is built in maya. Use FBX import houdini.

I also like rhino3d/grashopper and the FireFly. However, VOP is more powerful, and I hope that the Arduino is programmed by VOP. Like grashopper and firefly :)

Link to comment
Share on other sites

Great work! Wow. Any chance you'll be sharing the Kinect 3D point cloud script?

This is the processing code,it can get obj file from Kinect.

import SimpleOpenNI.*;

SimpleOpenNI context;

float zoomF =0.3f;

float rotX = radians(180); // by default rotate the hole scene 180deg around the x-axis,

// the data from openni comes upside down

float rotY = radians(0);

PrintWriter output;

int frame=0;

void setup()

{

frameRate(300);

size(1024,768,P3D); // strange, get drawing error in the cameraFrustum if i use P3D, in opengl there is no problem

//context = new SimpleOpenNI(this,SimpleOpenNI.RUN_MODE_SINGLE_THREADED);

context = new SimpleOpenNI(this);

// disable mirror

context.setMirror(false);

// enable depthMap generation

if(context.enableDepth() == false)

{

println("Can't open the depthMap, maybe the camera is not connected!");

exit();

return;

}

stroke(255,255,255);

smooth();

perspective(radians(45),

float(width)/float(height),

10,150000);

// Create a new file in the sketch directory

//output = createWriter("depth"+frame+".txt");

}

void draw()

{

// update the cam

context.update();

background(0,0,0);

translate(width/2, height/2, 0);

rotateX(rotX);

rotateY(rotY);

scale(zoomF);

int[] depthMap = context.depthMap();

int steps = 5; // to speed up the drawing, draw every third point

int index;

PVector realWorldPoint;

translate(0,0,-1000); // set the rotation center of the scene 1000 infront of the camera

stroke(255);

output = createWriter("depth"+frame+".obj"); //////////////////////////////////////////////////////

PVector[] realWorldMap = context.depthMapRealWorld();

for(int y=0;y < context.depthHeight();y+=steps)

{

for(int x=0;x < context.depthWidth();x+=steps)

{

index = x + y * context.depthWidth();

if(depthMap[index] > 0)

{

// draw the projected point

// realWorldPoint = context.depthMapRealWorld()[index];

realWorldPoint = realWorldMap[index];

point(realWorldPoint.x,realWorldPoint.y,realWorldPoint.z); // make realworld z negative, in the 3d drawing coordsystem +z points in the direction of the eye

output.print("v "); // Write the coordinate to the file

output.println(realWorldPoint.x+" "+realWorldPoint.y+" "+realWorldPoint.z); // Write the coordinate to the file

}

// println("x: " + x + " y: " + y);

}

}

// draw the kinect cam

context.drawCamFrustum();

output.flush(); // Writes the remaining data to the file

output.close(); // Finishes the file

frame++;

}

void keyPressed()

{

output.flush(); // Writes the remaining data to the file

output.close(); // Finishes the file

exit(); // Stops the program

}

Edited by rotate
Link to comment
Share on other sites

Hey, Thanks for sharing this! What arduino model did you use-is this Arduino Uno? Is the stc12c5410 microcontroller faster than the default one?

Yes,I think they're probably the same speed.stc12c5410 has On-chip10K program memory, On- chip 512bytes internal SRAM, and it uses the 8051 instruction set. It does not use the Arduino script, but you can make higher efficiency use of c language.

Here is a document about 12c5410

http://www.mcu-memory.com/datasheet/stc/STC-AD-PDF/STC12C5410AD-english.pdf

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...