Jump to content

Gpgpu In Hdk?


Recommended Posts

Hi,

I would like to implement a GPU-accelerated Houdini SOP that calculates ambient occlusion using GPGPU techniques (drawing a full-screen quad for pixel-texel mapping, using CG shaders etc). Is it possible to start glut from a Houdini plugin, or use the viewport for GPGPU calculations?

Thx,

David

Link to comment
Share on other sites

I would like to implement a GPU-accelerated Houdini SOP that calculates ambient occlusion using GPGPU techniques (drawing a full-screen quad for pixel-texel mapping, using CG shaders etc). Is it possible to start glut from a Houdini plugin, or use the viewport for GPGPU calculations?

You can use GLUT (I used freeglut in Linux) to do your work in a new "window" (either on-screen or off-screen). In Linux, at least, you have to watch out for which context is being used. I found you had to do something like:

oldContext = glXGetCurrentContext( ... );

glutMainLoop();

glXMakeCurrent( ..., oldContext );

Otherwise the Houdini viewport would get all messed up.

Cheers!

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...