Jump to content
ameyer

Houdini Viewport - VR HMD Connection - Oculus Rift / HTC Vive

Recommended Posts

Hello,
I am working on a prerendered 360 VR video with Houdini.
We are currently builing our pipeline for this and want to implement a feature that allows to connect the Houdini viewport to an HMD like the Oculus Rift or HTC Vive.

Realtime playback speeds are not necessarily that important.

Does anybody have an idea if this is possible and if how to approach it?

Also for proper realtime playback, we want to able to create 360 latlong Flipbooks.
The opengl ROP currently does not seem to support this.
Any ideas on that?

Thanks a lot
Adrian


FX TD @ Animation Institute - Filmakademie Baden-Württemberg

Share this post


Link to post
Share on other sites

Hey,
haha thats funny.
It's actually from my former TD in the project. So yeah I know it (-;

Well it does help a little bit, but still its just grabbing the render view, so you always have to render and than can get a feeling for the scene in VR.
What I was actually hoping for is really a viewport connection. But that would have been quite a thing to implement.
It exists for Maya already though. But nothing for Houdini /-:
 

Share this post


Link to post
Share on other sites
Posted (edited)

Hey @ameyer,

aha, ok :). Streaming the viewport doesn't seem easy, I think you have to dig into pyqt (not sure about that though). However, rendering flipbooks is pretty straight forward via python:

custom_range = (0,10)
filename = "~/tmp/flipbook_$F4.png"

cur_desktop = hou.ui.curDesktop()
scene = cur_desktop.paneTabOfType(hou.paneTabType.SceneViewer)

flip_options = scene.flipbookSettings().stash()
flip_options.frameRange(custom_range)
flip_options.output(filename)
scene.flipbook(scene.curViewport(), flip_options)

API: https://www.sidefx.com/docs/houdini/hom/hou/FlipbookSettings.html

Cheers,
Christian

Edited by p2or

Share this post


Link to post
Share on other sites

Yeah but you cannot do VR (Latlong) Flipbook.

You can only do it with the OpenGL ROP with Cubemaps and then assembly them in Post etc.

But it's completely unusable, takes ages, wrong stereo etc. Its like 10x faster then to just do a VR latlong Render (with Redshift, quite fast). Thats our current workflow actually still (unfortunately). 

Share this post


Link to post
Share on other sites
Posted (edited)

what about unreal engine + HDA + HouEngine ? 

- you will have the interactivity 

- the FPS 

- a true real time framework 

- No pain to find a custom solution

 

In case you really want it in H you could make some research on 

 - Check out how openVR encode positional tracking info

https://github.com/ValveSoftware/openvr

- How to translate those info into a usable format like OSC

https://github.com/BarakChamo/OpenVR-OSC

- how to load the data contain in this format into CHOP , for exemple OSC loader in CHOP

https://docs.derivative.ca/OpenVR_CHOP (Touch Designer equivalent )

- Connect your CHOP channels to your camera 360

- is it possible to apply directly Vive lenshader in OpenGL inside Houdini ? you need to double check

- the hardest part like said previously will be to open a GL window that stream the houdini viewport rendered throw that camera

Getting 90FPS out of that will be a real challenge/miracle that needs some serious programming background

 

Well except if you are a Russian coder, and it appears from your avatar that you are not  :), i will spend this R&D time, on Unreal Engine and Houdini Engine practice.

 

Cheers 

E

 

 

 

 

 

Edited by sebkaine
  • Like 1

Share this post


Link to post
Share on other sites
Posted (edited)

Well sure, Unreal would be awesome. Already tested this.

But how do you get scenes with particles and volumes and whatsoever to unreal intuitevily.

You always have to make complex alembic exports and what not. (With Engine thats also not that easy, you would have to have all objects in one HDA etc. )

For me it was way faster then to just render Latlong Previews. I'm pretty sure you'll find the same thing.

 

And yeah, hoocking it up with CHops and stuff would probably be not such a big problem, but then you have to get the stereo working and stuff.

So in the end you need a custom viewport output. Which is surely possible but quite hardcore.

 

Lets hope that somedays these guys will implement houdini.

https://vr-plugin.com/

Edited by ameyer

Share this post


Link to post
Share on other sites
Posted (edited)

I understand Adrian, but i will just give you my pov.

- Working in VR by applying the traditional VFX way into the 360 world, is imo the wrong way to do VR.

- Importing Tons of VDB assets / load of gygabyte for POP cache etc .... certainly is not an option in UE.

- but precalc suck for 360 it is an awkward way , and 360 movies tend to be boring excpet when they are rendered 8K stereo with good haptics setup

 

I think that you must also think about an other option, that is to shift your paradigm. and accept that yes real time will give you lot of constrain , BUT you will have an iteration power over all the creative process that no precalc engine on the world can match ...

even RS with 4*1080Ti.

If you are making a 360 experience and will dedicate 1 or 2 years of your life doing so ... working in precalc with a VFX paradigm way of doing things in mind, is just the wrong way.

But if you start to modify some of art direction + story to take RT restriction into account, you will be able to consider your project as a software not a movie, and will be able to achieve somethin far more impressive than what you have expected.

 

You will for exemple be able to pack an experience , then distribute it on various support,

- like interactive room scale immersive movie with a rift or vive 

- wire free interactive eperience with an oculus quest 

- 8K stereo movie for wider audience

 

10 min of 8K at 60Fps is  36.000 frame to render. if you render one 8K frame with RS in only 5 minutes , it's 180.000 minutes for the all movie so it's 3000 hours for one iteration of your movie , and about 125 days for one iteration of the movie.

with unreal you will be able to render your movie in one day on a single machine.

 

so just before you jump in a clunky Maya + Houdini + precalc RS workflow, be sure to double test UE and RT workflow. if after that you sure that UE is not for you , that is OK but at least you will have really push the UE options.

 

Good Lucks !

Cheers 

E

 

EDIT : and by the way i agree with you  streaming the houdini viewport to oculus or vive or valve index would be great , this has been already ask as RFE , finger crossed for H18... 

Edited by sebkaine
  • Like 1

Share this post


Link to post
Share on other sites

Hi @ameyer,

one last (probably stupid but simple) idea: How about using vnc getting the viewport stream for a temporary solution? Have you already tried that?

Cheers,
Christian

Share this post


Link to post
Share on other sites
Guest
You are commenting as a guest. If you have an account, please sign in.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoticons maximum are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×