Jump to content

Using Soho: Indigo Renderer Support


Recommended Posts

  • Replies 89
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Posted Images

Day One

I downloaded the Windows 32bit version of Indigo and installed it.

I'm a little familiar with how the soho code for the pipelines distributed with Houdini are layed out - it's basically:

* IFD.py - the starting point

* IFDframe.py - all the entities which are written out per frame

* IFDgeo.py - called by IFDframe generally - defines the geometry output

* IFDapi.py - ad hoc functions imported into every module

* wranglers/HoudiniLightIFD.py - functions called to define how lights get exported in IFD

My tactic is always to hack up a working pipeline to a point, and then rebuild it in the name of optimization.

I copied $HIH/soho/IFD.py to a new directory and changed it's name to indigo.py. I axed a lot of 2nd half of the code specific to Mantra - just commenting some general points. I surgically removed any other mantra-specific references.

In Houdini, used the Type Manager on the Mantra ROP to duplicate the HDA and changed the name and label to Indigo. Create an Indigo node and in the Type Properties change the name of the soho_program to point at my indigo.py instead of IFD.py. Also changed the program from "mantra" to "indigo" and such.

Hitting Render now, I am using my indigo.py script; I import hou put in some hou.ui.displayMessage()'s to see what I'm doing.

The crux of what SOHO is meant to do is it will catch anything printed to stdout and redirect it into a file. This file is then gets submitted to the the program of your choice. SOHO also assist you with a couple of other things: it determines which objects are to be output into your scene and builds a nice list so you're not responsible for re-doing this for every exporter you write - and with crazy subnets and so on, it's not an easy task. It also has a mechanism for quickly querying larges batches of parameters in the hierarchical Rendering Properties way.

Now I examine the Indigo doc pdf and load up all their test scenes (.igs) and examine the format. Its XML and seems simple enough: define a "camera", define lights of various kinds, define "material"s, define "mesh" instances and "model" which instantiate those defined by "mesh" entities. All good - nothing too confusing.

So I begin by wanting to set up the simplest scene: a triangle mesh of a box.

From the original IFD.py I see how to loop over objects and get the camera and stuff and set up quick procedures for outputting a simple camera and mesh and model definition. Unlike the IFD pipeline, I just put all the my code into the one indigo.py file until I get to a point that I might want to split it out.

I set what I can (inspecting my output .igs file) and when I'm happy I have what looks like a valid XML file, I launch Indigo against it. It complains about all these missing entities one by one and so I substituted some hardcoded prints in there temporarily just to squeeze through Indigo's parser and start a render. It turns out that although there are defaults for many of the entities/tags, Indigo needs them defined anyway (which seems to contradict having a default in the first place, doesn't it?). This bit takes a time, hammering the output into parsable shape.

Finally, success! Something renders... a bit of the sky! Perfect. And all I could expect given the crappy cut'n'pasted entities from various examples.

What have I learned about Indigo now:

  • Triangle meshes only - no quads, polys. Boo!
  • No motion-blur blocks or anything in any example or documentation.
  • Needs an exhaustively set up Camera.
  • Can instance geometry.
  • XML format - might get large on big geo
  • Can reference external .3ds, .obj and .ply geometry.
  • Can read/write EXR files - goody!

Link to comment
Share on other sites

Day Two

So now I want to get my geometry defined properly and I can read the SOP geometry something like this:

	soppath = obj.getDefaultedString("object:soppath", now, [""])[0]
	geo = SohoGeometry(soppath, now)
	npts = geo.globalValue('geo:pointcount')[0]
	nprims = geo.globalValue('geo:primcount')[0]
	atrP = geo.attribute('geo:point',  'P')
	for pts in xrange(npts):
		P = geo.value(atrP, pts)
		print "vertex pos=\"%f %f %f\" "%(P[0], P[1], P[2] )
...		
	atrVertexCount = geo.attribute('geo:prim', 'geo:vertexcount')
	atrPointref	= geo.attribute('geo:vertex', 'geo:pointref')
	for prim in xrange( nprims ):
		nvtx =  geo.value(atrVertexCount, prim)[0]
		print "tri"+"".join([" %d "%geo.vertex(atrPointref,prim, vtx)[0]  for vtx in xrange(nvtx)])

Loops through all points and writes them out, and then follow with a loop through all primitives and print all the points they reference.

Looking at the transforms in the model entity, I can only specify a position, and a 3x3 rotation matrix. Didn't figure out the conversion from hou.Matrix4 to a Matrix3 so I'll just translate stuff and expect rotations to be weird.

I put the camera at the origin and move some objects around and render away...

LATER:

Ok, so I convert the Matrix4 transform to a Matrix3 - which is rotation + scale Matrix - and Indigo seems to respect scales in the 3x3 (unexpectedly, but luckily) because the <scale> tag is only a Uniform scale and I was worried about how to represent non-uniform scales for instances.

I have now decided that I should start to set up some of the basic parameters available to Houdini and SOHO, just to start it off and get it running. All the Rendering Properties are stored in $HIH/soho/parameters. I have started one called Indigo.ds and started to populate it with Indigo's parameters. (See attached image). There is some awkwardness in a casual environment like my Windows setup at home with respect to the .ds files being included from the right location. Without having a HOUDINI_PATH setup with a fallback searchpath from $HOME/houdini9.1 to $HFS, I have to copy all the .ds files manually over to the houdini9.1 location or they won't be found. Perhaps I can figure out a better way of doing this later.

post-4-1201589683_thumb.png

post-4-1201596568_thumb.jpg

post-4-1201596610_thumb.png

Link to comment
Share on other sites

That's cool and explained me a lot about SOHO.

But where did you get the information about all those: *.getDefaulted* things?

Reverse engineering?

Afraid so :) A lot of SOHO learning is reverse engineering - at least for me. SESI have made attempts to train people in SOHO but it's quite intricate in many ways.

Link to comment
Share on other sites

Jason,

could you tell a bit what library you're using to deal with XML files. I must say that this is my biggest problem for now. Dealing with XML tags. Minidom or Element Tree simply don't fit. Sometimes I have a feeling like XML was an idea of a real evil...

Link to comment
Share on other sites

Jason,

could you tell a bit what library you're using to deal with XML files. I must say that this is my biggest problem for now. Dealing with XML tags. Minidom or Element Tree simply don't fit. Sometimes I have a feeling like XML was an idea of a real evil...

Since I'm just writing XML, I'm cutting down the complexity and RAM use and I'm just printing tags myself; which seems to me to be an acceptable trade-off.

I just make a couple of helper functions like:

def i_entity_start( entity ):
  print "&lt;%s&gt;"%entity
  soho.indent(1,None)

def i_entity_end( entity ):
  soho.indent(-1,None)
  print "&lt;/%s&gt;"%entity

Link to comment
Share on other sites

  • 2 weeks later...

I managed to spend an hour on the exporter last night and I just took some time to go through the Indigo PDF manual (which is really just a syntax doc) and populate more parameters and neaten up some ugly stuff I had left around from hacking the IFD pipeline. I added a tiny block of code to query all the ROP settings for the "render_settings" and "tonemapping" blocks of the XML spec.

I will spend a post explaining how the SOHOparameters thing works -and the black art of creating an Indigo ROP which sources and creates its own default Properties. Currently my Indigo ROP is copy of the Mantra ROP so all my default Properties are Mantra's so I need to get it to create Indigo ones instead.

After this I'll tackle the problem of defining Lights, and then finally, Materials.

Link to comment
Share on other sites

  • 4 weeks later...

Day Four: Lights and Wrangling

The first thing I did was go through the Indigo manual to examine what types of lights Indigo seems to support. The information was kind of scattered about but I found that it can represent the following types of Lights:

  • Sky Light (sun direction)
  • Environment Map (lat-long EXR or spherical hdrshop file)
  • Rectangle area light (with no rotation - only x/y so its not possible to orient the grid)
  • Meshlight (geometry can emit light)

I was a little suprised to find there were not spotlights. As is expected in a physically-based renderer, there are scoping and no shadow controls- only emission controls. This does make defining a light a little simpler and easier for me.

My first task was to build an Indigo Light - a new Light from a Light Template which will be dedicated to all the Indigo features. So I downloaded the .tar file from this page: SohoWranglers and started to add all the parameters to my custom Indigo Light. I have decided to make this one single light type handle the Skylight, Environment Map and Rectangle area light types and the Meshlight seems like it might be handled better by adding some meshlight rendering parameters to existing Objects.

I now add SOHO code to loop through all (scoped) lights in the scene and output them into the IGS stream (the indigo file format). One by one I query all the light parameters and depending on their values, I write out blocks defining the type of light and parameters. It doesn't take too long write this bit - it's straightforward parameter evals for the most part.

These parameters that I have manually added to my Indigo Light should not doubt be added to my Indigo.ds SOHOparameters file too.

So now, on to "wrangling". What is wrangling? It's essentially a "layer" of code that will translate parameters from a Light to ones that make sense for the exporter. I figure there are really two reasons to write wranglers:

  • (set lights, varying backends) To translate the standard Houdini Light to be able to act as a native light in your soho pipeline. eg, translate HoudiniLight parms to IndigoLight parameters for the Indigo soho export to work.
  • (vary lights to set backends) To allow you to make a custom studio light which will be translated to any or all of the pipelines. eg., a BobFXHouseLight to IFD, RIB or IGS.

Starting again from the wrangling.tar.gz file mentioned above, I start to consider how HoudiniLight parameters might define a valid IndigoLight. It seems that it would support a few basic parms of the Indigo Light:

  1. if the HoudiniLight's Area is set to Grid, lets define a rectangle light. The HoudiniLight only has a single size setting, so the IndigoLight rectangle would have to be square only. The lightcolor parm would be translated to RGB spectrum colors (rectanglelight:spectrum:rgb).
  2. If the area is set to Environment and Use Area Map is on, we can define a Environment Map light type in Indigo. The dimmer can be mapped to env_map:lat_long:gain.

I add the parameters and how to translate them to Indigo-like parameters and so still I only have one section of code that knows how to translate Indigo lights; I don't have to duplicate/write any more code for outputting HoudiniLights because now I have a nice translator.

Later tonight I will post some illustrative code fragments to explain this a little more deeply.

Link to comment
Share on other sites

I was a little suprised to find there were not spotlights. As is expected in a physically-based renderer, there are scoping and no shadow controls- only emission controls.

Hi Jason,

As spotlights do you mean directional point lights?

Link to comment
Share on other sites

Yup. Or, I guess, directional area lights. I haven't yet played with meshlights either - I'm not sure how efficient or effective they'd be.

As I mentioned on another thread, I don't think there are "point sized" lights in renderers like Maxwell or Indigo. "Lights" are actually materials, and the artists apply it to what ever mesh they have to simulate a light bulb. To make it directinal, you'd have to model the lamp's shade.

I don't know how they deal with the sun, but I guess is not a point either, because there is some softness to its shadow too.

Link to comment
Share on other sites

Ok, so I made some progress this weekend.

I basically have implemented all the materials (and "mediums") in SHOPs and translated them to the IGS file. I have also exposed the uv_sets in a way Houdini artists are used to working. I'll write more in the morning on how I went about implementing the materials and mediums.

For some reason my transparent materials aren't working properly - and I suspect ill-defined normals, perhaps; although everything else shades fine.

I have two other outstanding issues:

  1. My camera parameters aren't very well defined. The mapping of parameters to the Indigo camera I find a little confusing and I'll probably have to enlist the help of someone more enlightened.
  2. My skylight is using an Indigo-ish Z-up vector in the light parameters. I think I should rather take the direction of the light to be the sun direction - and translate Z-up to Y-up.

I'll be ready to send out this code as is to any willing alpha-test group who can make comments and changes:)

post-4-1205132197_thumb.png

post-4-1205132323_thumb.jpg

Link to comment
Share on other sites

Day Five - Materials

Looking through the documentation, I see that there are two type of shader entities - one called "materials" and another called "mediums". Materials can use Mediums to define the interior space of the object- which really seem to define how light is transmitted/scattered through the interior of the object. There are a fixed number of materials Indigo supports - there is no programmable shading and there are no displacements.

So the task here is to define all the available mediums and materials and parse them into the IGS. It seems like a good fit is to create a SHOP for each supported material and medium. There is no way in SHOPs (that I know of) to create a new SHOP context for Mediums so I'll have to just piggyback onto an existing shop type. Perhaps one day Indigo will support Displacements so I'll have to just re-use the Surface shader context and put a hidden field in the SHOP parameters to distinguish a Medium from a Material. This is the reason you'll see there is a hidden i_category field on each SHOP set to either "material" or "medium". I hope that in the future SESI would open up SHOPs a bit more and allow arbitrary SHOP types to be easily defined - which would also open up Houdini Materials to be able to accept more than just the currently hard-coded set of the SHOP types.

Instead of relying on an explicit SHOP type name, like "i_diffuse" for a diffuse material, I also decided to store the material's real type in a hidden field on the SHOP too - which would allow future versions of the shop to be created and not have to modify the SOHO code to recognise them. For this there is a "i_type" hidden string field on each shop set to "diffuse", "phong", "glossy_transparent" and so on.

It seems as if Indigo wants the Mediums defined first and instead of parsing through and finding out which Materials are being used and which aren't, I just output all SHOPs which are Mediums first, then scan through the hipfile again and output all SHOPs which are Materials.

More to come...

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...