Jump to content
Atom

Moana Island Scene [Dataset]

Recommended Posts

A lot of the data is in .JSON files, which I have never really worked with before. Here is some code you can place inside a shelf tool to generate all the lights in the scene, from the JSON file.

 

Change the hard coded filename path to wherever you placed the downloaded files on your system.

import json

filename = r"S:\HoudiniProjects\Moana_Island\lights.json"
#Read JSON data into the records variable
if filename:
    with open(filename, 'r') as f:
        records = json.load(f)
    f.close()

for item in records:
    node_name = item
    #w = records[node_name]["width"]
    #h = records[node_name]["height"]
    light_type = records[node_name]["type"]
    exposure = records[node_name]["exposure"]
    col = records[node_name]["color"]
    loc = records[node_name]["location"]
    rot = records[node_name]["rotation"]
    node = hou.node("/obj").createNode("hlight",node_name)
    if node != None:
        node.parm("tx").set(loc[0])
        node.parm("ty").set(loc[1])
        node.parm("tz").set(loc[2])
        node.parm("rx").set(rot[0])
        node.parm("ry").set(rot[1])
        node.parm("rz").set(rot[2])
        node.parm("light_colorr").set(col[0])
        node.parm("light_colorg").set(col[1])
        node.parm("light_colorb").set(col[2])
        node.parm("light_exposure").set(exposure)
        if light_type == "quad":
            node.parm("light_type").set(2)
        if light_type == "dome":
            node.parm("light_type").set(8)

 

Edited by Atom

Share this post


Link to post
Share on other sites

This bit of code will generate the camera for the shot.

import json

filename = r"S:\HoudiniProjects\Moana_Island\shotCam.json"
#Read JSON data into the datastore variable
if filename:
    with open(filename, 'r') as f:
        datastore = json.load(f)
    f.close()

camera_name = datastore["name"]
fov  = datastore["fov"]
focal_length = datastore["focalLength"]
loc = datastore["eye"]
rot = datastore["look"]
node_camera = hou.node("/obj").createNode("cam",camera_name)
if node_camera != None:
    node_camera.parm("aperture").set(fov)
    node_camera.parm("focal").set(focal_length)
    node_camera.parm("tx").set(loc[0])
    node_camera.parm("ty").set(loc[1])
    node_camera.parm("tz").set(loc[2])
    node_camera.parm("rx").set(rot[0])
    node_camera.parm("ry").set(rot[1])
    node_camera.parm("rz").set(rot[2])

 

Share this post


Link to post
Share on other sites

Here is another piece of code that you can place inside a PythonSOP (i.e. a python node inside a SOP network, not /obj level python node).

It will read the .JSON file and create a set of points with a @pscale and @instancefile attribute applied.

Untitled-1.thumb.jpg.cef7fcd3da2e1ed55224dc78b280b4e5.jpg

node = hou.pwd()
geo = node.geometry()

# Add code to modify contents of geo.
# Use drop down menu to select examples.
pivot = {0,0,0}
v3 = hou.Vector3 (1,0,0)
geo.addAttrib(hou.attribType.Point, "orient",v3)
geo.addAttrib(hou.attribType.Point, "pscale",1.0)
geo.addAttrib(hou.attribType.Point, "instancefile","")
data_path = "S:\HoudiniProjects\Moana_Island"
    
import json
                
filename = r"S:\HoudiniProjects\Moana_Island\isBeach_xgStones.json"
#Read JSON data into the datastore variable
if filename:
    with open(filename, 'r') as f:
        records = json.load(f)
    f.close()

for k, v in records.items(): 
    #print k
    for item in v:
        #print item
        # At this level, each item is considered a point.
        point = geo.createPoint()
        if point != None:
            mtx =  records[k][item]
            m = hou.Matrix4 ([mtx[0],mtx[1],mtx[2],mtx[3],mtx[4],mtx[5],mtx[6],mtx[7],mtx[8],mtx[9],mtx[10],mtx[11],mtx[12],mtx[13],mtx[14],mtx[15]])
            loc = m.extractTranslates('srt')
            point.setPosition(loc)
            rot = m.extractRotates('srt')
            point.setAttribValue("orient", rot)
            scl = m.extractScales('srt')
            point.setAttribValue("pscale", (scl[0]+scl[1]+scl[2])/3.0)
            instance_name = "%s/%s.obj" % (data_path,item)
            point.setAttribValue("instancefile", instance_name)
    #print ""

 

Share this post


Link to post
Share on other sites

Here are the python based instance point generators for the isBeach portion of the data set. The set shown below is made up of 1,688,000 points.

Adjust the data_path in each object to point to the location your downloaded data set.

Shown here are data sets for...xgFibers, xgHibiscus, xgPalmDebrisd, xgPebbles, xgSeaweed, xgShells, xgShellsSmall and xgStones.

Untitled-1.jpg

ap_moana_stones_070918.hiplc

Untitled-1.jpg

Edited by Atom
  • Like 1
  • Thanks 1

Share this post


Link to post
Share on other sites

I think sharing this production files is one of the most remarkable events in the CG industry for the past few years...

@Atom Hi, Atom! Thanks for the tread, inspire to examine this data! Can you explain PythonSOP usage? In your scene, these nodes do not connected to anything...

Downloading island-basepackage.tgz, 6 days left...

 

Share this post


Link to post
Share on other sites

They don't connect to anything because they are generators, kind of like a BOX node does not have an input, it generates all the points that make a box. The python file does "connect" to the .JSON file which contains the data that the script reads to generate points. You need the data and to point the hard coded file path in the python node to the Moana data before you can see anything generated in the scene.

Yeah, it took me 3 days to get all the data downloaded. Just an FYI too. You don't really need the animated data. All that is in that data set is a .obj sequence of the ocean which barely moves. The basepackage contains a single still of the ocean which is enough to get going.

Edited by Atom
  • Like 2

Share this post


Link to post
Share on other sites

So PythonSOP generates points with necessary attributes and the file1 node is meant to be scattered over this points?
 

Share this post


Link to post
Share on other sites

No, the file1 is just a left over junk node, sorry about that, I should have removed them.

What gets scattered is the .obj name specified in the s@instancefile attribute that the python node generates from the point data set. This is why you need to set the hard coded data_path to point to where you finally store your downloaded data set. You'll have to do this inside each python node. This solution is very basic at this time.

# Change this variable to point to your downloaded island path.
data_path = r"S:\HoudiniProjects\Moana_Island\island-basepackage-v1\island"

 

Share this post


Link to post
Share on other sites

How does Houdini evaluate PythonSOP?  If it does it every frame is int it too expensive to scatter geometry?
I am quite fine with Python, but Houdini still a mystery for me... Probably I need to download the data and play with it to understand the setup. 

Thanks, Atom!

Share this post


Link to post
Share on other sites

The python node is inside a Geometry node, this means it will get cooked when needed. It will run once to generate the points. If you were to provide an alternate .JSON filename based upon a frame number, then the python node would need to run every time the frame changes.

The data set is heavy, so when I open the scene it takes a minute or more for all those python nodes to read the .JSON file and generate the points. My estimate is around 1,688,000 points just for beach debris.

  • Like 1

Share this post


Link to post
Share on other sites

Make sense, thanks, Atom.

Also, I was thinking if its possible to produce the same results with VEX and if it would be faster. Definitely will need to find a workaround how to transfer JSON data to a wrangler...

Share this post


Link to post
Share on other sites

I just got the isCoral data set imported and rendering. Here is my Redshift render with default materials. I still need to process. materials.json file for a better match. This image shows approximately 2 Million points instancing .rs proxy objects.

untitled-2.jpg

Edited by Atom

Share this post


Link to post
Share on other sites

Here is a short script you can place in a button on the shelf. It will read the Moana materials.json file and create a companion Redshift material in your SHOP network. A few values are forwarded into the node creation process, such as diffuse color, ior, alpha etc... It could be extended to relay more information for a better material match.

import os,re,json

filename = r"S:\HoudiniProjects\Moana_Island\materials.json"
texture_path = r"S:\HoudiniProjects\Moana_Island\island-basepackage-v1\island\textures"

def returnValidHoudiniNodeName(passedItem):
    # Thanks to Graham on OdForce for this function!
    # Replace any illegal characters for node names here.
    return re.sub("[^0-9a-zA-Z\.]+", "_", passedItem)
    
def createRedshiftImageMapMaterial(passedSHOP, passedImageFilePath, passedName, passedDiffuse=[0,0,0], passedSpecular=[0,0,0], passedWeight=0.1, passedIOR=1.0, passedOpacity=1.0):
    print "->%s [%s]" % (passedName, passedImageFilePath)
    rs_vop = hou.node(passedSHOP).createNode("redshift_vopnet",passedName)
    if rs_vop != None:
        rs_output = hou.node("%s/%s/redshift_material1" % (passedSHOP, passedName))  # Detect the default closure node that should be created by the redshift_vopnet.
        if rs_output != None:
            # Create.
            rs_mat = rs_vop.createNode("redshift::Material","rs_Mat")
            if rs_mat != None:
                # Set passed values.
                rs_mat.parm("diffuse_colorr").set(passedDiffuse[0])
                rs_mat.parm("diffuse_colorg").set(passedDiffuse[1])
                rs_mat.parm("diffuse_colorb").set(passedDiffuse[2])
                rs_mat.parm("refl_colorr").set(passedSpecular[0])
                rs_mat.parm("refl_colorg").set(passedSpecular[1])
                rs_mat.parm("refl_colorb").set(passedSpecular[2])
                rs_mat.parm("refl_weight").set(passedWeight)
                rs_mat.parm("refl_roughness").set(0.23)         # Hard coded to soft blur reflection.
                rs_mat.parm("refl_ior").set(passedIOR)
                rs_mat.parm("opacity_colorr").set(passedOpacity)
                rs_mat.parm("opacity_colorg").set(passedOpacity)
                rs_mat.parm("opacity_colorb").set(passedOpacity)
                
                rs_tex = rs_vop.createNode("redshift::TextureSampler",returnValidHoudiniNodeName("rs_Tex_%s" % passedName))
                if rs_tex != None:
                    # Wire
                    try:
                        rs_output.setInput(0,rs_mat)
                        can_continue = True
                    except:
                        can_continue = False
                    if can_continue:
                        if passedImageFilePath.find("NOT_DETECTED")==-1:
                            # Only plug in texture if the texture map was specified.
                            rs_mat.setInput(0,rs_tex)                       # input #0 is diffuse color.
                        extension = os.path.splitext(passedImageFilePath)[1]
                        files_with_alphas = [".png",".PNG",".tga",".TGA",".tif",".TIF",".tiff",".TIFF",".exr",".EXR"]
                        if extension in files_with_alphas:
                            # Place a sprite after the rsMaterial to implment opacity support.
                            rs_sprite = rs_vop.createNode("redshift::Sprite",returnValidHoudiniNodeName("rs_Sprite_%s" % passedName))
                            if rs_sprite != None:
                                rs_sprite.parm("tex0").set(passedImageFilePath)    # set the filename to the texture.
                                rs_sprite.parm("mode").set("1")
                                rs_sprite.setInput(0,rs_mat)
                                rs_output.setInput(0,rs_sprite)
                                #rs_mat.setInput(46,rs_tex)                  # input #46 is opacity color (i.e. alpha).

                        rs_tex.parm("tex0").set(passedImageFilePath)    # set the filename to the texture.
                        
                        # Remove luminosity from texture using a color corrector.
                        rs_cc = rs_vop.createNode("redshift::RSColorCorrection",returnValidHoudiniNodeName("rs_CC_%s" % passedName))
                        if rs_cc != None:
                            rs_cc.setInput(0,rs_tex)
                            
                            rs_cc.parm("saturation").set(0)
                            # Add a slight bump using the greyscale value of the diffuse texture.
                            rs_bump = rs_vop.createNode("redshift::BumpMap",returnValidHoudiniNodeName("rs_Bump_%s" % passedName))
                            if rs_bump != None:
                                rs_bump.setInput(0,rs_cc)
                                rs_bump.parm("scale").set(0.25)          # Hard coded, feel free to adjust.
                                rs_output.setInput(2,rs_bump)
                                                
                        # Layout.
                        rs_vop.moveToGoodPosition() 
                        rs_tex.moveToGoodPosition()
                        rs_cc.moveToGoodPosition() 
                        rs_bump.moveToGoodPosition()
                        rs_mat.moveToGoodPosition()
                        rs_output.moveToGoodPosition()
                else:
                    print "problem creating redshift::TextureSampler node."
            else:
                print "problem creating redshift::Material node."
        else:
            print "problem detecting redshift_material1 automatic closure."
    else:
        print "problem creating redshift vop net?"

#Read JSON data into the records variable
if filename:
    with open(filename, 'r') as f:
        records = json.load(f)
    f.close()

for item in records:
    # Some sub-items only exist some of the time?
    # If you reference them and they are not there, you get an error.
    #diffTrans = records[item]["diffTrans"]
    baseColor = records[item]["baseColor"]
    alpha = records[item]["alpha"]
    colorMap = records[item]["colorMap"]
    displacementMap = records[item]["displacementMap"] 
    ior = records[item]["ior"]
    roughness = records[item]["roughness"]
    reflection_weight = 0.1
    shader_name = "rs_%s" % item
    diffuse_color = (baseColor[0],baseColor[1],baseColor[2])
    specular_color = (0.9,0.9,0.9)

    createRedshiftImageMapMaterial("/shop", "%s/%s" % (texture_path,item),shader_name,diffuse_color,specular_color,reflection_weight,ior,alpha)

 

Edited by Atom
  • Thanks 1

Share this post


Link to post
Share on other sites

I think it is possible to use parmTuple for all channels at once:

rs_mat.parmTuple("diffuse_color").set(passedDiffuse)

Share this post


Link to post
Share on other sites

Here is the Fibers, Grass and Palm Debris geometry from the isCoastline data set.

 

untitled-1.jpg

Edited by Atom

Share this post


Link to post
Share on other sites

Here is a first look at the isMountainA data set with foliage and palms points loaded.

untitled-1.jpg

 

And here is isMountainB with everything except LowGrowth. The low growth point set is 3.2Gb of point data all by itself.

untitled-2.thumb.jpg.a37db72d46c831a18be9c1648547a627.jpg

Edited by Atom
  • Like 1

Share this post


Link to post
Share on other sites

Why I can see scattered objects only on a Scene root level, but not in Geometry context where PythonSOP located (only points displayed there)?

Share this post


Link to post
Share on other sites

That is the normal operation for instances. You only see points at the SOP level. The Instancing to the viewport is handled at the /obj level.

  • Like 1

Share this post


Link to post
Share on other sites

Here is the 1st draft of the isDunesA data set.

untitled-1.jpg

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×