Jump to content


  • Content count

  • Donations

    0.00 CAD 
  • Joined

  • Last visited

  • Days Won


Everything posted by LaidlawFX

  1. Renderman Shading Language guru ?

    Country hoping ain't easy, and is very time consuming, lol. Compute Lighting does a lot of the old black magic internal to it so you don't have to think about it. It's just and hda so you can see how everything is split up inside. The key part is you don't need to plug Ce into anything. The Bind parameter/option to use as an export option from the parameter is all you need. That is what passes the value to mantra, or better said what mantra is looking for from the compiled shader. So if you dive into the compute lighting you'll see many of those "dead ends" for parameter vops. Their names will correspond to the Mantra Extra Image planes vex varriables. For the standard ones they are just toggle boxes now a days. But to understand it you can make a custom variable called foo in your shader and add the extra image plane and the vex variable called foo. Not what you are after I know. I have not directly answered your question Trying to point out the flow from the basics. Once you get those humps the rest is quite easy. Cd is only the term as deifned in SOPs. Once you get in to the material land Cd is only used as a bind import and then multiplied against some other variable. Then they stop using the term Cd anywhere else in that context. Cd used to mean Color Diffuse, but that logic is long gone in the age of albedo and PBR. Legacy YEAH! If you really want you could call your SOPs attribute Albedo to make it easier and instead of importing Cd, you import albedo for instance. You won't see the color in the viewport with Cd, but it's more of an understanding of the oddity of Mantra versus Renderman than anything practical. Hopefully, my non-answer answer makes more sense now, lol.
  2. I can't take credit for it, but it needed to be shared. This made me cry with laughter.
  3. Houdini 18 Wishlist

    Welcome to the forum. And surprises await for you very soon.
  4. Renderman Shading Language guru ?

    I'm in transit for the next few days, so no Houdini in front of me. Edit... Unless realtors are being a pain in the arse... So the way I transitioned from the two languages I started with basic shaders. The new physical models are far more complex now adays, than when I switched. To find a simple example go to the material library and drop down a basic diffuse. Then go check it out in vops. So one of the biggest first bumps is to render a constant shader in mantra. There are 4 combination of render engine unlike renderman last I worked with it. The default option on Mantra is not PBR, but raytrace. The main difference as far as shading with these render engines, is they are a combination of two geometry processes and for practical purposes will say two shading methods. Raytrace, and micropolygon are the geometry processes, and the non-pbr and PBR methods are for shading. So it's important to place a mantra rop and change the render engine so you can shade correctly when developing such as you are. For PBR to make a constant shader wire a parameter or constant into a bind export called Ce (i.e emission). In the basic diffuse example this is contained in the Compute lighting vop, which is really just a custom export vop for mantra myriad render layers. For PBR to make a diffuse with color. Take a Lambert vop that exports a f (bsdf) the yellow wire and multiple it with the color from a constant or parameter vop into the output parameter and variables. For a non-pbr shader to make a constant shader wire a constant or parameter into the color (Cf) on the output parameters and variables vop. For a non-pbr shader to make a diffuse shader wire the color from the lambert vop and multiple it with the color from a constant and parameter color and then wire it into the color (Cf) on the output parameters and variables vop. Sorry for the basic, it's actually the one of the biggest gotcha when switching between the two. The harder stuff is actually easier.
  5. Renderman Shading Language guru ?

    Your best bet is to deconstruct a mantra shader to see what it is doing. VEX and RSL are similar but mantra and renderman both have their quirks that make them different. With pbr your color output with no lighting is called emission. Look at some of the simpler shaders like diffuse to see an example. Otherwise just multiply color by the bsdf.
  6. Houdini 17 Wishlist

    Houdini 16 launch has been announced, February 21st! Time for the dreamers and the wishlist* to move on to the next major version. Perhaps a 16.5... as gleaned from the Amarok event??? So IMO I think the gauntlet has been thrown down. Houdini needs to be a fully fleshed out 3-D package that any person can create content in from beginning to end. No more weak spots, where you have to dive into another package, since you have no choice. Halo I mean COPs I'm looking at you. If I want to use another package that's fine, but I should no longer need to use another tool from my tool belt from the beginning to end of my 3-D authoring pipeline needs. Houdini Engine UI functionality needs to be more fully integrated into their host packages. Blueprint nodes in Unreal. Network editable nodes in Unity and Maya. Fully fleshed out UI options for parameter interfaces; rollovers, help, disable whens, hide whens. I should be able to create one tool for all my host programs, especially if they are SideFX supported ones. Lead Houdini Engine by example so when I want to implement it into my own pipeline or tools I know it can be done. A unified node context, I know this drives people banana's, but it should be a choice to work in different node contexts. Not a mandatory obligation that you need to be in /obj/ Scene, /obj/sop/ Sops, Dops, Materials, etc. to perform those task. Houdini was created with the combinations of several different programs as defined by these contexts nearly 20 years ago now. It's time to UNITE them all! We can still keep the old Context's just as POPs still lives under the hood, or you can just unleash the / context to us all, but it would be nice to work in a unified context i.e. Nuke. And as always it's the users responsibility to keep their network clean! Thank SideFX gods for the wire dots, and the circular contexts. More fully fleshed out presets. The shelf is good, but if I'm working a commercial or doing some R&D for a bigger project I need a full fleshed out setup. The setups exist out there, but I don't need to rebuild the same setup at every studio I work at. Additionally the Shelf tools needs some love. Just make them the same as hda with all their functionality. Add an Extract Content feature. Don't keep them the separate beast that they are. HDA's are powerful, shelfs are deprived and in constant sadness to their tool brethren. An example library for each node and code example that is easy to view and find. It's rare to find examples spread through the Houdini Doc's. If I could have the help Pane, or an example Pane that I could search through that be amazing. This could be tied in with more fully fleshed out presets. You don't necessarily need a lone example per a file, combined ones often make greater sense. The orbolt pane for instance. The upgrade to the Help docs has been awesome, including the more graphical documentation i.e. the packed sop. But those example files are trailing. More common studio tools that are predefined. Every studio ends up creating special importers and exporters that all in the end do the same thing. Just create a few common studio nodes, that can be easily manipulated. Either via python modules as they presented in their rigging tools, or by non-compiled file sops and rops. The Alembic ROP is a very convenient example of showing the code so you can manipulate it. I shouldn't need to have multiple different contexts and nodes to import and export geometry and data. An uber file sop to load them all. An uber filecache to export them all. One ring! My precious! I would still love to take all the older nodes like the point sop, and have them converted to vops/wrangles. Maintain the same parameter ui, but have a little button or switch that flips from a wrangle to a code version. There is a certain sense that there is still a layer of black box with each of these nodes. This is where the fabric crowd, and programmers say they don't understand what is happening, and flip a table and say they need to build it from scratch. I can understand the proprietary algorithms being compiled black box nodes, but the point sop... come on now, this isn't a dark secret to the world. This would allow us to retire so many old nodes. Speaking of which the node count in Houdini is only getting more ridiculous each version. There is no way one person can know them all. I LOVE all the new features, but there comes a point when there are too many nodes. The biggest hindrance to new people is not knowing that a node exist that they can use. Node acumen should not be a barrier to using Houdini. The Houdini learning curve is dropping faster and faster. However, I've used Houdini for a decade with a wide variety or projects, and I can easily say I have not used every node. That's cool, but it also ridiculous. There does not need to be a multiply, add, add constant, etc. a single math node would suffice, opalias that stuff! There needs to be a survey of all the nodes, alias them to a wrangle/vop and retire! retire! retire! those nodes. Plus make some useful example along the way. Ok I think I ranted enough. My blood got pumping for Houdini 16 and I'm stoked about the new toys. I can not wait for this new Lego set and to work on some more amazing projects. And yes I will make my nodes look like Legos... *As a note any true bugs of RFE's please send to SideFX Support. This is only an un-official wishlist, so we can compare notes, rant and rave.
  7. Renderman Shading Language guru ?

    RMB on the vops and look at the vex code. It should line up extremely similar.
  8. .hip file versioning (git)

    Yeah I would not recommend doing this either. What @berglte said.
  9. VFX to Realtime Transition

    If you are specifically wanting to go into FX for VR the field is quite small at the moment. A few notable companies are pushing projects where they might have one or two full time traditional FX positions specific to AR/VR/MR, for instance, Facebook. However, it's a very small market outside of indy. Game into the AR/VR/MR market would certainly help, especially, if you are coming from Unity as opposed to Unreal. Hololens publicly works with Unity for instance. It may help you to get a good understand of the different type of tech and the capabilities before you make the dive. The hardware capabilities drives a lot of what you can do, and unless you wear headsets that are connected to a PC, versus one capable of mobile computing, your ability to do a lot of graphics is limited. It's a fun challenge to be sure, but if you are not a fan of trying to do your magic and make it efficient as possible, it may not be worthwhile. I've actually only heard of a very few times where someone goes through a traditional path of going to school and going intern, to junior, up the ranks, so it's IMO allways a sideway happenstance. If you have the desire than learn up and go. The market is still pretty young for VR/AR/MR, where people need to rediscover old school tricks, and leverage modern tech to get a great experience. Just browsing through all the games available on Steam is a great opening course, of what is possible. You could also find that AAA games is more up your alley to. Games and VR/AR/MR are close enough to be lumped together, but the different type of games is a very vast medium compared to say film and commercials. Games, Films, and Commercials all lives under the same umbrella of 3-D for content creation, but games has a wider technology base that handle the processing of FXs. FXs in games are more dependent on the hardware as user input is provided causing multiple options, than Film and Commercial where the device only needs to play a 2D image. The content of the image may be the same pretty explosion, but depending on the company and the pipeline in games you will be doing different tricks. Films you may widdle your renders from 30hrs to 30min, in games you widdle your renders from 10 milliseconds to 1 micro second. Your compositing goes from Nuke to hlsl shaders. Same basic math and functions, but different set of overhead, algorithms and hardware to process it. IMO, if you are not interested in the aforementioned ideas, of building a test level and the peripheral, getting the general feel of what it is like working in the game engine, then it's not really a good choice for you at this time. I'm not saying you need to be a Level Designer, programmer, or any of the other specialties of games not really in Film commercials, but these are some basic components of FXs in games and VR/AR/MR that you need to do on a regular basis. For instance, in film you make an FX one off and you may a tool if you do that FX often for a sequence or film. In games and VR you'll set up a test level where you have all your FXs playing on loop or set to run via commands for guns, so you check on their quality constantly. This process I don't think is that much different than an image morgue for lighting artist for reference, but if you don't find the concept at least non-nonchalant or enjoyable then that's a good flag to know for yourself. On the real fun side, in this test environment you are the one who has made all these repeatable methods of destruction and chaos. I've seen FXs artist just love to shoot and blow up stuff with their own concoctions. Personally I like most media, but I still prefer to read a book, and my fun with work is solving the problems associated with it. Maybe it's the team, studio, or project you like more than the medium. So I guess the soul searching question is what part of FXs do you like and why?
  10. VFX to Realtime Transition

    Hello Mountaingoat, I can respond a bit more later, but figured I would jump in now. I have made the transition. The biggest issue you will deal with is not your skill transfer, but the people who will doubt you because you worked in another medium. So don't get discouraged from that crap. The budgets are different for processing, but it's more a factor of scale than all of the sudden trying to learning latin. The technology in this field changes so fast that it doesn't really matter what you did a few years ago. You allways need to adapt and move forward. As for the easiest way to transfer it kind of depends on your background, and if you have any specific interest. These motivating factors will help you push through. The process of building the content from say Houdini is the same in many cases. Instead of kicking it to say mantra you kick it to that game engine's render process. Which are remarkably the same if you dig deep enough. The difference is that with the realtime fx engine some of the components from Houdini will be replaced by the game engine. So a bunch of the particle editor will be done int he game engine, but you will still author the texture arrays in Houdini. Coming from what ever you did before and finding that vernacular to what the new package uses will help you move forward quicker, and then you can add to it. I would suggest Unreal if you are going for AAA games and Unity for VR and mobile. That difference is drastically changing every 6 months, but you will find the similarity when you go into production environment between the two ecosystem. The first thing I would say do is to create a nice test level for yourself, so you have a stock player than can run around. Then slowly add FX elements to it that enhance the environment depending on where you want to learn first. Maybe do some of the low hanging fruit with all the Houdini GameDev integrations at first. You can even make a procedural level area too. Add in a bit of each type of FX. Get used to how the shader system works on the houdini gamedev stuff, and then you can start to do more specific engine related stuff like modifying the weapons fire. Just like film/commercial VFX there are a lot of specialties in game VFX. Each studio will be looking for something different so don't be discourage when a studio may not like the certain sauce of FX your focused in especially because of your background. it's not like you are going to make the transition over night. I've meet many leads that can't even do half of the work of their team, so no probs.
  11. Since Houdini is a 3-D program the fundamental assumption is that all geometry data is associated with a point. A program that would be agnostic to this type of data would be excel.
  12. ThinkPad P71 for Houdini?

    That hardware should do you ok. The important parts are the Nvidia and the ram. Have fun.
  13. Thank you for this.
  14. Houdini 17 Wishlist

    Lol, yeah people hate learning curves, and Houdini is a knuckle ball, lol.
  15. They removed opdefaultcolor and opdefault shape in favor of themes in 16.5 those options only existed for 16.0 http://www.sidefx.com/docs/houdini/network/organize.html#custom
  16. Here is the documents on this. In short, they can an only look at the resolved values, i.e. a local referenced parameter on the left, and a static value on the right. http://www.sidefx.com/docs/houdini/ref/windows/optype.html#conditions So you could do { parma == "" parmb == "" } this would mean they both need to be true in order to work an (and) operation. Or if { parma == "" } { parmb == "" } would be an (or) operation. You can also do ==, !=, <, >, >=, <=, =~ (matches pattern), !~ (doesn’t match pattern) as additional operations. What is a actually quite common is to make a parameter that is invisible (or under a hidewhen under an advanced set of options) that you can use to hide and disable parameter and folder menus.
  17. Getting the current active network editor pane

    In 16.5 this is implemented.
  18. hq sim and otls

    So if anyone comes back to this thread the end issue is that the HQ Sim ROP node and the Geometry ROP when encapsulated in an HDA need to have all the parameters promoted up, and for the two nodes to be allowed editable. I do not have a list of the bare minimum parameters required to promote up like Alfred Output. Suffice to say Alfred Output is required on the farm for all geometry/dop outputs, and it can only be turned on in the HDA. So you might as well leave those options on by default. Funny enough once the command is sent you do not always need to have access to these hdas on the farm when submitted, as the command line sent to the farm may not reference them like with most mantra renders. This is why is it can work in some situations and not in others. Hope this helps future me
  19. Houdini 17 Wishlist

    This would be excellent for the Add Spare Input structure for compile blocks.
  20. Houdini 17 Wishlist

    Interesting, I will try at home. My workstation does not allow it. Are you on Windows 10? It could be my graphics cards is just destroying my battery.
  21. Houdini 17 Wishlist

    I wish that having Houdini open on my laptop would not be such a power drain on my battery. I know it has nothing to do with normal Houdini operation. I can watch a few movies and my battery is good, however, trying to look something up in Houdini and having it on in the background is a death knell to my battery.
  22. managing .sim checkpoint files

    Yeah they were in effect at R&H and I've maintained them going forward for the most part mostly for the failed farm nodes deal. I am not too heavy into simulations any more. I'm doing more world building now.
  23. hq sim and otls

    So now I have a pretty json log file of hconfig from the farm and locally, and everything lines up the way it is supposed to so far. Edit: I'm used to actually work with systems that are part of the farm natively. 0 differences between a work station and a work node. Not on a system where we send everything including the environment to the farm... so there are certainly some strange things. Should have just built a python exporter from scratch, teach me for trying to do this the easy way...
  24. Houdini 17 Wishlist

    Hell yeah!