Jump to content

Kardonn

Members
  • Content count

    103
  • Joined

  • Last visited

  • Days Won

    10

Kardonn last won the day on December 16 2016

Kardonn had the most liked content!

Community Reputation

105 Excellent

About Kardonn

  • Rank
    Initiate

Personal Information

  • Name
    Lorne
  • Location
    Toronto

Recent Profile Visitors

2,159 profile views
  1. Yeah they issued a fix actually based on this example file here in 15.5.714, everything below that version won't open it for whatever reason even though like you said...totally fine in Maya.
  2. Here actually I should really have just posted the FBX file in question to begin with so you guys could try pulling it into a Houdini session, attached it now. @edwardI've got those .bclips baked out, but I'm pretty green at this whole crowd workflow stuff so I'm not totally sure what I should be using to add those translations. I was mentioning that Agent Edit seems to work, except for some reason it doesn't 'stick'...viewing the Edit my Agent walks forwards in space along +Z properly, but then once it hits the sim it's back to default. @anthonymcgrathI'm desperately trying to avoid having to do anything to these guys in Maya because the end goal of getting this all working would be to start exporting clips from all the game's characters and races to play around with, and that would be so much Maya work to fix up these hundred+ clips! This .FBX will probably only load properly in H15.5.714 and only via the Agent node. walk.fbx
  3. Decided I'd learn how to work with the Houdini Crowd system by actually making my own agents from exported character animations out of World of Warcraft, but the main problem I've got is that since all of these clips started off without any locomotion, I need to figure out how you actually get that back in. Agent Edit SOP appears to work in the viewport at least...I can translate the character over the course of their walk cycle and reorient them (they start facing X instead of Z) but yet when I feed that into the Agent ROP it doesn't seem to stick. Not a ton of info out there sadly for working with custom Agents in H15.5, and even worse is that these particular FBX files explode if you try to Import -> FBX them which makes this all potentially a lot more difficult. Has to be loaded up via Agent SOP.
  4. Houdini 16 Wishlist

    @marty it sure seems like it, feels like the exact same Wire and Cloth/FEM solver now for years...which makes me all the more sad every time I duck back over to Maya.
  5. Houdini 16 Wishlist

    A few main things on my wishlist. 1) The ability to interact with instanced geometry so that it can be used more easily in DOPs, SOPs, etc. This especially stings when working with 3rd party renderers which don't support rendering Houdini's packed geometry...otherwise I could just work with Copy SOPs of packed primitives and then unpack them at will when their real geometry is needed in calculations. 2) I want to be able to grab shader data from the shop_materialpath attribute. Again this assists hugely with interacting with your geometry in the SOP/DOP level. Great example, right now if you want to scatter points onto an object that will be displaced at render time, you have to painstakingly match the displacement in the SOP level with a whole AttribWrangle setup to pull in the displacement map texture, match the displacement values from your shader, and then do v@P += v@N * disp. Except it's far worse than that, because you'll of course have to do this for every single different shader on every single different object, and then god help you if you're modulating those displacement maps with VOPs procedural nodes...you now have to recreate those too. I like to imagine a world where in the SOPs level I can simply ask for @shop_displacement or @shop_albedo and instantly have access to my displacement maps, diffuse maps, etc., in other contexts. Scatter points onto 100 different objects and tell Houdini to displace them via this hidden @shop_displacement attrib. 3) Better multithreading across the board, and PLEASE start using >PROC_GROUP0 in Windows. Workstations with 65+ threads are very common now, and I still find Windows much more comfortable for small studio work...it REALLY stings losing half my machine to this, especially when Arnold, VRay, etc., have easily sorted it out it seems. I basically have to either turn off Hyperthreading so that Windows doesn't make a 2nd PROC_GROUP or have to run Linux if I really need to do a lot of very heavy threaded stuff. 4) WAY better cloth, wires, and other very fundamental simulation types that other packages like Maya seem to have absolutely nailed for years now. nCloth and nHair make the Cloth and Wire Solvers look like a joke, and even more absurd is that there's no reason a Wire Solver without self collisions shouldn't be using every single thread in your machine...matter of fact that's how I've had to start working. I split my guide hairs into 10+ equal parts and just jam 10 Geometry ROPs down my machine at the same time. And if you want to see a real cloth solver go to work, play with Marvelous Designer some time. It will make you very sad to go back to Houdini. 5) Improve the hair/fur. Play around with Yeti, really study it and take notes. Whatever they're doing gives far more natural looking hair grooms, and their procedural generates and feeds the data to Maya MUCH faster than the SideFX fur tools do. A heavy fur groom with 5M hairs might take Houdini 4+ minutes to finally ship over to Arnold while Yeti is munching the same in around 45 seconds to feed to MtoA. That's really the major stuff for now I think. I tend to work on a lot of very heavy environment layouts and assets for various studios, and honestly the most tiring part of my work is matching and duplicating all the SHOPs level stuff in the SOP level, and constantly having to come up with ways of accessing my @instance, and far worse... @instancefile data so that I can layer further scatters and integration with what I've done. Try this some day; scatter some Arnold archive trees via @instancefile and then try to add snow to them. You have to create a completely parallel workflow to do it, and it's not an enjoyable task whatsoever because it's just grunt work that has to get done.
  6. OpenCL FLIP faster on Linux?

    Nope never any viewport issues before BUT for about two weeks I was thinking RedShift had terrible out of core performance on my dual GTX 1080 machine which turned out to be completely driver related.
  7. OpenCL FLIP faster on Linux?

    I can say for sure that both my Titan X and GTX 1080 workstations drive a Houdini viewport at 4K and 5K like no one's business.
  8. Ignore me...another problem solved by the time tested approach called "read the f*cking manual". Solution for anyone searching this thread for answers one day in the future: opparm -C [node] [parm] [value] The -C triggers any callbacks in addition to setting the parameter value.
  9. I've got a shader setup here that pulls textures based on .json lists and need to be able to swap that .json file during command line renders in order to generate preview images for material libraries. Problem is that the pre-render script [opparm /shop/shader json "S:/Lib/foo.json"] works, but that parm in a Houdini session runs a callback script when you hit enter after plugging in a .json path. When you change the .json path via opparm though...it doesn't get triggered and nothing in the shader actually updates. There's an "update" button parm on the shader that runs the same callback script, I just don't know how to 'execute' that button from the pre-render script section of the ROP. Anyone know how you do that?
  10. OpenCL FLIP faster on Linux?

    From the RedShift devs, very likely it applies to everything including this OpenCL performance gain in Linux you're seeing.
  11. Working on some layouts here and was hoping there's some not-too-messy way of getting at the instanced geo being generated by a geometry node that's doing fast point instancing of some other objects. In this specific case I want to sprinkle snow on a bunch of things I've scattered, but I've run out of ideas on nice ways to get access to that geometry other than reproducing my instance setup somewhere else as a Copy SOP setup. Just wanted to double check that's pretty much my only course of action here before I get started on doing it. Edit: Potentially have a funny idea of a workaround using the GI Light to generate photons that I'd just use as my points.
  12. Megascans is publicly available now for anyone who hasn't seen the announcement yet.
  13. Foam from the bath

    Yup you could go that route too. If you have a polygon object behaving nicely on your character, you could have your rest frame, scatter points on it for instancing spheres to, and then use the PointDeform SOP or several other ways to stick those points to the geo representing your cloud of foam. Then you could duplicate that foam geo a couple of times with different Peak SOP settings applied, maybe a mild Mountain SOP, and then you could have a few layers of bubbles working for you that way and get some depth. Then do a Fog VDB from your polygon foam to fake the depth past where your bubbles run out.
  14. Foam from the bath

    Honestly I think you could just do a high viscosity Flip sim but instead of meshing it afterwards you simply instance spheres to it with varied sizes and shading attributes. Maybe something in the Flip solver too that randomly deletes 0.1% of the particles per frame to give the effect of popping bubbles. Use that in combination with a fog VDB made from the Flip fields to help get the white cloudy look internally and I bet you're close already just with that.
  15. There's nothing stopping you from making images like that in any rendering package out there right now. In fact you'd be better off with many of the others because from looking at these images, I'm guessing here that Maxwell is using either a Blinn or Beckmann specular model instead of the much better looking GGX distribution model. Those pics are 49.5% good texturing, models, and lookdev...and 49.5% good lighting and compositing. The other 1% is down to the renderer you've chosen.
×