Jump to content

j00ey

Members
  • Content count

    187
  • Joined

  • Last visited

  • Days Won

    1

j00ey last won the day on August 23 2015

j00ey had the most liked content!

Community Reputation

13 Good

About j00ey

  • Rank
    Initiate

Personal Information

  • Name
    Tim
  • Location
    London

Recent Profile Visitors

3,068 profile views
  1. 16.5 drums

    Oh great, I kept meaning to put persistent quickmarks on the H17 wishlist... someone must have done it for me
  2. 16.5 drums

    It appears to be out...
  3. Some DOPs RnD

    Thanks! I love the stuff you've been posting on here too - fantastic work!
  4. Some DOPs RnD

    I thought I'd make a topic here to share some RnD I've been doing in spare time in case it's helpful to anyone, as I've been able to learn so much useful knowledge from others on here. I have a couple of other scenes underway that I'll put up once they're ready. They're not for any particular shot, more just the result of one thing leading to another... The first one is essentially a simple cloth sim a driven by an RBD sim, where some balls are attracted to a wandering point. I used a technique similar to generating a wet map to transfer some colour between the balls while they're touching which sort-of worked but ended up producing a lot more blue than anything else. Also for each point on the surface I calculate the difference between the average distance to neighbouring points during the animation and at the beginning, a sort of compression or stretch attribute and I'm using that to displace the surface first in SOPs, then also in the shader. http://vimeo.com/241509160 *Edit - I can't figure out how the embed code works, in case anyone knows..?
  5. I figured this out, i guess I must have figured it out last time and just forgotten... I made a scene file in case it's useful to anyone. The pc file needs to go in a 'cache' subdirectory of wherever you download the scene to, ie $HIP/cache/butterfly_pts.pc butterfly_pts.pc displacement_pcLookup.hip
  6. Has something changed with the new H16 mat way of doing displacement that would cause problems with point cloud lookups? I have a shader [a modified classic shader] in which I'm looking up a point cloud to get some wetness and colour attributes and it works fine without displacement. As soon as I switch that part on though, all the wetness etc vanishes. I'm sure when I've done this before it just worked but I'm clearly missing something. I'm transforming global P from current to object space to do the pc lookup, I don't recall having to do any compensation there for the fact that the geo is displaced. Does anyone have any pointers..?
  7. Ah yes thanks! I was looking at the Geometry but of course until you set a value manually, there's nothing there. Much appreciated.
  8. I see - I just tried setting it with a quaternion and it's doing something! Thanks very much for the pointer. Where do you see that in the spreadsheet though to be able to tell you have to set it with a quaternion instead of a vector?
  9. Ah thanks - that's interesting, I'll try that. But on initial inspection, my goal_twist_angle etc attribs aren't converted to quaternion. Where are you seeing that happening?
  10. Well I'm actually not trying to achieve anything in particular, I'm learning some RBD stuff, specifically constraints and was looking into the cone twist. I think it may be a new thing but it does now have a motor capability and it does work globally [to be honest I'm still trying to understand what it's really doing] but per-constraint doesn't. I just copied the name from the related param field in the same way you can find the names like goal_twist_axis etc and indeed i@motor_enabled works but I couldn't work out the motor_target.
  11. Hi I'm having a problem with a cone twist constraint network. I'm setting all the axes per constraint in SOPs (goal_twist_axis, constrained_twist_axis etc) and I wanted to set the motor_target values inside a SOP solver in my DOP network but it's just not doing anything. I can set motor_enabled and that works fine. I've tried v@motor_target, v@motor_targetr, f@motor_targetry, nothing gets picked up. I can't find anything in the docs or online... Any ideas?
  12. Export animation data to JSON

    @f1480187's method is indeed much cleaner. Here's an updated file in case it's useful. basic_json01.hip
  13. Export animation data to JSON

    Thanks very much for the tip, I will look at that.
  14. Export animation data to JSON

    I made you a basic file to start you off - I'm not very advanced in python myself so don't take this as the proper way to do something, it's just a way... If you open a python shell and click on the 'Print JSON' button, you should get some JSON printed out. Look in the python module of the asset to see how it's set up. basic_json.hip
  15. Export animation data to JSON

    It depends on what the software you're reading it back in with wants. I was writing it out for a custom renderer so the developer gave me a template to use as a guide and I matched that, filling in the data from geometry attributes and transforms etc. I guess if you just want translation, rotation etc you can use worldTransformAtTime() to get the transformation matrix at a given frame, then use extractTranslates(), extractRotates() to get the values for each frame [or time step] that you want to sample. Check how the reader expects your JSON to be formatted and match that - eg [if my memory serves] I had to write the point positions as one long array of comma separated values that the reader parses in groups of 3 [x, y, and z], so that meant looping through the points, getting the positions, stripping off the brackets, appending to the array then putting square brackets around the array. The site I posted last time was very useful for checking I hadn't missed a comma or bracket somewhere. If you have experience with python I think it should be fairly straightforward, if not [like me] use the docs...
×