Welcome to od|forum

Register now to gain access to all of our features. Once registered and logged in, you will be able to contribute to this site by submitting your own content or replying to existing content. You'll be able to customize your profile, receive reputation points as a reward for submitting content, while also communicating with other members via your own private inbox, plus much more! This message will be removed once you have signed in.

Skybar

Members
  • Content count

    962
  • Joined

  • Last visited

  • Days Won

    23

Everything posted by Skybar

  1. I've been meaning to look into realtime effects, from a houdini perspective, but I don't quite know where to get started. Destructions for example, how do I get that into the game engine? Just caching out an alembic is probably a no go with its unique geometry per frame - so I'm thinking maybe bones or something? And explosions/fires etc, is rendering sprites still the go-to method there? This tutorial is pretty much what I can find, and it seems pretty subpar. And the stuff from SESI and Gametutor is mostly about assets and not really any FX. Are there any good resources for this, or any pointers how to get started?
  2. I got a couple of spheres I want to render with refraction. They are pretty low res, and I wanted to subdivide them at rendertime. However the poles get all screwed up (see pictures) and this is clearly visible in the refraction. I've noticed this lots of times before but it never really was an issue. Is there an easy fix for this? Probably a noob question but I have no clue really. I could go back and use the "polygon"-sphere (made up of only tris) instead, but then I'd have to resim and all that. Cheers!
  3. If you test like this it seems its the -1 it fails on: int test1 = @N.x == 0; int test2 = @N.y == -1; int test3 = @N.z == 0; if(test2) i@check = 1; Might be a rounding issue somewhere, because if you then change it to int test2 = @N.y >= -1; it works fine. Also if you normalize N in your example it works.
  4. It should show up in Render Scheduler.
  5. It's a multplier isnt it, so you effectively got 300*300=90000 ?
  6. Well, no. Sim files can be saved for dopnets, containing the whole simulation state. It's not a substitution for bgeo, but rather a tool if you want to continue the simulation at a given point. Mplay is an image viewer so it wont play geometry files, use Gplay for that.
  7. I'm not doing that much stuff at home anymore, but I think I got 4-6TB something - which was enough for what I did. Start with 2TB, and if you find yourself needing more just go and buy another disk.
  8. For things like the Transform SOP - how would it know which value it should use? You could potentially transform some geometry with thousands of points, with each point having a unique attribute value. Simply using @myattribute should return exactly what, then? So yes, it won't work with point/primitive attributes - you'd have to use Point SOPs or Primitive SOPs in that case since those work on those individual levels. The Transform SOP works on the detail of the geometry - so what happens if we promote our attribute to detail then? Spoiler, it works just fine using @myattribute in the transform sop when the attribute is a detail attribute.
  9. Well it is kinda free off the shelf, isnt it? To be fair I havent really used opencl before, but I laid down a box and used the Billowy Smoke shelftool on it, and set the resolution of the sim to 0.01. Running normally the pyrosolver takes 2m15s for 25 frames. Simply enabling the opencl checkbox that time is now 10s for the exact same thing, the sourcing itself actually takes longer now at 12s. The resulting smoke from both sims look pretty much identical. Have I broken something? Its a small and short test, but still, didn't expect this dramatic speedup at all.
  10. Uhm, why? It makes perfect sense and is widely accepted. Its called the object context, /obj/ for short, and 'object level' is just easy to say. If you'd call it "scene level", another novice might look for /scene/ that doesnt exist. If someone is refering to sops as object level, they are the ones that are inconsistent and doing it wrong. But theyre probably not in the majority, feel free to slap them on the wrist when they do.
  11. Sprites?
  12. Using bullet? It doesnt use volume collisions and uses convex collision by default. See the guide geometry under the bullet tab.
  13. 16.0.593 here, that might be it. Still weird though. FOREACH_Loop_vs_Subnet_01_dv.hiplc
  14. Are you sure? Thats the only two things I changed. It looked more like the green one but not identical. You might need to press "reset cached pass" or whatever its called on the end block.
  15. Set the 'foreach_begin1' to Fetch Feedback. Also you have a stamp expression in 'randPoint1' which wont work. It doesn't make it look exactly the same but atleast its working.
  16. Havent seen it, but you could just make the volume really really big. So that the edges wont matter.
  17. Why cant you just scale the character down? It will save you a lot of headache. Then afterwards, if you need to, you can simply scale the simulation back up. This is usually how its done in production.
  18. It wont have any effect on the performance. But if you are simulating stuff the files you cache to disk usually takes up a ton of space. That's why he recommended a bigger one, 2TB tend to be eaten up quite quickly depending on what you do.
  19. What exactly are you trying to do? When in primitive mode on the wrangle it already loops over all prims, so right now you are looping over all primitives for every primitive, twice. Also, I think it creates attributes the last thing it does - so you cannot access it until after that wrangle. I think it would work better if you instead made a variable instead of an attribute, but dont quote me on this. If you just want to delete random primitives do it like this: if(rand(@primnum) > 0.5) removeprim(0, @primnum, 1);
  20. I think when you just render a volume, it knows how many and how big the voxels are. When you are using procedural it doesnt, it just got the bounding box - so it will "dice" that box using volume quality. Probably not the correct term but thats how I understand it. And its relative to the box size, so you need higher volume quality the bigger it is to keep the "voxels" the same size. I guess.
  21. I've come across this before but I'm not sure why it happens. It could be how it samples the volume, actual voxels vs. the procedural. If you increase the volume quality on the mantra rop it gets a lot closer, but also increases the rendertime. Not sure if it's possible to get a 100% exact match while also keeping the rendertime the same. Actually, looking in the docs it says this that I've forgotten about, so it makes sense increasing the volume quality: "When using this procedural, you should always specify an explicit bounding box. If you don’t specify a box, parts of your function may appear clipped in the render. The default rendering quality for VEX volume procedurals is set in relation to the box size, so larger boxes will use a larger ray march step size. You can adjust quality from the default using vm_volumequality."
  22. The video says Houdini so yes, it is clearly possible. He even says in the comments how he did it, so yeah there you go.
  23. Yeah ok that I can understand. Maybe using $FF is used so rarely it doesnt really warrant its own solution. But fact is its still annoying
  24. Maybe try grains with constraints.
  25. Sure but thats not what I'm asking. If you have a sequence like: file.1.bgeo, file.1.5.bgeo, file.2.bgeo, file.2.5.bgeo - it wont interpret that as $FF, but it will make multiple sequences like file.1.$F.bgeo and file.2.$F.bgeo. But I want the whole thing to be file.$FF.bgeo. I'm doing this very rarely, but when I do its always annoying.