Jump to content

FX in games, where to get started?


Skybar

Recommended Posts

We prefer pre-simming big spectacle event type sequences for games for a few reasons:

1. It's far less performance intensive than real-time physics simulations (ie: havok or apex)

2. You have total control over the motion and more importantly the end-state. You don't want a chunk of debris blocking player progression or a giant block of concrete jittering on top of a non-reactive car. 

3. Pre-simmed looks way better with much detailed motion which is no surprise when comparing a sophisticated solver (Houdini for example) on a fast PC simming for 30 seconds per frame versus a highly optimized solver running on console (similar to low-end PC) simming at 4 milliseconds per frame (typical portion of overall 33 millisecond budget per frame for FX)

For rigid body sims, we import them into game as simple .FBX skeletal animation assets with simple binding (1 joint per chunk plus root joint). For soft bodies, we export alembic out of Houdini to Maya, then use some custom tools to export the alembic as a series of blend shapes which we blend through in sequence to create the illusion of smooth deformation. The previous poster is absolutely right that alembic (1 shape per frame of animation, 30 shapes per second) is untenable (tooooo heavy) so we allow artists to specify the desired "frame rate" such as 6 shapes per second, etc. which provides good fidelity at a reasonable performance cost. We also try to be mindful of polycount on the these assets because they add up once you have 40 shapes (6 or so seconds of animation) to make a single large sequence.

Hope that's helpful.

Best,

Ewan

  • Like 2
Link to comment
Share on other sites

  • 2 weeks later...

I can't speak for DICE, and I haven't researched that Battlefield building collapse at all, but it's almost certainly entirely pre-simulated. You wouldn't risk having something as drastic as that in a multiplayer map utilizing any realtime simulation, except for small debris. Not when you need to maintain a consistent framerate and sync over a network.

Like others have said, presimulated animations are used extensively in games, even in singleplayer. Realtime simulation still has a long way to go, (DMM is impressive, but limited in realtime). Any time you want to art direct a collapse to behave in a very specific way, it just doesn't make sense to leave it to the whims of a realtime solution. The same is true of (sometimes), cloth and fluid simulation. Good cheap realtime cloth exists, and it's used extensively for characters, but sometimes for specific fidelity or looks, presimulated is the way to go. Generally, if there's any risk of a realtime solution producing an unfavourable result, you bake it - it's more optimized that way, anyway.

Believe it or not, Alembic *is* used in some engines. I forget other examples right now, but Crytek's Ryse used realtime Alembic loading extensively.

For our part at Naughty Dog, every animation (cloth/rigid body/fluid) that isn't done in realtime gets baked down into a joint animation, and is handled virtually identically to any other animated object/character. For destruction, that means large scale structure or building collapses (we still rely heavily on realtime physics for anything the character can interact with). On the other hand, cloth is almost entirely realtime in Uncharted 4 (using a vertex shader technique that integrates with our wind system), except for a handful of instances where we specifically needed more interesting looks, fidelity, or character interaction - this certainly wasn't true for The Last of Us, where almost all the environment cloth animations were prebaked. Our fluids are also almost entirely realtime now - but we typically don't have anything with the fidelity of a FLIP or SPH solve in-game, except for some closeups in cutscenes.

Link to comment
Share on other sites

  • 4 months later...

Late to the thread, but to add on for other people coming across it.

A good way to think of it is by the frame budget as has been mentioned; with 30 FPS games or cinematic you have 33.3 milli seconds to process an entire frame. At 60 FPS you have 16.6 milli seconds, and as you get to VR 90 11.1 ms to 120 8.3 ms.  This has to be shared with all the other teams, like environment, lighting and fx, so you have a sliding percentage of these milli seconds to work with.  Unless you are running an asynchronous processing thread, so it's not really at your games FPS.

So pre-simed is removing that calculation time with transforms calls instead which are cheaper, i.e. no realtime physics = no process cost but you still need to move the pieces. This does cost more in memory to store all these transform but not too bad in the age on digital download. The best methods are to blend real-time and pre-simmed with a set of pre-canned simulations, and with cheap run time particles and some havok pieces. This way you can load balance for your visual quality and other processes.

As for prepping for games with Houdini, anything you can do in Houdini you can put into a game, with the limit of you need to put a fbx rig on it, or spit out via a texture or texture array. Each engine has different limits on their ability so you'll need to balance the amount you can put in via the engine time budget limits, and the export limits. 

The SideFX video is a pretty good demo to start off with plus a few videos on cascade for unreal. You should be seeing it will be easier to do Houdini to Games engine work over the next year or two. But if you can render a realistic texture array of pyro, a cinematic style rbd sim, and insert them into some cascade style fx, a state machine, and a sequencer you should be pretty good.

Here are some example below from our team on Halo 5.

 

  • Like 3
Link to comment
Share on other sites

  • 5 months later...

Laaate to this thread but in case someone is still wondering about that old skyscraper. I worked on it. It was presimmed and baked out to boneanimation. We didn't have alembic support during BF4. There are also a couple of versions of it for different platforms. All the smoke and small debris were made using Frostbite.

  • Like 2
Link to comment
Share on other sites

On 4/14/2017 at 3:07 PM, Demno said:

Laaate to this thread but in case someone is still wondering about that old skyscraper. I worked on it. It was presimmed and baked out to boneanimation. We didn't have alembic support during BF4. There are also a couple of versions of it for different platforms. All the smoke and small debris were made using Frostbite.

Just saw your interview on cgsociety, just wanted to say it was an awesome read :) 

  • Like 1
Link to comment
Share on other sites

On 2017-04-18 at 9:49 PM, marty said:

@Demno thanks! Out of interest would you use Houdini for this work now instead of Frostbite?

No. Frostbite is the game engine, or the renderer if you will. There's no good way of rendering out volumes in realtime so using houdini for it isn't really an alternative yet.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...