Jump to content

FX in games, where to get started?


Skybar

Recommended Posts

I've been meaning to look into realtime effects, from a houdini perspective, but I don't quite know where to get started. Destructions for example, how do I get that into the game engine? Just caching out an alembic is probably a no go with its unique geometry per frame - so I'm thinking maybe bones or something? And explosions/fires etc, is rendering sprites still the go-to method there? 
 
This tutorial is pretty much what I can find, and it seems pretty subpar. And the stuff from SESI and Gametutor is mostly about assets and not really any FX.
 
Are there any good resources for this, or any pointers how to get started?

Link to comment
Share on other sites

Physics in games is not a pre-baked thing - they are calculated on the fly using high-speed libraries.  i.e. there are even FEM libraries IIRC!

 

On the Game Dev forums the main discussion is how to deal with calculating physics when the fps is inconsistent.

Link to comment
Share on other sites

Do you have an exact example of s sim you want to import? As far as I've read it's only billboard renders and in game physics, aka rag doll, that is used, not deforming vertex / duplicating points that are used

Link to comment
Share on other sites

I don't have anything specific in mind, just in general. Like this building collapsing in Battlefield: 

(starting at 00:50)

 

That is not dynamic is it? The building and explosions/smoke is pre-simmed and brought in-game somehow.

I'm not that game-savvy but I reckon animations are done with bones. When you jump, the game fetches the jump animation for your character. When I press the button to explode the building, the game fetches the exploding building animation (that is originally a simulation). Get what I mean?

Link to comment
Share on other sites

i would imagine that's worked directly into the game engine itself.  certainly it has an internal geometry format and some mechanism to deal with prebuilt animations.  so you'd sim your stuff then export to some proprietary format designed for the game engine to handle.  fx are more likely to be rendered with ingame fx engines to handle things like particles and textured sprites for smoke.  so that sample there is probably a prebuilt animation of the ingame asset with some scripted fx for the smoke blasts and perhaps the debris.

Link to comment
Share on other sites

Some digging into Battlefield is that it uses their 'Frostbite' engine.

 

http://www.frostbite.com/about/frostbite-3/

 

The entire world is alive in Frostbite games, immersing players into deep and dynamic worlds with always changing wind, water, and weather. 

For many years, our developers have pursued realistic visuals. We have defined industry-leading visual standards by simulating real-world lighting conditions and depth of field characteristics.  The next step for us has the pursuit of realism beyond static visuals. Dynamic behaviors over time are key to immersion and believable worlds.  In our games, players will notice subtle movements in the world around them driven by changes in weather.  Prepare for the next level of realism, driven by Frostbite technology!

 

 

 

 

Edit:

Destruction masking techniques are of interest :'

Destruction Masking in Frostbite 2 using Volume Distance Fields'

 

http://www.slideshare.net/DICEStudio/siggraph10-arrdestruction-maskinginfrostbite2

 

Edit 2 DMM videos: Watch from 11min in the Quantum Break video for crazy physics action

 

http://au.ign.com/videos/2014/10/23/16-minutes-of-quantum-break-gameplay-on-xbox-one

Edited by tar
Link to comment
Share on other sites

Yeah alright probably a mistake bringing up a big blockbuster, it just came to mind. But if I want to play a pre-simmed piece of geometry in say Unreal, how do I package that from Houdini? Do I really have to go for realtime dynamics, it seems pretty overkill if it doesn't have to be dynamic.

 

Surely people must use Houdini for games? Or is it only with inhouse tools that no one knows anything about?

Link to comment
Share on other sites

for big effects that need to look good on every machine, they'll definitely be scripted prebuilt animations.  you don't want your big thing event to look like crap cuz a user's rig can't handle the sim.  and of course, you can't have anything that has in-game repercussions left up to the user's machine either.  like if a bridge has to collapse and block your way, you can't leave it to the sim to do that since sims are pretty unpredictable when scaled to different resolutions/time-slices.

 

engine plugs into unity, so i would imagine there's gotta be a way to export an animated asset relatively easily, no?

Link to comment
Share on other sites

Nevermind my link was the first link posted :)

 

There was an article on SideFX on how Naughty Dog did the VFX in Uncharted 3 not too long ago.  http://www.sidefx.com/index.php?Itemid=68&id=2208&option=com_content&task=view

 

 

To me the premise that really any destruction simulation is fractured in game, in any game seems pretty off, why would a game designer want a worse looking destruction that costs more?  Except where absolutely necessary.

Edited by MrScienceOfficer
Link to comment
Share on other sites

There will be a demarkation point of what physics can be done in the engine in real-time and what is pre-baked, this almost defines why Vulkan, NextGL is being developed, as the CPU is freerer to compute real-time physics.  

 

If you can calculate in realtime a destruction then you could blow up a building from any point, not just the pre-baked sim.

 

Edit:

 

'Real-Time Deformation and Fracture in a Game Environment'

 

 

Abstract

This paper describes a simulation system that has been developed to model the deformation and fracture of solid objects in a real-time gaming context. Based around a corotational tetrahedral finite element method, this system has been constructed from components published in the graphics and computational physics literatures. The goal of this paper is to describe how these components can be combined to produce an engine that is robust to unpredictable user interactions, fast enough to model reasonable scenarios at real-time speeds, suitable for use in the design of a game level, and with appropriate controls allowing content creators to match artistic direction. Details concerning parallel implementation, solver design, rendering method, and other aspects of the simulation are elucidated with the intent of providing a guide to others wishing to implement similar systems. Examples from in-game scenes captured on the Xbox 360, PS3, and PC platforms are included. 
This paper recieved the award for best paper at SCA 2009.

http://graphics.berkeley.edu/papers/Parker-RTD-2009-08/Parker-RTD-2009-08.pdf

Edited by tar
Link to comment
Share on other sites

 

There will be a demarkation point of what physics can be done in the engine in real-time and what is pre-baked, this almost defines why Vulkan, NextGL is being developed, as the CPU is freerer to compute real-time physics.  

 

 

I think this is backwards in a way, Vulcan and DX12 are complete re-designs of those APIs specifically to use More CPU threads, thus more CPU power.

 

However if we are talking about GPU shaders (Iike in the paper) then definitely yes, it will likely be faster for sometime, but once PCI-E 4 comes out the balance will likely tip back to the point where pre simulated destruction are almost always faster for a time, but then GPU's will get more cores and tip the balance back and so on and so forth.

 

The best solution will likely be a combination of the two.  When it comes to games unlike film, I think fakery will always be the go to solution no matter how efficient the real thing gets.

Link to comment
Share on other sites

I'll look into it - it may well be that the main idea is to thread better so you can get the GPU to do more, but, it's not the way I've interpreted the presentations as of yet.

Link to comment
Share on other sites

Watch from 11min in the Quantum Break video for crazy physics action

 

http://au.ign.com/videos/2014/10/23/16-minutes-of-quantum-break-gameplay-on-xbox-one

Yeap, I was at Remedy last year, the physics is all pre-baked from DMM.

On top of the performance issues, pre-baking gives you total control of the outcome. In such set pieces as in the video it is very nice to know what will go down, with real-time physics (indeed, in all physics) funky things can happen..

Link to comment
Share on other sites

  • 2 weeks later...

Yeap, I was at Remedy last year, the physics is all pre-baked from DMM.

On top of the performance issues, pre-baking gives you total control of the outcome. In such set pieces as in the video it is very nice to know what will go down, with real-time physics (indeed, in all physics) funky things can happen..

 

but that is also what makes real-time physics interesting, in reference to the battlefield tower collapse scene;

I think its (almost) the same every time, which doesn't allow for cool (and possibly funky) surprises, which can lead to more player engagement in some cases,

such as unexpected chain reactions and stuff.

 

granted, it is quite difficult to get right in real-time and in one-off scenes, such as single player games it might not be worth the time to make it realtime.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...