Monday 5 March 2012

Exporting 60fps video from cocos2d

Here's the video.  It looks a little different when the screen is in your hand being tilted around.


I couldn't put off creating a video of my upcoming game, Balls Abound, any longer.  I still have a couple of tasks to wrap up before submitting the app, but I now have video export working.  This is how I achieved it.

First you have to make the decision, quick and easy by using a video camera to simply record yourself playing the game, or the more difficult route of professional quality hi def 60fps game play footage.  I decided on the latter.

The first step in getting video off your device is recording some playback.  For this to work you need your game to be deterministic, and then you need a way to capture state on every frame so that it can be played back later.  I am using cocos2d with box2d and building my game with XCode for iOS.  By modifying CCTouchDispatcher and CCDirectorIOS (and how cocos2d interacts with the accelerometer, which I discuss in another post) I managed to defer posting of accelerometer and touch inputs to happen within the directors drawScene method.  From there I simply record the current inputs to disk on every frame, and fix the delta time to 1/60 (for 60fps; this is important later on when you assemble the video and try to sync audio, which will be at 60fps).

The next step is to play the game back using the inputs you have now recorded.  Again, by modifying the CCDirectorIOS drawScene method, I was able to read the previously recorded inputs on each frame, and feed the stored accelerometer and touch inputs into the current frame's state for rendering.  So now I am able to playback a previous game run to the screen in real time, but I need to save these frames to disk if I'm going to make a video.

I found a solution in the cocos2d forums for grabbing the OpenGL framebuffer into a UIImage, and with another change to CCDirectorIOS I am now dumping each frame to a PNG on the device.  I'm not sure this is the quickest solution in terms of performance, but it is functional, and PNG is necessary to ensure the quality of the final video (I tried JPG and was dissatisfied).  Unfortunately to playback a 2 minute run of the game generates 7200 images and can take an hour or more using my iPod Touch 4G.  And then you have to download the app from you device and extract the images somewhere to your hard drive, which can easily take 10 minutes or more, with well over 1GB of images.  But finally it's done and you have a nice image sequence.

But you still don't have any sound, and we need sound.  This got trickier than I thought.  The simplest approach seems to be to use Audacity (or whatever you prefer) to simply record the output from my iPod's headphone jack.  Remember how we fixed the delta time to 1/60 when recording our game inputs?  This is because in order to get our recorded audio to sync with our generated video, they have to share a frame rate.  If we allowed the small changes in delta time that occur naturally when checking a display links timestamp then it would be impossible to sync audio and video in our final product.  To ensure that the audio and video stay in sync, we need to do whatever we can to ensure the audio we record is played at a consistent 60fps.  So, I modify CCDirectorIOS once again, and now I can play back the game through the physics engine only (ie. no OpenGL calls) and get the sound with as close as possible to 0.016s between each frame.  I've seen fancier approaches that are a lot more work (keeping track of the data being fed through OpenAL and storing it as a PCM file), but this seems to work well enough.

Now, I have a mess of different builds that I want to flip between easily.  "Record gameplay", "Replay to screen", "Replay to file", "Replay sound only" are all things I want to do.  In XCode in the project configurations, alongside Debug and Release, I added new configurations for the four types of build I wanted.  I #ifdef'd the heck out of CCDirectorIOS when adding all the record/playback code so I simply add the necessary defines into my build configuration, and finally create four schemes that each use the appropriate build configuration.  Now I can simply choose the build scheme I want and launch the game from XCode.  Just remember you're still looking at around an hour turnaround time to dump a 2minute run from the game.

Ok, we've recorded game play, dumped the resulting frames out to disk, recorded the audio that goes alongside and have a pile of input files to create a video now.  I've heard ImageMagick will do this, but decided to go the easier route (for me anyway) and splash out $30 for Quicktime Pro.  With QuickTime it's a simple matter of "Open image sequence" and point it at all your jpgs.  After another short wait and you will have a video with no audio sitting in front of you.  You should also have Audacity sitting with some audio that matches your video (which you recorded in by replaying the sound only).  In order to align the audio and video, you want some landmark that is both audible and visible.  In the movies they use a clapboard for this, but we didn't insert one into the game and maybe we don't need to.  What we do have is a ball hitting the ground as the first noise that is heard.  So I used QuickTime to find the frame when the ball first hits the ground.  Now you can trim everything that happens before that ball hit noise in the audio that you have.  Export the audio from Audacity to your favorite format, and then you can add to your movie at the desired frame using QuickTime.  Finally, we have a video of the game with perfectly synced audio and video.  Now you just need to use QuickTime to export it to a proper video format for publication.

No comments:

Post a Comment