Most game design documents include a resource budget section, where you are expected to write things like "my AI will use 5% of the CPU and 1.5 megabytes of RAM".
This is bulls**t, and intellectually dishonest. It is simply not possible to predict resource usage with that level of precision before the code has been written. Of all the teams that have written such budgets, I bet only a tiny fraction even had the tools to measure their results against the plan, let alone any intention of actually enforcing it!
The only truly accurate way to know what resources a design will need is to implement it and then measure the results. But going over budget can be horrifically expensive:
- If you use too much memory, your game will crash
- If you use too much time, poor framerate will make it unplayable
- If you are lucky, these problems can be fixed by optimizing a few pieces of code
- If you are less lucky you may also have to change artwork, which is usually more expensive than code optimization because the work must be repeated for every model or level, rather than just one performance hotspot
- If you are really unlucky you may need to change fundamental design assumptions such as the size of levels or number of entities on screen, which can have knock-on effects throughout the entire game
- There goes your profit margin
- If this takes too long, the publisher may get nervous and decide to kill the project entirely
Yikes! We'd better make a conservative original estimate, then, to avoid any danger of ending up in such a mess...
But this is a competitive marketplace. Especially in the AAA space where many companies are making variants of the same genres, if one team guesses conservatively while their competitor pushes a little harder, they will end up with reviews saying "well, it's ok, but this other similar game has better graphics and more cars on the track at a time". Even for indie games that depend on original gameplay concepts, implementation quality is still important. Nobody is going to want to play a crappy implementation, no matter how good the core idea is.
It's like The Price Is Right. You must guess as high as possible to win, but are disqualified if you go over the limit.
There are several ways to mitigate the risk:
- Prototype important features to gather data and refine your estimates
- Prefer late-binding decisions, creating knobs that can be tweaked without causing huge knock-on effects
- Include a couple of ripcord features that can be cut in case of emergency
- Be a good guesser (experience helps with this)
- Be lucky!
I started writing this post because I wanted to talk about a MotoGP example of a late-binding knob, but this is long enough already, so I will defer that for later.
The main ripcord feature in MotoGP was multisampling. Thankfully our guesses turned out pretty good, so we didn't have to cut this, but had everything fallen to pieces at the end of development, the flick of a switch could have saved 2.3 meg of RAM, a ton of GPU time, and (unusually, on account of how we were memory bound on a UMA machine) also sped up our CPU code.
I once worked with a guy who built even more explicit ripcords into his projects. First thing when starting a new game he would allocate a megabyte array that was never used, and insert a millisecond delay into the Update method. A year or two later, when everyone around him is tearing out their hair and wailing in despair: tada!