Friday 22 July 2016

Game Update : Optimization

Optimization.

This will most likely not be the last time I'll be talking about optimization.  Primarily because many avoid the subject... I mean, it IS a difficult subject to talk about; there's no many factors that it's almost an abstract concept.  I enjoy talking about it and it's seriously a high priority when you think about how one should build a video game.

I enjoy talking about it because I somewhat see it as an art form in itself.  I geek out whenever I see low poly challenges where artists try to do incredible art with very little.

When I first started the video game project, I already had some sort of game plan on how things should be organized in order to ease the load off the CPU and GPU.  Artificial Intelligence is going to be an important aspect of the game so it's imperative that I free up as much CPU cycles as possible.

The key to this is to have a simple workflow and this particular case has been pretty straight forward:

  • You build something as fast as possible just to see it working in the game.
  • Make sure it doesn't break the game, test and debug it.
  • Optimize/re-write it.

If you want to build your own video game, that's the mindset I believe you should have.  The first point is particularly important; you shouldn't waste your time trying to get it working perfectly the first time... because you'll end up spending a lot of effort for very little "game" to show for.

Since my last "game update" blog post, I've been splitting my time between putting new things in and optimizing old things.  It keeps me from losing sight of the ultimate goal and spending too much time on a single thing.
For example, Last Year (in June) I've talked about saving the world data on disk (because everything was stored on RAM at that point) and I implemented that during the winter time.  Since the save file would grow as the player explored the world (as the content gets created), I didn't think too much of it when I put that functionality in...

... until I did the math.  With an (current) average of 2600 areas to explore (surface level, not actual dungeons), the save file was estimated to get up to 3 terabytes worth of information.  For roughly 1.5 to 4 megs a zone.  That's excessive!  So I spent a bit more time earlier this year to streamline the data and I got that shrunk down to 120kb or so.  Per zone.

While I was figuring that out, I also spent some time adding in features that would shape the game-play:  Creating a name generator for my environments, adding the ability for characters to generate sound and smell... so that running around makes you loud (so that others can detect you) and going into sewers makes you smelly the longer you stay in there (for detection and other sorts of interactions).

Adding new things, fixing old stuff.

What's been on my plate recently has been about continuing working on the foundation for the NPC AI; particularly the dialogue system.  This will be an on-going process that will be on my "add" list for quite some time.  After all, it's a prime element for the game.  As far as "fixing" things, I had my eyes on graphics.

Optimizing graphics is such an awkward thing to do.  I mean, building assets in an optimized way is simple enough but it's really about how the game engine is handling those assets in a live environment that can sometimes be a challenge to optimize.  What's typical for the industry, you play through a level, identify a drop in frame-rate, investigate where the player is looking at, and then make some changes.  Sometimes it's as easy as removing a couple of trees here and there because there was too much vegetation in one particular spot (which is taxing on the video card), and sometimes it's as difficult as re-aligning streets and buildings in a city so that buildings could obscure other objects that would otherwise drop your frame-rate.

But when you're dealing with randomness, it gets a bit tricky.  I mean, objects that are dynamically spawning due to the computer's whim are already a difficult to deal with (compared to assets that are locked down before compile), but now you also have to deal with the standard challenges... which typically boils down to density and line of sight.

I was more or less prepared for this.  Procedural generation helps a ton here by giving some guidelines to how the computer should build things.  So I can say things like "don't put too many trees in any given area" or "break off line of sight by placing a building here".  That, coupled with how I optimize the 3D assets, the only thing that I'm left with is dealing with the game engine's quirks.
Thankfully, Unity is actually pretty damn good as I didn't have to fiddle too much (although I did have to build my own system for hiding objects and reducing detail at a distance).  If there is a problem and you can't identify where it's coming from, the profiler is a good spot to check for clues.


It's just a little bit daunting to look at.  I've been using the profiler extensively for the past month or so I've been tweaking some things here and there.  Overall, though, everything's going according to plan and it seems like Unity won't have any major performance issues with what I've got planned.

AI is coming, and the CPU needs to have a light load.