Optimise your code later

I was young and naive then. Having learned this fascinating subject known as C programming, I read anything I could about programming. As a game enthusiast, I pored over articles and tricks on getting the computer to perform neat and cool stuff.

Things like

i << 3

to multiply by 8 as I could save a few processor cycles.

Or the triple XOR swap trick

x = x ^ y;
y = x ^ y;
x = x ^ y;

so I could save the use of a temporary variable.

Or counting down so I could compare against zero in a for loop's condition part.

int i, sum = 0;
for (i = 10; i != 0; --i)
    sum += i;

Which was supposedly more efficient than counting upwards, and having to compare against a nonzero value.

int i, sum = 0;
for (i = 1; i <= 10; ++i)
    sum += i;

I can't remember when, but at some point in time, I decided that I've been restricting myself. Obsessed with faster clock cycles, smaller memory footprints and elaborate coding structures, I had taken away the freedom to create a working program the way it needs to be. I became an optimisation slave.

Chris Anderson mentioned in an article that in the mainframe era,

early developers devoted as much code as possible to running their core algorithms efficiently

It was difficult letting that optimisation obsession go, but I was

liberated from worrying about scarce computational resources like memory and CPU cycles, could become more and more ambitious, focusing on higher-order functions such as user interfaces and new markets such as entertainment

But what about demo sceners? Aren't they coding against unnecessary limits? In the past, programmers optimise out of necessity. Demo sceners optimise as a form of programming challenge. Although nowadays, there are contests (such as in Breakpoint) where full blown demos weigh in megabytes. With better computing architecture, richer textures, fuller sounds and more detailed 3D models are used.

Speaking of 3D models, I remember writing a custom geometric shape generator. There were functions for generating spheres, cubes and tetrahedrons, complete with texture mapping capabilities. I was obsessing over the number of polygons produced, trying to figure out the least number of vertices so that frame rate wasn't affected.

I gave the computer too little credit. Just because I couldn't conceive of generating hundreds of vertices and polygons, didn't mean the computer couldn't. All that time wasted optimising when I could have been working on my game...

"But don't you need to find out what the limits are?" you ask. "If you plan and optimise from the start, then you won't hit those nasty problems."

Let me introduce you to the math concept of local and global maxima. A maxima is a point of maximum value.
Local and global maxima

If you optimise from the start without regards to the whole, then you gain local maxima, high efficiency relative to the little amount of code you wrote.

But if you wait till closer to the end, where you have already solved the given problem, then you have a better idea of the code structure. This is where your optimisation efforts create the most benefits. You get global maxima effect.