## Optimising only for profit is a lousy strategy

I felt sorry for the woman when I heard she was going to be let go by the company. I felt even more sorry when I heard that she would be replaced by 3 graduates in China. Seriously, graduates were that cheap in China?

This reminds me of a Chinese phrase. “In the past, having a degree was a big deal. Now, the whole street’s filled with graduates.” Not all degrees are created equal.

Let’s start off with a few baseline understandings…

### Infinity and beyond

Problems are often formulated in terms of infinity to make things easier and to solve realistic problems. Infinity is usually a simplification. Think of infinity as “so big I don’t have to worry about how big it is.”

(Emphasis mine)

Companies like infinity. Particularly when it comes to describing (theoretical) growths and market share. This is false. More on this later.

### Basic economics

We will assume this simple generalised equation for the purpose of this article:

Profit (or earnings) = Revenue – Cost

You can think of revenue as the price of an item. Cost covers everything from the cost of manufacturing an item to storing the item, to paying employees, to paying rent (for use of space), to paying utilities.

Just keep in mind that profit is calculated from 2 components.

### The gold standard and Bretton Woods

Back in the old days (way way back), the value of money was static. If you held on to a dollar, 10 years later, that dollar was still worth a dollar. That’s because the value of money was tied to gold.

Historically, a bank was legally responsible to give anyone gold in exchange for the money bill given. That means if you handed the bank 100 dollars, the bank had to give you 100 dollars worth of gold. This meant that banks had to keep large reserves of gold, just in case.

The invention of the money bill dollar note thingy just made it easier so you don’t have to carry around nuggets of gold. Gold’s heavy.

In July 1944, the Bretton Woods Agreement was signed. Basically, many countries agreed to tie their currency to the US dollar. The US dollar was tied to the gold standard, so this wasn’t a problem. (This was part of the reason why US rose to be a dominant force in the world in the early days, because practically everyone was using the US dollar as a reserve currency)

In August 1971, the United States stopped the convertibility of the US dollar to gold. It might have something to do with funding the battles of the Vietnam War. The US ran out of funds. To continue funding, the US needed to print money out of thin air. And you couldn’t do that if your currency was pegged to gold. (Note: I’m not bashing on Americans. I read this in an economics book. No I can’t remember which book… you should know me by now…).

This also “freed” the other countries from tying their currencies to the US dollar. Which (probably) gave rise to the idea of foreign currency exchange rates.

The dissolution of the Bretton Woods Agreement also meant the creation of fiat money. Meaning that dollar you have there is worth what the government say it’s worth. Printing money out of thin air also gave rise to the concept of inflation. Meaning that dollar you have there is probably worth less than a dollar a year ago.

### Saturation limits

31 October 2011 was designated as the day when the world population became 7 billion. It’s a big number, but it’s not infinity.

Which is where the companies made their mistake.

In the post-World-War-2 era, everyone wanted a better life. We’ve sacrificed enough. We’ve suffered enough. We want a better life! (baby boomers, hello!)

Babies were made. Population grew. Household appliances made their ways into homes. Henry Ford created the automobile. Product categories multiplied. Industries boomed.

Revenue was up. Sales quintupled. Profits were up.

Just when local markets seemed exhausted, globalisation came and opened up the world. International trade continued the seemingly upward trend.

People started expecting growth as a natural consequence. Companies started paying more attention to Wall Street and upholding shareholder value.

“We just need to capture 1 more percent of market share!”

I read that in America, there were more licensed vehicles than licensed drivers. Meaning there were more vehicles than people who could drive them. I understand there’s a surplus of 31 million of such vehicles. Supposedly, every man, woman and child in Canada could have a vehicle from this surplus.

There are probably more cell phones than cell phone users. There’s more food produced than needed to feed every person in the world (yet there are millions starving).

What happens if Microsoft succeeds in placing a computer (with Windows, naturally) on every desktop and in every home? What happens if Apple succeeds in placing an iPhone/iPad in everyone’s hands? What if everyone has already bought Angry Birds on their iPhone/iPad? What happens if McDonald’s succeeds in getting everyone to eat at their restaurants? What if everyone used an Oral B toothbrush? What if everyone used Body Shop products? What if every male used Old Spice?

What if every business person is already flying with your airline? What if every Harry Potter fan already has all 7 books? (that’s probably a rhetorical question…) What if every C# programmer already owns a copy of your C# programming book? What if every tea lover in your area is already frequenting your tea house?

What if every possible customer already has your product? What if every possible customer already maxed out his/her rate of consumption of your product?

The natural limit is population. The next limit is rate of consumption. Every company hits these 2 limits. The limits just weren’t as prominent a couple of decades ago.

### Revenue started stalling

When you hit those natural limits, the company growth stalls. To give the illusion of growth, we go back to that equation again.

Profit = Revenue – Cost

The outside world (mainly Wall Street and the stock market) views growth in relation to profit. The assumption is that if a company is making a profit, it’s still healthy. As in it’s still bringing in revenue.

But if you’re not bringing in revenue, it means you’re not making any more sales. Maybe it’s because your customers switched brands. Maybe your customers switched to a cheaper version of your product (which cannibalises on your own sales, but hey at least you didn’t lose that customer).

But in today’s hyperconnected world, the reason is probably that your customer “market share” is already saturated. You might think 7 billion people is still a lot of people, but a large part of those people are in poverty. They simply cannot buy your product. Or those who can buy your product, don’t want your product.

Some new startup shows up and gets millions of users within a month. It continues at a steady pace and then… stops. The natural equilibrium is reached.

The company CEO has to do something to show that the company is still growing (because the people watching Dow Jones is breathing down her neck). So if revenue doesn’t increase as much, what can you do to increase profits? Reduce costs.

### Cost reduction policies

I’d say as a broad generalisation, there’s only so much you can do to reduce costs. Rent space? Consolidate people and equipment in fewer locations. Equipment maintenance costs? Have less equipment, or more efficient equipment, or just get rid of the whole thing.

But one of the most costly line items (if not the most costly) is hiring people. (Be honest. Tell me when I said “cost reduction” you didn’t think of “layoffs”)

Let’s see. The world population is growing (albeit more slowly now). Generally speaking, more people are working (I know the current economy sucks with few jobs being created. Stay with me). Less people are dying. More people are having longer lives. Less opportunities to move up the corporate ladder (because the high level managers are still there).

Yet people still expect pay raises every year. I’m not pro-Malthusianism, but the supply of money is kinda limited… Wait, good thing the Bretton Woods Agreement was dissolved.

Since people have feelings (and machines and raw materials don’t), companies hesitate to fire people (in case of major backlashes). So something has to give.

Outsourcing (the bad kind). Mergers and acquisitions (probably where the term “wholly owned subsidiary” came from). Subtle changes in accounting books (which is illegal, don’t do it).

Anything to create the illusion of growth and profits. (And with the fiat currency system, money itself is kind of an illusion. But that’s another topic…)

It’s made people commit suicide to make an iPhone. It’s made people to over-consume (creating obesity as a problem and the dieting industry to exist). It’s made people buy houses they couldn’t really afford. It’s made people to allow those people who couldn’t afford houses to buy houses.

It’s made people look for shallower qualities in marriage partners (diamonds, big car, big house, big breasts [I hesitated on including this one], big paycheck), which caused increasing divorce rates, which increased the number of divorce lawyers needed, which increased the number of real estate agents needed (to split the property).

It’s caused the dot com bust. It’s caused tech startups to look for the fastest exit strategy, because the venture capitalists backing the startup forced the founders to do so (so the VCs could get their return on investment).

Optimising only for profit is a lousy strategy.

## Know what you are optimising for

Seth Godin gave a math puzzle. I know! I’m shocked too! I’d have to plagiarise a bit, since the puzzle fills more than 50% of his article. I’ll take the minimum that still makes sense. Here it goes:

Let’s say your goal is to reduce gasoline consumption.

And let’s say there are only two kinds of cars in the world. Half of them are Suburbans that get 10 miles to the gallon and half are Priuses that get 50.

If we assume that all the cars drive the same number of miles, which would be a better investment:

• Get new tires for all the Suburbans and increase their mileage a bit to 13 miles per gallon.
• Replace all the Priuses and rewire them to get 100 miles per gallon (doubling their average!)

My first answer was the second scenario. It’s wrong. The first scenario is the better investment. 2 of Seth’s readers had provided their own explanations (see Charlie‘s and Nariman‘s explanations).

Charlie gave a concrete example with calculations. Nariman distilled the question into math symbols. *smile* Both explained the answer excellently. I’m going to borrow on Nariman’s math workings and continue from there. You might want to read both explanations first.

So, following up on Nariman’s math calculations, we have

Let m be number of miles driven by a car…
Let s be the gas consumption (in gallons) for Suburbans (= m/10)
Let p be the gas consumption (in gallons) for Priuses (= m/50)
Let T be the total consumption (in gallons) (= s + p = m/10 + m/50 = 6m/50 = 0.12/m)

Now, Charlie used a “magic number” to start, 1300 miles. We’ll use that. Without loss of generality, we’ll examine only 1 Suburban and 1 Prius (since we’re talking about 50% existence for each).

In the 1st scenario, the total gasoline consumption is
1300/13 + 1300/50
= 100 + 26
= 126 gallons

In the 2nd scenario, the total gasoline consumption is
1300/10 + 1300/100
= 130 + 13
= 143 gallons

So with simple numbers, it’s easy to see that the 1st scenario is better. But can we make the 2nd scenario better? We doubled the mileage of a Prius and it’s still not good enough. How much do we need to improve the mileage before it becomes comparable?

Let h be the mileage such that the 2nd scenario is comparable.
So for the 2nd scenario, it becomes
1300/10 + 1300/h
= 130 + 1300/h

Here’s where it gets interesting. Let h go to infinity. The expression
130 + 1300/h
goes to 130 (because 1300/h goes to zero),
which is still more than 126 (from the 1st scenario).

This means, even if the Prius can travel all the way to Alpha Centauri and back a gazillion times, and then run a bajillion laps on the circumference of the universe, all on just a drop of oil, the 1st scenario is still better!

### The misdirection

I’m guessing your first answer is also that the 2nd scenario is better. The reason why it’s wrong is because we were optimising for the wrong thing. The very first statement is

Let’s say your goal is to reduce gasoline consumption.

We were supposed to minimise gasoline consumption. But when the question came up, the term “mileage” appeared and took centre stage. And subsequently wrangled our minds to forget about what we were trying to do, and coerced us to maximise mileage instead.

We were solving the wrong problem.

### Parting thoughts

Improving something that’s fairly good (a 50 miles per gallon Prius) is harder than improving something that’s fairly terrible (a 10 miles per gallon Suburban). Individually speaking, you should go ahead and improve the Prius (it was a 100% improvement!). But taken together, you should be improving the weakest link. In this case, the Suburban.

We aren’t just improving one line of transportation. We are improving the entire system of transportation on the planet.

Random thought: the problem of minimising gasoline consumption is not the dual problem of maximising mileage. Go figure.

## Multiplications, additions and bit shifts

In a previous article, I asked what this does

```int foo(int para)
{
int temp = para << 3;
return temp + temp + temp;
}
```

The function basically returns a value of the parameter multiplied by 24 (thanks Mark!).

So why was it written that way? Speed.

From what I understand, in the olden days, multiplication was slower than addition and bit shift operations. I don't know how slow, but it was slow. It was so slow that programmers (particularly game programmers) started replacing multiplication operations with a combination of additions and bit shifts.

Our example above with 1 bit shift and 2 additions was faster than just 1 multiplication operation. Hmm... that example may not be the best to use... Let me give another one.

The idea is to split the multiplication into additions first. So
i * 80
becomes
i * (64 + 16)
which becomes
(i * 64) + (i * 16)
which becomes
(i << 6) + (i << 4) where we swap multiplications with powers of 2 to bit shifts. See, you do need some math in game programming.

Bit shifts are faster than additions, which are faster than multiplications. The speed difference obtained from rewriting the code was enough for some game programmers to adopt for use.

With modern computer chips, multiplications are just as fast as additions and bit shifts. So I did some comparison. Before I show you the code, there are 2 warnings:

• The code segments aren't supposed to be tested on their own. The original speed improvements made sense when combined with their surrounding code.
• Measuring something changes it, so you're not measuring the true value (some principle from physics. If you can find me a reference, I'd appreciate it).

I will present 4 different ways of multiplying by 24 using 4 functions. The first is the one posed in the previous post:

```int foo(int para)
{
int temp = para << 3;
return temp + temp + temp;
}
```

The second function uses multiplication.

```int goo(int para)
{
int temp = para;
return temp * 24;
}
```

I didn't really have to declare the variable `temp`. I did it so all 4 functions declare a variable. Even variable declaration takes time, minute as it is, so I had to take that into account and standardise it across the 4 functions. This is the part where "measuring something changes it"...

The third function uses 2 bit shifts and 1 addition.

```int hoo(int para)
{
int temp = para;
return (temp << 4) + (temp << 3);
}
```

Convince yourself that the return value is a multiplication of the parameter by 24.

The fourth function is a modification of the first function.

```int loo(int para)
{
int temp = para << 3;
return (temp << 1) + temp;
}
```

If bit shifts are faster than additions, `temp << 1` is faster than `temp + temp`, right? The use of temporary variables to hold intermediate values for another calculation is another technique used in game programming. This sounds a little vague, so I'm publishing another article to explain this further.

And here's the timing code

```const int cnMax = 500000000;
const int cnLoops = 20;
const int cnValue = 5;
DateTime dtStart = DateTime.Now;
DateTime dtEnd = DateTime.Now;
TimeSpan ts;
int i, j, temp = 0;

temp = 0;
dtStart = DateTime.Now;
for (i = 0; i < cnLoops; ++i)
{
for (j = 0; j < cnMax; ++j)
{
temp = foo(cnValue);
}
}
dtEnd = DateTime.Now;
ts = dtEnd - dtStart;
Console.WriteLine(ts.TotalSeconds);
Console.WriteLine(temp);

temp = 0;
dtStart = DateTime.Now;
for (i = 0; i < cnLoops; ++i)
{
for (j = 0; j < cnMax; ++j)
{
temp = goo(cnValue);
}
}
dtEnd = DateTime.Now;
ts = dtEnd - dtStart;
Console.WriteLine(ts.TotalSeconds);
Console.WriteLine(temp);

temp = 0;
dtStart = DateTime.Now;
for (i = 0; i < cnLoops; ++i)
{
for (j = 0; j < cnMax; ++j)
{
temp = hoo(cnValue);
}
}
dtEnd = DateTime.Now;
ts = dtEnd - dtStart;
Console.WriteLine(ts.TotalSeconds);
Console.WriteLine(temp);

temp = 0;
dtStart = DateTime.Now;
for (i = 0; i < cnLoops; ++i)
{
for (j = 0; j < cnMax; ++j)
{
temp = loo(cnValue);
}
}
dtEnd = DateTime.Now;
ts = dtEnd - dtStart;
Console.WriteLine(ts.TotalSeconds);
Console.WriteLine(temp);
```

I've initialised `temp` to 0 just before the test loops, and printed its value after the loops to check that the calculations give correct results. The loops are coded with a standard structure, and the only difference is the function used. All 4 functions are coded in as similar a structure as possible, to isolate only the method of calculation.

I've used nested `for` loops to increase the number of iterations. One loop can only go as high as 2^31 iterations (unless I use `long`). Nested loops increase this limit by doing a lower iteration limit multiple times.

And the conclusion? Multiplications are now just as fast. I didn't get consistent results to rank the methods in order of speed though. Generally speaking, the 2-shift-1-add method (3rd function) is faster than multiply-by-24 method (2nd function).

That's it. As an exercise, you might want to come up with better code to time them. Follow some simple rules:

• The results must be repeatable
• Tests must be standardised

A suggestion? Perhaps you can add more code to the loop structure (or the functions). So long as all 4 loops (or function code) are similar, you don't need a minimalistic approach. This allows you to increase the percentage of identical parts between the tests, and thus highlight the difference in only the calculation method (let me know if you're confused by this).

Have fun!

## It gets compiled anyway

I feel I misrepresented my intentions in a previous article. In that article, I gave a code example:

```int i;
char c[3];
i = 0;
c[i++] = ‘a’;
c[i++] = ‘b’;
c[i++] = ‘c’;
```

Christopher commented that the code gets optimised by most modern compilers (can you see any inefficiencies in the code?). I agree with that. There was a time when I studied and adhered to basic code optimisations. Practices such as unrolling small `for` loops and simplifying boolean check conditions. I still do the basics (out of instinctual habit), but it’s not that big a deal anymore.

Computers have gotten to the point where small inefficiencies don’t really matter anymore. High computing speeds overshadowed any minor stalls. Compilers are also smart enough to reduce inefficient code in the first place.

So why did I cite that example? Because you haven’t had the benefit of Moore’s Law.

Imagine taking over someone else’s code and after wading through reams of code, you found that essay of a function could be reduced to a single line of code without loss of understanding nor purpose. Imagine how much effort it took you to understand all that code before you understood its purpose. If you’re lucky, there would be comments and documentation. If you’re luckier, those comments and documentation would even be relevant and up to date.

In human-first programming, you’re not just creating software for the end user, you’re also writing code for another programmer to read. The compiler doesn’t care how obfuscated the code is. It can read it just fine. You, on the other hand, might have a little trouble with the code.

Just because the code gets compiled anyway doesn’t mean you can be sloppy.

I was young and naive then. Having learned this fascinating subject known as C programming, I read anything I could about programming. As a game enthusiast, I pored over articles and tricks on getting the computer to perform neat and cool stuff.

Things like

```i << 3
```

to multiply by 8 as I could save a few processor cycles.

Or the triple XOR swap trick

```x = x ^ y;
y = x ^ y;
x = x ^ y;
```

so I could save the use of a temporary variable.

Or counting down so I could compare against zero in a `for` loop's condition part.

```int i, sum = 0;
for (i = 10; i != 0; --i)
{
sum += i;
}
```

Which was supposedly more efficient than counting upwards, and having to compare against a nonzero value.

```int i, sum = 0;
for (i = 1; i <= 10; ++i)
{
sum += i;
}
```

I can't remember when, but at some point in time, I decided that I've been restricting myself. Obsessed with faster clock cycles, smaller memory footprints and elaborate coding structures, I had taken away the freedom to create a working program the way it needs to be. I became an optimisation slave.

Chris Anderson mentioned in an article that in the mainframe era,

early developers devoted as much code as possible to running their core algorithms efficiently

It was difficult letting that optimisation obsession go, but I was

liberated from worrying about scarce computational resources like memory and CPU cycles, could become more and more ambitious, focusing on higher-order functions such as user interfaces and new markets such as entertainment

But what about demo sceners? Aren't they coding against unnecessary limits? In the past, programmers optimise out of necessity. Demo sceners optimise as a form of programming challenge. Although nowadays, there are contests (such as in Breakpoint) where full blown demos weigh in megabytes. With better computing architecture, richer textures, fuller sounds and more detailed 3D models are used.

Speaking of 3D models, I remember writing a custom geometric shape generator. There were functions for generating spheres, cubes and tetrahedrons, complete with texture mapping capabilities. I was obsessing over the number of polygons produced, trying to figure out the least number of vertices so that frame rate wasn't affected.

I gave the computer too little credit. Just because I couldn't conceive of generating hundreds of vertices and polygons, didn't mean the computer couldn't. All that time wasted optimising when I could have been working on my game...

"But don't you need to find out what the limits are?" you ask. "If you plan and optimise from the start, then you won't hit those nasty problems."

Let me introduce you to the math concept of local and global maxima. A maxima is a point of maximum value.

If you optimise from the start without regards to the whole, then you gain local maxima, high efficiency relative to the little amount of code you wrote.

But if you wait till closer to the end, where you have already solved the given problem, then you have a better idea of the code structure. This is where your optimisation efforts create the most benefits. You get global maxima effect.