Get your leap years right

Thirty days hath September
April, June and November
All the rest hath thirty one
Except February alone

Snowboarder up high by Jason Lugo @ iStockphoto

You just divided the year by 4 right? If the year is divisible by 4, then it’s a leap year, isn’t it? Yes it is, if you still use 2 digit years, and that’s if 00 represents 2000 and not 1900.

[The Gregorian calendar is the basis used for the following discussion.]

Back in the pre-Y2K years, many programs were written with 2 digit representations of years. Actually, I have no idea when the first computer was built. I’m not talking about abacus, or big light bulbs, or gas chambers. There were gas chambers right? *smile* I remember hearing about humongous contraptions that required entire rooms to house them. They’re not what I’m referring to. I’m talking about the computer as we know it now, that’s about a few decades old, back in the 90’s.

Anyway, the time period when those programs were written was between 1900 and 2000. So any 2 digit year, such as 1940 represented by 40, divisible by 4 was a leap year. 2000 came and went. There were still many programs written with the 2 digit year representation. This presented a problem.

Does 04 represent 1904, or 2004? To minimise changes from a 2 digit to a 4 digit representation, a workaround was used. Any 2 digit year below a certain number was designated to be in the 21st century, otherwise it’s in the 20th century.

For example, 15 would represent 2015, whereas 63 referred to 1963. The Oracle database uses 50 as the magic number. So 2 digit years range from 1950 to 2049.

I’ve maintained legacy code which uses 80, so it could handle 1980 to 2079. Probably with the assumption that anything before 1980 was really really old and thus was no longer useful data.

The magic number was introduced to allow the divide-by-4 leap year calculation to continue working.

I implore you. Please use the proper calculation formula for leap years. And please use the 4 digit year representation.

if ((year modulo 4 is 0) and (year modulo 100 is not 0)) or (year modulo 400 is 0)
then leap
else no_leap

I’ve had occasion to write a Javascript function involving counting days in months. Which involved February. Which inevitably involved calculating leap years. So we have,

(year modulo 4 is 0) and (year modulo 100 is not 0)

which translates to

(iCurrentYear%4==0) && (iCurrentYear%100!=0)

And

year modulo 400 is 0

becomes

iCurrentYear%400==0

It’s not that hard, is it? So the final if condition becomes

if ( ((iCurrentYear%4==0) && (iCurrentYear%100!=0)) || (iCurrentYear%400==0) )

In English: Any year that’s divisible by 4 but is not a century year (I made that up) is a leap year. So 1896, 1904 are leap years but 1900 is not.

What about 2000? That’s a leap year, and it’s a century year. Aahhh… Century years are not leap years, unless they are divisible by 400. So 1600, 2000, 2400 are leap years.

“But that formula won’t kick in till 2100!” you exclaim. “I can still use the divide-by-4 method!”

You and I will probably not be around in 2100. The application will probably not even live past 2010, let alone 2100.

The point is, the change is minor, yet allows your application to live a lot longer. So why don’t you change? Your unwillingness to change (or improve)? Your laziness? What is holding you back?

P.S. The Wikipedia article on leap years indicate that in 8000 years, the above formula might fail. 8000 years is plenty time.

An alien might time-travel to 2008 and use an application I wrote. Then the alien travels back to his time, year 2099 and finds that my application fails once 2100 crosses over. I’d hate to have an angry extraterrestrial breathing down my neck…

And happy birthday to leaplings!

Choosing colours

Colours of the rainbow
Graze from rage to royal
Beauty on eyes they bestow
Passions inside us they boil

What colour? by Christopher O Driscoll @ iStockphoto

We don’t have a graphic designer in the office. So it’s up to the individual developer to come up with graphics, styles and design. Well, you can imagine the look of the applications created. My colleagues and I try to assuage the situation by making sure our own applications look decent.

Without much training or good image editing software available, there’s only so much we could do. Swathes of colour, cool looking fonts and some simple images. Luckily I have Paint.NET to tide over some pixel operations.

Anyway, I have to choose colours fairly often. The colour of the text, that of the background, and that border lining. CSS does an amazing simplication for web applications. It can’t tell me what colours to use though.

I wrote about my agony over web safe colours before. This is like a continuation of sorts. So for colour selection, there’s the colour wheel and rectangular blocks of colour arranged in a grid according to red, green and blue values.

The thing with swathes of colour is that, they are continuous. Trying to decide which of two close points on the colour wheel give a better “feeling” makes my head hurt. So I have this handy colour chart to help with at least coming up with a starting colour. Please have a look before reading on.

Alright, so the colours defined by the KnownColor enumeration in .NET framework don’t form any kind of standard. You’re missing the point. I use them as a starting point, sometimes even as the colour I want. Because I arranged the chart by sorting with colour names alphabetically, the colour chart displays colours with similar hues usually away from each other.

This gives my eyes distinct patches of colour to look at. I can find that exact hue I want because of the added variable of distance. With the other surrounding colours, the ideal hue sometimes jump out at me.

The problem isn’t with designing. It’s about indecision. Whether you’re designing the look of a web page or designing the code structure is secondary. Your inability to decide, stalls the project.

I’m better with writing code. I’m learning to improve my web design skills. So I wrote a program to generate a colour chart, which helps me make faster design decisions. What did you do?

Solve the given problem first

Solve the given problem first
Ignore other thought tatters
Like water filling need of thirst
It’s the only thing that matters

Pondering the problem by Paul Kline @ iStockphoto

You’ve been given the software requirements. The problem is defined. The tasks are laid out. Yet half-way through coding, you start deviating. You think up cool new stuff to “enhance” the application. You’re distracted by the wonderful “features” you can add to make the application better. You’re not getting the project moving, and as such, you’ve become a bottleneck.

This isn’t about solving the actual problem, where the underlying problem is obscured by a superficial but seemingly real problem. It’s about coding against whatever is already discussed, decided and defined. It’s about doing what’s needed before doing what will be nice to have.

That rotozoom effect will be so cool!

Yes it will. Right after you finish coding the basic camera movements first. And that texture loading function. And that resource file management class.

I remember trying out as a hobbyist game programmer. I’ve played lots of role playing games. I know about sprites and 2D maps. I know about polygons and isometric maps. I understand orthogonal views and culling planes. I learned about colour mappings and Phong and Gouraud shading.

Coding the basic game code structure is incredibly boring and tedious. I’ve written “Hello World” equivalents for OpenGL and DirectX rendering, to test the basic code template. I’ve written functions to generate simple geometry objects such as spheres, cubes and pyramids. I’ve written custom import facilities to take in a 256 by 256 pixel bitmap as the game font, and render the individual letters correctly (try thinking about fixed and variable width fonts).

I was a student then, and didn’t want to shell out money to buy third party game frameworks. Besides, it’s fun to learn from the start. I just didn’t realise it involved so much work.

So I understand that sometimes, it’s easy to get sidetracked and go do something more interesting. Now older and wiser, I’ve gained the resolve and discipline to finish the basic stuff first. The fancy stuff can come after that.

I wanna do sorting as well

There was an incident where I did a quick impromptu tutoring on C for John (as I’ll call him). John was a freshman, and just started C programming as one of his courses. I was asked to help him with one of his assignments.

The assignment was a standard question where a list of students with their test scores were given (or were to be input). They were then to be given a grade, calculated by the answer program. Say 70-100 would be an A, 60-70 would be a B and so on.

The actual requirements in the assignment were simple. I helped John understand some of the compiler errors, and gave hints on how to go about coding some of the tasks. But he already had plans…

The lecturer set it such that there would be extra credit for, well, extra stuff (probably because the assignment was easy to begin with). What kind of extra stuff? “Well, you go figure it out!” the lecturer said (paraphrased). He did however dropped hints like user manuals, extra input information about students, and sorting.

John was obviously still in the process of understanding “missing semicolon” errors, and why scanf needs an ampersand before the variable name (if not scanning character strings/arrays). I was there at his house for a visit. It’s almost by coincidence that I was there. He obviously wanted me to help out as much as possible, preferably the whole assignment and the extra credit part.

So he asked me how to do sorting, for extra credits. He had difficulty understanding how to slot student scores into grade levels, which involved if-elses and “smaller than, greater than” comparisons. I’m not sure how much he could take in about sorting.

I told him to solve the basic assignment first. He wasn’t even done with all the required tasks. I advised him that all the extra credit means little if the actual assignment wasn’t even completed. I doubt his lecturer would award him full marks for that kind of work.

Beautify later

I’ve often had to resist the urge to add in extra display help, or create a better looking graphic. It could wait. I’ve already planned out the AJAX required for that snazzy display text. I’ve already decided which design elements needed polishing. But I needed to get the application working correctly first, right then.

It would be more user-friendly for the user. It would look more impressive to the customer. But it had to work first. What’s the point of it looking pretty if it doesn’t work?

Bake the cake first. You can add icing later.

The need for better, younger programmers

Lately I’m seeing a theme gradually forming. There are commercials featuring young people voicing their support in maintaining an environmentally friendly lifestyle. I also see commercials targeting young people, judging by the SMS inspired characters such as the carat ^, slashes /, and at signs @ to form pictures. Then there’s Microsoft’s free software for students. And finally, there’s the winning bid by Singapore to host the Youth Olympics 2010.

The focus on young people is growing.

My math teacher (or was it chemistry) once remarked that it’s a shame that she’s teaching a concept the way it was, at that age of learning we’re in. Because when we progress to the next education level, we would find out it was wrong, and that we’d have to learn a more correct version.

It’s like fractions in math. Say
2/3 = 4*a
I was to “bring the 3 over to the other side”, so I get
2 = 3*4*a

Later on, I learned that it’s actually multiplying both sides of the equation by 3, which gave the easier to explain conclusion of “bringing the 3 over”.

Education at a young age is important. Ideas arise, opinions form, and suddenly they’re set in stone (or at least very difficult to change). Once you’re at a certain age, learning new concepts might involve unlearning old ones. For some people, learning a new concept is not the problem. It’s the letting go of old concepts that’s painful, because that means they might be wrong before.

The world needs computers and software to make a lot of things work. New problems need to be solved, and they appear faster and more complex. Programming experience doesn’t count for much anymore. The ability to think, is.

I share Joel’s faint distaste for Java being taught in schools. Alright, fine, I hate Java. I don’t know what pricks me about it. When I first learned it, which was the version 1.1.8, it was cool. But not fun, like C. And Java kept jumping up in version numbers. But there’s only one version of C (at least it didn’t keep adding new “cool” commands).

I remember many an afternoon where I helped my fellow students figure out what went wrong in a segmentation fault. It’s the worse output error one could ever get, because there’s nothing to give you any clue whatsoever about why the program failed. It was a test of logical skills, intelligent omission and creative printf additions.

But if young programmers don’t get these kinds of training, what will happen to our future programs? They are the best at coming up with innovative and creative solutions, because they don’t have past baggage. Yet if they get a lousy programming education, they won’t be able to come up with those solutions in the first place.

Dream In Code does a great job at code guidance. A student (or any person with a programming question) can ask for help, but code must be provided. Gentle guidance is gladly given. Sample solutions are staunchly stemmed. The questioner is encouraged to think.

The betterment of our future doesn’t just lie with us, the so called established and experienced programmers. Our younger generations of programmers matter too, probably more so.

Essays, washrooms and variables

I seem to be writing a lot more these days. As in longer in length. Such as the article on functional specs, the unhealthy focus on flaws, or the fear of authentication.

When I was younger, I hated writing stuff. I’m fairly fluent in English. I just hated the composition part. My mind was better at grasping at concepts, and using them in spurts. Making me write essays with a minimum length or number of words was torture. Geography and history were the bane of my studious years…

Mathematics suited me. Couple the conciseness of mathematical expression with elegant programming code, and here I am, a programmer. I’m actually surprised that I can write so much. I will vary the length of my articles more. Sometimes, short articles work just as well to bring my point across.

Anyway, I found someone who shares my opinion about talking in the toilet. See point number 2.

Then there’s the advice on keeping variable naming simple for purposes of competitions or tests. Because

the lifespan of the code is 30 minutes, not years or decades

Contrast that with variable naming for clarity, for easy maintenance, and for the sanity of your fellow programmers.

What is a functional spec?

Folder stack by Christine Balderas @ iStockphoto

It has many names. It’s brought up joy and relief to some people. It’s brought down pain and agony to some people. Some say it created order from chaos. Some say it restricted their freedom to express, to create and to change.

It is the functional spec. Also known as the design specification or technical specification, it is simply a document or cluster of documents detailing the workings of a software project.

I use the term “software project” instead of “software application” because the functional spec is to describe what interactions are available between people and computer (broadly speaking). It just so happens that an application (usually custom written) sits between them most of the time. Rather than further defining it in abstract ideas, let me show you 3 examples.

The spoon-feed

It was the first time I had anyone reporting to me. Well not exactly reporting to me. I’m supposed to write up a spec, defining tasks in it, and hand it over to someone stationed in China. I would then liaise with that Chinese developer, doing code reviews, giving guidance and finally grading the fella.

Offshoring is all the rage, and despite the polite feedback about how difficult the situation was, I had to work with this arrangement. Let me list a few reasons why it’s a little hard to get the project going smoothly.

  • They’re fresh graduates
  • They know zero about our (specific) business
  • They’re Chinese (let me explain a little further down before hammering me, ok?)
  • Long distance communication
  • Working times

From my observations and my own experience, I can tell you that sometimes it’s hard to move from an academic realm into the professional realm. There might be tons more code, with massive behemoths of libraries and documentation. Practicality sometimes win over theory, creating seemingly incorrect code, compounded by comments far and few in between.

They also know nothing about the existing software and business we support. Either explain to them in detail so they can understand the business requirements, or cut out enough of the business stuff so only coding stuff is left. Which is usually the case.

I’m a Chinese. I’m fairly proficient with conversational speech in Chinese. Yet I falter repeatedly, trying to communicate to them because I understand code in English. They understand code better in Chinese! Suddenly, explaining concepts and business logic became a lot harder, because I don’t know enough Chinese to translate smoothly, and they aren’t fluent enough in English to get it.

And they are far. I’m in Singapore and they’re in China. Big deal that we’re in the same time zone. If you are in the right spot, the North pole and Antarctica are also in the same time zone. Some things are just better explained in person, with facial expressions, with gestures, with voice intonations. Video conferencing is an activity left for higher management. I’m to settle for email, maybe a phone call and live screen presentations in virtual meetings.

With the person sitting next to me, I can just point at the button on the screen and say “just move this to here”, pointing to a new position. Long distance, I have to do screen shots and I have to describe in exact detail where I want that button to go.

I don’t even know what time they start work, when they go for lunch, what time they leave work. I don’t know whether they are in meetings with their (real) direct supervisors, or undergoing training. In part, maybe because of their very recent departure from the academic environment, they haven’t inculcated a willingness to complete the work, even if it means they stay back a little. Of course, if the only transport out of their workplace leaves at a certain time, they’d have to go…

Ok, I nearly lost the line of thought about where I’m going with this. With all the reasons stated above, the functional specs I sent to them were very detailed and practically devoid of business logic. They don’t need to know, they don’t want to know, and quite frankly, it doesn’t matter whether they know or not.

My colleagues have done specs and assigned tasks to the offshore developers too. My impression was that my colleagues had been exceedingly detailed in their descriptions. The Word and Excel documents were so filled with pseudo code, that all that’s really missing is the feature to compile an executable straight out from Microsoft Office itself.

Most of the ASP.NET tasks simply degraded into copy and paste operations from the part of the developers. A textbox is supposed to hold a database field of char(10). The least they could do was to actually validate that at most 10 characters were given, right? You mean, I have to explicitly tell them that? If I had to spoon-feed everything to them, what would they learn?

The drowner

It happens in big teams. Someone’s got to do documentation, and they have the people to do it. The business people churn out all sorts of documents. The testers churn out all sorts of test scenarios and test data. And the programmers were then expected to churn out a matching amount of documents, preferably replete with screen shots, enough technical details to wow (but not woe) them, and lots and lots of calculations to show how their business logic turns imaginary scenarios into profits.

I knew of this person who documented exactly what a particular button would do, right down to which tables were used. I knew of another person who had to document the enterprise framework he created so other developers knew how to use his custom controls. It was downright stifling. My intuitive sense told me there was a better way to move the project along. The rules and documentation were bogging the project down.

In another project, the business users were just a bit enthusiastic. Documents seemingly come into existence with each breath taken. They weren’t satisfied with a development environment, a test environment and a production environment. They needed five.

Test scenarios and data weren’t just necessary; they’d positively die without it. There were Excel spreadsheets with neat little tiny almost-imperceptible text describing the exact condition a test was to be conducted, with frozen column panels, columns to checked (with a tick character, if you can find it), and a “remarks” column (always handy).

If you thought truth tables were bad, you are in for a rude awakening. Business logic frequently run with multiple conditions. Say there’s a “number greater 30 and item code is ABC”. What if the number is 25 and the item code is XYZ? Oh goodness gracious, I didn’t think of that! The number is implied to be positive? What happens if it’s negative? Better include that case.

And they want a technically verbose version too. It doesn’t matter if they couldn’t quite understand it. They just want it, in case you’re gone, and they pass it to the next programmer.

The gist

The “gist” works best in a small tightly knit team with highly specialised skills and knowledge. It provides enough details so business people can coincide it with their understanding. It provides enough clues for testers to work out test scenarios. And it provides just enough information for a programmer to start.

There’s enough information to start thinking about design, yet not confined to any strict layout. Enough to run through coding logic based on business logic, discovering missing implications previously unbeknownst. Enough freedom to still change underlying code while providing a consistent frontal interface.

The gist also comes about because it’s a small tightly knit team. There’s simply not enough people and time to create a comprehensive document. We’re programmers, the intelligent people, right? So we fill in more blanks than other people. Hence, when done properly, we don’t need as much words in a document. Code can express our meanings much more effectively.

That said, every team member must be fairly knowledgeable in a broad range of topics. Since there are less people, members’ knowledge then bleed into each other. When documentation is scarce, the ability to fill in blanks becomes extremely important. Each member just happens to do a certain portion of the project, and do it well.

So what is it really?

I honestly can’t tell you that. Just write and use functional specifications in the best manner for your particular situation. I favour the flexibility and freedom to make code changes when required. An iron-clad documentation of rules will impede rapid progress.

Besides, the only way to have a completed application is to finish writing the application. The application might look and behave totally different from what was originally envisioned. Newton’s law at work; it’s very hard to undo mountains of paperwork.

Why you need linguistic skills – part 3

Language in dictionary by Christian Grass @ iStockphoto

This is the last part of the series. You might want to read up on part 1 and part 2, because they were about programming related ideas. In this article, we’ll look at the non-programming areas where you need good linguistic skills.

Below, above and around you

When I was young, my idea of a programmer was the typical hacker type of look, either ingenious, obscenely obese, pizza-eating-grubby-fingers-typing, or skinny, pallid and anti-social (or any combination of them). Programmers were depicted as reclusive because of their obsession with computers, sometimes to the exclusion of communicating with other people.

In this current world, programmers need to talk to many types of people. Programmers talk to new staff, testers, managers, customers, sales people, marketing people and their peers. They talk to people of all levels, below them, above them and around them.

If you only know how to talk in code, the only people reasonably expected to understand you are your peers.

Chicken and Duck speak

There’s a Chinese saying “Ji Tong Ya Jiang”, which translates to “chicken talking with duck”. Both are of the fowl family, yet neither can understand each other. It’s used in situations where person A is talking to person B using words that person B understands, yet person B cannot understand the whole sentence when all the words are put together. Comprende?

For example, I know a friend who’s a programmer and work at the same floor as me. One day, I passed by his desk, and saw an SQL reference book in Chinese on the table. I picked it up and leafed through it. Individually, I could read the Chinese words. It’s when they’re strung together that I don’t know what they mean. It was terrible. Only when I looked at the SQL code did some of the sentences begin to make sense.

Ever had someone ask you why something failed due to “select access to such-n-such-table not granted”? Do you tell her the problem’s fixed and access granted? Or do you go into detail about how the database table was recreated and select access was not granted, causing the error? She might know what a database table is, but frankly she doesn’t really care.

Ever had someone ask you why an application is so slow? Do you tell her it’s a database problem and let it go at that? Or do you launch into an explanation on why a particular database table had a missing index, even going so far as to name the database table, and that you recreated the index, and that should speed up the retrieval SQL query, which in turn speeds up the application?

Use the appropriate words for the appropriate types of people you talk to. A manager can understand a slow server response answer. A technically knowledgeable tester can understand a slow database server response answer. A programmer can understand a slow database server response caused by an inefficient algorithm in an SQL stored procedure answer.

The only way you’re going to use the appropriate words is if you already know the appropriate words. Of all the people involved in a software project, you as a programmer are the most intelligent of the lot (I’m biased of course *smile*). Since you’re the most intelligent, it’s up to you to have a wider range of vocabulary so you can talk to anyone, and even translate for others.

Which brings us to writing documents understandable by everyone…

Put it in writing please

It doesn’t matter what you call them. They’re specification documents, including general business information, business logic, technical information and anything concerning the software project.

When you write a technical specification document for programmers, you translate business logic into near-pseudo code. You detail the database tables involved, perhaps a modification of existing table columns, perhaps a new database table. You might include a mockup screen shot, with some details on what that button should do, and what this datagrid should display. You note down validation checks based on what is reasonably expected from the business logic.

When you write user guides for users, you write in a different manner. Lots of pictures involved. Text in snappy sentences, preferably free of technical jargon. Validation checks explained in “between 1 and 10 inclusive” instead of “1 <= i && i <= 10".

Do you write for the right people?

In closing

The importance of your linguistic skills must be emphasised. Today's world creates situations where you need to talk to non-programmers, and non-programmers talking to you. A phone call from a user. A face-to-face conversation with a tester. An email to a customer.

I find it difficult to understand how someone who can tell the difference between int and short, cannot tell the difference between "lose" and "loose". You're an intelligent person. You should be above simple spelling and grammatical errors.

Non-programmers don't communicate with you through your code. They communicate with you through your language. Be a better programmer. Have better linguistic skills.

Looking for flaws

Or checking for correctness? When you test your code, do you look for all the wrong things that can happen, or do you make sure it’s working properly first?

The weak link by James Steidl @ iStockphoto

Of course, you should be doing both. But you’ve got to do one or the other first. You can’t do both at the same time, because they’re completely different. One is where you don’t know what will happen, like rolling your head on the keyboard to come up with random input. One is where you know exactly what has to happen, like a textbox accepting numeric input, and only between 10 and 23.

Correctness is defined here as “working according to what’s reasonably expected”. If you have specs, there you have it. Negatives can count as correct too. For example, “must not show information of other users” is a negative requirement. If the application indeed does not show other users’ information, then it’s working correctly according to the requirement.

The fastest way to complete a project is to code according to specs, test for correctness, then test for flaws. Because the specifications are known, there are thus well defined finite areas of correctness, which provide a basis for deadline estimation.

Flaws on the other hand, are practically infinite. You can always find something to change for the better. When you test for flaws first, you will find tons of minor irritating aspects of the application that you can change. None of which will significantly bring you any closer to what’s required.

You’ll only stop because the deadline has mysteriously gotten closer, so you decide it’s better to stop nitpicking and actually start testing for the requirements.

Let’s take the example given before, a textbox accepting numeric input, and only between 10 and 23. Reading through the specs, you find out that it’s for hours of operation, as in anything between 10 am and 11 pm. So the value obtained from the textbox is used as an integer.

So you test by punching in numbers and making sure only numbers 10 through to 23 are accepted. Numbers outside the range as input will give an error. You also make sure alphabets and symbols give errors too. This satisfies the correctness requirement.

Then you test for flaws. Things like 12.0 or +19 are accepted, but preferably not.

Alright, as a contrived example, the textbox input requirement seems rather lame. But I hope you got the point. Testing for correctness means you get a working application quickly. Then when you test for flaws, you’re just adding on to the correctness.

The thing is, some people must overcome whatever their misgivings about your application before they even check how cool your application is and how correct it is. These people are usually your users, or your managers, or whoever uses the application but don’t really give a rodent’s behind about your code.

I’ve met some of these people. And I understand their position and view point. They don’t quite like this colour, or they want that button somewhere else. I get it. There’s a reason why I understand this behaviour, and that I even expect it of them.

They already assume your application is working correctly. Hence they go nitpicking. They have a right to nitpick. In their minds, the application is working according to specifications, and they’re tuning it to better suit them.

Now, the authors of defect reports… I don’t understand them. Dedicated testers are supposed to give feedback on the application, which includes both correctness and flaws. And correctness should be a higher priority. They should be the safety net, catching the uncommon cases that escaped the programmer, but contribute to the correctness of the application.

Yet the testers I know seemingly focus exclusively on the flaws with zero regards to the correctness. Once, my colleague had to come up with a suitable image for a trashcan, for use as a delete button in a datagrid. The original image was done by me, bluish in theme. My colleague even apologised to me that he’s not using my image. It was fine. The image wasn’t artist-grade, but it serves its purpose.

He had to do it, or the testers refuse to test the intended web form at all. They don’t want to test if a new record can be correctly inserted into the database. They don’t want to test if they can update existing records correctly. They don’t want to do anything on the web form, refusing to understand how it works, until they’ve got their trashcan image.

One image after another was sent. They don’t like the look. They don’t like the colour. They don’t think it looks like a trashcan. Finally I suggested to my colleague, just ask them for an image instead.

They couldn’t. They don’t like the images submitted to them, and they couldn’t give us an image they approve of. After wasting many days over this trivial matter, my colleague came up with an image, digitally manipulated to fit what hopefully suits the testers’ perceived requirements.

It was ugly. It was dull red, like someone had a nosebleed and sneezed, and one of the blood drops happened to hit the floor in a vague squarish pattern. Then a digital photograph was taken of it, and the result had a slight yellowish-green cast to it. And because of the looming deadline, the testers finally went with it.

I am working hard with the users to bring the application to live status as soon as possible. The sooner it goes live, the sooner users get to benefit. The testers, on the other hand, seem bent on delaying the project. Their fears of causing calculation errors, of users using their approved interface design (but might possibly be perceived as unfriendly), of maybe tens of other irrational fears have stopped them from growing. And their fears have essentially stopped the company from growing.

It’s about mindset. Do you expect greatness and code and test for correctness? Or do you fear and nitpick on minutiae?

Are you dealing fatal blows?

Computer issues by Clint Spencer @ iStockphoto

When you code, do you write half-heartedly? Do you solve a problem to the best of your abilities? Are you disciplined in your work ethics? When your programming abilities are called to the fore, can you deal fatal blows?

The cat and mouse game

I’ve been following this manga series about a kung fu artist, Chen Min. During the early days of this young man’s training and journeys, there was once where he was under the tutelage of a master living in the mountains. For one of his training lessons, he was given a simple task.

Close to where they lived, a gorgeous waterfall gushed clear water from high up. To the side of this waterfall, a steep cliff stood resolute. There was this stone panel with words engraved on it, and the panel’s situated in the middle of this vertical rock-strewn wall face. The only positional clue was a small branch jutting out near the panel.

The task? Find out what those words were, or don’t go back to the master.

The cliff was too steep to climb with barely any hand holds. The only way was to jump off the cliff into the water below, and read the panel while undergoing free fall. Chen Min jumped down over and over, looked and squinted, but he couldn’t make out the words at all. Defeated, he sat down to rest.

On the grass near him, a small mouse sat. Sniffing at the air, it was nervously looking around. Then he noticed a cat, crouching quietly nearby, still and motionless. The cat moved a paw forwards. The mouse, catching the almost imperceptible movement, scampered forward.

The cat gave chase, springing forth with alarming agility. The mouse zigzagged its way, feverishly trying to throw the cat off. Rodent and feline blurred into lines of white, slashing through the grass in lightning streaks.

The mouse, nearing its limits, zoomed into a pocket of thick vegetation. The cat jumped right in, ignoring everything in its path. Branches swiped its body. Leaves flapped in its face. But the cat held steady, its entire focus solely on its white four-legged meal. Its endurance pushed too far, the mouse slowed, and the cat snapped it into its jaws.

From the chase he witnessed, Chen Min finally found the element he lacked. An ability to focus so intently as to block out every other distraction. He went back to the cliff. Before he jumped, he concentrated on that small branch, willing every cell in his body to focus on it, and the stone panel beside it. Then he jumped once more, eyes fixated on the branch. Closing in on the branch, he shifted his eyes to the stone panel. The words were, “bright mirror still water”, literally translated from Chinese.

When water is still enough, the water surface acts like a mirror. Sounds deep, huh? To be still enough, one must be focused enough. Yet too much focus, and one’s mind gets disturbed, disrupting the stillness. I’m still trying to understand this…

Anyway, from the cat and mouse chase, it seems that one false move triggers an action. If one does not react fast enough, there might not be enough time. Which brings us to the next point.

There is never a good time

So, in another part of Chen Min’s journeys, he’s at a monastery learning from another master. One day, the master decided to spar with Chen Min. After a few jabs here and there, the master basically gave the “stop trying to hit and hit me!” remark.

Chen Min gave his best, punching as fast as he could. He got in a few hits, and he was happy. I mean, he hit the master, right? The master just stood there, apparently unaffected by the attacks. Chen Min sprinted forth, followed the master’s movements to the right, gave a quick punch with his right hand, and hit the master squarely in the chest.

“When are you going to bring down this old geezer?” the master bellowed.

In an unexpected retaliation, the master sprung forwards and jabbed lightning-quick into Chen Min’s stomach. And Chen Min howled in pain, rolling around clutching his stomach.

“In a real battle, you only get one chance!”

So skimming forward a little, Chen Min was following his master through the busy market streets. There was a crowd gathering somewhere and they went to take a look. An industrious vendor had placed a weasel and a snake in the same cage, promising a fierce battle and much entertainment.

The snake clearly had an advantage. One bite with its poisonous fangs and the weasel was done for. The snake slithered slowly around the cage, taunting its opponent.

It jabbed! The weasel dodged barely out of the way.

It lunged! The weasel swerved to the side just in time.

Presently, the weasel found itself backed up into a corner. The snake, sensing its victory, advanced cockily.

For a split second, the snake got distracted and tilted its head ever so slightly to the side. The weasel grabbed the opportunity and pounced towards the serpent. The snake recovered in the nick of time and bared its fangs in retaliation.

Just when the two combatants were almost head to head, the weasel recoiled its head, swivelled sideways and clammed its teeth onto the snake’s head from the side. Sensing its impending death, the snake wrapped itself around the weasel, seeking to loosen the weasel’s deadly hold, but it was too late. The weasel bit down hard, because any quarter given would mean its certain demise.

In the end, the weasel won, because it took its one opportunity at striking. There’s actually more from the manga artist reinforcing this, with (another) cat and mouse incident, and a swinging pendulum with a broken bottle attached. Maybe I’ll talk about it another time…

The lesson to take away is this. The right time to do anything is the best time to do it. That, may not quite make any sense. Let’s look at it this way. For the weasel, there’s no chance for it to take the offensive. The only time it could was when the snake got distracted. The snake would probably not get distracted again. So the weasel really just had one chance. Just one. And that’s the best time to attack.

Focus and timing

When you’re coding, you need to focus unwaveringly on your solution. Not only that, you need to hold that focus for an extended period of time, because you’re juggling many things in your head.

Obviously, a long project is going to sap your strength significantly. Your deadline estimation is also going to depend a lot on how much discipline you can muster to focus, to keep yourself in prime coding state. In a large team, you’ll be working with other people, management, rules and regulations. And in a small team, or if you’re a lone wolf programmer, the only opponent is yourself.

The timing is never going to be right to fix that bug, to refactor that piece of code, to do what you most need to do (yet can’t muster the energy to do so). You only get one chance. Code it right the very first time, because you might not get another chance to correct it.

If you’re fortunate enough to be graced with a second opportunity, you’ll have to weigh the benefits of using your time on code correction, or on something else with better value. Perhaps decisions can’t be made with complete information at hand. Make the best decision you can without going into design analysis paralysis, but code with confidence that you’ve done everything you can to make it not suck.

Programming is about solving problems. The problem has already made its first move. Are you dealing fatal blows back?

Why you need linguistic skills – part 2

Language in dictionary by Christian Grass @ iStockphoto

In a previous article, we explored the reasons for bad variable naming. In this article, we’ll look at why your basic spelling and grammar skills are important, and using your context-based linguistic skills in reading code.

Simple language

Frankly speaking, you just need basic spelling and grammar skills. I mean, it’s not like you’re asked to code stuff about the number of defenestrations or keeping track of myocardial infarctions. Unless, of course, you’re working on a medieval game where throwing people out of windows is fun or a medical application predicting the likelihood of heart attacks.

For the most part, you’re working with concepts such as user IDs, names, sums of something, billed amount, tax amount and so on. These concepts are likely to be commonplace. If not, they would be explained in a business document. The hard work of naming variables is already done. You just need to name those variables. How hard can sName or iSum be?

All you really need to take care of is spelling and simple grammar. Why spelling? It’s easier to read and it’s easier to find. Imagine you need to do some refactoring, and you have to replace all references to “Result” to “Case”. You thought your find-and-replace was done. Then you find an instance of “Rslt” or worse, “Resutl”. The worst part is you don’t know if you’ve completely replaced every instance correctly, including those weird cases.

As for grammar, let’s look at the simple case of plurality. GetResult() suggests the return of a single record. GetResults() suggests the return of possibly more than one record, maybe a list or an array. An unimaginative programmer might name the second function GetResult2(), which tells us nothing whatsoever, other than it’s possibly another form of GetResult().

This reminds me of something baavgai from Dream In Code said about naming database tables and columns. If I remember correctly, he suggested using the singular form, for example, “customer” for the database table storing customer information. This way, the “customer_id” column makes sense, instead of “customers_id” in the “customers” table. Even if the computer doesn’t care either way, please use appropriate singular or plural forms, for the sake of other human beings.

If spelling errors and inappropriate forms in code are travesties, spelling errors and inappropriate forms in database tables and columns are abominations! The database is the backbone of everything you code. If a table name is changed, can you imagine how many applications are affected by the name change?

Back-end programs, console programs, web applications, stored procedures. Every single one has to be checked and corrected. It just takes too much effort, so the table name remains incorrectly spelled.

Let’s go back to spelling for a bit, in particular, short forms or abbreviations. Don’t use them. Spell the word in full. Not everyone can decipher decBillAmt as billed amount (take note of tenses too). If the shortened form is widely known, or reasonably well known within the context, then by all means use the short form.

There was this time where a senior level manager was reading through the documentation to find out more about the nuts and bolts of our application. He stopped at AcctNo. He asked, “What’s account no?” He managed to decipher “acct” as “account”, but couldn’t make head or tail of what “no” means. The negative of “yes” didn’t fit the context. We had to explain it stands for “account number” in full. Shortening words for the sake of short forms is not an excuse for illegibility.

There’s also this instance where my colleague asked me what mstIptRec was. I thought for a bit, racking my brain for the business logic involved, scouring the surrounding code for clues and sifting through my vocabulary to fill in the missing letters. I came up with “master input record”. Make your fellow programmers’ lives easier; spell the word in full.

Filling in gaps

Nothing has meaning except the meaning you give it.

This brings us to the next point, understanding your fellow programmers’ code. You may be a skilled programmer, but the sad truth is, you’re going to read other people’s code, and let’s face it, some of them are terrible.

You are able to understand what “i can has cheezburger” means because you can rearrange words and fill in letters. Your problem with code is that, well, the compiler has already “approved” the other person’s code. You know, by successfully compiling it. The compiler has already understood perfectly what the code has to do. You’re the one who’s in trouble.

Since the syntax (grammar) is correct, and the spelling is correct (compiler will have choked on “ant i;“), you have to look at the variable names, function names, class names and so on to decipher meanings. This is where your linguistic skills come into play. You need to fill in the context. This is crucial to your quick understanding of another person’s code, maybe even your own.

Compilers are incredibly unforgiving when it comes to syntax, yet unbelievably forgiving when it comes to context.

Remember, you’re the only one who knows if

int i = 2 + 3;

should be

int i = 2 * 3;