🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

Brain giblets

Published April 28, 2008
Advertisement
I was coding away earlier, innocent as can be, when I suddenly was struck by an errant thought. After scolding it for interrupting my blissfully thought-free coding session, I took a closer look at it.

Turns out it was a fairly deep and important truth. In fact, I nearly came over here to write a journal entry about it - it was that good.

Now, sadly, I'm afraid the tale takes a tragic turn. I didn't feel that I could actually write an entry about it. Beginners would probably appreciate it, intermediate-level coders would recognize it, and wise old sages would find fifty ways in which I'm utterly wrong and then commence beating me about the head with rubber balloons.

So I didn't write an entry, and took a brief nap instead.


Later it occurred to me that maybe a short entry is actually OK. I mean, I know that I normally write small books every time I wander over into journal-land, but maybe it's time for some variety.

I plan to start crapping out these little tidbits whenever they pop into my brain, for two reasons. First, it will help reinforce and clarify the concept in my own mind. Secondly, someone out on the vast intertron is bound to find them useful.

So without further ado and verbosity on my part, here it is.


Observation: To actually write good code, you must keep the entire project in your mind at once. Without this, you will invariably miss some tiny detail or nuance, and this introduces inconsistency. Inconsistency is a serious source of bugs.

The Cold Harsh Reality: Doing this is very hard. It takes years of practice to increase the amount of code and logic that your mind can keep track of. However, no amount of practice makes your capacity limitless; everyone hits an upper bound somewhere along the line.

Advice: There are three main strategies for coping with this need to understand large amounts of information:
  1. Get a bigger brain (more memory)

  2. Divide the problem into smaller chunks

  3. Work at higher levels of abstraction

Method 1 is severely limited as we already saw. Methods 2 and 3, as it turns out, are deeply related to each other.

"Divide and conquer" strategies work by creating isolated "black boxes" - the stuff we commonly refer to as "modules." Each module can work by itself given some specific input and output parameters; building the final program can be thought of as stitching together lots of these modules. In today's OO-dominated culture, this usually means a class or set of classes.

Once you have a set of black box classes, you can stitch those together to create a higher level of abstraction. For example, consider these two functionally similar programs:

// Program 1int main(){   float average = (4.0f + 6.0f + 10.0f + 3.0f + 15.0f + 12.0f + 1.0f) / 7.0f;   std::cout << average << std::endl;}


// Program 2template class Iterator>ValueType average(Iterator begin, Iterator end){   ValueType total = 0;   size_t count = 0;   for(Iterator iter = begin; iter != end; ++iter)   {      total += *iter;      ++count;   }   return total / static_cast(count);}int main(){   for(unsigned i = 0; i < 50; ++i)   {      // Assume we have a function GetNumberSet that fits in here      std::vector<float> numbers = GetNumberSet(i);      std::cout << average(numbers.begin(), numbers.end()) << std::endl;   }}


At first glance, the first program seems obviously superior: it's shorter, we can tell what it's doing immediately, and so on.

Now what happens if we need to crunch, say, 50,000 averages?

The second program is the clear winner in this case. We don't want the program to be hard-coded with 50,000 different sets of data - it would be impossible to maintain. Instead, we'd probably want program #2 with GetNumberSet coded up in such a way that it handled feeding the numbers into the average function for us.

The second program is longer and more complex than the first, but it is also more powerful. This is because the second program works at a higher level of abstraction than the first. We've created a black box - average - and now we can use it freely.

Program #2 is superior when dealing with lots of numbers because the abstraction layer lets us think about fewer elements. If we used approach #1 to handle the data, we'd have 50,000 computations to keep track of - not to mention displaying the results to the user. With the second approach, we only really need to track three things: the source of the numbers themselves, the average function, and the output.


Summary: Use whatever strategies you can to minimize the amount of detail you have to think about at one time. This will help make for more cleanly separated code, and as a bonus it will reduce defects due to oversight.

Bonus Question: Can you think of at least three ways in which program 2 is superior to program 1, besides the one already discussed?
0 likes 2 comments

Comments

Telastyn
I used to have a near visual memory of things. Spelling tests became very easy once I found I could recall a visual memory of a word. Nowhere near as good as my father (who would remember the line number that a particular piece of code was at so he could debug things over the phone)... but still; damned good.

These days, I promptly forget anything more than 3 days old. Oh, I still get flashes; mostly RPG maps or driving directions or what a particular sign said... but code? nah.

Oddly enough, this has helped me code a lot better.

1. I can't rely on my memory as the documentation of code. Better naming, more comments, less 'conciseness' evolved.

2. I follow patterns a lot more. I won't remember details, but given a problem I have a pretty good idea where/how I'd solve it.

So instead of keeping the whole project in my head, I follow implementation patterns so I can extrapolate how I would've solved it. It eliminates the need to remember the details, and cuts down on the 'exceptional' sort of data that really taxes your memory.
April 28, 2008 10:30 PM
Daerax
Agreed completely. I would append to three and say also, work with abstractions close to/meaningful to you.
April 29, 2008 07:51 AM
You must log in to join the conversation.
Don't have a GameDev.net account? Sign up!
Profile
Author
Advertisement
Advertisement