I’ve been nibbling through Douglas R. Hofstadter’s Pulitzer Prize-winning book, Gödel, Escher, Bach. In Chapter XVII he mentions the mathematician Srinivasa Ramanujan (1887-1920) who had a talent for extremely fast mathematical analysis. This class of mind (including so-called idiot savants) he called lightning calculators. In this passage he discusses the unlikelihood that such minds have access to resources or processes outside of general recursive functions:

One could probably make a nice plot showing how the time taken by a lightning calculator varies with the size of the numbers involved, and the operations involved, and from it deduce some features of the algorithms employed.

This got me thinking about my own mathematical algorithms. By that term I mean the series of atomic steps taken to produce a result such as the sum `14 + 38 = 52`

or the quotient `39 ÷ 3 = 13`

.

I vaguely remember being taught to add two-digit numbers in grade school. The process begins with the numbers stacked so that the ones and tens places are vertically aligned. A horizontal line is drawn to separate the stack from the result below. Starting with the right-most column, we sum each column and place the result below the stack. If the result is two digits, place the “ones” numeral below the current column and the “tens” numeral above the next colmn to the left. (“Carry the one.”) Here is the final product, showing the work done:

```
1 ← carry the one
14
```__+ 38__
52

That is the most basic algorithm in my mathematical toolbox. You probably have a similar algorithm that you use to add a column of numbers. Maybe you visualize the stack and proceed consciously through the steps or perhaps you have trained your mind to produce mathematical results at a subconscious level. Hofstadter’s point was that everyone’s result must be produced, at some level, by a series of simple steps he called a general recursive function (Church-Turing Thesis).

Reading about this, I realized that my own addition algorithm proceeds not from right to left but from left to right. Whereas the standard method begins with the least significant digits, my method begins with the most significant digits. (Lets leave the Freudian isomorphisms out of this discussion, interesting though they may be.) Here is the way I add numbers:

Stack the numbers as before. Sum the left-most column and write the result below. (Begin loop.) Sum the next column and if it exceeds one digit, increment the previous result and append the new result. (End loop.)

I wonder whether my algorithm can produce results with fewer operations on average. I guess that if the likelihood of column sums exceeding one digit is less than a certain threshold, my method will be faster (completing in fewer operations). Perhaps one method is easier for minds having a specific learning preference, i.e. visual or auditory or tactile.

Here’s homework for the curious: write a program that compares these algorithms in terms of number of operations to sum every possible set of two, three, and four numbers having two, three, and four digits. If my method is faster for some class of sums, such as those having an instantly recognizable feature like a low occurrence of digits greater than five, would the extra steps of recognizing such a class and selecting the most appropriate algorithm improve the overall speed of doing sums?

My hope is that somebody can produce objective proof that my summing algorithm is not always slower than the right-to-left method taught to me in school. If not, I might be afflicted with *mathematics disorder*—an actual diagnosis in the DSM-IV. Pfizer?

Dunno, but … like the folk who can find day of week for any date? (Coincidentally, the first algo we were asked to write in our APL course. I’m old … don’t ask.) Seems to me those are conceptually different from folk who can deal with transcendental functions. I mean … those have a grasp of math and can prolly explicate their reasoning (read: the dreaded “Show your work” metric.) while the former get the answers to problems that are more arithmetic than math and likely can’t explicate it at all.

Just a thot … I groked lasers in grade 3 but flunked math in grade 10 cuz of quadratic equations. *shrug*

Even when I was little, while I was learning in school the aforementioned stacking method, I found myself adding or subtracting the columns on the left first. The only times it has seemed slower are when the numbers are in the millions or above and when three or more entries are involved. I don’t think it’s that unusual. I think it’s actually more natural. Indeed, I asked a couple of my friends, and they do it the same way.

I agree with Ben. I think the autistic savants (the preferred term for idiot savants, per the referenced DSM-IV and forthcoming DSM-V) are able to compute on a much more complex level. Straightforward addition and subtraction are not on this level. Simply notice that when you switch from addition to multi-digit multiplication you can no longer operate from left to right.

There are other interesting short-cuts I find myself performing with multiplication and division, such as separating the columns for one of the two numbers and then bringing the products back together. I’m sure others with average mathematical skills like mine perform similar tasks, but to the savants, such an operation is child’s play.

Rereading your post reminded me of Ilya Prigogine’s book on chaos theory … surprisingly readable. And that reminded me of 1985 with Martin Gardiner’s “Recreations” column in SciAm … had me racing to my C=64 to cobbling together imaginary math routines using assembler, making BIOS calls. Which brings me to my point: does anybody even /grok/ assembler anymore? Or *heaven forbid* ML?

left to right addition is a well known and tried and tested method in fact it is what our children are taught in school alongside right to left “traditional” addition