Rise of the Planet of the Apes (Poster) A few weeks ago, I watched Rise of the Planet of the Apes in a movie theatre. The movie showed a genetically engineered chimpanzee trying to solve a puzzle involving 4 discs, initially stacked in decreasing size on one of three pegs. The chimpanzee was supposed to transer the entire stack to one of the other pegs, moving only one disc at a time and never moving a larger one onto a smaller one.

The problem was called the Lucas' problem in the movie. I couldn't help myself from remarking to my friend sitting next to me that the problem is actually known as the Tower of Hanoi and the minimum number of moves required to solve the problem is 2n − 1 if there are n discs involved. The math lectures were not very welcome by her in the middle of the movie.

During the intermission, she was curious to know why it was called the Lucas' problem instead of the Tower of Hanoi in the movie. I think it's probably because the puzzle was invented by the French mathematician Édouard Lucas in 1883. There is another version of this problem known as the Tower of Brahma that involves 64 discs made of pure gold and three diamond needles. According to a legend, a group of Brahmin priests are working at the problem and the world will end when the last move of the puzzle is completed. Now, even if they make one move every second, it'll take 18,446,744,073,709,551,615 seconds to complete the puzzle. That's about 584.54 billion years.

Animated solution Since, the puzzle in the movie involved only 4 discs, it can be solved in 24 − 1 = 15 moves as illustrated in an animated solution by AndrĂ© Karwath that I found in the Wikipedia article for the puzzle.

I'll not discuss the solution of this puzzle in this blog post. There are plenty of articles on the web including the Wikpedia article that describes why it takes a minimum of 2n − 1 moves to complete the puzzle. In this post, I'll talk about an interesting result I discovered while playing with this puzzle one April afternoon this year.

An interesting thing to note is that if we use Tn to denote the minimum number of moves required to solve the Tower of Hanoi problem involving n discs, then Tn when expressed in binary is the largest possible n-bit integer. For example, T4 = 1510 = 11112. After all, the largest n-bit integer is 2n − 1. Now, let me describe the result I came across while playing with it by asking a question.

This problem deals with arbitrary-precision integers. Some definitions:

  1. A positive integer N is called an n-bit integer if and only if the minimum number of bits required to express the integer is n, or in other words, ⌊log2(N)⌋ + 1 = n.
  2. Tn represents the largest possible n-bit integer.

Some assumptions:

  1. Let the time complexity of adding or subtracting two integers be O(1).
  2. Let the time complexity of counting the number of 1-bits in an n-bit integer be O(n).

A positive integer n is given. What is the most efficient way, in terms of time complexity, to compute the number of 1-bits in T1 + T2 + … + Tn?

The naive approach would involve adding all the n integers and counting the number of 1-bits in the sum. The time required to add the n integers is O(n). The sum would require (n + 1) bits to be represented. Hence, it would take O(n) time to count the number of bits in the sum. Since the sum is (n + 1) bits long, the naive approach requires O(n) memory as well. If n is as large as 264, over 2 exbibytes of memory would be required to store the sum.

There is a more efficient solution.

T1 + … + Tn
 = 2(n + 1) − 2 − n
 = [2(n + 1) − 1] − (n + 1)

Now, 2(n + 1) − 1 has (n + 1) 1-bits. Subtracting (n + 1) from it is equivalent to removing one 1-bit in 2(n + 1) − 1 for each 1-bit in (n + 1). So,

[number of 1-bits in T1 + … + Tn]  =  [(n + 1)]  −  [number of 1-bits in (n + 1)].

So, the computation involves counting the number of 1-bits in n + 1 which takes O(log n) time and subtracting it from n + 1 which takes O(1) time. The problem is solved in O(log n) time. It occupies O(log n) memory. What required over 2 exbibytes of memory with the naive approach requires only a little over 8 bytes of memory with this approach.

2 comments

Anonymous said:

Wonderful! Reminds me of something Euler would think of, very neat and tidy.

Post a comment

RSS