Fractal Memory Usage and a Big Number

In a previous post I said I’d talk about 4**(4**(4**4)).
First, about the number. I first saw it mentioned in a math lesson back in high-school, when the teacher wanted to demonstrate estimation abilities.
He wrote it on the board, like so:

4444

and then asked students to estimate how big that number is. Turns out this number is much bigger than most students thought. (For comparison, the number of atoms in the observable universe is 1080.)

Given that this number is so big, it should come as no surprise that computing it will consume all available memory.
What is interesting is the way the memory is consumed:

(It died with a MemoryError just before the next “jump”.)

When I first saw this I was fascinated. To those interested it is generated by the long pow function in Python, implemented in longobject.c. According to the comment there, the algorithm is a left to right 5-ary exponentiation, taken from “Handbook of Applied Cryptography”.
In short, the simpler algorithm “left-to-right binary exponentiation” (used for smaller exponents) repeatedly squares the accumulator, and according to the binary digits of the exponents, multiplies it by the base. 5-ary exponentiation basically does the same thing, but in each iteration it ‘eats’ five digits of the exponent instead of just one.

It might be very interesting to study algorithms and code according to their memory usage. Once at work I used a similar memory usage graph to optimize the memory usage of some algorithm that was eating too much memory. I don’t remember the details as it happened a few years ago, but I do remember that I recognized the “type of the memory eating”, which helped me to narrow the problem down to a specific area of code, where indeed the problem was located. (I consider this a case of programming-fu :)

Notes:
1) The reason I’m talking about Python is because it has arbitrarily sized longs built-in.
2) This graph was generated using Windows’ task manager. On my Ubuntu machine it looks quite different, almost a straight line. Any guesses why?

This entry was posted in Algorithms, computer science, Fractals, Math, Programming, Python and tagged , , , . Bookmark the permalink.

One Response to Fractal Memory Usage and a Big Number

  1. Interesting post**post.

    About Ubuntu memory consumption: In Windows if a process ask for 2GB of RAM it is allocated immediatly (“sure, there you are” windows says) or refused (“sorry, i don’t have enough memory right now”), but Linux instead always lies (“haha, ok” replies) and never allocates the amount of memory a process asks.

    The rationale behind this is that 99% of programs are badly written (which is sadly true), so they never actually USE the amount of memory they ask. So, as a program start to use the memory they asked, it becames allocated in the O.S, as the straight line you discovered.

    This is good most of the time, because unused memory becomes better used memory (disk cache for ex). The bad side is what happens if a program actually USE the 2GB they asked and memory is insufficient right now: well, Linux really will be an evil guy and will kill some “random” process.

    You can read a funny analogy in this LWN article:
    http://lwn.net/Articles/104179/