Categories
Fractals Graphics Math Programming Python

Visualizing Data Using the Hilbert Curve

Some time ago, a coworker asked me to help him visualize some data. He had a very long series (many millions) of data points, and he thought that plotting a pixel for each one would visualize it well, so he asked for my help.
I installed Python & PIL on his machine, and not too long after, he had the image plotted. The script looked something like:

data_points = get_data_points()
n =  int((len(data_points)**0.5) + 0.5)
 
image = Image('1', (n, n))
for idx, pt in enumerate(data_points):
	image.putpixel(pt, (idx/n, idx%n))
image.save('bla.png', 'png')

Easy enough to do. Well, easy enough if you have enough memory to handle very large data sets. Luckily enough, we had just enough memory for this script & data series, and we were happy. The image was generated, and everything worked fine.

Still, we wanted to improve on that. One problem with this visualization is that two horizontally adjacent pixels don’t have anything to do with each other. Remembering xkcd’s “Map of the Internet“, I decided to use the Hilbert Curve. I started with wikipedia’s version of the code for the Python turtle and changed it to generate a string of instructions of where to put pixels. On the way I improved the time complexity by changing it to have only two recursion calls instead of four. (It can probably be taken down to one by the way, I leave that as a challenge to the reader :)

Unfortunately, at this point we didn’t have enough memory to hold all of those instructions, so I changed it into a generator. Now it was too slow. I cached the lower levels of the recursion, and now it worked in reasonable time (about 3-5 minutes), with reasonable memory requirements (no OutOfMemory exceptions). Of course, I’m skipping a bit of exceptions and debugging along the way. Still, it was relatively straightforward.

Writing the generator wasn’t enough – there were still pixels to draw! It took a few more minutes to write a simple “turtle”, that walks the generated hilbert curve.
Now, we were ready:

hilbert = Hilbert(int(math.log(len(data_points), 4) + 0.5))
for pt in data_points:
    x,y = hilbert.next()
    image.putpixel(pt, (x,y))

A few minutes later, the image was generated.

Categories
computer science Fractals Python

Fractals in 10 minutes No. 6: Turtle Snowflake

I didn’t write this one, but I found it’s simplicity and availability so compelling, I couldn’t just not write about it.
In a reddit post from a while ago, some commenter named jfedor left the following comment:

A little known fact is that you can do the following on any standard Python installation:

from turtle import *
 
def f(length, depth):
   if depth == 0:
     forward(length)
   else:
     f(length/3, depth-1)
     right(60)
     f(length/3, depth-1)
     left(120)
     f(length/3, depth-1)
     right(60)
     f(length/3, depth-1)
 
f(500, 4)

If you copy paste, it’s a fractal in less than a minute. If you type it yourself, it’s still less than 10. And it’s something you can show a kid. I really liked this one.

Categories
Fractals Programming Python

Fractals in 10 minutes No. 5: Sierpinski Chaos Game

A few years back, I was in some course where they also taught me some Matlab. One of the exercises was to draw the Sierpinski triangle using a method of progressing a point randomly. I was quite surprised at the time, because I thought the Sierpinski triangle was more of an analytical thing: you drew it using inverted triangles.

I wanted to check up on it, and it turns out the method has a name: Chaos Game.
To generate a Sierpinski triangle using this method, one starts with a some point inside the triangle. Then, at each step, the next point is half the distance from the current point to one of the corners selected at random. I think this method of generating the Sierpinski triangle is even easier than the analytical one.

A sierpinski triangle generated using a chaos game

I used pylab (matplotlib) to create this image.
As I usually do, I also wanted to draw it using ascii-art. However, I must confess, I am not satisfied with the result:

              %6
             6*%8
            %8  66
           %6%*6%%6
          6%'    '66
         6866'  '6%86
        %6* 86 '%8 *66
       %**%%'***%6***%%
      8'6            6*8
     6%%*8          %*'*6
    6%6'666        86# 86%
   %%*''**%6      %**''*'%6
  *%8'   '6%6    668'   '#%%
 %% '6' '#*'6%  *% %8' '6''6*
%68%#6#*#68*86*%6#%8%8'6%6*6%*

The code isn’t really good, as I didn’t put much thought into it and just hacked it up. Still, I’m putting it up, and I might improve it someday.

Categories
Fractals Programming Python

Fractals in 10 minutes No. 4: Mandelbrot and Julia in Ascii Art

I felt like it’s about time I tackled the Mandelbrot and Julia sets, in Ascii-Art. Heck, the Mandelbrot fractal is on the logo of this Blog! However, being the most well-known fractal, this issue was tackled already, with satisfactory results.

Still, I took the ten minutes required to draw them in Python, using numpy:

def get_mandelbrot(x, num_iter = 10):
    c = x
    for i in range(num_iter):
        x = x**2 + c
    return abs(x)>2
 
def get_julia(x, c, num_iter = 10):
    for i in range(num_iter):
        x = x**2 + c
    return abs(x)>2

“Hey!” you might say, “There’s no loop in here!”. Indeed, with numpy loops can sometimes be avoided, when using arrays. When the expression x**2 + c is applied to an array x, it is applied element-wise, allowing for implicit loops. The actual magic happens in the following function:

def get_board(bottom_left, top_right, num_x, num_y):
    x0, y0 = bottom_left
    x1, y1 = top_right
    x_values = numpy.arange(x0, x1, (x1-x0)/float(num_x))
    y_values = numpy.arange(y0, y1, (y1-y0)/float(num_y))
    return numpy.array([[x+1j*y for x in x_values] for y in y_values])

The result of get_board will be used as the “x input” later on. It should be noted though that while that’s a cute trick this time, it might grow unhandy for more complicated computations. For example, making each element to reflect the iteration number on which it “escaped”.
So, here are the results:

############################   ##############
###########################   ###############
##########################     ##############
#####################                ########
####################                #########
############ #######                 ########
############                         ########
#########                           #########
#########                           #########
############                         ########
############ #######                 ########
####################                #########
#####################                ########
##########################     ##############
###########################   ###############

############################################################
############################################################
############################################################
############################ ###############################
######################### ##   #############################
############## #######           ###########################
######## ##      ###              ##########  ##############
#########  ###        ###         ######       #############
##############       ######         ###        ###  ########
###############  ##########              ###      ## #######
############################           ####### #############
##############################   ## ########################
################################ ###########################
############################################################
############################################################

Here’s the code. (With the usual bsd-license slapped on top of it :)

Categories
Algorithms computer science Fractals Math Programming Python

Fractal Memory Usage and a Big Number

In a previous post I said I’d talk about 4**(4**(4**4)).
First, about the number. I first saw it mentioned in a math lesson back in high-school, when the teacher wanted to demonstrate estimation abilities.
He wrote it on the board, like so:

4444

and then asked students to estimate how big that number is. Turns out this number is much bigger than most students thought. (For comparison, the number of atoms in the observable universe is 1080.)

Given that this number is so big, it should come as no surprise that computing it will consume all available memory.
What is interesting is the way the memory is consumed:

(It died with a MemoryError just before the next “jump”.)

When I first saw this I was fascinated. To those interested it is generated by the long pow function in Python, implemented in longobject.c. According to the comment there, the algorithm is a left to right 5-ary exponentiation, taken from “Handbook of Applied Cryptography”.
In short, the simpler algorithm “left-to-right binary exponentiation” (used for smaller exponents) repeatedly squares the accumulator, and according to the binary digits of the exponents, multiplies it by the base. 5-ary exponentiation basically does the same thing, but in each iteration it ‘eats’ five digits of the exponent instead of just one.

It might be very interesting to study algorithms and code according to their memory usage. Once at work I used a similar memory usage graph to optimize the memory usage of some algorithm that was eating too much memory. I don’t remember the details as it happened a few years ago, but I do remember that I recognized the “type of the memory eating”, which helped me to narrow the problem down to a specific area of code, where indeed the problem was located. (I consider this a case of programming-fu :)

Notes:
1) The reason I’m talking about Python is because it has arbitrarily sized longs built-in.
2) This graph was generated using Windows’ task manager. On my Ubuntu machine it looks quite different, almost a straight line. Any guesses why?

Categories
Fractals Graphics Programming Python

A second look at the dragon fractal

At first when I drew the (twin)dragon fractal, I had a small bug. I used the base 1+i instead of 1-i. This also generated a very similar looking beast. Thinking about that for a while, made me curious. Just like the Mandelbrot set, maybe other interesting fractals could be generated using exactly the same method, with a different complex ‘seed’.

So I patched up my code, and made it output a series of images for complex numbers on a path. I thought the reasonable choice would be a spiral, so using t*(cost+isint) as a base I wrote a spiral that would go around the origin several times and finally land on 1-i.
It might seem obvious to you to try and make this interactive instead – why not move the mouse around and watch the different fractals that emerge? Well, I wanted a video.

I also wanted the fractal to convey a little more information. So each point in the set was given a color according to its generation. I decided after some trial and error that white was better for the older generation, and red for younger generations.
After some PIL and ffmpeg work, it was ready.

While working on the video, I witnessed some very interesting fractals with different seeds.
I was very curious as to why certain shapes emerged. For example, a square pattern was very common:
Squares fractal
This turned out to be not that surprising. Its generator number is of the shape 0+ia. So it made sense. I still didn’t figure out how come hexagon shapes were so common:
Hexagons1 Hexagons2 Hexagons1

Maybe it had something to do with pi/6, I’m not too sure about that. If it did, I would expect to see many more regular polygons.
Here is another curious one:
holes

Another interesting phenomenon was what I started to call ‘the dragon’s heart’. You see, in the final dragon, the starting generations were spread about pretty evenly. However, with other bases, even ones generating something which is pretty similar to the dragon, the starting generations are clustered together – sometimes at the side, sometimes at the middle.

I’ve got a feeling (not proven or anything) that to be space filling, a fractal’s starting generations should be spread about. What do you think?

Categories
Fractals Math Programming Python

Fractals in 10 minutes no. 3 – The Dragon

When first I looked through the pages of the book “Hacker’s Delight”, I found myself looking at the chapter about bases. There I learned a very curious fact – with the digits of 0,1 and the base of -2, you can represent any integer. Right afterwards I learned something even more interesting – with the digits of 0,1 and the base of 1-i, you can represent and number of the form a+bi where a and b are integers. Having nothing to do with this curious fact, I let the subject go.
Some time later, I was reading through Knuth’s “Art of Computer Programming”, and found that with the base of (1-i)^-1, and digits of 0,1 you can generate the dragon fractal!

The dragon fractal

Generating the fractal is quite simple actually:

def create_dragon_set(n):
    """calculate the dragon set, according to Knuth"""
    s = set([0.0+0.0j])
    for i in range(n):
        new_power = (1.0-1.0j)**(-i)
        s |= set(x+new_power for x in s)
    return s

(By the way, can you do it better?)

The annoying part is converting the complex numbers to drawable integer points. After doing so, I used PIL to draw the jpeg.
Here’s a link to the code.

Categories
Algorithms Fractals Programming Python

Optimizing the recursive spring

Remember the recursive spring? Well, when I worked on it, it was quite slow. I mean very slow. Especially at the higher levels.

The original reason was the recursion. Consider this – to ‘springify’ a track, you had to sample it twice, so that you could get the heading. Here is an illustration:

(lower track is ‘inner’ track)

Now it is clear – the number of recursive calls made on all levels grow exponentially. There is some hope. The recursive calls use points calculated previously. It seems that with good caching we’ll be able to cancel out the exponential growth, and make it linear.

Actually, good caching did much more than that – I cached the last track as well. Which meant that the second time around drawing the track didn’t cost anything – just some dictionary lookups. This yielded a pretty elegant result.

To those curious, here is the code. Before I started working on it I hacked together a small 3d explorer with PyOpenGL, so that I’ll be able to play with the 3d models. This module is useful by itself as well, and someday soon I’ll dedicate a post to it. (Only after making it better. Right now I don’t consider it high-quality code.)

To run it, use tracks.py, and play with it a little. If you look at the code, there are many parameters to play with, yielding interesting results.

Categories
Fractals Graphics Programming Python

Fractals in 10 minutes No. 2 – Recursive Spring

Imagine a straight wire. Bend this wire until its ends meet. You get a ring. Next stage.
Take another straight wire, bend it as before, but this time, don’t let the ends meet, instead continue bending the wire more and more, until you get a spring. Now, Think of this spring as the first wire, and bend it until the spring’s ends meet. We get a spring-ring. (See pictures below)

At this point it should be obvious – instead of making the ends of the spring meet, we can continue bending the spring into a meta-spring, and again make this meta-spring’s ends meet. Rinse and repeat ad-infinitum.

At first I had a hard time picturing it, and I thought it would be tough to write about it without drawing it. So I set with pen and paper, and tried to draw it. That was even harder than picturing it. So I thought I could write a script to draw it, and that turned out to be pretty easy.

Here are the first four stages. The second and third are shown from the side. Click for bigger versions.

To implement this, I wrote a function that draws any track. A track is defined to be a function takes a float argument between 0 and 1, and returns the appropriate 3d point in between, where 0 is one end of the track and 1 is the other. (Actually, I also decided that tracks should return a “right hand” vector as well). Then I wrote a ring:

def ring(t):
    pi = math.pi
    cos = math.cos
    sin = math.sin
    x = numpy.array((2*cos(2*pi*t),2*sin(2*pi*t),0))
    return (x,-x/3)

Lastly I wrote a wrapper function, that takes any track, and springifies it. It was called springify_track. At this point I could do this:
(The constants are the radius of the spring loops, the sampling delta, and the number loops.)

t1 = ring
t2 = springify_track(t1, 0.25, 0.1, 50)
t3 = springify_track(t2, 0.06, 0.0011, 5000)
t4 = springify_track(t3, 0.011, 0.000012, 500000)
Categories
Algorithms Fractals Math

Riddle of the Week – LStrings

So I wrote about lstrings. I intend to write about them again in a short while – I already finished the basic script a few days ago, but I’m waiting, until I will be satisfied with it.

In the meantime, here is a curious riddle (That I came up with):

Version 1 (not final)

Assume you are given some iteration of an lstring – S. How do you discover the original lstring used to create S?

Discussion

Well, those who are quick to answer will first try to discover the length of the original string – it could be computed directly from S. At this point you should come to the conclusion that without being given the exact number of iterations – the riddle is not that interesting… Iterating on an lstring is an associative operation. If we denote the substitution of a string A into a string B as multiplication, when both are iterations on an original string T , we will get:

A*B =T^n * T^k = T^(n+k) = T^k* T^n

This will result in the obvious answer to the original riddle – “Why, the lstring S is the original string, with zero iterations!”. This answer is correct – and obviously useless :)

So, let us rephrase the riddle:

Version 2 (final)

Assume you are given some iteration of an lstring – S. How do you discover the minimum possible lstring that could be used to create S?

Discussion – But not a solution

This is the proper riddle – have at it! I will be glad to read your thoughts…

note: The acceptable solution should be either an algorithm, or a constructive mathematical proof. I don’t like reading only statements of existence :)