Saturday, 22 March 2014

Complexity and O(n) Notation

This week we went through calculating the complexity of a given function and finding its BigOh notation. This is the standard approach to estimate the efficiency of an algorithm. Since computers are quite fast, the problem with efficiency matters only on really large number of items, therefore BigOh notation is used whenever we assume a great number of items are given to the algorithm. In calculus you can think of this as taking the limit of a function as our parameters, called n from now on, approaches infinity. We essentially analyze and compare the growth rate of the algorithms. A great example is actually the two implementations I did of calculating the fibonacci sequence on a previous post. For the normal recursive solution, a simple call to calculate fibonacci(5) was actually running the function 15 extra times.

As our n gets pretty big, we can see that our recursive fibonacci algorithm would grow exponentially. Its complexity is O(((1+sqrt(5))/2)^n) to be precise and actually that base is the golden ratio! Another implementation that I showed on my previous post was using memoization to store previously calculated sequences so our number of calls would be 8, we nearly cut it in half!

As you can see from the illustration above, the memoization method gets rid of the redundant part of the problem and we just have to calculate and store the left part of our call tree. Hope you guys like the illustration I drew! Since we just have to calculate the leftmost branch of the tree to compute the sequence, our complexity becomes only O(n)! Much better eh?

No comments:

Post a Comment