# diskodev

DESIGN & DEVELOPMENT AGENCY

# Multiplication Algorithm and Recurrences

Multiplication of two numbers is a well defined computational problem. It is such a basic step that we do not give a thought about the algorithms behind it. In this post we shall look at the various implementations of the multiplication algorithm. We will be only looking at the basic algorithms and for the complete set, refer here. This will also be an introductory post to analyzing recurrence relations and to finding the upper bound on the running time of the recurrence algorithm.

Let us try to multiply two numbers X and Y. Let us use the basic algorithm that we all studied in our younger days.

56 X 236

---------------

3402 + 17010 + 113400

---------------

133812

---------------

Here each number in Y is multiplied against each number in X and shifted accordingly. Let us try to frame a pseudo code for the above algorithm.

In the above code block, we assume that there is a function - length(), that returns the length of the number. For example, for the number 567, it returns 3. The running time of the above algorithm is easy to guess. Since each number is multiplied against another number, we need two loops to do the work. Hence the upper bound on the above algorithm is O(n^2)(n-squared).

Let us try to do better than the n-squared algorithm we have seen above. Let us use a recursive algorithm to multiply two numbers. The basic idea of the algorithm is to breakdown the input into smaller sub-problems, solve them and then combine the solutions.

The working of the algorithm is as follows :

In the algorithm above, we divide both the numbers into n/2 size each and then make four recursive call to multiply them (which might make more recursive calls) and get the final result. The base condition here is that when we get a single digit number, we return the product of the numbers.

Now let us come try to come up with the upper bound on the running time of the recurrence algorithm earlier. The recurrence in the algorithm above is of the form,

T(n) = 4T(n/2) + f(n)

Here, there are four sub-problems and the size of each sub-problem is half of the original problem. And f(n) represents the work done outside the recurrences which is just multiplying the single digit numbers and hence O(1). (These are loose bounds and assuming that multiplying single digit numbers is a elementary operation)

The solution of the recurrence can easily be found using the master theorem. But let us analyze it using the recursion tree method to understand it intuitively.

The recursion tree for the above recurrence can be shown as below,

In the above recurrence, except the leaves, each call makes four other recursive calls. To come up with the upper bound on the running time, we need to calculate the cost at each level and add them up along the height of the recursion tree. To aid us in our calculation, let us assume that n is an power of 2.

The number of sub-problems at each level j is equal to (4 ^ j). The size of sub-problems at level j is equal to n/(2 ^ j). Hence the cost accrued in each level of the reccursion tree is equal to (4 ^ j) * (n/(2 ^ j)) which gives to (2 ^ j) * cn, the cost of one level in the tree. The height of the tree is log n to the base 2. (It is actually log n-1 to the base 2, but we can be sloppy in this and this does not affect the analysis) This is so because the growth of the tree is a logarithmic function with base 2. So let us sum the cost at each level over the entire tree. Hence,

Above, we are trying to sum the cost of each level over the entire recursion tree. This gives us the upper bound on the runtime of the algorithm. So continuing,

This equation gives geometric series of the form, (The base of the logarithmic function above is 2)

Since log 2(to the base 2) is 1, we get

So we have used a complex algorithm to multiply two numbers without getting a increase in the runtime. The runtime of the recurrence algorithm is the same as the "naive" algorithm we had seen earlier. Let us see if we can improve upon the recurrence algorithm we have seen. Note that, to use the recursive algorithm we make four recursive multiplications to compute the procedure. Can we reduce this so that we can reduce the recursive calls thereby reducing the running time of the algorithm? There is a clever algorithm called Karatsuba algorithm which exactly does this. The algorithm makes use of a Gauss trick that makes three recursive calls instead of four to multiply two numbers. The working of the algorithm is as follows:

The recurrence of the above algorithm is given as,

T(n) = 3T(n/2) + f(n)

Here, there are three sub-problems and the size of each sub-problem is half of the original problem. And f(n) represents the work done outside the recurrences which is just multiplying the single digit numbers and hence O(1).

The recursion tree for the above recurrence can be shown as below,

The number of sub-problems at each level j is equal to (3 ^ j). The size of sub-problems at level j is equal to n/(2 ^ j). Hence the cost accrued in each level of the recursion tree is equal to (3 ^ j) * (n/(2 ^ j)) which gives to ((3/2) ^ j) * cn, the cost of one level in the tree. The height of the tree is log n to the base 2. (It is actually log n-1 to the base 2, but we can be sloppy in this and this does not affect the analysis) This is so because the growth of the tree is a logarithmic function with base 2. So let us sum the cost at each level over the entire tree. Hence,

Above, we are again trying to sum the cost of each level over the entire recursion tree. So continuing,

This equation gives geometric series of the form, (The base of the logarithmic function is 2)

The above makes use of logarithmic properties to simplify the equation.

We have actually reduced the running time from n-squared to (n ^ 1.5) which is a slower growing function than n-squared as shown in the figure below.

The code to do the multiplication is given in the below gist

The function getOrderOfNumber() returns the number of digits in a number. The code makes three recursive calls as we had described earlier and returns the result.

Note: We can confirm the upper bound we had calculated using the recursion tree by the easier master theorem. The details of the process are given below.

The first step in using the master theorem is to calculate the number of leaves in the recursion. The leaves are given by,