
We are searching data for your request:
Upon completion, a link will appear to access the found materials.
Example (PageIndex{1})
Find a recurrence relation and initial conditions for (1, 5, 17, 53, 161, 485ldots ext{.})
- Solution
Finding the recurrence relation would be easier if we had some context for the problem (like the Tower of Hanoi, for example). Alas, we have only the sequence. Remember, the recurrence relation tells you how to get from previous terms to future terms. What is going on here? We could look at the differences between terms: (4, 12, 36, 108, ldots ext{.}) Notice that these are growing by a factor of 3. Is the original sequence as well? (1cdot 3 = 3 ext{,}) (5 cdot 3 = 15 ext{,}) (17 cdot 3 = 51) and so on. It appears that we always end up with 2 less than the next term. Aha!
So (a_n = 3a_{n-1} + 2) is our recurrence relation and the initial condition is (a_0 = 1 ext{.})
Recurrence Relations
A recurrence relation is an equation in which each term of the sequence is defined as a function of the preceding terms.
There are different ways of solving these relations, we're going to examine:
Solving recurrence relations by repeated derivation/substitution
The premise for this type of solution is to continually substitute the recurrence relation on the right hand side (ie. substitute a value into the original equation and then derive a previous version of the equation). The various derivations should lead to a generalized form of the equation that can be solved for a time constraint on the algorithm/equation: For example:
The only substitution available at this point is to substitute n/2 for n in the original equation:
Add c to both sides to derive the original equation:
Now that another equation has been derived there is a new substitution that can be made in the original: n/4 for n
In general, the basic equation is:
Making an assumption that n = 2 k allows for:
Solving recurrence relations by telescoping
The premise for this type of solution is to substitute values for n, add up the results, and eliminate like terms. For example:
The Master method
The Master method is applicable for solving recurrences of the form The solution of recurrence given in (2) is, > du
ight )
ight)$ Example 1: Consider a recurrence, For this recurrence, $a_1 = 2, b_1 = 1/4, a_2 = 3, b_2 = 1/6, f(n) = nlog n$. The value of $p$ can be calculated as, > du
ight )
ight) In forward substitution method, we put $n = 0, 1, 2, …$ in the recurrence relation until we see a pattern. In backward substitution, we do the opposite i.e. we put $n = n, n - 1, n - 2, …$ or $n = n, n/2, n/4, …$ until we see the pattern. After we see the pattern, we make a guesswork for the running time and we verify the guesswork. Let us use this method in some examples. Consider an example recurrence relation given below Given $T(n)$, we can calculate the value of $T(n/2)$ from the above recurrence relation as
We rarely use forward and backward substitution method in the practical cases. There are much more sophisticated and fast methods. But these methods can be used as a last resort when other methods are powerless to solve some kinds of recurrences. In the previous post, we discussed analysis of loops. Many algorithms are recursive in nature. When we analyze them, we get a recurrence relation for time complexity. We get running time on an input of size n as a function of n and the running time on inputs of smaller sizes. For example in Merge Sort, to sort a given array, we divide it in two halves and recursively repeat the process for the two halves. Finally we merge the results. Time complexity of Merge Sort can be written as T(n) = 2T(n/2) + cn. There are many other algorithms like Binary Search, Tower of Hanoi, etc. There are mainly three ways for solving recurrences. 1) Substitution Method: We make a guess for the solution and then we use mathematical induction to prove the guess is correct or incorrect. 2) Recurrence Tree Method: In this method, we draw a recurrence tree and calculate the time taken by every level of tree. Finally, we sum the work done at all levels. To draw the recurrence tree, we start from the given recurrence and keep drawing till we find a pattern among levels. The pattern is typically a arithmetic or geometric series. 3) Master Method: There are following three cases: 2. If f(n) = Θ(n c ) where c = Logba then T(n) = Θ(n c Log n) 3.If f(n) = Ω(n c ) where c > Logba then T(n) = Θ(f(n)) How does this work? In recurrence tree method, we calculate total work done. If the work done at leaves is polynomially more, then leaves are the dominant part, and our result becomes the work done at leaves (Case 1). If work done at leaves and root is asymptotically same, then our result becomes height multiplied by work done at any level (Case 2). If work done at root is asymptotically more, then our result becomes work done at root (Case 3). Examples of some standard algorithms whose time complexity can be evaluated using Master Method Binary Search: T(n) = T(n/2) + Θ(1). It also falls in case 2 as c is 0 and Logba is also 0. So the solution is Θ(Logn) Notes: 2) Case 2 can be extended for f(n) = Θ(n c Log k n) Please write comments if you find anything incorrect, or you want to share more information about the topic discussed above Attention reader! Don&rsquot stop learning now. Get hold of all the important DSA concepts with the DSA Self Paced Course at a student-friendly price and become industry ready. To complete your preparation from learning a language to DS Algo and many more, please refer Complete Interview Preparation Course. The Recursion Tree Method is a way of solving recurrence relations. In this method, a recurrence relation is converted into recursive trees. Each node represents the cost incurred at various levels of recursion. To find the total cost, costs of all levels are summed up. Let us see how to solve these recurrence relations with the help of some examples: Question 1: T(n) = 2T(n/2) + c Recursive Tree with each level cost Count the total number of levels – Choose the longest path from root node to leaf node Size of problem at last level = n/2 k At last level size of problem becomes 1 Total no of levels in recursive tree = k +1 = log2(n) + 1 T(n) = c + 2c + 4c + —- + (no. of levels-1) times + last level cost = c + 2c + 4c + —- + log2(n) times + Θ(n) = c(1 + 2 + 4 + —- + log2(n) times) + Θ(n) 1 + 2 + 4 + —– + log2(n) times –> 2 0 + 2 1 + 2 2 + —– + log2(n) times –> Geometric Progression(G.P.) = c(n) + Θ(n) Thus, T(n) = Θ(n) Question 2: T(n) = T(n/10) + T(9n/10) + n Recursion Tree with each level cost Count the total number of levels – Choose the longest path from root node to leaf node (9/10) 0 n –> (9/10) 1 n –> (9/10) 2 n –> ……… –> (9/10) k n Size of problem at last level = (9/10) k n At last level size of problem becomes 1 (9/10) k n = 1 (9/10) k = 1/n k = log10/9(n) Attention reader! Don&rsquot stop learning now. Practice GATE exam well before the actual exam with the subject-wise and overall quizzes available in GATE Test Series Course. This is a little trickier, because as the arguments to the f's drop, they are multiplied by more and more a's. After some backward substitution it is not hard to recognize the pattern Example: T(n) = 2T(n-1) + n. Then from the formula This turns out to be a rather painful sum to solve exactly, but we can reasonably guess that it's somewhere between 2 n and 2 n n, and try guess-but-verify to whittle the range down further. Theorem: For a recurrence relation in the form [a_n=c_1a_ Proof: Suppose we have any solution to the original recurrence: ( Subtracting these, we get [a_n-p_n=c_1(a_ So, any solution ( The Master Theorem provides instant asymptotic solutions for many recurrences of the form T(n) = aT(n/b) + f(n), that apply for all values of n (not just powers of b). It is based on applying the analysis of the preceding section to various broad families of functions f, and then extending the results using a monotonicity argument to values of n that are not powers of b. Here we sketch out the proof see LevitinBook Appendix B for a more detailed argument. If f(n) = 0, then the recurrence is simply T(n) = aT(n/b). This has solution T(n) = n log[b] a T(1) = Θ(n log[b] a ). (Example: T(n) = 4T(n/2) has solution Θ(n lg 4 ) = Θ(n²).) We classify different cases of the Master Theorem based on how f(n) compares to this default solution. We assume that T(1) = Θ(1) throughout. Suppose that f(x) = x c . Then a i f(n/b i ) = a i n c / b ic = n c (a/b c ) i . The sum is then a geometric series with ratio (a/b c ), and its behavior depends critically on whether (a/b c ) is less than 1, equal to 1, or greater than 1. If (a/b c ) is less than 1, then Sigmai=0 to infinity n c (a/b c ) i = n c /(1-(a/b c )) = O(n c ). This case arises when log(a/b c ) = log a - c log b is less than zero, which occurs precisely when c > log a / log b = logb a. So if f(n) = n c , the f(n) term in the sum dominates both the rest of the sum and the n log[b] a term, and we get T(n) = Θ(f(n)). If f(n) is Omega(n c ), but satisfies the additional technical requirement that af(n/b) <= (1-delta) f(n) for all n and some fixed delta > 0, then the geometric series argument still works with factor (1-delta), and we still get T(n) = Θ(f(n)). This covers the case where f(n) = Omega(n log[b] a + epsilon ). If (a/b c ) is equal to 1, then every term in the sum is the same, and the total is f(n) logb n. In this case c = logb a, so f(n) = n log[b] a and f(n) logb n dominates (barely) the T(1) term. An extended version of this analysis shows that the solution is T(n) = Θ(f(n) log n) when f(n) = Θ(n log[b] a ). Finally, if (a/b c ) is greater than 1, we have a geometric series whose sum is proportional to its last term, which can be shown to be asymptotically smaller than the T(1) term. This case gives T(n) = Θ(n log[b] a ) for any f(n) = O(n log[b] a - epsilon ). These three cases do not cover all possibilities (consider T(n) = 2T(n/2) + n log n), but they will handle most recurrences of this form you are likely to encounter. This seems pretty close to the right answer. When $k = lceil ln(n)/ln(7)
ceil$, $n le 7^k$, so $A(lceil n/7^k
ceil) = 1$. Putting this in your equation, $A(n) = 4^k +4^klfloor n/7^k
floor ^2 . +4lfloor n/7
floor ^2 + n^2 $. To get an upper bound, since $lfloor x
floor le x le lceil x
ceil < x+1$, $4^k < 4^ <1+ln(n)/ln(7)>= 4n^ In this case, the series multiplying $n^2$ converges. Try to solve this with the 4 and 7 swapped, so that $A(
$egin
where $a ge 0$ and $b ge 1$ are constants and $f(n)$ is an asymptotically positive function. Depending upon the value of $a, b$ and function $f(n)$, the master method has three cases.
$T(n) = Thetaleft(n^p left( 1 + int_<1>^
Provided,
$sum_^
$T(n) = 2T(n/4) + 3T(n/6) + Theta(nlog n)$
$a_1b_1^p + a_2b_2^p = 2 imes(1/4)^p + 3 imes (1/6)^p = 1$
$p = 1$ satisfies the above equation. The solution is
$egin
& = Thetaleft(n left( 1 + int_<1>^
&= Thetaleft(n left( 1 + frac
&= Theta(nlog^2n)
end
Back substitution method
$T(n) = egin
$T(n/2) = 2Tleft ( frac
Now we back substitute the value of $T(n/2)$ in $T(n)$
$T(n) = 2^2Tleft (frac
We proceed in a similar way
$egin
&= 2^4T left (frac
&= 2^kT left (frac
end
Now we should use the boundary (base) condition i.e. $T(1) = 1$. In order to use the boundary condition, the entity inside $T()$ must be 1 i.e.
$frac
Taking $log_2$ on both sides,
$n = log_2 n$
The equation (6) becomes
$egin
T(n) &= 2^
& = nT(1) + nlog_2 n
& = nlog_2 n + n
end
The correctness of above running time can be proved using induction. Put $n = 2, 4, 8, 16, …$ and you can easily verify that the guessed running time is actually correct.
2.4: Solving Recurrence Relations
Master Method is a direct way to get the solution. The master method works only for following type of recurrences or for recurrences that can be transformed to following type.
1. If f(n) = O(n c ) where c < Logba then T(n) = Θ(n Log b a )
Master method is mainly derived from recurrence tree method. If we draw recurrence tree of T(n) = aT(n/b) + f(n), we can see that the work done at root is f(n) and work done at all leaves is Θ(n c ) where c is Logba. And the height of recurrence tree is Logbn
Merge Sort: T(n) = 2T(n/2) + Θ(n). It falls in case 2 as c is 1 and Logba] is also 1. So the solution is Θ(n Logn)
1) It is not necessary that a recurrence of the form T(n) = aT(n/b) + f(n) can be solved using Master Theorem. The given three cases have some gaps between them. For example, the recurrence T(n) = 2T(n/2) + n/Logn cannot be solved using master method.
If f(n) = Θ(n c Log k n) for some constant k >= 0 and c = Logba, then T(n) = Θ(n c Log k+1 n)
2.4: Solving Recurrence Relations
3.2. When T(n) = aT(n-1) + f(n)
Solving Non-Linear Relations
4. The Master Theorem
I think you are wrong. I'm sure. Let's discuss.
Sorry to interfere, but I need a little more information.
It not absolutely approaches me. Who else, what can prompt?
I believe that you are wrong. I can defend my position. Email me at PM, we will discuss.
You can debate endlessly, so I'll just thank the author. Thanks!
I am final, I am sorry, there is an offer to go on other way.
)