Skip to content

Recursion review #182

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Mar 27, 2025
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions recursion/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ func fibonacci(n int) int {

When formulating recursive algorithms, it is essential to consider the following four rules of recursion:

1. It is imperative to establish a base case, or else the program will terminate abruptly
1. It is imperative to establish a base case, or else the program keep recursing and terminate abruptly after running out of stack memory
2. The algorithm should progress toward the base case at each recursive call
3. Recursive calls are presumed effective; thus, traversing every recursive call and performing bookkeeping is unnecessary
4. Use memoization, a technique that prevents redundant computation by caching previously computed results, can enhance the algorithm's efficiency.
Expand All @@ -45,9 +45,9 @@ Recursions are often inefficient in both time and space complexity. The number o

There are a few different ways of determining the time complexity of recursive algorithms:

1. Recurrence Relations: This approach involves defining a recurrence relation that expresses the algorithm's time complexity in terms of its sub-problems' time complexity. For example, for the recursive Fibonacci algorithm, the recurrence relation is T(n) = T(n-1) + T(n-2) + O(1), where T(n) represents the time complexity of the algorithm for an input of size n.
2. Recursion Trees: This method involves drawing a tree to represent the algorithm's recursive calls. The algorithm's time complexity can be calculated by summing the work done at each level of the tree. For example, for the recursive factorial algorithm, each level of the tree represents a call to the function with a smaller input size, and the work done at each level is constant.
3. Master Theorem: This approach is a formula for solving recurrence relations that have the form T(n) = aT(n/b) + f(n). The Master Theorem can be used to quickly determine the time complexity of some [Divide-and-conquer](../dnc) algorithms.
1. Recurrence Relations: This approach involves defining a recurrence relation that expresses the algorithm's time complexity in terms of its sub-problems' time complexity. For example, for the recursive Fibonacci algorithm, the recurrence relation is T(n) = T(n-1) + T(n-2) + O(1), where T(n) represents the time complexity of the algorithm for an input of size n
2. Recursion Trees: This method involves drawing a tree to represent the algorithm's recursive calls. The algorithm's time complexity can be calculated by summing the work done at each level of the tree. For example, for the recursive factorial algorithm, each level of the tree represents a call to the function with a smaller input size, and the work done at each level is constant
3. Master Theorem: This approach is a formula for solving recurrence relations that have the form T(n) = aT(n/b) + f(n). The Master Theorem can be used to quickly determine the time complexity of some [Divide-and-conquer](../dnc) algorithms

The space complexity of recursive calls is affected by having to store a copy of the state and variables in the stack with each recursion.

Expand Down