// Recursion - Greg Hogg DSA Course Materials Lecture 6
// Fibonacci
#include <iostream>
int fibonacci(int n) {
if (n == 0) return 0;
if (n == 1) return 1;
return fibonacci(n - 1) + fibonacci(n - 2);
}
int main() {
int n = 12;
std::cout << "Fibonacci of " << n << " is: " << fibonacci(n) << std::endl;
return 0;
}
// Reverse printing of Linked List
#include <iostream>
class SinglyNode {
public:
int val;
SinglyNode* next;
SinglyNode(int val) : val(val), next(nullptr) {}
std::string toString() const {
return std::to_string(val);
}
};
SinglyNode* createLinkedList() {
SinglyNode* head = new SinglyNode(1);
head->next = new SinglyNode(3);
head->next->next = new SinglyNode(4);
head->next->next->next = new SinglyNode(7);
return head;
}
void reversePrint(SinglyNode* node) {
if (node == nullptr) return;
reversePrint(node->next);
std::cout << "Node value: " << node->val << std::endl;
}
int main() {
SinglyNode* head = createLinkedList();
std::cout << "Reversed List:" << std::endl;
reversePrint(head);
return 0;
}
// Recursion - Greg Hogg DSA Course Materials Lecture 6
public class Main {
public static void main(String[] args) {
int n = 12;
System.out.println("Fibonacci of " + n + " is: " + fibonacci(n));
}
// Fibonacci Sequence Calculation - O(2^n) time, O(n) space
public static int fibonacci(int n) {
if (n == 0) return 0;
if (n == 1) return 1;
return fibonacci(n - 1) + fibonacci(n - 2);
}
}
public class Main {
public static void main(String[] args) {
SinglyNode head = createLinkedList();
System.out.println("Reversed List:");
reversePrint(head);
}
// Singly Linked List Node Definition
static class SinglyNode {
int val;
SinglyNode next;
SinglyNode(int val) {
this.val = val;
this.next = null;
}
public String toString() {
return Integer.toString(val);
}
}
// Create a Linked List
public static SinglyNode createLinkedList() {
SinglyNode head = new SinglyNode(1);
head.next = new SinglyNode(3);
head.next.next = new SinglyNode(4);
head.next.next.next = new SinglyNode(7);
return head;
}
// Reverse Print Linked List - O(n) time, O(n) space due to recursion
public static void reversePrint(SinglyNode node) {
if (node == null) return;
reversePrint(node.next);
System.out.println("Node value: " + node);
}
}
Recursion is a fundamental technique where a solution is built by repeatedly applying the same logic to smaller versions of a problem. It’s especially effective in scenarios involving branching, repetitive breakdown, or hierarchical structures — like tree traversal or divide-and-conquer strategies. The key is defining when the process should stop and how it should progress.
Recursion is a powerful method of solving problems by solving smaller instances of the same problem. A recursive solution works by reducing a task step by step, using the outcome of a simpler version of itself to solve the original challenge.
Every recursive process requires two parts: a base case that signals when to stop, and a recursive case that continues the breakdown. The function keeps calling itself with modified input until it reaches the stopping condition, then works its way back to combine results.
Recursion is ideal for problems with natural hierarchical structure or repetitive branching, such as:
Beginners often forget to define the base case or write one that never triggers. This leads to infinite recursion and program crashes. Understanding the flow of control and drawing the call stack can help debug and refine your logic.
Recursion encourages thinking in terms of patterns and structure rather than procedures. Once mastered, it enables you to write elegant, efficient solutions to complex problems — especially those involving repetition, branching, or hierarchical data.
Recursion is a fundamental algorithmic technique where a function solves a problem by calling itself with a subset or modified version of the original input. It is a powerful tool especially suited to problems that can be broken down into smaller subproblems of the same type.
A recursive function is defined in terms of itself. This means the function includes a call to itself within its body. However, without constraints, this would result in an infinite loop. That's why every recursive function must define one or more base cases — situations where the recursion ends and no further self-calls are made.
Every recursive function has two essential components:
When a recursive function calls itself, each invocation is added to a call stack — a structure that keeps track of active function calls. Each layer on the stack is called a stack frame, and it holds:
As the function keeps calling itself, new frames are pushed onto the stack. This continues until a base case is encountered. Then, the function starts returning values, and each frame is popped off the stack in reverse order (Last In, First Out), passing control back to the previous call.
Imagine climbing down a ladder, each rung representing a new recursive call. The base case is the ground. Once you reach it, you climb back up, one rung (or function call) at a time. The recursive call stack ensures that each step is remembered so that the correct sequence of returns occurs.
Since each recursive call adds a new frame to the call stack, the space complexity of a recursive algorithm is directly tied to the maximum depth of recursion. If a recursive function calls itself N
times before reaching the base case, the stack will grow to O(N)
frames, resulting in O(N) space complexity.
Efficient recursion relies on controlling this depth, often by minimizing repeated work or by converting deep recursion into iterative processes when stack space is a concern.
Recursion is especially well-suited to problems with a naturally recursive structure. Some common examples include:
Recursion simplifies many complex problems by breaking them into smaller instances of the same problem. Understanding the role of base cases, recursive steps, and the call stack is essential to using recursion effectively and avoiding common pitfalls like infinite recursion or excessive space usage.