1. How do you analyze the time and space complexity of recursive algorithms? Explain with examples.
Analyzing the complexity of recursive algorithms requires understanding the recursion tree, recurrence relations, and the work done at each level. The approach differs from iterative algorithms because we must account for the recursive calls and the call stack. To analyze time complexity, first identify the base case and its complexity, usually O of 1. Second, determine the number of recursive calls made at each level. Third, calculate the work done in each call excluding recursive calls. Fourth, determine the depth of recursion which is how many levels deep the recursion goes. Fifth, combine these factors using recurrence relations or the recursion tree method. The recursion tree method visualizes each function call as a node. The root is the initial call, and children are recursive calls. Each level represents one step of recursion depth. Count the total work done across all nodes. For example, in binary tree traversal, each node is visited once, giving O of n time complexity. Recurrence relations express the running time as a function of the input size. For example, T of n equals 2 times T of n divided by 2 plus O of n for Merge Sort, where 2 times T of n divided by 2 represents two recursive calls on half-sized arrays, and O of n is the merge operation. Solving this using the Master Theorem or recursion tree gives O of n log n. Common patterns include linear recursion with one recursive call like factorial where T of n equals T of n minus 1 plus O of 1, giving O of n total. Binary recursion with two calls like Fibonacci has T of n equals T of n minus 1 plus T of n minus 2 plus O of 1, giving O of 2 to the power n. Divide and conquer with balanced splits like Binary Search has T of n equals T of n divided by 2 plus O of 1, giving O of log n. Multiple branching with equal work like Merge Sort has T of n equals 2 times T of n divided by 2 plus O of n, giving O of n log n. Space complexity analysis must consider both the call stack and auxiliary space. The call stack depth equals the maximum recursion depth. Each recursive call adds a frame to the stack. For example, factorial has recursion depth n giving O of n space. Binary Search has recursion depth log n giving O of log n space. Auxiliary space is extra space used within function calls excluding the stack. For instance, Merge Sort uses O of n auxiliary space for temporary arrays. Examples show these principles. For factorial, we have one recursive call with O of 1 work per call, recursion depth n, giving time O of n and space O of n for the call stack. For Fibonacci naive recursion, we have two recursive calls, O of 1 work per call, recursion depth n, but the tree has 2 to the power n total nodes, giving time O of 2 to the power n and space O of n for maximum stack depth. For Binary Search, we have one recursive call, O of 1 work per call, recursion depth log n, giving time O of log n and space O of log n. For Merge Sort, we have two recursive calls, O of n work for merging, recursion depth log n, giving time O of n log n and space O of n for auxiliary arrays plus O of log n for stack. Optimizing recursive algorithms can convert to iterative using explicit stack or queue to avoid call stack overhead. Use memoization to cache results of repeated recursive calls as in dynamic programming. Apply tail call optimization where the recursive call is the last operation, allowing some compilers to optimize space to O of 1. Use divide and conquer efficiently to minimize the work at each level.