Problem Statement
What is algorithm analysis? Explain time complexity and space complexity with examples.
Explanation
Algorithm analysis is the process of determining the computational complexity of algorithms, measuring how their time and space requirements grow as the input size increases. This helps us compare different algorithms and choose the most efficient one for a given problem.
Time complexity measures the amount of time an algorithm takes to complete as a function of input size. It represents the number of basic operations executed. We express time complexity using Big O notation which describes the worst-case, upper bound on growth rate. Common time complexities from best to worst are O of 1 for constant time where execution time does not depend on input size, O of log n for logarithmic time like binary search, O of n for linear time like linear search, O of n log n for algorithms like merge sort and quick sort, O of n squared for nested loops like bubble sort, and O of 2 to the power n for exponential time like recursive Fibonacci.
Space complexity measures the amount of memory an algorithm uses as a function of input size. It includes space for input data, temporary variables, and recursive call stack. For example, O of 1 space means using a fixed amount of memory regardless of input size, O of n space means memory grows linearly with input, and O of n squared space means memory grows quadratically.
For example, linear search has time complexity O of n because in worst case it checks every element, and space complexity O of 1 because it uses only a few variables. Binary search has time complexity O of log n because it halves the search space each time, and space complexity O of 1 for iterative implementation or O of log n for recursive due to call stack. Merge sort has time complexity O of n log n for all cases because it divides array in half recursively and merges in linear time, and space complexity O of n for the temporary arrays used during merging.
Code Solution
Solution

