Big O notation is one of the most fundamental tools to analyze the cost of an algorithm.

Big O notation is used to represent the time complexity of an algorithm.

Time complexity is the computational complexity that describes the amount of time it takes to run an algorithm.

Big-O is about finding the Asymptotic upper bound or the worse case runtime.

Asymptotic analysis helps to analyze the algorithm performance change in the order of input size.

Big O Notation in Data Structure is used to express algorithmic complexity using algebraic terms.

Understanding of algorithms like Big O helps you to decide what does or does not matter when designing algorithms.