ALGORITHM COMPLEXITIES AND BIG-O NOTATION

Ezelu Joseph
4 min readJan 10, 2024

--

INTRODUCTION TO ALGORITHM AND ALGORITHM COMPLEXITY

An algorithm is a step-by-step procedure or set of rules designed to perform a specific task or solve a particular problem. It is a well-defined, finite sequence of instructions that, when followed, leads to a solution for a given problem or a desired outcome.

Algorithm complexity refers to the analysis of the efficiency of an algorithm concerning the resources it consumes, typically with a focus on time and space.

TYPES OF ALGORITHM COMPLEXITY

There are 2 types of Algorithm complexities.

  1. Time Complexity — This is the amount of time it algorithm takes to execute successfully, with respect to the number of inputs
  2. Space Complexity — This is the amount of space needed for an algorithm to execute successfully, with respect to the number of inputs

REPRESENTATION OF ALGORITHM COMPLEXITY

Algorithm complexities are represented using Asymptotic notations. These are mathematical concepts to represent time and space complexities.

We have 3 Asymptotic notations for representing Algorithm Complexities:

  1. Omega Notation — Best Case Scenario
  2. Theta Notation — Average Case Scenario
  3. Big-O Notation — Worst Case Scenario

In this article, I’ll focus on the Big-O Notation, as this is the most widely implemented notation.

BIG-O NOTATION

Big-O Notation is used to represent the “worst case scenario” of algorithms. It defines the complexities of an algorithm using algebraic terms.

It is expressed in terms of the input. And it focuses on the larger picture without getting caught up in minute details

TIME COMPLEXITY

  • O(n) — Constant Time Complexity
    The algorithm’s runtime or execution time remains constant regardless of the size of the input data.
function constantTimeComplexity(arr) {     
// Accessing the first element of the array
return arr[0];
}
  • O(n) — Linear Time Complexity
    The algorithm’s runtime is directly proportional to the size of the input data. As the input size increases, the execution time also increases linearly.
function linearTimeComplexity(arr) {
// Iterating through each element in the array
arr.forEach(element => {
console.log(element);
});
}
  • O(n²) — Quadratic Time Complexity
    The algorithm’s runtime is proportional to the square of the size of the input data. As the input size increases, the execution time increases quadratically.
function quadraticTimeComplexity(arr) {
// Nested loop iterating through each pair of elements in the array
arr.forEach(i => {
arr.forEach(j => {
console.log(i, j);
});
});
}
  • O(n³) —Cubic Time Complexity
    The Algorithm runtime is proportional to the cube of the size of the input data. As the input size increases, the execution time increases cubically.
function cubicTimeComplexity(arr) {
// Triple nested loop iterating through each triplet of elements in the array
arr.forEach(i => {
arr.forEach(j => {
arr.forEach(k => {
console.log(i, j, k);
});
});
});
}
  • O(logn) — The algorithms runtime grows logarithmically with the use of the input data. As the input size increases, the execution time increases logarithmically
function logarithmicTimeComplexity(arr, target) {
// Binary search example
let low = 0;
let high = arr.length - 1;

while (low <= high) {
const mid = Math.floor((low + high) / 2);

// Do some operation based on comparison
if (arr[mid] === target) {
return mid;
} else if (arr[mid] < target) {
low = mid + 1;
} else {
high = mid - 1;
}
}

return -1; // Element not found
}

SPACE COMPLEXITY

Space complexity follows the same sequence as time complexity

  • O(1)
  • O(n)
  • O(n¹)
  • O(n²)
  • O(n³)
  • O(logn)

BIG-O NOTATION OF OBJECTS

Here are some time complexities of

  • Insertion — O(1)
  • Removal — O(1)
  • Access — O(1)
  • Search — O(n)
  • Object.keys() — O(n)
  • Object.values() — O(n)
  • Object.entries() — O(n)

BIG-O NOTATION OF ARRAYS

  • Insert / remove from end — O(1)
  • Insert / remove from beginning — O(n)
  • Accessing — O(1)
  • Searching — O(n)
  • Push / pop — O(1)
  • Shift, unshift, concat, slice, splice — O(n)
  • forEach, map, filter, reduce — O(n)

SUMMARY

This article provides an introduction to algorithm complexity and focuses on Big-O notation, specifically emphasizing the “worst case scenario” for algorithms. Algorithm complexity is analyzed concerning the resources it consumes, primarily time and space. Two main types of algorithm complexities are discussed: time complexity, which measures the time an algorithm takes to execute based on the number of inputs, and space complexity, which measures the space needed for an algorithm to execute successfully.

This article introduces three asymptotic notations for representing algorithm complexities: Omega Notation (best case scenario), Theta Notation (average case scenario), and Big-O Notation (worst case scenario). Big-O Notation, the most widely implemented notation, is expressed in terms of the input and provides an overview without getting into minute details.

Time complexities using Big-O Notation are explained, including constant time complexity (O(1)), linear time complexity (O(n)), quadratic time complexity (O(n²)), cubic time complexity (O(n³)), and logarithmic time complexity (O(logn)). Code snippets in JavaScript illustrate each time complexity example.

Space complexity, following a similar sequence as time complexity, is briefly mentioned, and examples of Big-O notation for objects and arrays are provided. The article concludes by presenting time complexities for various operations on objects and arrays.

--

--