Sorting means arranging elements in a specific order (usually ascending or descending). It is one of the most important topics in DSA because many problems become much easier once the data is sorted—such as binary search, two pointers, interval merging, duplicate handling, and greedy strategies.
Sorting algorithms are commonly compared based on:
- Time complexity (how fast they run)
- Space complexity (extra memory used)
- Stability (whether equal elements keep their original order)
Introduction to Sorting and Why It Matters
Why Sorting Is Important
- Enables binary search (requires sorted data)
- Simplifies patterns like two pointers and sliding window
- Essential for greedy algorithms and interval problems
- Used in ranking, scheduling, and data processing systems
Key Properties of Sorting Algorithms
- Time Complexity
- Best case
- Average case
- Worst case
- Space Usage
- In-place (O(1) extra space)
- Not in-place (uses extra memory)
- Stability
- Stable sort: equal elements preserve original order
- Unstable sort: order of equal elements may change
Basic Sorting Algorithms
These algorithms are mainly used for learning and small inputs.
Bubble Sort
Bubble Sort repeatedly compares adjacent elements and swaps them if they are in the wrong order. After each pass, the largest element “bubbles” to the end.
How It Works
- Compare
a[0]anda[1], swap if needed - Compare
a[1]anda[2], swap if needed - Continue till the end
- Repeat passes until no swaps occur
Python Implementation
def bubble_sort(arr):
n = len(arr)
for i in range(n):
swapped = False
for j in range(0, n - i - 1):
if arr[j] > arr[j + 1]:
arr[j], arr[j + 1] = arr[j + 1], arr[j]
swapped = True
if not swapped:
break
return arr
print(bubble_sort([5, 1, 4, 2, 8]))
Complexity
- Time:
- Best: O(n) (already sorted with swap check)
- Average/Worst: O(n²)
- Space: O(1)
- Stable: Yes
Usage: Educational only; not suitable for large datasets.
Selection Sort
Selection Sort repeatedly selects the minimum element from the unsorted portion and places it at the beginning.
How It Works
- Find the smallest element
- Swap it with the first unsorted position
- Repeat for the remaining list
Python Implementation
def selection_sort(arr):
n = len(arr)
for i in range(n):
min_index = i
for j in range(i + 1, n):
if arr[j] < arr[min_index]:
min_index = j
arr[i], arr[min_index] = arr[min_index], arr[i]
return arr
print(selection_sort([64, 25, 12, 22, 11]))
Complexity
- Time: O(n²) (best/average/worst)
- Space: O(1)
- Stable: No (due to swapping)
Usage: Simple, minimal swaps, but inefficient for large inputs.
Insertion Sort
Insertion Sort builds the sorted array one element at a time—similar to sorting cards in hand.
How It Works
- Take the next element
- Shift larger elements to the right
- Insert the element in the correct position
Python Implementation
def insertion_sort(arr):
for i in range(1, len(arr)):
key = arr[i]
j = i - 1
while j >= 0 and arr[j] > key:
arr[j + 1] = arr[j]
j -= 1
arr[j + 1] = key
return arr
print(insertion_sort([12, 11, 13, 5, 6]))
Complexity
- Best: O(n) (already sorted)
- Average/Worst: O(n²)
- Space: O(1)
- Stable: Yes
Usage: Very good for small or nearly sorted datasets.
Advanced Sorting Algorithms
These are used in real-world systems.
Merge Sort
Merge Sort is a divide-and-conquer algorithm:
- Divide array into halves
- Sort each half recursively
- Merge the sorted halves
Why Merge Sort Is Powerful
- Guaranteed O(n log n) time
- Stable
- Works well for linked lists and large files (external sorting)
Python Implementation
def merge_sort(arr):
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left = merge_sort(arr[:mid])
right = merge_sort(arr[mid:])
return merge(left, right)
def merge(left, right):
result = []
i = j = 0
while i < len(left) and j < len(right):
if left[i] <= right[j]:
result.append(left[i])
i += 1
else:
result.append(right[j])
j += 1
result.extend(left[i:])
result.extend(right[j:])
return result
print(merge_sort([38, 27, 43, 3, 9, 82, 10]))
Complexity
- Time: O(n log n) (all cases)
- Space: O(n)
- Stable: Yes
Quick Sort
Quick Sort also uses divide and conquer:
- Choose a pivot
- Partition elements smaller and larger than pivot
- Recursively sort partitions
Important Note
Quick Sort is extremely fast in practice, but has worst-case O(n²) if pivot selection is poor.
Python Implementation (Simple Version)
def quick_sort(arr):
if len(arr) <= 1:
return arr
pivot = arr[len(arr)//2]
left = [x for x in arr if x < pivot]
mid = [x for x in arr if x == pivot]
right = [x for x in arr if x > pivot]
return quick_sort(left) + mid + quick_sort(right)
print(quick_sort([10, 7, 8, 9, 1, 5]))
Complexity
- Average: O(n log n)
- Worst: O(n²)
- Space: Depends on implementation
- Stable: No
Heap Sort
Heap Sort uses a heap (priority queue):
- Build a heap
- Repeatedly extract min/max
Python Implementation (Using heapq)
import heapq
def heap_sort(arr):
heapq.heapify(arr)
result = []
while arr:
result.append(heapq.heappop(arr))
return result
print(heap_sort([12, 11, 13, 5, 6, 7]))
Complexity
- Time: O(n log n)
- Space: O(n) (in this implementation)
- Stable: No
Comparison of Sorting Algorithms
Time Complexity Summary
- O(n²): Bubble, Selection, Insertion
- O(n log n): Merge, Quick (avg), Heap
Stability Summary
- Stable: Bubble, Insertion, Merge
- Unstable: Selection, Quick, Heap
When to Use Which Algorithm
- Small / nearly sorted data: Insertion Sort
- Guaranteed performance + stability: Merge Sort
- Fast average performance: Quick Sort
- Priority-based operations: Heap Sort
Summary
Sorting is a foundational skill in DSA.
- Basic sorting algorithms build intuition and understanding.
- Advanced algorithms like Merge Sort, Quick Sort, and Heap Sort are used in real systems.
- Choosing the right sorting algorithm depends on input size, memory constraints, and stability requirements.
Leave a Reply