Greedy algorithms solve problems by making the best choice at the current moment (a locally optimal choice), hoping it leads to a globally optimal solution. Greedy approaches are often simpler and faster than Dynamic Programming—but they only work when the problem satisfies the greedy-choice property.
Introduction to the Greedy Approach
What is a Greedy Algorithm?
A greedy algorithm builds a solution step-by-step and at each step chooses what looks best right now.
Greedy-Choice Property
A problem has the greedy-choice property if:
✅ Making the best local choice always leads to an optimal global solution.
This is not true for all problems—greedy needs the right structure.
When Greedy Works Well
Greedy works well when:
- Local choices never block an optimal solution
- The problem has a provable greedy rule
- Sorting first makes the greedy choices valid (very common)
When Greedy Fails
Greedy can fail when a local best choice blocks a better future result.
Example where greedy fails (coin system):
- coins =
[1, 3, 4], amount =6 - greedy picks:
4 + 1 + 1→ 3 coins - optimal is:
3 + 3→ 2 coins ✅
Classic Greedy Problems
1) Activity Selection Problem
Problem: Given activities with start and end times, select the maximum number of non-overlapping activities.
Key Greedy Idea
✅ Sort by end time, then always pick the one that ends earliest.
Why it works: Finishing earliest leaves the most room for future activities.
Implementation
def activity_selection(activities):
# activities = [(start, end), ...]
activities.sort(key=lambda x: x[1]) # sort by end time
selected = []
last_end = float("-inf")
for start, end in activities:
if start >= last_end:
selected.append((start, end))
last_end = end
return selected
acts = [(1, 3), (2, 4), (3, 5), (0, 6), (5, 7), (8, 9)]
print(activity_selection(acts))
Complexity
- Time: O(n log n) (sorting)
- Space: O(n) (selected list)
2) Fractional Knapsack Problem
Problem: Items have weights and values. You can take fractions of items. Maximize value under capacity.
Greedy Idea
✅ Take items in decreasing order of value/weight ratio.
Why greedy works here: Fractions are allowed, so choosing best density first always stays optimal.
Implementation
def fractional_knapsack(weights, values, capacity):
items = []
for w, v in zip(weights, values):
items.append((v / w, w, v)) # (ratio, weight, value)
items.sort(reverse=True) # sort by ratio descending
total_value = 0
for ratio, w, v in items:
if capacity == 0:
break
if w <= capacity:
total_value += v
capacity -= w
else:
total_value += ratio * capacity
capacity = 0
return total_value
weights = [10, 20, 30]
values = [60, 100, 120]
print(fractional_knapsack(weights, values, 50)) # 240.0
Complexity
- Time: O(n log n)
- Space: O(n)
3) Huffman Coding (Data Compression)
Huffman coding assigns shorter binary codes to more frequent characters.
Greedy Idea
✅ Repeatedly combine the two least frequent nodes.
Data Structure Used
- Min-heap (priority queue)
Build Huffman Tree
import heapq
from collections import Counter
class Node:
def __init__(self, freq, char=None, left=None, right=None):
self.freq = freq
self.char = char
self.left = left
self.right = right
def __lt__(self, other):
return self.freq < other.freq
def build_huffman_tree(text):
freq = Counter(text)
heap = [Node(f, ch) for ch, f in freq.items()]
heapq.heapify(heap)
while len(heap) > 1:
a = heapq.heappop(heap)
b = heapq.heappop(heap)
merged = Node(a.freq + b.freq, left=a, right=b)
heapq.heappush(heap, merged)
return heap[0] if heap else None
Generate Codes (DFS Traversal)
def generate_codes(root):
codes = {}
def dfs(node, path):
if not node:
return
if node.char is not None:
codes[node.char] = path
return
dfs(node.left, path + "0")
dfs(node.right, path + "1")
dfs(root, "")
return codes
text = "aaabbc"
root = build_huffman_tree(text)
print(generate_codes(root))
Complexity
- Heap build: O(k)
- Merging: O(k log k)
wherek = number of unique characters
4) Prim’s Algorithm (Minimum Spanning Tree)
Finds MST of a connected weighted undirected graph.
Greedy Idea
✅ Expand MST by always picking the cheapest edge that connects to a new node.
Implementation (Min Heap)
import heapq
def prim_mst(graph, start=0):
# graph: {u: [(v, w), ...]}
visited = set()
mst_cost = 0
pq = [(0, start)] # (weight, node)
while pq:
weight, node = heapq.heappop(pq)
if node in visited:
continue
visited.add(node)
mst_cost += weight
for neighbor, w in graph[node]:
if neighbor not in visited:
heapq.heappush(pq, (w, neighbor))
return mst_cost
Complexity
- Time: O(E log V)
- Space: O(V + E)
5) Kruskal’s Algorithm (Minimum Spanning Tree)
Builds MST by picking the globally smallest edges.
Greedy Idea
✅ Sort edges by weight, take the next smallest edge that doesn’t form a cycle.
Needs:
- Union-Find (DSU) to detect cycles efficiently.
Union-Find
class DSU:
def __init__(self, n):
self.parent = list(range(n))
self.rank = [0] * n
def find(self, x):
if self.parent[x] != x:
self.parent[x] = self.find(self.parent[x])
return self.parent[x]
def union(self, a, b):
ra, rb = self.find(a), self.find(b)
if ra == rb:
return False
if self.rank[ra] < self.rank[rb]:
self.parent[ra] = rb
elif self.rank[ra] > self.rank[rb]:
self.parent[rb] = ra
else:
self.parent[rb] = ra
self.rank[ra] += 1
return True
Kruskal Implementation
def kruskal(n, edges):
# edges: [(w, u, v), ...]
edges.sort()
dsu = DSU(n)
mst_cost = 0
mst_edges = []
for w, u, v in edges:
if dsu.union(u, v):
mst_cost += w
mst_edges.append((u, v, w))
return mst_cost, mst_edges
Complexity
- Sorting edges: O(E log E)
- DSU ops: ~ O(1) amortized
- Total: O(E log E)
Summary
Greedy algorithms make locally optimal decisions and work when the problem structure guarantees that those local choices lead to a globally optimal solution.
✅ Common Greedy Problems:
- Activity Selection (sort by end time)
- Fractional Knapsack (sort by value density)
- Huffman Coding (merge two smallest using heap)
- MST: Prim’s (expand using min edge), Kruskal’s (sort edges + DSU)
Leave a Reply