Understanding Binary Search Trees (BSTs) - A Devs Practical Guide

Understanding Binary Search Trees (BSTs) - A Dev’s Practical Guide

Data structures are the backbone of efficient software. Among them, trees hold a special place, and Binary Search Trees (BSTs) are perhaps one of the most fundamental and frequently discussed. If you’ve ever needed to store data in a way that allows for quick retrieval, insertion, and deletion while maintaining order, BSTs are a concept you need to grasp.

This post will cut through the academic jargon and provide you with a hands-on, example-driven understanding of BSTs, complete with working Python code that you can run and experiment with.

Let’s get started.

What’s a Tree? (A Quick Recap)

Before we dive into BSTs, let’s briefly touch upon the general concept of a “tree” in computer science.

A tree is a non-linear data structure that simulates a hierarchical tree structure, with a root value and subtrees of children with a parent node, represented as a set of linked nodes.

  • Node: The fundamental building block of a tree. It contains a value and references (pointers) to its children.
  • Root: The topmost node of the tree. A tree has only one root node.
  • Child: A node directly connected to another node (its parent) one level below.
  • Parent: A node directly connected to another node (its child) one level above.
  • Leaf: A node with no children.
  • Edge: The connection between two nodes.

What’s a Binary Tree?

A Binary Tree is a specific type of tree data structure where each node has at most two children, typically referred to as the left child and the right child.

This constraint (at most two children) is what makes it “binary.”

What is a Binary Search Tree (BST)?

A Binary Search Tree (BST) is a specialized binary tree that adheres to a specific ordering property for efficient searching, insertion, and deletion operations.

The BST Property states:

  1. All values in the left subtree of a node must be less than the value of the node itself.
  2. All values in the right subtree of a node must be greater than the value of the node itself.
  3. Both the left and right subtrees must also be BSTs.

This property is crucial because it ensures that if you’re looking for a specific value, you can quickly eliminate half of the tree’s nodes at each step, similar to a binary search algorithm on a sorted array.

Advantages of BSTs

  • Efficient Searching, Insertion, and Deletion: On average, these operations take O(log N) time, where N is the number of nodes. This is significantly faster than O(N) for linked lists or unsorted arrays.
  • Ordered Data Retrieval: Traversing a BST in a specific way (in-order traversal, which we’ll cover) yields elements in sorted order.
  • Flexibility: Unlike arrays, BSTs can grow and shrink dynamically without requiring reallocation or fixed-size constraints.

Disadvantages of BSTs

  • Worst-Case Performance: If elements are inserted in a sorted (or reverse-sorted) order, a BST can degenerate into a linked list, leading to O(N) performance for all operations. This is why self-balancing BSTs (like AVL trees or Red-Black trees) exist, which automatically adjust their structure to maintain balance. We won’t cover self-balancing trees in this post, but it’s important to know they address this limitation.
  • Memory Overhead: Each node requires memory for its value and pointers to its children, which can be more memory-intensive than arrays for simple data storage.

Building Blocks: The Node Structure

Before we implement BST operations, we need a Node class. This class will represent each element in our tree.

class Node:
    def __init__(self, key):
        self.key = key
        self.left = None
        self.right = None

    def __str__(self):
        return str(self.key)

    def __repr__(self):
        return f"Node({self.key})"

class BinarySearchTree:
    def __init__(self):
        self.root = None

The Node class is simple:

  • key: The value stored in the node. In a BST, these keys are what we’ll compare.
  • left: A reference to the left child node. Initially None.
  • right: A reference to the right child node. Initially None.

The __str__ and __repr__ methods are just for easier printing and debugging.


BST Operations: Implementation and Examples

Now, let’s implement the core operations of a BST: Insertion, Searching, and Traversal. We’ll also tackle the trickier Deletion operation.

We’ll define a BinarySearchTree class to encapsulate these operations, with the root node as its main attribute.

1. Insertion (insert)

Inserting a new value into a BST involves finding the correct position while maintaining the BST property.

Algorithm:

  1. Start at the root.
  2. If the tree is empty, the new node becomes the root.
  3. If the new value is less than the current node’s value, go to the left child.
  4. If the new value is greater than the current node’s value, go to the right child.
  5. Repeat steps 3-4 until you find an empty spot (a None child), then insert the new node there.
  6. Note: We typically assume distinct values in a BST. If duplicates are allowed, you can decide whether to place them in the left or right subtree (e.g., key <= current_node.key for left, key > current_node.key for right). For this example, we’ll avoid inserting duplicates.
class Node:
    def __init__(self, key):
        self.key = key
        self.left = None
        self.right = None

    def __str__(self):
        return str(self.key)

    def __repr__(self):
        return f"Node({self.key})"

class BinarySearchTree:
    def __init__(self):
        self.root = None

    def insert(self, key):
        new_node = Node(key)
        if self.root is None:
            self.root = new_node
            print(f"Inserted {key} as root.")
            return

        current = self.root
        parent = None
        while current is not None:
            parent = current
            if key < current.key:
                current = current.left
            elif key > current.key: # Handle distinct values for simplicity
                current = current.right
            else: # Key already exists, do not insert
                print(f"Key {key} already exists. Skipping insertion.")
                return

        if key < parent.key:
            parent.left = new_node
            print(f"Inserted {key} to the left of {parent.key}.")
        else:
            parent.right = new_node
            print(f"Inserted {key} to the right of {parent.key}.")

# --- Example Usage for Insertion ---
if __name__ == "__main__":
    print("--- BST Insertion Example ---")
    bst = BinarySearchTree()
    bst.insert(50)
    bst.insert(30)
    bst.insert(70)
    bst.insert(20)
    bst.insert(40)
    bst.insert(60)
    bst.insert(80)
    bst.insert(50) # Try inserting a duplicate
--- BST Insertion Example ---
Inserted 50 as root.
Inserted 30 to the left of 50.
Inserted 70 to the right of 50.
Inserted 20 to the left of 30.
Inserted 40 to the right of 30.
Inserted 60 to the left of 70.
Inserted 80 to the right of 70.
Key 50 already exists. Skipping insertion.

This sequence builds the following BST structure:

      50
     /  \
    30   70
   / \   / \
  20 40 60 80

2. Searching (search)

Searching for a value in a BST follows a very similar path to insertion.

Algorithm:

  1. Start at the root.
  2. If the current node is None (reached the end of a path without finding the key), the key is not in the tree.
  3. If the key matches the current node’s value, you’ve found it.
  4. If the key is less than the current node’s value, go to the left child.
  5. If the key is greater than the current node’s value, go to the right child.
  6. Repeat steps 2-5 until the key is found or None is reached.
class Node:
    def __init__(self, key):
        self.key = key
        self.left = None
        self.right = None

    def __str__(self):
        return str(self.key)

    def __repr__(self):
        return f"Node({self.key})"

class BinarySearchTree:
    def __init__(self):
        self.root = None

    def insert(self, key):
        new_node = Node(key)
        if self.root is None:
            self.root = new_node
            # print(f"Inserted {key} as root.") # Suppress for cleaner output in search example
            return

        current = self.root
        parent = None
        while current is not None:
            parent = current
            if key < current.key:
                current = current.left
            elif key > current.key:
                current = current.right
            else:
                # print(f"Key {key} already exists. Skipping insertion.") # Suppress
                return

        if key < parent.key:
            parent.left = new_node
            # print(f"Inserted {key} to the left of {parent.key}.") # Suppress
        else:
            parent.right = new_node
            # print(f"Inserted {key} to the right of {parent.key}.") # Suppress

    def search(self, key):
        current = self.root
        while current is not None:
            if key == current.key:
                return current # Key found
            elif key < current.key:
                current = current.left
            else: # key > current.key
                current = current.right
        return None # Key not found

# --- Example Usage for Searching ---
if __name__ == "__main__":
    print("--- BST Search Example ---")
    bst_search = BinarySearchTree()
    bst_search.insert(50)
    bst_search.insert(30)
    bst_search.insert(70)
    bst_search.insert(20)
    bst_search.insert(40)
    bst_search.insert(60)
    bst_search.insert(80)

    found_node = bst_search.search(40)
    if found_node:
        print(f"Key 40 found: {found_node.key}")
    else:
        print("Key 40 not found.")

    found_node = bst_search.search(99)
    if found_node:
        print(f"Key 99 found: {found_node.key}")
    else:
        print("Key 99 not found.")

    found_node = bst_search.search(50)
    if found_node:
        print(f"Key 50 (root) found: {found_node.key}")
    else:
        print("Key 50 (root) not found.")
--- BST Search Example ---
Key 40 found: 40
Key 99 not found.
Key 50 (root) found: 50

3. Traversal

Traversal is the process of visiting all nodes in a tree. For BSTs, there are three common types of Depth-First Traversals: In-order, Pre-order, and Post-order.

These are typically implemented recursively.

In-order Traversal (Left, Root, Right)

In-order traversal is particularly useful for BSTs because it visits nodes in ascending order of their values. This means printing the nodes during an in-order traversal will give you a sorted list of all keys in the BST.

Algorithm:

  1. Recursively traverse the left subtree.
  2. Visit the current node (e.g., print its key).
  3. Recursively traverse the right subtree.
class Node:
    def __init__(self, key):
        self.key = key
        self.left = None
        self.right = None

    def __str__(self):
        return str(self.key)

    def __repr__(self):
        return f"Node({self.key})"

class BinarySearchTree:
    def __init__(self):
        self.root = None

    def insert(self, key):
        new_node = Node(key)
        if self.root is None:
            self.root = new_node
            print(f"Inserted {key} as root.")
            return

        current = self.root
        parent = None
        while current is not None:
            parent = current
            if key < current.key:
                current = current.left
            elif key > current.key:
                current = current.right
            else:
                print(f"Key {key} already exists. Skipping insertion.")
                return

        if key < parent.key:
            parent.left = new_node
            print(f"Inserted {key} to the left of {parent.key}.")
        else:
            parent.right = new_node
            print(f"Inserted {key} to the right of {parent.key}.")

    def inorder_traversal(self):
        nodes = []
        self._inorder_recursive(self.root, nodes)
        return nodes

    def _inorder_recursive(self, node, nodes_list):
        if node is not None:
            self._inorder_recursive(node.left, nodes_list)
            nodes_list.append(node.key)
            self._inorder_recursive(node.right, nodes_list)

# --- Example Usage for In-order Traversal ---
if __name__ == "__main__":
    print("--- BST In-order Traversal Example ---")
    bst_traversal = BinarySearchTree()
    elements = [50, 30, 70, 20, 40, 60, 80, 10]
    for el in elements:
        bst_traversal.insert(el)

    inorder_result = bst_traversal.inorder_traversal()
    print(f"In-order traversal (sorted): {inorder_result}")
--- BST In-order Traversal Example ---
Inserted 50 as root.
Inserted 30 to the left of 50.
Inserted 70 to the right of 50.
Inserted 20 to the left of 30.
Inserted 40 to the right of 30.
Inserted 60 to the left of 70.
Inserted 80 to the right of 70.
Inserted 10 to the left of 20.
In-order traversal (sorted): [10, 20, 30, 40, 50, 60, 70, 80]

Notice how the output [10, 20, 30, 40, 50, 60, 70, 80] is perfectly sorted, demonstrating the BST property.

Pre-order Traversal (Root, Left, Right)

Pre-order traversal visits the current node first, then its left subtree, then its right subtree. This traversal is useful for creating a copy of the tree or for representing the tree structure.

Algorithm:

  1. Visit the current node.
  2. Recursively traverse the left subtree.
  3. Recursively traverse the right subtree.
class Node:
    def __init__(self, key):
        self.key = key
        self.left = None
        self.right = None

    def __str__(self):
        return str(self.key)

    def __repr__(self):
        return f"Node({self.key})"

class BinarySearchTree:
    def __init__(self):
        self.root = None

    def insert(self, key):
        new_node = Node(key)
        if self.root is None:
            self.root = new_node
            # print(f"Inserted {key} as root.") # Suppress for cleaner output
            return

        current = self.root
        parent = None
        while current is not None:
            parent = current
            if key < current.key:
                current = current.left
            elif key > current.key:
                current = current.right
            else:
                # print(f"Key {key} already exists. Skipping insertion.") # Suppress
                return

        if key < parent.key:
            parent.left = new_node
            # print(f"Inserted {key} to the left of {parent.key}.") # Suppress
        else:
            parent.right = new_node
            # print(f"Inserted {key} to the right of {parent.key}.") # Suppress

    def preorder_traversal(self):
        nodes = []
        self._preorder_recursive(self.root, nodes)
        return nodes

    def _preorder_recursive(self, node, nodes_list):
        if node is not None:
            nodes_list.append(node.key)
            self._preorder_recursive(node.left, nodes_list)
            self._preorder_recursive(node.right, nodes_list)

# --- Example Usage for Pre-order Traversal ---
if __name__ == "__main__":
    print("--- BST Pre-order Traversal Example ---")
    bst_traversal_pre = BinarySearchTree()
    elements = [50, 30, 70, 20, 40, 60, 80, 10] # Same elements as before
    for el in elements:
        bst_traversal_pre.insert(el)

    preorder_result = bst_traversal_pre.preorder_traversal()
    print(f"Pre-order traversal: {preorder_result}")
--- BST Pre-order Traversal Example ---
Pre-order traversal: [50, 30, 20, 10, 40, 70, 60, 80]

Post-order Traversal (Left, Right, Root)

Post-order traversal visits the left subtree, then the right subtree, and finally the current node. This is often used for deleting a tree from the bottom up, or for evaluating expression trees.

Algorithm:

  1. Recursively traverse the left subtree.
  2. Recursively traverse the right subtree.
  3. Visit the current node.
class Node:
    def __init__(self, key):
        self.key = key
        self.left = None
        self.right = None

    def __str__(self):
        return str(self.key)

    def __repr__(self):
        return f"Node({self.key})"

class BinarySearchTree:
    def __init__(self):
        self.root = None

    def insert(self, key):
        new_node = Node(key)
        if self.root is None:
            self.root = new_node
            # print(f"Inserted {key} as root.") # Suppress for cleaner output
            return

        current = self.root
        parent = None
        while current is not None:
            parent = current
            if key < current.key:
                current = current.left
            elif key > current.key:
                current = current.right
            else:
                # print(f"Key {key} already exists. Skipping insertion.") # Suppress
                return

        if key < parent.key:
            parent.left = new_node
            # print(f"Inserted {key} to the left of {parent.key}.") # Suppress
        else:
            parent.right = new_node
            # print(f"Inserted {key} to the right of {parent.key}.") # Suppress

    def postorder_traversal(self):
        nodes = []
        self._postorder_recursive(self.root, nodes)
        return nodes

    def _postorder_recursive(self, node, nodes_list):
        if node is not None:
            self._postorder_recursive(node.left, nodes_list)
            self._postorder_recursive(node.right, nodes_list)
            nodes_list.append(node.key)

# --- Example Usage for Post-order Traversal ---
if __name__ == "__main__":
    print("--- BST Post-order Traversal Example ---")
    bst_traversal_post = BinarySearchTree()
    elements = [50, 30, 70, 20, 40, 60, 80, 10] # Same elements as before
    for el in elements:
        bst_traversal_post.insert(el)

    postorder_result = bst_traversal_post.postorder_traversal()
    print(f"Post-order traversal: {postorder_result}")
--- BST Post-order Traversal Example ---
Post-order traversal: [10, 20, 40, 30, 60, 80, 70, 50]

4. Deletion (delete)

Deletion is the most complex operation in a BST because you must remove a node while preserving the BST property. There are three main cases to handle:

Case 1: Node to be deleted is a Leaf Node (no children).

  • Simply remove the node. Update its parent’s pointer (left or right) to None.

Case 2: Node to be deleted has One Child.

  • Replace the node with its single child. Update its parent’s pointer to point to the child.

Case 3: Node to be deleted has Two Children.

  • This is the trickiest case. You cannot simply remove the node as it has two subtrees.
  • The strategy is to find an in-order successor (the smallest node in the right subtree) or an in-order predecessor (the largest node in the left subtree).
  • Replace the deleted node’s value with the successor’s (or predecessor’s) value.
  • Then, recursively delete the successor (or predecessor) node from its original position. Since the successor is the smallest in the right subtree, it will have at most one child (a right child), simplifying its deletion.

We’ll need a helper function to find the minimum value node in a subtree for Case 3.

class Node:
    def __init__(self, key):
        self.key = key
        self.left = None
        self.right = None

    def __str__(self):
        return str(self.key)

    def __repr__(self):
        return f"Node({self.key})"

class BinarySearchTree:
    def __init__(self):
        self.root = None

    def insert(self, key):
        new_node = Node(key)
        if self.root is None:
            self.root = new_node
            print(f"Inserted {key} as root.")
            return

        current = self.root
        parent = None
        while current is not None:
            parent = current
            if key < current.key:
                current = current.left
            elif key > current.key:
                current = current.right
            else:
                print(f"Key {key} already exists. Skipping insertion.")
                return

        if key < parent.key:
            parent.left = new_node
            print(f"Inserted {key} to the left of {parent.key}.")
        else:
            parent.right = new_node
            print(f"Inserted {key} to the right of {parent.key}.")

    def inorder_traversal(self):
        nodes = []
        self._inorder_recursive(self.root, nodes)
        return nodes

    def _inorder_recursive(self, node, nodes_list):
        if node is not None:
            self._inorder_recursive(node.left, nodes_list)
            nodes_list.append(node.key)
            self._inorder_recursive(node.right, nodes_list)

    def _find_min_node(self, node):
        """Helper to find the node with the minimum key in a given subtree."""
        current = node
        while current.left is not None:
            current = current.left
        return current

    def delete(self, key):
        self.root = self._delete_recursive(self.root, key)

    def _delete_recursive(self, node, key):
        # Base Case: If the tree is empty or the key is not found
        if node is None:
            print(f"Key {key} not found for deletion.")
            return node

        # Traverse the tree to find the node to be deleted
        if key < node.key:
            node.left = self._delete_recursive(node.left, key)
        elif key > node.key:
            node.right = self._delete_recursive(node.right, key)
        else: # Node with key is found!
            print(f"Found node {node.key} for deletion.")

            # Case 1: Node has no children (leaf node) or only one child
            if node.left is None:
                temp = node.right
                node = None # Delete the node
                print(f"Deleted {key}. Replaced with its right child (or None).")
                return temp # Return the right child (or None) to link with parent
            elif node.right is None:
                temp = node.left
                node = None # Delete the node
                print(f"Deleted {key}. Replaced with its left child.")
                return temp # Return the left child to link with parent

            # Case 3: Node has two children
            # Find the in-order successor (smallest in the right subtree)
            successor = self._find_min_node(node.right)
            print(f"Node {key} has two children. Successor is {successor.key}.")

            # Copy the successor's content to this node
            node.key = successor.key

            # Delete the in-order successor from its original position in the right subtree
            node.right = self._delete_recursive(node.right, successor.key)
            print(f"Recursively deleted successor {successor.key}.")

        return node # Return the (possibly modified) current node to its parent

# --- Example Usage for Deletion ---
if __name__ == "__main__":
    print("\n--- BST Deletion Example ---")
    bst_delete = BinarySearchTree()
    elements = [50, 30, 70, 20, 40, 60, 80, 10, 35, 45, 75]
    for el in elements:
        bst_delete.insert(el)

    print("\nInitial In-order Traversal:")
    print(bst_delete.inorder_traversal())

    # Delete a leaf node (10)
    print("\nDeleting 10 (leaf node)...")
    bst_delete.delete(10)
    print("In-order Traversal after deleting 10:")
    print(bst_delete.inorder_traversal())

    # Delete another leaf node (60)
    print("\nDeleting 60 (leaf node)...")
    bst_delete.delete(60)
    print("In-order Traversal after deleting 60:")
    print(bst_delete.inorder_traversal())

    # Delete a node with two children (30)
    print("\nDeleting 30 (node with two children)...")
    bst_delete.delete(30)
    print("In-order Traversal after deleting 30:")
    print(bst_delete.inorder_traversal())

    # Delete the root (50), which has two children
    print("\nDeleting 50 (root node, with two children)...")
    bst_delete.delete(50)
    print("In-order Traversal after deleting 50:")
    print(bst_delete.inorder_traversal())

    # Note: The deletion messages are verbose to help trace the logic.
    # The actual structure of the BST after multiple deletions can be tricky to visualize from just in-order output.
--- BST Deletion Example ---
Inserted 50 as root.
Inserted 30 to the left of 50.
Inserted 70 to the right of 50.
Inserted 20 to the left of 30.
Inserted 40 to the right of 30.
Inserted 60 to the left of 70.
Inserted 80 to the right of 70.
Inserted 10 to the left of 20.
Inserted 35 to the left of 40.
Inserted 45 to the right of 40.
Inserted 75 to the left of 80.

Initial In-order Traversal:
[10, 20, 30, 35, 40, 45, 50, 60, 70, 75, 80]

Deleting 10 (leaf node)...
Found node 10 for deletion.
Deleted 10. Replaced with its right child (or None).
In-order Traversal after deleting 10:
[20, 30, 35, 40, 45, 50, 60, 70, 75, 80]

Deleting 60 (leaf node)...
Found node 60 for deletion.
Deleted 60. Replaced with its right child (or None).
In-order Traversal after deleting 60:
[20, 30, 35, 40, 45, 50, 70, 75, 80]

Deleting 30 (node with two children)...
Found node 30 for deletion.
Node 30 has two children. Successor is 35.
Found node 35 for deletion.
Deleted 35. Replaced with its right child (or None).
Recursively deleted successor 35.
In-order Traversal after deleting 30:
[20, 35, 40, 45, 50, 70, 75, 80]

Deleting 50 (root node, with two children)...
Found node 50 for deletion.
Node 50 has two children. Successor is 70.
Found node 70 for deletion.
Node 70 has two children. Successor is 75.
Found node 75 for deletion.
Deleted 75. Replaced with its right child (or None).
Recursively deleted successor 75.
Recursively deleted successor 70.
In-order Traversal after deleting 50:
[20, 35, 40, 45, 70, 75, 80]

Time Complexity of BST Operations

Understanding the efficiency of BST operations is crucial.

  • Average Case (Balanced Tree):

    • Search: O(log N)
    • Insertion: O(log N)
    • Deletion: O(log N)
    • Traversal: O(N) (since all nodes must be visited)
    • Reasoning: In a balanced BST, each comparison effectively halves the search space, similar to binary search on an array.
  • Worst Case (Unbalanced Tree):

    • Search: O(N)
    • Insertion: O(N)
    • Deletion: O(N)
    • Traversal: O(N)
    • Reasoning: If elements are inserted in a strictly ascending or descending order (e.g., 10, 20, 30, 40), the BST degenerates into a skewed tree resembling a linked list. In this scenario, operations require traversing every node in the worst case.

Note: The log N performance is the primary reason to use BSTs over simpler structures like linked lists or unsorted arrays for dynamic, ordered data. However, the O(N) worst case is a significant drawback, which led to the development of self-balancing BSTs like AVL Trees and Red-Black Trees. These trees guarantee O(log N) performance by automatically performing rotations to maintain balance.

Real-World Applications of BSTs

Despite the potential for unbalance, BSTs (and their self-balancing variants) are widely used:

  • Databases and File Systems: For indexing and quick retrieval of records.
  • Symbol Tables/Dictionaries/Maps: In programming languages, where key-value pairs need efficient lookup. Many standard library implementations of maps (like C++ std::map or Java TreeMap) are often built using Red-Black Trees.
  • Priority Queues: Although heaps are more common, BSTs can implement priority queues, allowing efficient retrieval of min/max elements.
  • Routing Tables: In networking, to store and quickly look up routing paths.
  • Computational Geometry: For solving problems like finding nearest neighbors or range queries.

Limitations and What’s Next

As highlighted, the biggest limitation of a plain BST is its potential for becoming unbalanced. When this happens, all the O(log N) average-case benefits vanish, and performance degrades to O(N).

For production systems where consistent performance is critical, you would almost always opt for a self-balancing BST. The most common ones are:

  • AVL Trees: Named after Adelson-Velskii and Landis, they are the first self-balancing BSTs. They maintain balance by ensuring that for any node, the heights of its left and right subtrees differ by at most 1.
  • Red-Black Trees: More complex to implement than AVL trees but used extensively in practice (e.g., in Linux kernel, C++ STL, Java’s TreeMap and HashSet). They maintain balance by enforcing a set of properties on node colors (red or black).

These self-balancing trees add more complexity to the insertion and deletion algorithms but guarantee logarithmic time complexity for all operations.

Conclusion

Binary Search Trees are a foundational data structure that every developer should understand. Their elegant recursive definition, combined with their average-case efficiency for core operations, makes them incredibly powerful for managing ordered data.

You’ve now seen how to implement the Node structure and the crucial insert, search, inorder_traversal, and delete operations in Python. You also understand the critical difference between average-case O(log N) performance and the worst-case O(N) for an unbalanced tree, setting the stage for learning about self-balancing variants.

Keep experimenting with the code examples. The best way to internalize these concepts is by writing, running, and debugging them yourself. Happy coding!

Last updated on