Why this is O(log n)
Why O(log n) for a balanced tree. Walking from a leaf to the root of a balanced tree of n nodes is a log n traversal — that's the defining property of balanced trees.
Caveat. For a *degenerate* tree (a linked list dressed up as a tree), the same code is O(n). The complexity depends on the structure of the input, not just the algorithm. CLRS calls this out as the canonical case where balance invariants matter.
How to recognize logarithmic complexity in the wild
O(log n) is the complexity of repeated halving. Classic shape: a loop where the index keeps dividing or multiplying by a constant factor. Binary search and balanced-tree descent are the textbook examples.
The eight rungs of the ladder
Big-O classifies the asymptotic growth of a function, not the wall-clock time. The ladder Bugdle uses has eight rungs ordered from cheapest to most ruinous: O(1), O(log n), O(n), O(n log n), O(n²), O(n³), O(2ⁿ), O(n!). The puzzle's answer sits at rung 2 of 8. The four-guess budget plus higher/lower hints means the puzzle is solvable in at most ⌈log₂(8)⌉ = 3 perfectly-read guesses; the fourth attempt absorbs misreads of the snippet.
Common confusables
Nested loops aren't automatically quadratic — the inner loop has to range over n, not a constant. A loop that always runs ten iterations is still O(1) work per outer iteration. Similarly, a recursive function isn't automatically exponential just because it calls itself twice — if the recursion has overlapping subproblems and you memoise, you collapse back to polynomial. Always ask: what does n really mean here, and how many distinct subproblems are there?
External reference: Big O notation — Wikipedia.