This week we covered chapters 15 and 16 of A Common-Sense Guide to Data Structures and Algorithms, 2nd ed. by Jay Wengrow (link). I’ve been coding along in Python on my Github.
Our topics today were binary search trees and heaps.
Speeding Up All the Things with Binary Search Trees
- The power of this data structure is that the data is always sorted.
Keeping Your Priorities Straight with Heaps
- “However, the bottom row can have empty positions, as long as there aren’t any nodes to the right of these empty positions.” I had to read this a few times to understand it. Another way to phrase it is that you can visually scan a given level of the heap from left to right and see there aren’t any gaps.
- The concept of weak ordering is interesting. The author calls out that you typically don’t search a heap, as you’re only concerned with what’s on top.
- The heap grows down, and when a given level is full, you make a new one and fill it out left to right.
- The author claims you can implement heaps using linked nodes instead of an array, but I couldn’t find an example. Arrays are probably easier because of proximity, as a linked structure would require metadata about the heap itself so you can track the “last” node. Houston asked why it would matter to the consumer if it’s an array or a linked structure. This can be an example of the leaky abstraction.