@bodil
A data structure that persists its current state when changed.
How is vector formed?
What's the problem with cons lists anyway?
We need a way to talk about the efficiency of operations on data structures.
Big O notation!
Same amount of work regardless of the size of the data structure.
Number of operations proportional to the size (n) of the data structure.
Number of operations logarithmic to the size of the data structure.
O(1) = constant time
O(log n) = logarithmic time
O(n) = linear time
O(n log n) = linear × logarithmic time
Spreading the cost over several operations.
car = contents of address register
cdr = contents of decrement register
cadr = car of cdr = second element
caddr = car of cdr of cdr = third element
cddr = cdr of cdr = third element onward
caar = car of car = first element of first element
TRIES
and the hardest problem in computer science
Tries are prefix or radix based search trees.
Phil Bagwell
Rich Hickey
INVARIANT:
Only the rightmost node at each level can be underfull.
Push/pop back: O(logₖ n)
Push/pop front: O(n)
Lookup: O(logₖ n)
Concat: O(n)
Split: O(n)
Push/pop back: O(1) amortised
Push/pop front: O(n)
Lookup: O(logₖ n)
Concat: O(n)
Split: O(n)
INVARIANT:
Rightmost and leftmost nodes can be underfull.
Push/pop back: O(logₖ n)
Push/pop front: O(logₖ n)
Lookup: O(logₖ n)
Concat: O(n)
Split: O(logₖ n)
Push/pop back: O(1) amortised
Push/pop front: O(1) amortised
Lookup: O(logₖ n)
Concat: O(n)
Split: O(logₖ n)
DO YOU HAVE A MOMENT TO TALK ABOUT
RELAXED RADIX BALANCED TREES
Push/pop: O(logₖ n)
Lookup: O(logₖ n)
Concat: O(logₖ n)
Split: O(logₖ n)
Push/pop: O(1) amortised
Lookup: O(logₖ n)
Concat: O(logₖ n)
Split: O(logₖ n)
@bodil
BELKA & STRELKA
Bagwell: Ideal Hash Trees
L'Orange: Understanding Clojure's Vectors
Stucki, Rompf, Ureche, Bagwell: RRB Vector
Okasaki: Purely Functional Data Structures
Acar, Charguéraud, Rainey: Chunked Sequences