CSE 326: Data Structures Priority Queues - Washington

25
CSE 326: Data Structures Priority Queues Leftist Heaps & Skew Heaps James Fogarty Spring 2009

Transcript of CSE 326: Data Structures Priority Queues - Washington

CSE 326: Data Structures

Priority Queues Leftist Heaps & Skew Heaps

James Fogarty

Spring 2009

Heap Merge Operation

• Useful operation for priority queues– Combining two sets of prioritiesCombining two sets of priorities

(perhaps to balance them or because of a failure)

• Also useful to simplify heap implementationImplement Insert DeleteMin in terms of Merge– Implement Insert, DeleteMin in terms of Merge

• Insert – Create one-element heap, merge with existing• DeleteMin – Remove root merge children• DeleteMin Remove root, merge children

Heap Merge Operation

• first attempt: insert each element from smaller heap into larger

runtime: O(n log n) worst, O(n) average• second attempt:

concatenate heap arrays and run buildHeapruntime: O(n) worst

• another approach: retain the existing information in the heaps

array shifting keeps us at O(n)

Heap Properties

• Recall our two heap properties– Structure and OrderingStructure and Ordering

• It is structure that has been holding us back– Have been throwing out existing information

in the ordering so we can maintain structure

• Why do we have the structure property?

• Let’s see what happens if we abandon it

Heap Merge Basics

• Use a pointer-based binary heap– Node Left, Node RightNode Left, Node Right

• Merge two heaps by:• Merge two heaps by:– Choose one root as the merged root

• (choose the smaller to preserve the order property)• (choose the smaller to preserve the order property)

– Retain one of its subtreesMerge its other subtree with the other root– Merge its other subtree with the other root

Heap Merge BasicsHeap Merge Basics

merge

a

gT1 a

merge

L1 R1 a < b L1 R1

bT2 b

L2 R2 L2 R2

Heap Merge Basics

• How expensive is the merge?– Constant cost per level recursively visitedConstant cost per level recursively visited– Worst case

• Deep path in the tree, you recurse down itp p , y• Whoops: O(n)

• Ok, so maybe completely ignoring structure is not the best approach to thisis not the best approach to this

Leftist Heaps

Idea: Avoid recursing down a deep path in the heapAvoid recursing down a deep path in the heap

Strategy:F ll h i i h llFocus all heap maintenance in a shallow portion of the heap

Leftist heaps:1. Most nodes are on the left2. All the merging work is done on the right

Definition: Null Path Lengthnull path length (npl) of a node x =

the number of nodes between x and a null in its subtree

npl(x) = min distance to a descendant with 0 or 1 children

l( ll) 1 ?• npl(null) = -1• npl(leaf) = 0• npl(single-child node) = 0 ??

?

npl(single child node) 0

0?1

??

0

000Equivalent definitions:1. npl(x) is the height of largest

complete subtree rooted at xcomplete subtree rooted at x2. npl(x) = 1 + min{npl(left(x)), npl(right(x))}

Leftist Heap Properties• Heap-order property

– parent value is ≤ to child values (same as before)– result: minimum element is at the root

• Leftist property– For every node x npl(left(x)) ≥ npl(right(x))– For every node x, npl(left(x)) ≥ npl(right(x))– result: tree is at least as “heavy” on left as right

Are leftist trees…complete?complete? balanced?

Are These Leftist?

11

2

001

11

0

00 0

Are These Leftist?

11

2

11

2

0

0

001

11

0 000

11

1

0

1

00 0000 0 0

0

0

Every subtree of a leftist tree is leftist!

0

tree is leftist!

Why the leftist property?

• Because it guarantees that:– The right path is really short compared to theThe right path is really short compared to the

number of nodes in the tree– For a leftist tree of N nodes, the right path has , g p

at most log(N+1) nodes• See the two proofs we’re about to skipp p

• Win – Perform all work on the right path,thus obtaining O(log N) merge

Right Path in a Leftist Tree is Short (#1)Right Path in a Leftist Tree is Short (#1)Claim: The right path is as short as any in the tree.Proof: (By contradiction)

Pick a shorter path: D < D

x

Pick a shorter path: D1 < D2Say it diverges from right path at x

l(L) ≤ D 1 b f h h fRL

D2D1

npl(L) ≤ D1-1 because of the path of length D1-1 to null

2npl(R) ≥ D2-1 because every node on

right path is leftist

Leftist property at x violated!

Right Path in a Leftist Tree is Short (#2)g ( )Claim: If right path has r nodes, then

the leftist tree has at least 2r-1 nodesthe leftist tree has at least 2 1 nodes.

Proof: (By induction)Proof: (By induction)Base case : r=1. Tree has at least 21-1 = 1 nodeInductive step : assume true for r’< r.

Prove for tree with right path at least r.

1. Right subtree: right path of r-1 nodes2r 1 1 i ht bt d (b i d ti )⇒ 2r-1-1 right subtree nodes (by induction)

2. Left subtree: also right path of length at least r-1 (prior slide)⇒ 2r-1-1 left subtree nodes (by induction)

Total tree size: (2r-1-1) + (2r-1-1) + 1 = 2r-1

Leftist Heap Merge

• Merge two heaps by:– Choose smaller root as the merged rootChoose smaller root as the merged root– Retain its left subtree (don’t touch it)– Merge its right subtree with the other rootMerge its right subtree with the other root

D ?• Done?– Need to preserve leftist property

Merge Continueda a

If npl(R’) > npl(L1)

L1 R’ R’ L1L1 R

R’ = Merge(R1, T2)

R L1

runtime: O(log n)( g )

Leftest Merge Example

51

merge3

?

0 merge1210

0 0

1

7

14

0

05

1merge

5?

87

31

0 0

141210

0 0

0

100 merge

120

087

140

80

80

80

(special case)

120

Sewing Up the Example

?7

3?

0

3?

0

31

0

010

5?

0

7

140

80

10

51

0

7

140

80

10

5 1

0

7

140

8

12

0

0

10 8

120

1014 8

120

1014

12Done?

Finally…

31

31

0

5 1

0

70

0

70

00

5 1

08

12

0

0

100

140

140

8

12

0

0

100

12 12

Seems like a lot of work

• Tradeoff of the leftist heap– Guaranteed O(log n) mergeGuaranteed O(log n) merge– Comes at cost:

• Added storage for caching subtree NPL valuesg g• Added complexity/logic to maintain and check NPL

– And reality is:y• Right side is “often” heavy and requires a switch

A k id i th ?• Any wacky ideas in the room?

Skew Heaps

• “Blindly” adjusting version of leftist heaps

• Always switch left and right children of the root selected to be the merged rootroot selected to be the merged root

• Don’t keep track of anything, don’t check anything, just always switch them

Skew Heap MergeSkew Heap Mergemerge

T1 aT1 amerge

L1 R1 a < b L1R1

bT2 b

L2 R2 L2 R2

Only one step per iteration, with children always switched

Skew Heap Merge Examplemerge

5

merge3

merge3

1210 7

14

5

g7

1410

5merge

87

3 141210

8

1410

8

12

87

14

8

7

3

7

14108

5

08

12

Amortized ComplexityAmortized ComplexitySuppose you run M times and average the running times

Amortized complexity:p ymax total # steps algorithm takes, in the worst case, for M consecutive operations on inputs of size N, divided by M (i e divide the max total by M)divided by M (i.e., divide the max total by M).

If M operations take total O(M log N) time in the worst case, amortized time per operation is O(log N).

Skew heaps have amortized complexity O(log N)Skew heaps have amortized complexity O(log N).