Lecture 5/15: Sorting
May 15, 2020
πAssociated files
Sorting
CS 106B: Programming Abstractions
Spring 2020, Stanford University Computer Science Department
Lecturers: Chris Gregg and Julie Zelenski
Slide 2
Announcements
- Mid-Quarter Assessments are in full swing! Congratulations to those that have already completed their assesments and best of luck to everyone that is doing one today or over the weekend! Here are a couple of tips for those of you that have yet to go:
- Make sure to double check that your assessment release time is correctly displayed on the online access portal. This time should be 5 minutes before the time slot that you signed up for on Paperless. Please let Nick know ASAP if the displayed time is incorrect.
- You'll be working through the assessment questions during your preparation time using the BlueBook software. Keep these two things in mind that may be different from how you have used BlueBook previously:
- Do not close BlueBook after your prep time is over! You will want to have it open while you discuss with your section leader.
- Do not submit your exam when you're done! We are not collecting submissions, so you will likely run into an error.
- Assignment 5 will be released over the weekend and due in a week (Friday, May 22).
Introduction to Sorting
- Insertion Sort
- Selection Sort
- Merge Sort
- Quicksort
- Other sorts you might want to look at:
- Radix Sort
- Shell Sort
- Tim Sort
- Heap Sort (we will cover heaps later in the course)
- Bogosort
- Sort you don't want to look at: BubbleSort
Slide 3
Sorting!
- In general, sorting consists of putting elements into a particular order, most often the order is numerical or lexicographical (i.e., alphabetic).
- In order for a list to be sorted, it must:
- be in nondecreasing order (each element must be no smaller than the previous element)
- be a permutation of the input
- In order for a list to be sorted, it must:
- Fundamentally, comparison sorts at best have a complexity of O(n log n).
- We also need to consider the space complexity: some sorts can be done in place, meaning the sorting does not take extra memory. This can be an important factor when choosing a sorting algorithm!
- In-place sorting can be stable or unstable: a stable sort retains the order of elements with the same key, from the original unsorted list to the final, sorted, list
- There are some phenomenal online sorting demonstrations:
Slide 4
Sorts
- There are many different ways to sort elements n a list.
- We will look at the following sorts in detail:
- Insertion sort
- Selection sort
- Mergesort (which you basically saw in assignment 4!)
- Quicksort
Slide 5
Insertion Sort
- Insertion sort: orders a list of values by repetitively inserting a particular value into a sorted subset of the list
- More specifically: β consider the first item to be a sorted sublist of length 1 β insert second item into sorted sublist, shifting first item if needed β insert third item into sorted sublist, shifting items 1-2 as needed β β¦ β repeat until all values have been inserted into their proper positions
Slide 6
Insertion Sort Algorithm
- Assume we have the following values, in this order:
9 5 10 8 12 11 14 2 22 43
- We want to rearrange (sort) the values so that the lowest is on the left, and the highest is on the right, and the entire list is ordered correctly.
- The insertion sort algorithm:
- iterate through the list (starting with the second element)
- at each element, shuffle the neighbors below that element up until the proper place is found for the element, and place it there.
- The 9 is already in place (as far as we know), so we start with the 5.
- We compare the 5 to the one below it, and see that we have to shuffle the 9 to the right and put the 5 into index 0:
ββ 5 9 10 8 12 11 14 2 22 43
- Next, we look at the 10. When we compare it to the 9 to its left, we see that we don't need to make any changes.
- Next, we look at the 8. Compared with its left neighbor, 10, we see that we need to shift 10 to the right. We also need to shift the 9 to the right (because it, too, is greater than 8). Finally, we put the 8 into index 1:
β€Ίβββ 5 9 10 8 12 11 14 2 22 43 β€» β€» 5 8 9 10 12 11 14 2 22 43
- Next, we look at the 12, and it is in the right place compared to the 10 and does not need to move.
- Looking at 11, we see that we need to move the 12 to the right, and put the 11 into index 4.
β€Ί 5 8 9 10 12 11 14 2 22 43 β€» 5 8 9 10 11 12 14 2 22 43
- The 14 is in the correct place (bigger than 12)
- Now we look at the 2. We traverse to the left and see that we need to move the 14, 12, 11, 10, 9, 8, and 5 to the right! Then, we put the 2 into index 0:
β€Ίββββββββββββββ 5 8 9 10 11 12 14 2 22 43 β€»β€»β€» β€» β€» β€» β€» 2 5 8 9 10 11 12 14 22 43
- The 22 does not need to be moved (it is greater than 14)
- Finally, the 43 doesn't need to be moved, because it is greater than 22.
- We have sorted the array!
- Complexity:
- Worst performance: O(n^2) (why?)
- Best performnance: O(n)
- Average performance: O(n^2), but very fast for small arrays, as it is a simple algorithm.
Slide 7
Insertion sort Code
// Rearranges the elements of v into sorted order.
void insertionSort(Vector<int>& v) {
for (int i = 1; i < v.size(); i++) {
int temp = v[i];
// slide elements right to make room for v[i]
int j = i;
while (j >= 1 && v[j - 1] > temp) {
v[j] = v[j - 1];
j--;
}
v[j] = temp;
}
}
Slide 8
Selection Sort
- Selection Sort is another in-place sort that has a simple algorithm:
- Find the smallest item in the list, and exchange it with the left-most unsorted element.
-
Repeat the process from the first unsorted element.
- Here is a good animation of selection sort.
Slide 9
Selection Sort Example
- Assume we have the following values, in this order:
9 5 10 8 12 11 14 2 22 43
- The algorithm:
- Find the smallest item in the list, and exchange it with the left-most unsorted element.
- Repeat the process from the first unsorted element.
-
Selection sort is particularly slow, because it needs to go through the entire list each time to find the smallest item.
- For the array above, we first look through each element one at a time, and we determine that the 2 is the lowest. So, we swap it into position 9 at index 0:
2 5 10 8 12 11 14 9 22 43
- Next, starting from index 1, we look through all the rest of the elements and find that the 5 is smallest, and we don't have to swap because it is already at index 1.
- Next, we loop from index 2, and we find that the 8 is lowest. We then swap with index 2:
2 5 8 10 12 11 14 9 22 43
- We continue this process, and end up with the following steps:
β----------β------- 2 5 8 9 12 11 14 10 22 43 β--------β------- 2 5 8 9 10 11 14 12 22 43 β------------ 2 5 8 9 10 11 14 12 22 43 (no swap necessary) β--β------ 2 5 8 9 10 11 12 14 22 43 β------- 2 5 8 9 10 11 12 14 22 43 (no swap necessary) β---- 2 5 8 9 10 11 12 14 22 43 (no swap necessary) β 2 5 8 9 10 11 12 14 22 43 (no swap necessary)
- Complexity:
- Worst performance: O(n^2)
- Best performance: O(n^2)
- Average performance: O(n^2)
- Bottom line: this is a pretty terrible sort!
- However: there is one use case for it that makes it useful: If you want to find the "top X" number of elements for a very small X and a large n, selection sort actually works fine.
- It's trivial why it works well for this: first, it finds the top element, then the next, then then next by looking through the entire list each time.
Slide 10
Selection sort Code
// Rearranges elements of v into sorted order
// using selection sort algorithm
void selectionSort(Vector<int>& v) {
for (int i = 0; i < v.size() - 1; i++) {
// find index of smallest remaining value
int min = i;
for (int j = i + 1; j < v.size(); j++) {
if (v[j] < v[min]) {
min = j;
}
}
// swap smallest value to proper place, v[i]
if (i != min) {
int temp = v[i];
v[i] = v[min];
v[min] = temp;
}
}
}
Slide 11
Merge Sort
- The next sort we are going to talk about is merge sort, which you've already seen in assignment 4!
- Merge sort is another comparison-based sorting algorithm and it is a divide-and-conquer sort.
- As you've seen (and done), merge sort can be coded recursively
- In essence, you are merging sorted lists, e.g.,
L1 = {3,5,11}
L2 = {1,8,10}
merge(L1,L2) = {1,3,5,8,10,11}
Slide 12
Merging two sorted lists
- Merging two sorted lists is straightforward:
β β L1: 3 5 11 15 L2: 1 8 10 Result so far: <empty>
β β L1: 3 5 11 15 L2: 1 8 10 Result so far: 1
β β L1: 3 5 11 15 L2: 1 8 10 Result so far: 1 3
β β L1: 3 5 11 15 L2: 1 8 10 Result so far: 1 3 5
β β L1: 3 5 11 15 L2: 1 8 10 Result so far: 1 3 5 8
β β L1: 3 5 11 15 L2: 1 8 10 Result so far: 1 3 5 8 10
-
Because L2 is done, we simply have to take the rest (11 and 15, in this case) from L1, and we're done:
β β L1: 3 5 11 15 L2: 1 8 10 Result so far: 1 3 5 8 10 11 15
Slide 13
Merge Sort full algorithm
- Here is the full algorithm (which you performed for assignment 4):
- Divide the unsorted list into
n
sublists, each containing 1 element (a list of 1 element is considered sorted). - Repeatedly merge sublists to produce new sorted sublists until there is only 1 sublist remaining. This will be the sorted list.
- Divide the unsorted list into
- Code (using vectors instead of queues):
// Rearranges the elements of v into sorted order using // the merge sort algorithm. void mergeSort(Vector<int> &vec) { int n = vec.size(); if (n <= 1) return; Vector<int> v1; Vector<int> v2; for (int i=0; i < n; i++) { if (i < n / 2) { v1.add(vec[i]); } else { v2.add(vec[i]); } } mergeSort(v1); mergeSort(v2); vec.clear(); merge(vec, v1, v2); }
- Here is the code to merge two vectors
// Merges the left/right elements into a sorted result. // Precondition: left/right are sorted, and vec is empty void merge(Vector<int> &vec, Vector<int> &v1, Vector<int> &v2) { int n1 = v1.size(); int n2 = v2.size(); int p1 = 0; int p2 = 0; while (p1 < n1 && p2 < n2) { if (v1[p1] <= v2[p2]) { vec.add(v1[p1++]); } else { vec.add(v2[p2++]); } } while (p1 < n1) { vec.add(v1[p1++]); } while (p2 < n2) { vec.add(v2[p2++]); } }
Slide 14
Merge Sort, full example
- Assume we have the following vector to sort:
[96 6 86 15 58 35 86 4 0]
- We recursively break into two halves until we are left with single elements (the base case):
[96 6 86 15 58 35 86 4 0] [96 6 86 15] [58 35 86 4 0] [96 6] [86 15] [58 35] [86 4 0] [96] [6] [86] [15] [58] [35] [86] [4 0] [4] [0]
- Now that we have single-element vectors, we merge back up, two at a time:
[4]β[0] [96] β [6] [86] β [15] [58] β [35] [86] β [0 4] [6 96] β [15 86] [35 58] β [0 4 86] [6 15 86 96] β [0 4 35 58 86] [0 4 6 15 35 58 86 86 96]
Slide 15
Merge Sort: Time Complexity
- Time complexity for merge sort:
- Worst case: O(n log n)
- Best case: O(n log n)
- Average case (all cases): O(n log n)
- Why is it O(n log n)?
- We have
log2 n
levels, because we are splitting the arrays into two each time (divide and conquer). - At each level, we have to work on all
n
elements. Therefore, the total complexity is O(n * log2 n), or O(n log n)
- We have
Slide 16
Merge Sort: Space Complexity
- Merge sort is the first sort we've looked at where we must keep producing more and more lists β this is a tradeoff!
- It would be possible to do merge sort in place, where we swap individual elements, but it is difficult and there is more overhead.
- If you're tight on memory, merge sort isn't necessarily the best sort
Slide 17
Is our merge sort stable?
- A sorting algorithm is stable if the elements that are the same end up in the same relative ordering after the end of the sort. For our example above, we can see if the original 86s are in the same order before and after:
[96 6 86 15 58 35 86 4 0]
- In other words, at the end, we should have the following, where the blue 86 comes before the green 86:
[0 4 6 15 35 58 86 86 96]
- You might ask, why does it matter? but what if the values were keys for values that have other keys associated with them, too? If you first sort by one key and then another, you might want to retain the ordering of elements with the same second key.
- It turns out that our implementation of merge sort is stable, and it hinges on one line in the
merge
function:if (v1[p1] <= v2[p2]) {
- Your book has a less than instead of a less than-and-equal sign, making it not stable!
- See here for more information about stable sorting.
Slide 18
Quicksort
- Quicksort is a sorting algorithm that is often faster than most other types of sorts.
- However, although it has an average O(n log n) time complexity, it also has a worst-case O(n^2) time complexity, though this rarely occurs if you code it wisely.
- Quicksort is another divide-and-conquer algorithm
- The basic idea is to divide a list into two smaller sub-lists:
- The low elements go on one side of the list (in any order)
- The high elements go on the other side of the list (in any order)
- Then, recursively sort the sub-lists.
- The basic idea is to divide a list into two smaller sub-lists:
- If you're asking yourself, what do you mean, _high and low elements? How do you determine what that means?_ that is a good question.
- We choose a pivot element, and then base the high and low elements on that element.
Slide 19
Quicksort Algorithm
- Pick an element, called the pivot, from the list. Choosing the pivot is important, as we will see later.
- Reorder the list so that all elements with values less than the pivot come before the pivot, while all elements with values greater than the pivot come after the pivot. After this partitioning, the pivot is in its final position. This is called the partition operation.
- Recursively apply the above steps to the sub-list of elements with smaller values and separately to the sub-list of elements with greater values.
- The base case of the recursion is for lists of 0 or 1 elements, which do not need to be sorted (because they are already trivially sorted).
Slide 20
Quicksort: different methods, same underlying algorithm
- We have two ways to perform quicksort:
- The naive algorithm: create new lists for each sub-list, in a similar fashion to merge sort.
- The in-place algorithm: perform the sort by swapping elements in a single list.
Slide 21
Quicksort: Naive implementation
- Assume the following list. We will choose the pivot to be the first element in the list:
pivot (6) β [6 5 9 12 3 4]
- We then create three lists, a list with elements less than the pivot, the pivot itself, and a list with elements greater than the pivot:
< 6 > 6 [5 3 4] [6] [9 12]
- Even if all elements go into one of the less than / greater than lists, that's the way it goes.
- We continue this with the sub-lists:
pivot (5) pivot(9) β β [5 3 4] [6] [9 12] < 5 > 5 < 9 > 9 [3 4] [5] [] [6] [] [9] [12]
- We continue this with the sub-lists (and there is only one more to sort):
pivot (3) β [3 4] [5] [] [6] [] [9] [12] < 3 > 3 [] [3] [4] [5] [] [6] [] [9] [12]
- Now we can simply merge back up the sub-lists, and it is easier than merge sort, because they are already sorted completely, from left to right:
[] [3] [4] [5] [] [6] [] [9] [12] becomes [3 4 5 6 9 12]
Slide 22
Quicksort algorithm: Naive code
Vector<int> naiveQuickSort(Vector<int> v) { // not passed by reference!
// base case: list of 0 or 1
if (v.size() < 2) {
return v;
}
int pivot = v[0]; // choose pivot to be left-most element
// create two new vectors to partition into
Vector<int> left, right;
// put all elements <= pivot into left, and all elements > pivot into right
for (int i=1; i<v.size(); i++) {
if (v[i] <= pivot) {
left.add(v[i]);
}
else {
right.add(v[i]);
}
}
left = naiveQuickSortHelper(left); // recursively handle the left
right = naiveQuickSortHelper(right); // recursively handle the right
left.add(pivot); // put the pivot at the end of the left
return left + right; // return the combination of left and right
}
Slide 23
Quicksort algorithm: in-place
- The in-place, recursive quicksort algorithm has a prototype in C++ that looks like this:
int quicksort(Vector<int> &v, int start, int finish);
- Pick your pivot as the left element (might not be a good choiceβ¦)
- Traverse the list from the end (right) backwards until the value should be to the left of the pivot, or it hits the left.
- Traverse the list from the beginning (left, after pivot) forwards until the value should be to the right of the pivot, or until it hits the right.
- Swap the pivot with the element where the left/right cross, unless it happens to be the pivot.
- This is best described with a detailed example. Assume the following initial list, with the index values below. We will call the function as
quicksort(v, 0, 7);
[56 25 37 58 95 19 73 30] 0 1 2 3 4 5 6 7
- We first pick a pivot on the left, then start our traversals:
pivot (56) β [(56) 25 37 58 95 19 73 30] 0 1 2 3 4 5 6 7
- We then start traversing from the right side towards the left until we find a value that should be to the left of the pivot:
pivot (56) β β (30 is already smaller than 56) [(56) 25 37 58 95 19 73 30] 0 1 2 3 4 5 6 7
- Then, we traverse the list from the beginning (after the pivot) forwards until the value should be to the right of the pivot:
pivot (56) β β (30 is already smaller than 56) [(56) 25 37 58 95 19 73 30] 0 1 2 3 4 5 6 7
- We've reached 58, so we swap the two elements where the left and right cross:
pivot (56) β β [(56) 25 37 30 95 19 73 58] 0 1 2 3 4 5 6 7
- Now we keep traversing from the right towards the left until the value should be to the left of the pivot, or until it hits the left marker:
pivot (56) β β 19 is less than 56 [(56) 25 37 30 95 19 73 58] 0 1 2 3 4 5 6 7
- Again, traverse from the left marker towards the right until we find a value that should be to the right of the pivot (95 in this case)
pivot (56) β β 19 is less than 56 [(56) 25 37 30 95 19 73 58] 0 1 2 3 4 5 6 7
- Now we swap the 95 and the 19:
pivot (56) β β [(56) 25 37 30 19 95 73 58] 0 1 2 3 4 5 6 7
- We then start from the right marker and traverse backwards until the value should be to the left of the pivot, or until it hits the left marker. In this case, it hits the left marker:
pivot (56) ββ [(56) 25 37 30 19 95 73 58] 0 1 2 3 4 5 6 7
- Finally, we swap the pivot with the value we've reached (which will always be smaller than or the same as the pivot):
pivot (56) ββ [19 25 37 30 (56) 95 73 58] 0 1 2 3 4 5 6 7
- Notice that the 56 is in its correct place, and will never need to be moved again. We now call
quicksort(v, 0, 3)
andquicksort(v, 5, 7)
recursively, and we will end up with a completely sorted list that we sorted in-place.
Slide 24
Quicksort in-place code
/*
* Rearranges the elements of v into sorted order using
* a recursive quick sort algorithm.
*/
void quicksort(Vector<int> &vec) {
quicksort(vec, 0, vec.size() - 1);
}
// helper function to pass in start and finish
void quicksort(Vector<int> &vec, int start, int finish) {
if (start >= finish) return;
int boundary = partition(vec, start, finish);
quicksort(vec, start, boundary - 1);
quicksort(vec, boundary + 1, finish);
}
- Here is the partition code:
int partition(Vector<int> &vec, int start, int finish) { int pivot = vec[start]; int lh = start + 1; int rh = finish; while (true) { while (lh < rh && vec[rh] >= pivot) rh--; while (lh < rh && vec[lh] < pivot) lh++; if (lh == rh) break; // swap int tmp = vec[lh]; vec[lh] = vec[rh]; vec[rh] = tmp; } if (vec[lh] >= pivot) return start; vec[start] = vec[lh]; vec[lh] = pivot; return lh; }
Slide 25
Quicksort Complexity
- Worst-case: O(n^2)
- What if we had a sorted list to begin with and we picked a pivot that was always at the beginning?
- We would always have all values in our sub-lists after the pivot! This degrades to O(n^2) behavior!
- Best-case: O(n log n) (similar analysis to merge sort)
- Average complexity: O(n log n)
- Not a stable sort!
Slide 26
Recap
Sorting Big-O Cheat Sheet | |||
---|---|---|---|
Sort | Worst Case | Best Case | Average Case |
Insertion | O(n^2) | O(n) | O(n^2) |
Selection | O(n^2) | O(n^2) | O(n^2) |
Merge | O(n log n) | O(n log n) | O(n log n) |
Quicksort | O(n^2) | O(n log n) | O(n log n) |