Last time, we learned how Bubble Sort works and we discovered that it's an order n-squared algorithm. This time, we'll look at another sorting algorithm called Selection Sort. There's still that reading that covers a number of sorting algorithms included in the lecture resources for this lecture. The problem we have is the same as we had last time, we want to sort an array of values into ascending or descending order. Another solution to that problem is Selection Sort. As with the Bubble Sort algorithm, there are online visualizations for Selection SORT as well, and we'll even look at one of those before we finish this lecture. Let's go learn how Selection Sort works and evaluate or analyze its algorithmic complexity. This is the same code that we used in the previous lecture, except I commented out the stuff that does Bubble Sort and uncommented the stuff that does Selection Sort. So let's go see how the Selection Sort function works. In this case, unlike Bubble Sort, we're going to work our way from left to right. So when Bubble Sort, remember the sorted part of the array grew from right to left each time we went through the outer loop, this time, the sorted part of the array will grow from left to right as we work our way through this outer loop. So instead of bubbling the largest element over to the right, we're going to get the smallest element in the range where sorting on the left, that we won't bubble it and we'll see the difference as we walk through the algorithm. Our outer loop just goes from zero up to the last valid element in the array. We're going to make an assumption. We're going to assume that the location of the smallest element of the range of the array that we're still trying to sort is in fact at the very beginning of the range we're trying to sort. So the first time through this outer loop k is zero. So we're going to assume that the smallest element in the entire array is actually at zero already on the left. We'll see how we can fix that assumption if we discover we're wrong. But for now, that's what we'll assume. Now, we'll go through the unsorted part of the array. So we start at k plus one, and we go all the way to the end. If the element we're currently looking at is less than the smallest element that we've found so far, so minLocation keeps track of the location of the smallest element we found. If we've just found an element that's smaller than that, then we just found a new minLocation. So we set minLocation equal to i here. Once this inner loop is done executing, minLocation holds the location of the smallest number in the range of the array that we were trying to sort. So here, we swap the elements at minLocation and at k. So this takes the smallest number in the range that was unsorted and puts it into the beginning of the range we were sorting, and then we bumped k along here and do it again until we've sorted the entire array. The difference between this and Bubble Sort, in addition to this one moving left to right instead of right to left, is here when we find elements in the wrong location, we don't actually swap them, we just keep track of the new minLocation, and we only swap once each iteration through the outer loop instead of potentially swapping a lot inside the inner loop. For our algorithm analysis, the outer loop executes n times and in the worst case, the inner loop executes n minus one times when we're on the very first iteration here where k is one, I will go from one up to count minus one and that's n minus one times. Just like in Bubble Sort with nested loops, we have to take this complexity and multiply it by this complexity. So we have n times n minus 1, which gives us n squared minus n, and we can throw away that lower order n term. So selection sort is also order n squared. Let's go take a look at a visualization of Selection Sort. This time I will search on Selection Sort visualization. Here's a YouTube video, we can go watch at least part of, and as you can see, he's wandering along through, this is the inner loop, he's keeping track of the lowest number he's found so far and once he found it by going through the entire set of values, he moved it all the way to the left. So k was zero in that iteration and k was one on the next iteration. Now, we're doing k is two, and you'll see that green bar and now k is three, and so on. So he's keeping track of the minLocation, and then when he's done with that inner loop, he swaps it with the beginning of the range he was looking at. To recap, in this lecture we learned how Selection Sort works and we discovered that it's also n order n-squared algorithm.