This
week we learned how to make methods more efficient by avoiding redundant calls.
I am pretty familiar with the fibonacci sequence but I have never heard about
memoization, so it is a little hard to understand. However, even though the
concept is completely new the algorithm is pretty clear in that it just checks
whether a given calculation is already included in the cache and if so there is
no need to run it. However, defining a function within a function sounds crazy
for someone who came from a java background like me. Nevertheless, the concept
is a little similar to a nested for-loop so after some careful contemplation
the algorithm becomes pretty straightforward. We also learned how to make
quicksort much more efficient without any worst case by randomizing the list
that is going to be sorted which is pretty reasonable to me I already know that
quicksort would perform really badly if the list already sorted. I found the
lab this week very interesting in that it relates a graphical concept to a list
and referring to each element needs some mathematical knowledge. It was a
little challenging at first when I couldn't understand the structure of the
tree, but then it became much easier since I understood the math behind it.
Selection
sort is a sorting algorithm that finds the minimum or the maximum element of a
list and puts it at the front or the back and excluding it for the next
recursion, and it always has an order of O(n^2). I know selection sort pretty
well from my java class in high school and it has now become so much easier
since we could just use the built-in max function in python.
Quick
sort is a sorting algorithm that finds a pivot value and one value that is
bigger than it on the left and one that is smaller on the right and swaps and
keeps running the recursion until the list is sorted. It has a best order of O(nlogn)
but a worst order of O(n^2) since if the list is sorted it would need to go
through the whole list n times, making it n^2. This inefficiency is why we
should randomize the list at first which greatly reduces the chance of sorted
parts of the list so that the order would approach the best case.
Merge
sort is a sorting algorithm that divides the list into elements, compares them
and puts them in the right place. It has always been the most complicated
sorting algorithm in my opinion since you have to use both recursion and for
loops and check for many cases. The code I learned in high school was so long
that the teacher didn't really want to talk about it. It has a constant order
of O(nlogn) and is considered the most sophisticated and optimal algorithm for
any type of lists.
I
agree with the author of http://csc148.blogspot.ca/ that Professor Heap's
lectures are very helpful and I particularly like the challenging yet very
intriguing assignments he gave. However, I don't think the tests were very
long. As long you could figure out how to construct the recursive algorithms
the allotted time was reasonable. The tests were pretty normal in my opinion.
The questions were not particularly hard, but they do test a considerable
amount of knowledge of the course material.