Thursday, March 27, 2014

Week 11: Sorting and Efficiency

There are many ways to sort,but we can't find the best one. Every method has its advantage and disadvantage.So far, we have learnt some sort like selection sort, insertion sort, bubble sort, merge sort, quick sort and so on.

1.Insertion sort







The best case is when the list is already sorted, at this time, the complexity is O(n). While the worst case occurs when the list is sorted reversely, and the time complexity is O(n^2).

It must check all following one by one, so it is moderately inefficient. It only can be used on the list that is almost sorted or lists with very few elements.

2.Selection sort










For selection sort, no matter what the lists look like, it will run the same time. So the complexity of the best case, worst case or average case are all the same, O(n^2)

But it swap no more than n times, so if it costs a lot to swap, it will be efficient.

3.Bubble sort










For bubble sort,it compares adjacent items and exchange those that are out of order. So the complexity of the best case is O(n), and the worst case is O(n^2).

This sort is very stable.

4.Quick sort







For this algorithm, we select a middle first , and put the items that are less than middle on the left and put others right, then repeat it until the length is 1. It is an improvement of bubble sort. And the complexity of the best case is O(nlog_2 n), while the worst case is O(n^2).The best case occurs when the left and right have the same length.

5.Merge sort




















In merge sort, we continually split it into two parts, it is a recursive algorithm. The best case, average case and worst case are all O(nlogn).

Summary

1.Insertion sort and bubble sort are always slow, but when the list has almost sorted, it can take very little time to sort. At this time, quick sort will waste a lot time.

2.When n is small, we can use selection sort if we don't have request on stability, and otherwise we can choose insertion sort or bubble sort.

3.When n is large, and the key elements are random, then it is suited to choose quick sort if stability is not that important.

4.When n is large, and the key elements are somehow sorted, at the same time, we want it to be stable, then the merge sort is a good choice for us.

As we can see, in different situation, we should choose different algorithm to sort a list to save time and space. The efficiency is very important for us especially when we have too many elements, if we choose the best suited way, we can save much time and space.

Thursday, March 6, 2014

Week 8: Linked list

Linked list is a new class represented a lists of numbers, just like the linked list below we are looking at.




It has two parts, the first one is value and second one is its contains, which can be regarded as a sub-linkedlist. Every time we add a value to the linked list it will add on the top of it, and just like ADT, the first one to come is the last one to out.

It is really a hard work to find the deepest value, which require us to use recursion algorithm.

But the most attractive function for me in this class is the reverse function.
It can totally reverse the linked list. It is really a challenge if we want to get the deepest value and dig it out on the top.

I have been confused of the whole function for a long time, and finally figure out how it works.


















As we can see, it builds a new linked list and make it equivalent to the original one, and builds a new variable named new_node to do the recursion.

It is such an amazing way to deal with that problem.