The article is about air pollution and how the pollution relates health problems. He states that this relationship is not completely explored and even the air inside houses is affected. After mentioning the political situation, the author closes with an appeal to the reader to reduce pollution. This seems like a logical structure when talking about this topic. The length of the paragraphs appears to be balanced and the topics are addressed in appropriate depth.
The article seems to be concise, every point is discussed in enough length and no drawn out paragraph catches the eye. The separation of the text is good. In every paragraph, one idea is discussed. The sentences, in the beginning, are short and straightforward but some of the later ones are too long.
The Argumentation and explanations seem reasonable and I can follow them. I personally like his analogies , but sometimes they are not appropriate. References are missing, but that is to be excepted as this is no scientific paper.
LinearSearch(v,l) searches the items in list l for the value v. It returns the position of the value or n if the is not in the list, where n is the length of l. The value is compared to every element in the list, denoted by l(i).
- (Initialize counter i.) Set i <- 0.
- (loop trough every item in l.)
- if l(i) = v return i.
- i <- i+1.
- return n.
InsertionSort(l) sorts the list l and returns it. It loops through each element of the list, finds its location in the sorted list, and inserts it there. The sorted list consists of every visited item.
- (Initialize counter i.) Set i <- 1.
- (Sort the list.) Loop trough every item in l.
- (Initialize counter j.)j is used to search for the correct position to insert l(i). Set j <- i.
- While j is still a valid index ( j > 0 ) and the predecessor of l(j) is greater than l(j) ( l(j-1) > l(j) ).
- Swap l(j) and l(j-1).
- j <- j-1.
- return l.
MergeSort(l) sorts the list l with a divide-and-conquer approach. That means the data is broken down into parts which are processed individually. After that, the preprocessed parts are merged back together again. Overall, the algorithm has a complexity of O(n log n).
Input: unsorted list l.
Output: sorted list l.
The major steps of the algorithm are as follows:
- Recursive subdivision of the list until each contains 1 item.
- Merge sublists until only one is remaining. This is the sorted list.
We now examine these steps in detail.
- If a list is empty or has only one element is sorted by definition. In this case, return l.
- left, right <- split(l).
- This evenly divides all items of l into left and right. This has a constant complexity, since only the midpoint of l needs to be computed.
- left <- MergeSort(left)
- right <- MergeSort(right)
- Recursive calls are used to further subdivide the sublists. This is done until there is until the whole list is divided into sublists of length one. These sublists are then sorted in the Merge function. This has a complexity of logarithmic since the inputs for the recursive calls are halved at each recursion step.
- return Merge(left,right)
function Merge(left, right)
This help function called by the MergeSort function actually does the sorting. The merge has a linear complexity, because each element of the input lists is merged into the result list exactly once.
- result <- empty_list.
- If none of both lists is empty merge both lists using a zip lock principle. Both left and right are sorted meaning the first element of both is the smallest item in each list. To merge the sublists, the first elements of both sublists are compared and the smaller one is appended to result.
- if head(left) <= head(right) then
- result <- result + head(left)
- left <- tail(left)
- result <- result + head(right)
- right <- tail(right)
- head(list) return the first element of list and tail(list) returns every element of list except the first one.
- If one of the sublists has any elements has any elements left simply append these elements to result.
- return result. This is a sorted list.