Thu - April 19, 2007

Lecture 36. 


No class Good Friday.

Epilogue: It was a fun year, and I sincerely wish that all of you got something lasting out of this course.
All quizzes have been placed in a box in front of the School of Computing General Office, Goodwin- 557, and will be available for pick-up
whenever Goodwin Hall is open.  

Posted at 09:39 AM    

Lecture 35. 


Quiz #5. 

Posted at 09:31 AM    

Mon - April 2, 2007

Lecture 34. 


I went over solutions to the homework problems for weeks 10 and 11. The solutions are attached in the PDF file below.

Please note: Many students in this class also attend CISC-204, and have a quiz immediately after our quiz #5. It turns out that we can begin a bit early on Wednesday as the class before us has been cancelled. Thus I will come to class at 9:10, and start the quiz at 9:20.
Sol10and11.pdf  

Posted at 12:59 PM    

Fri - March 30, 2007

Lecture 33. 


Today I covered the final topic for the term, Skip Lists. I have attached a PDF file of the presentation I gave in class.

SkipList.pdf  

Posted at 10:14 AM    

Wed - March 28, 2007

Lecture 32. 


Today's topic was about using mergesort to minimize external memory accesses. I have attached a PDF file of today's presentation.

Mergesort.pdf  

Posted at 10:41 AM    

Mon - March 26, 2007

Lecture 31. 


I picked up from where I left off last Friday morning, with a more detailed description of the technique for handling an overflow in a 2-4 tree. I then proceeded to a description of how to implement deletions in a 2-4 tree.

As was mentioned on Friday there are two distinct uses for the multi-way search tree organization, one for small branches as in 2-4 trees, and the other when the branching factor is large. That is the topic of the attached presentation. This explains how the so called B-tree and B+-tree are used to minimize external memory accesses.

BandB.pdf  

Posted at 12:18 PM    

Fri - March 23, 2007

Lecture 30. 


Multi-way search trees lead to efficient data structures for two separate reasons. The so called m-way search tree with small m makes compares favourably with a binary search tree in terms of worst case asymptotic performance. With large m an m-way search tree is useful for searching large collections of data that are stored on slow secondary memory.

I gave a general introduction on multi-way search trees, and then went on to the so called B-tree, and in particular a 2-4 tree. I illustrated through an example how inserts work on a 2-4 tree. I will continue this example on Monday.  

Posted at 03:51 PM    

Wed - March 21, 2007

Lecture 29. 


Quiz #4 was written today.  

Posted at 12:11 PM    

Mon - March 19, 2007

Lecture 28. 


Today I covered Floyd's one-to-all shortest path problem for directed graphs. Negative edge weights are allowed so long as there are no negative cycles.

I then went over the solutions to the assignments in preparation of quiz #4. The solutions are attached below.

Sol8and9.pdf  

Posted at 12:03 PM    

Fri - March 16, 2007

Lecture 27. 


Today we revisited the FWI all-to-all shortest path algorithm. I used the data projector to show a run of the algorithm. A PDF version of the presentation is attached below.

I then returned to the one to all shortest path problem and Dijkstra's algorithm that we started on Wednesday. I began where I left out giving the key points needed to justify the correctness of the algorithm. The algorithm was presented in two forms. One a high level version, that left the actual details some steps of the implementation vague. Using this version I demonstrated the algorithm on a small example graph. I then refined the implementation making use of a min-heap to obtain an efficient running time.

Recall that we assume that the input graph is connected. The conclusions regarding Dijkstra's algorithm drawn by the end of the lecture are:

- Using an adjacency matrix representation of the graph uses Ω(|V|2) time and space.
- A straight forward implementation uses O(|V|2) time.
- If the graph is sparse that is |E| << |V| then
- we can gain efficiencies in space by using an adjacency list representation of the graph yielding O(|E|) space complexity.
- we can gain efficiencies in time with a clever use of a heap yielding O(|E| log |E|) time complexity.
- The algorithm's correctness is limited to graphs (or directed graphs) with non-negative edge weights.

allToAllShortest.pdf 

Posted at 11:26 AM    

Wed - March 14, 2007

Lecture 26 


Today we saw several graph algorithms. I described two types of graph traversals, depth first, and breadth first. These traversal algorithms have a computational complexity that is proportional to the size of the input when they are implemented with an adjacency list. The different traversals have different uses. We saw one immediate use of breadth first search with respect to finding shortest paths in graphs where all edges have the same weight.

The singles source shortest path problem accepts a weighted connected graph together with a distinguished vertex v0 and reports the weight of shortest paths from v0 to all vertices in the graph. We saw two very easy special cases. For a more general case we are in the midst of developing the so called, Dijkstra's algorithm. The basic idea behind this algorithm is to build a spanning tree that contains all of the requisite shortest paths. The algorithm works on a partition of the vertex set into two subsets U (unknown, or unseen vertices) and T (the shortest path spanning tree). Every iteration of the algorithm moves a vertex from U to T, until the entire tree is built. I will continue with the details of the algorithm and show that by using a heap we can obtain an implementation that runs in O( |E| log |E|) time and O(|E|) space.
 

Posted at 11:31 AM    

Mon - March 12, 2007

Lecture 25. 


Yurai Nuñez gave the lecture today. I like to distinguish edges between edges in a directed graph, or an undirected graph by the use of different parentheses. For example a directed edge from u to v is an ordered pair (u,v). On the other hand an edge u,v is simple a two element subset {u,v}.

Yurai went over terminology that is used in graph theory. Some of these terms are not quite standard, although the terms that you saw coincide with or text book and are very common. Sometimes a graph with no loops (an edge from a vertex to itself) or multiple edges (that is more that one copy of the edge {u,v} in an undirected graph, or (u,v) in the directed graph) is called a "simple graph". All of the graphs considered in or book are simple, so the simple adjective is dropped.

Yurai also talked about graph representations. In particular the adjacency matrix, the incidence matrix, and the linked structure called an adjacency list.

I will continue with graph algorithms for the rest of this week.  

Posted at 02:55 PM    

Fri - March 9, 2007

Lecture 24. 


Today we looked at the FWI algorithm also known as the Floyd-Warshall algorithm. This simple to code algorithm uses an adjacency matrix to determine the weights of shortest paths between all pairs of vertices in a weighted directed graph. This algorithm uses a technique called "dynamic programming". Along the way I defined the "principle of optimality condition" used to justify the correctness of the algorithm.  

Posted at 10:07 AM    

Wed - March 7, 2007

Lecture 23. 


Quiz #3 was written today. Quizzes initialled r.x. were marked by Henry Xiao, and those initialled h.x. were marked by Michael Xiao.
 

Posted at 12:17 PM    

Mon - March 5, 2007

Lecture 22. 


Today I went over the solutions to the last two data structures assignments. They are attached below.
Sol6.pdf
Sol7.pdf  

Posted at 12:03 PM    

















©