analyse the runtime performance of a (simple) algorithm/program in terms of the size of its inputs, and this in the average, best, and worst cases.
answer evaluation questions such as the following: If the inputs double in size, will the runtime be the same, doubled, quadrupled, squared, or what? Which algorithm/program should one choose among several alternatives for the same specification? Which data structure should one choose when implementing an algorithm, considering its basic operations (add, modify, delete, find, order) on its data? Does it make a difference in runtime (for large inputs) if one replaces a part of an algorithm/program by a faster alternative?
state in a precise way the assumptions underlying such a runtime performance analysis.
know (and analyse) the runtime performance of the basic operations (add, modify, delete, find, order) for some of the most common (simple) data structures (lists, arrays, first-in first-out queues, priority queues, stacks, trees, heaps).
explain the trade-off of using a so-called greedy algorithm (that makes opportunistic choices without any sense of regret) for an optimisation problem where the expected runtime performance is very bad for the typically occurring size of the inputs.
Mathematical foundations: asymptotic notation, summations, recurrence relations. Data structures: stacks, first-in first-out queues, trees, heaps, priority queues. Searching: binary search trees, balanced search trees, hash tables. Sorting. Graph algorithms (simple ones). Greedy algorithms. Implementation in a functional programming languageof algorithms and data structures.
Lectures, laboratory work, lessons, and mandatory assignments.