100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached
logo-home
Data Structures and Algorithms Summary () $7.56
Add to cart

Summary

Data Structures and Algorithms Summary ()

1 review
 99 views  6 purchases
  • Course
  • Institution
  • Book

Summary of a combination of lecture notes and parts of the book "Introduction to Algorithms" by TH Cormen, CE Leiserson, RL Rivest, and C Stein. Lectures were given by prof. F Van Raamsdonk in schoolyear .

Preview 2 out of 12  pages

  • No
  • Hoofdstuk 1, 2, 3, 4, 6, 7, 8, 10, 11, 12, 13, 15 & 16
  • April 1, 2022
  • 12
  • 2021/2022
  • Summary

1  review

review-writer-avatar

By: teodorioancalin • 2 year ago

avatar-seller
X_400614 Data Structures and Algorithms
Index

Chapter 1 - The Role of Algorithms in Computing...........................................................................................1
Chapter 2 - Getting Started ...............................................................................................................................1
Chapter 3 - Growth of Functions ......................................................................................................................2
Chapter 6 - Heapsort .........................................................................................................................................2
Chapter 7 - Quicksort ........................................................................................................................................3
Chapter 8 - Sorting in Linear Time ...................................................................................................................3
Chapter 10 - Elementary Data Structures .........................................................................................................5
Chapter 11 - Hash Tables ..................................................................................................................................6
Chapter 12 - Binary Search Trees .....................................................................................................................8
Chapter 13 - AVL Trees .....................................................................................................................................9
Chapter 15 - Dynamic Programming ..............................................................................................................10
Chapter 16 - Greedy Algorithms .....................................................................................................................11
Chapter x - Remaining Examples & Dijkstra’s Algorithm .............................................................................11

Chapter 1 - The Role of Algorithms in Computing

• An algorithm is a computational procedure that takes input and produces an output. It solves a
computational problem and is correct if, for every input instance it produces the right output.
• Not every problem solved by algorithms has an easily identified set of candidate solutions.
• A data structure is a way to store and organize data in order to facilitate access and modification.
• It is a given to choose an algorithm wisely, such that it will be time and space efficient.

Chapter 2 - Getting Started

➤2.1 Insertion Sort
• Insertion sort is a sorting algorithm that sorts by inserting an unsorted
element to the already sorted array. It is an efficient algorithm for
sorting a small number of elements.
• It sorts the input in place, it rearranges the elements, keys, within the
array, though it makes use of a subarray to store the still unsorted elements in.
• Insertion sort runs on Θ(n2) in average and worst case scenarios and has a best case scenario in Θ(n).

➤2.2 Analyzing Algorithms
• The running time of the algorithm grows with the input size. The running time is the amount of steps
executed to get an output.
• Assuming each line takes the same time, each execution of the ith line takes time ci, where ci is a
constant cost. A statement that takes ci steps to execute an executes n times will contribute cin to the total
running time.
• The worst case running time is always taken into consideration because it guarantees that the algorithm
will never take any longer; plus, for some algorithms they actually occur fairly often.
• For the rate of growth of the running time, only the leading term is taken. I.e. 2 n2 will be just n2.

➤2.3 Designing Algorithms
2.3.1 The divide-and-conquer approach
• Many algorithms are recursive in structure. It follows a divide-and-
conquer approach where the problem is divided in smaller subproblems
that will be solved recursively. Then the results are combined.
• Merge sort uses such approach. The elements are divided into two
subparts, then the subparts are sorted recursively using merge sort.
• A key operation is the merging of two sorted sequences by calling
MERGE(A, p, q, r) where p ≤ q < r. This takes Θ(n) time.
• The sentinel card(∞) is there to say one
subsequence has no more elements, therefore, the
elements in the other subsequence will always be
bigger.



, 2.3.2 Analyzing divide-and-conquer algorithms
• When an algorithm contains a recursive call to itself, its running time is often described by a recursive
equation, a recurrence. It describes the overall running time on a problem of size n in terms of the
running time on smaller inputs than n.
• The running time on a problem of size n is T(n). If the size is small enough such that n ≤ c, then it takes
constant time which is Θ(1).
• Suppose division yields a subproblems that are 1/b the size of the original problem. It will take aT(n/b)
to solve all the subproblems. Next, the time it takes to divide, D(n), and the time to combine, C(n), also
need to be added to get the recurrence.

{aT (n /b) + D(n) + C(n)
Θ(1) if n ≤ c,
• T (n) =
otherwise
• While looking at merge sort and n > 1, the running time is broken down as follows:
- Dividing is done in constant time, D(n) = Θ(1).
- Recursively solving 2 subproblems each of size n/2 will add 2T(n/2) to the running time.
- Merging the subproblems takes Θ(n) as stated before, so C(n) = Θ(n)
• Which makes the worst case running time for merge sort: 2T(n/2) + Θ(n) if n > 1. If n = 1, Θ(1).
• For merge sort, T(n) = Θ(n lg n), as the recurrence if n > 1 can be rewritten as T(n) = 2T(n/2) + cn
where constant c represents the time required to solve problems of size 1 as well as the time per array
element of the divide and combine steps.

Chapter 3 - Growth of Functions

➤3.1 Asymptotic Notation
• The asymptotic efficiency of algorithms is studied when the size of the input is large enough to make
only the order of growth of the running time relevant.
• Asymptotic notations are well suited to characterizing running times no matter what the input is.
• The Θ-notation is asymptotic tight bound (=), with tight bound meaning the running time is nailed
within a constant factor above and below.
• The O-notation is asymptotic upper bound (≤), this is used when a running time is only bound from
above. So in this case the ‘big-o’ of a running time can actually be much larger than in reality.
• The Ω-notation is asymptotic lower bound (≥), it says that an algorithm takes at least a certain amount
of time. It bounds the growth of the running time from below.
• Using asymptotic notation in equations can help eliminate inessential detail and clutter.
• The right-hand side of an equation provides less detail than the left-hand side.
• The o-notation is asymptotic upper bound not tight (<), e.g. 2n = o(n2) but 2n2 ≠ o(n2).
• The ω-notation is asymptotic lower bound not tight (>), e.g. n2/2 = ω(n) but n2/2 ≠ ω(n2).

Chapter 6 - Heapsort

➤6.1 Heaps
• The (binary) heap data structure is an array object that can be viewed as a nearly complete binary tree.
• Each node of the tree corresponds to an element of the array.
• An array A that represents a heap has two attributes, A.length which gives the number of elements of the
array and A.heap-size which represents how many elements in the heap are stored in A.
• In a max-heap, which is used for heapsort, the max-heap property is that A[parent(i)] ≥ A[i]. So the
largest element will be stored at the root. In a min-heap it is the opposite.
• The height of a node is the maximal length of a path to a leaf(a node without children). The height of the
tree is the height of the root. Its height is Θ(lg n).

➤6.2 Maintaining The Heap Property and Building A Max-Heap
• Maintaining the heap property is done with procedure MAX-HEAPIFY,
which lets the value at A[i] float down so that the largest element will
eventually become the root.
• By calling BUILD-MAX-HEAP, MAX-HEAPIFY is called
on all nodes such that the array now will be in order.

The benefits of buying summaries with Stuvia:

Guaranteed quality through customer reviews

Guaranteed quality through customer reviews

Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.

Quick and easy check-out

Quick and easy check-out

You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.

Focus on what matters

Focus on what matters

Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!

Frequently asked questions

What do I get when I buy this document?

You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.

Satisfaction guarantee: how does it work?

Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.

Who am I buying these notes from?

Stuvia is a marketplace, so you are not buying this document from us, but from seller syntryx. Stuvia facilitates payment to the seller.

Will I be stuck with a subscription?

No, you only buy these notes for $7.56. You're not tied to anything after your purchase.

Can Stuvia be trusted?

4.6 stars on Google & Trustpilot (+1000 reviews)

56326 documents were sold in the last 30 days

Founded in 2010, the go-to place to buy study notes for 14 years now

Start selling
$7.56  6x  sold
  • (1)
Add to cart
Added