Explain the differences between arrays and linked lists. When would you use each?

Arrays:

  • Memory Allocation: Contiguous block of memory.
  • Access Time: O(1) - Constant time for access with an index.
  • Insertion/Deletion: O(n) - Linear time, as it may require shifting elements.
  • Use When: You need fast access to elements and the size of the collection is fixed.

Linked Lists:

  • Memory Allocation: Non-contiguous memory, with elements linked by pointers.
  • Access Time: O(n) - Linear time, as you have to traverse the list.
  • Insertion/Deletion: O(1) - Constant time if the node is known.
  • Use When: You need frequent insertions and deletions and the size of the collection is dynamic.

What are the time complexities of insertion, deletion, and search in a binary search tree?

For a balanced Binary Search Tree (BST):

  • Insertion: Average: O(log n), Worst: O(n)
  • Deletion: Average: O(log n), Worst: O(n)
  • Search: Average: O(log n), Worst: O(n)

The worst-case scenario occurs when the tree is unbalanced (skewed), resembling a linked list.

How would you detect and remove a cycle in a linked list?

Detection: Use Floyd’s Cycle-Finding Algorithm (also known as the Tortoise and Hare algorithm). Two pointers, a slow one that moves one step at a time and a fast one that moves two steps at a time, are used. If they meet, there is a cycle.

Removal:

  1. Once a cycle is detected, keep one pointer at the meeting point and move the other to the head of the list.
  2. Move both pointers one step at a time. The point where they meet again is the start of the loop.
  3. To remove the cycle, traverse to the node just before the start of the loop and set its `next` pointer to `null`.

Compare QuickSort, MergeSort, and HeapSort in terms of complexity and practical performance.

  • QuickSort: Average time complexity of O(n log n), but worst-case is O(n^2). It's an in-place sort, so space complexity is O(log n) for the recursion stack. It's generally the fastest in practice but is not stable.
  • MergeSort: Time complexity is always O(n log n). It requires O(n) extra space, making it unsuitable for large datasets in memory-constrained environments. It is a stable sort.
  • HeapSort: Time complexity is always O(n log n). It's an in-place sort with a space complexity of O(1). It's not stable and is generally slower in practice than a well-implemented QuickSort.

Explain dynamic programming. Can you solve the classic Knapsack problem using DP?

Dynamic Programming (DP) is a method for solving complex problems by breaking them down into simpler subproblems. It is applicable where problems have overlapping subproblems and optimal substructure. The results of subproblems are stored to avoid re-computation.

The 0/1 Knapsack Problem is a classic DP problem where you have a set of items, each with a weight and a value, and you want to determine the number of each item to include in a collection so that the total weight is less than or equal to a given limit and the total value is as large as possible. You can solve this using a DP table to store the maximum value for each weight capacity.

What is the difference between greedy algorithms and dynamic programming? Provide examples.

Greedy Algorithms: Make the locally optimal choice at each step with the hope of finding a global optimum. They don't reconsider their choices. Example: Activity Selection Problem.

Dynamic Programming: Makes decisions at each step considering the future. It solves subproblems and uses their solutions to solve the larger problem. Example: 0/1 Knapsack Problem.

How do you implement a priority queue? What data structures can be used?

A priority queue is an abstract data type that is like a regular queue but where each element has a "priority" associated with it. An element with high priority is served before an element with low priority.

It can be implemented using various data structures, but the most efficient is a heap (specifically a min-heap or max-heap). A heap allows for O(log n) insertions and deletions while maintaining the priority order.