AdSense

Sunday, September 14, 2014

Time complexity of different data structures

Arrays

  • InsertingO(1) for all the positions, since it is done with indexes
  • DeletingO(n) if we have to find the element, O(1) if we know position of the element
  • SearchingO(n) if array is unsorted and O(log n) if array is sorted and something like a binary search is used.

Linked List:

  • InsertingO(1), if done at the head, O(n) if anywhere else since we have to reach that position by traveseing the linkedlist linearly.
  • DeletingO(1), if done at the head, O(n) if anywhere else since we have to reach that position by traveseing the linkedlist linearly.
  • SearchingO(n)

Doubly-Linked List:

  • InsertingO(1), if done at the head, O(n) if anywhere else since we have to reach that position by traveseing the linkedlist linearly.
  • DeletingO(1), if done at the head, O(n) if anywhere else since we have to reach that position by traveseing the linkedlist linearly.
  • SearchingO(n)

Stack:

  • PushO(1)
  • PopO(1)
  • TopO(1)
  • Search (Something like lookup, as a special operation): O(n) (I guess so)

Queue/Deque/Circular Queue:

  • InsertO(1)
  • RemoveO(1)
  • SizeO(1)

Binary Search Tree:

  • Insert, delete and search: Average case: O(log n), Worst Case: O(n)

Heap/PriorityQueue (min/max):

  • findMin/findMaxO(1)
  • insertO(log n)
  • deleteMin/MaxO(log n)
  • lookup, delete (if at all provided): O(n), we will have to scan all the elements as they are not ordered like BST

HashMap/Hashtable/HashSet:

  • Insert/DeleteO(1) amortized
  • Re-size/hashO(n)
  • ContainsO(1)
Special thanks to Stackoverflow!

No comments:

Post a Comment