A Gentle Introduction to Data Structures: How Queues Work

Queues are one of the most fundamental and widely-used data structures in computer science. They show up everywhere from operating systems to web applications to real-life situations we encounter every day.

As a full-stack developer, having a solid grasp on how queues work, both conceptually and in code, is essential. But don‘t worry – by the end of this post, you‘ll be a queue master! Let‘s jump right in.

The Queue ADT

At its core, a queue is a linear data structure that follows the "First In, First Out" or FIFO principle. This means that the first element added to the queue will always be the first one to be removed.

Think of a queue like the line at a grocery store checkout:

  • The first customer to get in line is the first to be served and exit the line
  • New customers join the line at the back and wait their turn
  • Customers in the middle cannot leave until those ahead of them have been served

In computer science terms, we call the addition of an element to the back of the queue an "enqueue" operation, and the removal of an element from the front a "dequeue" operation.

Here‘s a simple diagram illustrating the FIFO queue principle:

[Queue FIFO diagram]

Some other key terminologies to know:

  • The "front" or "head" refers to the first element in the queue, which is next to be dequeued
  • The "rear" or "tail" is the last element, which was most recently enqueued
  • The queue is "empty" if it contains no elements
  • Some queue implementations have a fixed upper bound on the number of elements the queue can contain, while others allow the capacity to grow dynamically

Queue Operations

The core functions any queue data structure should support include:

  • enqueue(element) – Add an element to the back of the queue
  • dequeue() – Remove and return the element at the front of the queue
  • front() / peek() – Return the front element without removing it
  • isEmpty() – Return true if the queue contains no elements, false otherwise
  • size() / length() – Return the number of elements currently in the queue

Here is how we could define a Queue class with these operations in Python:

class Queue:
  def __init__(self):
    self.items = []

  def isEmpty(self):
    return self.items == []

  def enqueue(self, item):
    self.items.append(item)

  def dequeue(self):
    if self.isEmpty(): 
      raise Exception("Queue is empty")
    return self.items.pop(0)

  def front(self):
    if self.isEmpty():
      raise Exception("Queue is empty")
    return self.items[0]

  def size(self):
    return len(self.items)

We implement the queue using a standard Python list, with enqueue appending to the back and dequeue removing from the front. isEmpty and size simply check the list length.

Note that in a production implementation, we would want to use a more optimized queue design as we‘ll discuss later. The key point is that a queue has a strict access pattern – add to back, remove from front.

Complexity Analysis

Let‘s analyze the time and space complexity of our core queue operations. We consider the number of elements currently in the queue as n.

Enqueue:

  • Time complexity: O(1) – appending to the back of a Python list is constant time on average
  • Space complexity: O(1) – enqueueing adds one element to the queue

Dequeue:

  • Time complexity: O(n) – removing from the front of a Python list requires shifting all elements over by one
  • Space complexity: O(1) – dequeuing removes one element from the queue

Front:

  • Time complexity: O(1) – accessing the first list element is constant time
  • Space complexity: O(1) – just returns one element

isEmpty and size:

  • Time complexity: O(1) – finding the list length is constant time
  • Space complexity: O(1) – just returns a boolean or integer

Overall, enqueueing is very efficient while dequeueing is linear time in this implementation since it uses a Python list. By using a different underlying structure like a linked list or ring buffer, we can achieve O(1) time for both enqueue and dequeue.

Circular Queues

One way to make dequeueing more efficient is to use a circular queue, also known as a ring buffer. This avoids the costly shifting of elements on dequeue.

The idea is that the queue "wraps around" from the end back to the beginning. We maintain two pointers, front and back, to track the current extent of the queue.

[Circular queue diagram]

When we dequeue, we simply increment the front pointer and ignore the dequeued slot. When we enqueue, we increment the back pointer and add the new element there. If back reaches the end, it wraps around to the start again.

Here‘s a Python implementation of a circular queue with a fixed capacity:

class CircularQueue:
  def __init__(self, capacity):
    self.items = [None] * capacity
    self.capacity = capacity 
    self.front = -1
    self.rear = -1
    self.size = 0

  def isEmpty(self):
    return self.size == 0

  def isFull(self):
    return self.size == self.capacity

  def enqueue(self, item):
    if self.isFull():
      raise Exception("Queue is full")
    self.rear = (self.rear + 1) % self.capacity
    self.items[self.rear] = item
    self.size += 1
    if self.front == -1:
      self.front = self.rear

  def dequeue(self):
    if self.isEmpty():
      raise Exception("Queue is empty")
    item = self.items[self.front]
    self.items[self.front] = None
    self.front = (self.front + 1) % self.capacity
    self.size -= 1
    if self.isEmpty():
      self.front = -1
      self.rear = -1
    return item

  def frontItem(self):
    if self.isEmpty():
      raise Exception("Queue is empty")
    return self.items[self.front]

  def rearItem(self):
    if self.isEmpty():
      raise Exception("Queue is empty")  
    return self.items[self.rear]

  def getSize(self):
    return self.size

The key aspects are:

  • Initialize queue items as fixed size list with capacity
  • Use front and rear pointers that increment with modular arithmetic
  • Check for full queue before enqueue and empty queue before dequeue
  • Enqueue to rear, dequeue from front
  • Reset pointers to -1 when empty

Now enqueue and dequeue are both O(1) time complexity! The tradeoff is we have a fixed capacity.

Priority Queues

Another useful queue variation is the priority queue. In a priority queue, elements are dequeued not based on insertion order, but on a priority value. Think of it like a hospital emergency room triage line.

The two types are:

  1. Ascending priority queue – element with minimum key value is dequeued first
  2. Descending priority queue – element with maximum key value is dequeued first

A common way to implement a priority queue is using a heap data structure. We‘ll cover heaps in detail in a future post, but the key ideas are:

  • Heaps are complete binary trees where each node has a higher (max-heap) or lower (min-heap) key than its children
  • Heap insertion and deletion are logarithmic time
  • Finding the min/max is constant time (just look at the root)

So by using a heap to track elements, a priority queue can support these core functions:

  • insert(element, key) – add element with priority key – O(log n)
  • getHighestPriority() – return the highest priority element – O(1)
  • removeHighestPriority() – remove and return the highest priority element – O(log n)

Other heap-based functions a priority queue may have:

  • remove(element) – remove a specific element – O(n)
  • changePriority(element, newPriority) – update an element‘s priority – O(log n) typical

The choice of a min-heap or max-heap depends if it should be an ascending or descending priority queue.

Here‘s a simple Python ascending priority queue using heapq:

import heapq

class PriorityQueue:
  def __init__(self):
    self.items = []

  def isEmpty(self):
    return len(self.items) == 0

  def insert(self, item, priority):
    heapq.heappush(self.items, (priority, item))

  def removeHighest(self):
    if self.isEmpty():
      raise Exception("Priority queue is empty")
    return heapq.heappop(self.items)[1]

  def peekHighest(self):
    if self.isEmpty():
      raise Exception("Priority queue is empty")
    return self.items[0][1]

  def size(self):
    return len(self.items)

heapq is a built-in Python module that provides min-heap functionality. We use tuples (priority, item) so that the heap orders elements by priority value.

A priority queue is useful any time you need to efficiently access elements in sorted order or by highest/lowest priority.

Applications of Queues

Queues are used internally by operating systems and programming languages for managing threads, processes, async events, and more. At the application level, here are some common examples:

  • Web server request handling – enqueue incoming requests from clients and dequeue to process them in order
  • Breadth-first search graph traversal – use queue to track which node to visit next as you traverse a graph level-by-level
  • Cache management – queue tracks which items to evict when the cache reaches capacity (e.g. LRU cache)
  • Print spooling – documents are enqueued for printing and dequeued in order when the printer is ready
  • Event bus – listeners subscribe to events and are notified in FIFO order when those events are published to the queue
  • Message passing – enables asynchronous communication between threads or distributed systems

The common theme is that queues help manage the flow and order of items, requests, or tasks to ensure fair, predictable, and efficient processing.

Comparing Queues and Stacks

Queues are often compared to stacks since both are simple linear data structures with specific access patterns. But they have key differences:

Queues:

  • FIFO (First In First Out) – first element added is first removed
  • Elements added to back/rear and removed from front
  • Used for order, fairness, and flow control
  • Examples: BFS graph traversal, event bus, request handling

Stacks:

  • LIFO (Last In First Out) – last element added is first removed
  • Elements pushed and popped from the same end (top)
  • Used for nested function calls, undo history, parsing
  • Examples: DFS graph traversal, function call stack, parentheses matching

So while both only allow access at the ends, the difference between FIFO and LIFO significantly impacts their usage. Understanding which to use when is an important skill.

Conclusion

Congratulations on making it to the end of this queue journey! We covered the fundamentals of how queues work, from real-world analogies to implementation details to applications. The key points to take away are:

  • Queues are FIFO (First In First Out) data structures
  • Enqueue adds to the back, dequeue removes from front
  • Circular queues use modular arithmetic for efficient enqueue/dequeue
  • Priority queues order elements by a priority rather than FIFO
  • Queues are used to manage order, flow, and efficiency in many contexts

I hope you enjoyed this gentle introduction to queues. The best way to solidify your understanding is to implement them yourself and apply them to real problems.

Stay tuned for more in this data structures series where we‘ll explore heaps, linked lists, hash tables and more. Happy coding!

Similar Posts