Introduction to Queues

Queues are a fundamental data structure in computer science, widely used in various algorithms and applications. Understanding the queue's basic principles and its differences from other data structures is crucial for solving problems efficiently.

A queue is a linear data structure that follows a specific order in which elements are added and removed. The most defining characteristic of a queue is that it operates on a FIFO (First In, First Out) basis. This means that the first element added to the queue will be the first one to be removed.

Imagine a queue as a line of people waiting for service at a bank. The person who arrives first will be the first to be served and leave the queue. Similarly, in a queue data structure, elements are added to the back (also called the "rear") and removed from the front.

Following is a simple representation of a queue:

[Front] [A] [B] [C] [D] [Rear]
  • Enqueue: Adding an element to the rear of the queue (like E).

  • Dequeue: Removing an element from the front of the queue (like A).

FIFO (First In, First Out) Principle

The FIFO principle is the cornerstone of how queues operate. In a FIFO system, the order of processing elements is strictly determined by the order of their arrival. The first element inserted into the queue is the first one to be removed, ensuring that elements are processed in the same order they were added.

This behavior is crucial in many real-world applications:

  • Task Scheduling: In operating systems, processes are often managed using queues. The first process to enter the queue is the first to be executed, ensuring fairness and efficiency.

  • Customer Service: Whether at a ticket counter or in a call center, customers are usually served in the order they arrive, making queues an ideal structure for managing such scenarios.

Comparison with Stacks and Other Data Structures

While a queue follows the FIFO principle, a stack operates on a LIFO (Last In, First Out) basis. In a stack, the last element added is the first one to be removed, similar to a stack of plates where you can only take the top plate.

  • Queue:

    • Order: FIFO (First In, First Out)

    • Operations:

      • Enqueue: Add an element to the rear

      • Dequeue: Remove an element from the front

    • Use Cases: Task scheduling, print queue management, buffering

  • Stack:

    • Order: LIFO (Last In, First Out)

    • Operations:

      • Push: Add an element to the top

      • Pop: Remove an element from the top

    • Use Cases: Function call management (call stack), undo mechanisms, backtracking algorithms

  • Array/List:

    • Order: Typically, no strict order of processing (depends on how you use it)

    • Operations:

      • Insert: Add an element at a specified position

      • Delete: Remove an element from a specified position

    • Use Cases: General-purpose storage, random access

  • Linked List:

    • Order: Elements are linked in a sequence, but not necessarily FIFO or LIFO

    • Operations:

      • Insert: Add an element at the beginning, end, or middle

      • Delete: Remove an element from anywhere in the list

    • Use Cases: Dynamic data structures, when frequent insertion and deletion are required

Key Differences:

  • Queue vs. Stack: Queues process elements in the order they arrive (FIFO), while stacks process elements in the reverse order (LIFO). Queues are often used where fairness and order are important, whereas stacks are used in scenarios where the most recent item needs to be accessed first.

  • Queue vs. Arrays/Lists: While arrays/lists allow random access to elements, queues maintain a strict order of processing. Queues are more specialized and efficient for scenarios where elements need to be processed in the order they are received.

  • Queue vs. Linked Lists: Queues can be implemented using linked lists, but the key difference lies in how the data is processed. Linked lists allow more flexible insertion and deletion, while queues enforce a strict FIFO order.

Queue Representation Using Arrays

Queues can be implemented using a static array, where a fixed-size array is used to store the elements of the queue. This implementation is straightforward and provides an efficient way to manage a queue, but it comes with some limitations, particularly regarding size and flexibility.

Static Array Implementation

In a static array-based queue, the queue is represented as an array with a fixed size. Two indices are maintained to keep track of the front and rear of the queue:

  • Front: The index of the element at the front of the queue (the first element to be dequeued).

  • Rear: The index of the element at the rear of the queue (the last element that was enqueued).

Initially, both front and rear are set to -1 to indicate that the queue is empty.

Basic Structure:

#define MAX 100  // Maximum size of the queue

typedef struct Queue {
    int arr[MAX];  // Array to store queue elements
    int front;     // Index of the front element
    int rear;      // Index of the rear element
} Queue;

Operations: Enqueue, Dequeue, Front, Rear

  1. Enqueue Operation:

    The enqueue operation adds an element to the rear of the queue. If the queue is full, the enqueue operation cannot proceed.

    Steps:

    • Check if the queue is full (rear == MAX - 1).

    • If the queue is empty (front == -1), set front to 0.

    • Increment rear and insert the new element at rear.

Code:

    void enqueue(Queue* queue, int item) {
        if (queue->rear == MAX - 1) {
            printf("Queue Overflow\n");
            return;
        }
        if (queue->front == -1) {
            queue->front = 0;
        }
        queue->arr[++queue->rear] = item;
        printf("%d enqueued to queue\n", item);
    }
  1. Dequeue Operation:

    The dequeue operation removes an element from the front of the queue. If the queue is empty, the dequeue operation cannot proceed.

    Steps:

    • Check if the queue is empty (front == -1).

    • Retrieve the element at the front.

    • If front equals rear, the queue becomes empty, so reset front and rear to -1.

    • Otherwise, increment front to remove the element.

Code:

    int dequeue(Queue* queue) {
        if (queue->front == -1) {
            printf("Queue Underflow\n");
            return -1;
        }
        int item = queue->arr[queue->front];
        if (queue->front == queue->rear) {
            queue->front = queue->rear = -1;  // Queue becomes empty
        } else {
            queue->front++;
        }
        return item;
    }
  1. Front Operation:

    The front operation returns the element at the front of the queue without removing it.

    Steps:

    • Check if the queue is empty (front == -1).

    • Return the element at the front index.

Code:

    int front(Queue* queue) {
        if (queue->front == -1) {
            printf("Queue is empty\n");
            return -1;
        }
        return queue->arr[queue->front];
    }
  1. Rear Operation:

    The rear operation returns the element at the rear of the queue without removing it.

    Steps:

    • Check if the queue is empty (front == -1).

    • Return the element at the rear index.

Code:

    int rear(Queue* queue) {
        if (queue->rear == -1) {
            printf("Queue is empty\n");
            return -1;
        }
        return queue->arr[queue->rear];
    }

Complexity Analysis

The operations on a queue implemented using a static array generally have efficient time complexity. However, there are some trade-offs to consider.

  1. Time Complexity:

    • Enqueue (enqueue): O(1) – Adding an element to the rear of the queue is a constant-time operation.

    • Dequeue (dequeue): O(1) – Removing an element from the front of the queue is a constant-time operation.

    • Front (front): O(1) – Accessing the front element is a constant-time operation.

    • Rear (rear): O(1) – Accessing the rear element is a constant-time operation.

  2. Space Complexity:

    • The space complexity is O(n), where n is the maximum size of the queue (MAX). This is because we must allocate memory for the entire array upfront, even if the queue is not full.
  3. Limitations:

    • Fixed Size: The biggest limitation of a static array-based queue is its fixed size. Once the array is full, you cannot add more elements without resizing or reallocating memory, which can be costly.

    • Underutilization: If elements are dequeued and not reused, parts of the array may remain unused, leading to wasted space.

Queue Representation Using Linked Lists

Queues can also be implemented using linked lists, which provide a more dynamic and flexible approach compared to array-based implementations. In a linked list-based queue, each element of the queue is represented as a node in a linked list, and the queue can grow or shrink dynamically without the constraints of a fixed size.

Dynamic Linked List Implementation

In a linked list-based queue, each node contains two parts:

  1. Data: The value stored in the node.

  2. Next: A pointer to the next node in the queue.

The queue is managed using two pointers:

  • Front: Points to the front node (the first node) in the queue.

  • Rear: Points to the rear node (the last node) in the queue.

Initially, both front and rear are set to NULL to indicate that the queue is empty.

Basic Structure:

#include <stdio.h>
#include <stdlib.h>

// Define the structure of a node in the linked list
typedef struct Node {
    int data;
    struct Node* next;
} Node;

// Define the structure of the queue
typedef struct Queue {
    Node* front;
    Node* rear;
} Queue;

Operations: Enqueue, Dequeue, Front, Rear

  1. Enqueue Operation:

    The enqueue operation adds an element to the rear of the queue. A new node is created, and if the queue is empty, both front and rear pointers are updated to point to this new node. Otherwise, the new node is linked to the end of the queue, and rear is updated.

    Steps:

    • Create a new node with the given data.

    • If the queue is empty (rear == NULL), set both front and rear to point to the new node.

    • Otherwise, link the new node to the current rear, and update rear to point to the new node.

Code:

    void enqueue(Queue* queue, int item) {
        Node* newNode = (Node*)malloc(sizeof(Node));
        newNode->data = item;
        newNode->next = NULL;

        if (queue->rear == NULL) {
            queue->front = queue->rear = newNode;
            return;
        }

        queue->rear->next = newNode;
        queue->rear = newNode;
    }
  1. Dequeue Operation:

    The dequeue operation removes an element from the front of the queue. If the queue is empty, the operation cannot proceed. Otherwise, the front node is removed, and front is updated to point to the next node. If the queue becomes empty after the operation, rear is also set to NULL.

    Steps:

    • Check if the queue is empty (front == NULL).

    • Store the front node’s data.

    • Update front to point to the next node.

    • If front becomes NULL, set rear to NULL as well.

    • Free the old front node and return the stored data.

Code:

    int dequeue(Queue* queue) {
        if (queue->front == NULL) {
            printf("Queue Underflow\n");
            return -1;
        }

        Node* temp = queue->front;
        int item = temp->data;
        queue->front = queue->front->next;

        if (queue->front == NULL) {
            queue->rear = NULL;
        }

        free(temp);
        return item;
    }
  1. Front Operation:

    The front operation returns the element at the front of the queue without removing it.

    Steps:

    • Check if the queue is empty (front == NULL).

    • Return the data of the front node.

Code:

    int front(Queue* queue) {
        if (queue->front == NULL) {
            printf("Queue is empty\n");
            return -1;
        }
        return queue->front->data;
    }
  1. Rear Operation:

    The rear operation returns the element at the rear of the queue without removing it.

    Steps:

    • Check if the queue is empty (rear == NULL).

    • Return the data of the rear node.

Code:

    int rear(Queue* queue) {
        if (queue->rear == NULL) {
            printf("Queue is empty\n");
            return -1;
        }
        return queue->rear->data;
    }

Advantages Over Array-Based Implementation

  1. Dynamic Size:

    • Flexibility: Unlike arrays, linked lists do not require a predetermined size. The queue can grow and shrink dynamically as elements are added or removed, making it ideal for scenarios where the number of elements is unknown or varies widely.

    • No Overflow: In an array-based implementation, the queue has a fixed maximum size, and trying to add more elements than this size causes an overflow. In a linked list-based queue, there's no fixed size, so you only run out of space when the system's memory is exhausted.

  2. Efficient Memory Utilization:

    • No Wasted Space: In a linked list, memory is allocated only when needed, unlike an array where a fixed block of memory is reserved regardless of usage. This leads to better memory utilization, especially in cases where the queue size fluctuates significantly.

    • No Need for Resizing: In arrays, if the queue grows beyond the initially allocated size, you may need to resize the array, which is an expensive operation. Linked lists naturally expand without the need for resizing.

  3. Simpler Dequeue Operation:

    • No Shifting Required: In a static array implementation, dequeuing an element may require shifting all the remaining elements forward (in some implementations). In a linked list, removing the front element is straightforward and does not involve shifting, making the operation more efficient.
  4. Better Performance in Specific Scenarios:

    • Frequent Insertions and Deletions: Linked lists excel in scenarios where frequent insertions and deletions are required, as these operations can be performed efficiently without worrying about resizing or shifting elements.

Queue ADT: Interface and Descriptions

The Queue Abstract Data Type (ADT) defines a queue as a collection of elements with specific operations that follow the First In, First Out (FIFO) principle. The key aspect of an ADT is that it specifies what operations are available for use without detailing how they are implemented. This abstraction allows users to interact with the queue through a consistent interface, regardless of the underlying data structure (array, linked list, etc.).

Defining the Abstract Data Type for a Queue

The Queue ADT provides a conceptual framework for what a queue should do, but it does not prescribe how the queue should be implemented. The primary operations of a Queue ADT are designed to allow the addition, removal, and inspection of elements, as well as checking the state of the queue (whether it is empty or full).

Basic Interface of the Queue ADT:

  1. enqueue(item):

    • Description: Adds an item to the rear of the queue.

    • Purpose: This operation increases the size of the queue by adding a new element at the end. It ensures that the FIFO order is maintained.

    • Preconditions: The queue must not be full (if there is a capacity limit).

    • Postconditions: The queue contains the new item at the rear, and the number of elements in the queue is incremented by one.

  2. dequeue():

    • Description: Removes and returns the item at the front of the queue.

    • Purpose: This operation decreases the size of the queue by removing the element that has been in the queue the longest. It maintains the FIFO order by ensuring that the oldest element is always removed first.

    • Preconditions: The queue must not be empty.

    • Postconditions: The queue no longer contains the front item, and the number of elements in the queue is decremented by one.

  3. peek() (or front()):

    • Description: Returns the item at the front of the queue without removing it.

    • Purpose: This operation allows the user to see the item that is next in line to be dequeued, without altering the state of the queue.

    • Preconditions: The queue must not be empty.

    • Postconditions: The queue remains unchanged.

  4. isEmpty():

    • Description: Returns true if the queue is empty, false otherwise.

    • Purpose: This operation checks whether there are any elements in the queue. It is used to prevent errors when trying to dequeue or peek at an empty queue.

    • Preconditions: None.

    • Postconditions: The state of the queue is not altered.

  5. isFull() (optional, for fixed-size queues):

    • Description: Returns true if the queue is full, false otherwise.

    • Purpose: This operation checks whether the queue has reached its maximum capacity. It is particularly relevant for fixed-size queues implemented using arrays.

    • Preconditions: The queue has a defined maximum size.

    • Postconditions: The state of the queue is not altered.

Implementation Details Abstracted from the User

One of the key principles of an ADT is that the implementation details are hidden from the user. The user interacts with the queue through the provided interface (e.g., enqueue, dequeue, peek), without needing to know how these operations are carried out internally. This abstraction allows the underlying data structure (whether it's an array, linked list, or some other mechanism) to be changed without affecting the user's code.

Example of Abstraction:

  • Array-Based Queue: If the queue is implemented using a static array, the enqueue operation might involve checking for space, inserting the element at the rear index, and possibly resizing the array if necessary.

  • Linked List-Based Queue: If the queue is implemented using a linked list, the enqueue operation would involve creating a new node, linking it to the rear of the list, and updating the rear pointer.

Despite these differences, the user simply calls enqueue(item) and does not need to worry about whether the queue is using an array or a linked list behind the scenes. This allows the developer to focus on the higher-level logic of the program without getting bogged down by the implementation specifics of the queue.

Linear Queue Drawbacks

Linear queues, especially those implemented using static arrays, are simple and easy to understand. However, they come with significant limitations that can make them inefficient in certain scenarios. The primary drawbacks include underutilization of space, problems with reaching maximum capacity, and the need for reallocation or shifting elements.

Issues with Static Queues: Underutilization of Space

In a linear queue implemented with a static array, elements are added at the rear and removed from the front. Over time, as elements are dequeued, the space at the front of the array is not reused. This can lead to a situation where there is unused space at the beginning of the array, even though the queue might be full when considering only the elements between the front and rear.

Example:

  • Consider a queue of size 5: [A, B, C, D, E] with front at index 0 and rear at index 4.

  • If you dequeue two elements (A and B), the queue looks like this: [ , , C, D, E] with front at index 2 and rear at index 4.

  • Even though there are only 3 elements in the queue, there is no space left to add new elements because the rear has reached the end of the array. The front space is underutilized.

Problems Caused by the Queue Reaching Maximum Capacity

When a queue implemented using a static array reaches its maximum capacity, no more elements can be enqueued until space is freed by dequeuing elements. This is problematic in dynamic environments where the number of elements can vary significantly. If the array is full, and you attempt to enqueue an element, the queue will experience an overflow condition, even if there is unused space at the beginning of the array.

Example:

  • Following the above scenario, if you attempt to enqueue a new element F, you cannot do so even though the array still has unused space at the beginning ([ , , C, D, E]).

Need for Reallocation or Shifting Elements

To address the problem of underutilization and the queue reaching maximum capacity, one solution is to shift all elements towards the front of the array when a dequeue operation is performed. However, this shifting is computationally expensive (O(n) time complexity) and can significantly degrade performance.

Another approach is to reallocate the array to a larger size when it becomes full, copying all existing elements to the new array. This is also costly in terms of both time and memory.

Example:

  • Shifting elements [C, D, E] to the front after dequeuing A and B results in [C, D, E, , ]. This requires moving each element one or more positions, which is inefficient.

Circular Queues: Overcoming the Limitations of Linear Queues

Circular queues are designed to overcome the limitations of linear queues by treating the array as a circular buffer. In a circular queue, the positions of the front and rear pointers wrap around to the beginning of the array when they reach the end. This allows the entire array to be utilized efficiently without the need for shifting elements or reallocating space.

Concept and Implementation of Circular Queues

In a circular queue, the rear pointer moves forward as elements are enqueued, and the front pointer moves forward as elements are dequeued. If either pointer reaches the end of the array, it wraps around to the beginning.

Key Points:

  • Enqueue: Add an element at the rear position and move rear to the next position. If rear reaches the end, it wraps around to the beginning.

  • Dequeue: Remove an element from the front position and move front to the next position. If front reaches the end, it wraps around to the beginning.

  • Full Queue Condition: The queue is considered full if (rear + 1) % size == front.

  • Empty Queue Condition: The queue is empty if front == rear.

Circular Queue Representation:

  • Consider an array of size 5 with front and rear initially set to 0.

  • After enqueueing elements A, B, C, D, E, the queue is full, and rear wraps around to 0 after reaching the end.

How Circular Queues Use Space More Efficiently

Circular queues use space more efficiently because they allow elements to be enqueued in the unused space at the front of the array when the rear reaches the end. This ensures that no space is wasted, unlike in linear queues where unused space at the front cannot be reclaimed without shifting elements.

Example:

  • After dequeuing A and B, front moves to index 2, and rear is at index 4.

  • Enqueueing a new element F will place it at index 0, efficiently utilizing the previously unused space: [F, , C, D, E].

Handling Circular Buffer Wrap-Around Conditions

The key to managing a circular queue is to handle the wrap-around conditions for the front and rear pointers. When these pointers reach the end of the array, they need to wrap around to the beginning:

  • Incrementing rear: (rear + 1) % size

  • Incrementing front: (front + 1) % size

These calculations ensure that the rear and front pointers stay within the bounds of the array and wrap around as needed.

Below is a C implementation of a circular queue using a static array.

#include <stdio.h>
#include <stdlib.h>

#define SIZE 5  // Define the size of the circular queue

typedef struct CircularQueue {
    int items[SIZE];
    int front;
    int rear;
} CircularQueue;

// Initialize the queue
void initializeQueue(CircularQueue* queue) {
    queue->front = -1;
    queue->rear = -1;
}

// Check if the queue is full
int isFull(CircularQueue* queue) {
    return (queue->front == (queue->rear + 1) % SIZE);
}

// Check if the queue is empty
int isEmpty(CircularQueue* queue) {
    return (queue->front == -1);
}

// Enqueue an element into the circular queue
void enqueue(CircularQueue* queue, int item) {
    if (isFull(queue)) {
        printf("Queue is full\n");
        return;
    }
    if (isEmpty(queue)) {
        queue->front = 0;
    }
    queue->rear = (queue->rear + 1) % SIZE;
    queue->items[queue->rear] = item;
    printf("Enqueued: %d\n", item);
}

// Dequeue an element from the circular queue
int dequeue(CircularQueue* queue) {
    if (isEmpty(queue)) {
        printf("Queue is empty\n");
        return -1;
    }
    int item = queue->items[queue->front];
    if (queue->front == queue->rear) {
        // Queue is now empty
        queue->front = queue->rear = -1;
    } else {
        queue->front = (queue->front + 1) % SIZE;
    }
    printf("Dequeued: %d\n", item);
    return item;
}

// Get the front element of the circular queue
int front(CircularQueue* queue) {
    if (isEmpty(queue)) {
        printf("Queue is empty\n");
        return -1;
    }
    return queue->items[queue->front];
}

// Get the rear element of the circular queue
int rear(CircularQueue* queue) {
    if (isEmpty(queue)) {
        printf("Queue is empty\n");
        return -1;
    }
    return queue->items[queue->rear];
}

int main() {
    CircularQueue queue;
    initializeQueue(&queue);

    enqueue(&queue, 1);
    enqueue(&queue, 2);
    enqueue(&queue, 3);
    enqueue(&queue, 4);
    enqueue(&queue, 5);  // Queue is full here

    printf("Front element is: %d\n", front(&queue));
    printf("Rear element is: %d\n", rear(&queue));

    dequeue(&queue);
    dequeue(&queue);

    enqueue(&queue, 6);  // This will wrap around to the beginning
    enqueue(&queue, 7);  // This will also wrap around

    printf("Front element is: %d\n", front(&queue));
    printf("Rear element is: %d\n", rear(&queue));

    return 0;
}

Explanation of the Code

  1. Initialization: The queue is initialized with front and rear set to -1, indicating an empty queue.

  2. Enqueue: Elements are added to the queue. If the queue is full, the operation is aborted. The rear pointer wraps around using modulo operation (rear + 1) % SIZE.

  3. Dequeue: Elements are removed from the front. If the queue becomes empty after dequeueing, both front and rear are reset to -1.

  4. Front and Rear Access: Functions are provided to access the front and rear elements of the queue without removing them.

Introduction to Deques and Their Versatility

A Deque (pronounced as "deck") stands for Double-Ended Queue. It is a generalized version of a queue that allows insertion and deletion of elements at both the front and rear ends. This versatility makes deques particularly useful in various applications where both LIFO (Last In, First Out) and FIFO (First In, First Out) behaviors are required simultaneously.

Deques can be thought of as a hybrid structure between stacks and queues, combining the functionality of both into one data structure. Unlike simple queues, where you can only insert elements at the rear and remove them from the front, deques allow more flexible operations.

Operations on Deques

Deques support the following fundamental operations:

  1. insertFront(item):

    • Description: Inserts an element at the front of the deque.

    • Purpose: Allows the addition of an element at the beginning of the deque, similar to a stack's push operation but at the opposite end.

  2. insertRear(item):

    • Description: Inserts an element at the rear of the deque.

    • Purpose: This is equivalent to the enqueue operation in a standard queue, adding an element at the end.

  3. deleteFront():

    • Description: Removes and returns the element at the front of the deque.

    • Purpose: This operation removes the element that has been in the deque the longest, similar to the dequeue operation in a standard queue.

  4. deleteRear():

    • Description: Removes and returns the element at the rear of the deque.

    • Purpose: Removes the most recently added element from the rear, similar to a stack's pop operation.

  5. isEmpty():

    • Description: Checks if the deque is empty.

    • Purpose: Used to determine if there are any elements in the deque before performing deletion operations.

  6. isFull() (optional, for fixed-size deques):

    • Description: Checks if the deque is full.

    • Purpose: Relevant for deques implemented with a fixed-size array, this checks if there is room to insert more elements.

Below is a simple C implementation of a deque using a static array:

#include <stdio.h>
#include <stdlib.h>

#define SIZE 5  // Define the size of the deque

typedef struct Deque {
    int arr[SIZE];
    int front;
    int rear;
} Deque;

// Initialize the deque
void initializeDeque(Deque* deque) {
    deque->front = -1;
    deque->rear = 0;
}

// Check if the deque is full
int isFull(Deque* deque) {
    return ((deque->front == 0 && deque->rear == SIZE-1) || (deque->front == deque->rear + 1));
}

// Check if the deque is empty
int isEmpty(Deque* deque) {
    return (deque->front == -1);
}

// Insert an element at the front of the deque
void insertFront(Deque* deque, int item) {
    if (isFull(deque)) {
        printf("Deque is full\n");
        return;
    }

    if (deque->front == -1) {
        deque->front = 0;
        deque->rear = 0;
    } else if (deque->front == 0) {
        deque->front = SIZE - 1;
    } else {
        deque->front = deque->front - 1;
    }

    deque->arr[deque->front] = item;
    printf("Inserted %d at the front\n", item);
}

// Insert an element at the rear of the deque
void insertRear(Deque* deque, int item) {
    if (isFull(deque)) {
        printf("Deque is full\n");
        return;
    }

    if (deque->front == -1) {
        deque->front = 0;
        deque->rear = 0;
    } else if (deque->rear == SIZE - 1) {
        deque->rear = 0;
    } else {
        deque->rear = deque->rear + 1;
    }

    deque->arr[deque->rear] = item;
    printf("Inserted %d at the rear\n", item);
}

// Delete an element from the front of the deque
int deleteFront(Deque* deque) {
    if (isEmpty(deque)) {
        printf("Deque is empty\n");
        return -1;
    }

    int item = deque->arr[deque->front];
    if (deque->front == deque->rear) {
        deque->front = -1;
        deque->rear = -1;
    } else if (deque->front == SIZE - 1) {
        deque->front = 0;
    } else {
        deque->front = deque->front + 1;
    }

    printf("Deleted %d from the front\n", item);
    return item;
}

// Delete an element from the rear of the deque
int deleteRear(Deque* deque) {
    if (isEmpty(deque)) {
        printf("Deque is empty\n");
        return -1;
    }

    int item = deque->arr[deque->rear];
    if (deque->front == deque->rear) {
        deque->front = -1;
        deque->rear = -1;
    } else if (deque->rear == 0) {
        deque->rear = SIZE - 1;
    } else {
        deque->rear = deque->rear - 1;
    }

    printf("Deleted %d from the rear\n", item);
    return item;
}

int main() {
    Deque deque;
    initializeDeque(&deque);

    insertRear(&deque, 5);
    insertRear(&deque, 10);
    insertFront(&deque, 15);
    insertFront(&deque, 20);
    insertRear(&deque, 25);  // Deque is now full

    deleteFront(&deque);
    deleteRear(&deque);

    insertRear(&deque, 30);  // Inserting after deletion

    return 0;
}

Applications and Use Cases for Deques

  1. Palindrome Checking:

    • Deques can be used to check if a word or phrase is a palindrome (reads the same forward and backward). Characters are added to the deque, and then characters from the front and rear are compared to check for symmetry.
  2. Sliding Window Problems:

    • Deques are ideal for solving sliding window problems, such as finding the maximum or minimum in a subarray of fixed size. The deque efficiently keeps track of useful elements within the current window, allowing for quick access to the maximum or minimum.
  3. Job Scheduling:

    • Deques can be used in scheduling tasks where jobs can be added or removed from either end of the queue, allowing for more flexible job management.

Comparing Linear, Circular, and Double-Ended Queues

  1. Linear Queue:

    • Use Case: Simple FIFO operations where elements are only added at the rear and removed from the front.

    • Performance: Linear queues are easy to implement but can suffer from space inefficiency as elements are dequeued, leaving unused space at the front.

    • Memory Usage: Fixed-size and static, so memory can be wasted if the queue is not used efficiently.

  2. Circular Queue:

    • Use Case: Efficiently utilizes space by allowing the rear to wrap around to the front, overcoming the limitations of linear queues.

    • Performance: Slightly more complex than linear queues but ensures no space is wasted.

    • Memory Usage: More efficient use of memory compared to linear queues since it reuses space.

  3. Double-Ended Queue (Deque):

    • Use Case: Versatile operations where elements need to be added or removed from both ends. Useful in applications like palindrome checking and sliding window problems.

    • Performance: More flexible but slightly more complex to implement and manage than simple queues.

    • Memory Usage: Like circular queues, deques can be implemented with dynamic sizing, allowing efficient use of memory without the risk of overflow (unless fixed-size).

Scenarios Where One Type of Queue is Preferable Over Others

  • Linear Queue: Preferable in simple scenarios where the order of processing is strictly FIFO, and memory constraints are not an issue. Example: simple job scheduling in a controlled environment.

  • Circular Queue: Ideal for scenarios where space efficiency is crucial, and you want to avoid the drawbacks of unused space in linear queues. Example: buffering data streams in hardware where memory is limited.

  • Deque: Best suited for applications requiring maximum flexibility in how elements are added and removed. Example: maintaining a list of recently accessed pages in a web browser (both front and rear operations are needed).

4
Subscribe to my newsletter

Read articles from Jyotiprakash Mishra directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Jyotiprakash Mishra
Jyotiprakash Mishra

I am Jyotiprakash, a deeply driven computer systems engineer, software developer, teacher, and philosopher. With a decade of professional experience, I have contributed to various cutting-edge software products in network security, mobile apps, and healthcare software at renowned companies like Oracle, Yahoo, and Epic. My academic journey has taken me to prestigious institutions such as the University of Wisconsin-Madison and BITS Pilani in India, where I consistently ranked among the top of my class. At my core, I am a computer enthusiast with a profound interest in understanding the intricacies of computer programming. My skills are not limited to application programming in Java; I have also delved deeply into computer hardware, learning about various architectures, low-level assembly programming, Linux kernel implementation, and writing device drivers. The contributions of Linus Torvalds, Ken Thompson, and Dennis Ritchie—who revolutionized the computer industry—inspire me. I believe that real contributions to computer science are made by mastering all levels of abstraction and understanding systems inside out. In addition to my professional pursuits, I am passionate about teaching and sharing knowledge. I have spent two years as a teaching assistant at UW Madison, where I taught complex concepts in operating systems, computer graphics, and data structures to both graduate and undergraduate students. Currently, I am an assistant professor at KIIT, Bhubaneswar, where I continue to teach computer science to undergraduate and graduate students. I am also working on writing a few free books on systems programming, as I believe in freely sharing knowledge to empower others.