Exploring Time and Space Complexities of Shell Sort

This article is a complementary resource to the DSA with Python and DSA with C++ courses.

Exploring Time and Space Complexities of Shell Sort

Shell sort sorts elements by repeatedly dividing the array into subarrays defined by an interval and performing insertion sort on these subarrays.

The interval size reduces after each iteration until it becomes 1, at which point it performs a final insertion sort.

def shell_sort(arr):
    n = len(arr)
    interval = n // 2  # Initialize the interval size
    
    while interval > 0:
        for i in range(interval, n):
            temp = arr[i]
            j = i
            while j >= interval and arr[j - interval] > temp:
                arr[j] = arr[j - interval]
                j -= interval
            arr[j] = temp
        interval //= 2  # Reduce the interval size
#include <vector>
using namespace std;

template <typename T>
void insertion_sort(std::vector<T>& vec, int interval) {
    int n = vec.size();

    // Perform interval-based insertion sort
    for (int i = interval; i < n; i++) {
        T temp = vec[i];
        int j = i;

        // Shift elements of the sorted segment to find the correct position for temp
        while (j >= interval && vec[j - interval] > temp) {
            vec[j] = vec[j - interval];
            j -= interval;
        }

        // Place the element at its correct position
        vec[j] = temp;
    }
}

template <typename T>
void shell_sort(std::vector<T>& vec) {
    int interval = vec.size() / 2;

    // Reduce the interval and perform insertion sort for each interval
    while (interval > 0) {
        insertion_sort(vec, interval);
        interval /= 2;  // Halve the interval
    }
}

The time complexity of the shell sort algorithm mainly depends on the interval sequence used.

Some optimal sequences that can be used in the Shell sort algorithm are:

  • Shell's Original Sequence: N/2 , N/4 ,..., 1
  • Knuth's Increments: 1, 4, 13,..., (3k – 1) / 2
  • Sedgewick's Increments: 1, 8, 23, 77, 281, 1073, 4193, 16577...
  • Hibbard's Increments: 1, 3, 7, 15, 31, 63, 127, 255, 511...
  • Papernov & Stasevich increments: 1, 3, 5, 9, 17, 33, 65...
  • Pratt Sequence: 1, 2, 3, 4, 6, 9, 8, 12, 18, 27, 16, 24, 36, 54, 81...

Best Case Time Complexity: O(n log⁡n)

For an optimal sequence, such as the Hibbard or Sedgewick sequence, shell sort can achieve a best-case time complexity of O(n logn).

Worst Case Time Complexity: <katex>O(n^{2})</katex>

For certain interval sequences, like the original Shell interval sequence, the algorithm performs poorly in the worst case, with a time complexity of <katex>O(n^{2})</katex> .

Average Case Time Complexity: <katex>O(n^{1.5})</katex> to O(n logn)

Common sequences like Knuth yield complexities in the range of <katex>O(n^{1.5})</katex>.

In practice, efficient interval sequences like Knuth, Hibbard, Pratt, or Sedgewick sequences are used. Hence, the average time complexity is <katex>O(n^{1.5})</katex> to O(n logn).

Number of Operations vs. Input Size
Number of Operations vs. Input Size

Space Complexity of Shell Sort: O(1)

Shell sort is an in-place algorithm requiring no additional memory beyond a few variables for interval tracking and swapping. Hence, the space complexity is O(1).


Summary: Time and Space Complexity

Best Case Time Complexity O(n log⁡n)
Worst Case Time Complexity <katex>O(n^{2})</katex>
Average Case Time Complexity <katex>O(n^{1.5})</katex> to O(n logn)
Space Complexity O(1)

When to Use Shell Sort?

When to Use:

  1. Moderately Large Datasets: It's faster than simple algorithms like Insertion Sort and Bubble Sort but less efficient than Quick Sort or Merge Sort for large datasets.
  2. Memory-Constrained Environments: Its in-place sorting mechanism makes it suitable for scenarios where memory usage is critical.

When Not to Use:

  1. Strict Stability Requirements: Shell sort is not a stable sorting algorithm, i.e., the relative order of equal elements may not be preserved. So, it shouldn't be used when you require stability.