Queue module is python 2 specific and queue module is python 3+

Importing the Queue module in python 3 will result in error

Answer from Abhishek on Stack Overflow
🌐
Python
docs.python.org β€Ί 3 β€Ί library β€Ί queue.html
queue β€” A synchronized queue class
February 23, 2026 - The module implements three types of queue, which differ only in the order in which the entries are retrieved. In a FIFO queue, the first tasks added are the first retrieved. In a LIFO queue, the most recently added entry is the first retrieved ...
🌐
Real Python
realpython.com β€Ί queue-in-python
Python Stacks, Queues, and Priority Queues in Practice – Real Python
December 1, 2023 - In this tutorial, you'll take a deep dive into the theory and practice of queues in programming. Along the way, you'll get to know the different types of queues, implement them, and then learn about the higher-level queues in Python's standard library. Be prepared to do a lot of coding.
Discussions

import - python queue vs. Queue? - Stack Overflow
A lot of posts I've seen use "import queue". But others uses "import Queue". What is the difference between these two python imports? Is one preferable to the other? More on stackoverflow.com
🌐 stackoverflow.com
SimpleQueue vs Queue in Python - what is the advantage of using SimpleQueue? - Stack Overflow
The queue β€” A synchronized queue class simply states that there are fewer functions allowed with SimpleQueue. I need very basic queue functionality for a multithreading application, would it hel... More on stackoverflow.com
🌐 stackoverflow.com
python - queue.Queue vs. collections.deque - Stack Overflow
I need a queue which multiple threads can put stuff into, and multiple threads may read from. Python has at least two queue classes, queue.Queue and collections.deque, with the former seemingly usi... More on stackoverflow.com
🌐 stackoverflow.com
python queue & multiprocessing queue: how they behave? - Stack Overflow
Items placed in the Queue.Queue ... threads inside the same process (using the threading module). The multiprocessing queues are for data interchange between different Python processes.... More on stackoverflow.com
🌐 stackoverflow.com
🌐
Great Learning
mygreatlearning.com β€Ί blog β€Ί it/software development β€Ί python queue
Python Queue
October 14, 2024 - The first person who entered the line will get the tickets first and after that, he will be removed from the line, followed by the next person in the lane and so on. A similar logic will be applied in Python Queue too for programs and processes to be executed.
🌐
GeeksforGeeks
geeksforgeeks.org β€Ί queue-in-python
Queue in Python - GeeksforGeeks
July 9, 2024 - Queue is built-in module of Python which is used to implement a queue. queue.Queue(maxsize) initializes a variable to a maximum size of maxsize. A maxsize of zero β€˜0’ means a infinite queue. This Queue follows FIFO rule.
🌐
Python
docs.python.org β€Ί 3 β€Ί library β€Ί asyncio-queue.html
Queues β€” Python 3.14.4 documentation
February 22, 2026 - Source code: Lib/asyncio/queues.py asyncio queues are designed to be similar to classes of the queue module. Although asyncio queues are not thread-safe, they are designed to be used specifically i...
Find elsewhere
🌐
BitDegree
bitdegree.org β€Ί learn β€Ί python-queue
The Four Types of Python Queue: Definitions and Examples
February 19, 2020 - TL;DR – A Python queue is a linear data structure, similar to a stack.
Top answer
1 of 7
429

queue.Queue and collections.deque serve different purposes. queue.Queue is intended for allowing different threads to communicate using queued messages/data, whereas collections.deque is simply intended as a data structure. That's why queue.Queue has methods like put_nowait(), get_nowait(), and join(), whereas collections.deque doesn't. queue.Queue isn't intended to be used as a collection, which is why it lacks the likes of the in operator.

It boils down to this: if you have multiple threads and you want them to be able to communicate without the need for locks, you're looking for queue.Queue; if you just want a queue or a double-ended queue as a datastructure, use collections.deque.

Finally, accessing and manipulating the internal deque of a queue.Queue is playing with fire - you really don't want to be doing that.

2 of 7
64

If all you're looking for is a thread-safe way to transfer objects between threads, then both would work (both for FIFO and LIFO). For FIFO:

  • Queue.put() and Queue.get() are thread-safe
  • Deques support thread-safe, memory efficient appends and pops from either side of the deque with approximately the same O(1) performance in either direction.

Note:

  • Other operations on deque might not be thread safe, I'm not sure.
  • deque does not block on pop() or popleft() so you can't base your consumer thread flow on blocking till a new item arrives.

However, it seems that deque has a significant efficiency advantage. Here are some benchmark results in seconds using CPython 2.7.3 for inserting and removing 100k items

deque 0.0747888759791
Queue 1.60079066852

Here's the benchmark code:

import time
import Queue
import collections

q = collections.deque()
t0 = time.clock()
for i in xrange(100000):
    q.append(1)
for i in xrange(100000):
    q.popleft()
print 'deque', time.clock() - t0

q = Queue.Queue(200000)
t0 = time.clock()
for i in xrange(100000):
    q.put(1)
for i in xrange(100000):
    q.get()
print 'Queue', time.clock() - t0
🌐
Stack Abuse
stackabuse.com β€Ί guide-to-queues-in-python
Guide to Queues in Python
April 18, 2024 - The first person in line gets served first, and new arrivals join at the end. This is a real-life example of a queue in action! For developers, especially in Python, queues aren't just theoretical constructs from a computer science textbook. They form the underlying architecture in many applications.
🌐
dbader.org
dbader.org β€Ί blog β€Ί queues-in-python
Queues in Python – dbader.org
May 2, 2017 - A regular queue, however, won’t re-order the items it carries. You get what you put in, and in exactly that order (remember the pipe example?) Python ships with several queue implementations that each have slightly different characteristics.
🌐
W3Schools
w3schools.com β€Ί python β€Ί python_dsa_queues.asp
Queues with Python
Python Examples Python Compiler ... Bootcamp Python Certificate Python Training ... A queue is a linear data structure that follows the First-In-First-Out (FIFO) principle....
Top answer
1 of 2
78

For your second example, you already gave the explanation yourself---Queue is a module, which cannot be called.

For the third example: I assume that you use Queue.Queue together with multiprocessing. A Queue.Queue will not be shared between processes. If the Queue.Queue is declared before the processes then each process will receive a copy of it which is then independent of every other process. Items placed in the Queue.Queue by the parent before starting the children will be available to each child. Items placed in the Queue.Queue by the parent after starting the child will only be available to the parent. Queue.Queue is made for data interchange between different threads inside the same process (using the threading module). The multiprocessing queues are for data interchange between different Python processes. While the API looks similar (it's designed to be that way), the underlying mechanisms are fundamentally different.

  • multiprocessing queues exchange data by pickling (serializing) objects and sending them through pipes.
  • Queue.Queue uses a data structure that is shared between threads and locks/mutexes for correct behaviour.
2 of 2
12

Queue.Queue

  • Was created to work in concurrent environments spawned with the threading module.

  • Each thread shares a reference to the Queue.Queue object among them. No copying or serialization of data happens here and all the threads have access to the same data inside the queue.

multiprocessing.Queue

  • Was created to work in parallel environments spawned with the multiprocessing module.

  • Each process gets access to a copy of the multiprocessing.Queue object among them. The contents of the queue are copied across the processes via pickle serialization. .

Top answer
1 of 2
15

JoinableQueue has methods join() and task_done(), which Queue hasn't.


class multiprocessing.Queue( [maxsize] )

Returns a process shared queue implemented using a pipe and a few locks/semaphores. When a process first puts an item on the queue a feeder thread is started which transfers objects from a buffer into the pipe.

The usual Queue.Empty and Queue.Full exceptions from the standard library’s Queue module are raised to signal timeouts.

Queue implements all the methods of Queue.Queue except for task_done() and join().


class multiprocessing.JoinableQueue( [maxsize] )

JoinableQueue, a Queue subclass, is a queue which additionally has task_done() and join() methods.

task_done()

Indicate that a formerly enqueued task is complete. Used by queue consumer threads. For each get() used to fetch a task, a subsequent call to task_done() tells the queue that the processing on the task is complete.

If a join() is currently blocking, it will resume when all items have been processed (meaning that a task_done() call was received for every item that had been put() into the queue).

Raises a ValueError if called more times than there were items placed in the queue.

join()

Block until all items in the queue have been gotten and processed.

The count of unfinished tasks goes up whenever an item is added to the queue. The count goes down whenever a consumer thread calls task_done() to indicate that the item was retrieved and all work on it is complete. When the count of unfinished tasks drops to zero, join() unblocks.


If you use JoinableQueue then you must call JoinableQueue.task_done() for each task removed from the queue or else the semaphore used to count the number of unfinished tasks may eventually overflow, raising an exception.

2 of 2
6

Based on the documentation, it's hard to be sure that Queue is actually empty. With JoinableQueue you can wait for the queue to empty by calling q.join(). In cases where you want to complete work in distinct batches where you do something discrete at the end of each batch, this could be helpful.

For example, perhaps you process 1000 items at a time through the queue, then send a push notification to a user that you've completed another batch. This would be challenging to implement with a normal Queue.

It might look something like:

import multiprocessing as mp

BATCH_SIZE = 1000
STOP_VALUE = 'STOP'

def consume(q):
  for item in iter(q.get, STOP_VALUE):
    try:
      process(item)
    # Be very defensive about errors since they can corrupt pipes.
    except Exception as e:
      logger.error(e)
    finally:
      q.task_done()

q = mp.JoinableQueue()
with mp.Pool() as pool:
  # Pull items off queue as fast as we can whenever they're ready.
  for _ in range(mp.cpu_count()):
    pool.apply_async(consume, q)
  for i in range(0, len(URLS), BATCH_SIZE):
    # Put `BATCH_SIZE` items in queue asynchronously.
    pool.map_async(expensive_func, URLS[i:i+BATCH_SIZE], callback=q.put)
    # Wait for the queue to empty.
    q.join()
    notify_users()
  # Stop the consumers so we can exit cleanly.
  for _ in range(mp.cpu_count()):
    q.put(STOP_VALUE)

NB: I haven't actually run this code. If you pull items off the queue faster than you put them on, you might finish early. In that case this code sends an update AT LEAST every 1000 items, and maybe more often. For progress updates, that's probably ok. If it's important to be exactly 1000, you could use an mp.Value('i', 0) and check that it's 1000 whenever your join releases.

🌐
Reddit
reddit.com β€Ί r/learnpython β€Ί beginner question on queue and multithreading
r/learnpython on Reddit: Beginner Question on Queue and Multithreading
October 22, 2024 -

I'm starting to dabble with Queue and Multithreading. Conceptually, I get when and where you would want to introduce these into your code. However, I just want to make sure I'm understanding the use case relationship between the two. Most online resources seem to treat the two as conjoined, but this seems to be under the assumption they will be used at larger scales.

My question, is Queue paired with Multithreading just so you can control the number of threads at any given moment without the program ending due to a lack of open threads? So, if I won't need more than 10 threads at a time then incorporating Queue might not be necessary. However, if I plan to have over 100 threads concurrently (assuming this is hitting some kind of processing limitation) then Queue should be incorporated?

Thanks!

Top answer
1 of 3
3
You are way off. It has nothing to do with the number of threads The first thing you need to understand is a "race condition". Imagine you have a list of tasks that you would like your threads to accomplish. Each thread will be while to_do_list: task = to_do_list[0] del to_do_list[0] # remove this task from the list do_work(task) When you have several threads working at the same time, it's possible that a 2 threads read the same task from the list at the same time. So basically after threadA reads the task but before threadA has a chance to delete it from the todo list, threadB has also read the task. In our example the first task in the list is done twice and the second task is skipped. But it could lead to much more serious data corruption. A Queue is simply a list that's protected against this condition. A thread can get and delete from the Queue without worry that another thread interferes. So we use Queues to pass data in and out of threads. But you should note we don't always need to use a Queue in python, because many of pythons basic functions have the same threadsafe protections built in (aka the GIL). An advanced user may use list.pop to do the same thing.
2 of 3
3
Ultimately, the two aren't instrinsically linked: you could user multithreading without queues, and even use a queue without multithreading. But they are commonly used together because the queue serves a useful purpose in coordinating work. Suppose you've got a list of 1000 items to process, and you want to use multithreading to do it in paralell. One option would be to create 1000 threads, give an item to each thread and let them process it. However, this is inefficient: those threads are consuming resources (eg. memory), and the contention of them all trying to do the same thing at once will likely slow them down. And if we have a million items, it gets even more infeasible. Ultimately, there's no reason why the optimal number of threads should depend on how many items you want to process. Generally, you'll want the same number of threads when processing 100 items as 1 million items, so we usually want to decouple the threads doing the work from the items being processed. One option would be to create, say, 10 threads and divide the work into batches of 100 items, and have each thread process a batch. This works much better, but can still have some downsides: if there's a lot of variance between items, one thread might churn through its batch faster than others, while another might take a long time - so at the end, you may have 9 threads sitting idle waiting for the last thread to finish, when they could potentially do things faster if they could take some of the work remaining. So another option is to have the threads request an item, process it, then request another item and so on, until there are no more items. And this is where Queue comes in. We put all our work items on the queue, then each thread sits in a loop pulling items off the queue and processes it, until the queue is empty. This spreads the work out better. Queue is written in such a way that retrieving and putting an item to/from it is threadsafe, meaning its protected against simultaneous access potentially corrupting the state (eg. the same item being given to two different threads, or items being lost when two threads try to add an item at the same time) - it guarantees such data races can't happen when using it. And there may be more complex scenarios, such as writing the result onto another queue, which might even have threads of its own pulling from. Ultimately, the queue is a way of communicating information between the threads in a safe way, usually in a producer/consumer oriented system.
🌐
Simplilearn
simplilearn.com β€Ί home β€Ί resources β€Ί software development β€Ί queue in python: working with queue data structure in python
Queue in Python: Working With Queue Data Structure in Python
March 5, 2026 - A queue is a built-in module of python used in threaded programming. It stores items sequentially in a FIFO manner. Learn all about the queue in python now!
Address Β  5851 Legacy Circle, 6th Floor, Plano, TX 75024 United States
🌐
Career Karma
careerkarma.com β€Ί blog β€Ί python β€Ί python queue and deque: a step-by-step guide
Python Queue and Deque: A Step-By-Step Guide | Career Karma
December 1, 2023 - There is a built-in library in Python designed to help you with these types of problems: queues. Queues are similar to stacks in Python, with the main difference being that with a queue, you remove the item least recently added.
🌐
Intellipaat
intellipaat.com β€Ί home β€Ί blog β€Ί queue in python – implementation explained
Queue in Python: How to Implement Queue in Python
October 14, 2025 - Now, if you want to know why Python is the preferred language for data science, you can go through this blog on Python Data Science tutorial. Queue is a collection of similar items arranged in a linear order.