See 8.10 Queue — A synchronized queue class (at the top)

The Queue module implements multi-producer, multi-consumer queues. It is especially useful in threaded programming when information must be exchanged safely between multiple threads.

Answer from Pavel Anossov on Stack Overflow
🌐
Python
docs.python.org › 3 › library › queue.html
queue — A synchronized queue class
February 23, 2026 - Source code: Lib/queue.py The queue module implements multi-producer, multi-consumer queues. It is especially useful in threaded programming when information must be exchanged safely between multip...
Discussions

algorithms - How are queues made thread-safe in Python - Software Engineering Stack Exchange
The question was simple, how does Python make Queue thread-safe? My answer was, because of Interpreter Lock (GIL), any time only 1 thread is making a call to get the element from queue while others are sleeping/waiting. More on softwareengineering.stackexchange.com
🌐 softwareengineering.stackexchange.com
February 2, 2018
Beginner Question on Queue and Multithreading
You are way off. It has nothing to do with the number of threads The first thing you need to understand is a "race condition". Imagine you have a list of tasks that you would like your threads to accomplish. Each thread will be while to_do_list: task = to_do_list[0] del to_do_list[0] # remove this task from the list do_work(task) When you have several threads working at the same time, it's possible that a 2 threads read the same task from the list at the same time. So basically after threadA reads the task but before threadA has a chance to delete it from the todo list, threadB has also read the task. In our example the first task in the list is done twice and the second task is skipped. But it could lead to much more serious data corruption. A Queue is simply a list that's protected against this condition. A thread can get and delete from the Queue without worry that another thread interferes. So we use Queues to pass data in and out of threads. But you should note we don't always need to use a Queue in python, because many of pythons basic functions have the same threadsafe protections built in (aka the GIL). An advanced user may use list.pop to do the same thing. More on reddit.com
🌐 r/learnpython
6
4
October 22, 2024
python - Is `queue.Queue` thread-safe, when accessed by several subprocesses using `concurrent.futures.ProcessPoolExecutor`? - Stack Overflow
I've read from blogs that queue.Queue should be thread-safe, but does that mean it's thread-safe under the assumption that the Python interpreter only executes one thread at a time (GIL), or is it also thread-safe in situations using multiprocessing, which side-steps the GIL by using subprocesses ... More on stackoverflow.com
🌐 stackoverflow.com
It's it a bad idea to access a Queue from two threads? (c#)
C# has a thread safe queue class More on reddit.com
🌐 r/learnprogramming
11
2
July 14, 2022
🌐
Coderz Column
coderzcolumn.com › tutorials › python › queue-thread-safe-synchronized-queues-in-python
queue - Thread-Safe Synchronized Queues in Python by Sunny Solanki
We don't need to worry about handling concurrent threads from accessing and modifying queue data structure as it'll be handled by the module itself internally. If you want to use queue data structure in a multithreading environment then it’s ...
🌐
Medium
jonbleiberg.medium.com › how-python-keeps-your-queues-thread-safe-4b66a2f3e692
How Python Keeps Your Queues Thread-Safe | by Jon Bleiberg | Medium
October 10, 2023 - Of course, thread safety isn’t only an issue when working with queues. Once you move outside of a single threaded context, all sorts of thorny issues can arise. Python’s standard implementation (CPython) is no exception. Plenty of the code that underlies Python (in particular, critical memory management and garbage collection routines) is not naturally thread safe.
Top answer
1 of 1
9

You are mistaken that the GIL would make a Python program threadsafe. It only makes the interpreter itself threadsafe.

For example, let's look at a super simple LIFO queue (aka. a Stack). We'll ignore that a list can already be used as a stack.

class Stack(object):
  def __init__(self, capacity):
    self.size = 0
    self.storage = [None] * capacity

  def push(self, value):
    self.storage[self.size] = value
    self.size += 1

  def pop(self):
    self.size -= 1
    result = self.storage[self.size]
    self.storage[self.size] = None
    return result

Is this threadsafe? Absolutely not, despite running under the GIL.

Consider this sequence of events:

  • Thread 1 adds a couple of values

    stack = Stack(5)
    stack.push(1)
    stack.push(2)
    stack.push(3)
    

    The state is now storage=[1, 2, 3, None, None], size=3.

  • Thread 1 adds a value stack.push(4) and is suspended before the size can be incremented

    self.storage[self.size] = value
    # interrupted here
    self.size += 1
    

    The state is now storage=[1, 2, 3, 4, None], size=3.

  • Thread 2 removes a value stack.pop() which is 3.

    The state is now storage=[1, 2, None, 4, None], size=2.

  • Thread 1 is resumed

    self.storage[self.size] = value
    # resume here
    self.size += 1
    

    The state is now storage=[1, 2, None, 4, None], size=3.

As a result, the stack is corrupted: the pushed value can't be retrieved, and the top element is empty.

The GIL only linearises data accesses, but this is almost completely useless to the ordinary Python developer because the order of operations is still unpredictable. I.e. the GIL cannot be used as a Python-level lock, it just guarantees that the values of all variables are up to date (volatile in C or Java). Python implementations without a GIL must also provide this property for compatibility, e.g. by using volatile memory accesses or using their own locks. Jython is a GIL-less implementation that specifically uses threadsafe implementations for dict, list, and so on.

Because Python does not guarantee any order of operations between threads, it comes as no surprise that thread-safe data structures must use a lock. For example, the standard library queue.Queue class @v3.6.4 has a mutex member, and a few condvars using that mutex. All data accesses are properly guarded. But note that this class isn't primarily intended a queue data structure, but as a job queue between multiple threads. A pure data structure would not usually be concerned with locking.

Of course, locks and mutexes stink for various reasons, e.g. because of the possibility of deadlocks, and because acquiring a lock is slow. As a consequence, there's lots of interest in lock-free data structures. When the hardware provides certain atomic instructions, it is possible to update a data structure with such an atomic operation, e.g. by replacing a pointer. But this tends to be rather difficult to do.

🌐
Python Module of the Week
pymotw.com › 2 › Queue
Queue – A thread-safe FIFO implementation - Python Module of the Week
... The Queue module provides a FIFO implementation suitable for multi-threaded programming. It can be used to pass messages or other data between producer and consumer threads safely. Locking is handled for the caller, so it is simple to have as many threads as you want working with the same ...
🌐
Super Fast Python
superfastpython.com › home › tutorials › thread-safe queue in python
Thread-Safe Queue in Python - Super Fast Python
September 12, 2022 - What is the Queue and how can we use it in Python? Run loops using all CPUs, download your FREE book to learn how. Python provides a thread-safe queue in the queue.Queue class.
Find elsewhere
🌐
Reddit
reddit.com › r/learnpython › beginner question on queue and multithreading
r/learnpython on Reddit: Beginner Question on Queue and Multithreading
October 22, 2024 -

I'm starting to dabble with Queue and Multithreading. Conceptually, I get when and where you would want to introduce these into your code. However, I just want to make sure I'm understanding the use case relationship between the two. Most online resources seem to treat the two as conjoined, but this seems to be under the assumption they will be used at larger scales.

My question, is Queue paired with Multithreading just so you can control the number of threads at any given moment without the program ending due to a lack of open threads? So, if I won't need more than 10 threads at a time then incorporating Queue might not be necessary. However, if I plan to have over 100 threads concurrently (assuming this is hitting some kind of processing limitation) then Queue should be incorporated?

Thanks!

Top answer
1 of 3
3
You are way off. It has nothing to do with the number of threads The first thing you need to understand is a "race condition". Imagine you have a list of tasks that you would like your threads to accomplish. Each thread will be while to_do_list: task = to_do_list[0] del to_do_list[0] # remove this task from the list do_work(task) When you have several threads working at the same time, it's possible that a 2 threads read the same task from the list at the same time. So basically after threadA reads the task but before threadA has a chance to delete it from the todo list, threadB has also read the task. In our example the first task in the list is done twice and the second task is skipped. But it could lead to much more serious data corruption. A Queue is simply a list that's protected against this condition. A thread can get and delete from the Queue without worry that another thread interferes. So we use Queues to pass data in and out of threads. But you should note we don't always need to use a Queue in python, because many of pythons basic functions have the same threadsafe protections built in (aka the GIL). An advanced user may use list.pop to do the same thing.
2 of 3
3
Ultimately, the two aren't instrinsically linked: you could user multithreading without queues, and even use a queue without multithreading. But they are commonly used together because the queue serves a useful purpose in coordinating work. Suppose you've got a list of 1000 items to process, and you want to use multithreading to do it in paralell. One option would be to create 1000 threads, give an item to each thread and let them process it. However, this is inefficient: those threads are consuming resources (eg. memory), and the contention of them all trying to do the same thing at once will likely slow them down. And if we have a million items, it gets even more infeasible. Ultimately, there's no reason why the optimal number of threads should depend on how many items you want to process. Generally, you'll want the same number of threads when processing 100 items as 1 million items, so we usually want to decouple the threads doing the work from the items being processed. One option would be to create, say, 10 threads and divide the work into batches of 100 items, and have each thread process a batch. This works much better, but can still have some downsides: if there's a lot of variance between items, one thread might churn through its batch faster than others, while another might take a long time - so at the end, you may have 9 threads sitting idle waiting for the last thread to finish, when they could potentially do things faster if they could take some of the work remaining. So another option is to have the threads request an item, process it, then request another item and so on, until there are no more items. And this is where Queue comes in. We put all our work items on the queue, then each thread sits in a loop pulling items off the queue and processes it, until the queue is empty. This spreads the work out better. Queue is written in such a way that retrieving and putting an item to/from it is threadsafe, meaning its protected against simultaneous access potentially corrupting the state (eg. the same item being given to two different threads, or items being lost when two threads try to add an item at the same time) - it guarantees such data races can't happen when using it. And there may be more complex scenarios, such as writing the result onto another queue, which might even have threads of its own pulling from. Ultimately, the queue is a way of communicating information between the threads in a safe way, usually in a producer/consumer oriented system.
🌐
Python Tutorial
pythontutorial.net › home › python concurrency › python thread-safe queue
Python Thread-safe Queue
June 4, 2023 - The built-in queue module allows you to exchange data safely between multiple threads. The Queue class in the queue module implements all required locking semantics. To create a new queue, you import the Queue class from the queue module: from queue import QueueCode language: Python (python)
🌐
Python
docs.python.org › 3 › library › asyncio-queue.html
Queues — Python 3.14.4 documentation
February 22, 2026 - Source code: Lib/asyncio/queues.py asyncio queues are designed to be similar to classes of the queue module. Although asyncio queues are not thread-safe, they are designed to be used specifically i...
🌐
GitHub
gist.github.com › initbrain › 7023213
Python thread-safe queue example · GitHub
Python thread-safe queue example. GitHub Gist: instantly share code, notes, and snippets.
🌐
InformIT
informit.com › articles › article.aspx
2.6 queue: Thread-Safe FIFO Implementation | Python 3 Data Structures | InformIT
The queue module provides a first-in, first-out (FIFO) data structure suitable for multi-threaded programming. It can be used to pass messages or other data between producer and consumer threads safely. Locking is handled for the caller, so many threads can work with the same Queue instance ...
🌐
Super Fast Python
superfastpython.com › home › tutorials › thread-safe simplequeue in python
Thread-Safe SimpleQueue in Python - Super Fast Python
September 12, 2022 - What is the SimpleQueue and how can we use it in Python? Run loops using all CPUs, download your FREE book to learn how. Python provides a simple thread-safe queue in the queue.SimpleQueue class.
🌐
GitHub
github.com › peter-wangxu › persist-queue
GitHub - peter-wangxu/persist-queue: A thread-safe disk based persistent queue in Python · GitHub
Thread-safe: Supports multi-threaded producers and consumers · Recoverable: Items can be read after process restart · Green-compatible: Works with greenlet or eventlet environments · Multiple serialization: Supports pickle (default), msgpack, ...
Starred by 379 users
Forked by 48 users
Languages   Python 99.8% | Shell 0.2%
🌐
iO Flood
ioflood.com › blog › python-queue
Python Queue Class | Usage Guide (With Examples)
February 5, 2024 - Here, we’ll explore common issues you may encounter and offer solutions and workarounds. Python queues are thread-safe, meaning they can be accessed by multiple threads simultaneously without the risk of data corruption.
🌐
GitHub
github.com › aio-libs › janus
GitHub - aio-libs/janus: Thread-safe asyncio-aware queue for Python · GitHub
import asyncio import janus def ...(queue.async_q) await fut await queue.aclose() asyncio.run(main()) This library is built using a classic thread-safe design....
Starred by 959 users
Forked by 53 users
Languages   Python 99.3% | Makefile 0.7%
🌐
Pythontic
pythontic.com › synchronized-queue
Synchronized queue classes in Python | Pythontic.com
The Queue module in the Python Standard library provides several kinds of queue classes. All these variants of queues are thread-safe but non-reentrant. They can be used in a producer(s)-consumer(s) threads environment, without writing any synchronization code. The synchronization is built-in ...
🌐
Real Python
realpython.com › ref › stdlib › queue
queue | Python Standard Library – Real Python
The Python queue module provides reliable thread-safe implementations of the queue data structure. It is commonly used for task scheduling and managing work between multiple threads.
🌐
Python
docs.python.org › 3 › library › threading.html
threading — Thread-based parallelism — Python 3.14.4 ...
... concurrent.futures.ThreadP... being able to retrieve their results when needed. queue provides a thread-safe interface for exchanging data between running threads....