Queue.task_done is not there for the workers' benefit. It is there to support Queue.join.


If I give you a box of work assignments, do I care about when you've taken everything out of the box?

No. I care about when the work is done. Looking at an empty box doesn't tell me that. You and 5 other guys might still be working on stuff you took out of the box.

Queue.task_done lets workers say when a task is done. Someone waiting for all the work to be done with Queue.join will wait until enough task_done calls have been made, not when the queue is empty.


eigenfield points out in the comments that it seems really weird for a queue to have task_done/join methods. That's true, but it's really a naming problem. The queue module has bad name choices that make it sound like a general-purpose queue library, when it's really a thread communication library.

It'd be weird for a general-purpose queue to have task_done/join methods, but it's entirely reasonable for an inter-thread message channel to have a way to indicate that messages have been processed. If the class was called thread_communication.MessageChannel instead of queue.Queue and task_done was called message_processed, the intent would be a lot clearer.

(If you need a general-purpose queue rather than an inter-thread message channel, use collections.deque.)

Answer from user2357112 on Stack Overflow
Top answer
1 of 4
116

Queue.task_done is not there for the workers' benefit. It is there to support Queue.join.


If I give you a box of work assignments, do I care about when you've taken everything out of the box?

No. I care about when the work is done. Looking at an empty box doesn't tell me that. You and 5 other guys might still be working on stuff you took out of the box.

Queue.task_done lets workers say when a task is done. Someone waiting for all the work to be done with Queue.join will wait until enough task_done calls have been made, not when the queue is empty.


eigenfield points out in the comments that it seems really weird for a queue to have task_done/join methods. That's true, but it's really a naming problem. The queue module has bad name choices that make it sound like a general-purpose queue library, when it's really a thread communication library.

It'd be weird for a general-purpose queue to have task_done/join methods, but it's entirely reasonable for an inter-thread message channel to have a way to indicate that messages have been processed. If the class was called thread_communication.MessageChannel instead of queue.Queue and task_done was called message_processed, the intent would be a lot clearer.

(If you need a general-purpose queue rather than an inter-thread message channel, use collections.deque.)

2 of 4
25

.task_done() is used to mark .join() that the processing is done.

If you use .join() and don't call .task_done() for every processed item, your script will hang forever.


Ain't nothin' like a short example;

import logging
import queue
import threading
import time

items_queue = queue.Queue()
running = False


def items_queue_worker():
    while running:
        try:
            item = items_queue.get(timeout=0.01)
            if item is None:
                continue

            try:
                process_item(item)
            finally:
                items_queue.task_done()

        except queue.Empty:
            pass
        except:
            logging.exception('error while processing item')


def process_item(item):
    print('processing {} started...'.format(item))
    time.sleep(0.5)
    print('processing {} done'.format(item))


if __name__ == '__main__':
    running = True

    # Create 10 items_queue_worker threads
    worker_threads = 10
    for _ in range(worker_threads):
        threading.Thread(target=items_queue_worker).start()

    # Populate your queue with data
    for i in range(100):
        items_queue.put(i)

    # Wait for all items to finish processing
    items_queue.join()

    running = False
🌐
Python
docs.python.org › 3 › library › queue.html
queue — A synchronized queue class
February 23, 2026 - Indicate that a formerly enqueued task is complete. Used by queue consumer threads. For each get() used to fetch a task, a subsequent call to task_done() tells the queue that the processing on the task is complete.
Discussions

multithreading - Python Queue get()/task_done() issue - Stack Overflow
The slot is freed up after get(), task_done() is just an utility above the queue level, it does not really work with the queue itself. More on stackoverflow.com
🌐 stackoverflow.com
Asyncio.Queue.task_done is not bound to particular object or `Queue.get` call. It can pass queue.join() with unprocessed objects in queue - Async-SIG - Discussions on Python.org
I have a question about asyncio.Queue. In documentation there is an example with multiple workers. And they use the task_done call to inform queue about the finished task. I’ve played a little bit with this example and found the task_done function ambiguous. More on discuss.python.org
🌐 discuss.python.org
0
May 16, 2024
Behavior of Queue.task_done() and Queue.join()
What you described is a problem if you join on your Queue without joining your produce first. Depending on how you structured your consumers, you can join on your produce, join your queue, and join your consumers, or sometimes leave out joining the queue altogether if the consumers are all joined and doing while loops. More on reddit.com
🌐 r/learnpython
1
2
February 3, 2019
task_done() called too many times?
what does your task_done() function look like? If you are doing recursion, you could not be handling termination cases correctly, causing infinite recursion. More on reddit.com
🌐 r/learnpython
5
0
May 18, 2020
🌐
Super Fast Python
superfastpython.com › home › tutorials › queue task_done() and join() in python
Queue task_done() and join() in Python - Super Fast Python
September 12, 2022 - You can mark queue tasks done via task_done() and be notified when all tasks are done via join(). In this tutorial you will discover how to use queue task done and join in Python.
🌐
Python
docs.python.org › 3 › library › asyncio-queue.html
Queues — Python 3.14.4 documentation
February 22, 2026 - Used by queue consumers. For each get() used to fetch a work item, a subsequent call to task_done() tells the queue that the processing on the work item is complete.
🌐
Python.org
discuss.python.org › async-sig
Asyncio.Queue.task_done is not bound to particular object or `Queue.get` call. It can pass queue.join() with unprocessed objects in queue - Async-SIG - Discussions on Python.org
May 16, 2024 - I have a question about asyncio.Queue. In documentation there is an example with multiple workers. And they use the task_done call to inform queue about the finished task. I’ve played a little bit with this example and …
🌐
Reddit
reddit.com › r/learnpython › behavior of queue.task_done() and queue.join()
r/learnpython on Reddit: Behavior of Queue.task_done() and Queue.join()
February 3, 2019 -

Looking at Queue.task_done() and Queue.join(), it seems like the goal is to block the current thread until all items on the queue have been processed.

But this only works if we first put all items on the queue, yes? Otherwise the following could occur:

  1. Produce 1

  2. Consume 1, task_done()

  3. Stops join

  4. Produce 2 ...

Am I missing something here?

🌐
Data Leads Future
dataleadsfuture.com › unleashing-the-power-of-python-asyncios-queue
Unleashing the Power of Python Asyncio’s Queue
December 19, 2025 - After finishing processing the data, we use queue.task_done() to tell the queue that the data has been successfully processed.
Find elsewhere
🌐
Faucet
docs.faucet.nz › en › 1.6.15 › _modules › queue.html
queue — Python documentation
Used by Queue consumer threads. For each get() used to fetch a task, a subsequent call to task_done() tells the queue that the processing on the task is complete. If a join() is currently blocking, it will resume when all items have been processed (meaning that a task_done() call was received ...
🌐
CopyProgramming
copyprogramming.com › howto › python-queue-queue-task-done-python
Python Queue and Task_Done: Complete Guide to Queue Task Handling and Best Practices 2026 - Python queue queue task done python
December 10, 2025 - The join() method blocks the calling thread until all items that have been added to the queue have been retrieved and marked as done via task_done(). It returns immediately if the queue is empty and all previous items have been marked done. This is essential for producer-consumer patterns where ...
🌐
GitHub
gist.github.com › idettman › 49aec7c1d8f1c397cd06c982f3be2cf7
Python Multithreading: Queues · GitHub
There are a couple differences in how queues work visually. First we set a maximum size to the queue, where 0 means infinite. The second visual difference is the task_done() bit at the end. That tells the queue that not only have I retrieved the information from the list, but I’ve finished ...
🌐
Reddit
reddit.com › r/learnpython › task_done() called too many times?
r/learnpython on Reddit: task_done() called too many times?
May 18, 2020 -

I have a Queue where the code looks like this:

def doWork():
    toprocess = []

    while True:

	while len(toprocess) <= 20 and not q.empty():

		toprocess.append(q.get())
        if len(toprocess) > 0:

		for item in toprocess:
                    <do stuff>
                dosomethingelse(toprocess)
	        toprocess = []

	        q.task_done()

The idea is that it pulls 20 items off the queue, and then processes the items. But I get an error randomly, saying I've called task_done() too many times. Why would this have happened?

*fixed* Okay, thank you guys! Turns out you do need to run task_done() at some point after every call to q.get(). The fix was to run .task_done() x times, where x is the length of toprocess.

🌐
Real Python
realpython.com › ref › stdlib › queue
queue | Python Standard Library – Real Python
import logging import queue import threading import time logging.basicConfig(level=logging.INFO, format="%(threadName)s %(message)s") def worker(tasks): while True: task = tasks.get() if task is None: tasks.task_done() break logging.info(f"processing {task}") time.sleep(1) tasks.task_done() tasks = queue.Queue() for task in ["task1", "task2", "task3", "task4", "task5"]: tasks.put(task) num_workers = 2 for _ in range(num_workers): tasks.put(None) threads = [ threading.Thread(target=worker, args=(tasks,)) for _ in range(num_workers) ] for thread in threads: thread.start() tasks.join() for thread in threads: thread.join() # Output: # Thread-1 (worker) processing task1 # Thread-2 (worker) processing task2 # Thread-1 (worker) processing task3 # Thread-2 (worker) processing task4 # Thread-1 (worker) processing task5
🌐
Troy Fawkes
troyfawkes.com › home › blog › the basics of python multithreading and queues
The Basics of Python Multithreading and Queues - Troy Fawkes
May 13, 2024 - It’s the bare-bones concepts of Queuing and Threading in Python. Before you do anything else, import Queue. ... my_list = [] my_list.append(1) my_list.append(2) my_list.append(3) print my_list.pop(0) # Outputs: 1 · The above code creates a list, assigns it three values, then removes the first value in so the list now has only 2 values (which are 2 and 3). my_queue = Queue(maxsize=0) my_queue.put(1) my_queue.put(2) my_queue.put(3) print my_queue.get() my_queue.task_done() # Outputs: 1
🌐
Read the Docs
python.readthedocs.io › en › latest › library › queue.html
17.7. queue — A synchronized queue class
November 15, 2017 - Indicate that a formerly enqueued task is complete. Used by queue consumer threads. For each get() used to fetch a task, a subsequent call to task_done() tells the queue that the processing on the task is complete.
🌐
Python Tutorial
pythontutorial.net › home › python concurrency › python thread-safe queue
Python Thread-safe Queue
June 4, 2023 - The queue.task_done() indicates that the function has processed the item on the queue.
🌐
GitHub
gist.github.com › 1st1 › f110d5e2ade94e679c4442e9b6d117e1
asyncio queues example · GitHub
import asyncio import random import time import threading # https://gist.github.com/1st1/f110d5e2ade94e679c4442e9b6d117e1 async def worker(name, queue): while True: sleep_for = await queue.get() await asyncio.sleep(sleep_for) queue.task_done() print(f'Dequeue {name} for {sleep_for:.2f}') def task_inqueue(queue): while True: sleep_for = random.uniform(0.05, 1.0) print(f"inserting queue:{sleep_for}") queue.put_nowait(sleep_for) time.sleep(1) async def task_dequeue(queue): print("====> ENTERING DEQUEUE") tasks = [] for i in range(3): task = asyncio.create_task(worker(f'worker-{i}', queue)) tasks.append(task) def main(): queue = asyncio.Queue() t_inqueue = threading.Thread(target=task_inqueue, args=(queue, )) t_inqueue.start() time.sleep(3) asyncio.run(task_dequeue(queue)) if __name__ == '__main__': main()