Queue.task_done is not there for the workers' benefit. It is there to support Queue.join.


If I give you a box of work assignments, do I care about when you've taken everything out of the box?

No. I care about when the work is done. Looking at an empty box doesn't tell me that. You and 5 other guys might still be working on stuff you took out of the box.

Queue.task_done lets workers say when a task is done. Someone waiting for all the work to be done with Queue.join will wait until enough task_done calls have been made, not when the queue is empty.


eigenfield points out in the comments that it seems really weird for a queue to have task_done/join methods. That's true, but it's really a naming problem. The queue module has bad name choices that make it sound like a general-purpose queue library, when it's really a thread communication library.

It'd be weird for a general-purpose queue to have task_done/join methods, but it's entirely reasonable for an inter-thread message channel to have a way to indicate that messages have been processed. If the class was called thread_communication.MessageChannel instead of queue.Queue and task_done was called message_processed, the intent would be a lot clearer.

(If you need a general-purpose queue rather than an inter-thread message channel, use collections.deque.)

Answer from user2357112 on Stack Overflow
Top answer
1 of 4
116

Queue.task_done is not there for the workers' benefit. It is there to support Queue.join.


If I give you a box of work assignments, do I care about when you've taken everything out of the box?

No. I care about when the work is done. Looking at an empty box doesn't tell me that. You and 5 other guys might still be working on stuff you took out of the box.

Queue.task_done lets workers say when a task is done. Someone waiting for all the work to be done with Queue.join will wait until enough task_done calls have been made, not when the queue is empty.


eigenfield points out in the comments that it seems really weird for a queue to have task_done/join methods. That's true, but it's really a naming problem. The queue module has bad name choices that make it sound like a general-purpose queue library, when it's really a thread communication library.

It'd be weird for a general-purpose queue to have task_done/join methods, but it's entirely reasonable for an inter-thread message channel to have a way to indicate that messages have been processed. If the class was called thread_communication.MessageChannel instead of queue.Queue and task_done was called message_processed, the intent would be a lot clearer.

(If you need a general-purpose queue rather than an inter-thread message channel, use collections.deque.)

2 of 4
25

.task_done() is used to mark .join() that the processing is done.

If you use .join() and don't call .task_done() for every processed item, your script will hang forever.


Ain't nothin' like a short example;

import logging
import queue
import threading
import time

items_queue = queue.Queue()
running = False


def items_queue_worker():
    while running:
        try:
            item = items_queue.get(timeout=0.01)
            if item is None:
                continue

            try:
                process_item(item)
            finally:
                items_queue.task_done()

        except queue.Empty:
            pass
        except:
            logging.exception('error while processing item')


def process_item(item):
    print('processing {} started...'.format(item))
    time.sleep(0.5)
    print('processing {} done'.format(item))


if __name__ == '__main__':
    running = True

    # Create 10 items_queue_worker threads
    worker_threads = 10
    for _ in range(worker_threads):
        threading.Thread(target=items_queue_worker).start()

    # Populate your queue with data
    for i in range(100):
        items_queue.put(i)

    # Wait for all items to finish processing
    items_queue.join()

    running = False
🌐
Python
docs.python.org › 3 › library › queue.html
queue — A synchronized queue class
February 23, 2026 - Indicate that a formerly enqueued task is complete. Used by queue consumer threads. For each get() used to fetch a task, a subsequent call to task_done() tells the queue that the processing on the task is complete.
🌐
Python
docs.python.org › 3 › library › asyncio-queue.html
Queues — Python 3.14.4 documentation
February 22, 2026 - Used by queue consumers. For each get() used to fetch a work item, a subsequent call to task_done() tells the queue that the processing on the work item is complete.
🌐
Super Fast Python
superfastpython.com › home › tutorials › queue task_done() and join() in python
Queue task_done() and join() in Python - Super Fast Python
September 12, 2022 - You can mark queue tasks done via task_done() and be notified when all tasks are done via join(). In this tutorial you will discover how to use queue task done and join in Python. Let’s get started. Need To Know When All Tasks are Done A thread is a thread of execution in a computer […]
🌐
Python.org
discuss.python.org › async-sig
Asyncio.Queue.task_done is not bound to particular object or `Queue.get` call. It can pass queue.join() with unprocessed objects in queue - Async-SIG - Discussions on Python.org
May 16, 2024 - I have a question about asyncio.Queue. In documentation there is an example with multiple workers. And they use the task_done call to inform queue about the finished task. I’ve played a little bit with this example and …
🌐
Data Leads Future
dataleadsfuture.com › unleashing-the-power-of-python-asyncios-queue
Unleashing the Power of Python Asyncio’s Queue
December 19, 2025 - After finishing processing the data, we use queue.task_done() to tell the queue that the data has been successfully processed.
🌐
GitHub
gist.github.com › 1st1 › f110d5e2ade94e679c4442e9b6d117e1
asyncio queues example · GitHub
import asyncio import random import time import threading # https://gist.github.com/1st1/f110d5e2ade94e679c4442e9b6d117e1 async def worker(name, queue): while True: sleep_for = await queue.get() await asyncio.sleep(sleep_for) queue.task_done() print(f'Dequeue {name} for {sleep_for:.2f}') def task_inqueue(queue): while True: sleep_for = random.uniform(0.05, 1.0) print(f"inserting queue:{sleep_for}") queue.put_nowait(sleep_for) time.sleep(1) async def task_dequeue(queue): print("====> ENTERING DEQUEUE") tasks = [] for i in range(3): task = asyncio.create_task(worker(f'worker-{i}', queue)) tasks.append(task) def main(): queue = asyncio.Queue() t_inqueue = threading.Thread(target=task_inqueue, args=(queue, )) t_inqueue.start() time.sleep(3) asyncio.run(task_dequeue(queue)) if __name__ == '__main__': main()
🌐
Troy Fawkes
troyfawkes.com › home › blog › the basics of python multithreading and queues
The Basics of Python Multithreading and Queues - Troy Fawkes
May 13, 2024 - ... my_list = [] my_list.append(1) my_list.append(2) my_list.append(3) print my_list.pop(0) # Outputs: 1 · The above code creates a list, assigns it three values, then removes the first value in so the list now has only 2 values (which are 2 and 3). my_queue = Queue(maxsize=0) my_queue.put(1) ...
Find elsewhere
🌐
Reddit
reddit.com › r/learnpython › behavior of queue.task_done() and queue.join()
r/learnpython on Reddit: Behavior of Queue.task_done() and Queue.join()
February 3, 2019 -

Looking at Queue.task_done() and Queue.join(), it seems like the goal is to block the current thread until all items on the queue have been processed.

But this only works if we first put all items on the queue, yes? Otherwise the following could occur:

  1. Produce 1

  2. Consume 1, task_done()

  3. Stops join

  4. Produce 2 ...

Am I missing something here?

🌐
PyPI
pypi.org › project › task-queue
python-task-queue
Here's an example implementation of a trivial PrintTask. The attributes of your container class should be simple values that can be easily encoded into JSON such as ints, floats, strings, and numpy arrays.
      » pip install task-queue
    
Published   Jul 02, 2025
Version   2.14.3
🌐
Real Python
realpython.com › ref › stdlib › queue
queue | Python Standard Library – Real Python
In this example, you use the queue module to distribute tasks among two worker threads, ensuring that each task is processed exactly once. ... In this tutorial, you'll take a deep dive into the theory and practice of queues in programming.
🌐
Full Stack Python
fullstackpython.com › task-queues.html
Task Queues - Full Stack Python
flask-celery-example is a simple Flask application with Celery as a task queue and Redis as the broker. django_dramatiq_example and flask_dramatiq_example are simple apps that demo how you can use Dramatiq with Django and Flask, respectively. International Space Station notifications with Python and Redis Queue (RQ) shows how to combine the RQ task queue library with Flask to send text message notifications every time a condition is met - in this blog post's case that the ISS is currently flying over your location on Earth.
🌐
TestDriven.io
testdriven.io › blog › developing-an-asynchronous-task-queue-in-python
Developing an Asynchronous Task Queue in Python | TestDriven.io
June 21, 2023 - This tutorial looks at how to implement several asynchronous task queues using the Python multiprocessing library and Redis.
🌐
GitHub
gist.github.com › idettman › 49aec7c1d8f1c397cd06c982f3be2cf7
Python Multithreading: Queues · GitHub
There are a couple differences in how queues work visually. First we set a maximum size to the queue, where 0 means infinite. The second visual difference is the task_done() bit at the end. That tells the queue that not only have I retrieved the information from the list, but I’ve finished with it.
🌐
Faucet
docs.faucet.nz › en › 1.6.15 › _modules › queue.html
queue — Python documentation
All methods # that acquire mutex ... self.all_tasks_done = threading.Condition(self.mutex) self.unfinished_tasks = 0 def task_done(self): '''Indicate that a formerly enqueued task is complete....
🌐
Medium
basillica.medium.com › working-with-queues-in-python-a-complete-guide-aa112d310542
Working with Queues in Python — A Complete Guide | by Basillica | Medium
March 27, 2024 - Example: from threading import ... Queue def worker(q): while True: item = q.get() print(f'Working on {item}') print(f'Finished {item}') q.task_done() q = Queue() for i in range(10): q.put(i) thread = Thread(target=worker, args=(q,)) thread.start() q.join() # Block until all tasks are done · This shows ...
🌐
Samuelstevens
samuelstevens.me › writing › python-multiprocessing
Multiprocessing Queue Example in Python - Sam Stevens
import time from multiprocessing import Queue, Process def produce(q: "Queue[int]", length: int) -> None: for _ in range(length): q.put(3) def consume(q: "Queue[int]") -> None: while True: num = q.get() print(f"Sleeping for {num} seconds.") time.sleep(num) # expensive work # q.task_done() would ...
🌐
Read the Docs
python.readthedocs.io › en › latest › library › queue.html
17.7. queue — A synchronized queue class
November 15, 2017 - Indicate that a formerly enqueued task is complete. Used by queue consumer threads. For each get() used to fetch a task, a subsequent call to task_done() tells the queue that the processing on the task is complete.
🌐
Python Module of the Week
pymotw.com › 2 › Queue
Queue – A thread-safe FIFO implementation - Python Module of the Week
It processes items in the queue one after another. These daemon threads go into an infinite loop, and only exit when the main thread ends. """ while True: print '%s: Looking for the next enclosure' % i url = q.get() print '%s: Downloading:' % i, url # instead of really downloading the URL, # we just pretend and sleep time.sleep(i + 2) q.task_done() # Set up some threads to fetch the enclosures for i in range(num_fetch_threads): worker = Thread(target=downloadEnclosures, args=(i, enclosure_queue,)) worker.setDaemon(True) worker.start() # Download the feed(s) and put the enclosure URLs into # the queue.