The simple way to do this is with a queue.Queue for the work and starting the threads with for _ in range(MAXTHREADS): threading.Thread(target=f, args=(the_queue,)).start(). I find this easier to read by subclassing Thread, however. Your mileage may vary.

import threading
import queue

class Worker(threading.Thread):
    def __init__(self, q, other_arg, *args, **kwargs):
        self.q = q
        self.other_arg = other_arg
        super().__init__(*args, **kwargs)
    def run(self):
        while True:
            try:
                work = self.q.get(timeout=3)  # 3s timeout
            except queue.Empty:
                return
            # do whatever work you have to do on work
            self.q.task_done()

q = queue.Queue()
for ptf in b:
    q.put_nowait(ptf)
for _ in range(20):
    Worker(q, otherarg).start()
q.join()  # blocks until the queue is empty.

If you're insistent about using a function, I'd suggest wrapping your targetFunction with something that knows how to get from the queue.

def wrapper_targetFunc(f, q, somearg):
    while True:
        try:
            work = q.get(timeout=3)  # or whatever
        except queue.Empty:
            return
        f(work, somearg)
        q.task_done()

q = queue.Queue()
for ptf in b:
    q.put_nowait(ptf)
for _ in range(20):
    threading.Thread(target=wrapper_targetFunc,
                     args=(targetFunction, q, otherarg)).start()
q.join()
Answer from Adam Smith on Stack Overflow
๐ŸŒ
Python
docs.python.org โ€บ 3 โ€บ library โ€บ queue.html
queue โ€” A synchronized queue class
February 23, 2026 - The count of unfinished tasks goes up whenever an item is added to the queue. The count goes down whenever a consumer thread calls task_done() to indicate that the item was retrieved and all work on it is complete. When the count of unfinished tasks drops to zero, join() unblocks. Example of how ...
Discussions

Beginner Question on Queue and Multithreading
You are way off. It has nothing to do with the number of threads The first thing you need to understand is a "race condition". Imagine you have a list of tasks that you would like your threads to accomplish. Each thread will be while to_do_list: task = to_do_list[0] del to_do_list[0] # remove this task from the list do_work(task) When you have several threads working at the same time, it's possible that a 2 threads read the same task from the list at the same time. So basically after threadA reads the task but before threadA has a chance to delete it from the todo list, threadB has also read the task. In our example the first task in the list is done twice and the second task is skipped. But it could lead to much more serious data corruption. A Queue is simply a list that's protected against this condition. A thread can get and delete from the Queue without worry that another thread interferes. So we use Queues to pass data in and out of threads. But you should note we don't always need to use a Queue in python, because many of pythons basic functions have the same threadsafe protections built in (aka the GIL). An advanced user may use list.pop to do the same thing. More on reddit.com
๐ŸŒ r/learnpython
6
4
October 22, 2024
multithreading - Threading in python using queue - Stack Overflow
I wanted to use threading in python to download lot of webpages and went through the following code which uses queues in one of the website. it puts a infinite while loop. Does each of thread run More on stackoverflow.com
๐ŸŒ stackoverflow.com
multithreading - How to process with thread Queue in python - Stack Overflow
Sometimes one threading is not releasing the lock and We are unable to see the expected output. Can we use some kind of timer, and Event concept where we can expect the same output.? Thank you. 2020-01-06T12:05:10.56Z+00:00 ... Referring to the above example you are get the size of the queue just ... More on stackoverflow.com
๐ŸŒ stackoverflow.com
Python multithreading and SQLite or a similar DB
You could possibly create a queue that is shared between the threads. Regarding the bottleneck that you are trying to fix is not writing to the database then this should do the trick. Create a separate thread from your other threads that handles this queue to write data to the sqlite3 database. This then allows threads to write to the queue and then a specific thread will handle this queue (by writing to the database), meaning no interference. More on reddit.com
๐ŸŒ r/Python
41
6
July 28, 2018
๐ŸŒ
Troy Fawkes
troyfawkes.com โ€บ home โ€บ blog โ€บ the basics of python multithreading and queues
The Basics of Python Multithreading and Queues - Troy Fawkes
May 13, 2024 - Then something about a daemon, and we start the bugger. Thatโ€™s 10 threads running (remember the infinite loop in do_stuff()?) and waiting for me to put something in the Queue. The rest of the code is the same as the Queue example so Iโ€™m just going to put it all together and let you figure it out:
๐ŸŒ
GitHub
gist.github.com โ€บ arthuralvim โ€บ 356795612606326f08d99c6cd375f796
Example of python queues and multithreading. ยท GitHub
Example of python queues and multithreading. ... This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode characters ... It's perfect. Thank you ... This is great, but queue is a global... What if I have a long running thread1 Thread 2 is a consumer of thread 1 that runs intermittently, but thread 1 is always running/filling its queue.
๐ŸŒ
Python Module of the Week
pymotw.com โ€บ 2 โ€บ Queue
Queue โ€“ A thread-safe FIFO implementation - Python Module of the Week
It is simplistic and unsuitable for actual use, but the skeleton implementation gives us enough code to work with to provide an example of using the Queue module. # System modules from Queue import Queue from threading import Thread import time # Local modules import feedparser # Set up some global variables num_fetch_threads = 2 enclosure_queue = Queue() # A real app wouldn't use hard-coded data...
๐ŸŒ
Super Fast Python
superfastpython.com โ€บ home โ€บ tutorials โ€บ thread-safe queue in python
Thread-Safe Queue in Python - Super Fast Python
September 12, 2022 - You can use a thread-safe queue via the queue.Queue class. In this tutorial, you will discover how to use a thread-safe queue in Python. Letโ€™s get started. Need for a Queue A thread is a thread of execution in a computer program. Every Python program has at least one thread of execution called the main [โ€ฆ]
๐ŸŒ
Python Tutorial
pythontutorial.net โ€บ home โ€บ python concurrency โ€บ python thread-safe queue
Python Thread-safe Queue
June 4, 2023 - Inserting item 1 into the queue ... item 3 Processing item 4 Processing item 5Code language: Python (python) ... Create a daemon thread called consumer_thread and start it immediately....
Find elsewhere
๐ŸŒ
Bogotobogo
bogotobogo.com โ€บ python โ€บ Multithread โ€บ python_multithreading_Synchronization_Producer_Consumer_using_Queue.php
Python Multithreading Tutorial: Producer and consumer with Queue - 2020
Creating a thread and passing arguments ... Condition objects with producer and consumer). In the following example, the Consumer and Producer threads runs indefinitely while checking the status of the queue....
๐ŸŒ
Medium
medium.com โ€บ @surve.aasim โ€บ thread-synchronization-in-python-using-queue-1be0df535a66
Thread Synchronization in Python using threading.Queue | by Aasim | Medium
August 17, 2023 - Threads can concurrently place items into the queue using the put() method and retrieve items using the get() method.
๐ŸŒ
TutorialsPoint
tutorialspoint.com โ€บ how-to-implement-multithreaded-queue-with-python
How to implement Multithreaded queue With Python
November 10, 2020 - ** Starting the thread - Fourth *** The multiplication result for the 5 is - [5, 10, 15, 20, 25] *** The multiplication result for the 10 is - [10, 20, 30, 40, 50, 60, 70, 80, 90, 100] *** The multiplication result for the 3 is - [3, 6, 9] ** Completed the thread - Third ** Completed the thread - Fourth ** Completed the thread - Second ** Completed the thread - First ยท 6. We have successfully implemented queue concept.
๐ŸŒ
Reddit
reddit.com โ€บ r/learnpython โ€บ beginner question on queue and multithreading
r/learnpython on Reddit: Beginner Question on Queue and Multithreading
October 22, 2024 -

I'm starting to dabble with Queue and Multithreading. Conceptually, I get when and where you would want to introduce these into your code. However, I just want to make sure I'm understanding the use case relationship between the two. Most online resources seem to treat the two as conjoined, but this seems to be under the assumption they will be used at larger scales.

My question, is Queue paired with Multithreading just so you can control the number of threads at any given moment without the program ending due to a lack of open threads? So, if I won't need more than 10 threads at a time then incorporating Queue might not be necessary. However, if I plan to have over 100 threads concurrently (assuming this is hitting some kind of processing limitation) then Queue should be incorporated?

Thanks!

Top answer
1 of 3
3
You are way off. It has nothing to do with the number of threads The first thing you need to understand is a "race condition". Imagine you have a list of tasks that you would like your threads to accomplish. Each thread will be while to_do_list: task = to_do_list[0] del to_do_list[0] # remove this task from the list do_work(task) When you have several threads working at the same time, it's possible that a 2 threads read the same task from the list at the same time. So basically after threadA reads the task but before threadA has a chance to delete it from the todo list, threadB has also read the task. In our example the first task in the list is done twice and the second task is skipped. But it could lead to much more serious data corruption. A Queue is simply a list that's protected against this condition. A thread can get and delete from the Queue without worry that another thread interferes. So we use Queues to pass data in and out of threads. But you should note we don't always need to use a Queue in python, because many of pythons basic functions have the same threadsafe protections built in (aka the GIL). An advanced user may use list.pop to do the same thing.
2 of 3
3
Ultimately, the two aren't instrinsically linked: you could user multithreading without queues, and even use a queue without multithreading. But they are commonly used together because the queue serves a useful purpose in coordinating work. Suppose you've got a list of 1000 items to process, and you want to use multithreading to do it in paralell. One option would be to create 1000 threads, give an item to each thread and let them process it. However, this is inefficient: those threads are consuming resources (eg. memory), and the contention of them all trying to do the same thing at once will likely slow them down. And if we have a million items, it gets even more infeasible. Ultimately, there's no reason why the optimal number of threads should depend on how many items you want to process. Generally, you'll want the same number of threads when processing 100 items as 1 million items, so we usually want to decouple the threads doing the work from the items being processed. One option would be to create, say, 10 threads and divide the work into batches of 100 items, and have each thread process a batch. This works much better, but can still have some downsides: if there's a lot of variance between items, one thread might churn through its batch faster than others, while another might take a long time - so at the end, you may have 9 threads sitting idle waiting for the last thread to finish, when they could potentially do things faster if they could take some of the work remaining. So another option is to have the threads request an item, process it, then request another item and so on, until there are no more items. And this is where Queue comes in. We put all our work items on the queue, then each thread sits in a loop pulling items off the queue and processes it, until the queue is empty. This spreads the work out better. Queue is written in such a way that retrieving and putting an item to/from it is threadsafe, meaning its protected against simultaneous access potentially corrupting the state (eg. the same item being given to two different threads, or items being lost when two threads try to add an item at the same time) - it guarantees such data races can't happen when using it. And there may be more complex scenarios, such as writing the result onto another queue, which might even have threads of its own pulling from. Ultimately, the queue is a way of communicating information between the threads in a safe way, usually in a producer/consumer oriented system.
๐ŸŒ
GeeksforGeeks
geeksforgeeks.org โ€บ python โ€บ python-implementation-of-queues-in-multi-threading
Implementation of Queues in Multi-Threading in Python3 - GeeksforGeeks
July 11, 2025 - If that work is completed, then the process will be allowed to exit and all of the daemon threads are automatically killed (which a the desired outcome want since all of the work is done anyways). To build further familiarity with the concept, it would be a good idea try to look at a slightly more advanced example that will give a better idea of how they can be more useful. ... # run with python3.3 onwards import queue import threading lock = threading.Lock() # we will use this thread lock during the work function # Work function from step 4 def writeToFile(filename, content): lock.acquire() #
๐ŸŒ
Coderz Column
coderzcolumn.com โ€บ tutorials โ€บ python โ€บ queue-thread-safe-synchronized-queues-in-python
queue - Thread-Safe Synchronized Queues in Python by Sunny Solanki
Python provides a list of lock primitives like Lock, RLock, and Semaphore which can be used to wrap the code which is modifying the shared data to make sure that only one thread executes the code hence only one updates data at a time. It's quite easy to use these primitives to prevent part of the code from getting concurrent access. This will require us to add code each time we are accessing and modifying shared data which can result in bugs if locks are not handled properly (acquired and released in sequence). Python provides a module named queue which handles concurrent access to queue data structure by using lock semantics for us.
Top answer
1 of 3
20

Setting the thread's to be daemon threads causes them to exit when the main is done. But, yes you are correct in that your threads will run continuously for as long as there is something in the queue else it will block.

The documentation explains this detail Queue docs

The python Threading documentation explains the daemon part as well.

The entire Python program exits when no alive non-daemon threads are left.

So, when the queue is emptied and the queue.join resumes when the interpreter exits the threads will then die.

EDIT: Correction on default behavior for Queue

2 of 3
8

Your script works fine for me, so I assume you are asking what is going on so you can understand it better. Yes, your subclass puts each thread in an infinite loop, waiting on something to be put in the queue. When something is found, it grabs it and does its thing. Then, the critical part, it notifies the queue that it's done with queue.task_done, and resumes waiting for another item in the queue.

While all this is going on with the worker threads, the main thread is waiting (join) until all the tasks in the queue are done, which will be when the threads have sent the queue.task_done flag the same number of times as messages in the queue . At that point the main thread finishes and exits. Since these are deamon threads, they close down too.

This is cool stuff, threads and queues. It's one of the really good parts of Python. You will hear all kinds of stuff about how threading in Python is screwed up with the GIL and such. But if you know where to use them (like in this case with network I/O), they will really speed things up for you. The general rule is if you are I/O bound, try and test threads; if you are cpu bound, threads are probably not a good idea, maybe try processes instead.

good luck,

Mike

๐ŸŒ
GeeksforGeeks
geeksforgeeks.org โ€บ python โ€บ multithreaded-priority-queue-in-python
Multithreaded Priority Queue in Python - GeeksforGeeks
July 12, 2025 - import queue import threading import time thread_exit_Flag = 0 class sample_Thread (threading.Thread): def __init__(self, threadID, name, q): threading.Thread.__init__(self) self.threadID = threadID self.name = name self.q = q def run(self): print ("initializing " + self.name) process_data(self.name, self.q) print ("Exiting " + self.name) # helper function to process data def process_data(threadName, q): while not thread_exit_Flag: queueLock.acquire() if not workQueue.empty(): data = q.get() queueLock.release() print ("% s processing % s" % (threadName, data)) else: queueLock.release() time.sl
๐ŸŒ
GitHub
gist.github.com โ€บ initbrain โ€บ 7023213
Python thread-safe queue example ยท GitHub
Python thread-safe queue example. GitHub Gist: instantly share code, notes, and snippets.
๐ŸŒ
Medium
medium.com โ€บ @shashwat_ds โ€บ a-tiny-multi-threaded-job-queue-in-30-lines-of-python-a344c3f3f7f0
A tiny multi threaded job queue in 30 lines of python | by Shashwat Kumar | Medium
December 11, 2015 - *args allows you to pass variable number of arguments and **kwargs allows named arguments. def start_workers(self): for i in range(self.num_workers): t = Thread(target=self.worker) t.daemon = True t.start()
๐ŸŒ
Medium
basillica.medium.com โ€บ working-with-queues-in-python-a-complete-guide-aa112d310542
Working with Queues in Python โ€” A Complete Guide | by Basillica | Medium
March 27, 2024 - deque provides a thread-safe version called deque.deque that can be used in multi-threaded programs. ... In addition to the built-in data structures, you can also implement a custom queue class in Python.
๐ŸŒ
IBM
developer.ibm.com โ€บ articles โ€บ au-threadingpython
Practical threaded programming with Python
June 3, 2008 - IBM Developer is your one-stop location for getting hands-on training and learning in-demand skills on relevant technologies such as generative AI, data science, AI, and open source.
Top answer
1 of 1
1

Let start at the end:

as per expectations, the line no 13 and line no 19 should be same.

Due to the fact that you get from the queue in one thread and inserting (put) to it on another without using any Lock you should not expect that between two line (in the thread function) nothing will be added to the queue. That is what you are seeing, printing the size in line 13 and getting the size in line 14 resulting with different values.

Also, after, line no 25, it should print Start..(because association_helps should finish its execution and run again)

You print("Start..") before entering the while True loop. There for you will not see any more of that print, unless you will call this function again.


The following are explanations and examples for how to resolve the races in the put/get threaded queue:

Declare the lock as a global variable.

lock1 = threading.Lock()

Using this lock now let's ensure that the size of the queue and the expected len(items) will results with the same value.

with lock1:
    print("Line no 13:", queue1.qsize())
    SizeofQueue1 = queue1.qsize()
# and
with lock1:
    queue1.put([i])

This will results with - the same expected size.:

Line no 13: 9
Line no 19: 9
[[1], [2], [3], [4], [5], [6], [7], [8], [9]]
Line no 25: done

Regarding the print("Start.."), you can just insert it to the while loop so it will printed between iterations.

while True:
        print("Start..")
        if queue1.qsize()>0:
            # The rest of the code

Finally, If you want the items list to contain only the items from the current iteration you need to clear it. If you wold not clear the list between two iteration the difference will just get bigger and bigger.

list.clear() Remove all items from the list. Equivalent to del a[:].

And you will results with:

while True:
        print("Start..")
        items.clear()
        if queue1.qsize()>0:
            # The rest of the code