Just call Executor.shutdown:

shutdown(wait=True)

Signal the executor that it should free any resources that it is using when the currently pending futures are done executing. Calls to Executor.submit() and Executor.map() made after shutdown will raise RuntimeError.

If wait is True then this method will not return until all the pending futures are done executing and the resources associated with the executor have been freed.

However if you keep track of your futures in a list then you can avoid shutting the executor down for future use using the futures.wait() function:

concurrent.futures.wait(fs, timeout=None, return_when=ALL_COMPLETED)

Wait for the Future instances (possibly created by different Executor instances) given by fs to complete. Returns a named 2-tuple of sets. The first set, named done, contains the futures that completed (finished or were cancelled) before the wait completed. The second set, named not_done, contains uncompleted futures.

note that if you don't provide a timeout it waits until all futures have completed.

You can also use futures.as_completed() instead, however you'd have to iterate over it.

Answer from Bakuriu on Stack Overflow
Top answer
1 of 3
102

Just call Executor.shutdown:

shutdown(wait=True)

Signal the executor that it should free any resources that it is using when the currently pending futures are done executing. Calls to Executor.submit() and Executor.map() made after shutdown will raise RuntimeError.

If wait is True then this method will not return until all the pending futures are done executing and the resources associated with the executor have been freed.

However if you keep track of your futures in a list then you can avoid shutting the executor down for future use using the futures.wait() function:

concurrent.futures.wait(fs, timeout=None, return_when=ALL_COMPLETED)

Wait for the Future instances (possibly created by different Executor instances) given by fs to complete. Returns a named 2-tuple of sets. The first set, named done, contains the futures that completed (finished or were cancelled) before the wait completed. The second set, named not_done, contains uncompleted futures.

note that if you don't provide a timeout it waits until all futures have completed.

You can also use futures.as_completed() instead, however you'd have to iterate over it.

2 of 3
36

As stated before, one can use Executor.shutdown(wait=True), but also pay attention to the following note in the documentation:

You can avoid having to call this method explicitly if you use the with statement, which will shutdown the Executor (waiting as if Executor.shutdown() were called with wait set to True):

import shutil
with ThreadPoolExecutor(max_workers=4) as e:
    e.submit(shutil.copy, 'src1.txt', 'dest1.txt')
    e.submit(shutil.copy, 'src2.txt', 'dest2.txt')
    e.submit(shutil.copy, 'src3.txt', 'dest3.txt')
    e.submit(shutil.copy, 'src4.txt', 'dest4.txt')
๐ŸŒ
Python
docs.python.org โ€บ 3 โ€บ library โ€บ concurrent.futures.html
concurrent.futures โ€” Launching parallel tasks โ€” Python 3.14 ...
January 30, 2026 - You can avoid having to call this method explicitly if you use the executor as a context manager via the with statement, which will shutdown the Executor (waiting as if Executor.shutdown() were called with wait set to True): import shutil with ThreadPoolExecutor(max_workers=4) as e: ...
๐ŸŒ
Super Fast Python
superfastpython.com โ€บ home โ€บ tutorials โ€บ how to wait for all tasks to finish in the threadpoolexecutor
How to Wait For All Tasks to Finish in the ThreadPoolExecutor - Super Fast Python
September 12, 2022 - You can wait for a task to finish in a ThreadPoolExecutor by calling the wait() module function. In this tutorial you will discover how to wait for tasks to finish in a Python thread pool.
๐ŸŒ
Medium
medium.com โ€บ @superfastpython โ€บ python-threadpoolexecutor-7-day-crash-course-78d4846d5acc
Python ThreadPoolExecutor: 7-Day Crash Course | by Super Fast Python | Medium
December 3, 2023 - How to Use as_completed() with ... need to handle the task results in a consistent way. We could wait for all tasks to be completed via the wait() function or respond to tasks as they complete via the as_completed() ...
๐ŸŒ
DigitalOcean
digitalocean.com โ€บ community โ€บ tutorials โ€บ how-to-use-threadpoolexecutor-in-python-3
How To Use ThreadPoolExecutor in Python 3 | DigitalOcean
June 23, 2020 - A with statement is used to create a ThreadPoolExecutor instance executor that will promptly clean up threads upon completion. Four jobs are submitted to the executor: one for each of the URLs in the wiki_page_urls list. Each call to submit returns a Future instance that is stored in the futures list. The as_completed function waits for each Future get_wiki_page_existence call to complete so we can print its result.
๐ŸŒ
Super Fast Python
superfastpython.com โ€บ home โ€บ tutorials โ€บ wait() vs. as_completed() with the threadpoolexecutor in python
wait() vs. as_completed() With the ThreadPoolExecutor in Python - Super Fast Python
September 12, 2022 - Use wait() when waiting for one or all tasks to complete and use as_completed() when you need results as they are available when using the ThreadPoolExecutor in Python. In this tutorial, you will discover when to use wait() and as_completed() ...
๐ŸŒ
pytz
pythonhosted.org โ€บ futures
concurrent.futures โ€” Asynchronous computation โ€” futures 2.1.3 documentation
Regardless of the value of wait, the entire Python program will not exit until all pending futures are done executing. You can avoid having to call this method explicitly if you use the with statement, which will shutdown the Executor (waiting as if Executor.shutdown were called with wait set to True): import shutil with ThreadPoolExecutor(max_workers=4) as e: e.submit(shutil.copy, 'src1.txt', 'dest1.txt') e.submit(shutil.copy, 'src2.txt', 'dest2.txt') e.submit(shutil.copy, 'src3.txt', 'dest3.txt') e.submit(shutil.copy, 'src3.txt', 'dest4.txt')
๐ŸŒ
Tiew Kee Hui's Blog
tiewkh.github.io โ€บ blog โ€บ python-thread-pool-executor
Shutting Down Python's ThreadPoolExecutor
July 10, 2021 - In Python 3.7 and 3.8, shutdown() only accepts one boolean parameter, wait. When wait = True, Python will wait until all submitted tasks have finished running before shutting down the ThreadPoolExecutor.
Find elsewhere
๐ŸŒ
Python
docs.python.org โ€บ 3.3 โ€บ library โ€บ concurrent.futures.html
17.4. concurrent.futures โ€” Launching parallel tasks โ€” Python 3.3.7 documentation
def wait_on_future(): f = executor.submit(pow, 5, 2) # This will never complete because there is only one worker thread and # it is executing this function. print(f.result()) executor = ThreadPoolExecutor(max_workers=1) executor.submit(wait_on_future)
๐ŸŒ
GeeksforGeeks
geeksforgeeks.org โ€บ how-to-use-threadpoolexecutor-in-python3
How to use ThreadPoolExecutor in Python3 ? - GeeksforGeeks
July 4, 2024 - It must be called after executor.submit() and executor.map() method else it would throw RuntimeError. wait=True makes the method not to return until execution of all threads is done and resources are freed up.
Top answer
1 of 3
16

The call to ThreadPoolExecutor.map does not block until all of its tasks are complete. Use wait to do this.

Copyfrom concurrent.futures import wait, ALL_COMPLETED
...

futures = [pool.submit(fn, args) for args in arg_list]
wait(futures, timeout=whatever, return_when=ALL_COMPLETED)  # ALL_COMPLETED is actually the default
do_other_stuff()

You could also call list(results) on the generator returned by pool.map to force the evaluation (which is what you're doing in your original example). If you're not actually using the values returned from the tasks, though, wait is the way to go.

2 of 3
12

It's true that Executor.map() will not wait for all futures to finish. Because it returns a lazy iterator like @MisterMiyagi said.

But we can accomplish this by using with:

Copyimport time

from concurrent.futures import ThreadPoolExecutor

def hello(i):
    time.sleep(i)
    print(i)

with ThreadPoolExecutor(max_workers=2) as executor:
    executor.map(hello, [1, 2, 3])
print("finish")

# output
# 1
# 2
# 3
# finish

As you can see, finish is printed after 1,2,3. It works because Executor has a __exit__() method, code is

Copydef __exit__(self, exc_type, exc_val, exc_tb):
    self.shutdown(wait=True)
    return False

the shutdown method of ThreadPoolExecutor is

Copydef shutdown(self, wait=True, *, cancel_futures=False):
    with self._shutdown_lock:
        self._shutdown = True
        if cancel_futures:
            # Drain all work items from the queue, and then cancel their
            # associated futures.
            while True:
                try:
                    work_item = self._work_queue.get_nowait()
                except queue.Empty:
                    break
                if work_item is not None:
                    work_item.future.cancel()

        # Send a wake-up to prevent threads calling
        # _work_queue.get(block=True) from permanently blocking.
        self._work_queue.put(None)
    if wait:
        for t in self._threads:
            t.join()
shutdown.__doc__ = _base.Executor.shutdown.__doc__

So by using with, we can get the ability to wait until all futures finish.

๐ŸŒ
Hacker News
news.ycombinator.com โ€บ item
How to Make Python Wait | Hacker News
December 25, 2019 - Manually dealing with threads and processes is useful if you want to build a framework or a very complex workflow. But chances are you just want to run stuff concurrently, in the background ยท In that case (which is most people case), you really want to use one of the stdlib pools: it takes ...
๐ŸŒ
Python Module of the Week
pymotw.com โ€บ 3 โ€บ concurrent.futures
concurrent.futures โ€” Manage Pools of Concurrent Tasks
March 18, 2018 - from concurrent import futures import threading import time def task(n): print('{}: sleeping {}'.format( threading.current_thread().name, n) ) time.sleep(n / 10) print('{}: done with {}'.format( threading.current_thread().name, n) ) return n / 10 ex = futures.ThreadPoolExecutor(max_workers=2) print('main: starting') f = ex.submit(task, 5) print('main: future: {}'.format(f)) print('main: waiting for results') result = f.result() print('main: result: {}'.format(result)) print('main: future after result: {}'.format(f)) The status of the future changes after the tasks is completed and the result i
๐ŸŒ
Python Engineer
python-engineer.com โ€บ posts โ€บ threadpoolexecutor
How to use ThreadPoolExecutor in Python - Python Engineer
May 2, 2022 - ThreadPoolExecutor provides an interface that abstracts thread management from users and provides a simple API to use a pool of worker threads. It can create threads as and when needed and assign tasks to them. In I/O bound tasks like web scraping, while an HTTP request is waiting for the response, another thread can be spawned to continue scraping other URLs.
๐ŸŒ
Python Forum
python-forum.io โ€บ thread-39372.html
How to timeout a task using the ThreadpoolExecutor?
Hi all. I'm wonder what are the possible approaches to handle timeout of a task that was submitted to an executor ThreadPoolExecutor. I know I can timeout while getting the result like so future.result(timeout=x). But I don't understand how can I ti...
๐ŸŒ
Python.org
discuss.python.org โ€บ python help
Concurrent usage of concurrent.futures.ThreadPoolExecutor - Python Help - Discussions on Python.org
December 8, 2023 - Is concurrent.futures.ThreadPoolExecutor safe to use concurrently? I seem to run into a deadlock when I have a concurrent.futures.ThreadPoolExecutor.map call when the function being mapped submits additional tasks to the same pool and calls result() on them ยท You only have a finite pool of ...
๐ŸŒ
Alexwlchan
alexwlchan.net โ€บ 2019 โ€บ adventures-with-concurrent-futures
Adventures in Python with concurrent.futures โ€“ alexwlchan
In this case, Iโ€™m waiting for the first Future. When itโ€™s done, weโ€™ll go ahead and schedule the next one. ... import concurrent.futures import itertools tasks_to_do = get_tasks_to_do() with concurrent.futures.ThreadPoolExecutor() as executor: # Schedule the first N futures.
๐ŸŒ
Roman's Blog
romanvm.pythonanywhere.com โ€บ post โ€บ executing-parallel-tasks-python-concurrent-futures-15
Executing Parallel Tasks in Python with concurrent.futures - Roman's Blog
May 6, 2018 - 11) def get_results(): with concurrent.futures.ThreadPoolExecutor(max_workers=4) as executor: future_results = [executor.submit(worker, i) for i in range(8)] concurrent.futures.wait(future_results) for future in future_results: try: yield ...
๐ŸŒ
GitHub
gist.github.com โ€บ clchiou โ€บ f2608cbe54403edb0b13
Python ThreadPoolExecutor (non-)graceful shutdown ยท GitHub
(default behavior) Removing file a ^Ca is removed Removing file b b is removed Removing file c c is removed Removing file d d is removed Traceback (most recent call last): File "non_graceful_shutdown.py", line 25, in <module> for future in as_completed(futures): File "/usr/lib/python3.4/concurrent/futures/_base.py", line 215, in as_completed waiter.event.wait(wait_timeout) File "/usr/lib/python3.4/threading.py", line 552, in wait signaled = self._cond.wait(timeout) File "/usr/lib/python3.4/threading.py", line 289, in wait waiter.acquire() KeyboardInterrupt
๐ŸŒ
Medium
blog.wahab2.com โ€บ python-threadpoolexecutor-use-cases-for-parallel-processing-3d5c90fd5634
Python ThreadPoolExecutor: Use Cases for Parallel Processing | by Abdul Rafee Wahab | Medium
August 23, 2023 - We then submit each computation task to the thread pool using the submit() function and store the resulting Future objects in a list called futures. We then use the as_completed() function to wait for all tasks to complete and retrieve the results.