I think the Pool class is typically more convenient, but it depends whether you want your results ordered or unordered.

Say you want to create 4 random strings (e.g,. could be a random user ID generator or so):

import multiprocessing as mp
import random
import string

# Define an output queue
output = mp.Queue()

# define a example function
def rand_string(length, output):
    """ Generates a random string of numbers, lower- and uppercase chars. """
    rand_str = ''.join(random.choice(
                    string.ascii_lowercase
                    + string.ascii_uppercase
                    + string.digits)
               for i in range(length))
    output.put(rand_str)

# Setup a list of processes that we want to run
processes = [mp.Process(target=rand_string, args=(5, output)) for x in range(4)]

# Run processes
for p in processes:
    p.start()

# Exit the completed processes
for p in processes:
    p.join()

# Get process results from the output queue
results = [output.get() for p in processes]

print(results)

# Output
# ['yzQfA', 'PQpqM', 'SHZYV', 'PSNkD']

Here, the order probably doesn't matter. I am not sure if there is a better way to do it, but if I want to keep track of results in the order in which the functions are called, I typically return tuples with an ID as first item, e.g.,

# define a example function
def rand_string(length, pos, output):
    """ Generates a random string of numbers, lower- and uppercase chars. """
    rand_str = ''.join(random.choice(
                    string.ascii_lowercase
                    + string.ascii_uppercase
                    + string.digits)
                for i in range(length))
    output.put((pos, rand_str))

# Setup a list of processes that we want to run
processes = [mp.Process(target=rand_string, args=(5, x, output)) for x in range(4)]

print(processes)

# Output
# [(1, '5lUya'), (3, 'QQvLr'), (0, 'KAQo6'), (2, 'nj6Q0')]

This let's me sort the results then:

results.sort()
results = [r[1] for r in results]
print(results)

# Output:
# ['KAQo6', '5lUya', 'nj6Q0', 'QQvLr']

The Pool class

Now to your question: How is this different from the Pool class? You'd typically prefer Pool.map to return ordered list of results without going through the hoop of creating tuples and sorting them by ID. Thus, I would say it is typically more efficient.

def cube(x):
    return x**3

pool = mp.Pool(processes=4)
results = pool.map(cube, range(1,7))
print(results)

# output:
# [1, 8, 27, 64, 125, 216]

Equivalently, there is also an "apply" method:

pool = mp.Pool(processes=4)
results = [pool.apply(cube, args=(x,)) for x in range(1,7)]
print(results)

# output:
# [1, 8, 27, 64, 125, 216]

Both Pool.apply and Pool.map will lock the main program until a process has finished.

Now, you also have Pool.apply_async and Pool.map_async, which return the result as soon as the process has finished, which is essentially similar to the Process class above. The advantage may be that they provide you with the convenient apply and map functionality that you know from Python's in-built apply and map

Answer from user2489252 on Stack Overflow
🌐
Python
docs.python.org › 3 › library › multiprocessing.html
multiprocessing — Process-based parallelism
February 23, 2026 - The multiprocessing module also introduces the Pool object which offers a convenient means of parallelizing the execution of a function across multiple input values, distributing the input data across processes (data parallelism).
🌐
Real Python
realpython.com › ref › stdlib › multiprocessing
multiprocessing | Python Standard Library – Real Python
>>> import multiprocessing, pathlib >>> def process_image(image_path): ... print(f"Processing {image_path}") ... >>> image_dir = pathlib.Path("images") >>> with multiprocessing.Pool() as pool: ... pool.map(process_image, list(image_dir.glob("*.jpg"))) ... In this example, the multiprocessing package helps you distribute the workload across multiple processes, significantly reducing the time needed to process all images in the directory. ... In this tutorial, you'll explore concurrency in Python, including multi-threaded and asynchronous solutions for I/O-bound tasks, and multiprocessing for CPU-bound tasks.
Discussions

Python multiprocessing: How to know to use Pool or Process? - Stack Overflow
So I have an algorithm I am writing, and the function multiprocess is supposed to call another function, CreateMatrixMp(), on as many processes as there are cpus, in parallel. I have never done More on stackoverflow.com
🌐 stackoverflow.com
multiprocess.Process vs. multiprocess.Pool
BTW Python version 2.7, os = Linux More on reddit.com
🌐 r/learnpython
3
8
March 27, 2013
Multithreading/Multiprocessing on Docker (Python)
This isn't a docker question, try over at r/learningpython More on reddit.com
🌐 r/docker
4
0
April 30, 2019
How to Choose the Right Python Concurrency API
This is excellent - more people should read it. Editing to say this is one of the most informative threads on this sub given both the original post and discussion. Saved. More on reddit.com
🌐 r/Python
69
420
July 29, 2022
🌐
Mimo
mimo.org › glossary › python › multiprocessing
Python Multiprocessing: Syntax, Usage, and Examples
Speed up Python performance with multiprocessing. Run CPU-heavy tasks in parallel, manage processes efficiently, and optimize execution with process pools.
🌐
Emergys
emergys.com › home › python multiprocessing: pool vs process – comparative analysis
Python Multiprocessing: Pool vs Process – Comparative Analysis | Emergys
April 3, 2021 - The pool will distribute those tasks to the worker processes(typically the same number as available cores), collect the return values as a list, and pass it to the parent process. Launching separate million processes would be less practical (probably breaking your OS). On the other hand, if you have a small number of tasks to execute in parallel and only need each task done once, it may be perfectly reasonable to use a separate multiprocessing.Process for each task rather than setting up a Pool.
🌐
DigitalOcean
digitalocean.com › community › tutorials › python-multiprocessing-example
Python Multiprocessing Example: Process, Pool & Queue | DigitalOcean
1 week ago - To parallelize a for loop, build the input list with a comprehension or range and pass it to pool.map. The parent iterates once to construct the list; workers process elements in parallel. # Tested on Python 3.11 from multiprocessing import Pool def job(n): return n, n ** 3 if __name__ == "__main__": inputs = [k for k in range(4)] with Pool(2) as pool: results = pool.map(job, inputs) print(results)
🌐
Medium
medium.com › @AlexanderObregon › understanding-pythons-multiprocessing-module-744dba8d4be4
Understanding Python’s Multiprocessing Module | Medium
August 10, 2024 - Managing concurrent tasks efficiently is a key aspect of leveraging Python’s multiprocessing module. It provides the Pool class to facilitate the parallel execution of a function across multiple input values, thus enabling you to distribute ...
Find elsewhere
🌐
Python⇒Speed
pythonspeed.com › articles › python-multiprocessing
Why your multiprocessing Pool is stuck (it’s full of sharks!)
September 13, 2024 - Python provides a handy module that allows you to run tasks in a pool of processes, a great way to improve the parallelism of your program. (Note that none of these examples were tested on Windows; I’m focusing on the *nix platform here.) from multiprocessing import Pool from os import getpid def double(i): print("I'm process", getpid()) return i * 2 if __name__ == '__main__': with Pool() as pool: result = pool.map(double, [1, 2, 3, 4, 5]) print(result)
🌐
DEV Community
dev.to › zenulabidin › python-multiprocessing-learning-pools-managers-and-challenges-at-lightspeed-1e29
Python Multiprocessing: Learning Pools, Managers and Challenges at Lightspeed - DEV Community
December 6, 2019 - Pool could prove useful if you have a numpy/scipy computation that needs to run in parallel across multiple threads. Last, we take a look at the networking multiprocessing can do for us. ... Connection objects allow the sending and receiving of picklable objects or strings. They can be thought of as message oriented connected sockets. So basically, consider this as a way to cheaply send objects across Python processes.
Top answer
1 of 2
36

I think the Pool class is typically more convenient, but it depends whether you want your results ordered or unordered.

Say you want to create 4 random strings (e.g,. could be a random user ID generator or so):

import multiprocessing as mp
import random
import string

# Define an output queue
output = mp.Queue()

# define a example function
def rand_string(length, output):
    """ Generates a random string of numbers, lower- and uppercase chars. """
    rand_str = ''.join(random.choice(
                    string.ascii_lowercase
                    + string.ascii_uppercase
                    + string.digits)
               for i in range(length))
    output.put(rand_str)

# Setup a list of processes that we want to run
processes = [mp.Process(target=rand_string, args=(5, output)) for x in range(4)]

# Run processes
for p in processes:
    p.start()

# Exit the completed processes
for p in processes:
    p.join()

# Get process results from the output queue
results = [output.get() for p in processes]

print(results)

# Output
# ['yzQfA', 'PQpqM', 'SHZYV', 'PSNkD']

Here, the order probably doesn't matter. I am not sure if there is a better way to do it, but if I want to keep track of results in the order in which the functions are called, I typically return tuples with an ID as first item, e.g.,

# define a example function
def rand_string(length, pos, output):
    """ Generates a random string of numbers, lower- and uppercase chars. """
    rand_str = ''.join(random.choice(
                    string.ascii_lowercase
                    + string.ascii_uppercase
                    + string.digits)
                for i in range(length))
    output.put((pos, rand_str))

# Setup a list of processes that we want to run
processes = [mp.Process(target=rand_string, args=(5, x, output)) for x in range(4)]

print(processes)

# Output
# [(1, '5lUya'), (3, 'QQvLr'), (0, 'KAQo6'), (2, 'nj6Q0')]

This let's me sort the results then:

results.sort()
results = [r[1] for r in results]
print(results)

# Output:
# ['KAQo6', '5lUya', 'nj6Q0', 'QQvLr']

The Pool class

Now to your question: How is this different from the Pool class? You'd typically prefer Pool.map to return ordered list of results without going through the hoop of creating tuples and sorting them by ID. Thus, I would say it is typically more efficient.

def cube(x):
    return x**3

pool = mp.Pool(processes=4)
results = pool.map(cube, range(1,7))
print(results)

# output:
# [1, 8, 27, 64, 125, 216]

Equivalently, there is also an "apply" method:

pool = mp.Pool(processes=4)
results = [pool.apply(cube, args=(x,)) for x in range(1,7)]
print(results)

# output:
# [1, 8, 27, 64, 125, 216]

Both Pool.apply and Pool.map will lock the main program until a process has finished.

Now, you also have Pool.apply_async and Pool.map_async, which return the result as soon as the process has finished, which is essentially similar to the Process class above. The advantage may be that they provide you with the convenient apply and map functionality that you know from Python's in-built apply and map

2 of 2
3

You can easily do this with pypeln:

import pypeln as pl

stage = pl.process.map(
    CreateMatrixMp, 
    range(self.numPixels), 
    workers=poolCount, 
    maxsize=2,
)

# iterate over it in the main process
for x in stage:
   # code

# or convert it to a list
data = list(stage)
🌐
GitHub
github.com › python › cpython › blob › main › Lib › multiprocessing › pool.py
cpython/Lib/multiprocessing/pool.py at main · python/cpython
# during Python shutdown when the Pool is destroyed. def __del__(self, _warn=warnings.warn, RUN=RUN): if self._state == RUN: _warn(f"unclosed running multiprocessing pool {self!r}", ResourceWarning, source=self) if getattr(self, '_change_notifier', None) is not None: self._change_notifier.put(None) ·
Author   python
🌐
GeeksforGeeks
geeksforgeeks.org › python › synchronization-pooling-processes-python
Synchronization and Pooling of processes in Python - GeeksforGeeks
September 15, 2023 - In order to utilize all the cores, multiprocessing module provides a Pool class. The Pool class represents a pool of worker processes. It has methods which allows tasks to be offloaded to the worker processes in a few different ways.
🌐
Real Python
realpython.com › lessons › how-use-multiprocessingpool
How to Use multiprocessing.Pool() (Video) – Real Python
This is the magic of the multiprocessing.Pool, because what it does is it actually fans out, it actually creates multiple Python processes in the background, and it’s going to spread out this computation for us across these different CPU cores, so they’re all going to happen in parallel and we don’t have to do anything.
Published   July 2, 2019
🌐
Medium
medium.com › @mehta.kavisha › different-methods-of-multiprocessing-in-python-70eb4009a990
Different Methods of Multiprocessing in Python | by KAVISHA MEHTA | Medium
December 10, 2022 - Each process needs to be created explicitly and hence makes it user-managed/program-managed. A process pool is a programming pattern for automatically managing a pool of worker processes.
🌐
Sebastian Raschka
sebastianraschka.com › Articles › 2014_multiprocessing.html
An introduction to parallel programming using Python’s multiprocessing module | Sebastian Raschka, PhD
June 20, 2014 - Below, we will set up benchmarking functions for our serial and multiprocessing approach that we can pass to our timeit benchmark function. We will be using the Pool.apply_async function to take advantage of firing up processes simultaneously: Here, we don’t care about the order in which ...
🌐
Chryswoods
chryswoods.com › parallel_python › pool_part2.html
chryswoods.com | Part 2: Pool
The way multiprocessing.Pool works is to fork your script into the team of workers when you create a Pool object. Each worker contains a complete copy of all of the functions and variables that exist at the time of the fork. This means that any changes after the fork will not be held by the ...
🌐
GeeksforGeeks
geeksforgeeks.org › python › multiprocessing-python-set-1
Multiprocessing in Python | Set 1 (Introduction) - GeeksforGeeks
July 23, 2025 - The main python script has a different process ID and multiprocessing module spawns new processes with different process IDs as we create Process objects p1 and p2. In above program, we use os.getpid() function to get ID of process running the current target function.
🌐
MachineLearningMastery
machinelearningmastery.com › home › blog › multiprocessing in python
Multiprocessing in Python - MachineLearningMastery.com
June 21, 2022 - The argument for multiprocessing.Pool() is the number of processes to create in the pool. If omitted, Python will make it equal to the number of cores you have in your computer.
🌐
DataFlair
data-flair.training › blogs › python-multiprocessing
Python Multiprocessing Module With Example - DataFlair
March 8, 2021 - Multiprocessing in Python is a package we can use with Python to spawn processes using an API that is much like the threading module. With support for both local and remote concurrency, it lets the programmer make efficient use of multiple ...