To quote the documentation for typing.ParamSpec:

P.args [...] should only be used to annotate *args. P.kwargs [...] should be only be used to annotate **kwargs.

So this is not possible to with ParamSpec.

The idea of being able to use P.args/P.kwargs in other scopes is not new. But there are fundamental problems that with that, which are unlikely to be solved with ParamSpec as is. Actually PEP 612 mentioned the underlying issue briefly, explaining the reason for why ParamSpec was introduced in the first place:

The core problem here is that, by default, parameters in Python can either be called positionally or as a keyword argument. This means we really have three categories (positional-only, positional-or-keyword, keyword-only) we’re trying to jam into two categories. [...] Fundamentally, in order to capture two categories when there are some things that can be in either category, we need a higher level primitive (ParamSpec) to capture all three, and then split them out afterward.

A good example was given by someone in the aforementioned issue:

def foo(x: int) -> None: ...
def get_callable_and_args(x: Callable[P, Any]) -> Tuple[P.args, P.kwargs]: ...

reveal_type(get_callable_and_args(foo))  # <- What should this reveal?

Should we reveal a tuple of int + empty dict? Or should we reveal an empty tuple and a singleton dict that maps x to int? [...] the existence of "positional-or-keyword" arguments in Python makes things highly ambiguous. [...] To fully resolve this kind of issue, there needs to be a way in the type system to represent positional-only, positional-or-keyword, and keyword-only parameters separately. But at that point, it's going to be a different language feature, not ParamSpec anymore.

So I would not hold my breath for ParamSpec to support this any time soon.


If we were talking only about positional arguments, what you want there would be possible with the TypeVarTuple. PEP 646 specifically suggests they can be used as Callable arguments.

Something like this should then be possible:

from collections.abc import Callable
from typing import Any, TypeVarTuple

Ts = TypeVarTuple("Ts")


def test_function(func: Callable[[*Ts], Any], args: list[tuple[*Ts]]) -> None:
    for a in args:
        func(*a)

I say should because it is still hard to verify since mypy does not fully support PEP 646 yet.

But even so, no support for keyword-arguments. Basically, as soon as a combination of the three different parameter categories comes up, things get tricky fast.

Answer from Daniel Fainberg on Stack Overflow
🌐
Python
peps.python.org › pep-0612
PEP 612 – Parameter Specification Variables | peps.python.org
A ParamSpec captures both positional and keyword accessible parameters, but there unfortunately is no object in the runtime that captures both of these together. Instead, we are forced to separate them into *args and **kwargs, respectively. This means we need to be able to split apart a single ParamSpec into these two components, and then bring them back together into a call.
🌐
Python documentation
docs.python.org › 3 › library › typing.html
typing — Support for type hints
For example, to annotate a decorator with_lock which provides a threading.Lock to the decorated function, Concatenate can be used to indicate that with_lock expects a callable which takes in a Lock as the first argument, and returns a callable with a different type signature. In this case, the ParamSpec indicates that the returned callable’s parameter types are dependent on the parameter types of the callable being passed in:
🌐
Sobolevn
sobolevn.me › 2021 › 12 › paramspec-guide
Python ParamSpec guide
from typing_extensions import ParamSpec, Concatenate # or `typing` for `python>=3.10` P = ParamSpec('P') def bar(x: int, *args: bool) -> int: ...
🌐
Biopragmatics
cthoyt.com › 2025 › 04 › 22 › python-generic-with-paramspec.html
Using ParamSpec with Python Generics | Biopragmatics
April 22, 2025 - from collections.abc import Callable from typing import ParamSpec, TypeVar P = ParamSpec("P") T = TypeVar("T") def applies_unary_function_generic(func: Callable[P, T], *args: P.args, **kwargs: P.kwargs) -> T: return func(*args, **kwargs) >>> applies_unary_function_generic(square_root, 9) 3.0
Top answer
1 of 2
3

To quote the documentation for typing.ParamSpec:

P.args [...] should only be used to annotate *args. P.kwargs [...] should be only be used to annotate **kwargs.

So this is not possible to with ParamSpec.

The idea of being able to use P.args/P.kwargs in other scopes is not new. But there are fundamental problems that with that, which are unlikely to be solved with ParamSpec as is. Actually PEP 612 mentioned the underlying issue briefly, explaining the reason for why ParamSpec was introduced in the first place:

The core problem here is that, by default, parameters in Python can either be called positionally or as a keyword argument. This means we really have three categories (positional-only, positional-or-keyword, keyword-only) we’re trying to jam into two categories. [...] Fundamentally, in order to capture two categories when there are some things that can be in either category, we need a higher level primitive (ParamSpec) to capture all three, and then split them out afterward.

A good example was given by someone in the aforementioned issue:

def foo(x: int) -> None: ...
def get_callable_and_args(x: Callable[P, Any]) -> Tuple[P.args, P.kwargs]: ...

reveal_type(get_callable_and_args(foo))  # <- What should this reveal?

Should we reveal a tuple of int + empty dict? Or should we reveal an empty tuple and a singleton dict that maps x to int? [...] the existence of "positional-or-keyword" arguments in Python makes things highly ambiguous. [...] To fully resolve this kind of issue, there needs to be a way in the type system to represent positional-only, positional-or-keyword, and keyword-only parameters separately. But at that point, it's going to be a different language feature, not ParamSpec anymore.

So I would not hold my breath for ParamSpec to support this any time soon.


If we were talking only about positional arguments, what you want there would be possible with the TypeVarTuple. PEP 646 specifically suggests they can be used as Callable arguments.

Something like this should then be possible:

from collections.abc import Callable
from typing import Any, TypeVarTuple

Ts = TypeVarTuple("Ts")


def test_function(func: Callable[[*Ts], Any], args: list[tuple[*Ts]]) -> None:
    for a in args:
        func(*a)

I say should because it is still hard to verify since mypy does not fully support PEP 646 yet.

But even so, no support for keyword-arguments. Basically, as soon as a combination of the three different parameter categories comes up, things get tricky fast.

2 of 2
2

While you can't reshape ParamSpec like that, with some API juggling you can achieve almost the same effect with minimal boilerplate code whilst retaining static typing. The key is to bind P = ParamSpec("P") to a class, then use that class's methods (which are bound to P) to do the type-checking for you.

First, a demonstration of what the final API could look like. I'm using mypy to generate the errors, but this should be compliant with other type-checkers.

def f1(a: int, b: float, *, c: int) -> float:
    return a + b + c

# Version 1 - sequence of procedural calls
>>> tester = FunctionTester(f1)
>>> tester.add_args(1, 2, c=-1)
>>> tester.add_args(3, 4, c=-2)
>>>
>>> # mypy: Argument 1 to "add_args" of "FunctionTester" has incompatible type "str"; expected "int" [arg-type]
>>> # mypy: Argument "b" to "add_args" of "FunctionTester" has incompatible type "List[<nothing>]"; expected "float" [arg-type]
>>> # mypy: Unexpected keyword argument "d" for "add_args" of "FunctionTester" [call-arg]
>>> # tester.add_args("", b=[], d=object())
>>>
>>> tester.test()
f1(1, 2, c=-1) executed in: 0:00:00.000002
f1(3, 4, c=-2) executed in: 0:00:00.000001

# Version 2 - minimalist API using a context manager which automatically calls the test upon exit
>>> with FunctionTester(f1) as test:
...     test(1, 2, c=-1)
...     test(3, 4, c=-2)
...
...     # mypy: Argument 1 has incompatible type "str"; expected "int" [arg-type]
...     # mypy: Argument "b" has incompatible type "List[<nothing>]"; expected "float" [arg-type]
...     # mypy: Unexpected keyword argument "d" [call-arg]
...     # test("", b=[], d=object())
f1(1, 2, c=-1) executed in: 0:00:00.000001
f1(3, 4, c=-2) executed in: 0:00:00.000001

Now, the implementation:

from types import TracebackType
from typing import (
    Any,
    Callable,
    Generic,
    ParamSpec,
    Sequence,
)
from typing_extensions import Self

from datetime import timedelta
from itertools import chain
from time import monotonic


P = ParamSpec("P")


class FunctionTester(Generic[P]):

    _func: Callable[P, Any]
    _arg_groups: list[tuple[tuple[Any, ...], dict[str, Any]]]

    def __init__(self, function_: Callable[P, Any]) -> None:
        self._func = function_
        self._arg_groups = []

    def add_args(self, *args: P.args, **kwargs: P.kwargs) -> None:
        self._arg_groups.append((args, kwargs))

    def test(self) -> None:
        func: Callable[P, Any] = self._func
        for a, kw in self._arg_groups:
            args_str = ", ".join(
                chain(
                    (str(i) for i in a),
                    (f"{k}={v}" for k, v in kw.items()),
                )
            )
            start = monotonic()
            func(*a, **kw)
            print(
                f"{func.__name__}({args_str}) "
                f"executed in: {timedelta(seconds=monotonic() - start)}"
            )
    
    # The following additional methods allow the API in Version 2

    __call__ = add_args

    def __enter__(self) -> Self:
        return self

    def __exit__(
        self,
        exc_type: type[BaseException],
        exc_val: BaseException,
        exc_tb: TracebackType,
    ) -> None:
        self.test()
🌐
GitHub
github.com › python › typing › issues › 1252
ParamSpec: use P.args and P.kwargs in other scopes, such as return types · Issue #1252 · python/typing
September 13, 2022 - from typing import Callable, TypeVar, Tuple, Any from typing_extensions import ParamSpec P = ParamSpec('P') T = TypeVar('T') def complex(value: str, reverse: bool =False, capitalize: bool =False) -> str: if reverse: value = str(reversed(value)) if capitalize: value = value.capitalize() return value def call_it(func: Callable[P, T], args: P.args, kwargs: P.kwargs) -> T: print("calling", func) return func(*args, **kwargs) def get_callable_and_args() -> Tuple[Callable[P, Any], P.args, P.kwargs]: return complex, ('foo',), {'reverse': True} call_it(*get_callable_and_args())
Author   chadrik
🌐
Python.org
discuss.python.org › python help
Using Concatenate and ParamSpec with a keyword argument - Python Help - Discussions on Python.org
August 25, 2023 - Hi all, Do I understand PEP 612 right in that yt allows to annotate a decorator that “removes” the first parameter of the decorate function, but it’s not possible (yet?) to fully annotate a decorator that would act on a KW only parameter? An example of what I am trying to achieve: def call_authenticated( func: Callable[Concatenate[AuthenticatedClient, P], Awaitable[R]], client: AuthenticatedClient, ) -> Callable[P, Awaitable[R]]: async def wrapped(*a, **k): return await fu...
🌐
Python.org
discuss.python.org › python help
Removing parameters from the *end* of a ParamSpec with Concatenate - Python Help - Discussions on Python.org
March 29, 2024 - If this has been addressed elsewhere, please redirect me 🙂 From my understanding of PEP612, and the typing docs on generics, typing.Concatenate can be used to denote addition or removal of a parameter from the beginning of a signature. An example (slightly modified from the one in PEP612): from typing import Callable, Concatenate, ParamSpec, reveal_type, Any P = ParamSpec("P") def bar(x: int, y: str) -> int: ... # higher order func that removes the first parameter def remove_f...
Find elsewhere
🌐
GitHub
github.com › python › typing › discussions › 1289
Using ParamSpec as an argument to a Generic type · python/typing · Discussion #1289
from typing import Callable, TypeVar, Generic, ParamSpec T = TypeVar("T") P = ParamSpec("P") class Container(Generic[T]): def __init__(self, x : T): self.x = x def map(f : Callable[P, T]) -> Callable[[Container[P]], Container[T]]: def inner(*args: Container[P.args], **kwargs: Container[P.kwargs]) -> Container[T]: args = tuple(arg.x for arg in args) kwargs = {k: v.x for k, v in kwargs.items()} return Container(f(*args, **kwargs)) return inner
Author   python
🌐
Python.org
discuss.python.org › python help
How to properly hint a class factory with ParamSpec - Python Help - Discussions on Python.org
June 16, 2023 - From a code perspective this works well and without issues, however I would like that the class factory is properly type hinted. Unfortunately I can’t seem to make it work. from typing import ParamSpec, TypeVar, Type P = ParamSpec("P") T = TypeVar("T") class A: def __init__(self, p1: int): ...
Top answer
1 of 2
4

if the parameters match exactly you can use this 1-1, if there's a mismatch you would need to also use Concatenate and modify it a bit, i have yet to find a generic solution for that case. this is a modified version of the above answer from @Daniil Fajnberg

from typing import TypeVar, ParamSpec, Callable

P = ParamSpec("P")
T = TypeVar("T")
S = TypeVar("S")

def paramspec_from(_: Callable[P, T]) -> Callable[[Callable[P, S]], Callable[P, S]]:
    def _fnc(fnc: Callable[P, S]) -> Callable[P, S]:
        return fnc
    return _fnc

then you can use it like this:

@paramspec_from(known_function)
def wrapper(*args, **kwargs) -> string:
    foo()
    known_function(*args, **kwargs)
    return "whatever"

The problem is that mypy only looks at the function header to resolve the ParamSpec, not the function body, so it cannot deduce the types of args and kwargs. but if we lift the function into the header, like by passing it into the decorator as an argument, mypy can deduce the parameters correctly. As you see we don't even use the function, we marked it with _. we only use it for its typehints.

2 of 2
3

PEP 612 as well as the documentation of ParamSpec.args and ParamSpec.kwargs are pretty clear on this:


These “properties” can only be used as the annotated types for *args and **kwargs, accessed from a ParamSpec already in scope.

- Source: PEP 612 ("The components of a ParamSpec" -> "Valid use locations")


Both attributes require the annotated parameter to be in scope.

- Source: python.typing module documentation (class typing.ParamSpec -> args/kwargs)


They [parameter specifications] are only valid when used in Concatenate, or as the first argument to Callable, or as parameters for user-defined Generics.

- Source: python.typing module documentation (class typing.ParamSpec, second paragraph)


So no, you cannot use parameter specification args/kwargs, without binding it a concrete Callable in the scope you want to use them in.

I question why you would even want that. If you know that wrapper will always call known_function and you want it to (as you said) have the exact same arguments, then you just annotate it with the same arguments. Example:

def known_function(x: int, y: str) -> bool:
    return str(x) == y


def wrapper(x: int, y: str) -> bool:
    # other things...
    return known_function(x, y)

If you do want wrapper to accept additional arguments aside from those passed on to known_function, then you just include those as well:

def known_function(x: int, y: str) -> bool:
    return str(x) == y


def wrapper(a: float, x: int, y: str) -> bool:
    print(a ** 2)
    return known_function(x, y)

If your argument is that you don't want to repeat yourself because known_function has 42 distinct and complexly typed parameters, then (with all due respect) the design of known_function should be covered in copious amounts gasoline and set ablaze.


If you insist to dynamically associate the parameter specifications (or are curious about possible workarounds for academic reasons), the following is the best thing I can think of.

You write a protected decorator that is only intended to be used on known_function. (You could even raise an exception, if it is called with anything else to protect your own sanity.) You define your wrapper inside that decorator (and add any additional arguments, if you want any). Thus, you'll be able to annotate its *args/**kwargs with the ParamSpecArgs/ParamSpecKwargs of the decorated function. In this case you probably don't want to use functools.wraps because the function you receive out of that decorator is probably intended not to replace known_function, but stand alongside it.

Here is a full working example:

from collections.abc import Callable
from typing import Concatenate, ParamSpec, TypeVar


P = ParamSpec("P")
T = TypeVar("T")


def known_function(x: int, y: str) -> bool:
    """Does thing XY"""
    return str(x) == y


def _decorate(f: Callable[P, T]) -> Callable[Concatenate[float, P], T]:
    if f is not known_function:  # type: ignore[comparison-overlap]
        raise RuntimeError("This is an exclusive decorator.")

    def _wrapper(a: float, /, *args: P.args, **kwargs: P.kwargs) -> T:
        """Also does thing XY, but first does something else."""
        print(a ** 2)
        return f(*args, **kwargs)
    return _wrapper


wrapper = _decorate(known_function)


if __name__ == "__main__":
    print(known_function(1, "2"))
    print(wrapper(3.14, 10, "10"))

Output as expected:

False
9.8596
True

Adding reveal_type(wrapper) to the script and running mypy gives the following:

Revealed type is "def (builtins.float, x: builtins.int, y: builtins.str) -> builtins.bool"

PyCharm also gives the relevant suggestions regarding the function signature, which it infers from having known_function passed into _decorate.

But again, just to be clear, I don't think this is good design. If your "wrapper" is not generic, but instead always calls the same function, you should explicitly annotate it, so that its parameters correspond to that function. After all:

Explicit is better than implicit.

- Zen of Python, line 2

🌐
Aneeshdurg
aneeshdurg.me › posts › 2024 › 01 › 28-paramspec
Using ParamSpec in Python
from typing import Callable, Concatenate, ParamSpec, TypeVar P = ParamSpec("P") R = TypeVar("R") def supplyFirstArgument( fn: Callable[Concatenate[Logger, P], R]) -> Callable[P, R]: def wrapped(*args: P.args, **kwargs: P.kwargs) -> R: return fn(global_logger, *args, **kwargs) Here by using Concatenate we’re telling python that we’re specifying the type of the first argument, and binding to all remaining arguments.
🌐
Python.org
discuss.python.org › python help
Proposal: Add `bound` to `ParamSpec` - Python Help - Discussions on Python.org
November 30, 2023 - Hi all, Is it feasible to add a bound parameter to ParamSpec? Sometimes, manually specifying the bound can greatly reduce the difficulty of using ParamSpec. For example, I have a base class defined like this: class IntView: def __init__(self, init_data: int): ... Then I want to inherit IntView like this: class RangedIntView(IntView): def __init__(self, init_data: int, min_value: int, max_value: int): super().__init__(init_data) self._min_value = min_value ...
🌐
GitHub
github.com › python › typing › issues › 1405
Generic ParamSpec in subclass definitions · Issue #1405 · python/typing
January 23, 2023 - topic: featureDiscussions about new features for Python's type annotationsDiscussions about new features for Python's type annotations ... Given some base class that is generic over a param-spec, I'd like to be able to define the param-spec using a subclass method implementation. Something like: T = TypeVar("T") P = ParamSpec("P") class Base(Generic[P, T]): func: Callable[P, T] # for instance def wrapper(self, *args: P.args, **kwargs: P.kwargs) -> Tuple[T]: return (self.func(*args, **kwargs), ) class Subclass(Base): def func(self, x: int, *, y: str) -> bytes: ...
Published   May 25, 2023
Author   alanhdu
🌐
Python
typing.python.org › en › latest › spec › generics.html
Generics — typing documentation
A ParamSpec captures both positional and keyword accessible parameters, but there unfortunately is no object in the runtime that captures both of these together. Instead, we are forced to separate them into *args and **kwargs, respectively. This means we need to be able to split apart a single ParamSpec into these two components, and then bring them back together into a call.
🌐
Python.org
discuss.python.org › python help
How to use `Paramspec` - Python Help - Discussions on Python.org
June 24, 2023 - No matter how I try to use it, I get a mypy error. Essentially I have a function like: def normalize(cls, data, *args, normalizer=None, **kwargs): if isinstance(cls, GenericAlias) and cls.__origin__ == list: if not isinstance(data, list): data = [data] if normalizer: return [normalizer(d, *args, **kwargs) for d in data] obj_cls = cls.__args__[0] return [obj_cls(d, *args, **kwargs) for d in data] if normalizer: return normal...
🌐
Python.org
discuss.python.org › ideas
Parametrize ParamSpec or TypedDicts with callables to provide signature data - Ideas - Discussions on Python.org
October 8, 2024 - I figure that was discussed before, but I haven’t been able to find the relevant topics, maybe I can revive the discussion TL;DR (skip to idea section) Motivation One should prefer explicit over implicit, but what if you cannot be explicit? Lets say I am using a 3rd party library for example seaborn.lineplot, which in turn uses matplotlib as a 4th party library. seaborn.lineplot(data=None, *, , err_kws=None, **kwargs) Without consulting the matplotlib documentation I do...
Top answer
1 of 1
2

Firstly, some notes about limitations of ParamSpec and Concatenate:

  • Concatenate requires you to always put the argument that is to be taken first in the list, and the final arg must be the ParamSpec.

  • A good way to think about it is that the wrapper "consumes" the first argument, and always expects a function that has a certain structure.

The below example type checks fine for me in pyright/pylance/mypy. I've introduced a few of the classes to check everything type checks - and the return type of "out" is User so it seems fine:

import asyncio
from typing import Any, Callable, Concatenate, Coroutine, cast
import contextlib
from uuid import UUID

@contextlib.asynccontextmanager
async def aclosing():
    try:
        yield AsyncConnection()
    finally:
        pass

class AsyncConnection[T]:
    def cursor(self):
        return aclosing()

class DbDriver:
    def connection(self):
        return aclosing()

def ensure_connR, **P -> Callable[P, Coroutine[Any, Any, R]]:
    """Ensure the function has a conn argument. If conn is not provided, generate a new connection and pass it to the function."""

    async def wrapper(*args: P.args, **kwargs: P.kwargs) -> R:
        # Get named keyword argument conn, or find an AsyncConnection in the args
        kwargs_conn = kwargs.get("conn")
        conn_arg: AsyncConnection[Any] | None = None
        if isinstance(kwargs_conn, AsyncConnection):
            conn_arg = kwargs_conn
        elif not conn_arg:
            for arg in args:
                if isinstance(arg, AsyncConnection):
                    conn_arg = arg
                    break
        if conn_arg:
            # If conn is provided, call the method as is
            return await func(conn_arg, *args, **kwargs)
        else:
            # If conn is not provided, generate a new connection and pass it to the method
            db_driver = DbDriver()
            async with db_driver.connection() as conn:
                return await func(conn, *args, **kwargs)

    return wrapper

class User:
    def __init__(self, uuid: UUID) -> None:
        self.uuid=uuid

@ensure_conn
async def get_user(conn: AsyncConnection, user_id: UUID):
    async with conn.cursor() as cursor:
        return User(user_id)

async def main():
    uuid: UUID = cast(UUID, '519766c5-af86-47ea-9fa9-cee0c0de66b1')
    out = await get_user(uuid)

asyncio.run(main())

Hope this helps!

Top answer
1 of 1
17

There is surprisingly little about this online. I was able to find someone else's discussion of this over at python/typing's Github, which I distilled using your example.

The crux of this solution is Callback Protocols, which are functionally equivalent to Callable, but additionally enable us to modify the return type of __get__ (essentially removing the self parameter) as is done for standard methods.

from __future__ import annotations

from typing import Any, Callable, Concatenate, Generic, ParamSpec, Protocol, TypeVar

from requests import Request

P = ParamSpec("P")
R = TypeVar("R", covariant=True)


class Method(Protocol, Generic[P, R]):
    def __get__(self, instance: Any, owner: type | None = None) -> Callable[P, R]:
        ...

    def __call__(self_, self: Any, *args: P.args, **kwargs: P.kwargs) -> R:
        ...


def request_wrapper(f: Callable[Concatenate[Any, Request, P], R]) -> Method[P, R]:
    def inner(self, *args: P.args, **kwargs: P.kwargs) -> R:
        return f(self, Request(), *args, **kwargs)

    return inner


class Thing:
    @request_wrapper
    def takes_int_str(self, request: Request, x: int, y: str) -> int:
        print(request)
        return x + 7


thing = Thing()
thing.takes_int_str(1, "a")

Since @Creris asked about the mypy error raised from the definition of inner, which is an apparent bug in mypy w/ ParamSpec and Callback Protocols as of mypy==0.991, here is an alternate implementation with no errors:

from __future__ import annotations

from typing import Any, Callable, Concatenate, ParamSpec, TypeVar

from requests import Request

P = ParamSpec("P")
R = TypeVar("R", covariant=True)


def request_wrapper(f: Callable[Concatenate[Any, Request, P], R]) -> Callable[Concatenate[Any, P], R]:
    def inner(self: Any, *args: P.args, **kwargs: P.kwargs) -> R:
        return f(self, Request(), *args, **kwargs)

    return inner


class Thing:
    @request_wrapper
    def takes_int_str(self, request: Request, x: int, y: str) -> int:
        print(request)
        return x + 7


thing = Thing()
thing.takes_int_str(1, "a")