To quote the documentation for typing.ParamSpec:
P.args[...] should only be used to annotate*args.P.kwargs[...] should be only be used to annotate**kwargs.
So this is not possible to with ParamSpec.
The idea of being able to use P.args/P.kwargs in other scopes is not new. But there are fundamental problems that with that, which are unlikely to be solved with ParamSpec as is. Actually PEP 612 mentioned the underlying issue briefly, explaining the reason for why ParamSpec was introduced in the first place:
The core problem here is that, by default, parameters in Python can either be called positionally or as a keyword argument. This means we really have three categories (positional-only, positional-or-keyword, keyword-only) we’re trying to jam into two categories. [...] Fundamentally, in order to capture two categories when there are some things that can be in either category, we need a higher level primitive (
ParamSpec) to capture all three, and then split them out afterward.
A good example was given by someone in the aforementioned issue:
def foo(x: int) -> None: ...
def get_callable_and_args(x: Callable[P, Any]) -> Tuple[P.args, P.kwargs]: ...
reveal_type(get_callable_and_args(foo)) # <- What should this reveal?
Should we reveal a tuple of
int+ empty dict? Or should we reveal an empty tuple and a singleton dict that mapsxtoint? [...] the existence of "positional-or-keyword" arguments in Python makes things highly ambiguous. [...] To fully resolve this kind of issue, there needs to be a way in the type system to represent positional-only, positional-or-keyword, and keyword-only parameters separately. But at that point, it's going to be a different language feature, notParamSpecanymore.
So I would not hold my breath for ParamSpec to support this any time soon.
If we were talking only about positional arguments, what you want there would be possible with the TypeVarTuple. PEP 646 specifically suggests they can be used as Callable arguments.
Something like this should then be possible:
from collections.abc import Callable
from typing import Any, TypeVarTuple
Ts = TypeVarTuple("Ts")
def test_function(func: Callable[[*Ts], Any], args: list[tuple[*Ts]]) -> None:
for a in args:
func(*a)
I say should because it is still hard to verify since mypy does not fully support PEP 646 yet.
But even so, no support for keyword-arguments. Basically, as soon as a combination of the three different parameter categories comes up, things get tricky fast.
Answer from Daniel Fainberg on Stack OverflowVideos
To quote the documentation for typing.ParamSpec:
P.args[...] should only be used to annotate*args.P.kwargs[...] should be only be used to annotate**kwargs.
So this is not possible to with ParamSpec.
The idea of being able to use P.args/P.kwargs in other scopes is not new. But there are fundamental problems that with that, which are unlikely to be solved with ParamSpec as is. Actually PEP 612 mentioned the underlying issue briefly, explaining the reason for why ParamSpec was introduced in the first place:
The core problem here is that, by default, parameters in Python can either be called positionally or as a keyword argument. This means we really have three categories (positional-only, positional-or-keyword, keyword-only) we’re trying to jam into two categories. [...] Fundamentally, in order to capture two categories when there are some things that can be in either category, we need a higher level primitive (
ParamSpec) to capture all three, and then split them out afterward.
A good example was given by someone in the aforementioned issue:
def foo(x: int) -> None: ...
def get_callable_and_args(x: Callable[P, Any]) -> Tuple[P.args, P.kwargs]: ...
reveal_type(get_callable_and_args(foo)) # <- What should this reveal?
Should we reveal a tuple of
int+ empty dict? Or should we reveal an empty tuple and a singleton dict that mapsxtoint? [...] the existence of "positional-or-keyword" arguments in Python makes things highly ambiguous. [...] To fully resolve this kind of issue, there needs to be a way in the type system to represent positional-only, positional-or-keyword, and keyword-only parameters separately. But at that point, it's going to be a different language feature, notParamSpecanymore.
So I would not hold my breath for ParamSpec to support this any time soon.
If we were talking only about positional arguments, what you want there would be possible with the TypeVarTuple. PEP 646 specifically suggests they can be used as Callable arguments.
Something like this should then be possible:
from collections.abc import Callable
from typing import Any, TypeVarTuple
Ts = TypeVarTuple("Ts")
def test_function(func: Callable[[*Ts], Any], args: list[tuple[*Ts]]) -> None:
for a in args:
func(*a)
I say should because it is still hard to verify since mypy does not fully support PEP 646 yet.
But even so, no support for keyword-arguments. Basically, as soon as a combination of the three different parameter categories comes up, things get tricky fast.
While you can't reshape ParamSpec like that, with some API juggling you can achieve almost the same effect with minimal boilerplate code whilst retaining static typing. The key is to bind P = ParamSpec("P") to a class, then use that class's methods (which are bound to P) to do the type-checking for you.
First, a demonstration of what the final API could look like. I'm using mypy to generate the errors, but this should be compliant with other type-checkers.
def f1(a: int, b: float, *, c: int) -> float:
return a + b + c
# Version 1 - sequence of procedural calls
>>> tester = FunctionTester(f1)
>>> tester.add_args(1, 2, c=-1)
>>> tester.add_args(3, 4, c=-2)
>>>
>>> # mypy: Argument 1 to "add_args" of "FunctionTester" has incompatible type "str"; expected "int" [arg-type]
>>> # mypy: Argument "b" to "add_args" of "FunctionTester" has incompatible type "List[<nothing>]"; expected "float" [arg-type]
>>> # mypy: Unexpected keyword argument "d" for "add_args" of "FunctionTester" [call-arg]
>>> # tester.add_args("", b=[], d=object())
>>>
>>> tester.test()
f1(1, 2, c=-1) executed in: 0:00:00.000002
f1(3, 4, c=-2) executed in: 0:00:00.000001
# Version 2 - minimalist API using a context manager which automatically calls the test upon exit
>>> with FunctionTester(f1) as test:
... test(1, 2, c=-1)
... test(3, 4, c=-2)
...
... # mypy: Argument 1 has incompatible type "str"; expected "int" [arg-type]
... # mypy: Argument "b" has incompatible type "List[<nothing>]"; expected "float" [arg-type]
... # mypy: Unexpected keyword argument "d" [call-arg]
... # test("", b=[], d=object())
f1(1, 2, c=-1) executed in: 0:00:00.000001
f1(3, 4, c=-2) executed in: 0:00:00.000001
Now, the implementation:
from types import TracebackType
from typing import (
Any,
Callable,
Generic,
ParamSpec,
Sequence,
)
from typing_extensions import Self
from datetime import timedelta
from itertools import chain
from time import monotonic
P = ParamSpec("P")
class FunctionTester(Generic[P]):
_func: Callable[P, Any]
_arg_groups: list[tuple[tuple[Any, ...], dict[str, Any]]]
def __init__(self, function_: Callable[P, Any]) -> None:
self._func = function_
self._arg_groups = []
def add_args(self, *args: P.args, **kwargs: P.kwargs) -> None:
self._arg_groups.append((args, kwargs))
def test(self) -> None:
func: Callable[P, Any] = self._func
for a, kw in self._arg_groups:
args_str = ", ".join(
chain(
(str(i) for i in a),
(f"{k}={v}" for k, v in kw.items()),
)
)
start = monotonic()
func(*a, **kw)
print(
f"{func.__name__}({args_str}) "
f"executed in: {timedelta(seconds=monotonic() - start)}"
)
# The following additional methods allow the API in Version 2
__call__ = add_args
def __enter__(self) -> Self:
return self
def __exit__(
self,
exc_type: type[BaseException],
exc_val: BaseException,
exc_tb: TracebackType,
) -> None:
self.test()
if the parameters match exactly you can use this 1-1, if there's a mismatch you would need to also use Concatenate and modify it a bit, i have yet to find a generic solution for that case. this is a modified version of the above answer from @Daniil Fajnberg
from typing import TypeVar, ParamSpec, Callable
P = ParamSpec("P")
T = TypeVar("T")
S = TypeVar("S")
def paramspec_from(_: Callable[P, T]) -> Callable[[Callable[P, S]], Callable[P, S]]:
def _fnc(fnc: Callable[P, S]) -> Callable[P, S]:
return fnc
return _fnc
then you can use it like this:
@paramspec_from(known_function)
def wrapper(*args, **kwargs) -> string:
foo()
known_function(*args, **kwargs)
return "whatever"
The problem is that mypy only looks at the function header to resolve the ParamSpec, not the function body, so it cannot deduce the types of args and kwargs. but if we lift the function into the header, like by passing it into the decorator as an argument, mypy can deduce the parameters correctly. As you see we don't even use the function, we marked it with _. we only use it for its typehints.
PEP 612 as well as the documentation of ParamSpec.args and ParamSpec.kwargs are pretty clear on this:
These “properties” can only be used as the annotated types for
*argsand**kwargs, accessed from aParamSpecalready in scope.
- Source: PEP 612 ("The components of a ParamSpec" -> "Valid use locations")
Both attributes require the annotated parameter to be in scope.
- Source: python.typing module documentation (class typing.ParamSpec -> args/kwargs)
They [parameter specifications] are only valid when used in
Concatenate, or as the first argument toCallable, or as parameters for user-defined Generics.
- Source: python.typing module documentation (class typing.ParamSpec, second paragraph)
So no, you cannot use parameter specification args/kwargs, without binding it a concrete Callable in the scope you want to use them in.
I question why you would even want that. If you know that wrapper will always call known_function and you want it to (as you said) have the exact same arguments, then you just annotate it with the same arguments. Example:
def known_function(x: int, y: str) -> bool:
return str(x) == y
def wrapper(x: int, y: str) -> bool:
# other things...
return known_function(x, y)
If you do want wrapper to accept additional arguments aside from those passed on to known_function, then you just include those as well:
def known_function(x: int, y: str) -> bool:
return str(x) == y
def wrapper(a: float, x: int, y: str) -> bool:
print(a ** 2)
return known_function(x, y)
If your argument is that you don't want to repeat yourself because known_function has 42 distinct and complexly typed parameters, then (with all due respect) the design of known_function should be covered in copious amounts gasoline and set ablaze.
If you insist to dynamically associate the parameter specifications (or are curious about possible workarounds for academic reasons), the following is the best thing I can think of.
You write a protected decorator that is only intended to be used on known_function. (You could even raise an exception, if it is called with anything else to protect your own sanity.) You define your wrapper inside that decorator (and add any additional arguments, if you want any). Thus, you'll be able to annotate its *args/**kwargs with the ParamSpecArgs/ParamSpecKwargs of the decorated function. In this case you probably don't want to use functools.wraps because the function you receive out of that decorator is probably intended not to replace known_function, but stand alongside it.
Here is a full working example:
from collections.abc import Callable
from typing import Concatenate, ParamSpec, TypeVar
P = ParamSpec("P")
T = TypeVar("T")
def known_function(x: int, y: str) -> bool:
"""Does thing XY"""
return str(x) == y
def _decorate(f: Callable[P, T]) -> Callable[Concatenate[float, P], T]:
if f is not known_function: # type: ignore[comparison-overlap]
raise RuntimeError("This is an exclusive decorator.")
def _wrapper(a: float, /, *args: P.args, **kwargs: P.kwargs) -> T:
"""Also does thing XY, but first does something else."""
print(a ** 2)
return f(*args, **kwargs)
return _wrapper
wrapper = _decorate(known_function)
if __name__ == "__main__":
print(known_function(1, "2"))
print(wrapper(3.14, 10, "10"))
Output as expected:
False 9.8596 True
Adding reveal_type(wrapper) to the script and running mypy gives the following:
Revealed type is "def (builtins.float, x: builtins.int, y: builtins.str) -> builtins.bool"
PyCharm also gives the relevant suggestions regarding the function signature, which it infers from having known_function passed into _decorate.
But again, just to be clear, I don't think this is good design. If your "wrapper" is not generic, but instead always calls the same function, you should explicitly annotate it, so that its parameters correspond to that function. After all:
Explicit is better than implicit.
- Zen of Python, line 2