When talking about static type checking, it helps to understand the notion of a subtype as distinct from a subclass. (In Python, type and class are synonymous; not so in the type system implemented by tools like mypy.)
A type T is a nominal subtype of type S if we explicitly say it is. Subclassing is a form of nominal subtyping: T is a subtype of S if (but not only if) T is a subclass of S.
A type T is a structural subtype of type S if it something about T itself is compatible with S. Protocols are Python's implementation of structure subtyping. Shape does not not need to be a nominal subtype of IShape (via subclassing) in order to be a structural subtype of IShape (via having an x attribute).
So the point of defining IShape as a Protocol rather than just a superclass of Shape is to support structural subtyping and avoid the need for nominal subtyping (and all the problems that inheritance can introduce).
class IShape(Protocol):
x: float
# A structural subtype of IShape
# Not a nominal subtype of IShape
class Shape:
def __init__(self):
self.x = 3
# Not a structural subtype of IShape
class Unshapely:
def __init__(self):
pass
def foo(v: IShape):
pass
foo(Shape()) # OK
foo(Unshapely()) # Not OK
So is structural subtyping a replacement for nominal subtyping? Not at all. Inheritance has its uses, but when it's your only method of subtyping, it gets used inappropriately. Once you have a distinction between structural and nominal subtyping in your type system, you can use the one that is appropriate to your actual needs.
Answer from chepner on Stack Overflowpython - Correct way to hint that a class is implementing a Protocol? - Stack Overflow
How to define a default implementation for a `Protocol` property with a setter
Interfaces with Protocols: why not ditch ABC for good?
What to use in replacement of an interface/protocol in python - Stack Overflow
Videos
Hello, if one finds interfaces useful in Python (>=3.8) and is convinced that static type-checking is a must, then why not ditch ABC and always use Protocols? I understand that the fundamental idea of a protocol is slightly different from an interface, but in practice, I had great success replacing abc's with Protocols without regrets.
With abc you would write (https://docs.python.org/3/library/abc.html) :
from abc import ABC, abstractmethod
class Animal(ABC):
@abstractmethod
def eat(self, food) -> float:
passWhereas with Protocols it's gonna be (good tutorial):
from typing import Protocol
class Animal(Protocol):
def eat(self, food) -> float:
...Scores in my subjective scoring system :)
| Capability | ABC | Protocols |
|---|---|---|
| Runtime checking | 1 | 1 (with a decorator) |
| Static checking with mypy | 1 | 1 |
Explicit interface (class Dog(Animal):) | 1 | 1 |
Implicit interface with duck-typing (class Dog:) | 0.5 (kind of with register, but it doesn't work with mypy yet) | 1 |
Default method implementation (def f(self): return 5) | -1 (implementations shouldn't be in the interfaces) | -1 (same, and mypy doesn't catch this) |
| Callback interface | 0 | 1 |
| Number of code lines | -1 (requires ABC inheritance and abstracmethod for every method) | 0 (optionalProtocol inheritance) |
| Total score | 1.5 | 4 |
So I do not quite see why one should ever use ABC except for legacy reasons. Other (IMHO minor) points in favour of ABC I've seen were about interactions with code editors.
Did I miss anything?
I put more detailed arguments into a Medium. There are many tutorials on using Protocols, but not many on ABC vs Protocols comparisons. I found a battle of Protocols vs Zope, but we are not using Zope, so it's not so relevant.
New in Python 3.8:
Some of the benefits of interfaces and protocols are type hinting during the development process using tools built into IDEs and static type analysis for detection of errors before runtime. This way, a static analysis tool can tell you when you check your code if you're trying to access any members that are not defined on an object, instead of only finding out at runtime.
The typing.Protocol class was added to Python 3.8 as a mechanism for "structural subtyping." The power behind this is that it can be used as an implicit base class. That is, any class that has members that match the Protocol's defined members is considered to be a subclass of it for purposes of static type analysis.
The basic example given in PEP 544 shows how this can be used.
Copyfrom typing import Protocol
class SupportsClose(Protocol):
def close(self) -> None:
# ...
class Resource:
# ...
def close(self) -> None:
self.file.close()
self.lock.release()
def close_all(things: Iterable[SupportsClose]) -> None:
for thing in things:
thing.close()
file = open('foo.txt')
resource = Resource()
close_all([file, resource]) # OK!
close_all([1]) # Error: 'int' has no 'close' method
Note: The typing-extensions package backports typing.Protocol for Python 3.5+.
In short, you probably don't need to worry about it at all. Since Python uses duck typing - see also the Wikipedia article for a broader definition - if an object has the right methods, it will simply work, otherwise exceptions will be raised.
You could possibly have a Piece base class with some methods throwing NotImplementedError to indicate they need to be re-implemented:
Copyclass Piece(object):
def move(<args>):
raise NotImplementedError(optional_error_message)
class Queen(Piece):
def move(<args>):
# Specific implementation for the Queen's movements
# Calling Queen().move(<args>) will work as intended but
class Knight(Piece):
pass
# Knight().move() will raise a NotImplementedError
Alternatively, you could explicitly validate an object you receive to make sure it has all the right methods, or that it is a subclass of Piece by using isinstance or isubclass.
Note that checking the type may not be considered "Pythonic" by some and using the NotImplementedError approach or the abc module - as mentioned in this very good answer - could be preferable.
Your factory just has to produce instances of objects having the right methods on them.