The SomeClass class has a custom metaclass. You will need to create a metaclass which inherits from both ABCMeta and this custom metaclass, then use it as the metaclass for MyClass. Without knowing more about this custom metaclass, I cannot determine a correct way to do this in the general case, but it will probably look like one of these possibilities:
class DerivedMeta(ABCMeta, type(SomeClass)):
pass
class DerivedMeta(type(SomeClass), ABCMeta):
pass
It's unlikely but possible you will also need to override one or more methods to ensure correct metaclass interactions.
Answer from Kevin on Stack OverflowThe SomeClass class has a custom metaclass. You will need to create a metaclass which inherits from both ABCMeta and this custom metaclass, then use it as the metaclass for MyClass. Without knowing more about this custom metaclass, I cannot determine a correct way to do this in the general case, but it will probably look like one of these possibilities:
class DerivedMeta(ABCMeta, type(SomeClass)):
pass
class DerivedMeta(type(SomeClass), ABCMeta):
pass
It's unlikely but possible you will also need to override one or more methods to ensure correct metaclass interactions.
Thread is still at the top in search result, so I wanted to share my complete solution.
I ran into this problem when trying to create an abstract template class meant to PyQt5 widgets, in Python 3.8. I applied @Kevin's solution, creating a new meta class first. The working code:
from abc import ABC, ABCMeta
from PyQt5.QtWidgets import QWidget, QLabel
class QABCMeta(ABCMeta, type(QWidget)):
"""Create a meta class that combines ABC and the Qt meta class"""
pass
class TcWidget(ABC, metaclass=QABCMeta):
"""Abstract class, to be multi-inherited together with a Qt item"""
pass
class TcLabel(QLabel, TcWidget):
"""Label that shows a value"""
pass
# ...
label = TcLabel()
# ...
Videos
Using Python's multiple inheritance (MI) for interfaces, abstract base classes, mixins, or similar techniques is perfectly fine. In most cases, the MRO produces intuitive results.
However, object initialization under multiple inheritance is really tricky. In Python you cannot combine multiple classes per MI unless all participating classes have been designed for MI. The issue is that the __init__() method cannot know from which class it will be called and what the signature of the super().__init__() method will be. Effectively, this means that MI constructors:
- must call the
super().__init__() - must only take arguments by name, not by position
- must forward any
**kwargsto thesuper().__init__() - must not warn on unexpected arguments
Where possible, the better alternative is to avoid __init__() methods for interface-like classes, and instead express requirements through abstract methods. For example, instead of a BananaContainer class, we might write this interface/ABC:
import abc # abstract base class
class BananaContainer(abc.ABC):
@property
@abc.abstractmethod
def bananas(self) -> list:
raise NotImplementedError
If a class wants to be a BananaContainer, it would have to implement that property.
In general, it is perfectly alright if you have a class that inherits from multiple interfaces or mixins. Aside from name collisions, the above __init__() problems, and general API bloat of the class, no noteworthy issues arise.
The second part of your question proposes a capability-based approach instead of using inheritance. Using composition instead of inheritance is often a very very good idea. For example, you eliminate the initialization problems by design. It also tends to lead to more explicit APIs that are easier to navigate and avoid name clashes. There should be some method that either returns an object representing a capability, or None if the capability isn't supported.
But these capabilities can be implemented in different ways: either by using normal composition, or by storing the capabilities in your own data structures.
Unless you have special needs for the object model, stick to the language. Store methods in normal object fields, provide normal methods to access them. This leads to a more comfortable API, and is more likely to support auto-completer and type-checkers.
If you need to modify the available capabilities of an object at run-time, and need to introduce new kinds of capabilities at run-time, then using a dictionary may be appropriate. But at this point you are inventing your own object system. This may be a good idea e.g. in games that have complex capability systems where new capabilities shall be defined in configuration files.
Most software does not have these requirements, and does not benefit from that kind of flexibility.
Additionally, Python's built-in object system is flexible enough that you could create new types and new methods without having to create a new object system. Builtins like
getattr(),setattr(),hasattr(), and thetype()constructor come in handy here.
I would likely express an object that can have both AppleContainer and BananaContainer capabilities like this:
class BananaContainer:
...
class AppleContainer:
...
class HasCapabilities:
def __init__(self, x, y, z):
# somehow determine the appropriate capabilities and initialize them
self._banana_container = BananaContainer(y) if x else None
self._apple_container = AppleContainer(y)
@property
def as_banana_container(self) -> Optional[BananaContainer]:
return self._banana_container
@property
def as_apple_container(self) -> Optional[AppleContainer]:
return self._apple_container
o = HasCapabilities(...)
bc = o.as_banana_container
if bc is not None:
bc.do_banana_things()
Or with Python 3.8 assignment expressions:
if (bc := o.as_banana_container) is not None:
bc.do_banana_things()
If you want to have some custom mechanisms for reflection over capabilities, you can implement that on top of this solution, with some amount of boilerplate. If we want to be MI-safe, we might declare the following base class that all capability-having classes need to inherit:
class CapabilityReflection:
# a base implementations so that actual implementations
# can safely call super()._get_capabilities()
def _list_capabilities(self):
return ()
def all_capabilities(self):
"""deduplicated set of capabilities that this object supports."""
set(self._list_capabilities())
def get_capability(self, captype):
"""find a capability by its type. Returns None if not supported."""
return None
which in the above case would have been implemented as:
class HasCapabilities(CapabilityReflection):
...
def _list_capabilities(self):
caps = [ # go through properties in case they have been overridden
self.as_banana_container,
self.as_apple_container,
]
yield from (cap for cap in caps if cap is not None)
yield from super()._list_capabilities()
def get_capability(self, captype):
if captype == BananaContainer:
return self.as_banana_container
if captype == AppleContainer:
return self.as_apple_container
return super().get_capability(captype)
In order to ensure a class has some properties, I make base "interface" classes
While this is a common design pattern in statically typed languages, Python programmers consider more idiomatic to use duck typing for your classes. Since the language is dynamically typed, if you have Foo and Bar classes that both can contain bananas, you are free to call unknown.banana on a variable that can be either. If unknown can be an object that don't implement banana, you can also use getattr or try/except AttributeError blocks. The explicit interface is just bloat over features the language already support.
If for any reason you don't want to get rid of these interfaces, then you could at least use multiple inheritance. It exists because it has uses and is correct to use in many cases.
can it hit us back later with problems like method resolution order or name collisions ?
In your multiple inheritance declaration, the first object has priority when it comes to symbol collisions. In some cases, it's a feature, but you have to be careful this doesn't cause unintended overrides, like you would when defining methods and properties in a child class.
The capability suggestion is overly defensive over inheritance mechanisms. Making sure you don't accidentally override is your responsibility, but it shouldn't be a huge burden. If it happens to be one, it's likely you have other problems. And in cases you are not sure of the symbols contained in a class and want to use it as a black box, it may be appropriate to favor composition.
There's no technical need to re-inherit ABC, but some linters may complain and people may argue that a class with abstract methods should explicitly inherit ABC to make the intention clear that the class is still abstract, and not a mistake. This especially goes for cases like this, where you don't explicitly declare any obvious abstract methods:
class C(Base):
def m(self):
...
Did you just miss implementing foo here, or did you intent to keep C an abstract class? Make it explicit by inheriting ABC.
WithAbstract inherits from Base which already inherits from abc.ABC so you don't have to inherit from abc.ABC again.
Unless all of a sudden Base ceases to inherit from abc.ABC and your code breaks.
I don't know about pythonic but I would tend to avoid multiple inheritance. True, it's not as problematic as in other languages like C++ but simple is better than complex.
If all the descendants of Base have to use @abc.abstractmethod decorator, then it's better to make it available from Base to avoid unnecessary copy/paste when creating a new child class.
Unless you use abc.ABCMeta as the metaclass for your class (either explicitly or by inheriting from abc.ABC), using abstractmethod doesn't really do anything.
Copy>>> from abc import abstractmethod, ABC
>>> class Foo:
... @abstractmethod
... def bar(self):
... pass
...
>>> f = Foo()
>>>
Likewise, using ABCMeta doesn't mean much unless you mark at least one method as abstract:
Copy>>> class Bar(ABC):
... pass
...
>>> b = Bar()
>>>
It's the combination of the two that allows a class to be (nominally) uninstantiable:
Copy>>> class Baz(ABC):
... @abstractmethod
... def m(self):
... pass
...
>>> b = Baz()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class Baz with abstract methods m
>>>
(Even then, note that all @abstractmethod does is add the decorated method to a set which the metaclass machinery consults when trying to instantiate the class. It is trivial to defeat that machinery:
Copy>>> Baz.__abstractmethods__
frozenset({'m'})
>>> Baz.__abstractmethods__ = set()
>>> b = Baz()
>>>
)
Note that ABC itself is a trivial class that uses ABCMeta as its metaclass, which makes any of its descendants use it as well.
Copy# Docstring omitted; see
# https://github.com/python/cpython/blob/3.7/Lib/abc.py#L166
# for the original
class ABC(metaclass=ABCMeta):
__slots__ = ()
What chepner said, and also readability. Inheriting from ABC makes it clear to your readers what you're up to.
Copy>>> from abc import ABC, abstractmethod
>>>
>>> class Foo:
... @abstractmethod
... def f(self):
... pass
...
>>> class Bar(Foo):
... pass
...
>>> Bar().f()
>>>
>>> class Baz(ABC):
... @abstractmethod
... def f(self):
... pass
...
>>> class Quux(Baz):
... pass
...
>>> Quux().f()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: Can't instantiate abstract class Quux with abstract methods f