Pretty good!
Overall you have reasonable types, especially Array (though that should only be declared once). But your use of kwargs and setattr interferes with that type safety. There are type-safe ways around this: one way would be for Rational to be a @dataclass, and for its super to call into fields() or asdict(). Or without dataclasses, maybe __dict__ would work (so long as Rational stores offset, enumerator and denominator as members from its own constructor).
Though it's unlikely for eval to collide with the built-in eval, it's enough to confuse some syntax highlighters etc. Probably best to rename this.
This:
powers_of_x = [x**i for i in range(len(self.enumerator))]
can be vectorised as
powers_of_x = x ** np.arange(len(self.enumerator))
The __main__ guard is not enough to exclude P, etc. from the global namespace. That code should be moved to a main function.
Don't assert in production code; raise an exception instead.
I was surprised to see the Python abstract classes don't enforce anything except the override and method name. I can see why in Python enforcing parameter data-types would probably not work, but the number of parameters and parameter names ought to be enforced.
I've always thought the point of abstract classes was to ensure that any inheritor of the class would would work with existing code to run the abstract methods defined in the super class. The whole point was to enforce method signatures.
It seems to me that Python's implantation of abstract classes has very little utility. Does anyone even use them? What for?
Abstract method with differing concrete signatures
Signature mismatch issues when using @abstractmethod
Overriding abstract methods in python - Stack Overflow
python - Abstract classes with varying amounts of parameters - Stack Overflow
It's worse than you think. Abstract methods are tracked by name only, so you don't even have to make quack a method in order to instantiate the child class.
class SurrealDuck(Quacker):
quack = 3
d = SurrealDuck()
print d.quack # Shows 3
There is nothing in the system that enforces that quack is even a callable object, let alone one whose arguments match the abstract method's original. At best, you could subclass ABCMeta and add code yourself to compare type signatures in the child to the originals in the parent, but this would be nontrivial to implement.
(Currently, marking something as "abstract" essentially just adds the name to a frozen set attribute in the parent (Quacker.__abstractmethods__). Making a class instantiable is as simple as setting this attribute to an empty iterable, which is useful for testing.)
I recommend you look at pylint. I ran this code through it's static analysis, and on the line where you defined the quack() method, it reported:
Argument number differs from overridden method (arguments-differ)
(https://en.wikipedia.org/wiki/Pylint)
What you're trying to do will just work—but it's a very bad idea.
In general, you don't want to change the signature of a method in incompatible ways when overriding. That's part of the Liskov Substitution Principle.
In Python, there are often good reasons to violate that—inheritance isn't always about subtyping.
But when you're using ABCs to define an interface, that's explicitly about subtyping. That's the sole purpose of ABC subclasses and abstractmethod decorators, so using them to mean anything else is at best highly misleading.
In more detail:
By inheriting from Agent, you are declaring that any instance of Clever_Agent can be used as if it were an Agent. That includes being able to call my_clever_agent.perceive_world(my_observation). In fact, it doesn't just include that; that's the entirely of what it means! If that call will always fail, then no Clever_Agent is an Agent, so it shouldn't claim to be.
In some languages, you occasionally need to fake your way around interface checking, so you can later type-switch and/or "dynamic-cast" back to the actual type. But in Python, that's never necessary. There's no such thing as "a list of Agents", just a list of anything-at-alls. (Unless you're using optional static type checking—but in that case, if you need to get around the static type checking, don't declare a static type just to give yourself a hurdle to get around.)
In Python, you can extend a method beyond its superclass method by adding optional parameters, and that's perfectly valid, because it's still compatible with the explicitly-declared type. For example, this would be a perfectly reasonable thing to do:
class Clever_Agent(Agent):
def perceive_world(self, observation, prediction=None):
print('I see %s' % observation)
if prediction is None:
print('I have no predictions about what will happen next')
else:
print('I think I am going to see %s happen next' % prediction)
Or even this might be reasonable:
class Agent(ABC):
@abstractmethod
def perceive_world(self, observation, prediction):
pass
class Dumb_agent(Agent):
def perceive_world(self, observation, prediction=None):
print('I see %s' % observation)
if prediction is not None:
print('I am too dumb to make a prediction, but I tried anyway')
class Clever_Agent(Agent):
def perceive_world(self, observation, prediction):
print('I see %s' % observation)
print('I think I am going to see %s happen next' % prediction)
In many ways overriding an abstract method from a parent class and adding or changing the method signature is technically not called a method override what you may be effectively be doing is method hiding. Method override always overrides a specific existing method signature in the parent class.
You may find your way around the problem by defining a variant abstract method in your parent class, and overriding it if necessary in your sub classes.
No checks are done on how many arguments concrete implementations take. So there is nothing stopping your from doing this already.
Just define those methods to take whatever parameters you need to accept:
class View(metaclass=ABCMeta):
@abstractmethod
def set(self):
pass
@abstractmethod
def get(self):
pass
class ConcreteView1(View):
def set(self, param1):
# implemenation
def get(self, param1, param2):
# implemenation
class ConcreteView2(View):
def set(self):
# implemenation
def get(self, param1, param2):
# implemenation
Other tools, such as linters, are a different matter. They could (rightly) claim that the above code violates the Livkov Substitution Principle, because you the subclasses and parent class take different arguments. But it is not Python itself that then tells you about this.
python 3.8
from abc import ABC, abstractmethod
class SomeAbstractClass(ABC):
@abstractmethod
def get(self, *args, **kwargs):
"""
Returns smth
"""
@abstractmethod
def set(self, key, value):
"""
Sets smth
"""
class Implementation(SomeAbstractClass):
def set(self, key, value):
pass
def get(self, some_var, another_one):
pass
Works perfectly, no warnings, no problems
» pip install abcmeta
Giving an override a more restrictive signature violates the Liskov Substitution Principle. But that's not what you should be doing here.
Your code should actually look like this:
from abc import ABC, abstractmethod
class Manager(ABC):
@abstractmethod
def connect(self) -> Handle:
raise NotImplementedError()
class DefaultManager(Manager):
def connect(self, *, thread_safe: bool = False) -> Handle:
if thread_safe:
return ThreadSafeHandle()
else:
return DefaultHandle()
(I'm assuming you've got something appropriate to annotate the return types with.)
Manager.connect's signature should reflect the kinds of calls that all subclass connect implementations should accept. It should be the intersection of all valid subclass signatures, not the union.
The signature specified by Manager.connect reflects what Manager promises all Manager instances can do. It promises that they can accept a connect call with 0 arguments. It doesn't promise anything about what they can't do. This Manager.connect signature doesn't accept a thread_safe argument, but that doesn't mean it promises subclasses won't accept that argument.
Your first example would be completely fine, as everything the abstract parent accepts is also accepted by the derived class, the argument types are identical (<s>a</s>, <s>b</s> untyped). See comment for why the first snippet isn't quite type-correct, only an override with def foo(self, a:Any=default, b:Any=default, *args, **kwargs): would be correct.
To be type-correct, the type of the accepted arguments has to get broader, not narrower. So I would take issue with the concrete example, since using a DefaultManager as a Manager would imply one could pass any amount of positional arguments to it, and any value into thread_safe.
More concretely, IMO, the best practice here is this:
from abc import ABC, abstractmethod
class Manager(ABC):
@abstractmethod
def connect(self):
raise NotImplementedError()
class DefaultManager(Manager):
def connect(self, *, thread_safe: bool = False):
if thread_safe:
return ThreadSafeHandle()
else:
return DefaultHandle()
because everything Manager.connect accepts is also accepted by DefaultManager.connect