No, model_dump cannot return an exact type.
However, you don't have to use model_dump. Consider providing all the values explicitly instead. This way you won't be able to forget a required field.
Either you get an error that UserCreate has no attribute defined
user = UserCreate(name="John")
# error: "UserCreate" has no attribute "age" [attr-defined]
User(name=user.name, age=user.age)
or that you haven't provided a required attribute
user = UserCreate(name="John")
# error: Missing named argument "age" for "User"
User(name=user.name)
Answer from Paweł Rubin on Stack Overflow`model_dump` doesn't work recursively with subclasses as expected
model_dump does not dump nested BaseModel
model.model_dump() 'include' option should take precedence over 'exclude_defaults' option
How to dump a list of pydantic instances into a list of dicts?
Videos
No, model_dump cannot return an exact type.
However, you don't have to use model_dump. Consider providing all the values explicitly instead. This way you won't be able to forget a required field.
Either you get an error that UserCreate has no attribute defined
user = UserCreate(name="John")
# error: "UserCreate" has no attribute "age" [attr-defined]
User(name=user.name, age=user.age)
or that you haven't provided a required attribute
user = UserCreate(name="John")
# error: Missing named argument "age" for "User"
User(name=user.name)
While you can't exactly type hint the return dict from model_dump, you can kinda "fake" it. Define a TypedDict with the same type annotations as your pydantic model and override model_dump in an if TYPE_CHECKING block.
from typing import TypedDict, TYPE_CHECKING
from pydantic import BaseModel
class UserCreateDict(TypedDict):
name: str
class UserCreate(BaseModel):
name: str
if TYPE_CHECKING:
def model_dump(self, **kwargs) -> UserCreateDict:
...
user = UserCreate(name="John")
User(**user.model_dump())
Then for example pylance can show that age is missing.

Hi Folks,I have been pondering on this question for sometime. But have not been able to get this. Searched internet but didn't find any article or video of help. Most of them talk about syntax and semantics of pydantic and none talked about what I wanted to know.
Asking this question, Because, in the first look pydantic looks helpful. as it helps us know what exact data is flowing through the application, helps us validate data. But it also takes the simplicity and flexibility that python provides us such as suppose, you may assign some intermediatry key in the dictionary pass that dictionary to some function pop that value out and use it.
using libraries like pydantic will make doing these things to put a little more effort and also the overhead of maintaining pydantic models comes. and in some scenario, some bug may be come up due to incorrect use of these kinds of libraries.
Well, the purpose of my question is not to question the usuability of pydantic. I know it is definitely useful as it is being used widely. but to understand how to use it perfectly and to understand where not to use it.
I'm curious about functionality of pydantic. Let's say I have the following class:
from pydantic import BaseModel, Field
class SampleModel(BaseModel):
positive: int = Field(gt=0, examples=[6])
non_negative: int = Field(ge=0, examples=[5])
Is there a way to generate a .json or dict object that looks like: {'positive': 6, 'non_negative':5}?
Repo Link: https://github.com/psalvaggio/dynapydantic
What My Project Does
TLDR: It's like `SerializeAsAny`, but for both serialization and validation.
Target Audience
Pydantic users. It is most useful for models that include inheritance trees.
Comparison
I have not see anything else, the project was motivated by this GitHub issue: https://github.com/pydantic/pydantic/issues/11595
I've been working on an extension module for `pydantic` that I think people might find useful. I'll copy/paste my "Motivation" section here:
Consider the following simple class setup:
import pydantic
class Base(pydantic.BaseModel):
pass
class A(Base):
field: int
class B(Base):
field: str
class Model(pydantic.BaseModel):
val: BaseAs expected, we can use A's and B's for Model.val:
>>> m = Model(val=A(field=1)) >>> m Model(val=A(field=1))
However, we quickly run into trouble when serializing and validating:
>>> m.model_dump()
{'base': {}}
>>> m.model_dump(serialize_as_any=True)
{'val': {'field': 1}}
>>> Model.model_validate(m.model_dump(serialize_as_any=True))
Model(val=Base())Pydantic provides a solution for serialization via serialize_as_any (and its corresponding field annotation SerializeAsAny), but offers no native solution for the validation half. Currently, the canonical way of doing this is to annotate the field as a discriminated union of all subclasses. Often, a single field in the model is chosen as the "discriminator". This library, dynapydantic, automates this process.
Let's reframe the above problem with dynapydantic:
import dynapydantic
import pydantic
class Base(
dynapydantic.SubclassTrackingModel,
discriminator_field="name",
discriminator_value_generator=lambda t: t.__name__,
):
pass
class A(Base):
field: int
class B(Base):
field: str
class Model(pydantic.BaseModel):
val: dynapydantic.Polymorphic[Base]Now, the same set of operations works as intended:
>>> m = Model(val=A(field=1))
>>> m
Model(val=A(field=1, name='A'))
>>> m.model_dump()
{'val': {'field': 1, 'name': 'A'}}
>>> Model.model_validate(m.model_dump())
Model(val=A(field=1, name='A')