This has been discussed some time ago and Samuel Colvin said he didn't want to pursue this as a feature for Pydantic.

If you are fine with code generation instead of actual runtime creation of models, you can use the datamodel-code-generator.

To be honest, I struggle to see the use case for generating complex models at runtime, seeing as their main purpose is validation, implying that you think about correct schema before running your program. But that is just my view.

For simple models I guess you can throw together your own logic for this fairly quickly.

If do you need something more sophisticated, the aforementioned library does offer some extensibility. You should be able to import and inherit from some of their classes like the JsonSchemaParser. Maybe that will get you somewhere.

Ultimately I think this becomes non-trivial very quickly, which is why Pydantic's maintainer didn't want to deal with it and why there is a whole separate project for this.

Answer from Daniel Fainberg on Stack Overflow
🌐
Pydantic
docs.pydantic.dev › latest › concepts › json_schema
JSON Schema - Pydantic Validation
The json_schema_extra option can be used to add extra information to the JSON schema, either at the Field level or at the Model level. You can pass a dict or a Callable to json_schema_extra.
Top answer
1 of 5
11

This has been discussed some time ago and Samuel Colvin said he didn't want to pursue this as a feature for Pydantic.

If you are fine with code generation instead of actual runtime creation of models, you can use the datamodel-code-generator.

To be honest, I struggle to see the use case for generating complex models at runtime, seeing as their main purpose is validation, implying that you think about correct schema before running your program. But that is just my view.

For simple models I guess you can throw together your own logic for this fairly quickly.

If do you need something more sophisticated, the aforementioned library does offer some extensibility. You should be able to import and inherit from some of their classes like the JsonSchemaParser. Maybe that will get you somewhere.

Ultimately I think this becomes non-trivial very quickly, which is why Pydantic's maintainer didn't want to deal with it and why there is a whole separate project for this.

2 of 5
7

Updated @Alon's answer to handle nested modals:

from typing import Any, Type, Optional
from enum import Enum

from pydantic import BaseModel, Field, create_model


def json_schema_to_base_model(schema: dict[str, Any]) -> Type[BaseModel]:
    type_mapping: dict[str, type] = {
        "string": str,
        "integer": int,
        "number": float,
        "boolean": bool,
        "array": list,
        "object": dict,
    }

    properties = schema.get("properties", {})
    required_fields = schema.get("required", [])
    model_fields = {}

    def process_field(field_name: str, field_props: dict[str, Any]) -> tuple:
        """Recursively processes a field and returns its type and Field instance."""
        json_type = field_props.get("type", "string")
        enum_values = field_props.get("enum")

        # Handle Enums
        if enum_values:
            enum_name: str = f"{field_name.capitalize()}Enum"
            field_type = Enum(enum_name, {v: v for v in enum_values})
        # Handle Nested Objects
        elif json_type == "object" and "properties" in field_props:
            field_type = json_schema_to_base_model(
                field_props
            )  # Recursively create submodel
        # Handle Arrays with Nested Objects
        elif json_type == "array" and "items" in field_props:
            item_props = field_props["items"]
            if item_props.get("type") == "object":
                item_type: type[BaseModel] = json_schema_to_base_model(item_props)
            else:
                item_type: type = type_mapping.get(item_props.get("type"), Any)
            field_type = list[item_type]
        else:
            field_type = type_mapping.get(json_type, Any)

        # Handle default values and optionality
        default_value = field_props.get("default", ...)
        nullable = field_props.get("nullable", False)
        description = field_props.get("title", "")

        if nullable:
            field_type = Optional[field_type]

        if field_name not in required_fields:
            default_value = field_props.get("default", None)

        return field_type, Field(default_value, description=description)

    # Process each field
    for field_name, field_props in properties.items():
        model_fields[field_name] = process_field(field_name, field_props)

    return create_model(schema.get("title", "DynamicModel"), **model_fields)

Example Schema

schema = {
    "title": "User",
    "type": "object",
    "properties": {
        "name": {"type": "string"},
        "age": {"type": "integer"},
        "is_active": {"type": "boolean"},
        "address": {
            "type": "object",
            "properties": {
                "street": {"type": "string"},
                "city": {"type": "string"},
                "zipcode": {"type": "integer"},
            },
        },
        "roles": {
            "type": "array",
            "items": {
                "type": "string",
                "enum": ["admin", "user", "guest"]
            }
        }
    },
    "required": ["name", "age"]
}

Generate the Pydantic model

DynamicModel = json_schema_to_base_model(schema)

Example usage

print(DynamicModel.schema_json(indent=2))
Discussions

Take 2: create model from JSON schema aka model_load_json_schema()
In short, there is a method for ... from JSON to a Model. The use case is simple: I want to give my users the capability to store arbitrary artifacts. An artifact can be anything, any object: a car, a travel trip, a database cluster, a toilet, you name it. But it has to conform to a predefined schema for that object. So an admin user would say, fine, we create the model schema (using the pydantic ... More on github.com
🌐 github.com
5
11
New Package: Jambo — Convert JSON Schema to Pydantic Models Automatically
lol, you forgot to remove the intro More on reddit.com
🌐 r/Python
29
76
April 10, 2025
How to define pydantic/JSON schema
Hello, I am working on a text translation project. I extract text from pdf, pass it to gpt-4o-mini and instruct it to translate the text to a target language. The text is in the following format: {‘0’:‘text1’, ‘1’:‘text2’,‘2’:‘text3’, …} I require the output in the ... More on community.openai.com
🌐 community.openai.com
1
1
October 22, 2024
python - Pydantic model for JSON Meta Schema - Stack Overflow
I have a use case where user needs to define some JSON Schema for later usage. Right know I am using Pydantic parse user configs and check if they are ok. Does any one know if there exist an library with Pydantic Model for JSON Meta Schema? More on stackoverflow.com
🌐 stackoverflow.com
🌐
PyPI
pypi.org › project › json-schema-to-pydantic
json-schema-to-pydantic · PyPI
3 weeks ago - A Python library for automatically generating Pydantic v2 models from JSON Schema definitions
      » pip install json-schema-to-pydantic
    
Published   Mar 09, 2026
Version   0.4.11
🌐
Medium
medium.com › @kishanbabariya101 › episode-8-json-schema-generation-in-pydantic-9a4c4fee02c8
Episode 8: JSON Schema Generation in Pydantic | by Kishan Babariya | Medium
December 17, 2024 - Feature Description Automatic Schema Generation Generate JSON Schema for all Pydantic models using model_json_schema(). FastAPI Integration Leverage schema generation for interactive API docs and validation.
🌐
Pydantic
docs.pydantic.dev › latest › integrations › datamodel_code_generator
datamodel-code-generator - Pydantic Validation
datamodel-codegen --input person.json --input-file-type jsonschema --output model.py ... { "$id": "person.json", "$schema": "http://json-schema.org/draft-07/schema#", "title": "Person", "type": "object", "properties": { "first_name": { "type": ...
🌐
Reddit
reddit.com › r/python › new package: jambo — convert json schema to pydantic models automatically
r/Python on Reddit: New Package: Jambo — Convert JSON Schema to Pydantic Models Automatically
April 10, 2025 -

🚀 I built Jambo, a tool that converts JSON Schema definitions into Pydantic models — dynamically, with zero config!

What my project does:

  • Takes JSON Schema definitions and automatically converts them into Pydantic models

  • Supports validation for strings, integers, arrays, nested objects, and more

  • Enforces constraints like minLength, maximum, pattern, etc.

  • Built with AI frameworks like LangChain and CrewAI in mind — perfect for structured data workflows

🧪 Quick Example:

from jambo.schema_converter import SchemaConverter

schema = {
    "title": "Person",
    "type": "object",
    "properties": {
        "name": {"type": "string"},
        "age": {"type": "integer"},
    },
    "required": ["name"],
}

Person = SchemaConverter.build(schema)
print(Person(name="Alice", age=30))

🎯 Target Audience:

  • Developers building AI agent workflows with structured data

  • Anyone needing to convert schemas into validated models quickly

  • Pydantic users who want to skip writing models manually

  • Those working with JSON APIs or dynamic schema generation

🙌 Why I built it:

My name is Vitor Hideyoshi. I needed a tool to dynamically generate models while working on AI agent frameworks — so I decided to build it and share it with others.

Check it out here:

  • GitHub: https://github.com/HideyoshiNakazone/jambo

  • PyPI: https://pypi.org/project/jambo/

Would love to hear what you think! Bug reports, feedback, and PRs all welcome! 😄
#ai #crewai #langchain #jsonschema #pydantic

Find elsewhere
🌐
Pydantic
docs.pydantic.dev › 1.10 › usage › schema
Schema - Pydantic
The description for models is taken from either the docstring of the class or the argument description to the Field class. The schema is generated by default using aliases as keys, but it can be generated using model property names instead by calling MainModel.schema/schema_json(by_alias=False).
🌐
GitHub
github.com › kreneskyp › jsonschema-pydantic
GitHub - kreneskyp/jsonschema-pydantic: Python library for converting JSON Schemas to Pydantic models
from jsonschema_pydantic import jsonschema_to_pydantic jsonschema = { "type": "object", "properties": { "name": {"type": "string"}, "age": {"type": "integer"}, }, "required": ["name"], } pydantic_model = jsonschema_to_pydantic(jsonschema)
Starred by 20 users
Forked by 4 users
Languages   Python 82.0% | Makefile 17.6% | Dockerfile 0.4% | Python 82.0% | Makefile 17.6% | Dockerfile 0.4%
🌐
PyPI
pypi.org › project › jsonschema-pydantic
jsonschema-pydantic · PyPI
from jsonschema_pydantic import jsonschema_to_pydantic jsonschema = { "type": "object", "properties": { "name": {"type": "string"}, "age": {"type": "integer"}, }, "required": ["name"], } pydantic_model = jsonschema_to_pydantic(jsonschema)
      » pip install jsonschema-pydantic
    
Published   Feb 03, 2024
Version   0.6
🌐
Pydantic
docs.pydantic.dev › 2.4 › concepts › json_schema
JSON Schema - Pydantic
BaseModel.model_dump_json returns a JSON string representation of the dict of the schema. TypeAdapter.dump_json serializes an instance of the adapted type to JSON. TypeAdapter.json_schema generates a JSON schema for the adapted type. The generated JSON schemas are compliant with the following specifications: ... OpenAPI extensions. import json from enum import Enum from typing import Union from typing_extensions import Annotated from pydantic import BaseModel, Field from pydantic.config import ConfigDict class FooBar(BaseModel): count: int size: Union[float, None] = None class Gender(str, Enum
Top answer
1 of 3
2

One solution is to hack the utils out of datamodel-code-generator, specifically their JsonSchemaParser. This generates an intermediate text representation of all pydantic models which you can then dynamically import. You might reasonably balk at this, but it does allow for self-referencing and multi-model setups at least:

import importlib.util
import json
import re
import sys
from contextlib import contextmanager
from pathlib import Path
from tempfile import NamedTemporaryFile
from types import ModuleType

from datamodel_code_generator.parser.jsonschema import JsonSchemaParser
from pydantic import BaseModel


NON_ALPHANUMERIC = re.compile(r"[^a-zA-Z0-9]+")
UPPER_CAMEL_CASE = re.compile(r"[A-Z][a-zA-Z0-9]+")
LOWER_CAMEL_CASE = re.compile(r"[a-z][a-zA-Z0-9]+")

class BadJsonSchema(Exception):
    pass


def _to_camel_case(name: str) -> str:
    if any(NON_ALPHANUMERIC.finditer(name)):
        return "".join(term.lower().title() for term in NON_ALPHANUMERIC.split(name))
    if UPPER_CAMEL_CASE.match(name):
        return name
    if LOWER_CAMEL_CASE.match(name):
        return name[0].upper() + name[1:]
    raise BadJsonSchema(f"Unknown case used for {name}")


def _load_module_from_file(file_path: Path) -> ModuleType:
    spec = importlib.util.spec_from_file_location(
        name=file_path.stem, location=str(file_path)
    )
    module = importlib.util.module_from_spec(spec)
    sys.modules[file_path.stem] = module
    spec.loader.exec_module(module)
    return module


@contextmanager
def _delete_file_on_completion(file_path: Path):
    try:
        yield
    finally:
        file_path.unlink(missing_ok=True)


def json_schema_to_pydantic_model(json_schema: dict, name_override: str) -> BaseModel:
    json_schema_as_str = json.dumps(json_schema)
    pydantic_models_as_str: str = JsonSchemaParser(json_schema_as_str).parse()

    with NamedTemporaryFile(suffix=".py", delete=False) as temp_file:
        temp_file_path = Path(temp_file.name).resolve()
        temp_file.write(pydantic_models_as_str.encode())

    with _delete_file_on_completion(file_path=temp_file_path):
        module = _load_module_from_file(file_path=temp_file_path)

    main_model_name = _to_camel_case(name=json_schema["title"])
    pydantic_model: BaseModel = module.__dict__[main_model_name]
    # Override the pydantic model/parser name for nicer ValidationError messaging and logging
    pydantic_model.__name__ = name_override
    pydantic_model.parse_obj.__func__.__name__ = name_override
    return pydantic_model

Main drawback as I see it- datamodel-code-generator has non-dev dependencies isort and black- not ideal to have in your deployments.

2 of 3
2

If I understand correctly, you are looking for a way to generate Pydantic models from JSON schemas. Here is an implementation of a code generator - meaning you feed it a JSON schema and it outputs a Python file with the Model definition(s). It is not "at runtime" though. For this, an approach that utilizes the create_model function was also discussed in this issue thread a while back, but as far as I know there is no such feature in Pydantic yet.

If you know that your models will not be too complex, it might be fairly easy to implement a crude version of this yourself. Essentially the properties in a JSON schema are reflected fairly nicely by the __fields__ attribute of a model. You could write a function that takes a parsed JSON schema (i.e. a dictionary) and generates the Field definitions to pass to create_model.

🌐
Bugbytes
bugbytes.io › posts › pydantic-nested-models-and-json-schemas
Pydantic - Nested Models and JSON Schemas - BugBytes
This defines the fields that exist on the model, the required fields, the types and different formats (for example, UUID string format), and more. Furthermore, this machine-readable JSON schema allows other tools to generate code from the schema. There are also tools that allow you to take JSON Schemas, and generate Pydantic models from the schema!
🌐
GitHub
github.com › pydantic › pydantic › blob › main › docs › concepts › json_schema.md
pydantic/docs/concepts/json_schema.md at main · pydantic/pydantic
The json_schema_extra option can be used to add extra information to the JSON schema, either at the Field level or at the Model level. You can pass a dict or a Callable to json_schema_extra.
Author   pydantic
🌐
Netlify
field-idempotency--pydantic-docs.netlify.app › usage › schema
Schema - pydantic
It's also possible to extend/override the generated JSON schema in a model. To do it, use the Config sub-class attribute schema_extra. For example, you could add examples to the JSON Schema: from pydantic import BaseModel class Person(BaseModel): name: str age: int class Config: schema_extra ...
🌐
Reddit
reddit.com › r/python › jsontopydantic - generate pydantic models from json in the browser
r/Python on Reddit: JSONtoPydantic - Generate Pydantic Models from JSON in the browser
December 4, 2020 -

https://jsontopydantic.com

Hi there! I built this over the weekend and figured someone other than myself might find it useful.

Like many who work with REST APIs in Python, I've recently fallen in love with Pydantic. If you haven't heard of Pydantic, it's a data validation and parsing library that makes working with JSON in Python quite pleasant.

I needed a quick way to generate a Pydantic model from any given sample of JSON, and hacked together this application to do so. You can paste in a valid JSON string, and you'll get a valid Pydantic model back.

This is helpful if you're working with poorly documented APIs, really big objects, or have lots of edge cases to catch.

Check it out and let me know what you think!

Code -> https://github.com/brokenloop/jsontopydantic

🌐
GitHub
github.com › richard-gyiko › json-schema-to-pydantic
GitHub - richard-gyiko/json-schema-to-pydantic
from json_schema_to_pydantic import create_model # Define your JSON Schema schema = { "title": "User", "type": "object", "properties": { "name": {"type": "string"}, "email": {"type": "string", "format": "email"}, "age": {"type": "integer", "minimum": 0} }, "required": ["name", "email"] } # Generate your Pydantic model UserModel = create_model(schema) # Use the model user = UserModel( name="John Doe", email="john@example.com", age=30 ) # Example with relaxed validation RelaxedModel = create_model( { "type": "object", "properties": { "tags": {"type": "array"}, # Array without items schema "metad
Starred by 37 users
Forked by 12 users
Languages   Python 100.0% | Python 100.0%