# Pydantic > Data validation using Python type hints Pydantic is the most widely used data validation library for Python. Fast and extensible, Pydantic plays nicely with your linters/IDE/brain. Define how data should be in pure, canonical Python 3.9+; validate it with Pydantic. # Concepts documentation An alias is an alternative name for a field, used when serializing and deserializing data. You can specify an alias in the following ways: - `alias` on the Field - must be a `str` - `validation_alias` on the Field - can be an instance of `str`, AliasPath, or AliasChoices - `serialization_alias` on the Field - must be a `str` - `alias_generator` on the Config - can be a callable or an instance of AliasGenerator For examples of how to use `alias`, `validation_alias`, and `serialization_alias`, see [Field aliases](../fields/#field-aliases). ## `AliasPath` and `AliasChoices` API Documentation pydantic.aliases.AliasPath\ pydantic.aliases.AliasChoices Pydantic provides two special types for convenience when using `validation_alias`: `AliasPath` and `AliasChoices`. The `AliasPath` is used to specify a path to a field using aliases. For example: ```python from pydantic import BaseModel, Field, AliasPath class User(BaseModel): first_name: str = Field(validation_alias=AliasPath('names', 0)) last_name: str = Field(validation_alias=AliasPath('names', 1)) user = User.model_validate({'names': ['John', 'Doe']}) # (1)! print(user) #> first_name='John' last_name='Doe' ``` 1. We are using `model_validate` to validate a dictionary using the field aliases. You can see more details about model_validate in the API reference. In the `'first_name'` field, we are using the alias `'names'` and the index `0` to specify the path to the first name. In the `'last_name'` field, we are using the alias `'names'` and the index `1` to specify the path to the last name. `AliasChoices` is used to specify a choice of aliases. For example: ```python from pydantic import BaseModel, Field, AliasChoices class User(BaseModel): first_name: str = Field(validation_alias=AliasChoices('first_name', 'fname')) last_name: str = Field(validation_alias=AliasChoices('last_name', 'lname')) user = User.model_validate({'fname': 'John', 'lname': 'Doe'}) # (1)! print(user) #> first_name='John' last_name='Doe' user = User.model_validate({'first_name': 'John', 'lname': 'Doe'}) # (2)! print(user) #> first_name='John' last_name='Doe' ``` 1. We are using the second alias choice for both fields. 1. We are using the first alias choice for the field `'first_name'` and the second alias choice for the field `'last_name'`. You can also use `AliasChoices` with `AliasPath`: ```python from pydantic import BaseModel, Field, AliasPath, AliasChoices class User(BaseModel): first_name: str = Field(validation_alias=AliasChoices('first_name', AliasPath('names', 0))) last_name: str = Field(validation_alias=AliasChoices('last_name', AliasPath('names', 1))) user = User.model_validate({'first_name': 'John', 'last_name': 'Doe'}) print(user) #> first_name='John' last_name='Doe' user = User.model_validate({'names': ['John', 'Doe']}) print(user) #> first_name='John' last_name='Doe' user = User.model_validate({'names': ['John'], 'last_name': 'Doe'}) print(user) #> first_name='John' last_name='Doe' ``` ## Using alias generators You can use the `alias_generator` parameter of Config to specify a callable (or group of callables, via `AliasGenerator`) that will generate aliases for all fields in a model. This is useful if you want to use a consistent naming convention for all fields in a model, but do not want to specify the alias for each field individually. Note Pydantic offers three built-in alias generators that you can use out of the box: to_pascal\ to_camel\ to_snake ### Using a callable Here's a basic example using a callable: ```python from pydantic import BaseModel, ConfigDict class Tree(BaseModel): model_config = ConfigDict( alias_generator=lambda field_name: field_name.upper() ) age: int height: float kind: str t = Tree.model_validate({'AGE': 12, 'HEIGHT': 1.2, 'KIND': 'oak'}) print(t.model_dump(by_alias=True)) #> {'AGE': 12, 'HEIGHT': 1.2, 'KIND': 'oak'} ``` ### Using an `AliasGenerator` API Documentation pydantic.aliases.AliasGenerator `AliasGenerator` is a class that allows you to specify multiple alias generators for a model. You can use an `AliasGenerator` to specify different alias generators for validation and serialization. This is particularly useful if you need to use different naming conventions for loading and saving data, but you don't want to specify the validation and serialization aliases for each field individually. For example: ```python from pydantic import AliasGenerator, BaseModel, ConfigDict class Tree(BaseModel): model_config = ConfigDict( alias_generator=AliasGenerator( validation_alias=lambda field_name: field_name.upper(), serialization_alias=lambda field_name: field_name.title(), ) ) age: int height: float kind: str t = Tree.model_validate({'AGE': 12, 'HEIGHT': 1.2, 'KIND': 'oak'}) print(t.model_dump(by_alias=True)) #> {'Age': 12, 'Height': 1.2, 'Kind': 'oak'} ``` ## Alias Precedence If you specify an `alias` on the Field, it will take precedence over the generated alias by default: ```python from pydantic import BaseModel, ConfigDict, Field def to_camel(string: str) -> str: return ''.join(word.capitalize() for word in string.split('_')) class Voice(BaseModel): model_config = ConfigDict(alias_generator=to_camel) name: str language_code: str = Field(alias='lang') voice = Voice(Name='Filiz', lang='tr-TR') print(voice.language_code) #> tr-TR print(voice.model_dump(by_alias=True)) #> {'Name': 'Filiz', 'lang': 'tr-TR'} ``` ### Alias Priority You may set `alias_priority` on a field to change this behavior: - `alias_priority=2` the alias will *not* be overridden by the alias generator. - `alias_priority=1` the alias *will* be overridden by the alias generator. - `alias_priority` not set: - alias is set: the alias will *not* be overridden by the alias generator. - alias is not set: the alias *will* be overridden by the alias generator. The same precedence applies to `validation_alias` and `serialization_alias`. See more about the different field aliases under [field aliases](../fields/#field-aliases). ## Alias Configuration You can use [`ConfigDict`](../config/) settings or runtime validation/serialization settings to control whether or not aliases are used. ### `ConfigDict` Settings You can use [configuration settings](../config/) to control, at the model level, whether or not aliases are used for validation and serialization. If you would like to control this behavior for nested models/surpassing the config-model boundary, use [runtime settings](#runtime-settings). #### Validation When validating data, you can enable population of attributes by attribute name, alias, or both. **By default**, Pydantic uses aliases for validation. Further configuration is available via: - ConfigDict.validate_by_alias: `True` by default - ConfigDict.validate_by_name: `False` by default ```python from pydantic import BaseModel, ConfigDict, Field class Model(BaseModel): my_field: str = Field(validation_alias='my_alias') model_config = ConfigDict(validate_by_alias=True, validate_by_name=False) print(repr(Model(my_alias='foo'))) # (1)! #> Model(my_field='foo') ``` 1. The alias `my_alias` is used for validation. ```python from pydantic import BaseModel, ConfigDict, Field class Model(BaseModel): my_field: str = Field(validation_alias='my_alias') model_config = ConfigDict(validate_by_alias=False, validate_by_name=True) print(repr(Model(my_field='foo'))) # (1)! #> Model(my_field='foo') ``` 1. the attribute identifier `my_field` is used for validation. ```python from pydantic import BaseModel, ConfigDict, Field class Model(BaseModel): my_field: str = Field(validation_alias='my_alias') model_config = ConfigDict(validate_by_alias=True, validate_by_name=True) print(repr(Model(my_alias='foo'))) # (1)! #> Model(my_field='foo') print(repr(Model(my_field='foo'))) # (2)! #> Model(my_field='foo') ``` 1. The alias `my_alias` is used for validation. 1. the attribute identifier `my_field` is used for validation. Warning You cannot set both `validate_by_alias` and `validate_by_name` to `False`. A [user error](../../errors/usage_errors/#validate-by-alias-and-name-false) is raised in this case. #### Serialization When serializing data, you can enable serialization by alias, which is disabled by default. See the ConfigDict.serialize_by_alias API documentation for more details. ```python from pydantic import BaseModel, ConfigDict, Field class Model(BaseModel): my_field: str = Field(serialization_alias='my_alias') model_config = ConfigDict(serialize_by_alias=True) m = Model(my_field='foo') print(m.model_dump()) # (1)! #> {'my_alias': 'foo'} ``` 1. The alias `my_alias` is used for serialization. Note The fact that serialization by alias is disabled by default is notably inconsistent with the default for validation (where aliases are used by default). We anticipate changing this default in V3. ### Runtime Settings You can use runtime alias flags to control alias use for validation and serialization on a per-call basis. If you would like to control this behavior on a model level, use [`ConfigDict` settings](#configdict-settings). #### Validation When validating data, you can enable population of attributes by attribute name, alias, or both. The `by_alias` and `by_name` flags are available on the model_validate(), model_validate_json(), and model_validate_strings() methods, as well as the TypeAdapter validation methods. By default: - `by_alias` is `True` - `by_name` is `False` ```python from pydantic import BaseModel, Field class Model(BaseModel): my_field: str = Field(validation_alias='my_alias') m = Model.model_validate( {'my_alias': 'foo'}, # (1)! by_alias=True, by_name=False, ) print(repr(m)) #> Model(my_field='foo') ``` 1. The alias `my_alias` is used for validation. ```python from pydantic import BaseModel, Field class Model(BaseModel): my_field: str = Field(validation_alias='my_alias') m = Model.model_validate( {'my_field': 'foo'}, by_alias=False, by_name=True # (1)! ) print(repr(m)) #> Model(my_field='foo') ``` 1. The attribute name `my_field` is used for validation. ```python from pydantic import BaseModel, Field class Model(BaseModel): my_field: str = Field(validation_alias='my_alias') m = Model.model_validate( {'my_alias': 'foo'}, by_alias=True, by_name=True # (1)! ) print(repr(m)) #> Model(my_field='foo') m = Model.model_validate( {'my_field': 'foo'}, by_alias=True, by_name=True # (2)! ) print(repr(m)) #> Model(my_field='foo') ``` 1. The alias `my_alias` is used for validation. 1. The attribute name `my_field` is used for validation. Warning You cannot set both `by_alias` and `by_name` to `False`. A [user error](../../errors/usage_errors/#validate-by-alias-and-name-false) is raised in this case. #### Serialization When serializing data, you can enable serialization by alias via the `by_alias` flag which is available on the model_dump() and model_dump_json() methods, as well as the TypeAdapter ones. By default, `by_alias` is `False`. ```py from pydantic import BaseModel, Field class Model(BaseModel): my_field: str = Field(serialization_alias='my_alias') m = Model(my_field='foo') print(m.model_dump(by_alias=True)) # (1)! #> {'my_alias': 'foo'} ``` 1. The alias `my_alias` is used for serialization. Note The fact that serialization by alias is disabled by default is notably inconsistent with the default for validation (where aliases are used by default). We anticipate changing this default in V3. The behaviour of Pydantic can be controlled via a variety of configuration values, documented on the ConfigDict class. This page describes how configuration can be specified for Pydantic's supported types. ## Configuration on Pydantic models On Pydantic models, configuration can be specified in two ways: - Using the model_config class attribute: ```python from pydantic import BaseModel, ConfigDict, ValidationError class Model(BaseModel): model_config = ConfigDict(str_max_length=5) # (1)! v: str try: m = Model(v='abcdef') except ValidationError as e: print(e) """ 1 validation error for Model v String should have at most 5 characters [type=string_too_long, input_value='abcdef', input_type=str] """ ``` 1. A plain dictionary (i.e. `{'str_max_length': 5}`) can also be used. Note In Pydantic V1, the `Config` class was used. This is still supported, but **deprecated**. - Using class arguments: ```python from pydantic import BaseModel class Model(BaseModel, frozen=True): a: str # (1)! ``` 1. Unlike the model_config class attribute, static type checkers will recognize the `frozen` argument, and so any instance mutation will be flagged as an type checking error. ## Configuration on Pydantic dataclasses [Pydantic dataclasses](../dataclasses/) also support configuration (read more in the [dedicated section](../dataclasses/#dataclass-config)). ```python from pydantic import ConfigDict, ValidationError from pydantic.dataclasses import dataclass @dataclass(config=ConfigDict(str_max_length=10, validate_assignment=True)) class User: name: str user = User(name='John Doe') try: user.name = 'x' * 20 except ValidationError as e: print(e) """ 1 validation error for User name String should have at most 10 characters [type=string_too_long, input_value='xxxxxxxxxxxxxxxxxxxx', input_type=str] """ ``` ## Configuration on `TypeAdapter` [Type adapters](../type_adapter/) (using the TypeAdapter class) support configuration, by providing a `config` argument. ```python from pydantic import ConfigDict, TypeAdapter ta = TypeAdapter(list[str], config=ConfigDict(coerce_numbers_to_str=True)) print(ta.validate_python([1, 2])) #> ['1', '2'] ``` ## Configuration on other supported types If you are using standard library dataclasses or TypedDict classes, the configuration can be set in two ways: - Using the `__pydantic_config__` class attribute: ```python from dataclasses import dataclass from pydantic import ConfigDict @dataclass class User: __pydantic_config__ = ConfigDict(strict=True) id: int name: str = 'John Doe' ``` - Using the with_config decorator (this avoids static type checking errors with TypedDict): ```python from typing_extensions import TypedDict from pydantic import ConfigDict, with_config @with_config(ConfigDict(str_to_lower=True)) class Model(TypedDict): x: str ``` ## Change behaviour globally If you wish to change the behaviour of Pydantic globally, you can create your own custom parent class with a custom configuration, as the configuration is inherited: ```python from pydantic import BaseModel, ConfigDict class Parent(BaseModel): model_config = ConfigDict(extra='allow') class Model(Parent): x: str m = Model(x='foo', y='bar') print(m.model_dump()) #> {'x': 'foo', 'y': 'bar'} ``` If you provide configuration to the subclasses, it will be *merged* with the parent configuration: ```python from pydantic import BaseModel, ConfigDict class Parent(BaseModel): model_config = ConfigDict(extra='allow', str_to_lower=False) class Model(Parent): model_config = ConfigDict(str_to_lower=True) x: str m = Model(x='FOO', y='bar') print(m.model_dump()) #> {'x': 'foo', 'y': 'bar'} print(Model.model_config) #> {'extra': 'allow', 'str_to_lower': True} ``` Warning If your model inherits from multiple bases, Pydantic currently *doesn't* follow the [MRO](https://docs.python.org/3/glossary.html#term-method-resolution-order). For more details, see [this issue](https://github.com/pydantic/pydantic/issues/9992). ## Configuration propagation Note that when using types that support configuration as field annotations on other types, configuration will *not* be propagated. In the following example, each model has its own "configuration boundary": ```python from pydantic import BaseModel, ConfigDict class User(BaseModel): name: str class Parent(BaseModel): user: User model_config = ConfigDict(str_max_length=2) print(Parent(user={'name': 'John Doe'})) #> user=User(name='John Doe') ``` The following table provides details on how Pydantic converts data during validation in both strict and lax modes. The "Strict" column contains checkmarks for type conversions that are allowed when validating in [Strict Mode](../strict_mode/). | Field Type | Input | Strict | Input Source | Conditions | | --- | --- | --- | --- | --- | | `bool` | `bool` | ✓ | Python & JSON | | | `bool` | `float` | | Python & JSON | Allowed values: `0.0, 1.0`. | | `bool` | `int` | | Python & JSON | Allowed values: `0, 1`. | | `bool` | `str` | | Python & JSON | Allowed values: `'f'`, `'n'`, `'no'`, `'off'`, `'false'`, `'False'`, `'t'`, `'y'`, `'on'`, `'yes'`, `'true'`, `'True'`. | | `bool` | `Decimal` | | Python | Allowed values: `Decimal(0), Decimal(1)`. | | `bytes` | `bytearray` | | Python | | | `bytes` | `bytes` | ✓ | Python | | | `bytes` | `str` | ✓ | JSON | | | `bytes` | `str` | | Python | | | `callable` | `-` | | JSON | Never valid. | | `callable` | `Any` | ✓ | Python | `callable()` check must return `True`. | | `date` | `bytes` | | Python | Format: `YYYY-MM-DD` (UTF-8). | | `date` | `date` | ✓ | Python | | | `date` | `datetime` | | Python | Must be exact date, eg. no `H`, `M`, `S`, `f`. | | `date` | `float` | | Python & JSON | Interpreted as seconds or ms from epoch. See [speedate](https://docs.rs/speedate/latest/speedate/). Must be exact date. | | `date` | `int` | | Python & JSON | Interpreted as seconds or ms from epoch. See [speedate](https://docs.rs/speedate/latest/speedate/). Must be exact date. | | `date` | `str` | | Python & JSON | Format: `YYYY-MM-DD`. | | `date` | `Decimal` | | Python | Interpreted as seconds or ms from epoch. See [speedate](https://docs.rs/speedate/latest/speedate/). Must be exact date. | | `datetime` | `bytes` | | Python | Format: `YYYY-MM-DDTHH:MM:SS.f` or `YYYY-MM-DD`. See [speedate](https://docs.rs/speedate/latest/speedate/), (UTF-8). | | `datetime` | `date` | | Python | | | `datetime` | `datetime` | ✓ | Python | | | `datetime` | `float` | | Python & JSON | Interpreted as seconds or ms from epoch, see [speedate](https://docs.rs/speedate/latest/speedate/). | | `datetime` | `int` | | Python & JSON | Interpreted as seconds or ms from epoch, see [speedate](https://docs.rs/speedate/latest/speedate/). | | `datetime` | `str` | | Python & JSON | Format: `YYYY-MM-DDTHH:MM:SS.f` or `YYYY-MM-DD`. See [speedate](https://docs.rs/speedate/latest/speedate/). | | `datetime` | `Decimal` | | Python | Interpreted as seconds or ms from epoch, see [speedate](https://docs.rs/speedate/latest/speedate/). | | `deque` | `deque` | ✓ | Python | | | `deque` | `frozenset` | | Python | | | `deque` | `list` | | Python | | | `deque` | `set` | | Python | | | `deque` | `tuple` | | Python | | | `deque` | `Array` | ✓ | JSON | | | `dict` | `dict` | ✓ | Python | | | `dict` | `Mapping` | | Python | Must implement the mapping interface and have an `items()` method. | | `dict` | `Object` | ✓ | JSON | | | `float` | `bool` | | Python & JSON | | | `float` | `bytes` | | Python | Must match `[0-9]+(\.[0-9]+)?`. | | `float` | `float` | ✓ | Python & JSON | `bool` is explicitly forbidden. | | `float` | `int` | ✓ | Python & JSON | | | `float` | `str` | | Python & JSON | Must match `[0-9]+(\.[0-9]+)?`. | | `float` | `Decimal` | | Python | | | `frozenset` | `deque` | | Python | | | `frozenset` | `dict_keys` | | Python | | | `frozenset` | `dict_values` | | Python | | | `frozenset` | `frozenset` | ✓ | Python | | | `frozenset` | `list` | | Python | | | `frozenset` | `set` | | Python | | | `frozenset` | `tuple` | | Python | | | `frozenset` | `Array` | ✓ | JSON | | | `int` | `bool` | | Python & JSON | | | `int` | `bytes` | | Python | Must be numeric only, e.g. `[0-9]+`. | | `int` | `float` | | Python & JSON | Must be exact int, e.g. `val % 1 == 0`, raises error for `nan`, `inf`. | | `int` | `int` | ✓ | Python & JSON | `bool` is explicitly forbidden. | | `int` | `int` | | Python & JSON | | | `int` | `str` | | Python & JSON | Must be numeric only, e.g. `[0-9]+`. | | `int` | `Decimal` | | Python | Must be exact int, e.g. `val % 1 == 0`. | | `list` | `deque` | | Python | | | `list` | `dict_keys` | | Python | | | `list` | `dict_values` | | Python | | | `list` | `frozenset` | | Python | | | `list` | `list` | ✓ | Python | | | `list` | `set` | | Python | | | `list` | `tuple` | | Python | | | `list` | `Array` | ✓ | JSON | | | `namedtuple` | `dict` | ✓ | Python | | | `namedtuple` | `list` | ✓ | Python | | | `namedtuple` | `namedtuple` | ✓ | Python | | | `namedtuple` | `tuple` | ✓ | Python | | | `namedtuple` | `Array` | ✓ | JSON | | | `namedtuple` | `NamedTuple` | ✓ | Python | | | `set` | `deque` | | Python | | | `set` | `dict_keys` | | Python | | | `set` | `dict_values` | | Python | | | `set` | `frozenset` | | Python | | | `set` | `list` | | Python | | | `set` | `set` | ✓ | Python | | | `set` | `tuple` | | Python | | | `set` | `Array` | ✓ | JSON | | | `str` | `bytearray` | | Python | Assumes UTF-8, error on unicode decoding error. | | `str` | `bytes` | | Python | Assumes UTF-8, error on unicode decoding error. | | `str` | `str` | ✓ | Python & JSON | | | `time` | `bytes` | | Python | Format: `HH:MM:SS.FFFFFF`. See [speedate](https://docs.rs/speedate/latest/speedate/). | | `time` | `float` | | Python & JSON | Interpreted as seconds, range `0 - 86399.9*`. | | `time` | `int` | | Python & JSON | Interpreted as seconds, range `0 - 86399`. | | `time` | `str` | | Python & JSON | Format: `HH:MM:SS.FFFFFF`. See [speedate](https://docs.rs/speedate/latest/speedate/). | | `time` | `time` | ✓ | Python | | | `time` | `Decimal` | | Python | Interpreted as seconds, range `0 - 86399.9*`. | | `timedelta` | `bytes` | | Python | Format: `ISO8601`. See [speedate](https://docs.rs/speedate/latest/speedate/), (UTF-8). | | `timedelta` | `float` | | Python & JSON | Interpreted as seconds. | | `timedelta` | `int` | | Python & JSON | Interpreted as seconds. | | `timedelta` | `str` | | Python & JSON | Format: `ISO8601`. See [speedate](https://docs.rs/speedate/latest/speedate/). | | `timedelta` | `timedelta` | ✓ | Python | | | `timedelta` | `Decimal` | | Python | Interpreted as seconds. | | `tuple` | `deque` | | Python | | | `tuple` | `dict_keys` | | Python | | | `tuple` | `dict_values` | | Python | | | `tuple` | `frozenset` | | Python | | | `tuple` | `list` | | Python | | | `tuple` | `set` | | Python | | | `tuple` | `tuple` | ✓ | Python | | | `tuple` | `Array` | ✓ | JSON | | | `type` | `type` | ✓ | Python | | | `Any` | `Any` | ✓ | Python & JSON | | | `ByteSize` | `float` | ✓ | Python & JSON | | | `ByteSize` | `int` | ✓ | Python & JSON | | | `ByteSize` | `str` | ✓ | Python & JSON | | | `ByteSize` | `Decimal` | ✓ | Python | | | `Decimal` | `float` | ✓ | JSON | | | `Decimal` | `float` | | Python & JSON | | | `Decimal` | `int` | ✓ | JSON | | | `Decimal` | `int` | | Python & JSON | | | `Decimal` | `str` | ✓ | JSON | | | `Decimal` | `str` | | Python & JSON | Must match `[0-9]+(\.[0-9]+)?`. | | `Decimal` | `Decimal` | ✓ | Python | | | `Enum` | `Any` | ✓ | JSON | Input value must be convertible to enum values. | | `Enum` | `Any` | | Python | Input value must be convertible to enum values. | | `Enum` | `Enum` | ✓ | Python | | | `IPv4Address` | `bytes` | | Python | | | `IPv4Address` | `int` | | Python | integer representing the IP address, must be less than `2**32` | | `IPv4Address` | `str` | ✓ | JSON | | | `IPv4Address` | `str` | | Python & JSON | | | `IPv4Address` | `IPv4Address` | ✓ | Python | | | `IPv4Address` | `IPv4Interface` | ✓ | Python | | | `IPv4Interface` | `bytes` | | Python | | | `IPv4Interface` | `int` | | Python | integer representing the IP address, must be less than `2**32` | | `IPv4Interface` | `str` | ✓ | JSON | | | `IPv4Interface` | `str` | | Python & JSON | | | `IPv4Interface` | `tuple` | | Python | | | `IPv4Interface` | `IPv4Address` | | Python | | | `IPv4Interface` | `IPv4Interface` | ✓ | Python | | | `IPv4Network` | `bytes` | | Python | | | `IPv4Network` | `int` | | Python | integer representing the IP network, must be less than `2**32` | | `IPv4Network` | `str` | ✓ | JSON | | | `IPv4Network` | `str` | | Python & JSON | | | `IPv4Network` | `IPv4Address` | | Python | | | `IPv4Network` | `IPv4Interface` | | Python | | | `IPv4Network` | `IPv4Network` | ✓ | Python | | | `IPv6Address` | `bytes` | | Python | | | `IPv6Address` | `int` | | Python | integer representing the IP address, must be less than `2**128` | | `IPv6Address` | `str` | ✓ | JSON | | | `IPv6Address` | `str` | | Python & JSON | | | `IPv6Address` | `IPv6Address` | ✓ | Python | | | `IPv6Address` | `IPv6Interface` | ✓ | Python | | | `IPv6Interface` | `bytes` | | Python | | | `IPv6Interface` | `int` | | Python | integer representing the IP address, must be less than `2**128` | | `IPv6Interface` | `str` | ✓ | JSON | | | `IPv6Interface` | `str` | | Python & JSON | | | `IPv6Interface` | `tuple` | | Python | | | `IPv6Interface` | `IPv6Address` | | Python | | | `IPv6Interface` | `IPv6Interface` | ✓ | Python | | | `IPv6Network` | `bytes` | | Python | | | `IPv6Network` | `int` | | Python | integer representing the IP address, must be less than `2**128` | | `IPv6Network` | `str` | ✓ | JSON | | | `IPv6Network` | `str` | | Python & JSON | | | `IPv6Network` | `IPv6Address` | | Python | | | `IPv6Network` | `IPv6Interface` | | Python | | | `IPv6Network` | `IPv6Network` | ✓ | Python | | | `InstanceOf` | `-` | | JSON | Never valid. | | `InstanceOf` | `Any` | ✓ | Python | `isinstance()` check must return `True`. | | `IntEnum` | `Any` | ✓ | JSON | Input value must be convertible to enum values. | | `IntEnum` | `Any` | | Python | Input value must be convertible to enum values. | | `IntEnum` | `IntEnum` | ✓ | Python | | | `Iterable` | `deque` | ✓ | Python | | | `Iterable` | `frozenset` | ✓ | Python | | | `Iterable` | `list` | ✓ | Python | | | `Iterable` | `set` | ✓ | Python | | | `Iterable` | `tuple` | ✓ | Python | | | `Iterable` | `Array` | ✓ | JSON | | | `NamedTuple` | `dict` | ✓ | Python | | | `NamedTuple` | `list` | ✓ | Python | | | `NamedTuple` | `namedtuple` | ✓ | Python | | | `NamedTuple` | `tuple` | ✓ | Python | | | `NamedTuple` | `Array` | ✓ | JSON | | | `NamedTuple` | `NamedTuple` | ✓ | Python | | | `None` | `None` | ✓ | Python & JSON | | | `Path` | `str` | ✓ | JSON | | | `Path` | `str` | | Python | | | `Path` | `Path` | ✓ | Python | | | `Pattern` | `bytes` | ✓ | Python | Input must be a valid pattern. | | `Pattern` | `str` | ✓ | Python & JSON | Input must be a valid pattern. | | `Sequence` | `deque` | | Python | | | `Sequence` | `list` | ✓ | Python | | | `Sequence` | `tuple` | | Python | | | `Sequence` | `Array` | ✓ | JSON | | | `TypedDict` | `dict` | ✓ | Python | | | `TypedDict` | `Any` | ✓ | Python | | | `TypedDict` | `Mapping` | | Python | Must implement the mapping interface and have an `items()` method. | | `TypedDict` | `Object` | ✓ | JSON | | | `UUID` | `str` | ✓ | JSON | | | `UUID` | `str` | | Python | | | `UUID` | `UUID` | ✓ | Python | | | Field Type | Input | Strict | Input Source | Conditions | | --- | --- | --- | --- | --- | | `bool` | `bool` | ✓ | Python & JSON | | | `bool` | `float` | | Python & JSON | Allowed values: `0.0, 1.0`. | | `bool` | `int` | | Python & JSON | Allowed values: `0, 1`. | | `bool` | `str` | | Python & JSON | Allowed values: `'f'`, `'n'`, `'no'`, `'off'`, `'false'`, `'False'`, `'t'`, `'y'`, `'on'`, `'yes'`, `'true'`, `'True'`. | | `bytes` | `str` | ✓ | JSON | | | `callable` | `-` | | JSON | Never valid. | | `date` | `float` | | Python & JSON | Interpreted as seconds or ms from epoch. See [speedate](https://docs.rs/speedate/latest/speedate/). Must be exact date. | | `date` | `int` | | Python & JSON | Interpreted as seconds or ms from epoch. See [speedate](https://docs.rs/speedate/latest/speedate/). Must be exact date. | | `date` | `str` | | Python & JSON | Format: `YYYY-MM-DD`. | | `datetime` | `float` | | Python & JSON | Interpreted as seconds or ms from epoch, see [speedate](https://docs.rs/speedate/latest/speedate/). | | `datetime` | `int` | | Python & JSON | Interpreted as seconds or ms from epoch, see [speedate](https://docs.rs/speedate/latest/speedate/). | | `datetime` | `str` | | Python & JSON | Format: `YYYY-MM-DDTHH:MM:SS.f` or `YYYY-MM-DD`. See [speedate](https://docs.rs/speedate/latest/speedate/). | | `deque` | `Array` | ✓ | JSON | | | `dict` | `Object` | ✓ | JSON | | | `float` | `bool` | | Python & JSON | | | `float` | `float` | ✓ | Python & JSON | `bool` is explicitly forbidden. | | `float` | `int` | ✓ | Python & JSON | | | `float` | `str` | | Python & JSON | Must match `[0-9]+(\.[0-9]+)?`. | | `frozenset` | `Array` | ✓ | JSON | | | `int` | `bool` | | Python & JSON | | | `int` | `float` | | Python & JSON | Must be exact int, e.g. `val % 1 == 0`, raises error for `nan`, `inf`. | | `int` | `int` | ✓ | Python & JSON | `bool` is explicitly forbidden. | | `int` | `int` | | Python & JSON | | | `int` | `str` | | Python & JSON | Must be numeric only, e.g. `[0-9]+`. | | `list` | `Array` | ✓ | JSON | | | `namedtuple` | `Array` | ✓ | JSON | | | `set` | `Array` | ✓ | JSON | | | `str` | `str` | ✓ | Python & JSON | | | `time` | `float` | | Python & JSON | Interpreted as seconds, range `0 - 86399.9*`. | | `time` | `int` | | Python & JSON | Interpreted as seconds, range `0 - 86399`. | | `time` | `str` | | Python & JSON | Format: `HH:MM:SS.FFFFFF`. See [speedate](https://docs.rs/speedate/latest/speedate/). | | `timedelta` | `float` | | Python & JSON | Interpreted as seconds. | | `timedelta` | `int` | | Python & JSON | Interpreted as seconds. | | `timedelta` | `str` | | Python & JSON | Format: `ISO8601`. See [speedate](https://docs.rs/speedate/latest/speedate/). | | `tuple` | `Array` | ✓ | JSON | | | `Any` | `Any` | ✓ | Python & JSON | | | `ByteSize` | `float` | ✓ | Python & JSON | | | `ByteSize` | `int` | ✓ | Python & JSON | | | `ByteSize` | `str` | ✓ | Python & JSON | | | `Decimal` | `float` | ✓ | JSON | | | `Decimal` | `float` | | Python & JSON | | | `Decimal` | `int` | ✓ | JSON | | | `Decimal` | `int` | | Python & JSON | | | `Decimal` | `str` | ✓ | JSON | | | `Decimal` | `str` | | Python & JSON | Must match `[0-9]+(\.[0-9]+)?`. | | `Enum` | `Any` | ✓ | JSON | Input value must be convertible to enum values. | | `IPv4Address` | `str` | ✓ | JSON | | | `IPv4Address` | `str` | | Python & JSON | | | `IPv4Interface` | `str` | ✓ | JSON | | | `IPv4Interface` | `str` | | Python & JSON | | | `IPv4Network` | `str` | ✓ | JSON | | | `IPv4Network` | `str` | | Python & JSON | | | `IPv6Address` | `str` | ✓ | JSON | | | `IPv6Address` | `str` | | Python & JSON | | | `IPv6Interface` | `str` | ✓ | JSON | | | `IPv6Interface` | `str` | | Python & JSON | | | `IPv6Network` | `str` | ✓ | JSON | | | `IPv6Network` | `str` | | Python & JSON | | | `InstanceOf` | `-` | | JSON | Never valid. | | `IntEnum` | `Any` | ✓ | JSON | Input value must be convertible to enum values. | | `Iterable` | `Array` | ✓ | JSON | | | `NamedTuple` | `Array` | ✓ | JSON | | | `None` | `None` | ✓ | Python & JSON | | | `Path` | `str` | ✓ | JSON | | | `Pattern` | `str` | ✓ | Python & JSON | Input must be a valid pattern. | | `Sequence` | `Array` | ✓ | JSON | | | `TypedDict` | `Object` | ✓ | JSON | | | `UUID` | `str` | ✓ | JSON | | | Field Type | Input | Strict | Input Source | Conditions | | --- | --- | --- | --- | --- | | `bool` | `bool` | ✓ | Python & JSON | | | `bytes` | `str` | ✓ | JSON | | | `deque` | `Array` | ✓ | JSON | | | `dict` | `Object` | ✓ | JSON | | | `float` | `float` | ✓ | Python & JSON | `bool` is explicitly forbidden. | | `float` | `int` | ✓ | Python & JSON | | | `frozenset` | `Array` | ✓ | JSON | | | `int` | `int` | ✓ | Python & JSON | `bool` is explicitly forbidden. | | `list` | `Array` | ✓ | JSON | | | `namedtuple` | `Array` | ✓ | JSON | | | `set` | `Array` | ✓ | JSON | | | `str` | `str` | ✓ | Python & JSON | | | `tuple` | `Array` | ✓ | JSON | | | `Any` | `Any` | ✓ | Python & JSON | | | `ByteSize` | `float` | ✓ | Python & JSON | | | `ByteSize` | `int` | ✓ | Python & JSON | | | `ByteSize` | `str` | ✓ | Python & JSON | | | `Decimal` | `float` | ✓ | JSON | | | `Decimal` | `int` | ✓ | JSON | | | `Decimal` | `str` | ✓ | JSON | | | `Enum` | `Any` | ✓ | JSON | Input value must be convertible to enum values. | | `IPv4Address` | `str` | ✓ | JSON | | | `IPv4Interface` | `str` | ✓ | JSON | | | `IPv4Network` | `str` | ✓ | JSON | | | `IPv6Address` | `str` | ✓ | JSON | | | `IPv6Interface` | `str` | ✓ | JSON | | | `IPv6Network` | `str` | ✓ | JSON | | | `IntEnum` | `Any` | ✓ | JSON | Input value must be convertible to enum values. | | `Iterable` | `Array` | ✓ | JSON | | | `NamedTuple` | `Array` | ✓ | JSON | | | `None` | `None` | ✓ | Python & JSON | | | `Path` | `str` | ✓ | JSON | | | `Pattern` | `str` | ✓ | Python & JSON | Input must be a valid pattern. | | `Sequence` | `Array` | ✓ | JSON | | | `TypedDict` | `Object` | ✓ | JSON | | | `UUID` | `str` | ✓ | JSON | | | Field Type | Input | Strict | Input Source | Conditions | | --- | --- | --- | --- | --- | | `bool` | `bool` | ✓ | Python & JSON | | | `bool` | `float` | | Python & JSON | Allowed values: `0.0, 1.0`. | | `bool` | `int` | | Python & JSON | Allowed values: `0, 1`. | | `bool` | `str` | | Python & JSON | Allowed values: `'f'`, `'n'`, `'no'`, `'off'`, `'false'`, `'False'`, `'t'`, `'y'`, `'on'`, `'yes'`, `'true'`, `'True'`. | | `bool` | `Decimal` | | Python | Allowed values: `Decimal(0), Decimal(1)`. | | `bytes` | `bytearray` | | Python | | | `bytes` | `bytes` | ✓ | Python | | | `bytes` | `str` | | Python | | | `callable` | `Any` | ✓ | Python | `callable()` check must return `True`. | | `date` | `bytes` | | Python | Format: `YYYY-MM-DD` (UTF-8). | | `date` | `date` | ✓ | Python | | | `date` | `datetime` | | Python | Must be exact date, eg. no `H`, `M`, `S`, `f`. | | `date` | `float` | | Python & JSON | Interpreted as seconds or ms from epoch. See [speedate](https://docs.rs/speedate/latest/speedate/). Must be exact date. | | `date` | `int` | | Python & JSON | Interpreted as seconds or ms from epoch. See [speedate](https://docs.rs/speedate/latest/speedate/). Must be exact date. | | `date` | `str` | | Python & JSON | Format: `YYYY-MM-DD`. | | `date` | `Decimal` | | Python | Interpreted as seconds or ms from epoch. See [speedate](https://docs.rs/speedate/latest/speedate/). Must be exact date. | | `datetime` | `bytes` | | Python | Format: `YYYY-MM-DDTHH:MM:SS.f` or `YYYY-MM-DD`. See [speedate](https://docs.rs/speedate/latest/speedate/), (UTF-8). | | `datetime` | `date` | | Python | | | `datetime` | `datetime` | ✓ | Python | | | `datetime` | `float` | | Python & JSON | Interpreted as seconds or ms from epoch, see [speedate](https://docs.rs/speedate/latest/speedate/). | | `datetime` | `int` | | Python & JSON | Interpreted as seconds or ms from epoch, see [speedate](https://docs.rs/speedate/latest/speedate/). | | `datetime` | `str` | | Python & JSON | Format: `YYYY-MM-DDTHH:MM:SS.f` or `YYYY-MM-DD`. See [speedate](https://docs.rs/speedate/latest/speedate/). | | `datetime` | `Decimal` | | Python | Interpreted as seconds or ms from epoch, see [speedate](https://docs.rs/speedate/latest/speedate/). | | `deque` | `deque` | ✓ | Python | | | `deque` | `frozenset` | | Python | | | `deque` | `list` | | Python | | | `deque` | `set` | | Python | | | `deque` | `tuple` | | Python | | | `dict` | `dict` | ✓ | Python | | | `dict` | `Mapping` | | Python | Must implement the mapping interface and have an `items()` method. | | `float` | `bool` | | Python & JSON | | | `float` | `bytes` | | Python | Must match `[0-9]+(\.[0-9]+)?`. | | `float` | `float` | ✓ | Python & JSON | `bool` is explicitly forbidden. | | `float` | `int` | ✓ | Python & JSON | | | `float` | `str` | | Python & JSON | Must match `[0-9]+(\.[0-9]+)?`. | | `float` | `Decimal` | | Python | | | `frozenset` | `deque` | | Python | | | `frozenset` | `dict_keys` | | Python | | | `frozenset` | `dict_values` | | Python | | | `frozenset` | `frozenset` | ✓ | Python | | | `frozenset` | `list` | | Python | | | `frozenset` | `set` | | Python | | | `frozenset` | `tuple` | | Python | | | `int` | `bool` | | Python & JSON | | | `int` | `bytes` | | Python | Must be numeric only, e.g. `[0-9]+`. | | `int` | `float` | | Python & JSON | Must be exact int, e.g. `val % 1 == 0`, raises error for `nan`, `inf`. | | `int` | `int` | ✓ | Python & JSON | `bool` is explicitly forbidden. | | `int` | `int` | | Python & JSON | | | `int` | `str` | | Python & JSON | Must be numeric only, e.g. `[0-9]+`. | | `int` | `Decimal` | | Python | Must be exact int, e.g. `val % 1 == 0`. | | `list` | `deque` | | Python | | | `list` | `dict_keys` | | Python | | | `list` | `dict_values` | | Python | | | `list` | `frozenset` | | Python | | | `list` | `list` | ✓ | Python | | | `list` | `set` | | Python | | | `list` | `tuple` | | Python | | | `namedtuple` | `dict` | ✓ | Python | | | `namedtuple` | `list` | ✓ | Python | | | `namedtuple` | `namedtuple` | ✓ | Python | | | `namedtuple` | `tuple` | ✓ | Python | | | `namedtuple` | `NamedTuple` | ✓ | Python | | | `set` | `deque` | | Python | | | `set` | `dict_keys` | | Python | | | `set` | `dict_values` | | Python | | | `set` | `frozenset` | | Python | | | `set` | `list` | | Python | | | `set` | `set` | ✓ | Python | | | `set` | `tuple` | | Python | | | `str` | `bytearray` | | Python | Assumes UTF-8, error on unicode decoding error. | | `str` | `bytes` | | Python | Assumes UTF-8, error on unicode decoding error. | | `str` | `str` | ✓ | Python & JSON | | | `time` | `bytes` | | Python | Format: `HH:MM:SS.FFFFFF`. See [speedate](https://docs.rs/speedate/latest/speedate/). | | `time` | `float` | | Python & JSON | Interpreted as seconds, range `0 - 86399.9*`. | | `time` | `int` | | Python & JSON | Interpreted as seconds, range `0 - 86399`. | | `time` | `str` | | Python & JSON | Format: `HH:MM:SS.FFFFFF`. See [speedate](https://docs.rs/speedate/latest/speedate/). | | `time` | `time` | ✓ | Python | | | `time` | `Decimal` | | Python | Interpreted as seconds, range `0 - 86399.9*`. | | `timedelta` | `bytes` | | Python | Format: `ISO8601`. See [speedate](https://docs.rs/speedate/latest/speedate/), (UTF-8). | | `timedelta` | `float` | | Python & JSON | Interpreted as seconds. | | `timedelta` | `int` | | Python & JSON | Interpreted as seconds. | | `timedelta` | `str` | | Python & JSON | Format: `ISO8601`. See [speedate](https://docs.rs/speedate/latest/speedate/). | | `timedelta` | `timedelta` | ✓ | Python | | | `timedelta` | `Decimal` | | Python | Interpreted as seconds. | | `tuple` | `deque` | | Python | | | `tuple` | `dict_keys` | | Python | | | `tuple` | `dict_values` | | Python | | | `tuple` | `frozenset` | | Python | | | `tuple` | `list` | | Python | | | `tuple` | `set` | | Python | | | `tuple` | `tuple` | ✓ | Python | | | `type` | `type` | ✓ | Python | | | `Any` | `Any` | ✓ | Python & JSON | | | `ByteSize` | `float` | ✓ | Python & JSON | | | `ByteSize` | `int` | ✓ | Python & JSON | | | `ByteSize` | `str` | ✓ | Python & JSON | | | `ByteSize` | `Decimal` | ✓ | Python | | | `Decimal` | `float` | | Python & JSON | | | `Decimal` | `int` | | Python & JSON | | | `Decimal` | `str` | | Python & JSON | Must match `[0-9]+(\.[0-9]+)?`. | | `Decimal` | `Decimal` | ✓ | Python | | | `Enum` | `Any` | | Python | Input value must be convertible to enum values. | | `Enum` | `Enum` | ✓ | Python | | | `IPv4Address` | `bytes` | | Python | | | `IPv4Address` | `int` | | Python | integer representing the IP address, must be less than `2**32` | | `IPv4Address` | `str` | | Python & JSON | | | `IPv4Address` | `IPv4Address` | ✓ | Python | | | `IPv4Address` | `IPv4Interface` | ✓ | Python | | | `IPv4Interface` | `bytes` | | Python | | | `IPv4Interface` | `int` | | Python | integer representing the IP address, must be less than `2**32` | | `IPv4Interface` | `str` | | Python & JSON | | | `IPv4Interface` | `tuple` | | Python | | | `IPv4Interface` | `IPv4Address` | | Python | | | `IPv4Interface` | `IPv4Interface` | ✓ | Python | | | `IPv4Network` | `bytes` | | Python | | | `IPv4Network` | `int` | | Python | integer representing the IP network, must be less than `2**32` | | `IPv4Network` | `str` | | Python & JSON | | | `IPv4Network` | `IPv4Address` | | Python | | | `IPv4Network` | `IPv4Interface` | | Python | | | `IPv4Network` | `IPv4Network` | ✓ | Python | | | `IPv6Address` | `bytes` | | Python | | | `IPv6Address` | `int` | | Python | integer representing the IP address, must be less than `2**128` | | `IPv6Address` | `str` | | Python & JSON | | | `IPv6Address` | `IPv6Address` | ✓ | Python | | | `IPv6Address` | `IPv6Interface` | ✓ | Python | | | `IPv6Interface` | `bytes` | | Python | | | `IPv6Interface` | `int` | | Python | integer representing the IP address, must be less than `2**128` | | `IPv6Interface` | `str` | | Python & JSON | | | `IPv6Interface` | `tuple` | | Python | | | `IPv6Interface` | `IPv6Address` | | Python | | | `IPv6Interface` | `IPv6Interface` | ✓ | Python | | | `IPv6Network` | `bytes` | | Python | | | `IPv6Network` | `int` | | Python | integer representing the IP address, must be less than `2**128` | | `IPv6Network` | `str` | | Python & JSON | | | `IPv6Network` | `IPv6Address` | | Python | | | `IPv6Network` | `IPv6Interface` | | Python | | | `IPv6Network` | `IPv6Network` | ✓ | Python | | | `InstanceOf` | `Any` | ✓ | Python | `isinstance()` check must return `True`. | | `IntEnum` | `Any` | | Python | Input value must be convertible to enum values. | | `IntEnum` | `IntEnum` | ✓ | Python | | | `Iterable` | `deque` | ✓ | Python | | | `Iterable` | `frozenset` | ✓ | Python | | | `Iterable` | `list` | ✓ | Python | | | `Iterable` | `set` | ✓ | Python | | | `Iterable` | `tuple` | ✓ | Python | | | `NamedTuple` | `dict` | ✓ | Python | | | `NamedTuple` | `list` | ✓ | Python | | | `NamedTuple` | `namedtuple` | ✓ | Python | | | `NamedTuple` | `tuple` | ✓ | Python | | | `NamedTuple` | `NamedTuple` | ✓ | Python | | | `None` | `None` | ✓ | Python & JSON | | | `Path` | `str` | | Python | | | `Path` | `Path` | ✓ | Python | | | `Pattern` | `bytes` | ✓ | Python | Input must be a valid pattern. | | `Pattern` | `str` | ✓ | Python & JSON | Input must be a valid pattern. | | `Sequence` | `deque` | | Python | | | `Sequence` | `list` | ✓ | Python | | | `Sequence` | `tuple` | | Python | | | `TypedDict` | `dict` | ✓ | Python | | | `TypedDict` | `Any` | ✓ | Python | | | `TypedDict` | `Mapping` | | Python | Must implement the mapping interface and have an `items()` method. | | `UUID` | `str` | | Python | | | `UUID` | `UUID` | ✓ | Python | | | Field Type | Input | Strict | Input Source | Conditions | | --- | --- | --- | --- | --- | | `bool` | `bool` | ✓ | Python & JSON | | | `bytes` | `bytes` | ✓ | Python | | | `callable` | `Any` | ✓ | Python | `callable()` check must return `True`. | | `date` | `date` | ✓ | Python | | | `datetime` | `datetime` | ✓ | Python | | | `deque` | `deque` | ✓ | Python | | | `dict` | `dict` | ✓ | Python | | | `float` | `float` | ✓ | Python & JSON | `bool` is explicitly forbidden. | | `float` | `int` | ✓ | Python & JSON | | | `frozenset` | `frozenset` | ✓ | Python | | | `int` | `int` | ✓ | Python & JSON | `bool` is explicitly forbidden. | | `list` | `list` | ✓ | Python | | | `namedtuple` | `dict` | ✓ | Python | | | `namedtuple` | `list` | ✓ | Python | | | `namedtuple` | `namedtuple` | ✓ | Python | | | `namedtuple` | `tuple` | ✓ | Python | | | `namedtuple` | `NamedTuple` | ✓ | Python | | | `set` | `set` | ✓ | Python | | | `str` | `str` | ✓ | Python & JSON | | | `time` | `time` | ✓ | Python | | | `timedelta` | `timedelta` | ✓ | Python | | | `tuple` | `tuple` | ✓ | Python | | | `type` | `type` | ✓ | Python | | | `Any` | `Any` | ✓ | Python & JSON | | | `ByteSize` | `float` | ✓ | Python & JSON | | | `ByteSize` | `int` | ✓ | Python & JSON | | | `ByteSize` | `str` | ✓ | Python & JSON | | | `ByteSize` | `Decimal` | ✓ | Python | | | `Decimal` | `Decimal` | ✓ | Python | | | `Enum` | `Enum` | ✓ | Python | | | `IPv4Address` | `IPv4Address` | ✓ | Python | | | `IPv4Address` | `IPv4Interface` | ✓ | Python | | | `IPv4Interface` | `IPv4Interface` | ✓ | Python | | | `IPv4Network` | `IPv4Network` | ✓ | Python | | | `IPv6Address` | `IPv6Address` | ✓ | Python | | | `IPv6Address` | `IPv6Interface` | ✓ | Python | | | `IPv6Interface` | `IPv6Interface` | ✓ | Python | | | `IPv6Network` | `IPv6Network` | ✓ | Python | | | `InstanceOf` | `Any` | ✓ | Python | `isinstance()` check must return `True`. | | `IntEnum` | `IntEnum` | ✓ | Python | | | `Iterable` | `deque` | ✓ | Python | | | `Iterable` | `frozenset` | ✓ | Python | | | `Iterable` | `list` | ✓ | Python | | | `Iterable` | `set` | ✓ | Python | | | `Iterable` | `tuple` | ✓ | Python | | | `NamedTuple` | `dict` | ✓ | Python | | | `NamedTuple` | `list` | ✓ | Python | | | `NamedTuple` | `namedtuple` | ✓ | Python | | | `NamedTuple` | `tuple` | ✓ | Python | | | `NamedTuple` | `NamedTuple` | ✓ | Python | | | `None` | `None` | ✓ | Python & JSON | | | `Path` | `Path` | ✓ | Python | | | `Pattern` | `bytes` | ✓ | Python | Input must be a valid pattern. | | `Pattern` | `str` | ✓ | Python & JSON | Input must be a valid pattern. | | `Sequence` | `list` | ✓ | Python | | | `TypedDict` | `dict` | ✓ | Python | | | `TypedDict` | `Any` | ✓ | Python | | | `UUID` | `UUID` | ✓ | Python | | API Documentation pydantic.dataclasses.dataclass If you don't want to use Pydantic's BaseModel you can instead get the same data validation on standard dataclasses. ```python from datetime import datetime from typing import Optional from pydantic.dataclasses import dataclass @dataclass class User: id: int name: str = 'John Doe' signup_ts: Optional[datetime] = None user = User(id='42', signup_ts='2032-06-21T12:00') print(user) """ User(id=42, name='John Doe', signup_ts=datetime.datetime(2032, 6, 21, 12, 0)) """ ``` ```python from datetime import datetime from pydantic.dataclasses import dataclass @dataclass class User: id: int name: str = 'John Doe' signup_ts: datetime | None = None user = User(id='42', signup_ts='2032-06-21T12:00') print(user) """ User(id=42, name='John Doe', signup_ts=datetime.datetime(2032, 6, 21, 12, 0)) """ ``` Note Keep in mind that Pydantic dataclasses are **not** a replacement for [Pydantic models](../models/). They provide a similar functionality to stdlib dataclasses with the addition of Pydantic validation. There are cases where subclassing using Pydantic models is the better choice. For more information and discussion see [pydantic/pydantic#710](https://github.com/pydantic/pydantic/issues/710). Similarities between Pydantic dataclasses and models include support for: - [Configuration](#dataclass-config) support - [Nested](../models/#nested-models) classes - [Generics](../models/#generic-models) Some differences between Pydantic dataclasses and models include: - [validators](#validators-and-initialization-hooks) - The behavior with the extra configuration value Similarly to Pydantic models, arguments used to instantiate the dataclass are [copied](../models/#attribute-copies). To make use of the [various methods](../models/#model-methods-and-properties) to validate, dump and generate a JSON Schema, you can wrap the dataclass with a TypeAdapter and make use of its methods. You can use both the Pydantic's Field() and the stdlib's field() functions: ```python import dataclasses from typing import Optional from pydantic import Field, TypeAdapter from pydantic.dataclasses import dataclass @dataclass class User: id: int name: str = 'John Doe' friends: list[int] = dataclasses.field(default_factory=lambda: [0]) age: Optional[int] = dataclasses.field( default=None, metadata={'title': 'The age of the user', 'description': 'do not lie!'}, ) height: Optional[int] = Field(None, title='The height in cm', ge=50, le=300) user = User(id='42') print(TypeAdapter(User).json_schema()) """ { 'properties': { 'id': {'title': 'Id', 'type': 'integer'}, 'name': {'default': 'John Doe', 'title': 'Name', 'type': 'string'}, 'friends': { 'items': {'type': 'integer'}, 'title': 'Friends', 'type': 'array', }, 'age': { 'anyOf': [{'type': 'integer'}, {'type': 'null'}], 'default': None, 'description': 'do not lie!', 'title': 'The age of the user', }, 'height': { 'anyOf': [ {'maximum': 300, 'minimum': 50, 'type': 'integer'}, {'type': 'null'}, ], 'default': None, 'title': 'The height in cm', }, }, 'required': ['id'], 'title': 'User', 'type': 'object', } """ ``` ```python import dataclasses from pydantic import Field, TypeAdapter from pydantic.dataclasses import dataclass @dataclass class User: id: int name: str = 'John Doe' friends: list[int] = dataclasses.field(default_factory=lambda: [0]) age: int | None = dataclasses.field( default=None, metadata={'title': 'The age of the user', 'description': 'do not lie!'}, ) height: int | None = Field(None, title='The height in cm', ge=50, le=300) user = User(id='42') print(TypeAdapter(User).json_schema()) """ { 'properties': { 'id': {'title': 'Id', 'type': 'integer'}, 'name': {'default': 'John Doe', 'title': 'Name', 'type': 'string'}, 'friends': { 'items': {'type': 'integer'}, 'title': 'Friends', 'type': 'array', }, 'age': { 'anyOf': [{'type': 'integer'}, {'type': 'null'}], 'default': None, 'description': 'do not lie!', 'title': 'The age of the user', }, 'height': { 'anyOf': [ {'maximum': 300, 'minimum': 50, 'type': 'integer'}, {'type': 'null'}, ], 'default': None, 'title': 'The height in cm', }, }, 'required': ['id'], 'title': 'User', 'type': 'object', } """ ``` The Pydantic `@dataclass` decorator accepts the same arguments as the standard decorator, with the addition of a `config` parameter. ## Dataclass config If you want to modify the configuration like you would with a BaseModel, you have two options: - Use the `config` argument of the decorator. - Define the configuration with the `__pydantic_config__` attribute. ```python from pydantic import ConfigDict from pydantic.dataclasses import dataclass # Option 1 -- using the decorator argument: @dataclass(config=ConfigDict(validate_assignment=True)) # (1)! class MyDataclass1: a: int # Option 2 -- using an attribute: @dataclass class MyDataclass2: a: int __pydantic_config__ = ConfigDict(validate_assignment=True) ``` 1. You can read more about `validate_assignment` in the API reference. Note While Pydantic dataclasses support the extra configuration value, some default behavior of stdlib dataclasses may prevail. For example, any extra fields present on a Pydantic dataclass with extra set to `'allow'` are omitted in the dataclass' string representation. There is also no way to provide validation [using the `__pydantic_extra__` attribute](../models/#extra-data). ## Rebuilding dataclass schema The rebuild_dataclass() can be used to rebuild the core schema of the dataclass. See the [rebuilding model schema](../models/#rebuilding-model-schema) section for more details. ## Stdlib dataclasses and Pydantic dataclasses ### Inherit from stdlib dataclasses Stdlib dataclasses (nested or not) can also be inherited and Pydantic will automatically validate all the inherited fields. ```python import dataclasses import pydantic @dataclasses.dataclass class Z: z: int @dataclasses.dataclass class Y(Z): y: int = 0 @pydantic.dataclasses.dataclass class X(Y): x: int = 0 foo = X(x=b'1', y='2', z='3') print(foo) #> X(z=3, y=2, x=1) try: X(z='pika') except pydantic.ValidationError as e: print(e) """ 1 validation error for X z Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='pika', input_type=str] """ ``` ### Usage of stdlib dataclasses with `BaseModel` When a standard library dataclass is used within a Pydantic model, a Pydantic dataclass or a TypeAdapter, validation will be applied (and the [configuration](#dataclass-config) stays the same). This means that using a stdlib or a Pydantic dataclass as a field annotation is functionally equivalent. ```python import dataclasses from typing import Optional from pydantic import BaseModel, ConfigDict, ValidationError @dataclasses.dataclass(frozen=True) class User: name: str class Foo(BaseModel): # Required so that pydantic revalidates the model attributes: model_config = ConfigDict(revalidate_instances='always') user: Optional[User] = None # nothing is validated as expected: user = User(name=['not', 'a', 'string']) print(user) #> User(name=['not', 'a', 'string']) try: Foo(user=user) except ValidationError as e: print(e) """ 1 validation error for Foo user.name Input should be a valid string [type=string_type, input_value=['not', 'a', 'string'], input_type=list] """ foo = Foo(user=User(name='pika')) try: foo.user.name = 'bulbi' except dataclasses.FrozenInstanceError as e: print(e) #> cannot assign to field 'name' ``` ```python import dataclasses from pydantic import BaseModel, ConfigDict, ValidationError @dataclasses.dataclass(frozen=True) class User: name: str class Foo(BaseModel): # Required so that pydantic revalidates the model attributes: model_config = ConfigDict(revalidate_instances='always') user: User | None = None # nothing is validated as expected: user = User(name=['not', 'a', 'string']) print(user) #> User(name=['not', 'a', 'string']) try: Foo(user=user) except ValidationError as e: print(e) """ 1 validation error for Foo user.name Input should be a valid string [type=string_type, input_value=['not', 'a', 'string'], input_type=list] """ foo = Foo(user=User(name='pika')) try: foo.user.name = 'bulbi' except dataclasses.FrozenInstanceError as e: print(e) #> cannot assign to field 'name' ``` ### Using custom types As said above, validation is applied on standard library dataclasses. If you make use of custom types, you will get an error when trying to refer to the dataclass. To circumvent the issue, you can set the arbitrary_types_allowed configuration value on the dataclass: ```python import dataclasses from pydantic import BaseModel, ConfigDict from pydantic.errors import PydanticSchemaGenerationError class ArbitraryType: def __init__(self, value): self.value = value def __repr__(self): return f'ArbitraryType(value={self.value!r})' @dataclasses.dataclass class DC: a: ArbitraryType b: str # valid as it is a stdlib dataclass without validation: my_dc = DC(a=ArbitraryType(value=3), b='qwe') try: class Model(BaseModel): dc: DC other: str # invalid as dc is now validated with pydantic, and ArbitraryType is not a known type Model(dc=my_dc, other='other') except PydanticSchemaGenerationError as e: print(e.message) """ Unable to generate pydantic-core schema for . Set `arbitrary_types_allowed=True` in the model_config to ignore this error or implement `__get_pydantic_core_schema__` on your type to fully support it. If you got this error by calling handler() within `__get_pydantic_core_schema__` then you likely need to call `handler.generate_schema()` since we do not call `__get_pydantic_core_schema__` on `` otherwise to avoid infinite recursion. """ # valid as we set arbitrary_types_allowed=True, and that config pushes down to the nested vanilla dataclass class Model(BaseModel): model_config = ConfigDict(arbitrary_types_allowed=True) dc: DC other: str m = Model(dc=my_dc, other='other') print(repr(m)) #> Model(dc=DC(a=ArbitraryType(value=3), b='qwe'), other='other') ``` ### Checking if a dataclass is a Pydantic dataclass Pydantic dataclasses are still considered dataclasses, so using dataclasses.is_dataclass will return `True`. To check if a type is specifically a pydantic dataclass you can use the is_pydantic_dataclass function. ```python import dataclasses import pydantic @dataclasses.dataclass class StdLibDataclass: id: int PydanticDataclass = pydantic.dataclasses.dataclass(StdLibDataclass) print(dataclasses.is_dataclass(StdLibDataclass)) #> True print(pydantic.dataclasses.is_pydantic_dataclass(StdLibDataclass)) #> False print(dataclasses.is_dataclass(PydanticDataclass)) #> True print(pydantic.dataclasses.is_pydantic_dataclass(PydanticDataclass)) #> True ``` ## Validators and initialization hooks Validators also work with Pydantic dataclasses: ```python from pydantic import field_validator from pydantic.dataclasses import dataclass @dataclass class DemoDataclass: product_id: str # should be a five-digit string, may have leading zeros @field_validator('product_id', mode='before') @classmethod def convert_int_serial(cls, v): if isinstance(v, int): v = str(v).zfill(5) return v print(DemoDataclass(product_id='01234')) #> DemoDataclass(product_id='01234') print(DemoDataclass(product_id=2468)) #> DemoDataclass(product_id='02468') ``` The dataclass __post_init__() method is also supported, and will be called between the calls to *before* and *after* model validators. Example ```python from pydantic_core import ArgsKwargs from typing_extensions import Self from pydantic import model_validator from pydantic.dataclasses import dataclass @dataclass class Birth: year: int month: int day: int @dataclass class User: birth: Birth @model_validator(mode='before') @classmethod def before(cls, values: ArgsKwargs) -> ArgsKwargs: print(f'First: {values}') # (1)! """ First: ArgsKwargs((), {'birth': {'year': 1995, 'month': 3, 'day': 2}}) """ return values @model_validator(mode='after') def after(self) -> Self: print(f'Third: {self}') #> Third: User(birth=Birth(year=1995, month=3, day=2)) return self def __post_init__(self): print(f'Second: {self.birth}') #> Second: Birth(year=1995, month=3, day=2) user = User(**{'birth': {'year': 1995, 'month': 3, 'day': 2}}) ``` 1. Unlike Pydantic models, the `values` parameter is of type ArgsKwargs # Experimental Features In this section you will find documentation for new, experimental features in Pydantic. These features are subject to change or removal, and we are looking for feedback and suggestions before making them a permanent part of Pydantic. See our [Version Policy](../../version-policy/#experimental-features) for more information on experimental features. ## Feedback We welcome feedback on experimental features! Please open an issue on the [Pydantic GitHub repository](https://github.com/pydantic/pydantic/issues/new/choose) to share your thoughts, requests, or suggestions. We also encourage you to read through existing feedback and add your thoughts to existing issues. ## Warnings on Import When you import an experimental feature from the `experimental` module, you'll see a warning message that the feature is experimental. You can disable this warning with the following: ```python import warnings from pydantic import PydanticExperimentalWarning warnings.filterwarnings('ignore', category=PydanticExperimentalWarning) ``` ## Pipeline API Pydantic v2.8.0 introduced an experimental "pipeline" API that allows composing of parsing (validation), constraints and transformations in a more type-safe manner than existing APIs. This API is subject to change or removal, we are looking for feedback and suggestions before making it a permanent part of Pydantic. API Documentation pydantic.experimental.pipeline Generally, the pipeline API is used to define a sequence of steps to apply to incoming data during validation. The pipeline API is designed to be more type-safe and composable than the existing Pydantic API. Each step in the pipeline can be: - A validation step that runs pydantic validation on the provided type - A transformation step that modifies the data - A constraint step that checks the data against a condition - A predicate step that checks the data against a condition and raises an error if it returns `False` Note that the following example attempts to be exhaustive at the cost of complexity: if you find yourself writing this many transformations in type annotations you may want to consider having a `UserIn` and `UserOut` model (example below) or similar where you make the transformations via idomatic plain Python code. These APIs are meant for situations where the code savings are significant and the added complexity is relatively small. ```python from __future__ import annotations from datetime import datetime from typing import Annotated from pydantic import BaseModel from pydantic.experimental.pipeline import validate_as class User(BaseModel): name: Annotated[str, validate_as(str).str_lower()] # (1)! age: Annotated[int, validate_as(int).gt(0)] # (2)! username: Annotated[str, validate_as(str).str_pattern(r'[a-z]+')] # (3)! password: Annotated[ str, validate_as(str) .transform(str.lower) .predicate(lambda x: x != 'password'), # (4)! ] favorite_number: Annotated[ # (5)! int, (validate_as(int) | validate_as(str).str_strip().validate_as(int)).gt( 0 ), ] friends: Annotated[list[User], validate_as(...).len(0, 100)] # (6)! bio: Annotated[ datetime, validate_as(int) .transform(lambda x: x / 1_000_000) .validate_as(...), # (8)! ] ``` 1. Lowercase a string. 1. Constrain an integer to be greater than zero. 1. Constrain a string to match a regex pattern. 1. You can also use the lower level transform, constrain and predicate methods. 1. Use the `|` or `&` operators to combine steps (like a logical OR or AND). 1. Calling `validate_as(...)` with `Ellipsis`, `...` as the first positional argument implies `validate_as()`. Use `validate_as(Any)` to accept any type. 1. You can call `validate_as()` before or after other steps to do pre or post processing. ### Mapping from `BeforeValidator`, `AfterValidator` and `WrapValidator` The `validate_as` method is a more type-safe way to define `BeforeValidator`, `AfterValidator` and `WrapValidator`: ```python from typing import Annotated from pydantic.experimental.pipeline import transform, validate_as # BeforeValidator Annotated[int, validate_as(str).str_strip().validate_as(...)] # (1)! # AfterValidator Annotated[int, transform(lambda x: x * 2)] # (2)! # WrapValidator Annotated[ int, validate_as(str) .str_strip() .validate_as(...) .transform(lambda x: x * 2), # (3)! ] ``` 1. Strip whitespace from a string before parsing it as an integer. 1. Multiply an integer by 2 after parsing it. 1. Strip whitespace from a string, validate it as an integer, then multiply it by 2. ### Alternative patterns There are many alternative patterns to use depending on the scenario. Just as an example, consider the `UserIn` and `UserOut` pattern mentioned above: ```python from __future__ import annotations from pydantic import BaseModel class UserIn(BaseModel): favorite_number: int | str class UserOut(BaseModel): favorite_number: int def my_api(user: UserIn) -> UserOut: favorite_number = user.favorite_number if isinstance(favorite_number, str): favorite_number = int(user.favorite_number.strip()) return UserOut(favorite_number=favorite_number) assert my_api(UserIn(favorite_number=' 1 ')).favorite_number == 1 ``` This example uses plain idiomatic Python code that may be easier to understand, type-check, etc. than the examples above. The approach you choose should really depend on your use case. You will have to compare verbosity, performance, ease of returning meaningful errors to your users, etc. to choose the right pattern. Just be mindful of abusing advanced patterns like the pipeline API just because you can. ## Partial Validation Pydantic v2.10.0 introduces experimental support for "partial validation". This allows you to validate an incomplete JSON string, or a Python object representing incomplete input data. Partial validation is particularly helpful when processing the output of an LLM, where the model streams structured responses, and you may wish to begin validating the stream while you're still receiving data (e.g. to show partial data to users). Warning Partial validation is an experimental feature and may change in future versions of Pydantic. The current implementation should be considered a proof of concept at this time and has a number of [limitations](#limitations-of-partial-validation). Partial validation can be enabled when using the three validation methods on `TypeAdapter`: TypeAdapter.validate_json(), TypeAdapter.validate_python(), and TypeAdapter.validate_strings(). This allows you to parse and validation incomplete JSON, but also to validate Python objects created by parsing incomplete data of any format. The `experimental_allow_partial` flag can be passed to these methods to enable partial validation. It can take the following values (and is `False`, by default): - `False` or `'off'` - disable partial validation - `True` or `'on'` - enable partial validation, but don't support trailing strings - `'trailing-strings'` - enable partial validation and support trailing strings `'trailing-strings'` mode `'trailing-strings'` mode allows for trailing incomplete strings at the end of partial JSON to be included in the output. For example, if you're validating against the following model: ```python from typing import TypedDict class Model(TypedDict): a: str b: str ``` Then the following JSON input would be considered valid, despite the incomplete string at the end: ```json '{"a": "hello", "b": "wor' ``` And would be validated as: ```python {'a': 'hello', 'b': 'wor'} ``` `experiment_allow_partial` in action: ```python from typing import Annotated from annotated_types import MinLen from typing_extensions import NotRequired, TypedDict from pydantic import TypeAdapter class Foobar(TypedDict): # (1)! a: int b: NotRequired[float] c: NotRequired[Annotated[str, MinLen(5)]] ta = TypeAdapter(list[Foobar]) v = ta.validate_json('[{"a": 1, "b"', experimental_allow_partial=True) # (2)! print(v) #> [{'a': 1}] v = ta.validate_json( '[{"a": 1, "b": 1.0, "c": "abcd', experimental_allow_partial=True # (3)! ) print(v) #> [{'a': 1, 'b': 1.0}] v = ta.validate_json( '[{"b": 1.0, "c": "abcde"', experimental_allow_partial=True # (4)! ) print(v) #> [] v = ta.validate_json( '[{"a": 1, "b": 1.0, "c": "abcde"},{"a": ', experimental_allow_partial=True ) print(v) #> [{'a': 1, 'b': 1.0, 'c': 'abcde'}] v = ta.validate_python([{'a': 1}], experimental_allow_partial=True) # (5)! print(v) #> [{'a': 1}] v = ta.validate_python( [{'a': 1, 'b': 1.0, 'c': 'abcd'}], experimental_allow_partial=True # (6)! ) print(v) #> [{'a': 1, 'b': 1.0}] v = ta.validate_json( '[{"a": 1, "b": 1.0, "c": "abcdefg', experimental_allow_partial='trailing-strings', # (7)! ) print(v) #> [{'a': 1, 'b': 1.0, 'c': 'abcdefg'}] ``` 1. The TypedDict `Foobar` has three field, but only `a` is required, that means that a valid instance of `Foobar` can be created even if the `b` and `c` fields are missing. 1. Parsing JSON, the input is valid JSON up to the point where the string is truncated. 1. In this case truncation of the input means the value of `c` (`abcd`) is invalid as input to `c` field, hence it's omitted. 1. The `a` field is required, so validation on the only item in the list fails and is dropped. 1. Partial validation also works with Python objects, it should have the same semantics as with JSON except of course you can't have a genuinely "incomplete" Python object. 1. The same as above but with a Python object, `c` is dropped as it's not required and failed validation. 1. The `trailing-strings` mode allows for incomplete strings at the end of partial JSON to be included in the output, in this case the input is valid JSON up to the point where the string is truncated, so the last string is included. ### How Partial Validation Works Partial validation follows the zen of Pydantic — it makes no guarantees about what the input data might have been, but it does guarantee to return a valid instance of the type you required, or raise a validation error. To do this, the `experimental_allow_partial` flag enables two pieces of behavior: #### 1. Partial JSON parsing The [jiter](https://github.com/pydantic/jiter) JSON parser used by Pydantic already supports parsing partial JSON, `experimental_allow_partial` is simply passed to jiter via the `allow_partial` argument. Note If you just want pure JSON parsing with support for partial JSON, you can use the [`jiter`](https://pypi.org/project/jiter/) Python library directly, or pass the `allow_partial` argument when calling pydantic_core.from_json. #### 2. Ignore errors in the last element of the input Only having access to part of the input data means errors can commonly occur in the last element of the input data. For example: - if a string has a constraint `MinLen(5)`, when you only see part of the input, validation might fail because part of the string is missing (e.g. `{"name": "Sam` instead of `{"name": "Samuel"}`) - if an `int` field has a constraint `Ge(10)`, when you only see part of the input, validation might fail because the number is too small (e.g. `1` instead of `10`) - if a `TypedDict` field has 3 required fields, but the partial input only has two of the fields, validation would fail because some field are missing - etc. etc. — there are lost more cases like this The point is that if you only see part of some valid input data, validation errors can often occur in the last element of a sequence or last value of mapping. To avoid these errors breaking partial validation, Pydantic will ignore ALL errors in the last element of the input data. Errors in last element ignored ```python from typing import Annotated from annotated_types import MinLen from pydantic import BaseModel, TypeAdapter class MyModel(BaseModel): a: int b: Annotated[str, MinLen(5)] ta = TypeAdapter(list[MyModel]) v = ta.validate_json( '[{"a": 1, "b": "12345"}, {"a": 1,', experimental_allow_partial=True, ) print(v) #> [MyModel(a=1, b='12345')] ``` ### Limitations of Partial Validation #### TypeAdapter only You can only pass `experiment_allow_partial` to TypeAdapter methods, it's not yet supported via other Pydantic entry points like BaseModel. #### Types supported Right now only a subset of collection validators know how to handle partial validation: - `list` - `set` - `frozenset` - `dict` (as in `dict[X, Y]`) - `TypedDict` — only non-required fields may be missing, e.g. via NotRequired or total=False) While you can use `experimental_allow_partial` while validating against types that include other collection validators, those types will be validated "all or nothing", and partial validation will not work on more nested types. E.g. in the [above](#2-ignore-errors-in-last) example partial validation works although the second item in the list is dropped completely since `BaseModel` doesn't (yet) support partial validation. But partial validation won't work at all in the follow example because `BaseModel` doesn't support partial validation so it doesn't forward the `allow_partial` instruction down to the list validator in `b`: ```python from typing import Annotated from annotated_types import MinLen from pydantic import BaseModel, TypeAdapter, ValidationError class MyModel(BaseModel): a: int = 1 b: list[Annotated[str, MinLen(5)]] = [] # (1)! ta = TypeAdapter(MyModel) try: v = ta.validate_json( '{"a": 1, "b": ["12345", "12', experimental_allow_partial=True ) except ValidationError as e: print(e) """ 1 validation error for MyModel b.1 String should have at least 5 characters [type=string_too_short, input_value='12', input_type=str] """ ``` 1. The list validator for `b` doesn't get the `allow_partial` instruction passed down to it by the model validator so it doesn't know to ignore errors in the last element of the input. #### Some invalid but complete JSON will be accepted The way [jiter](https://github.com/pydantic/jiter) (the JSON parser used by Pydantic) works means it's currently not possible to differentiate between complete JSON like `{"a": 1, "b": "12"}` and incomplete JSON like `{"a": 1, "b": "12`. This means that some invalid JSON will be accepted by Pydantic when using `experimental_allow_partial`, e.g.: ```python from typing import Annotated from annotated_types import MinLen from typing_extensions import TypedDict from pydantic import TypeAdapter class Foobar(TypedDict, total=False): a: int b: Annotated[str, MinLen(5)] ta = TypeAdapter(Foobar) v = ta.validate_json( '{"a": 1, "b": "12', experimental_allow_partial=True # (1)! ) print(v) #> {'a': 1} v = ta.validate_json( '{"a": 1, "b": "12"}', experimental_allow_partial=True # (2)! ) print(v) #> {'a': 1} ``` 1. This will pass validation as expected although the last field will be omitted as it failed validation. 1. This will also pass validation since the binary representation of the JSON data passed to pydantic-core is indistinguishable from the previous case. ```python from typing import Annotated from annotated_types import MinLen from typing import TypedDict from pydantic import TypeAdapter class Foobar(TypedDict, total=False): a: int b: Annotated[str, MinLen(5)] ta = TypeAdapter(Foobar) v = ta.validate_json( '{"a": 1, "b": "12', experimental_allow_partial=True # (1)! ) print(v) #> {'a': 1} v = ta.validate_json( '{"a": 1, "b": "12"}', experimental_allow_partial=True # (2)! ) print(v) #> {'a': 1} ``` 1. This will pass validation as expected although the last field will be omitted as it failed validation. 1. This will also pass validation since the binary representation of the JSON data passed to pydantic-core is indistinguishable from the previous case. #### Any error in the last field of the input will be ignored As described [above](#2-ignore-errors-in-last), many errors can result from truncating the input. Rather than trying to specifically ignore errors that could result from truncation, Pydantic ignores all errors in the last element of the input in partial validation mode. This means clearly invalid data will pass validation if the error is in the last field of the input: ```python from typing import Annotated from annotated_types import Ge from pydantic import TypeAdapter ta = TypeAdapter(list[Annotated[int, Ge(10)]]) v = ta.validate_python([20, 30, 4], experimental_allow_partial=True) # (1)! print(v) #> [20, 30] ta = TypeAdapter(list[int]) v = ta.validate_python([1, 2, 'wrong'], experimental_allow_partial=True) # (2)! print(v) #> [1, 2] ``` 1. As you would expect, this will pass validation since Pydantic correctly ignores the error in the (truncated) last item. 1. This will also pass validation since the error in the last item is ignored. ## Validation of a callable's arguments Pydantic provides the @validate_call decorator to perform validation on the provided arguments (and additionally return type) of a callable. However, it only allows arguments to be provided by actually calling the decorated callable. In some situations, you may want to just *validate* the arguments, such as when loading from other data sources such as JSON data. For this reason, the experimental generate_arguments_schema() function can be used to construct a core schema, which can later be used with a SchemaValidator. ```python from pydantic_core import SchemaValidator from pydantic.experimental.arguments_schema import generate_arguments_schema def func(p: bool, *args: str, **kwargs: int) -> None: ... arguments_schema = generate_arguments_schema(func=func) val = SchemaValidator(arguments_schema, config={'coerce_numbers_to_str': True}) args, kwargs = val.validate_json( '{"p": true, "args": ["arg1", 1], "kwargs": {"extra": 1}}' ) print(args, kwargs) # (1)! #> (True, 'arg1', '1') {'extra': 1} ``` 1. If you want the validated arguments as a dictionary, you can use the Signature.bind() method: ```python from inspect import signature signature(func).bind(*args, **kwargs).arguments #> {'p': True, 'args': ('arg1', '1'), 'kwargs': {'extra': 1}} ``` Note Unlike @validate_call, this core schema will only validated the provided arguments, the underlying callable will *not* be called. This new core schema will become the default one to be used by @validate_call in Pydantic V3. Additionally, you can ignore specific parameters by providing a callback, called for every parameter: ```python from typing import Any from pydantic_core import SchemaValidator from pydantic.experimental.arguments_schema import generate_arguments_schema def func(p: bool, *args: str, **kwargs: int) -> None: ... def skip_first_parameter(index: int, name: str, annotation: Any) -> Any: if index == 0: return 'skip' arguments_schema = generate_arguments_schema( func=func, parameters_callback=skip_first_parameter, ) val = SchemaValidator(arguments_schema) args, kwargs = val.validate_json('{"args": ["arg1"], "kwargs": {"extra": 1}}') print(args, kwargs) #> ('arg1',) {'extra': 1} ``` API Documentation pydantic.fields.Field In this section, we will go through the available mechanisms to customize Pydantic model fields: default values, JSON Schema metadata, constraints, etc. To do so, the Field() function is used a lot, and behaves the same way as the standard library field() function for dataclasses: ```python from pydantic import BaseModel, Field class Model(BaseModel): name: str = Field(frozen=True) ``` Note Even though `name` is assigned a value, it is still required and has no default value. If you want to emphasize on the fact that a value must be provided, you can use the ellipsis: ```python class Model(BaseModel): name: str = Field(..., frozen=True) ``` However, its usage is discouraged as it doesn't play well with static type checkers. ## The annotated pattern To apply constraints or attach Field() functions to a model field, Pydantic supports the Annotated typing construct to attach metadata to an annotation: ```python from typing import Annotated from pydantic import BaseModel, Field, WithJsonSchema class Model(BaseModel): name: Annotated[str, Field(strict=True), WithJsonSchema({'extra': 'data'})] ``` As far as static type checkers are concerned, `name` is still typed as `str`, but Pydantic leverages the available metadata to add validation logic, type constraints, etc. Using this pattern has some advantages: - Using the `f: = Field(...)` form can be confusing and might trick users into thinking `f` has a default value, while in reality it is still required. - You can provide an arbitrary amount of metadata elements for a field. As shown in the example above, the Field() function only supports a limited set of constraints/metadata, and you may have to use different Pydantic utilities such as WithJsonSchema in some cases. - Types can be made reusable (see the documentation on [custom types](../types/#using-the-annotated-pattern) using this pattern). However, note that certain arguments to the Field() function (namely, `default`, `default_factory`, and `alias`) are taken into account by static type checkers to synthesize a correct `__init__` method. The annotated pattern is *not* understood by them, so you should use the normal assignment form instead. Tip The annotated pattern can also be used to add metadata to specific parts of the type. For instance, [validation constraints](#field-constraints) can be added this way: ```python from typing import Annotated from pydantic import BaseModel, Field class Model(BaseModel): int_list: list[Annotated[int, Field(gt=0)]] # Valid: [1, 3] # Invalid: [-1, 2] ``` ## Default values Default values for fields can be provided using the normal assignment syntax or by providing a value to the `default` argument: ```python from pydantic import BaseModel, Field class User(BaseModel): # Both fields aren't required: name: str = 'John Doe' age: int = Field(default=20) ``` Warning [In Pydantic V1](../../migration/#required-optional-and-nullable-fields), a type annotated as Any or wrapped by Optional would be given an implicit default of `None` even if no default was explicitly specified. This is no longer the case in Pydantic V2. You can also pass a callable to the `default_factory` argument that will be called to generate a default value: ```python from uuid import uuid4 from pydantic import BaseModel, Field class User(BaseModel): id: str = Field(default_factory=lambda: uuid4().hex) ``` The default factory can also take a single required argument, in which case the already validated data will be passed as a dictionary. ```python from pydantic import BaseModel, EmailStr, Field class User(BaseModel): email: EmailStr username: str = Field(default_factory=lambda data: data['email']) user = User(email='user@example.com') print(user.username) #> user@example.com ``` The `data` argument will *only* contain the already validated data, based on the [order of model fields](../models/#field-ordering) (the above example would fail if `username` were to be defined before `email`). ## Validate default values By default, Pydantic will *not* validate default values. The `validate_default` field parameter (or the validate_default configuration value) can be used to enable this behavior: ```python from pydantic import BaseModel, Field, ValidationError class User(BaseModel): age: int = Field(default='twelve', validate_default=True) try: user = User() except ValidationError as e: print(e) """ 1 validation error for User age Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='twelve', input_type=str] """ ``` ### Mutable default values A common source of bugs in Python is to use a mutable object as a default value for a function or method argument, as the same instance ends up being reused in each call. The dataclasses module actually raises an error in this case, indicating that you should use a [default factory](https://docs.python.org/3/library/dataclasses.html#default-factory-functions) instead. While the same thing can be done in Pydantic, it is not required. In the event that the default value is not hashable, Pydantic will create a deep copy of the default value when creating each instance of the model: ```python from pydantic import BaseModel class Model(BaseModel): item_counts: list[dict[str, int]] = [{}] m1 = Model() m1.item_counts[0]['a'] = 1 print(m1.item_counts) #> [{'a': 1}] m2 = Model() print(m2.item_counts) #> [{}] ``` ## Field aliases Tip Read more about aliases in the [dedicated section](../alias/). For validation and serialization, you can define an alias for a field. There are three ways to define an alias: - `Field(alias='foo')` - `Field(validation_alias='foo')` - `Field(serialization_alias='foo')` The `alias` parameter is used for both validation *and* serialization. If you want to use *different* aliases for validation and serialization respectively, you can use the `validation_alias` and `serialization_alias` parameters, which will apply only in their respective use cases. Here is an example of using the `alias` parameter: ```python from pydantic import BaseModel, Field class User(BaseModel): name: str = Field(alias='username') user = User(username='johndoe') # (1)! print(user) #> name='johndoe' print(user.model_dump(by_alias=True)) # (2)! #> {'username': 'johndoe'} ``` 1. The alias `'username'` is used for instance creation and validation. 1. We are using model_dump() to convert the model into a serializable format. Note that the `by_alias` keyword argument defaults to `False`, and must be specified explicitly to dump models using the field (serialization) aliases. You can also use ConfigDict.serialize_by_alias to configure this behavior at the model level. When `by_alias=True`, the alias `'username'` used during serialization. If you want to use an alias *only* for validation, you can use the `validation_alias` parameter: ```python from pydantic import BaseModel, Field class User(BaseModel): name: str = Field(validation_alias='username') user = User(username='johndoe') # (1)! print(user) #> name='johndoe' print(user.model_dump(by_alias=True)) # (2)! #> {'name': 'johndoe'} ``` 1. The validation alias `'username'` is used during validation. 1. The field name `'name'` is used during serialization. If you only want to define an alias for *serialization*, you can use the `serialization_alias` parameter: ```python from pydantic import BaseModel, Field class User(BaseModel): name: str = Field(serialization_alias='username') user = User(name='johndoe') # (1)! print(user) #> name='johndoe' print(user.model_dump(by_alias=True)) # (2)! #> {'username': 'johndoe'} ``` 1. The field name `'name'` is used for validation. 1. The serialization alias `'username'` is used for serialization. Alias precedence and priority In case you use `alias` together with `validation_alias` or `serialization_alias` at the same time, the `validation_alias` will have priority over `alias` for validation, and `serialization_alias` will have priority over `alias` for serialization. If you provide a value for the alias_generator model setting, you can control the order of precedence for field alias and generated aliases via the `alias_priority` field parameter. You can read more about alias precedence [here](../alias/#alias-precedence). Static type checking/IDE support If you provide a value for the `alias` field parameter, static type checkers will use this alias instead of the actual field name to synthesize the `__init__` method: ```python from pydantic import BaseModel, Field class User(BaseModel): name: str = Field(alias='username') user = User(username='johndoe') # (1)! ``` 1. Accepted by type checkers. This means that when using the validate_by_name model setting (which allows both the field name and alias to be used during model validation), type checkers will error when the actual field name is used: ```python from pydantic import BaseModel, ConfigDict, Field class User(BaseModel): model_config = ConfigDict(validate_by_name=True) name: str = Field(alias='username') user = User(name='johndoe') # (1)! ``` 1. *Not* accepted by type checkers. If you still want type checkers to use the field name and not the alias, the [annotated pattern](#the-annotated-pattern) can be used (which is only understood by Pydantic): ```python from typing import Annotated from pydantic import BaseModel, ConfigDict, Field class User(BaseModel): model_config = ConfigDict(validate_by_name=True, validate_by_alias=True) name: Annotated[str, Field(alias='username')] user = User(name='johndoe') # (1)! user = User(username='johndoe') # (2)! ``` 1. Accepted by type checkers. 1. *Not* accepted by type checkers. ### Validation Alias Even though Pydantic treats `alias` and `validation_alias` the same when creating model instances, type checkers only understand the `alias` field parameter. As a workaround, you can instead specify both an `alias` and serialization_alias`(identical to the field name), as the`serialization_alias`will override the`alias\` during serialization: ```python from pydantic import BaseModel, Field class MyModel(BaseModel): my_field: int = Field(validation_alias='myValidationAlias') ``` with: ```python from pydantic import BaseModel, Field class MyModel(BaseModel): my_field: int = Field( alias='myValidationAlias', serialization_alias='my_field', ) m = MyModel(myValidationAlias=1) print(m.model_dump(by_alias=True)) #> {'my_field': 1} ``` ## Numeric Constraints There are some keyword arguments that can be used to constrain numeric values: - `gt` - greater than - `lt` - less than - `ge` - greater than or equal to - `le` - less than or equal to - `multiple_of` - a multiple of the given number - `allow_inf_nan` - allow `'inf'`, `'-inf'`, `'nan'` values Here's an example: ```python from pydantic import BaseModel, Field class Foo(BaseModel): positive: int = Field(gt=0) non_negative: int = Field(ge=0) negative: int = Field(lt=0) non_positive: int = Field(le=0) even: int = Field(multiple_of=2) love_for_pydantic: float = Field(allow_inf_nan=True) foo = Foo( positive=1, non_negative=0, negative=-1, non_positive=0, even=2, love_for_pydantic=float('inf'), ) print(foo) """ positive=1 non_negative=0 negative=-1 non_positive=0 even=2 love_for_pydantic=inf """ ``` JSON Schema In the generated JSON schema: - `gt` and `lt` constraints will be translated to `exclusiveMinimum` and `exclusiveMaximum`. - `ge` and `le` constraints will be translated to `minimum` and `maximum`. - `multiple_of` constraint will be translated to `multipleOf`. The above snippet will generate the following JSON Schema: ```json { "title": "Foo", "type": "object", "properties": { "positive": { "title": "Positive", "type": "integer", "exclusiveMinimum": 0 }, "non_negative": { "title": "Non Negative", "type": "integer", "minimum": 0 }, "negative": { "title": "Negative", "type": "integer", "exclusiveMaximum": 0 }, "non_positive": { "title": "Non Positive", "type": "integer", "maximum": 0 }, "even": { "title": "Even", "type": "integer", "multipleOf": 2 }, "love_for_pydantic": { "title": "Love For Pydantic", "type": "number" } }, "required": [ "positive", "non_negative", "negative", "non_positive", "even", "love_for_pydantic" ] } ``` See the [JSON Schema Draft 2020-12](https://json-schema.org/understanding-json-schema/reference/numeric.html#numeric-types) for more details. Constraints on compound types In case you use field constraints with compound types, an error can happen in some cases. To avoid potential issues, you can use `Annotated`: ```python from typing import Annotated, Optional from pydantic import BaseModel, Field class Foo(BaseModel): positive: Optional[Annotated[int, Field(gt=0)]] # Can error in some cases, not recommended: non_negative: Optional[int] = Field(ge=0) ``` ## String Constraints API Documentation pydantic.types.StringConstraints There are fields that can be used to constrain strings: - `min_length`: Minimum length of the string. - `max_length`: Maximum length of the string. - `pattern`: A regular expression that the string must match. Here's an example: ```python from pydantic import BaseModel, Field class Foo(BaseModel): short: str = Field(min_length=3) long: str = Field(max_length=10) regex: str = Field(pattern=r'^\d*$') # (1)! foo = Foo(short='foo', long='foobarbaz', regex='123') print(foo) #> short='foo' long='foobarbaz' regex='123' ``` 1. Only digits are allowed. JSON Schema In the generated JSON schema: - `min_length` constraint will be translated to `minLength`. - `max_length` constraint will be translated to `maxLength`. - `pattern` constraint will be translated to `pattern`. The above snippet will generate the following JSON Schema: ```json { "title": "Foo", "type": "object", "properties": { "short": { "title": "Short", "type": "string", "minLength": 3 }, "long": { "title": "Long", "type": "string", "maxLength": 10 }, "regex": { "title": "Regex", "type": "string", "pattern": "^\\d*$" } }, "required": [ "short", "long", "regex" ] } ``` ## Decimal Constraints There are fields that can be used to constrain decimals: - `max_digits`: Maximum number of digits within the `Decimal`. It does not include a zero before the decimal point or trailing decimal zeroes. - `decimal_places`: Maximum number of decimal places allowed. It does not include trailing decimal zeroes. Here's an example: ```python from decimal import Decimal from pydantic import BaseModel, Field class Foo(BaseModel): precise: Decimal = Field(max_digits=5, decimal_places=2) foo = Foo(precise=Decimal('123.45')) print(foo) #> precise=Decimal('123.45') ``` ## Dataclass Constraints There are fields that can be used to constrain dataclasses: - `init`: Whether the field should be included in the `__init__` of the dataclass. - `init_var`: Whether the field should be seen as an [init-only field](https://docs.python.org/3/library/dataclasses.html#init-only-variables) in the dataclass. - `kw_only`: Whether the field should be a keyword-only argument in the constructor of the dataclass. Here's an example: ```python from pydantic import BaseModel, Field from pydantic.dataclasses import dataclass @dataclass class Foo: bar: str baz: str = Field(init_var=True) qux: str = Field(kw_only=True) class Model(BaseModel): foo: Foo model = Model(foo=Foo('bar', baz='baz', qux='qux')) print(model.model_dump()) # (1)! #> {'foo': {'bar': 'bar', 'qux': 'qux'}} ``` 1. The `baz` field is not included in the `model_dump()` output, since it is an init-only field. ## Field Representation The parameter `repr` can be used to control whether the field should be included in the string representation of the model. ```python from pydantic import BaseModel, Field class User(BaseModel): name: str = Field(repr=True) # (1)! age: int = Field(repr=False) user = User(name='John', age=42) print(user) #> name='John' ``` 1. This is the default value. ## Discriminator The parameter `discriminator` can be used to control the field that will be used to discriminate between different models in a union. It takes either the name of a field or a `Discriminator` instance. The `Discriminator` approach can be useful when the discriminator fields aren't the same for all the models in the `Union`. The following example shows how to use `discriminator` with a field name: ```python from typing import Literal, Union from pydantic import BaseModel, Field class Cat(BaseModel): pet_type: Literal['cat'] age: int class Dog(BaseModel): pet_type: Literal['dog'] age: int class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') print(Model.model_validate({'pet': {'pet_type': 'cat', 'age': 12}})) # (1)! #> pet=Cat(pet_type='cat', age=12) ``` 1. See more about [Validating data](../models/#validating-data) in the [Models](../models/) page. ```python from typing import Literal from pydantic import BaseModel, Field class Cat(BaseModel): pet_type: Literal['cat'] age: int class Dog(BaseModel): pet_type: Literal['dog'] age: int class Model(BaseModel): pet: Cat | Dog = Field(discriminator='pet_type') print(Model.model_validate({'pet': {'pet_type': 'cat', 'age': 12}})) # (1)! #> pet=Cat(pet_type='cat', age=12) ``` 1. See more about [Validating data](../models/#validating-data) in the [Models](../models/) page. The following example shows how to use the `discriminator` keyword argument with a `Discriminator` instance: ```python from typing import Annotated, Literal, Union from pydantic import BaseModel, Discriminator, Field, Tag class Cat(BaseModel): pet_type: Literal['cat'] age: int class Dog(BaseModel): pet_kind: Literal['dog'] age: int def pet_discriminator(v): if isinstance(v, dict): return v.get('pet_type', v.get('pet_kind')) return getattr(v, 'pet_type', getattr(v, 'pet_kind', None)) class Model(BaseModel): pet: Union[Annotated[Cat, Tag('cat')], Annotated[Dog, Tag('dog')]] = Field( discriminator=Discriminator(pet_discriminator) ) print(repr(Model.model_validate({'pet': {'pet_type': 'cat', 'age': 12}}))) #> Model(pet=Cat(pet_type='cat', age=12)) print(repr(Model.model_validate({'pet': {'pet_kind': 'dog', 'age': 12}}))) #> Model(pet=Dog(pet_kind='dog', age=12)) ``` ```python from typing import Annotated, Literal from pydantic import BaseModel, Discriminator, Field, Tag class Cat(BaseModel): pet_type: Literal['cat'] age: int class Dog(BaseModel): pet_kind: Literal['dog'] age: int def pet_discriminator(v): if isinstance(v, dict): return v.get('pet_type', v.get('pet_kind')) return getattr(v, 'pet_type', getattr(v, 'pet_kind', None)) class Model(BaseModel): pet: Annotated[Cat, Tag('cat')] | Annotated[Dog, Tag('dog')] = Field( discriminator=Discriminator(pet_discriminator) ) print(repr(Model.model_validate({'pet': {'pet_type': 'cat', 'age': 12}}))) #> Model(pet=Cat(pet_type='cat', age=12)) print(repr(Model.model_validate({'pet': {'pet_kind': 'dog', 'age': 12}}))) #> Model(pet=Dog(pet_kind='dog', age=12)) ``` You can also take advantage of `Annotated` to define your discriminated unions. See the [Discriminated Unions](../unions/#discriminated-unions) docs for more details. ## Strict Mode The `strict` parameter on a Field specifies whether the field should be validated in "strict mode". In strict mode, Pydantic throws an error during validation instead of coercing data on the field where `strict=True`. ```python from pydantic import BaseModel, Field class User(BaseModel): name: str = Field(strict=True) age: int = Field(strict=False) # (1)! user = User(name='John', age='42') # (2)! print(user) #> name='John' age=42 ``` 1. This is the default value. 1. The `age` field is not validated in the strict mode. Therefore, it can be assigned a string. See [Strict Mode](../strict_mode/) for more details. See [Conversion Table](../conversion_table/) for more details on how Pydantic converts data in both strict and lax modes. ## Immutability The parameter `frozen` is used to emulate the frozen dataclass behaviour. It is used to prevent the field from being assigned a new value after the model is created (immutability). See the [frozen dataclass documentation](https://docs.python.org/3/library/dataclasses.html#frozen-instances) for more details. ```python from pydantic import BaseModel, Field, ValidationError class User(BaseModel): name: str = Field(frozen=True) age: int user = User(name='John', age=42) try: user.name = 'Jane' # (1)! except ValidationError as e: print(e) """ 1 validation error for User name Field is frozen [type=frozen_field, input_value='Jane', input_type=str] """ ``` 1. Since `name` field is frozen, the assignment is not allowed. ## Exclude The `exclude` parameter can be used to control which fields should be excluded from the model when exporting the model. See the following example: ```python from pydantic import BaseModel, Field class User(BaseModel): name: str age: int = Field(exclude=True) user = User(name='John', age=42) print(user.model_dump()) # (1)! #> {'name': 'John'} ``` 1. The `age` field is not included in the `model_dump()` output, since it is excluded. See the [Serialization](../serialization/#model-and-field-level-include-and-exclude) section for more details. ## Deprecated fields The `deprecated` parameter can be used to mark a field as being deprecated. Doing so will result in: - a runtime deprecation warning emitted when accessing the field. - `"deprecated": true` being set in the generated JSON schema. You can set the `deprecated` parameter as one of: - A string, which will be used as the deprecation message. - An instance of the `warnings.deprecated` decorator (or the `typing_extensions` backport). - A boolean, which will be used to mark the field as deprecated with a default `'deprecated'` deprecation message. ### `deprecated` as a string ```python from typing import Annotated from pydantic import BaseModel, Field class Model(BaseModel): deprecated_field: Annotated[int, Field(deprecated='This is deprecated')] print(Model.model_json_schema()['properties']['deprecated_field']) #> {'deprecated': True, 'title': 'Deprecated Field', 'type': 'integer'} ``` ### `deprecated` via the `warnings.deprecated` decorator Note You can only use the `deprecated` decorator in this way if you have `typing_extensions` >= 4.9.0 installed. ```python import importlib.metadata from typing import Annotated, deprecated from packaging.version import Version from pydantic import BaseModel, Field if Version(importlib.metadata.version('typing_extensions')) >= Version('4.9'): class Model(BaseModel): deprecated_field: Annotated[int, deprecated('This is deprecated')] # Or explicitly using `Field`: alt_form: Annotated[ int, Field(deprecated=deprecated('This is deprecated')) ] ``` ### `deprecated` as a boolean ```python from typing import Annotated from pydantic import BaseModel, Field class Model(BaseModel): deprecated_field: Annotated[int, Field(deprecated=True)] print(Model.model_json_schema()['properties']['deprecated_field']) #> {'deprecated': True, 'title': 'Deprecated Field', 'type': 'integer'} ``` Support for `category` and `stacklevel` The current implementation of this feature does not take into account the `category` and `stacklevel` arguments to the `deprecated` decorator. This might land in a future version of Pydantic. Accessing a deprecated field in validators When accessing a deprecated field inside a validator, the deprecation warning will be emitted. You can use catch_warnings to explicitly ignore it: ```python import warnings from typing_extensions import Self from pydantic import BaseModel, Field, model_validator class Model(BaseModel): deprecated_field: int = Field(deprecated='This is deprecated') @model_validator(mode='after') def validate_model(self) -> Self: with warnings.catch_warnings(): warnings.simplefilter('ignore', DeprecationWarning) self.deprecated_field = self.deprecated_field * 2 ``` ## Customizing JSON Schema Some field parameters are used exclusively to customize the generated JSON schema. The parameters in question are: - `title` - `description` - `examples` - `json_schema_extra` Read more about JSON schema customization / modification with fields in the [Customizing JSON Schema](../json_schema/#field-level-customization) section of the JSON schema docs. ## The `computed_field` decorator API Documentation computed_field The computed_field decorator can be used to include property or cached_property attributes when serializing a model or dataclass. The property will also be taken into account in the JSON Schema (in serialization mode). Note Properties can be useful for fields that are computed from other fields, or for fields that are expensive to be computed (and thus, are cached if using cached_property). However, note that Pydantic will *not* perform any additional logic on the wrapped property (validation, cache invalidation, etc.). Here's an example of the JSON schema (in serialization mode) generated for a model with a computed field: ```python from pydantic import BaseModel, computed_field class Box(BaseModel): width: float height: float depth: float @computed_field @property # (1)! def volume(self) -> float: return self.width * self.height * self.depth print(Box.model_json_schema(mode='serialization')) """ { 'properties': { 'width': {'title': 'Width', 'type': 'number'}, 'height': {'title': 'Height', 'type': 'number'}, 'depth': {'title': 'Depth', 'type': 'number'}, 'volume': {'readOnly': True, 'title': 'Volume', 'type': 'number'}, }, 'required': ['width', 'height', 'depth', 'volume'], 'title': 'Box', 'type': 'object', } """ ``` Here's an example using the `model_dump` method with a computed field: ```python from pydantic import BaseModel, computed_field class Box(BaseModel): width: float height: float depth: float @computed_field @property # (1)! def volume(self) -> float: return self.width * self.height * self.depth b = Box(width=1, height=2, depth=3) print(b.model_dump()) #> {'width': 1.0, 'height': 2.0, 'depth': 3.0, 'volume': 6.0} ``` 1. If not specified, computed_field will implicitly convert the method to a property. However, it is preferable to explicitly use the @property decorator for type checking purposes. As with regular fields, computed fields can be marked as being deprecated: ```python from typing_extensions import deprecated from pydantic import BaseModel, computed_field class Box(BaseModel): width: float height: float depth: float @computed_field @property @deprecated("'volume' is deprecated") def volume(self) -> float: return self.width * self.height * self.depth ``` Forward annotations (wrapped in quotes) or using the `from __future__ import annotations` [future statement](https://docs.python.org/3/reference/simple_stmts.html#future) (as introduced in [PEP563](https://www.python.org/dev/peps/pep-0563/)) are supported: ```python from __future__ import annotations from pydantic import BaseModel MyInt = int class Model(BaseModel): a: MyInt # Without the future import, equivalent to: # a: 'MyInt' print(Model(a='1')) #> a=1 ``` As shown in the following sections, forward annotations are useful when you want to reference a type that is not yet defined in your code. The internal logic to resolve forward annotations is described in detail in [this section](../../internals/resolving_annotations/). ## Self-referencing (or "Recursive") Models Models with self-referencing fields are also supported. These annotations will be resolved during model creation. Within the model, you can either add the `from __future__ import annotations` import or wrap the annotation in a string: ```python from typing import Optional from pydantic import BaseModel class Foo(BaseModel): a: int = 123 sibling: 'Optional[Foo]' = None print(Foo()) #> a=123 sibling=None print(Foo(sibling={'a': '321'})) #> a=123 sibling=Foo(a=321, sibling=None) ``` ### Cyclic references When working with self-referencing recursive models, it is possible that you might encounter cyclic references in validation inputs. For example, this can happen when validating ORM instances with back-references from attributes. Rather than raising a RecursionError while attempting to validate data with cyclic references, Pydantic is able to detect the cyclic reference and raise an appropriate ValidationError: ```python from typing import Optional from pydantic import BaseModel, ValidationError class ModelA(BaseModel): b: 'Optional[ModelB]' = None class ModelB(BaseModel): a: Optional[ModelA] = None cyclic_data = {} cyclic_data['a'] = {'b': cyclic_data} print(cyclic_data) #> {'a': {'b': {...}}} try: ModelB.model_validate(cyclic_data) except ValidationError as exc: print(exc) """ 1 validation error for ModelB a.b Recursion error - cyclic reference detected [type=recursion_loop, input_value={'a': {'b': {...}}}, input_type=dict] """ ``` ```python from typing import Optional from pydantic import BaseModel, ValidationError class ModelA(BaseModel): b: 'Optional[ModelB]' = None class ModelB(BaseModel): a: ModelA | None = None cyclic_data = {} cyclic_data['a'] = {'b': cyclic_data} print(cyclic_data) #> {'a': {'b': {...}}} try: ModelB.model_validate(cyclic_data) except ValidationError as exc: print(exc) """ 1 validation error for ModelB a.b Recursion error - cyclic reference detected [type=recursion_loop, input_value={'a': {'b': {...}}}, input_type=dict] """ ``` Because this error is raised without actually exceeding the maximum recursion depth, you can catch and handle the raised ValidationError without needing to worry about the limited remaining recursion depth: ```python from contextlib import contextmanager from dataclasses import field from typing import Iterator from pydantic import BaseModel, ValidationError, field_validator def is_recursion_validation_error(exc: ValidationError) -> bool: errors = exc.errors() return len(errors) == 1 and errors[0]['type'] == 'recursion_loop' @contextmanager def suppress_recursion_validation_error() -> Iterator[None]: try: yield except ValidationError as exc: if not is_recursion_validation_error(exc): raise exc class Node(BaseModel): id: int children: list['Node'] = field(default_factory=list) @field_validator('children', mode='wrap') @classmethod def drop_cyclic_references(cls, children, h): try: return h(children) except ValidationError as exc: if not ( is_recursion_validation_error(exc) and isinstance(children, list) ): raise exc value_without_cyclic_refs = [] for child in children: with suppress_recursion_validation_error(): value_without_cyclic_refs.extend(h([child])) return h(value_without_cyclic_refs) # Create data with cyclic references representing the graph 1 -> 2 -> 3 -> 1 node_data = {'id': 1, 'children': [{'id': 2, 'children': [{'id': 3}]}]} node_data['children'][0]['children'][0]['children'] = [node_data] print(Node.model_validate(node_data)) #> id=1 children=[Node(id=2, children=[Node(id=3, children=[])])] ``` ```python from contextlib import contextmanager from dataclasses import field from collections.abc import Iterator from pydantic import BaseModel, ValidationError, field_validator def is_recursion_validation_error(exc: ValidationError) -> bool: errors = exc.errors() return len(errors) == 1 and errors[0]['type'] == 'recursion_loop' @contextmanager def suppress_recursion_validation_error() -> Iterator[None]: try: yield except ValidationError as exc: if not is_recursion_validation_error(exc): raise exc class Node(BaseModel): id: int children: list['Node'] = field(default_factory=list) @field_validator('children', mode='wrap') @classmethod def drop_cyclic_references(cls, children, h): try: return h(children) except ValidationError as exc: if not ( is_recursion_validation_error(exc) and isinstance(children, list) ): raise exc value_without_cyclic_refs = [] for child in children: with suppress_recursion_validation_error(): value_without_cyclic_refs.extend(h([child])) return h(value_without_cyclic_refs) # Create data with cyclic references representing the graph 1 -> 2 -> 3 -> 1 node_data = {'id': 1, 'children': [{'id': 2, 'children': [{'id': 3}]}]} node_data['children'][0]['children'][0]['children'] = [node_data] print(Node.model_validate(node_data)) #> id=1 children=[Node(id=2, children=[Node(id=3, children=[])])] ``` Similarly, if Pydantic encounters a recursive reference during *serialization*, rather than waiting for the maximum recursion depth to be exceeded, a ValueError is raised immediately: ```python from pydantic import TypeAdapter # Create data with cyclic references representing the graph 1 -> 2 -> 3 -> 1 node_data = {'id': 1, 'children': [{'id': 2, 'children': [{'id': 3}]}]} node_data['children'][0]['children'][0]['children'] = [node_data] try: # Try serializing the circular reference as JSON TypeAdapter(dict).dump_json(node_data) except ValueError as exc: print(exc) """ Error serializing to JSON: ValueError: Circular reference detected (id repeated) """ ``` This can also be handled if desired: ```python from dataclasses import field from typing import Any from pydantic import ( SerializerFunctionWrapHandler, TypeAdapter, field_serializer, ) from pydantic.dataclasses import dataclass @dataclass class NodeReference: id: int @dataclass class Node(NodeReference): children: list['Node'] = field(default_factory=list) @field_serializer('children', mode='wrap') def serialize( self, children: list['Node'], handler: SerializerFunctionWrapHandler ) -> Any: """ Serialize a list of nodes, handling circular references by excluding the children. """ try: return handler(children) except ValueError as exc: if not str(exc).startswith('Circular reference'): raise exc result = [] for node in children: try: serialized = handler([node]) except ValueError as exc: if not str(exc).startswith('Circular reference'): raise exc result.append({'id': node.id}) else: result.append(serialized) return result # Create a cyclic graph: nodes = [Node(id=1), Node(id=2), Node(id=3)] nodes[0].children.append(nodes[1]) nodes[1].children.append(nodes[2]) nodes[2].children.append(nodes[0]) print(nodes[0]) #> Node(id=1, children=[Node(id=2, children=[Node(id=3, children=[...])])]) # Serialize the cyclic graph: print(TypeAdapter(Node).dump_python(nodes[0])) """ { 'id': 1, 'children': [{'id': 2, 'children': [{'id': 3, 'children': [{'id': 1}]}]}], } """ ``` # JSON ## Json Parsing API Documentation pydantic.main.BaseModel.model_validate_json pydantic.type_adapter.TypeAdapter.validate_json pydantic_core.from_json Pydantic provides builtin JSON parsing, which helps achieve: - Significant performance improvements without the cost of using a 3rd party library - Support for custom errors - Support for `strict` specifications Here's an example of Pydantic's builtin JSON parsing via the model_validate_json method, showcasing the support for `strict` specifications while parsing JSON data that doesn't match the model's type annotations: ```python from datetime import date from pydantic import BaseModel, ConfigDict, ValidationError class Event(BaseModel): model_config = ConfigDict(strict=True) when: date where: tuple[int, int] json_data = '{"when": "1987-01-28", "where": [51, -1]}' print(Event.model_validate_json(json_data)) # (1)! #> when=datetime.date(1987, 1, 28) where=(51, -1) try: Event.model_validate({'when': '1987-01-28', 'where': [51, -1]}) # (2)! except ValidationError as e: print(e) """ 2 validation errors for Event when Input should be a valid date [type=date_type, input_value='1987-01-28', input_type=str] where Input should be a valid tuple [type=tuple_type, input_value=[51, -1], input_type=list] """ ``` 1. JSON has no `date` or tuple types, but Pydantic knows that so allows strings and arrays as inputs respectively when parsing JSON directly. 1. If you pass the same values to the model_validate method, Pydantic will raise a validation error because the `strict` configuration is enabled. In v2.5.0 and above, Pydantic uses [`jiter`](https://docs.rs/jiter/latest/jiter/), a fast and iterable JSON parser, to parse JSON data. Using `jiter` compared to `serde` results in modest performance improvements that will get even better in the future. The `jiter` JSON parser is almost entirely compatible with the `serde` JSON parser, with one noticeable enhancement being that `jiter` supports deserialization of `inf` and `NaN` values. In the future, `jiter` is intended to enable support validation errors to include the location in the original JSON input which contained the invalid value. ### Partial JSON Parsing **Starting in v2.7.0**, Pydantic's [JSON parser](https://docs.rs/jiter/latest/jiter/) offers support for partial JSON parsing, which is exposed via pydantic_core.from_json. Here's an example of this feature in action: ```python from pydantic_core import from_json partial_json_data = '["aa", "bb", "c' # (1)! try: result = from_json(partial_json_data, allow_partial=False) except ValueError as e: print(e) # (2)! #> EOF while parsing a string at line 1 column 15 result = from_json(partial_json_data, allow_partial=True) print(result) # (3)! #> ['aa', 'bb'] ``` 1. The JSON list is incomplete - it's missing a closing `"]` 1. When `allow_partial` is set to `False` (the default), a parsing error occurs. 1. When `allow_partial` is set to `True`, part of the input is deserialized successfully. This also works for deserializing partial dictionaries. For example: ```python from pydantic_core import from_json partial_dog_json = '{"breed": "lab", "name": "fluffy", "friends": ["buddy", "spot", "rufus"], "age' dog_dict = from_json(partial_dog_json, allow_partial=True) print(dog_dict) #> {'breed': 'lab', 'name': 'fluffy', 'friends': ['buddy', 'spot', 'rufus']} ``` Validating LLM Output This feature is particularly beneficial for validating LLM outputs. We've written some blog posts about this topic, which you can find [here](https://pydantic.dev/articles). In future versions of Pydantic, we expect to expand support for this feature through either Pydantic's other JSON validation functions (pydantic.main.BaseModel.model_validate_json and pydantic.type_adapter.TypeAdapter.validate_json) or model configuration. Stay tuned 🚀! For now, you can use pydantic_core.from_json in combination with pydantic.main.BaseModel.model_validate to achieve the same result. Here's an example: ```python from pydantic_core import from_json from pydantic import BaseModel class Dog(BaseModel): breed: str name: str friends: list partial_dog_json = '{"breed": "lab", "name": "fluffy", "friends": ["buddy", "spot", "rufus"], "age' dog = Dog.model_validate(from_json(partial_dog_json, allow_partial=True)) print(repr(dog)) #> Dog(breed='lab', name='fluffy', friends=['buddy', 'spot', 'rufus']) ``` Tip For partial JSON parsing to work reliably, all fields on the model should have default values. Check out the following example for a more in-depth look at how to use default values with partial JSON parsing: Using default values with partial JSON parsing ```python from typing import Annotated, Any, Optional import pydantic_core from pydantic import BaseModel, ValidationError, WrapValidator def default_on_error(v, handler) -> Any: """ Raise a PydanticUseDefault exception if the value is missing. This is useful for avoiding errors from partial JSON preventing successful validation. """ try: return handler(v) except ValidationError as exc: # there might be other types of errors resulting from partial JSON parsing # that you allow here, feel free to customize as needed if all(e['type'] == 'missing' for e in exc.errors()): raise pydantic_core.PydanticUseDefault() else: raise class NestedModel(BaseModel): x: int y: str class MyModel(BaseModel): foo: Optional[str] = None bar: Annotated[ Optional[tuple[str, int]], WrapValidator(default_on_error) ] = None nested: Annotated[ Optional[NestedModel], WrapValidator(default_on_error) ] = None m = MyModel.model_validate( pydantic_core.from_json('{"foo": "x", "bar": ["world",', allow_partial=True) ) print(repr(m)) #> MyModel(foo='x', bar=None, nested=None) m = MyModel.model_validate( pydantic_core.from_json( '{"foo": "x", "bar": ["world", 1], "nested": {"x":', allow_partial=True ) ) print(repr(m)) #> MyModel(foo='x', bar=('world', 1), nested=None) ``` ### Caching Strings **Starting in v2.7.0**, Pydantic's [JSON parser](https://docs.rs/jiter/latest/jiter/) offers support for configuring how Python strings are cached during JSON parsing and validation (when Python strings are constructed from Rust strings during Python validation, e.g. after `strip_whitespace=True`). The `cache_strings` setting is exposed via both model config and pydantic_core.from_json. The `cache_strings` setting can take any of the following values: - `True` or `'all'` (the default): cache all strings - `'keys'`: cache only dictionary keys, this **only** applies when used with pydantic_core.from_json or when parsing JSON using Json - `False` or `'none'`: no caching Using the string caching feature results in performance improvements, but increases memory usage slightly. String Caching Details 1. Strings are cached using a fully associative cache with a size of [16,384](https://github.com/pydantic/jiter/blob/5bbdcfd22882b7b286416b22f74abd549c7b2fd7/src/py_string_cache.rs#L113). 1. Only strings where `len(string) < 64` are cached. 1. There is some overhead to looking up the cache, which is normally worth it to avoid constructing strings. However, if you know there will be very few repeated strings in your data, you might get a performance boost by disabling this setting with `cache_strings=False`. ## JSON Serialization API Documentation pydantic.main.BaseModel.model_dump_json\ pydantic.type_adapter.TypeAdapter.dump_json\ pydantic_core.to_json For more information on JSON serialization, see the [Serialization Concepts](../serialization/#modelmodel_dump_json) page. API Documentation pydantic.json_schema Pydantic allows automatic creation and customization of JSON schemas from models. The generated JSON schemas are compliant with the following specifications: - [JSON Schema Draft 2020-12](https://json-schema.org/draft/2020-12/release-notes.html) - [OpenAPI Specification v3.1.0](https://github.com/OAI/OpenAPI-Specification). ## Generating JSON Schema Use the following functions to generate JSON schema: - BaseModel.model_json_schema returns a jsonable dict of a model's schema. - TypeAdapter.json_schema returns a jsonable dict of an adapted type's schema. Note These methods are not to be confused with BaseModel.model_dump_json and TypeAdapter.dump_json, which serialize instances of the model or adapted type, respectively. These methods return JSON strings. In comparison, BaseModel.model_json_schema and TypeAdapter.json_schema return a jsonable dict representing the JSON schema of the model or adapted type, respectively. on the "jsonable" nature of JSON schema Regarding the "jsonable" nature of the model_json_schema results, calling `json.dumps(m.model_json_schema())`on some `BaseModel` `m` returns a valid JSON string. Similarly, for TypeAdapter.json_schema, calling `json.dumps(TypeAdapter().json_schema())` returns a valid JSON string. Tip Pydantic offers support for both of: 1. [Customizing JSON Schema](#customizing-json-schema) 1. [Customizing the JSON Schema Generation Process](#customizing-the-json-schema-generation-process) The first approach generally has a more narrow scope, allowing for customization of the JSON schema for more specific cases and types. The second approach generally has a more broad scope, allowing for customization of the JSON schema generation process overall. The same effects can be achieved with either approach, but depending on your use case, one approach might offer a more simple solution than the other. Here's an example of generating JSON schema from a `BaseModel`: ```python import json from enum import Enum from typing import Annotated, Union from pydantic import BaseModel, Field from pydantic.config import ConfigDict class FooBar(BaseModel): count: int size: Union[float, None] = None class Gender(str, Enum): male = 'male' female = 'female' other = 'other' not_given = 'not_given' class MainModel(BaseModel): """ This is the description of the main model """ model_config = ConfigDict(title='Main') foo_bar: FooBar gender: Annotated[Union[Gender, None], Field(alias='Gender')] = None snap: int = Field( default=42, title='The Snap', description='this is the value of snap', gt=30, lt=50, ) main_model_schema = MainModel.model_json_schema() # (1)! print(json.dumps(main_model_schema, indent=2)) # (2)! ``` JSON output: ```json { "$defs": { "FooBar": { "properties": { "count": { "title": "Count", "type": "integer" }, "size": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Size" } }, "required": [ "count" ], "title": "FooBar", "type": "object" }, "Gender": { "enum": [ "male", "female", "other", "not_given" ], "title": "Gender", "type": "string" } }, "description": "This is the description of the main model", "properties": { "foo_bar": { "$ref": "#/$defs/FooBar" }, "Gender": { "anyOf": [ { "$ref": "#/$defs/Gender" }, { "type": "null" } ], "default": null }, "snap": { "default": 42, "description": "this is the value of snap", "exclusiveMaximum": 50, "exclusiveMinimum": 30, "title": "The Snap", "type": "integer" } }, "required": [ "foo_bar" ], "title": "Main", "type": "object" } ``` 1. This produces a "jsonable" dict of `MainModel`'s schema. 1. Calling `json.dumps` on the schema dict produces a JSON string. ```python import json from enum import Enum from typing import Annotated from pydantic import BaseModel, Field from pydantic.config import ConfigDict class FooBar(BaseModel): count: int size: float | None = None class Gender(str, Enum): male = 'male' female = 'female' other = 'other' not_given = 'not_given' class MainModel(BaseModel): """ This is the description of the main model """ model_config = ConfigDict(title='Main') foo_bar: FooBar gender: Annotated[Gender | None, Field(alias='Gender')] = None snap: int = Field( default=42, title='The Snap', description='this is the value of snap', gt=30, lt=50, ) main_model_schema = MainModel.model_json_schema() # (1)! print(json.dumps(main_model_schema, indent=2)) # (2)! ``` JSON output: ```json { "$defs": { "FooBar": { "properties": { "count": { "title": "Count", "type": "integer" }, "size": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Size" } }, "required": [ "count" ], "title": "FooBar", "type": "object" }, "Gender": { "enum": [ "male", "female", "other", "not_given" ], "title": "Gender", "type": "string" } }, "description": "This is the description of the main model", "properties": { "foo_bar": { "$ref": "#/$defs/FooBar" }, "Gender": { "anyOf": [ { "$ref": "#/$defs/Gender" }, { "type": "null" } ], "default": null }, "snap": { "default": 42, "description": "this is the value of snap", "exclusiveMaximum": 50, "exclusiveMinimum": 30, "title": "The Snap", "type": "integer" } }, "required": [ "foo_bar" ], "title": "Main", "type": "object" } ``` 1. This produces a "jsonable" dict of `MainModel`'s schema. 1. Calling `json.dumps` on the schema dict produces a JSON string. The TypeAdapter class lets you create an object with methods for validating, serializing, and producing JSON schemas for arbitrary types. This serves as a complete replacement for `schema_of` in Pydantic V1 (which is now deprecated). Here's an example of generating JSON schema from a TypeAdapter: ```python from pydantic import TypeAdapter adapter = TypeAdapter(list[int]) print(adapter.json_schema()) #> {'items': {'type': 'integer'}, 'type': 'array'} ``` You can also generate JSON schemas for combinations of BaseModels and TypeAdapters, as shown in this example: ```python import json from typing import Union from pydantic import BaseModel, TypeAdapter class Cat(BaseModel): name: str color: str class Dog(BaseModel): name: str breed: str ta = TypeAdapter(Union[Cat, Dog]) ta_schema = ta.json_schema() print(json.dumps(ta_schema, indent=2)) ``` JSON output: ```json { "$defs": { "Cat": { "properties": { "name": { "title": "Name", "type": "string" }, "color": { "title": "Color", "type": "string" } }, "required": [ "name", "color" ], "title": "Cat", "type": "object" }, "Dog": { "properties": { "name": { "title": "Name", "type": "string" }, "breed": { "title": "Breed", "type": "string" } }, "required": [ "name", "breed" ], "title": "Dog", "type": "object" } }, "anyOf": [ { "$ref": "#/$defs/Cat" }, { "$ref": "#/$defs/Dog" } ] } ``` ### Configuring the `JsonSchemaMode` Specify the mode of JSON schema generation via the `mode` parameter in the model_json_schema and TypeAdapter.json_schema methods. By default, the mode is set to `'validation'`, which produces a JSON schema corresponding to the model's validation schema. The JsonSchemaMode is a type alias that represents the available options for the `mode` parameter: - `'validation'` - `'serialization'` Here's an example of how to specify the `mode` parameter, and how it affects the generated JSON schema: ```python from decimal import Decimal from pydantic import BaseModel class Model(BaseModel): a: Decimal = Decimal('12.34') print(Model.model_json_schema(mode='validation')) """ { 'properties': { 'a': { 'anyOf': [{'type': 'number'}, {'type': 'string'}], 'default': '12.34', 'title': 'A', } }, 'title': 'Model', 'type': 'object', } """ print(Model.model_json_schema(mode='serialization')) """ { 'properties': {'a': {'default': '12.34', 'title': 'A', 'type': 'string'}}, 'title': 'Model', 'type': 'object', } """ ``` ## Customizing JSON Schema The generated JSON schema can be customized at both the field level and model level via: 1. [Field-level customization](#field-level-customization) with the Field constructor 1. [Model-level customization](#model-level-customization) with model_config At both the field and model levels, you can use the `json_schema_extra` option to add extra information to the JSON schema. The [Using `json_schema_extra`](#using-json_schema_extra) section below provides more details on this option. For custom types, Pydantic offers other tools for customizing JSON schema generation: 1. [`WithJsonSchema` annotation](#withjsonschema-annotation) 1. [`SkipJsonSchema` annotation](#skipjsonschema-annotation) 1. [Implementing `__get_pydantic_core_schema__`](#implementing_get_pydantic_core_schema) 1. [Implementing `__get_pydantic_json_schema__`](#implementing_get_pydantic_json_schema) ### Field-Level Customization Optionally, the Field function can be used to provide extra information about the field and validations. Some field parameters are used exclusively to customize the generated JSON Schema: - `title`: The title of the field. - `description`: The description of the field. - `examples`: The examples of the field. - `json_schema_extra`: Extra JSON Schema properties to be added to the field. - `field_title_generator`: A function that programmatically sets the field's title, based on its name and info. Here's an example: ```python import json from pydantic import BaseModel, EmailStr, Field, SecretStr class User(BaseModel): age: int = Field(description='Age of the user') email: EmailStr = Field(examples=['marcelo@mail.com']) name: str = Field(title='Username') password: SecretStr = Field( json_schema_extra={ 'title': 'Password', 'description': 'Password of the user', 'examples': ['123456'], } ) print(json.dumps(User.model_json_schema(), indent=2)) ``` JSON output: ```json { "properties": { "age": { "description": "Age of the user", "title": "Age", "type": "integer" }, "email": { "examples": [ "marcelo@mail.com" ], "format": "email", "title": "Email", "type": "string" }, "name": { "title": "Username", "type": "string" }, "password": { "description": "Password of the user", "examples": [ "123456" ], "format": "password", "title": "Password", "type": "string", "writeOnly": true } }, "required": [ "age", "email", "name", "password" ], "title": "User", "type": "object" } ``` #### Unenforced `Field` constraints If Pydantic finds constraints which are not being enforced, an error will be raised. If you want to force the constraint to appear in the schema, even though it's not being checked upon parsing, you can use variadic arguments to Field with the raw schema attribute name: ```python from pydantic import BaseModel, Field, PositiveInt try: # this won't work since `PositiveInt` takes precedence over the # constraints defined in `Field`, meaning they're ignored class Model(BaseModel): foo: PositiveInt = Field(lt=10) except ValueError as e: print(e) # if you find yourself needing this, an alternative is to declare # the constraints in `Field` (or you could use `conint()`) # here both constraints will be enforced: class ModelB(BaseModel): # Here both constraints will be applied and the schema # will be generated correctly foo: int = Field(gt=0, lt=10) print(ModelB.model_json_schema()) """ { 'properties': { 'foo': { 'exclusiveMaximum': 10, 'exclusiveMinimum': 0, 'title': 'Foo', 'type': 'integer', } }, 'required': ['foo'], 'title': 'ModelB', 'type': 'object', } """ ``` You can specify JSON schema modifications via the Field constructor via typing.Annotated as well: ```python import json from typing import Annotated from uuid import uuid4 from pydantic import BaseModel, Field class Foo(BaseModel): id: Annotated[str, Field(default_factory=lambda: uuid4().hex)] name: Annotated[str, Field(max_length=256)] = Field( 'Bar', title='CustomName' ) print(json.dumps(Foo.model_json_schema(), indent=2)) ``` JSON output: ```json { "properties": { "id": { "title": "Id", "type": "string" }, "name": { "default": "Bar", "maxLength": 256, "title": "CustomName", "type": "string" } }, "title": "Foo", "type": "object" } ``` ### Programmatic field title generation The `field_title_generator` parameter can be used to programmatically generate the title for a field based on its name and info. See the following example: ```python import json from pydantic import BaseModel, Field from pydantic.fields import FieldInfo def make_title(field_name: str, field_info: FieldInfo) -> str: return field_name.upper() class Person(BaseModel): name: str = Field(field_title_generator=make_title) age: int = Field(field_title_generator=make_title) print(json.dumps(Person.model_json_schema(), indent=2)) """ { "properties": { "name": { "title": "NAME", "type": "string" }, "age": { "title": "AGE", "type": "integer" } }, "required": [ "name", "age" ], "title": "Person", "type": "object" } """ ``` ### Model-Level Customization You can also use model config to customize JSON schema generation on a model. Specifically, the following config options are relevant: - title - json_schema_extra - json_schema_mode_override - field_title_generator - model_title_generator ### Using `json_schema_extra` The `json_schema_extra` option can be used to add extra information to the JSON schema, either at the [Field level](#field-level-customization) or at the [Model level](#model-level-customization). You can pass a `dict` or a `Callable` to `json_schema_extra`. #### Using `json_schema_extra` with a `dict` You can pass a `dict` to `json_schema_extra` to add extra information to the JSON schema: ```python import json from pydantic import BaseModel, ConfigDict class Model(BaseModel): a: str model_config = ConfigDict(json_schema_extra={'examples': [{'a': 'Foo'}]}) print(json.dumps(Model.model_json_schema(), indent=2)) ``` JSON output: ```json { "examples": [ { "a": "Foo" } ], "properties": { "a": { "title": "A", "type": "string" } }, "required": [ "a" ], "title": "Model", "type": "object" } ``` #### Using `json_schema_extra` with a `Callable` You can pass a `Callable` to `json_schema_extra` to modify the JSON schema with a function: ```python import json from pydantic import BaseModel, Field def pop_default(s): s.pop('default') class Model(BaseModel): a: int = Field(default=1, json_schema_extra=pop_default) print(json.dumps(Model.model_json_schema(), indent=2)) ``` JSON output: ```json { "properties": { "a": { "title": "A", "type": "integer" } }, "title": "Model", "type": "object" } ``` #### Merging `json_schema_extra` Starting in v2.9, Pydantic merges `json_schema_extra` dictionaries from annotated types. This pattern offers a more additive approach to merging rather than the previous override behavior. This can be quite helpful for cases of reusing json schema extra information across multiple types. We viewed this change largely as a bug fix, as it resolves unintentional differences in the `json_schema_extra` merging behavior between `BaseModel` and `TypeAdapter` instances - see [this issue](https://github.com/pydantic/pydantic/issues/9210) for more details. ```python import json from typing import Annotated from typing_extensions import TypeAlias from pydantic import Field, TypeAdapter ExternalType: TypeAlias = Annotated[ int, Field(json_schema_extra={'key1': 'value1'}) ] ta = TypeAdapter( Annotated[ExternalType, Field(json_schema_extra={'key2': 'value2'})] ) print(json.dumps(ta.json_schema(), indent=2)) """ { "key1": "value1", "key2": "value2", "type": "integer" } """ ``` ```python import json from typing import Annotated from typing import TypeAlias from pydantic import Field, TypeAdapter ExternalType: TypeAlias = Annotated[ int, Field(json_schema_extra={'key1': 'value1'}) ] ta = TypeAdapter( Annotated[ExternalType, Field(json_schema_extra={'key2': 'value2'})] ) print(json.dumps(ta.json_schema(), indent=2)) """ { "key1": "value1", "key2": "value2", "type": "integer" } """ ``` Note We no longer (and never fully did) support composing a mix of `dict` and `callable` type `json_schema_extra` specifications. If this is a requirement for your use case, please [open a pydantic issue](https://github.com/pydantic/pydantic/issues/new/choose) and explain your situation - we'd be happy to reconsider this decision when presented with a compelling case. ### `WithJsonSchema` annotation API Documentation pydantic.json_schema.WithJsonSchema Tip Using WithJsonSchema is preferred over [implementing `__get_pydantic_json_schema__`](#implementing_get_pydantic_json_schema) for custom types, as it's more simple and less error-prone. The WithJsonSchema annotation can be used to override the generated (base) JSON schema for a given type without the need to implement `__get_pydantic_core_schema__` or `__get_pydantic_json_schema__` on the type itself. Note that this overrides the whole JSON Schema generation process for the field (in the following example, the `'type'` also needs to be provided). ```python import json from typing import Annotated from pydantic import BaseModel, WithJsonSchema MyInt = Annotated[ int, WithJsonSchema({'type': 'integer', 'examples': [1, 0, -1]}), ] class Model(BaseModel): a: MyInt print(json.dumps(Model.model_json_schema(), indent=2)) ``` JSON output: ```json { "properties": { "a": { "examples": [ 1, 0, -1 ], "title": "A", "type": "integer" } }, "required": [ "a" ], "title": "Model", "type": "object" } ``` Note You might be tempted to use the WithJsonSchema annotation to fine-tune the JSON Schema of fields having [validators](../validators/) attached. Instead, it is recommended to use [the `json_schema_input_type` argument](../validators/#json-schema-and-field-validators). ### `SkipJsonSchema` annotation API Documentation pydantic.json_schema.SkipJsonSchema The SkipJsonSchema annotation can be used to skip an included field (or part of a field's specifications) from the generated JSON schema. See the API docs for more details. ### Implementing `__get_pydantic_core_schema__` Custom types (used as `field_name: TheType` or `field_name: Annotated[TheType, ...]`) as well as `Annotated` metadata (used as `field_name: Annotated[int, SomeMetadata]`) can modify or override the generated schema by implementing `__get_pydantic_core_schema__`. This method receives two positional arguments: 1. The type annotation that corresponds to this type (so in the case of `TheType[T][int]` it would be `TheType[int]`). 1. A handler/callback to call the next implementer of `__get_pydantic_core_schema__`. The handler system works just like [*wrap* field validators](../validators/#field-wrap-validator). In this case the input is the type and the output is a `core_schema`. Here is an example of a custom type that *overrides* the generated `core_schema`: ```python from dataclasses import dataclass from typing import Any from pydantic_core import core_schema from pydantic import BaseModel, GetCoreSchemaHandler @dataclass class CompressedString: dictionary: dict[int, str] text: list[int] def build(self) -> str: return ' '.join([self.dictionary[key] for key in self.text]) @classmethod def __get_pydantic_core_schema__( cls, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: assert source is CompressedString return core_schema.no_info_after_validator_function( cls._validate, core_schema.str_schema(), serialization=core_schema.plain_serializer_function_ser_schema( cls._serialize, info_arg=False, return_schema=core_schema.str_schema(), ), ) @staticmethod def _validate(value: str) -> 'CompressedString': inverse_dictionary: dict[str, int] = {} text: list[int] = [] for word in value.split(' '): if word not in inverse_dictionary: inverse_dictionary[word] = len(inverse_dictionary) text.append(inverse_dictionary[word]) return CompressedString( {v: k for k, v in inverse_dictionary.items()}, text ) @staticmethod def _serialize(value: 'CompressedString') -> str: return value.build() class MyModel(BaseModel): value: CompressedString print(MyModel.model_json_schema()) """ { 'properties': {'value': {'title': 'Value', 'type': 'string'}}, 'required': ['value'], 'title': 'MyModel', 'type': 'object', } """ print(MyModel(value='fox fox fox dog fox')) """ value = CompressedString(dictionary={0: 'fox', 1: 'dog'}, text=[0, 0, 0, 1, 0]) """ print(MyModel(value='fox fox fox dog fox').model_dump(mode='json')) #> {'value': 'fox fox fox dog fox'} ``` Since Pydantic would not know how to generate a schema for `CompressedString`, if you call `handler(source)` in its `__get_pydantic_core_schema__` method you would get a `pydantic.errors.PydanticSchemaGenerationError` error. This will be the case for most custom types, so you almost never want to call into `handler` for custom types. The process for `Annotated` metadata is much the same except that you can generally call into `handler` to have Pydantic handle generating the schema. ```python from dataclasses import dataclass from typing import Annotated, Any, Sequence from pydantic_core import core_schema from pydantic import BaseModel, GetCoreSchemaHandler, ValidationError @dataclass class RestrictCharacters: alphabet: Sequence[str] def __get_pydantic_core_schema__( self, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: if not self.alphabet: raise ValueError('Alphabet may not be empty') schema = handler( source ) # get the CoreSchema from the type / inner constraints if schema['type'] != 'str': raise TypeError('RestrictCharacters can only be applied to strings') return core_schema.no_info_after_validator_function( self.validate, schema, ) def validate(self, value: str) -> str: if any(c not in self.alphabet for c in value): raise ValueError( f'{value!r} is not restricted to {self.alphabet!r}' ) return value class MyModel(BaseModel): value: Annotated[str, RestrictCharacters('ABC')] print(MyModel.model_json_schema()) """ { 'properties': {'value': {'title': 'Value', 'type': 'string'}}, 'required': ['value'], 'title': 'MyModel', 'type': 'object', } """ print(MyModel(value='CBA')) #> value='CBA' try: MyModel(value='XYZ') except ValidationError as e: print(e) """ 1 validation error for MyModel value Value error, 'XYZ' is not restricted to 'ABC' [type=value_error, input_value='XYZ', input_type=str] """ ``` ```python from dataclasses import dataclass from typing import Annotated, Any from collections.abc import Sequence from pydantic_core import core_schema from pydantic import BaseModel, GetCoreSchemaHandler, ValidationError @dataclass class RestrictCharacters: alphabet: Sequence[str] def __get_pydantic_core_schema__( self, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: if not self.alphabet: raise ValueError('Alphabet may not be empty') schema = handler( source ) # get the CoreSchema from the type / inner constraints if schema['type'] != 'str': raise TypeError('RestrictCharacters can only be applied to strings') return core_schema.no_info_after_validator_function( self.validate, schema, ) def validate(self, value: str) -> str: if any(c not in self.alphabet for c in value): raise ValueError( f'{value!r} is not restricted to {self.alphabet!r}' ) return value class MyModel(BaseModel): value: Annotated[str, RestrictCharacters('ABC')] print(MyModel.model_json_schema()) """ { 'properties': {'value': {'title': 'Value', 'type': 'string'}}, 'required': ['value'], 'title': 'MyModel', 'type': 'object', } """ print(MyModel(value='CBA')) #> value='CBA' try: MyModel(value='XYZ') except ValidationError as e: print(e) """ 1 validation error for MyModel value Value error, 'XYZ' is not restricted to 'ABC' [type=value_error, input_value='XYZ', input_type=str] """ ``` So far we have been wrapping the schema, but if you just want to *modify* it or *ignore* it you can as well. To modify the schema, first call the handler, then mutate the result: ```python from typing import Annotated, Any from pydantic_core import ValidationError, core_schema from pydantic import BaseModel, GetCoreSchemaHandler class SmallString: def __get_pydantic_core_schema__( self, source: type[Any], handler: GetCoreSchemaHandler, ) -> core_schema.CoreSchema: schema = handler(source) assert schema['type'] == 'str' schema['max_length'] = 10 # modify in place return schema class MyModel(BaseModel): value: Annotated[str, SmallString()] try: MyModel(value='too long!!!!!') except ValidationError as e: print(e) """ 1 validation error for MyModel value String should have at most 10 characters [type=string_too_long, input_value='too long!!!!!', input_type=str] """ ``` Tip Note that you *must* return a schema, even if you are just mutating it in place. To override the schema completely, do not call the handler and return your own `CoreSchema`: ```python from typing import Annotated, Any from pydantic_core import ValidationError, core_schema from pydantic import BaseModel, GetCoreSchemaHandler class AllowAnySubclass: def __get_pydantic_core_schema__( self, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: # we can't call handler since it will fail for arbitrary types def validate(value: Any) -> Any: if not isinstance(value, source): raise ValueError( f'Expected an instance of {source}, got an instance of {type(value)}' ) return core_schema.no_info_plain_validator_function(validate) class Foo: pass class Model(BaseModel): f: Annotated[Foo, AllowAnySubclass()] print(Model(f=Foo())) #> f=None class NotFoo: pass try: Model(f=NotFoo()) except ValidationError as e: print(e) """ 1 validation error for Model f Value error, Expected an instance of , got an instance of [type=value_error, input_value=<__main__.NotFoo object at 0x0123456789ab>, input_type=NotFoo] """ ``` ### Implementing `__get_pydantic_json_schema__` You can also implement `__get_pydantic_json_schema__` to modify or override the generated json schema. Modifying this method only affects the JSON schema - it doesn't affect the core schema, which is used for validation and serialization. Here's an example of modifying the generated JSON schema: ```python import json from typing import Any from pydantic_core import core_schema as cs from pydantic import GetCoreSchemaHandler, GetJsonSchemaHandler, TypeAdapter from pydantic.json_schema import JsonSchemaValue class Person: name: str age: int def __init__(self, name: str, age: int): self.name = name self.age = age @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler ) -> cs.CoreSchema: return cs.typed_dict_schema( { 'name': cs.typed_dict_field(cs.str_schema()), 'age': cs.typed_dict_field(cs.int_schema()), }, ) @classmethod def __get_pydantic_json_schema__( cls, core_schema: cs.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: json_schema = handler(core_schema) json_schema = handler.resolve_ref_schema(json_schema) json_schema['examples'] = [ { 'name': 'John Doe', 'age': 25, } ] json_schema['title'] = 'Person' return json_schema print(json.dumps(TypeAdapter(Person).json_schema(), indent=2)) ``` JSON output: ```json { "examples": [ { "age": 25, "name": "John Doe" } ], "properties": { "name": { "title": "Name", "type": "string" }, "age": { "title": "Age", "type": "integer" } }, "required": [ "name", "age" ], "title": "Person", "type": "object" } ``` ### Using `field_title_generator` The `field_title_generator` parameter can be used to programmatically generate the title for a field based on its name and info. This is similar to the field level `field_title_generator`, but the `ConfigDict` option will be applied to all fields of the class. See the following example: ```python import json from pydantic import BaseModel, ConfigDict class Person(BaseModel): model_config = ConfigDict( field_title_generator=lambda field_name, field_info: field_name.upper() ) name: str age: int print(json.dumps(Person.model_json_schema(), indent=2)) """ { "properties": { "name": { "title": "NAME", "type": "string" }, "age": { "title": "AGE", "type": "integer" } }, "required": [ "name", "age" ], "title": "Person", "type": "object" } """ ``` ### Using `model_title_generator` The `model_title_generator` config option is similar to the `field_title_generator` option, but it applies to the title of the model itself, and accepts the model class as input. See the following example: ```python import json from pydantic import BaseModel, ConfigDict def make_title(model: type) -> str: return f'Title-{model.__name__}' class Person(BaseModel): model_config = ConfigDict(model_title_generator=make_title) name: str age: int print(json.dumps(Person.model_json_schema(), indent=2)) """ { "properties": { "name": { "title": "Name", "type": "string" }, "age": { "title": "Age", "type": "integer" } }, "required": [ "name", "age" ], "title": "Title-Person", "type": "object" } """ ``` ## JSON schema types Types, custom field types, and constraints (like `max_length`) are mapped to the corresponding spec formats in the following priority order (when there is an equivalent available): 1. [JSON Schema Core](https://json-schema.org/draft/2020-12/json-schema-core) 1. [JSON Schema Validation](https://json-schema.org/draft/2020-12/json-schema-validation) 1. [OpenAPI Data Types](https://github.com/OAI/OpenAPI-Specification/blob/master/versions/3.0.2.md#data-types) 1. The standard `format` JSON field is used to define Pydantic extensions for more complex `string` sub-types. The field schema mapping from Python or Pydantic to JSON schema is done as follows: {{ schema_mappings_table }} ## Top-level schema generation You can also generate a top-level JSON schema that only includes a list of models and related sub-models in its `$defs`: ```python import json from pydantic import BaseModel from pydantic.json_schema import models_json_schema class Foo(BaseModel): a: str = None class Model(BaseModel): b: Foo class Bar(BaseModel): c: int _, top_level_schema = models_json_schema( [(Model, 'validation'), (Bar, 'validation')], title='My Schema' ) print(json.dumps(top_level_schema, indent=2)) ``` JSON output: ```json { "$defs": { "Bar": { "properties": { "c": { "title": "C", "type": "integer" } }, "required": [ "c" ], "title": "Bar", "type": "object" }, "Foo": { "properties": { "a": { "default": null, "title": "A", "type": "string" } }, "title": "Foo", "type": "object" }, "Model": { "properties": { "b": { "$ref": "#/$defs/Foo" } }, "required": [ "b" ], "title": "Model", "type": "object" } }, "title": "My Schema" } ``` ## Customizing the JSON Schema Generation Process API Documentation pydantic.json_schema If you need custom schema generation, you can use a `schema_generator`, modifying the GenerateJsonSchema class as necessary for your application. The various methods that can be used to produce JSON schema accept a keyword argument `schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema`, and you can pass your custom subclass to these methods in order to use your own approach to generating JSON schema. `GenerateJsonSchema` implements the translation of a type's `pydantic-core` schema into a JSON schema. By design, this class breaks the JSON schema generation process into smaller methods that can be easily overridden in subclasses to modify the "global" approach to generating JSON schema. ```python from pydantic import BaseModel from pydantic.json_schema import GenerateJsonSchema class MyGenerateJsonSchema(GenerateJsonSchema): def generate(self, schema, mode='validation'): json_schema = super().generate(schema, mode=mode) json_schema['title'] = 'Customize title' json_schema['$schema'] = self.schema_dialect return json_schema class MyModel(BaseModel): x: int print(MyModel.model_json_schema(schema_generator=MyGenerateJsonSchema)) """ { 'properties': {'x': {'title': 'X', 'type': 'integer'}}, 'required': ['x'], 'title': 'Customize title', 'type': 'object', '$schema': 'https://json-schema.org/draft/2020-12/schema', } """ ``` Below is an approach you can use to exclude any fields from the schema that don't have valid json schemas: ```python from typing import Callable from pydantic_core import PydanticOmit, core_schema from pydantic import BaseModel from pydantic.json_schema import GenerateJsonSchema, JsonSchemaValue class MyGenerateJsonSchema(GenerateJsonSchema): def handle_invalid_for_json_schema( self, schema: core_schema.CoreSchema, error_info: str ) -> JsonSchemaValue: raise PydanticOmit def example_callable(): return 1 class Example(BaseModel): name: str = 'example' function: Callable = example_callable instance_example = Example() validation_schema = instance_example.model_json_schema( schema_generator=MyGenerateJsonSchema, mode='validation' ) print(validation_schema) """ { 'properties': { 'name': {'default': 'example', 'title': 'Name', 'type': 'string'} }, 'title': 'Example', 'type': 'object', } """ ``` ```python from collections.abc import Callable from pydantic_core import PydanticOmit, core_schema from pydantic import BaseModel from pydantic.json_schema import GenerateJsonSchema, JsonSchemaValue class MyGenerateJsonSchema(GenerateJsonSchema): def handle_invalid_for_json_schema( self, schema: core_schema.CoreSchema, error_info: str ) -> JsonSchemaValue: raise PydanticOmit def example_callable(): return 1 class Example(BaseModel): name: str = 'example' function: Callable = example_callable instance_example = Example() validation_schema = instance_example.model_json_schema( schema_generator=MyGenerateJsonSchema, mode='validation' ) print(validation_schema) """ { 'properties': { 'name': {'default': 'example', 'title': 'Name', 'type': 'string'} }, 'title': 'Example', 'type': 'object', } """ ``` ### JSON schema sorting By default, Pydantic recursively sorts JSON schemas by alphabetically sorting keys. Notably, Pydantic skips sorting the values of the `properties` key, to preserve the order of the fields as they were defined in the model. If you would like to customize this behavior, you can override the `sort` method in your custom `GenerateJsonSchema` subclass. The below example uses a no-op `sort` method to disable sorting entirely, which is reflected in the preserved order of the model fields and `json_schema_extra` keys: ```python import json from typing import Optional from pydantic import BaseModel, Field from pydantic.json_schema import GenerateJsonSchema, JsonSchemaValue class MyGenerateJsonSchema(GenerateJsonSchema): def sort( self, value: JsonSchemaValue, parent_key: Optional[str] = None ) -> JsonSchemaValue: """No-op, we don't want to sort schema values at all.""" return value class Bar(BaseModel): c: str b: str a: str = Field(json_schema_extra={'c': 'hi', 'b': 'hello', 'a': 'world'}) json_schema = Bar.model_json_schema(schema_generator=MyGenerateJsonSchema) print(json.dumps(json_schema, indent=2)) """ { "type": "object", "properties": { "c": { "type": "string", "title": "C" }, "b": { "type": "string", "title": "B" }, "a": { "type": "string", "c": "hi", "b": "hello", "a": "world", "title": "A" } }, "required": [ "c", "b", "a" ], "title": "Bar" } """ ``` ```python import json from pydantic import BaseModel, Field from pydantic.json_schema import GenerateJsonSchema, JsonSchemaValue class MyGenerateJsonSchema(GenerateJsonSchema): def sort( self, value: JsonSchemaValue, parent_key: str | None = None ) -> JsonSchemaValue: """No-op, we don't want to sort schema values at all.""" return value class Bar(BaseModel): c: str b: str a: str = Field(json_schema_extra={'c': 'hi', 'b': 'hello', 'a': 'world'}) json_schema = Bar.model_json_schema(schema_generator=MyGenerateJsonSchema) print(json.dumps(json_schema, indent=2)) """ { "type": "object", "properties": { "c": { "type": "string", "title": "C" }, "b": { "type": "string", "title": "B" }, "a": { "type": "string", "c": "hi", "b": "hello", "a": "world", "title": "A" } }, "required": [ "c", "b", "a" ], "title": "Bar" } """ ``` ## Customizing the `$ref`s in JSON Schema The format of `$ref`s can be altered by calling model_json_schema() or model_dump_json() with the `ref_template` keyword argument. The definitions are always stored under the key `$defs`, but a specified prefix can be used for the references. This is useful if you need to extend or modify the JSON schema default definitions location. For example, with OpenAPI: ```python import json from pydantic import BaseModel from pydantic.type_adapter import TypeAdapter class Foo(BaseModel): a: int class Model(BaseModel): a: Foo adapter = TypeAdapter(Model) print( json.dumps( adapter.json_schema(ref_template='#/components/schemas/{model}'), indent=2, ) ) ``` JSON output: ```json { "$defs": { "Foo": { "properties": { "a": { "title": "A", "type": "integer" } }, "required": [ "a" ], "title": "Foo", "type": "object" } }, "properties": { "a": { "$ref": "#/components/schemas/Foo" } }, "required": [ "a" ], "title": "Model", "type": "object" } ``` ## Miscellaneous Notes on JSON Schema Generation - The JSON schema for `Optional` fields indicates that the value `null` is allowed. - The `Decimal` type is exposed in JSON schema (and serialized) as a string. - Since the `namedtuple` type doesn't exist in JSON, a model's JSON schema does not preserve `namedtuple`s as `namedtuple`s. - Sub-models used are added to the `$defs` JSON attribute and referenced, as per the spec. - Sub-models with modifications (via the `Field` class) like a custom title, description, or default value, are recursively included instead of referenced. - The `description` for models is taken from either the docstring of the class or the argument `description` to the `Field` class. - The schema is generated by default using aliases as keys, but it can be generated using model property names instead by calling model_json_schema() or model_dump_json() with the `by_alias=False` keyword argument. API Documentation pydantic.main.BaseModel One of the primary ways of defining schema in Pydantic is via models. Models are simply classes which inherit from BaseModel and define fields as annotated attributes. You can think of models as similar to structs in languages like C, or as the requirements of a single endpoint in an API. Models share many similarities with Python's dataclasses, but have been designed with some subtle-yet-important differences that streamline certain workflows related to validation, serialization, and JSON schema generation. You can find more discussion of this in the [Dataclasses](../dataclasses/) section of the docs. Untrusted data can be passed to a model and, after parsing and validation, Pydantic guarantees that the fields of the resultant model instance will conform to the field types defined on the model. Validation — a *deliberate* misnomer ### TL;DR We use the term "validation" to refer to the process of instantiating a model (or other type) that adheres to specified types and constraints. This task, which Pydantic is well known for, is most widely recognized as "validation" in colloquial terms, even though in other contexts the term "validation" may be more restrictive. ______________________________________________________________________ ### The long version The potential confusion around the term "validation" arises from the fact that, strictly speaking, Pydantic's primary focus doesn't align precisely with the dictionary definition of "validation": > ### validation > > *noun* the action of checking or proving the validity or accuracy of something. In Pydantic, the term "validation" refers to the process of instantiating a model (or other type) that adheres to specified types and constraints. Pydantic guarantees the types and constraints of the output, not the input data. This distinction becomes apparent when considering that Pydantic's `ValidationError` is raised when data cannot be successfully parsed into a model instance. While this distinction may initially seem subtle, it holds practical significance. In some cases, "validation" goes beyond just model creation, and can include the copying and coercion of data. This can involve copying arguments passed to the constructor in order to perform coercion to a new type without mutating the original input data. For a more in-depth understanding of the implications for your usage, refer to the [Data Conversion](#data-conversion) and [Attribute Copies](#attribute-copies) sections below. In essence, Pydantic's primary goal is to assure that the resulting structure post-processing (termed "validation") precisely conforms to the applied type hints. Given the widespread adoption of "validation" as the colloquial term for this process, we will consistently use it in our documentation. While the terms "parse" and "validation" were previously used interchangeably, moving forward, we aim to exclusively employ "validate", with "parse" reserved specifically for discussions related to [JSON parsing](../json/). ## Basic model usage Note Pydantic relies heavily on the existing Python typing constructs to define models. If you are not familiar with those, the following resources can be useful: - The [Type System Guides](https://typing.readthedocs.io/en/latest/guides/index.html) - The [mypy documentation](https://mypy.readthedocs.io/en/latest/) ```python from pydantic import BaseModel, ConfigDict class User(BaseModel): id: int name: str = 'Jane Doe' model_config = ConfigDict(str_max_length=10) # (1)! ``` 1. Pydantic models support a variety of [configuration values](../config/) (see here for the available configuration values). In this example, `User` is a model with two fields: - `id`, which is an integer and is required - `name`, which is a string and is not required (it has a default value). Fields can be customized in a number of ways using the Field() function. See the [documentation on fields](../fields/) for more information. The model can then be instantiated: ```python user = User(id='123') ``` `user` is an instance of `User`. Initialization of the object will perform all parsing and validation. If no ValidationError exception is raised, you know the resulting model instance is valid. Fields of a model can be accessed as normal attributes of the `user` object: ```python assert user.name == 'Jane Doe' # (1)! assert user.id == 123 # (2)! assert isinstance(user.id, int) ``` 1. `name` wasn't set when `user` was initialized, so the default value was used. The model_fields_set attribute can be inspected to check the field names explicitly set during instantiation. 1. Note that the string `'123'` was coerced to an integer and its value is `123`. More details on Pydantic's coercion logic can be found in the [data conversion](#data-conversion) section. The model instance can be serialized using the model_dump() method: ```python assert user.model_dump() == {'id': 123, 'name': 'Jane Doe'} ``` Calling dict on the instance will also provide a dictionary, but nested fields will not be recursively converted into dictionaries. model_dump() also provides numerous arguments to customize the serialization result. By default, models are mutable and field values can be changed through attribute assignment: ```python user.id = 321 assert user.id == 321 ``` Warning When defining your models, watch out for naming collisions between your field name and its type annotation. For example, the following will not behave as expected and would yield a validation error: ```python from typing import Optional from pydantic import BaseModel class Boo(BaseModel): int: Optional[int] = None m = Boo(int=123) # Will fail to validate. ``` Because of how Python evaluates annotated assignment statements, the statement is equivalent to `int: None = None`, thus leading to a validation error. ### Model methods and properties The example above only shows the tip of the iceberg of what models can do. Models possess the following methods and attributes: - model_validate(): Validates the given object against the Pydantic model. See [Validating data](#validating-data). - model_validate_json(): Validates the given JSON data against the Pydantic model. See [Validating data](#validating-data). - model_construct(): Creates models without running validation. See [Creating models without validation](#creating-models-without-validation). - model_dump(): Returns a dictionary of the model's fields and values. See [Serialization](../serialization/#model_dump). - model_dump_json(): Returns a JSON string representation of model_dump(). See [Serialization](../serialization/#model_dump_json). - model_copy(): Returns a copy (by default, shallow copy) of the model. See [Serialization](../serialization/#model_copy). - model_json_schema(): Returns a jsonable dictionary representing the model's JSON Schema. See [JSON Schema](../json_schema/). - model_fields: A mapping between field names and their definitions (FieldInfo instances). - model_computed_fields: A mapping between computed field names and their definitions (ComputedFieldInfo instances). - model_extra: The extra fields set during validation. - model_fields_set: The set of fields which were explicitly provided when the model was initialized. - model_parametrized_name(): Computes the class name for parametrizations of generic classes. - model_post_init(): Performs additional actions after the model is instantiated and all field validators are applied. - model_rebuild(): Rebuilds the model schema, which also supports building recursive generic models. See [Rebuilding model schema](#rebuilding-model-schema). Note See the API documentation of BaseModel for the class definition including a full list of methods and attributes. Tip See [Changes to `pydantic.BaseModel`](../../migration/#changes-to-pydanticbasemodel) in the [Migration Guide](../../migration/) for details on changes from Pydantic V1. ## Data conversion Pydantic may cast input data to force it to conform to model field types, and in some cases this may result in a loss of information. For example: ```python from pydantic import BaseModel class Model(BaseModel): a: int b: float c: str print(Model(a=3.000, b='2.72', c=b'binary data').model_dump()) #> {'a': 3, 'b': 2.72, 'c': 'binary data'} ``` This is a deliberate decision of Pydantic, and is frequently the most useful approach. See [here](https://github.com/pydantic/pydantic/issues/578) for a longer discussion on the subject. Nevertheless, Pydantic provides a [strict mode](../strict_mode/), where no data conversion is performed. Values must be of the same type than the declared field type. This is also the case for collections. In most cases, you shouldn't make use of abstract container classes and just use a concrete type, such as list: ```python from pydantic import BaseModel class Model(BaseModel): items: list[int] # (1)! print(Model(items=(1, 2, 3))) #> items=[1, 2, 3] ``` 1. In this case, you might be tempted to use the abstract Sequence type to allow both lists and tuples. But Pydantic takes care of converting the tuple input to a list, so in most cases this isn't necessary. Besides, using these abstract types can also lead to [poor validation performance](../performance/#sequence-vs-list-or-tuple-with-mapping-vs-dict), and in general using concrete container types will avoid unnecessary checks. ## Extra data By default, Pydantic models **won't error when you provide extra data**, and these values will simply be ignored: ```python from pydantic import BaseModel class Model(BaseModel): x: int m = Model(x=1, y='a') assert m.model_dump() == {'x': 1} ``` The extra configuration value can be used to control this behavior: ```python from pydantic import BaseModel, ConfigDict class Model(BaseModel): x: int model_config = ConfigDict(extra='allow') m = Model(x=1, y='a') # (1)! assert m.model_dump() == {'x': 1, 'y': 'a'} assert m.__pydantic_extra__ == {'y': 'a'} ``` 1. If extra was set to `'forbid'`, this would fail. The configuration can take three values: - `'ignore'`: Providing extra data is ignored (the default). - `'forbid'`: Providing extra data is not permitted. - `'allow'`: Providing extra data is allowed and stored in the `__pydantic_extra__` dictionary attribute. The `__pydantic_extra__` can explicitly be annotated to provide validation for extra fields. For more details, refer to the extra API documentation. Pydantic dataclasses also support extra data (see the [dataclass configuration](../dataclasses/#dataclass-config) section). ## Nested models More complex hierarchical data structures can be defined using models themselves as types in annotations. ```python from typing import Optional from pydantic import BaseModel class Foo(BaseModel): count: int size: Optional[float] = None class Bar(BaseModel): apple: str = 'x' banana: str = 'y' class Spam(BaseModel): foo: Foo bars: list[Bar] m = Spam(foo={'count': 4}, bars=[{'apple': 'x1'}, {'apple': 'x2'}]) print(m) """ foo=Foo(count=4, size=None) bars=[Bar(apple='x1', banana='y'), Bar(apple='x2', banana='y')] """ print(m.model_dump()) """ { 'foo': {'count': 4, 'size': None}, 'bars': [{'apple': 'x1', 'banana': 'y'}, {'apple': 'x2', 'banana': 'y'}], } """ ``` ```python from pydantic import BaseModel class Foo(BaseModel): count: int size: float | None = None class Bar(BaseModel): apple: str = 'x' banana: str = 'y' class Spam(BaseModel): foo: Foo bars: list[Bar] m = Spam(foo={'count': 4}, bars=[{'apple': 'x1'}, {'apple': 'x2'}]) print(m) """ foo=Foo(count=4, size=None) bars=[Bar(apple='x1', banana='y'), Bar(apple='x2', banana='y')] """ print(m.model_dump()) """ { 'foo': {'count': 4, 'size': None}, 'bars': [{'apple': 'x1', 'banana': 'y'}, {'apple': 'x2', 'banana': 'y'}], } """ ``` Self-referencing models are supported. For more details, see the documentation related to [forward annotations](../forward_annotations/#self-referencing-or-recursive-models). ## Rebuilding model schema When you define a model class in your code, Pydantic will analyze the body of the class to collect a variety of information required to perform validation and serialization, gathered in a core schema. Notably, the model's type annotations are evaluated to understand the valid types for each field (more information can be found in the [Architecture](../../internals/architecture/) documentation). However, it might be the case that annotations refer to symbols not defined when the model class is being created. To circumvent this issue, the model_rebuild() method can be used: ```python from pydantic import BaseModel, PydanticUserError class Foo(BaseModel): x: 'Bar' # (1)! try: Foo.model_json_schema() except PydanticUserError as e: print(e) """ `Foo` is not fully defined; you should define `Bar`, then call `Foo.model_rebuild()`. For further information visit https://errors.pydantic.dev/2/u/class-not-fully-defined """ class Bar(BaseModel): pass Foo.model_rebuild() print(Foo.model_json_schema()) """ { '$defs': {'Bar': {'properties': {}, 'title': 'Bar', 'type': 'object'}}, 'properties': {'x': {'$ref': '#/$defs/Bar'}}, 'required': ['x'], 'title': 'Foo', 'type': 'object', } """ ``` 1. `Bar` is not yet defined when the `Foo` class is being created. For this reason, a [forward annotation](../forward_annotations/) is being used. Pydantic tries to determine when this is necessary automatically and error if it wasn't done, but you may want to call model_rebuild() proactively when dealing with recursive models or generics. In V2, model_rebuild() replaced `update_forward_refs()` from V1. There are some slight differences with the new behavior. The biggest change is that when calling model_rebuild() on the outermost model, it builds a core schema used for validation of the whole model (nested models and all), so all types at all levels need to be ready before model_rebuild() is called. ## Arbitrary class instances (Formerly known as "ORM Mode"/`from_orm`). Pydantic models can also be created from arbitrary class instances by reading the instance attributes corresponding to the model field names. One common application of this functionality is integration with object-relational mappings (ORMs). To do this, set the from_attributes config value to `True` (see the documentation on [Configuration](../config/) for more details). The example here uses [SQLAlchemy](https://www.sqlalchemy.org/), but the same approach should work for any ORM. ```python from typing import Annotated from sqlalchemy import ARRAY, String from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column from pydantic import BaseModel, ConfigDict, StringConstraints class Base(DeclarativeBase): pass class CompanyOrm(Base): __tablename__ = 'companies' id: Mapped[int] = mapped_column(primary_key=True, nullable=False) public_key: Mapped[str] = mapped_column( String(20), index=True, nullable=False, unique=True ) domains: Mapped[list[str]] = mapped_column(ARRAY(String(255))) class CompanyModel(BaseModel): model_config = ConfigDict(from_attributes=True) id: int public_key: Annotated[str, StringConstraints(max_length=20)] domains: list[Annotated[str, StringConstraints(max_length=255)]] co_orm = CompanyOrm( id=123, public_key='foobar', domains=['example.com', 'foobar.com'], ) print(co_orm) #> <__main__.CompanyOrm object at 0x0123456789ab> co_model = CompanyModel.model_validate(co_orm) print(co_model) #> id=123 public_key='foobar' domains=['example.com', 'foobar.com'] ``` ### Nested attributes When using attributes to parse models, model instances will be created from both top-level attributes and deeper-nested attributes as appropriate. Here is an example demonstrating the principle: ```python from pydantic import BaseModel, ConfigDict class PetCls: def __init__(self, *, name: str, species: str): self.name = name self.species = species class PersonCls: def __init__(self, *, name: str, age: float = None, pets: list[PetCls]): self.name = name self.age = age self.pets = pets class Pet(BaseModel): model_config = ConfigDict(from_attributes=True) name: str species: str class Person(BaseModel): model_config = ConfigDict(from_attributes=True) name: str age: float = None pets: list[Pet] bones = PetCls(name='Bones', species='dog') orion = PetCls(name='Orion', species='cat') anna = PersonCls(name='Anna', age=20, pets=[bones, orion]) anna_model = Person.model_validate(anna) print(anna_model) """ name='Anna' age=20.0 pets=[Pet(name='Bones', species='dog'), Pet(name='Orion', species='cat')] """ ``` ## Error handling Pydantic will raise a ValidationError exception whenever it finds an error in the data it's validating. A single exception will be raised regardless of the number of errors found, and that validation error will contain information about all of the errors and how they happened. See [Error Handling](../../errors/errors/) for details on standard and custom errors. As a demonstration: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): list_of_ints: list[int] a_float: float data = dict( list_of_ints=['1', 2, 'bad'], a_float='not a float', ) try: Model(**data) except ValidationError as e: print(e) """ 2 validation errors for Model list_of_ints.2 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='bad', input_type=str] a_float Input should be a valid number, unable to parse string as a number [type=float_parsing, input_value='not a float', input_type=str] """ ``` ## Validating data Pydantic provides three methods on models classes for parsing data: - model_validate(): this is very similar to the `__init__` method of the model, except it takes a dictionary or an object rather than keyword arguments. If the object passed cannot be validated, or if it's not a dictionary or instance of the model in question, a ValidationError will be raised. - model_validate_json(): this validates the provided data as a JSON string or `bytes` object. If your incoming data is a JSON payload, this is generally considered faster (instead of manually parsing the data as a dictionary). Learn more about JSON parsing in the [JSON](../json/) section of the docs. - model_validate_strings(): this takes a dictionary (can be nested) with string keys and values and validates the data in JSON mode so that said strings can be coerced into the correct types. ```python from datetime import datetime from typing import Optional from pydantic import BaseModel, ValidationError class User(BaseModel): id: int name: str = 'John Doe' signup_ts: Optional[datetime] = None m = User.model_validate({'id': 123, 'name': 'James'}) print(m) #> id=123 name='James' signup_ts=None try: User.model_validate(['not', 'a', 'dict']) except ValidationError as e: print(e) """ 1 validation error for User Input should be a valid dictionary or instance of User [type=model_type, input_value=['not', 'a', 'dict'], input_type=list] """ m = User.model_validate_json('{"id": 123, "name": "James"}') print(m) #> id=123 name='James' signup_ts=None try: m = User.model_validate_json('{"id": 123, "name": 123}') except ValidationError as e: print(e) """ 1 validation error for User name Input should be a valid string [type=string_type, input_value=123, input_type=int] """ try: m = User.model_validate_json('invalid JSON') except ValidationError as e: print(e) """ 1 validation error for User Invalid JSON: expected value at line 1 column 1 [type=json_invalid, input_value='invalid JSON', input_type=str] """ m = User.model_validate_strings({'id': '123', 'name': 'James'}) print(m) #> id=123 name='James' signup_ts=None m = User.model_validate_strings( {'id': '123', 'name': 'James', 'signup_ts': '2024-04-01T12:00:00'} ) print(m) #> id=123 name='James' signup_ts=datetime.datetime(2024, 4, 1, 12, 0) try: m = User.model_validate_strings( {'id': '123', 'name': 'James', 'signup_ts': '2024-04-01'}, strict=True ) except ValidationError as e: print(e) """ 1 validation error for User signup_ts Input should be a valid datetime, invalid datetime separator, expected `T`, `t`, `_` or space [type=datetime_parsing, input_value='2024-04-01', input_type=str] """ ``` ```python from datetime import datetime from pydantic import BaseModel, ValidationError class User(BaseModel): id: int name: str = 'John Doe' signup_ts: datetime | None = None m = User.model_validate({'id': 123, 'name': 'James'}) print(m) #> id=123 name='James' signup_ts=None try: User.model_validate(['not', 'a', 'dict']) except ValidationError as e: print(e) """ 1 validation error for User Input should be a valid dictionary or instance of User [type=model_type, input_value=['not', 'a', 'dict'], input_type=list] """ m = User.model_validate_json('{"id": 123, "name": "James"}') print(m) #> id=123 name='James' signup_ts=None try: m = User.model_validate_json('{"id": 123, "name": 123}') except ValidationError as e: print(e) """ 1 validation error for User name Input should be a valid string [type=string_type, input_value=123, input_type=int] """ try: m = User.model_validate_json('invalid JSON') except ValidationError as e: print(e) """ 1 validation error for User Invalid JSON: expected value at line 1 column 1 [type=json_invalid, input_value='invalid JSON', input_type=str] """ m = User.model_validate_strings({'id': '123', 'name': 'James'}) print(m) #> id=123 name='James' signup_ts=None m = User.model_validate_strings( {'id': '123', 'name': 'James', 'signup_ts': '2024-04-01T12:00:00'} ) print(m) #> id=123 name='James' signup_ts=datetime.datetime(2024, 4, 1, 12, 0) try: m = User.model_validate_strings( {'id': '123', 'name': 'James', 'signup_ts': '2024-04-01'}, strict=True ) except ValidationError as e: print(e) """ 1 validation error for User signup_ts Input should be a valid datetime, invalid datetime separator, expected `T`, `t`, `_` or space [type=datetime_parsing, input_value='2024-04-01', input_type=str] """ ``` If you want to validate serialized data in a format other than JSON, you should load the data into a dictionary yourself and then pass it to model_validate. Note Depending on the types and model configs involved, model_validate and model_validate_json may have different validation behavior. If you have data coming from a non-JSON source, but want the same validation behavior and errors you'd get from model_validate_json, our recommendation for now is to use either use `model_validate_json(json.dumps(data))`, or use model_validate_strings if the data takes the form of a (potentially nested) dictionary with string keys and values. Note If you're passing in an instance of a model to model_validate, you will want to consider setting revalidate_instances in the model's config. If you don't set this value, then validation will be skipped on model instances. See the below example: ```python from pydantic import BaseModel class Model(BaseModel): a: int m = Model(a=0) # note: setting `validate_assignment` to `True` in the config can prevent this kind of misbehavior. m.a = 'not an int' # doesn't raise a validation error even though m is invalid m2 = Model.model_validate(m) ``` ```python from pydantic import BaseModel, ConfigDict, ValidationError class Model(BaseModel): a: int model_config = ConfigDict(revalidate_instances='always') m = Model(a=0) # note: setting `validate_assignment` to `True` in the config can prevent this kind of misbehavior. m.a = 'not an int' try: m2 = Model.model_validate(m) except ValidationError as e: print(e) """ 1 validation error for Model a Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='not an int', input_type=str] """ ``` ### Creating models without validation Pydantic also provides the model_construct() method, which allows models to be created **without validation**. This can be useful in at least a few cases: - when working with complex data that is already known to be valid (for performance reasons) - when one or more of the validator functions are non-idempotent - when one or more of the validator functions have side effects that you don't want to be triggered. Warning model_construct() does not do any validation, meaning it can create models which are invalid. **You should only ever use the model_construct() method with data which has already been validated, or that you definitely trust.** Note In Pydantic V2, the performance gap between validation (either with direct instantiation or the `model_validate*` methods) and model_construct() has been narrowed considerably. For simple models, going with validation may even be faster. If you are using model_construct() for performance reasons, you may want to profile your use case before assuming it is actually faster. Note that for [root models](#rootmodel-and-custom-root-types), the root value can be passed to model_construct() positionally, instead of using a keyword argument. Here are some additional notes on the behavior of model_construct(): - When we say "no validation is performed" — this includes converting dictionaries to model instances. So if you have a field referring to a model type, you will need to convert the inner dictionary to a model yourself. - If you do not pass keyword arguments for fields with defaults, the default values will still be used. - For models with private attributes, the `__pydantic_private__` dictionary will be populated the same as it would be when creating the model with validation. - No `__init__` method from the model or any of its parent classes will be called, even when a custom `__init__` method is defined. On [extra data](#extra-data) behavior with model_construct() - For models with extra set to `'allow'`, data not corresponding to fields will be correctly stored in the `__pydantic_extra__` dictionary and saved to the model's `__dict__` attribute. - For models with extra set to `'ignore'`, data not corresponding to fields will be ignored — that is, not stored in `__pydantic_extra__` or `__dict__` on the instance. - Unlike when instiating the model with validation, a call to model_construct() with extra set to `'forbid'` doesn't raise an error in the presence of data not corresponding to fields. Rather, said input data is simply ignored. ## Generic models Pydantic supports the creation of generic models to make it easier to reuse a common model structure. Both the new type parameter syntax (introduced by [PEP 695](https://peps.python.org/pep-0695/) in Python 3.12) and the old syntax are supported (refer to [the Python documentation](https://docs.python.org/3/library/typing.html#building-generic-types-and-type-aliases) for more details). Here is an example using a generic Pydantic model to create an easily-reused HTTP response payload wrapper: ```python from typing import Generic, TypeVar from pydantic import BaseModel, ValidationError DataT = TypeVar('DataT') # (1)! class DataModel(BaseModel): number: int class Response(BaseModel, Generic[DataT]): # (2)! data: DataT # (3)! print(Response[int](data=1)) #> data=1 print(Response[str](data='value')) #> data='value' print(Response[str](data='value').model_dump()) #> {'data': 'value'} data = DataModel(number=1) print(Response[DataModel](data=data).model_dump()) #> {'data': {'number': 1}} try: Response[int](data='value') except ValidationError as e: print(e) """ 1 validation error for Response[int] data Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='value', input_type=str] """ ``` 1. Declare one or more type variables to use to parameterize your model. 1. Declare a Pydantic model that inherits from BaseModel and typing.Generic (in this specific order), and add the list of type variables you declared previously as parameters to the Generic parent. 1. Use the type variables as annotations where you will want to replace them with other types. ```python from pydantic import BaseModel, ValidationError class DataModel(BaseModel): number: int class Response[DataT](BaseModel): # (1)! data: DataT # (2)! print(Response[int](data=1)) #> data=1 print(Response[str](data='value')) #> data='value' print(Response[str](data='value').model_dump()) #> {'data': 'value'} data = DataModel(number=1) print(Response[DataModel](data=data).model_dump()) #> {'data': {'number': 1}} try: Response[int](data='value') except ValidationError as e: print(e) """ 1 validation error for Response[int] data Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='value', input_type=str] """ ``` 1. Declare a Pydantic model and add the list of type variables as type parameters. 1. Use the type variables as annotations where you will want to replace them with other types. Warning When parametrizing a model with a concrete type, Pydantic **does not** validate that the provided type is [assignable to the type variable](https://typing.readthedocs.io/en/latest/spec/generics.html#type-variables-with-an-upper-bound) if it has an upper bound. Any [configuration](../config/), [validation](../validators/) or [serialization](../serialization/) logic set on the generic model will also be applied to the parametrized classes, in the same way as when inheriting from a model class. Any custom methods or attributes will also be inherited. Generic models also integrate properly with type checkers, so you get all the type checking you would expect if you were to declare a distinct type for each parametrization. Note Internally, Pydantic creates subclasses of the generic model at runtime when the generic model class is parametrized. These classes are cached, so there should be minimal overhead introduced by the use of generics models. To inherit from a generic model and preserve the fact that it is generic, the subclass must also inherit from Generic: ```python from typing import Generic, TypeVar from pydantic import BaseModel TypeX = TypeVar('TypeX') class BaseClass(BaseModel, Generic[TypeX]): X: TypeX class ChildClass(BaseClass[TypeX], Generic[TypeX]): pass # Parametrize `TypeX` with `int`: print(ChildClass[int](X=1)) #> X=1 ``` You can also create a generic subclass of a model that partially or fully replaces the type variables in the superclass: ```python from typing import Generic, TypeVar from pydantic import BaseModel TypeX = TypeVar('TypeX') TypeY = TypeVar('TypeY') TypeZ = TypeVar('TypeZ') class BaseClass(BaseModel, Generic[TypeX, TypeY]): x: TypeX y: TypeY class ChildClass(BaseClass[int, TypeY], Generic[TypeY, TypeZ]): z: TypeZ # Parametrize `TypeY` with `str`: print(ChildClass[str, int](x='1', y='y', z='3')) #> x=1 y='y' z=3 ``` If the name of the concrete subclasses is important, you can also override the default name generation by overriding the model_parametrized_name() method: ```python from typing import Any, Generic, TypeVar from pydantic import BaseModel DataT = TypeVar('DataT') class Response(BaseModel, Generic[DataT]): data: DataT @classmethod def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str: return f'{params[0].__name__.title()}Response' print(repr(Response[int](data=1))) #> IntResponse(data=1) print(repr(Response[str](data='a'))) #> StrResponse(data='a') ``` You can use parametrized generic models as types in other models: ```python from typing import Generic, TypeVar from pydantic import BaseModel T = TypeVar('T') class ResponseModel(BaseModel, Generic[T]): content: T class Product(BaseModel): name: str price: float class Order(BaseModel): id: int product: ResponseModel[Product] product = Product(name='Apple', price=0.5) response = ResponseModel[Product](content=product) order = Order(id=1, product=response) print(repr(order)) """ Order(id=1, product=ResponseModel[Product](content=Product(name='Apple', price=0.5))) """ ``` Using the same type variable in nested models allows you to enforce typing relationships at different points in your model: ```python from typing import Generic, TypeVar from pydantic import BaseModel, ValidationError T = TypeVar('T') class InnerT(BaseModel, Generic[T]): inner: T class OuterT(BaseModel, Generic[T]): outer: T nested: InnerT[T] nested = InnerT[int](inner=1) print(OuterT[int](outer=1, nested=nested)) #> outer=1 nested=InnerT[int](inner=1) try: print(OuterT[int](outer='a', nested=InnerT(inner='a'))) # (1)! except ValidationError as e: print(e) """ 2 validation errors for OuterT[int] outer Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] nested.inner Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] """ ``` 1. The `OuterT` model is parametrized with `int`, but the data associated with the the `T` annotations during validation is of type `str`, leading to validation errors. Warning While it may not raise an error, we strongly advise against using parametrized generics in [`isinstance()`](https://docs.python.org/3/library/functions.html#isinstance) checks. For example, you should not do `isinstance(my_model, MyGenericModel[int])`. However, it is fine to do `isinstance(my_model, MyGenericModel)` (note that, for standard generics, it would raise an error to do a subclass check with a parameterized generic class). If you need to perform [`isinstance()`](https://docs.python.org/3/library/functions.html#isinstance) checks against parametrized generics, you can do this by subclassing the parametrized generic class: ```python class MyIntModel(MyGenericModel[int]): ... isinstance(my_model, MyIntModel) ``` Implementation Details When using nested generic models, Pydantic sometimes performs revalidation in an attempt to produce the most intuitive validation result. Specifically, if you have a field of type `GenericModel[SomeType]` and you validate data like `GenericModel[SomeCompatibleType]` against this field, we will inspect the data, recognize that the input data is sort of a "loose" subclass of `GenericModel`, and revalidate the contained `SomeCompatibleType` data. This adds some validation overhead, but makes things more intuitive for cases like that shown below. ```python from typing import Any, Generic, TypeVar from pydantic import BaseModel T = TypeVar('T') class GenericModel(BaseModel, Generic[T]): a: T class Model(BaseModel): inner: GenericModel[Any] print(repr(Model.model_validate(Model(inner=GenericModel[int](a=1))))) #> Model(inner=GenericModel[Any](a=1)) ``` Note, validation will still fail if you, for example are validating against `GenericModel[int]` and pass in an instance `GenericModel[str](a='not an int')`. It's also worth noting that this pattern will re-trigger any custom validation as well, like additional model validators and the like. Validators will be called once on the first pass, validating directly against `GenericModel[Any]`. That validation fails, as `GenericModel[int]` is not a subclass of `GenericModel[Any]`. This relates to the warning above about the complications of using parametrized generics in `isinstance()` and `issubclass()` checks. Then, the validators will be called again on the second pass, during more lax force-revalidation phase, which succeeds. To better understand this consequence, see below: ```python from typing import Any, Generic, Self, TypeVar from pydantic import BaseModel, model_validator T = TypeVar('T') class GenericModel(BaseModel, Generic[T]): a: T @model_validator(mode='after') def validate_after(self: Self) -> Self: print('after validator running custom validation...') return self class Model(BaseModel): inner: GenericModel[Any] m = Model.model_validate(Model(inner=GenericModel[int](a=1))) #> after validator running custom validation... #> after validator running custom validation... print(repr(m)) #> Model(inner=GenericModel[Any](a=1)) ``` ### Validation of unparametrized type variables When leaving type variables unparametrized, Pydantic treats generic models similarly to how it treats built-in generic types like list and dict: - If the type variable is [bound](https://typing.readthedocs.io/en/latest/reference/generics.html#type-variables-with-upper-bounds) or [constrained](https://typing.readthedocs.io/en/latest/reference/generics.html#type-variables-with-constraints) to a specific type, it will be used. - If the type variable has a default type (as specified by [PEP 696](https://peps.python.org/pep-0696/)), it will be used. - For unbound or unconstrained type variables, Pydantic will fallback to Any. ```python from typing import Generic from typing_extensions import TypeVar from pydantic import BaseModel, ValidationError T = TypeVar('T') U = TypeVar('U', bound=int) V = TypeVar('V', default=str) class Model(BaseModel, Generic[T, U, V]): t: T u: U v: V print(Model(t='t', u=1, v='v')) #> t='t' u=1 v='v' try: Model(t='t', u='u', v=1) except ValidationError as exc: print(exc) """ 2 validation errors for Model u Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='u', input_type=str] v Input should be a valid string [type=string_type, input_value=1, input_type=int] """ ``` Warning In some cases, validation against an unparametrized generic model can lead to data loss. Specifically, if a subtype of the type variable upper bound, constraints, or default is being used and the model isn't explicitly parametrized, the resulting type **will not be** the one being provided: ```python from typing import Generic, TypeVar from pydantic import BaseModel ItemT = TypeVar('ItemT', bound='ItemBase') class ItemBase(BaseModel): ... class IntItem(ItemBase): value: int class ItemHolder(BaseModel, Generic[ItemT]): item: ItemT loaded_data = {'item': {'value': 1}} print(ItemHolder(**loaded_data)) # (1)! #> item=ItemBase() print(ItemHolder[IntItem](**loaded_data)) # (2)! #> item=IntItem(value=1) ``` 1. When the generic isn't parametrized, the input data is validated against the `ItemT` upper bound. Given that `ItemBase` has no fields, the `item` field information is lost. 1. In this case, the type variable is explicitly parametrized, so the input data is validated against the `IntItem` class. ### Serialization of unparametrized type variables The behavior of serialization differs when using type variables with [upper bounds](https://typing.readthedocs.io/en/latest/reference/generics.html#type-variables-with-upper-bounds), [constraints](https://typing.readthedocs.io/en/latest/reference/generics.html#type-variables-with-constraints), or a default value: If a Pydantic model is used in a type variable upper bound and the type variable is never parametrized, then Pydantic will use the upper bound for validation but treat the value as Any in terms of serialization: ```python from typing import Generic, TypeVar from pydantic import BaseModel class ErrorDetails(BaseModel): foo: str ErrorDataT = TypeVar('ErrorDataT', bound=ErrorDetails) class Error(BaseModel, Generic[ErrorDataT]): message: str details: ErrorDataT class MyErrorDetails(ErrorDetails): bar: str # serialized as Any error = Error( message='We just had an error', details=MyErrorDetails(foo='var', bar='var2'), ) assert error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', 'bar': 'var2', }, } # serialized using the concrete parametrization # note that `'bar': 'var2'` is missing error = Error[ErrorDetails]( message='We just had an error', details=ErrorDetails(foo='var'), ) assert error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', }, } ``` Here's another example of the above behavior, enumerating all permutations regarding bound specification and generic type parametrization: ```python from typing import Generic, TypeVar from pydantic import BaseModel TBound = TypeVar('TBound', bound=BaseModel) TNoBound = TypeVar('TNoBound') class IntValue(BaseModel): value: int class ItemBound(BaseModel, Generic[TBound]): item: TBound class ItemNoBound(BaseModel, Generic[TNoBound]): item: TNoBound item_bound_inferred = ItemBound(item=IntValue(value=3)) item_bound_explicit = ItemBound[IntValue](item=IntValue(value=3)) item_no_bound_inferred = ItemNoBound(item=IntValue(value=3)) item_no_bound_explicit = ItemNoBound[IntValue](item=IntValue(value=3)) # calling `print(x.model_dump())` on any of the above instances results in the following: #> {'item': {'value': 3}} ``` However, if [constraints](https://typing.readthedocs.io/en/latest/reference/generics.html#type-variables-with-constraints) or a default value (as per [PEP 696](https://peps.python.org/pep-0696/)) is being used, then the default type or constraints will be used for both validation and serialization if the type variable is not parametrized. You can override this behavior using [`SerializeAsAny`](../serialization/#serializeasany-annotation): ```python from typing import Generic from typing_extensions import TypeVar from pydantic import BaseModel, SerializeAsAny class ErrorDetails(BaseModel): foo: str ErrorDataT = TypeVar('ErrorDataT', default=ErrorDetails) class Error(BaseModel, Generic[ErrorDataT]): message: str details: ErrorDataT class MyErrorDetails(ErrorDetails): bar: str # serialized using the default's serializer error = Error( message='We just had an error', details=MyErrorDetails(foo='var', bar='var2'), ) assert error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', }, } # If `ErrorDataT` was using an upper bound, `bar` would be present in `details`. class SerializeAsAnyError(BaseModel, Generic[ErrorDataT]): message: str details: SerializeAsAny[ErrorDataT] # serialized as Any error = SerializeAsAnyError( message='We just had an error', details=MyErrorDetails(foo='var', bar='baz'), ) assert error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', 'bar': 'baz', }, } ``` ```python from typing import Generic from typing import TypeVar from pydantic import BaseModel, SerializeAsAny class ErrorDetails(BaseModel): foo: str ErrorDataT = TypeVar('ErrorDataT', default=ErrorDetails) class Error(BaseModel, Generic[ErrorDataT]): message: str details: ErrorDataT class MyErrorDetails(ErrorDetails): bar: str # serialized using the default's serializer error = Error( message='We just had an error', details=MyErrorDetails(foo='var', bar='var2'), ) assert error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', }, } # If `ErrorDataT` was using an upper bound, `bar` would be present in `details`. class SerializeAsAnyError(BaseModel, Generic[ErrorDataT]): message: str details: SerializeAsAny[ErrorDataT] # serialized as Any error = SerializeAsAnyError( message='We just had an error', details=MyErrorDetails(foo='var', bar='baz'), ) assert error.model_dump() == { 'message': 'We just had an error', 'details': { 'foo': 'var', 'bar': 'baz', }, } ``` ## Dynamic model creation API Documentation pydantic.main.create_model There are some occasions where it is desirable to create a model using runtime information to specify the fields. Pydantic provides the create_model() function to allow models to be created dynamically: ```python from pydantic import BaseModel, create_model DynamicFoobarModel = create_model('DynamicFoobarModel', foo=str, bar=(int, 123)) # Equivalent to: class StaticFoobarModel(BaseModel): foo: str bar: int = 123 ``` Field definitions are specified as keyword arguments, and should either be: - A single element, representing the type annotation of the field. - A two-tuple, the first element being the type and the second element the assigned value (either a default or the Field() function). Here is a more advanced example: ```python from typing import Annotated from pydantic import BaseModel, Field, PrivateAttr, create_model DynamicModel = create_model( 'DynamicModel', foo=(str, Field(alias='FOO')), bar=Annotated[str, Field(description='Bar field')], _private=(int, PrivateAttr(default=1)), ) class StaticModel(BaseModel): foo: str = Field(alias='FOO') bar: Annotated[str, Field(description='Bar field')] _private: int = PrivateAttr(default=1) ``` The special keyword arguments `__config__` and `__base__` can be used to customize the new model. This includes extending a base model with extra fields. ```python from pydantic import BaseModel, create_model class FooModel(BaseModel): foo: str bar: int = 123 BarModel = create_model( 'BarModel', apple=(str, 'russet'), banana=(str, 'yellow'), __base__=FooModel, ) print(BarModel) #> print(BarModel.model_fields.keys()) #> dict_keys(['foo', 'bar', 'apple', 'banana']) ``` You can also add validators by passing a dictionary to the `__validators__` argument. ```python from pydantic import ValidationError, create_model, field_validator def alphanum(cls, v): assert v.isalnum(), 'must be alphanumeric' return v validators = { 'username_validator': field_validator('username')(alphanum) # (1)! } UserModel = create_model( 'UserModel', username=(str, ...), __validators__=validators ) user = UserModel(username='scolvin') print(user) #> username='scolvin' try: UserModel(username='scolvi%n') except ValidationError as e: print(e) """ 1 validation error for UserModel username Assertion failed, must be alphanumeric [type=assertion_error, input_value='scolvi%n', input_type=str] """ ``` 1. Make sure that the validators names do not clash with any of the field names as internally, Pydantic gathers all members into a namespace and mimics the normal creation of a class using the [`types` module utilities](https://docs.python.org/3/library/types.html#dynamic-type-creation). Note To pickle a dynamically created model: - the model must be defined globally - the `__module__` argument must be provided ## `RootModel` and custom root types API Documentation pydantic.root_model.RootModel Pydantic models can be defined with a "custom root type" by subclassing pydantic.RootModel. The root type can be any type supported by Pydantic, and is specified by the generic parameter to `RootModel`. The root value can be passed to the model `__init__` or model_validate via the first and only argument. Here's an example of how this works: ```python from pydantic import RootModel Pets = RootModel[list[str]] PetsByName = RootModel[dict[str, str]] print(Pets(['dog', 'cat'])) #> root=['dog', 'cat'] print(Pets(['dog', 'cat']).model_dump_json()) #> ["dog","cat"] print(Pets.model_validate(['dog', 'cat'])) #> root=['dog', 'cat'] print(Pets.model_json_schema()) """ {'items': {'type': 'string'}, 'title': 'RootModel[list[str]]', 'type': 'array'} """ print(PetsByName({'Otis': 'dog', 'Milo': 'cat'})) #> root={'Otis': 'dog', 'Milo': 'cat'} print(PetsByName({'Otis': 'dog', 'Milo': 'cat'}).model_dump_json()) #> {"Otis":"dog","Milo":"cat"} print(PetsByName.model_validate({'Otis': 'dog', 'Milo': 'cat'})) #> root={'Otis': 'dog', 'Milo': 'cat'} ``` If you want to access items in the `root` field directly or to iterate over the items, you can implement custom `__iter__` and `__getitem__` functions, as shown in the following example. ```python from pydantic import RootModel class Pets(RootModel): root: list[str] def __iter__(self): return iter(self.root) def __getitem__(self, item): return self.root[item] pets = Pets.model_validate(['dog', 'cat']) print(pets[0]) #> dog print([pet for pet in pets]) #> ['dog', 'cat'] ``` You can also create subclasses of the parametrized root model directly: ```python from pydantic import RootModel class Pets(RootModel[list[str]]): def describe(self) -> str: return f'Pets: {", ".join(self.root)}' my_pets = Pets.model_validate(['dog', 'cat']) print(my_pets.describe()) #> Pets: dog, cat ``` ## Faux immutability Models can be configured to be immutable via `model_config['frozen'] = True`. When this is set, attempting to change the values of instance attributes will raise errors. See the API reference for more details. Note This behavior was achieved in Pydantic V1 via the config setting `allow_mutation = False`. This config flag is deprecated in Pydantic V2, and has been replaced with `frozen`. Warning In Python, immutability is not enforced. Developers have the ability to modify objects that are conventionally considered "immutable" if they choose to do so. ```python from pydantic import BaseModel, ConfigDict, ValidationError class FooBarModel(BaseModel): model_config = ConfigDict(frozen=True) a: str b: dict foobar = FooBarModel(a='hello', b={'apple': 'pear'}) try: foobar.a = 'different' except ValidationError as e: print(e) """ 1 validation error for FooBarModel a Instance is frozen [type=frozen_instance, input_value='different', input_type=str] """ print(foobar.a) #> hello print(foobar.b) #> {'apple': 'pear'} foobar.b['apple'] = 'grape' print(foobar.b) #> {'apple': 'grape'} ``` Trying to change `a` caused an error, and `a` remains unchanged. However, the dict `b` is mutable, and the immutability of `foobar` doesn't stop `b` from being changed. ## Abstract base classes Pydantic models can be used alongside Python's [Abstract Base Classes](https://docs.python.org/3/library/abc.html) (ABCs). ```python import abc from pydantic import BaseModel class FooBarModel(BaseModel, abc.ABC): a: str b: int @abc.abstractmethod def my_abstract_method(self): pass ``` ## Field ordering Field order affects models in the following ways: - field order is preserved in the model [JSON Schema](../json_schema/) - field order is preserved in [validation errors](#error-handling) - field order is preserved by [`.model_dump()` and `.model_dump_json()` etc.](../serialization/#model_dump) ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): a: int b: int = 2 c: int = 1 d: int = 0 e: float print(Model.model_fields.keys()) #> dict_keys(['a', 'b', 'c', 'd', 'e']) m = Model(e=2, a=1) print(m.model_dump()) #> {'a': 1, 'b': 2, 'c': 1, 'd': 0, 'e': 2.0} try: Model(a='x', b='x', c='x', d='x', e='x') except ValidationError as err: error_locations = [e['loc'] for e in err.errors()] print(error_locations) #> [('a',), ('b',), ('c',), ('d',), ('e',)] ``` ## Automatically excluded attributes ### Class variables Attributes annotated with ClassVar are properly treated by Pydantic as class variables, and will not become fields on model instances: ```python from typing import ClassVar from pydantic import BaseModel class Model(BaseModel): x: ClassVar[int] = 1 y: int = 2 m = Model() print(m) #> y=2 print(Model.x) #> 1 ``` ### Private model attributes API Documentation pydantic.fields.PrivateAttr Attributes whose name has a leading underscore are not treated as fields by Pydantic, and are not included in the model schema. Instead, these are converted into a "private attribute" which is not validated or even set during calls to `__init__`, `model_validate`, etc. Here is an example of usage: ```python from datetime import datetime from random import randint from typing import Any from pydantic import BaseModel, PrivateAttr class TimeAwareModel(BaseModel): _processed_at: datetime = PrivateAttr(default_factory=datetime.now) _secret_value: str def model_post_init(self, context: Any) -> None: # this could also be done with `default_factory`: self._secret_value = randint(1, 5) m = TimeAwareModel() print(m._processed_at) #> 2032-01-02 03:04:05.000006 print(m._secret_value) #> 3 ``` Private attribute names must start with underscore to prevent conflicts with model fields. However, dunder names (such as `__attr__`) are not supported, and will be completely ignored from the model definition. ## Model signature All Pydantic models will have their signature generated based on their fields: ```python import inspect from pydantic import BaseModel, Field class FooModel(BaseModel): id: int name: str = None description: str = 'Foo' apple: int = Field(alias='pear') print(inspect.signature(FooModel)) #> (*, id: int, name: str = None, description: str = 'Foo', pear: int) -> None ``` An accurate signature is useful for introspection purposes and libraries like `FastAPI` or `hypothesis`. The generated signature will also respect custom `__init__` functions: ```python import inspect from pydantic import BaseModel class MyModel(BaseModel): id: int info: str = 'Foo' def __init__(self, id: int = 1, *, bar: str, **data) -> None: """My custom init!""" super().__init__(id=id, bar=bar, **data) print(inspect.signature(MyModel)) #> (id: int = 1, *, bar: str, info: str = 'Foo') -> None ``` To be included in the signature, a field's alias or name must be a valid Python identifier. Pydantic will prioritize a field's alias over its name when generating the signature, but may use the field name if the alias is not a valid Python identifier. If a field's alias and name are *both* not valid identifiers (which may be possible through exotic use of `create_model`), a `**data` argument will be added. In addition, the `**data` argument will always be present in the signature if `model_config['extra'] == 'allow'`. ## Structural pattern matching Pydantic supports structural pattern matching for models, as introduced by [PEP 636](https://peps.python.org/pep-0636/) in Python 3.10. ```python from pydantic import BaseModel class Pet(BaseModel): name: str species: str a = Pet(name='Bones', species='dog') match a: # match `species` to 'dog', declare and initialize `dog_name` case Pet(species='dog', name=dog_name): print(f'{dog_name} is a dog') #> Bones is a dog # default case case _: print('No dog matched') ``` Note A match-case statement may seem as if it creates a new model, but don't be fooled; it is just syntactic sugar for getting an attribute and either comparing it or declaring and initializing it. ## Attribute copies In many cases, arguments passed to the constructor will be copied in order to perform validation and, where necessary, coercion. In this example, note that the ID of the list changes after the class is constructed because it has been copied during validation: ```python from pydantic import BaseModel class C1: arr = [] def __init__(self, in_arr): self.arr = in_arr class C2(BaseModel): arr: list[int] arr_orig = [1, 9, 10, 3] c1 = C1(arr_orig) c2 = C2(arr=arr_orig) print(f'{id(c1.arr) == id(c2.arr)=}') #> id(c1.arr) == id(c2.arr)=False ``` Note There are some situations where Pydantic does not copy attributes, such as when passing models — we use the model as is. You can override this behaviour by setting [`model_config['revalidate_instances'] = 'always'`](../../api/config/#pydantic.config.ConfigDict). # Performance tips In most cases Pydantic won't be your bottle neck, only follow this if you're sure it's necessary. ## In general, use `model_validate_json()` not `model_validate(json.loads(...))` On `model_validate(json.loads(...))`, the JSON is parsed in Python, then converted to a dict, then it's validated internally. On the other hand, `model_validate_json()` already performs the validation internally. There are a few cases where `model_validate(json.loads(...))` may be faster. Specifically, when using a `'before'` or `'wrap'` validator on a model, validation may be faster with the two step method. You can read more about these special cases in [this discussion](https://github.com/pydantic/pydantic/discussions/6388#discussioncomment-8193105). Many performance improvements are currently in the works for `pydantic-core`, as discussed [here](https://github.com/pydantic/pydantic/discussions/6388#discussioncomment-8194048). Once these changes are merged, we should be at the point where `model_validate_json()` is always faster than `model_validate(json.loads(...))`. ## `TypeAdapter` instantiated once The idea here is to avoid constructing validators and serializers more than necessary. Each time a `TypeAdapter` is instantiated, it will construct a new validator and serializer. If you're using a `TypeAdapter` in a function, it will be instantiated each time the function is called. Instead, instantiate it once, and reuse it. ```python from pydantic import TypeAdapter def my_func(): adapter = TypeAdapter(list[int]) # do something with adapter ``` ```python from pydantic import TypeAdapter adapter = TypeAdapter(list[int]) def my_func(): ... # do something with adapter ``` ## `Sequence` vs `list` or `tuple` with `Mapping` vs `dict` When using `Sequence`, Pydantic calls `isinstance(value, Sequence)` to check if the value is a sequence. Also, Pydantic will try to validate against different types of sequences, like `list` and `tuple`. If you know the value is a `list` or `tuple`, use `list` or `tuple` instead of `Sequence`. The same applies to `Mapping` and `dict`. If you know the value is a `dict`, use `dict` instead of `Mapping`. ## Don't do validation when you don't have to, use `Any` to keep the value unchanged If you don't need to validate a value, use `Any` to keep the value unchanged. ```python from typing import Any from pydantic import BaseModel class Model(BaseModel): a: Any model = Model(a=1) ``` ## Avoid extra information via subclasses of primitives ```python class CompletedStr(str): def __init__(self, s: str): self.s = s self.done = False ``` ```python from pydantic import BaseModel class CompletedModel(BaseModel): s: str done: bool = False ``` ## Use tagged union, not union Tagged union (or discriminated union) is a union with a field that indicates which type it is. ```python from typing import Any, Literal from pydantic import BaseModel, Field class DivModel(BaseModel): el_type: Literal['div'] = 'div' class_name: str | None = None children: list[Any] | None = None class SpanModel(BaseModel): el_type: Literal['span'] = 'span' class_name: str | None = None contents: str | None = None class ButtonModel(BaseModel): el_type: Literal['button'] = 'button' class_name: str | None = None contents: str | None = None class InputModel(BaseModel): el_type: Literal['input'] = 'input' class_name: str | None = None value: str | None = None class Html(BaseModel): contents: DivModel | SpanModel | ButtonModel | InputModel = Field( discriminator='el_type' ) ``` See [Discriminated Unions](../unions/#discriminated-unions) for more details. ## Use `TypedDict` over nested models Instead of using nested models, use `TypedDict` to define the structure of the data. Performance comparison With a simple benchmark, `TypedDict` is about ~2.5x faster than nested models: ```python from timeit import timeit from typing_extensions import TypedDict from pydantic import BaseModel, TypeAdapter class A(TypedDict): a: str b: int class TypedModel(TypedDict): a: A class B(BaseModel): a: str b: int class Model(BaseModel): b: B ta = TypeAdapter(TypedModel) result1 = timeit( lambda: ta.validate_python({'a': {'a': 'a', 'b': 2}}), number=10000 ) result2 = timeit( lambda: Model.model_validate({'b': {'a': 'a', 'b': 2}}), number=10000 ) print(result2 / result1) ``` ## Avoid wrap validators if you really care about performance Wrap validators are generally slower than other validators. This is because they require that data is materialized in Python during validation. Wrap validators can be incredibly useful for complex validation logic, but if you're looking for the best performance, you should avoid them. ## Failing early with `FailFast` Starting in v2.8+, you can apply the `FailFast` annotation to sequence types to fail early if any item in the sequence fails validation. If you use this annotation, you won't get validation errors for the rest of the items in the sequence if one fails, so you're effectively trading off visibility for performance. ```python from typing import Annotated from pydantic import FailFast, TypeAdapter, ValidationError ta = TypeAdapter(Annotated[list[bool], FailFast()]) try: ta.validate_python([True, 'invalid', False, 'also invalid']) except ValidationError as exc: print(exc) """ 1 validation error for list[bool] 1 Input should be a valid boolean, unable to interpret input [type=bool_parsing, input_value='invalid', input_type=str] """ ``` Read more about `FailFast` here. # Settings Management [Pydantic Settings](https://github.com/pydantic/pydantic-settings) provides optional Pydantic features for loading a settings or config class from environment variables or secrets files. ## Installation Installation is as simple as: ```bash pip install pydantic-settings ``` ## Usage If you create a model that inherits from `BaseSettings`, the model initialiser will attempt to determine the values of any fields not passed as keyword arguments by reading from the environment. (Default values will still be used if the matching environment variable is not set.) This makes it easy to: - Create a clearly-defined, type-hinted application configuration class - Automatically read modifications to the configuration from environment variables - Manually override specific settings in the initialiser where desired (e.g. in unit tests) For example: ```py from collections.abc import Callable from typing import Any from pydantic import ( AliasChoices, AmqpDsn, BaseModel, Field, ImportString, PostgresDsn, RedisDsn, ) from pydantic_settings import BaseSettings, SettingsConfigDict class SubModel(BaseModel): foo: str = 'bar' apple: int = 1 class Settings(BaseSettings): auth_key: str = Field(validation_alias='my_auth_key') # (1)! api_key: str = Field(alias='my_api_key') # (2)! redis_dsn: RedisDsn = Field( 'redis://user:pass@localhost:6379/1', validation_alias=AliasChoices('service_redis_dsn', 'redis_url'), # (3)! ) pg_dsn: PostgresDsn = 'postgres://user:pass@localhost:5432/foobar' amqp_dsn: AmqpDsn = 'amqp://user:pass@localhost:5672/' special_function: ImportString[Callable[[Any], Any]] = 'math.cos' # (4)! # to override domains: # export my_prefix_domains='["foo.com", "bar.com"]' domains: set[str] = set() # to override more_settings: # export my_prefix_more_settings='{"foo": "x", "apple": 1}' more_settings: SubModel = SubModel() model_config = SettingsConfigDict(env_prefix='my_prefix_') # (5)! print(Settings().model_dump()) """ { 'auth_key': 'xxx', 'api_key': 'xxx', 'redis_dsn': RedisDsn('redis://user:pass@localhost:6379/1'), 'pg_dsn': PostgresDsn('postgres://user:pass@localhost:5432/foobar'), 'amqp_dsn': AmqpDsn('amqp://user:pass@localhost:5672/'), 'special_function': math.cos, 'domains': set(), 'more_settings': {'foo': 'bar', 'apple': 1}, } """ ``` 1. The environment variable name is overridden using `validation_alias`. In this case, the environment variable `my_auth_key` will be read instead of `auth_key`. Check the [`Field` documentation](../fields/) for more information. 1. The environment variable name is overridden using `alias`. In this case, the environment variable `my_api_key` will be used for both validation and serialization instead of `api_key`. Check the [`Field` documentation](../fields/#field-aliases) for more information. 1. The AliasChoices class allows to have multiple environment variable names for a single field. The first environment variable that is found will be used. Check the [documentation on alias choices](../alias/#aliaspath-and-aliaschoices) for more information. 1. The ImportString class allows to import an object from a string. In this case, the environment variable `special_function` will be read and the function math.cos will be imported. 1. The `env_prefix` config setting allows to set a prefix for all environment variables. Check the [Environment variable names documentation](#environment-variable-names) for more information. ## Validation of default values Unlike pydantic `BaseModel`, default values of `BaseSettings` fields are validated by default. You can disable this behaviour by setting `validate_default=False` either in `model_config` or on field level by `Field(validate_default=False)`: ```py from pydantic import Field from pydantic_settings import BaseSettings, SettingsConfigDict class Settings(BaseSettings): model_config = SettingsConfigDict(validate_default=False) # default won't be validated foo: int = 'test' print(Settings()) #> foo='test' class Settings1(BaseSettings): # default won't be validated foo: int = Field('test', validate_default=False) print(Settings1()) #> foo='test' ``` Check the [validation of default values](../fields/#validate-default-values) for more information. ## Environment variable names By default, the environment variable name is the same as the field name. You can change the prefix for all environment variables by setting the `env_prefix` config setting, or via the `_env_prefix` keyword argument on instantiation: ```py from pydantic_settings import BaseSettings, SettingsConfigDict class Settings(BaseSettings): model_config = SettingsConfigDict(env_prefix='my_prefix_') auth_key: str = 'xxx' # will be read from `my_prefix_auth_key` ``` Note The default `env_prefix` is `''` (empty string). `env_prefix` is not only for env settings but also for dotenv files, secrets, and other sources. If you want to change the environment variable name for a single field, you can use an alias. There are two ways to do this: - Using `Field(alias=...)` (see `api_key` above) - Using `Field(validation_alias=...)` (see `auth_key` above) Check the [`Field` aliases documentation](../fields/#field-aliases) for more information about aliases. `env_prefix` does not apply to fields with alias. It means the environment variable name is the same as field alias: ```py from pydantic import Field from pydantic_settings import BaseSettings, SettingsConfigDict class Settings(BaseSettings): model_config = SettingsConfigDict(env_prefix='my_prefix_') foo: str = Field('xxx', alias='FooAlias') # (1)! ``` 1. `env_prefix` will be ignored and the value will be read from `FooAlias` environment variable. ### Case-sensitivity By default, environment variable names are case-insensitive. If you want to make environment variable names case-sensitive, you can set the `case_sensitive` config setting: ```py from pydantic_settings import BaseSettings, SettingsConfigDict class Settings(BaseSettings): model_config = SettingsConfigDict(case_sensitive=True) redis_host: str = 'localhost' ``` When `case_sensitive` is `True`, the environment variable names must match field names (optionally with a prefix), so in this example `redis_host` could only be modified via `export redis_host`. If you want to name environment variables all upper-case, you should name attribute all upper-case too. You can still name environment variables anything you like through `Field(validation_alias=...)`. Case-sensitivity can also be set via the `_case_sensitive` keyword argument on instantiation. In case of nested models, the `case_sensitive` setting will be applied to all nested models. ```py import os from pydantic import BaseModel, ValidationError from pydantic_settings import BaseSettings class RedisSettings(BaseModel): host: str port: int class Settings(BaseSettings, case_sensitive=True): redis: RedisSettings os.environ['redis'] = '{"host": "localhost", "port": 6379}' print(Settings().model_dump()) #> {'redis': {'host': 'localhost', 'port': 6379}} os.environ['redis'] = '{"HOST": "localhost", "port": 6379}' # (1)! try: Settings() except ValidationError as e: print(e) """ 1 validation error for Settings redis.host Field required [type=missing, input_value={'HOST': 'localhost', 'port': 6379}, input_type=dict] For further information visit https://errors.pydantic.dev/2/v/missing """ ``` 1. Note that the `host` field is not found because the environment variable name is `HOST` (all upper-case). Note On Windows, Python's `os` module always treats environment variables as case-insensitive, so the `case_sensitive` config setting will have no effect - settings will always be updated ignoring case. ## Parsing environment variable values By default environment variables are parsed verbatim, including if the value is empty. You can choose to ignore empty environment variables by setting the `env_ignore_empty` config setting to `True`. This can be useful if you would prefer to use the default value for a field rather than an empty value from the environment. For most simple field types (such as `int`, `float`, `str`, etc.), the environment variable value is parsed the same way it would be if passed directly to the initialiser (as a string). Complex types like `list`, `set`, `dict`, and sub-models are populated from the environment by treating the environment variable's value as a JSON-encoded string. Another way to populate nested complex variables is to configure your model with the `env_nested_delimiter` config setting, then use an environment variable with a name pointing to the nested module fields. What it does is simply explodes your variable into nested models or dicts. So if you define a variable `FOO__BAR__BAZ=123` it will convert it into `FOO={'BAR': {'BAZ': 123}}` If you have multiple variables with the same structure they will be merged. Note Sub model has to inherit from `pydantic.BaseModel`, Otherwise `pydantic-settings` will initialize sub model, collects values for sub model fields separately, and you may get unexpected results. As an example, given the following environment variables: ```bash # your environment export V0=0 export SUB_MODEL='{"v1": "json-1", "v2": "json-2"}' export SUB_MODEL__V2=nested-2 export SUB_MODEL__V3=3 export SUB_MODEL__DEEP__V4=v4 ``` You could load them into the following settings model: ```py from pydantic import BaseModel from pydantic_settings import BaseSettings, SettingsConfigDict class DeepSubModel(BaseModel): # (1)! v4: str class SubModel(BaseModel): # (2)! v1: str v2: bytes v3: int deep: DeepSubModel class Settings(BaseSettings): model_config = SettingsConfigDict(env_nested_delimiter='__') v0: str sub_model: SubModel print(Settings().model_dump()) """ { 'v0': '0', 'sub_model': {'v1': 'json-1', 'v2': b'nested-2', 'v3': 3, 'deep': {'v4': 'v4'}}, } """ ``` 1. Sub model has to inherit from `pydantic.BaseModel`. 1. Sub model has to inherit from `pydantic.BaseModel`. `env_nested_delimiter` can be configured via the `model_config` as shown above, or via the `_env_nested_delimiter` keyword argument on instantiation. By default environment variables are split by `env_nested_delimiter` into arbitrarily deep nested fields. You can limit the depth of the nested fields with the `env_nested_max_split` config setting. A common use case this is particularly useful is for two-level deep settings, where the `env_nested_delimiter` (usually a single `_`) may be a substring of model field names. For example: ```bash # your environment export GENERATION_LLM_PROVIDER='anthropic' export GENERATION_LLM_API_KEY='your-api-key' export GENERATION_LLM_API_VERSION='2024-03-15' ``` You could load them into the following settings model: ```py from pydantic import BaseModel from pydantic_settings import BaseSettings, SettingsConfigDict class LLMConfig(BaseModel): provider: str = 'openai' api_key: str api_type: str = 'azure' api_version: str = '2023-03-15-preview' class GenerationConfig(BaseSettings): model_config = SettingsConfigDict( env_nested_delimiter='_', env_nested_max_split=1, env_prefix='GENERATION_' ) llm: LLMConfig ... print(GenerationConfig().model_dump()) """ { 'llm': { 'provider': 'anthropic', 'api_key': 'your-api-key', 'api_type': 'azure', 'api_version': '2024-03-15', } } """ ``` Without `env_nested_max_split=1` set, `GENERATION_LLM_API_KEY` would be parsed as `llm.api.key` instead of `llm.api_key` and it would raise a `ValidationError`. Nested environment variables take precedence over the top-level environment variable JSON (e.g. in the example above, `SUB_MODEL__V2` trumps `SUB_MODEL`). You may also populate a complex type by providing your own source class. ```py import json import os from typing import Any from pydantic.fields import FieldInfo from pydantic_settings import ( BaseSettings, EnvSettingsSource, PydanticBaseSettingsSource, ) class MyCustomSource(EnvSettingsSource): def prepare_field_value( self, field_name: str, field: FieldInfo, value: Any, value_is_complex: bool ) -> Any: if field_name == 'numbers': return [int(x) for x in value.split(',')] return json.loads(value) class Settings(BaseSettings): numbers: list[int] @classmethod def settings_customise_sources( cls, settings_cls: type[BaseSettings], init_settings: PydanticBaseSettingsSource, env_settings: PydanticBaseSettingsSource, dotenv_settings: PydanticBaseSettingsSource, file_secret_settings: PydanticBaseSettingsSource, ) -> tuple[PydanticBaseSettingsSource, ...]: return (MyCustomSource(settings_cls),) os.environ['numbers'] = '1,2,3' print(Settings().model_dump()) #> {'numbers': [1, 2, 3]} ``` ### Disabling JSON parsing pydantic-settings by default parses complex types from environment variables as JSON strings. If you want to disable this behavior for a field and parse the value in your own validator, you can annotate the field with [`NoDecode`](../../api/pydantic_settings/#pydantic_settings.NoDecode): ```py import os from typing import Annotated from pydantic import field_validator from pydantic_settings import BaseSettings, NoDecode class Settings(BaseSettings): numbers: Annotated[list[int], NoDecode] # (1)! @field_validator('numbers', mode='before') @classmethod def decode_numbers(cls, v: str) -> list[int]: return [int(x) for x in v.split(',')] os.environ['numbers'] = '1,2,3' print(Settings().model_dump()) #> {'numbers': [1, 2, 3]} ``` 1. The `NoDecode` annotation disables JSON parsing for the `numbers` field. The `decode_numbers` field validator will be called to parse the value. You can also disable JSON parsing for all fields by setting the `enable_decoding` config setting to `False`: ```py import os from pydantic import field_validator from pydantic_settings import BaseSettings, SettingsConfigDict class Settings(BaseSettings): model_config = SettingsConfigDict(enable_decoding=False) numbers: list[int] @field_validator('numbers', mode='before') @classmethod def decode_numbers(cls, v: str) -> list[int]: return [int(x) for x in v.split(',')] os.environ['numbers'] = '1,2,3' print(Settings().model_dump()) #> {'numbers': [1, 2, 3]} ``` You can force JSON parsing for a field by annotating it with [`ForceDecode`](../../api/pydantic_settings/#pydantic_settings.ForceDecode). This will bypass the `enable_decoding` config setting: ```py import os from typing import Annotated from pydantic import field_validator from pydantic_settings import BaseSettings, ForceDecode, SettingsConfigDict class Settings(BaseSettings): model_config = SettingsConfigDict(enable_decoding=False) numbers: Annotated[list[int], ForceDecode] numbers1: list[int] # (1)! @field_validator('numbers1', mode='before') @classmethod def decode_numbers1(cls, v: str) -> list[int]: return [int(x) for x in v.split(',')] os.environ['numbers'] = '["1","2","3"]' os.environ['numbers1'] = '1,2,3' print(Settings().model_dump()) #> {'numbers': [1, 2, 3], 'numbers1': [1, 2, 3]} ``` 1. The `numbers1` field is not annotated with `ForceDecode`, so it will not be parsed as JSON. and we have to provide a custom validator to parse the value. ## Nested model default partial updates By default, Pydantic settings does not allow partial updates to nested model default objects. This behavior can be overriden by setting the `nested_model_default_partial_update` flag to `True`, which will allow partial updates on nested model default object fields. ```py import os from pydantic import BaseModel from pydantic_settings import BaseSettings, SettingsConfigDict class SubModel(BaseModel): val: int = 0 flag: bool = False class SettingsPartialUpdate(BaseSettings): model_config = SettingsConfigDict( env_nested_delimiter='__', nested_model_default_partial_update=True ) nested_model: SubModel = SubModel(val=1) class SettingsNoPartialUpdate(BaseSettings): model_config = SettingsConfigDict( env_nested_delimiter='__', nested_model_default_partial_update=False ) nested_model: SubModel = SubModel(val=1) # Apply a partial update to the default object using environment variables os.environ['NESTED_MODEL__FLAG'] = 'True' # When partial update is enabled, the existing SubModel instance is updated # with nested_model.flag=True change assert SettingsPartialUpdate().model_dump() == { 'nested_model': {'val': 1, 'flag': True} } # When partial update is disabled, a new SubModel instance is instantiated # with nested_model.flag=True change assert SettingsNoPartialUpdate().model_dump() == { 'nested_model': {'val': 0, 'flag': True} } ``` ## Dotenv (.env) support Dotenv files (generally named `.env`) are a common pattern that make it easy to use environment variables in a platform-independent manner. A dotenv file follows the same general principles of all environment variables, and it looks like this: .env ```bash # ignore comment ENVIRONMENT="production" REDIS_ADDRESS=localhost:6379 MEANING_OF_LIFE=42 MY_VAR='Hello world' ``` Once you have your `.env` file filled with variables, *pydantic* supports loading it in two ways: 1. Setting the `env_file` (and `env_file_encoding` if you don't want the default encoding of your OS) on `model_config` in the `BaseSettings` class: ```py from pydantic_settings import BaseSettings, SettingsConfigDict class Settings(BaseSettings): model_config = SettingsConfigDict(env_file='.env', env_file_encoding='utf-8') ``` 1. Instantiating the `BaseSettings` derived class with the `_env_file` keyword argument (and the `_env_file_encoding` if needed): ```py from pydantic_settings import BaseSettings, SettingsConfigDict class Settings(BaseSettings): model_config = SettingsConfigDict(env_file='.env', env_file_encoding='utf-8') settings = Settings(_env_file='prod.env', _env_file_encoding='utf-8') ``` In either case, the value of the passed argument can be any valid path or filename, either absolute or relative to the current working directory. From there, *pydantic* will handle everything for you by loading in your variables and validating them. Note If a filename is specified for `env_file`, Pydantic will only check the current working directory and won't check any parent directories for the `.env` file. Even when using a dotenv file, *pydantic* will still read environment variables as well as the dotenv file, **environment variables will always take priority over values loaded from a dotenv file**. Passing a file path via the `_env_file` keyword argument on instantiation (method 2) will override the value (if any) set on the `model_config` class. If the above snippets were used in conjunction, `prod.env` would be loaded while `.env` would be ignored. If you need to load multiple dotenv files, you can pass multiple file paths as a tuple or list. The files will be loaded in order, with each file overriding the previous one. ```py from pydantic_settings import BaseSettings, SettingsConfigDict class Settings(BaseSettings): model_config = SettingsConfigDict( # `.env.prod` takes priority over `.env` env_file=('.env', '.env.prod') ) ``` You can also use the keyword argument override to tell Pydantic not to load any file at all (even if one is set in the `model_config` class) by passing `None` as the instantiation keyword argument, e.g. `settings = Settings(_env_file=None)`. Because python-dotenv is used to parse the file, bash-like semantics such as `export` can be used which (depending on your OS and environment) may allow your dotenv file to also be used with `source`, see [python-dotenv's documentation](https://saurabh-kumar.com/python-dotenv/#usages) for more details. Pydantic settings consider `extra` config in case of dotenv file. It means if you set the `extra=forbid` (*default*) on `model_config` and your dotenv file contains an entry for a field that is not defined in settings model, it will raise `ValidationError` in settings construction. For compatibility with pydantic 1.x BaseSettings you should use `extra=ignore`: ```py from pydantic_settings import BaseSettings, SettingsConfigDict class Settings(BaseSettings): model_config = SettingsConfigDict(env_file='.env', extra='ignore') ``` Note Pydantic settings loads all the values from dotenv file and passes it to the model, regardless of the model's `env_prefix`. So if you provide extra values in a dotenv file, whether they start with `env_prefix` or not, a `ValidationError` will be raised. ## Command Line Support Pydantic settings provides integrated CLI support, making it easy to quickly define CLI applications using Pydantic models. There are two primary use cases for Pydantic settings CLI: 1. When using a CLI to override fields in Pydantic models. 1. When using Pydantic models to define CLIs. By default, the experience is tailored towards use case #1 and builds on the foundations established in [parsing environment variables](#parsing-environment-variable-values). If your use case primarily falls into #2, you will likely want to enable most of the defaults outlined at the end of [creating CLI applications](#creating-cli-applications). ### The Basics To get started, let's revisit the example presented in [parsing environment variables](#parsing-environment-variable-values) but using a Pydantic settings CLI: ```py import sys from pydantic import BaseModel from pydantic_settings import BaseSettings, SettingsConfigDict class DeepSubModel(BaseModel): v4: str class SubModel(BaseModel): v1: str v2: bytes v3: int deep: DeepSubModel class Settings(BaseSettings): model_config = SettingsConfigDict(cli_parse_args=True) v0: str sub_model: SubModel sys.argv = [ 'example.py', '--v0=0', '--sub_model={"v1": "json-1", "v2": "json-2"}', '--sub_model.v2=nested-2', '--sub_model.v3=3', '--sub_model.deep.v4=v4', ] print(Settings().model_dump()) """ { 'v0': '0', 'sub_model': {'v1': 'json-1', 'v2': b'nested-2', 'v3': 3, 'deep': {'v4': 'v4'}}, } """ ``` To enable CLI parsing, we simply set the `cli_parse_args` flag to a valid value, which retains similar conotations as defined in `argparse`. Note that a CLI settings source is [**the topmost source**](#field-value-priority) by default unless its [priority value is customised](#customise-settings-sources): ```py import os import sys from pydantic_settings import ( BaseSettings, CliSettingsSource, PydanticBaseSettingsSource, ) class Settings(BaseSettings): my_foo: str @classmethod def settings_customise_sources( cls, settings_cls: type[BaseSettings], init_settings: PydanticBaseSettingsSource, env_settings: PydanticBaseSettingsSource, dotenv_settings: PydanticBaseSettingsSource, file_secret_settings: PydanticBaseSettingsSource, ) -> tuple[PydanticBaseSettingsSource, ...]: return env_settings, CliSettingsSource(settings_cls, cli_parse_args=True) os.environ['MY_FOO'] = 'from environment' sys.argv = ['example.py', '--my_foo=from cli'] print(Settings().model_dump()) #> {'my_foo': 'from environment'} ``` #### Lists CLI argument parsing of lists supports intermixing of any of the below three styles: - JSON style `--field='[1,2]'` - Argparse style `--field 1 --field 2` - Lazy style `--field=1,2` ```py import sys from pydantic_settings import BaseSettings class Settings(BaseSettings, cli_parse_args=True): my_list: list[int] sys.argv = ['example.py', '--my_list', '[1,2]'] print(Settings().model_dump()) #> {'my_list': [1, 2]} sys.argv = ['example.py', '--my_list', '1', '--my_list', '2'] print(Settings().model_dump()) #> {'my_list': [1, 2]} sys.argv = ['example.py', '--my_list', '1,2'] print(Settings().model_dump()) #> {'my_list': [1, 2]} ``` #### Dictionaries CLI argument parsing of dictionaries supports intermixing of any of the below two styles: - JSON style `--field='{"k1": 1, "k2": 2}'` - Environment variable style `--field k1=1 --field k2=2` These can be used in conjunction with list forms as well, e.g: - `--field k1=1,k2=2 --field k3=3 --field '{"k4": 4}'` etc. ```py import sys from pydantic_settings import BaseSettings class Settings(BaseSettings, cli_parse_args=True): my_dict: dict[str, int] sys.argv = ['example.py', '--my_dict', '{"k1":1,"k2":2}'] print(Settings().model_dump()) #> {'my_dict': {'k1': 1, 'k2': 2}} sys.argv = ['example.py', '--my_dict', 'k1=1', '--my_dict', 'k2=2'] print(Settings().model_dump()) #> {'my_dict': {'k1': 1, 'k2': 2}} ``` #### Literals and Enums CLI argument parsing of literals and enums are converted into CLI choices. ```py import sys from enum import IntEnum from typing import Literal from pydantic_settings import BaseSettings class Fruit(IntEnum): pear = 0 kiwi = 1 lime = 2 class Settings(BaseSettings, cli_parse_args=True): fruit: Fruit pet: Literal['dog', 'cat', 'bird'] sys.argv = ['example.py', '--fruit', 'lime', '--pet', 'cat'] print(Settings().model_dump()) #> {'fruit': , 'pet': 'cat'} ``` #### Aliases Pydantic field aliases are added as CLI argument aliases. Aliases of length one are converted into short options. ```py import sys from pydantic import AliasChoices, AliasPath, Field from pydantic_settings import BaseSettings class User(BaseSettings, cli_parse_args=True): first_name: str = Field( validation_alias=AliasChoices('f', 'fname', AliasPath('name', 0)) ) last_name: str = Field( validation_alias=AliasChoices('l', 'lname', AliasPath('name', 1)) ) sys.argv = ['example.py', '--fname', 'John', '--lname', 'Doe'] print(User().model_dump()) #> {'first_name': 'John', 'last_name': 'Doe'} sys.argv = ['example.py', '-f', 'John', '-l', 'Doe'] print(User().model_dump()) #> {'first_name': 'John', 'last_name': 'Doe'} sys.argv = ['example.py', '--name', 'John,Doe'] print(User().model_dump()) #> {'first_name': 'John', 'last_name': 'Doe'} sys.argv = ['example.py', '--name', 'John', '--lname', 'Doe'] print(User().model_dump()) #> {'first_name': 'John', 'last_name': 'Doe'} ``` ### Subcommands and Positional Arguments Subcommands and positional arguments are expressed using the `CliSubCommand` and `CliPositionalArg` annotations. The subcommand annotation can only be applied to required fields (i.e. fields that do not have a default value). Furthermore, subcommands must be a valid type derived from either a pydantic `BaseModel` or pydantic.dataclasses `dataclass`. Parsed subcommands can be retrieved from model instances using the `get_subcommand` utility function. If a subcommand is not required, set the `is_required` flag to `False` to disable raising an error if no subcommand is found. Note CLI settings subcommands are limited to a single subparser per model. In other words, all subcommands for a model are grouped under a single subparser; it does not allow for multiple subparsers with each subparser having its own set of subcommands. For more information on subparsers, see [argparse subcommands](https://docs.python.org/3/library/argparse.html#sub-commands). Note `CliSubCommand` and `CliPositionalArg` are always case sensitive. ```py import sys from pydantic import BaseModel from pydantic_settings import ( BaseSettings, CliPositionalArg, CliSubCommand, SettingsError, get_subcommand, ) class Init(BaseModel): directory: CliPositionalArg[str] class Clone(BaseModel): repository: CliPositionalArg[str] directory: CliPositionalArg[str] class Git(BaseSettings, cli_parse_args=True, cli_exit_on_error=False): clone: CliSubCommand[Clone] init: CliSubCommand[Init] # Run without subcommands sys.argv = ['example.py'] cmd = Git() assert cmd.model_dump() == {'clone': None, 'init': None} try: # Will raise an error since no subcommand was provided get_subcommand(cmd).model_dump() except SettingsError as err: assert str(err) == 'Error: CLI subcommand is required {clone, init}' # Will not raise an error since subcommand is not required assert get_subcommand(cmd, is_required=False) is None # Run the clone subcommand sys.argv = ['example.py', 'clone', 'repo', 'dest'] cmd = Git() assert cmd.model_dump() == { 'clone': {'repository': 'repo', 'directory': 'dest'}, 'init': None, } # Returns the subcommand model instance (in this case, 'clone') assert get_subcommand(cmd).model_dump() == { 'directory': 'dest', 'repository': 'repo', } ``` The `CliSubCommand` and `CliPositionalArg` annotations also support union operations and aliases. For unions of Pydantic models, it is important to remember the [nuances](https://docs.pydantic.dev/latest/concepts/unions/) that can arise during validation. Specifically, for unions of subcommands that are identical in content, it is recommended to break them out into separate `CliSubCommand` fields to avoid any complications. Lastly, the derived subcommand names from unions will be the names of the Pydantic model classes themselves. When assigning aliases to `CliSubCommand` or `CliPositionalArg` fields, only a single alias can be assigned. For non-union subcommands, aliasing will change the displayed help text and subcommand name. Conversely, for union subcommands, aliasing will have no tangible effect from the perspective of the CLI settings source. Lastly, for positional arguments, aliasing will change the CLI help text displayed for the field. ```py import sys from typing import Union from pydantic import BaseModel, Field from pydantic_settings import ( BaseSettings, CliPositionalArg, CliSubCommand, get_subcommand, ) class Alpha(BaseModel): """Apha Help""" cmd_alpha: CliPositionalArg[str] = Field(alias='alpha-cmd') class Beta(BaseModel): """Beta Help""" opt_beta: str = Field(alias='opt-beta') class Gamma(BaseModel): """Gamma Help""" opt_gamma: str = Field(alias='opt-gamma') class Root(BaseSettings, cli_parse_args=True, cli_exit_on_error=False): alpha_or_beta: CliSubCommand[Union[Alpha, Beta]] = Field(alias='alpha-or-beta-cmd') gamma: CliSubCommand[Gamma] = Field(alias='gamma-cmd') sys.argv = ['example.py', 'Alpha', 'hello'] assert get_subcommand(Root()).model_dump() == {'cmd_alpha': 'hello'} sys.argv = ['example.py', 'Beta', '--opt-beta=hey'] assert get_subcommand(Root()).model_dump() == {'opt_beta': 'hey'} sys.argv = ['example.py', 'gamma-cmd', '--opt-gamma=hi'] assert get_subcommand(Root()).model_dump() == {'opt_gamma': 'hi'} ``` ### Creating CLI Applications The `CliApp` class provides two utility methods, `CliApp.run` and `CliApp.run_subcommand`, that can be used to run a Pydantic `BaseSettings`, `BaseModel`, or `pydantic.dataclasses.dataclass` as a CLI application. Primarily, the methods provide structure for running `cli_cmd` methods associated with models. `CliApp.run` can be used in directly providing the `cli_args` to be parsed, and will run the model `cli_cmd` method (if defined) after instantiation: ```py from pydantic_settings import BaseSettings, CliApp class Settings(BaseSettings): this_foo: str def cli_cmd(self) -> None: # Print the parsed data print(self.model_dump()) #> {'this_foo': 'is such a foo'} # Update the parsed data showing cli_cmd ran self.this_foo = 'ran the foo cli cmd' s = CliApp.run(Settings, cli_args=['--this_foo', 'is such a foo']) print(s.model_dump()) #> {'this_foo': 'ran the foo cli cmd'} ``` Similarly, the `CliApp.run_subcommand` can be used in recursive fashion to run the `cli_cmd` method of a subcommand: ```py from pydantic import BaseModel from pydantic_settings import CliApp, CliPositionalArg, CliSubCommand class Init(BaseModel): directory: CliPositionalArg[str] def cli_cmd(self) -> None: print(f'git init "{self.directory}"') #> git init "dir" self.directory = 'ran the git init cli cmd' class Clone(BaseModel): repository: CliPositionalArg[str] directory: CliPositionalArg[str] def cli_cmd(self) -> None: print(f'git clone from "{self.repository}" into "{self.directory}"') self.directory = 'ran the clone cli cmd' class Git(BaseModel): clone: CliSubCommand[Clone] init: CliSubCommand[Init] def cli_cmd(self) -> None: CliApp.run_subcommand(self) cmd = CliApp.run(Git, cli_args=['init', 'dir']) assert cmd.model_dump() == { 'clone': None, 'init': {'directory': 'ran the git init cli cmd'}, } ``` Note Unlike `CliApp.run`, `CliApp.run_subcommand` requires the subcommand model to have a defined `cli_cmd` method. For `BaseModel` and `pydantic.dataclasses.dataclass` types, `CliApp.run` will internally use the following `BaseSettings` configuration defaults: - `nested_model_default_partial_update=True` - `case_sensitive=True` - `cli_hide_none_type=True` - `cli_avoid_json=True` - `cli_enforce_required=True` - `cli_implicit_flags=True` - `cli_kebab_case=True` ### Asynchronous CLI Commands Pydantic settings supports running asynchronous CLI commands via `CliApp.run` and `CliApp.run_subcommand`. With this feature, you can define async def methods within your Pydantic models (including subcommands) and have them executed just like their synchronous counterparts. Specifically: 1. Asynchronous methods are supported: You can now mark your cli_cmd or similar CLI entrypoint methods as async def and have CliApp execute them. 1. Subcommands may also be asynchronous: If you have nested CLI subcommands, the final (lowest-level) subcommand methods can likewise be asynchronous. 1. Limit asynchronous methods to final subcommands: Defining parent commands as asynchronous is not recommended, because it can result in additional threads and event loops being created. For best performance and to avoid unnecessary resource usage, only implement your deepest (child) subcommands as async def. Below is a simple example demonstrating an asynchronous top-level command: ```py from pydantic_settings import BaseSettings, CliApp class AsyncSettings(BaseSettings): async def cli_cmd(self) -> None: print('Hello from an async CLI method!') #> Hello from an async CLI method! # If an event loop is already running, a new thread will be used; # otherwise, asyncio.run() is used to execute this async method. assert CliApp.run(AsyncSettings, cli_args=[]).model_dump() == {} ``` #### Asynchronous Subcommands As mentioned above, you can also define subcommands as async. However, only do so for the leaf (lowest-level) subcommand to avoid spawning new threads and event loops unnecessarily in parent commands: ```py from pydantic import BaseModel from pydantic_settings import ( BaseSettings, CliApp, CliPositionalArg, CliSubCommand, ) class Clone(BaseModel): repository: CliPositionalArg[str] directory: CliPositionalArg[str] async def cli_cmd(self) -> None: # Perform async tasks here, e.g. network or I/O operations print(f'Cloning async from "{self.repository}" into "{self.directory}"') #> Cloning async from "repo" into "dir" class Git(BaseSettings): clone: CliSubCommand[Clone] def cli_cmd(self) -> None: # Run the final subcommand (clone/init). It is recommended to define async methods only at the deepest level. CliApp.run_subcommand(self) CliApp.run(Git, cli_args=['clone', 'repo', 'dir']).model_dump() == { 'repository': 'repo', 'directory': 'dir', } ``` When executing a subcommand with an asynchronous cli_cmd, Pydantic settings automatically detects whether the current thread already has an active event loop. If so, the async command is run in a fresh thread to avoid conflicts. Otherwise, it uses asyncio.run() in the current thread. This handling ensures your asynchronous subcommands “just work” without additional manual setup. ### Mutually Exclusive Groups CLI mutually exclusive groups can be created by inheriting from the `CliMutuallyExclusiveGroup` class. Note A `CliMutuallyExclusiveGroup` cannot be used in a union or contain nested models. ```py from typing import Optional from pydantic import BaseModel from pydantic_settings import CliApp, CliMutuallyExclusiveGroup, SettingsError class Circle(CliMutuallyExclusiveGroup): radius: Optional[float] = None diameter: Optional[float] = None perimeter: Optional[float] = None class Settings(BaseModel): circle: Circle try: CliApp.run( Settings, cli_args=['--circle.radius=1', '--circle.diameter=2'], cli_exit_on_error=False, ) except SettingsError as e: print(e) """ error parsing CLI: argument --circle.diameter: not allowed with argument --circle.radius """ ``` ### Customizing the CLI Experience The below flags can be used to customise the CLI experience to your needs. #### Change the Displayed Program Name Change the default program name displayed in the help text usage by setting `cli_prog_name`. By default, it will derive the name of the currently executing program from `sys.argv[0]`, just like argparse. ```py import sys from pydantic_settings import BaseSettings class Settings(BaseSettings, cli_parse_args=True, cli_prog_name='appdantic'): pass try: sys.argv = ['example.py', '--help'] Settings() except SystemExit as e: print(e) #> 0 """ usage: appdantic [-h] options: -h, --help show this help message and exit """ ``` #### CLI Boolean Flags Change whether boolean fields should be explicit or implicit by default using the `cli_implicit_flags` setting. By default, boolean fields are "explicit", meaning a boolean value must be explicitly provided on the CLI, e.g. `--flag=True`. Conversely, boolean fields that are "implicit" derive the value from the flag itself, e.g. `--flag,--no-flag`, which removes the need for an explicit value to be passed. Additionally, the provided `CliImplicitFlag` and `CliExplicitFlag` annotations can be used for more granular control when necessary. ```py from pydantic_settings import BaseSettings, CliExplicitFlag, CliImplicitFlag class ExplicitSettings(BaseSettings, cli_parse_args=True): """Boolean fields are explicit by default.""" explicit_req: bool """ --explicit_req bool (required) """ explicit_opt: bool = False """ --explicit_opt bool (default: False) """ # Booleans are explicit by default, so must override implicit flags with annotation implicit_req: CliImplicitFlag[bool] """ --implicit_req, --no-implicit_req (required) """ implicit_opt: CliImplicitFlag[bool] = False """ --implicit_opt, --no-implicit_opt (default: False) """ class ImplicitSettings(BaseSettings, cli_parse_args=True, cli_implicit_flags=True): """With cli_implicit_flags=True, boolean fields are implicit by default.""" # Booleans are implicit by default, so must override explicit flags with annotation explicit_req: CliExplicitFlag[bool] """ --explicit_req bool (required) """ explicit_opt: CliExplicitFlag[bool] = False """ --explicit_opt bool (default: False) """ implicit_req: bool """ --implicit_req, --no-implicit_req (required) """ implicit_opt: bool = False """ --implicit_opt, --no-implicit_opt (default: False) """ ``` #### Ignore Unknown Arguments Change whether to ignore unknown CLI arguments and only parse known ones using `cli_ignore_unknown_args`. By default, the CLI does not ignore any args. ```py import sys from pydantic_settings import BaseSettings class Settings(BaseSettings, cli_parse_args=True, cli_ignore_unknown_args=True): good_arg: str sys.argv = ['example.py', '--bad-arg=bad', 'ANOTHER_BAD_ARG', '--good_arg=hello world'] print(Settings().model_dump()) #> {'good_arg': 'hello world'} ``` #### CLI Kebab Case for Arguments Change whether CLI arguments should use kebab case by enabling `cli_kebab_case`. ```py import sys from pydantic import Field from pydantic_settings import BaseSettings class Settings(BaseSettings, cli_parse_args=True, cli_kebab_case=True): my_option: str = Field(description='will show as kebab case on CLI') try: sys.argv = ['example.py', '--help'] Settings() except SystemExit as e: print(e) #> 0 """ usage: example.py [-h] [--my-option str] options: -h, --help show this help message and exit --my-option str will show as kebab case on CLI (required) """ ``` #### Change Whether CLI Should Exit on Error Change whether the CLI internal parser will exit on error or raise a `SettingsError` exception by using `cli_exit_on_error`. By default, the CLI internal parser will exit on error. ```py import sys from pydantic_settings import BaseSettings, SettingsError class Settings(BaseSettings, cli_parse_args=True, cli_exit_on_error=False): ... try: sys.argv = ['example.py', '--bad-arg'] Settings() except SettingsError as e: print(e) #> error parsing CLI: unrecognized arguments: --bad-arg ``` #### Enforce Required Arguments at CLI Pydantic settings is designed to pull values in from various sources when instantating a model. This means a field that is required is not strictly required from any single source (e.g. the CLI). Instead, all that matters is that one of the sources provides the required value. However, if your use case [aligns more with #2](#command-line-support), using Pydantic models to define CLIs, you will likely want required fields to be *strictly required at the CLI*. We can enable this behavior by using `cli_enforce_required`. Note A required `CliPositionalArg` field is always strictly required (enforced) at the CLI. ```py import os import sys from pydantic import Field from pydantic_settings import BaseSettings, SettingsError class Settings( BaseSettings, cli_parse_args=True, cli_enforce_required=True, cli_exit_on_error=False, ): my_required_field: str = Field(description='a top level required field') os.environ['MY_REQUIRED_FIELD'] = 'hello from environment' try: sys.argv = ['example.py'] Settings() except SettingsError as e: print(e) #> error parsing CLI: the following arguments are required: --my_required_field ``` #### Change the None Type Parse String Change the CLI string value that will be parsed (e.g. "null", "void", "None", etc.) into `None` by setting `cli_parse_none_str`. By default it will use the `env_parse_none_str` value if set. Otherwise, it will default to "null" if `cli_avoid_json` is `False`, and "None" if `cli_avoid_json` is `True`. ```py import sys from typing import Optional from pydantic import Field from pydantic_settings import BaseSettings class Settings(BaseSettings, cli_parse_args=True, cli_parse_none_str='void'): v1: Optional[int] = Field(description='the top level v0 option') sys.argv = ['example.py', '--v1', 'void'] print(Settings().model_dump()) #> {'v1': None} ``` #### Hide None Type Values Hide `None` values from the CLI help text by enabling `cli_hide_none_type`. ```py import sys from typing import Optional from pydantic import Field from pydantic_settings import BaseSettings class Settings(BaseSettings, cli_parse_args=True, cli_hide_none_type=True): v0: Optional[str] = Field(description='the top level v0 option') try: sys.argv = ['example.py', '--help'] Settings() except SystemExit as e: print(e) #> 0 """ usage: example.py [-h] [--v0 str] options: -h, --help show this help message and exit --v0 str the top level v0 option (required) """ ``` #### Avoid Adding JSON CLI Options Avoid adding complex fields that result in JSON strings at the CLI by enabling `cli_avoid_json`. ```py import sys from pydantic import BaseModel, Field from pydantic_settings import BaseSettings class SubModel(BaseModel): v1: int = Field(description='the sub model v1 option') class Settings(BaseSettings, cli_parse_args=True, cli_avoid_json=True): sub_model: SubModel = Field( description='The help summary for SubModel related options' ) try: sys.argv = ['example.py', '--help'] Settings() except SystemExit as e: print(e) #> 0 """ usage: example.py [-h] [--sub_model.v1 int] options: -h, --help show this help message and exit sub_model options: The help summary for SubModel related options --sub_model.v1 int the sub model v1 option (required) """ ``` #### Use Class Docstring for Group Help Text By default, when populating the group help text for nested models it will pull from the field descriptions. Alternatively, we can also configure CLI settings to pull from the class docstring instead. Note If the field is a union of nested models the group help text will always be pulled from the field description; even if `cli_use_class_docs_for_groups` is set to `True`. ```py import sys from pydantic import BaseModel, Field from pydantic_settings import BaseSettings class SubModel(BaseModel): """The help text from the class docstring.""" v1: int = Field(description='the sub model v1 option') class Settings(BaseSettings, cli_parse_args=True, cli_use_class_docs_for_groups=True): """My application help text.""" sub_model: SubModel = Field(description='The help text from the field description') try: sys.argv = ['example.py', '--help'] Settings() except SystemExit as e: print(e) #> 0 """ usage: example.py [-h] [--sub_model JSON] [--sub_model.v1 int] My application help text. options: -h, --help show this help message and exit sub_model options: The help text from the class docstring. --sub_model JSON set sub_model from JSON string --sub_model.v1 int the sub model v1 option (required) """ ``` #### Change the CLI Flag Prefix Character Change The CLI flag prefix character used in CLI optional arguments by settings `cli_flag_prefix_char`. ```py import sys from pydantic import AliasChoices, Field from pydantic_settings import BaseSettings class Settings(BaseSettings, cli_parse_args=True, cli_flag_prefix_char='+'): my_arg: str = Field(validation_alias=AliasChoices('m', 'my-arg')) sys.argv = ['example.py', '++my-arg', 'hi'] print(Settings().model_dump()) #> {'my_arg': 'hi'} sys.argv = ['example.py', '+m', 'hi'] print(Settings().model_dump()) #> {'my_arg': 'hi'} ``` #### Suppressing Fields from CLI Help Text To suppress a field from the CLI help text, the `CliSuppress` annotation can be used for field types, or the `CLI_SUPPRESS` string constant can be used for field descriptions. ```py import sys from pydantic import Field from pydantic_settings import CLI_SUPPRESS, BaseSettings, CliSuppress class Settings(BaseSettings, cli_parse_args=True): """Suppress fields from CLI help text.""" field_a: CliSuppress[int] = 0 field_b: str = Field(default=1, description=CLI_SUPPRESS) try: sys.argv = ['example.py', '--help'] Settings() except SystemExit as e: print(e) #> 0 """ usage: example.py [-h] Suppress fields from CLI help text. options: -h, --help show this help message and exit """ ``` ### Integrating with Existing Parsers A CLI settings source can be integrated with existing parsers by overriding the default CLI settings source with a user defined one that specifies the `root_parser` object. ```py import sys from argparse import ArgumentParser from pydantic_settings import BaseSettings, CliApp, CliSettingsSource parser = ArgumentParser() parser.add_argument('--food', choices=['pear', 'kiwi', 'lime']) class Settings(BaseSettings): name: str = 'Bob' # Set existing `parser` as the `root_parser` object for the user defined settings source cli_settings = CliSettingsSource(Settings, root_parser=parser) # Parse and load CLI settings from the command line into the settings source. sys.argv = ['example.py', '--food', 'kiwi', '--name', 'waldo'] s = CliApp.run(Settings, cli_settings_source=cli_settings) print(s.model_dump()) #> {'name': 'waldo'} # Load CLI settings from pre-parsed arguments. i.e., the parsing occurs elsewhere and we # just need to load the pre-parsed args into the settings source. parsed_args = parser.parse_args(['--food', 'kiwi', '--name', 'ralph']) s = CliApp.run(Settings, cli_args=parsed_args, cli_settings_source=cli_settings) print(s.model_dump()) #> {'name': 'ralph'} ``` A `CliSettingsSource` connects with a `root_parser` object by using parser methods to add `settings_cls` fields as command line arguments. The `CliSettingsSource` internal parser representation is based on the `argparse` library, and therefore, requires parser methods that support the same attributes as their `argparse` counterparts. The available parser methods that can be customised, along with their argparse counterparts (the defaults), are listed below: - `parse_args_method` - (`argparse.ArgumentParser.parse_args`) - `add_argument_method` - (`argparse.ArgumentParser.add_argument`) - `add_argument_group_method` - (`argparse.ArgumentParser.add_argument_group`) - `add_parser_method` - (`argparse._SubParsersAction.add_parser`) - `add_subparsers_method` - (`argparse.ArgumentParser.add_subparsers`) - `formatter_class` - (`argparse.RawDescriptionHelpFormatter`) For a non-argparse parser the parser methods can be set to `None` if not supported. The CLI settings will only raise an error when connecting to the root parser if a parser method is necessary but set to `None`. Note The `formatter_class` is only applied to subcommands. The `CliSettingsSource` never touches or modifies any of the external parser settings to avoid breaking changes. Since subcommands reside on their own internal parser trees, we can safely apply the `formatter_class` settings without breaking the external parser logic. ## Secrets Placing secret values in files is a common pattern to provide sensitive configuration to an application. A secret file follows the same principal as a dotenv file except it only contains a single value and the file name is used as the key. A secret file will look like the following: /var/run/database_password ```text super_secret_database_password ``` Once you have your secret files, *pydantic* supports loading it in two ways: 1. Setting the `secrets_dir` on `model_config` in a `BaseSettings` class to the directory where your secret files are stored. ```py from pydantic_settings import BaseSettings, SettingsConfigDict class Settings(BaseSettings): model_config = SettingsConfigDict(secrets_dir='/var/run') database_password: str ``` 1. Instantiating the `BaseSettings` derived class with the `_secrets_dir` keyword argument: ```text settings = Settings(_secrets_dir='/var/run') ``` In either case, the value of the passed argument can be any valid directory, either absolute or relative to the current working directory. **Note that a non existent directory will only generate a warning**. From there, *pydantic* will handle everything for you by loading in your variables and validating them. Even when using a secrets directory, *pydantic* will still read environment variables from a dotenv file or the environment, **a dotenv file and environment variables will always take priority over values loaded from the secrets directory**. Passing a file path via the `_secrets_dir` keyword argument on instantiation (method 2) will override the value (if any) set on the `model_config` class. If you need to load settings from multiple secrets directories, you can pass multiple paths as a tuple or list. Just like for `env_file`, values from subsequent paths override previous ones. ```python from pydantic_settings import BaseSettings, SettingsConfigDict class Settings(BaseSettings): # files in '/run/secrets' take priority over '/var/run' model_config = SettingsConfigDict(secrets_dir=('/var/run', '/run/secrets')) database_password: str ``` If any of `secrets_dir` is missing, it is ignored, and warning is shown. If any of `secrets_dir` is a file, error is raised. ### Use Case: Docker Secrets Docker Secrets can be used to provide sensitive configuration to an application running in a Docker container. To use these secrets in a *pydantic* application the process is simple. More information regarding creating, managing and using secrets in Docker see the official [Docker documentation](https://docs.docker.com/engine/reference/commandline/secret/). First, define your `Settings` class with a `SettingsConfigDict` that specifies the secrets directory. ```py from pydantic_settings import BaseSettings, SettingsConfigDict class Settings(BaseSettings): model_config = SettingsConfigDict(secrets_dir='/run/secrets') my_secret_data: str ``` Note By default [Docker uses `/run/secrets`](https://docs.docker.com/engine/swarm/secrets/#how-docker-manages-secrets) as the target mount point. If you want to use a different location, change `Config.secrets_dir` accordingly. Then, create your secret via the Docker CLI ```bash printf "This is a secret" | docker secret create my_secret_data - ``` Last, run your application inside a Docker container and supply your newly created secret ```bash docker service create --name pydantic-with-secrets --secret my_secret_data pydantic-app:latest ``` ## AWS Secrets Manager You must set one parameter: - `secret_id`: The AWS secret id You must have the same naming convention in the key value in secret as in the field name. For example, if the key in secret is named `SqlServerPassword`, the field name must be the same. You can use an alias too. In AWS Secrets Manager, nested models are supported with the `--` separator in the key name. For example, `SqlServer--Password`. Arrays (e.g. `MySecret--0`, `MySecret--1`) are not supported. ```py import os from pydantic import BaseModel from pydantic_settings import ( AWSSecretsManagerSettingsSource, BaseSettings, PydanticBaseSettingsSource, ) class SubModel(BaseModel): a: str class AWSSecretsManagerSettings(BaseSettings): foo: str bar: int sub: SubModel @classmethod def settings_customise_sources( cls, settings_cls: type[BaseSettings], init_settings: PydanticBaseSettingsSource, env_settings: PydanticBaseSettingsSource, dotenv_settings: PydanticBaseSettingsSource, file_secret_settings: PydanticBaseSettingsSource, ) -> tuple[PydanticBaseSettingsSource, ...]: aws_secrets_manager_settings = AWSSecretsManagerSettingsSource( settings_cls, os.environ['AWS_SECRETS_MANAGER_SECRET_ID'], ) return ( init_settings, env_settings, dotenv_settings, file_secret_settings, aws_secrets_manager_settings, ) ``` ## Azure Key Vault You must set two parameters: - `url`: For example, `https://my-resource.vault.azure.net/`. - `credential`: If you use `DefaultAzureCredential`, in local you can execute `az login` to get your identity credentials. The identity must have a role assignment (the recommended one is `Key Vault Secrets User`), so you can access the secrets. You must have the same naming convention in the field name as in the Key Vault secret name. For example, if the secret is named `SqlServerPassword`, the field name must be the same. You can use an alias too. In Key Vault, nested models are supported with the `--` separator. For example, `SqlServer--Password`. Key Vault arrays (e.g. `MySecret--0`, `MySecret--1`) are not supported. ```py import os from azure.identity import DefaultAzureCredential from pydantic import BaseModel from pydantic_settings import ( AzureKeyVaultSettingsSource, BaseSettings, PydanticBaseSettingsSource, ) class SubModel(BaseModel): a: str class AzureKeyVaultSettings(BaseSettings): foo: str bar: int sub: SubModel @classmethod def settings_customise_sources( cls, settings_cls: type[BaseSettings], init_settings: PydanticBaseSettingsSource, env_settings: PydanticBaseSettingsSource, dotenv_settings: PydanticBaseSettingsSource, file_secret_settings: PydanticBaseSettingsSource, ) -> tuple[PydanticBaseSettingsSource, ...]: az_key_vault_settings = AzureKeyVaultSettingsSource( settings_cls, os.environ['AZURE_KEY_VAULT_URL'], DefaultAzureCredential(), ) return ( init_settings, env_settings, dotenv_settings, file_secret_settings, az_key_vault_settings, ) ``` ## Other settings source Other settings sources are available for common configuration files: - `JsonConfigSettingsSource` using `json_file` and `json_file_encoding` arguments - `PyprojectTomlConfigSettingsSource` using *(optional)* `pyproject_toml_depth` and *(optional)* `pyproject_toml_table_header` arguments - `TomlConfigSettingsSource` using `toml_file` argument - `YamlConfigSettingsSource` using `yaml_file` and yaml_file_encoding arguments You can also provide multiple files by providing a list of path: ```py toml_file = ['config.default.toml', 'config.custom.toml'] ``` To use them, you can use the same mechanism described [here](#customise-settings-sources) ```py from pydantic import BaseModel from pydantic_settings import ( BaseSettings, PydanticBaseSettingsSource, SettingsConfigDict, TomlConfigSettingsSource, ) class Nested(BaseModel): nested_field: str class Settings(BaseSettings): foobar: str nested: Nested model_config = SettingsConfigDict(toml_file='config.toml') @classmethod def settings_customise_sources( cls, settings_cls: type[BaseSettings], init_settings: PydanticBaseSettingsSource, env_settings: PydanticBaseSettingsSource, dotenv_settings: PydanticBaseSettingsSource, file_secret_settings: PydanticBaseSettingsSource, ) -> tuple[PydanticBaseSettingsSource, ...]: return (TomlConfigSettingsSource(settings_cls),) ``` This will be able to read the following "config.toml" file, located in your working directory: ```toml foobar = "Hello" [nested] nested_field = "world!" ``` ### pyproject.toml "pyproject.toml" is a standardized file for providing configuration values in Python projects. [PEP 518](https://peps.python.org/pep-0518/#tool-table) defines a `[tool]` table that can be used to provide arbitrary tool configuration. While encouraged to use the `[tool]` table, `PyprojectTomlConfigSettingsSource` can be used to load variables from any location with in "pyproject.toml" file. This is controlled by providing `SettingsConfigDict(pyproject_toml_table_header=tuple[str, ...])` where the value is a tuple of header parts. By default, `pyproject_toml_table_header=('tool', 'pydantic-settings')` which will load variables from the `[tool.pydantic-settings]` table. ```python from pydantic_settings import ( BaseSettings, PydanticBaseSettingsSource, PyprojectTomlConfigSettingsSource, SettingsConfigDict, ) class Settings(BaseSettings): """Example loading values from the table used by default.""" field: str @classmethod def settings_customise_sources( cls, settings_cls: type[BaseSettings], init_settings: PydanticBaseSettingsSource, env_settings: PydanticBaseSettingsSource, dotenv_settings: PydanticBaseSettingsSource, file_secret_settings: PydanticBaseSettingsSource, ) -> tuple[PydanticBaseSettingsSource, ...]: return (PyprojectTomlConfigSettingsSource(settings_cls),) class SomeTableSettings(Settings): """Example loading values from a user defined table.""" model_config = SettingsConfigDict( pyproject_toml_table_header=('tool', 'some-table') ) class RootSettings(Settings): """Example loading values from the root of a pyproject.toml file.""" model_config = SettingsConfigDict(extra='ignore', pyproject_toml_table_header=()) ``` This will be able to read the following "pyproject.toml" file, located in your working directory, resulting in `Settings(field='default-table')`, `SomeTableSettings(field='some-table')`, & `RootSettings(field='root')`: ```toml field = "root" [tool.pydantic-settings] field = "default-table" [tool.some-table] field = "some-table" ``` By default, `PyprojectTomlConfigSettingsSource` will only look for a "pyproject.toml" in the your current working directory. However, there are two options to change this behavior. - `SettingsConfigDict(pyproject_toml_depth=)` can be provided to check `` number of directories **up** in the directory tree for a "pyproject.toml" if one is not found in the current working directory. By default, no parent directories are checked. - An explicit file path can be provided to the source when it is instantiated (e.g. `PyprojectTomlConfigSettingsSource(settings_cls, Path('~/.config').resolve() / 'pyproject.toml')`). If a file path is provided this way, it will be treated as absolute (no other locations are checked). ```python from pathlib import Path from pydantic_settings import ( BaseSettings, PydanticBaseSettingsSource, PyprojectTomlConfigSettingsSource, SettingsConfigDict, ) class DiscoverSettings(BaseSettings): """Example of discovering a pyproject.toml in parent directories in not in `Path.cwd()`.""" model_config = SettingsConfigDict(pyproject_toml_depth=2) @classmethod def settings_customise_sources( cls, settings_cls: type[BaseSettings], init_settings: PydanticBaseSettingsSource, env_settings: PydanticBaseSettingsSource, dotenv_settings: PydanticBaseSettingsSource, file_secret_settings: PydanticBaseSettingsSource, ) -> tuple[PydanticBaseSettingsSource, ...]: return (PyprojectTomlConfigSettingsSource(settings_cls),) class ExplicitFilePathSettings(BaseSettings): """Example of explicitly providing the path to the file to load.""" field: str @classmethod def settings_customise_sources( cls, settings_cls: type[BaseSettings], init_settings: PydanticBaseSettingsSource, env_settings: PydanticBaseSettingsSource, dotenv_settings: PydanticBaseSettingsSource, file_secret_settings: PydanticBaseSettingsSource, ) -> tuple[PydanticBaseSettingsSource, ...]: return ( PyprojectTomlConfigSettingsSource( settings_cls, Path('~/.config').resolve() / 'pyproject.toml' ), ) ``` ## Field value priority In the case where a value is specified for the same `Settings` field in multiple ways, the selected value is determined as follows (in descending order of priority): 1. If `cli_parse_args` is enabled, arguments passed in at the CLI. 1. Arguments passed to the `Settings` class initialiser. 1. Environment variables, e.g. `my_prefix_special_function` as described above. 1. Variables loaded from a dotenv (`.env`) file. 1. Variables loaded from the secrets directory. 1. The default field values for the `Settings` model. ## Customise settings sources If the default order of priority doesn't match your needs, it's possible to change it by overriding the `settings_customise_sources` method of your `Settings` . `settings_customise_sources` takes four callables as arguments and returns any number of callables as a tuple. In turn these callables are called to build the inputs to the fields of the settings class. Each callable should take an instance of the settings class as its sole argument and return a `dict`. ### Changing Priority The order of the returned callables decides the priority of inputs; first item is the highest priority. ```py from pydantic import PostgresDsn from pydantic_settings import BaseSettings, PydanticBaseSettingsSource class Settings(BaseSettings): database_dsn: PostgresDsn @classmethod def settings_customise_sources( cls, settings_cls: type[BaseSettings], init_settings: PydanticBaseSettingsSource, env_settings: PydanticBaseSettingsSource, dotenv_settings: PydanticBaseSettingsSource, file_secret_settings: PydanticBaseSettingsSource, ) -> tuple[PydanticBaseSettingsSource, ...]: return env_settings, init_settings, file_secret_settings print(Settings(database_dsn='postgres://postgres@localhost:5432/kwargs_db')) #> database_dsn=PostgresDsn('postgres://postgres@localhost:5432/kwargs_db') ``` By flipping `env_settings` and `init_settings`, environment variables now have precedence over `__init__` kwargs. ### Adding sources As explained earlier, *pydantic* ships with multiples built-in settings sources. However, you may occasionally need to add your own custom sources, `settings_customise_sources` makes this very easy: ```py import json from pathlib import Path from typing import Any from pydantic.fields import FieldInfo from pydantic_settings import ( BaseSettings, PydanticBaseSettingsSource, SettingsConfigDict, ) class JsonConfigSettingsSource(PydanticBaseSettingsSource): """ A simple settings source class that loads variables from a JSON file at the project's root. Here we happen to choose to use the `env_file_encoding` from Config when reading `config.json` """ def get_field_value( self, field: FieldInfo, field_name: str ) -> tuple[Any, str, bool]: encoding = self.config.get('env_file_encoding') file_content_json = json.loads( Path('tests/example_test_config.json').read_text(encoding) ) field_value = file_content_json.get(field_name) return field_value, field_name, False def prepare_field_value( self, field_name: str, field: FieldInfo, value: Any, value_is_complex: bool ) -> Any: return value def __call__(self) -> dict[str, Any]: d: dict[str, Any] = {} for field_name, field in self.settings_cls.model_fields.items(): field_value, field_key, value_is_complex = self.get_field_value( field, field_name ) field_value = self.prepare_field_value( field_name, field, field_value, value_is_complex ) if field_value is not None: d[field_key] = field_value return d class Settings(BaseSettings): model_config = SettingsConfigDict(env_file_encoding='utf-8') foobar: str @classmethod def settings_customise_sources( cls, settings_cls: type[BaseSettings], init_settings: PydanticBaseSettingsSource, env_settings: PydanticBaseSettingsSource, dotenv_settings: PydanticBaseSettingsSource, file_secret_settings: PydanticBaseSettingsSource, ) -> tuple[PydanticBaseSettingsSource, ...]: return ( init_settings, JsonConfigSettingsSource(settings_cls), env_settings, file_secret_settings, ) print(Settings()) #> foobar='test' ``` #### Accesing the result of previous sources Each source of settings can access the output of the previous ones. ```python from typing import Any from pydantic.fields import FieldInfo from pydantic_settings import PydanticBaseSettingsSource class MyCustomSource(PydanticBaseSettingsSource): def get_field_value( self, field: FieldInfo, field_name: str ) -> tuple[Any, str, bool]: ... def __call__(self) -> dict[str, Any]: # Retrieve the aggregated settings from previous sources current_state = self.current_state current_state.get('some_setting') # Retrive settings from all sources individually # self.settings_sources_data["SettingsSourceName"]: dict[str, Any] settings_sources_data = self.settings_sources_data settings_sources_data['SomeSettingsSource'].get('some_setting') # Your code here... ``` ### Removing sources You might also want to disable a source: ```py from pydantic import ValidationError from pydantic_settings import BaseSettings, PydanticBaseSettingsSource class Settings(BaseSettings): my_api_key: str @classmethod def settings_customise_sources( cls, settings_cls: type[BaseSettings], init_settings: PydanticBaseSettingsSource, env_settings: PydanticBaseSettingsSource, dotenv_settings: PydanticBaseSettingsSource, file_secret_settings: PydanticBaseSettingsSource, ) -> tuple[PydanticBaseSettingsSource, ...]: # here we choose to ignore arguments from init_settings return env_settings, file_secret_settings try: Settings(my_api_key='this is ignored') except ValidationError as exc_info: print(exc_info) """ 1 validation error for Settings my_api_key Field required [type=missing, input_value={}, input_type=dict] For further information visit https://errors.pydantic.dev/2/v/missing """ ``` ## In-place reloading In case you want to reload in-place an existing setting, you can do it by using its `__init__` method : ```py import os from pydantic import Field from pydantic_settings import BaseSettings class Settings(BaseSettings): foo: str = Field('foo') mutable_settings = Settings() print(mutable_settings.foo) #> foo os.environ['foo'] = 'bar' print(mutable_settings.foo) #> foo mutable_settings.__init__() print(mutable_settings.foo) #> bar os.environ.pop('foo') mutable_settings.__init__() print(mutable_settings.foo) #> foo ``` Beyond accessing model attributes directly via their field names (e.g. `model.foobar`), models can be converted, dumped, serialized, and exported in a number of ways. Serialize versus dump Pydantic uses the terms "serialize" and "dump" interchangeably. Both refer to the process of converting a model to a dictionary or JSON-encoded string. Outside of Pydantic, the word "serialize" usually refers to converting in-memory data into a string or bytes. However, in the context of Pydantic, there is a very close relationship between converting an object from a more structured form — such as a Pydantic model, a dataclass, etc. — into a less structured form comprised of Python built-ins such as dict. While we could (and on occasion, do) distinguish between these scenarios by using the word "dump" when converting to primitives and "serialize" when converting to string, for practical purposes, we frequently use the word "serialize" to refer to both of these situations, even though it does not always imply conversion to a string or bytes. ## `model.model_dump(...)` API Documentation pydantic.main.BaseModel.model_dump This is the primary way of converting a model to a dictionary. Sub-models will be recursively converted to dictionaries. By default, the output may contain non-JSON-serializable Python objects. The `mode` argument can be specified as `'json'` to ensure that the output only contains JSON serializable types. Other parameters exist to include or exclude fields, [including nested fields](#advanced-include-and-exclude), or to further customize the serialization behaviour. See the available parameters for more information. Note The one exception to sub-models being converted to dictionaries is that [`RootModel`](../models/#rootmodel-and-custom-root-types) and its subclasses will have the `root` field value dumped directly, without a wrapping dictionary. This is also done recursively. Note You can use [computed fields](../../api/fields/#pydantic.fields.computed_field) to include `property` and `cached_property` data in the `model.model_dump(...)` output. Example: ```python from typing import Any, Optional from pydantic import BaseModel, Field, Json class BarModel(BaseModel): whatever: int class FooBarModel(BaseModel): banana: Optional[float] = 1.1 foo: str = Field(serialization_alias='foo_alias') bar: BarModel m = FooBarModel(banana=3.14, foo='hello', bar={'whatever': 123}) # returns a dictionary: print(m.model_dump()) #> {'banana': 3.14, 'foo': 'hello', 'bar': {'whatever': 123}} print(m.model_dump(include={'foo', 'bar'})) #> {'foo': 'hello', 'bar': {'whatever': 123}} print(m.model_dump(exclude={'foo', 'bar'})) #> {'banana': 3.14} print(m.model_dump(by_alias=True)) #> {'banana': 3.14, 'foo_alias': 'hello', 'bar': {'whatever': 123}} print( FooBarModel(foo='hello', bar={'whatever': 123}).model_dump( exclude_unset=True ) ) #> {'foo': 'hello', 'bar': {'whatever': 123}} print( FooBarModel(banana=1.1, foo='hello', bar={'whatever': 123}).model_dump( exclude_defaults=True ) ) #> {'foo': 'hello', 'bar': {'whatever': 123}} print( FooBarModel(foo='hello', bar={'whatever': 123}).model_dump( exclude_defaults=True ) ) #> {'foo': 'hello', 'bar': {'whatever': 123}} print( FooBarModel(banana=None, foo='hello', bar={'whatever': 123}).model_dump( exclude_none=True ) ) #> {'foo': 'hello', 'bar': {'whatever': 123}} class Model(BaseModel): x: list[Json[Any]] print(Model(x=['{"a": 1}', '[1, 2]']).model_dump()) #> {'x': [{'a': 1}, [1, 2]]} print(Model(x=['{"a": 1}', '[1, 2]']).model_dump(round_trip=True)) #> {'x': ['{"a":1}', '[1,2]']} ``` ```python from typing import Any from pydantic import BaseModel, Field, Json class BarModel(BaseModel): whatever: int class FooBarModel(BaseModel): banana: float | None = 1.1 foo: str = Field(serialization_alias='foo_alias') bar: BarModel m = FooBarModel(banana=3.14, foo='hello', bar={'whatever': 123}) # returns a dictionary: print(m.model_dump()) #> {'banana': 3.14, 'foo': 'hello', 'bar': {'whatever': 123}} print(m.model_dump(include={'foo', 'bar'})) #> {'foo': 'hello', 'bar': {'whatever': 123}} print(m.model_dump(exclude={'foo', 'bar'})) #> {'banana': 3.14} print(m.model_dump(by_alias=True)) #> {'banana': 3.14, 'foo_alias': 'hello', 'bar': {'whatever': 123}} print( FooBarModel(foo='hello', bar={'whatever': 123}).model_dump( exclude_unset=True ) ) #> {'foo': 'hello', 'bar': {'whatever': 123}} print( FooBarModel(banana=1.1, foo='hello', bar={'whatever': 123}).model_dump( exclude_defaults=True ) ) #> {'foo': 'hello', 'bar': {'whatever': 123}} print( FooBarModel(foo='hello', bar={'whatever': 123}).model_dump( exclude_defaults=True ) ) #> {'foo': 'hello', 'bar': {'whatever': 123}} print( FooBarModel(banana=None, foo='hello', bar={'whatever': 123}).model_dump( exclude_none=True ) ) #> {'foo': 'hello', 'bar': {'whatever': 123}} class Model(BaseModel): x: list[Json[Any]] print(Model(x=['{"a": 1}', '[1, 2]']).model_dump()) #> {'x': [{'a': 1}, [1, 2]]} print(Model(x=['{"a": 1}', '[1, 2]']).model_dump(round_trip=True)) #> {'x': ['{"a":1}', '[1,2]']} ``` ## `model.model_dump_json(...)` API Documentation pydantic.main.BaseModel.model_dump_json The `.model_dump_json()` method serializes a model directly to a JSON-encoded string that is equivalent to the result produced by [`.model_dump()`](#modelmodel_dump). See the available parameters for more information. Note Pydantic can serialize many commonly used types to JSON that would otherwise be incompatible with a simple `json.dumps(foobar)` (e.g. `datetime`, `date` or `UUID`) . ```python from datetime import datetime from pydantic import BaseModel class BarModel(BaseModel): whatever: int class FooBarModel(BaseModel): foo: datetime bar: BarModel m = FooBarModel(foo=datetime(2032, 6, 1, 12, 13, 14), bar={'whatever': 123}) print(m.model_dump_json()) #> {"foo":"2032-06-01T12:13:14","bar":{"whatever":123}} print(m.model_dump_json(indent=2)) """ { "foo": "2032-06-01T12:13:14", "bar": { "whatever": 123 } } """ ``` ## `dict(model)` and iteration Pydantic models can also be converted to dictionaries using `dict(model)`, and you can also iterate over a model's fields using `for field_name, field_value in model:`. With this approach the raw field values are returned, so sub-models will not be converted to dictionaries. Example: ```python from pydantic import BaseModel class BarModel(BaseModel): whatever: int class FooBarModel(BaseModel): banana: float foo: str bar: BarModel m = FooBarModel(banana=3.14, foo='hello', bar={'whatever': 123}) print(dict(m)) #> {'banana': 3.14, 'foo': 'hello', 'bar': BarModel(whatever=123)} for name, value in m: print(f'{name}: {value}') #> banana: 3.14 #> foo: hello #> bar: whatever=123 ``` Note also that [`RootModel`](../models/#rootmodel-and-custom-root-types) *does* get converted to a dictionary with the key `'root'`. ## Custom serializers Pydantic provides several functional serializers to customise how a model is serialized to a dictionary or JSON. - @field_serializer - @model_serializer - PlainSerializer - WrapSerializer Serialization can be customised on a field using the @field_serializer decorator, and on a model using the @model_serializer decorator. ```python from datetime import datetime, timedelta, timezone from typing import Any from pydantic import BaseModel, ConfigDict, field_serializer, model_serializer class WithCustomEncoders(BaseModel): model_config = ConfigDict(ser_json_timedelta='iso8601') dt: datetime diff: timedelta @field_serializer('dt') def serialize_dt(self, dt: datetime, _info): return dt.timestamp() m = WithCustomEncoders( dt=datetime(2032, 6, 1, tzinfo=timezone.utc), diff=timedelta(hours=100) ) print(m.model_dump_json()) #> {"dt":1969660800.0,"diff":"P4DT4H"} class Model(BaseModel): x: str @model_serializer def ser_model(self) -> dict[str, Any]: return {'x': f'serialized {self.x}'} print(Model(x='test value').model_dump_json()) #> {"x":"serialized test value"} ``` Note A single serializer can also be called on all fields by passing the special value '\*' to the @field_serializer decorator. In addition, PlainSerializer and WrapSerializer enable you to use a function to modify the output of serialization. Both serializers accept optional arguments including: - `return_type` specifies the return type for the function. If omitted it will be inferred from the type annotation. - `when_used` specifies when this serializer should be used. Accepts a string with values 'always', 'unless-none', 'json', and 'json-unless-none'. Defaults to 'always'. `PlainSerializer` uses a simple function to modify the output of serialization. ```python from typing import Annotated from pydantic import BaseModel from pydantic.functional_serializers import PlainSerializer FancyInt = Annotated[ int, PlainSerializer(lambda x: f'{x:,}', return_type=str, when_used='json') ] class MyModel(BaseModel): x: FancyInt print(MyModel(x=1234).model_dump()) #> {'x': 1234} print(MyModel(x=1234).model_dump(mode='json')) #> {'x': '1,234'} ``` `WrapSerializer` receives the raw inputs along with a handler function that applies the standard serialization logic, and can modify the resulting value before returning it as the final output of serialization. ```python from typing import Annotated, Any from pydantic import BaseModel, SerializerFunctionWrapHandler from pydantic.functional_serializers import WrapSerializer def ser_wrap(v: Any, nxt: SerializerFunctionWrapHandler) -> str: return f'{nxt(v + 1):,}' FancyInt = Annotated[int, WrapSerializer(ser_wrap, when_used='json')] class MyModel(BaseModel): x: FancyInt print(MyModel(x=1234).model_dump()) #> {'x': 1234} print(MyModel(x=1234).model_dump(mode='json')) #> {'x': '1,235'} ``` ### Overriding the return type when dumping a model While the return value of `.model_dump()` can usually be described as `dict[str, Any]`, through the use of `@model_serializer` you can actually cause it to return a value that doesn't match this signature: ```python from pydantic import BaseModel, model_serializer class Model(BaseModel): x: str @model_serializer def ser_model(self) -> str: return self.x print(Model(x='not a dict').model_dump()) #> not a dict ``` If you want to do this and still get proper type-checking for this method, you can override `.model_dump()` in an `if TYPE_CHECKING:` block: ```python from __future__ import annotations from typing import TYPE_CHECKING, Any, Literal from pydantic import BaseModel, model_serializer class Model(BaseModel): x: str @model_serializer def ser_model(self) -> str: return self.x if TYPE_CHECKING: # Ensure type checkers see the correct return type def model_dump( self, *, mode: Literal['json', 'python'] | str = 'python', include: Any = None, exclude: Any = None, by_alias: bool | None = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool = True, ) -> str: ... ``` This trick is actually used in [`RootModel`](../models/#rootmodel-and-custom-root-types) for precisely this purpose. ## Serializing subclasses ### Subclasses of standard types Subclasses of standard types are automatically dumped like their super-classes: ```python from datetime import date, timedelta from typing import Any from pydantic_core import core_schema from pydantic import BaseModel, GetCoreSchemaHandler class DayThisYear(date): """ Contrived example of a special type of date that takes an int and interprets it as a day in the current year """ @classmethod def __get_pydantic_core_schema__( cls, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: return core_schema.no_info_after_validator_function( cls.validate, core_schema.int_schema(), serialization=core_schema.format_ser_schema('%Y-%m-%d'), ) @classmethod def validate(cls, v: int): return date(2023, 1, 1) + timedelta(days=v) class FooModel(BaseModel): date: DayThisYear m = FooModel(date=300) print(m.model_dump_json()) #> {"date":"2023-10-28"} ``` ### Subclass instances for fields of `BaseModel`, dataclasses, `TypedDict` When using fields whose annotations are themselves struct-like types (e.g., `BaseModel` subclasses, dataclasses, etc.), the default behavior is to serialize the attribute value as though it was an instance of the annotated type, even if it is a subclass. More specifically, only the fields from the *annotated* type will be included in the dumped object: ```python from pydantic import BaseModel class User(BaseModel): name: str class UserLogin(User): password: str class OuterModel(BaseModel): user: User user = UserLogin(name='pydantic', password='hunter2') m = OuterModel(user=user) print(m) #> user=UserLogin(name='pydantic', password='hunter2') print(m.model_dump()) # note: the password field is not included #> {'user': {'name': 'pydantic'}} ``` Migration Warning This behavior is different from how things worked in Pydantic V1, where we would always include all (subclass) fields when recursively dumping models to dicts. The motivation behind this change in behavior is that it helps ensure that you know precisely which fields could be included when serializing, even if subclasses get passed when instantiating the object. In particular, this can help prevent surprises when adding sensitive information like secrets as fields of subclasses. ### Serializing with duck-typing 🦆 What is serialization with duck typing? Duck-typing serialization is the behavior of serializing an object based on the fields present in the object itself, rather than the fields present in the schema of the object. This means that when an object is serialized, fields present in a subclass, but not in the original schema, will be included in the serialized output. This behavior was the default in Pydantic V1, but was changed in V2 to help ensure that you know precisely which fields would be included when serializing, even if subclasses get passed when instantiating the object. This helps prevent security risks when serializing subclasses with sensitive information, for example. If you want v1-style duck-typing serialization behavior, you can use a runtime setting, or annotate individual types. - Field / type level: use the `SerializeAsAny` annotation - Runtime level: use the `serialize_as_any` flag when calling `model_dump()` or `model_dump_json()` We discuss these options below in more detail: #### `SerializeAsAny` annotation: If you want duck-typing serialization behavior, this can be done using the `SerializeAsAny` annotation on a type: ```python from pydantic import BaseModel, SerializeAsAny class User(BaseModel): name: str class UserLogin(User): password: str class OuterModel(BaseModel): as_any: SerializeAsAny[User] as_user: User user = UserLogin(name='pydantic', password='password') print(OuterModel(as_any=user, as_user=user).model_dump()) """ { 'as_any': {'name': 'pydantic', 'password': 'password'}, 'as_user': {'name': 'pydantic'}, } """ ``` When a field is annotated as `SerializeAsAny[]`, the validation behavior will be the same as if it was annotated as ``, and type-checkers like mypy will treat the attribute as having the appropriate type as well. But when serializing, the field will be serialized as though the type hint for the field was `Any`, which is where the name comes from. #### `serialize_as_any` runtime setting The `serialize_as_any` runtime setting can be used to serialize model data with or without duck typed serialization behavior. `serialize_as_any` can be passed as a keyword argument to the `model_dump()` and `model_dump_json` methods of `BaseModel`s and `RootModel`s. It can also be passed as a keyword argument to the `dump_python()` and `dump_json()` methods of `TypeAdapter`s. If `serialize_as_any` is set to `True`, the model will be serialized using duck typed serialization behavior, which means that the model will ignore the schema and instead ask the object itself how it should be serialized. In particular, this means that when model subclasses are serialized, fields present in the subclass but not in the original schema will be included. If `serialize_as_any` is set to `False` (which is the default), the model will be serialized using the schema, which means that fields present in a subclass but not in the original schema will be ignored. Why is this flag useful? Sometimes, you want to make sure that no matter what fields might have been added in subclasses, the serialized object will only have the fields listed in the original type definition. This can be useful if you add something like a `password: str` field in a subclass that you don't want to accidentally include in the serialized output. For example: ```python from pydantic import BaseModel class User(BaseModel): name: str class UserLogin(User): password: str class OuterModel(BaseModel): user1: User user2: User user = UserLogin(name='pydantic', password='password') outer_model = OuterModel(user1=user, user2=user) print(outer_model.model_dump(serialize_as_any=True)) # (1)! """ { 'user1': {'name': 'pydantic', 'password': 'password'}, 'user2': {'name': 'pydantic', 'password': 'password'}, } """ print(outer_model.model_dump(serialize_as_any=False)) # (2)! #> {'user1': {'name': 'pydantic'}, 'user2': {'name': 'pydantic'}} ``` 1. With `serialize_as_any` set to `True`, the result matches that of V1. 1. With `serialize_as_any` set to `False` (the V2 default), fields present on the subclass, but not the base class, are not included in serialization. This setting even takes effect with nested and recursive patterns as well. For example: ```python from pydantic import BaseModel class User(BaseModel): name: str friends: list['User'] class UserLogin(User): password: str class OuterModel(BaseModel): user: User user = UserLogin( name='samuel', password='pydantic-pw', friends=[UserLogin(name='sebastian', password='fastapi-pw', friends=[])], ) print(OuterModel(user=user).model_dump(serialize_as_any=True)) # (1)! """ { 'user': { 'name': 'samuel', 'friends': [ {'name': 'sebastian', 'friends': [], 'password': 'fastapi-pw'} ], 'password': 'pydantic-pw', } } """ print(OuterModel(user=user).model_dump(serialize_as_any=False)) # (2)! """ {'user': {'name': 'samuel', 'friends': [{'name': 'sebastian', 'friends': []}]}} """ ``` 1. Even nested `User` model instances are dumped with fields unique to `User` subclasses. 1. Even nested `User` model instances are dumped without fields unique to `User` subclasses. Note The behavior of the `serialize_as_any` runtime flag is almost the same as the behavior of the `SerializeAsAny` annotation. There are a few nuanced differences that we're working to resolve, but for the most part, you can expect the same behavior from both. See more about the differences in this [active issue](https://github.com/pydantic/pydantic/issues/9049) #### Overriding the `serialize_as_any` default (False) You can override the default setting for `serialize_as_any` by configuring a subclass of `BaseModel` that overrides the default for the `serialize_as_any` parameter to `model_dump()` and `model_dump_json()`, and then use that as the base class (instead of `pydantic.BaseModel`) for any model you want to have this default behavior. For example, you could do the following if you want to use duck-typing serialization by default: ```python from typing import Any from pydantic import BaseModel, SecretStr class MyBaseModel(BaseModel): def model_dump(self, **kwargs) -> dict[str, Any]: return super().model_dump(serialize_as_any=True, **kwargs) def model_dump_json(self, **kwargs) -> str: return super().model_dump_json(serialize_as_any=True, **kwargs) class User(MyBaseModel): name: str class UserInfo(User): password: SecretStr class OuterModel(MyBaseModel): user: User u = OuterModel(user=UserInfo(name='John', password='secret_pw')) print(u.model_dump_json()) # (1)! #> {"user":{"name":"John","password":"**********"}} ``` 1. By default, `model_dump_json` will use duck-typing serialization behavior, which means that the `password` field is included in the output. ## `pickle.dumps(model)` Pydantic models support efficient pickling and unpickling. ```python import pickle from pydantic import BaseModel class FooBarModel(BaseModel): a: str b: int m = FooBarModel(a='hello', b=123) print(m) #> a='hello' b=123 data = pickle.dumps(m) print(data[:20]) #> b'\x80\x04\x95\x95\x00\x00\x00\x00\x00\x00\x00\x8c\x08__main_' m2 = pickle.loads(data) print(m2) #> a='hello' b=123 ``` ## Advanced include and exclude The `model_dump` and `model_dump_json` methods support `include` and `exclude` parameters which can either be sets or dictionaries. This allows nested selection of which fields to export: ```python from pydantic import BaseModel, SecretStr class User(BaseModel): id: int username: str password: SecretStr class Transaction(BaseModel): id: str user: User value: int t = Transaction( id='1234567890', user=User(id=42, username='JohnDoe', password='hashedpassword'), value=9876543210, ) # using a set: print(t.model_dump(exclude={'user', 'value'})) #> {'id': '1234567890'} # using a dict: print(t.model_dump(exclude={'user': {'username', 'password'}, 'value': True})) #> {'id': '1234567890', 'user': {'id': 42}} print(t.model_dump(include={'id': True, 'user': {'id'}})) #> {'id': '1234567890', 'user': {'id': 42}} ``` Using `True` indicates that we want to exclude or include an entire key, just as if we included it in a set (note that using `False` isn't supported). This can be done at any depth level. Special care must be taken when including or excluding fields from a list or tuple of submodels or dictionaries. In this scenario, `model_dump` and related methods expect integer keys for element-wise inclusion or exclusion. To exclude a field from **every** member of a list or tuple, the dictionary key `'__all__'` can be used, as shown here: ```python import datetime from pydantic import BaseModel, SecretStr class Country(BaseModel): name: str phone_code: int class Address(BaseModel): post_code: int country: Country class CardDetails(BaseModel): number: SecretStr expires: datetime.date class Hobby(BaseModel): name: str info: str class User(BaseModel): first_name: str second_name: str address: Address card_details: CardDetails hobbies: list[Hobby] user = User( first_name='John', second_name='Doe', address=Address( post_code=123456, country=Country(name='USA', phone_code=1) ), card_details=CardDetails( number='4212934504460000', expires=datetime.date(2020, 5, 1) ), hobbies=[ Hobby(name='Programming', info='Writing code and stuff'), Hobby(name='Gaming', info='Hell Yeah!!!'), ], ) exclude_keys = { 'second_name': True, 'address': {'post_code': True, 'country': {'phone_code'}}, 'card_details': True, # You can exclude fields from specific members of a tuple/list by index: 'hobbies': {-1: {'info'}}, } include_keys = { 'first_name': True, 'address': {'country': {'name'}}, 'hobbies': {0: True, -1: {'name'}}, } # would be the same as user.model_dump(exclude=exclude_keys) in this case: print(user.model_dump(include=include_keys)) """ { 'first_name': 'John', 'address': {'country': {'name': 'USA'}}, 'hobbies': [ {'name': 'Programming', 'info': 'Writing code and stuff'}, {'name': 'Gaming'}, ], } """ # To exclude a field from all members of a nested list or tuple, use "__all__": print(user.model_dump(exclude={'hobbies': {'__all__': {'info'}}})) """ { 'first_name': 'John', 'second_name': 'Doe', 'address': { 'post_code': 123456, 'country': {'name': 'USA', 'phone_code': 1}, }, 'card_details': { 'number': SecretStr('**********'), 'expires': datetime.date(2020, 5, 1), }, 'hobbies': [{'name': 'Programming'}, {'name': 'Gaming'}], } """ ``` The same holds for the `model_dump_json` method. ### Model- and field-level include and exclude In addition to the explicit `exclude` and `include` arguments passed to `model_dump` and `model_dump_json` methods, we can also pass the `exclude: bool` arguments directly to the `Field` constructor: Setting `exclude` on the field constructor (`Field(exclude=True)`) takes priority over the `exclude`/`include` on `model_dump` and `model_dump_json`: ```python from pydantic import BaseModel, Field, SecretStr class User(BaseModel): id: int username: str password: SecretStr = Field(exclude=True) class Transaction(BaseModel): id: str value: int = Field(exclude=True) t = Transaction( id='1234567890', value=9876543210, ) print(t.model_dump()) #> {'id': '1234567890'} print(t.model_dump(include={'id': True, 'value': True})) # (1)! #> {'id': '1234567890'} ``` 1. `value` excluded from the output because it excluded in `Field`. That being said, setting `exclude` on the field constructor (`Field(exclude=True)`) does not take priority over the `exclude_unset`, `exclude_none`, and `exclude_default` parameters on `model_dump` and `model_dump_json`: ```python from typing import Optional from pydantic import BaseModel, Field class Person(BaseModel): name: str age: Optional[int] = Field(None, exclude=False) person = Person(name='Jeremy') print(person.model_dump()) #> {'name': 'Jeremy', 'age': None} print(person.model_dump(exclude_none=True)) # (1)! #> {'name': 'Jeremy'} print(person.model_dump(exclude_unset=True)) # (2)! #> {'name': 'Jeremy'} print(person.model_dump(exclude_defaults=True)) # (3)! #> {'name': 'Jeremy'} ``` 1. `age` excluded from the output because `exclude_none` was set to `True`, and `age` is `None`. 1. `age` excluded from the output because `exclude_unset` was set to `True`, and `age` was not set in the Person constructor. 1. `age` excluded from the output because `exclude_defaults` was set to `True`, and `age` takes the default value of `None`. ```python from pydantic import BaseModel, Field class Person(BaseModel): name: str age: int | None = Field(None, exclude=False) person = Person(name='Jeremy') print(person.model_dump()) #> {'name': 'Jeremy', 'age': None} print(person.model_dump(exclude_none=True)) # (1)! #> {'name': 'Jeremy'} print(person.model_dump(exclude_unset=True)) # (2)! #> {'name': 'Jeremy'} print(person.model_dump(exclude_defaults=True)) # (3)! #> {'name': 'Jeremy'} ``` 1. `age` excluded from the output because `exclude_none` was set to `True`, and `age` is `None`. 1. `age` excluded from the output because `exclude_unset` was set to `True`, and `age` was not set in the Person constructor. 1. `age` excluded from the output because `exclude_defaults` was set to `True`, and `age` takes the default value of `None`. ## Serialization Context You can pass a context object to the serialization methods which can be accessed from the `info` parameter to decorated serializer functions. This is useful when you need to dynamically update the serialization behavior during runtime. For example, if you wanted a field to be dumped depending on a dynamically controllable set of allowed values, this could be done by passing the allowed values by context: ```python from pydantic import BaseModel, SerializationInfo, field_serializer class Model(BaseModel): text: str @field_serializer('text') def remove_stopwords(self, v: str, info: SerializationInfo): context = info.context if context: stopwords = context.get('stopwords', set()) v = ' '.join(w for w in v.split() if w.lower() not in stopwords) return v model = Model.model_construct(**{'text': 'This is an example document'}) print(model.model_dump()) # no context #> {'text': 'This is an example document'} print(model.model_dump(context={'stopwords': ['this', 'is', 'an']})) #> {'text': 'example document'} print(model.model_dump(context={'stopwords': ['document']})) #> {'text': 'This is an example'} ``` Similarly, you can [use a context for validation](../validators/#validation-context). ## `model_copy(...)` API Documentation pydantic.main.BaseModel.model_copy `model_copy()` allows models to be duplicated (with optional updates), which is particularly useful when working with frozen models. Example: ```python from pydantic import BaseModel class BarModel(BaseModel): whatever: int class FooBarModel(BaseModel): banana: float foo: str bar: BarModel m = FooBarModel(banana=3.14, foo='hello', bar={'whatever': 123}) print(m.model_copy(update={'banana': 0})) #> banana=0 foo='hello' bar=BarModel(whatever=123) print(id(m.bar) == id(m.model_copy().bar)) #> True # normal copy gives the same object reference for bar print(id(m.bar) == id(m.model_copy(deep=True).bar)) #> False # deep copy gives a new object reference for `bar` ``` API Documentation pydantic.types.Strict By default, Pydantic will attempt to coerce values to the desired type when possible. For example, you can pass the string `"123"` as the input to an `int` field, and it will be converted to `123`. This coercion behavior is useful in many scenarios — think: UUIDs, URL parameters, HTTP headers, environment variables, user input, etc. However, there are also situations where this is not desirable, and you want Pydantic to error instead of coercing data. To better support this use case, Pydantic provides a "strict mode" that can be enabled on a per-model, per-field, or even per-validation-call basis. When strict mode is enabled, Pydantic will be much less lenient when coercing data, and will instead error if the data is not of the correct type. Here is a brief example showing the difference between validation behavior in strict and the default/"lax" mode: ```python from pydantic import BaseModel, ValidationError class MyModel(BaseModel): x: int print(MyModel.model_validate({'x': '123'})) # lax mode #> x=123 try: MyModel.model_validate({'x': '123'}, strict=True) # strict mode except ValidationError as exc: print(exc) """ 1 validation error for MyModel x Input should be a valid integer [type=int_type, input_value='123', input_type=str] """ ``` There are various ways to get strict-mode validation while using Pydantic, which will be discussed in more detail below: - [Passing `strict=True` to the validation methods](#strict-mode-in-method-calls), such as `BaseModel.model_validate`, `TypeAdapter.validate_python`, and similar for JSON - [Using `Field(strict=True)`](#strict-mode-with-field) with fields of a `BaseModel`, `dataclass`, or `TypedDict` - [Using `pydantic.types.Strict` as a type annotation](#strict-mode-with-annotated-strict) on a field - Pydantic provides some type aliases that are already annotated with `Strict`, such as `pydantic.types.StrictInt` - [Using `ConfigDict(strict=True)`](#strict-mode-with-configdict) ## Type coercions in strict mode For most types, when validating data from python in strict mode, only the instances of the exact types are accepted. For example, when validating an `int` field, only instances of `int` are accepted; passing instances of `float` or `str` will result in raising a `ValidationError`. Note that we are looser when validating data from JSON in strict mode. For example, when validating a `UUID` field, instances of `str` will be accepted when validating from JSON, but not from python: ```python import json from uuid import UUID from pydantic import BaseModel, ValidationError class MyModel(BaseModel): guid: UUID data = {'guid': '12345678-1234-1234-1234-123456789012'} print(MyModel.model_validate(data)) # OK: lax #> guid=UUID('12345678-1234-1234-1234-123456789012') print( MyModel.model_validate_json(json.dumps(data), strict=True) ) # OK: strict, but from json #> guid=UUID('12345678-1234-1234-1234-123456789012') try: MyModel.model_validate(data, strict=True) # Not OK: strict, from python except ValidationError as exc: print(exc.errors(include_url=False)) """ [ { 'type': 'is_instance_of', 'loc': ('guid',), 'msg': 'Input should be an instance of UUID', 'input': '12345678-1234-1234-1234-123456789012', 'ctx': {'class': 'UUID'}, } ] """ ``` For more details about what types are allowed as inputs in strict mode, you can review the [Conversion Table](../conversion_table/). ## Strict mode in method calls All the examples included so far get strict-mode validation through the use of `strict=True` as a keyword argument to the validation methods. While we have shown this for `BaseModel.model_validate`, this also works with arbitrary types through the use of `TypeAdapter`: ```python from pydantic import TypeAdapter, ValidationError print(TypeAdapter(bool).validate_python('yes')) # OK: lax #> True try: TypeAdapter(bool).validate_python('yes', strict=True) # Not OK: strict except ValidationError as exc: print(exc) """ 1 validation error for bool Input should be a valid boolean [type=bool_type, input_value='yes', input_type=str] """ ``` Note this also works even when using more "complex" types in `TypeAdapter`: ```python from dataclasses import dataclass from pydantic import TypeAdapter, ValidationError @dataclass class MyDataclass: x: int try: TypeAdapter(MyDataclass).validate_python({'x': '123'}, strict=True) except ValidationError as exc: print(exc) """ 1 validation error for MyDataclass Input should be an instance of MyDataclass [type=dataclass_exact_type, input_value={'x': '123'}, input_type=dict] """ ``` This also works with the `TypeAdapter.validate_json` and `BaseModel.model_validate_json` methods: ```python import json from uuid import UUID from pydantic import BaseModel, TypeAdapter, ValidationError try: TypeAdapter(list[int]).validate_json('["1", 2, "3"]', strict=True) except ValidationError as exc: print(exc) """ 2 validation errors for list[int] 0 Input should be a valid integer [type=int_type, input_value='1', input_type=str] 2 Input should be a valid integer [type=int_type, input_value='3', input_type=str] """ class Model(BaseModel): x: int y: UUID data = {'x': '1', 'y': '12345678-1234-1234-1234-123456789012'} try: Model.model_validate(data, strict=True) except ValidationError as exc: # Neither x nor y are valid in strict mode from python: print(exc) """ 2 validation errors for Model x Input should be a valid integer [type=int_type, input_value='1', input_type=str] y Input should be an instance of UUID [type=is_instance_of, input_value='12345678-1234-1234-1234-123456789012', input_type=str] """ json_data = json.dumps(data) try: Model.model_validate_json(json_data, strict=True) except ValidationError as exc: # From JSON, x is still not valid in strict mode, but y is: print(exc) """ 1 validation error for Model x Input should be a valid integer [type=int_type, input_value='1', input_type=str] """ ``` ## Strict mode with `Field` For individual fields on a model, you can [set `strict=True` on the field](../../api/fields/#pydantic.fields.Field). This will cause strict-mode validation to be used for that field, even when the validation methods are called without `strict=True`. Only the fields for which `strict=True` is set will be affected: ```python from pydantic import BaseModel, Field, ValidationError class User(BaseModel): name: str age: int n_pets: int user = User(name='John', age='42', n_pets='1') print(user) #> name='John' age=42 n_pets=1 class AnotherUser(BaseModel): name: str age: int = Field(strict=True) n_pets: int try: anotheruser = AnotherUser(name='John', age='42', n_pets='1') except ValidationError as e: print(e) """ 1 validation error for AnotherUser age Input should be a valid integer [type=int_type, input_value='42', input_type=str] """ ``` Note that making fields strict will also affect the validation performed when instantiating the model class: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: int = Field(strict=True) y: int = Field(strict=False) try: Model(x='1', y='2') except ValidationError as exc: print(exc) """ 1 validation error for Model x Input should be a valid integer [type=int_type, input_value='1', input_type=str] """ ``` ### Using `Field` as an annotation Note that `Field(strict=True)` (or with any other keyword arguments) can be used as an annotation if necessary, e.g., when working with `TypedDict`: ```python from typing import Annotated from typing_extensions import TypedDict from pydantic import Field, TypeAdapter, ValidationError class MyDict(TypedDict): x: Annotated[int, Field(strict=True)] try: TypeAdapter(MyDict).validate_python({'x': '1'}) except ValidationError as exc: print(exc) """ 1 validation error for MyDict x Input should be a valid integer [type=int_type, input_value='1', input_type=str] """ ``` ```python from typing import Annotated from typing import TypedDict from pydantic import Field, TypeAdapter, ValidationError class MyDict(TypedDict): x: Annotated[int, Field(strict=True)] try: TypeAdapter(MyDict).validate_python({'x': '1'}) except ValidationError as exc: print(exc) """ 1 validation error for MyDict x Input should be a valid integer [type=int_type, input_value='1', input_type=str] """ ``` ## Strict mode with `Annotated[..., Strict()]` API Documentation pydantic.types.Strict Pydantic also provides the [`Strict`](../../api/types/#pydantic.types.Strict) class, which is intended for use as metadata with typing.Annotated class; this annotation indicates that the annotated field should be validated in strict mode: ```python from typing import Annotated from pydantic import BaseModel, Strict, ValidationError class User(BaseModel): name: str age: int is_active: Annotated[bool, Strict()] User(name='David', age=33, is_active=True) try: User(name='David', age=33, is_active='True') except ValidationError as exc: print(exc) """ 1 validation error for User is_active Input should be a valid boolean [type=bool_type, input_value='True', input_type=str] """ ``` This is, in fact, the method used to implement some of the strict-out-of-the-box types provided by Pydantic, such as [`StrictInt`](../../api/types/#pydantic.types.StrictInt). ## Strict mode with `ConfigDict` ### `BaseModel` If you want to enable strict mode for all fields on a complex input type, you can use [`ConfigDict(strict=True)`](../../api/config/#pydantic.config.ConfigDict) in the `model_config`: ```python from pydantic import BaseModel, ConfigDict, ValidationError class User(BaseModel): model_config = ConfigDict(strict=True) name: str age: int is_active: bool try: User(name='David', age='33', is_active='yes') except ValidationError as exc: print(exc) """ 2 validation errors for User age Input should be a valid integer [type=int_type, input_value='33', input_type=str] is_active Input should be a valid boolean [type=bool_type, input_value='yes', input_type=str] """ ``` Note When using `strict=True` through a model's `model_config`, you can still override the strictness of individual fields by setting `strict=False` on individual fields: ```python from pydantic import BaseModel, ConfigDict, Field class User(BaseModel): model_config = ConfigDict(strict=True) name: str age: int = Field(strict=False) ``` Note that strict mode is not recursively applied to nested model fields: ```python from pydantic import BaseModel, ConfigDict, ValidationError class Inner(BaseModel): y: int class Outer(BaseModel): model_config = ConfigDict(strict=True) x: int inner: Inner print(Outer(x=1, inner=Inner(y='2'))) #> x=1 inner=Inner(y=2) try: Outer(x='1', inner=Inner(y='2')) except ValidationError as exc: print(exc) """ 1 validation error for Outer x Input should be a valid integer [type=int_type, input_value='1', input_type=str] """ ``` (This is also the case for dataclasses and `TypedDict`.) If this is undesirable, you should make sure that strict mode is enabled for all the types involved. For example, this can be done for model classes by using a shared base class with `model_config = ConfigDict(strict=True)`: ```python from pydantic import BaseModel, ConfigDict, ValidationError class MyBaseModel(BaseModel): model_config = ConfigDict(strict=True) class Inner(MyBaseModel): y: int class Outer(MyBaseModel): x: int inner: Inner try: Outer.model_validate({'x': 1, 'inner': {'y': '2'}}) except ValidationError as exc: print(exc) """ 1 validation error for Outer inner.y Input should be a valid integer [type=int_type, input_value='2', input_type=str] """ ``` ### Dataclasses and `TypedDict` Pydantic dataclasses behave similarly to the examples shown above with `BaseModel`, just that instead of `model_config` you should use the `config` keyword argument to the `@pydantic.dataclasses.dataclass` decorator. When possible, you can achieve nested strict mode for vanilla dataclasses or `TypedDict` subclasses by annotating fields with the [`pydantic.types.Strict` annotation](#strict-mode-with-annotated-strict). However, if this is *not* possible (e.g., when working with third-party types), you can set the config that Pydantic should use for the type by setting the `__pydantic_config__` attribute on the type: ```python from typing_extensions import TypedDict from pydantic import ConfigDict, TypeAdapter, ValidationError class Inner(TypedDict): y: int Inner.__pydantic_config__ = ConfigDict(strict=True) class Outer(TypedDict): x: int inner: Inner adapter = TypeAdapter(Outer) print(adapter.validate_python({'x': '1', 'inner': {'y': 2}})) #> {'x': 1, 'inner': {'y': 2}} try: adapter.validate_python({'x': '1', 'inner': {'y': '2'}}) except ValidationError as exc: print(exc) """ 1 validation error for Outer inner.y Input should be a valid integer [type=int_type, input_value='2', input_type=str] """ ``` ```python from typing import TypedDict from pydantic import ConfigDict, TypeAdapter, ValidationError class Inner(TypedDict): y: int Inner.__pydantic_config__ = ConfigDict(strict=True) class Outer(TypedDict): x: int inner: Inner adapter = TypeAdapter(Outer) print(adapter.validate_python({'x': '1', 'inner': {'y': 2}})) #> {'x': 1, 'inner': {'y': 2}} try: adapter.validate_python({'x': '1', 'inner': {'y': '2'}}) except ValidationError as exc: print(exc) """ 1 validation error for Outer inner.y Input should be a valid integer [type=int_type, input_value='2', input_type=str] """ ``` ### `TypeAdapter` You can also get strict mode through the use of the config keyword argument to the [`TypeAdapter`](../../api/type_adapter/) class: ```python from pydantic import ConfigDict, TypeAdapter, ValidationError adapter = TypeAdapter(bool, config=ConfigDict(strict=True)) try: adapter.validate_python('yes') except ValidationError as exc: print(exc) """ 1 validation error for bool Input should be a valid boolean [type=bool_type, input_value='yes', input_type=str] """ ``` ### `@validate_call` Strict mode is also usable with the [`@validate_call`](../../api/validate_call/#pydantic.validate_call_decorator.validate_call) decorator by passing the `config` keyword argument: ```python from pydantic import ConfigDict, ValidationError, validate_call @validate_call(config=ConfigDict(strict=True)) def foo(x: int) -> int: return x try: foo('1') except ValidationError as exc: print(exc) """ 1 validation error for foo 0 Input should be a valid integer [type=int_type, input_value='1', input_type=str] """ ``` You may have types that are not `BaseModel`s that you want to validate data against. Or you may want to validate a `list[SomeModel]`, or dump it to JSON. API Documentation pydantic.type_adapter.TypeAdapter For use cases like this, Pydantic provides TypeAdapter, which can be used for type validation, serialization, and JSON schema generation without needing to create a BaseModel. A TypeAdapter instance exposes some of the functionality from BaseModel instance methods for types that do not have such methods (such as dataclasses, primitive types, and more): ```python from typing_extensions import TypedDict from pydantic import TypeAdapter, ValidationError class User(TypedDict): name: str id: int user_list_adapter = TypeAdapter(list[User]) user_list = user_list_adapter.validate_python([{'name': 'Fred', 'id': '3'}]) print(repr(user_list)) #> [{'name': 'Fred', 'id': 3}] try: user_list_adapter.validate_python( [{'name': 'Fred', 'id': 'wrong', 'other': 'no'}] ) except ValidationError as e: print(e) """ 1 validation error for list[User] 0.id Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='wrong', input_type=str] """ print(repr(user_list_adapter.dump_json(user_list))) #> b'[{"name":"Fred","id":3}]' ``` ```python from typing import TypedDict from pydantic import TypeAdapter, ValidationError class User(TypedDict): name: str id: int user_list_adapter = TypeAdapter(list[User]) user_list = user_list_adapter.validate_python([{'name': 'Fred', 'id': '3'}]) print(repr(user_list)) #> [{'name': 'Fred', 'id': 3}] try: user_list_adapter.validate_python( [{'name': 'Fred', 'id': 'wrong', 'other': 'no'}] ) except ValidationError as e: print(e) """ 1 validation error for list[User] 0.id Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='wrong', input_type=str] """ print(repr(user_list_adapter.dump_json(user_list))) #> b'[{"name":"Fred","id":3}]' ``` `dump_json` returns `bytes` `TypeAdapter`'s `dump_json` methods returns a `bytes` object, unlike the corresponding method for `BaseModel`, `model_dump_json`, which returns a `str`. The reason for this discrepancy is that in V1, model dumping returned a str type, so this behavior is retained in V2 for backwards compatibility. For the `BaseModel` case, `bytes` are coerced to `str` types, but `bytes` are often the desired end type. Hence, for the new `TypeAdapter` class in V2, the return type is simply `bytes`, which can easily be coerced to a `str` type if desired. Note Despite some overlap in use cases with RootModel, TypeAdapter should not be used as a type annotation for specifying fields of a `BaseModel`, etc. ## Parsing data into a specified type TypeAdapter can be used to apply the parsing logic to populate Pydantic models in a more ad-hoc way. This function behaves similarly to BaseModel.model_validate, but works with arbitrary Pydantic-compatible types. This is especially useful when you want to parse results into a type that is not a direct subclass of BaseModel. For example: ```python from pydantic import BaseModel, TypeAdapter class Item(BaseModel): id: int name: str # `item_data` could come from an API call, eg., via something like: # item_data = requests.get('https://my-api.com/items').json() item_data = [{'id': 1, 'name': 'My Item'}] items = TypeAdapter(list[Item]).validate_python(item_data) print(items) #> [Item(id=1, name='My Item')] ``` TypeAdapter is capable of parsing data into any of the types Pydantic can handle as fields of a BaseModel. Performance considerations When creating an instance of TypeAdapter, the provided type must be analyzed and converted into a pydantic-core schema. This comes with some non-trivial overhead, so it is recommended to create a `TypeAdapter` for a given type just once and reuse it in loops or other performance-critical code. ## Rebuilding a `TypeAdapter`'s schema In v2.10+, TypeAdapter's support deferred schema building and manual rebuilds. This is helpful for the case of: - Types with forward references - Types for which core schema builds are expensive When you initialize a TypeAdapter with a type, Pydantic analyzes the type and creates a core schema for it. This core schema contains the information needed to validate and serialize data for that type. See the [architecture documentation](../../internals/architecture/) for more information on core schemas. If you set defer_build to `True` when initializing a `TypeAdapter`, Pydantic will defer building the core schema until the first time it is needed (for validation or serialization). In order to manually trigger the building of the core schema, you can call the rebuild method on the TypeAdapter instance: ```python from pydantic import ConfigDict, TypeAdapter ta = TypeAdapter('MyInt', config=ConfigDict(defer_build=True)) # some time later, the forward reference is defined MyInt = int ta.rebuild() assert ta.validate_python(1) == 1 ``` Where possible Pydantic uses [standard library types](../../api/standard_library_types/) to define fields, thus smoothing the learning curve. For many useful applications, however, no standard library type exists, so Pydantic implements many commonly used types. There are also more complex types that can be found in the [Pydantic Extra Types](https://github.com/pydantic/pydantic-extra-types) package. If no existing type suits your purpose you can also implement your [own Pydantic-compatible types](#custom-types) with custom properties and validation. The following sections describe the types supported by Pydantic. - [Standard Library Types](../../api/standard_library_types/) — types from the Python standard library. - [Strict Types](#strict-types) — types that enable you to prevent coercion from compatible types. - [Custom Data Types](#custom-types) — create your own custom data types. - [Field Type Conversions](../conversion_table/) — strict and lax conversion between different field types. ## Type conversion During validation, Pydantic can coerce data into expected types. There are two modes of coercion: strict and lax. See [Conversion Table](../conversion_table/) for more details on how Pydantic converts data in both strict and lax modes. See [Strict mode](../strict_mode/) and [Strict Types](#strict-types) for details on enabling strict coercion. ## Strict Types Pydantic provides the following strict types: - StrictBool - StrictBytes - StrictFloat - StrictInt - StrictStr These types will only pass validation when the validated value is of the respective type or is a subtype of that type. ### Constrained types This behavior is also exposed via the `strict` field of the constrained types and can be combined with a multitude of complex validation rules. See the individual type signatures for supported arguments. - conbytes() - condate() - condecimal() - confloat() - confrozenset() - conint() - conlist() - conset() - constr() The following caveats apply: - `StrictBytes` (and the `strict` option of `conbytes()`) will accept both `bytes`, and `bytearray` types. - `StrictInt` (and the `strict` option of `conint()`) will not accept `bool` types, even though `bool` is a subclass of `int` in Python. Other subclasses will work. - `StrictFloat` (and the `strict` option of `confloat()`) will not accept `int`. Besides the above, you can also have a FiniteFloat type that will only accept finite values (i.e. not `inf`, `-inf` or `nan`). ## Custom Types You can also define your own custom data types. There are several ways to achieve it. ### Using the annotated pattern The [annotated pattern](../fields/#the-annotated-pattern) can be used to make types reusable across your code base. For example, to create a type representing a positive integer: ```python from typing import Annotated from pydantic import Field, TypeAdapter, ValidationError PositiveInt = Annotated[int, Field(gt=0)] # (1)! ta = TypeAdapter(PositiveInt) print(ta.validate_python(1)) #> 1 try: ta.validate_python(-1) except ValidationError as exc: print(exc) """ 1 validation error for constrained-int Input should be greater than 0 [type=greater_than, input_value=-1, input_type=int] """ ``` 1. Note that you can also use constraints from the [annotated-types](https://github.com/annotated-types/annotated-types) library to make this Pydantic-agnostic: ```python from annotated_types import Gt PositiveInt = Annotated[int, Gt(0)] ``` #### Adding validation and serialization You can add or override validation, serialization, and JSON schemas to an arbitrary type using the markers that Pydantic exports: ```python from typing import Annotated from pydantic import ( AfterValidator, PlainSerializer, TypeAdapter, WithJsonSchema, ) TruncatedFloat = Annotated[ float, AfterValidator(lambda x: round(x, 1)), PlainSerializer(lambda x: f'{x:.1e}', return_type=str), WithJsonSchema({'type': 'string'}, mode='serialization'), ] ta = TypeAdapter(TruncatedFloat) input = 1.02345 assert input != 1.0 assert ta.validate_python(input) == 1.0 assert ta.dump_json(input) == b'"1.0e+00"' assert ta.json_schema(mode='validation') == {'type': 'number'} assert ta.json_schema(mode='serialization') == {'type': 'string'} ``` #### Generics Type variables can be used within the Annotated type: ```python from typing import Annotated, TypeVar from annotated_types import Gt, Len from pydantic import TypeAdapter, ValidationError T = TypeVar('T') ShortList = Annotated[list[T], Len(max_length=4)] ta = TypeAdapter(ShortList[int]) v = ta.validate_python([1, 2, 3, 4]) assert v == [1, 2, 3, 4] try: ta.validate_python([1, 2, 3, 4, 5]) except ValidationError as exc: print(exc) """ 1 validation error for list[int] List should have at most 4 items after validation, not 5 [type=too_long, input_value=[1, 2, 3, 4, 5], input_type=list] """ PositiveList = list[Annotated[T, Gt(0)]] ta = TypeAdapter(PositiveList[float]) v = ta.validate_python([1.0]) assert type(v[0]) is float try: ta.validate_python([-1.0]) except ValidationError as exc: print(exc) """ 1 validation error for list[constrained-float] 0 Input should be greater than 0 [type=greater_than, input_value=-1.0, input_type=float] """ ``` ### Named type aliases The above examples make use of *implicit* type aliases, assigned to a variable. At runtime, Pydantic has no way of knowing the name of the variable it was assigned to, and this can be problematic for two reasons: - The [JSON Schema](../json_schema/) of the alias won't be converted into a [definition](https://json-schema.org/understanding-json-schema/structuring#defs). This is mostly useful when you are using the alias more than once in a model definition. - In most cases, [recursive type aliases](#named-recursive-types) won't work. By leveraging the new [`type` statement](https://typing.readthedocs.io/en/latest/spec/aliases.html#type-statement) (introduced in [PEP 695](https://peps.python.org/pep-0695/)), you can define aliases as follows: ```python from typing import Annotated from annotated_types import Gt from typing_extensions import TypeAliasType from pydantic import BaseModel PositiveIntList = TypeAliasType('PositiveIntList', list[Annotated[int, Gt(0)]]) class Model(BaseModel): x: PositiveIntList y: PositiveIntList print(Model.model_json_schema()) # (1)! """ { '$defs': { 'PositiveIntList': { 'items': {'exclusiveMinimum': 0, 'type': 'integer'}, 'type': 'array', } }, 'properties': { 'x': {'$ref': '#/$defs/PositiveIntList'}, 'y': {'$ref': '#/$defs/PositiveIntList'}, }, 'required': ['x', 'y'], 'title': 'Model', 'type': 'object', } """ ``` 1. If `PositiveIntList` were to be defined as an implicit type alias, its definition would have been duplicated in both `'x'` and `'y'`. ```python from typing import Annotated from annotated_types import Gt from pydantic import BaseModel type PositiveIntList = list[Annotated[int, Gt(0)]] class Model(BaseModel): x: PositiveIntList y: PositiveIntList print(Model.model_json_schema()) # (1)! """ { '$defs': { 'PositiveIntList': { 'items': {'exclusiveMinimum': 0, 'type': 'integer'}, 'type': 'array', } }, 'properties': { 'x': {'$ref': '#/$defs/PositiveIntList'}, 'y': {'$ref': '#/$defs/PositiveIntList'}, }, 'required': ['x', 'y'], 'title': 'Model', 'type': 'object', } """ ``` 1. If `PositiveIntList` were to be defined as an implicit type alias, its definition would have been duplicated in both `'x'` and `'y'`. When to use named type aliases While (named) PEP 695 and implicit type aliases are meant to be equivalent for static type checkers, Pydantic will *not* understand field-specific metadata inside named aliases. That is, metadata such as `alias`, `default`, `deprecated`, *cannot* be used: ```python from typing import Annotated from typing_extensions import TypeAliasType from pydantic import BaseModel, Field MyAlias = TypeAliasType('MyAlias', Annotated[int, Field(default=1)]) class Model(BaseModel): x: MyAlias # This is not allowed ``` ```python from typing import Annotated from pydantic import BaseModel, Field type MyAlias = Annotated[int, Field(default=1)] class Model(BaseModel): x: MyAlias # This is not allowed ``` Only metadata that can be applied to the annotated type itself is allowed (e.g. [validation constraints](../fields/#field-constraints) and JSON metadata). Trying to support field-specific metadata would require eagerly inspecting the type alias's __value__, and as such Pydantic wouldn't be able to have the alias stored as a JSON Schema definition. Note As with implicit type aliases, type variables can also be used inside the generic alias: ```python from typing import Annotated, TypeVar from annotated_types import Len from typing_extensions import TypeAliasType T = TypeVar('T') ShortList = TypeAliasType( 'ShortList', Annotated[list[T], Len(max_length=4)], type_params=(T,) ) ``` ```python from typing import Annotated, TypeVar from annotated_types import Len type ShortList[T] = Annotated[list[T], Len(max_length=4)] ``` #### Named recursive types Named type aliases should be used whenever you need to define recursive type aliases (1). 1. For several reasons, Pydantic isn't able to support implicit recursive aliases. For instance, it won't be able to resolve [forward annotations](../forward_annotations/) across modules. For instance, here is an example definition of a JSON type: ```python from typing import Union from typing_extensions import TypeAliasType from pydantic import TypeAdapter Json = TypeAliasType( 'Json', 'Union[dict[str, Json], list[Json], str, int, float, bool, None]', # (1)! ) ta = TypeAdapter(Json) print(ta.json_schema()) """ { '$defs': { 'Json': { 'anyOf': [ { 'additionalProperties': {'$ref': '#/$defs/Json'}, 'type': 'object', }, {'items': {'$ref': '#/$defs/Json'}, 'type': 'array'}, {'type': 'string'}, {'type': 'integer'}, {'type': 'number'}, {'type': 'boolean'}, {'type': 'null'}, ] } }, '$ref': '#/$defs/Json', } """ ``` 1. Wrapping the annotation in quotes is necessary as it is eagerly evaluated (and `Json` has yet to be defined). ```python from pydantic import TypeAdapter type Json = dict[str, Json] | list[Json] | str | int | float | bool | None # (1)! ta = TypeAdapter(Json) print(ta.json_schema()) """ { '$defs': { 'Json': { 'anyOf': [ { 'additionalProperties': {'$ref': '#/$defs/Json'}, 'type': 'object', }, {'items': {'$ref': '#/$defs/Json'}, 'type': 'array'}, {'type': 'string'}, {'type': 'integer'}, {'type': 'number'}, {'type': 'boolean'}, {'type': 'null'}, ] } }, '$ref': '#/$defs/Json', } """ ``` 1. The value of a named type alias is lazily evaluated, so there's no need to use forward annotations. Tip Pydantic defines a JsonValue type as a convenience. ### Customizing validation with `__get_pydantic_core_schema__` To do more extensive customization of how Pydantic handles custom classes, and in particular when you have access to the class or can subclass it, you can implement a special `__get_pydantic_core_schema__` to tell Pydantic how to generate the `pydantic-core` schema. While `pydantic` uses `pydantic-core` internally to handle validation and serialization, it is a new API for Pydantic V2, thus it is one of the areas most likely to be tweaked in the future and you should try to stick to the built-in constructs like those provided by `annotated-types`, `pydantic.Field`, or `BeforeValidator` and so on. You can implement `__get_pydantic_core_schema__` both on a custom type and on metadata intended to be put in `Annotated`. In both cases the API is middleware-like and similar to that of "wrap" validators: you get a `source_type` (which isn't necessarily the same as the class, in particular for generics) and a `handler` that you can call with a type to either call the next metadata in `Annotated` or call into Pydantic's internal schema generation. The simplest no-op implementation calls the handler with the type you are given, then returns that as the result. You can also choose to modify the type before calling the handler, modify the core schema returned by the handler, or not call the handler at all. #### As a method on a custom type The following is an example of a type that uses `__get_pydantic_core_schema__` to customize how it gets validated. This is equivalent to implementing `__get_validators__` in Pydantic V1. ```python from typing import Any from pydantic_core import CoreSchema, core_schema from pydantic import GetCoreSchemaHandler, TypeAdapter class Username(str): @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler ) -> CoreSchema: return core_schema.no_info_after_validator_function(cls, handler(str)) ta = TypeAdapter(Username) res = ta.validate_python('abc') assert isinstance(res, Username) assert res == 'abc' ``` See [JSON Schema](../json_schema/) for more details on how to customize JSON schemas for custom types. #### As an annotation Often you'll want to parametrize your custom type by more than just generic type parameters (which you can do via the type system and will be discussed later). Or you may not actually care (or want to) make an instance of your subclass; you actually want the original type, just with some extra validation done. For example, if you were to implement `pydantic.AfterValidator` (see [Adding validation and serialization](#adding-validation-and-serialization)) yourself, you'd do something similar to the following: ```python from dataclasses import dataclass from typing import Annotated, Any, Callable from pydantic_core import CoreSchema, core_schema from pydantic import BaseModel, GetCoreSchemaHandler @dataclass(frozen=True) # (1)! class MyAfterValidator: func: Callable[[Any], Any] def __get_pydantic_core_schema__( self, source_type: Any, handler: GetCoreSchemaHandler ) -> CoreSchema: return core_schema.no_info_after_validator_function( self.func, handler(source_type) ) Username = Annotated[str, MyAfterValidator(str.lower)] class Model(BaseModel): name: Username assert Model(name='ABC').name == 'abc' # (2)! ``` 1. The `frozen=True` specification makes `MyAfterValidator` hashable. Without this, a union such as `Username | None` will raise an error. 1. Notice that type checkers will not complain about assigning `'ABC'` to `Username` like they did in the previous example because they do not consider `Username` to be a distinct type from `str`. ```python from dataclasses import dataclass from typing import Annotated, Any from collections.abc import Callable from pydantic_core import CoreSchema, core_schema from pydantic import BaseModel, GetCoreSchemaHandler @dataclass(frozen=True) # (1)! class MyAfterValidator: func: Callable[[Any], Any] def __get_pydantic_core_schema__( self, source_type: Any, handler: GetCoreSchemaHandler ) -> CoreSchema: return core_schema.no_info_after_validator_function( self.func, handler(source_type) ) Username = Annotated[str, MyAfterValidator(str.lower)] class Model(BaseModel): name: Username assert Model(name='ABC').name == 'abc' # (2)! ``` 1. The `frozen=True` specification makes `MyAfterValidator` hashable. Without this, a union such as `Username | None` will raise an error. 1. Notice that type checkers will not complain about assigning `'ABC'` to `Username` like they did in the previous example because they do not consider `Username` to be a distinct type from `str`. #### Handling third-party types Another use case for the pattern in the previous section is to handle third party types. ```python from typing import Annotated, Any from pydantic_core import core_schema from pydantic import ( BaseModel, GetCoreSchemaHandler, GetJsonSchemaHandler, ValidationError, ) from pydantic.json_schema import JsonSchemaValue class ThirdPartyType: """ This is meant to represent a type from a third-party library that wasn't designed with Pydantic integration in mind, and so doesn't have a `pydantic_core.CoreSchema` or anything. """ x: int def __init__(self): self.x = 0 class _ThirdPartyTypePydanticAnnotation: @classmethod def __get_pydantic_core_schema__( cls, _source_type: Any, _handler: GetCoreSchemaHandler, ) -> core_schema.CoreSchema: """ We return a pydantic_core.CoreSchema that behaves in the following ways: * ints will be parsed as `ThirdPartyType` instances with the int as the x attribute * `ThirdPartyType` instances will be parsed as `ThirdPartyType` instances without any changes * Nothing else will pass validation * Serialization will always return just an int """ def validate_from_int(value: int) -> ThirdPartyType: result = ThirdPartyType() result.x = value return result from_int_schema = core_schema.chain_schema( [ core_schema.int_schema(), core_schema.no_info_plain_validator_function(validate_from_int), ] ) return core_schema.json_or_python_schema( json_schema=from_int_schema, python_schema=core_schema.union_schema( [ # check if it's an instance first before doing any further work core_schema.is_instance_schema(ThirdPartyType), from_int_schema, ] ), serialization=core_schema.plain_serializer_function_ser_schema( lambda instance: instance.x ), ) @classmethod def __get_pydantic_json_schema__( cls, _core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: # Use the same schema that would be used for `int` return handler(core_schema.int_schema()) # We now create an `Annotated` wrapper that we'll use as the annotation for fields on `BaseModel`s, etc. PydanticThirdPartyType = Annotated[ ThirdPartyType, _ThirdPartyTypePydanticAnnotation ] # Create a model class that uses this annotation as a field class Model(BaseModel): third_party_type: PydanticThirdPartyType # Demonstrate that this field is handled correctly, that ints are parsed into `ThirdPartyType`, and that # these instances are also "dumped" directly into ints as expected. m_int = Model(third_party_type=1) assert isinstance(m_int.third_party_type, ThirdPartyType) assert m_int.third_party_type.x == 1 assert m_int.model_dump() == {'third_party_type': 1} # Do the same thing where an instance of ThirdPartyType is passed in instance = ThirdPartyType() assert instance.x == 0 instance.x = 10 m_instance = Model(third_party_type=instance) assert isinstance(m_instance.third_party_type, ThirdPartyType) assert m_instance.third_party_type.x == 10 assert m_instance.model_dump() == {'third_party_type': 10} # Demonstrate that validation errors are raised as expected for invalid inputs try: Model(third_party_type='a') except ValidationError as e: print(e) """ 2 validation errors for Model third_party_type.is-instance[ThirdPartyType] Input should be an instance of ThirdPartyType [type=is_instance_of, input_value='a', input_type=str] third_party_type.chain[int,function-plain[validate_from_int()]] Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] """ assert Model.model_json_schema() == { 'properties': { 'third_party_type': {'title': 'Third Party Type', 'type': 'integer'} }, 'required': ['third_party_type'], 'title': 'Model', 'type': 'object', } ``` You can use this approach to e.g. define behavior for Pandas or Numpy types. #### Using `GetPydanticSchema` to reduce boilerplate API Documentation pydantic.types.GetPydanticSchema You may notice that the above examples where we create a marker class require a good amount of boilerplate. For many simple cases you can greatly minimize this by using `pydantic.GetPydanticSchema`: ```python from typing import Annotated from pydantic_core import core_schema from pydantic import BaseModel, GetPydanticSchema class Model(BaseModel): y: Annotated[ str, GetPydanticSchema( lambda tp, handler: core_schema.no_info_after_validator_function( lambda x: x * 2, handler(tp) ) ), ] assert Model(y='ab').y == 'abab' ``` #### Summary Let's recap: 1. Pydantic provides high level hooks to customize types via `Annotated` like `AfterValidator` and `Field`. Use these when possible. 1. Under the hood these use `pydantic-core` to customize validation, and you can hook into that directly using `GetPydanticSchema` or a marker class with `__get_pydantic_core_schema__`. 1. If you really want a custom type you can implement `__get_pydantic_core_schema__` on the type itself. ### Handling custom generic classes Warning This is an advanced technique that you might not need in the beginning. In most of the cases you will probably be fine with standard Pydantic models. You can use [Generic Classes](https://docs.python.org/3/library/typing.html#typing.Generic) as field types and perform custom validation based on the "type parameters" (or sub-types) with `__get_pydantic_core_schema__`. If the Generic class that you are using as a sub-type has a classmethod `__get_pydantic_core_schema__`, you don't need to use arbitrary_types_allowed for it to work. Because the `source_type` parameter is not the same as the `cls` parameter, you can use `typing.get_args` (or `typing_extensions.get_args`) to extract the generic parameters. Then you can use the `handler` to generate a schema for them by calling `handler.generate_schema`. Note that we do not do something like `handler(get_args(source_type)[0])` because we want to generate an unrelated schema for that generic parameter, not one that is influenced by the current context of `Annotated` metadata and such. This is less important for custom types, but crucial for annotated metadata that modifies schema building. ```python from dataclasses import dataclass from typing import Any, Generic, TypeVar from pydantic_core import CoreSchema, core_schema from typing_extensions import get_args, get_origin from pydantic import ( BaseModel, GetCoreSchemaHandler, ValidationError, ValidatorFunctionWrapHandler, ) ItemType = TypeVar('ItemType') # This is not a pydantic model, it's an arbitrary generic class @dataclass class Owner(Generic[ItemType]): name: str item: ItemType @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler ) -> CoreSchema: origin = get_origin(source_type) if origin is None: # used as `x: Owner` without params origin = source_type item_tp = Any else: item_tp = get_args(source_type)[0] # both calling handler(...) and handler.generate_schema(...) # would work, but prefer the latter for conceptual and consistency reasons item_schema = handler.generate_schema(item_tp) def val_item( v: Owner[Any], handler: ValidatorFunctionWrapHandler ) -> Owner[Any]: v.item = handler(v.item) return v python_schema = core_schema.chain_schema( # `chain_schema` means do the following steps in order: [ # Ensure the value is an instance of Owner core_schema.is_instance_schema(cls), # Use the item_schema to validate `items` core_schema.no_info_wrap_validator_function( val_item, item_schema ), ] ) return core_schema.json_or_python_schema( # for JSON accept an object with name and item keys json_schema=core_schema.chain_schema( [ core_schema.typed_dict_schema( { 'name': core_schema.typed_dict_field( core_schema.str_schema() ), 'item': core_schema.typed_dict_field(item_schema), } ), # after validating the json data convert it to python core_schema.no_info_before_validator_function( lambda data: Owner( name=data['name'], item=data['item'] ), # note that we reuse the same schema here as below python_schema, ), ] ), python_schema=python_schema, ) class Car(BaseModel): color: str class House(BaseModel): rooms: int class Model(BaseModel): car_owner: Owner[Car] home_owner: Owner[House] model = Model( car_owner=Owner(name='John', item=Car(color='black')), home_owner=Owner(name='James', item=House(rooms=3)), ) print(model) """ car_owner=Owner(name='John', item=Car(color='black')) home_owner=Owner(name='James', item=House(rooms=3)) """ try: # If the values of the sub-types are invalid, we get an error Model( car_owner=Owner(name='John', item=House(rooms=3)), home_owner=Owner(name='James', item=Car(color='black')), ) except ValidationError as e: print(e) """ 2 validation errors for Model wine Input should be a valid number, unable to parse string as a number [type=float_parsing, input_value='Kinda good', input_type=str] cheese Input should be a valid boolean, unable to interpret input [type=bool_parsing, input_value='yeah', input_type=str] """ # Similarly with JSON model = Model.model_validate_json( '{"car_owner":{"name":"John","item":{"color":"black"}},"home_owner":{"name":"James","item":{"rooms":3}}}' ) print(model) """ car_owner=Owner(name='John', item=Car(color='black')) home_owner=Owner(name='James', item=House(rooms=3)) """ try: Model.model_validate_json( '{"car_owner":{"name":"John","item":{"rooms":3}},"home_owner":{"name":"James","item":{"color":"black"}}}' ) except ValidationError as e: print(e) """ 2 validation errors for Model car_owner.item.color Field required [type=missing, input_value={'rooms': 3}, input_type=dict] home_owner.item.rooms Field required [type=missing, input_value={'color': 'black'}, input_type=dict] """ ``` ```python from dataclasses import dataclass from typing import Any, Generic, TypeVar from pydantic_core import CoreSchema, core_schema from typing import get_args, get_origin from pydantic import ( BaseModel, GetCoreSchemaHandler, ValidationError, ValidatorFunctionWrapHandler, ) ItemType = TypeVar('ItemType') # This is not a pydantic model, it's an arbitrary generic class @dataclass class Owner(Generic[ItemType]): name: str item: ItemType @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler ) -> CoreSchema: origin = get_origin(source_type) if origin is None: # used as `x: Owner` without params origin = source_type item_tp = Any else: item_tp = get_args(source_type)[0] # both calling handler(...) and handler.generate_schema(...) # would work, but prefer the latter for conceptual and consistency reasons item_schema = handler.generate_schema(item_tp) def val_item( v: Owner[Any], handler: ValidatorFunctionWrapHandler ) -> Owner[Any]: v.item = handler(v.item) return v python_schema = core_schema.chain_schema( # `chain_schema` means do the following steps in order: [ # Ensure the value is an instance of Owner core_schema.is_instance_schema(cls), # Use the item_schema to validate `items` core_schema.no_info_wrap_validator_function( val_item, item_schema ), ] ) return core_schema.json_or_python_schema( # for JSON accept an object with name and item keys json_schema=core_schema.chain_schema( [ core_schema.typed_dict_schema( { 'name': core_schema.typed_dict_field( core_schema.str_schema() ), 'item': core_schema.typed_dict_field(item_schema), } ), # after validating the json data convert it to python core_schema.no_info_before_validator_function( lambda data: Owner( name=data['name'], item=data['item'] ), # note that we reuse the same schema here as below python_schema, ), ] ), python_schema=python_schema, ) class Car(BaseModel): color: str class House(BaseModel): rooms: int class Model(BaseModel): car_owner: Owner[Car] home_owner: Owner[House] model = Model( car_owner=Owner(name='John', item=Car(color='black')), home_owner=Owner(name='James', item=House(rooms=3)), ) print(model) """ car_owner=Owner(name='John', item=Car(color='black')) home_owner=Owner(name='James', item=House(rooms=3)) """ try: # If the values of the sub-types are invalid, we get an error Model( car_owner=Owner(name='John', item=House(rooms=3)), home_owner=Owner(name='James', item=Car(color='black')), ) except ValidationError as e: print(e) """ 2 validation errors for Model wine Input should be a valid number, unable to parse string as a number [type=float_parsing, input_value='Kinda good', input_type=str] cheese Input should be a valid boolean, unable to interpret input [type=bool_parsing, input_value='yeah', input_type=str] """ # Similarly with JSON model = Model.model_validate_json( '{"car_owner":{"name":"John","item":{"color":"black"}},"home_owner":{"name":"James","item":{"rooms":3}}}' ) print(model) """ car_owner=Owner(name='John', item=Car(color='black')) home_owner=Owner(name='James', item=House(rooms=3)) """ try: Model.model_validate_json( '{"car_owner":{"name":"John","item":{"rooms":3}},"home_owner":{"name":"James","item":{"color":"black"}}}' ) except ValidationError as e: print(e) """ 2 validation errors for Model car_owner.item.color Field required [type=missing, input_value={'rooms': 3}, input_type=dict] home_owner.item.rooms Field required [type=missing, input_value={'color': 'black'}, input_type=dict] """ ``` #### Generic containers The same idea can be applied to create generic container types, like a custom `Sequence` type: ```python from typing import Any, Sequence, TypeVar from pydantic_core import ValidationError, core_schema from typing_extensions import get_args from pydantic import BaseModel, GetCoreSchemaHandler T = TypeVar('T') class MySequence(Sequence[T]): def __init__(self, v: Sequence[T]): self.v = v def __getitem__(self, i): return self.v[i] def __len__(self): return len(self.v) @classmethod def __get_pydantic_core_schema__( cls, source: Any, handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: instance_schema = core_schema.is_instance_schema(cls) args = get_args(source) if args: # replace the type and rely on Pydantic to generate the right schema # for `Sequence` sequence_t_schema = handler.generate_schema(Sequence[args[0]]) else: sequence_t_schema = handler.generate_schema(Sequence) non_instance_schema = core_schema.no_info_after_validator_function( MySequence, sequence_t_schema ) return core_schema.union_schema([instance_schema, non_instance_schema]) class M(BaseModel): model_config = dict(validate_default=True) s1: MySequence = [3] m = M() print(m) #> s1=<__main__.MySequence object at 0x0123456789ab> print(m.s1.v) #> [3] class M(BaseModel): s1: MySequence[int] M(s1=[1]) try: M(s1=['a']) except ValidationError as exc: print(exc) """ 2 validation errors for M s1.is-instance[MySequence] Input should be an instance of MySequence [type=is_instance_of, input_value=['a'], input_type=list] s1.function-after[MySequence(), json-or-python[json=list[int],python=chain[is-instance[Sequence],function-wrap[sequence_validator()]]]].0 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] """ ``` ```python from typing import Any, TypeVar from collections.abc import Sequence from pydantic_core import ValidationError, core_schema from typing import get_args from pydantic import BaseModel, GetCoreSchemaHandler T = TypeVar('T') class MySequence(Sequence[T]): def __init__(self, v: Sequence[T]): self.v = v def __getitem__(self, i): return self.v[i] def __len__(self): return len(self.v) @classmethod def __get_pydantic_core_schema__( cls, source: Any, handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: instance_schema = core_schema.is_instance_schema(cls) args = get_args(source) if args: # replace the type and rely on Pydantic to generate the right schema # for `Sequence` sequence_t_schema = handler.generate_schema(Sequence[args[0]]) else: sequence_t_schema = handler.generate_schema(Sequence) non_instance_schema = core_schema.no_info_after_validator_function( MySequence, sequence_t_schema ) return core_schema.union_schema([instance_schema, non_instance_schema]) class M(BaseModel): model_config = dict(validate_default=True) s1: MySequence = [3] m = M() print(m) #> s1=<__main__.MySequence object at 0x0123456789ab> print(m.s1.v) #> [3] class M(BaseModel): s1: MySequence[int] M(s1=[1]) try: M(s1=['a']) except ValidationError as exc: print(exc) """ 2 validation errors for M s1.is-instance[MySequence] Input should be an instance of MySequence [type=is_instance_of, input_value=['a'], input_type=list] s1.function-after[MySequence(), json-or-python[json=list[int],python=chain[is-instance[Sequence],function-wrap[sequence_validator()]]]].0 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] """ ``` ### Access to field name Note This was not possible with Pydantic V2 to V2.3, it was [re-added](https://github.com/pydantic/pydantic/pull/7542) in Pydantic V2.4. As of Pydantic V2.4, you can access the field name via the `handler.field_name` within `__get_pydantic_core_schema__` and thereby set the field name which will be available from `info.field_name`. ```python from typing import Any from pydantic_core import core_schema from pydantic import BaseModel, GetCoreSchemaHandler, ValidationInfo class CustomType: """Custom type that stores the field it was used in.""" def __init__(self, value: int, field_name: str): self.value = value self.field_name = field_name def __repr__(self): return f'CustomType<{self.value} {self.field_name!r}>' @classmethod def validate(cls, value: int, info: ValidationInfo): return cls(value, info.field_name) @classmethod def __get_pydantic_core_schema__( cls, source_type: Any, handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: return core_schema.with_info_after_validator_function( cls.validate, handler(int), field_name=handler.field_name ) class MyModel(BaseModel): my_field: CustomType m = MyModel(my_field=1) print(m.my_field) #> CustomType<1 'my_field'> ``` You can also access `field_name` from the markers used with `Annotated`, like AfterValidator. ```python from typing import Annotated from pydantic import AfterValidator, BaseModel, ValidationInfo def my_validators(value: int, info: ValidationInfo): return f'<{value} {info.field_name!r}>' class MyModel(BaseModel): my_field: Annotated[int, AfterValidator(my_validators)] m = MyModel(my_field=1) print(m.my_field) #> <1 'my_field'> ``` Unions are fundamentally different to all other types Pydantic validates - instead of requiring all fields/items/values to be valid, unions require only one member to be valid. This leads to some nuance around how to validate unions: - which member(s) of the union should you validate data against, and in which order? - which errors to raise when validation fails? Validating unions feels like adding another orthogonal dimension to the validation process. To solve these problems, Pydantic supports three fundamental approaches to validating unions: 1. [left to right mode](#left-to-right-mode) - the simplest approach, each member of the union is tried in order and the first match is returned 1. [smart mode](#smart-mode) - similar to "left to right mode" members are tried in order; however, validation will proceed past the first match to attempt to find a better match, this is the default mode for most union validation 1. [discriminated unions](#discriminated-unions) - only one member of the union is tried, based on a discriminator Tip In general, we recommend using [discriminated unions](#discriminated-unions). They are both more performant and more predictable than untagged unions, as they allow you to control which member of the union to validate against. For complex cases, if you're using untagged unions, it's recommended to use `union_mode='left_to_right'` if you need guarantees about the order of validation attempts against the union members. If you're looking for incredibly specialized behavior, you can use a [custom validator](../validators/#field-validators). ## Union Modes ### Left to Right Mode Note Because this mode often leads to unexpected validation results, it is not the default in Pydantic >=2, instead `union_mode='smart'` is the default. With this approach, validation is attempted against each member of the union in their order they're defined, and the first successful validation is accepted as input. If validation fails on all members, the validation error includes the errors from all members of the union. `union_mode='left_to_right'` must be set as a [`Field`](../fields/) parameter on union fields where you want to use it. Union with left to right mode ```python from typing import Union from pydantic import BaseModel, Field, ValidationError class User(BaseModel): id: Union[str, int] = Field(union_mode='left_to_right') print(User(id=123)) #> id=123 print(User(id='hello')) #> id='hello' try: User(id=[]) except ValidationError as e: print(e) """ 2 validation errors for User id.str Input should be a valid string [type=string_type, input_value=[], input_type=list] id.int Input should be a valid integer [type=int_type, input_value=[], input_type=list] """ ``` Union with left to right mode ```python from pydantic import BaseModel, Field, ValidationError class User(BaseModel): id: str | int = Field(union_mode='left_to_right') print(User(id=123)) #> id=123 print(User(id='hello')) #> id='hello' try: User(id=[]) except ValidationError as e: print(e) """ 2 validation errors for User id.str Input should be a valid string [type=string_type, input_value=[], input_type=list] id.int Input should be a valid integer [type=int_type, input_value=[], input_type=list] """ ``` The order of members is very important in this case, as demonstrated by tweak the above example: Union with left to right - unexpected results ```python from typing import Union from pydantic import BaseModel, Field class User(BaseModel): id: Union[int, str] = Field(union_mode='left_to_right') print(User(id=123)) # (1) #> id=123 print(User(id='456')) # (2) #> id=456 ``` 1. As expected the input is validated against the `int` member and the result is as expected. 1. We're in lax mode and the numeric string `'123'` is valid as input to the first member of the union, `int`. Since that is tried first, we get the surprising result of `id` being an `int` instead of a `str`. Union with left to right - unexpected results ```python from pydantic import BaseModel, Field class User(BaseModel): id: int | str = Field(union_mode='left_to_right') print(User(id=123)) # (1) #> id=123 print(User(id='456')) # (2) #> id=456 ``` 1. As expected the input is validated against the `int` member and the result is as expected. 1. We're in lax mode and the numeric string `'123'` is valid as input to the first member of the union, `int`. Since that is tried first, we get the surprising result of `id` being an `int` instead of a `str`. ### Smart Mode Because of the potentially surprising results of `union_mode='left_to_right'`, in Pydantic >=2 the default mode for `Union` validation is `union_mode='smart'`. In this mode, pydantic attempts to select the best match for the input from the union members. The exact algorithm may change between Pydantic minor releases to allow for improvements in both performance and accuracy. Note We reserve the right to tweak the internal `smart` matching algorithm in future versions of Pydantic. If you rely on very specific matching behavior, it's recommended to use `union_mode='left_to_right'` or [discriminated unions](#discriminated-unions). Smart Mode Algorithm The smart mode algorithm uses two metrics to determine the best match for the input: 1. The number of valid fields set (relevant for models, dataclasses, and typed dicts) 1. The exactness of the match (relevant for all types) #### Number of valid fields set Note This metric was introduced in Pydantic v2.8.0. Prior to this version, only exactness was used to determine the best match. This metric is currently only relevant for models, dataclasses, and typed dicts. The greater the number of valid fields set, the better the match. The number of fields set on nested models is also taken into account. These counts bubble up to the top-level union, where the union member with the highest count is considered the best match. For data types where this metric is relevant, we prioritize this count over exactness. For all other types, we use solely exactness. #### Exactness For `exactness`, Pydantic scores a match of a union member into one of the following three groups (from highest score to lowest score): - An exact type match, for example an `int` input to a `float | int` union validation is an exact type match for the `int` member - Validation would have succeeded in [`strict` mode](../strict_mode/) - Validation would have succeeded in lax mode The union match which produced the highest exactness score will be considered the best match. In smart mode, the following steps are taken to try to select the best match for the input: 1. Union members are attempted left to right, with any successful matches scored into one of the three exactness categories described above, with the valid fields set count also tallied. 1. After all members have been evaluated, the member with the highest "valid fields set" count is returned. 1. If there's a tie for the highest "valid fields set" count, the exactness score is used as a tiebreaker, and the member with the highest exactness score is returned. 1. If validation failed on all the members, return all the errors. 1. Union members are attempted left to right, with any successful matches scored into one of the three exactness categories described above. - If validation succeeds with an exact type match, that member is returned immediately and following members will not be attempted. 1. If validation succeeded on at least one member as a "strict" match, the leftmost of those "strict" matches is returned. 1. If validation succeeded on at least one member in "lax" mode, the leftmost match is returned. 1. Validation failed on all the members, return all the errors. ```python from typing import Union from uuid import UUID from pydantic import BaseModel class User(BaseModel): id: Union[int, str, UUID] name: str user_01 = User(id=123, name='John Doe') print(user_01) #> id=123 name='John Doe' print(user_01.id) #> 123 user_02 = User(id='1234', name='John Doe') print(user_02) #> id='1234' name='John Doe' print(user_02.id) #> 1234 user_03_uuid = UUID('cf57432e-809e-4353-adbd-9d5c0d733868') user_03 = User(id=user_03_uuid, name='John Doe') print(user_03) #> id=UUID('cf57432e-809e-4353-adbd-9d5c0d733868') name='John Doe' print(user_03.id) #> cf57432e-809e-4353-adbd-9d5c0d733868 print(user_03_uuid.int) #> 275603287559914445491632874575877060712 ``` ```python from uuid import UUID from pydantic import BaseModel class User(BaseModel): id: int | str | UUID name: str user_01 = User(id=123, name='John Doe') print(user_01) #> id=123 name='John Doe' print(user_01.id) #> 123 user_02 = User(id='1234', name='John Doe') print(user_02) #> id='1234' name='John Doe' print(user_02.id) #> 1234 user_03_uuid = UUID('cf57432e-809e-4353-adbd-9d5c0d733868') user_03 = User(id=user_03_uuid, name='John Doe') print(user_03) #> id=UUID('cf57432e-809e-4353-adbd-9d5c0d733868') name='John Doe' print(user_03.id) #> cf57432e-809e-4353-adbd-9d5c0d733868 print(user_03_uuid.int) #> 275603287559914445491632874575877060712 ``` ## Discriminated Unions **Discriminated unions are sometimes referred to as "Tagged Unions".** We can use discriminated unions to more efficiently validate `Union` types, by choosing which member of the union to validate against. This makes validation more efficient and also avoids a proliferation of errors when validation fails. Adding discriminator to unions also means the generated JSON schema implements the [associated OpenAPI specification](https://github.com/OAI/OpenAPI-Specification/blob/main/versions/3.1.0.md#discriminator-object). ### Discriminated Unions with `str` discriminators Frequently, in the case of a `Union` with multiple models, there is a common field to all members of the union that can be used to distinguish which union case the data should be validated against; this is referred to as the "discriminator" in [OpenAPI](https://swagger.io/docs/specification/data-models/inheritance-and-polymorphism/). To validate models based on that information you can set the same field - let's call it `my_discriminator` - in each of the models with a discriminated value, which is one (or many) `Literal` value(s). For your `Union`, you can set the discriminator in its value: `Field(discriminator='my_discriminator')`. ```python from typing import Literal, Union from pydantic import BaseModel, Field, ValidationError class Cat(BaseModel): pet_type: Literal['cat'] meows: int class Dog(BaseModel): pet_type: Literal['dog'] barks: float class Lizard(BaseModel): pet_type: Literal['reptile', 'lizard'] scales: bool class Model(BaseModel): pet: Union[Cat, Dog, Lizard] = Field(discriminator='pet_type') n: int print(Model(pet={'pet_type': 'dog', 'barks': 3.14}, n=1)) #> pet=Dog(pet_type='dog', barks=3.14) n=1 try: Model(pet={'pet_type': 'dog'}, n=1) except ValidationError as e: print(e) """ 1 validation error for Model pet.dog.barks Field required [type=missing, input_value={'pet_type': 'dog'}, input_type=dict] """ ``` ```python from typing import Literal from pydantic import BaseModel, Field, ValidationError class Cat(BaseModel): pet_type: Literal['cat'] meows: int class Dog(BaseModel): pet_type: Literal['dog'] barks: float class Lizard(BaseModel): pet_type: Literal['reptile', 'lizard'] scales: bool class Model(BaseModel): pet: Cat | Dog | Lizard = Field(discriminator='pet_type') n: int print(Model(pet={'pet_type': 'dog', 'barks': 3.14}, n=1)) #> pet=Dog(pet_type='dog', barks=3.14) n=1 try: Model(pet={'pet_type': 'dog'}, n=1) except ValidationError as e: print(e) """ 1 validation error for Model pet.dog.barks Field required [type=missing, input_value={'pet_type': 'dog'}, input_type=dict] """ ``` ### Discriminated Unions with callable `Discriminator` API Documentation pydantic.types.Discriminator In the case of a `Union` with multiple models, sometimes there isn't a single uniform field across all models that you can use as a discriminator. This is the perfect use case for a callable `Discriminator`. Tip When you're designing callable discriminators, remember that you might have to account for both `dict` and model type inputs. This pattern is similar to that of `mode='before'` validators, where you have to anticipate various forms of input. But wait! You ask, I only anticipate passing in `dict` types, why do I need to account for models? Pydantic uses callable discriminators for serialization as well, at which point the input to your callable is very likely to be a model instance. In the following examples, you'll see that the callable discriminators are designed to handle both `dict` and model inputs. If you don't follow this practice, it's likely that you'll, in the best case, get warnings during serialization, and in the worst case, get runtime errors during validation. ```python from typing import Annotated, Any, Literal, Union from pydantic import BaseModel, Discriminator, Tag class Pie(BaseModel): time_to_cook: int num_ingredients: int class ApplePie(Pie): fruit: Literal['apple'] = 'apple' class PumpkinPie(Pie): filling: Literal['pumpkin'] = 'pumpkin' def get_discriminator_value(v: Any) -> str: if isinstance(v, dict): return v.get('fruit', v.get('filling')) return getattr(v, 'fruit', getattr(v, 'filling', None)) class ThanksgivingDinner(BaseModel): dessert: Annotated[ Union[ Annotated[ApplePie, Tag('apple')], Annotated[PumpkinPie, Tag('pumpkin')], ], Discriminator(get_discriminator_value), ] apple_variation = ThanksgivingDinner.model_validate( {'dessert': {'fruit': 'apple', 'time_to_cook': 60, 'num_ingredients': 8}} ) print(repr(apple_variation)) """ ThanksgivingDinner(dessert=ApplePie(time_to_cook=60, num_ingredients=8, fruit='apple')) """ pumpkin_variation = ThanksgivingDinner.model_validate( { 'dessert': { 'filling': 'pumpkin', 'time_to_cook': 40, 'num_ingredients': 6, } } ) print(repr(pumpkin_variation)) """ ThanksgivingDinner(dessert=PumpkinPie(time_to_cook=40, num_ingredients=6, filling='pumpkin')) """ ``` ```python from typing import Annotated, Any, Literal from pydantic import BaseModel, Discriminator, Tag class Pie(BaseModel): time_to_cook: int num_ingredients: int class ApplePie(Pie): fruit: Literal['apple'] = 'apple' class PumpkinPie(Pie): filling: Literal['pumpkin'] = 'pumpkin' def get_discriminator_value(v: Any) -> str: if isinstance(v, dict): return v.get('fruit', v.get('filling')) return getattr(v, 'fruit', getattr(v, 'filling', None)) class ThanksgivingDinner(BaseModel): dessert: Annotated[ ( Annotated[ApplePie, Tag('apple')] | Annotated[PumpkinPie, Tag('pumpkin')] ), Discriminator(get_discriminator_value), ] apple_variation = ThanksgivingDinner.model_validate( {'dessert': {'fruit': 'apple', 'time_to_cook': 60, 'num_ingredients': 8}} ) print(repr(apple_variation)) """ ThanksgivingDinner(dessert=ApplePie(time_to_cook=60, num_ingredients=8, fruit='apple')) """ pumpkin_variation = ThanksgivingDinner.model_validate( { 'dessert': { 'filling': 'pumpkin', 'time_to_cook': 40, 'num_ingredients': 6, } } ) print(repr(pumpkin_variation)) """ ThanksgivingDinner(dessert=PumpkinPie(time_to_cook=40, num_ingredients=6, filling='pumpkin')) """ ``` `Discriminator`s can also be used to validate `Union` types with combinations of models and primitive types. For example: ```python from typing import Annotated, Any, Union from pydantic import BaseModel, Discriminator, Tag, ValidationError def model_x_discriminator(v: Any) -> str: if isinstance(v, int): return 'int' if isinstance(v, (dict, BaseModel)): return 'model' else: # return None if the discriminator value isn't found return None class SpecialValue(BaseModel): value: int class DiscriminatedModel(BaseModel): value: Annotated[ Union[ Annotated[int, Tag('int')], Annotated['SpecialValue', Tag('model')], ], Discriminator(model_x_discriminator), ] model_data = {'value': {'value': 1}} m = DiscriminatedModel.model_validate(model_data) print(m) #> value=SpecialValue(value=1) int_data = {'value': 123} m = DiscriminatedModel.model_validate(int_data) print(m) #> value=123 try: DiscriminatedModel.model_validate({'value': 'not an int or a model'}) except ValidationError as e: print(e) # (1)! """ 1 validation error for DiscriminatedModel value Unable to extract tag using discriminator model_x_discriminator() [type=union_tag_not_found, input_value='not an int or a model', input_type=str] """ ``` 1. Notice the callable discriminator function returns `None` if a discriminator value is not found. When `None` is returned, this `union_tag_not_found` error is raised. ```python from typing import Annotated, Any from pydantic import BaseModel, Discriminator, Tag, ValidationError def model_x_discriminator(v: Any) -> str: if isinstance(v, int): return 'int' if isinstance(v, (dict, BaseModel)): return 'model' else: # return None if the discriminator value isn't found return None class SpecialValue(BaseModel): value: int class DiscriminatedModel(BaseModel): value: Annotated[ ( Annotated[int, Tag('int')] | Annotated['SpecialValue', Tag('model')] ), Discriminator(model_x_discriminator), ] model_data = {'value': {'value': 1}} m = DiscriminatedModel.model_validate(model_data) print(m) #> value=SpecialValue(value=1) int_data = {'value': 123} m = DiscriminatedModel.model_validate(int_data) print(m) #> value=123 try: DiscriminatedModel.model_validate({'value': 'not an int or a model'}) except ValidationError as e: print(e) # (1)! """ 1 validation error for DiscriminatedModel value Unable to extract tag using discriminator model_x_discriminator() [type=union_tag_not_found, input_value='not an int or a model', input_type=str] """ ``` 1. Notice the callable discriminator function returns `None` if a discriminator value is not found. When `None` is returned, this `union_tag_not_found` error is raised. Note Using the [annotated pattern](../fields/#the-annotated-pattern) can be handy to regroup the `Union` and `discriminator` information. See the next example for more details. There are a few ways to set a discriminator for a field, all varying slightly in syntax. For `str` discriminators: ```python some_field: Union[...] = Field(discriminator='my_discriminator') some_field: Annotated[Union[...], Field(discriminator='my_discriminator')] ``` For callable `Discriminator`s: ```python some_field: Union[...] = Field(discriminator=Discriminator(...)) some_field: Annotated[Union[...], Discriminator(...)] some_field: Annotated[Union[...], Field(discriminator=Discriminator(...))] ``` Warning Discriminated unions cannot be used with only a single variant, such as `Union[Cat]`. Python changes `Union[T]` into `T` at interpretation time, so it is not possible for `pydantic` to distinguish fields of `Union[T]` from `T`. ### Nested Discriminated Unions Only one discriminator can be set for a field but sometimes you want to combine multiple discriminators. You can do it by creating nested `Annotated` types, e.g.: ```python from typing import Annotated, Literal, Union from pydantic import BaseModel, Field, ValidationError class BlackCat(BaseModel): pet_type: Literal['cat'] color: Literal['black'] black_name: str class WhiteCat(BaseModel): pet_type: Literal['cat'] color: Literal['white'] white_name: str Cat = Annotated[Union[BlackCat, WhiteCat], Field(discriminator='color')] class Dog(BaseModel): pet_type: Literal['dog'] name: str Pet = Annotated[Union[Cat, Dog], Field(discriminator='pet_type')] class Model(BaseModel): pet: Pet n: int m = Model(pet={'pet_type': 'cat', 'color': 'black', 'black_name': 'felix'}, n=1) print(m) #> pet=BlackCat(pet_type='cat', color='black', black_name='felix') n=1 try: Model(pet={'pet_type': 'cat', 'color': 'red'}, n='1') except ValidationError as e: print(e) """ 1 validation error for Model pet.cat Input tag 'red' found using 'color' does not match any of the expected tags: 'black', 'white' [type=union_tag_invalid, input_value={'pet_type': 'cat', 'color': 'red'}, input_type=dict] """ try: Model(pet={'pet_type': 'cat', 'color': 'black'}, n='1') except ValidationError as e: print(e) """ 1 validation error for Model pet.cat.black.black_name Field required [type=missing, input_value={'pet_type': 'cat', 'color': 'black'}, input_type=dict] """ ``` Tip If you want to validate data against a union, and solely a union, you can use pydantic's [`TypeAdapter`](../type_adapter/) construct instead of inheriting from the standard `BaseModel`. In the context of the previous example, we have the following: ```python type_adapter = TypeAdapter(Pet) pet = type_adapter.validate_python( {'pet_type': 'cat', 'color': 'black', 'black_name': 'felix'} ) print(repr(pet)) #> BlackCat(pet_type='cat', color='black', black_name='felix') ``` ## Union Validation Errors When `Union` validation fails, error messages can be quite verbose, as they will produce validation errors for each case in the union. This is especially noticeable when dealing with recursive models, where reasons may be generated at each level of recursion. Discriminated unions help to simplify error messages in this case, as validation errors are only produced for the case with a matching discriminator value. You can also customize the error type, message, and context for a `Discriminator` by passing these specifications as parameters to the `Discriminator` constructor, as seen in the example below. ```python from typing import Annotated, Union from pydantic import BaseModel, Discriminator, Tag, ValidationError # Errors are quite verbose with a normal Union: class Model(BaseModel): x: Union[str, 'Model'] try: Model.model_validate({'x': {'x': {'x': 1}}}) except ValidationError as e: print(e) """ 4 validation errors for Model x.str Input should be a valid string [type=string_type, input_value={'x': {'x': 1}}, input_type=dict] x.Model.x.str Input should be a valid string [type=string_type, input_value={'x': 1}, input_type=dict] x.Model.x.Model.x.str Input should be a valid string [type=string_type, input_value=1, input_type=int] x.Model.x.Model.x.Model Input should be a valid dictionary or instance of Model [type=model_type, input_value=1, input_type=int] """ try: Model.model_validate({'x': {'x': {'x': {}}}}) except ValidationError as e: print(e) """ 4 validation errors for Model x.str Input should be a valid string [type=string_type, input_value={'x': {'x': {}}}, input_type=dict] x.Model.x.str Input should be a valid string [type=string_type, input_value={'x': {}}, input_type=dict] x.Model.x.Model.x.str Input should be a valid string [type=string_type, input_value={}, input_type=dict] x.Model.x.Model.x.Model.x Field required [type=missing, input_value={}, input_type=dict] """ # Errors are much simpler with a discriminated union: def model_x_discriminator(v): if isinstance(v, str): return 'str' if isinstance(v, (dict, BaseModel)): return 'model' class DiscriminatedModel(BaseModel): x: Annotated[ Union[ Annotated[str, Tag('str')], Annotated['DiscriminatedModel', Tag('model')], ], Discriminator( model_x_discriminator, custom_error_type='invalid_union_member', # (1)! custom_error_message='Invalid union member', # (2)! custom_error_context={'discriminator': 'str_or_model'}, # (3)! ), ] try: DiscriminatedModel.model_validate({'x': {'x': {'x': 1}}}) except ValidationError as e: print(e) """ 1 validation error for DiscriminatedModel x.model.x.model.x Invalid union member [type=invalid_union_member, input_value=1, input_type=int] """ try: DiscriminatedModel.model_validate({'x': {'x': {'x': {}}}}) except ValidationError as e: print(e) """ 1 validation error for DiscriminatedModel x.model.x.model.x.model.x Field required [type=missing, input_value={}, input_type=dict] """ # The data is still handled properly when valid: data = {'x': {'x': {'x': 'a'}}} m = DiscriminatedModel.model_validate(data) print(m.model_dump()) #> {'x': {'x': {'x': 'a'}}} ``` 1. `custom_error_type` is the `type` attribute of the `ValidationError` raised when validation fails. 1. `custom_error_message` is the `msg` attribute of the `ValidationError` raised when validation fails. 1. `custom_error_context` is the `ctx` attribute of the `ValidationError` raised when validation fails. You can also simplify error messages by labeling each case with a Tag. This is especially useful when you have complex types like those in this example: ```python from typing import Annotated, Union from pydantic import AfterValidator, Tag, TypeAdapter, ValidationError DoubledList = Annotated[list[int], AfterValidator(lambda x: x * 2)] StringsMap = dict[str, str] # Not using any `Tag`s for each union case, the errors are not so nice to look at adapter = TypeAdapter(Union[DoubledList, StringsMap]) try: adapter.validate_python(['a']) except ValidationError as exc_info: print(exc_info) """ 2 validation errors for union[function-after[(), list[int]],dict[str,str]] function-after[(), list[int]].0 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] dict[str,str] Input should be a valid dictionary [type=dict_type, input_value=['a'], input_type=list] """ tag_adapter = TypeAdapter( Union[ Annotated[DoubledList, Tag('DoubledList')], Annotated[StringsMap, Tag('StringsMap')], ] ) try: tag_adapter.validate_python(['a']) except ValidationError as exc_info: print(exc_info) """ 2 validation errors for union[DoubledList,StringsMap] DoubledList.0 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] StringsMap Input should be a valid dictionary [type=dict_type, input_value=['a'], input_type=list] """ ``` API Documentation pydantic.validate_call_decorator.validate_call The validate_call() decorator allows the arguments passed to a function to be parsed and validated using the function's annotations before the function is called. While under the hood this uses the same approach of model creation and initialisation (see [Validators](../validators/) for more details), it provides an extremely easy way to apply validation to your code with minimal boilerplate. Example of usage: ```python from pydantic import ValidationError, validate_call @validate_call def repeat(s: str, count: int, *, separator: bytes = b'') -> bytes: b = s.encode() return separator.join(b for _ in range(count)) a = repeat('hello', 3) print(a) #> b'hellohellohello' b = repeat('x', '4', separator=b' ') print(b) #> b'x x x x' try: c = repeat('hello', 'wrong') except ValidationError as exc: print(exc) """ 1 validation error for repeat 1 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='wrong', input_type=str] """ ``` ## Parameter types Parameter types are inferred from type annotations on the function, or as Any if not annotated. All types listed in [types](../types/) can be validated, including Pydantic models and [custom types](../types/#custom-types). As with the rest of Pydantic, types are by default coerced by the decorator before they're passed to the actual function: ```python from datetime import date from pydantic import validate_call @validate_call def greater_than(d1: date, d2: date, *, include_equal=False) -> date: # (1)! if include_equal: return d1 >= d2 else: return d1 > d2 d1 = '2000-01-01' # (2)! d2 = date(2001, 1, 1) greater_than(d1, d2, include_equal=True) ``` 1. Because `include_equal` has no type annotation, it will be inferred as Any. 1. Although `d1` is a string, it will be converted to a date object. Type coercion like this can be extremely helpful, but also confusing or not desired (see [model data conversion](../models/#data-conversion)). [Strict mode](../strict_mode/) can be enabled by using a [custom configuration](#custom-configuration). Validating the return value By default, the return value of the function is **not** validated. To do so, the `validate_return` argument of the decorator can be set to `True`. ## Function signatures The validate_call() decorator is designed to work with functions using all possible parameter configurations and all possible combinations of these: - Positional or keyword parameters with or without defaults. - Keyword-only parameters: parameters after `*,`. - Positional-only parameters: parameters before `, /`. - Variable positional parameters defined via `*` (often `*args`). - Variable keyword parameters defined via `**` (often `**kwargs`). Example ```python from pydantic import validate_call @validate_call def pos_or_kw(a: int, b: int = 2) -> str: return f'a={a} b={b}' print(pos_or_kw(1, b=3)) #> a=1 b=3 @validate_call def kw_only(*, a: int, b: int = 2) -> str: return f'a={a} b={b}' print(kw_only(a=1)) #> a=1 b=2 print(kw_only(a=1, b=3)) #> a=1 b=3 @validate_call def pos_only(a: int, b: int = 2, /) -> str: return f'a={a} b={b}' print(pos_only(1)) #> a=1 b=2 @validate_call def var_args(*args: int) -> str: return str(args) print(var_args(1)) #> (1,) print(var_args(1, 2, 3)) #> (1, 2, 3) @validate_call def var_kwargs(**kwargs: int) -> str: return str(kwargs) print(var_kwargs(a=1)) #> {'a': 1} print(var_kwargs(a=1, b=2)) #> {'a': 1, 'b': 2} @validate_call def armageddon( a: int, /, b: int, *c: int, d: int, e: int = None, **f: int, ) -> str: return f'a={a} b={b} c={c} d={d} e={e} f={f}' print(armageddon(1, 2, d=3)) #> a=1 b=2 c=() d=3 e=None f={} print(armageddon(1, 2, 3, 4, 5, 6, d=8, e=9, f=10, spam=11)) #> a=1 b=2 c=(3, 4, 5, 6) d=8 e=9 f={'f': 10, 'spam': 11} ``` Unpack for keyword parameters Unpack and typed dictionaries can be used to annotate the variable keyword parameters of a function: ```python from typing_extensions import TypedDict, Unpack from pydantic import validate_call class Point(TypedDict): x: int y: int @validate_call def add_coords(**kwargs: Unpack[Point]) -> int: return kwargs['x'] + kwargs['y'] add_coords(x=1, y=2) ``` For reference, see the [related specification section](https://typing.readthedocs.io/en/latest/spec/callables.html#unpack-for-keyword-arguments) and [PEP 692](https://peps.python.org/pep-0692/). ## Using the Field() function to describe function parameters The [`Field()` function](../fields/) can also be used with the decorator to provide extra information about the field and validations. If you don't make use of the `default` or `default_factory` parameter, it is recommended to use the [annotated pattern](../fields/#the-annotated-pattern) (so that type checkers infer the parameter as being required). Otherwise, the Field() function can be used as a default value (again, to trick type checkers into thinking a default value is provided for the parameter). ```python from typing import Annotated from pydantic import Field, ValidationError, validate_call @validate_call def how_many(num: Annotated[int, Field(gt=10)]): return num try: how_many(1) except ValidationError as e: print(e) """ 1 validation error for how_many 0 Input should be greater than 10 [type=greater_than, input_value=1, input_type=int] """ @validate_call def return_value(value: str = Field(default='default value')): return value print(return_value()) #> default value ``` [Aliases](../fields/#field-aliases) can be used with the decorator as normal: ```python from typing import Annotated from pydantic import Field, validate_call @validate_call def how_many(num: Annotated[int, Field(gt=10, alias='number')]): return num how_many(number=42) ``` ## Accessing the original function The original function which was decorated can still be accessed by using the `raw_function` attribute. This is useful if in some scenarios you trust your input arguments and want to call the function in the most efficient way (see [notes on performance](#performance) below): ```python from pydantic import validate_call @validate_call def repeat(s: str, count: int, *, separator: bytes = b'') -> bytes: b = s.encode() return separator.join(b for _ in range(count)) a = repeat('hello', 3) print(a) #> b'hellohellohello' b = repeat.raw_function('good bye', 2, separator=b', ') print(b) #> b'good bye, good bye' ``` ## Async functions validate_call() can also be used on async functions: ```python class Connection: async def execute(self, sql, *args): return 'testing@example.com' conn = Connection() # ignore-above import asyncio from pydantic import PositiveInt, ValidationError, validate_call @validate_call async def get_user_email(user_id: PositiveInt): # `conn` is some fictional connection to a database email = await conn.execute('select email from users where id=$1', user_id) if email is None: raise RuntimeError('user not found') else: return email async def main(): email = await get_user_email(123) print(email) #> testing@example.com try: await get_user_email(-4) except ValidationError as exc: print(exc.errors()) """ [ { 'type': 'greater_than', 'loc': (0,), 'msg': 'Input should be greater than 0', 'input': -4, 'ctx': {'gt': 0}, 'url': 'https://errors.pydantic.dev/2/v/greater_than', } ] """ asyncio.run(main()) # requires: `conn.execute()` that will return `'testing@example.com'` ``` ## Compatibility with type checkers As the validate_call() decorator preserves the decorated function's signature, it should be compatible with type checkers (such as mypy and pyright). However, due to current limitations in the Python type system, the [`raw_function`](#accessing-the-original-function) or other attributes won't be recognized and you will need to suppress the error using (usually with a `# type: ignore` comment). ## Custom configuration Similarly to Pydantic models, the `config` parameter of the decorator can be used to specify a custom configuration: ```python from pydantic import ConfigDict, ValidationError, validate_call class Foobar: def __init__(self, v: str): self.v = v def __add__(self, other: 'Foobar') -> str: return f'{self} + {other}' def __str__(self) -> str: return f'Foobar({self.v})' @validate_call(config=ConfigDict(arbitrary_types_allowed=True)) def add_foobars(a: Foobar, b: Foobar): return a + b c = add_foobars(Foobar('a'), Foobar('b')) print(c) #> Foobar(a) + Foobar(b) try: add_foobars(1, 2) except ValidationError as e: print(e) """ 2 validation errors for add_foobars 0 Input should be an instance of Foobar [type=is_instance_of, input_value=1, input_type=int] 1 Input should be an instance of Foobar [type=is_instance_of, input_value=2, input_type=int] """ ``` ## Extension — validating arguments before calling a function In some cases, it may be helpful to separate validation of a function's arguments from the function call itself. This might be useful when a particular function is costly/time consuming. Here's an example of a workaround you can use for that pattern: ```python from pydantic import validate_call @validate_call def validate_foo(a: int, b: int): def foo(): return a + b return foo foo = validate_foo(a=1, b=2) print(foo()) #> 3 ``` ## Limitations ### Validation exception Currently upon validation failure, a standard Pydantic ValidationError is raised (see [model error handling](../models/#error-handling) for details). This is also true for missing required arguments, where Python normally raises a TypeError. ### Performance We've made a big effort to make Pydantic as performant as possible. While the inspection of the decorated function is only performed once, there will still be a performance impact when making calls to the function compared to using the original function. In many situations, this will have little or no noticeable effect. However, be aware that validate_call() is not an equivalent or alternative to function definitions in strongly typed languages, and it never will be. In addition to Pydantic's [built-in validation capabilities](../fields/#field-constraints), you can leverage custom validators at the field and model levels to enforce more complex constraints and ensure the integrity of your data. Tip Want to quickly jump to the relevant validator section? - Field validators ______________________________________________________________________ - [field *after* validators](#field-after-validator) - [field *before* validators](#field-before-validator) - [field *plain* validators](#field-plain-validator) - [field *wrap* validators](#field-wrap-validator) - Model validators ______________________________________________________________________ - [model *before* validators](#model-before-validator) - [model *after* validators](#model-after-validator) - [model *wrap* validators](#model-wrap-validator) ## Field validators API Documentation pydantic.functional_validators.WrapValidator\ pydantic.functional_validators.PlainValidator\ pydantic.functional_validators.BeforeValidator\ pydantic.functional_validators.AfterValidator\ pydantic.functional_validators.field_validator In its simplest form, a field validator is a callable taking the value to be validated as an argument and **returning the validated value**. The callable can perform checks for specific conditions (see [raising validation errors](#raising-validation-errors)) and make changes to the validated value (coercion or mutation). **Four** different types of validators can be used. They can all be defined using the [annotated pattern](../fields/#the-annotated-pattern) or using the field_validator() decorator, applied on a class method: - ***After* validators**: run after Pydantic's internal validation. They are generally more type safe and thus easier to implement. Here is an example of a validator performing a validation check, and returning the value unchanged. ```python from typing import Annotated from pydantic import AfterValidator, BaseModel, ValidationError def is_even(value: int) -> int: if value % 2 == 1: raise ValueError(f'{value} is not an even number') return value # (1)! class Model(BaseModel): number: Annotated[int, AfterValidator(is_even)] try: Model(number=1) except ValidationError as err: print(err) """ 1 validation error for Model number Value error, 1 is not an even number [type=value_error, input_value=1, input_type=int] """ ``` 1. Note that it is important to return the validated value. Here is an example of a validator performing a validation check, and returning the value unchanged, this time using the field_validator() decorator. ```python from pydantic import BaseModel, ValidationError, field_validator class Model(BaseModel): number: int @field_validator('number', mode='after') # (1)! @classmethod def is_even(cls, value: int) -> int: if value % 2 == 1: raise ValueError(f'{value} is not an even number') return value # (2)! try: Model(number=1) except ValidationError as err: print(err) """ 1 validation error for Model number Value error, 1 is not an even number [type=value_error, input_value=1, input_type=int] """ ``` 1. `'after'` is the default mode for the decorator, and can be omitted. 1. Note that it is important to return the validated value. Example mutating the value Here is an example of a validator making changes to the validated value (no exception is raised). ```python from typing import Annotated from pydantic import AfterValidator, BaseModel def double_number(value: int) -> int: return value * 2 class Model(BaseModel): number: Annotated[int, AfterValidator(double_number)] print(Model(number=2)) #> number=4 ``` ```python from pydantic import BaseModel, field_validator class Model(BaseModel): number: int @field_validator('number', mode='after') # (1)! @classmethod def double_number(cls, value: int) -> int: return value * 2 print(Model(number=2)) #> number=4 ``` 1. `'after'` is the default mode for the decorator, and can be omitted. - ***Before* validators**: run before Pydantic's internal parsing and validation (e.g. coercion of a `str` to an `int`). These are more flexible than [*after* validators](#field-after-validator), but they also have to deal with the raw input, which in theory could be any arbitrary object. You should also avoid mutating the value directly if you are raising a [validation error](#raising-validation-errors) later in your validator function, as the mutated value may be passed to other validators if using [unions](../unions/). The value returned from this callable is then validated against the provided type annotation by Pydantic. ```python from typing import Annotated, Any from pydantic import BaseModel, BeforeValidator, ValidationError def ensure_list(value: Any) -> Any: # (1)! if not isinstance(value, list): # (2)! return [value] else: return value class Model(BaseModel): numbers: Annotated[list[int], BeforeValidator(ensure_list)] print(Model(numbers=2)) #> numbers=[2] try: Model(numbers='str') except ValidationError as err: print(err) # (3)! """ 1 validation error for Model numbers.0 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='str', input_type=str] """ ``` 1. Notice the use of Any as a type hint for `value`. *Before* validators take the raw input, which can be anything. 1. Note that you might want to check for other sequence types (such as tuples) that would normally successfully validate against the `list` type. *Before* validators give you more flexibility, but you have to account for every possible case. 1. Pydantic still performs validation against the `int` type, no matter if our `ensure_list` validator did operations on the original input type. ```python from typing import Any from pydantic import BaseModel, ValidationError, field_validator class Model(BaseModel): numbers: list[int] @field_validator('numbers', mode='before') @classmethod def ensure_list(cls, value: Any) -> Any: # (1)! if not isinstance(value, list): # (2)! return [value] else: return value print(Model(numbers=2)) #> numbers=[2] try: Model(numbers='str') except ValidationError as err: print(err) # (3)! """ 1 validation error for Model numbers.0 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='str', input_type=str] """ ``` 1. Notice the use of Any as a type hint for `value`. *Before* validators take the raw input, which can be anything. 1. Note that you might want to check for other sequence types (such as tuples) that would normally successfully validate against the `list` type. *Before* validators give you more flexibility, but you have to account for every possible case. 1. Pydantic still performs validation against the `int` type, no matter if our `ensure_list` validator did operations on the original input type. - ***Plain* validators**: act similarly to *before* validators but they **terminate validation immediately** after returning, so no further validators are called and Pydantic does not do any of its internal validation against the field type. ```python from typing import Annotated, Any from pydantic import BaseModel, PlainValidator def val_number(value: Any) -> Any: if isinstance(value, int): return value * 2 else: return value class Model(BaseModel): number: Annotated[int, PlainValidator(val_number)] print(Model(number=4)) #> number=8 print(Model(number='invalid')) # (1)! #> number='invalid' ``` 1. Although `'invalid'` shouldn't validate against the `int` type, Pydantic accepts the input. ```python from typing import Any from pydantic import BaseModel, field_validator class Model(BaseModel): number: int @field_validator('number', mode='plain') @classmethod def val_number(cls, value: Any) -> Any: if isinstance(value, int): return value * 2 else: return value print(Model(number=4)) #> number=8 print(Model(number='invalid')) # (1)! #> number='invalid' ``` 1. Although `'invalid'` shouldn't validate against the `int` type, Pydantic accepts the input. - ***Wrap* validators**: are the most flexible of all. You can run code before or after Pydantic and other validators process the input, or you can terminate validation immediately, either by returning the value early or by raising an error. Such validators must be defined with a **mandatory** extra `handler` parameter: a callable taking the value to be validated as an argument. Internally, this handler will delegate validation of the value to Pydantic. You are free to wrap the call to the handler in a [`try..except`](https://docs.python.org/3/tutorial/errors.html#handling-exceptions) block, or not call it at all. ```python from typing import Any from typing import Annotated from pydantic import BaseModel, Field, ValidationError, ValidatorFunctionWrapHandler, WrapValidator def truncate(value: Any, handler: ValidatorFunctionWrapHandler) -> str: try: return handler(value) except ValidationError as err: if err.errors()[0]['type'] == 'string_too_long': return handler(value[:5]) else: raise class Model(BaseModel): my_string: Annotated[str, Field(max_length=5), WrapValidator(truncate)] print(Model(my_string='abcde')) #> my_string='abcde' print(Model(my_string='abcdef')) #> my_string='abcde' ``` ```python from typing import Any from typing import Annotated from pydantic import BaseModel, Field, ValidationError, ValidatorFunctionWrapHandler, field_validator class Model(BaseModel): my_string: Annotated[str, Field(max_length=5)] @field_validator('my_string', mode='wrap') @classmethod def truncate(cls, value: Any, handler: ValidatorFunctionWrapHandler) -> str: try: return handler(value) except ValidationError as err: if err.errors()[0]['type'] == 'string_too_long': return handler(value[:5]) else: raise print(Model(my_string='abcde')) #> my_string='abcde' print(Model(my_string='abcdef')) #> my_string='abcde' ``` Validation of default values As mentioned in the [fields documentation](../fields/#validate-default-values), default values of fields are *not* validated unless configured to do so, and thus custom validators will not be applied as well. ### Which validator pattern to use While both approaches can achieve the same thing, each pattern provides different benefits. #### Using the annotated pattern One of the key benefits of using the [annotated pattern](../fields/#the-annotated-pattern) is to make validators reusable: ```python from typing import Annotated from pydantic import AfterValidator, BaseModel def is_even(value: int) -> int: if value % 2 == 1: raise ValueError(f'{value} is not an even number') return value EvenNumber = Annotated[int, AfterValidator(is_even)] class Model1(BaseModel): my_number: EvenNumber class Model2(BaseModel): other_number: Annotated[EvenNumber, AfterValidator(lambda v: v + 2)] class Model3(BaseModel): list_of_even_numbers: list[EvenNumber] # (1)! ``` 1. As mentioned in the [annotated pattern](../fields/#the-annotated-pattern) documentation, we can also make use of validators for specific parts of the annotation (in this case, validation is applied for list items, but not the whole list). It is also easier to understand which validators are applied to a type, by just looking at the field annotation. #### Using the decorator pattern One of the key benefits of using the field_validator() decorator is to apply the function to multiple fields: ```python from pydantic import BaseModel, field_validator class Model(BaseModel): f1: str f2: str @field_validator('f1', 'f2', mode='before') @classmethod def capitalize(cls, value: str) -> str: return value.capitalize() ``` Here are a couple additional notes about the decorator usage: - If you want the validator to apply to all fields (including the ones defined in subclasses), you can pass `'*'` as the field name argument. - By default, the decorator will ensure the provided field name(s) are defined on the model. If you want to disable this check during class creation, you can do so by passing `False` to the `check_fields` argument. This is useful when the field validator is defined on a base class, and the field is expected to be set on subclasses. ## Model validators API Documentation pydantic.functional_validators.model_validator Validation can also be performed on the entire model's data using the model_validator() decorator. **Three** different types of model validators can be used: - ***After* validators**: run after the whole model has been validated. As such, they are defined as *instance* methods and can be seen as post-initialization hooks. Important note: the validated instance should be returned. ```python from typing_extensions import Self from pydantic import BaseModel, model_validator class UserModel(BaseModel): username: str password: str password_repeat: str @model_validator(mode='after') def check_passwords_match(self) -> Self: if self.password != self.password_repeat: raise ValueError('Passwords do not match') return self ``` - ***Before* validators**: are run before the model is instantiated. These are more flexible than *after* validators, but they also have to deal with the raw input, which in theory could be any arbitrary object. You should also avoid mutating the value directly if you are raising a [validation error](#raising-validation-errors) later in your validator function, as the mutated value may be passed to other validators if using [unions](../unions/). ```python from typing import Any from pydantic import BaseModel, model_validator class UserModel(BaseModel): username: str @model_validator(mode='before') @classmethod def check_card_number_not_present(cls, data: Any) -> Any: # (1)! if isinstance(data, dict): # (2)! if 'card_number' in data: raise ValueError("'card_number' should not be included") return data ``` 1. Notice the use of Any as a type hint for `data`. *Before* validators take the raw input, which can be anything. 1. Most of the time, the input data will be a dictionary (e.g. when calling `UserModel(username='...')`). However, this is not always the case. For instance, if the from_attributes configuration value is set, you might receive an arbitrary class instance for the `data` argument. - ***Wrap* validators**: are the most flexible of all. You can run code before or after Pydantic and other validators process the input data, or you can terminate validation immediately, either by returning the data early or by raising an error. ```python import logging from typing import Any from typing_extensions import Self from pydantic import BaseModel, ModelWrapValidatorHandler, ValidationError, model_validator class UserModel(BaseModel): username: str @model_validator(mode='wrap') @classmethod def log_failed_validation(cls, data: Any, handler: ModelWrapValidatorHandler[Self]) -> Self: try: return handler(data) except ValidationError: logging.error('Model %s failed to validate with data %s', cls, data) raise ``` On inheritance A model validator defined in a base class will be called during the validation of a subclass instance. Overriding a model validator in a subclass will override the base class' validator, and thus only the subclass' version of said validator will be called. ## Raising validation errors To raise a validation error, three types of exceptions can be used: - ValueError: this is the most common exception raised inside validators. - AssertionError: using the assert statement also works, but be aware that these statements are skipped when Python is run with the -O optimization flag. - PydanticCustomError: a bit more verbose, but provides extra flexibility: ```python from pydantic_core import PydanticCustomError from pydantic import BaseModel, ValidationError, field_validator class Model(BaseModel): x: int @field_validator('x', mode='after') @classmethod def validate_x(cls, v: int) -> int: if v % 42 == 0: raise PydanticCustomError( 'the_answer_error', '{number} is the answer!', {'number': v}, ) return v try: Model(x=42 * 2) except ValidationError as e: print(e) """ 1 validation error for Model x 84 is the answer! [type=the_answer_error, input_value=84, input_type=int] """ ``` ## Validation info Both the field and model validators callables (in all modes) can optionally take an extra ValidationInfo argument, providing useful extra information, such as: - [already validated data](#validation-data) - [user defined context](#validation-context) - the current validation mode: either `'python'` or `'json'` (see the mode property) - the current field name (see the field_name property). ### Validation data For field validators, the already validated data can be accessed using the data property. Here is an example than can be used as an alternative to the [*after* model validator](#model-after-validator) example: ```python from pydantic import BaseModel, ValidationInfo, field_validator class UserModel(BaseModel): password: str password_repeat: str username: str @field_validator('password_repeat', mode='after') @classmethod def check_passwords_match(cls, value: str, info: ValidationInfo) -> str: if value != info.data['password']: raise ValueError('Passwords do not match') return value ``` Warning As validation is performed in the [order fields are defined](../models/#field-ordering), you have to make sure you are not accessing a field that hasn't been validated yet. In the code above, for example, the `username` validated value is not available yet, as it is defined *after* `password_repeat`. The data property is `None` for [model validators](#model-validators). ### Validation context You can pass a context object to the [validation methods](../models/#validating-data), which can be accessed inside the validator functions using the context property: ```python from pydantic import BaseModel, ValidationInfo, field_validator class Model(BaseModel): text: str @field_validator('text', mode='after') @classmethod def remove_stopwords(cls, v: str, info: ValidationInfo) -> str: if isinstance(info.context, dict): stopwords = info.context.get('stopwords', set()) v = ' '.join(w for w in v.split() if w.lower() not in stopwords) return v data = {'text': 'This is an example document'} print(Model.model_validate(data)) # no context #> text='This is an example document' print(Model.model_validate(data, context={'stopwords': ['this', 'is', 'an']})) #> text='example document' ``` Similarly, you can [use a context for serialization](../serialization/#serialization-context). Providing context when directly instantiating a model It is currently not possible to provide a context when directly instantiating a model (i.e. when calling `Model(...)`). You can work around this through the use of a ContextVar and a custom `__init__` method: ```python from __future__ import annotations from contextlib import contextmanager from contextvars import ContextVar from typing import Any, Generator from pydantic import BaseModel, ValidationInfo, field_validator _init_context_var = ContextVar('_init_context_var', default=None) @contextmanager def init_context(value: dict[str, Any]) -> Generator[None]: token = _init_context_var.set(value) try: yield finally: _init_context_var.reset(token) class Model(BaseModel): my_number: int def __init__(self, /, **data: Any) -> None: self.__pydantic_validator__.validate_python( data, self_instance=self, context=_init_context_var.get(), ) @field_validator('my_number') @classmethod def multiply_with_context(cls, value: int, info: ValidationInfo) -> int: if isinstance(info.context, dict): multiplier = info.context.get('multiplier', 1) value = value * multiplier return value print(Model(my_number=2)) #> my_number=2 with init_context({'multiplier': 3}): print(Model(my_number=2)) #> my_number=6 print(Model(my_number=2)) #> my_number=2 ``` ## Ordering of validators When using the [annotated pattern](#using-the-annotated-pattern), the order in which validators are applied is defined as follows: [*before*](#field-before-validator) and [*wrap*](#field-wrap-validator) validators are run from right to left, and [*after*](#field-after-validator) validators are then run from left to right: ```python from pydantic import AfterValidator, BaseModel, BeforeValidator, WrapValidator class Model(BaseModel): name: Annotated[ str, AfterValidator(runs_3rd), AfterValidator(runs_4th), BeforeValidator(runs_2nd), WrapValidator(runs_1st), ] ``` Internally, validators defined using [the decorator](#using-the-decorator-pattern) are converted to their annotated form counterpart and added last after the existing metadata for the field. This means that the same ordering logic applies. ## Special types Pydantic provides a few special utilities that can be used to customize validation. - InstanceOf can be used to validate that a value is an instance of a given class. ```python from pydantic import BaseModel, InstanceOf, ValidationError class Fruit: def __repr__(self): return self.__class__.__name__ class Banana(Fruit): ... class Apple(Fruit): ... class Basket(BaseModel): fruits: list[InstanceOf[Fruit]] print(Basket(fruits=[Banana(), Apple()])) #> fruits=[Banana, Apple] try: Basket(fruits=[Banana(), 'Apple']) except ValidationError as e: print(e) """ 1 validation error for Basket fruits.1 Input should be an instance of Fruit [type=is_instance_of, input_value='Apple', input_type=str] """ ``` - SkipValidation can be used to skip validation on a field. ```python from pydantic import BaseModel, SkipValidation class Model(BaseModel): names: list[SkipValidation[str]] m = Model(names=['foo', 'bar']) print(m) #> names=['foo', 'bar'] m = Model(names=['foo', 123]) # (1)! print(m) #> names=['foo', 123] ``` 1. Note that the validation of the second item is skipped. If it has the wrong type it will emit a warning during serialization. - PydanticUseDefault can be used to notify Pydantic that the default value should be used. ```python from typing import Annotated, Any from pydantic_core import PydanticUseDefault from pydantic import BaseModel, BeforeValidator def default_if_none(value: Any) -> Any: if value is None: raise PydanticUseDefault() return value class Model(BaseModel): name: Annotated[str, BeforeValidator(default_if_none)] = 'default_name' print(Model(name=None)) #> name='default_name' ``` ## JSON Schema and field validators When using [*before*](#field-before-validator), [*plain*](#field-plain-validator) or [*wrap*](#field-wrap-validator) field validators, the accepted input type may be different from the field annotation. Consider the following example: ```python from typing import Any from pydantic import BaseModel, field_validator class Model(BaseModel): value: str @field_validator('value', mode='before') @classmethod def cast_ints(cls, value: Any) -> Any: if isinstance(value, int): return str(value) else: return value print(Model(value='a')) #> value='a' print(Model(value=1)) #> value='1' ``` While the type hint for `value` is `str`, the `cast_ints` validator also allows integers. To specify the correct input type, the `json_schema_input_type` argument can be provided: ```python from typing import Any, Union from pydantic import BaseModel, field_validator class Model(BaseModel): value: str @field_validator( 'value', mode='before', json_schema_input_type=Union[int, str] ) @classmethod def cast_ints(cls, value: Any) -> Any: if isinstance(value, int): return str(value) else: return value print(Model.model_json_schema()['properties']['value']) #> {'anyOf': [{'type': 'integer'}, {'type': 'string'}], 'title': 'Value'} ``` As a convenience, Pydantic will use the field type if the argument is not provided (unless you are using a [*plain*](#field-plain-validator) validator, in which case `json_schema_input_type` defaults to Any as the field type is completely discarded). # API documentation Support for alias configurations. ## AliasPath ```python AliasPath(first_arg: str, *args: str | int) ``` Usage Documentation [`AliasPath` and `AliasChoices`](../../concepts/alias/#aliaspath-and-aliaschoices) A data class used by `validation_alias` as a convenience to create aliases. Attributes: | Name | Type | Description | | --- | --- | --- | | `path` | `list[int | str]` | A list of string or integer aliases. | Source code in `pydantic/aliases.py` ```python def __init__(self, first_arg: str, *args: str | int) -> None: self.path = [first_arg] + list(args) ``` ### convert_to_aliases ```python convert_to_aliases() -> list[str | int] ``` Converts arguments to a list of string or integer aliases. Returns: | Type | Description | | --- | --- | | `list[str | int]` | The list of aliases. | Source code in `pydantic/aliases.py` ```python def convert_to_aliases(self) -> list[str | int]: """Converts arguments to a list of string or integer aliases. Returns: The list of aliases. """ return self.path ``` ### search_dict_for_path ```python search_dict_for_path(d: dict) -> Any ``` Searches a dictionary for the path specified by the alias. Returns: | Type | Description | | --- | --- | | `Any` | The value at the specified path, or PydanticUndefined if the path is not found. | Source code in `pydantic/aliases.py` ```python def search_dict_for_path(self, d: dict) -> Any: """Searches a dictionary for the path specified by the alias. Returns: The value at the specified path, or `PydanticUndefined` if the path is not found. """ v = d for k in self.path: if isinstance(v, str): # disallow indexing into a str, like for AliasPath('x', 0) and x='abc' return PydanticUndefined try: v = v[k] except (KeyError, IndexError, TypeError): return PydanticUndefined return v ``` ## AliasChoices ```python AliasChoices( first_choice: str | AliasPath, *choices: str | AliasPath ) ``` Usage Documentation [`AliasPath` and `AliasChoices`](../../concepts/alias/#aliaspath-and-aliaschoices) A data class used by `validation_alias` as a convenience to create aliases. Attributes: | Name | Type | Description | | --- | --- | --- | | `choices` | `list[str | AliasPath]` | A list containing a string or AliasPath. | Source code in `pydantic/aliases.py` ```python def __init__(self, first_choice: str | AliasPath, *choices: str | AliasPath) -> None: self.choices = [first_choice] + list(choices) ``` ### convert_to_aliases ```python convert_to_aliases() -> list[list[str | int]] ``` Converts arguments to a list of lists containing string or integer aliases. Returns: | Type | Description | | --- | --- | | `list[list[str | int]]` | The list of aliases. | Source code in `pydantic/aliases.py` ```python def convert_to_aliases(self) -> list[list[str | int]]: """Converts arguments to a list of lists containing string or integer aliases. Returns: The list of aliases. """ aliases: list[list[str | int]] = [] for c in self.choices: if isinstance(c, AliasPath): aliases.append(c.convert_to_aliases()) else: aliases.append([c]) return aliases ``` ## AliasGenerator ```python AliasGenerator( alias: Callable[[str], str] | None = None, validation_alias: ( Callable[[str], str | AliasPath | AliasChoices] | None ) = None, serialization_alias: Callable[[str], str] | None = None, ) ``` Usage Documentation [Using an `AliasGenerator`](../../concepts/alias/#using-an-aliasgenerator) A data class used by `alias_generator` as a convenience to create various aliases. Attributes: | Name | Type | Description | | --- | --- | --- | | `alias` | `Callable[[str], str] | None` | A callable that takes a field name and returns an alias for it. | | `validation_alias` | `Callable[[str], str | AliasPath | AliasChoices] | None` | A callable that takes a field name and returns a validation alias for it. | | `serialization_alias` | `Callable[[str], str] | None` | A callable that takes a field name and returns a serialization alias for it. | ### generate_aliases ```python generate_aliases( field_name: str, ) -> tuple[ str | None, str | AliasPath | AliasChoices | None, str | None, ] ``` Generate `alias`, `validation_alias`, and `serialization_alias` for a field. Returns: | Type | Description | | --- | --- | | `tuple[str | None, str | AliasPath | AliasChoices | None, str | None]` | A tuple of three aliases - validation, alias, and serialization. | Source code in `pydantic/aliases.py` ```python def generate_aliases(self, field_name: str) -> tuple[str | None, str | AliasPath | AliasChoices | None, str | None]: """Generate `alias`, `validation_alias`, and `serialization_alias` for a field. Returns: A tuple of three aliases - validation, alias, and serialization. """ alias = self._generate_alias('alias', (str,), field_name) validation_alias = self._generate_alias('validation_alias', (str, AliasChoices, AliasPath), field_name) serialization_alias = self._generate_alias('serialization_alias', (str,), field_name) return alias, validation_alias, serialization_alias # type: ignore ``` Type annotations to use with `__get_pydantic_core_schema__` and `__get_pydantic_json_schema__`. ## GetJsonSchemaHandler Handler to call into the next JSON schema generation function. Attributes: | Name | Type | Description | | --- | --- | --- | | `mode` | `JsonSchemaMode` | Json schema mode, can be validation or serialization. | ### resolve_ref_schema ```python resolve_ref_schema( maybe_ref_json_schema: JsonSchemaValue, ) -> JsonSchemaValue ``` Get the real schema for a `{"$ref": ...}` schema. If the schema given is not a `$ref` schema, it will be returned as is. This means you don't have to check before calling this function. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `maybe_ref_json_schema` | `JsonSchemaValue` | A JsonSchemaValue which may be a $ref schema. | *required* | Raises: | Type | Description | | --- | --- | | `LookupError` | If the ref is not found. | Returns: | Name | Type | Description | | --- | --- | --- | | `JsonSchemaValue` | `JsonSchemaValue` | A JsonSchemaValue that has no $ref. | Source code in `pydantic/annotated_handlers.py` ```python def resolve_ref_schema(self, maybe_ref_json_schema: JsonSchemaValue, /) -> JsonSchemaValue: """Get the real schema for a `{"$ref": ...}` schema. If the schema given is not a `$ref` schema, it will be returned as is. This means you don't have to check before calling this function. Args: maybe_ref_json_schema: A JsonSchemaValue which may be a `$ref` schema. Raises: LookupError: If the ref is not found. Returns: JsonSchemaValue: A JsonSchemaValue that has no `$ref`. """ raise NotImplementedError ``` ## GetCoreSchemaHandler Handler to call into the next CoreSchema schema generation function. ### field_name ```python field_name: str | None ``` Get the name of the closest field to this validator. ### generate_schema ```python generate_schema(source_type: Any) -> CoreSchema ``` Generate a schema unrelated to the current context. Use this function if e.g. you are handling schema generation for a sequence and want to generate a schema for its items. Otherwise, you may end up doing something like applying a `min_length` constraint that was intended for the sequence itself to its items! Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `source_type` | `Any` | The input type. | *required* | Returns: | Name | Type | Description | | --- | --- | --- | | `CoreSchema` | `CoreSchema` | The pydantic-core CoreSchema generated. | Source code in `pydantic/annotated_handlers.py` ```python def generate_schema(self, source_type: Any, /) -> core_schema.CoreSchema: """Generate a schema unrelated to the current context. Use this function if e.g. you are handling schema generation for a sequence and want to generate a schema for its items. Otherwise, you may end up doing something like applying a `min_length` constraint that was intended for the sequence itself to its items! Args: source_type: The input type. Returns: CoreSchema: The `pydantic-core` CoreSchema generated. """ raise NotImplementedError ``` ### resolve_ref_schema ```python resolve_ref_schema( maybe_ref_schema: CoreSchema, ) -> CoreSchema ``` Get the real schema for a `definition-ref` schema. If the schema given is not a `definition-ref` schema, it will be returned as is. This means you don't have to check before calling this function. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `maybe_ref_schema` | `CoreSchema` | A CoreSchema, ref-based or not. | *required* | Raises: | Type | Description | | --- | --- | | `LookupError` | If the ref is not found. | Returns: | Type | Description | | --- | --- | | `CoreSchema` | A concrete CoreSchema. | Source code in `pydantic/annotated_handlers.py` ```python def resolve_ref_schema(self, maybe_ref_schema: core_schema.CoreSchema, /) -> core_schema.CoreSchema: """Get the real schema for a `definition-ref` schema. If the schema given is not a `definition-ref` schema, it will be returned as is. This means you don't have to check before calling this function. Args: maybe_ref_schema: A `CoreSchema`, `ref`-based or not. Raises: LookupError: If the `ref` is not found. Returns: A concrete `CoreSchema`. """ raise NotImplementedError ``` Pydantic models are simply classes which inherit from `BaseModel` and define fields as annotated attributes. ## pydantic.BaseModel Usage Documentation [Models](../../concepts/models/) A base class for creating Pydantic models. Attributes: | Name | Type | Description | | --- | --- | --- | | `__class_vars__` | `set[str]` | The names of the class variables defined on the model. | | `__private_attributes__` | `Dict[str, ModelPrivateAttr]` | Metadata about the private attributes of the model. | | `__signature__` | `Signature` | The synthesized __init__ Signature of the model. | | `__pydantic_complete__` | `bool` | Whether model building is completed, or if there are still undefined fields. | | `__pydantic_core_schema__` | `CoreSchema` | The core schema of the model. | | `__pydantic_custom_init__` | `bool` | Whether the model has a custom __init__ function. | | `__pydantic_decorators__` | `DecoratorInfos` | Metadata containing the decorators defined on the model. This replaces Model.__validators__ and Model.__root_validators__ from Pydantic V1. | | `__pydantic_generic_metadata__` | `PydanticGenericMetadata` | Metadata for generic models; contains data used for a similar purpose to args, origin, parameters in typing-module generics. May eventually be replaced by these. | | `__pydantic_parent_namespace__` | `Dict[str, Any] | None` | Parent namespace of the model, used for automatic rebuilding of models. | | `__pydantic_post_init__` | `None | Literal['model_post_init']` | The name of the post-init method for the model, if defined. | | `__pydantic_root_model__` | `bool` | Whether the model is a RootModel. | | `__pydantic_serializer__` | `SchemaSerializer` | The pydantic-core SchemaSerializer used to dump instances of the model. | | `__pydantic_validator__` | `SchemaValidator | PluggableSchemaValidator` | The pydantic-core SchemaValidator used to validate instances of the model. | | `__pydantic_fields__` | `Dict[str, FieldInfo]` | A dictionary of field names and their corresponding FieldInfo objects. | | `__pydantic_computed_fields__` | `Dict[str, ComputedFieldInfo]` | A dictionary of computed field names and their corresponding ComputedFieldInfo objects. | | `__pydantic_extra__` | `dict[str, Any] | None` | A dictionary containing extra values, if extra is set to 'allow'. | | `__pydantic_fields_set__` | `set[str]` | The names of fields explicitly set during instantiation. | | `__pydantic_private__` | `dict[str, Any] | None` | Values of private attributes set on the model instance. | Source code in `pydantic/main.py` ````python class BaseModel(metaclass=_model_construction.ModelMetaclass): """!!! abstract "Usage Documentation" [Models](../concepts/models.md) A base class for creating Pydantic models. Attributes: __class_vars__: The names of the class variables defined on the model. __private_attributes__: Metadata about the private attributes of the model. __signature__: The synthesized `__init__` [`Signature`][inspect.Signature] of the model. __pydantic_complete__: Whether model building is completed, or if there are still undefined fields. __pydantic_core_schema__: The core schema of the model. __pydantic_custom_init__: Whether the model has a custom `__init__` function. __pydantic_decorators__: Metadata containing the decorators defined on the model. This replaces `Model.__validators__` and `Model.__root_validators__` from Pydantic V1. __pydantic_generic_metadata__: Metadata for generic models; contains data used for a similar purpose to __args__, __origin__, __parameters__ in typing-module generics. May eventually be replaced by these. __pydantic_parent_namespace__: Parent namespace of the model, used for automatic rebuilding of models. __pydantic_post_init__: The name of the post-init method for the model, if defined. __pydantic_root_model__: Whether the model is a [`RootModel`][pydantic.root_model.RootModel]. __pydantic_serializer__: The `pydantic-core` `SchemaSerializer` used to dump instances of the model. __pydantic_validator__: The `pydantic-core` `SchemaValidator` used to validate instances of the model. __pydantic_fields__: A dictionary of field names and their corresponding [`FieldInfo`][pydantic.fields.FieldInfo] objects. __pydantic_computed_fields__: A dictionary of computed field names and their corresponding [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] objects. __pydantic_extra__: A dictionary containing extra values, if [`extra`][pydantic.config.ConfigDict.extra] is set to `'allow'`. __pydantic_fields_set__: The names of fields explicitly set during instantiation. __pydantic_private__: Values of private attributes set on the model instance. """ # Note: Many of the below class vars are defined in the metaclass, but we define them here for type checking purposes. model_config: ClassVar[ConfigDict] = ConfigDict() """ Configuration for the model, should be a dictionary conforming to [`ConfigDict`][pydantic.config.ConfigDict]. """ __class_vars__: ClassVar[set[str]] """The names of the class variables defined on the model.""" __private_attributes__: ClassVar[Dict[str, ModelPrivateAttr]] # noqa: UP006 """Metadata about the private attributes of the model.""" __signature__: ClassVar[Signature] """The synthesized `__init__` [`Signature`][inspect.Signature] of the model.""" __pydantic_complete__: ClassVar[bool] = False """Whether model building is completed, or if there are still undefined fields.""" __pydantic_core_schema__: ClassVar[CoreSchema] """The core schema of the model.""" __pydantic_custom_init__: ClassVar[bool] """Whether the model has a custom `__init__` method.""" # Must be set for `GenerateSchema.model_schema` to work for a plain `BaseModel` annotation. __pydantic_decorators__: ClassVar[_decorators.DecoratorInfos] = _decorators.DecoratorInfos() """Metadata containing the decorators defined on the model. This replaces `Model.__validators__` and `Model.__root_validators__` from Pydantic V1.""" __pydantic_generic_metadata__: ClassVar[_generics.PydanticGenericMetadata] """Metadata for generic models; contains data used for a similar purpose to __args__, __origin__, __parameters__ in typing-module generics. May eventually be replaced by these.""" __pydantic_parent_namespace__: ClassVar[Dict[str, Any] | None] = None # noqa: UP006 """Parent namespace of the model, used for automatic rebuilding of models.""" __pydantic_post_init__: ClassVar[None | Literal['model_post_init']] """The name of the post-init method for the model, if defined.""" __pydantic_root_model__: ClassVar[bool] = False """Whether the model is a [`RootModel`][pydantic.root_model.RootModel].""" __pydantic_serializer__: ClassVar[SchemaSerializer] """The `pydantic-core` `SchemaSerializer` used to dump instances of the model.""" __pydantic_validator__: ClassVar[SchemaValidator | PluggableSchemaValidator] """The `pydantic-core` `SchemaValidator` used to validate instances of the model.""" __pydantic_fields__: ClassVar[Dict[str, FieldInfo]] # noqa: UP006 """A dictionary of field names and their corresponding [`FieldInfo`][pydantic.fields.FieldInfo] objects. This replaces `Model.__fields__` from Pydantic V1. """ __pydantic_setattr_handlers__: ClassVar[Dict[str, Callable[[BaseModel, str, Any], None]]] # noqa: UP006 """`__setattr__` handlers. Memoizing the handlers leads to a dramatic performance improvement in `__setattr__`""" __pydantic_computed_fields__: ClassVar[Dict[str, ComputedFieldInfo]] # noqa: UP006 """A dictionary of computed field names and their corresponding [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] objects.""" __pydantic_extra__: dict[str, Any] | None = _model_construction.NoInitField(init=False) """A dictionary containing extra values, if [`extra`][pydantic.config.ConfigDict.extra] is set to `'allow'`.""" __pydantic_fields_set__: set[str] = _model_construction.NoInitField(init=False) """The names of fields explicitly set during instantiation.""" __pydantic_private__: dict[str, Any] | None = _model_construction.NoInitField(init=False) """Values of private attributes set on the model instance.""" if not TYPE_CHECKING: # Prevent `BaseModel` from being instantiated directly # (defined in an `if not TYPE_CHECKING` block for clarity and to avoid type checking errors): __pydantic_core_schema__ = _mock_val_ser.MockCoreSchema( 'Pydantic models should inherit from BaseModel, BaseModel cannot be instantiated directly', code='base-model-instantiated', ) __pydantic_validator__ = _mock_val_ser.MockValSer( 'Pydantic models should inherit from BaseModel, BaseModel cannot be instantiated directly', val_or_ser='validator', code='base-model-instantiated', ) __pydantic_serializer__ = _mock_val_ser.MockValSer( 'Pydantic models should inherit from BaseModel, BaseModel cannot be instantiated directly', val_or_ser='serializer', code='base-model-instantiated', ) __slots__ = '__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__' def __init__(self, /, **data: Any) -> None: """Create a new model by parsing and validating input data from keyword arguments. Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model. `self` is explicitly positional-only to allow `self` as a field name. """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self) if self is not validated_self: warnings.warn( 'A custom validator is returning a value other than `self`.\n' "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n" 'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.', stacklevel=2, ) # The following line sets a flag that we use to determine when `__init__` gets overridden by the user __init__.__pydantic_base_init__ = True # pyright: ignore[reportFunctionMemberAccess] @_utils.deprecated_instance_property @classmethod def model_fields(cls) -> dict[str, FieldInfo]: """A mapping of field names to their respective [`FieldInfo`][pydantic.fields.FieldInfo] instances. !!! warning Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class. """ return getattr(cls, '__pydantic_fields__', {}) @_utils.deprecated_instance_property @classmethod def model_computed_fields(cls) -> dict[str, ComputedFieldInfo]: """A mapping of computed field names to their respective [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances. !!! warning Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class. """ return getattr(cls, '__pydantic_computed_fields__', {}) @property def model_extra(self) -> dict[str, Any] | None: """Get extra fields set during validation. Returns: A dictionary of extra fields, or `None` if `config.extra` is not set to `"allow"`. """ return self.__pydantic_extra__ @property def model_fields_set(self) -> set[str]: """Returns the set of fields that have been explicitly set on this model instance. Returns: A set of strings representing the fields that have been set, i.e. that were not filled from defaults. """ return self.__pydantic_fields_set__ @classmethod def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self: # noqa: C901 """Creates a new instance of the `Model` class with validated data. Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data. Default values are respected, but no other validation is performed. !!! note `model_construct()` generally respects the `model_config.extra` setting on the provided model. That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__` and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored. Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in an error if extra values are passed, but they will be ignored. Args: _fields_set: A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute. Otherwise, the field names from the `values` argument will be used. values: Trusted or pre-validated data dictionary. Returns: A new instance of the `Model` class with validated data. """ m = cls.__new__(cls) fields_values: dict[str, Any] = {} fields_set = set() for name, field in cls.__pydantic_fields__.items(): if field.alias is not None and field.alias in values: fields_values[name] = values.pop(field.alias) fields_set.add(name) if (name not in fields_set) and (field.validation_alias is not None): validation_aliases: list[str | AliasPath] = ( field.validation_alias.choices if isinstance(field.validation_alias, AliasChoices) else [field.validation_alias] ) for alias in validation_aliases: if isinstance(alias, str) and alias in values: fields_values[name] = values.pop(alias) fields_set.add(name) break elif isinstance(alias, AliasPath): value = alias.search_dict_for_path(values) if value is not PydanticUndefined: fields_values[name] = value fields_set.add(name) break if name not in fields_set: if name in values: fields_values[name] = values.pop(name) fields_set.add(name) elif not field.is_required(): fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values) if _fields_set is None: _fields_set = fields_set _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None _object_setattr(m, '__dict__', fields_values) _object_setattr(m, '__pydantic_fields_set__', _fields_set) if not cls.__pydantic_root_model__: _object_setattr(m, '__pydantic_extra__', _extra) if cls.__pydantic_post_init__: m.model_post_init(None) # update private attributes with values set if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None: for k, v in values.items(): if k in m.__private_attributes__: m.__pydantic_private__[k] = v elif not cls.__pydantic_root_model__: # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist # Since it doesn't, that means that `__pydantic_private__` should be set to None _object_setattr(m, '__pydantic_private__', None) return m def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self: """!!! abstract "Usage Documentation" [`model_copy`](../concepts/serialization.md#model_copy) Returns a copy of the model. !!! note The underlying instance's [`__dict__`][object.__dict__] attribute is copied. This might have unexpected side effects if you store anything in it, on top of the model fields (e.g. the value of [cached properties][functools.cached_property]). Args: update: Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data. deep: Set to `True` to make a deep copy of the model. Returns: New model instance. """ copied = self.__deepcopy__() if deep else self.__copy__() if update: if self.model_config.get('extra') == 'allow': for k, v in update.items(): if k in self.__pydantic_fields__: copied.__dict__[k] = v else: if copied.__pydantic_extra__ is None: copied.__pydantic_extra__ = {} copied.__pydantic_extra__[k] = v else: copied.__dict__.update(update) copied.__pydantic_fields_set__.update(update.keys()) return copied def model_dump( self, *, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False, ) -> dict[str, Any]: """!!! abstract "Usage Documentation" [`model_dump`](../concepts/serialization.md#modelmodel_dump) Generate a dictionary representation of the model, optionally specifying which fields to include or exclude. Args: mode: The mode in which `to_python` should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects. include: A set of fields to include in the output. exclude: A set of fields to exclude from the output. context: Additional context to pass to the serializer. by_alias: Whether to use the field's alias in the dictionary key if defined. exclude_unset: Whether to exclude fields that have not been explicitly set. exclude_defaults: Whether to exclude fields that are set to their default value. exclude_none: Whether to exclude fields that have a value of `None`. round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T]. warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError]. fallback: A function to call when an unknown value is encountered. If not provided, a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised. serialize_as_any: Whether to serialize fields with duck-typing serialization behavior. Returns: A dictionary representation of the model. """ return self.__pydantic_serializer__.to_python( self, mode=mode, by_alias=by_alias, include=include, exclude=exclude, context=context, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, round_trip=round_trip, warnings=warnings, fallback=fallback, serialize_as_any=serialize_as_any, ) def model_dump_json( self, *, indent: int | None = None, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False, ) -> str: """!!! abstract "Usage Documentation" [`model_dump_json`](../concepts/serialization.md#modelmodel_dump_json) Generates a JSON representation of the model using Pydantic's `to_json` method. Args: indent: Indentation to use in the JSON output. If None is passed, the output will be compact. include: Field(s) to include in the JSON output. exclude: Field(s) to exclude from the JSON output. context: Additional context to pass to the serializer. by_alias: Whether to serialize using field aliases. exclude_unset: Whether to exclude fields that have not been explicitly set. exclude_defaults: Whether to exclude fields that are set to their default value. exclude_none: Whether to exclude fields that have a value of `None`. round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T]. warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError]. fallback: A function to call when an unknown value is encountered. If not provided, a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised. serialize_as_any: Whether to serialize fields with duck-typing serialization behavior. Returns: A JSON string representation of the model. """ return self.__pydantic_serializer__.to_json( self, indent=indent, include=include, exclude=exclude, context=context, by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, round_trip=round_trip, warnings=warnings, fallback=fallback, serialize_as_any=serialize_as_any, ).decode() @classmethod def model_json_schema( cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', ) -> dict[str, Any]: """Generates a JSON schema for a model class. Args: by_alias: Whether to use attribute aliases or not. ref_template: The reference template. schema_generator: To override the logic used to generate the JSON schema, as a subclass of `GenerateJsonSchema` with your desired modifications mode: The mode in which to generate the schema. Returns: The JSON schema for the given model class. """ return model_json_schema( cls, by_alias=by_alias, ref_template=ref_template, schema_generator=schema_generator, mode=mode ) @classmethod def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str: """Compute the class name for parametrizations of generic classes. This method can be overridden to achieve a custom naming scheme for generic BaseModels. Args: params: Tuple of types of the class. Given a generic class `Model` with 2 type variables and a concrete model `Model[str, int]`, the value `(str, int)` would be passed to `params`. Returns: String representing the new class where `params` are passed to `cls` as type variables. Raises: TypeError: Raised when trying to generate concrete names for non-generic models. """ if not issubclass(cls, typing.Generic): raise TypeError('Concrete names should only be generated for generic models.') # Any strings received should represent forward references, so we handle them specially below. # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future, # we may be able to remove this special case. param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params] params_component = ', '.join(param_names) return f'{cls.__name__}[{params_component}]' def model_post_init(self, context: Any, /) -> None: """Override this method to perform additional initialization after `__init__` and `model_construct`. This is useful if you want to do some validation that requires the entire model to be initialized. """ pass @classmethod def model_rebuild( cls, *, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None, ) -> bool | None: """Try to rebuild the pydantic-core schema for the model. This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails. Args: force: Whether to force the rebuilding of the model schema, defaults to `False`. raise_errors: Whether to raise errors, defaults to `True`. _parent_namespace_depth: The depth level of the parent namespace, defaults to 2. _types_namespace: The types namespace, defaults to `None`. Returns: Returns `None` if the schema is already "complete" and rebuilding was not required. If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`. """ if not force and cls.__pydantic_complete__: return None for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'): if attr in cls.__dict__: # Deleting the validator/serializer is necessary as otherwise they can get reused in # pydantic-core. Same applies for the core schema that can be reused in schema generation. delattr(cls, attr) cls.__pydantic_complete__ = False if _types_namespace is not None: rebuild_ns = _types_namespace elif _parent_namespace_depth > 0: rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {} else: rebuild_ns = {} parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {} ns_resolver = _namespace_utils.NsResolver( parent_namespace={**rebuild_ns, **parent_ns}, ) if not cls.__pydantic_fields_complete__: typevars_map = _generics.get_model_typevars_map(cls) try: cls.__pydantic_fields__ = _fields.rebuild_model_fields( cls, ns_resolver=ns_resolver, typevars_map=typevars_map, ) except NameError as e: exc = PydanticUndefinedAnnotation.from_name_error(e) _mock_val_ser.set_model_mocks(cls, f'`{exc.name}`') if raise_errors: raise exc from e if not raise_errors and not cls.__pydantic_fields_complete__: # No need to continue with schema gen, it is guaranteed to fail return False assert cls.__pydantic_fields_complete__ return _model_construction.complete_model_class( cls, _config.ConfigWrapper(cls.model_config, check=False), raise_errors=raise_errors, ns_resolver=ns_resolver, ) @classmethod def model_validate( cls, obj: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None, ) -> Self: """Validate a pydantic model instance. Args: obj: The object to validate. strict: Whether to enforce types strictly. from_attributes: Whether to extract data from object attributes. context: Additional context to pass to the validator. by_alias: Whether to use the field's alias when validating against the provided input data. by_name: Whether to use the field's name when validating against the provided input data. Raises: ValidationError: If the object could not be validated. Returns: The validated model instance. """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True if by_alias is False and by_name is not True: raise PydanticUserError( 'At least one of `by_alias` or `by_name` must be set to True.', code='validate-by-alias-and-name-false', ) return cls.__pydantic_validator__.validate_python( obj, strict=strict, from_attributes=from_attributes, context=context, by_alias=by_alias, by_name=by_name ) @classmethod def model_validate_json( cls, json_data: str | bytes | bytearray, *, strict: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None, ) -> Self: """!!! abstract "Usage Documentation" [JSON Parsing](../concepts/json.md#json-parsing) Validate the given JSON data against the Pydantic model. Args: json_data: The JSON data to validate. strict: Whether to enforce types strictly. context: Extra variables to pass to the validator. by_alias: Whether to use the field's alias when validating against the provided input data. by_name: Whether to use the field's name when validating against the provided input data. Returns: The validated Pydantic model. Raises: ValidationError: If `json_data` is not a JSON string or the object could not be validated. """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True if by_alias is False and by_name is not True: raise PydanticUserError( 'At least one of `by_alias` or `by_name` must be set to True.', code='validate-by-alias-and-name-false', ) return cls.__pydantic_validator__.validate_json( json_data, strict=strict, context=context, by_alias=by_alias, by_name=by_name ) @classmethod def model_validate_strings( cls, obj: Any, *, strict: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None, ) -> Self: """Validate the given object with string data against the Pydantic model. Args: obj: The object containing string data to validate. strict: Whether to enforce types strictly. context: Extra variables to pass to the validator. by_alias: Whether to use the field's alias when validating against the provided input data. by_name: Whether to use the field's name when validating against the provided input data. Returns: The validated Pydantic model. """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True if by_alias is False and by_name is not True: raise PydanticUserError( 'At least one of `by_alias` or `by_name` must be set to True.', code='validate-by-alias-and-name-false', ) return cls.__pydantic_validator__.validate_strings( obj, strict=strict, context=context, by_alias=by_alias, by_name=by_name ) @classmethod def __get_pydantic_core_schema__(cls, source: type[BaseModel], handler: GetCoreSchemaHandler, /) -> CoreSchema: # This warning is only emitted when calling `super().__get_pydantic_core_schema__` from a model subclass. # In the generate schema logic, this method (`BaseModel.__get_pydantic_core_schema__`) is special cased to # *not* be called if not overridden. warnings.warn( 'The `__get_pydantic_core_schema__` method of the `BaseModel` class is deprecated. If you are calling ' '`super().__get_pydantic_core_schema__` when overriding the method on a Pydantic model, consider using ' '`handler(source)` instead. However, note that overriding this method on models can lead to unexpected ' 'side effects.', PydanticDeprecatedSince211, stacklevel=2, ) # Logic copied over from `GenerateSchema._model_schema`: schema = cls.__dict__.get('__pydantic_core_schema__') if schema is not None and not isinstance(schema, _mock_val_ser.MockCoreSchema): return cls.__pydantic_core_schema__ return handler(source) @classmethod def __get_pydantic_json_schema__( cls, core_schema: CoreSchema, handler: GetJsonSchemaHandler, /, ) -> JsonSchemaValue: """Hook into generating the model's JSON schema. Args: core_schema: A `pydantic-core` CoreSchema. You can ignore this argument and call the handler with a new CoreSchema, wrap this CoreSchema (`{'type': 'nullable', 'schema': current_schema}`), or just call the handler with the original schema. handler: Call into Pydantic's internal JSON schema generation. This will raise a `pydantic.errors.PydanticInvalidForJsonSchema` if JSON schema generation fails. Since this gets called by `BaseModel.model_json_schema` you can override the `schema_generator` argument to that function to change JSON schema generation globally for a type. Returns: A JSON schema, as a Python object. """ return handler(core_schema) @classmethod def __pydantic_init_subclass__(cls, **kwargs: Any) -> None: """This is intended to behave just like `__init_subclass__`, but is called by `ModelMetaclass` only after the class is actually fully initialized. In particular, attributes like `model_fields` will be present when this is called. This is necessary because `__init_subclass__` will always be called by `type.__new__`, and it would require a prohibitively large refactor to the `ModelMetaclass` to ensure that `type.__new__` was called in such a manner that the class would already be sufficiently initialized. This will receive the same `kwargs` that would be passed to the standard `__init_subclass__`, namely, any kwargs passed to the class definition that aren't used internally by pydantic. Args: **kwargs: Any keyword arguments passed to the class definition that aren't used internally by pydantic. """ pass def __class_getitem__( cls, typevar_values: type[Any] | tuple[type[Any], ...] ) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef: cached = _generics.get_cached_generic_type_early(cls, typevar_values) if cached is not None: return cached if cls is BaseModel: raise TypeError('Type parameters should be placed on typing.Generic, not BaseModel') if not hasattr(cls, '__parameters__'): raise TypeError(f'{cls} cannot be parametrized because it does not inherit from typing.Generic') if not cls.__pydantic_generic_metadata__['parameters'] and typing.Generic not in cls.__bases__: raise TypeError(f'{cls} is not a generic class') if not isinstance(typevar_values, tuple): typevar_values = (typevar_values,) # For a model `class Model[T, U, V = int](BaseModel): ...` parametrized with `(str, bool)`, # this gives us `{T: str, U: bool, V: int}`: typevars_map = _generics.map_generic_model_arguments(cls, typevar_values) # We also update the provided args to use defaults values (`(str, bool)` becomes `(str, bool, int)`): typevar_values = tuple(v for v in typevars_map.values()) if _utils.all_identical(typevars_map.keys(), typevars_map.values()) and typevars_map: submodel = cls # if arguments are equal to parameters it's the same object _generics.set_cached_generic_type(cls, typevar_values, submodel) else: parent_args = cls.__pydantic_generic_metadata__['args'] if not parent_args: args = typevar_values else: args = tuple(_generics.replace_types(arg, typevars_map) for arg in parent_args) origin = cls.__pydantic_generic_metadata__['origin'] or cls model_name = origin.model_parametrized_name(args) params = tuple( {param: None for param in _generics.iter_contained_typevars(typevars_map.values())} ) # use dict as ordered set with _generics.generic_recursion_self_type(origin, args) as maybe_self_type: cached = _generics.get_cached_generic_type_late(cls, typevar_values, origin, args) if cached is not None: return cached if maybe_self_type is not None: return maybe_self_type # Attempt to rebuild the origin in case new types have been defined try: # depth 2 gets you above this __class_getitem__ call. # Note that we explicitly provide the parent ns, otherwise # `model_rebuild` will use the parent ns no matter if it is the ns of a module. # We don't want this here, as this has unexpected effects when a model # is being parametrized during a forward annotation evaluation. parent_ns = _typing_extra.parent_frame_namespace(parent_depth=2) or {} origin.model_rebuild(_types_namespace=parent_ns) except PydanticUndefinedAnnotation: # It's okay if it fails, it just means there are still undefined types # that could be evaluated later. pass submodel = _generics.create_generic_submodel(model_name, origin, args, params) # Cache the generated model *only* if not in the process of parametrizing # another model. In some valid scenarios, we miss the opportunity to cache # it but in some cases this results in `PydanticRecursiveRef` instances left # on `FieldInfo` annotations: if len(_generics.recursively_defined_type_refs()) == 1: _generics.set_cached_generic_type(cls, typevar_values, submodel, origin, args) return submodel def __copy__(self) -> Self: """Returns a shallow copy of the model.""" cls = type(self) m = cls.__new__(cls) _object_setattr(m, '__dict__', copy(self.__dict__)) _object_setattr(m, '__pydantic_extra__', copy(self.__pydantic_extra__)) _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__)) if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None: _object_setattr(m, '__pydantic_private__', None) else: _object_setattr( m, '__pydantic_private__', {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, ) return m def __deepcopy__(self, memo: dict[int, Any] | None = None) -> Self: """Returns a deep copy of the model.""" cls = type(self) m = cls.__new__(cls) _object_setattr(m, '__dict__', deepcopy(self.__dict__, memo=memo)) _object_setattr(m, '__pydantic_extra__', deepcopy(self.__pydantic_extra__, memo=memo)) # This next line doesn't need a deepcopy because __pydantic_fields_set__ is a set[str], # and attempting a deepcopy would be marginally slower. _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__)) if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None: _object_setattr(m, '__pydantic_private__', None) else: _object_setattr( m, '__pydantic_private__', deepcopy({k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, memo=memo), ) return m if not TYPE_CHECKING: # We put `__getattr__` in a non-TYPE_CHECKING block because otherwise, mypy allows arbitrary attribute access # The same goes for __setattr__ and __delattr__, see: https://github.com/pydantic/pydantic/issues/8643 def __getattr__(self, item: str) -> Any: private_attributes = object.__getattribute__(self, '__private_attributes__') if item in private_attributes: attribute = private_attributes[item] if hasattr(attribute, '__get__'): return attribute.__get__(self, type(self)) # type: ignore try: # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items return self.__pydantic_private__[item] # type: ignore except KeyError as exc: raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc else: # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized. # See `BaseModel.__repr_args__` for more details try: pydantic_extra = object.__getattribute__(self, '__pydantic_extra__') except AttributeError: pydantic_extra = None if pydantic_extra: try: return pydantic_extra[item] except KeyError as exc: raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc else: if hasattr(self.__class__, item): return super().__getattribute__(item) # Raises AttributeError if appropriate else: # this is the current error raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') def __setattr__(self, name: str, value: Any) -> None: if (setattr_handler := self.__pydantic_setattr_handlers__.get(name)) is not None: setattr_handler(self, name, value) # if None is returned from _setattr_handler, the attribute was set directly elif (setattr_handler := self._setattr_handler(name, value)) is not None: setattr_handler(self, name, value) # call here to not memo on possibly unknown fields self.__pydantic_setattr_handlers__[name] = setattr_handler # memoize the handler for faster access def _setattr_handler(self, name: str, value: Any) -> Callable[[BaseModel, str, Any], None] | None: """Get a handler for setting an attribute on the model instance. Returns: A handler for setting an attribute on the model instance. Used for memoization of the handler. Memoizing the handlers leads to a dramatic performance improvement in `__setattr__` Returns `None` when memoization is not safe, then the attribute is set directly. """ cls = self.__class__ if name in cls.__class_vars__: raise AttributeError( f'{name!r} is a ClassVar of `{cls.__name__}` and cannot be set on an instance. ' f'If you want to set a value on the class, use `{cls.__name__}.{name} = value`.' ) elif not _fields.is_valid_field_name(name): if (attribute := cls.__private_attributes__.get(name)) is not None: if hasattr(attribute, '__set__'): return lambda model, _name, val: attribute.__set__(model, val) else: return _SIMPLE_SETATTR_HANDLERS['private'] else: _object_setattr(self, name, value) return None # Can not return memoized handler with possibly freeform attr names attr = getattr(cls, name, None) # NOTE: We currently special case properties and `cached_property`, but we might need # to generalize this to all data/non-data descriptors at some point. For non-data descriptors # (such as `cached_property`), it isn't obvious though. `cached_property` caches the value # to the instance's `__dict__`, but other non-data descriptors might do things differently. if isinstance(attr, cached_property): return _SIMPLE_SETATTR_HANDLERS['cached_property'] _check_frozen(cls, name, value) # We allow properties to be set only on non frozen models for now (to match dataclasses). # This can be changed if it ever gets requested. if isinstance(attr, property): return lambda model, _name, val: attr.__set__(model, val) elif cls.model_config.get('validate_assignment'): return _SIMPLE_SETATTR_HANDLERS['validate_assignment'] elif name not in cls.__pydantic_fields__: if cls.model_config.get('extra') != 'allow': # TODO - matching error raise ValueError(f'"{cls.__name__}" object has no field "{name}"') elif attr is None: # attribute does not exist, so put it in extra self.__pydantic_extra__[name] = value return None # Can not return memoized handler with possibly freeform attr names else: # attribute _does_ exist, and was not in extra, so update it return _SIMPLE_SETATTR_HANDLERS['extra_known'] else: return _SIMPLE_SETATTR_HANDLERS['model_field'] def __delattr__(self, item: str) -> Any: cls = self.__class__ if item in self.__private_attributes__: attribute = self.__private_attributes__[item] if hasattr(attribute, '__delete__'): attribute.__delete__(self) # type: ignore return try: # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items del self.__pydantic_private__[item] # type: ignore return except KeyError as exc: raise AttributeError(f'{cls.__name__!r} object has no attribute {item!r}') from exc # Allow cached properties to be deleted (even if the class is frozen): attr = getattr(cls, item, None) if isinstance(attr, cached_property): return object.__delattr__(self, item) _check_frozen(cls, name=item, value=None) if item in self.__pydantic_fields__: object.__delattr__(self, item) elif self.__pydantic_extra__ is not None and item in self.__pydantic_extra__: del self.__pydantic_extra__[item] else: try: object.__delattr__(self, item) except AttributeError: raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') # Because we make use of `@dataclass_transform()`, `__replace__` is already synthesized by # type checkers, so we define the implementation in this `if not TYPE_CHECKING:` block: def __replace__(self, **changes: Any) -> Self: return self.model_copy(update=changes) def __getstate__(self) -> dict[Any, Any]: private = self.__pydantic_private__ if private: private = {k: v for k, v in private.items() if v is not PydanticUndefined} return { '__dict__': self.__dict__, '__pydantic_extra__': self.__pydantic_extra__, '__pydantic_fields_set__': self.__pydantic_fields_set__, '__pydantic_private__': private, } def __setstate__(self, state: dict[Any, Any]) -> None: _object_setattr(self, '__pydantic_fields_set__', state.get('__pydantic_fields_set__', {})) _object_setattr(self, '__pydantic_extra__', state.get('__pydantic_extra__', {})) _object_setattr(self, '__pydantic_private__', state.get('__pydantic_private__', {})) _object_setattr(self, '__dict__', state.get('__dict__', {})) if not TYPE_CHECKING: def __eq__(self, other: Any) -> bool: if isinstance(other, BaseModel): # When comparing instances of generic types for equality, as long as all field values are equal, # only require their generic origin types to be equal, rather than exact type equality. # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1). self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__ other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__ # Perform common checks first if not ( self_type == other_type and getattr(self, '__pydantic_private__', None) == getattr(other, '__pydantic_private__', None) and self.__pydantic_extra__ == other.__pydantic_extra__ ): return False # We only want to compare pydantic fields but ignoring fields is costly. # We'll perform a fast check first, and fallback only when needed # See GH-7444 and GH-7825 for rationale and a performance benchmark # First, do the fast (and sometimes faulty) __dict__ comparison if self.__dict__ == other.__dict__: # If the check above passes, then pydantic fields are equal, we can return early return True # We don't want to trigger unnecessary costly filtering of __dict__ on all unequal objects, so we return # early if there are no keys to ignore (we would just return False later on anyway) model_fields = type(self).__pydantic_fields__.keys() if self.__dict__.keys() <= model_fields and other.__dict__.keys() <= model_fields: return False # If we reach here, there are non-pydantic-fields keys, mapped to unequal values, that we need to ignore # Resort to costly filtering of the __dict__ objects # We use operator.itemgetter because it is much faster than dict comprehensions # NOTE: Contrary to standard python class and instances, when the Model class has a default value for an # attribute and the model instance doesn't have a corresponding attribute, accessing the missing attribute # raises an error in BaseModel.__getattr__ instead of returning the class attribute # So we can use operator.itemgetter() instead of operator.attrgetter() getter = operator.itemgetter(*model_fields) if model_fields else lambda _: _utils._SENTINEL try: return getter(self.__dict__) == getter(other.__dict__) except KeyError: # In rare cases (such as when using the deprecated BaseModel.copy() method), # the __dict__ may not contain all model fields, which is how we can get here. # getter(self.__dict__) is much faster than any 'safe' method that accounts # for missing keys, and wrapping it in a `try` doesn't slow things down much # in the common case. self_fields_proxy = _utils.SafeGetItemProxy(self.__dict__) other_fields_proxy = _utils.SafeGetItemProxy(other.__dict__) return getter(self_fields_proxy) == getter(other_fields_proxy) # other instance is not a BaseModel else: return NotImplemented # delegate to the other item in the comparison if TYPE_CHECKING: # We put `__init_subclass__` in a TYPE_CHECKING block because, even though we want the type-checking benefits # described in the signature of `__init_subclass__` below, we don't want to modify the default behavior of # subclass initialization. def __init_subclass__(cls, **kwargs: Unpack[ConfigDict]): """This signature is included purely to help type-checkers check arguments to class declaration, which provides a way to conveniently set model_config key/value pairs. ```python from pydantic import BaseModel class MyModel(BaseModel, extra='allow'): ... ``` However, this may be deceiving, since the _actual_ calls to `__init_subclass__` will not receive any of the config arguments, and will only receive any keyword arguments passed during class initialization that are _not_ expected keys in ConfigDict. (This is due to the way `ModelMetaclass.__new__` works.) Args: **kwargs: Keyword arguments passed to the class definition, which set model_config Note: You may want to override `__pydantic_init_subclass__` instead, which behaves similarly but is called *after* the class is fully initialized. """ def __iter__(self) -> TupleGenerator: """So `dict(model)` works.""" yield from [(k, v) for (k, v) in self.__dict__.items() if not k.startswith('_')] extra = self.__pydantic_extra__ if extra: yield from extra.items() def __repr__(self) -> str: return f'{self.__repr_name__()}({self.__repr_str__(", ")})' def __repr_args__(self) -> _repr.ReprArgs: # Eagerly create the repr of computed fields, as this may trigger access of cached properties and as such # modify the instance's `__dict__`. If we don't do it now, it could happen when iterating over the `__dict__` # below if the instance happens to be referenced in a field, and would modify the `__dict__` size *during* iteration. computed_fields_repr_args = [ (k, getattr(self, k)) for k, v in self.__pydantic_computed_fields__.items() if v.repr ] for k, v in self.__dict__.items(): field = self.__pydantic_fields__.get(k) if field and field.repr: if v is not self: yield k, v else: yield k, self.__repr_recursion__(v) # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized. # This can happen if a `ValidationError` is raised during initialization and the instance's # repr is generated as part of the exception handling. Therefore, we use `getattr` here # with a fallback, even though the type hints indicate the attribute will always be present. try: pydantic_extra = object.__getattribute__(self, '__pydantic_extra__') except AttributeError: pydantic_extra = None if pydantic_extra is not None: yield from ((k, v) for k, v in pydantic_extra.items()) yield from computed_fields_repr_args # take logic from `_repr.Representation` without the side effects of inheritance, see #5740 __repr_name__ = _repr.Representation.__repr_name__ __repr_recursion__ = _repr.Representation.__repr_recursion__ __repr_str__ = _repr.Representation.__repr_str__ __pretty__ = _repr.Representation.__pretty__ __rich_repr__ = _repr.Representation.__rich_repr__ def __str__(self) -> str: return self.__repr_str__(' ') # ##### Deprecated methods from v1 ##### @property @typing_extensions.deprecated( 'The `__fields__` attribute is deprecated, use `model_fields` instead.', category=None ) def __fields__(self) -> dict[str, FieldInfo]: warnings.warn( 'The `__fields__` attribute is deprecated, use `model_fields` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return getattr(type(self), '__pydantic_fields__', {}) @property @typing_extensions.deprecated( 'The `__fields_set__` attribute is deprecated, use `model_fields_set` instead.', category=None, ) def __fields_set__(self) -> set[str]: warnings.warn( 'The `__fields_set__` attribute is deprecated, use `model_fields_set` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return self.__pydantic_fields_set__ @typing_extensions.deprecated('The `dict` method is deprecated; use `model_dump` instead.', category=None) def dict( # noqa: D102 self, *, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, ) -> Dict[str, Any]: # noqa UP006 warnings.warn( 'The `dict` method is deprecated; use `model_dump` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return self.model_dump( include=include, exclude=exclude, by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, ) @typing_extensions.deprecated('The `json` method is deprecated; use `model_dump_json` instead.', category=None) def json( # noqa: D102 self, *, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = PydanticUndefined, # type: ignore[assignment] models_as_dict: bool = PydanticUndefined, # type: ignore[assignment] **dumps_kwargs: Any, ) -> str: warnings.warn( 'The `json` method is deprecated; use `model_dump_json` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) if encoder is not PydanticUndefined: raise TypeError('The `encoder` argument is no longer supported; use field serializers instead.') if models_as_dict is not PydanticUndefined: raise TypeError('The `models_as_dict` argument is no longer supported; use a model serializer instead.') if dumps_kwargs: raise TypeError('`dumps_kwargs` keyword arguments are no longer supported.') return self.model_dump_json( include=include, exclude=exclude, by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, ) @classmethod @typing_extensions.deprecated('The `parse_obj` method is deprecated; use `model_validate` instead.', category=None) def parse_obj(cls, obj: Any) -> Self: # noqa: D102 warnings.warn( 'The `parse_obj` method is deprecated; use `model_validate` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return cls.model_validate(obj) @classmethod @typing_extensions.deprecated( 'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, ' 'otherwise load the data then use `model_validate` instead.', category=None, ) def parse_raw( # noqa: D102 cls, b: str | bytes, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False, ) -> Self: # pragma: no cover warnings.warn( 'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, ' 'otherwise load the data then use `model_validate` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import parse try: obj = parse.load_str_bytes( b, proto=proto, content_type=content_type, encoding=encoding, allow_pickle=allow_pickle, ) except (ValueError, TypeError) as exc: import json # try to match V1 if isinstance(exc, UnicodeDecodeError): type_str = 'value_error.unicodedecode' elif isinstance(exc, json.JSONDecodeError): type_str = 'value_error.jsondecode' elif isinstance(exc, ValueError): type_str = 'value_error' else: type_str = 'type_error' # ctx is missing here, but since we've added `input` to the error, we're not pretending it's the same error: pydantic_core.InitErrorDetails = { # The type: ignore on the next line is to ignore the requirement of LiteralString 'type': pydantic_core.PydanticCustomError(type_str, str(exc)), # type: ignore 'loc': ('__root__',), 'input': b, } raise pydantic_core.ValidationError.from_exception_data(cls.__name__, [error]) return cls.model_validate(obj) @classmethod @typing_extensions.deprecated( 'The `parse_file` method is deprecated; load the data from file, then if your data is JSON ' 'use `model_validate_json`, otherwise `model_validate` instead.', category=None, ) def parse_file( # noqa: D102 cls, path: str | Path, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False, ) -> Self: warnings.warn( 'The `parse_file` method is deprecated; load the data from file, then if your data is JSON ' 'use `model_validate_json`, otherwise `model_validate` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import parse obj = parse.load_file( path, proto=proto, content_type=content_type, encoding=encoding, allow_pickle=allow_pickle, ) return cls.parse_obj(obj) @classmethod @typing_extensions.deprecated( 'The `from_orm` method is deprecated; set ' "`model_config['from_attributes']=True` and use `model_validate` instead.", category=None, ) def from_orm(cls, obj: Any) -> Self: # noqa: D102 warnings.warn( 'The `from_orm` method is deprecated; set ' "`model_config['from_attributes']=True` and use `model_validate` instead.", category=PydanticDeprecatedSince20, stacklevel=2, ) if not cls.model_config.get('from_attributes', None): raise PydanticUserError( 'You must set the config attribute `from_attributes=True` to use from_orm', code=None ) return cls.model_validate(obj) @classmethod @typing_extensions.deprecated('The `construct` method is deprecated; use `model_construct` instead.', category=None) def construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self: # noqa: D102 warnings.warn( 'The `construct` method is deprecated; use `model_construct` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return cls.model_construct(_fields_set=_fields_set, **values) @typing_extensions.deprecated( 'The `copy` method is deprecated; use `model_copy` instead. ' 'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.', category=None, ) def copy( self, *, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: Dict[str, Any] | None = None, # noqa UP006 deep: bool = False, ) -> Self: # pragma: no cover """Returns a copy of the model. !!! warning "Deprecated" This method is now deprecated; use `model_copy` instead. If you need `include` or `exclude`, use: ```python {test="skip" lint="skip"} data = self.model_dump(include=include, exclude=exclude, round_trip=True) data = {**data, **(update or {})} copied = self.model_validate(data) ``` Args: include: Optional set or mapping specifying which fields to include in the copied model. exclude: Optional set or mapping specifying which fields to exclude in the copied model. update: Optional dictionary of field-value pairs to override field values in the copied model. deep: If True, the values of fields that are Pydantic models will be deep-copied. Returns: A copy of the model with included, excluded and updated fields as specified. """ warnings.warn( 'The `copy` method is deprecated; use `model_copy` instead. ' 'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import copy_internals values = dict( copy_internals._iter( self, to_dict=False, by_alias=False, include=include, exclude=exclude, exclude_unset=False ), **(update or {}), ) if self.__pydantic_private__ is None: private = None else: private = {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined} if self.__pydantic_extra__ is None: extra: dict[str, Any] | None = None else: extra = self.__pydantic_extra__.copy() for k in list(self.__pydantic_extra__): if k not in values: # k was in the exclude extra.pop(k) for k in list(values): if k in self.__pydantic_extra__: # k must have come from extra extra[k] = values.pop(k) # new `__pydantic_fields_set__` can have unset optional fields with a set value in `update` kwarg if update: fields_set = self.__pydantic_fields_set__ | update.keys() else: fields_set = set(self.__pydantic_fields_set__) # removing excluded fields from `__pydantic_fields_set__` if exclude: fields_set -= set(exclude) return copy_internals._copy_and_set_values(self, values, fields_set, extra, private, deep=deep) @classmethod @typing_extensions.deprecated('The `schema` method is deprecated; use `model_json_schema` instead.', category=None) def schema( # noqa: D102 cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE ) -> Dict[str, Any]: # noqa UP006 warnings.warn( 'The `schema` method is deprecated; use `model_json_schema` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return cls.model_json_schema(by_alias=by_alias, ref_template=ref_template) @classmethod @typing_extensions.deprecated( 'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.', category=None, ) def schema_json( # noqa: D102 cls, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any ) -> str: # pragma: no cover warnings.warn( 'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) import json from .deprecated.json import pydantic_encoder return json.dumps( cls.model_json_schema(by_alias=by_alias, ref_template=ref_template), default=pydantic_encoder, **dumps_kwargs, ) @classmethod @typing_extensions.deprecated('The `validate` method is deprecated; use `model_validate` instead.', category=None) def validate(cls, value: Any) -> Self: # noqa: D102 warnings.warn( 'The `validate` method is deprecated; use `model_validate` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return cls.model_validate(value) @classmethod @typing_extensions.deprecated( 'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.', category=None, ) def update_forward_refs(cls, **localns: Any) -> None: # noqa: D102 warnings.warn( 'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) if localns: # pragma: no cover raise TypeError('`localns` arguments are not longer accepted.') cls.model_rebuild(force=True) @typing_extensions.deprecated( 'The private method `_iter` will be removed and should no longer be used.', category=None ) def _iter(self, *args: Any, **kwargs: Any) -> Any: warnings.warn( 'The private method `_iter` will be removed and should no longer be used.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import copy_internals return copy_internals._iter(self, *args, **kwargs) @typing_extensions.deprecated( 'The private method `_copy_and_set_values` will be removed and should no longer be used.', category=None, ) def _copy_and_set_values(self, *args: Any, **kwargs: Any) -> Any: warnings.warn( 'The private method `_copy_and_set_values` will be removed and should no longer be used.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import copy_internals return copy_internals._copy_and_set_values(self, *args, **kwargs) @classmethod @typing_extensions.deprecated( 'The private method `_get_value` will be removed and should no longer be used.', category=None, ) def _get_value(cls, *args: Any, **kwargs: Any) -> Any: warnings.warn( 'The private method `_get_value` will be removed and should no longer be used.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import copy_internals return copy_internals._get_value(cls, *args, **kwargs) @typing_extensions.deprecated( 'The private method `_calculate_keys` will be removed and should no longer be used.', category=None, ) def _calculate_keys(self, *args: Any, **kwargs: Any) -> Any: warnings.warn( 'The private method `_calculate_keys` will be removed and should no longer be used.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import copy_internals return copy_internals._calculate_keys(self, *args, **kwargs) ```` ### __init__ ```python __init__(**data: Any) -> None ``` Raises ValidationError if the input data cannot be validated to form a valid model. `self` is explicitly positional-only to allow `self` as a field name. Source code in `pydantic/main.py` ```python def __init__(self, /, **data: Any) -> None: """Create a new model by parsing and validating input data from keyword arguments. Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model. `self` is explicitly positional-only to allow `self` as a field name. """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self) if self is not validated_self: warnings.warn( 'A custom validator is returning a value other than `self`.\n' "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n" 'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.', stacklevel=2, ) ``` ### model_config ```python model_config: ConfigDict = ConfigDict() ``` Configuration for the model, should be a dictionary conforming to ConfigDict. ### model_fields ```python model_fields() -> dict[str, FieldInfo] ``` A mapping of field names to their respective FieldInfo instances. Warning Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class. Source code in `pydantic/main.py` ```python @_utils.deprecated_instance_property @classmethod def model_fields(cls) -> dict[str, FieldInfo]: """A mapping of field names to their respective [`FieldInfo`][pydantic.fields.FieldInfo] instances. !!! warning Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class. """ return getattr(cls, '__pydantic_fields__', {}) ``` ### model_computed_fields ```python model_computed_fields() -> dict[str, ComputedFieldInfo] ``` A mapping of computed field names to their respective ComputedFieldInfo instances. Warning Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class. Source code in `pydantic/main.py` ```python @_utils.deprecated_instance_property @classmethod def model_computed_fields(cls) -> dict[str, ComputedFieldInfo]: """A mapping of computed field names to their respective [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances. !!! warning Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class. """ return getattr(cls, '__pydantic_computed_fields__', {}) ``` ### __pydantic_core_schema__ ```python __pydantic_core_schema__: CoreSchema ``` The core schema of the model. ### model_extra ```python model_extra: dict[str, Any] | None ``` Get extra fields set during validation. Returns: | Type | Description | | --- | --- | | `dict[str, Any] | None` | A dictionary of extra fields, or None if config.extra is not set to "allow". | ### model_fields_set ```python model_fields_set: set[str] ``` Returns the set of fields that have been explicitly set on this model instance. Returns: | Type | Description | | --- | --- | | `set[str]` | A set of strings representing the fields that have been set, i.e. that were not filled from defaults. | ### model_construct ```python model_construct( _fields_set: set[str] | None = None, **values: Any ) -> Self ``` Creates a new instance of the `Model` class with validated data. Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data. Default values are respected, but no other validation is performed. Note `model_construct()` generally respects the `model_config.extra` setting on the provided model. That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__` and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored. Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in an error if extra values are passed, but they will be ignored. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `_fields_set` | `set[str] | None` | A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the model_fields_set attribute. Otherwise, the field names from the values argument will be used. | `None` | | `values` | `Any` | Trusted or pre-validated data dictionary. | `{}` | Returns: | Type | Description | | --- | --- | | `Self` | A new instance of the Model class with validated data. | Source code in `pydantic/main.py` ```python @classmethod def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self: # noqa: C901 """Creates a new instance of the `Model` class with validated data. Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data. Default values are respected, but no other validation is performed. !!! note `model_construct()` generally respects the `model_config.extra` setting on the provided model. That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__` and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored. Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in an error if extra values are passed, but they will be ignored. Args: _fields_set: A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute. Otherwise, the field names from the `values` argument will be used. values: Trusted or pre-validated data dictionary. Returns: A new instance of the `Model` class with validated data. """ m = cls.__new__(cls) fields_values: dict[str, Any] = {} fields_set = set() for name, field in cls.__pydantic_fields__.items(): if field.alias is not None and field.alias in values: fields_values[name] = values.pop(field.alias) fields_set.add(name) if (name not in fields_set) and (field.validation_alias is not None): validation_aliases: list[str | AliasPath] = ( field.validation_alias.choices if isinstance(field.validation_alias, AliasChoices) else [field.validation_alias] ) for alias in validation_aliases: if isinstance(alias, str) and alias in values: fields_values[name] = values.pop(alias) fields_set.add(name) break elif isinstance(alias, AliasPath): value = alias.search_dict_for_path(values) if value is not PydanticUndefined: fields_values[name] = value fields_set.add(name) break if name not in fields_set: if name in values: fields_values[name] = values.pop(name) fields_set.add(name) elif not field.is_required(): fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values) if _fields_set is None: _fields_set = fields_set _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None _object_setattr(m, '__dict__', fields_values) _object_setattr(m, '__pydantic_fields_set__', _fields_set) if not cls.__pydantic_root_model__: _object_setattr(m, '__pydantic_extra__', _extra) if cls.__pydantic_post_init__: m.model_post_init(None) # update private attributes with values set if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None: for k, v in values.items(): if k in m.__private_attributes__: m.__pydantic_private__[k] = v elif not cls.__pydantic_root_model__: # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist # Since it doesn't, that means that `__pydantic_private__` should be set to None _object_setattr(m, '__pydantic_private__', None) return m ``` ### model_copy ```python model_copy( *, update: Mapping[str, Any] | None = None, deep: bool = False ) -> Self ``` Usage Documentation [`model_copy`](../../concepts/serialization/#model_copy) Returns a copy of the model. Note The underlying instance's __dict__ attribute is copied. This might have unexpected side effects if you store anything in it, on top of the model fields (e.g. the value of cached properties). Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `update` | `Mapping[str, Any] | None` | Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data. | `None` | | `deep` | `bool` | Set to True to make a deep copy of the model. | `False` | Returns: | Type | Description | | --- | --- | | `Self` | New model instance. | Source code in `pydantic/main.py` ```python def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self: """!!! abstract "Usage Documentation" [`model_copy`](../concepts/serialization.md#model_copy) Returns a copy of the model. !!! note The underlying instance's [`__dict__`][object.__dict__] attribute is copied. This might have unexpected side effects if you store anything in it, on top of the model fields (e.g. the value of [cached properties][functools.cached_property]). Args: update: Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data. deep: Set to `True` to make a deep copy of the model. Returns: New model instance. """ copied = self.__deepcopy__() if deep else self.__copy__() if update: if self.model_config.get('extra') == 'allow': for k, v in update.items(): if k in self.__pydantic_fields__: copied.__dict__[k] = v else: if copied.__pydantic_extra__ is None: copied.__pydantic_extra__ = {} copied.__pydantic_extra__[k] = v else: copied.__dict__.update(update) copied.__pydantic_fields_set__.update(update.keys()) return copied ``` ### model_dump ```python model_dump( *, mode: Literal["json", "python"] | str = "python", include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: ( bool | Literal["none", "warn", "error"] ) = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False ) -> dict[str, Any] ``` Usage Documentation [`model_dump`](../../concepts/serialization/#modelmodel_dump) Generate a dictionary representation of the model, optionally specifying which fields to include or exclude. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `mode` | `Literal['json', 'python'] | str` | The mode in which to_python should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects. | `'python'` | | `include` | `IncEx | None` | A set of fields to include in the output. | `None` | | `exclude` | `IncEx | None` | A set of fields to exclude from the output. | `None` | | `context` | `Any | None` | Additional context to pass to the serializer. | `None` | | `by_alias` | `bool | None` | Whether to use the field's alias in the dictionary key if defined. | `None` | | `exclude_unset` | `bool` | Whether to exclude fields that have not been explicitly set. | `False` | | `exclude_defaults` | `bool` | Whether to exclude fields that are set to their default value. | `False` | | `exclude_none` | `bool` | Whether to exclude fields that have a value of None. | `False` | | `round_trip` | `bool` | If True, dumped values should be valid as input for non-idempotent types such as Json[T]. | `False` | | `warnings` | `bool | Literal['none', 'warn', 'error']` | How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError. | `True` | | `fallback` | `Callable[[Any], Any] | None` | A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised. | `None` | | `serialize_as_any` | `bool` | Whether to serialize fields with duck-typing serialization behavior. | `False` | Returns: | Type | Description | | --- | --- | | `dict[str, Any]` | A dictionary representation of the model. | Source code in `pydantic/main.py` ```python def model_dump( self, *, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False, ) -> dict[str, Any]: """!!! abstract "Usage Documentation" [`model_dump`](../concepts/serialization.md#modelmodel_dump) Generate a dictionary representation of the model, optionally specifying which fields to include or exclude. Args: mode: The mode in which `to_python` should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects. include: A set of fields to include in the output. exclude: A set of fields to exclude from the output. context: Additional context to pass to the serializer. by_alias: Whether to use the field's alias in the dictionary key if defined. exclude_unset: Whether to exclude fields that have not been explicitly set. exclude_defaults: Whether to exclude fields that are set to their default value. exclude_none: Whether to exclude fields that have a value of `None`. round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T]. warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError]. fallback: A function to call when an unknown value is encountered. If not provided, a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised. serialize_as_any: Whether to serialize fields with duck-typing serialization behavior. Returns: A dictionary representation of the model. """ return self.__pydantic_serializer__.to_python( self, mode=mode, by_alias=by_alias, include=include, exclude=exclude, context=context, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, round_trip=round_trip, warnings=warnings, fallback=fallback, serialize_as_any=serialize_as_any, ) ``` ### model_dump_json ```python model_dump_json( *, indent: int | None = None, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: ( bool | Literal["none", "warn", "error"] ) = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False ) -> str ``` Usage Documentation [`model_dump_json`](../../concepts/serialization/#modelmodel_dump_json) Generates a JSON representation of the model using Pydantic's `to_json` method. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `indent` | `int | None` | Indentation to use in the JSON output. If None is passed, the output will be compact. | `None` | | `include` | `IncEx | None` | Field(s) to include in the JSON output. | `None` | | `exclude` | `IncEx | None` | Field(s) to exclude from the JSON output. | `None` | | `context` | `Any | None` | Additional context to pass to the serializer. | `None` | | `by_alias` | `bool | None` | Whether to serialize using field aliases. | `None` | | `exclude_unset` | `bool` | Whether to exclude fields that have not been explicitly set. | `False` | | `exclude_defaults` | `bool` | Whether to exclude fields that are set to their default value. | `False` | | `exclude_none` | `bool` | Whether to exclude fields that have a value of None. | `False` | | `round_trip` | `bool` | If True, dumped values should be valid as input for non-idempotent types such as Json[T]. | `False` | | `warnings` | `bool | Literal['none', 'warn', 'error']` | How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError. | `True` | | `fallback` | `Callable[[Any], Any] | None` | A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised. | `None` | | `serialize_as_any` | `bool` | Whether to serialize fields with duck-typing serialization behavior. | `False` | Returns: | Type | Description | | --- | --- | | `str` | A JSON string representation of the model. | Source code in `pydantic/main.py` ```python def model_dump_json( self, *, indent: int | None = None, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False, ) -> str: """!!! abstract "Usage Documentation" [`model_dump_json`](../concepts/serialization.md#modelmodel_dump_json) Generates a JSON representation of the model using Pydantic's `to_json` method. Args: indent: Indentation to use in the JSON output. If None is passed, the output will be compact. include: Field(s) to include in the JSON output. exclude: Field(s) to exclude from the JSON output. context: Additional context to pass to the serializer. by_alias: Whether to serialize using field aliases. exclude_unset: Whether to exclude fields that have not been explicitly set. exclude_defaults: Whether to exclude fields that are set to their default value. exclude_none: Whether to exclude fields that have a value of `None`. round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T]. warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError]. fallback: A function to call when an unknown value is encountered. If not provided, a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised. serialize_as_any: Whether to serialize fields with duck-typing serialization behavior. Returns: A JSON string representation of the model. """ return self.__pydantic_serializer__.to_json( self, indent=indent, include=include, exclude=exclude, context=context, by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, round_trip=round_trip, warnings=warnings, fallback=fallback, serialize_as_any=serialize_as_any, ).decode() ``` ### model_json_schema ```python model_json_schema( by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[ GenerateJsonSchema ] = GenerateJsonSchema, mode: JsonSchemaMode = "validation", ) -> dict[str, Any] ``` Generates a JSON schema for a model class. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `by_alias` | `bool` | Whether to use attribute aliases or not. | `True` | | `ref_template` | `str` | The reference template. | `DEFAULT_REF_TEMPLATE` | | `schema_generator` | `type[GenerateJsonSchema]` | To override the logic used to generate the JSON schema, as a subclass of GenerateJsonSchema with your desired modifications | `GenerateJsonSchema` | | `mode` | `JsonSchemaMode` | The mode in which to generate the schema. | `'validation'` | Returns: | Type | Description | | --- | --- | | `dict[str, Any]` | The JSON schema for the given model class. | Source code in `pydantic/main.py` ```python @classmethod def model_json_schema( cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', ) -> dict[str, Any]: """Generates a JSON schema for a model class. Args: by_alias: Whether to use attribute aliases or not. ref_template: The reference template. schema_generator: To override the logic used to generate the JSON schema, as a subclass of `GenerateJsonSchema` with your desired modifications mode: The mode in which to generate the schema. Returns: The JSON schema for the given model class. """ return model_json_schema( cls, by_alias=by_alias, ref_template=ref_template, schema_generator=schema_generator, mode=mode ) ``` ### model_parametrized_name ```python model_parametrized_name( params: tuple[type[Any], ...] ) -> str ``` Compute the class name for parametrizations of generic classes. This method can be overridden to achieve a custom naming scheme for generic BaseModels. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `params` | `tuple[type[Any], ...]` | Tuple of types of the class. Given a generic class Model with 2 type variables and a concrete model Model[str, int], the value (str, int) would be passed to params. | *required* | Returns: | Type | Description | | --- | --- | | `str` | String representing the new class where params are passed to cls as type variables. | Raises: | Type | Description | | --- | --- | | `TypeError` | Raised when trying to generate concrete names for non-generic models. | Source code in `pydantic/main.py` ```python @classmethod def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str: """Compute the class name for parametrizations of generic classes. This method can be overridden to achieve a custom naming scheme for generic BaseModels. Args: params: Tuple of types of the class. Given a generic class `Model` with 2 type variables and a concrete model `Model[str, int]`, the value `(str, int)` would be passed to `params`. Returns: String representing the new class where `params` are passed to `cls` as type variables. Raises: TypeError: Raised when trying to generate concrete names for non-generic models. """ if not issubclass(cls, typing.Generic): raise TypeError('Concrete names should only be generated for generic models.') # Any strings received should represent forward references, so we handle them specially below. # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future, # we may be able to remove this special case. param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params] params_component = ', '.join(param_names) return f'{cls.__name__}[{params_component}]' ``` ### model_post_init ```python model_post_init(context: Any) -> None ``` Override this method to perform additional initialization after `__init__` and `model_construct`. This is useful if you want to do some validation that requires the entire model to be initialized. Source code in `pydantic/main.py` ```python def model_post_init(self, context: Any, /) -> None: """Override this method to perform additional initialization after `__init__` and `model_construct`. This is useful if you want to do some validation that requires the entire model to be initialized. """ pass ``` ### model_rebuild ```python model_rebuild( *, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None ) -> bool | None ``` Try to rebuild the pydantic-core schema for the model. This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `force` | `bool` | Whether to force the rebuilding of the model schema, defaults to False. | `False` | | `raise_errors` | `bool` | Whether to raise errors, defaults to True. | `True` | | `_parent_namespace_depth` | `int` | The depth level of the parent namespace, defaults to 2. | `2` | | `_types_namespace` | `MappingNamespace | None` | The types namespace, defaults to None. | `None` | Returns: | Type | Description | | --- | --- | | `bool | None` | Returns None if the schema is already "complete" and rebuilding was not required. | | `bool | None` | If rebuilding was required, returns True if rebuilding was successful, otherwise False. | Source code in `pydantic/main.py` ```python @classmethod def model_rebuild( cls, *, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None, ) -> bool | None: """Try to rebuild the pydantic-core schema for the model. This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails. Args: force: Whether to force the rebuilding of the model schema, defaults to `False`. raise_errors: Whether to raise errors, defaults to `True`. _parent_namespace_depth: The depth level of the parent namespace, defaults to 2. _types_namespace: The types namespace, defaults to `None`. Returns: Returns `None` if the schema is already "complete" and rebuilding was not required. If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`. """ if not force and cls.__pydantic_complete__: return None for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'): if attr in cls.__dict__: # Deleting the validator/serializer is necessary as otherwise they can get reused in # pydantic-core. Same applies for the core schema that can be reused in schema generation. delattr(cls, attr) cls.__pydantic_complete__ = False if _types_namespace is not None: rebuild_ns = _types_namespace elif _parent_namespace_depth > 0: rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {} else: rebuild_ns = {} parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {} ns_resolver = _namespace_utils.NsResolver( parent_namespace={**rebuild_ns, **parent_ns}, ) if not cls.__pydantic_fields_complete__: typevars_map = _generics.get_model_typevars_map(cls) try: cls.__pydantic_fields__ = _fields.rebuild_model_fields( cls, ns_resolver=ns_resolver, typevars_map=typevars_map, ) except NameError as e: exc = PydanticUndefinedAnnotation.from_name_error(e) _mock_val_ser.set_model_mocks(cls, f'`{exc.name}`') if raise_errors: raise exc from e if not raise_errors and not cls.__pydantic_fields_complete__: # No need to continue with schema gen, it is guaranteed to fail return False assert cls.__pydantic_fields_complete__ return _model_construction.complete_model_class( cls, _config.ConfigWrapper(cls.model_config, check=False), raise_errors=raise_errors, ns_resolver=ns_resolver, ) ``` ### model_validate ```python model_validate( obj: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None ) -> Self ``` Validate a pydantic model instance. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `obj` | `Any` | The object to validate. | *required* | | `strict` | `bool | None` | Whether to enforce types strictly. | `None` | | `from_attributes` | `bool | None` | Whether to extract data from object attributes. | `None` | | `context` | `Any | None` | Additional context to pass to the validator. | `None` | | `by_alias` | `bool | None` | Whether to use the field's alias when validating against the provided input data. | `None` | | `by_name` | `bool | None` | Whether to use the field's name when validating against the provided input data. | `None` | Raises: | Type | Description | | --- | --- | | `ValidationError` | If the object could not be validated. | Returns: | Type | Description | | --- | --- | | `Self` | The validated model instance. | Source code in `pydantic/main.py` ```python @classmethod def model_validate( cls, obj: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None, ) -> Self: """Validate a pydantic model instance. Args: obj: The object to validate. strict: Whether to enforce types strictly. from_attributes: Whether to extract data from object attributes. context: Additional context to pass to the validator. by_alias: Whether to use the field's alias when validating against the provided input data. by_name: Whether to use the field's name when validating against the provided input data. Raises: ValidationError: If the object could not be validated. Returns: The validated model instance. """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True if by_alias is False and by_name is not True: raise PydanticUserError( 'At least one of `by_alias` or `by_name` must be set to True.', code='validate-by-alias-and-name-false', ) return cls.__pydantic_validator__.validate_python( obj, strict=strict, from_attributes=from_attributes, context=context, by_alias=by_alias, by_name=by_name ) ``` ### model_validate_json ```python model_validate_json( json_data: str | bytes | bytearray, *, strict: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None ) -> Self ``` Usage Documentation [JSON Parsing](../../concepts/json/#json-parsing) Validate the given JSON data against the Pydantic model. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `json_data` | `str | bytes | bytearray` | The JSON data to validate. | *required* | | `strict` | `bool | None` | Whether to enforce types strictly. | `None` | | `context` | `Any | None` | Extra variables to pass to the validator. | `None` | | `by_alias` | `bool | None` | Whether to use the field's alias when validating against the provided input data. | `None` | | `by_name` | `bool | None` | Whether to use the field's name when validating against the provided input data. | `None` | Returns: | Type | Description | | --- | --- | | `Self` | The validated Pydantic model. | Raises: | Type | Description | | --- | --- | | `ValidationError` | If json_data is not a JSON string or the object could not be validated. | Source code in `pydantic/main.py` ```python @classmethod def model_validate_json( cls, json_data: str | bytes | bytearray, *, strict: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None, ) -> Self: """!!! abstract "Usage Documentation" [JSON Parsing](../concepts/json.md#json-parsing) Validate the given JSON data against the Pydantic model. Args: json_data: The JSON data to validate. strict: Whether to enforce types strictly. context: Extra variables to pass to the validator. by_alias: Whether to use the field's alias when validating against the provided input data. by_name: Whether to use the field's name when validating against the provided input data. Returns: The validated Pydantic model. Raises: ValidationError: If `json_data` is not a JSON string or the object could not be validated. """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True if by_alias is False and by_name is not True: raise PydanticUserError( 'At least one of `by_alias` or `by_name` must be set to True.', code='validate-by-alias-and-name-false', ) return cls.__pydantic_validator__.validate_json( json_data, strict=strict, context=context, by_alias=by_alias, by_name=by_name ) ``` ### model_validate_strings ```python model_validate_strings( obj: Any, *, strict: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None ) -> Self ``` Validate the given object with string data against the Pydantic model. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `obj` | `Any` | The object containing string data to validate. | *required* | | `strict` | `bool | None` | Whether to enforce types strictly. | `None` | | `context` | `Any | None` | Extra variables to pass to the validator. | `None` | | `by_alias` | `bool | None` | Whether to use the field's alias when validating against the provided input data. | `None` | | `by_name` | `bool | None` | Whether to use the field's name when validating against the provided input data. | `None` | Returns: | Type | Description | | --- | --- | | `Self` | The validated Pydantic model. | Source code in `pydantic/main.py` ```python @classmethod def model_validate_strings( cls, obj: Any, *, strict: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None, ) -> Self: """Validate the given object with string data against the Pydantic model. Args: obj: The object containing string data to validate. strict: Whether to enforce types strictly. context: Extra variables to pass to the validator. by_alias: Whether to use the field's alias when validating against the provided input data. by_name: Whether to use the field's name when validating against the provided input data. Returns: The validated Pydantic model. """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True if by_alias is False and by_name is not True: raise PydanticUserError( 'At least one of `by_alias` or `by_name` must be set to True.', code='validate-by-alias-and-name-false', ) return cls.__pydantic_validator__.validate_strings( obj, strict=strict, context=context, by_alias=by_alias, by_name=by_name ) ``` ## pydantic.create_model ```python create_model( model_name: str, /, *, __config__: ConfigDict | None = None, __doc__: str | None = None, __base__: None = None, __module__: str = __name__, __validators__: ( dict[str, Callable[..., Any]] | None ) = None, __cls_kwargs__: dict[str, Any] | None = None, **field_definitions: Any | tuple[str, Any], ) -> type[BaseModel] ``` ```python create_model( model_name: str, /, *, __config__: ConfigDict | None = None, __doc__: str | None = None, __base__: type[ModelT] | tuple[type[ModelT], ...], __module__: str = __name__, __validators__: ( dict[str, Callable[..., Any]] | None ) = None, __cls_kwargs__: dict[str, Any] | None = None, **field_definitions: Any | tuple[str, Any], ) -> type[ModelT] ``` ```python create_model( model_name: str, /, *, __config__: ConfigDict | None = None, __doc__: str | None = None, __base__: ( type[ModelT] | tuple[type[ModelT], ...] | None ) = None, __module__: str | None = None, __validators__: ( dict[str, Callable[..., Any]] | None ) = None, __cls_kwargs__: dict[str, Any] | None = None, **field_definitions: Any | tuple[str, Any], ) -> type[ModelT] ``` Usage Documentation [Dynamic Model Creation](../../concepts/models/#dynamic-model-creation) Dynamically creates and returns a new Pydantic model, in other words, `create_model` dynamically creates a subclass of BaseModel. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `model_name` | `str` | The name of the newly created model. | *required* | | `__config__` | `ConfigDict | None` | The configuration of the new model. | `None` | | `__doc__` | `str | None` | The docstring of the new model. | `None` | | `__base__` | `type[ModelT] | tuple[type[ModelT], ...] | None` | The base class or classes for the new model. | `None` | | `__module__` | `str | None` | The name of the module that the model belongs to; if None, the value is taken from sys.\_getframe(1) | `None` | | `__validators__` | `dict[str, Callable[..., Any]] | None` | A dictionary of methods that validate fields. The keys are the names of the validation methods to be added to the model, and the values are the validation methods themselves. You can read more about functional validators here. | `None` | | `__cls_kwargs__` | `dict[str, Any] | None` | A dictionary of keyword arguments for class creation, such as metaclass. | `None` | | `**field_definitions` | `Any | tuple[str, Any]` | Field definitions of the new model. Either: a single element, representing the type annotation of the field. a two-tuple, the first element being the type and the second element the assigned value (either a default or the Field() function). | `{}` | Returns: | Type | Description | | --- | --- | | `type[ModelT]` | The new model. | Raises: | Type | Description | | --- | --- | | `PydanticUserError` | If __base__ and __config__ are both passed. | Source code in `pydantic/main.py` ```python def create_model( # noqa: C901 model_name: str, /, *, __config__: ConfigDict | None = None, __doc__: str | None = None, __base__: type[ModelT] | tuple[type[ModelT], ...] | None = None, __module__: str | None = None, __validators__: dict[str, Callable[..., Any]] | None = None, __cls_kwargs__: dict[str, Any] | None = None, # TODO PEP 747: replace `Any` by the TypeForm: **field_definitions: Any | tuple[str, Any], ) -> type[ModelT]: """!!! abstract "Usage Documentation" [Dynamic Model Creation](../concepts/models.md#dynamic-model-creation) Dynamically creates and returns a new Pydantic model, in other words, `create_model` dynamically creates a subclass of [`BaseModel`][pydantic.BaseModel]. Args: model_name: The name of the newly created model. __config__: The configuration of the new model. __doc__: The docstring of the new model. __base__: The base class or classes for the new model. __module__: The name of the module that the model belongs to; if `None`, the value is taken from `sys._getframe(1)` __validators__: A dictionary of methods that validate fields. The keys are the names of the validation methods to be added to the model, and the values are the validation methods themselves. You can read more about functional validators [here](https://docs.pydantic.dev/2.9/concepts/validators/#field-validators). __cls_kwargs__: A dictionary of keyword arguments for class creation, such as `metaclass`. **field_definitions: Field definitions of the new model. Either: - a single element, representing the type annotation of the field. - a two-tuple, the first element being the type and the second element the assigned value (either a default or the [`Field()`][pydantic.Field] function). Returns: The new [model][pydantic.BaseModel]. Raises: PydanticUserError: If `__base__` and `__config__` are both passed. """ if __base__ is not None: if __config__ is not None: raise PydanticUserError( 'to avoid confusion `__config__` and `__base__` cannot be used together', code='create-model-config-base', ) if not isinstance(__base__, tuple): __base__ = (__base__,) else: __base__ = (cast('type[ModelT]', BaseModel),) __cls_kwargs__ = __cls_kwargs__ or {} fields: dict[str, Any] = {} annotations: dict[str, Any] = {} for f_name, f_def in field_definitions.items(): if isinstance(f_def, tuple): if len(f_def) != 2: raise PydanticUserError( f'Field definition for {f_name!r} should a single element representing the type or a two-tuple, the first element ' 'being the type and the second element the assigned value (either a default or the `Field()` function).', code='create-model-field-definitions', ) annotations[f_name] = f_def[0] fields[f_name] = f_def[1] else: annotations[f_name] = f_def if __module__ is None: f = sys._getframe(1) __module__ = f.f_globals['__name__'] namespace: dict[str, Any] = {'__annotations__': annotations, '__module__': __module__} if __doc__: namespace.update({'__doc__': __doc__}) if __validators__: namespace.update(__validators__) namespace.update(fields) if __config__: namespace['model_config'] = _config.ConfigWrapper(__config__).config_dict resolved_bases = types.resolve_bases(__base__) meta, ns, kwds = types.prepare_class(model_name, resolved_bases, kwds=__cls_kwargs__) if resolved_bases is not __base__: ns['__orig_bases__'] = __base__ namespace.update(ns) return meta( model_name, resolved_bases, namespace, __pydantic_reset_parent_namespace__=False, _create_model_module=__module__, **kwds, ) ``` Configuration for Pydantic models. ## ConfigDict Bases: `TypedDict` A TypedDict for configuring Pydantic behaviour. ### title ```python title: str | None ``` The title for the generated JSON schema, defaults to the model's name ### model_title_generator ```python model_title_generator: Callable[[type], str] | None ``` A callable that takes a model class and returns the title for it. Defaults to `None`. ### field_title_generator ```python field_title_generator: ( Callable[[str, FieldInfo | ComputedFieldInfo], str] | None ) ``` A callable that takes a field's name and info and returns title for it. Defaults to `None`. ### str_to_lower ```python str_to_lower: bool ``` Whether to convert all characters to lowercase for str types. Defaults to `False`. ### str_to_upper ```python str_to_upper: bool ``` Whether to convert all characters to uppercase for str types. Defaults to `False`. ### str_strip_whitespace ```python str_strip_whitespace: bool ``` Whether to strip leading and trailing whitespace for str types. ### str_min_length ```python str_min_length: int ``` The minimum length for str types. Defaults to `None`. ### str_max_length ```python str_max_length: int | None ``` The maximum length for str types. Defaults to `None`. ### extra ```python extra: ExtraValues | None ``` Whether to ignore, allow, or forbid extra data during model initialization. Defaults to `'ignore'`. Three configuration values are available: - `'ignore'`: Providing extra data is ignored (the default): ```python from pydantic import BaseModel, ConfigDict class User(BaseModel): model_config = ConfigDict(extra='ignore') # (1)! name: str user = User(name='John Doe', age=20) # (2)! print(user) #> name='John Doe' ``` 1. This is the default behaviour. 1. The `age` argument is ignored. - `'forbid'`: Providing extra data is not permitted, and a ValidationError will be raised if this is the case: ```python from pydantic import BaseModel, ConfigDict, ValidationError class Model(BaseModel): x: int model_config = ConfigDict(extra='forbid') try: Model(x=1, y='a') except ValidationError as exc: print(exc) """ 1 validation error for Model y Extra inputs are not permitted [type=extra_forbidden, input_value='a', input_type=str] """ ``` - `'allow'`: Providing extra data is allowed and stored in the `__pydantic_extra__` dictionary attribute: ```python from pydantic import BaseModel, ConfigDict class Model(BaseModel): x: int model_config = ConfigDict(extra='allow') m = Model(x=1, y='a') assert m.__pydantic_extra__ == {'y': 'a'} ``` By default, no validation will be applied to these extra items, but you can set a type for the values by overriding the type annotation for `__pydantic_extra__`: ```python from pydantic import BaseModel, ConfigDict, Field, ValidationError class Model(BaseModel): __pydantic_extra__: dict[str, int] = Field(init=False) # (1)! x: int model_config = ConfigDict(extra='allow') try: Model(x=1, y='a') except ValidationError as exc: print(exc) """ 1 validation error for Model y Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] """ m = Model(x=1, y='2') assert m.x == 1 assert m.y == 2 assert m.model_dump() == {'x': 1, 'y': 2} assert m.__pydantic_extra__ == {'y': 2} ``` 1. The `= Field(init=False)` does not have any effect at runtime, but prevents the `__pydantic_extra__` field from being included as a parameter to the model's `__init__` method by type checkers. ### frozen ```python frozen: bool ``` Whether models are faux-immutable, i.e. whether `__setattr__` is allowed, and also generates a `__hash__()` method for the model. This makes instances of the model potentially hashable if all the attributes are hashable. Defaults to `False`. Note On V1, the inverse of this setting was called `allow_mutation`, and was `True` by default. ### populate_by_name ```python populate_by_name: bool ``` Whether an aliased field may be populated by its name as given by the model attribute, as well as the alias. Defaults to `False`. Warning `populate_by_name` usage is not recommended in v2.11+ and will be deprecated in v3. Instead, you should use the validate_by_name configuration setting. When `validate_by_name=True` and `validate_by_alias=True`, this is strictly equivalent to the previous behavior of `populate_by_name=True`. In v2.11, we also introduced a validate_by_alias setting that introduces more fine grained control for validation behavior. Here's how you might go about using the new settings to achieve the same behavior: ```python from pydantic import BaseModel, ConfigDict, Field class Model(BaseModel): model_config = ConfigDict(validate_by_name=True, validate_by_alias=True) my_field: str = Field(alias='my_alias') # (1)! m = Model(my_alias='foo') # (2)! print(m) #> my_field='foo' m = Model(my_alias='foo') # (3)! print(m) #> my_field='foo' ``` 1. The field `'my_field'` has an alias `'my_alias'`. 1. The model is populated by the alias `'my_alias'`. 1. The model is populated by the attribute name `'my_field'`. ### use_enum_values ```python use_enum_values: bool ``` Whether to populate models with the `value` property of enums, rather than the raw enum. This may be useful if you want to serialize `model.model_dump()` later. Defaults to `False`. Note If you have an `Optional[Enum]` value that you set a default for, you need to use `validate_default=True` for said Field to ensure that the `use_enum_values` flag takes effect on the default, as extracting an enum's value occurs during validation, not serialization. ```python from enum import Enum from typing import Optional from pydantic import BaseModel, ConfigDict, Field class SomeEnum(Enum): FOO = 'foo' BAR = 'bar' BAZ = 'baz' class SomeModel(BaseModel): model_config = ConfigDict(use_enum_values=True) some_enum: SomeEnum another_enum: Optional[SomeEnum] = Field( default=SomeEnum.FOO, validate_default=True ) model1 = SomeModel(some_enum=SomeEnum.BAR) print(model1.model_dump()) #> {'some_enum': 'bar', 'another_enum': 'foo'} model2 = SomeModel(some_enum=SomeEnum.BAR, another_enum=SomeEnum.BAZ) print(model2.model_dump()) #> {'some_enum': 'bar', 'another_enum': 'baz'} ``` ### validate_assignment ```python validate_assignment: bool ``` Whether to validate the data when the model is changed. Defaults to `False`. The default behavior of Pydantic is to validate the data when the model is created. In case the user changes the data after the model is created, the model is *not* revalidated. ```python from pydantic import BaseModel class User(BaseModel): name: str user = User(name='John Doe') # (1)! print(user) #> name='John Doe' user.name = 123 # (1)! print(user) #> name=123 ``` 1. The validation happens only when the model is created. 1. The validation does not happen when the data is changed. In case you want to revalidate the model when the data is changed, you can use `validate_assignment=True`: ```python from pydantic import BaseModel, ValidationError class User(BaseModel, validate_assignment=True): # (1)! name: str user = User(name='John Doe') # (2)! print(user) #> name='John Doe' try: user.name = 123 # (3)! except ValidationError as e: print(e) ''' 1 validation error for User name Input should be a valid string [type=string_type, input_value=123, input_type=int] ''' ``` 1. You can either use class keyword arguments, or `model_config` to set `validate_assignment=True`. 1. The validation happens when the model is created. 1. The validation *also* happens when the data is changed. ### arbitrary_types_allowed ```python arbitrary_types_allowed: bool ``` Whether arbitrary types are allowed for field types. Defaults to `False`. ```python from pydantic import BaseModel, ConfigDict, ValidationError # This is not a pydantic model, it's an arbitrary class class Pet: def __init__(self, name: str): self.name = name class Model(BaseModel): model_config = ConfigDict(arbitrary_types_allowed=True) pet: Pet owner: str pet = Pet(name='Hedwig') # A simple check of instance type is used to validate the data model = Model(owner='Harry', pet=pet) print(model) #> pet=<__main__.Pet object at 0x0123456789ab> owner='Harry' print(model.pet) #> <__main__.Pet object at 0x0123456789ab> print(model.pet.name) #> Hedwig print(type(model.pet)) #> try: # If the value is not an instance of the type, it's invalid Model(owner='Harry', pet='Hedwig') except ValidationError as e: print(e) ''' 1 validation error for Model pet Input should be an instance of Pet [type=is_instance_of, input_value='Hedwig', input_type=str] ''' # Nothing in the instance of the arbitrary type is checked # Here name probably should have been a str, but it's not validated pet2 = Pet(name=42) model2 = Model(owner='Harry', pet=pet2) print(model2) #> pet=<__main__.Pet object at 0x0123456789ab> owner='Harry' print(model2.pet) #> <__main__.Pet object at 0x0123456789ab> print(model2.pet.name) #> 42 print(type(model2.pet)) #> ``` ### from_attributes ```python from_attributes: bool ``` Whether to build models and look up discriminators of tagged unions using python object attributes. ### loc_by_alias ```python loc_by_alias: bool ``` Whether to use the actual key provided in the data (e.g. alias) for error `loc`s rather than the field's name. Defaults to `True`. ### alias_generator ```python alias_generator: ( Callable[[str], str] | AliasGenerator | None ) ``` A callable that takes a field name and returns an alias for it or an instance of AliasGenerator. Defaults to `None`. When using a callable, the alias generator is used for both validation and serialization. If you want to use different alias generators for validation and serialization, you can use AliasGenerator instead. If data source field names do not match your code style (e. g. CamelCase fields), you can automatically generate aliases using `alias_generator`. Here's an example with a basic callable: ```python from pydantic import BaseModel, ConfigDict from pydantic.alias_generators import to_pascal class Voice(BaseModel): model_config = ConfigDict(alias_generator=to_pascal) name: str language_code: str voice = Voice(Name='Filiz', LanguageCode='tr-TR') print(voice.language_code) #> tr-TR print(voice.model_dump(by_alias=True)) #> {'Name': 'Filiz', 'LanguageCode': 'tr-TR'} ``` If you want to use different alias generators for validation and serialization, you can use AliasGenerator. ```python from pydantic import AliasGenerator, BaseModel, ConfigDict from pydantic.alias_generators import to_camel, to_pascal class Athlete(BaseModel): first_name: str last_name: str sport: str model_config = ConfigDict( alias_generator=AliasGenerator( validation_alias=to_camel, serialization_alias=to_pascal, ) ) athlete = Athlete(firstName='John', lastName='Doe', sport='track') print(athlete.model_dump(by_alias=True)) #> {'FirstName': 'John', 'LastName': 'Doe', 'Sport': 'track'} ``` Note Pydantic offers three built-in alias generators: to_pascal, to_camel, and to_snake. ### ignored_types ```python ignored_types: tuple[type, ...] ``` A tuple of types that may occur as values of class attributes without annotations. This is typically used for custom descriptors (classes that behave like `property`). If an attribute is set on a class without an annotation and has a type that is not in this tuple (or otherwise recognized by *pydantic*), an error will be raised. Defaults to `()`. ### allow_inf_nan ```python allow_inf_nan: bool ``` Whether to allow infinity (`+inf` an `-inf`) and NaN values to float and decimal fields. Defaults to `True`. ### json_schema_extra ```python json_schema_extra: JsonDict | JsonSchemaExtraCallable | None ``` A dict or callable to provide extra JSON schema properties. Defaults to `None`. ### json_encoders ```python json_encoders: dict[type[object], JsonEncoder] | None ``` A `dict` of custom JSON encoders for specific types. Defaults to `None`. Deprecated This config option is a carryover from v1. We originally planned to remove it in v2 but didn't have a 1:1 replacement so we are keeping it for now. It is still deprecated and will likely be removed in the future. ### strict ```python strict: bool ``` *(new in V2)* If `True`, strict validation is applied to all fields on the model. By default, Pydantic attempts to coerce values to the correct type, when possible. There are situations in which you may want to disable this behavior, and instead raise an error if a value's type does not match the field's type annotation. To configure strict mode for all fields on a model, you can set `strict=True` on the model. ```python from pydantic import BaseModel, ConfigDict class Model(BaseModel): model_config = ConfigDict(strict=True) name: str age: int ``` See [Strict Mode](../../concepts/strict_mode/) for more details. See the [Conversion Table](../../concepts/conversion_table/) for more details on how Pydantic converts data in both strict and lax modes. ### revalidate_instances ```python revalidate_instances: Literal[ "always", "never", "subclass-instances" ] ``` When and how to revalidate models and dataclasses during validation. Accepts the string values of `'never'`, `'always'` and `'subclass-instances'`. Defaults to `'never'`. - `'never'` will not revalidate models and dataclasses during validation - `'always'` will revalidate models and dataclasses during validation - `'subclass-instances'` will revalidate models and dataclasses during validation if the instance is a subclass of the model or dataclass By default, model and dataclass instances are not revalidated during validation. ```python from pydantic import BaseModel class User(BaseModel, revalidate_instances='never'): # (1)! hobbies: list[str] class SubUser(User): sins: list[str] class Transaction(BaseModel): user: User my_user = User(hobbies=['reading']) t = Transaction(user=my_user) print(t) #> user=User(hobbies=['reading']) my_user.hobbies = [1] # (2)! t = Transaction(user=my_user) # (3)! print(t) #> user=User(hobbies=[1]) my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying']) t = Transaction(user=my_sub_user) print(t) #> user=SubUser(hobbies=['scuba diving'], sins=['lying']) ``` 1. `revalidate_instances` is set to `'never'` by \*\*default. 1. The assignment is not validated, unless you set `validate_assignment` to `True` in the model's config. 1. Since `revalidate_instances` is set to `never`, this is not revalidated. If you want to revalidate instances during validation, you can set `revalidate_instances` to `'always'` in the model's config. ```python from pydantic import BaseModel, ValidationError class User(BaseModel, revalidate_instances='always'): # (1)! hobbies: list[str] class SubUser(User): sins: list[str] class Transaction(BaseModel): user: User my_user = User(hobbies=['reading']) t = Transaction(user=my_user) print(t) #> user=User(hobbies=['reading']) my_user.hobbies = [1] try: t = Transaction(user=my_user) # (2)! except ValidationError as e: print(e) ''' 1 validation error for Transaction user.hobbies.0 Input should be a valid string [type=string_type, input_value=1, input_type=int] ''' my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying']) t = Transaction(user=my_sub_user) print(t) # (3)! #> user=User(hobbies=['scuba diving']) ``` 1. `revalidate_instances` is set to `'always'`. 1. The model is revalidated, since `revalidate_instances` is set to `'always'`. 1. Using `'never'` we would have gotten `user=SubUser(hobbies=['scuba diving'], sins=['lying'])`. It's also possible to set `revalidate_instances` to `'subclass-instances'` to only revalidate instances of subclasses of the model. ```python from pydantic import BaseModel class User(BaseModel, revalidate_instances='subclass-instances'): # (1)! hobbies: list[str] class SubUser(User): sins: list[str] class Transaction(BaseModel): user: User my_user = User(hobbies=['reading']) t = Transaction(user=my_user) print(t) #> user=User(hobbies=['reading']) my_user.hobbies = [1] t = Transaction(user=my_user) # (2)! print(t) #> user=User(hobbies=[1]) my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying']) t = Transaction(user=my_sub_user) print(t) # (3)! #> user=User(hobbies=['scuba diving']) ``` 1. `revalidate_instances` is set to `'subclass-instances'`. 1. This is not revalidated, since `my_user` is not a subclass of `User`. 1. Using `'never'` we would have gotten `user=SubUser(hobbies=['scuba diving'], sins=['lying'])`. ### ser_json_timedelta ```python ser_json_timedelta: Literal['iso8601', 'float'] ``` The format of JSON serialized timedeltas. Accepts the string values of `'iso8601'` and `'float'`. Defaults to `'iso8601'`. - `'iso8601'` will serialize timedeltas to ISO 8601 durations. - `'float'` will serialize timedeltas to the total number of seconds. ### ser_json_bytes ```python ser_json_bytes: Literal['utf8', 'base64', 'hex'] ``` The encoding of JSON serialized bytes. Defaults to `'utf8'`. Set equal to `val_json_bytes` to get back an equal value after serialization round trip. - `'utf8'` will serialize bytes to UTF-8 strings. - `'base64'` will serialize bytes to URL safe base64 strings. - `'hex'` will serialize bytes to hexadecimal strings. ### val_json_bytes ```python val_json_bytes: Literal['utf8', 'base64', 'hex'] ``` The encoding of JSON serialized bytes to decode. Defaults to `'utf8'`. Set equal to `ser_json_bytes` to get back an equal value after serialization round trip. - `'utf8'` will deserialize UTF-8 strings to bytes. - `'base64'` will deserialize URL safe base64 strings to bytes. - `'hex'` will deserialize hexadecimal strings to bytes. ### ser_json_inf_nan ```python ser_json_inf_nan: Literal['null', 'constants', 'strings'] ``` The encoding of JSON serialized infinity and NaN float values. Defaults to `'null'`. - `'null'` will serialize infinity and NaN values as `null`. - `'constants'` will serialize infinity and NaN values as `Infinity` and `NaN`. - `'strings'` will serialize infinity as string `"Infinity"` and NaN as string `"NaN"`. ### validate_default ```python validate_default: bool ``` Whether to validate default values during validation. Defaults to `False`. ### validate_return ```python validate_return: bool ``` Whether to validate the return value from call validators. Defaults to `False`. ### protected_namespaces ```python protected_namespaces: tuple[str | Pattern[str], ...] ``` A `tuple` of strings and/or patterns that prevent models from having fields with names that conflict with them. For strings, we match on a prefix basis. Ex, if 'dog' is in the protected namespace, 'dog_name' will be protected. For patterns, we match on the entire field name. Ex, if `re.compile(r'^dog$')` is in the protected namespace, 'dog' will be protected, but 'dog_name' will not be. Defaults to `('model_validate', 'model_dump',)`. The reason we've selected these is to prevent collisions with other validation / dumping formats in the future - ex, `model_validate_{some_newly_supported_format}`. Before v2.10, Pydantic used `('model_',)` as the default value for this setting to prevent collisions between model attributes and `BaseModel`'s own methods. This was changed in v2.10 given feedback that this restriction was limiting in AI and data science contexts, where it is common to have fields with names like `model_id`, `model_input`, `model_output`, etc. For more details, see https://github.com/pydantic/pydantic/issues/10315. ```python import warnings from pydantic import BaseModel warnings.filterwarnings('error') # Raise warnings as errors try: class Model(BaseModel): model_dump_something: str except UserWarning as e: print(e) ''' Field "model_dump_something" in Model has conflict with protected namespace "model_dump". You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('model_validate',)`. ''' ``` You can customize this behavior using the `protected_namespaces` setting: ```python import re import warnings from pydantic import BaseModel, ConfigDict with warnings.catch_warnings(record=True) as caught_warnings: warnings.simplefilter('always') # Catch all warnings class Model(BaseModel): safe_field: str also_protect_field: str protect_this: str model_config = ConfigDict( protected_namespaces=( 'protect_me_', 'also_protect_', re.compile('^protect_this$'), ) ) for warning in caught_warnings: print(f'{warning.message}') ''' Field "also_protect_field" in Model has conflict with protected namespace "also_protect_". You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('protect_me_', re.compile('^protect_this$'))`. Field "protect_this" in Model has conflict with protected namespace "re.compile('^protect_this$')". You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('protect_me_', 'also_protect_')`. ''' ``` While Pydantic will only emit a warning when an item is in a protected namespace but does not actually have a collision, an error *is* raised if there is an actual collision with an existing attribute: ```python from pydantic import BaseModel, ConfigDict try: class Model(BaseModel): model_validate: str model_config = ConfigDict(protected_namespaces=('model_',)) except NameError as e: print(e) ''' Field "model_validate" conflicts with member > of protected namespace "model_". ''' ``` ### hide_input_in_errors ```python hide_input_in_errors: bool ``` Whether to hide inputs when printing errors. Defaults to `False`. Pydantic shows the input value and type when it raises `ValidationError` during the validation. ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): a: str try: Model(a=123) except ValidationError as e: print(e) ''' 1 validation error for Model a Input should be a valid string [type=string_type, input_value=123, input_type=int] ''' ``` You can hide the input value and type by setting the `hide_input_in_errors` config to `True`. ```python from pydantic import BaseModel, ConfigDict, ValidationError class Model(BaseModel): a: str model_config = ConfigDict(hide_input_in_errors=True) try: Model(a=123) except ValidationError as e: print(e) ''' 1 validation error for Model a Input should be a valid string [type=string_type] ''' ``` ### defer_build ```python defer_build: bool ``` Whether to defer model validator and serializer construction until the first model validation. Defaults to False. This can be useful to avoid the overhead of building models which are only used nested within other models, or when you want to manually define type namespace via Model.model_rebuild(\_types_namespace=...). Since v2.10, this setting also applies to pydantic dataclasses and TypeAdapter instances. ### plugin_settings ```python plugin_settings: dict[str, object] | None ``` A `dict` of settings for plugins. Defaults to `None`. ### schema_generator ```python schema_generator: type[GenerateSchema] | None ``` Warning `schema_generator` is deprecated in v2.10. Prior to v2.10, this setting was advertised as highly subject to change. It's possible that this interface may once again become public once the internal core schema generation API is more stable, but that will likely come after significant performance improvements have been made. ### json_schema_serialization_defaults_required ```python json_schema_serialization_defaults_required: bool ``` Whether fields with default values should be marked as required in the serialization schema. Defaults to `False`. This ensures that the serialization schema will reflect the fact a field with a default will always be present when serializing the model, even though it is not required for validation. However, there are scenarios where this may be undesirable — in particular, if you want to share the schema between validation and serialization, and don't mind fields with defaults being marked as not required during serialization. See [#7209](https://github.com/pydantic/pydantic/issues/7209) for more details. ```python from pydantic import BaseModel, ConfigDict class Model(BaseModel): a: str = 'a' model_config = ConfigDict(json_schema_serialization_defaults_required=True) print(Model.model_json_schema(mode='validation')) ''' { 'properties': {'a': {'default': 'a', 'title': 'A', 'type': 'string'}}, 'title': 'Model', 'type': 'object', } ''' print(Model.model_json_schema(mode='serialization')) ''' { 'properties': {'a': {'default': 'a', 'title': 'A', 'type': 'string'}}, 'required': ['a'], 'title': 'Model', 'type': 'object', } ''' ``` ### json_schema_mode_override ```python json_schema_mode_override: Literal[ "validation", "serialization", None ] ``` If not `None`, the specified mode will be used to generate the JSON schema regardless of what `mode` was passed to the function call. Defaults to `None`. This provides a way to force the JSON schema generation to reflect a specific mode, e.g., to always use the validation schema. It can be useful when using frameworks (such as FastAPI) that may generate different schemas for validation and serialization that must both be referenced from the same schema; when this happens, we automatically append `-Input` to the definition reference for the validation schema and `-Output` to the definition reference for the serialization schema. By specifying a `json_schema_mode_override` though, this prevents the conflict between the validation and serialization schemas (since both will use the specified schema), and so prevents the suffixes from being added to the definition references. ```python from pydantic import BaseModel, ConfigDict, Json class Model(BaseModel): a: Json[int] # requires a string to validate, but will dump an int print(Model.model_json_schema(mode='serialization')) ''' { 'properties': {'a': {'title': 'A', 'type': 'integer'}}, 'required': ['a'], 'title': 'Model', 'type': 'object', } ''' class ForceInputModel(Model): # the following ensures that even with mode='serialization', we # will get the schema that would be generated for validation. model_config = ConfigDict(json_schema_mode_override='validation') print(ForceInputModel.model_json_schema(mode='serialization')) ''' { 'properties': { 'a': { 'contentMediaType': 'application/json', 'contentSchema': {'type': 'integer'}, 'title': 'A', 'type': 'string', } }, 'required': ['a'], 'title': 'ForceInputModel', 'type': 'object', } ''' ``` ### coerce_numbers_to_str ```python coerce_numbers_to_str: bool ``` If `True`, enables automatic coercion of any `Number` type to `str` in "lax" (non-strict) mode. Defaults to `False`. Pydantic doesn't allow number types (`int`, `float`, `Decimal`) to be coerced as type `str` by default. ```python from decimal import Decimal from pydantic import BaseModel, ConfigDict, ValidationError class Model(BaseModel): value: str try: print(Model(value=42)) except ValidationError as e: print(e) ''' 1 validation error for Model value Input should be a valid string [type=string_type, input_value=42, input_type=int] ''' class Model(BaseModel): model_config = ConfigDict(coerce_numbers_to_str=True) value: str repr(Model(value=42).value) #> "42" repr(Model(value=42.13).value) #> "42.13" repr(Model(value=Decimal('42.13')).value) #> "42.13" ``` ### regex_engine ```python regex_engine: Literal['rust-regex', 'python-re'] ``` The regex engine to be used for pattern validation. Defaults to `'rust-regex'`. - `rust-regex` uses the [`regex`](https://docs.rs/regex) Rust crate, which is non-backtracking and therefore more DDoS resistant, but does not support all regex features. - `python-re` use the [`re`](https://docs.python.org/3/library/re.html) module, which supports all regex features, but may be slower. Note If you use a compiled regex pattern, the python-re engine will be used regardless of this setting. This is so that flags such as `re.IGNORECASE` are respected. ```python from pydantic import BaseModel, ConfigDict, Field, ValidationError class Model(BaseModel): model_config = ConfigDict(regex_engine='python-re') value: str = Field(pattern=r'^abc(?=def)') print(Model(value='abcdef').value) #> abcdef try: print(Model(value='abxyzcdef')) except ValidationError as e: print(e) ''' 1 validation error for Model value String should match pattern '^abc(?=def)' [type=string_pattern_mismatch, input_value='abxyzcdef', input_type=str] ''' ``` ### validation_error_cause ```python validation_error_cause: bool ``` If `True`, Python exceptions that were part of a validation failure will be shown as an exception group as a cause. Can be useful for debugging. Defaults to `False`. Note Python 3.10 and older don't support exception groups natively. \<=3.10, backport must be installed: `pip install exceptiongroup`. Note The structure of validation errors are likely to change in future Pydantic versions. Pydantic offers no guarantees about their structure. Should be used for visual traceback debugging only. ### use_attribute_docstrings ```python use_attribute_docstrings: bool ``` Whether docstrings of attributes (bare string literals immediately following the attribute declaration) should be used for field descriptions. Defaults to `False`. Available in Pydantic v2.7+. ```python from pydantic import BaseModel, ConfigDict, Field class Model(BaseModel): model_config = ConfigDict(use_attribute_docstrings=True) x: str """ Example of an attribute docstring """ y: int = Field(description="Description in Field") """ Description in Field overrides attribute docstring """ print(Model.model_fields["x"].description) # > Example of an attribute docstring print(Model.model_fields["y"].description) # > Description in Field ``` This requires the source code of the class to be available at runtime. Usage with `TypedDict` and stdlib dataclasses Due to current limitations, attribute docstrings detection may not work as expected when using TypedDict and stdlib dataclasses, in particular when: - inheritance is being used. - multiple classes have the same name in the same source file. ### cache_strings ```python cache_strings: bool | Literal['all', 'keys', 'none'] ``` Whether to cache strings to avoid constructing new Python objects. Defaults to True. Enabling this setting should significantly improve validation performance while increasing memory usage slightly. - `True` or `'all'` (the default): cache all strings - `'keys'`: cache only dictionary keys - `False` or `'none'`: no caching Note `True` or `'all'` is required to cache strings during general validation because validators don't know if they're in a key or a value. Tip If repeated strings are rare, it's recommended to use `'keys'` or `'none'` to reduce memory usage, as the performance difference is minimal if repeated strings are rare. ### validate_by_alias ```python validate_by_alias: bool ``` Whether an aliased field may be populated by its alias. Defaults to `True`. Note In v2.11, `validate_by_alias` was introduced in conjunction with validate_by_name to empower users with more fine grained validation control. In \ my_field='foo' ``` 1. The field `'my_field'` has an alias `'my_alias'`. 1. The model can only be populated by the attribute name `'my_field'`. Warning You cannot set both `validate_by_alias` and `validate_by_name` to `False`. This would make it impossible to populate an attribute. See [usage errors](../../errors/usage_errors/#validate-by-alias-and-name-false) for an example. If you set `validate_by_alias` to `False`, under the hood, Pydantic dynamically sets `validate_by_name` to `True` to ensure that validation can still occur. ### validate_by_name ```python validate_by_name: bool ``` Whether an aliased field may be populated by its name as given by the model attribute. Defaults to `False`. Note In v2.0-v2.10, the `populate_by_name` configuration setting was used to specify whether or not a field could be populated by its name **and** alias. In v2.11, `validate_by_name` was introduced in conjunction with validate_by_alias to empower users with more fine grained validation behavior control. ```python from pydantic import BaseModel, ConfigDict, Field class Model(BaseModel): model_config = ConfigDict(validate_by_name=True, validate_by_alias=True) my_field: str = Field(validation_alias='my_alias') # (1)! m = Model(my_alias='foo') # (2)! print(m) #> my_field='foo' m = Model(my_field='foo') # (3)! print(m) #> my_field='foo' ``` 1. The field `'my_field'` has an alias `'my_alias'`. 1. The model is populated by the alias `'my_alias'`. 1. The model is populated by the attribute name `'my_field'`. Warning You cannot set both `validate_by_alias` and `validate_by_name` to `False`. This would make it impossible to populate an attribute. See [usage errors](../../errors/usage_errors/#validate-by-alias-and-name-false) for an example. ### serialize_by_alias ```python serialize_by_alias: bool ``` Whether an aliased field should be serialized by its alias. Defaults to `False`. Note: In v2.11, `serialize_by_alias` was introduced to address the [popular request](https://github.com/pydantic/pydantic/issues/8379) for consistency with alias behavior for validation and serialization settings. In v3, the default value is expected to change to `True` for consistency with the validation default. ```python from pydantic import BaseModel, ConfigDict, Field class Model(BaseModel): model_config = ConfigDict(serialize_by_alias=True) my_field: str = Field(serialization_alias='my_alias') # (1)! m = Model(my_field='foo') print(m.model_dump()) # (2)! #> {'my_alias': 'foo'} ``` 1. The field `'my_field'` has an alias `'my_alias'`. 1. The model is serialized using the alias `'my_alias'` for the `'my_field'` attribute. ## with_config ```python with_config( *, config: ConfigDict ) -> Callable[[_TypeT], _TypeT] ``` ```python with_config( config: ConfigDict, ) -> Callable[[_TypeT], _TypeT] ``` ```python with_config( **config: Unpack[ConfigDict], ) -> Callable[[_TypeT], _TypeT] ``` ```python with_config( config: ConfigDict | None = None, /, **kwargs: Any ) -> Callable[[_TypeT], _TypeT] ``` Usage Documentation [Configuration with other types](../../concepts/config/#configuration-on-other-supported-types) A convenience decorator to set a [Pydantic configuration](./) on a `TypedDict` or a `dataclass` from the standard library. Although the configuration can be set using the `__pydantic_config__` attribute, it does not play well with type checkers, especially with `TypedDict`. Usage ```python from typing_extensions import TypedDict from pydantic import ConfigDict, TypeAdapter, with_config @with_config(ConfigDict(str_to_lower=True)) class TD(TypedDict): x: str ta = TypeAdapter(TD) print(ta.validate_python({'x': 'ABC'})) #> {'x': 'abc'} ``` Source code in `pydantic/config.py` ````python def with_config(config: ConfigDict | None = None, /, **kwargs: Any) -> Callable[[_TypeT], _TypeT]: """!!! abstract "Usage Documentation" [Configuration with other types](../concepts/config.md#configuration-on-other-supported-types) A convenience decorator to set a [Pydantic configuration](config.md) on a `TypedDict` or a `dataclass` from the standard library. Although the configuration can be set using the `__pydantic_config__` attribute, it does not play well with type checkers, especially with `TypedDict`. !!! example "Usage" ```python from typing_extensions import TypedDict from pydantic import ConfigDict, TypeAdapter, with_config @with_config(ConfigDict(str_to_lower=True)) class TD(TypedDict): x: str ta = TypeAdapter(TD) print(ta.validate_python({'x': 'ABC'})) #> {'x': 'abc'} ``` """ if config is not None and kwargs: raise ValueError('Cannot specify both `config` and keyword arguments') if len(kwargs) == 1 and (kwargs_conf := kwargs.get('config')) is not None: warnings.warn( 'Passing `config` as a keyword argument is deprecated. Pass `config` as a positional argument instead', category=PydanticDeprecatedSince211, stacklevel=2, ) final_config = cast(ConfigDict, kwargs_conf) else: final_config = config if config is not None else cast(ConfigDict, kwargs) def inner(class_: _TypeT, /) -> _TypeT: # Ideally, we would check for `class_` to either be a `TypedDict` or a stdlib dataclass. # However, the `@with_config` decorator can be applied *after* `@dataclass`. To avoid # common mistakes, we at least check for `class_` to not be a Pydantic model. from ._internal._utils import is_model_class if is_model_class(class_): raise PydanticUserError( f'Cannot use `with_config` on {class_.__name__} as it is a Pydantic model', code='with-config-on-model', ) class_.__pydantic_config__ = final_config return class_ return inner ```` ## ExtraValues ```python ExtraValues = Literal['allow', 'ignore', 'forbid'] ``` ## pydantic.alias_generators Alias generators for converting between different capitalization conventions. ### to_pascal ```python to_pascal(snake: str) -> str ``` Convert a snake_case string to PascalCase. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `snake` | `str` | The string to convert. | *required* | Returns: | Type | Description | | --- | --- | | `str` | The PascalCase string. | Source code in `pydantic/alias_generators.py` ```python def to_pascal(snake: str) -> str: """Convert a snake_case string to PascalCase. Args: snake: The string to convert. Returns: The PascalCase string. """ camel = snake.title() return re.sub('([0-9A-Za-z])_(?=[0-9A-Z])', lambda m: m.group(1), camel) ``` ### to_camel ```python to_camel(snake: str) -> str ``` Convert a snake_case string to camelCase. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `snake` | `str` | The string to convert. | *required* | Returns: | Type | Description | | --- | --- | | `str` | The converted camelCase string. | Source code in `pydantic/alias_generators.py` ```python def to_camel(snake: str) -> str: """Convert a snake_case string to camelCase. Args: snake: The string to convert. Returns: The converted camelCase string. """ # If the string is already in camelCase and does not contain a digit followed # by a lowercase letter, return it as it is if re.match('^[a-z]+[A-Za-z0-9]*$', snake) and not re.search(r'\d[a-z]', snake): return snake camel = to_pascal(snake) return re.sub('(^_*[A-Z])', lambda m: m.group(1).lower(), camel) ``` ### to_snake ```python to_snake(camel: str) -> str ``` Convert a PascalCase, camelCase, or kebab-case string to snake_case. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `camel` | `str` | The string to convert. | *required* | Returns: | Type | Description | | --- | --- | | `str` | The converted string in snake_case. | Source code in `pydantic/alias_generators.py` ```python def to_snake(camel: str) -> str: """Convert a PascalCase, camelCase, or kebab-case string to snake_case. Args: camel: The string to convert. Returns: The converted string in snake_case. """ # Handle the sequence of uppercase letters followed by a lowercase letter snake = re.sub(r'([A-Z]+)([A-Z][a-z])', lambda m: f'{m.group(1)}_{m.group(2)}', camel) # Insert an underscore between a lowercase letter and an uppercase letter snake = re.sub(r'([a-z])([A-Z])', lambda m: f'{m.group(1)}_{m.group(2)}', snake) # Insert an underscore between a digit and an uppercase letter snake = re.sub(r'([0-9])([A-Z])', lambda m: f'{m.group(1)}_{m.group(2)}', snake) # Insert an underscore between a lowercase letter and a digit snake = re.sub(r'([a-z])([0-9])', lambda m: f'{m.group(1)}_{m.group(2)}', snake) # Replace hyphens with underscores to handle kebab-case snake = snake.replace('-', '_') return snake.lower() ``` Provide an enhanced dataclass that performs validation. ## dataclass ```python dataclass( *, init: Literal[False] = False, repr: bool = True, eq: bool = True, order: bool = False, unsafe_hash: bool = False, frozen: bool = False, config: ConfigDict | type[object] | None = None, validate_on_init: bool | None = None, kw_only: bool = ..., slots: bool = ... ) -> Callable[[type[_T]], type[PydanticDataclass]] ``` ```python dataclass( _cls: type[_T], *, init: Literal[False] = False, repr: bool = True, eq: bool = True, order: bool = False, unsafe_hash: bool = False, frozen: bool | None = None, config: ConfigDict | type[object] | None = None, validate_on_init: bool | None = None, kw_only: bool = ..., slots: bool = ... ) -> type[PydanticDataclass] ``` ```python dataclass( *, init: Literal[False] = False, repr: bool = True, eq: bool = True, order: bool = False, unsafe_hash: bool = False, frozen: bool | None = None, config: ConfigDict | type[object] | None = None, validate_on_init: bool | None = None ) -> Callable[[type[_T]], type[PydanticDataclass]] ``` ```python dataclass( _cls: type[_T], *, init: Literal[False] = False, repr: bool = True, eq: bool = True, order: bool = False, unsafe_hash: bool = False, frozen: bool | None = None, config: ConfigDict | type[object] | None = None, validate_on_init: bool | None = None ) -> type[PydanticDataclass] ``` ```python dataclass( _cls: type[_T] | None = None, *, init: Literal[False] = False, repr: bool = True, eq: bool = True, order: bool = False, unsafe_hash: bool = False, frozen: bool | None = None, config: ConfigDict | type[object] | None = None, validate_on_init: bool | None = None, kw_only: bool = False, slots: bool = False ) -> ( Callable[[type[_T]], type[PydanticDataclass]] | type[PydanticDataclass] ) ``` Usage Documentation [`dataclasses`](../../concepts/dataclasses/) A decorator used to create a Pydantic-enhanced dataclass, similar to the standard Python `dataclass`, but with added validation. This function should be used similarly to `dataclasses.dataclass`. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `_cls` | `type[_T] | None` | The target dataclass. | `None` | | `init` | `Literal[False]` | Included for signature compatibility with dataclasses.dataclass, and is passed through to dataclasses.dataclass when appropriate. If specified, must be set to False, as pydantic inserts its own __init__ function. | `False` | | `repr` | `bool` | A boolean indicating whether to include the field in the __repr__ output. | `True` | | `eq` | `bool` | Determines if a __eq__ method should be generated for the class. | `True` | | `order` | `bool` | Determines if comparison magic methods should be generated, such as __lt__, but not __eq__. | `False` | | `unsafe_hash` | `bool` | Determines if a __hash__ method should be included in the class, as in dataclasses.dataclass. | `False` | | `frozen` | `bool | None` | Determines if the generated class should be a 'frozen' dataclass, which does not allow its attributes to be modified after it has been initialized. If not set, the value from the provided config argument will be used (and will default to False otherwise). | `None` | | `config` | `ConfigDict | type[object] | None` | The Pydantic config to use for the dataclass. | `None` | | `validate_on_init` | `bool | None` | A deprecated parameter included for backwards compatibility; in V2, all Pydantic dataclasses are validated on init. | `None` | | `kw_only` | `bool` | Determines if __init__ method parameters must be specified by keyword only. Defaults to False. | `False` | | `slots` | `bool` | Determines if the generated class should be a 'slots' dataclass, which does not allow the addition of new attributes after instantiation. | `False` | Returns: | Type | Description | | --- | --- | | `Callable[[type[_T]], type[PydanticDataclass]] | type[PydanticDataclass]` | A decorator that accepts a class as its argument and returns a Pydantic dataclass. | Raises: | Type | Description | | --- | --- | | `AssertionError` | Raised if init is not False or validate_on_init is False. | Source code in `pydantic/dataclasses.py` ```python @dataclass_transform(field_specifiers=(dataclasses.field, Field, PrivateAttr)) def dataclass( _cls: type[_T] | None = None, *, init: Literal[False] = False, repr: bool = True, eq: bool = True, order: bool = False, unsafe_hash: bool = False, frozen: bool | None = None, config: ConfigDict | type[object] | None = None, validate_on_init: bool | None = None, kw_only: bool = False, slots: bool = False, ) -> Callable[[type[_T]], type[PydanticDataclass]] | type[PydanticDataclass]: """!!! abstract "Usage Documentation" [`dataclasses`](../concepts/dataclasses.md) A decorator used to create a Pydantic-enhanced dataclass, similar to the standard Python `dataclass`, but with added validation. This function should be used similarly to `dataclasses.dataclass`. Args: _cls: The target `dataclass`. init: Included for signature compatibility with `dataclasses.dataclass`, and is passed through to `dataclasses.dataclass` when appropriate. If specified, must be set to `False`, as pydantic inserts its own `__init__` function. repr: A boolean indicating whether to include the field in the `__repr__` output. eq: Determines if a `__eq__` method should be generated for the class. order: Determines if comparison magic methods should be generated, such as `__lt__`, but not `__eq__`. unsafe_hash: Determines if a `__hash__` method should be included in the class, as in `dataclasses.dataclass`. frozen: Determines if the generated class should be a 'frozen' `dataclass`, which does not allow its attributes to be modified after it has been initialized. If not set, the value from the provided `config` argument will be used (and will default to `False` otherwise). config: The Pydantic config to use for the `dataclass`. validate_on_init: A deprecated parameter included for backwards compatibility; in V2, all Pydantic dataclasses are validated on init. kw_only: Determines if `__init__` method parameters must be specified by keyword only. Defaults to `False`. slots: Determines if the generated class should be a 'slots' `dataclass`, which does not allow the addition of new attributes after instantiation. Returns: A decorator that accepts a class as its argument and returns a Pydantic `dataclass`. Raises: AssertionError: Raised if `init` is not `False` or `validate_on_init` is `False`. """ assert init is False, 'pydantic.dataclasses.dataclass only supports init=False' assert validate_on_init is not False, 'validate_on_init=False is no longer supported' if sys.version_info >= (3, 10): kwargs = {'kw_only': kw_only, 'slots': slots} else: kwargs = {} def make_pydantic_fields_compatible(cls: type[Any]) -> None: """Make sure that stdlib `dataclasses` understands `Field` kwargs like `kw_only` To do that, we simply change `x: int = pydantic.Field(..., kw_only=True)` into `x: int = dataclasses.field(default=pydantic.Field(..., kw_only=True), kw_only=True)` """ for annotation_cls in cls.__mro__: annotations: dict[str, Any] = getattr(annotation_cls, '__annotations__', {}) for field_name in annotations: field_value = getattr(cls, field_name, None) # Process only if this is an instance of `FieldInfo`. if not isinstance(field_value, FieldInfo): continue # Initialize arguments for the standard `dataclasses.field`. field_args: dict = {'default': field_value} # Handle `kw_only` for Python 3.10+ if sys.version_info >= (3, 10) and field_value.kw_only: field_args['kw_only'] = True # Set `repr` attribute if it's explicitly specified to be not `True`. if field_value.repr is not True: field_args['repr'] = field_value.repr setattr(cls, field_name, dataclasses.field(**field_args)) # In Python 3.9, when subclassing, information is pulled from cls.__dict__['__annotations__'] # for annotations, so we must make sure it's initialized before we add to it. if cls.__dict__.get('__annotations__') is None: cls.__annotations__ = {} cls.__annotations__[field_name] = annotations[field_name] def create_dataclass(cls: type[Any]) -> type[PydanticDataclass]: """Create a Pydantic dataclass from a regular dataclass. Args: cls: The class to create the Pydantic dataclass from. Returns: A Pydantic dataclass. """ from ._internal._utils import is_model_class if is_model_class(cls): raise PydanticUserError( f'Cannot create a Pydantic dataclass from {cls.__name__} as it is already a Pydantic model', code='dataclass-on-model', ) original_cls = cls # we warn on conflicting config specifications, but only if the class doesn't have a dataclass base # because a dataclass base might provide a __pydantic_config__ attribute that we don't want to warn about has_dataclass_base = any(dataclasses.is_dataclass(base) for base in cls.__bases__) if not has_dataclass_base and config is not None and hasattr(cls, '__pydantic_config__'): warn( f'`config` is set via both the `dataclass` decorator and `__pydantic_config__` for dataclass {cls.__name__}. ' f'The `config` specification from `dataclass` decorator will take priority.', category=UserWarning, stacklevel=2, ) # if config is not explicitly provided, try to read it from the type config_dict = config if config is not None else getattr(cls, '__pydantic_config__', None) config_wrapper = _config.ConfigWrapper(config_dict) decorators = _decorators.DecoratorInfos.build(cls) # Keep track of the original __doc__ so that we can restore it after applying the dataclasses decorator # Otherwise, classes with no __doc__ will have their signature added into the JSON schema description, # since dataclasses.dataclass will set this as the __doc__ original_doc = cls.__doc__ if _pydantic_dataclasses.is_builtin_dataclass(cls): # Don't preserve the docstring for vanilla dataclasses, as it may include the signature # This matches v1 behavior, and there was an explicit test for it original_doc = None # We don't want to add validation to the existing std lib dataclass, so we will subclass it # If the class is generic, we need to make sure the subclass also inherits from Generic # with all the same parameters. bases = (cls,) if issubclass(cls, Generic): generic_base = Generic[cls.__parameters__] # type: ignore bases = bases + (generic_base,) cls = types.new_class(cls.__name__, bases) make_pydantic_fields_compatible(cls) # Respect frozen setting from dataclass constructor and fallback to config setting if not provided if frozen is not None: frozen_ = frozen if config_wrapper.frozen: # It's not recommended to define both, as the setting from the dataclass decorator will take priority. warn( f'`frozen` is set via both the `dataclass` decorator and `config` for dataclass {cls.__name__!r}.' 'This is not recommended. The `frozen` specification on `dataclass` will take priority.', category=UserWarning, stacklevel=2, ) else: frozen_ = config_wrapper.frozen or False cls = dataclasses.dataclass( # type: ignore[call-overload] cls, # the value of init here doesn't affect anything except that it makes it easier to generate a signature init=True, repr=repr, eq=eq, order=order, unsafe_hash=unsafe_hash, frozen=frozen_, **kwargs, ) # This is an undocumented attribute to distinguish stdlib/Pydantic dataclasses. # It should be set as early as possible: cls.__is_pydantic_dataclass__ = True cls.__pydantic_decorators__ = decorators # type: ignore cls.__doc__ = original_doc cls.__module__ = original_cls.__module__ cls.__qualname__ = original_cls.__qualname__ cls.__pydantic_complete__ = False # `complete_dataclass` will set it to `True` if successful. # TODO `parent_namespace` is currently None, but we could do the same thing as Pydantic models: # fetch the parent ns using `parent_frame_namespace` (if the dataclass was defined in a function), # and possibly cache it (see the `__pydantic_parent_namespace__` logic for models). _pydantic_dataclasses.complete_dataclass(cls, config_wrapper, raise_errors=False) return cls return create_dataclass if _cls is None else create_dataclass(_cls) ``` ## rebuild_dataclass ```python rebuild_dataclass( cls: type[PydanticDataclass], *, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None ) -> bool | None ``` Try to rebuild the pydantic-core schema for the dataclass. This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails. This is analogous to `BaseModel.model_rebuild`. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `cls` | `type[PydanticDataclass]` | The class to rebuild the pydantic-core schema for. | *required* | | `force` | `bool` | Whether to force the rebuilding of the schema, defaults to False. | `False` | | `raise_errors` | `bool` | Whether to raise errors, defaults to True. | `True` | | `_parent_namespace_depth` | `int` | The depth level of the parent namespace, defaults to 2. | `2` | | `_types_namespace` | `MappingNamespace | None` | The types namespace, defaults to None. | `None` | Returns: | Type | Description | | --- | --- | | `bool | None` | Returns None if the schema is already "complete" and rebuilding was not required. | | `bool | None` | If rebuilding was required, returns True if rebuilding was successful, otherwise False. | Source code in `pydantic/dataclasses.py` ```python def rebuild_dataclass( cls: type[PydanticDataclass], *, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None, ) -> bool | None: """Try to rebuild the pydantic-core schema for the dataclass. This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails. This is analogous to `BaseModel.model_rebuild`. Args: cls: The class to rebuild the pydantic-core schema for. force: Whether to force the rebuilding of the schema, defaults to `False`. raise_errors: Whether to raise errors, defaults to `True`. _parent_namespace_depth: The depth level of the parent namespace, defaults to 2. _types_namespace: The types namespace, defaults to `None`. Returns: Returns `None` if the schema is already "complete" and rebuilding was not required. If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`. """ if not force and cls.__pydantic_complete__: return None for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'): if attr in cls.__dict__: # Deleting the validator/serializer is necessary as otherwise they can get reused in # pydantic-core. Same applies for the core schema that can be reused in schema generation. delattr(cls, attr) cls.__pydantic_complete__ = False if _types_namespace is not None: rebuild_ns = _types_namespace elif _parent_namespace_depth > 0: rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {} else: rebuild_ns = {} ns_resolver = _namespace_utils.NsResolver( parent_namespace=rebuild_ns, ) return _pydantic_dataclasses.complete_dataclass( cls, _config.ConfigWrapper(cls.__pydantic_config__, check=False), raise_errors=raise_errors, ns_resolver=ns_resolver, # We could provide a different config instead (with `'defer_build'` set to `True`) # of this explicit `_force_build` argument, but because config can come from the # decorator parameter or the `__pydantic_config__` attribute, `complete_dataclass` # will overwrite `__pydantic_config__` with the provided config above: _force_build=True, ) ``` ## is_pydantic_dataclass ```python is_pydantic_dataclass( class_: type[Any], ) -> TypeGuard[type[PydanticDataclass]] ``` Whether a class is a pydantic dataclass. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `class_` | `type[Any]` | The class. | *required* | Returns: | Type | Description | | --- | --- | | `TypeGuard[type[PydanticDataclass]]` | True if the class is a pydantic dataclass, False otherwise. | Source code in `pydantic/dataclasses.py` ```python def is_pydantic_dataclass(class_: type[Any], /) -> TypeGuard[type[PydanticDataclass]]: """Whether a class is a pydantic dataclass. Args: class_: The class. Returns: `True` if the class is a pydantic dataclass, `False` otherwise. """ try: return '__is_pydantic_dataclass__' in class_.__dict__ and dataclasses.is_dataclass(class_) except AttributeError: return False ``` Pydantic-specific errors. ## PydanticErrorMixin ```python PydanticErrorMixin( message: str, *, code: PydanticErrorCodes | None ) ``` A mixin class for common functionality shared by all Pydantic-specific errors. Attributes: | Name | Type | Description | | --- | --- | --- | | `message` | | A message describing the error. | | `code` | | An optional error code from PydanticErrorCodes enum. | Source code in `pydantic/errors.py` ```python def __init__(self, message: str, *, code: PydanticErrorCodes | None) -> None: self.message = message self.code = code ``` ## PydanticUserError ```python PydanticUserError( message: str, *, code: PydanticErrorCodes | None ) ``` Bases: `PydanticErrorMixin`, `TypeError` An error raised due to incorrect use of Pydantic. Source code in `pydantic/errors.py` ```python def __init__(self, message: str, *, code: PydanticErrorCodes | None) -> None: self.message = message self.code = code ``` ## PydanticUndefinedAnnotation ```python PydanticUndefinedAnnotation(name: str, message: str) ``` Bases: `PydanticErrorMixin`, `NameError` A subclass of `NameError` raised when handling undefined annotations during `CoreSchema` generation. Attributes: | Name | Type | Description | | --- | --- | --- | | `name` | | Name of the error. | | `message` | | Description of the error. | Source code in `pydantic/errors.py` ```python def __init__(self, name: str, message: str) -> None: self.name = name super().__init__(message=message, code='undefined-annotation') ``` ### from_name_error ```python from_name_error(name_error: NameError) -> Self ``` Convert a `NameError` to a `PydanticUndefinedAnnotation` error. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `name_error` | `NameError` | NameError to be converted. | *required* | Returns: | Type | Description | | --- | --- | | `Self` | Converted PydanticUndefinedAnnotation error. | Source code in `pydantic/errors.py` ```python @classmethod def from_name_error(cls, name_error: NameError) -> Self: """Convert a `NameError` to a `PydanticUndefinedAnnotation` error. Args: name_error: `NameError` to be converted. Returns: Converted `PydanticUndefinedAnnotation` error. """ try: name = name_error.name # type: ignore # python > 3.10 except AttributeError: name = re.search(r".*'(.+?)'", str(name_error)).group(1) # type: ignore[union-attr] return cls(name=name, message=str(name_error)) ``` ## PydanticImportError ```python PydanticImportError(message: str) ``` Bases: `PydanticErrorMixin`, `ImportError` An error raised when an import fails due to module changes between V1 and V2. Attributes: | Name | Type | Description | | --- | --- | --- | | `message` | | Description of the error. | Source code in `pydantic/errors.py` ```python def __init__(self, message: str) -> None: super().__init__(message, code='import-error') ``` ## PydanticSchemaGenerationError ```python PydanticSchemaGenerationError(message: str) ``` Bases: `PydanticUserError` An error raised during failures to generate a `CoreSchema` for some type. Attributes: | Name | Type | Description | | --- | --- | --- | | `message` | | Description of the error. | Source code in `pydantic/errors.py` ```python def __init__(self, message: str) -> None: super().__init__(message, code='schema-for-unknown-type') ``` ## PydanticInvalidForJsonSchema ```python PydanticInvalidForJsonSchema(message: str) ``` Bases: `PydanticUserError` An error raised during failures to generate a JSON schema for some `CoreSchema`. Attributes: | Name | Type | Description | | --- | --- | --- | | `message` | | Description of the error. | Source code in `pydantic/errors.py` ```python def __init__(self, message: str) -> None: super().__init__(message, code='invalid-for-json-schema') ``` ## PydanticForbiddenQualifier ```python PydanticForbiddenQualifier( qualifier: Qualifier, annotation: Any ) ``` Bases: `PydanticUserError` An error raised if a forbidden type qualifier is found in a type annotation. Source code in `pydantic/errors.py` ```python def __init__(self, qualifier: Qualifier, annotation: Any) -> None: super().__init__( message=( f'The annotation {_repr.display_as_type(annotation)!r} contains the {self._qualifier_repr_map[qualifier]!r} ' f'type qualifier, which is invalid in the context it is defined.' ), code=None, ) ``` # Pipeline API Experimental pipeline API functionality. Be careful with this API, it's subject to change. ## \_Pipeline ```python _Pipeline(_steps: tuple[_Step, ...]) ``` Bases: `Generic[_InT, _OutT]` Abstract representation of a chain of validation, transformation, and parsing steps. ### transform ```python transform( func: Callable[[_OutT], _NewOutT] ) -> _Pipeline[_InT, _NewOutT] ``` Transform the output of the previous step. If used as the first step in a pipeline, the type of the field is used. That is, the transformation is applied to after the value is parsed to the field's type. Source code in `pydantic/experimental/pipeline.py` ```python def transform( self, func: Callable[[_OutT], _NewOutT], ) -> _Pipeline[_InT, _NewOutT]: """Transform the output of the previous step. If used as the first step in a pipeline, the type of the field is used. That is, the transformation is applied to after the value is parsed to the field's type. """ return _Pipeline[_InT, _NewOutT](self._steps + (_Transform(func),)) ``` ### validate_as ```python validate_as( tp: type[_NewOutT], *, strict: bool = ... ) -> _Pipeline[_InT, _NewOutT] ``` ```python validate_as( tp: EllipsisType, *, strict: bool = ... ) -> _Pipeline[_InT, Any] ``` ```python validate_as( tp: type[_NewOutT] | EllipsisType, *, strict: bool = False ) -> _Pipeline[_InT, Any] ``` Validate / parse the input into a new type. If no type is provided, the type of the field is used. Types are parsed in Pydantic's `lax` mode by default, but you can enable `strict` mode by passing `strict=True`. Source code in `pydantic/experimental/pipeline.py` ```python def validate_as(self, tp: type[_NewOutT] | EllipsisType, *, strict: bool = False) -> _Pipeline[_InT, Any]: # type: ignore """Validate / parse the input into a new type. If no type is provided, the type of the field is used. Types are parsed in Pydantic's `lax` mode by default, but you can enable `strict` mode by passing `strict=True`. """ if isinstance(tp, EllipsisType): return _Pipeline[_InT, Any](self._steps + (_ValidateAs(_FieldTypeMarker, strict=strict),)) return _Pipeline[_InT, _NewOutT](self._steps + (_ValidateAs(tp, strict=strict),)) ``` ### validate_as_deferred ```python validate_as_deferred( func: Callable[[], type[_NewOutT]] ) -> _Pipeline[_InT, _NewOutT] ``` Parse the input into a new type, deferring resolution of the type until the current class is fully defined. This is useful when you need to reference the class in it's own type annotations. Source code in `pydantic/experimental/pipeline.py` ```python def validate_as_deferred(self, func: Callable[[], type[_NewOutT]]) -> _Pipeline[_InT, _NewOutT]: """Parse the input into a new type, deferring resolution of the type until the current class is fully defined. This is useful when you need to reference the class in it's own type annotations. """ return _Pipeline[_InT, _NewOutT](self._steps + (_ValidateAsDefer(func),)) ``` ### constrain ```python constrain(constraint: Ge) -> _Pipeline[_InT, _NewOutGe] ``` ```python constrain(constraint: Gt) -> _Pipeline[_InT, _NewOutGt] ``` ```python constrain(constraint: Le) -> _Pipeline[_InT, _NewOutLe] ``` ```python constrain(constraint: Lt) -> _Pipeline[_InT, _NewOutLt] ``` ```python constrain(constraint: Len) -> _Pipeline[_InT, _NewOutLen] ``` ```python constrain( constraint: MultipleOf, ) -> _Pipeline[_InT, _NewOutT] ``` ```python constrain( constraint: Timezone, ) -> _Pipeline[_InT, _NewOutDatetime] ``` ```python constrain(constraint: Predicate) -> _Pipeline[_InT, _OutT] ``` ```python constrain( constraint: Interval, ) -> _Pipeline[_InT, _NewOutInterval] ``` ```python constrain(constraint: _Eq) -> _Pipeline[_InT, _OutT] ``` ```python constrain(constraint: _NotEq) -> _Pipeline[_InT, _OutT] ``` ```python constrain(constraint: _In) -> _Pipeline[_InT, _OutT] ``` ```python constrain(constraint: _NotIn) -> _Pipeline[_InT, _OutT] ``` ```python constrain( constraint: Pattern[str], ) -> _Pipeline[_InT, _NewOutT] ``` ```python constrain(constraint: _ConstraintAnnotation) -> Any ``` Constrain a value to meet a certain condition. We support most conditions from `annotated_types`, as well as regular expressions. Most of the time you'll be calling a shortcut method like `gt`, `lt`, `len`, etc so you don't need to call this directly. Source code in `pydantic/experimental/pipeline.py` ```python def constrain(self, constraint: _ConstraintAnnotation) -> Any: """Constrain a value to meet a certain condition. We support most conditions from `annotated_types`, as well as regular expressions. Most of the time you'll be calling a shortcut method like `gt`, `lt`, `len`, etc so you don't need to call this directly. """ return _Pipeline[_InT, _OutT](self._steps + (_Constraint(constraint),)) ``` ### predicate ```python predicate( func: Callable[[_NewOutT], bool] ) -> _Pipeline[_InT, _NewOutT] ``` Constrain a value to meet a certain predicate. Source code in `pydantic/experimental/pipeline.py` ```python def predicate(self: _Pipeline[_InT, _NewOutT], func: Callable[[_NewOutT], bool]) -> _Pipeline[_InT, _NewOutT]: """Constrain a value to meet a certain predicate.""" return self.constrain(annotated_types.Predicate(func)) ``` ### gt ```python gt(gt: _NewOutGt) -> _Pipeline[_InT, _NewOutGt] ``` Constrain a value to be greater than a certain value. Source code in `pydantic/experimental/pipeline.py` ```python def gt(self: _Pipeline[_InT, _NewOutGt], gt: _NewOutGt) -> _Pipeline[_InT, _NewOutGt]: """Constrain a value to be greater than a certain value.""" return self.constrain(annotated_types.Gt(gt)) ``` ### lt ```python lt(lt: _NewOutLt) -> _Pipeline[_InT, _NewOutLt] ``` Constrain a value to be less than a certain value. Source code in `pydantic/experimental/pipeline.py` ```python def lt(self: _Pipeline[_InT, _NewOutLt], lt: _NewOutLt) -> _Pipeline[_InT, _NewOutLt]: """Constrain a value to be less than a certain value.""" return self.constrain(annotated_types.Lt(lt)) ``` ### ge ```python ge(ge: _NewOutGe) -> _Pipeline[_InT, _NewOutGe] ``` Constrain a value to be greater than or equal to a certain value. Source code in `pydantic/experimental/pipeline.py` ```python def ge(self: _Pipeline[_InT, _NewOutGe], ge: _NewOutGe) -> _Pipeline[_InT, _NewOutGe]: """Constrain a value to be greater than or equal to a certain value.""" return self.constrain(annotated_types.Ge(ge)) ``` ### le ```python le(le: _NewOutLe) -> _Pipeline[_InT, _NewOutLe] ``` Constrain a value to be less than or equal to a certain value. Source code in `pydantic/experimental/pipeline.py` ```python def le(self: _Pipeline[_InT, _NewOutLe], le: _NewOutLe) -> _Pipeline[_InT, _NewOutLe]: """Constrain a value to be less than or equal to a certain value.""" return self.constrain(annotated_types.Le(le)) ``` ### len ```python len( min_len: int, max_len: int | None = None ) -> _Pipeline[_InT, _NewOutLen] ``` Constrain a value to have a certain length. Source code in `pydantic/experimental/pipeline.py` ```python def len(self: _Pipeline[_InT, _NewOutLen], min_len: int, max_len: int | None = None) -> _Pipeline[_InT, _NewOutLen]: """Constrain a value to have a certain length.""" return self.constrain(annotated_types.Len(min_len, max_len)) ``` ### multiple_of ```python multiple_of( multiple_of: _NewOutDiv, ) -> _Pipeline[_InT, _NewOutDiv] ``` ```python multiple_of( multiple_of: _NewOutMod, ) -> _Pipeline[_InT, _NewOutMod] ``` ```python multiple_of(multiple_of: Any) -> _Pipeline[_InT, Any] ``` Constrain a value to be a multiple of a certain number. Source code in `pydantic/experimental/pipeline.py` ```python def multiple_of(self: _Pipeline[_InT, Any], multiple_of: Any) -> _Pipeline[_InT, Any]: """Constrain a value to be a multiple of a certain number.""" return self.constrain(annotated_types.MultipleOf(multiple_of)) ``` ### eq ```python eq(value: _OutT) -> _Pipeline[_InT, _OutT] ``` Constrain a value to be equal to a certain value. Source code in `pydantic/experimental/pipeline.py` ```python def eq(self: _Pipeline[_InT, _OutT], value: _OutT) -> _Pipeline[_InT, _OutT]: """Constrain a value to be equal to a certain value.""" return self.constrain(_Eq(value)) ``` ### not_eq ```python not_eq(value: _OutT) -> _Pipeline[_InT, _OutT] ``` Constrain a value to not be equal to a certain value. Source code in `pydantic/experimental/pipeline.py` ```python def not_eq(self: _Pipeline[_InT, _OutT], value: _OutT) -> _Pipeline[_InT, _OutT]: """Constrain a value to not be equal to a certain value.""" return self.constrain(_NotEq(value)) ``` ### in\_ ```python in_(values: Container[_OutT]) -> _Pipeline[_InT, _OutT] ``` Constrain a value to be in a certain set. Source code in `pydantic/experimental/pipeline.py` ```python def in_(self: _Pipeline[_InT, _OutT], values: Container[_OutT]) -> _Pipeline[_InT, _OutT]: """Constrain a value to be in a certain set.""" return self.constrain(_In(values)) ``` ### not_in ```python not_in(values: Container[_OutT]) -> _Pipeline[_InT, _OutT] ``` Constrain a value to not be in a certain set. Source code in `pydantic/experimental/pipeline.py` ```python def not_in(self: _Pipeline[_InT, _OutT], values: Container[_OutT]) -> _Pipeline[_InT, _OutT]: """Constrain a value to not be in a certain set.""" return self.constrain(_NotIn(values)) ``` ### otherwise ```python otherwise( other: _Pipeline[_OtherIn, _OtherOut] ) -> _Pipeline[_InT | _OtherIn, _OutT | _OtherOut] ``` Combine two validation chains, returning the result of the first chain if it succeeds, and the second chain if it fails. Source code in `pydantic/experimental/pipeline.py` ```python def otherwise(self, other: _Pipeline[_OtherIn, _OtherOut]) -> _Pipeline[_InT | _OtherIn, _OutT | _OtherOut]: """Combine two validation chains, returning the result of the first chain if it succeeds, and the second chain if it fails.""" return _Pipeline((_PipelineOr(self, other),)) ``` ### then ```python then( other: _Pipeline[_OutT, _OtherOut] ) -> _Pipeline[_InT, _OtherOut] ``` Pipe the result of one validation chain into another. Source code in `pydantic/experimental/pipeline.py` ```python def then(self, other: _Pipeline[_OutT, _OtherOut]) -> _Pipeline[_InT, _OtherOut]: """Pipe the result of one validation chain into another.""" return _Pipeline((_PipelineAnd(self, other),)) ``` # Arguments schema API Experimental module exposing a function to generate a core schema that validates callable arguments. ## generate_arguments_schema ```python generate_arguments_schema( func: Callable[..., Any], schema_type: Literal[ "arguments", "arguments-v3" ] = "arguments-v3", parameters_callback: ( Callable[[int, str, Any], Literal["skip"] | None] | None ) = None, config: ConfigDict | None = None, ) -> CoreSchema ``` Generate the schema for the arguments of a function. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `func` | `Callable[..., Any]` | The function to generate the schema for. | *required* | | `schema_type` | `Literal['arguments', 'arguments-v3']` | The type of schema to generate. | `'arguments-v3'` | | `parameters_callback` | `Callable[[int, str, Any], Literal['skip'] | None] | None` | A callable that will be invoked for each parameter. The callback should take three required arguments: the index, the name and the type annotation (or Parameter.empty if not annotated) of the parameter. The callback can optionally return 'skip', so that the parameter gets excluded from the resulting schema. | `None` | | `config` | `ConfigDict | None` | The configuration to use. | `None` | Returns: | Type | Description | | --- | --- | | `CoreSchema` | The generated schema. | Source code in `pydantic/experimental/arguments_schema.py` ```python def generate_arguments_schema( func: Callable[..., Any], schema_type: Literal['arguments', 'arguments-v3'] = 'arguments-v3', parameters_callback: Callable[[int, str, Any], Literal['skip'] | None] | None = None, config: ConfigDict | None = None, ) -> CoreSchema: """Generate the schema for the arguments of a function. Args: func: The function to generate the schema for. schema_type: The type of schema to generate. parameters_callback: A callable that will be invoked for each parameter. The callback should take three required arguments: the index, the name and the type annotation (or [`Parameter.empty`][inspect.Parameter.empty] if not annotated) of the parameter. The callback can optionally return `'skip'`, so that the parameter gets excluded from the resulting schema. config: The configuration to use. Returns: The generated schema. """ generate_schema = _generate_schema.GenerateSchema( _config.ConfigWrapper(config), ns_resolver=_namespace_utils.NsResolver(namespaces_tuple=_namespace_utils.ns_for_function(func)), ) if schema_type == 'arguments': schema = generate_schema._arguments_schema(func, parameters_callback) # pyright: ignore[reportArgumentType] else: schema = generate_schema._arguments_v3_schema(func, parameters_callback) # pyright: ignore[reportArgumentType] return generate_schema.clean_schema(schema) ``` Defining fields on models. ## Field ```python Field( default: ellipsis, *, alias: str | None = _Unset, alias_priority: int | None = _Unset, validation_alias: ( str | AliasPath | AliasChoices | None ) = _Unset, serialization_alias: str | None = _Unset, title: str | None = _Unset, field_title_generator: ( Callable[[str, FieldInfo], str] | None ) = _Unset, description: str | None = _Unset, examples: list[Any] | None = _Unset, exclude: bool | None = _Unset, discriminator: str | Discriminator | None = _Unset, deprecated: Deprecated | str | bool | None = _Unset, json_schema_extra: ( JsonDict | Callable[[JsonDict], None] | None ) = _Unset, frozen: bool | None = _Unset, validate_default: bool | None = _Unset, repr: bool = _Unset, init: bool | None = _Unset, init_var: bool | None = _Unset, kw_only: bool | None = _Unset, pattern: str | Pattern[str] | None = _Unset, strict: bool | None = _Unset, coerce_numbers_to_str: bool | None = _Unset, gt: SupportsGt | None = _Unset, ge: SupportsGe | None = _Unset, lt: SupportsLt | None = _Unset, le: SupportsLe | None = _Unset, multiple_of: float | None = _Unset, allow_inf_nan: bool | None = _Unset, max_digits: int | None = _Unset, decimal_places: int | None = _Unset, min_length: int | None = _Unset, max_length: int | None = _Unset, union_mode: Literal["smart", "left_to_right"] = _Unset, fail_fast: bool | None = _Unset, **extra: Unpack[_EmptyKwargs] ) -> Any ``` ```python Field( default: _T, *, alias: str | None = _Unset, alias_priority: int | None = _Unset, validation_alias: ( str | AliasPath | AliasChoices | None ) = _Unset, serialization_alias: str | None = _Unset, title: str | None = _Unset, field_title_generator: ( Callable[[str, FieldInfo], str] | None ) = _Unset, description: str | None = _Unset, examples: list[Any] | None = _Unset, exclude: bool | None = _Unset, discriminator: str | Discriminator | None = _Unset, deprecated: Deprecated | str | bool | None = _Unset, json_schema_extra: ( JsonDict | Callable[[JsonDict], None] | None ) = _Unset, frozen: bool | None = _Unset, validate_default: bool | None = _Unset, repr: bool = _Unset, init: bool | None = _Unset, init_var: bool | None = _Unset, kw_only: bool | None = _Unset, pattern: str | Pattern[str] | None = _Unset, strict: bool | None = _Unset, coerce_numbers_to_str: bool | None = _Unset, gt: SupportsGt | None = _Unset, ge: SupportsGe | None = _Unset, lt: SupportsLt | None = _Unset, le: SupportsLe | None = _Unset, multiple_of: float | None = _Unset, allow_inf_nan: bool | None = _Unset, max_digits: int | None = _Unset, decimal_places: int | None = _Unset, min_length: int | None = _Unset, max_length: int | None = _Unset, union_mode: Literal["smart", "left_to_right"] = _Unset, fail_fast: bool | None = _Unset, **extra: Unpack[_EmptyKwargs] ) -> _T ``` ```python Field( *, default_factory: ( Callable[[], _T] | Callable[[dict[str, Any]], _T] ), alias: str | None = _Unset, alias_priority: int | None = _Unset, validation_alias: ( str | AliasPath | AliasChoices | None ) = _Unset, serialization_alias: str | None = _Unset, title: str | None = _Unset, field_title_generator: ( Callable[[str, FieldInfo], str] | None ) = _Unset, description: str | None = _Unset, examples: list[Any] | None = _Unset, exclude: bool | None = _Unset, discriminator: str | Discriminator | None = _Unset, deprecated: Deprecated | str | bool | None = _Unset, json_schema_extra: ( JsonDict | Callable[[JsonDict], None] | None ) = _Unset, frozen: bool | None = _Unset, validate_default: bool | None = _Unset, repr: bool = _Unset, init: bool | None = _Unset, init_var: bool | None = _Unset, kw_only: bool | None = _Unset, pattern: str | Pattern[str] | None = _Unset, strict: bool | None = _Unset, coerce_numbers_to_str: bool | None = _Unset, gt: SupportsGt | None = _Unset, ge: SupportsGe | None = _Unset, lt: SupportsLt | None = _Unset, le: SupportsLe | None = _Unset, multiple_of: float | None = _Unset, allow_inf_nan: bool | None = _Unset, max_digits: int | None = _Unset, decimal_places: int | None = _Unset, min_length: int | None = _Unset, max_length: int | None = _Unset, union_mode: Literal["smart", "left_to_right"] = _Unset, fail_fast: bool | None = _Unset, **extra: Unpack[_EmptyKwargs] ) -> _T ``` ```python Field( *, alias: str | None = _Unset, alias_priority: int | None = _Unset, validation_alias: ( str | AliasPath | AliasChoices | None ) = _Unset, serialization_alias: str | None = _Unset, title: str | None = _Unset, field_title_generator: ( Callable[[str, FieldInfo], str] | None ) = _Unset, description: str | None = _Unset, examples: list[Any] | None = _Unset, exclude: bool | None = _Unset, discriminator: str | Discriminator | None = _Unset, deprecated: Deprecated | str | bool | None = _Unset, json_schema_extra: ( JsonDict | Callable[[JsonDict], None] | None ) = _Unset, frozen: bool | None = _Unset, validate_default: bool | None = _Unset, repr: bool = _Unset, init: bool | None = _Unset, init_var: bool | None = _Unset, kw_only: bool | None = _Unset, pattern: str | Pattern[str] | None = _Unset, strict: bool | None = _Unset, coerce_numbers_to_str: bool | None = _Unset, gt: SupportsGt | None = _Unset, ge: SupportsGe | None = _Unset, lt: SupportsLt | None = _Unset, le: SupportsLe | None = _Unset, multiple_of: float | None = _Unset, allow_inf_nan: bool | None = _Unset, max_digits: int | None = _Unset, decimal_places: int | None = _Unset, min_length: int | None = _Unset, max_length: int | None = _Unset, union_mode: Literal["smart", "left_to_right"] = _Unset, fail_fast: bool | None = _Unset, **extra: Unpack[_EmptyKwargs] ) -> Any ``` ```python Field( default: Any = PydanticUndefined, *, default_factory: ( Callable[[], Any] | Callable[[dict[str, Any]], Any] | None ) = _Unset, alias: str | None = _Unset, alias_priority: int | None = _Unset, validation_alias: ( str | AliasPath | AliasChoices | None ) = _Unset, serialization_alias: str | None = _Unset, title: str | None = _Unset, field_title_generator: ( Callable[[str, FieldInfo], str] | None ) = _Unset, description: str | None = _Unset, examples: list[Any] | None = _Unset, exclude: bool | None = _Unset, discriminator: str | Discriminator | None = _Unset, deprecated: Deprecated | str | bool | None = _Unset, json_schema_extra: ( JsonDict | Callable[[JsonDict], None] | None ) = _Unset, frozen: bool | None = _Unset, validate_default: bool | None = _Unset, repr: bool = _Unset, init: bool | None = _Unset, init_var: bool | None = _Unset, kw_only: bool | None = _Unset, pattern: str | Pattern[str] | None = _Unset, strict: bool | None = _Unset, coerce_numbers_to_str: bool | None = _Unset, gt: SupportsGt | None = _Unset, ge: SupportsGe | None = _Unset, lt: SupportsLt | None = _Unset, le: SupportsLe | None = _Unset, multiple_of: float | None = _Unset, allow_inf_nan: bool | None = _Unset, max_digits: int | None = _Unset, decimal_places: int | None = _Unset, min_length: int | None = _Unset, max_length: int | None = _Unset, union_mode: Literal["smart", "left_to_right"] = _Unset, fail_fast: bool | None = _Unset, **extra: Unpack[_EmptyKwargs] ) -> Any ``` Usage Documentation [Fields](../../concepts/fields/) Create a field for objects that can be configured. Used to provide extra information about a field, either for the model schema or complex validation. Some arguments apply only to number fields (`int`, `float`, `Decimal`) and some apply only to `str`. Note - Any `_Unset` objects will be replaced by the corresponding value defined in the `_DefaultValues` dictionary. If a key for the `_Unset` object is not found in the `_DefaultValues` dictionary, it will default to `None` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `default` | `Any` | Default value if the field is not set. | `PydanticUndefined` | | `default_factory` | `Callable[[], Any] | Callable[[dict[str, Any]], Any] | None` | A callable to generate the default value. The callable can either take 0 arguments (in which case it is called as is) or a single argument containing the already validated data. | `_Unset` | | `alias` | `str | None` | The name to use for the attribute when validating or serializing by alias. This is often used for things like converting between snake and camel case. | `_Unset` | | `alias_priority` | `int | None` | Priority of the alias. This affects whether an alias generator is used. | `_Unset` | | `validation_alias` | `str | AliasPath | AliasChoices | None` | Like alias, but only affects validation, not serialization. | `_Unset` | | `serialization_alias` | `str | None` | Like alias, but only affects serialization, not validation. | `_Unset` | | `title` | `str | None` | Human-readable title. | `_Unset` | | `field_title_generator` | `Callable[[str, FieldInfo], str] | None` | A callable that takes a field name and returns title for it. | `_Unset` | | `description` | `str | None` | Human-readable description. | `_Unset` | | `examples` | `list[Any] | None` | Example values for this field. | `_Unset` | | `exclude` | `bool | None` | Whether to exclude the field from the model serialization. | `_Unset` | | `discriminator` | `str | Discriminator | None` | Field name or Discriminator for discriminating the type in a tagged union. | `_Unset` | | `deprecated` | `Deprecated | str | bool | None` | A deprecation message, an instance of warnings.deprecated or the typing_extensions.deprecated backport, or a boolean. If True, a default deprecation message will be emitted when accessing the field. | `_Unset` | | `json_schema_extra` | `JsonDict | Callable[[JsonDict], None] | None` | A dict or callable to provide extra JSON schema properties. | `_Unset` | | `frozen` | `bool | None` | Whether the field is frozen. If true, attempts to change the value on an instance will raise an error. | `_Unset` | | `validate_default` | `bool | None` | If True, apply validation to the default value every time you create an instance. Otherwise, for performance reasons, the default value of the field is trusted and not validated. | `_Unset` | | `repr` | `bool` | A boolean indicating whether to include the field in the __repr__ output. | `_Unset` | | `init` | `bool | None` | Whether the field should be included in the constructor of the dataclass. (Only applies to dataclasses.) | `_Unset` | | `init_var` | `bool | None` | Whether the field should only be included in the constructor of the dataclass. (Only applies to dataclasses.) | `_Unset` | | `kw_only` | `bool | None` | Whether the field should be a keyword-only argument in the constructor of the dataclass. (Only applies to dataclasses.) | `_Unset` | | `coerce_numbers_to_str` | `bool | None` | Whether to enable coercion of any Number type to str (not applicable in strict mode). | `_Unset` | | `strict` | `bool | None` | If True, strict validation is applied to the field. See Strict Mode for details. | `_Unset` | | `gt` | `SupportsGt | None` | Greater than. If set, value must be greater than this. Only applicable to numbers. | `_Unset` | | `ge` | `SupportsGe | None` | Greater than or equal. If set, value must be greater than or equal to this. Only applicable to numbers. | `_Unset` | | `lt` | `SupportsLt | None` | Less than. If set, value must be less than this. Only applicable to numbers. | `_Unset` | | `le` | `SupportsLe | None` | Less than or equal. If set, value must be less than or equal to this. Only applicable to numbers. | `_Unset` | | `multiple_of` | `float | None` | Value must be a multiple of this. Only applicable to numbers. | `_Unset` | | `min_length` | `int | None` | Minimum length for iterables. | `_Unset` | | `max_length` | `int | None` | Maximum length for iterables. | `_Unset` | | `pattern` | `str | Pattern[str] | None` | Pattern for strings (a regular expression). | `_Unset` | | `allow_inf_nan` | `bool | None` | Allow inf, -inf, nan. Only applicable to float and Decimal numbers. | `_Unset` | | `max_digits` | `int | None` | Maximum number of allow digits for strings. | `_Unset` | | `decimal_places` | `int | None` | Maximum number of decimal places allowed for numbers. | `_Unset` | | `union_mode` | `Literal['smart', 'left_to_right']` | The strategy to apply when validating a union. Can be smart (the default), or left_to_right. See Union Mode for details. | `_Unset` | | `fail_fast` | `bool | None` | If True, validation will stop on the first error. If False, all validation errors will be collected. This option can be applied only to iterable types (list, tuple, set, and frozenset). | `_Unset` | | `extra` | `Unpack[_EmptyKwargs]` | (Deprecated) Extra fields that will be included in the JSON schema. Warning The extra kwargs is deprecated. Use json_schema_extra instead. | `{}` | Returns: | Type | Description | | --- | --- | | `Any` | A new FieldInfo. The return annotation is Any so Field can be used on type-annotated fields without causing a type error. | Source code in `pydantic/fields.py` ```python def Field( # noqa: C901 default: Any = PydanticUndefined, *, default_factory: Callable[[], Any] | Callable[[dict[str, Any]], Any] | None = _Unset, alias: str | None = _Unset, alias_priority: int | None = _Unset, validation_alias: str | AliasPath | AliasChoices | None = _Unset, serialization_alias: str | None = _Unset, title: str | None = _Unset, field_title_generator: Callable[[str, FieldInfo], str] | None = _Unset, description: str | None = _Unset, examples: list[Any] | None = _Unset, exclude: bool | None = _Unset, discriminator: str | types.Discriminator | None = _Unset, deprecated: Deprecated | str | bool | None = _Unset, json_schema_extra: JsonDict | Callable[[JsonDict], None] | None = _Unset, frozen: bool | None = _Unset, validate_default: bool | None = _Unset, repr: bool = _Unset, init: bool | None = _Unset, init_var: bool | None = _Unset, kw_only: bool | None = _Unset, pattern: str | typing.Pattern[str] | None = _Unset, strict: bool | None = _Unset, coerce_numbers_to_str: bool | None = _Unset, gt: annotated_types.SupportsGt | None = _Unset, ge: annotated_types.SupportsGe | None = _Unset, lt: annotated_types.SupportsLt | None = _Unset, le: annotated_types.SupportsLe | None = _Unset, multiple_of: float | None = _Unset, allow_inf_nan: bool | None = _Unset, max_digits: int | None = _Unset, decimal_places: int | None = _Unset, min_length: int | None = _Unset, max_length: int | None = _Unset, union_mode: Literal['smart', 'left_to_right'] = _Unset, fail_fast: bool | None = _Unset, **extra: Unpack[_EmptyKwargs], ) -> Any: """!!! abstract "Usage Documentation" [Fields](../concepts/fields.md) Create a field for objects that can be configured. Used to provide extra information about a field, either for the model schema or complex validation. Some arguments apply only to number fields (`int`, `float`, `Decimal`) and some apply only to `str`. Note: - Any `_Unset` objects will be replaced by the corresponding value defined in the `_DefaultValues` dictionary. If a key for the `_Unset` object is not found in the `_DefaultValues` dictionary, it will default to `None` Args: default: Default value if the field is not set. default_factory: A callable to generate the default value. The callable can either take 0 arguments (in which case it is called as is) or a single argument containing the already validated data. alias: The name to use for the attribute when validating or serializing by alias. This is often used for things like converting between snake and camel case. alias_priority: Priority of the alias. This affects whether an alias generator is used. validation_alias: Like `alias`, but only affects validation, not serialization. serialization_alias: Like `alias`, but only affects serialization, not validation. title: Human-readable title. field_title_generator: A callable that takes a field name and returns title for it. description: Human-readable description. examples: Example values for this field. exclude: Whether to exclude the field from the model serialization. discriminator: Field name or Discriminator for discriminating the type in a tagged union. deprecated: A deprecation message, an instance of `warnings.deprecated` or the `typing_extensions.deprecated` backport, or a boolean. If `True`, a default deprecation message will be emitted when accessing the field. json_schema_extra: A dict or callable to provide extra JSON schema properties. frozen: Whether the field is frozen. If true, attempts to change the value on an instance will raise an error. validate_default: If `True`, apply validation to the default value every time you create an instance. Otherwise, for performance reasons, the default value of the field is trusted and not validated. repr: A boolean indicating whether to include the field in the `__repr__` output. init: Whether the field should be included in the constructor of the dataclass. (Only applies to dataclasses.) init_var: Whether the field should _only_ be included in the constructor of the dataclass. (Only applies to dataclasses.) kw_only: Whether the field should be a keyword-only argument in the constructor of the dataclass. (Only applies to dataclasses.) coerce_numbers_to_str: Whether to enable coercion of any `Number` type to `str` (not applicable in `strict` mode). strict: If `True`, strict validation is applied to the field. See [Strict Mode](../concepts/strict_mode.md) for details. gt: Greater than. If set, value must be greater than this. Only applicable to numbers. ge: Greater than or equal. If set, value must be greater than or equal to this. Only applicable to numbers. lt: Less than. If set, value must be less than this. Only applicable to numbers. le: Less than or equal. If set, value must be less than or equal to this. Only applicable to numbers. multiple_of: Value must be a multiple of this. Only applicable to numbers. min_length: Minimum length for iterables. max_length: Maximum length for iterables. pattern: Pattern for strings (a regular expression). allow_inf_nan: Allow `inf`, `-inf`, `nan`. Only applicable to float and [`Decimal`][decimal.Decimal] numbers. max_digits: Maximum number of allow digits for strings. decimal_places: Maximum number of decimal places allowed for numbers. union_mode: The strategy to apply when validating a union. Can be `smart` (the default), or `left_to_right`. See [Union Mode](../concepts/unions.md#union-modes) for details. fail_fast: If `True`, validation will stop on the first error. If `False`, all validation errors will be collected. This option can be applied only to iterable types (list, tuple, set, and frozenset). extra: (Deprecated) Extra fields that will be included in the JSON schema. !!! warning Deprecated The `extra` kwargs is deprecated. Use `json_schema_extra` instead. Returns: A new [`FieldInfo`][pydantic.fields.FieldInfo]. The return annotation is `Any` so `Field` can be used on type-annotated fields without causing a type error. """ # Check deprecated and removed params from V1. This logic should eventually be removed. const = extra.pop('const', None) # type: ignore if const is not None: raise PydanticUserError('`const` is removed, use `Literal` instead', code='removed-kwargs') min_items = extra.pop('min_items', None) # type: ignore if min_items is not None: warn('`min_items` is deprecated and will be removed, use `min_length` instead', DeprecationWarning) if min_length in (None, _Unset): min_length = min_items # type: ignore max_items = extra.pop('max_items', None) # type: ignore if max_items is not None: warn('`max_items` is deprecated and will be removed, use `max_length` instead', DeprecationWarning) if max_length in (None, _Unset): max_length = max_items # type: ignore unique_items = extra.pop('unique_items', None) # type: ignore if unique_items is not None: raise PydanticUserError( ( '`unique_items` is removed, use `Set` instead' '(this feature is discussed in https://github.com/pydantic/pydantic-core/issues/296)' ), code='removed-kwargs', ) allow_mutation = extra.pop('allow_mutation', None) # type: ignore if allow_mutation is not None: warn('`allow_mutation` is deprecated and will be removed. use `frozen` instead', DeprecationWarning) if allow_mutation is False: frozen = True regex = extra.pop('regex', None) # type: ignore if regex is not None: raise PydanticUserError('`regex` is removed. use `pattern` instead', code='removed-kwargs') if extra: warn( 'Using extra keyword arguments on `Field` is deprecated and will be removed.' ' Use `json_schema_extra` instead.' f' (Extra keys: {", ".join(k.__repr__() for k in extra.keys())})', DeprecationWarning, ) if not json_schema_extra or json_schema_extra is _Unset: json_schema_extra = extra # type: ignore if ( validation_alias and validation_alias is not _Unset and not isinstance(validation_alias, (str, AliasChoices, AliasPath)) ): raise TypeError('Invalid `validation_alias` type. it should be `str`, `AliasChoices`, or `AliasPath`') if serialization_alias in (_Unset, None) and isinstance(alias, str): serialization_alias = alias if validation_alias in (_Unset, None): validation_alias = alias include = extra.pop('include', None) # type: ignore if include is not None: warn('`include` is deprecated and does nothing. It will be removed, use `exclude` instead', DeprecationWarning) return FieldInfo.from_field( default, default_factory=default_factory, alias=alias, alias_priority=alias_priority, validation_alias=validation_alias, serialization_alias=serialization_alias, title=title, field_title_generator=field_title_generator, description=description, examples=examples, exclude=exclude, discriminator=discriminator, deprecated=deprecated, json_schema_extra=json_schema_extra, frozen=frozen, pattern=pattern, validate_default=validate_default, repr=repr, init=init, init_var=init_var, kw_only=kw_only, coerce_numbers_to_str=coerce_numbers_to_str, strict=strict, gt=gt, ge=ge, lt=lt, le=le, multiple_of=multiple_of, min_length=min_length, max_length=max_length, allow_inf_nan=allow_inf_nan, max_digits=max_digits, decimal_places=decimal_places, union_mode=union_mode, fail_fast=fail_fast, ) ``` ## FieldInfo ```python FieldInfo(**kwargs: Unpack[_FieldInfoInputs]) ``` Bases: `Representation` This class holds information about a field. `FieldInfo` is used for any field definition regardless of whether the Field() function is explicitly used. Warning You generally shouldn't be creating `FieldInfo` directly, you'll only need to use it when accessing BaseModel `.model_fields` internals. Attributes: | Name | Type | Description | | --- | --- | --- | | `annotation` | `type[Any] | None` | The type annotation of the field. | | `default` | `Any` | The default value of the field. | | `default_factory` | `Callable[[], Any] | Callable[[dict[str, Any]], Any] | None` | A callable to generate the default value. The callable can either take 0 arguments (in which case it is called as is) or a single argument containing the already validated data. | | `alias` | `str | None` | The alias name of the field. | | `alias_priority` | `int | None` | The priority of the field's alias. | | `validation_alias` | `str | AliasPath | AliasChoices | None` | The validation alias of the field. | | `serialization_alias` | `str | None` | The serialization alias of the field. | | `title` | `str | None` | The title of the field. | | `field_title_generator` | `Callable[[str, FieldInfo], str] | None` | A callable that takes a field name and returns title for it. | | `description` | `str | None` | The description of the field. | | `examples` | `list[Any] | None` | List of examples of the field. | | `exclude` | `bool | None` | Whether to exclude the field from the model serialization. | | `discriminator` | `str | Discriminator | None` | Field name or Discriminator for discriminating the type in a tagged union. | | `deprecated` | `Deprecated | str | bool | None` | A deprecation message, an instance of warnings.deprecated or the typing_extensions.deprecated backport, or a boolean. If True, a default deprecation message will be emitted when accessing the field. | | `json_schema_extra` | `JsonDict | Callable[[JsonDict], None] | None` | A dict or callable to provide extra JSON schema properties. | | `frozen` | `bool | None` | Whether the field is frozen. | | `validate_default` | `bool | None` | Whether to validate the default value of the field. | | `repr` | `bool` | Whether to include the field in representation of the model. | | `init` | `bool | None` | Whether the field should be included in the constructor of the dataclass. | | `init_var` | `bool | None` | Whether the field should only be included in the constructor of the dataclass, and not stored. | | `kw_only` | `bool | None` | Whether the field should be a keyword-only argument in the constructor of the dataclass. | | `metadata` | `list[Any]` | List of metadata constraints. | See the signature of `pydantic.fields.Field` for more details about the expected arguments. Source code in `pydantic/fields.py` ```python def __init__(self, **kwargs: Unpack[_FieldInfoInputs]) -> None: """This class should generally not be initialized directly; instead, use the `pydantic.fields.Field` function or one of the constructor classmethods. See the signature of `pydantic.fields.Field` for more details about the expected arguments. """ self._attributes_set = {k: v for k, v in kwargs.items() if v is not _Unset} kwargs = {k: _DefaultValues.get(k) if v is _Unset else v for k, v in kwargs.items()} # type: ignore self.annotation = kwargs.get('annotation') default = kwargs.pop('default', PydanticUndefined) if default is Ellipsis: self.default = PydanticUndefined self._attributes_set.pop('default', None) else: self.default = default self.default_factory = kwargs.pop('default_factory', None) if self.default is not PydanticUndefined and self.default_factory is not None: raise TypeError('cannot specify both default and default_factory') self.alias = kwargs.pop('alias', None) self.validation_alias = kwargs.pop('validation_alias', None) self.serialization_alias = kwargs.pop('serialization_alias', None) alias_is_set = any(alias is not None for alias in (self.alias, self.validation_alias, self.serialization_alias)) self.alias_priority = kwargs.pop('alias_priority', None) or 2 if alias_is_set else None self.title = kwargs.pop('title', None) self.field_title_generator = kwargs.pop('field_title_generator', None) self.description = kwargs.pop('description', None) self.examples = kwargs.pop('examples', None) self.exclude = kwargs.pop('exclude', None) self.discriminator = kwargs.pop('discriminator', None) # For compatibility with FastAPI<=0.110.0, we preserve the existing value if it is not overridden self.deprecated = kwargs.pop('deprecated', getattr(self, 'deprecated', None)) self.repr = kwargs.pop('repr', True) self.json_schema_extra = kwargs.pop('json_schema_extra', None) self.validate_default = kwargs.pop('validate_default', None) self.frozen = kwargs.pop('frozen', None) # currently only used on dataclasses self.init = kwargs.pop('init', None) self.init_var = kwargs.pop('init_var', None) self.kw_only = kwargs.pop('kw_only', None) self.metadata = self._collect_metadata(kwargs) # type: ignore # Private attributes: self._qualifiers: set[Qualifier] = set() # Used to rebuild FieldInfo instances: self._complete = True self._original_annotation: Any = PydanticUndefined self._original_assignment: Any = PydanticUndefined ``` ### from_field ```python from_field( default: Any = PydanticUndefined, **kwargs: Unpack[_FromFieldInfoInputs] ) -> FieldInfo ``` Create a new `FieldInfo` object with the `Field` function. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `default` | `Any` | The default value for the field. Defaults to Undefined. | `PydanticUndefined` | | `**kwargs` | `Unpack[_FromFieldInfoInputs]` | Additional arguments dictionary. | `{}` | Raises: | Type | Description | | --- | --- | | `TypeError` | If 'annotation' is passed as a keyword argument. | Returns: | Type | Description | | --- | --- | | `FieldInfo` | A new FieldInfo object with the given parameters. | Example This is how you can create a field with default value like this: ```python import pydantic class MyModel(pydantic.BaseModel): foo: int = pydantic.Field(4) ``` Source code in `pydantic/fields.py` ````python @staticmethod def from_field(default: Any = PydanticUndefined, **kwargs: Unpack[_FromFieldInfoInputs]) -> FieldInfo: """Create a new `FieldInfo` object with the `Field` function. Args: default: The default value for the field. Defaults to Undefined. **kwargs: Additional arguments dictionary. Raises: TypeError: If 'annotation' is passed as a keyword argument. Returns: A new FieldInfo object with the given parameters. Example: This is how you can create a field with default value like this: ```python import pydantic class MyModel(pydantic.BaseModel): foo: int = pydantic.Field(4) ``` """ if 'annotation' in kwargs: raise TypeError('"annotation" is not permitted as a Field keyword argument') return FieldInfo(default=default, **kwargs) ```` ### from_annotation ```python from_annotation( annotation: type[Any], *, _source: AnnotationSource = ANY ) -> FieldInfo ``` Creates a `FieldInfo` instance from a bare annotation. This function is used internally to create a `FieldInfo` from a bare annotation like this: ```python import pydantic class MyModel(pydantic.BaseModel): foo: int # <-- like this ``` We also account for the case where the annotation can be an instance of `Annotated` and where one of the (not first) arguments in `Annotated` is an instance of `FieldInfo`, e.g.: ```python from typing import Annotated import annotated_types import pydantic class MyModel(pydantic.BaseModel): foo: Annotated[int, annotated_types.Gt(42)] bar: Annotated[int, pydantic.Field(gt=42)] ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `annotation` | `type[Any]` | An annotation object. | *required* | Returns: | Type | Description | | --- | --- | | `FieldInfo` | An instance of the field metadata. | Source code in `pydantic/fields.py` ````python @staticmethod def from_annotation(annotation: type[Any], *, _source: AnnotationSource = AnnotationSource.ANY) -> FieldInfo: """Creates a `FieldInfo` instance from a bare annotation. This function is used internally to create a `FieldInfo` from a bare annotation like this: ```python import pydantic class MyModel(pydantic.BaseModel): foo: int # <-- like this ``` We also account for the case where the annotation can be an instance of `Annotated` and where one of the (not first) arguments in `Annotated` is an instance of `FieldInfo`, e.g.: ```python from typing import Annotated import annotated_types import pydantic class MyModel(pydantic.BaseModel): foo: Annotated[int, annotated_types.Gt(42)] bar: Annotated[int, pydantic.Field(gt=42)] ``` Args: annotation: An annotation object. Returns: An instance of the field metadata. """ try: inspected_ann = inspect_annotation( annotation, annotation_source=_source, unpack_type_aliases='skip', ) except ForbiddenQualifier as e: raise PydanticForbiddenQualifier(e.qualifier, annotation) # TODO check for classvar and error? # No assigned value, this happens when using a bare `Final` qualifier (also for other # qualifiers, but they shouldn't appear here). In this case we infer the type as `Any` # because we don't have any assigned value. type_expr: Any = Any if inspected_ann.type is UNKNOWN else inspected_ann.type final = 'final' in inspected_ann.qualifiers metadata = inspected_ann.metadata if not metadata: # No metadata, e.g. `field: int`, or `field: Final[str]`: field_info = FieldInfo(annotation=type_expr, frozen=final or None) field_info._qualifiers = inspected_ann.qualifiers return field_info # With metadata, e.g. `field: Annotated[int, Field(...), Gt(1)]`: field_info_annotations = [a for a in metadata if isinstance(a, FieldInfo)] field_info = FieldInfo.merge_field_infos(*field_info_annotations, annotation=type_expr) new_field_info = copy(field_info) new_field_info.annotation = type_expr new_field_info.frozen = final or field_info.frozen field_metadata: list[Any] = [] for a in metadata: if typing_objects.is_deprecated(a): new_field_info.deprecated = a.message elif not isinstance(a, FieldInfo): field_metadata.append(a) else: field_metadata.extend(a.metadata) new_field_info.metadata = field_metadata new_field_info._qualifiers = inspected_ann.qualifiers return new_field_info ```` ### from_annotated_attribute ```python from_annotated_attribute( annotation: type[Any], default: Any, *, _source: AnnotationSource = ANY ) -> FieldInfo ``` Create `FieldInfo` from an annotation with a default value. This is used in cases like the following: ```python from typing import Annotated import annotated_types import pydantic class MyModel(pydantic.BaseModel): foo: int = 4 # <-- like this bar: Annotated[int, annotated_types.Gt(4)] = 4 # <-- or this spam: Annotated[int, pydantic.Field(gt=4)] = 4 # <-- or this ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `annotation` | `type[Any]` | The type annotation of the field. | *required* | | `default` | `Any` | The default value of the field. | *required* | Returns: | Type | Description | | --- | --- | | `FieldInfo` | A field object with the passed values. | Source code in `pydantic/fields.py` ````python @staticmethod def from_annotated_attribute( annotation: type[Any], default: Any, *, _source: AnnotationSource = AnnotationSource.ANY ) -> FieldInfo: """Create `FieldInfo` from an annotation with a default value. This is used in cases like the following: ```python from typing import Annotated import annotated_types import pydantic class MyModel(pydantic.BaseModel): foo: int = 4 # <-- like this bar: Annotated[int, annotated_types.Gt(4)] = 4 # <-- or this spam: Annotated[int, pydantic.Field(gt=4)] = 4 # <-- or this ``` Args: annotation: The type annotation of the field. default: The default value of the field. Returns: A field object with the passed values. """ if annotation is default: raise PydanticUserError( 'Error when building FieldInfo from annotated attribute. ' "Make sure you don't have any field name clashing with a type annotation.", code='unevaluable-type-annotation', ) try: inspected_ann = inspect_annotation( annotation, annotation_source=_source, unpack_type_aliases='skip', ) except ForbiddenQualifier as e: raise PydanticForbiddenQualifier(e.qualifier, annotation) # TODO check for classvar and error? # TODO infer from the default, this can be done in v3 once we treat final fields with # a default as proper fields and not class variables: type_expr: Any = Any if inspected_ann.type is UNKNOWN else inspected_ann.type final = 'final' in inspected_ann.qualifiers metadata = inspected_ann.metadata if isinstance(default, FieldInfo): # e.g. `field: int = Field(...)` default.annotation = type_expr default.metadata += metadata merged_default = FieldInfo.merge_field_infos( *[x for x in metadata if isinstance(x, FieldInfo)], default, annotation=default.annotation, ) merged_default.frozen = final or merged_default.frozen merged_default._qualifiers = inspected_ann.qualifiers return merged_default if isinstance(default, dataclasses.Field): # `collect_dataclass_fields()` passes the dataclass Field as a default. pydantic_field = FieldInfo._from_dataclass_field(default) pydantic_field.annotation = type_expr pydantic_field.metadata += metadata pydantic_field = FieldInfo.merge_field_infos( *[x for x in metadata if isinstance(x, FieldInfo)], pydantic_field, annotation=pydantic_field.annotation, ) pydantic_field.frozen = final or pydantic_field.frozen pydantic_field.init_var = 'init_var' in inspected_ann.qualifiers pydantic_field.init = getattr(default, 'init', None) pydantic_field.kw_only = getattr(default, 'kw_only', None) pydantic_field._qualifiers = inspected_ann.qualifiers return pydantic_field if not metadata: # No metadata, e.g. `field: int = ...`, or `field: Final[str] = ...`: field_info = FieldInfo(annotation=type_expr, default=default, frozen=final or None) field_info._qualifiers = inspected_ann.qualifiers return field_info # With metadata, e.g. `field: Annotated[int, Field(...), Gt(1)] = ...`: field_infos = [a for a in metadata if isinstance(a, FieldInfo)] field_info = FieldInfo.merge_field_infos(*field_infos, annotation=type_expr, default=default) field_metadata: list[Any] = [] for a in metadata: if typing_objects.is_deprecated(a): field_info.deprecated = a.message elif not isinstance(a, FieldInfo): field_metadata.append(a) else: field_metadata.extend(a.metadata) field_info.metadata = field_metadata field_info._qualifiers = inspected_ann.qualifiers return field_info ```` ### merge_field_infos ```python merge_field_infos( *field_infos: FieldInfo, **overrides: Any ) -> FieldInfo ``` Merge `FieldInfo` instances keeping only explicitly set attributes. Later `FieldInfo` instances override earlier ones. Returns: | Name | Type | Description | | --- | --- | --- | | `FieldInfo` | `FieldInfo` | A merged FieldInfo instance. | Source code in `pydantic/fields.py` ```python @staticmethod def merge_field_infos(*field_infos: FieldInfo, **overrides: Any) -> FieldInfo: """Merge `FieldInfo` instances keeping only explicitly set attributes. Later `FieldInfo` instances override earlier ones. Returns: FieldInfo: A merged FieldInfo instance. """ if len(field_infos) == 1: # No merging necessary, but we still need to make a copy and apply the overrides field_info = copy(field_infos[0]) field_info._attributes_set.update(overrides) default_override = overrides.pop('default', PydanticUndefined) if default_override is Ellipsis: default_override = PydanticUndefined if default_override is not PydanticUndefined: field_info.default = default_override for k, v in overrides.items(): setattr(field_info, k, v) return field_info # type: ignore merged_field_info_kwargs: dict[str, Any] = {} metadata = {} for field_info in field_infos: attributes_set = field_info._attributes_set.copy() try: json_schema_extra = attributes_set.pop('json_schema_extra') existing_json_schema_extra = merged_field_info_kwargs.get('json_schema_extra') if existing_json_schema_extra is None: merged_field_info_kwargs['json_schema_extra'] = json_schema_extra if isinstance(existing_json_schema_extra, dict): if isinstance(json_schema_extra, dict): merged_field_info_kwargs['json_schema_extra'] = { **existing_json_schema_extra, **json_schema_extra, } if callable(json_schema_extra): warn( 'Composing `dict` and `callable` type `json_schema_extra` is not supported.' 'The `callable` type is being ignored.' "If you'd like support for this behavior, please open an issue on pydantic.", PydanticJsonSchemaWarning, ) elif callable(json_schema_extra): # if ever there's a case of a callable, we'll just keep the last json schema extra spec merged_field_info_kwargs['json_schema_extra'] = json_schema_extra except KeyError: pass # later FieldInfo instances override everything except json_schema_extra from earlier FieldInfo instances merged_field_info_kwargs.update(attributes_set) for x in field_info.metadata: if not isinstance(x, FieldInfo): metadata[type(x)] = x merged_field_info_kwargs.update(overrides) field_info = FieldInfo(**merged_field_info_kwargs) field_info.metadata = list(metadata.values()) return field_info ``` ### deprecation_message ```python deprecation_message: str | None ``` The deprecation message to be emitted, or `None` if not set. ### default_factory_takes_validated_data ```python default_factory_takes_validated_data: bool | None ``` Whether the provided default factory callable has a validated data parameter. Returns `None` if no default factory is set. ### get_default ```python get_default( *, call_default_factory: Literal[True], validated_data: dict[str, Any] | None = None ) -> Any ``` ```python get_default( *, call_default_factory: Literal[False] = ... ) -> Any ``` ```python get_default( *, call_default_factory: bool = False, validated_data: dict[str, Any] | None = None ) -> Any ``` Get the default value. We expose an option for whether to call the default_factory (if present), as calling it may result in side effects that we want to avoid. However, there are times when it really should be called (namely, when instantiating a model via `model_construct`). Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `call_default_factory` | `bool` | Whether to call the default factory or not. | `False` | | `validated_data` | `dict[str, Any] | None` | The already validated data to be passed to the default factory. | `None` | Returns: | Type | Description | | --- | --- | | `Any` | The default value, calling the default factory if requested or None if not set. | Source code in `pydantic/fields.py` ```python def get_default(self, *, call_default_factory: bool = False, validated_data: dict[str, Any] | None = None) -> Any: """Get the default value. We expose an option for whether to call the default_factory (if present), as calling it may result in side effects that we want to avoid. However, there are times when it really should be called (namely, when instantiating a model via `model_construct`). Args: call_default_factory: Whether to call the default factory or not. validated_data: The already validated data to be passed to the default factory. Returns: The default value, calling the default factory if requested or `None` if not set. """ if self.default_factory is None: return _utils.smart_deepcopy(self.default) elif call_default_factory: if self.default_factory_takes_validated_data: fac = cast('Callable[[dict[str, Any]], Any]', self.default_factory) if validated_data is None: raise ValueError( "The default factory requires the 'validated_data' argument, which was not provided when calling 'get_default'." ) return fac(validated_data) else: fac = cast('Callable[[], Any]', self.default_factory) return fac() else: return None ``` ### is_required ```python is_required() -> bool ``` Check if the field is required (i.e., does not have a default value or factory). Returns: | Type | Description | | --- | --- | | `bool` | True if the field is required, False otherwise. | Source code in `pydantic/fields.py` ```python def is_required(self) -> bool: """Check if the field is required (i.e., does not have a default value or factory). Returns: `True` if the field is required, `False` otherwise. """ return self.default is PydanticUndefined and self.default_factory is None ``` ### rebuild_annotation ```python rebuild_annotation() -> Any ``` Attempts to rebuild the original annotation for use in function signatures. If metadata is present, it adds it to the original annotation using `Annotated`. Otherwise, it returns the original annotation as-is. Note that because the metadata has been flattened, the original annotation may not be reconstructed exactly as originally provided, e.g. if the original type had unrecognized annotations, or was annotated with a call to `pydantic.Field`. Returns: | Type | Description | | --- | --- | | `Any` | The rebuilt annotation. | Source code in `pydantic/fields.py` ```python def rebuild_annotation(self) -> Any: """Attempts to rebuild the original annotation for use in function signatures. If metadata is present, it adds it to the original annotation using `Annotated`. Otherwise, it returns the original annotation as-is. Note that because the metadata has been flattened, the original annotation may not be reconstructed exactly as originally provided, e.g. if the original type had unrecognized annotations, or was annotated with a call to `pydantic.Field`. Returns: The rebuilt annotation. """ if not self.metadata: return self.annotation else: # Annotated arguments must be a tuple return Annotated[(self.annotation, *self.metadata)] # type: ignore ``` ### apply_typevars_map ```python apply_typevars_map( typevars_map: Mapping[TypeVar, Any] | None, globalns: GlobalsNamespace | None = None, localns: MappingNamespace | None = None, ) -> None ``` Apply a `typevars_map` to the annotation. This method is used when analyzing parametrized generic types to replace typevars with their concrete types. This method applies the `typevars_map` to the annotation in place. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `typevars_map` | `Mapping[TypeVar, Any] | None` | A dictionary mapping type variables to their concrete types. | *required* | | `globalns` | `GlobalsNamespace | None` | The globals namespace to use during type annotation evaluation. | `None` | | `localns` | `MappingNamespace | None` | The locals namespace to use during type annotation evaluation. | `None` | See Also pydantic.\_internal.\_generics.replace_types is used for replacing the typevars with their concrete types. Source code in `pydantic/fields.py` ```python def apply_typevars_map( self, typevars_map: Mapping[TypeVar, Any] | None, globalns: GlobalsNamespace | None = None, localns: MappingNamespace | None = None, ) -> None: """Apply a `typevars_map` to the annotation. This method is used when analyzing parametrized generic types to replace typevars with their concrete types. This method applies the `typevars_map` to the annotation in place. Args: typevars_map: A dictionary mapping type variables to their concrete types. globalns: The globals namespace to use during type annotation evaluation. localns: The locals namespace to use during type annotation evaluation. See Also: pydantic._internal._generics.replace_types is used for replacing the typevars with their concrete types. """ annotation, _ = _typing_extra.try_eval_type(self.annotation, globalns, localns) self.annotation = _generics.replace_types(annotation, typevars_map) ``` ## PrivateAttr ```python PrivateAttr( default: _T, *, init: Literal[False] = False ) -> _T ``` ```python PrivateAttr( *, default_factory: Callable[[], _T], init: Literal[False] = False ) -> _T ``` ```python PrivateAttr(*, init: Literal[False] = False) -> Any ``` ```python PrivateAttr( default: Any = PydanticUndefined, *, default_factory: Callable[[], Any] | None = None, init: Literal[False] = False ) -> Any ``` Usage Documentation [Private Model Attributes](../../concepts/models/#private-model-attributes) Indicates that an attribute is intended for private use and not handled during normal validation/serialization. Private attributes are not validated by Pydantic, so it's up to you to ensure they are used in a type-safe manner. Private attributes are stored in `__private_attributes__` on the model. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `default` | `Any` | The attribute's default value. Defaults to Undefined. | `PydanticUndefined` | | `default_factory` | `Callable[[], Any] | None` | Callable that will be called when a default value is needed for this attribute. If both default and default_factory are set, an error will be raised. | `None` | | `init` | `Literal[False]` | Whether the attribute should be included in the constructor of the dataclass. Always False. | `False` | Returns: | Type | Description | | --- | --- | | `Any` | An instance of ModelPrivateAttr class. | Raises: | Type | Description | | --- | --- | | `ValueError` | If both default and default_factory are set. | Source code in `pydantic/fields.py` ```python def PrivateAttr( default: Any = PydanticUndefined, *, default_factory: Callable[[], Any] | None = None, init: Literal[False] = False, ) -> Any: """!!! abstract "Usage Documentation" [Private Model Attributes](../concepts/models.md#private-model-attributes) Indicates that an attribute is intended for private use and not handled during normal validation/serialization. Private attributes are not validated by Pydantic, so it's up to you to ensure they are used in a type-safe manner. Private attributes are stored in `__private_attributes__` on the model. Args: default: The attribute's default value. Defaults to Undefined. default_factory: Callable that will be called when a default value is needed for this attribute. If both `default` and `default_factory` are set, an error will be raised. init: Whether the attribute should be included in the constructor of the dataclass. Always `False`. Returns: An instance of [`ModelPrivateAttr`][pydantic.fields.ModelPrivateAttr] class. Raises: ValueError: If both `default` and `default_factory` are set. """ if default is not PydanticUndefined and default_factory is not None: raise TypeError('cannot specify both default and default_factory') return ModelPrivateAttr( default, default_factory=default_factory, ) ``` ## ModelPrivateAttr ```python ModelPrivateAttr( default: Any = PydanticUndefined, *, default_factory: Callable[[], Any] | None = None ) ``` Bases: `Representation` A descriptor for private attributes in class models. Warning You generally shouldn't be creating `ModelPrivateAttr` instances directly, instead use `pydantic.fields.PrivateAttr`. (This is similar to `FieldInfo` vs. `Field`.) Attributes: | Name | Type | Description | | --- | --- | --- | | `default` | | The default value of the attribute if not provided. | | `default_factory` | | A callable function that generates the default value of the attribute if not provided. | Source code in `pydantic/fields.py` ```python def __init__( self, default: Any = PydanticUndefined, *, default_factory: typing.Callable[[], Any] | None = None ) -> None: if default is Ellipsis: self.default = PydanticUndefined else: self.default = default self.default_factory = default_factory ``` ### get_default ```python get_default() -> Any ``` Retrieve the default value of the object. If `self.default_factory` is `None`, the method will return a deep copy of the `self.default` object. If `self.default_factory` is not `None`, it will call `self.default_factory` and return the value returned. Returns: | Type | Description | | --- | --- | | `Any` | The default value of the object. | Source code in `pydantic/fields.py` ```python def get_default(self) -> Any: """Retrieve the default value of the object. If `self.default_factory` is `None`, the method will return a deep copy of the `self.default` object. If `self.default_factory` is not `None`, it will call `self.default_factory` and return the value returned. Returns: The default value of the object. """ return _utils.smart_deepcopy(self.default) if self.default_factory is None else self.default_factory() ``` ## computed_field ```python computed_field(func: PropertyT) -> PropertyT ``` ```python computed_field( *, alias: str | None = None, alias_priority: int | None = None, title: str | None = None, field_title_generator: ( Callable[[str, ComputedFieldInfo], str] | None ) = None, description: str | None = None, deprecated: Deprecated | str | bool | None = None, examples: list[Any] | None = None, json_schema_extra: ( JsonDict | Callable[[JsonDict], None] | None ) = None, repr: bool = True, return_type: Any = PydanticUndefined ) -> Callable[[PropertyT], PropertyT] ``` ```python computed_field( func: PropertyT | None = None, /, *, alias: str | None = None, alias_priority: int | None = None, title: str | None = None, field_title_generator: ( Callable[[str, ComputedFieldInfo], str] | None ) = None, description: str | None = None, deprecated: Deprecated | str | bool | None = None, examples: list[Any] | None = None, json_schema_extra: ( JsonDict | Callable[[JsonDict], None] | None ) = None, repr: bool | None = None, return_type: Any = PydanticUndefined, ) -> PropertyT | Callable[[PropertyT], PropertyT] ``` Usage Documentation [The `computed_field` decorator](../../concepts/fields/#the-computed_field-decorator) Decorator to include `property` and `cached_property` when serializing models or dataclasses. This is useful for fields that are computed from other fields, or for fields that are expensive to compute and should be cached. ```python from pydantic import BaseModel, computed_field class Rectangle(BaseModel): width: int length: int @computed_field @property def area(self) -> int: return self.width * self.length print(Rectangle(width=3, length=2).model_dump()) #> {'width': 3, 'length': 2, 'area': 6} ``` If applied to functions not yet decorated with `@property` or `@cached_property`, the function is automatically wrapped with `property`. Although this is more concise, you will lose IntelliSense in your IDE, and confuse static type checkers, thus explicit use of `@property` is recommended. Mypy Warning Even with the `@property` or `@cached_property` applied to your function before `@computed_field`, mypy may throw a `Decorated property not supported` error. See [mypy issue #1362](https://github.com/python/mypy/issues/1362), for more information. To avoid this error message, add `# type: ignore[prop-decorator]` to the `@computed_field` line. [pyright](https://github.com/microsoft/pyright) supports `@computed_field` without error. ```python import random from pydantic import BaseModel, computed_field class Square(BaseModel): width: float @computed_field def area(self) -> float: # converted to a `property` by `computed_field` return round(self.width**2, 2) @area.setter def area(self, new_area: float) -> None: self.width = new_area**0.5 @computed_field(alias='the magic number', repr=False) def random_number(self) -> int: return random.randint(0, 1_000) square = Square(width=1.3) # `random_number` does not appear in representation print(repr(square)) #> Square(width=1.3, area=1.69) print(square.random_number) #> 3 square.area = 4 print(square.model_dump_json(by_alias=True)) #> {"width":2.0,"area":4.0,"the magic number":3} ``` Overriding with `computed_field` You can't override a field from a parent class with a `computed_field` in the child class. `mypy` complains about this behavior if allowed, and `dataclasses` doesn't allow this pattern either. See the example below: ```python from pydantic import BaseModel, computed_field class Parent(BaseModel): a: str try: class Child(Parent): @computed_field @property def a(self) -> str: return 'new a' except TypeError as e: print(e) ''' Field 'a' of class 'Child' overrides symbol of same name in a parent class. This override with a computed_field is incompatible. ''' ``` Private properties decorated with `@computed_field` have `repr=False` by default. ```python from functools import cached_property from pydantic import BaseModel, computed_field class Model(BaseModel): foo: int @computed_field @cached_property def _private_cached_property(self) -> int: return -self.foo @computed_field @property def _private_property(self) -> int: return -self.foo m = Model(foo=1) print(repr(m)) #> Model(foo=1) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `func` | `PropertyT | None` | the function to wrap. | `None` | | `alias` | `str | None` | alias to use when serializing this computed field, only used when by_alias=True | `None` | | `alias_priority` | `int | None` | priority of the alias. This affects whether an alias generator is used | `None` | | `title` | `str | None` | Title to use when including this computed field in JSON Schema | `None` | | `field_title_generator` | `Callable[[str, ComputedFieldInfo], str] | None` | A callable that takes a field name and returns title for it. | `None` | | `description` | `str | None` | Description to use when including this computed field in JSON Schema, defaults to the function's docstring | `None` | | `deprecated` | `Deprecated | str | bool | None` | A deprecation message (or an instance of warnings.deprecated or the typing_extensions.deprecated backport). to be emitted when accessing the field. Or a boolean. This will automatically be set if the property is decorated with the deprecated decorator. | `None` | | `examples` | `list[Any] | None` | Example values to use when including this computed field in JSON Schema | `None` | | `json_schema_extra` | `JsonDict | Callable[[JsonDict], None] | None` | A dict or callable to provide extra JSON schema properties. | `None` | | `repr` | `bool | None` | whether to include this computed field in model repr. Default is False for private properties and True for public properties. | `None` | | `return_type` | `Any` | optional return for serialization logic to expect when serializing to JSON, if included this must be correct, otherwise a TypeError is raised. If you don't include a return type Any is used, which does runtime introspection to handle arbitrary objects. | `PydanticUndefined` | Returns: | Type | Description | | --- | --- | | `PropertyT | Callable[[PropertyT], PropertyT]` | A proxy wrapper for the property. | Source code in `pydantic/fields.py` ````python def computed_field( func: PropertyT | None = None, /, *, alias: str | None = None, alias_priority: int | None = None, title: str | None = None, field_title_generator: typing.Callable[[str, ComputedFieldInfo], str] | None = None, description: str | None = None, deprecated: Deprecated | str | bool | None = None, examples: list[Any] | None = None, json_schema_extra: JsonDict | typing.Callable[[JsonDict], None] | None = None, repr: bool | None = None, return_type: Any = PydanticUndefined, ) -> PropertyT | typing.Callable[[PropertyT], PropertyT]: """!!! abstract "Usage Documentation" [The `computed_field` decorator](../concepts/fields.md#the-computed_field-decorator) Decorator to include `property` and `cached_property` when serializing models or dataclasses. This is useful for fields that are computed from other fields, or for fields that are expensive to compute and should be cached. ```python from pydantic import BaseModel, computed_field class Rectangle(BaseModel): width: int length: int @computed_field @property def area(self) -> int: return self.width * self.length print(Rectangle(width=3, length=2).model_dump()) #> {'width': 3, 'length': 2, 'area': 6} ``` If applied to functions not yet decorated with `@property` or `@cached_property`, the function is automatically wrapped with `property`. Although this is more concise, you will lose IntelliSense in your IDE, and confuse static type checkers, thus explicit use of `@property` is recommended. !!! warning "Mypy Warning" Even with the `@property` or `@cached_property` applied to your function before `@computed_field`, mypy may throw a `Decorated property not supported` error. See [mypy issue #1362](https://github.com/python/mypy/issues/1362), for more information. To avoid this error message, add `# type: ignore[prop-decorator]` to the `@computed_field` line. [pyright](https://github.com/microsoft/pyright) supports `@computed_field` without error. ```python import random from pydantic import BaseModel, computed_field class Square(BaseModel): width: float @computed_field def area(self) -> float: # converted to a `property` by `computed_field` return round(self.width**2, 2) @area.setter def area(self, new_area: float) -> None: self.width = new_area**0.5 @computed_field(alias='the magic number', repr=False) def random_number(self) -> int: return random.randint(0, 1_000) square = Square(width=1.3) # `random_number` does not appear in representation print(repr(square)) #> Square(width=1.3, area=1.69) print(square.random_number) #> 3 square.area = 4 print(square.model_dump_json(by_alias=True)) #> {"width":2.0,"area":4.0,"the magic number":3} ``` !!! warning "Overriding with `computed_field`" You can't override a field from a parent class with a `computed_field` in the child class. `mypy` complains about this behavior if allowed, and `dataclasses` doesn't allow this pattern either. See the example below: ```python from pydantic import BaseModel, computed_field class Parent(BaseModel): a: str try: class Child(Parent): @computed_field @property def a(self) -> str: return 'new a' except TypeError as e: print(e) ''' Field 'a' of class 'Child' overrides symbol of same name in a parent class. This override with a computed_field is incompatible. ''' ``` Private properties decorated with `@computed_field` have `repr=False` by default. ```python from functools import cached_property from pydantic import BaseModel, computed_field class Model(BaseModel): foo: int @computed_field @cached_property def _private_cached_property(self) -> int: return -self.foo @computed_field @property def _private_property(self) -> int: return -self.foo m = Model(foo=1) print(repr(m)) #> Model(foo=1) ``` Args: func: the function to wrap. alias: alias to use when serializing this computed field, only used when `by_alias=True` alias_priority: priority of the alias. This affects whether an alias generator is used title: Title to use when including this computed field in JSON Schema field_title_generator: A callable that takes a field name and returns title for it. description: Description to use when including this computed field in JSON Schema, defaults to the function's docstring deprecated: A deprecation message (or an instance of `warnings.deprecated` or the `typing_extensions.deprecated` backport). to be emitted when accessing the field. Or a boolean. This will automatically be set if the property is decorated with the `deprecated` decorator. examples: Example values to use when including this computed field in JSON Schema json_schema_extra: A dict or callable to provide extra JSON schema properties. repr: whether to include this computed field in model repr. Default is `False` for private properties and `True` for public properties. return_type: optional return for serialization logic to expect when serializing to JSON, if included this must be correct, otherwise a `TypeError` is raised. If you don't include a return type Any is used, which does runtime introspection to handle arbitrary objects. Returns: A proxy wrapper for the property. """ def dec(f: Any) -> Any: nonlocal description, deprecated, return_type, alias_priority unwrapped = _decorators.unwrap_wrapped_function(f) if description is None and unwrapped.__doc__: description = inspect.cleandoc(unwrapped.__doc__) if deprecated is None and hasattr(unwrapped, '__deprecated__'): deprecated = unwrapped.__deprecated__ # if the function isn't already decorated with `@property` (or another descriptor), then we wrap it now f = _decorators.ensure_property(f) alias_priority = (alias_priority or 2) if alias is not None else None if repr is None: repr_: bool = not _wrapped_property_is_private(property_=f) else: repr_ = repr dec_info = ComputedFieldInfo( f, return_type, alias, alias_priority, title, field_title_generator, description, deprecated, examples, json_schema_extra, repr_, ) return _decorators.PydanticDescriptorProxy(f, dec_info) if func is None: return dec else: return dec(func) ```` ## ComputedFieldInfo ```python ComputedFieldInfo( wrapped_property: property, return_type: Any, alias: str | None, alias_priority: int | None, title: str | None, field_title_generator: ( Callable[[str, ComputedFieldInfo], str] | None ), description: str | None, deprecated: Deprecated | str | bool | None, examples: list[Any] | None, json_schema_extra: ( JsonDict | Callable[[JsonDict], None] | None ), repr: bool, ) ``` A container for data from `@computed_field` so that we can access it while building the pydantic-core schema. Attributes: | Name | Type | Description | | --- | --- | --- | | `decorator_repr` | `str` | A class variable representing the decorator string, '@computed_field'. | | `wrapped_property` | `property` | The wrapped computed field property. | | `return_type` | `Any` | The type of the computed field property's return value. | | `alias` | `str | None` | The alias of the property to be used during serialization. | | `alias_priority` | `int | None` | The priority of the alias. This affects whether an alias generator is used. | | `title` | `str | None` | Title of the computed field to include in the serialization JSON schema. | | `field_title_generator` | `Callable[[str, ComputedFieldInfo], str] | None` | A callable that takes a field name and returns title for it. | | `description` | `str | None` | Description of the computed field to include in the serialization JSON schema. | | `deprecated` | `Deprecated | str | bool | None` | A deprecation message, an instance of warnings.deprecated or the typing_extensions.deprecated backport, or a boolean. If True, a default deprecation message will be emitted when accessing the field. | | `examples` | `list[Any] | None` | Example values of the computed field to include in the serialization JSON schema. | | `json_schema_extra` | `JsonDict | Callable[[JsonDict], None] | None` | A dict or callable to provide extra JSON schema properties. | | `repr` | `bool` | A boolean indicating whether to include the field in the repr output. | ### deprecation_message ```python deprecation_message: str | None ``` The deprecation message to be emitted, or `None` if not set. This module contains related classes and functions for serialization. ## FieldPlainSerializer ```python FieldPlainSerializer: TypeAlias = ( "core_schema.SerializerFunction | _Partial" ) ``` A field serializer method or function in `plain` mode. ## FieldWrapSerializer ```python FieldWrapSerializer: TypeAlias = ( "core_schema.WrapSerializerFunction | _Partial" ) ``` A field serializer method or function in `wrap` mode. ## FieldSerializer ```python FieldSerializer: TypeAlias = ( "FieldPlainSerializer | FieldWrapSerializer" ) ``` A field serializer method or function. ## ModelPlainSerializerWithInfo ```python ModelPlainSerializerWithInfo: TypeAlias = Callable[ [Any, SerializationInfo], Any ] ``` A model serializer method with the `info` argument, in `plain` mode. ## ModelPlainSerializerWithoutInfo ```python ModelPlainSerializerWithoutInfo: TypeAlias = Callable[ [Any], Any ] ``` A model serializer method without the `info` argument, in `plain` mode. ## ModelPlainSerializer ```python ModelPlainSerializer: TypeAlias = ( "ModelPlainSerializerWithInfo | ModelPlainSerializerWithoutInfo" ) ``` A model serializer method in `plain` mode. ## ModelWrapSerializerWithInfo ```python ModelWrapSerializerWithInfo: TypeAlias = Callable[ [Any, SerializerFunctionWrapHandler, SerializationInfo], Any, ] ``` A model serializer method with the `info` argument, in `wrap` mode. ## ModelWrapSerializerWithoutInfo ```python ModelWrapSerializerWithoutInfo: TypeAlias = Callable[ [Any, SerializerFunctionWrapHandler], Any ] ``` A model serializer method without the `info` argument, in `wrap` mode. ## ModelWrapSerializer ```python ModelWrapSerializer: TypeAlias = ( "ModelWrapSerializerWithInfo | ModelWrapSerializerWithoutInfo" ) ``` A model serializer method in `wrap` mode. ## PlainSerializer ```python PlainSerializer( func: SerializerFunction, return_type: Any = PydanticUndefined, when_used: WhenUsed = "always", ) ``` Plain serializers use a function to modify the output of serialization. This is particularly helpful when you want to customize the serialization for annotated types. Consider an input of `list`, which will be serialized into a space-delimited string. ```python from typing import Annotated from pydantic import BaseModel, PlainSerializer CustomStr = Annotated[ list, PlainSerializer(lambda x: ' '.join(x), return_type=str) ] class StudentModel(BaseModel): courses: CustomStr student = StudentModel(courses=['Math', 'Chemistry', 'English']) print(student.model_dump()) #> {'courses': 'Math Chemistry English'} ``` Attributes: | Name | Type | Description | | --- | --- | --- | | `func` | `SerializerFunction` | The serializer function. | | `return_type` | `Any` | The return type for the function. If omitted it will be inferred from the type annotation. | | `when_used` | `WhenUsed` | Determines when this serializer should be used. Accepts a string with values 'always', 'unless-none', 'json', and 'json-unless-none'. Defaults to 'always'. | ## WrapSerializer ```python WrapSerializer( func: WrapSerializerFunction, return_type: Any = PydanticUndefined, when_used: WhenUsed = "always", ) ``` Wrap serializers receive the raw inputs along with a handler function that applies the standard serialization logic, and can modify the resulting value before returning it as the final output of serialization. For example, here's a scenario in which a wrap serializer transforms timezones to UTC **and** utilizes the existing `datetime` serialization logic. ```python from datetime import datetime, timezone from typing import Annotated, Any from pydantic import BaseModel, WrapSerializer class EventDatetime(BaseModel): start: datetime end: datetime def convert_to_utc(value: Any, handler, info) -> dict[str, datetime]: # Note that `handler` can actually help serialize the `value` for # further custom serialization in case it's a subclass. partial_result = handler(value, info) if info.mode == 'json': return { k: datetime.fromisoformat(v).astimezone(timezone.utc) for k, v in partial_result.items() } return {k: v.astimezone(timezone.utc) for k, v in partial_result.items()} UTCEventDatetime = Annotated[EventDatetime, WrapSerializer(convert_to_utc)] class EventModel(BaseModel): event_datetime: UTCEventDatetime dt = EventDatetime( start='2024-01-01T07:00:00-08:00', end='2024-01-03T20:00:00+06:00' ) event = EventModel(event_datetime=dt) print(event.model_dump()) ''' { 'event_datetime': { 'start': datetime.datetime( 2024, 1, 1, 15, 0, tzinfo=datetime.timezone.utc ), 'end': datetime.datetime( 2024, 1, 3, 14, 0, tzinfo=datetime.timezone.utc ), } } ''' print(event.model_dump_json()) ''' {"event_datetime":{"start":"2024-01-01T15:00:00Z","end":"2024-01-03T14:00:00Z"}} ''' ``` Attributes: | Name | Type | Description | | --- | --- | --- | | `func` | `WrapSerializerFunction` | The serializer function to be wrapped. | | `return_type` | `Any` | The return type for the function. If omitted it will be inferred from the type annotation. | | `when_used` | `WhenUsed` | Determines when this serializer should be used. Accepts a string with values 'always', 'unless-none', 'json', and 'json-unless-none'. Defaults to 'always'. | ## field_serializer ```python field_serializer( field: str, /, *fields: str, mode: Literal["wrap"], return_type: Any = ..., when_used: WhenUsed = ..., check_fields: bool | None = ..., ) -> Callable[ [_FieldWrapSerializerT], _FieldWrapSerializerT ] ``` ```python field_serializer( field: str, /, *fields: str, mode: Literal["plain"] = ..., return_type: Any = ..., when_used: WhenUsed = ..., check_fields: bool | None = ..., ) -> Callable[ [_FieldPlainSerializerT], _FieldPlainSerializerT ] ``` ```python field_serializer( *fields: str, mode: Literal["plain", "wrap"] = "plain", return_type: Any = PydanticUndefined, when_used: WhenUsed = "always", check_fields: bool | None = None ) -> ( Callable[[_FieldWrapSerializerT], _FieldWrapSerializerT] | Callable[ [_FieldPlainSerializerT], _FieldPlainSerializerT ] ) ``` Decorator that enables custom field serialization. In the below example, a field of type `set` is used to mitigate duplication. A `field_serializer` is used to serialize the data as a sorted list. ```python from typing import Set from pydantic import BaseModel, field_serializer class StudentModel(BaseModel): name: str = 'Jane' courses: Set[str] @field_serializer('courses', when_used='json') def serialize_courses_in_order(self, courses: Set[str]): return sorted(courses) student = StudentModel(courses={'Math', 'Chemistry', 'English'}) print(student.model_dump_json()) #> {"name":"Jane","courses":["Chemistry","English","Math"]} ``` See [Custom serializers](../../concepts/serialization/#custom-serializers) for more information. Four signatures are supported: - `(self, value: Any, info: FieldSerializationInfo)` - `(self, value: Any, nxt: SerializerFunctionWrapHandler, info: FieldSerializationInfo)` - `(value: Any, info: SerializationInfo)` - `(value: Any, nxt: SerializerFunctionWrapHandler, info: SerializationInfo)` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `fields` | `str` | Which field(s) the method should be called on. | `()` | | `mode` | `Literal['plain', 'wrap']` | The serialization mode. plain means the function will be called instead of the default serialization logic, wrap means the function will be called with an argument to optionally call the default serialization logic. | `'plain'` | | `return_type` | `Any` | Optional return type for the function, if omitted it will be inferred from the type annotation. | `PydanticUndefined` | | `when_used` | `WhenUsed` | Determines the serializer will be used for serialization. | `'always'` | | `check_fields` | `bool | None` | Whether to check that the fields actually exist on the model. | `None` | Returns: | Type | Description | | --- | --- | | `Callable[[_FieldWrapSerializerT], _FieldWrapSerializerT] | Callable[[_FieldPlainSerializerT], _FieldPlainSerializerT]` | The decorator function. | Source code in `pydantic/functional_serializers.py` ````python def field_serializer( *fields: str, mode: Literal['plain', 'wrap'] = 'plain', return_type: Any = PydanticUndefined, when_used: WhenUsed = 'always', check_fields: bool | None = None, ) -> ( Callable[[_FieldWrapSerializerT], _FieldWrapSerializerT] | Callable[[_FieldPlainSerializerT], _FieldPlainSerializerT] ): """Decorator that enables custom field serialization. In the below example, a field of type `set` is used to mitigate duplication. A `field_serializer` is used to serialize the data as a sorted list. ```python from typing import Set from pydantic import BaseModel, field_serializer class StudentModel(BaseModel): name: str = 'Jane' courses: Set[str] @field_serializer('courses', when_used='json') def serialize_courses_in_order(self, courses: Set[str]): return sorted(courses) student = StudentModel(courses={'Math', 'Chemistry', 'English'}) print(student.model_dump_json()) #> {"name":"Jane","courses":["Chemistry","English","Math"]} ``` See [Custom serializers](../concepts/serialization.md#custom-serializers) for more information. Four signatures are supported: - `(self, value: Any, info: FieldSerializationInfo)` - `(self, value: Any, nxt: SerializerFunctionWrapHandler, info: FieldSerializationInfo)` - `(value: Any, info: SerializationInfo)` - `(value: Any, nxt: SerializerFunctionWrapHandler, info: SerializationInfo)` Args: fields: Which field(s) the method should be called on. mode: The serialization mode. - `plain` means the function will be called instead of the default serialization logic, - `wrap` means the function will be called with an argument to optionally call the default serialization logic. return_type: Optional return type for the function, if omitted it will be inferred from the type annotation. when_used: Determines the serializer will be used for serialization. check_fields: Whether to check that the fields actually exist on the model. Returns: The decorator function. """ def dec(f: FieldSerializer) -> _decorators.PydanticDescriptorProxy[Any]: dec_info = _decorators.FieldSerializerDecoratorInfo( fields=fields, mode=mode, return_type=return_type, when_used=when_used, check_fields=check_fields, ) return _decorators.PydanticDescriptorProxy(f, dec_info) # pyright: ignore[reportArgumentType] return dec # pyright: ignore[reportReturnType] ```` ## model_serializer ```python model_serializer( f: _ModelPlainSerializerT, ) -> _ModelPlainSerializerT ``` ```python model_serializer( *, mode: Literal["wrap"], when_used: WhenUsed = "always", return_type: Any = ... ) -> Callable[ [_ModelWrapSerializerT], _ModelWrapSerializerT ] ``` ```python model_serializer( *, mode: Literal["plain"] = ..., when_used: WhenUsed = "always", return_type: Any = ... ) -> Callable[ [_ModelPlainSerializerT], _ModelPlainSerializerT ] ``` ```python model_serializer( f: ( _ModelPlainSerializerT | _ModelWrapSerializerT | None ) = None, /, *, mode: Literal["plain", "wrap"] = "plain", when_used: WhenUsed = "always", return_type: Any = PydanticUndefined, ) -> ( _ModelPlainSerializerT | Callable[ [_ModelWrapSerializerT], _ModelWrapSerializerT ] | Callable[ [_ModelPlainSerializerT], _ModelPlainSerializerT ] ) ``` Decorator that enables custom model serialization. This is useful when a model need to be serialized in a customized manner, allowing for flexibility beyond just specific fields. An example would be to serialize temperature to the same temperature scale, such as degrees Celsius. ```python from typing import Literal from pydantic import BaseModel, model_serializer class TemperatureModel(BaseModel): unit: Literal['C', 'F'] value: int @model_serializer() def serialize_model(self): if self.unit == 'F': return {'unit': 'C', 'value': int((self.value - 32) / 1.8)} return {'unit': self.unit, 'value': self.value} temperature = TemperatureModel(unit='F', value=212) print(temperature.model_dump()) #> {'unit': 'C', 'value': 100} ``` Two signatures are supported for `mode='plain'`, which is the default: - `(self)` - `(self, info: SerializationInfo)` And two other signatures for `mode='wrap'`: - `(self, nxt: SerializerFunctionWrapHandler)` - `(self, nxt: SerializerFunctionWrapHandler, info: SerializationInfo)` See [Custom serializers](../../concepts/serialization/#custom-serializers) for more information. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `f` | `_ModelPlainSerializerT | _ModelWrapSerializerT | None` | The function to be decorated. | `None` | | `mode` | `Literal['plain', 'wrap']` | The serialization mode. 'plain' means the function will be called instead of the default serialization logic 'wrap' means the function will be called with an argument to optionally call the default serialization logic. | `'plain'` | | `when_used` | `WhenUsed` | Determines when this serializer should be used. | `'always'` | | `return_type` | `Any` | The return type for the function. If omitted it will be inferred from the type annotation. | `PydanticUndefined` | Returns: | Type | Description | | --- | --- | | `_ModelPlainSerializerT | Callable[[_ModelWrapSerializerT], _ModelWrapSerializerT] | Callable[[_ModelPlainSerializerT], _ModelPlainSerializerT]` | The decorator function. | Source code in `pydantic/functional_serializers.py` ````python def model_serializer( f: _ModelPlainSerializerT | _ModelWrapSerializerT | None = None, /, *, mode: Literal['plain', 'wrap'] = 'plain', when_used: WhenUsed = 'always', return_type: Any = PydanticUndefined, ) -> ( _ModelPlainSerializerT | Callable[[_ModelWrapSerializerT], _ModelWrapSerializerT] | Callable[[_ModelPlainSerializerT], _ModelPlainSerializerT] ): """Decorator that enables custom model serialization. This is useful when a model need to be serialized in a customized manner, allowing for flexibility beyond just specific fields. An example would be to serialize temperature to the same temperature scale, such as degrees Celsius. ```python from typing import Literal from pydantic import BaseModel, model_serializer class TemperatureModel(BaseModel): unit: Literal['C', 'F'] value: int @model_serializer() def serialize_model(self): if self.unit == 'F': return {'unit': 'C', 'value': int((self.value - 32) / 1.8)} return {'unit': self.unit, 'value': self.value} temperature = TemperatureModel(unit='F', value=212) print(temperature.model_dump()) #> {'unit': 'C', 'value': 100} ``` Two signatures are supported for `mode='plain'`, which is the default: - `(self)` - `(self, info: SerializationInfo)` And two other signatures for `mode='wrap'`: - `(self, nxt: SerializerFunctionWrapHandler)` - `(self, nxt: SerializerFunctionWrapHandler, info: SerializationInfo)` See [Custom serializers](../concepts/serialization.md#custom-serializers) for more information. Args: f: The function to be decorated. mode: The serialization mode. - `'plain'` means the function will be called instead of the default serialization logic - `'wrap'` means the function will be called with an argument to optionally call the default serialization logic. when_used: Determines when this serializer should be used. return_type: The return type for the function. If omitted it will be inferred from the type annotation. Returns: The decorator function. """ def dec(f: ModelSerializer) -> _decorators.PydanticDescriptorProxy[Any]: dec_info = _decorators.ModelSerializerDecoratorInfo(mode=mode, return_type=return_type, when_used=when_used) return _decorators.PydanticDescriptorProxy(f, dec_info) if f is None: return dec # pyright: ignore[reportReturnType] else: return dec(f) # pyright: ignore[reportReturnType] ```` This module contains related classes and functions for validation. ## ModelAfterValidatorWithoutInfo ```python ModelAfterValidatorWithoutInfo = Callable[ [_ModelType], _ModelType ] ``` A `@model_validator` decorated function signature. This is used when `mode='after'` and the function does not have info argument. ## ModelAfterValidator ```python ModelAfterValidator = Callable[ [_ModelType, ValidationInfo], _ModelType ] ``` A `@model_validator` decorated function signature. This is used when `mode='after'`. ## AfterValidator ```python AfterValidator( func: ( NoInfoValidatorFunction | WithInfoValidatorFunction ), ) ``` Usage Documentation [field *after* validators](../../concepts/validators/#field-after-validator) A metadata class that indicates that a validation should be applied **after** the inner validation logic. Attributes: | Name | Type | Description | | --- | --- | --- | | `func` | `NoInfoValidatorFunction | WithInfoValidatorFunction` | The validator function. | Example ```python from typing import Annotated from pydantic import AfterValidator, BaseModel, ValidationError MyInt = Annotated[int, AfterValidator(lambda v: v + 1)] class Model(BaseModel): a: MyInt print(Model(a=1).a) #> 2 try: Model(a='a') except ValidationError as e: print(e.json(indent=2)) ''' [ { "type": "int_parsing", "loc": [ "a" ], "msg": "Input should be a valid integer, unable to parse string as an integer", "input": "a", "url": "https://errors.pydantic.dev/2/v/int_parsing" } ] ''' ``` ## BeforeValidator ```python BeforeValidator( func: ( NoInfoValidatorFunction | WithInfoValidatorFunction ), json_schema_input_type: Any = PydanticUndefined, ) ``` Usage Documentation [field *before* validators](../../concepts/validators/#field-before-validator) A metadata class that indicates that a validation should be applied **before** the inner validation logic. Attributes: | Name | Type | Description | | --- | --- | --- | | `func` | `NoInfoValidatorFunction | WithInfoValidatorFunction` | The validator function. | | `json_schema_input_type` | `Any` | The input type of the function. This is only used to generate the appropriate JSON Schema (in validation mode). | Example ```python from typing import Annotated from pydantic import BaseModel, BeforeValidator MyInt = Annotated[int, BeforeValidator(lambda v: v + 1)] class Model(BaseModel): a: MyInt print(Model(a=1).a) #> 2 try: Model(a='a') except TypeError as e: print(e) #> can only concatenate str (not "int") to str ``` ## PlainValidator ```python PlainValidator( func: ( NoInfoValidatorFunction | WithInfoValidatorFunction ), json_schema_input_type: Any = Any, ) ``` Usage Documentation [field *plain* validators](../../concepts/validators/#field-plain-validator) A metadata class that indicates that a validation should be applied **instead** of the inner validation logic. Note Before v2.9, `PlainValidator` wasn't always compatible with JSON Schema generation for `mode='validation'`. You can now use the `json_schema_input_type` argument to specify the input type of the function to be used in the JSON schema when `mode='validation'` (the default). See the example below for more details. Attributes: | Name | Type | Description | | --- | --- | --- | | `func` | `NoInfoValidatorFunction | WithInfoValidatorFunction` | The validator function. | | `json_schema_input_type` | `Any` | The input type of the function. This is only used to generate the appropriate JSON Schema (in validation mode). If not provided, will default to Any. | Example ```python from typing import Annotated, Union from pydantic import BaseModel, PlainValidator MyInt = Annotated[ int, PlainValidator( lambda v: int(v) + 1, json_schema_input_type=Union[str, int] # (1)! ), ] class Model(BaseModel): a: MyInt print(Model(a='1').a) #> 2 print(Model(a=1).a) #> 2 ``` 1. In this example, we've specified the `json_schema_input_type` as `Union[str, int]` which indicates to the JSON schema generator that in validation mode, the input type for the `a` field can be either a `str` or an `int`. ## WrapValidator ```python WrapValidator( func: ( NoInfoWrapValidatorFunction | WithInfoWrapValidatorFunction ), json_schema_input_type: Any = PydanticUndefined, ) ``` Usage Documentation [field *wrap* validators](../../concepts/validators/#field-wrap-validator) A metadata class that indicates that a validation should be applied **around** the inner validation logic. Attributes: | Name | Type | Description | | --- | --- | --- | | `func` | `NoInfoWrapValidatorFunction | WithInfoWrapValidatorFunction` | The validator function. | | `json_schema_input_type` | `Any` | The input type of the function. This is only used to generate the appropriate JSON Schema (in validation mode). | ```python from datetime import datetime from typing import Annotated from pydantic import BaseModel, ValidationError, WrapValidator def validate_timestamp(v, handler): if v == 'now': # we don't want to bother with further validation, just return the new value return datetime.now() try: return handler(v) except ValidationError: # validation failed, in this case we want to return a default value return datetime(2000, 1, 1) MyTimestamp = Annotated[datetime, WrapValidator(validate_timestamp)] class Model(BaseModel): a: MyTimestamp print(Model(a='now').a) #> 2032-01-02 03:04:05.000006 print(Model(a='invalid').a) #> 2000-01-01 00:00:00 ``` ## ModelWrapValidatorHandler Bases: `ValidatorFunctionWrapHandler`, `Protocol[_ModelTypeCo]` `@model_validator` decorated function handler argument type. This is used when `mode='wrap'`. ## ModelWrapValidatorWithoutInfo Bases: `Protocol[_ModelType]` A `@model_validator` decorated function signature. This is used when `mode='wrap'` and the function does not have info argument. ## ModelWrapValidator Bases: `Protocol[_ModelType]` A `@model_validator` decorated function signature. This is used when `mode='wrap'`. ## FreeModelBeforeValidatorWithoutInfo Bases: `Protocol` A `@model_validator` decorated function signature. This is used when `mode='before'` and the function does not have info argument. ## ModelBeforeValidatorWithoutInfo Bases: `Protocol` A `@model_validator` decorated function signature. This is used when `mode='before'` and the function does not have info argument. ## FreeModelBeforeValidator Bases: `Protocol` A `@model_validator` decorated function signature. This is used when `mode='before'`. ## ModelBeforeValidator Bases: `Protocol` A `@model_validator` decorated function signature. This is used when `mode='before'`. ## InstanceOf ```python InstanceOf() ``` Generic type for annotating a type that is an instance of a given class. Example ```python from pydantic import BaseModel, InstanceOf class Foo: ... class Bar(BaseModel): foo: InstanceOf[Foo] Bar(foo=Foo()) try: Bar(foo=42) except ValidationError as e: print(e) """ [ │ { │ │ 'type': 'is_instance_of', │ │ 'loc': ('foo',), │ │ 'msg': 'Input should be an instance of Foo', │ │ 'input': 42, │ │ 'ctx': {'class': 'Foo'}, │ │ 'url': 'https://errors.pydantic.dev/0.38.0/v/is_instance_of' │ } ] """ ``` ## SkipValidation ```python SkipValidation() ``` If this is applied as an annotation (e.g., via `x: Annotated[int, SkipValidation]`), validation will be skipped. You can also use `SkipValidation[int]` as a shorthand for `Annotated[int, SkipValidation]`. This can be useful if you want to use a type annotation for documentation/IDE/type-checking purposes, and know that it is safe to skip validation for one or more of the fields. Because this converts the validation schema to `any_schema`, subsequent annotation-applied transformations may not have the expected effects. Therefore, when used, this annotation should generally be the final annotation applied to a type. ## field_validator ```python field_validator( field: str, /, *fields: str, mode: Literal["wrap"], check_fields: bool | None = ..., json_schema_input_type: Any = ..., ) -> Callable[[_V2WrapValidatorType], _V2WrapValidatorType] ``` ```python field_validator( field: str, /, *fields: str, mode: Literal["before", "plain"], check_fields: bool | None = ..., json_schema_input_type: Any = ..., ) -> Callable[ [_V2BeforeAfterOrPlainValidatorType], _V2BeforeAfterOrPlainValidatorType, ] ``` ```python field_validator( field: str, /, *fields: str, mode: Literal["after"] = ..., check_fields: bool | None = ..., ) -> Callable[ [_V2BeforeAfterOrPlainValidatorType], _V2BeforeAfterOrPlainValidatorType, ] ``` ```python field_validator( field: str, /, *fields: str, mode: FieldValidatorModes = "after", check_fields: bool | None = None, json_schema_input_type: Any = PydanticUndefined, ) -> Callable[[Any], Any] ``` Usage Documentation [field validators](../../concepts/validators/#field-validators) Decorate methods on the class indicating that they should be used to validate fields. Example usage: ```python from typing import Any from pydantic import ( BaseModel, ValidationError, field_validator, ) class Model(BaseModel): a: str @field_validator('a') @classmethod def ensure_foobar(cls, v: Any): if 'foobar' not in v: raise ValueError('"foobar" not found in a') return v print(repr(Model(a='this is foobar good'))) #> Model(a='this is foobar good') try: Model(a='snap') except ValidationError as exc_info: print(exc_info) ''' 1 validation error for Model a Value error, "foobar" not found in a [type=value_error, input_value='snap', input_type=str] ''' ``` For more in depth examples, see [Field Validators](../../concepts/validators/#field-validators). Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `field` | `str` | The first field the field_validator should be called on; this is separate from fields to ensure an error is raised if you don't pass at least one. | *required* | | `*fields` | `str` | Additional field(s) the field_validator should be called on. | `()` | | `mode` | `FieldValidatorModes` | Specifies whether to validate the fields before or after validation. | `'after'` | | `check_fields` | `bool | None` | Whether to check that the fields actually exist on the model. | `None` | | `json_schema_input_type` | `Any` | The input type of the function. This is only used to generate the appropriate JSON Schema (in validation mode) and can only specified when mode is either 'before', 'plain' or 'wrap'. | `PydanticUndefined` | Returns: | Type | Description | | --- | --- | | `Callable[[Any], Any]` | A decorator that can be used to decorate a function to be used as a field_validator. | Raises: | Type | Description | | --- | --- | | `PydanticUserError` | If @field_validator is used bare (with no fields). If the args passed to @field_validator as fields are not strings. If @field_validator applied to instance methods. | Source code in `pydantic/functional_validators.py` ````python def field_validator( field: str, /, *fields: str, mode: FieldValidatorModes = 'after', check_fields: bool | None = None, json_schema_input_type: Any = PydanticUndefined, ) -> Callable[[Any], Any]: """!!! abstract "Usage Documentation" [field validators](../concepts/validators.md#field-validators) Decorate methods on the class indicating that they should be used to validate fields. Example usage: ```python from typing import Any from pydantic import ( BaseModel, ValidationError, field_validator, ) class Model(BaseModel): a: str @field_validator('a') @classmethod def ensure_foobar(cls, v: Any): if 'foobar' not in v: raise ValueError('"foobar" not found in a') return v print(repr(Model(a='this is foobar good'))) #> Model(a='this is foobar good') try: Model(a='snap') except ValidationError as exc_info: print(exc_info) ''' 1 validation error for Model a Value error, "foobar" not found in a [type=value_error, input_value='snap', input_type=str] ''' ``` For more in depth examples, see [Field Validators](../concepts/validators.md#field-validators). Args: field: The first field the `field_validator` should be called on; this is separate from `fields` to ensure an error is raised if you don't pass at least one. *fields: Additional field(s) the `field_validator` should be called on. mode: Specifies whether to validate the fields before or after validation. check_fields: Whether to check that the fields actually exist on the model. json_schema_input_type: The input type of the function. This is only used to generate the appropriate JSON Schema (in validation mode) and can only specified when `mode` is either `'before'`, `'plain'` or `'wrap'`. Returns: A decorator that can be used to decorate a function to be used as a field_validator. Raises: PydanticUserError: - If `@field_validator` is used bare (with no fields). - If the args passed to `@field_validator` as fields are not strings. - If `@field_validator` applied to instance methods. """ if isinstance(field, FunctionType): raise PydanticUserError( '`@field_validator` should be used with fields and keyword arguments, not bare. ' "E.g. usage should be `@validator('', ...)`", code='validator-no-fields', ) if mode not in ('before', 'plain', 'wrap') and json_schema_input_type is not PydanticUndefined: raise PydanticUserError( f"`json_schema_input_type` can't be used when mode is set to {mode!r}", code='validator-input-type', ) if json_schema_input_type is PydanticUndefined and mode == 'plain': json_schema_input_type = Any fields = field, *fields if not all(isinstance(field, str) for field in fields): raise PydanticUserError( '`@field_validator` fields should be passed as separate string args. ' "E.g. usage should be `@validator('', '', ...)`", code='validator-invalid-fields', ) def dec( f: Callable[..., Any] | staticmethod[Any, Any] | classmethod[Any, Any, Any], ) -> _decorators.PydanticDescriptorProxy[Any]: if _decorators.is_instance_method_from_sig(f): raise PydanticUserError( '`@field_validator` cannot be applied to instance methods', code='validator-instance-method' ) # auto apply the @classmethod decorator f = _decorators.ensure_classmethod_based_on_signature(f) dec_info = _decorators.FieldValidatorDecoratorInfo( fields=fields, mode=mode, check_fields=check_fields, json_schema_input_type=json_schema_input_type ) return _decorators.PydanticDescriptorProxy(f, dec_info) return dec ```` ## model_validator ```python model_validator(*, mode: Literal["wrap"]) -> Callable[ [_AnyModelWrapValidator[_ModelType]], PydanticDescriptorProxy[ModelValidatorDecoratorInfo], ] ``` ```python model_validator(*, mode: Literal["before"]) -> Callable[ [_AnyModelBeforeValidator], PydanticDescriptorProxy[ModelValidatorDecoratorInfo], ] ``` ```python model_validator(*, mode: Literal["after"]) -> Callable[ [_AnyModelAfterValidator[_ModelType]], PydanticDescriptorProxy[ModelValidatorDecoratorInfo], ] ``` ```python model_validator( *, mode: Literal["wrap", "before", "after"] ) -> Any ``` Usage Documentation [Model Validators](../../concepts/validators/#model-validators) Decorate model methods for validation purposes. Example usage: ```python from typing_extensions import Self from pydantic import BaseModel, ValidationError, model_validator class Square(BaseModel): width: float height: float @model_validator(mode='after') def verify_square(self) -> Self: if self.width != self.height: raise ValueError('width and height do not match') return self s = Square(width=1, height=1) print(repr(s)) #> Square(width=1.0, height=1.0) try: Square(width=1, height=2) except ValidationError as e: print(e) ''' 1 validation error for Square Value error, width and height do not match [type=value_error, input_value={'width': 1, 'height': 2}, input_type=dict] ''' ``` For more in depth examples, see [Model Validators](../../concepts/validators/#model-validators). Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `mode` | `Literal['wrap', 'before', 'after']` | A required string literal that specifies the validation mode. It can be one of the following: 'wrap', 'before', or 'after'. | *required* | Returns: | Type | Description | | --- | --- | | `Any` | A decorator that can be used to decorate a function to be used as a model validator. | Source code in `pydantic/functional_validators.py` ````python def model_validator( *, mode: Literal['wrap', 'before', 'after'], ) -> Any: """!!! abstract "Usage Documentation" [Model Validators](../concepts/validators.md#model-validators) Decorate model methods for validation purposes. Example usage: ```python from typing_extensions import Self from pydantic import BaseModel, ValidationError, model_validator class Square(BaseModel): width: float height: float @model_validator(mode='after') def verify_square(self) -> Self: if self.width != self.height: raise ValueError('width and height do not match') return self s = Square(width=1, height=1) print(repr(s)) #> Square(width=1.0, height=1.0) try: Square(width=1, height=2) except ValidationError as e: print(e) ''' 1 validation error for Square Value error, width and height do not match [type=value_error, input_value={'width': 1, 'height': 2}, input_type=dict] ''' ``` For more in depth examples, see [Model Validators](../concepts/validators.md#model-validators). Args: mode: A required string literal that specifies the validation mode. It can be one of the following: 'wrap', 'before', or 'after'. Returns: A decorator that can be used to decorate a function to be used as a model validator. """ def dec(f: Any) -> _decorators.PydanticDescriptorProxy[Any]: # auto apply the @classmethod decorator f = _decorators.ensure_classmethod_based_on_signature(f) dec_info = _decorators.ModelValidatorDecoratorInfo(mode=mode) return _decorators.PydanticDescriptorProxy(f, dec_info) return dec ```` Usage Documentation [JSON Schema](../../concepts/json_schema/) The `json_schema` module contains classes and functions to allow the way [JSON Schema](https://json-schema.org/) is generated to be customized. In general you shouldn't need to use this module directly; instead, you can use BaseModel.model_json_schema and TypeAdapter.json_schema. ## CoreSchemaOrFieldType ```python CoreSchemaOrFieldType = Literal[ CoreSchemaType, CoreSchemaFieldType ] ``` A type alias for defined schema types that represents a union of `core_schema.CoreSchemaType` and `core_schema.CoreSchemaFieldType`. ## JsonSchemaValue ```python JsonSchemaValue = dict[str, Any] ``` A type alias for a JSON schema value. This is a dictionary of string keys to arbitrary JSON values. ## JsonSchemaMode ```python JsonSchemaMode = Literal['validation', 'serialization'] ``` A type alias that represents the mode of a JSON schema; either 'validation' or 'serialization'. For some types, the inputs to validation differ from the outputs of serialization. For example, computed fields will only be present when serializing, and should not be provided when validating. This flag provides a way to indicate whether you want the JSON schema required for validation inputs, or that will be matched by serialization outputs. ## JsonSchemaWarningKind ```python JsonSchemaWarningKind = Literal[ "skipped-choice", "non-serializable-default", "skipped-discriminator", ] ``` A type alias representing the kinds of warnings that can be emitted during JSON schema generation. See GenerateJsonSchema.render_warning_message for more details. ## NoDefault ```python NoDefault = object() ``` A sentinel value used to indicate that no default value should be used when generating a JSON Schema for a core schema with a default value. ## DEFAULT_REF_TEMPLATE ```python DEFAULT_REF_TEMPLATE = '#/$defs/{model}' ``` The default format string used to generate reference names. ## PydanticJsonSchemaWarning Bases: `UserWarning` This class is used to emit warnings produced during JSON schema generation. See the GenerateJsonSchema.emit_warning and GenerateJsonSchema.render_warning_message methods for more details; these can be overridden to control warning behavior. ## GenerateJsonSchema ```python GenerateJsonSchema( by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, ) ``` Usage Documentation [Customizing the JSON Schema Generation Process](../../concepts/json_schema/#customizing-the-json-schema-generation-process) A class for generating JSON schemas. This class generates JSON schemas based on configured parameters. The default schema dialect is . The class uses `by_alias` to configure how fields with multiple names are handled and `ref_template` to format reference names. Attributes: | Name | Type | Description | | --- | --- | --- | | `schema_dialect` | | The JSON schema dialect used to generate the schema. See Declaring a Dialect in the JSON Schema documentation for more information about dialects. | | `ignored_warning_kinds` | `set[JsonSchemaWarningKind]` | Warnings to ignore when generating the schema. self.render_warning_message will do nothing if its argument kind is in ignored_warning_kinds; this value can be modified on subclasses to easily control which warnings are emitted. | | `by_alias` | | Whether to use field aliases when generating the schema. | | `ref_template` | | The format string used when generating reference names. | | `core_to_json_refs` | `dict[CoreModeRef, JsonRef]` | A mapping of core refs to JSON refs. | | `core_to_defs_refs` | `dict[CoreModeRef, DefsRef]` | A mapping of core refs to definition refs. | | `defs_to_core_refs` | `dict[DefsRef, CoreModeRef]` | A mapping of definition refs to core refs. | | `json_to_defs_refs` | `dict[JsonRef, DefsRef]` | A mapping of JSON refs to definition refs. | | `definitions` | `dict[DefsRef, JsonSchemaValue]` | Definitions in the schema. | Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `by_alias` | `bool` | Whether to use field aliases in the generated schemas. | `True` | | `ref_template` | `str` | The format string to use when generating reference names. | `DEFAULT_REF_TEMPLATE` | Raises: | Type | Description | | --- | --- | | `JsonSchemaError` | If the instance of the class is inadvertently reused after generating a schema. | Source code in `pydantic/json_schema.py` ```python def __init__(self, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE): self.by_alias = by_alias self.ref_template = ref_template self.core_to_json_refs: dict[CoreModeRef, JsonRef] = {} self.core_to_defs_refs: dict[CoreModeRef, DefsRef] = {} self.defs_to_core_refs: dict[DefsRef, CoreModeRef] = {} self.json_to_defs_refs: dict[JsonRef, DefsRef] = {} self.definitions: dict[DefsRef, JsonSchemaValue] = {} self._config_wrapper_stack = _config.ConfigWrapperStack(_config.ConfigWrapper({})) self._mode: JsonSchemaMode = 'validation' # The following includes a mapping of a fully-unique defs ref choice to a list of preferred # alternatives, which are generally simpler, such as only including the class name. # At the end of schema generation, we use these to produce a JSON schema with more human-readable # definitions, which would also work better in a generated OpenAPI client, etc. self._prioritized_defsref_choices: dict[DefsRef, list[DefsRef]] = {} self._collision_counter: dict[str, int] = defaultdict(int) self._collision_index: dict[str, int] = {} self._schema_type_to_method = self.build_schema_type_to_method() # When we encounter definitions we need to try to build them immediately # so that they are available schemas that reference them # But it's possible that CoreSchema was never going to be used # (e.g. because the CoreSchema that references short circuits is JSON schema generation without needing # the reference) so instead of failing altogether if we can't build a definition we # store the error raised and re-throw it if we end up needing that def self._core_defs_invalid_for_json_schema: dict[DefsRef, PydanticInvalidForJsonSchema] = {} # This changes to True after generating a schema, to prevent issues caused by accidental reuse # of a single instance of a schema generator self._used = False ``` ### ValidationsMapping This class just contains mappings from core_schema attribute names to the corresponding JSON schema attribute names. While I suspect it is unlikely to be necessary, you can in principle override this class in a subclass of GenerateJsonSchema (by inheriting from GenerateJsonSchema.ValidationsMapping) to change these mappings. ### build_schema_type_to_method ```python build_schema_type_to_method() -> dict[ CoreSchemaOrFieldType, Callable[[CoreSchemaOrField], JsonSchemaValue], ] ``` Builds a dictionary mapping fields to methods for generating JSON schemas. Returns: | Type | Description | | --- | --- | | `dict[CoreSchemaOrFieldType, Callable[[CoreSchemaOrField], JsonSchemaValue]]` | A dictionary containing the mapping of CoreSchemaOrFieldType to a handler method. | Raises: | Type | Description | | --- | --- | | `TypeError` | If no method has been defined for generating a JSON schema for a given pydantic core schema type. | Source code in `pydantic/json_schema.py` ```python def build_schema_type_to_method( self, ) -> dict[CoreSchemaOrFieldType, Callable[[CoreSchemaOrField], JsonSchemaValue]]: """Builds a dictionary mapping fields to methods for generating JSON schemas. Returns: A dictionary containing the mapping of `CoreSchemaOrFieldType` to a handler method. Raises: TypeError: If no method has been defined for generating a JSON schema for a given pydantic core schema type. """ mapping: dict[CoreSchemaOrFieldType, Callable[[CoreSchemaOrField], JsonSchemaValue]] = {} core_schema_types: list[CoreSchemaOrFieldType] = list(get_literal_values(CoreSchemaOrFieldType)) for key in core_schema_types: method_name = f'{key.replace("-", "_")}_schema' try: mapping[key] = getattr(self, method_name) except AttributeError as e: # pragma: no cover if os.getenv('PYDANTIC_PRIVATE_ALLOW_UNHANDLED_SCHEMA_TYPES'): continue raise TypeError( f'No method for generating JsonSchema for core_schema.type={key!r} ' f'(expected: {type(self).__name__}.{method_name})' ) from e return mapping ``` ### generate_definitions ```python generate_definitions( inputs: Sequence[ tuple[JsonSchemaKeyT, JsonSchemaMode, CoreSchema] ] ) -> tuple[ dict[ tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue, ], dict[DefsRef, JsonSchemaValue], ] ``` Generates JSON schema definitions from a list of core schemas, pairing the generated definitions with a mapping that links the input keys to the definition references. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `inputs` | `Sequence[tuple[JsonSchemaKeyT, JsonSchemaMode, CoreSchema]]` | A sequence of tuples, where: The first element is a JSON schema key type. The second element is the JSON mode: either 'validation' or 'serialization'. The third element is a core schema. | *required* | Returns: | Type | Description | | --- | --- | | `tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], dict[DefsRef, JsonSchemaValue]]` | A tuple where: The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.) The second element is a dictionary whose keys are definition references for the JSON schemas from the first returned element, and whose values are the actual JSON schema definitions. | Raises: | Type | Description | | --- | --- | | `PydanticUserError` | Raised if the JSON schema generator has already been used to generate a JSON schema. | Source code in `pydantic/json_schema.py` ```python def generate_definitions( self, inputs: Sequence[tuple[JsonSchemaKeyT, JsonSchemaMode, core_schema.CoreSchema]] ) -> tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], dict[DefsRef, JsonSchemaValue]]: """Generates JSON schema definitions from a list of core schemas, pairing the generated definitions with a mapping that links the input keys to the definition references. Args: inputs: A sequence of tuples, where: - The first element is a JSON schema key type. - The second element is the JSON mode: either 'validation' or 'serialization'. - The third element is a core schema. Returns: A tuple where: - The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.) - The second element is a dictionary whose keys are definition references for the JSON schemas from the first returned element, and whose values are the actual JSON schema definitions. Raises: PydanticUserError: Raised if the JSON schema generator has already been used to generate a JSON schema. """ if self._used: raise PydanticUserError( 'This JSON schema generator has already been used to generate a JSON schema. ' f'You must create a new instance of {type(self).__name__} to generate a new JSON schema.', code='json-schema-already-used', ) for _, mode, schema in inputs: self._mode = mode self.generate_inner(schema) definitions_remapping = self._build_definitions_remapping() json_schemas_map: dict[tuple[JsonSchemaKeyT, JsonSchemaMode], DefsRef] = {} for key, mode, schema in inputs: self._mode = mode json_schema = self.generate_inner(schema) json_schemas_map[(key, mode)] = definitions_remapping.remap_json_schema(json_schema) json_schema = {'$defs': self.definitions} json_schema = definitions_remapping.remap_json_schema(json_schema) self._used = True return json_schemas_map, self.sort(json_schema['$defs']) # type: ignore ``` ### generate ```python generate( schema: CoreSchema, mode: JsonSchemaMode = "validation" ) -> JsonSchemaValue ``` Generates a JSON schema for a specified schema in a specified mode. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `CoreSchema` | A Pydantic model. | *required* | | `mode` | `JsonSchemaMode` | The mode in which to generate the schema. Defaults to 'validation'. | `'validation'` | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | A JSON schema representing the specified schema. | Raises: | Type | Description | | --- | --- | | `PydanticUserError` | If the JSON schema generator has already been used to generate a JSON schema. | Source code in `pydantic/json_schema.py` ```python def generate(self, schema: CoreSchema, mode: JsonSchemaMode = 'validation') -> JsonSchemaValue: """Generates a JSON schema for a specified schema in a specified mode. Args: schema: A Pydantic model. mode: The mode in which to generate the schema. Defaults to 'validation'. Returns: A JSON schema representing the specified schema. Raises: PydanticUserError: If the JSON schema generator has already been used to generate a JSON schema. """ self._mode = mode if self._used: raise PydanticUserError( 'This JSON schema generator has already been used to generate a JSON schema. ' f'You must create a new instance of {type(self).__name__} to generate a new JSON schema.', code='json-schema-already-used', ) json_schema: JsonSchemaValue = self.generate_inner(schema) json_ref_counts = self.get_json_ref_counts(json_schema) ref = cast(JsonRef, json_schema.get('$ref')) while ref is not None: # may need to unpack multiple levels ref_json_schema = self.get_schema_from_definitions(ref) if json_ref_counts[ref] == 1 and ref_json_schema is not None and len(json_schema) == 1: # "Unpack" the ref since this is the only reference and there are no sibling keys json_schema = ref_json_schema.copy() # copy to prevent recursive dict reference json_ref_counts[ref] -= 1 ref = cast(JsonRef, json_schema.get('$ref')) ref = None self._garbage_collect_definitions(json_schema) definitions_remapping = self._build_definitions_remapping() if self.definitions: json_schema['$defs'] = self.definitions json_schema = definitions_remapping.remap_json_schema(json_schema) # For now, we will not set the $schema key. However, if desired, this can be easily added by overriding # this method and adding the following line after a call to super().generate(schema): # json_schema['$schema'] = self.schema_dialect self._used = True return self.sort(json_schema) ``` ### generate_inner ```python generate_inner( schema: CoreSchemaOrField, ) -> JsonSchemaValue ``` Generates a JSON schema for a given core schema. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `CoreSchemaOrField` | The given core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | TODO: the nested function definitions here seem like bad practice, I'd like to unpack these in a future PR. It'd be great if we could shorten the call stack a bit for JSON schema generation, and I think there's potential for that here. Source code in `pydantic/json_schema.py` ```python def generate_inner(self, schema: CoreSchemaOrField) -> JsonSchemaValue: # noqa: C901 """Generates a JSON schema for a given core schema. Args: schema: The given core schema. Returns: The generated JSON schema. TODO: the nested function definitions here seem like bad practice, I'd like to unpack these in a future PR. It'd be great if we could shorten the call stack a bit for JSON schema generation, and I think there's potential for that here. """ # If a schema with the same CoreRef has been handled, just return a reference to it # Note that this assumes that it will _never_ be the case that the same CoreRef is used # on types that should have different JSON schemas if 'ref' in schema: core_ref = CoreRef(schema['ref']) # type: ignore[typeddict-item] core_mode_ref = (core_ref, self.mode) if core_mode_ref in self.core_to_defs_refs and self.core_to_defs_refs[core_mode_ref] in self.definitions: return {'$ref': self.core_to_json_refs[core_mode_ref]} def populate_defs(core_schema: CoreSchema, json_schema: JsonSchemaValue) -> JsonSchemaValue: if 'ref' in core_schema: core_ref = CoreRef(core_schema['ref']) # type: ignore[typeddict-item] defs_ref, ref_json_schema = self.get_cache_defs_ref_schema(core_ref) json_ref = JsonRef(ref_json_schema['$ref']) # Replace the schema if it's not a reference to itself # What we want to avoid is having the def be just a ref to itself # which is what would happen if we blindly assigned any if json_schema.get('$ref', None) != json_ref: self.definitions[defs_ref] = json_schema self._core_defs_invalid_for_json_schema.pop(defs_ref, None) json_schema = ref_json_schema return json_schema def handler_func(schema_or_field: CoreSchemaOrField) -> JsonSchemaValue: """Generate a JSON schema based on the input schema. Args: schema_or_field: The core schema to generate a JSON schema from. Returns: The generated JSON schema. Raises: TypeError: If an unexpected schema type is encountered. """ # Generate the core-schema-type-specific bits of the schema generation: json_schema: JsonSchemaValue | None = None if self.mode == 'serialization' and 'serialization' in schema_or_field: # In this case, we skip the JSON Schema generation of the schema # and use the `'serialization'` schema instead (canonical example: # `Annotated[int, PlainSerializer(str)]`). ser_schema = schema_or_field['serialization'] # type: ignore json_schema = self.ser_schema(ser_schema) # It might be that the 'serialization'` is skipped depending on `when_used`. # This is only relevant for `nullable` schemas though, so we special case here. if ( json_schema is not None and ser_schema.get('when_used') in ('unless-none', 'json-unless-none') and schema_or_field['type'] == 'nullable' ): json_schema = self.get_flattened_anyof([{'type': 'null'}, json_schema]) if json_schema is None: if _core_utils.is_core_schema(schema_or_field) or _core_utils.is_core_schema_field(schema_or_field): generate_for_schema_type = self._schema_type_to_method[schema_or_field['type']] json_schema = generate_for_schema_type(schema_or_field) else: raise TypeError(f'Unexpected schema type: schema={schema_or_field}') return json_schema current_handler = _schema_generation_shared.GenerateJsonSchemaHandler(self, handler_func) metadata = cast(_core_metadata.CoreMetadata, schema.get('metadata', {})) # TODO: I dislike that we have to wrap these basic dict updates in callables, is there any way around this? if js_updates := metadata.get('pydantic_js_updates'): def js_updates_handler_func( schema_or_field: CoreSchemaOrField, current_handler: GetJsonSchemaHandler = current_handler, ) -> JsonSchemaValue: json_schema = {**current_handler(schema_or_field), **js_updates} return json_schema current_handler = _schema_generation_shared.GenerateJsonSchemaHandler(self, js_updates_handler_func) if js_extra := metadata.get('pydantic_js_extra'): def js_extra_handler_func( schema_or_field: CoreSchemaOrField, current_handler: GetJsonSchemaHandler = current_handler, ) -> JsonSchemaValue: json_schema = current_handler(schema_or_field) if isinstance(js_extra, dict): json_schema.update(to_jsonable_python(js_extra)) elif callable(js_extra): # similar to typing issue in _update_class_schema when we're working with callable js extra js_extra(json_schema) # type: ignore return json_schema current_handler = _schema_generation_shared.GenerateJsonSchemaHandler(self, js_extra_handler_func) for js_modify_function in metadata.get('pydantic_js_functions', ()): def new_handler_func( schema_or_field: CoreSchemaOrField, current_handler: GetJsonSchemaHandler = current_handler, js_modify_function: GetJsonSchemaFunction = js_modify_function, ) -> JsonSchemaValue: json_schema = js_modify_function(schema_or_field, current_handler) if _core_utils.is_core_schema(schema_or_field): json_schema = populate_defs(schema_or_field, json_schema) original_schema = current_handler.resolve_ref_schema(json_schema) ref = json_schema.pop('$ref', None) if ref and json_schema: original_schema.update(json_schema) return original_schema current_handler = _schema_generation_shared.GenerateJsonSchemaHandler(self, new_handler_func) for js_modify_function in metadata.get('pydantic_js_annotation_functions', ()): def new_handler_func( schema_or_field: CoreSchemaOrField, current_handler: GetJsonSchemaHandler = current_handler, js_modify_function: GetJsonSchemaFunction = js_modify_function, ) -> JsonSchemaValue: return js_modify_function(schema_or_field, current_handler) current_handler = _schema_generation_shared.GenerateJsonSchemaHandler(self, new_handler_func) json_schema = current_handler(schema) if _core_utils.is_core_schema(schema): json_schema = populate_defs(schema, json_schema) return json_schema ``` ### sort ```python sort( value: JsonSchemaValue, parent_key: str | None = None ) -> JsonSchemaValue ``` Override this method to customize the sorting of the JSON schema (e.g., don't sort at all, sort all keys unconditionally, etc.) By default, alphabetically sort the keys in the JSON schema, skipping the 'properties' and 'default' keys to preserve field definition order. This sort is recursive, so it will sort all nested dictionaries as well. Source code in `pydantic/json_schema.py` ```python def sort(self, value: JsonSchemaValue, parent_key: str | None = None) -> JsonSchemaValue: """Override this method to customize the sorting of the JSON schema (e.g., don't sort at all, sort all keys unconditionally, etc.) By default, alphabetically sort the keys in the JSON schema, skipping the 'properties' and 'default' keys to preserve field definition order. This sort is recursive, so it will sort all nested dictionaries as well. """ sorted_dict: dict[str, JsonSchemaValue] = {} keys = value.keys() if parent_key not in ('properties', 'default'): keys = sorted(keys) for key in keys: sorted_dict[key] = self._sort_recursive(value[key], parent_key=key) return sorted_dict ``` ### invalid_schema ```python invalid_schema(schema: InvalidSchema) -> JsonSchemaValue ``` Placeholder - should never be called. Source code in `pydantic/json_schema.py` ```python def invalid_schema(self, schema: core_schema.InvalidSchema) -> JsonSchemaValue: """Placeholder - should never be called.""" raise RuntimeError('Cannot generate schema for invalid_schema. This is a bug! Please report it.') ``` ### any_schema ```python any_schema(schema: AnySchema) -> JsonSchemaValue ``` Generates a JSON schema that matches any value. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `AnySchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def any_schema(self, schema: core_schema.AnySchema) -> JsonSchemaValue: """Generates a JSON schema that matches any value. Args: schema: The core schema. Returns: The generated JSON schema. """ return {} ``` ### none_schema ```python none_schema(schema: NoneSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches `None`. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `NoneSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def none_schema(self, schema: core_schema.NoneSchema) -> JsonSchemaValue: """Generates a JSON schema that matches `None`. Args: schema: The core schema. Returns: The generated JSON schema. """ return {'type': 'null'} ``` ### bool_schema ```python bool_schema(schema: BoolSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a bool value. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `BoolSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def bool_schema(self, schema: core_schema.BoolSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a bool value. Args: schema: The core schema. Returns: The generated JSON schema. """ return {'type': 'boolean'} ``` ### int_schema ```python int_schema(schema: IntSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches an int value. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `IntSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def int_schema(self, schema: core_schema.IntSchema) -> JsonSchemaValue: """Generates a JSON schema that matches an int value. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema: dict[str, Any] = {'type': 'integer'} self.update_with_validations(json_schema, schema, self.ValidationsMapping.numeric) json_schema = {k: v for k, v in json_schema.items() if v not in {math.inf, -math.inf}} return json_schema ``` ### float_schema ```python float_schema(schema: FloatSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a float value. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `FloatSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def float_schema(self, schema: core_schema.FloatSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a float value. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema: dict[str, Any] = {'type': 'number'} self.update_with_validations(json_schema, schema, self.ValidationsMapping.numeric) json_schema = {k: v for k, v in json_schema.items() if v not in {math.inf, -math.inf}} return json_schema ``` ### decimal_schema ```python decimal_schema(schema: DecimalSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a decimal value. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `DecimalSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def decimal_schema(self, schema: core_schema.DecimalSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a decimal value. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema = self.str_schema(core_schema.str_schema()) if self.mode == 'validation': multiple_of = schema.get('multiple_of') le = schema.get('le') ge = schema.get('ge') lt = schema.get('lt') gt = schema.get('gt') json_schema = { 'anyOf': [ self.float_schema( core_schema.float_schema( allow_inf_nan=schema.get('allow_inf_nan'), multiple_of=None if multiple_of is None else float(multiple_of), le=None if le is None else float(le), ge=None if ge is None else float(ge), lt=None if lt is None else float(lt), gt=None if gt is None else float(gt), ) ), json_schema, ], } return json_schema ``` ### str_schema ```python str_schema(schema: StringSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a string value. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `StringSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def str_schema(self, schema: core_schema.StringSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a string value. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema = {'type': 'string'} self.update_with_validations(json_schema, schema, self.ValidationsMapping.string) if isinstance(json_schema.get('pattern'), Pattern): # TODO: should we add regex flags to the pattern? json_schema['pattern'] = json_schema.get('pattern').pattern # type: ignore return json_schema ``` ### bytes_schema ```python bytes_schema(schema: BytesSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a bytes value. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `BytesSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def bytes_schema(self, schema: core_schema.BytesSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a bytes value. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema = {'type': 'string', 'format': 'base64url' if self._config.ser_json_bytes == 'base64' else 'binary'} self.update_with_validations(json_schema, schema, self.ValidationsMapping.bytes) return json_schema ``` ### date_schema ```python date_schema(schema: DateSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a date value. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `DateSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def date_schema(self, schema: core_schema.DateSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a date value. Args: schema: The core schema. Returns: The generated JSON schema. """ return {'type': 'string', 'format': 'date'} ``` ### time_schema ```python time_schema(schema: TimeSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a time value. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `TimeSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def time_schema(self, schema: core_schema.TimeSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a time value. Args: schema: The core schema. Returns: The generated JSON schema. """ return {'type': 'string', 'format': 'time'} ``` ### datetime_schema ```python datetime_schema(schema: DatetimeSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a datetime value. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `DatetimeSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def datetime_schema(self, schema: core_schema.DatetimeSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a datetime value. Args: schema: The core schema. Returns: The generated JSON schema. """ return {'type': 'string', 'format': 'date-time'} ``` ### timedelta_schema ```python timedelta_schema( schema: TimedeltaSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a timedelta value. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `TimedeltaSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def timedelta_schema(self, schema: core_schema.TimedeltaSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a timedelta value. Args: schema: The core schema. Returns: The generated JSON schema. """ if self._config.ser_json_timedelta == 'float': return {'type': 'number'} return {'type': 'string', 'format': 'duration'} ``` ### literal_schema ```python literal_schema(schema: LiteralSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a literal value. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `LiteralSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def literal_schema(self, schema: core_schema.LiteralSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a literal value. Args: schema: The core schema. Returns: The generated JSON schema. """ expected = [to_jsonable_python(v.value if isinstance(v, Enum) else v) for v in schema['expected']] result: dict[str, Any] = {} if len(expected) == 1: result['const'] = expected[0] else: result['enum'] = expected types = {type(e) for e in expected} if types == {str}: result['type'] = 'string' elif types == {int}: result['type'] = 'integer' elif types == {float}: result['type'] = 'number' elif types == {bool}: result['type'] = 'boolean' elif types == {list}: result['type'] = 'array' elif types == {type(None)}: result['type'] = 'null' return result ``` ### enum_schema ```python enum_schema(schema: EnumSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches an Enum value. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `EnumSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def enum_schema(self, schema: core_schema.EnumSchema) -> JsonSchemaValue: """Generates a JSON schema that matches an Enum value. Args: schema: The core schema. Returns: The generated JSON schema. """ enum_type = schema['cls'] description = None if not enum_type.__doc__ else inspect.cleandoc(enum_type.__doc__) if ( description == 'An enumeration.' ): # This is the default value provided by enum.EnumMeta.__new__; don't use it description = None result: dict[str, Any] = {'title': enum_type.__name__, 'description': description} result = {k: v for k, v in result.items() if v is not None} expected = [to_jsonable_python(v.value) for v in schema['members']] result['enum'] = expected types = {type(e) for e in expected} if isinstance(enum_type, str) or types == {str}: result['type'] = 'string' elif isinstance(enum_type, int) or types == {int}: result['type'] = 'integer' elif isinstance(enum_type, float) or types == {float}: result['type'] = 'number' elif types == {bool}: result['type'] = 'boolean' elif types == {list}: result['type'] = 'array' return result ``` ### is_instance_schema ```python is_instance_schema( schema: IsInstanceSchema, ) -> JsonSchemaValue ``` Handles JSON schema generation for a core schema that checks if a value is an instance of a class. Unless overridden in a subclass, this raises an error. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `IsInstanceSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def is_instance_schema(self, schema: core_schema.IsInstanceSchema) -> JsonSchemaValue: """Handles JSON schema generation for a core schema that checks if a value is an instance of a class. Unless overridden in a subclass, this raises an error. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.handle_invalid_for_json_schema(schema, f'core_schema.IsInstanceSchema ({schema["cls"]})') ``` ### is_subclass_schema ```python is_subclass_schema( schema: IsSubclassSchema, ) -> JsonSchemaValue ``` Handles JSON schema generation for a core schema that checks if a value is a subclass of a class. For backwards compatibility with v1, this does not raise an error, but can be overridden to change this. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `IsSubclassSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def is_subclass_schema(self, schema: core_schema.IsSubclassSchema) -> JsonSchemaValue: """Handles JSON schema generation for a core schema that checks if a value is a subclass of a class. For backwards compatibility with v1, this does not raise an error, but can be overridden to change this. Args: schema: The core schema. Returns: The generated JSON schema. """ # Note: This is for compatibility with V1; you can override if you want different behavior. return {} ``` ### callable_schema ```python callable_schema(schema: CallableSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a callable value. Unless overridden in a subclass, this raises an error. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `CallableSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def callable_schema(self, schema: core_schema.CallableSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a callable value. Unless overridden in a subclass, this raises an error. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.handle_invalid_for_json_schema(schema, 'core_schema.CallableSchema') ``` ### list_schema ```python list_schema(schema: ListSchema) -> JsonSchemaValue ``` Returns a schema that matches a list schema. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `ListSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def list_schema(self, schema: core_schema.ListSchema) -> JsonSchemaValue: """Returns a schema that matches a list schema. Args: schema: The core schema. Returns: The generated JSON schema. """ items_schema = {} if 'items_schema' not in schema else self.generate_inner(schema['items_schema']) json_schema = {'type': 'array', 'items': items_schema} self.update_with_validations(json_schema, schema, self.ValidationsMapping.array) return json_schema ``` ### tuple_positional_schema ```python tuple_positional_schema( schema: TupleSchema, ) -> JsonSchemaValue ``` Replaced by `tuple_schema`. Source code in `pydantic/json_schema.py` ```python @deprecated('`tuple_positional_schema` is deprecated. Use `tuple_schema` instead.', category=None) @final def tuple_positional_schema(self, schema: core_schema.TupleSchema) -> JsonSchemaValue: """Replaced by `tuple_schema`.""" warnings.warn( '`tuple_positional_schema` is deprecated. Use `tuple_schema` instead.', PydanticDeprecatedSince26, stacklevel=2, ) return self.tuple_schema(schema) ``` ### tuple_variable_schema ```python tuple_variable_schema( schema: TupleSchema, ) -> JsonSchemaValue ``` Replaced by `tuple_schema`. Source code in `pydantic/json_schema.py` ```python @deprecated('`tuple_variable_schema` is deprecated. Use `tuple_schema` instead.', category=None) @final def tuple_variable_schema(self, schema: core_schema.TupleSchema) -> JsonSchemaValue: """Replaced by `tuple_schema`.""" warnings.warn( '`tuple_variable_schema` is deprecated. Use `tuple_schema` instead.', PydanticDeprecatedSince26, stacklevel=2, ) return self.tuple_schema(schema) ``` ### tuple_schema ```python tuple_schema(schema: TupleSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a tuple schema e.g. `tuple[int, str, bool]` or `tuple[int, ...]`. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `TupleSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def tuple_schema(self, schema: core_schema.TupleSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a tuple schema e.g. `tuple[int, str, bool]` or `tuple[int, ...]`. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema: JsonSchemaValue = {'type': 'array'} if 'variadic_item_index' in schema: variadic_item_index = schema['variadic_item_index'] if variadic_item_index > 0: json_schema['minItems'] = variadic_item_index json_schema['prefixItems'] = [ self.generate_inner(item) for item in schema['items_schema'][:variadic_item_index] ] if variadic_item_index + 1 == len(schema['items_schema']): # if the variadic item is the last item, then represent it faithfully json_schema['items'] = self.generate_inner(schema['items_schema'][variadic_item_index]) else: # otherwise, 'items' represents the schema for the variadic # item plus the suffix, so just allow anything for simplicity # for now json_schema['items'] = True else: prefixItems = [self.generate_inner(item) for item in schema['items_schema']] if prefixItems: json_schema['prefixItems'] = prefixItems json_schema['minItems'] = len(prefixItems) json_schema['maxItems'] = len(prefixItems) self.update_with_validations(json_schema, schema, self.ValidationsMapping.array) return json_schema ``` ### set_schema ```python set_schema(schema: SetSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a set schema. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `SetSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def set_schema(self, schema: core_schema.SetSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a set schema. Args: schema: The core schema. Returns: The generated JSON schema. """ return self._common_set_schema(schema) ``` ### frozenset_schema ```python frozenset_schema( schema: FrozenSetSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a frozenset schema. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `FrozenSetSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def frozenset_schema(self, schema: core_schema.FrozenSetSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a frozenset schema. Args: schema: The core schema. Returns: The generated JSON schema. """ return self._common_set_schema(schema) ``` ### generator_schema ```python generator_schema( schema: GeneratorSchema, ) -> JsonSchemaValue ``` Returns a JSON schema that represents the provided GeneratorSchema. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `GeneratorSchema` | The schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def generator_schema(self, schema: core_schema.GeneratorSchema) -> JsonSchemaValue: """Returns a JSON schema that represents the provided GeneratorSchema. Args: schema: The schema. Returns: The generated JSON schema. """ items_schema = {} if 'items_schema' not in schema else self.generate_inner(schema['items_schema']) json_schema = {'type': 'array', 'items': items_schema} self.update_with_validations(json_schema, schema, self.ValidationsMapping.array) return json_schema ``` ### dict_schema ```python dict_schema(schema: DictSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a dict schema. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `DictSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def dict_schema(self, schema: core_schema.DictSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a dict schema. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema: JsonSchemaValue = {'type': 'object'} keys_schema = self.generate_inner(schema['keys_schema']).copy() if 'keys_schema' in schema else {} if '$ref' not in keys_schema: keys_pattern = keys_schema.pop('pattern', None) # Don't give a title to patternProperties/propertyNames: keys_schema.pop('title', None) else: # Here, we assume that if the keys schema is a definition reference, # it can't be a simple string core schema (and thus no pattern can exist). # However, this is only in practice (in theory, a definition reference core # schema could be generated for a simple string schema). # Note that we avoid calling `self.resolve_ref_schema`, as it might not exist yet. keys_pattern = None values_schema = self.generate_inner(schema['values_schema']).copy() if 'values_schema' in schema else {} # don't give a title to additionalProperties: values_schema.pop('title', None) if values_schema or keys_pattern is not None: if keys_pattern is None: json_schema['additionalProperties'] = values_schema else: json_schema['patternProperties'] = {keys_pattern: values_schema} else: # for `dict[str, Any]`, we allow any key and any value, since `str` is the default key type json_schema['additionalProperties'] = True if ( # The len check indicates that constraints are probably present: (keys_schema.get('type') == 'string' and len(keys_schema) > 1) # If this is a definition reference schema, it most likely has constraints: or '$ref' in keys_schema ): keys_schema.pop('type', None) json_schema['propertyNames'] = keys_schema self.update_with_validations(json_schema, schema, self.ValidationsMapping.object) return json_schema ``` ### function_before_schema ```python function_before_schema( schema: BeforeValidatorFunctionSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a function-before schema. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `BeforeValidatorFunctionSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def function_before_schema(self, schema: core_schema.BeforeValidatorFunctionSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a function-before schema. Args: schema: The core schema. Returns: The generated JSON schema. """ if self.mode == 'validation' and (input_schema := schema.get('json_schema_input_schema')): return self.generate_inner(input_schema) return self.generate_inner(schema['schema']) ``` ### function_after_schema ```python function_after_schema( schema: AfterValidatorFunctionSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a function-after schema. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `AfterValidatorFunctionSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def function_after_schema(self, schema: core_schema.AfterValidatorFunctionSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a function-after schema. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.generate_inner(schema['schema']) ``` ### function_plain_schema ```python function_plain_schema( schema: PlainValidatorFunctionSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a function-plain schema. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `PlainValidatorFunctionSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def function_plain_schema(self, schema: core_schema.PlainValidatorFunctionSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a function-plain schema. Args: schema: The core schema. Returns: The generated JSON schema. """ if self.mode == 'validation' and (input_schema := schema.get('json_schema_input_schema')): return self.generate_inner(input_schema) return self.handle_invalid_for_json_schema( schema, f'core_schema.PlainValidatorFunctionSchema ({schema["function"]})' ) ``` ### function_wrap_schema ```python function_wrap_schema( schema: WrapValidatorFunctionSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a function-wrap schema. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `WrapValidatorFunctionSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def function_wrap_schema(self, schema: core_schema.WrapValidatorFunctionSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a function-wrap schema. Args: schema: The core schema. Returns: The generated JSON schema. """ if self.mode == 'validation' and (input_schema := schema.get('json_schema_input_schema')): return self.generate_inner(input_schema) return self.generate_inner(schema['schema']) ``` ### default_schema ```python default_schema( schema: WithDefaultSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema with a default value. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `WithDefaultSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def default_schema(self, schema: core_schema.WithDefaultSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema with a default value. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema = self.generate_inner(schema['schema']) default = self.get_default_value(schema) if default is NoDefault: return json_schema # we reflect the application of custom plain, no-info serializers to defaults for # JSON Schemas viewed in serialization mode: # TODO: improvements along with https://github.com/pydantic/pydantic/issues/8208 if ( self.mode == 'serialization' and (ser_schema := schema['schema'].get('serialization')) and (ser_func := ser_schema.get('function')) and ser_schema.get('type') == 'function-plain' and not ser_schema.get('info_arg') and not (default is None and ser_schema.get('when_used') in ('unless-none', 'json-unless-none')) ): try: default = ser_func(default) # type: ignore except Exception: # It might be that the provided default needs to be validated (read: parsed) first # (assuming `validate_default` is enabled). However, we can't perform # such validation during JSON Schema generation so we don't support # this pattern for now. # (One example is when using `foo: ByteSize = '1MB'`, which validates and # serializes as an int. In this case, `ser_func` is `int` and `int('1MB')` fails). self.emit_warning( 'non-serializable-default', f'Unable to serialize value {default!r} with the plain serializer; excluding default from JSON schema', ) return json_schema try: encoded_default = self.encode_default(default) except pydantic_core.PydanticSerializationError: self.emit_warning( 'non-serializable-default', f'Default value {default} is not JSON serializable; excluding default from JSON schema', ) # Return the inner schema, as though there was no default return json_schema json_schema['default'] = encoded_default return json_schema ``` ### get_default_value ```python get_default_value(schema: WithDefaultSchema) -> Any ``` Get the default value to be used when generating a JSON Schema for a core schema with a default. The default implementation is to use the statically defined default value. This method can be overridden if you want to make use of the default factory. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `WithDefaultSchema` | The 'with-default' core schema. | *required* | Returns: | Type | Description | | --- | --- | | `Any` | The default value to use, or NoDefault if no default value is available. | Source code in `pydantic/json_schema.py` ```python def get_default_value(self, schema: core_schema.WithDefaultSchema) -> Any: """Get the default value to be used when generating a JSON Schema for a core schema with a default. The default implementation is to use the statically defined default value. This method can be overridden if you want to make use of the default factory. Args: schema: The `'with-default'` core schema. Returns: The default value to use, or [`NoDefault`][pydantic.json_schema.NoDefault] if no default value is available. """ return schema.get('default', NoDefault) ``` ### nullable_schema ```python nullable_schema(schema: NullableSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that allows null values. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `NullableSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def nullable_schema(self, schema: core_schema.NullableSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that allows null values. Args: schema: The core schema. Returns: The generated JSON schema. """ null_schema = {'type': 'null'} inner_json_schema = self.generate_inner(schema['schema']) if inner_json_schema == null_schema: return null_schema else: # Thanks to the equality check against `null_schema` above, I think 'oneOf' would also be valid here; # I'll use 'anyOf' for now, but it could be changed it if it would work better with some external tooling return self.get_flattened_anyof([inner_json_schema, null_schema]) ``` ### union_schema ```python union_schema(schema: UnionSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that allows values matching any of the given schemas. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `UnionSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def union_schema(self, schema: core_schema.UnionSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that allows values matching any of the given schemas. Args: schema: The core schema. Returns: The generated JSON schema. """ generated: list[JsonSchemaValue] = [] choices = schema['choices'] for choice in choices: # choice will be a tuple if an explicit label was provided choice_schema = choice[0] if isinstance(choice, tuple) else choice try: generated.append(self.generate_inner(choice_schema)) except PydanticOmit: continue except PydanticInvalidForJsonSchema as exc: self.emit_warning('skipped-choice', exc.message) if len(generated) == 1: return generated[0] return self.get_flattened_anyof(generated) ``` ### tagged_union_schema ```python tagged_union_schema( schema: TaggedUnionSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that allows values matching any of the given schemas, where the schemas are tagged with a discriminator field that indicates which schema should be used to validate the value. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `TaggedUnionSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def tagged_union_schema(self, schema: core_schema.TaggedUnionSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that allows values matching any of the given schemas, where the schemas are tagged with a discriminator field that indicates which schema should be used to validate the value. Args: schema: The core schema. Returns: The generated JSON schema. """ generated: dict[str, JsonSchemaValue] = {} for k, v in schema['choices'].items(): if isinstance(k, Enum): k = k.value try: # Use str(k) since keys must be strings for json; while not technically correct, # it's the closest that can be represented in valid JSON generated[str(k)] = self.generate_inner(v).copy() except PydanticOmit: continue except PydanticInvalidForJsonSchema as exc: self.emit_warning('skipped-choice', exc.message) one_of_choices = _deduplicate_schemas(generated.values()) json_schema: JsonSchemaValue = {'oneOf': one_of_choices} # This reflects the v1 behavior; TODO: we should make it possible to exclude OpenAPI stuff from the JSON schema openapi_discriminator = self._extract_discriminator(schema, one_of_choices) if openapi_discriminator is not None: json_schema['discriminator'] = { 'propertyName': openapi_discriminator, 'mapping': {k: v.get('$ref', v) for k, v in generated.items()}, } return json_schema ``` ### chain_schema ```python chain_schema(schema: ChainSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a core_schema.ChainSchema. When generating a schema for validation, we return the validation JSON schema for the first step in the chain. For serialization, we return the serialization JSON schema for the last step in the chain. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `ChainSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def chain_schema(self, schema: core_schema.ChainSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a core_schema.ChainSchema. When generating a schema for validation, we return the validation JSON schema for the first step in the chain. For serialization, we return the serialization JSON schema for the last step in the chain. Args: schema: The core schema. Returns: The generated JSON schema. """ step_index = 0 if self.mode == 'validation' else -1 # use first step for validation, last for serialization return self.generate_inner(schema['steps'][step_index]) ``` ### lax_or_strict_schema ```python lax_or_strict_schema( schema: LaxOrStrictSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that allows values matching either the lax schema or the strict schema. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `LaxOrStrictSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def lax_or_strict_schema(self, schema: core_schema.LaxOrStrictSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that allows values matching either the lax schema or the strict schema. Args: schema: The core schema. Returns: The generated JSON schema. """ # TODO: Need to read the default value off of model config or whatever use_strict = schema.get('strict', False) # TODO: replace this default False # If your JSON schema fails to generate it is probably # because one of the following two branches failed. if use_strict: return self.generate_inner(schema['strict_schema']) else: return self.generate_inner(schema['lax_schema']) ``` ### json_or_python_schema ```python json_or_python_schema( schema: JsonOrPythonSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that allows values matching either the JSON schema or the Python schema. The JSON schema is used instead of the Python schema. If you want to use the Python schema, you should override this method. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `JsonOrPythonSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def json_or_python_schema(self, schema: core_schema.JsonOrPythonSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that allows values matching either the JSON schema or the Python schema. The JSON schema is used instead of the Python schema. If you want to use the Python schema, you should override this method. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.generate_inner(schema['json_schema']) ``` ### typed_dict_schema ```python typed_dict_schema( schema: TypedDictSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a typed dict. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `TypedDictSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def typed_dict_schema(self, schema: core_schema.TypedDictSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a typed dict. Args: schema: The core schema. Returns: The generated JSON schema. """ total = schema.get('total', True) named_required_fields: list[tuple[str, bool, CoreSchemaField]] = [ (name, self.field_is_required(field, total), field) for name, field in schema['fields'].items() if self.field_is_present(field) ] if self.mode == 'serialization': named_required_fields.extend(self._name_required_computed_fields(schema.get('computed_fields', []))) cls = schema.get('cls') config = _get_typed_dict_config(cls) with self._config_wrapper_stack.push(config): json_schema = self._named_required_fields_schema(named_required_fields) if cls is not None: self._update_class_schema(json_schema, cls, config) else: extra = config.get('extra') if extra == 'forbid': json_schema['additionalProperties'] = False elif extra == 'allow': json_schema['additionalProperties'] = True return json_schema ``` ### typed_dict_field_schema ```python typed_dict_field_schema( schema: TypedDictField, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a typed dict field. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `TypedDictField` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def typed_dict_field_schema(self, schema: core_schema.TypedDictField) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a typed dict field. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.generate_inner(schema['schema']) ``` ### dataclass_field_schema ```python dataclass_field_schema( schema: DataclassField, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a dataclass field. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `DataclassField` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def dataclass_field_schema(self, schema: core_schema.DataclassField) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a dataclass field. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.generate_inner(schema['schema']) ``` ### model_field_schema ```python model_field_schema(schema: ModelField) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a model field. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `ModelField` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def model_field_schema(self, schema: core_schema.ModelField) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a model field. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.generate_inner(schema['schema']) ``` ### computed_field_schema ```python computed_field_schema( schema: ComputedField, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a computed field. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `ComputedField` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def computed_field_schema(self, schema: core_schema.ComputedField) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a computed field. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.generate_inner(schema['return_schema']) ``` ### model_schema ```python model_schema(schema: ModelSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a model. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `ModelSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def model_schema(self, schema: core_schema.ModelSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a model. Args: schema: The core schema. Returns: The generated JSON schema. """ # We do not use schema['model'].model_json_schema() here # because it could lead to inconsistent refs handling, etc. cls = cast('type[BaseModel]', schema['cls']) config = cls.model_config with self._config_wrapper_stack.push(config): json_schema = self.generate_inner(schema['schema']) self._update_class_schema(json_schema, cls, config) return json_schema ``` ### resolve_ref_schema ```python resolve_ref_schema( json_schema: JsonSchemaValue, ) -> JsonSchemaValue ``` Resolve a JsonSchemaValue to the non-ref schema if it is a $ref schema. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `json_schema` | `JsonSchemaValue` | The schema to resolve. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The resolved schema. | Raises: | Type | Description | | --- | --- | | `RuntimeError` | If the schema reference can't be found in definitions. | Source code in `pydantic/json_schema.py` ```python def resolve_ref_schema(self, json_schema: JsonSchemaValue) -> JsonSchemaValue: """Resolve a JsonSchemaValue to the non-ref schema if it is a $ref schema. Args: json_schema: The schema to resolve. Returns: The resolved schema. Raises: RuntimeError: If the schema reference can't be found in definitions. """ while '$ref' in json_schema: ref = json_schema['$ref'] schema_to_update = self.get_schema_from_definitions(JsonRef(ref)) if schema_to_update is None: raise RuntimeError(f'Cannot update undefined schema for $ref={ref}') json_schema = schema_to_update return json_schema ``` ### model_fields_schema ```python model_fields_schema( schema: ModelFieldsSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a model's fields. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `ModelFieldsSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def model_fields_schema(self, schema: core_schema.ModelFieldsSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a model's fields. Args: schema: The core schema. Returns: The generated JSON schema. """ named_required_fields: list[tuple[str, bool, CoreSchemaField]] = [ (name, self.field_is_required(field, total=True), field) for name, field in schema['fields'].items() if self.field_is_present(field) ] if self.mode == 'serialization': named_required_fields.extend(self._name_required_computed_fields(schema.get('computed_fields', []))) json_schema = self._named_required_fields_schema(named_required_fields) extras_schema = schema.get('extras_schema', None) if extras_schema is not None: schema_to_update = self.resolve_ref_schema(json_schema) schema_to_update['additionalProperties'] = self.generate_inner(extras_schema) return json_schema ``` ### field_is_present ```python field_is_present(field: CoreSchemaField) -> bool ``` Whether the field should be included in the generated JSON schema. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `field` | `CoreSchemaField` | The schema for the field itself. | *required* | Returns: | Type | Description | | --- | --- | | `bool` | True if the field should be included in the generated JSON schema, False otherwise. | Source code in `pydantic/json_schema.py` ```python def field_is_present(self, field: CoreSchemaField) -> bool: """Whether the field should be included in the generated JSON schema. Args: field: The schema for the field itself. Returns: `True` if the field should be included in the generated JSON schema, `False` otherwise. """ if self.mode == 'serialization': # If you still want to include the field in the generated JSON schema, # override this method and return True return not field.get('serialization_exclude') elif self.mode == 'validation': return True else: assert_never(self.mode) ``` ### field_is_required ```python field_is_required( field: ModelField | DataclassField | TypedDictField, total: bool, ) -> bool ``` Whether the field should be marked as required in the generated JSON schema. (Note that this is irrelevant if the field is not present in the JSON schema.). Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `field` | `ModelField | DataclassField | TypedDictField` | The schema for the field itself. | *required* | | `total` | `bool` | Only applies to TypedDictFields. Indicates if the TypedDict this field belongs to is total, in which case any fields that don't explicitly specify required=False are required. | *required* | Returns: | Type | Description | | --- | --- | | `bool` | True if the field should be marked as required in the generated JSON schema, False otherwise. | Source code in `pydantic/json_schema.py` ```python def field_is_required( self, field: core_schema.ModelField | core_schema.DataclassField | core_schema.TypedDictField, total: bool, ) -> bool: """Whether the field should be marked as required in the generated JSON schema. (Note that this is irrelevant if the field is not present in the JSON schema.). Args: field: The schema for the field itself. total: Only applies to `TypedDictField`s. Indicates if the `TypedDict` this field belongs to is total, in which case any fields that don't explicitly specify `required=False` are required. Returns: `True` if the field should be marked as required in the generated JSON schema, `False` otherwise. """ if self.mode == 'serialization' and self._config.json_schema_serialization_defaults_required: return not field.get('serialization_exclude') else: if field['type'] == 'typed-dict-field': return field.get('required', total) else: return field['schema']['type'] != 'default' ``` ### dataclass_args_schema ```python dataclass_args_schema( schema: DataclassArgsSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a dataclass's constructor arguments. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `DataclassArgsSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def dataclass_args_schema(self, schema: core_schema.DataclassArgsSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a dataclass's constructor arguments. Args: schema: The core schema. Returns: The generated JSON schema. """ named_required_fields: list[tuple[str, bool, CoreSchemaField]] = [ (field['name'], self.field_is_required(field, total=True), field) for field in schema['fields'] if self.field_is_present(field) ] if self.mode == 'serialization': named_required_fields.extend(self._name_required_computed_fields(schema.get('computed_fields', []))) return self._named_required_fields_schema(named_required_fields) ``` ### dataclass_schema ```python dataclass_schema( schema: DataclassSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a dataclass. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `DataclassSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def dataclass_schema(self, schema: core_schema.DataclassSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a dataclass. Args: schema: The core schema. Returns: The generated JSON schema. """ from ._internal._dataclasses import is_builtin_dataclass cls = schema['cls'] config: ConfigDict = getattr(cls, '__pydantic_config__', cast('ConfigDict', {})) with self._config_wrapper_stack.push(config): json_schema = self.generate_inner(schema['schema']).copy() self._update_class_schema(json_schema, cls, config) # Dataclass-specific handling of description if is_builtin_dataclass(cls): # vanilla dataclass; don't use cls.__doc__ as it will contain the class signature by default description = None else: description = None if cls.__doc__ is None else inspect.cleandoc(cls.__doc__) if description: json_schema['description'] = description return json_schema ``` ### arguments_schema ```python arguments_schema( schema: ArgumentsSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a function's arguments. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `ArgumentsSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def arguments_schema(self, schema: core_schema.ArgumentsSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a function's arguments. Args: schema: The core schema. Returns: The generated JSON schema. """ prefer_positional = schema.get('metadata', {}).get('pydantic_js_prefer_positional_arguments') arguments = schema['arguments_schema'] kw_only_arguments = [a for a in arguments if a.get('mode') == 'keyword_only'] kw_or_p_arguments = [a for a in arguments if a.get('mode') in {'positional_or_keyword', None}] p_only_arguments = [a for a in arguments if a.get('mode') == 'positional_only'] var_args_schema = schema.get('var_args_schema') var_kwargs_schema = schema.get('var_kwargs_schema') if prefer_positional: positional_possible = not kw_only_arguments and not var_kwargs_schema if positional_possible: return self.p_arguments_schema(p_only_arguments + kw_or_p_arguments, var_args_schema) keyword_possible = not p_only_arguments and not var_args_schema if keyword_possible: return self.kw_arguments_schema(kw_or_p_arguments + kw_only_arguments, var_kwargs_schema) if not prefer_positional: positional_possible = not kw_only_arguments and not var_kwargs_schema if positional_possible: return self.p_arguments_schema(p_only_arguments + kw_or_p_arguments, var_args_schema) raise PydanticInvalidForJsonSchema( 'Unable to generate JSON schema for arguments validator with positional-only and keyword-only arguments' ) ``` ### kw_arguments_schema ```python kw_arguments_schema( arguments: list[ArgumentsParameter], var_kwargs_schema: CoreSchema | None, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a function's keyword arguments. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `arguments` | `list[ArgumentsParameter]` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def kw_arguments_schema( self, arguments: list[core_schema.ArgumentsParameter], var_kwargs_schema: CoreSchema | None ) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a function's keyword arguments. Args: arguments: The core schema. Returns: The generated JSON schema. """ properties: dict[str, JsonSchemaValue] = {} required: list[str] = [] for argument in arguments: name = self.get_argument_name(argument) argument_schema = self.generate_inner(argument['schema']).copy() argument_schema['title'] = self.get_title_from_name(name) properties[name] = argument_schema if argument['schema']['type'] != 'default': # This assumes that if the argument has a default value, # the inner schema must be of type WithDefaultSchema. # I believe this is true, but I am not 100% sure required.append(name) json_schema: JsonSchemaValue = {'type': 'object', 'properties': properties} if required: json_schema['required'] = required if var_kwargs_schema: additional_properties_schema = self.generate_inner(var_kwargs_schema) if additional_properties_schema: json_schema['additionalProperties'] = additional_properties_schema else: json_schema['additionalProperties'] = False return json_schema ``` ### p_arguments_schema ```python p_arguments_schema( arguments: list[ArgumentsParameter], var_args_schema: CoreSchema | None, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a function's positional arguments. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `arguments` | `list[ArgumentsParameter]` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def p_arguments_schema( self, arguments: list[core_schema.ArgumentsParameter], var_args_schema: CoreSchema | None ) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a function's positional arguments. Args: arguments: The core schema. Returns: The generated JSON schema. """ prefix_items: list[JsonSchemaValue] = [] min_items = 0 for argument in arguments: name = self.get_argument_name(argument) argument_schema = self.generate_inner(argument['schema']).copy() argument_schema['title'] = self.get_title_from_name(name) prefix_items.append(argument_schema) if argument['schema']['type'] != 'default': # This assumes that if the argument has a default value, # the inner schema must be of type WithDefaultSchema. # I believe this is true, but I am not 100% sure min_items += 1 json_schema: JsonSchemaValue = {'type': 'array'} if prefix_items: json_schema['prefixItems'] = prefix_items if min_items: json_schema['minItems'] = min_items if var_args_schema: items_schema = self.generate_inner(var_args_schema) if items_schema: json_schema['items'] = items_schema else: json_schema['maxItems'] = len(prefix_items) return json_schema ``` ### get_argument_name ```python get_argument_name( argument: ArgumentsParameter | ArgumentsV3Parameter, ) -> str ``` Retrieves the name of an argument. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `argument` | `ArgumentsParameter | ArgumentsV3Parameter` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `str` | The name of the argument. | Source code in `pydantic/json_schema.py` ```python def get_argument_name(self, argument: core_schema.ArgumentsParameter | core_schema.ArgumentsV3Parameter) -> str: """Retrieves the name of an argument. Args: argument: The core schema. Returns: The name of the argument. """ name = argument['name'] if self.by_alias: alias = argument.get('alias') if isinstance(alias, str): name = alias else: pass # might want to do something else? return name ``` ### arguments_v3_schema ```python arguments_v3_schema( schema: ArgumentsV3Schema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a function's arguments. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `ArgumentsV3Schema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def arguments_v3_schema(self, schema: core_schema.ArgumentsV3Schema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a function's arguments. Args: schema: The core schema. Returns: The generated JSON schema. """ arguments = schema['arguments_schema'] properties: dict[str, JsonSchemaValue] = {} required: list[str] = [] for argument in arguments: mode = argument.get('mode', 'positional_or_keyword') name = self.get_argument_name(argument) argument_schema = self.generate_inner(argument['schema']).copy() if mode == 'var_args': argument_schema = {'type': 'array', 'items': argument_schema} elif mode == 'var_kwargs_uniform': argument_schema = {'type': 'object', 'additionalProperties': argument_schema} argument_schema.setdefault('title', self.get_title_from_name(name)) properties[name] = argument_schema if ( (mode == 'var_kwargs_unpacked_typed_dict' and 'required' in argument_schema) or mode not in {'var_args', 'var_kwargs_uniform', 'var_kwargs_unpacked_typed_dict'} and argument['schema']['type'] != 'default' ): # This assumes that if the argument has a default value, # the inner schema must be of type WithDefaultSchema. # I believe this is true, but I am not 100% sure required.append(name) json_schema: JsonSchemaValue = {'type': 'object', 'properties': properties} if required: json_schema['required'] = required return json_schema ``` ### call_schema ```python call_schema(schema: CallSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a function call. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `CallSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def call_schema(self, schema: core_schema.CallSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a function call. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.generate_inner(schema['arguments_schema']) ``` ### custom_error_schema ```python custom_error_schema( schema: CustomErrorSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a custom error. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `CustomErrorSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def custom_error_schema(self, schema: core_schema.CustomErrorSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a custom error. Args: schema: The core schema. Returns: The generated JSON schema. """ return self.generate_inner(schema['schema']) ``` ### json_schema ```python json_schema(schema: JsonSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a JSON object. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `JsonSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def json_schema(self, schema: core_schema.JsonSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a JSON object. Args: schema: The core schema. Returns: The generated JSON schema. """ content_core_schema = schema.get('schema') or core_schema.any_schema() content_json_schema = self.generate_inner(content_core_schema) if self.mode == 'validation': return {'type': 'string', 'contentMediaType': 'application/json', 'contentSchema': content_json_schema} else: # self.mode == 'serialization' return content_json_schema ``` ### url_schema ```python url_schema(schema: UrlSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a URL. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `UrlSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def url_schema(self, schema: core_schema.UrlSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a URL. Args: schema: The core schema. Returns: The generated JSON schema. """ json_schema = {'type': 'string', 'format': 'uri', 'minLength': 1} self.update_with_validations(json_schema, schema, self.ValidationsMapping.string) return json_schema ``` ### multi_host_url_schema ```python multi_host_url_schema( schema: MultiHostUrlSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a URL that can be used with multiple hosts. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `MultiHostUrlSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def multi_host_url_schema(self, schema: core_schema.MultiHostUrlSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a URL that can be used with multiple hosts. Args: schema: The core schema. Returns: The generated JSON schema. """ # Note: 'multi-host-uri' is a custom/pydantic-specific format, not part of the JSON Schema spec json_schema = {'type': 'string', 'format': 'multi-host-uri', 'minLength': 1} self.update_with_validations(json_schema, schema, self.ValidationsMapping.string) return json_schema ``` ### uuid_schema ```python uuid_schema(schema: UuidSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a UUID. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `UuidSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def uuid_schema(self, schema: core_schema.UuidSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a UUID. Args: schema: The core schema. Returns: The generated JSON schema. """ return {'type': 'string', 'format': 'uuid'} ``` ### definitions_schema ```python definitions_schema( schema: DefinitionsSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that defines a JSON object with definitions. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `DefinitionsSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def definitions_schema(self, schema: core_schema.DefinitionsSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that defines a JSON object with definitions. Args: schema: The core schema. Returns: The generated JSON schema. """ for definition in schema['definitions']: try: self.generate_inner(definition) except PydanticInvalidForJsonSchema as e: core_ref: CoreRef = CoreRef(definition['ref']) # type: ignore self._core_defs_invalid_for_json_schema[self.get_defs_ref((core_ref, self.mode))] = e continue return self.generate_inner(schema['schema']) ``` ### definition_ref_schema ```python definition_ref_schema( schema: DefinitionReferenceSchema, ) -> JsonSchemaValue ``` Generates a JSON schema that matches a schema that references a definition. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `DefinitionReferenceSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def definition_ref_schema(self, schema: core_schema.DefinitionReferenceSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a schema that references a definition. Args: schema: The core schema. Returns: The generated JSON schema. """ core_ref = CoreRef(schema['schema_ref']) _, ref_json_schema = self.get_cache_defs_ref_schema(core_ref) return ref_json_schema ``` ### ser_schema ```python ser_schema( schema: ( SerSchema | IncExSeqSerSchema | IncExDictSerSchema ), ) -> JsonSchemaValue | None ``` Generates a JSON schema that matches a schema that defines a serialized object. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `SerSchema | IncExSeqSerSchema | IncExDictSerSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue | None` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def ser_schema( self, schema: core_schema.SerSchema | core_schema.IncExSeqSerSchema | core_schema.IncExDictSerSchema ) -> JsonSchemaValue | None: """Generates a JSON schema that matches a schema that defines a serialized object. Args: schema: The core schema. Returns: The generated JSON schema. """ schema_type = schema['type'] if schema_type == 'function-plain' or schema_type == 'function-wrap': # PlainSerializerFunctionSerSchema or WrapSerializerFunctionSerSchema return_schema = schema.get('return_schema') if return_schema is not None: return self.generate_inner(return_schema) elif schema_type == 'format' or schema_type == 'to-string': # FormatSerSchema or ToStringSerSchema return self.str_schema(core_schema.str_schema()) elif schema['type'] == 'model': # ModelSerSchema return self.generate_inner(schema['schema']) return None ``` ### complex_schema ```python complex_schema(schema: ComplexSchema) -> JsonSchemaValue ``` Generates a JSON schema that matches a complex number. JSON has no standard way to represent complex numbers. Complex number is not a numeric type. Here we represent complex number as strings following the rule defined by Python. For instance, '1+2j' is an accepted complex string. Details can be found in Python's complex documentation. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `ComplexSchema` | The core schema. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The generated JSON schema. | Source code in `pydantic/json_schema.py` ```python def complex_schema(self, schema: core_schema.ComplexSchema) -> JsonSchemaValue: """Generates a JSON schema that matches a complex number. JSON has no standard way to represent complex numbers. Complex number is not a numeric type. Here we represent complex number as strings following the rule defined by Python. For instance, '1+2j' is an accepted complex string. Details can be found in [Python's `complex` documentation][complex]. Args: schema: The core schema. Returns: The generated JSON schema. """ return {'type': 'string'} ``` ### get_title_from_name ```python get_title_from_name(name: str) -> str ``` Retrieves a title from a name. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `name` | `str` | The name to retrieve a title from. | *required* | Returns: | Type | Description | | --- | --- | | `str` | The title. | Source code in `pydantic/json_schema.py` ```python def get_title_from_name(self, name: str) -> str: """Retrieves a title from a name. Args: name: The name to retrieve a title from. Returns: The title. """ return name.title().replace('_', ' ').strip() ``` ### field_title_should_be_set ```python field_title_should_be_set( schema: CoreSchemaOrField, ) -> bool ``` Returns true if a field with the given schema should have a title set based on the field name. Intuitively, we want this to return true for schemas that wouldn't otherwise provide their own title (e.g., int, float, str), and false for those that would (e.g., BaseModel subclasses). Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `CoreSchemaOrField` | The schema to check. | *required* | Returns: | Type | Description | | --- | --- | | `bool` | True if the field should have a title set, False otherwise. | Source code in `pydantic/json_schema.py` ```python def field_title_should_be_set(self, schema: CoreSchemaOrField) -> bool: """Returns true if a field with the given schema should have a title set based on the field name. Intuitively, we want this to return true for schemas that wouldn't otherwise provide their own title (e.g., int, float, str), and false for those that would (e.g., BaseModel subclasses). Args: schema: The schema to check. Returns: `True` if the field should have a title set, `False` otherwise. """ if _core_utils.is_core_schema_field(schema): if schema['type'] == 'computed-field': field_schema = schema['return_schema'] else: field_schema = schema['schema'] return self.field_title_should_be_set(field_schema) elif _core_utils.is_core_schema(schema): if schema.get('ref'): # things with refs, such as models and enums, should not have titles set return False if schema['type'] in {'default', 'nullable', 'definitions'}: return self.field_title_should_be_set(schema['schema']) # type: ignore[typeddict-item] if _core_utils.is_function_with_inner_schema(schema): return self.field_title_should_be_set(schema['schema']) if schema['type'] == 'definition-ref': # Referenced schemas should not have titles set for the same reason # schemas with refs should not return False return True # anything else should have title set else: raise PydanticInvalidForJsonSchema(f'Unexpected schema type: schema={schema}') # pragma: no cover ``` ### normalize_name ```python normalize_name(name: str) -> str ``` Normalizes a name to be used as a key in a dictionary. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `name` | `str` | The name to normalize. | *required* | Returns: | Type | Description | | --- | --- | | `str` | The normalized name. | Source code in `pydantic/json_schema.py` ```python def normalize_name(self, name: str) -> str: """Normalizes a name to be used as a key in a dictionary. Args: name: The name to normalize. Returns: The normalized name. """ return re.sub(r'[^a-zA-Z0-9.\-_]', '_', name).replace('.', '__') ``` ### get_defs_ref ```python get_defs_ref(core_mode_ref: CoreModeRef) -> DefsRef ``` Override this method to change the way that definitions keys are generated from a core reference. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `core_mode_ref` | `CoreModeRef` | The core reference. | *required* | Returns: | Type | Description | | --- | --- | | `DefsRef` | The definitions key. | Source code in `pydantic/json_schema.py` ```python def get_defs_ref(self, core_mode_ref: CoreModeRef) -> DefsRef: """Override this method to change the way that definitions keys are generated from a core reference. Args: core_mode_ref: The core reference. Returns: The definitions key. """ # Split the core ref into "components"; generic origins and arguments are each separate components core_ref, mode = core_mode_ref components = re.split(r'([\][,])', core_ref) # Remove IDs from each component components = [x.rsplit(':', 1)[0] for x in components] core_ref_no_id = ''.join(components) # Remove everything before the last period from each "component" components = [re.sub(r'(?:[^.[\]]+\.)+((?:[^.[\]]+))', r'\1', x) for x in components] short_ref = ''.join(components) mode_title = _MODE_TITLE_MAPPING[mode] # It is important that the generated defs_ref values be such that at least one choice will not # be generated for any other core_ref. Currently, this should be the case because we include # the id of the source type in the core_ref name = DefsRef(self.normalize_name(short_ref)) name_mode = DefsRef(self.normalize_name(short_ref) + f'-{mode_title}') module_qualname = DefsRef(self.normalize_name(core_ref_no_id)) module_qualname_mode = DefsRef(f'{module_qualname}-{mode_title}') module_qualname_id = DefsRef(self.normalize_name(core_ref)) occurrence_index = self._collision_index.get(module_qualname_id) if occurrence_index is None: self._collision_counter[module_qualname] += 1 occurrence_index = self._collision_index[module_qualname_id] = self._collision_counter[module_qualname] module_qualname_occurrence = DefsRef(f'{module_qualname}__{occurrence_index}') module_qualname_occurrence_mode = DefsRef(f'{module_qualname_mode}__{occurrence_index}') self._prioritized_defsref_choices[module_qualname_occurrence_mode] = [ name, name_mode, module_qualname, module_qualname_mode, module_qualname_occurrence, module_qualname_occurrence_mode, ] return module_qualname_occurrence_mode ``` ### get_cache_defs_ref_schema ```python get_cache_defs_ref_schema( core_ref: CoreRef, ) -> tuple[DefsRef, JsonSchemaValue] ``` This method wraps the get_defs_ref method with some cache-lookup/population logic, and returns both the produced defs_ref and the JSON schema that will refer to the right definition. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `core_ref` | `CoreRef` | The core reference to get the definitions reference for. | *required* | Returns: | Type | Description | | --- | --- | | `tuple[DefsRef, JsonSchemaValue]` | A tuple of the definitions reference and the JSON schema that will refer to it. | Source code in `pydantic/json_schema.py` ```python def get_cache_defs_ref_schema(self, core_ref: CoreRef) -> tuple[DefsRef, JsonSchemaValue]: """This method wraps the get_defs_ref method with some cache-lookup/population logic, and returns both the produced defs_ref and the JSON schema that will refer to the right definition. Args: core_ref: The core reference to get the definitions reference for. Returns: A tuple of the definitions reference and the JSON schema that will refer to it. """ core_mode_ref = (core_ref, self.mode) maybe_defs_ref = self.core_to_defs_refs.get(core_mode_ref) if maybe_defs_ref is not None: json_ref = self.core_to_json_refs[core_mode_ref] return maybe_defs_ref, {'$ref': json_ref} defs_ref = self.get_defs_ref(core_mode_ref) # populate the ref translation mappings self.core_to_defs_refs[core_mode_ref] = defs_ref self.defs_to_core_refs[defs_ref] = core_mode_ref json_ref = JsonRef(self.ref_template.format(model=defs_ref)) self.core_to_json_refs[core_mode_ref] = json_ref self.json_to_defs_refs[json_ref] = defs_ref ref_json_schema = {'$ref': json_ref} return defs_ref, ref_json_schema ``` ### handle_ref_overrides ```python handle_ref_overrides( json_schema: JsonSchemaValue, ) -> JsonSchemaValue ``` Remove any sibling keys that are redundant with the referenced schema. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `json_schema` | `JsonSchemaValue` | The schema to remove redundant sibling keys from. | *required* | Returns: | Type | Description | | --- | --- | | `JsonSchemaValue` | The schema with redundant sibling keys removed. | Source code in `pydantic/json_schema.py` ```python def handle_ref_overrides(self, json_schema: JsonSchemaValue) -> JsonSchemaValue: """Remove any sibling keys that are redundant with the referenced schema. Args: json_schema: The schema to remove redundant sibling keys from. Returns: The schema with redundant sibling keys removed. """ if '$ref' in json_schema: # prevent modifications to the input; this copy may be safe to drop if there is significant overhead json_schema = json_schema.copy() referenced_json_schema = self.get_schema_from_definitions(JsonRef(json_schema['$ref'])) if referenced_json_schema is None: # This can happen when building schemas for models with not-yet-defined references. # It may be a good idea to do a recursive pass at the end of the generation to remove # any redundant override keys. return json_schema for k, v in list(json_schema.items()): if k == '$ref': continue if k in referenced_json_schema and referenced_json_schema[k] == v: del json_schema[k] # redundant key return json_schema ``` ### encode_default ```python encode_default(dft: Any) -> Any ``` Encode a default value to a JSON-serializable value. This is used to encode default values for fields in the generated JSON schema. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `dft` | `Any` | The default value to encode. | *required* | Returns: | Type | Description | | --- | --- | | `Any` | The encoded default value. | Source code in `pydantic/json_schema.py` ```python def encode_default(self, dft: Any) -> Any: """Encode a default value to a JSON-serializable value. This is used to encode default values for fields in the generated JSON schema. Args: dft: The default value to encode. Returns: The encoded default value. """ from .type_adapter import TypeAdapter, _type_has_config config = self._config try: default = ( dft if _type_has_config(type(dft)) else TypeAdapter(type(dft), config=config.config_dict).dump_python( dft, by_alias=self.by_alias, mode='json' ) ) except PydanticSchemaGenerationError: raise pydantic_core.PydanticSerializationError(f'Unable to encode default value {dft}') return pydantic_core.to_jsonable_python( default, timedelta_mode=config.ser_json_timedelta, bytes_mode=config.ser_json_bytes, by_alias=self.by_alias ) ``` ### update_with_validations ```python update_with_validations( json_schema: JsonSchemaValue, core_schema: CoreSchema, mapping: dict[str, str], ) -> None ``` Update the json_schema with the corresponding validations specified in the core_schema, using the provided mapping to translate keys in core_schema to the appropriate keys for a JSON schema. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `json_schema` | `JsonSchemaValue` | The JSON schema to update. | *required* | | `core_schema` | `CoreSchema` | The core schema to get the validations from. | *required* | | `mapping` | `dict[str, str]` | A mapping from core_schema attribute names to the corresponding JSON schema attribute names. | *required* | Source code in `pydantic/json_schema.py` ```python def update_with_validations( self, json_schema: JsonSchemaValue, core_schema: CoreSchema, mapping: dict[str, str] ) -> None: """Update the json_schema with the corresponding validations specified in the core_schema, using the provided mapping to translate keys in core_schema to the appropriate keys for a JSON schema. Args: json_schema: The JSON schema to update. core_schema: The core schema to get the validations from. mapping: A mapping from core_schema attribute names to the corresponding JSON schema attribute names. """ for core_key, json_schema_key in mapping.items(): if core_key in core_schema: json_schema[json_schema_key] = core_schema[core_key] ``` ### get_json_ref_counts ```python get_json_ref_counts( json_schema: JsonSchemaValue, ) -> dict[JsonRef, int] ``` Get all values corresponding to the key '$ref' anywhere in the json_schema. Source code in `pydantic/json_schema.py` ```python def get_json_ref_counts(self, json_schema: JsonSchemaValue) -> dict[JsonRef, int]: """Get all values corresponding to the key '$ref' anywhere in the json_schema.""" json_refs: dict[JsonRef, int] = Counter() def _add_json_refs(schema: Any) -> None: if isinstance(schema, dict): if '$ref' in schema: json_ref = JsonRef(schema['$ref']) if not isinstance(json_ref, str): return # in this case, '$ref' might have been the name of a property already_visited = json_ref in json_refs json_refs[json_ref] += 1 if already_visited: return # prevent recursion on a definition that was already visited try: defs_ref = self.json_to_defs_refs[json_ref] if defs_ref in self._core_defs_invalid_for_json_schema: raise self._core_defs_invalid_for_json_schema[defs_ref] _add_json_refs(self.definitions[defs_ref]) except KeyError: if not json_ref.startswith(('http://', 'https://')): raise for k, v in schema.items(): if k == 'examples' and isinstance(v, list): # Skip examples that may contain arbitrary values and references # (see the comment in `_get_all_json_refs` for more details). continue _add_json_refs(v) elif isinstance(schema, list): for v in schema: _add_json_refs(v) _add_json_refs(json_schema) return json_refs ``` ### emit_warning ```python emit_warning( kind: JsonSchemaWarningKind, detail: str ) -> None ``` This method simply emits PydanticJsonSchemaWarnings based on handling in the `warning_message` method. Source code in `pydantic/json_schema.py` ```python def emit_warning(self, kind: JsonSchemaWarningKind, detail: str) -> None: """This method simply emits PydanticJsonSchemaWarnings based on handling in the `warning_message` method.""" message = self.render_warning_message(kind, detail) if message is not None: warnings.warn(message, PydanticJsonSchemaWarning) ``` ### render_warning_message ```python render_warning_message( kind: JsonSchemaWarningKind, detail: str ) -> str | None ``` This method is responsible for ignoring warnings as desired, and for formatting the warning messages. You can override the value of `ignored_warning_kinds` in a subclass of GenerateJsonSchema to modify what warnings are generated. If you want more control, you can override this method; just return None in situations where you don't want warnings to be emitted. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `kind` | `JsonSchemaWarningKind` | The kind of warning to render. It can be one of the following: 'skipped-choice': A choice field was skipped because it had no valid choices. 'non-serializable-default': A default value was skipped because it was not JSON-serializable. | *required* | | `detail` | `str` | A string with additional details about the warning. | *required* | Returns: | Type | Description | | --- | --- | | `str | None` | The formatted warning message, or None if no warning should be emitted. | Source code in `pydantic/json_schema.py` ```python def render_warning_message(self, kind: JsonSchemaWarningKind, detail: str) -> str | None: """This method is responsible for ignoring warnings as desired, and for formatting the warning messages. You can override the value of `ignored_warning_kinds` in a subclass of GenerateJsonSchema to modify what warnings are generated. If you want more control, you can override this method; just return None in situations where you don't want warnings to be emitted. Args: kind: The kind of warning to render. It can be one of the following: - 'skipped-choice': A choice field was skipped because it had no valid choices. - 'non-serializable-default': A default value was skipped because it was not JSON-serializable. detail: A string with additional details about the warning. Returns: The formatted warning message, or `None` if no warning should be emitted. """ if kind in self.ignored_warning_kinds: return None return f'{detail} [{kind}]' ``` ## WithJsonSchema ```python WithJsonSchema( json_schema: JsonSchemaValue | None, mode: ( Literal["validation", "serialization"] | None ) = None, ) ``` Usage Documentation [`WithJsonSchema` Annotation](../../concepts/json_schema/#withjsonschema-annotation) Add this as an annotation on a field to override the (base) JSON schema that would be generated for that field. This provides a way to set a JSON schema for types that would otherwise raise errors when producing a JSON schema, such as Callable, or types that have an is-instance core schema, without needing to go so far as creating a custom subclass of pydantic.json_schema.GenerateJsonSchema. Note that any *modifications* to the schema that would normally be made (such as setting the title for model fields) will still be performed. If `mode` is set this will only apply to that schema generation mode, allowing you to set different json schemas for validation and serialization. ## Examples ```python Examples( examples: dict[str, Any], mode: ( Literal["validation", "serialization"] | None ) = None, ) ``` ```python Examples( examples: list[Any], mode: ( Literal["validation", "serialization"] | None ) = None, ) ``` ```python Examples( examples: dict[str, Any] | list[Any], mode: ( Literal["validation", "serialization"] | None ) = None, ) ``` Add examples to a JSON schema. If the JSON Schema already contains examples, the provided examples will be appended. If `mode` is set this will only apply to that schema generation mode, allowing you to add different examples for validation and serialization. Source code in `pydantic/json_schema.py` ```python def __init__( self, examples: dict[str, Any] | list[Any], mode: Literal['validation', 'serialization'] | None = None ) -> None: if isinstance(examples, dict): warnings.warn( 'Using a dict for `examples` is deprecated, use a list instead.', PydanticDeprecatedSince29, stacklevel=2, ) self.examples = examples self.mode = mode ``` ## SkipJsonSchema ```python SkipJsonSchema() ``` Usage Documentation [`SkipJsonSchema` Annotation](../../concepts/json_schema/#skipjsonschema-annotation) Add this as an annotation on a field to skip generating a JSON schema for that field. Example ```python from pprint import pprint from typing import Union from pydantic import BaseModel from pydantic.json_schema import SkipJsonSchema class Model(BaseModel): a: Union[int, None] = None # (1)! b: Union[int, SkipJsonSchema[None]] = None # (2)! c: SkipJsonSchema[Union[int, None]] = None # (3)! pprint(Model.model_json_schema()) ''' { 'properties': { 'a': { 'anyOf': [ {'type': 'integer'}, {'type': 'null'} ], 'default': None, 'title': 'A' }, 'b': { 'default': None, 'title': 'B', 'type': 'integer' } }, 'title': 'Model', 'type': 'object' } ''' ``` 1. The integer and null types are both included in the schema for `a`. 1. The integer type is the only type included in the schema for `b`. 1. The entirety of the `c` field is omitted from the schema. ## model_json_schema ```python model_json_schema( cls: type[BaseModel] | type[PydanticDataclass], by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[ GenerateJsonSchema ] = GenerateJsonSchema, mode: JsonSchemaMode = "validation", ) -> dict[str, Any] ``` Utility function to generate a JSON Schema for a model. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `cls` | `type[BaseModel] | type[PydanticDataclass]` | The model class to generate a JSON Schema for. | *required* | | `by_alias` | `bool` | If True (the default), fields will be serialized according to their alias. If False, fields will be serialized according to their attribute name. | `True` | | `ref_template` | `str` | The template to use for generating JSON Schema references. | `DEFAULT_REF_TEMPLATE` | | `schema_generator` | `type[GenerateJsonSchema]` | The class to use for generating the JSON Schema. | `GenerateJsonSchema` | | `mode` | `JsonSchemaMode` | The mode to use for generating the JSON Schema. It can be one of the following: 'validation': Generate a JSON Schema for validating data. 'serialization': Generate a JSON Schema for serializing data. | `'validation'` | Returns: | Type | Description | | --- | --- | | `dict[str, Any]` | The generated JSON Schema. | Source code in `pydantic/json_schema.py` ```python def model_json_schema( cls: type[BaseModel] | type[PydanticDataclass], by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', ) -> dict[str, Any]: """Utility function to generate a JSON Schema for a model. Args: cls: The model class to generate a JSON Schema for. by_alias: If `True` (the default), fields will be serialized according to their alias. If `False`, fields will be serialized according to their attribute name. ref_template: The template to use for generating JSON Schema references. schema_generator: The class to use for generating the JSON Schema. mode: The mode to use for generating the JSON Schema. It can be one of the following: - 'validation': Generate a JSON Schema for validating data. - 'serialization': Generate a JSON Schema for serializing data. Returns: The generated JSON Schema. """ from .main import BaseModel schema_generator_instance = schema_generator(by_alias=by_alias, ref_template=ref_template) if isinstance(cls.__pydantic_core_schema__, _mock_val_ser.MockCoreSchema): cls.__pydantic_core_schema__.rebuild() if cls is BaseModel: raise AttributeError('model_json_schema() must be called on a subclass of BaseModel, not BaseModel itself.') assert not isinstance(cls.__pydantic_core_schema__, _mock_val_ser.MockCoreSchema), 'this is a bug! please report it' return schema_generator_instance.generate(cls.__pydantic_core_schema__, mode=mode) ``` ## models_json_schema ```python models_json_schema( models: Sequence[ tuple[ type[BaseModel] | type[PydanticDataclass], JsonSchemaMode, ] ], *, by_alias: bool = True, title: str | None = None, description: str | None = None, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[ GenerateJsonSchema ] = GenerateJsonSchema ) -> tuple[ dict[ tuple[ type[BaseModel] | type[PydanticDataclass], JsonSchemaMode, ], JsonSchemaValue, ], JsonSchemaValue, ] ``` Utility function to generate a JSON Schema for multiple models. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `models` | `Sequence[tuple[type[BaseModel] | type[PydanticDataclass], JsonSchemaMode]]` | A sequence of tuples of the form (model, mode). | *required* | | `by_alias` | `bool` | Whether field aliases should be used as keys in the generated JSON Schema. | `True` | | `title` | `str | None` | The title of the generated JSON Schema. | `None` | | `description` | `str | None` | The description of the generated JSON Schema. | `None` | | `ref_template` | `str` | The reference template to use for generating JSON Schema references. | `DEFAULT_REF_TEMPLATE` | | `schema_generator` | `type[GenerateJsonSchema]` | The schema generator to use for generating the JSON Schema. | `GenerateJsonSchema` | Returns: | Type | Description | | --- | --- | | `tuple[dict[tuple[type[BaseModel] | type[PydanticDataclass], JsonSchemaMode], JsonSchemaValue], JsonSchemaValue]` | A tuple where: - The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.) - The second element is a JSON schema containing all definitions referenced in the first returned element, along with the optional title and description keys. | Source code in `pydantic/json_schema.py` ```python def models_json_schema( models: Sequence[tuple[type[BaseModel] | type[PydanticDataclass], JsonSchemaMode]], *, by_alias: bool = True, title: str | None = None, description: str | None = None, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, ) -> tuple[dict[tuple[type[BaseModel] | type[PydanticDataclass], JsonSchemaMode], JsonSchemaValue], JsonSchemaValue]: """Utility function to generate a JSON Schema for multiple models. Args: models: A sequence of tuples of the form (model, mode). by_alias: Whether field aliases should be used as keys in the generated JSON Schema. title: The title of the generated JSON Schema. description: The description of the generated JSON Schema. ref_template: The reference template to use for generating JSON Schema references. schema_generator: The schema generator to use for generating the JSON Schema. Returns: A tuple where: - The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.) - The second element is a JSON schema containing all definitions referenced in the first returned element, along with the optional title and description keys. """ for cls, _ in models: if isinstance(cls.__pydantic_core_schema__, _mock_val_ser.MockCoreSchema): cls.__pydantic_core_schema__.rebuild() instance = schema_generator(by_alias=by_alias, ref_template=ref_template) inputs: list[tuple[type[BaseModel] | type[PydanticDataclass], JsonSchemaMode, CoreSchema]] = [ (m, mode, m.__pydantic_core_schema__) for m, mode in models ] json_schemas_map, definitions = instance.generate_definitions(inputs) json_schema: dict[str, Any] = {} if definitions: json_schema['$defs'] = definitions if title: json_schema['title'] = title if description: json_schema['description'] = description return json_schemas_map, json_schema ``` The networks module contains types for common network-related fields. ## MAX_EMAIL_LENGTH ```python MAX_EMAIL_LENGTH = 2048 ``` Maximum length for an email. A somewhat arbitrary but very generous number compared to what is allowed by most implementations. ## UrlConstraints ```python UrlConstraints( max_length: int | None = None, allowed_schemes: list[str] | None = None, host_required: bool | None = None, default_host: str | None = None, default_port: int | None = None, default_path: str | None = None, ) ``` Url constraints. Attributes: | Name | Type | Description | | --- | --- | --- | | `max_length` | `int | None` | The maximum length of the url. Defaults to None. | | `allowed_schemes` | `list[str] | None` | The allowed schemes. Defaults to None. | | `host_required` | `bool | None` | Whether the host is required. Defaults to None. | | `default_host` | `str | None` | The default host. Defaults to None. | | `default_port` | `int | None` | The default port. Defaults to None. | | `default_path` | `str | None` | The default path. Defaults to None. | ### defined_constraints ```python defined_constraints: dict[str, Any] ``` Fetch a key / value mapping of constraints to values that are not None. Used for core schema updates. ## AnyUrl ```python AnyUrl(url: str | Url | _BaseUrl) ``` Bases: `_BaseUrl` Base type for all URLs. - Any scheme allowed - Top-level domain (TLD) not required - Host not required Assuming an input URL of `http://samuel:pass@example.com:8000/the/path/?query=here#fragment=is;this=bit`, the types export the following properties: - `scheme`: the URL scheme (`http`), always set. - `host`: the URL host (`example.com`). - `username`: optional username if included (`samuel`). - `password`: optional password if included (`pass`). - `port`: optional port (`8000`). - `path`: optional path (`/the/path/`). - `query`: optional URL query (for example, `GET` arguments or "search string", such as `query=here`). - `fragment`: optional fragment (`fragment=is;this=bit`). Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreUrl | _BaseUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ## AnyHttpUrl ```python AnyHttpUrl(url: str | Url | _BaseUrl) ``` Bases: `AnyUrl` A type that will accept any http or https URL. - TLD not required - Host not required Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreUrl | _BaseUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ## HttpUrl ```python HttpUrl(url: str | Url | _BaseUrl) ``` Bases: `AnyUrl` A type that will accept any http or https URL. - TLD not required - Host not required - Max length 2083 ```python from pydantic import BaseModel, HttpUrl, ValidationError class MyModel(BaseModel): url: HttpUrl m = MyModel(url='http://www.example.com') # (1)! print(m.url) #> http://www.example.com/ try: MyModel(url='ftp://invalid.url') except ValidationError as e: print(e) ''' 1 validation error for MyModel url URL scheme should be 'http' or 'https' [type=url_scheme, input_value='ftp://invalid.url', input_type=str] ''' try: MyModel(url='not a url') except ValidationError as e: print(e) ''' 1 validation error for MyModel url Input should be a valid URL, relative URL without a base [type=url_parsing, input_value='not a url', input_type=str] ''' ``` 1. Note: mypy would prefer `m = MyModel(url=HttpUrl('http://www.example.com'))`, but Pydantic will convert the string to an HttpUrl instance anyway. "International domains" (e.g. a URL where the host or TLD includes non-ascii characters) will be encoded via [punycode](https://en.wikipedia.org/wiki/Punycode) (see [this article](https://www.xudongz.com/blog/2017/idn-phishing/) for a good description of why this is important): ```python from pydantic import BaseModel, HttpUrl class MyModel(BaseModel): url: HttpUrl m1 = MyModel(url='http://puny£code.com') print(m1.url) #> http://xn--punycode-eja.com/ m2 = MyModel(url='https://www.аррӏе.com/') print(m2.url) #> https://www.xn--80ak6aa92e.com/ m3 = MyModel(url='https://www.example.珠宝/') print(m3.url) #> https://www.example.xn--pbt977c/ ``` Underscores in Hostnames In Pydantic, underscores are allowed in all parts of a domain except the TLD. Technically this might be wrong - in theory the hostname cannot have underscores, but subdomains can. To explain this; consider the following two cases: - `exam_ple.co.uk`: the hostname is `exam_ple`, which should not be allowed since it contains an underscore. - `foo_bar.example.com` the hostname is `example`, which should be allowed since the underscore is in the subdomain. Without having an exhaustive list of TLDs, it would be impossible to differentiate between these two. Therefore underscores are allowed, but you can always do further validation in a validator if desired. Also, Chrome, Firefox, and Safari all currently accept `http://exam_ple.com` as a URL, so we're in good (or at least big) company. Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreUrl | _BaseUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ## AnyWebsocketUrl ```python AnyWebsocketUrl(url: str | Url | _BaseUrl) ``` Bases: `AnyUrl` A type that will accept any ws or wss URL. - TLD not required - Host not required Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreUrl | _BaseUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ## WebsocketUrl ```python WebsocketUrl(url: str | Url | _BaseUrl) ``` Bases: `AnyUrl` A type that will accept any ws or wss URL. - TLD not required - Host not required - Max length 2083 Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreUrl | _BaseUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ## FileUrl ```python FileUrl(url: str | Url | _BaseUrl) ``` Bases: `AnyUrl` A type that will accept any file URL. - Host not required Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreUrl | _BaseUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ## FtpUrl ```python FtpUrl(url: str | Url | _BaseUrl) ``` Bases: `AnyUrl` A type that will accept ftp URL. - TLD not required - Host not required Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreUrl | _BaseUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ## PostgresDsn ```python PostgresDsn(url: str | MultiHostUrl | _BaseMultiHostUrl) ``` Bases: `_BaseMultiHostUrl` A type that will accept any Postgres DSN. - User info required - TLD not required - Host required - Supports multiple hosts If further validation is required, these properties can be used by validators to enforce specific behaviour: ```python from pydantic import ( BaseModel, HttpUrl, PostgresDsn, ValidationError, field_validator, ) class MyModel(BaseModel): url: HttpUrl m = MyModel(url='http://www.example.com') # the repr() method for a url will display all properties of the url print(repr(m.url)) #> HttpUrl('http://www.example.com/') print(m.url.scheme) #> http print(m.url.host) #> www.example.com print(m.url.port) #> 80 class MyDatabaseModel(BaseModel): db: PostgresDsn @field_validator('db') def check_db_name(cls, v): assert v.path and len(v.path) > 1, 'database must be provided' return v m = MyDatabaseModel(db='postgres://user:pass@localhost:5432/foobar') print(m.db) #> postgres://user:pass@localhost:5432/foobar try: MyDatabaseModel(db='postgres://user:pass@localhost:5432') except ValidationError as e: print(e) ''' 1 validation error for MyDatabaseModel db Assertion failed, database must be provided assert (None) + where None = PostgresDsn('postgres://user:pass@localhost:5432').path [type=assertion_error, input_value='postgres://user:pass@localhost:5432', input_type=str] ''' ``` Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreMultiHostUrl | _BaseMultiHostUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ### host ```python host: str ``` The required URL host. ## CockroachDsn ```python CockroachDsn(url: str | Url | _BaseUrl) ``` Bases: `AnyUrl` A type that will accept any Cockroach DSN. - User info required - TLD not required - Host required Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreUrl | _BaseUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ### host ```python host: str ``` The required URL host. ## AmqpDsn ```python AmqpDsn(url: str | Url | _BaseUrl) ``` Bases: `AnyUrl` A type that will accept any AMQP DSN. - User info required - TLD not required - Host not required Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreUrl | _BaseUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ## RedisDsn ```python RedisDsn(url: str | Url | _BaseUrl) ``` Bases: `AnyUrl` A type that will accept any Redis DSN. - User info required - TLD not required - Host required (e.g., `rediss://:pass@localhost`) Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreUrl | _BaseUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ### host ```python host: str ``` The required URL host. ## MongoDsn ```python MongoDsn(url: str | MultiHostUrl | _BaseMultiHostUrl) ``` Bases: `_BaseMultiHostUrl` A type that will accept any MongoDB DSN. - User info not required - Database name not required - Port not required - User info may be passed without user part (e.g., `mongodb://mongodb0.example.com:27017`). Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreMultiHostUrl | _BaseMultiHostUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ## KafkaDsn ```python KafkaDsn(url: str | Url | _BaseUrl) ``` Bases: `AnyUrl` A type that will accept any Kafka DSN. - User info required - TLD not required - Host not required Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreUrl | _BaseUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ## NatsDsn ```python NatsDsn(url: str | MultiHostUrl | _BaseMultiHostUrl) ``` Bases: `_BaseMultiHostUrl` A type that will accept any NATS DSN. NATS is a connective technology built for the ever increasingly hyper-connected world. It is a single technology that enables applications to securely communicate across any combination of cloud vendors, on-premise, edge, web and mobile, and devices. More: https://nats.io Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreMultiHostUrl | _BaseMultiHostUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ## MySQLDsn ```python MySQLDsn(url: str | Url | _BaseUrl) ``` Bases: `AnyUrl` A type that will accept any MySQL DSN. - User info required - TLD not required - Host not required Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreUrl | _BaseUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ## MariaDBDsn ```python MariaDBDsn(url: str | Url | _BaseUrl) ``` Bases: `AnyUrl` A type that will accept any MariaDB DSN. - User info required - TLD not required - Host not required Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreUrl | _BaseUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ## ClickHouseDsn ```python ClickHouseDsn(url: str | Url | _BaseUrl) ``` Bases: `AnyUrl` A type that will accept any ClickHouse DSN. - User info required - TLD not required - Host not required Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreUrl | _BaseUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ## SnowflakeDsn ```python SnowflakeDsn(url: str | Url | _BaseUrl) ``` Bases: `AnyUrl` A type that will accept any Snowflake DSN. - User info required - TLD not required - Host required Source code in `pydantic/networks.py` ```python def __init__(self, url: str | _CoreUrl | _BaseUrl) -> None: self._url = _build_type_adapter(self.__class__).validate_python(url)._url ``` ### host ```python host: str ``` The required URL host. ## EmailStr Info To use this type, you need to install the optional [`email-validator`](https://github.com/JoshData/python-email-validator) package: ```bash pip install email-validator ``` Validate email addresses. ```python from pydantic import BaseModel, EmailStr class Model(BaseModel): email: EmailStr print(Model(email='contact@mail.com')) #> email='contact@mail.com' ``` ## NameEmail ```python NameEmail(name: str, email: str) ``` Bases: `Representation` Info To use this type, you need to install the optional [`email-validator`](https://github.com/JoshData/python-email-validator) package: ```bash pip install email-validator ``` Validate a name and email address combination, as specified by [RFC 5322](https://datatracker.ietf.org/doc/html/rfc5322#section-3.4). The `NameEmail` has two properties: `name` and `email`. In case the `name` is not provided, it's inferred from the email address. ```python from pydantic import BaseModel, NameEmail class User(BaseModel): email: NameEmail user = User(email='Fred Bloggs ') print(user.email) #> Fred Bloggs print(user.email.name) #> Fred Bloggs user = User(email='fred.bloggs@example.com') print(user.email) #> fred.bloggs print(user.email.name) #> fred.bloggs ``` Source code in `pydantic/networks.py` ```python def __init__(self, name: str, email: str): self.name = name self.email = email ``` ## IPvAnyAddress Validate an IPv4 or IPv6 address. ```python from pydantic import BaseModel from pydantic.networks import IPvAnyAddress class IpModel(BaseModel): ip: IPvAnyAddress print(IpModel(ip='127.0.0.1')) #> ip=IPv4Address('127.0.0.1') try: IpModel(ip='http://www.example.com') except ValueError as e: print(e.errors()) ''' [ { 'type': 'ip_any_address', 'loc': ('ip',), 'msg': 'value is not a valid IPv4 or IPv6 address', 'input': 'http://www.example.com', } ] ''' ``` ## IPvAnyInterface Validate an IPv4 or IPv6 interface. ## IPvAnyNetwork Validate an IPv4 or IPv6 network. ## validate_email ```python validate_email(value: str) -> tuple[str, str] ``` Email address validation using [email-validator](https://pypi.org/project/email-validator/). Returns: | Type | Description | | --- | --- | | `tuple[str, str]` | A tuple containing the local part of the email (or the name for "pretty" email addresses) and the normalized email. | Raises: | Type | Description | | --- | --- | | `PydanticCustomError` | If the email is invalid. | Note Note that: - Raw IP address (literal) domain parts are not allowed. - `"John Doe "` style "pretty" email addresses are processed. - Spaces are striped from the beginning and end of addresses, but no error is raised. Source code in `pydantic/networks.py` ```python def validate_email(value: str) -> tuple[str, str]: """Email address validation using [email-validator](https://pypi.org/project/email-validator/). Returns: A tuple containing the local part of the email (or the name for "pretty" email addresses) and the normalized email. Raises: PydanticCustomError: If the email is invalid. Note: Note that: * Raw IP address (literal) domain parts are not allowed. * `"John Doe "` style "pretty" email addresses are processed. * Spaces are striped from the beginning and end of addresses, but no error is raised. """ if email_validator is None: import_email_validator() if len(value) > MAX_EMAIL_LENGTH: raise PydanticCustomError( 'value_error', 'value is not a valid email address: {reason}', {'reason': f'Length must not exceed {MAX_EMAIL_LENGTH} characters'}, ) m = pretty_email_regex.fullmatch(value) name: str | None = None if m: unquoted_name, quoted_name, value = m.groups() name = unquoted_name or quoted_name email = value.strip() try: parts = email_validator.validate_email(email, check_deliverability=False) except email_validator.EmailNotValidError as e: raise PydanticCustomError( 'value_error', 'value is not a valid email address: {reason}', {'reason': str(e.args[0])} ) from e email = parts.normalized assert email is not None name = name or parts.local_part return name, email ``` ## __version__ ```python __version__: str ``` ## SchemaValidator ```python SchemaValidator( schema: CoreSchema, config: CoreConfig | None = None ) ``` `SchemaValidator` is the Python wrapper for `pydantic-core`'s Rust validation logic, internally it owns one `CombinedValidator` which may in turn own more `CombinedValidator`s which make up the full schema validator. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `CoreSchema` | The CoreSchema to use for validation. | *required* | | `config` | `CoreConfig | None` | Optionally a CoreConfig to configure validation. | `None` | ### title ```python title: str ``` The title of the schema, as used in the heading of ValidationError.__str__(). ### validate_python ```python validate_python( input: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: Any | None = None, self_instance: Any | None = None, allow_partial: ( bool | Literal["off", "on", "trailing-strings"] ) = False, by_alias: bool | None = None, by_name: bool | None = None ) -> Any ``` Validate a Python object against the schema and return the validated object. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `input` | `Any` | The Python object to validate. | *required* | | `strict` | `bool | None` | Whether to validate the object in strict mode. If None, the value of CoreConfig.strict is used. | `None` | | `from_attributes` | `bool | None` | Whether to validate objects as inputs to models by extracting attributes. If None, the value of CoreConfig.from_attributes is used. | `None` | | `context` | `Any | None` | The context to use for validation, this is passed to functional validators as info.context. | `None` | | `self_instance` | `Any | None` | An instance of a model set attributes on from validation, this is used when running validation from the __init__ method of a model. | `None` | | `allow_partial` | `bool | Literal['off', 'on', 'trailing-strings']` | Whether to allow partial validation; if True errors in the last element of sequences and mappings are ignored. 'trailing-strings' means any final unfinished JSON string is included in the result. | `False` | | `by_alias` | `bool | None` | Whether to use the field's alias when validating against the provided input data. | `None` | | `by_name` | `bool | None` | Whether to use the field's name when validating against the provided input data. | `None` | Raises: | Type | Description | | --- | --- | | `ValidationError` | If validation fails. | | `Exception` | Other error types maybe raised if internal errors occur. | Returns: | Type | Description | | --- | --- | | `Any` | The validated object. | ### isinstance_python ```python isinstance_python( input: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: Any | None = None, self_instance: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None ) -> bool ``` Similar to validate_python() but returns a boolean. Arguments match `validate_python()`. This method will not raise `ValidationError`s but will raise internal errors. Returns: | Type | Description | | --- | --- | | `bool` | True if validation succeeds, False if validation fails. | ### validate_json ```python validate_json( input: str | bytes | bytearray, *, strict: bool | None = None, context: Any | None = None, self_instance: Any | None = None, allow_partial: ( bool | Literal["off", "on", "trailing-strings"] ) = False, by_alias: bool | None = None, by_name: bool | None = None ) -> Any ``` Validate JSON data directly against the schema and return the validated Python object. This method should be significantly faster than `validate_python(json.loads(json_data))` as it avoids the need to create intermediate Python objects It also handles constructing the correct Python type even in strict mode, where `validate_python(json.loads(json_data))` would fail validation. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `input` | `str | bytes | bytearray` | The JSON data to validate. | *required* | | `strict` | `bool | None` | Whether to validate the object in strict mode. If None, the value of CoreConfig.strict is used. | `None` | | `context` | `Any | None` | The context to use for validation, this is passed to functional validators as info.context. | `None` | | `self_instance` | `Any | None` | An instance of a model set attributes on from validation. | `None` | | `allow_partial` | `bool | Literal['off', 'on', 'trailing-strings']` | Whether to allow partial validation; if True incomplete JSON will be parsed successfully and errors in the last element of sequences and mappings are ignored. 'trailing-strings' means any final unfinished JSON string is included in the result. | `False` | | `by_alias` | `bool | None` | Whether to use the field's alias when validating against the provided input data. | `None` | | `by_name` | `bool | None` | Whether to use the field's name when validating against the provided input data. | `None` | Raises: | Type | Description | | --- | --- | | `ValidationError` | If validation fails or if the JSON data is invalid. | | `Exception` | Other error types maybe raised if internal errors occur. | Returns: | Type | Description | | --- | --- | | `Any` | The validated Python object. | ### validate_strings ```python validate_strings( input: _StringInput, *, strict: bool | None = None, context: Any | None = None, allow_partial: ( bool | Literal["off", "on", "trailing-strings"] ) = False, by_alias: bool | None = None, by_name: bool | None = None ) -> Any ``` Validate a string against the schema and return the validated Python object. This is similar to `validate_json` but applies to scenarios where the input will be a string but not JSON data, e.g. URL fragments, query parameters, etc. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `input` | `_StringInput` | The input as a string, or bytes/bytearray if strict=False. | *required* | | `strict` | `bool | None` | Whether to validate the object in strict mode. If None, the value of CoreConfig.strict is used. | `None` | | `context` | `Any | None` | The context to use for validation, this is passed to functional validators as info.context. | `None` | | `allow_partial` | `bool | Literal['off', 'on', 'trailing-strings']` | Whether to allow partial validation; if True errors in the last element of sequences and mappings are ignored. 'trailing-strings' means any final unfinished JSON string is included in the result. | `False` | | `by_alias` | `bool | None` | Whether to use the field's alias when validating against the provided input data. | `None` | | `by_name` | `bool | None` | Whether to use the field's name when validating against the provided input data. | `None` | Raises: | Type | Description | | --- | --- | | `ValidationError` | If validation fails or if the JSON data is invalid. | | `Exception` | Other error types maybe raised if internal errors occur. | Returns: | Type | Description | | --- | --- | | `Any` | The validated Python object. | ### validate_assignment ```python validate_assignment( obj: Any, field_name: str, field_value: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None ) -> ( dict[str, Any] | tuple[dict[str, Any], dict[str, Any] | None, set[str]] ) ``` Validate an assignment to a field on a model. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `obj` | `Any` | The model instance being assigned to. | *required* | | `field_name` | `str` | The name of the field to validate assignment for. | *required* | | `field_value` | `Any` | The value to assign to the field. | *required* | | `strict` | `bool | None` | Whether to validate the object in strict mode. If None, the value of CoreConfig.strict is used. | `None` | | `from_attributes` | `bool | None` | Whether to validate objects as inputs to models by extracting attributes. If None, the value of CoreConfig.from_attributes is used. | `None` | | `context` | `Any | None` | The context to use for validation, this is passed to functional validators as info.context. | `None` | | `by_alias` | `bool | None` | Whether to use the field's alias when validating against the provided input data. | `None` | | `by_name` | `bool | None` | Whether to use the field's name when validating against the provided input data. | `None` | Raises: | Type | Description | | --- | --- | | `ValidationError` | If validation fails. | | `Exception` | Other error types maybe raised if internal errors occur. | Returns: | Type | Description | | --- | --- | | `dict[str, Any] | tuple[dict[str, Any], dict[str, Any] | None, set[str]]` | Either the model dict or a tuple of (model_data, model_extra, fields_set) | ### get_default_value ```python get_default_value( *, strict: bool | None = None, context: Any = None ) -> Some | None ``` Get the default value for the schema, including running default value validation. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `strict` | `bool | None` | Whether to validate the default value in strict mode. If None, the value of CoreConfig.strict is used. | `None` | | `context` | `Any` | The context to use for validation, this is passed to functional validators as info.context. | `None` | Raises: | Type | Description | | --- | --- | | `ValidationError` | If validation fails. | | `Exception` | Other error types maybe raised if internal errors occur. | Returns: | Type | Description | | --- | --- | | `Some | None` | None if the schema has no default value, otherwise a Some containing the default. | ## SchemaSerializer ```python SchemaSerializer( schema: CoreSchema, config: CoreConfig | None = None ) ``` `SchemaSerializer` is the Python wrapper for `pydantic-core`'s Rust serialization logic, internally it owns one `CombinedSerializer` which may in turn own more `CombinedSerializer`s which make up the full schema serializer. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `CoreSchema` | The CoreSchema to use for serialization. | *required* | | `config` | `CoreConfig | None` | Optionally a CoreConfig to to configure serialization. | `None` | ### to_python ```python to_python( value: Any, *, mode: str | None = None, include: _IncEx | None = None, exclude: _IncEx | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: ( bool | Literal["none", "warn", "error"] ) = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False, context: Any | None = None ) -> Any ``` Serialize/marshal a Python object to a Python object including transforming and filtering data. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `value` | `Any` | The Python object to serialize. | *required* | | `mode` | `str | None` | The serialization mode to use, either 'python' or 'json', defaults to 'python'. In JSON mode, all values are converted to JSON compatible types, e.g. None, int, float, str, list, dict. | `None` | | `include` | `_IncEx | None` | A set of fields to include, if None all fields are included. | `None` | | `exclude` | `_IncEx | None` | A set of fields to exclude, if None no fields are excluded. | `None` | | `by_alias` | `bool | None` | Whether to use the alias names of fields. | `None` | | `exclude_unset` | `bool` | Whether to exclude fields that are not set, e.g. are not included in __pydantic_fields_set__. | `False` | | `exclude_defaults` | `bool` | Whether to exclude fields that are equal to their default value. | `False` | | `exclude_none` | `bool` | Whether to exclude fields that have a value of None. | `False` | | `round_trip` | `bool` | Whether to enable serialization and validation round-trip support. | `False` | | `warnings` | `bool | Literal['none', 'warn', 'error']` | How to handle invalid fields. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError. | `True` | | `fallback` | `Callable[[Any], Any] | None` | A function to call when an unknown value is encountered, if None a PydanticSerializationError error is raised. | `None` | | `serialize_as_any` | `bool` | Whether to serialize fields with duck-typing serialization behavior. | `False` | | `context` | `Any | None` | The context to use for serialization, this is passed to functional serializers as info.context. | `None` | Raises: | Type | Description | | --- | --- | | `PydanticSerializationError` | If serialization fails and no fallback function is provided. | Returns: | Type | Description | | --- | --- | | `Any` | The serialized Python object. | ### to_json ```python to_json( value: Any, *, indent: int | None = None, include: _IncEx | None = None, exclude: _IncEx | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: ( bool | Literal["none", "warn", "error"] ) = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False, context: Any | None = None ) -> bytes ``` Serialize a Python object to JSON including transforming and filtering data. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `value` | `Any` | The Python object to serialize. | *required* | | `indent` | `int | None` | If None, the JSON will be compact, otherwise it will be pretty-printed with the indent provided. | `None` | | `include` | `_IncEx | None` | A set of fields to include, if None all fields are included. | `None` | | `exclude` | `_IncEx | None` | A set of fields to exclude, if None no fields are excluded. | `None` | | `by_alias` | `bool | None` | Whether to use the alias names of fields. | `None` | | `exclude_unset` | `bool` | Whether to exclude fields that are not set, e.g. are not included in __pydantic_fields_set__. | `False` | | `exclude_defaults` | `bool` | Whether to exclude fields that are equal to their default value. | `False` | | `exclude_none` | `bool` | Whether to exclude fields that have a value of None. | `False` | | `round_trip` | `bool` | Whether to enable serialization and validation round-trip support. | `False` | | `warnings` | `bool | Literal['none', 'warn', 'error']` | How to handle invalid fields. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError. | `True` | | `fallback` | `Callable[[Any], Any] | None` | A function to call when an unknown value is encountered, if None a PydanticSerializationError error is raised. | `None` | | `serialize_as_any` | `bool` | Whether to serialize fields with duck-typing serialization behavior. | `False` | | `context` | `Any | None` | The context to use for serialization, this is passed to functional serializers as info.context. | `None` | Raises: | Type | Description | | --- | --- | | `PydanticSerializationError` | If serialization fails and no fallback function is provided. | Returns: | Type | Description | | --- | --- | | `bytes` | JSON bytes. | ## ValidationError Bases: `ValueError` `ValidationError` is the exception raised by `pydantic-core` when validation fails, it contains a list of errors which detail why validation failed. ### title ```python title: str ``` The title of the error, as used in the heading of `str(validation_error)`. ### from_exception_data ```python from_exception_data( title: str, line_errors: list[InitErrorDetails], input_type: Literal["python", "json"] = "python", hide_input: bool = False, ) -> Self ``` Python constructor for a Validation Error. The API for constructing validation errors will probably change in the future, hence the static method rather than `__init__`. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `title` | `str` | The title of the error, as used in the heading of str(validation_error) | *required* | | `line_errors` | `list[InitErrorDetails]` | A list of InitErrorDetails which contain information about errors that occurred during validation. | *required* | | `input_type` | `Literal['python', 'json']` | Whether the error is for a Python object or JSON. | `'python'` | | `hide_input` | `bool` | Whether to hide the input value in the error message. | `False` | ### error_count ```python error_count() -> int ``` Returns: | Type | Description | | --- | --- | | `int` | The number of errors in the validation error. | ### errors ```python errors( *, include_url: bool = True, include_context: bool = True, include_input: bool = True ) -> list[ErrorDetails] ``` Details about each error in the validation error. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `include_url` | `bool` | Whether to include a URL to documentation on the error each error. | `True` | | `include_context` | `bool` | Whether to include the context of each error. | `True` | | `include_input` | `bool` | Whether to include the input value of each error. | `True` | Returns: | Type | Description | | --- | --- | | `list[ErrorDetails]` | A list of ErrorDetails for each error in the validation error. | ### json ```python json( *, indent: int | None = None, include_url: bool = True, include_context: bool = True, include_input: bool = True ) -> str ``` Same as errors() but returns a JSON string. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `indent` | `int | None` | The number of spaces to indent the JSON by, or None for no indentation - compact JSON. | `None` | | `include_url` | `bool` | Whether to include a URL to documentation on the error each error. | `True` | | `include_context` | `bool` | Whether to include the context of each error. | `True` | | `include_input` | `bool` | Whether to include the input value of each error. | `True` | Returns: | Type | Description | | --- | --- | | `str` | a JSON string. | ## ErrorDetails Bases: `TypedDict` ### type ```python type: str ``` The type of error that occurred, this is an identifier designed for programmatic use that will change rarely or never. `type` is unique for each error message, and can hence be used as an identifier to build custom error messages. ### loc ```python loc: tuple[int | str, ...] ``` Tuple of strings and ints identifying where in the schema the error occurred. ### msg ```python msg: str ``` A human readable error message. ### input ```python input: Any ``` The input data at this `loc` that caused the error. ### ctx ```python ctx: NotRequired[dict[str, Any]] ``` Values which are required to render the error message, and could hence be useful in rendering custom error messages. Also useful for passing custom error data forward. ### url ```python url: NotRequired[str] ``` The documentation URL giving information about the error. No URL is available if a PydanticCustomError is used. ## InitErrorDetails Bases: `TypedDict` ### type ```python type: str | PydanticCustomError ``` The type of error that occurred, this should be a "slug" identifier that changes rarely or never. ### loc ```python loc: NotRequired[tuple[int | str, ...]] ``` Tuple of strings and ints identifying where in the schema the error occurred. ### input ```python input: Any ``` The input data at this `loc` that caused the error. ### ctx ```python ctx: NotRequired[dict[str, Any]] ``` Values which are required to render the error message, and could hence be useful in rendering custom error messages. Also useful for passing custom error data forward. ## SchemaError Bases: `Exception` Information about errors that occur while building a SchemaValidator or SchemaSerializer. ### error_count ```python error_count() -> int ``` Returns: | Type | Description | | --- | --- | | `int` | The number of errors in the schema. | ### errors ```python errors() -> list[ErrorDetails] ``` Returns: | Type | Description | | --- | --- | | `list[ErrorDetails]` | A list of ErrorDetails for each error in the schema. | ## PydanticCustomError ```python PydanticCustomError( error_type: LiteralString, message_template: LiteralString, context: dict[str, Any] | None = None, ) ``` Bases: `ValueError` A custom exception providing flexible error handling for Pydantic validators. You can raise this error in custom validators when you'd like flexibility in regards to the error type, message, and context. Example ```py from pydantic_core import PydanticCustomError def custom_validator(v) -> None: if v <= 10: raise PydanticCustomError('custom_value_error', 'Value must be greater than {value}', {'value': 10, 'extra_context': 'extra_data'}) return v ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `error_type` | `LiteralString` | The error type. | *required* | | `message_template` | `LiteralString` | The message template. | *required* | | `context` | `dict[str, Any] | None` | The data to inject into the message template. | `None` | ### context ```python context: dict[str, Any] | None ``` Values which are required to render the error message, and could hence be useful in passing error data forward. ### type ```python type: str ``` The error type associated with the error. For consistency with Pydantic, this is typically a snake_case string. ### message_template ```python message_template: str ``` The message template associated with the error. This is a string that can be formatted with context variables in `{curly_braces}`. ### message ```python message() -> str ``` The formatted message associated with the error. This presents as the message template with context variables appropriately injected. ## PydanticKnownError ```python PydanticKnownError( error_type: ErrorType, context: dict[str, Any] | None = None, ) ``` Bases: `ValueError` A helper class for raising exceptions that mimic Pydantic's built-in exceptions, with more flexibility in regards to context. Unlike PydanticCustomError, the `error_type` argument must be a known `ErrorType`. Example ```py from pydantic_core import PydanticKnownError def custom_validator(v) -> None: if v <= 10: raise PydanticKnownError(error_type='greater_than', context={'gt': 10}) return v ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `error_type` | `ErrorType` | The error type. | *required* | | `context` | `dict[str, Any] | None` | The data to inject into the message template. | `None` | ### context ```python context: dict[str, Any] | None ``` Values which are required to render the error message, and could hence be useful in passing error data forward. ### type ```python type: ErrorType ``` The type of the error. ### message_template ```python message_template: str ``` The message template associated with the provided error type. This is a string that can be formatted with context variables in `{curly_braces}`. ### message ```python message() -> str ``` The formatted message associated with the error. This presents as the message template with context variables appropriately injected. ## PydanticOmit Bases: `Exception` An exception to signal that a field should be omitted from a generated result. This could span from omitting a field from a JSON Schema to omitting a field from a serialized result. Upcoming: more robust support for using PydanticOmit in custom serializers is still in development. Right now, this is primarily used in the JSON Schema generation process. Example ```py from typing import Callable from pydantic_core import PydanticOmit from pydantic import BaseModel from pydantic.json_schema import GenerateJsonSchema, JsonSchemaValue class MyGenerateJsonSchema(GenerateJsonSchema): def handle_invalid_for_json_schema(self, schema, error_info) -> JsonSchemaValue: raise PydanticOmit class Predicate(BaseModel): name: str = 'no-op' func: Callable = lambda x: x instance_example = Predicate() validation_schema = instance_example.model_json_schema(schema_generator=MyGenerateJsonSchema, mode='validation') print(validation_schema) ''' {'properties': {'name': {'default': 'no-op', 'title': 'Name', 'type': 'string'}}, 'title': 'Predicate', 'type': 'object'} ''' ``` For a more in depth example / explanation, see the [customizing JSON schema](../../concepts/json_schema/#customizing-the-json-schema-generation-process) docs. ## PydanticUseDefault Bases: `Exception` An exception to signal that standard validation either failed or should be skipped, and the default value should be used instead. This warning can be raised in custom valiation functions to redirect the flow of validation. Example ```py from pydantic_core import PydanticUseDefault from datetime import datetime from pydantic import BaseModel, field_validator class Event(BaseModel): name: str = 'meeting' time: datetime @field_validator('name', mode='plain') def name_must_be_present(cls, v) -> str: if not v or not isinstance(v, str): raise PydanticUseDefault() return v event1 = Event(name='party', time=datetime(2024, 1, 1, 12, 0, 0)) print(repr(event1)) # > Event(name='party', time=datetime.datetime(2024, 1, 1, 12, 0)) event2 = Event(time=datetime(2024, 1, 1, 12, 0, 0)) print(repr(event2)) # > Event(name='meeting', time=datetime.datetime(2024, 1, 1, 12, 0)) ``` For an additional example, see the [validating partial json data](../../concepts/json/#partial-json-parsing) section of the Pydantic documentation. ## PydanticSerializationError ```python PydanticSerializationError(message: str) ``` Bases: `ValueError` An error raised when an issue occurs during serialization. In custom serializers, this error can be used to indicate that serialization has failed. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `message` | `str` | The message associated with the error. | *required* | ## PydanticSerializationUnexpectedValue ```python PydanticSerializationUnexpectedValue(message: str) ``` Bases: `ValueError` An error raised when an unexpected value is encountered during serialization. This error is often caught and coerced into a warning, as `pydantic-core` generally makes a best attempt at serializing values, in contrast with validation where errors are eagerly raised. Example ```py from pydantic import BaseModel, field_serializer from pydantic_core import PydanticSerializationUnexpectedValue class BasicPoint(BaseModel): x: int y: int @field_serializer('*') def serialize(self, v): if not isinstance(v, int): raise PydanticSerializationUnexpectedValue(f'Expected type `int`, got {type(v)} with value {v}') return v point = BasicPoint(x=1, y=2) # some sort of mutation point.x = 'a' print(point.model_dump()) ''' UserWarning: Pydantic serializer warnings: PydanticSerializationUnexpectedValue(Expected type `int`, got with value a) return self.__pydantic_serializer__.to_python( {'x': 'a', 'y': 2} ''' ``` This is often used internally in `pydantic-core` when unexpected types are encountered during serialization, but it can also be used by users in custom serializers, as seen above. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `message` | `str` | The message associated with the unexpected value. | *required* | ## Url ```python Url(url: str) ``` Bases: `SupportsAllComparisons` A URL type, internal logic uses the [url rust crate](https://docs.rs/url/latest/url/) originally developed by Mozilla. ## MultiHostUrl ```python MultiHostUrl(url: str) ``` Bases: `SupportsAllComparisons` A URL type with support for multiple hosts, as used by some databases for DSNs, e.g. `https://foo.com,bar.com/path`. Internal URL logic uses the [url rust crate](https://docs.rs/url/latest/url/) originally developed by Mozilla. ## MultiHostHost Bases: `TypedDict` A host part of a multi-host URL. ### username ```python username: str | None ``` The username part of this host, or `None`. ### password ```python password: str | None ``` The password part of this host, or `None`. ### host ```python host: str | None ``` The host part of this host, or `None`. ### port ```python port: int | None ``` The port part of this host, or `None`. ## ArgsKwargs ```python ArgsKwargs( args: tuple[Any, ...], kwargs: dict[str, Any] | None = None, ) ``` A construct used to store arguments and keyword arguments for a function call. This data structure is generally used to store information for core schemas associated with functions (like in an arguments schema). This data structure is also currently used for some validation against dataclasses. Example ```py from pydantic.dataclasses import dataclass from pydantic import model_validator @dataclass class Model: a: int b: int @model_validator(mode="before") @classmethod def no_op_validator(cls, values): print(values) return values Model(1, b=2) #> ArgsKwargs((1,), {"b": 2}) Model(1, 2) #> ArgsKwargs((1, 2), {}) Model(a=1, b=2) #> ArgsKwargs((), {"a": 1, "b": 2}) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `args` | `tuple[Any, ...]` | The arguments (inherently ordered) for a function call. | *required* | | `kwargs` | `dict[str, Any] | None` | The keyword arguments for a function call | `None` | ### args ```python args: tuple[Any, ...] ``` The arguments (inherently ordered) for a function call. ### kwargs ```python kwargs: dict[str, Any] | None ``` The keyword arguments for a function call. ## Some Bases: `Generic[_T]` Similar to Rust's [`Option::Some`](https://doc.rust-lang.org/std/option/enum.Option.html) type, this identifies a value as being present, and provides a way to access it. Generally used in a union with `None` to different between "some value which could be None" and no value. ### value ```python value: _T ``` Returns the value wrapped by `Some`. ## TzInfo Bases: `tzinfo` An `pydantic-core` implementation of the abstract datetime.tzinfo class. ### tzname ```python tzname(dt: datetime | None) -> str | None ``` Return the time zone name corresponding to the datetime object *dt*, as a string. For more info, see tzinfo.tzname. ### utcoffset ```python utcoffset(dt: datetime | None) -> timedelta | None ``` Return offset of local time from UTC, as a timedelta object that is positive east of UTC. If local time is west of UTC, this should be negative. More info can be found at tzinfo.utcoffset. ### dst ```python dst(dt: datetime | None) -> timedelta | None ``` Return the daylight saving time (DST) adjustment, as a timedelta object or `None` if DST information isn’t known. More info can be found attzinfo.dst. ### fromutc ```python fromutc(dt: datetime) -> datetime ``` Adjust the date and time data associated datetime object *dt*, returning an equivalent datetime in self’s local time. More info can be found at tzinfo.fromutc. ## ErrorTypeInfo Bases: `TypedDict` Gives information about errors. ### type ```python type: ErrorType ``` The type of error that occurred, this should a "slug" identifier that changes rarely or never. ### message_template_python ```python message_template_python: str ``` String template to render a human readable error message from using context, when the input is Python. ### example_message_python ```python example_message_python: str ``` Example of a human readable error message, when the input is Python. ### message_template_json ```python message_template_json: NotRequired[str] ``` String template to render a human readable error message from using context, when the input is JSON data. ### example_message_json ```python example_message_json: NotRequired[str] ``` Example of a human readable error message, when the input is JSON data. ### example_context ```python example_context: dict[str, Any] | None ``` Example of context values. ## to_json ```python to_json( value: Any, *, indent: int | None = None, include: _IncEx | None = None, exclude: _IncEx | None = None, by_alias: bool = True, exclude_none: bool = False, round_trip: bool = False, timedelta_mode: Literal["iso8601", "float"] = "iso8601", bytes_mode: Literal["utf8", "base64", "hex"] = "utf8", inf_nan_mode: Literal[ "null", "constants", "strings" ] = "constants", serialize_unknown: bool = False, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False, context: Any | None = None ) -> bytes ``` Serialize a Python object to JSON including transforming and filtering data. This is effectively a standalone version of SchemaSerializer.to_json. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `value` | `Any` | The Python object to serialize. | *required* | | `indent` | `int | None` | If None, the JSON will be compact, otherwise it will be pretty-printed with the indent provided. | `None` | | `include` | `_IncEx | None` | A set of fields to include, if None all fields are included. | `None` | | `exclude` | `_IncEx | None` | A set of fields to exclude, if None no fields are excluded. | `None` | | `by_alias` | `bool` | Whether to use the alias names of fields. | `True` | | `exclude_none` | `bool` | Whether to exclude fields that have a value of None. | `False` | | `round_trip` | `bool` | Whether to enable serialization and validation round-trip support. | `False` | | `timedelta_mode` | `Literal['iso8601', 'float']` | How to serialize timedelta objects, either 'iso8601' or 'float'. | `'iso8601'` | | `bytes_mode` | `Literal['utf8', 'base64', 'hex']` | How to serialize bytes objects, either 'utf8', 'base64', or 'hex'. | `'utf8'` | | `inf_nan_mode` | `Literal['null', 'constants', 'strings']` | How to serialize Infinity, -Infinity and NaN values, either 'null', 'constants', or 'strings'. | `'constants'` | | `serialize_unknown` | `bool` | Attempt to serialize unknown types, str(value) will be used, if that fails "\" will be used. | `False` | | `fallback` | `Callable[[Any], Any] | None` | A function to call when an unknown value is encountered, if None a PydanticSerializationError error is raised. | `None` | | `serialize_as_any` | `bool` | Whether to serialize fields with duck-typing serialization behavior. | `False` | | `context` | `Any | None` | The context to use for serialization, this is passed to functional serializers as info.context. | `None` | Raises: | Type | Description | | --- | --- | | `PydanticSerializationError` | If serialization fails and no fallback function is provided. | Returns: | Type | Description | | --- | --- | | `bytes` | JSON bytes. | ## from_json ```python from_json( data: str | bytes | bytearray, *, allow_inf_nan: bool = True, cache_strings: ( bool | Literal["all", "keys", "none"] ) = True, allow_partial: ( bool | Literal["off", "on", "trailing-strings"] ) = False ) -> Any ``` Deserialize JSON data to a Python object. This is effectively a faster version of `json.loads()`, with some extra functionality. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `data` | `str | bytes | bytearray` | The JSON data to deserialize. | *required* | | `allow_inf_nan` | `bool` | Whether to allow Infinity, -Infinity and NaN values as json.loads() does by default. | `True` | | `cache_strings` | `bool | Literal['all', 'keys', 'none']` | Whether to cache strings to avoid constructing new Python objects, this should have a significant impact on performance while increasing memory usage slightly, all/True means cache all strings, keys means cache only dict keys, none/False means no caching. | `True` | | `allow_partial` | `bool | Literal['off', 'on', 'trailing-strings']` | Whether to allow partial deserialization, if True JSON data is returned if the end of the input is reached before the full object is deserialized, e.g. \["aa", "bb", "c would return ['aa', 'bb']. 'trailing-strings' means any final unfinished JSON string is included in the result. | `False` | Raises: | Type | Description | | --- | --- | | `ValueError` | If deserialization fails. | Returns: | Type | Description | | --- | --- | | `Any` | The deserialized Python object. | ## to_jsonable_python ```python to_jsonable_python( value: Any, *, include: _IncEx | None = None, exclude: _IncEx | None = None, by_alias: bool = True, exclude_none: bool = False, round_trip: bool = False, timedelta_mode: Literal["iso8601", "float"] = "iso8601", bytes_mode: Literal["utf8", "base64", "hex"] = "utf8", inf_nan_mode: Literal[ "null", "constants", "strings" ] = "constants", serialize_unknown: bool = False, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False, context: Any | None = None ) -> Any ``` Serialize/marshal a Python object to a JSON-serializable Python object including transforming and filtering data. This is effectively a standalone version of SchemaSerializer.to_python(mode='json'). Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `value` | `Any` | The Python object to serialize. | *required* | | `include` | `_IncEx | None` | A set of fields to include, if None all fields are included. | `None` | | `exclude` | `_IncEx | None` | A set of fields to exclude, if None no fields are excluded. | `None` | | `by_alias` | `bool` | Whether to use the alias names of fields. | `True` | | `exclude_none` | `bool` | Whether to exclude fields that have a value of None. | `False` | | `round_trip` | `bool` | Whether to enable serialization and validation round-trip support. | `False` | | `timedelta_mode` | `Literal['iso8601', 'float']` | How to serialize timedelta objects, either 'iso8601' or 'float'. | `'iso8601'` | | `bytes_mode` | `Literal['utf8', 'base64', 'hex']` | How to serialize bytes objects, either 'utf8', 'base64', or 'hex'. | `'utf8'` | | `inf_nan_mode` | `Literal['null', 'constants', 'strings']` | How to serialize Infinity, -Infinity and NaN values, either 'null', 'constants', or 'strings'. | `'constants'` | | `serialize_unknown` | `bool` | Attempt to serialize unknown types, str(value) will be used, if that fails "\" will be used. | `False` | | `fallback` | `Callable[[Any], Any] | None` | A function to call when an unknown value is encountered, if None a PydanticSerializationError error is raised. | `None` | | `serialize_as_any` | `bool` | Whether to serialize fields with duck-typing serialization behavior. | `False` | | `context` | `Any | None` | The context to use for serialization, this is passed to functional serializers as info.context. | `None` | Raises: | Type | Description | | --- | --- | | `PydanticSerializationError` | If serialization fails and no fallback function is provided. | Returns: | Type | Description | | --- | --- | | `Any` | The serialized Python object. | This module contains definitions to build schemas which `pydantic_core` can validate and serialize. ## WhenUsed ```python WhenUsed = Literal[ "always", "unless-none", "json", "json-unless-none" ] ``` Values have the following meanings: - `'always'` means always use - `'unless-none'` means use unless the value is `None` - `'json'` means use when serializing to JSON - `'json-unless-none'` means use when serializing to JSON and the value is not `None` ## CoreConfig Bases: `TypedDict` Base class for schema configuration options. Attributes: | Name | Type | Description | | --- | --- | --- | | `title` | `str` | The name of the configuration. | | `strict` | `bool` | Whether the configuration should strictly adhere to specified rules. | | `extra_fields_behavior` | `ExtraBehavior` | The behavior for handling extra fields. | | `typed_dict_total` | `bool` | Whether the TypedDict should be considered total. Default is True. | | `from_attributes` | `bool` | Whether to use attributes for models, dataclasses, and tagged union keys. | | `loc_by_alias` | `bool` | Whether to use the used alias (or first alias for "field required" errors) instead of field_names to construct error locs. Default is True. | | `revalidate_instances` | `Literal['always', 'never', 'subclass-instances']` | Whether instances of models and dataclasses should re-validate. Default is 'never'. | | `validate_default` | `bool` | Whether to validate default values during validation. Default is False. | | `str_max_length` | `int` | The maximum length for string fields. | | `str_min_length` | `int` | The minimum length for string fields. | | `str_strip_whitespace` | `bool` | Whether to strip whitespace from string fields. | | `str_to_lower` | `bool` | Whether to convert string fields to lowercase. | | `str_to_upper` | `bool` | Whether to convert string fields to uppercase. | | `allow_inf_nan` | `bool` | Whether to allow infinity and NaN values for float fields. Default is True. | | `ser_json_timedelta` | `Literal['iso8601', 'float']` | The serialization option for timedelta values. Default is 'iso8601'. | | `ser_json_bytes` | `Literal['utf8', 'base64', 'hex']` | The serialization option for bytes values. Default is 'utf8'. | | `ser_json_inf_nan` | `Literal['null', 'constants', 'strings']` | The serialization option for infinity and NaN values in float fields. Default is 'null'. | | `val_json_bytes` | `Literal['utf8', 'base64', 'hex']` | The validation option for bytes values, complementing ser_json_bytes. Default is 'utf8'. | | `hide_input_in_errors` | `bool` | Whether to hide input data from ValidationError representation. | | `validation_error_cause` | `bool` | Whether to add user-python excs to the cause of a ValidationError. Requires exceptiongroup backport pre Python 3.11. | | `coerce_numbers_to_str` | `bool` | Whether to enable coercion of any Number type to str (not applicable in strict mode). | | `regex_engine` | `Literal['rust-regex', 'python-re']` | The regex engine to use for regex pattern validation. Default is 'rust-regex'. See StringSchema. | | `cache_strings` | `Union[bool, Literal['all', 'keys', 'none']]` | Whether to cache strings. Default is True, True or 'all' is required to cache strings during general validation since validators don't know if they're in a key or a value. | | `validate_by_alias` | `bool` | Whether to use the field's alias when validating against the provided input data. Default is True. | | `validate_by_name` | `bool` | Whether to use the field's name when validating against the provided input data. Default is False. Replacement for populate_by_name. | | `serialize_by_alias` | `bool` | Whether to serialize by alias. Default is False, expected to change to True in V3. | ## SerializationInfo Bases: `Protocol` ### context ```python context: Any | None ``` Current serialization context. ## ValidationInfo Bases: `Protocol` Argument passed to validation functions. ### context ```python context: Any | None ``` Current validation context. ### config ```python config: CoreConfig | None ``` The CoreConfig that applies to this validation. ### mode ```python mode: Literal['python', 'json'] ``` The type of input data we are currently validating ### data ```python data: dict[str, Any] ``` The data being validated for this model. ### field_name ```python field_name: str | None ``` The name of the current field being validated if this validator is attached to a model field. ## simple_ser_schema ```python simple_ser_schema( type: ExpectedSerializationTypes, ) -> SimpleSerSchema ``` Returns a schema for serialization with a custom type. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `type` | `ExpectedSerializationTypes` | The type to use for serialization | *required* | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ```python def simple_ser_schema(type: ExpectedSerializationTypes) -> SimpleSerSchema: """ Returns a schema for serialization with a custom type. Args: type: The type to use for serialization """ return SimpleSerSchema(type=type) ``` ## plain_serializer_function_ser_schema ```python plain_serializer_function_ser_schema( function: SerializerFunction, *, is_field_serializer: bool | None = None, info_arg: bool | None = None, return_schema: CoreSchema | None = None, when_used: WhenUsed = "always" ) -> PlainSerializerFunctionSerSchema ``` Returns a schema for serialization with a function, can be either a "general" or "field" function. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `function` | `SerializerFunction` | The function to use for serialization | *required* | | `is_field_serializer` | `bool | None` | Whether the serializer is for a field, e.g. takes model as the first argument, and info includes field_name | `None` | | `info_arg` | `bool | None` | Whether the function takes an info argument | `None` | | `return_schema` | `CoreSchema | None` | Schema to use for serializing return value | `None` | | `when_used` | `WhenUsed` | When the function should be called | `'always'` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ```python def plain_serializer_function_ser_schema( function: SerializerFunction, *, is_field_serializer: bool | None = None, info_arg: bool | None = None, return_schema: CoreSchema | None = None, when_used: WhenUsed = 'always', ) -> PlainSerializerFunctionSerSchema: """ Returns a schema for serialization with a function, can be either a "general" or "field" function. Args: function: The function to use for serialization is_field_serializer: Whether the serializer is for a field, e.g. takes `model` as the first argument, and `info` includes `field_name` info_arg: Whether the function takes an `info` argument return_schema: Schema to use for serializing return value when_used: When the function should be called """ if when_used == 'always': # just to avoid extra elements in schema, and to use the actual default defined in rust when_used = None # type: ignore return _dict_not_none( type='function-plain', function=function, is_field_serializer=is_field_serializer, info_arg=info_arg, return_schema=return_schema, when_used=when_used, ) ``` ## wrap_serializer_function_ser_schema ```python wrap_serializer_function_ser_schema( function: WrapSerializerFunction, *, is_field_serializer: bool | None = None, info_arg: bool | None = None, schema: CoreSchema | None = None, return_schema: CoreSchema | None = None, when_used: WhenUsed = "always" ) -> WrapSerializerFunctionSerSchema ``` Returns a schema for serialization with a wrap function, can be either a "general" or "field" function. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `function` | `WrapSerializerFunction` | The function to use for serialization | *required* | | `is_field_serializer` | `bool | None` | Whether the serializer is for a field, e.g. takes model as the first argument, and info includes field_name | `None` | | `info_arg` | `bool | None` | Whether the function takes an info argument | `None` | | `schema` | `CoreSchema | None` | The schema to use for the inner serialization | `None` | | `return_schema` | `CoreSchema | None` | Schema to use for serializing return value | `None` | | `when_used` | `WhenUsed` | When the function should be called | `'always'` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ```python def wrap_serializer_function_ser_schema( function: WrapSerializerFunction, *, is_field_serializer: bool | None = None, info_arg: bool | None = None, schema: CoreSchema | None = None, return_schema: CoreSchema | None = None, when_used: WhenUsed = 'always', ) -> WrapSerializerFunctionSerSchema: """ Returns a schema for serialization with a wrap function, can be either a "general" or "field" function. Args: function: The function to use for serialization is_field_serializer: Whether the serializer is for a field, e.g. takes `model` as the first argument, and `info` includes `field_name` info_arg: Whether the function takes an `info` argument schema: The schema to use for the inner serialization return_schema: Schema to use for serializing return value when_used: When the function should be called """ if when_used == 'always': # just to avoid extra elements in schema, and to use the actual default defined in rust when_used = None # type: ignore return _dict_not_none( type='function-wrap', function=function, is_field_serializer=is_field_serializer, info_arg=info_arg, schema=schema, return_schema=return_schema, when_used=when_used, ) ``` ## format_ser_schema ```python format_ser_schema( formatting_string: str, *, when_used: WhenUsed = "json-unless-none" ) -> FormatSerSchema ``` Returns a schema for serialization using python's `format` method. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `formatting_string` | `str` | String defining the format to use | *required* | | `when_used` | `WhenUsed` | Same meaning as for [general_function_plain_ser_schema], but with a different default | `'json-unless-none'` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ```python def format_ser_schema(formatting_string: str, *, when_used: WhenUsed = 'json-unless-none') -> FormatSerSchema: """ Returns a schema for serialization using python's `format` method. Args: formatting_string: String defining the format to use when_used: Same meaning as for [general_function_plain_ser_schema], but with a different default """ if when_used == 'json-unless-none': # just to avoid extra elements in schema, and to use the actual default defined in rust when_used = None # type: ignore return _dict_not_none(type='format', formatting_string=formatting_string, when_used=when_used) ``` ## to_string_ser_schema ```python to_string_ser_schema( *, when_used: WhenUsed = "json-unless-none" ) -> ToStringSerSchema ``` Returns a schema for serialization using python's `str()` / `__str__` method. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `when_used` | `WhenUsed` | Same meaning as for [general_function_plain_ser_schema], but with a different default | `'json-unless-none'` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ```python def to_string_ser_schema(*, when_used: WhenUsed = 'json-unless-none') -> ToStringSerSchema: """ Returns a schema for serialization using python's `str()` / `__str__` method. Args: when_used: Same meaning as for [general_function_plain_ser_schema], but with a different default """ s = dict(type='to-string') if when_used != 'json-unless-none': # just to avoid extra elements in schema, and to use the actual default defined in rust s['when_used'] = when_used return s # type: ignore ``` ## model_ser_schema ```python model_ser_schema( cls: type[Any], schema: CoreSchema ) -> ModelSerSchema ``` Returns a schema for serialization using a model. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `cls` | `type[Any]` | The expected class type, used to generate warnings if the wrong type is passed | *required* | | `schema` | `CoreSchema` | Internal schema to use to serialize the model dict | *required* | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ```python def model_ser_schema(cls: type[Any], schema: CoreSchema) -> ModelSerSchema: """ Returns a schema for serialization using a model. Args: cls: The expected class type, used to generate warnings if the wrong type is passed schema: Internal schema to use to serialize the model dict """ return ModelSerSchema(type='model', cls=cls, schema=schema) ``` ## invalid_schema ```python invalid_schema( ref: str | None = None, metadata: dict[str, Any] | None = None, ) -> InvalidSchema ``` Returns an invalid schema, used to indicate that a schema is invalid. ```text Returns a schema that matches any value, e.g.: ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ```python def invalid_schema(ref: str | None = None, metadata: dict[str, Any] | None = None) -> InvalidSchema: """ Returns an invalid schema, used to indicate that a schema is invalid. Returns a schema that matches any value, e.g.: Args: ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core """ return _dict_not_none(type='invalid', ref=ref, metadata=metadata) ``` ## computed_field ```python computed_field( property_name: str, return_schema: CoreSchema, *, alias: str | None = None, metadata: dict[str, Any] | None = None ) -> ComputedField ``` ComputedFields are properties of a model or dataclass that are included in serialization. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `property_name` | `str` | The name of the property on the model or dataclass | *required* | | `return_schema` | `CoreSchema` | The schema used for the type returned by the computed field | *required* | | `alias` | `str | None` | The name to use in the serialized output | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ```python def computed_field( property_name: str, return_schema: CoreSchema, *, alias: str | None = None, metadata: dict[str, Any] | None = None ) -> ComputedField: """ ComputedFields are properties of a model or dataclass that are included in serialization. Args: property_name: The name of the property on the model or dataclass return_schema: The schema used for the type returned by the computed field alias: The name to use in the serialized output metadata: Any other information you want to include with the schema, not used by pydantic-core """ return _dict_not_none( type='computed-field', property_name=property_name, return_schema=return_schema, alias=alias, metadata=metadata ) ``` ## any_schema ```python any_schema( *, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> AnySchema ``` Returns a schema that matches any value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.any_schema() v = SchemaValidator(schema) assert v.validate_python(1) == 1 ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def any_schema( *, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> AnySchema: """ Returns a schema that matches any value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.any_schema() v = SchemaValidator(schema) assert v.validate_python(1) == 1 ``` Args: ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none(type='any', ref=ref, metadata=metadata, serialization=serialization) ```` ## none_schema ```python none_schema( *, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> NoneSchema ``` Returns a schema that matches a None value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.none_schema() v = SchemaValidator(schema) assert v.validate_python(None) is None ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def none_schema( *, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> NoneSchema: """ Returns a schema that matches a None value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.none_schema() v = SchemaValidator(schema) assert v.validate_python(None) is None ``` Args: ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none(type='none', ref=ref, metadata=metadata, serialization=serialization) ```` ## bool_schema ```python bool_schema( strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> BoolSchema ``` Returns a schema that matches a bool value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.bool_schema() v = SchemaValidator(schema) assert v.validate_python('True') is True ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `strict` | `bool | None` | Whether the value should be a bool or a value that can be converted to a bool | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def bool_schema( strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> BoolSchema: """ Returns a schema that matches a bool value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.bool_schema() v = SchemaValidator(schema) assert v.validate_python('True') is True ``` Args: strict: Whether the value should be a bool or a value that can be converted to a bool ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none(type='bool', strict=strict, ref=ref, metadata=metadata, serialization=serialization) ```` ## int_schema ```python int_schema( *, multiple_of: int | None = None, le: int | None = None, ge: int | None = None, lt: int | None = None, gt: int | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> IntSchema ``` Returns a schema that matches a int value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.int_schema(multiple_of=2, le=6, ge=2) v = SchemaValidator(schema) assert v.validate_python('4') == 4 ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `multiple_of` | `int | None` | The value must be a multiple of this number | `None` | | `le` | `int | None` | The value must be less than or equal to this number | `None` | | `ge` | `int | None` | The value must be greater than or equal to this number | `None` | | `lt` | `int | None` | The value must be strictly less than this number | `None` | | `gt` | `int | None` | The value must be strictly greater than this number | `None` | | `strict` | `bool | None` | Whether the value should be a int or a value that can be converted to a int | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def int_schema( *, multiple_of: int | None = None, le: int | None = None, ge: int | None = None, lt: int | None = None, gt: int | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> IntSchema: """ Returns a schema that matches a int value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.int_schema(multiple_of=2, le=6, ge=2) v = SchemaValidator(schema) assert v.validate_python('4') == 4 ``` Args: multiple_of: The value must be a multiple of this number le: The value must be less than or equal to this number ge: The value must be greater than or equal to this number lt: The value must be strictly less than this number gt: The value must be strictly greater than this number strict: Whether the value should be a int or a value that can be converted to a int ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='int', multiple_of=multiple_of, le=le, ge=ge, lt=lt, gt=gt, strict=strict, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## float_schema ```python float_schema( *, allow_inf_nan: bool | None = None, multiple_of: float | None = None, le: float | None = None, ge: float | None = None, lt: float | None = None, gt: float | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> FloatSchema ``` Returns a schema that matches a float value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.float_schema(le=0.8, ge=0.2) v = SchemaValidator(schema) assert v.validate_python('0.5') == 0.5 ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `allow_inf_nan` | `bool | None` | Whether to allow inf and nan values | `None` | | `multiple_of` | `float | None` | The value must be a multiple of this number | `None` | | `le` | `float | None` | The value must be less than or equal to this number | `None` | | `ge` | `float | None` | The value must be greater than or equal to this number | `None` | | `lt` | `float | None` | The value must be strictly less than this number | `None` | | `gt` | `float | None` | The value must be strictly greater than this number | `None` | | `strict` | `bool | None` | Whether the value should be a float or a value that can be converted to a float | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def float_schema( *, allow_inf_nan: bool | None = None, multiple_of: float | None = None, le: float | None = None, ge: float | None = None, lt: float | None = None, gt: float | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> FloatSchema: """ Returns a schema that matches a float value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.float_schema(le=0.8, ge=0.2) v = SchemaValidator(schema) assert v.validate_python('0.5') == 0.5 ``` Args: allow_inf_nan: Whether to allow inf and nan values multiple_of: The value must be a multiple of this number le: The value must be less than or equal to this number ge: The value must be greater than or equal to this number lt: The value must be strictly less than this number gt: The value must be strictly greater than this number strict: Whether the value should be a float or a value that can be converted to a float ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='float', allow_inf_nan=allow_inf_nan, multiple_of=multiple_of, le=le, ge=ge, lt=lt, gt=gt, strict=strict, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## decimal_schema ```python decimal_schema( *, allow_inf_nan: bool | None = None, multiple_of: Decimal | None = None, le: Decimal | None = None, ge: Decimal | None = None, lt: Decimal | None = None, gt: Decimal | None = None, max_digits: int | None = None, decimal_places: int | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> DecimalSchema ``` Returns a schema that matches a decimal value, e.g.: ```py from decimal import Decimal from pydantic_core import SchemaValidator, core_schema schema = core_schema.decimal_schema(le=0.8, ge=0.2) v = SchemaValidator(schema) assert v.validate_python('0.5') == Decimal('0.5') ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `allow_inf_nan` | `bool | None` | Whether to allow inf and nan values | `None` | | `multiple_of` | `Decimal | None` | The value must be a multiple of this number | `None` | | `le` | `Decimal | None` | The value must be less than or equal to this number | `None` | | `ge` | `Decimal | None` | The value must be greater than or equal to this number | `None` | | `lt` | `Decimal | None` | The value must be strictly less than this number | `None` | | `gt` | `Decimal | None` | The value must be strictly greater than this number | `None` | | `max_digits` | `int | None` | The maximum number of decimal digits allowed | `None` | | `decimal_places` | `int | None` | The maximum number of decimal places allowed | `None` | | `strict` | `bool | None` | Whether the value should be a float or a value that can be converted to a float | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def decimal_schema( *, allow_inf_nan: bool | None = None, multiple_of: Decimal | None = None, le: Decimal | None = None, ge: Decimal | None = None, lt: Decimal | None = None, gt: Decimal | None = None, max_digits: int | None = None, decimal_places: int | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> DecimalSchema: """ Returns a schema that matches a decimal value, e.g.: ```py from decimal import Decimal from pydantic_core import SchemaValidator, core_schema schema = core_schema.decimal_schema(le=0.8, ge=0.2) v = SchemaValidator(schema) assert v.validate_python('0.5') == Decimal('0.5') ``` Args: allow_inf_nan: Whether to allow inf and nan values multiple_of: The value must be a multiple of this number le: The value must be less than or equal to this number ge: The value must be greater than or equal to this number lt: The value must be strictly less than this number gt: The value must be strictly greater than this number max_digits: The maximum number of decimal digits allowed decimal_places: The maximum number of decimal places allowed strict: Whether the value should be a float or a value that can be converted to a float ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='decimal', gt=gt, ge=ge, lt=lt, le=le, max_digits=max_digits, decimal_places=decimal_places, multiple_of=multiple_of, allow_inf_nan=allow_inf_nan, strict=strict, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## complex_schema ```python complex_schema( *, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> ComplexSchema ``` Returns a schema that matches a complex value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.complex_schema() v = SchemaValidator(schema) assert v.validate_python('1+2j') == complex(1, 2) assert v.validate_python(complex(1, 2)) == complex(1, 2) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `strict` | `bool | None` | Whether the value should be a complex object instance or a value that can be converted to a complex object | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def complex_schema( *, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> ComplexSchema: """ Returns a schema that matches a complex value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.complex_schema() v = SchemaValidator(schema) assert v.validate_python('1+2j') == complex(1, 2) assert v.validate_python(complex(1, 2)) == complex(1, 2) ``` Args: strict: Whether the value should be a complex object instance or a value that can be converted to a complex object ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='complex', strict=strict, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## str_schema ```python str_schema( *, pattern: str | Pattern[str] | None = None, max_length: int | None = None, min_length: int | None = None, strip_whitespace: bool | None = None, to_lower: bool | None = None, to_upper: bool | None = None, regex_engine: ( Literal["rust-regex", "python-re"] | None ) = None, strict: bool | None = None, coerce_numbers_to_str: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> StringSchema ``` Returns a schema that matches a string value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.str_schema(max_length=10, min_length=2) v = SchemaValidator(schema) assert v.validate_python('hello') == 'hello' ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `pattern` | `str | Pattern[str] | None` | A regex pattern that the value must match | `None` | | `max_length` | `int | None` | The value must be at most this length | `None` | | `min_length` | `int | None` | The value must be at least this length | `None` | | `strip_whitespace` | `bool | None` | Whether to strip whitespace from the value | `None` | | `to_lower` | `bool | None` | Whether to convert the value to lowercase | `None` | | `to_upper` | `bool | None` | Whether to convert the value to uppercase | `None` | | `regex_engine` | `Literal['rust-regex', 'python-re'] | None` | The regex engine to use for pattern validation. Default is 'rust-regex'. - rust-regex uses the regex Rust crate, which is non-backtracking and therefore more DDoS resistant, but does not support all regex features. - python-re use the re module, which supports all regex features, but may be slower. | `None` | | `strict` | `bool | None` | Whether the value should be a string or a value that can be converted to a string | `None` | | `coerce_numbers_to_str` | `bool | None` | Whether to enable coercion of any Number type to str (not applicable in strict mode). | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def str_schema( *, pattern: str | Pattern[str] | None = None, max_length: int | None = None, min_length: int | None = None, strip_whitespace: bool | None = None, to_lower: bool | None = None, to_upper: bool | None = None, regex_engine: Literal['rust-regex', 'python-re'] | None = None, strict: bool | None = None, coerce_numbers_to_str: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> StringSchema: """ Returns a schema that matches a string value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.str_schema(max_length=10, min_length=2) v = SchemaValidator(schema) assert v.validate_python('hello') == 'hello' ``` Args: pattern: A regex pattern that the value must match max_length: The value must be at most this length min_length: The value must be at least this length strip_whitespace: Whether to strip whitespace from the value to_lower: Whether to convert the value to lowercase to_upper: Whether to convert the value to uppercase regex_engine: The regex engine to use for pattern validation. Default is 'rust-regex'. - `rust-regex` uses the [`regex`](https://docs.rs/regex) Rust crate, which is non-backtracking and therefore more DDoS resistant, but does not support all regex features. - `python-re` use the [`re`](https://docs.python.org/3/library/re.html) module, which supports all regex features, but may be slower. strict: Whether the value should be a string or a value that can be converted to a string coerce_numbers_to_str: Whether to enable coercion of any `Number` type to `str` (not applicable in `strict` mode). ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='str', pattern=pattern, max_length=max_length, min_length=min_length, strip_whitespace=strip_whitespace, to_lower=to_lower, to_upper=to_upper, regex_engine=regex_engine, strict=strict, coerce_numbers_to_str=coerce_numbers_to_str, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## bytes_schema ```python bytes_schema( *, max_length: int | None = None, min_length: int | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> BytesSchema ``` Returns a schema that matches a bytes value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.bytes_schema(max_length=10, min_length=2) v = SchemaValidator(schema) assert v.validate_python(b'hello') == b'hello' ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `max_length` | `int | None` | The value must be at most this length | `None` | | `min_length` | `int | None` | The value must be at least this length | `None` | | `strict` | `bool | None` | Whether the value should be a bytes or a value that can be converted to a bytes | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def bytes_schema( *, max_length: int | None = None, min_length: int | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> BytesSchema: """ Returns a schema that matches a bytes value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.bytes_schema(max_length=10, min_length=2) v = SchemaValidator(schema) assert v.validate_python(b'hello') == b'hello' ``` Args: max_length: The value must be at most this length min_length: The value must be at least this length strict: Whether the value should be a bytes or a value that can be converted to a bytes ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='bytes', max_length=max_length, min_length=min_length, strict=strict, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## date_schema ```python date_schema( *, strict: bool | None = None, le: date | None = None, ge: date | None = None, lt: date | None = None, gt: date | None = None, now_op: Literal["past", "future"] | None = None, now_utc_offset: int | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> DateSchema ``` Returns a schema that matches a date value, e.g.: ```py from datetime import date from pydantic_core import SchemaValidator, core_schema schema = core_schema.date_schema(le=date(2020, 1, 1), ge=date(2019, 1, 1)) v = SchemaValidator(schema) assert v.validate_python(date(2019, 6, 1)) == date(2019, 6, 1) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `strict` | `bool | None` | Whether the value should be a date or a value that can be converted to a date | `None` | | `le` | `date | None` | The value must be less than or equal to this date | `None` | | `ge` | `date | None` | The value must be greater than or equal to this date | `None` | | `lt` | `date | None` | The value must be strictly less than this date | `None` | | `gt` | `date | None` | The value must be strictly greater than this date | `None` | | `now_op` | `Literal['past', 'future'] | None` | The value must be in the past or future relative to the current date | `None` | | `now_utc_offset` | `int | None` | The value must be in the past or future relative to the current date with this utc offset | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def date_schema( *, strict: bool | None = None, le: date | None = None, ge: date | None = None, lt: date | None = None, gt: date | None = None, now_op: Literal['past', 'future'] | None = None, now_utc_offset: int | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> DateSchema: """ Returns a schema that matches a date value, e.g.: ```py from datetime import date from pydantic_core import SchemaValidator, core_schema schema = core_schema.date_schema(le=date(2020, 1, 1), ge=date(2019, 1, 1)) v = SchemaValidator(schema) assert v.validate_python(date(2019, 6, 1)) == date(2019, 6, 1) ``` Args: strict: Whether the value should be a date or a value that can be converted to a date le: The value must be less than or equal to this date ge: The value must be greater than or equal to this date lt: The value must be strictly less than this date gt: The value must be strictly greater than this date now_op: The value must be in the past or future relative to the current date now_utc_offset: The value must be in the past or future relative to the current date with this utc offset ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='date', strict=strict, le=le, ge=ge, lt=lt, gt=gt, now_op=now_op, now_utc_offset=now_utc_offset, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## time_schema ```python time_schema( *, strict: bool | None = None, le: time | None = None, ge: time | None = None, lt: time | None = None, gt: time | None = None, tz_constraint: ( Literal["aware", "naive"] | int | None ) = None, microseconds_precision: Literal[ "truncate", "error" ] = "truncate", ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> TimeSchema ``` Returns a schema that matches a time value, e.g.: ```py from datetime import time from pydantic_core import SchemaValidator, core_schema schema = core_schema.time_schema(le=time(12, 0, 0), ge=time(6, 0, 0)) v = SchemaValidator(schema) assert v.validate_python(time(9, 0, 0)) == time(9, 0, 0) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `strict` | `bool | None` | Whether the value should be a time or a value that can be converted to a time | `None` | | `le` | `time | None` | The value must be less than or equal to this time | `None` | | `ge` | `time | None` | The value must be greater than or equal to this time | `None` | | `lt` | `time | None` | The value must be strictly less than this time | `None` | | `gt` | `time | None` | The value must be strictly greater than this time | `None` | | `tz_constraint` | `Literal['aware', 'naive'] | int | None` | The value must be timezone aware or naive, or an int to indicate required tz offset | `None` | | `microseconds_precision` | `Literal['truncate', 'error']` | The behavior when seconds have more than 6 digits or microseconds is too large | `'truncate'` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def time_schema( *, strict: bool | None = None, le: time | None = None, ge: time | None = None, lt: time | None = None, gt: time | None = None, tz_constraint: Literal['aware', 'naive'] | int | None = None, microseconds_precision: Literal['truncate', 'error'] = 'truncate', ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> TimeSchema: """ Returns a schema that matches a time value, e.g.: ```py from datetime import time from pydantic_core import SchemaValidator, core_schema schema = core_schema.time_schema(le=time(12, 0, 0), ge=time(6, 0, 0)) v = SchemaValidator(schema) assert v.validate_python(time(9, 0, 0)) == time(9, 0, 0) ``` Args: strict: Whether the value should be a time or a value that can be converted to a time le: The value must be less than or equal to this time ge: The value must be greater than or equal to this time lt: The value must be strictly less than this time gt: The value must be strictly greater than this time tz_constraint: The value must be timezone aware or naive, or an int to indicate required tz offset microseconds_precision: The behavior when seconds have more than 6 digits or microseconds is too large ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='time', strict=strict, le=le, ge=ge, lt=lt, gt=gt, tz_constraint=tz_constraint, microseconds_precision=microseconds_precision, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## datetime_schema ```python datetime_schema( *, strict: bool | None = None, le: datetime | None = None, ge: datetime | None = None, lt: datetime | None = None, gt: datetime | None = None, now_op: Literal["past", "future"] | None = None, tz_constraint: ( Literal["aware", "naive"] | int | None ) = None, now_utc_offset: int | None = None, microseconds_precision: Literal[ "truncate", "error" ] = "truncate", ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> DatetimeSchema ``` Returns a schema that matches a datetime value, e.g.: ```py from datetime import datetime from pydantic_core import SchemaValidator, core_schema schema = core_schema.datetime_schema() v = SchemaValidator(schema) now = datetime.now() assert v.validate_python(str(now)) == now ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `strict` | `bool | None` | Whether the value should be a datetime or a value that can be converted to a datetime | `None` | | `le` | `datetime | None` | The value must be less than or equal to this datetime | `None` | | `ge` | `datetime | None` | The value must be greater than or equal to this datetime | `None` | | `lt` | `datetime | None` | The value must be strictly less than this datetime | `None` | | `gt` | `datetime | None` | The value must be strictly greater than this datetime | `None` | | `now_op` | `Literal['past', 'future'] | None` | The value must be in the past or future relative to the current datetime | `None` | | `tz_constraint` | `Literal['aware', 'naive'] | int | None` | The value must be timezone aware or naive, or an int to indicate required tz offset TODO: use of a tzinfo where offset changes based on the datetime is not yet supported | `None` | | `now_utc_offset` | `int | None` | The value must be in the past or future relative to the current datetime with this utc offset | `None` | | `microseconds_precision` | `Literal['truncate', 'error']` | The behavior when seconds have more than 6 digits or microseconds is too large | `'truncate'` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def datetime_schema( *, strict: bool | None = None, le: datetime | None = None, ge: datetime | None = None, lt: datetime | None = None, gt: datetime | None = None, now_op: Literal['past', 'future'] | None = None, tz_constraint: Literal['aware', 'naive'] | int | None = None, now_utc_offset: int | None = None, microseconds_precision: Literal['truncate', 'error'] = 'truncate', ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> DatetimeSchema: """ Returns a schema that matches a datetime value, e.g.: ```py from datetime import datetime from pydantic_core import SchemaValidator, core_schema schema = core_schema.datetime_schema() v = SchemaValidator(schema) now = datetime.now() assert v.validate_python(str(now)) == now ``` Args: strict: Whether the value should be a datetime or a value that can be converted to a datetime le: The value must be less than or equal to this datetime ge: The value must be greater than or equal to this datetime lt: The value must be strictly less than this datetime gt: The value must be strictly greater than this datetime now_op: The value must be in the past or future relative to the current datetime tz_constraint: The value must be timezone aware or naive, or an int to indicate required tz offset TODO: use of a tzinfo where offset changes based on the datetime is not yet supported now_utc_offset: The value must be in the past or future relative to the current datetime with this utc offset microseconds_precision: The behavior when seconds have more than 6 digits or microseconds is too large ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='datetime', strict=strict, le=le, ge=ge, lt=lt, gt=gt, now_op=now_op, tz_constraint=tz_constraint, now_utc_offset=now_utc_offset, microseconds_precision=microseconds_precision, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## timedelta_schema ```python timedelta_schema( *, strict: bool | None = None, le: timedelta | None = None, ge: timedelta | None = None, lt: timedelta | None = None, gt: timedelta | None = None, microseconds_precision: Literal[ "truncate", "error" ] = "truncate", ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> TimedeltaSchema ``` Returns a schema that matches a timedelta value, e.g.: ```py from datetime import timedelta from pydantic_core import SchemaValidator, core_schema schema = core_schema.timedelta_schema(le=timedelta(days=1), ge=timedelta(days=0)) v = SchemaValidator(schema) assert v.validate_python(timedelta(hours=12)) == timedelta(hours=12) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `strict` | `bool | None` | Whether the value should be a timedelta or a value that can be converted to a timedelta | `None` | | `le` | `timedelta | None` | The value must be less than or equal to this timedelta | `None` | | `ge` | `timedelta | None` | The value must be greater than or equal to this timedelta | `None` | | `lt` | `timedelta | None` | The value must be strictly less than this timedelta | `None` | | `gt` | `timedelta | None` | The value must be strictly greater than this timedelta | `None` | | `microseconds_precision` | `Literal['truncate', 'error']` | The behavior when seconds have more than 6 digits or microseconds is too large | `'truncate'` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def timedelta_schema( *, strict: bool | None = None, le: timedelta | None = None, ge: timedelta | None = None, lt: timedelta | None = None, gt: timedelta | None = None, microseconds_precision: Literal['truncate', 'error'] = 'truncate', ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> TimedeltaSchema: """ Returns a schema that matches a timedelta value, e.g.: ```py from datetime import timedelta from pydantic_core import SchemaValidator, core_schema schema = core_schema.timedelta_schema(le=timedelta(days=1), ge=timedelta(days=0)) v = SchemaValidator(schema) assert v.validate_python(timedelta(hours=12)) == timedelta(hours=12) ``` Args: strict: Whether the value should be a timedelta or a value that can be converted to a timedelta le: The value must be less than or equal to this timedelta ge: The value must be greater than or equal to this timedelta lt: The value must be strictly less than this timedelta gt: The value must be strictly greater than this timedelta microseconds_precision: The behavior when seconds have more than 6 digits or microseconds is too large ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='timedelta', strict=strict, le=le, ge=ge, lt=lt, gt=gt, microseconds_precision=microseconds_precision, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## literal_schema ```python literal_schema( expected: list[Any], *, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> LiteralSchema ``` Returns a schema that matches a literal value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.literal_schema(['hello', 'world']) v = SchemaValidator(schema) assert v.validate_python('hello') == 'hello' ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `expected` | `list[Any]` | The value must be one of these values | *required* | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def literal_schema( expected: list[Any], *, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> LiteralSchema: """ Returns a schema that matches a literal value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.literal_schema(['hello', 'world']) v = SchemaValidator(schema) assert v.validate_python('hello') == 'hello' ``` Args: expected: The value must be one of these values ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none(type='literal', expected=expected, ref=ref, metadata=metadata, serialization=serialization) ```` ## enum_schema ```python enum_schema( cls: Any, members: list[Any], *, sub_type: Literal["str", "int", "float"] | None = None, missing: Callable[[Any], Any] | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> EnumSchema ``` Returns a schema that matches an enum value, e.g.: ```py from enum import Enum from pydantic_core import SchemaValidator, core_schema class Color(Enum): RED = 1 GREEN = 2 BLUE = 3 schema = core_schema.enum_schema(Color, list(Color.__members__.values())) v = SchemaValidator(schema) assert v.validate_python(2) is Color.GREEN ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `cls` | `Any` | The enum class | *required* | | `members` | `list[Any]` | The members of the enum, generally list(MyEnum.__members__.values()) | *required* | | `sub_type` | `Literal['str', 'int', 'float'] | None` | The type of the enum, either 'str' or 'int' or None for plain enums | `None` | | `missing` | `Callable[[Any], Any] | None` | A function to use when the value is not found in the enum, from _missing_ | `None` | | `strict` | `bool | None` | Whether to use strict mode, defaults to False | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def enum_schema( cls: Any, members: list[Any], *, sub_type: Literal['str', 'int', 'float'] | None = None, missing: Callable[[Any], Any] | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> EnumSchema: """ Returns a schema that matches an enum value, e.g.: ```py from enum import Enum from pydantic_core import SchemaValidator, core_schema class Color(Enum): RED = 1 GREEN = 2 BLUE = 3 schema = core_schema.enum_schema(Color, list(Color.__members__.values())) v = SchemaValidator(schema) assert v.validate_python(2) is Color.GREEN ``` Args: cls: The enum class members: The members of the enum, generally `list(MyEnum.__members__.values())` sub_type: The type of the enum, either 'str' or 'int' or None for plain enums missing: A function to use when the value is not found in the enum, from `_missing_` strict: Whether to use strict mode, defaults to False ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='enum', cls=cls, members=members, sub_type=sub_type, missing=missing, strict=strict, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## is_instance_schema ```python is_instance_schema( cls: Any, *, cls_repr: str | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> IsInstanceSchema ``` Returns a schema that checks if a value is an instance of a class, equivalent to python's `isinstance` method, e.g.: ```py from pydantic_core import SchemaValidator, core_schema class A: pass schema = core_schema.is_instance_schema(cls=A) v = SchemaValidator(schema) v.validate_python(A()) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `cls` | `Any` | The value must be an instance of this class | *required* | | `cls_repr` | `str | None` | If provided this string is used in the validator name instead of repr(cls) | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def is_instance_schema( cls: Any, *, cls_repr: str | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> IsInstanceSchema: """ Returns a schema that checks if a value is an instance of a class, equivalent to python's `isinstance` method, e.g.: ```py from pydantic_core import SchemaValidator, core_schema class A: pass schema = core_schema.is_instance_schema(cls=A) v = SchemaValidator(schema) v.validate_python(A()) ``` Args: cls: The value must be an instance of this class cls_repr: If provided this string is used in the validator name instead of `repr(cls)` ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='is-instance', cls=cls, cls_repr=cls_repr, ref=ref, metadata=metadata, serialization=serialization ) ```` ## is_subclass_schema ```python is_subclass_schema( cls: type[Any], *, cls_repr: str | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> IsInstanceSchema ``` Returns a schema that checks if a value is a subtype of a class, equivalent to python's `issubclass` method, e.g.: ```py from pydantic_core import SchemaValidator, core_schema class A: pass class B(A): pass schema = core_schema.is_subclass_schema(cls=A) v = SchemaValidator(schema) v.validate_python(B) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `cls` | `type[Any]` | The value must be a subclass of this class | *required* | | `cls_repr` | `str | None` | If provided this string is used in the validator name instead of repr(cls) | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def is_subclass_schema( cls: type[Any], *, cls_repr: str | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> IsInstanceSchema: """ Returns a schema that checks if a value is a subtype of a class, equivalent to python's `issubclass` method, e.g.: ```py from pydantic_core import SchemaValidator, core_schema class A: pass class B(A): pass schema = core_schema.is_subclass_schema(cls=A) v = SchemaValidator(schema) v.validate_python(B) ``` Args: cls: The value must be a subclass of this class cls_repr: If provided this string is used in the validator name instead of `repr(cls)` ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='is-subclass', cls=cls, cls_repr=cls_repr, ref=ref, metadata=metadata, serialization=serialization ) ```` ## callable_schema ```python callable_schema( *, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> CallableSchema ``` Returns a schema that checks if a value is callable, equivalent to python's `callable` method, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.callable_schema() v = SchemaValidator(schema) v.validate_python(min) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def callable_schema( *, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> CallableSchema: """ Returns a schema that checks if a value is callable, equivalent to python's `callable` method, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.callable_schema() v = SchemaValidator(schema) v.validate_python(min) ``` Args: ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none(type='callable', ref=ref, metadata=metadata, serialization=serialization) ```` ## list_schema ```python list_schema( items_schema: CoreSchema | None = None, *, min_length: int | None = None, max_length: int | None = None, fail_fast: bool | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: IncExSeqOrElseSerSchema | None = None ) -> ListSchema ``` Returns a schema that matches a list value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.list_schema(core_schema.int_schema(), min_length=0, max_length=10) v = SchemaValidator(schema) assert v.validate_python(['4']) == [4] ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `items_schema` | `CoreSchema | None` | The value must be a list of items that match this schema | `None` | | `min_length` | `int | None` | The value must be a list with at least this many items | `None` | | `max_length` | `int | None` | The value must be a list with at most this many items | `None` | | `fail_fast` | `bool | None` | Stop validation on the first error | `None` | | `strict` | `bool | None` | The value must be a list with exactly this many items | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `IncExSeqOrElseSerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def list_schema( items_schema: CoreSchema | None = None, *, min_length: int | None = None, max_length: int | None = None, fail_fast: bool | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: IncExSeqOrElseSerSchema | None = None, ) -> ListSchema: """ Returns a schema that matches a list value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.list_schema(core_schema.int_schema(), min_length=0, max_length=10) v = SchemaValidator(schema) assert v.validate_python(['4']) == [4] ``` Args: items_schema: The value must be a list of items that match this schema min_length: The value must be a list with at least this many items max_length: The value must be a list with at most this many items fail_fast: Stop validation on the first error strict: The value must be a list with exactly this many items ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='list', items_schema=items_schema, min_length=min_length, max_length=max_length, fail_fast=fail_fast, strict=strict, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## tuple_positional_schema ```python tuple_positional_schema( items_schema: list[CoreSchema], *, extras_schema: CoreSchema | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: IncExSeqOrElseSerSchema | None = None ) -> TupleSchema ``` Returns a schema that matches a tuple of schemas, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.tuple_positional_schema( [core_schema.int_schema(), core_schema.str_schema()] ) v = SchemaValidator(schema) assert v.validate_python((1, 'hello')) == (1, 'hello') ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `items_schema` | `list[CoreSchema]` | The value must be a tuple with items that match these schemas | *required* | | `extras_schema` | `CoreSchema | None` | The value must be a tuple with items that match this schema This was inspired by JSON schema's prefixItems and items fields. In python's typing.Tuple, you can't specify a type for "extra" items -- they must all be the same type if the length is variable. So this field won't be set from a typing.Tuple annotation on a pydantic model. | `None` | | `strict` | `bool | None` | The value must be a tuple with exactly this many items | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `IncExSeqOrElseSerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def tuple_positional_schema( items_schema: list[CoreSchema], *, extras_schema: CoreSchema | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: IncExSeqOrElseSerSchema | None = None, ) -> TupleSchema: """ Returns a schema that matches a tuple of schemas, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.tuple_positional_schema( [core_schema.int_schema(), core_schema.str_schema()] ) v = SchemaValidator(schema) assert v.validate_python((1, 'hello')) == (1, 'hello') ``` Args: items_schema: The value must be a tuple with items that match these schemas extras_schema: The value must be a tuple with items that match this schema This was inspired by JSON schema's `prefixItems` and `items` fields. In python's `typing.Tuple`, you can't specify a type for "extra" items -- they must all be the same type if the length is variable. So this field won't be set from a `typing.Tuple` annotation on a pydantic model. strict: The value must be a tuple with exactly this many items ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ if extras_schema is not None: variadic_item_index = len(items_schema) items_schema = items_schema + [extras_schema] else: variadic_item_index = None return tuple_schema( items_schema=items_schema, variadic_item_index=variadic_item_index, strict=strict, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## tuple_variable_schema ```python tuple_variable_schema( items_schema: CoreSchema | None = None, *, min_length: int | None = None, max_length: int | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: IncExSeqOrElseSerSchema | None = None ) -> TupleSchema ``` Returns a schema that matches a tuple of a given schema, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.tuple_variable_schema( items_schema=core_schema.int_schema(), min_length=0, max_length=10 ) v = SchemaValidator(schema) assert v.validate_python(('1', 2, 3)) == (1, 2, 3) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `items_schema` | `CoreSchema | None` | The value must be a tuple with items that match this schema | `None` | | `min_length` | `int | None` | The value must be a tuple with at least this many items | `None` | | `max_length` | `int | None` | The value must be a tuple with at most this many items | `None` | | `strict` | `bool | None` | The value must be a tuple with exactly this many items | `None` | | `ref` | `str | None` | Optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `IncExSeqOrElseSerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def tuple_variable_schema( items_schema: CoreSchema | None = None, *, min_length: int | None = None, max_length: int | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: IncExSeqOrElseSerSchema | None = None, ) -> TupleSchema: """ Returns a schema that matches a tuple of a given schema, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.tuple_variable_schema( items_schema=core_schema.int_schema(), min_length=0, max_length=10 ) v = SchemaValidator(schema) assert v.validate_python(('1', 2, 3)) == (1, 2, 3) ``` Args: items_schema: The value must be a tuple with items that match this schema min_length: The value must be a tuple with at least this many items max_length: The value must be a tuple with at most this many items strict: The value must be a tuple with exactly this many items ref: Optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return tuple_schema( items_schema=[items_schema or any_schema()], variadic_item_index=0, min_length=min_length, max_length=max_length, strict=strict, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## tuple_schema ```python tuple_schema( items_schema: list[CoreSchema], *, variadic_item_index: int | None = None, min_length: int | None = None, max_length: int | None = None, fail_fast: bool | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: IncExSeqOrElseSerSchema | None = None ) -> TupleSchema ``` Returns a schema that matches a tuple of schemas, with an optional variadic item, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.tuple_schema( [core_schema.int_schema(), core_schema.str_schema(), core_schema.float_schema()], variadic_item_index=1, ) v = SchemaValidator(schema) assert v.validate_python((1, 'hello', 'world', 1.5)) == (1, 'hello', 'world', 1.5) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `items_schema` | `list[CoreSchema]` | The value must be a tuple with items that match these schemas | *required* | | `variadic_item_index` | `int | None` | The index of the schema in items_schema to be treated as variadic (following PEP 646) | `None` | | `min_length` | `int | None` | The value must be a tuple with at least this many items | `None` | | `max_length` | `int | None` | The value must be a tuple with at most this many items | `None` | | `fail_fast` | `bool | None` | Stop validation on the first error | `None` | | `strict` | `bool | None` | The value must be a tuple with exactly this many items | `None` | | `ref` | `str | None` | Optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `IncExSeqOrElseSerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def tuple_schema( items_schema: list[CoreSchema], *, variadic_item_index: int | None = None, min_length: int | None = None, max_length: int | None = None, fail_fast: bool | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: IncExSeqOrElseSerSchema | None = None, ) -> TupleSchema: """ Returns a schema that matches a tuple of schemas, with an optional variadic item, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.tuple_schema( [core_schema.int_schema(), core_schema.str_schema(), core_schema.float_schema()], variadic_item_index=1, ) v = SchemaValidator(schema) assert v.validate_python((1, 'hello', 'world', 1.5)) == (1, 'hello', 'world', 1.5) ``` Args: items_schema: The value must be a tuple with items that match these schemas variadic_item_index: The index of the schema in `items_schema` to be treated as variadic (following PEP 646) min_length: The value must be a tuple with at least this many items max_length: The value must be a tuple with at most this many items fail_fast: Stop validation on the first error strict: The value must be a tuple with exactly this many items ref: Optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='tuple', items_schema=items_schema, variadic_item_index=variadic_item_index, min_length=min_length, max_length=max_length, fail_fast=fail_fast, strict=strict, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## set_schema ```python set_schema( items_schema: CoreSchema | None = None, *, min_length: int | None = None, max_length: int | None = None, fail_fast: bool | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> SetSchema ``` Returns a schema that matches a set of a given schema, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.set_schema( items_schema=core_schema.int_schema(), min_length=0, max_length=10 ) v = SchemaValidator(schema) assert v.validate_python({1, '2', 3}) == {1, 2, 3} ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `items_schema` | `CoreSchema | None` | The value must be a set with items that match this schema | `None` | | `min_length` | `int | None` | The value must be a set with at least this many items | `None` | | `max_length` | `int | None` | The value must be a set with at most this many items | `None` | | `fail_fast` | `bool | None` | Stop validation on the first error | `None` | | `strict` | `bool | None` | The value must be a set with exactly this many items | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def set_schema( items_schema: CoreSchema | None = None, *, min_length: int | None = None, max_length: int | None = None, fail_fast: bool | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> SetSchema: """ Returns a schema that matches a set of a given schema, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.set_schema( items_schema=core_schema.int_schema(), min_length=0, max_length=10 ) v = SchemaValidator(schema) assert v.validate_python({1, '2', 3}) == {1, 2, 3} ``` Args: items_schema: The value must be a set with items that match this schema min_length: The value must be a set with at least this many items max_length: The value must be a set with at most this many items fail_fast: Stop validation on the first error strict: The value must be a set with exactly this many items ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='set', items_schema=items_schema, min_length=min_length, max_length=max_length, fail_fast=fail_fast, strict=strict, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## frozenset_schema ```python frozenset_schema( items_schema: CoreSchema | None = None, *, min_length: int | None = None, max_length: int | None = None, fail_fast: bool | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> FrozenSetSchema ``` Returns a schema that matches a frozenset of a given schema, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.frozenset_schema( items_schema=core_schema.int_schema(), min_length=0, max_length=10 ) v = SchemaValidator(schema) assert v.validate_python(frozenset(range(3))) == frozenset({0, 1, 2}) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `items_schema` | `CoreSchema | None` | The value must be a frozenset with items that match this schema | `None` | | `min_length` | `int | None` | The value must be a frozenset with at least this many items | `None` | | `max_length` | `int | None` | The value must be a frozenset with at most this many items | `None` | | `fail_fast` | `bool | None` | Stop validation on the first error | `None` | | `strict` | `bool | None` | The value must be a frozenset with exactly this many items | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def frozenset_schema( items_schema: CoreSchema | None = None, *, min_length: int | None = None, max_length: int | None = None, fail_fast: bool | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> FrozenSetSchema: """ Returns a schema that matches a frozenset of a given schema, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.frozenset_schema( items_schema=core_schema.int_schema(), min_length=0, max_length=10 ) v = SchemaValidator(schema) assert v.validate_python(frozenset(range(3))) == frozenset({0, 1, 2}) ``` Args: items_schema: The value must be a frozenset with items that match this schema min_length: The value must be a frozenset with at least this many items max_length: The value must be a frozenset with at most this many items fail_fast: Stop validation on the first error strict: The value must be a frozenset with exactly this many items ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='frozenset', items_schema=items_schema, min_length=min_length, max_length=max_length, fail_fast=fail_fast, strict=strict, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## generator_schema ```python generator_schema( items_schema: CoreSchema | None = None, *, min_length: int | None = None, max_length: int | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: IncExSeqOrElseSerSchema | None = None ) -> GeneratorSchema ``` Returns a schema that matches a generator value, e.g.: ```py from typing import Iterator from pydantic_core import SchemaValidator, core_schema def gen() -> Iterator[int]: yield 1 schema = core_schema.generator_schema(items_schema=core_schema.int_schema()) v = SchemaValidator(schema) v.validate_python(gen()) ``` Unlike other types, validated generators do not raise ValidationErrors eagerly, but instead will raise a ValidationError when a violating value is actually read from the generator. This is to ensure that "validated" generators retain the benefit of lazy evaluation. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `items_schema` | `CoreSchema | None` | The value must be a generator with items that match this schema | `None` | | `min_length` | `int | None` | The value must be a generator that yields at least this many items | `None` | | `max_length` | `int | None` | The value must be a generator that yields at most this many items | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `IncExSeqOrElseSerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def generator_schema( items_schema: CoreSchema | None = None, *, min_length: int | None = None, max_length: int | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: IncExSeqOrElseSerSchema | None = None, ) -> GeneratorSchema: """ Returns a schema that matches a generator value, e.g.: ```py from typing import Iterator from pydantic_core import SchemaValidator, core_schema def gen() -> Iterator[int]: yield 1 schema = core_schema.generator_schema(items_schema=core_schema.int_schema()) v = SchemaValidator(schema) v.validate_python(gen()) ``` Unlike other types, validated generators do not raise ValidationErrors eagerly, but instead will raise a ValidationError when a violating value is actually read from the generator. This is to ensure that "validated" generators retain the benefit of lazy evaluation. Args: items_schema: The value must be a generator with items that match this schema min_length: The value must be a generator that yields at least this many items max_length: The value must be a generator that yields at most this many items ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='generator', items_schema=items_schema, min_length=min_length, max_length=max_length, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## dict_schema ```python dict_schema( keys_schema: CoreSchema | None = None, values_schema: CoreSchema | None = None, *, min_length: int | None = None, max_length: int | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> DictSchema ``` Returns a schema that matches a dict value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.dict_schema( keys_schema=core_schema.str_schema(), values_schema=core_schema.int_schema() ) v = SchemaValidator(schema) assert v.validate_python({'a': '1', 'b': 2}) == {'a': 1, 'b': 2} ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `keys_schema` | `CoreSchema | None` | The value must be a dict with keys that match this schema | `None` | | `values_schema` | `CoreSchema | None` | The value must be a dict with values that match this schema | `None` | | `min_length` | `int | None` | The value must be a dict with at least this many items | `None` | | `max_length` | `int | None` | The value must be a dict with at most this many items | `None` | | `strict` | `bool | None` | Whether the keys and values should be validated with strict mode | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def dict_schema( keys_schema: CoreSchema | None = None, values_schema: CoreSchema | None = None, *, min_length: int | None = None, max_length: int | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> DictSchema: """ Returns a schema that matches a dict value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.dict_schema( keys_schema=core_schema.str_schema(), values_schema=core_schema.int_schema() ) v = SchemaValidator(schema) assert v.validate_python({'a': '1', 'b': 2}) == {'a': 1, 'b': 2} ``` Args: keys_schema: The value must be a dict with keys that match this schema values_schema: The value must be a dict with values that match this schema min_length: The value must be a dict with at least this many items max_length: The value must be a dict with at most this many items strict: Whether the keys and values should be validated with strict mode ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='dict', keys_schema=keys_schema, values_schema=values_schema, min_length=min_length, max_length=max_length, strict=strict, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## no_info_before_validator_function ```python no_info_before_validator_function( function: NoInfoValidatorFunction, schema: CoreSchema, *, ref: str | None = None, json_schema_input_schema: CoreSchema | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> BeforeValidatorFunctionSchema ``` Returns a schema that calls a validator function before validating, no `info` argument is provided, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn(v: bytes) -> str: return v.decode() + 'world' func_schema = core_schema.no_info_before_validator_function( function=fn, schema=core_schema.str_schema() ) schema = core_schema.typed_dict_schema({'a': core_schema.typed_dict_field(func_schema)}) v = SchemaValidator(schema) assert v.validate_python({'a': b'hello '}) == {'a': 'hello world'} ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `function` | `NoInfoValidatorFunction` | The validator function to call | *required* | | `schema` | `CoreSchema` | The schema to validate the output of the validator function | *required* | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `json_schema_input_schema` | `CoreSchema | None` | The core schema to be used to generate the corresponding JSON Schema input type | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def no_info_before_validator_function( function: NoInfoValidatorFunction, schema: CoreSchema, *, ref: str | None = None, json_schema_input_schema: CoreSchema | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> BeforeValidatorFunctionSchema: """ Returns a schema that calls a validator function before validating, no `info` argument is provided, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn(v: bytes) -> str: return v.decode() + 'world' func_schema = core_schema.no_info_before_validator_function( function=fn, schema=core_schema.str_schema() ) schema = core_schema.typed_dict_schema({'a': core_schema.typed_dict_field(func_schema)}) v = SchemaValidator(schema) assert v.validate_python({'a': b'hello '}) == {'a': 'hello world'} ``` Args: function: The validator function to call schema: The schema to validate the output of the validator function ref: optional unique identifier of the schema, used to reference the schema in other places json_schema_input_schema: The core schema to be used to generate the corresponding JSON Schema input type metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='function-before', function={'type': 'no-info', 'function': function}, schema=schema, ref=ref, json_schema_input_schema=json_schema_input_schema, metadata=metadata, serialization=serialization, ) ```` ## with_info_before_validator_function ```python with_info_before_validator_function( function: WithInfoValidatorFunction, schema: CoreSchema, *, field_name: str | None = None, ref: str | None = None, json_schema_input_schema: CoreSchema | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> BeforeValidatorFunctionSchema ``` Returns a schema that calls a validator function before validation, the function is called with an `info` argument, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn(v: bytes, info: core_schema.ValidationInfo) -> str: assert info.data is not None assert info.field_name is not None return v.decode() + 'world' func_schema = core_schema.with_info_before_validator_function( function=fn, schema=core_schema.str_schema(), field_name='a' ) schema = core_schema.typed_dict_schema({'a': core_schema.typed_dict_field(func_schema)}) v = SchemaValidator(schema) assert v.validate_python({'a': b'hello '}) == {'a': 'hello world'} ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `function` | `WithInfoValidatorFunction` | The validator function to call | *required* | | `field_name` | `str | None` | The name of the field | `None` | | `schema` | `CoreSchema` | The schema to validate the output of the validator function | *required* | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `json_schema_input_schema` | `CoreSchema | None` | The core schema to be used to generate the corresponding JSON Schema input type | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def with_info_before_validator_function( function: WithInfoValidatorFunction, schema: CoreSchema, *, field_name: str | None = None, ref: str | None = None, json_schema_input_schema: CoreSchema | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> BeforeValidatorFunctionSchema: """ Returns a schema that calls a validator function before validation, the function is called with an `info` argument, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn(v: bytes, info: core_schema.ValidationInfo) -> str: assert info.data is not None assert info.field_name is not None return v.decode() + 'world' func_schema = core_schema.with_info_before_validator_function( function=fn, schema=core_schema.str_schema(), field_name='a' ) schema = core_schema.typed_dict_schema({'a': core_schema.typed_dict_field(func_schema)}) v = SchemaValidator(schema) assert v.validate_python({'a': b'hello '}) == {'a': 'hello world'} ``` Args: function: The validator function to call field_name: The name of the field schema: The schema to validate the output of the validator function ref: optional unique identifier of the schema, used to reference the schema in other places json_schema_input_schema: The core schema to be used to generate the corresponding JSON Schema input type metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='function-before', function=_dict_not_none(type='with-info', function=function, field_name=field_name), schema=schema, ref=ref, json_schema_input_schema=json_schema_input_schema, metadata=metadata, serialization=serialization, ) ```` ## no_info_after_validator_function ```python no_info_after_validator_function( function: NoInfoValidatorFunction, schema: CoreSchema, *, ref: str | None = None, json_schema_input_schema: CoreSchema | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> AfterValidatorFunctionSchema ``` Returns a schema that calls a validator function after validating, no `info` argument is provided, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn(v: str) -> str: return v + 'world' func_schema = core_schema.no_info_after_validator_function(fn, core_schema.str_schema()) schema = core_schema.typed_dict_schema({'a': core_schema.typed_dict_field(func_schema)}) v = SchemaValidator(schema) assert v.validate_python({'a': b'hello '}) == {'a': 'hello world'} ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `function` | `NoInfoValidatorFunction` | The validator function to call after the schema is validated | *required* | | `schema` | `CoreSchema` | The schema to validate before the validator function | *required* | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `json_schema_input_schema` | `CoreSchema | None` | The core schema to be used to generate the corresponding JSON Schema input type | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def no_info_after_validator_function( function: NoInfoValidatorFunction, schema: CoreSchema, *, ref: str | None = None, json_schema_input_schema: CoreSchema | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> AfterValidatorFunctionSchema: """ Returns a schema that calls a validator function after validating, no `info` argument is provided, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn(v: str) -> str: return v + 'world' func_schema = core_schema.no_info_after_validator_function(fn, core_schema.str_schema()) schema = core_schema.typed_dict_schema({'a': core_schema.typed_dict_field(func_schema)}) v = SchemaValidator(schema) assert v.validate_python({'a': b'hello '}) == {'a': 'hello world'} ``` Args: function: The validator function to call after the schema is validated schema: The schema to validate before the validator function ref: optional unique identifier of the schema, used to reference the schema in other places json_schema_input_schema: The core schema to be used to generate the corresponding JSON Schema input type metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='function-after', function={'type': 'no-info', 'function': function}, schema=schema, ref=ref, json_schema_input_schema=json_schema_input_schema, metadata=metadata, serialization=serialization, ) ```` ## with_info_after_validator_function ```python with_info_after_validator_function( function: WithInfoValidatorFunction, schema: CoreSchema, *, field_name: str | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> AfterValidatorFunctionSchema ``` Returns a schema that calls a validator function after validation, the function is called with an `info` argument, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn(v: str, info: core_schema.ValidationInfo) -> str: assert info.data is not None assert info.field_name is not None return v + 'world' func_schema = core_schema.with_info_after_validator_function( function=fn, schema=core_schema.str_schema(), field_name='a' ) schema = core_schema.typed_dict_schema({'a': core_schema.typed_dict_field(func_schema)}) v = SchemaValidator(schema) assert v.validate_python({'a': b'hello '}) == {'a': 'hello world'} ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `function` | `WithInfoValidatorFunction` | The validator function to call after the schema is validated | *required* | | `schema` | `CoreSchema` | The schema to validate before the validator function | *required* | | `field_name` | `str | None` | The name of the field this validators is applied to, if any | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def with_info_after_validator_function( function: WithInfoValidatorFunction, schema: CoreSchema, *, field_name: str | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> AfterValidatorFunctionSchema: """ Returns a schema that calls a validator function after validation, the function is called with an `info` argument, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn(v: str, info: core_schema.ValidationInfo) -> str: assert info.data is not None assert info.field_name is not None return v + 'world' func_schema = core_schema.with_info_after_validator_function( function=fn, schema=core_schema.str_schema(), field_name='a' ) schema = core_schema.typed_dict_schema({'a': core_schema.typed_dict_field(func_schema)}) v = SchemaValidator(schema) assert v.validate_python({'a': b'hello '}) == {'a': 'hello world'} ``` Args: function: The validator function to call after the schema is validated schema: The schema to validate before the validator function field_name: The name of the field this validators is applied to, if any ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='function-after', function=_dict_not_none(type='with-info', function=function, field_name=field_name), schema=schema, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## no_info_wrap_validator_function ```python no_info_wrap_validator_function( function: NoInfoWrapValidatorFunction, schema: CoreSchema, *, ref: str | None = None, json_schema_input_schema: CoreSchema | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> WrapValidatorFunctionSchema ``` Returns a schema which calls a function with a `validator` callable argument which can optionally be used to call inner validation with the function logic, this is much like the "onion" implementation of middleware in many popular web frameworks, no `info` argument is passed, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn( v: str, validator: core_schema.ValidatorFunctionWrapHandler, ) -> str: return validator(input_value=v) + 'world' schema = core_schema.no_info_wrap_validator_function( function=fn, schema=core_schema.str_schema() ) v = SchemaValidator(schema) assert v.validate_python('hello ') == 'hello world' ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `function` | `NoInfoWrapValidatorFunction` | The validator function to call | *required* | | `schema` | `CoreSchema` | The schema to validate the output of the validator function | *required* | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `json_schema_input_schema` | `CoreSchema | None` | The core schema to be used to generate the corresponding JSON Schema input type | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def no_info_wrap_validator_function( function: NoInfoWrapValidatorFunction, schema: CoreSchema, *, ref: str | None = None, json_schema_input_schema: CoreSchema | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> WrapValidatorFunctionSchema: """ Returns a schema which calls a function with a `validator` callable argument which can optionally be used to call inner validation with the function logic, this is much like the "onion" implementation of middleware in many popular web frameworks, no `info` argument is passed, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn( v: str, validator: core_schema.ValidatorFunctionWrapHandler, ) -> str: return validator(input_value=v) + 'world' schema = core_schema.no_info_wrap_validator_function( function=fn, schema=core_schema.str_schema() ) v = SchemaValidator(schema) assert v.validate_python('hello ') == 'hello world' ``` Args: function: The validator function to call schema: The schema to validate the output of the validator function ref: optional unique identifier of the schema, used to reference the schema in other places json_schema_input_schema: The core schema to be used to generate the corresponding JSON Schema input type metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='function-wrap', function={'type': 'no-info', 'function': function}, schema=schema, json_schema_input_schema=json_schema_input_schema, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## with_info_wrap_validator_function ```python with_info_wrap_validator_function( function: WithInfoWrapValidatorFunction, schema: CoreSchema, *, field_name: str | None = None, json_schema_input_schema: CoreSchema | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> WrapValidatorFunctionSchema ``` Returns a schema which calls a function with a `validator` callable argument which can optionally be used to call inner validation with the function logic, this is much like the "onion" implementation of middleware in many popular web frameworks, an `info` argument is also passed, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn( v: str, validator: core_schema.ValidatorFunctionWrapHandler, info: core_schema.ValidationInfo, ) -> str: return validator(input_value=v) + 'world' schema = core_schema.with_info_wrap_validator_function( function=fn, schema=core_schema.str_schema() ) v = SchemaValidator(schema) assert v.validate_python('hello ') == 'hello world' ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `function` | `WithInfoWrapValidatorFunction` | The validator function to call | *required* | | `schema` | `CoreSchema` | The schema to validate the output of the validator function | *required* | | `field_name` | `str | None` | The name of the field this validators is applied to, if any | `None` | | `json_schema_input_schema` | `CoreSchema | None` | The core schema to be used to generate the corresponding JSON Schema input type | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def with_info_wrap_validator_function( function: WithInfoWrapValidatorFunction, schema: CoreSchema, *, field_name: str | None = None, json_schema_input_schema: CoreSchema | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> WrapValidatorFunctionSchema: """ Returns a schema which calls a function with a `validator` callable argument which can optionally be used to call inner validation with the function logic, this is much like the "onion" implementation of middleware in many popular web frameworks, an `info` argument is also passed, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn( v: str, validator: core_schema.ValidatorFunctionWrapHandler, info: core_schema.ValidationInfo, ) -> str: return validator(input_value=v) + 'world' schema = core_schema.with_info_wrap_validator_function( function=fn, schema=core_schema.str_schema() ) v = SchemaValidator(schema) assert v.validate_python('hello ') == 'hello world' ``` Args: function: The validator function to call schema: The schema to validate the output of the validator function field_name: The name of the field this validators is applied to, if any json_schema_input_schema: The core schema to be used to generate the corresponding JSON Schema input type ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='function-wrap', function=_dict_not_none(type='with-info', function=function, field_name=field_name), schema=schema, json_schema_input_schema=json_schema_input_schema, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## no_info_plain_validator_function ```python no_info_plain_validator_function( function: NoInfoValidatorFunction, *, ref: str | None = None, json_schema_input_schema: CoreSchema | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> PlainValidatorFunctionSchema ``` Returns a schema that uses the provided function for validation, no `info` argument is passed, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn(v: str) -> str: assert 'hello' in v return v + 'world' schema = core_schema.no_info_plain_validator_function(function=fn) v = SchemaValidator(schema) assert v.validate_python('hello ') == 'hello world' ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `function` | `NoInfoValidatorFunction` | The validator function to call | *required* | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `json_schema_input_schema` | `CoreSchema | None` | The core schema to be used to generate the corresponding JSON Schema input type | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def no_info_plain_validator_function( function: NoInfoValidatorFunction, *, ref: str | None = None, json_schema_input_schema: CoreSchema | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> PlainValidatorFunctionSchema: """ Returns a schema that uses the provided function for validation, no `info` argument is passed, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn(v: str) -> str: assert 'hello' in v return v + 'world' schema = core_schema.no_info_plain_validator_function(function=fn) v = SchemaValidator(schema) assert v.validate_python('hello ') == 'hello world' ``` Args: function: The validator function to call ref: optional unique identifier of the schema, used to reference the schema in other places json_schema_input_schema: The core schema to be used to generate the corresponding JSON Schema input type metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='function-plain', function={'type': 'no-info', 'function': function}, ref=ref, json_schema_input_schema=json_schema_input_schema, metadata=metadata, serialization=serialization, ) ```` ## with_info_plain_validator_function ```python with_info_plain_validator_function( function: WithInfoValidatorFunction, *, field_name: str | None = None, ref: str | None = None, json_schema_input_schema: CoreSchema | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> PlainValidatorFunctionSchema ``` Returns a schema that uses the provided function for validation, an `info` argument is passed, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn(v: str, info: core_schema.ValidationInfo) -> str: assert 'hello' in v return v + 'world' schema = core_schema.with_info_plain_validator_function(function=fn) v = SchemaValidator(schema) assert v.validate_python('hello ') == 'hello world' ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `function` | `WithInfoValidatorFunction` | The validator function to call | *required* | | `field_name` | `str | None` | The name of the field this validators is applied to, if any | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `json_schema_input_schema` | `CoreSchema | None` | The core schema to be used to generate the corresponding JSON Schema input type | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def with_info_plain_validator_function( function: WithInfoValidatorFunction, *, field_name: str | None = None, ref: str | None = None, json_schema_input_schema: CoreSchema | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> PlainValidatorFunctionSchema: """ Returns a schema that uses the provided function for validation, an `info` argument is passed, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn(v: str, info: core_schema.ValidationInfo) -> str: assert 'hello' in v return v + 'world' schema = core_schema.with_info_plain_validator_function(function=fn) v = SchemaValidator(schema) assert v.validate_python('hello ') == 'hello world' ``` Args: function: The validator function to call field_name: The name of the field this validators is applied to, if any ref: optional unique identifier of the schema, used to reference the schema in other places json_schema_input_schema: The core schema to be used to generate the corresponding JSON Schema input type metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='function-plain', function=_dict_not_none(type='with-info', function=function, field_name=field_name), ref=ref, json_schema_input_schema=json_schema_input_schema, metadata=metadata, serialization=serialization, ) ```` ## with_default_schema ```python with_default_schema( schema: CoreSchema, *, default: Any = PydanticUndefined, default_factory: Union[ Callable[[], Any], Callable[[dict[str, Any]], Any], None, ] = None, default_factory_takes_data: bool | None = None, on_error: ( Literal["raise", "omit", "default"] | None ) = None, validate_default: bool | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> WithDefaultSchema ``` Returns a schema that adds a default value to the given schema, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.with_default_schema(core_schema.str_schema(), default='hello') wrapper_schema = core_schema.typed_dict_schema( {'a': core_schema.typed_dict_field(schema)} ) v = SchemaValidator(wrapper_schema) assert v.validate_python({}) == v.validate_python({'a': 'hello'}) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `CoreSchema` | The schema to add a default value to | *required* | | `default` | `Any` | The default value to use | `PydanticUndefined` | | `default_factory` | `Union[Callable[[], Any], Callable[[dict[str, Any]], Any], None]` | A callable that returns the default value to use | `None` | | `default_factory_takes_data` | `bool | None` | Whether the default factory takes a validated data argument | `None` | | `on_error` | `Literal['raise', 'omit', 'default'] | None` | What to do if the schema validation fails. One of 'raise', 'omit', 'default' | `None` | | `validate_default` | `bool | None` | Whether the default value should be validated | `None` | | `strict` | `bool | None` | Whether the underlying schema should be validated with strict mode | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def with_default_schema( schema: CoreSchema, *, default: Any = PydanticUndefined, default_factory: Union[Callable[[], Any], Callable[[dict[str, Any]], Any], None] = None, default_factory_takes_data: bool | None = None, on_error: Literal['raise', 'omit', 'default'] | None = None, validate_default: bool | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> WithDefaultSchema: """ Returns a schema that adds a default value to the given schema, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.with_default_schema(core_schema.str_schema(), default='hello') wrapper_schema = core_schema.typed_dict_schema( {'a': core_schema.typed_dict_field(schema)} ) v = SchemaValidator(wrapper_schema) assert v.validate_python({}) == v.validate_python({'a': 'hello'}) ``` Args: schema: The schema to add a default value to default: The default value to use default_factory: A callable that returns the default value to use default_factory_takes_data: Whether the default factory takes a validated data argument on_error: What to do if the schema validation fails. One of 'raise', 'omit', 'default' validate_default: Whether the default value should be validated strict: Whether the underlying schema should be validated with strict mode ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ s = _dict_not_none( type='default', schema=schema, default_factory=default_factory, default_factory_takes_data=default_factory_takes_data, on_error=on_error, validate_default=validate_default, strict=strict, ref=ref, metadata=metadata, serialization=serialization, ) if default is not PydanticUndefined: s['default'] = default return s ```` ## nullable_schema ```python nullable_schema( schema: CoreSchema, *, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> NullableSchema ``` Returns a schema that matches a nullable value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.nullable_schema(core_schema.str_schema()) v = SchemaValidator(schema) assert v.validate_python(None) is None ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `CoreSchema` | The schema to wrap | *required* | | `strict` | `bool | None` | Whether the underlying schema should be validated with strict mode | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def nullable_schema( schema: CoreSchema, *, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> NullableSchema: """ Returns a schema that matches a nullable value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.nullable_schema(core_schema.str_schema()) v = SchemaValidator(schema) assert v.validate_python(None) is None ``` Args: schema: The schema to wrap strict: Whether the underlying schema should be validated with strict mode ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='nullable', schema=schema, strict=strict, ref=ref, metadata=metadata, serialization=serialization ) ```` ## union_schema ```python union_schema( choices: list[CoreSchema | tuple[CoreSchema, str]], *, auto_collapse: bool | None = None, custom_error_type: str | None = None, custom_error_message: str | None = None, custom_error_context: ( dict[str, str | int] | None ) = None, mode: Literal["smart", "left_to_right"] | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> UnionSchema ``` Returns a schema that matches a union value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.union_schema([core_schema.str_schema(), core_schema.int_schema()]) v = SchemaValidator(schema) assert v.validate_python('hello') == 'hello' assert v.validate_python(1) == 1 ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `choices` | `list[CoreSchema | tuple[CoreSchema, str]]` | The schemas to match. If a tuple, the second item is used as the label for the case. | *required* | | `auto_collapse` | `bool | None` | whether to automatically collapse unions with one element to the inner validator, default true | `None` | | `custom_error_type` | `str | None` | The custom error type to use if the validation fails | `None` | | `custom_error_message` | `str | None` | The custom error message to use if the validation fails | `None` | | `custom_error_context` | `dict[str, str | int] | None` | The custom error context to use if the validation fails | `None` | | `mode` | `Literal['smart', 'left_to_right'] | None` | How to select which choice to return * smart (default) will try to return the choice which is the closest match to the input value * left_to_right will return the first choice in choices which succeeds validation | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def union_schema( choices: list[CoreSchema | tuple[CoreSchema, str]], *, auto_collapse: bool | None = None, custom_error_type: str | None = None, custom_error_message: str | None = None, custom_error_context: dict[str, str | int] | None = None, mode: Literal['smart', 'left_to_right'] | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> UnionSchema: """ Returns a schema that matches a union value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.union_schema([core_schema.str_schema(), core_schema.int_schema()]) v = SchemaValidator(schema) assert v.validate_python('hello') == 'hello' assert v.validate_python(1) == 1 ``` Args: choices: The schemas to match. If a tuple, the second item is used as the label for the case. auto_collapse: whether to automatically collapse unions with one element to the inner validator, default true custom_error_type: The custom error type to use if the validation fails custom_error_message: The custom error message to use if the validation fails custom_error_context: The custom error context to use if the validation fails mode: How to select which choice to return * `smart` (default) will try to return the choice which is the closest match to the input value * `left_to_right` will return the first choice in `choices` which succeeds validation ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='union', choices=choices, auto_collapse=auto_collapse, custom_error_type=custom_error_type, custom_error_message=custom_error_message, custom_error_context=custom_error_context, mode=mode, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## tagged_union_schema ```python tagged_union_schema( choices: dict[Any, CoreSchema], discriminator: ( str | list[str | int] | list[list[str | int]] | Callable[[Any], Any] ), *, custom_error_type: str | None = None, custom_error_message: str | None = None, custom_error_context: ( dict[str, int | str | float] | None ) = None, strict: bool | None = None, from_attributes: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> TaggedUnionSchema ``` Returns a schema that matches a tagged union value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema apple_schema = core_schema.typed_dict_schema( { 'foo': core_schema.typed_dict_field(core_schema.str_schema()), 'bar': core_schema.typed_dict_field(core_schema.int_schema()), } ) banana_schema = core_schema.typed_dict_schema( { 'foo': core_schema.typed_dict_field(core_schema.str_schema()), 'spam': core_schema.typed_dict_field( core_schema.list_schema(items_schema=core_schema.int_schema()) ), } ) schema = core_schema.tagged_union_schema( choices={ 'apple': apple_schema, 'banana': banana_schema, }, discriminator='foo', ) v = SchemaValidator(schema) assert v.validate_python({'foo': 'apple', 'bar': '123'}) == {'foo': 'apple', 'bar': 123} assert v.validate_python({'foo': 'banana', 'spam': [1, 2, 3]}) == { 'foo': 'banana', 'spam': [1, 2, 3], } ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `choices` | `dict[Any, CoreSchema]` | The schemas to match When retrieving a schema from choices using the discriminator value, if the value is a str, it should be fed back into the choices map until a schema is obtained (This approach is to prevent multiple ownership of a single schema in Rust) | *required* | | `discriminator` | `str | list[str | int] | list[list[str | int]] | Callable[[Any], Any]` | The discriminator to use to determine the schema to use * If discriminator is a str, it is the name of the attribute to use as the discriminator * If discriminator is a list of int/str, it should be used as a "path" to access the discriminator * If discriminator is a list of lists, each inner list is a path, and the first path that exists is used * If discriminator is a callable, it should return the discriminator when called on the value to validate; the callable can return None to indicate that there is no matching discriminator present on the input | *required* | | `custom_error_type` | `str | None` | The custom error type to use if the validation fails | `None` | | `custom_error_message` | `str | None` | The custom error message to use if the validation fails | `None` | | `custom_error_context` | `dict[str, int | str | float] | None` | The custom error context to use if the validation fails | `None` | | `strict` | `bool | None` | Whether the underlying schemas should be validated with strict mode | `None` | | `from_attributes` | `bool | None` | Whether to use the attributes of the object to retrieve the discriminator value | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def tagged_union_schema( choices: dict[Any, CoreSchema], discriminator: str | list[str | int] | list[list[str | int]] | Callable[[Any], Any], *, custom_error_type: str | None = None, custom_error_message: str | None = None, custom_error_context: dict[str, int | str | float] | None = None, strict: bool | None = None, from_attributes: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> TaggedUnionSchema: """ Returns a schema that matches a tagged union value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema apple_schema = core_schema.typed_dict_schema( { 'foo': core_schema.typed_dict_field(core_schema.str_schema()), 'bar': core_schema.typed_dict_field(core_schema.int_schema()), } ) banana_schema = core_schema.typed_dict_schema( { 'foo': core_schema.typed_dict_field(core_schema.str_schema()), 'spam': core_schema.typed_dict_field( core_schema.list_schema(items_schema=core_schema.int_schema()) ), } ) schema = core_schema.tagged_union_schema( choices={ 'apple': apple_schema, 'banana': banana_schema, }, discriminator='foo', ) v = SchemaValidator(schema) assert v.validate_python({'foo': 'apple', 'bar': '123'}) == {'foo': 'apple', 'bar': 123} assert v.validate_python({'foo': 'banana', 'spam': [1, 2, 3]}) == { 'foo': 'banana', 'spam': [1, 2, 3], } ``` Args: choices: The schemas to match When retrieving a schema from `choices` using the discriminator value, if the value is a str, it should be fed back into the `choices` map until a schema is obtained (This approach is to prevent multiple ownership of a single schema in Rust) discriminator: The discriminator to use to determine the schema to use * If `discriminator` is a str, it is the name of the attribute to use as the discriminator * If `discriminator` is a list of int/str, it should be used as a "path" to access the discriminator * If `discriminator` is a list of lists, each inner list is a path, and the first path that exists is used * If `discriminator` is a callable, it should return the discriminator when called on the value to validate; the callable can return `None` to indicate that there is no matching discriminator present on the input custom_error_type: The custom error type to use if the validation fails custom_error_message: The custom error message to use if the validation fails custom_error_context: The custom error context to use if the validation fails strict: Whether the underlying schemas should be validated with strict mode from_attributes: Whether to use the attributes of the object to retrieve the discriminator value ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='tagged-union', choices=choices, discriminator=discriminator, custom_error_type=custom_error_type, custom_error_message=custom_error_message, custom_error_context=custom_error_context, strict=strict, from_attributes=from_attributes, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## chain_schema ```python chain_schema( steps: list[CoreSchema], *, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> ChainSchema ``` Returns a schema that chains the provided validation schemas, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn(v: str, info: core_schema.ValidationInfo) -> str: assert 'hello' in v return v + ' world' fn_schema = core_schema.with_info_plain_validator_function(function=fn) schema = core_schema.chain_schema( [fn_schema, fn_schema, fn_schema, core_schema.str_schema()] ) v = SchemaValidator(schema) assert v.validate_python('hello') == 'hello world world world' ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `steps` | `list[CoreSchema]` | The schemas to chain | *required* | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def chain_schema( steps: list[CoreSchema], *, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> ChainSchema: """ Returns a schema that chains the provided validation schemas, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn(v: str, info: core_schema.ValidationInfo) -> str: assert 'hello' in v return v + ' world' fn_schema = core_schema.with_info_plain_validator_function(function=fn) schema = core_schema.chain_schema( [fn_schema, fn_schema, fn_schema, core_schema.str_schema()] ) v = SchemaValidator(schema) assert v.validate_python('hello') == 'hello world world world' ``` Args: steps: The schemas to chain ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none(type='chain', steps=steps, ref=ref, metadata=metadata, serialization=serialization) ```` ## lax_or_strict_schema ```python lax_or_strict_schema( lax_schema: CoreSchema, strict_schema: CoreSchema, *, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> LaxOrStrictSchema ``` Returns a schema that uses the lax or strict schema, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn(v: str, info: core_schema.ValidationInfo) -> str: assert 'hello' in v return v + ' world' lax_schema = core_schema.int_schema(strict=False) strict_schema = core_schema.int_schema(strict=True) schema = core_schema.lax_or_strict_schema( lax_schema=lax_schema, strict_schema=strict_schema, strict=True ) v = SchemaValidator(schema) assert v.validate_python(123) == 123 schema = core_schema.lax_or_strict_schema( lax_schema=lax_schema, strict_schema=strict_schema, strict=False ) v = SchemaValidator(schema) assert v.validate_python('123') == 123 ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `lax_schema` | `CoreSchema` | The lax schema to use | *required* | | `strict_schema` | `CoreSchema` | The strict schema to use | *required* | | `strict` | `bool | None` | Whether the strict schema should be used | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def lax_or_strict_schema( lax_schema: CoreSchema, strict_schema: CoreSchema, *, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> LaxOrStrictSchema: """ Returns a schema that uses the lax or strict schema, e.g.: ```py from pydantic_core import SchemaValidator, core_schema def fn(v: str, info: core_schema.ValidationInfo) -> str: assert 'hello' in v return v + ' world' lax_schema = core_schema.int_schema(strict=False) strict_schema = core_schema.int_schema(strict=True) schema = core_schema.lax_or_strict_schema( lax_schema=lax_schema, strict_schema=strict_schema, strict=True ) v = SchemaValidator(schema) assert v.validate_python(123) == 123 schema = core_schema.lax_or_strict_schema( lax_schema=lax_schema, strict_schema=strict_schema, strict=False ) v = SchemaValidator(schema) assert v.validate_python('123') == 123 ``` Args: lax_schema: The lax schema to use strict_schema: The strict schema to use strict: Whether the strict schema should be used ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='lax-or-strict', lax_schema=lax_schema, strict_schema=strict_schema, strict=strict, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## json_or_python_schema ```python json_or_python_schema( json_schema: CoreSchema, python_schema: CoreSchema, *, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> JsonOrPythonSchema ``` Returns a schema that uses the Json or Python schema depending on the input: ```py from pydantic_core import SchemaValidator, ValidationError, core_schema v = SchemaValidator( core_schema.json_or_python_schema( json_schema=core_schema.int_schema(), python_schema=core_schema.int_schema(strict=True), ) ) assert v.validate_json('"123"') == 123 try: v.validate_python('123') except ValidationError: pass else: raise AssertionError('Validation should have failed') ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `json_schema` | `CoreSchema` | The schema to use for Json inputs | *required* | | `python_schema` | `CoreSchema` | The schema to use for Python inputs | *required* | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def json_or_python_schema( json_schema: CoreSchema, python_schema: CoreSchema, *, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> JsonOrPythonSchema: """ Returns a schema that uses the Json or Python schema depending on the input: ```py from pydantic_core import SchemaValidator, ValidationError, core_schema v = SchemaValidator( core_schema.json_or_python_schema( json_schema=core_schema.int_schema(), python_schema=core_schema.int_schema(strict=True), ) ) assert v.validate_json('"123"') == 123 try: v.validate_python('123') except ValidationError: pass else: raise AssertionError('Validation should have failed') ``` Args: json_schema: The schema to use for Json inputs python_schema: The schema to use for Python inputs ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='json-or-python', json_schema=json_schema, python_schema=python_schema, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## typed_dict_field ```python typed_dict_field( schema: CoreSchema, *, required: bool | None = None, validation_alias: ( str | list[str | int] | list[list[str | int]] | None ) = None, serialization_alias: str | None = None, serialization_exclude: bool | None = None, metadata: dict[str, Any] | None = None ) -> TypedDictField ``` Returns a schema that matches a typed dict field, e.g.: ```py from pydantic_core import core_schema field = core_schema.typed_dict_field(schema=core_schema.int_schema(), required=True) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `CoreSchema` | The schema to use for the field | *required* | | `required` | `bool | None` | Whether the field is required, otherwise uses the value from total on the typed dict | `None` | | `validation_alias` | `str | list[str | int] | list[list[str | int]] | None` | The alias(es) to use to find the field in the validation data | `None` | | `serialization_alias` | `str | None` | The alias to use as a key when serializing | `None` | | `serialization_exclude` | `bool | None` | Whether to exclude the field when serializing | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def typed_dict_field( schema: CoreSchema, *, required: bool | None = None, validation_alias: str | list[str | int] | list[list[str | int]] | None = None, serialization_alias: str | None = None, serialization_exclude: bool | None = None, metadata: dict[str, Any] | None = None, ) -> TypedDictField: """ Returns a schema that matches a typed dict field, e.g.: ```py from pydantic_core import core_schema field = core_schema.typed_dict_field(schema=core_schema.int_schema(), required=True) ``` Args: schema: The schema to use for the field required: Whether the field is required, otherwise uses the value from `total` on the typed dict validation_alias: The alias(es) to use to find the field in the validation data serialization_alias: The alias to use as a key when serializing serialization_exclude: Whether to exclude the field when serializing metadata: Any other information you want to include with the schema, not used by pydantic-core """ return _dict_not_none( type='typed-dict-field', schema=schema, required=required, validation_alias=validation_alias, serialization_alias=serialization_alias, serialization_exclude=serialization_exclude, metadata=metadata, ) ```` ## typed_dict_schema ```python typed_dict_schema( fields: dict[str, TypedDictField], *, cls: type[Any] | None = None, cls_name: str | None = None, computed_fields: list[ComputedField] | None = None, strict: bool | None = None, extras_schema: CoreSchema | None = None, extra_behavior: ExtraBehavior | None = None, total: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, config: CoreConfig | None = None ) -> TypedDictSchema ``` Returns a schema that matches a typed dict, e.g.: ```py from typing_extensions import TypedDict from pydantic_core import SchemaValidator, core_schema class MyTypedDict(TypedDict): a: str wrapper_schema = core_schema.typed_dict_schema( {'a': core_schema.typed_dict_field(core_schema.str_schema())}, cls=MyTypedDict ) v = SchemaValidator(wrapper_schema) assert v.validate_python({'a': 'hello'}) == {'a': 'hello'} ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `fields` | `dict[str, TypedDictField]` | The fields to use for the typed dict | *required* | | `cls` | `type[Any] | None` | The class to use for the typed dict | `None` | | `cls_name` | `str | None` | The name to use in error locations. Falls back to cls.__name__, or the validator name if no class is provided. | `None` | | `computed_fields` | `list[ComputedField] | None` | Computed fields to use when serializing the model, only applies when directly inside a model | `None` | | `strict` | `bool | None` | Whether the typed dict is strict | `None` | | `extras_schema` | `CoreSchema | None` | The extra validator to use for the typed dict | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `extra_behavior` | `ExtraBehavior | None` | The extra behavior to use for the typed dict | `None` | | `total` | `bool | None` | Whether the typed dict is total, otherwise uses typed_dict_total from config | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def typed_dict_schema( fields: dict[str, TypedDictField], *, cls: type[Any] | None = None, cls_name: str | None = None, computed_fields: list[ComputedField] | None = None, strict: bool | None = None, extras_schema: CoreSchema | None = None, extra_behavior: ExtraBehavior | None = None, total: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, config: CoreConfig | None = None, ) -> TypedDictSchema: """ Returns a schema that matches a typed dict, e.g.: ```py from typing_extensions import TypedDict from pydantic_core import SchemaValidator, core_schema class MyTypedDict(TypedDict): a: str wrapper_schema = core_schema.typed_dict_schema( {'a': core_schema.typed_dict_field(core_schema.str_schema())}, cls=MyTypedDict ) v = SchemaValidator(wrapper_schema) assert v.validate_python({'a': 'hello'}) == {'a': 'hello'} ``` Args: fields: The fields to use for the typed dict cls: The class to use for the typed dict cls_name: The name to use in error locations. Falls back to `cls.__name__`, or the validator name if no class is provided. computed_fields: Computed fields to use when serializing the model, only applies when directly inside a model strict: Whether the typed dict is strict extras_schema: The extra validator to use for the typed dict ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core extra_behavior: The extra behavior to use for the typed dict total: Whether the typed dict is total, otherwise uses `typed_dict_total` from config serialization: Custom serialization schema """ return _dict_not_none( type='typed-dict', fields=fields, cls=cls, cls_name=cls_name, computed_fields=computed_fields, strict=strict, extras_schema=extras_schema, extra_behavior=extra_behavior, total=total, ref=ref, metadata=metadata, serialization=serialization, config=config, ) ```` ## model_field ```python model_field( schema: CoreSchema, *, validation_alias: ( str | list[str | int] | list[list[str | int]] | None ) = None, serialization_alias: str | None = None, serialization_exclude: bool | None = None, frozen: bool | None = None, metadata: dict[str, Any] | None = None ) -> ModelField ``` Returns a schema for a model field, e.g.: ```py from pydantic_core import core_schema field = core_schema.model_field(schema=core_schema.int_schema()) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `CoreSchema` | The schema to use for the field | *required* | | `validation_alias` | `str | list[str | int] | list[list[str | int]] | None` | The alias(es) to use to find the field in the validation data | `None` | | `serialization_alias` | `str | None` | The alias to use as a key when serializing | `None` | | `serialization_exclude` | `bool | None` | Whether to exclude the field when serializing | `None` | | `frozen` | `bool | None` | Whether the field is frozen | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def model_field( schema: CoreSchema, *, validation_alias: str | list[str | int] | list[list[str | int]] | None = None, serialization_alias: str | None = None, serialization_exclude: bool | None = None, frozen: bool | None = None, metadata: dict[str, Any] | None = None, ) -> ModelField: """ Returns a schema for a model field, e.g.: ```py from pydantic_core import core_schema field = core_schema.model_field(schema=core_schema.int_schema()) ``` Args: schema: The schema to use for the field validation_alias: The alias(es) to use to find the field in the validation data serialization_alias: The alias to use as a key when serializing serialization_exclude: Whether to exclude the field when serializing frozen: Whether the field is frozen metadata: Any other information you want to include with the schema, not used by pydantic-core """ return _dict_not_none( type='model-field', schema=schema, validation_alias=validation_alias, serialization_alias=serialization_alias, serialization_exclude=serialization_exclude, frozen=frozen, metadata=metadata, ) ```` ## model_fields_schema ```python model_fields_schema( fields: dict[str, ModelField], *, model_name: str | None = None, computed_fields: list[ComputedField] | None = None, strict: bool | None = None, extras_schema: CoreSchema | None = None, extras_keys_schema: CoreSchema | None = None, extra_behavior: ExtraBehavior | None = None, from_attributes: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> ModelFieldsSchema ``` Returns a schema that matches the fields of a Pydantic model, e.g.: ```py from pydantic_core import SchemaValidator, core_schema wrapper_schema = core_schema.model_fields_schema( {'a': core_schema.model_field(core_schema.str_schema())} ) v = SchemaValidator(wrapper_schema) print(v.validate_python({'a': 'hello'})) #> ({'a': 'hello'}, None, {'a'}) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `fields` | `dict[str, ModelField]` | The fields of the model | *required* | | `model_name` | `str | None` | The name of the model, used for error messages, defaults to "Model" | `None` | | `computed_fields` | `list[ComputedField] | None` | Computed fields to use when serializing the model, only applies when directly inside a model | `None` | | `strict` | `bool | None` | Whether the model is strict | `None` | | `extras_schema` | `CoreSchema | None` | The schema to use when validating extra input data | `None` | | `extras_keys_schema` | `CoreSchema | None` | The schema to use when validating the keys of extra input data | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `extra_behavior` | `ExtraBehavior | None` | The extra behavior to use for the model fields | `None` | | `from_attributes` | `bool | None` | Whether the model fields should be populated from attributes | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def model_fields_schema( fields: dict[str, ModelField], *, model_name: str | None = None, computed_fields: list[ComputedField] | None = None, strict: bool | None = None, extras_schema: CoreSchema | None = None, extras_keys_schema: CoreSchema | None = None, extra_behavior: ExtraBehavior | None = None, from_attributes: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> ModelFieldsSchema: """ Returns a schema that matches the fields of a Pydantic model, e.g.: ```py from pydantic_core import SchemaValidator, core_schema wrapper_schema = core_schema.model_fields_schema( {'a': core_schema.model_field(core_schema.str_schema())} ) v = SchemaValidator(wrapper_schema) print(v.validate_python({'a': 'hello'})) #> ({'a': 'hello'}, None, {'a'}) ``` Args: fields: The fields of the model model_name: The name of the model, used for error messages, defaults to "Model" computed_fields: Computed fields to use when serializing the model, only applies when directly inside a model strict: Whether the model is strict extras_schema: The schema to use when validating extra input data extras_keys_schema: The schema to use when validating the keys of extra input data ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core extra_behavior: The extra behavior to use for the model fields from_attributes: Whether the model fields should be populated from attributes serialization: Custom serialization schema """ return _dict_not_none( type='model-fields', fields=fields, model_name=model_name, computed_fields=computed_fields, strict=strict, extras_schema=extras_schema, extras_keys_schema=extras_keys_schema, extra_behavior=extra_behavior, from_attributes=from_attributes, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## model_schema ```python model_schema( cls: type[Any], schema: CoreSchema, *, generic_origin: type[Any] | None = None, custom_init: bool | None = None, root_model: bool | None = None, post_init: str | None = None, revalidate_instances: ( Literal["always", "never", "subclass-instances"] | None ) = None, strict: bool | None = None, frozen: bool | None = None, extra_behavior: ExtraBehavior | None = None, config: CoreConfig | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> ModelSchema ``` A model schema generally contains a typed-dict schema. It will run the typed dict validator, then create a new class and set the dict and fields set returned from the typed dict validator to `__dict__` and `__pydantic_fields_set__` respectively. Example: ```py from pydantic_core import CoreConfig, SchemaValidator, core_schema class MyModel: __slots__ = ( '__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__', ) schema = core_schema.model_schema( cls=MyModel, config=CoreConfig(str_max_length=5), schema=core_schema.model_fields_schema( fields={'a': core_schema.model_field(core_schema.str_schema())}, ), ) v = SchemaValidator(schema) assert v.isinstance_python({'a': 'hello'}) is True assert v.isinstance_python({'a': 'too long'}) is False ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `cls` | `type[Any]` | The class to use for the model | *required* | | `schema` | `CoreSchema` | The schema to use for the model | *required* | | `generic_origin` | `type[Any] | None` | The origin type used for this model, if it's a parametrized generic. Ex, if this model schema represents SomeModel[int], generic_origin is SomeModel | `None` | | `custom_init` | `bool | None` | Whether the model has a custom init method | `None` | | `root_model` | `bool | None` | Whether the model is a RootModel | `None` | | `post_init` | `str | None` | The call after init to use for the model | `None` | | `revalidate_instances` | `Literal['always', 'never', 'subclass-instances'] | None` | whether instances of models and dataclasses (including subclass instances) should re-validate defaults to config.revalidate_instances, else 'never' | `None` | | `strict` | `bool | None` | Whether the model is strict | `None` | | `frozen` | `bool | None` | Whether the model is frozen | `None` | | `extra_behavior` | `ExtraBehavior | None` | The extra behavior to use for the model, used in serialization | `None` | | `config` | `CoreConfig | None` | The config to use for the model | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def model_schema( cls: type[Any], schema: CoreSchema, *, generic_origin: type[Any] | None = None, custom_init: bool | None = None, root_model: bool | None = None, post_init: str | None = None, revalidate_instances: Literal['always', 'never', 'subclass-instances'] | None = None, strict: bool | None = None, frozen: bool | None = None, extra_behavior: ExtraBehavior | None = None, config: CoreConfig | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> ModelSchema: """ A model schema generally contains a typed-dict schema. It will run the typed dict validator, then create a new class and set the dict and fields set returned from the typed dict validator to `__dict__` and `__pydantic_fields_set__` respectively. Example: ```py from pydantic_core import CoreConfig, SchemaValidator, core_schema class MyModel: __slots__ = ( '__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__', ) schema = core_schema.model_schema( cls=MyModel, config=CoreConfig(str_max_length=5), schema=core_schema.model_fields_schema( fields={'a': core_schema.model_field(core_schema.str_schema())}, ), ) v = SchemaValidator(schema) assert v.isinstance_python({'a': 'hello'}) is True assert v.isinstance_python({'a': 'too long'}) is False ``` Args: cls: The class to use for the model schema: The schema to use for the model generic_origin: The origin type used for this model, if it's a parametrized generic. Ex, if this model schema represents `SomeModel[int]`, generic_origin is `SomeModel` custom_init: Whether the model has a custom init method root_model: Whether the model is a `RootModel` post_init: The call after init to use for the model revalidate_instances: whether instances of models and dataclasses (including subclass instances) should re-validate defaults to config.revalidate_instances, else 'never' strict: Whether the model is strict frozen: Whether the model is frozen extra_behavior: The extra behavior to use for the model, used in serialization config: The config to use for the model ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='model', cls=cls, generic_origin=generic_origin, schema=schema, custom_init=custom_init, root_model=root_model, post_init=post_init, revalidate_instances=revalidate_instances, strict=strict, frozen=frozen, extra_behavior=extra_behavior, config=config, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## dataclass_field ```python dataclass_field( name: str, schema: CoreSchema, *, kw_only: bool | None = None, init: bool | None = None, init_only: bool | None = None, validation_alias: ( str | list[str | int] | list[list[str | int]] | None ) = None, serialization_alias: str | None = None, serialization_exclude: bool | None = None, metadata: dict[str, Any] | None = None, frozen: bool | None = None ) -> DataclassField ``` Returns a schema for a dataclass field, e.g.: ```py from pydantic_core import SchemaValidator, core_schema field = core_schema.dataclass_field( name='a', schema=core_schema.str_schema(), kw_only=False ) schema = core_schema.dataclass_args_schema('Foobar', [field]) v = SchemaValidator(schema) assert v.validate_python({'a': 'hello'}) == ({'a': 'hello'}, None) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `name` | `str` | The name to use for the argument parameter | *required* | | `schema` | `CoreSchema` | The schema to use for the argument parameter | *required* | | `kw_only` | `bool | None` | Whether the field can be set with a positional argument as well as a keyword argument | `None` | | `init` | `bool | None` | Whether the field should be validated during initialization | `None` | | `init_only` | `bool | None` | Whether the field should be omitted from __dict__ and passed to __post_init__ | `None` | | `validation_alias` | `str | list[str | int] | list[list[str | int]] | None` | The alias(es) to use to find the field in the validation data | `None` | | `serialization_alias` | `str | None` | The alias to use as a key when serializing | `None` | | `serialization_exclude` | `bool | None` | Whether to exclude the field when serializing | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `frozen` | `bool | None` | Whether the field is frozen | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def dataclass_field( name: str, schema: CoreSchema, *, kw_only: bool | None = None, init: bool | None = None, init_only: bool | None = None, validation_alias: str | list[str | int] | list[list[str | int]] | None = None, serialization_alias: str | None = None, serialization_exclude: bool | None = None, metadata: dict[str, Any] | None = None, frozen: bool | None = None, ) -> DataclassField: """ Returns a schema for a dataclass field, e.g.: ```py from pydantic_core import SchemaValidator, core_schema field = core_schema.dataclass_field( name='a', schema=core_schema.str_schema(), kw_only=False ) schema = core_schema.dataclass_args_schema('Foobar', [field]) v = SchemaValidator(schema) assert v.validate_python({'a': 'hello'}) == ({'a': 'hello'}, None) ``` Args: name: The name to use for the argument parameter schema: The schema to use for the argument parameter kw_only: Whether the field can be set with a positional argument as well as a keyword argument init: Whether the field should be validated during initialization init_only: Whether the field should be omitted from `__dict__` and passed to `__post_init__` validation_alias: The alias(es) to use to find the field in the validation data serialization_alias: The alias to use as a key when serializing serialization_exclude: Whether to exclude the field when serializing metadata: Any other information you want to include with the schema, not used by pydantic-core frozen: Whether the field is frozen """ return _dict_not_none( type='dataclass-field', name=name, schema=schema, kw_only=kw_only, init=init, init_only=init_only, validation_alias=validation_alias, serialization_alias=serialization_alias, serialization_exclude=serialization_exclude, metadata=metadata, frozen=frozen, ) ```` ## dataclass_args_schema ```python dataclass_args_schema( dataclass_name: str, fields: list[DataclassField], *, computed_fields: list[ComputedField] | None = None, collect_init_only: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, extra_behavior: ExtraBehavior | None = None ) -> DataclassArgsSchema ``` Returns a schema for validating dataclass arguments, e.g.: ```py from pydantic_core import SchemaValidator, core_schema field_a = core_schema.dataclass_field( name='a', schema=core_schema.str_schema(), kw_only=False ) field_b = core_schema.dataclass_field( name='b', schema=core_schema.bool_schema(), kw_only=False ) schema = core_schema.dataclass_args_schema('Foobar', [field_a, field_b]) v = SchemaValidator(schema) assert v.validate_python({'a': 'hello', 'b': True}) == ({'a': 'hello', 'b': True}, None) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `dataclass_name` | `str` | The name of the dataclass being validated | *required* | | `fields` | `list[DataclassField]` | The fields to use for the dataclass | *required* | | `computed_fields` | `list[ComputedField] | None` | Computed fields to use when serializing the dataclass | `None` | | `collect_init_only` | `bool | None` | Whether to collect init only fields into a dict to pass to __post_init__ | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | | `extra_behavior` | `ExtraBehavior | None` | How to handle extra fields | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def dataclass_args_schema( dataclass_name: str, fields: list[DataclassField], *, computed_fields: list[ComputedField] | None = None, collect_init_only: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, extra_behavior: ExtraBehavior | None = None, ) -> DataclassArgsSchema: """ Returns a schema for validating dataclass arguments, e.g.: ```py from pydantic_core import SchemaValidator, core_schema field_a = core_schema.dataclass_field( name='a', schema=core_schema.str_schema(), kw_only=False ) field_b = core_schema.dataclass_field( name='b', schema=core_schema.bool_schema(), kw_only=False ) schema = core_schema.dataclass_args_schema('Foobar', [field_a, field_b]) v = SchemaValidator(schema) assert v.validate_python({'a': 'hello', 'b': True}) == ({'a': 'hello', 'b': True}, None) ``` Args: dataclass_name: The name of the dataclass being validated fields: The fields to use for the dataclass computed_fields: Computed fields to use when serializing the dataclass collect_init_only: Whether to collect init only fields into a dict to pass to `__post_init__` ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema extra_behavior: How to handle extra fields """ return _dict_not_none( type='dataclass-args', dataclass_name=dataclass_name, fields=fields, computed_fields=computed_fields, collect_init_only=collect_init_only, ref=ref, metadata=metadata, serialization=serialization, extra_behavior=extra_behavior, ) ```` ## dataclass_schema ```python dataclass_schema( cls: type[Any], schema: CoreSchema, fields: list[str], *, generic_origin: type[Any] | None = None, cls_name: str | None = None, post_init: bool | None = None, revalidate_instances: ( Literal["always", "never", "subclass-instances"] | None ) = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, frozen: bool | None = None, slots: bool | None = None, config: CoreConfig | None = None ) -> DataclassSchema ``` Returns a schema for a dataclass. As with `ModelSchema`, this schema can only be used as a field within another schema, not as the root type. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `cls` | `type[Any]` | The dataclass type, used to perform subclass checks | *required* | | `schema` | `CoreSchema` | The schema to use for the dataclass fields | *required* | | `fields` | `list[str]` | Fields of the dataclass, this is used in serialization and in validation during re-validation and while validating assignment | *required* | | `generic_origin` | `type[Any] | None` | The origin type used for this dataclass, if it's a parametrized generic. Ex, if this model schema represents SomeDataclass[int], generic_origin is SomeDataclass | `None` | | `cls_name` | `str | None` | The name to use in error locs, etc; this is useful for generics (default: cls.__name__) | `None` | | `post_init` | `bool | None` | Whether to call __post_init__ after validation | `None` | | `revalidate_instances` | `Literal['always', 'never', 'subclass-instances'] | None` | whether instances of models and dataclasses (including subclass instances) should re-validate defaults to config.revalidate_instances, else 'never' | `None` | | `strict` | `bool | None` | Whether to require an exact instance of cls | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | | `frozen` | `bool | None` | Whether the dataclass is frozen | `None` | | `slots` | `bool | None` | Whether slots=True on the dataclass, means each field is assigned independently, rather than simply setting __dict__, default false | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ```python def dataclass_schema( cls: type[Any], schema: CoreSchema, fields: list[str], *, generic_origin: type[Any] | None = None, cls_name: str | None = None, post_init: bool | None = None, revalidate_instances: Literal['always', 'never', 'subclass-instances'] | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, frozen: bool | None = None, slots: bool | None = None, config: CoreConfig | None = None, ) -> DataclassSchema: """ Returns a schema for a dataclass. As with `ModelSchema`, this schema can only be used as a field within another schema, not as the root type. Args: cls: The dataclass type, used to perform subclass checks schema: The schema to use for the dataclass fields fields: Fields of the dataclass, this is used in serialization and in validation during re-validation and while validating assignment generic_origin: The origin type used for this dataclass, if it's a parametrized generic. Ex, if this model schema represents `SomeDataclass[int]`, generic_origin is `SomeDataclass` cls_name: The name to use in error locs, etc; this is useful for generics (default: `cls.__name__`) post_init: Whether to call `__post_init__` after validation revalidate_instances: whether instances of models and dataclasses (including subclass instances) should re-validate defaults to config.revalidate_instances, else 'never' strict: Whether to require an exact instance of `cls` ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema frozen: Whether the dataclass is frozen slots: Whether `slots=True` on the dataclass, means each field is assigned independently, rather than simply setting `__dict__`, default false """ return _dict_not_none( type='dataclass', cls=cls, generic_origin=generic_origin, fields=fields, cls_name=cls_name, schema=schema, post_init=post_init, revalidate_instances=revalidate_instances, strict=strict, ref=ref, metadata=metadata, serialization=serialization, frozen=frozen, slots=slots, config=config, ) ``` ## arguments_parameter ```python arguments_parameter( name: str, schema: CoreSchema, *, mode: ( Literal[ "positional_only", "positional_or_keyword", "keyword_only", ] | None ) = None, alias: ( str | list[str | int] | list[list[str | int]] | None ) = None ) -> ArgumentsParameter ``` Returns a schema that matches an argument parameter, e.g.: ```py from pydantic_core import SchemaValidator, core_schema param = core_schema.arguments_parameter( name='a', schema=core_schema.str_schema(), mode='positional_only' ) schema = core_schema.arguments_schema([param]) v = SchemaValidator(schema) assert v.validate_python(('hello',)) == (('hello',), {}) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `name` | `str` | The name to use for the argument parameter | *required* | | `schema` | `CoreSchema` | The schema to use for the argument parameter | *required* | | `mode` | `Literal['positional_only', 'positional_or_keyword', 'keyword_only'] | None` | The mode to use for the argument parameter | `None` | | `alias` | `str | list[str | int] | list[list[str | int]] | None` | The alias to use for the argument parameter | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def arguments_parameter( name: str, schema: CoreSchema, *, mode: Literal['positional_only', 'positional_or_keyword', 'keyword_only'] | None = None, alias: str | list[str | int] | list[list[str | int]] | None = None, ) -> ArgumentsParameter: """ Returns a schema that matches an argument parameter, e.g.: ```py from pydantic_core import SchemaValidator, core_schema param = core_schema.arguments_parameter( name='a', schema=core_schema.str_schema(), mode='positional_only' ) schema = core_schema.arguments_schema([param]) v = SchemaValidator(schema) assert v.validate_python(('hello',)) == (('hello',), {}) ``` Args: name: The name to use for the argument parameter schema: The schema to use for the argument parameter mode: The mode to use for the argument parameter alias: The alias to use for the argument parameter """ return _dict_not_none(name=name, schema=schema, mode=mode, alias=alias) ```` ## arguments_schema ```python arguments_schema( arguments: list[ArgumentsParameter], *, validate_by_name: bool | None = None, validate_by_alias: bool | None = None, var_args_schema: CoreSchema | None = None, var_kwargs_mode: VarKwargsMode | None = None, var_kwargs_schema: CoreSchema | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> ArgumentsSchema ``` Returns a schema that matches an arguments schema, e.g.: ```py from pydantic_core import SchemaValidator, core_schema param_a = core_schema.arguments_parameter( name='a', schema=core_schema.str_schema(), mode='positional_only' ) param_b = core_schema.arguments_parameter( name='b', schema=core_schema.bool_schema(), mode='positional_only' ) schema = core_schema.arguments_schema([param_a, param_b]) v = SchemaValidator(schema) assert v.validate_python(('hello', True)) == (('hello', True), {}) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `arguments` | `list[ArgumentsParameter]` | The arguments to use for the arguments schema | *required* | | `validate_by_name` | `bool | None` | Whether to populate by the parameter names, defaults to False. | `None` | | `validate_by_alias` | `bool | None` | Whether to populate by the parameter aliases, defaults to True. | `None` | | `var_args_schema` | `CoreSchema | None` | The variable args schema to use for the arguments schema | `None` | | `var_kwargs_mode` | `VarKwargsMode | None` | The validation mode to use for variadic keyword arguments. If 'uniform', every value of the keyword arguments will be validated against the var_kwargs_schema schema. If 'unpacked-typed-dict', the var_kwargs_schema argument must be a typed_dict_schema | `None` | | `var_kwargs_schema` | `CoreSchema | None` | The variable kwargs schema to use for the arguments schema | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def arguments_schema( arguments: list[ArgumentsParameter], *, validate_by_name: bool | None = None, validate_by_alias: bool | None = None, var_args_schema: CoreSchema | None = None, var_kwargs_mode: VarKwargsMode | None = None, var_kwargs_schema: CoreSchema | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> ArgumentsSchema: """ Returns a schema that matches an arguments schema, e.g.: ```py from pydantic_core import SchemaValidator, core_schema param_a = core_schema.arguments_parameter( name='a', schema=core_schema.str_schema(), mode='positional_only' ) param_b = core_schema.arguments_parameter( name='b', schema=core_schema.bool_schema(), mode='positional_only' ) schema = core_schema.arguments_schema([param_a, param_b]) v = SchemaValidator(schema) assert v.validate_python(('hello', True)) == (('hello', True), {}) ``` Args: arguments: The arguments to use for the arguments schema validate_by_name: Whether to populate by the parameter names, defaults to `False`. validate_by_alias: Whether to populate by the parameter aliases, defaults to `True`. var_args_schema: The variable args schema to use for the arguments schema var_kwargs_mode: The validation mode to use for variadic keyword arguments. If `'uniform'`, every value of the keyword arguments will be validated against the `var_kwargs_schema` schema. If `'unpacked-typed-dict'`, the `var_kwargs_schema` argument must be a [`typed_dict_schema`][pydantic_core.core_schema.typed_dict_schema] var_kwargs_schema: The variable kwargs schema to use for the arguments schema ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='arguments', arguments_schema=arguments, validate_by_name=validate_by_name, validate_by_alias=validate_by_alias, var_args_schema=var_args_schema, var_kwargs_mode=var_kwargs_mode, var_kwargs_schema=var_kwargs_schema, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## arguments_v3_parameter ```python arguments_v3_parameter( name: str, schema: CoreSchema, *, mode: ( Literal[ "positional_only", "positional_or_keyword", "keyword_only", "var_args", "var_kwargs_uniform", "var_kwargs_unpacked_typed_dict", ] | None ) = None, alias: ( str | list[str | int] | list[list[str | int]] | None ) = None ) -> ArgumentsV3Parameter ``` Returns a schema that matches an argument parameter, e.g.: ```py from pydantic_core import SchemaValidator, core_schema param = core_schema.arguments_v3_parameter( name='a', schema=core_schema.str_schema(), mode='positional_only' ) schema = core_schema.arguments_v3_schema([param]) v = SchemaValidator(schema) assert v.validate_python({'a': 'hello'}) == (('hello',), {}) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `name` | `str` | The name to use for the argument parameter | *required* | | `schema` | `CoreSchema` | The schema to use for the argument parameter | *required* | | `mode` | `Literal['positional_only', 'positional_or_keyword', 'keyword_only', 'var_args', 'var_kwargs_uniform', 'var_kwargs_unpacked_typed_dict'] | None` | The mode to use for the argument parameter | `None` | | `alias` | `str | list[str | int] | list[list[str | int]] | None` | The alias to use for the argument parameter | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def arguments_v3_parameter( name: str, schema: CoreSchema, *, mode: Literal[ 'positional_only', 'positional_or_keyword', 'keyword_only', 'var_args', 'var_kwargs_uniform', 'var_kwargs_unpacked_typed_dict', ] | None = None, alias: str | list[str | int] | list[list[str | int]] | None = None, ) -> ArgumentsV3Parameter: """ Returns a schema that matches an argument parameter, e.g.: ```py from pydantic_core import SchemaValidator, core_schema param = core_schema.arguments_v3_parameter( name='a', schema=core_schema.str_schema(), mode='positional_only' ) schema = core_schema.arguments_v3_schema([param]) v = SchemaValidator(schema) assert v.validate_python({'a': 'hello'}) == (('hello',), {}) ``` Args: name: The name to use for the argument parameter schema: The schema to use for the argument parameter mode: The mode to use for the argument parameter alias: The alias to use for the argument parameter """ return _dict_not_none(name=name, schema=schema, mode=mode, alias=alias) ```` ## arguments_v3_schema ```python arguments_v3_schema( arguments: list[ArgumentsV3Parameter], *, validate_by_name: bool | None = None, validate_by_alias: bool | None = None, extra_behavior: ( Literal["forbid", "ignore"] | None ) = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> ArgumentsV3Schema ``` Returns a schema that matches an arguments schema, e.g.: ```py from pydantic_core import SchemaValidator, core_schema param_a = core_schema.arguments_v3_parameter( name='a', schema=core_schema.str_schema(), mode='positional_only' ) param_b = core_schema.arguments_v3_parameter( name='kwargs', schema=core_schema.bool_schema(), mode='var_kwargs_uniform' ) schema = core_schema.arguments_v3_schema([param_a, param_b]) v = SchemaValidator(schema) assert v.validate_python({'a': 'hi', 'kwargs': {'b': True}}) == (('hi',), {'b': True}) ``` This schema is currently not used by other Pydantic components. In V3, it will most likely become the default arguments schema for the `'call'` schema. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `arguments` | `list[ArgumentsV3Parameter]` | The arguments to use for the arguments schema. | *required* | | `validate_by_name` | `bool | None` | Whether to populate by the parameter names, defaults to False. | `None` | | `validate_by_alias` | `bool | None` | Whether to populate by the parameter aliases, defaults to True. | `None` | | `extra_behavior` | `Literal['forbid', 'ignore'] | None` | The extra behavior to use. | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places. | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core. | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema. | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def arguments_v3_schema( arguments: list[ArgumentsV3Parameter], *, validate_by_name: bool | None = None, validate_by_alias: bool | None = None, extra_behavior: Literal['forbid', 'ignore'] | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> ArgumentsV3Schema: """ Returns a schema that matches an arguments schema, e.g.: ```py from pydantic_core import SchemaValidator, core_schema param_a = core_schema.arguments_v3_parameter( name='a', schema=core_schema.str_schema(), mode='positional_only' ) param_b = core_schema.arguments_v3_parameter( name='kwargs', schema=core_schema.bool_schema(), mode='var_kwargs_uniform' ) schema = core_schema.arguments_v3_schema([param_a, param_b]) v = SchemaValidator(schema) assert v.validate_python({'a': 'hi', 'kwargs': {'b': True}}) == (('hi',), {'b': True}) ``` This schema is currently not used by other Pydantic components. In V3, it will most likely become the default arguments schema for the `'call'` schema. Args: arguments: The arguments to use for the arguments schema. validate_by_name: Whether to populate by the parameter names, defaults to `False`. validate_by_alias: Whether to populate by the parameter aliases, defaults to `True`. extra_behavior: The extra behavior to use. ref: optional unique identifier of the schema, used to reference the schema in other places. metadata: Any other information you want to include with the schema, not used by pydantic-core. serialization: Custom serialization schema. """ return _dict_not_none( type='arguments-v3', arguments_schema=arguments, validate_by_name=validate_by_name, validate_by_alias=validate_by_alias, extra_behavior=extra_behavior, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## call_schema ```python call_schema( arguments: CoreSchema, function: Callable[..., Any], *, function_name: str | None = None, return_schema: CoreSchema | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> CallSchema ``` Returns a schema that matches an arguments schema, then calls a function, e.g.: ```py from pydantic_core import SchemaValidator, core_schema param_a = core_schema.arguments_parameter( name='a', schema=core_schema.str_schema(), mode='positional_only' ) param_b = core_schema.arguments_parameter( name='b', schema=core_schema.bool_schema(), mode='positional_only' ) args_schema = core_schema.arguments_schema([param_a, param_b]) schema = core_schema.call_schema( arguments=args_schema, function=lambda a, b: a + str(not b), return_schema=core_schema.str_schema(), ) v = SchemaValidator(schema) assert v.validate_python((('hello', True))) == 'helloFalse' ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `arguments` | `CoreSchema` | The arguments to use for the arguments schema | *required* | | `function` | `Callable[..., Any]` | The function to use for the call schema | *required* | | `function_name` | `str | None` | The function name to use for the call schema, if not provided function.__name__ is used | `None` | | `return_schema` | `CoreSchema | None` | The return schema to use for the call schema | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def call_schema( arguments: CoreSchema, function: Callable[..., Any], *, function_name: str | None = None, return_schema: CoreSchema | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> CallSchema: """ Returns a schema that matches an arguments schema, then calls a function, e.g.: ```py from pydantic_core import SchemaValidator, core_schema param_a = core_schema.arguments_parameter( name='a', schema=core_schema.str_schema(), mode='positional_only' ) param_b = core_schema.arguments_parameter( name='b', schema=core_schema.bool_schema(), mode='positional_only' ) args_schema = core_schema.arguments_schema([param_a, param_b]) schema = core_schema.call_schema( arguments=args_schema, function=lambda a, b: a + str(not b), return_schema=core_schema.str_schema(), ) v = SchemaValidator(schema) assert v.validate_python((('hello', True))) == 'helloFalse' ``` Args: arguments: The arguments to use for the arguments schema function: The function to use for the call schema function_name: The function name to use for the call schema, if not provided `function.__name__` is used return_schema: The return schema to use for the call schema ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='call', arguments_schema=arguments, function=function, function_name=function_name, return_schema=return_schema, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## custom_error_schema ```python custom_error_schema( schema: CoreSchema, custom_error_type: str, *, custom_error_message: str | None = None, custom_error_context: dict[str, Any] | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> CustomErrorSchema ``` Returns a schema that matches a custom error value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.custom_error_schema( schema=core_schema.int_schema(), custom_error_type='MyError', custom_error_message='Error msg', ) v = SchemaValidator(schema) v.validate_python(1) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `CoreSchema` | The schema to use for the custom error schema | *required* | | `custom_error_type` | `str` | The custom error type to use for the custom error schema | *required* | | `custom_error_message` | `str | None` | The custom error message to use for the custom error schema | `None` | | `custom_error_context` | `dict[str, Any] | None` | The custom error context to use for the custom error schema | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def custom_error_schema( schema: CoreSchema, custom_error_type: str, *, custom_error_message: str | None = None, custom_error_context: dict[str, Any] | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> CustomErrorSchema: """ Returns a schema that matches a custom error value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.custom_error_schema( schema=core_schema.int_schema(), custom_error_type='MyError', custom_error_message='Error msg', ) v = SchemaValidator(schema) v.validate_python(1) ``` Args: schema: The schema to use for the custom error schema custom_error_type: The custom error type to use for the custom error schema custom_error_message: The custom error message to use for the custom error schema custom_error_context: The custom error context to use for the custom error schema ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='custom-error', schema=schema, custom_error_type=custom_error_type, custom_error_message=custom_error_message, custom_error_context=custom_error_context, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## json_schema ```python json_schema( schema: CoreSchema | None = None, *, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> JsonSchema ``` Returns a schema that matches a JSON value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema dict_schema = core_schema.model_fields_schema( { 'field_a': core_schema.model_field(core_schema.str_schema()), 'field_b': core_schema.model_field(core_schema.bool_schema()), }, ) class MyModel: __slots__ = ( '__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__', ) field_a: str field_b: bool json_schema = core_schema.json_schema(schema=dict_schema) schema = core_schema.model_schema(cls=MyModel, schema=json_schema) v = SchemaValidator(schema) m = v.validate_python('{"field_a": "hello", "field_b": true}') assert isinstance(m, MyModel) ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `CoreSchema | None` | The schema to use for the JSON schema | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def json_schema( schema: CoreSchema | None = None, *, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> JsonSchema: """ Returns a schema that matches a JSON value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema dict_schema = core_schema.model_fields_schema( { 'field_a': core_schema.model_field(core_schema.str_schema()), 'field_b': core_schema.model_field(core_schema.bool_schema()), }, ) class MyModel: __slots__ = ( '__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__', ) field_a: str field_b: bool json_schema = core_schema.json_schema(schema=dict_schema) schema = core_schema.model_schema(cls=MyModel, schema=json_schema) v = SchemaValidator(schema) m = v.validate_python('{"field_a": "hello", "field_b": true}') assert isinstance(m, MyModel) ``` Args: schema: The schema to use for the JSON schema ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none(type='json', schema=schema, ref=ref, metadata=metadata, serialization=serialization) ```` ## url_schema ```python url_schema( *, max_length: int | None = None, allowed_schemes: list[str] | None = None, host_required: bool | None = None, default_host: str | None = None, default_port: int | None = None, default_path: str | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> UrlSchema ``` Returns a schema that matches a URL value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.url_schema() v = SchemaValidator(schema) print(v.validate_python('https://example.com')) #> https://example.com/ ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `max_length` | `int | None` | The maximum length of the URL | `None` | | `allowed_schemes` | `list[str] | None` | The allowed URL schemes | `None` | | `host_required` | `bool | None` | Whether the URL must have a host | `None` | | `default_host` | `str | None` | The default host to use if the URL does not have a host | `None` | | `default_port` | `int | None` | The default port to use if the URL does not have a port | `None` | | `default_path` | `str | None` | The default path to use if the URL does not have a path | `None` | | `strict` | `bool | None` | Whether to use strict URL parsing | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def url_schema( *, max_length: int | None = None, allowed_schemes: list[str] | None = None, host_required: bool | None = None, default_host: str | None = None, default_port: int | None = None, default_path: str | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> UrlSchema: """ Returns a schema that matches a URL value, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.url_schema() v = SchemaValidator(schema) print(v.validate_python('https://example.com')) #> https://example.com/ ``` Args: max_length: The maximum length of the URL allowed_schemes: The allowed URL schemes host_required: Whether the URL must have a host default_host: The default host to use if the URL does not have a host default_port: The default port to use if the URL does not have a port default_path: The default path to use if the URL does not have a path strict: Whether to use strict URL parsing ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='url', max_length=max_length, allowed_schemes=allowed_schemes, host_required=host_required, default_host=default_host, default_port=default_port, default_path=default_path, strict=strict, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## multi_host_url_schema ```python multi_host_url_schema( *, max_length: int | None = None, allowed_schemes: list[str] | None = None, host_required: bool | None = None, default_host: str | None = None, default_port: int | None = None, default_path: str | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None ) -> MultiHostUrlSchema ``` Returns a schema that matches a URL value with possibly multiple hosts, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.multi_host_url_schema() v = SchemaValidator(schema) print(v.validate_python('redis://localhost,0.0.0.0,127.0.0.1')) #> redis://localhost,0.0.0.0,127.0.0.1 ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `max_length` | `int | None` | The maximum length of the URL | `None` | | `allowed_schemes` | `list[str] | None` | The allowed URL schemes | `None` | | `host_required` | `bool | None` | Whether the URL must have a host | `None` | | `default_host` | `str | None` | The default host to use if the URL does not have a host | `None` | | `default_port` | `int | None` | The default port to use if the URL does not have a port | `None` | | `default_path` | `str | None` | The default path to use if the URL does not have a path | `None` | | `strict` | `bool | None` | Whether to use strict URL parsing | `None` | | `ref` | `str | None` | optional unique identifier of the schema, used to reference the schema in other places | `None` | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def multi_host_url_schema( *, max_length: int | None = None, allowed_schemes: list[str] | None = None, host_required: bool | None = None, default_host: str | None = None, default_port: int | None = None, default_path: str | None = None, strict: bool | None = None, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> MultiHostUrlSchema: """ Returns a schema that matches a URL value with possibly multiple hosts, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.multi_host_url_schema() v = SchemaValidator(schema) print(v.validate_python('redis://localhost,0.0.0.0,127.0.0.1')) #> redis://localhost,0.0.0.0,127.0.0.1 ``` Args: max_length: The maximum length of the URL allowed_schemes: The allowed URL schemes host_required: Whether the URL must have a host default_host: The default host to use if the URL does not have a host default_port: The default port to use if the URL does not have a port default_path: The default path to use if the URL does not have a path strict: Whether to use strict URL parsing ref: optional unique identifier of the schema, used to reference the schema in other places metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='multi-host-url', max_length=max_length, allowed_schemes=allowed_schemes, host_required=host_required, default_host=default_host, default_port=default_port, default_path=default_path, strict=strict, ref=ref, metadata=metadata, serialization=serialization, ) ```` ## definitions_schema ```python definitions_schema( schema: CoreSchema, definitions: list[CoreSchema] ) -> DefinitionsSchema ``` Build a schema that contains both an inner schema and a list of definitions which can be used within the inner schema. ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.definitions_schema( core_schema.list_schema(core_schema.definition_reference_schema('foobar')), [core_schema.int_schema(ref='foobar')], ) v = SchemaValidator(schema) assert v.validate_python([1, 2, '3']) == [1, 2, 3] ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema` | `CoreSchema` | The inner schema | *required* | | `definitions` | `list[CoreSchema]` | List of definitions which can be referenced within inner schema | *required* | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def definitions_schema(schema: CoreSchema, definitions: list[CoreSchema]) -> DefinitionsSchema: """ Build a schema that contains both an inner schema and a list of definitions which can be used within the inner schema. ```py from pydantic_core import SchemaValidator, core_schema schema = core_schema.definitions_schema( core_schema.list_schema(core_schema.definition_reference_schema('foobar')), [core_schema.int_schema(ref='foobar')], ) v = SchemaValidator(schema) assert v.validate_python([1, 2, '3']) == [1, 2, 3] ``` Args: schema: The inner schema definitions: List of definitions which can be referenced within inner schema """ return DefinitionsSchema(type='definitions', schema=schema, definitions=definitions) ```` ## definition_reference_schema ```python definition_reference_schema( schema_ref: str, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> DefinitionReferenceSchema ``` Returns a schema that points to a schema stored in "definitions", this is useful for nested recursive models and also when you want to define validators separately from the main schema, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema_definition = core_schema.definition_reference_schema('list-schema') schema = core_schema.definitions_schema( schema=schema_definition, definitions=[ core_schema.list_schema(items_schema=schema_definition, ref='list-schema'), ], ) v = SchemaValidator(schema) assert v.validate_python([()]) == [[]] ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `schema_ref` | `str` | The schema ref to use for the definition reference schema | *required* | | `metadata` | `dict[str, Any] | None` | Any other information you want to include with the schema, not used by pydantic-core | `None` | | `serialization` | `SerSchema | None` | Custom serialization schema | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_core/core_schema.py` ````python def definition_reference_schema( schema_ref: str, ref: str | None = None, metadata: dict[str, Any] | None = None, serialization: SerSchema | None = None, ) -> DefinitionReferenceSchema: """ Returns a schema that points to a schema stored in "definitions", this is useful for nested recursive models and also when you want to define validators separately from the main schema, e.g.: ```py from pydantic_core import SchemaValidator, core_schema schema_definition = core_schema.definition_reference_schema('list-schema') schema = core_schema.definitions_schema( schema=schema_definition, definitions=[ core_schema.list_schema(items_schema=schema_definition, ref='list-schema'), ], ) v = SchemaValidator(schema) assert v.validate_python([()]) == [[]] ``` Args: schema_ref: The schema ref to use for the definition reference schema metadata: Any other information you want to include with the schema, not used by pydantic-core serialization: Custom serialization schema """ return _dict_not_none( type='definition-ref', schema_ref=schema_ref, ref=ref, metadata=metadata, serialization=serialization ) ```` Color definitions are used as per the CSS3 [CSS Color Module Level 3](http://www.w3.org/TR/css3-color/#svg-color) specification. A few colors have multiple names referring to the sames colors, eg. `grey` and `gray` or `aqua` and `cyan`. In these cases the *last* color when sorted alphabetically takes preferences, eg. `Color((0, 255, 255)).as_named() == 'cyan'` because "cyan" comes after "aqua". ## RGBA ```python RGBA(r: float, g: float, b: float, alpha: float | None) ``` Internal use only as a representation of a color. Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/color.py` ```python def __init__(self, r: float, g: float, b: float, alpha: float | None): self.r = r self.g = g self.b = b self.alpha = alpha self._tuple: tuple[float, float, float, float | None] = (r, g, b, alpha) ``` ## Color ```python Color(value: ColorType) ``` Bases: `Representation` Represents a color. Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/color.py` ```python def __init__(self, value: ColorType) -> None: self._rgba: RGBA self._original: ColorType if isinstance(value, (tuple, list)): self._rgba = parse_tuple(value) elif isinstance(value, str): self._rgba = parse_str(value) elif isinstance(value, Color): self._rgba = value._rgba value = value._original else: raise PydanticCustomError( 'color_error', 'value is not a valid color: value must be a tuple, list or string', ) # if we've got here value must be a valid color self._original = value ``` ### original ```python original() -> ColorType ``` Original value passed to `Color`. Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/color.py` ```python def original(self) -> ColorType: """ Original value passed to `Color`. """ return self._original ``` ### as_named ```python as_named(*, fallback: bool = False) -> str ``` Returns the name of the color if it can be found in `COLORS_BY_VALUE` dictionary, otherwise returns the hexadecimal representation of the color or raises `ValueError`. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `fallback` | `bool` | If True, falls back to returning the hexadecimal representation of the color instead of raising a ValueError when no named color is found. | `False` | Returns: | Type | Description | | --- | --- | | `str` | The name of the color, or the hexadecimal representation of the color. | Raises: | Type | Description | | --- | --- | | `ValueError` | When no named color is found and fallback is False. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/color.py` ```python def as_named(self, *, fallback: bool = False) -> str: """ Returns the name of the color if it can be found in `COLORS_BY_VALUE` dictionary, otherwise returns the hexadecimal representation of the color or raises `ValueError`. Args: fallback: If True, falls back to returning the hexadecimal representation of the color instead of raising a ValueError when no named color is found. Returns: The name of the color, or the hexadecimal representation of the color. Raises: ValueError: When no named color is found and fallback is `False`. """ if self._rgba.alpha is not None: return self.as_hex() rgb = cast(Tuple[int, int, int], self.as_rgb_tuple()) if rgb in COLORS_BY_VALUE: return COLORS_BY_VALUE[rgb] else: if fallback: return self.as_hex() else: raise ValueError('no named color found, use fallback=True, as_hex() or as_rgb()') ``` ### as_hex ```python as_hex(format: Literal['short', 'long'] = 'short') -> str ``` Returns the hexadecimal representation of the color. Hex string representing the color can be 3, 4, 6, or 8 characters depending on whether the string a "short" representation of the color is possible and whether there's an alpha channel. Returns: | Type | Description | | --- | --- | | `str` | The hexadecimal representation of the color. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/color.py` ```python def as_hex(self, format: Literal['short', 'long'] = 'short') -> str: """Returns the hexadecimal representation of the color. Hex string representing the color can be 3, 4, 6, or 8 characters depending on whether the string a "short" representation of the color is possible and whether there's an alpha channel. Returns: The hexadecimal representation of the color. """ values = [float_to_255(c) for c in self._rgba[:3]] if self._rgba.alpha is not None: values.append(float_to_255(self._rgba.alpha)) as_hex = ''.join(f'{v:02x}' for v in values) if format == 'short' and all(c in repeat_colors for c in values): as_hex = ''.join(as_hex[c] for c in range(0, len(as_hex), 2)) return f'#{as_hex}' ``` ### as_rgb ```python as_rgb() -> str ``` Color as an `rgb(, , )` or `rgba(, , , )` string. Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/color.py` ```python def as_rgb(self) -> str: """ Color as an `rgb(, , )` or `rgba(, , , )` string. """ if self._rgba.alpha is None: return f'rgb({float_to_255(self._rgba.r)}, {float_to_255(self._rgba.g)}, {float_to_255(self._rgba.b)})' else: return ( f'rgba({float_to_255(self._rgba.r)}, {float_to_255(self._rgba.g)}, {float_to_255(self._rgba.b)}, ' f'{round(self._alpha_float(), 2)})' ) ``` ### as_rgb_tuple ```python as_rgb_tuple(*, alpha: bool | None = None) -> ColorTuple ``` Returns the color as an RGB or RGBA tuple. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `alpha` | `bool | None` | Whether to include the alpha channel. There are three options for this input: None (default): Include alpha only if it's set. (e.g. not None) True: Always include alpha. False: Always omit alpha. | `None` | Returns: | Type | Description | | --- | --- | | `ColorTuple` | A tuple that contains the values of the red, green, and blue channels in the range 0 to 255. If alpha is included, it is in the range 0 to 1. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/color.py` ```python def as_rgb_tuple(self, *, alpha: bool | None = None) -> ColorTuple: """ Returns the color as an RGB or RGBA tuple. Args: alpha: Whether to include the alpha channel. There are three options for this input: - `None` (default): Include alpha only if it's set. (e.g. not `None`) - `True`: Always include alpha. - `False`: Always omit alpha. Returns: A tuple that contains the values of the red, green, and blue channels in the range 0 to 255. If alpha is included, it is in the range 0 to 1. """ r, g, b = (float_to_255(c) for c in self._rgba[:3]) if alpha is None and self._rgba.alpha is None or alpha is not None and not alpha: return r, g, b else: return r, g, b, self._alpha_float() ``` ### as_hsl ```python as_hsl() -> str ``` Color as an `hsl(, , )` or `hsl(, , , )` string. Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/color.py` ```python def as_hsl(self) -> str: """ Color as an `hsl(, , )` or `hsl(, , , )` string. """ if self._rgba.alpha is None: h, s, li = self.as_hsl_tuple(alpha=False) # type: ignore return f'hsl({h * 360:0.0f}, {s:0.0%}, {li:0.0%})' else: h, s, li, a = self.as_hsl_tuple(alpha=True) # type: ignore return f'hsl({h * 360:0.0f}, {s:0.0%}, {li:0.0%}, {round(a, 2)})' ``` ### as_hsl_tuple ```python as_hsl_tuple(*, alpha: bool | None = None) -> HslColorTuple ``` Returns the color as an HSL or HSLA tuple. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `alpha` | `bool | None` | Whether to include the alpha channel. None (default): Include the alpha channel only if it's set (e.g. not None). True: Always include alpha. False: Always omit alpha. | `None` | Returns: | Type | Description | | --- | --- | | `HslColorTuple` | The color as a tuple of hue, saturation, lightness, and alpha (if included). All elements are in the range 0 to 1. | Note This is HSL as used in HTML and most other places, not HLS as used in Python's `colorsys`. Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/color.py` ```python def as_hsl_tuple(self, *, alpha: bool | None = None) -> HslColorTuple: """ Returns the color as an HSL or HSLA tuple. Args: alpha: Whether to include the alpha channel. - `None` (default): Include the alpha channel only if it's set (e.g. not `None`). - `True`: Always include alpha. - `False`: Always omit alpha. Returns: The color as a tuple of hue, saturation, lightness, and alpha (if included). All elements are in the range 0 to 1. Note: This is HSL as used in HTML and most other places, not HLS as used in Python's `colorsys`. """ h, l, s = rgb_to_hls(self._rgba.r, self._rgba.g, self._rgba.b) if alpha is None: if self._rgba.alpha is None: return h, s, l else: return h, s, l, self._alpha_float() return (h, s, l, self._alpha_float()) if alpha else (h, s, l) ``` ## parse_tuple ```python parse_tuple(value: tuple[Any, ...]) -> RGBA ``` Parse a tuple or list to get RGBA values. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `value` | `tuple[Any, ...]` | A tuple or list. | *required* | Returns: | Type | Description | | --- | --- | | `RGBA` | An RGBA tuple parsed from the input tuple. | Raises: | Type | Description | | --- | --- | | `PydanticCustomError` | If tuple is not valid. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/color.py` ```python def parse_tuple(value: tuple[Any, ...]) -> RGBA: """Parse a tuple or list to get RGBA values. Args: value: A tuple or list. Returns: An `RGBA` tuple parsed from the input tuple. Raises: PydanticCustomError: If tuple is not valid. """ if len(value) == 3: r, g, b = (parse_color_value(v) for v in value) return RGBA(r, g, b, None) elif len(value) == 4: r, g, b = (parse_color_value(v) for v in value[:3]) return RGBA(r, g, b, parse_float_alpha(value[3])) else: raise PydanticCustomError('color_error', 'value is not a valid color: tuples must have length 3 or 4') ``` ## parse_str ```python parse_str(value: str) -> RGBA ``` Parse a string representing a color to an RGBA tuple. Possible formats for the input string include: - named color, see `COLORS_BY_NAME` - hex short eg. `fff` (prefix can be `#`, `0x` or nothing) - hex long eg. `ffffff` (prefix can be `#`, `0x` or nothing) - `rgb(, , )` - `rgba(, , , )` - `transparent` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `value` | `str` | A string representing a color. | *required* | Returns: | Type | Description | | --- | --- | | `RGBA` | An RGBA tuple parsed from the input string. | Raises: | Type | Description | | --- | --- | | `ValueError` | If the input string cannot be parsed to an RGBA tuple. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/color.py` ```python def parse_str(value: str) -> RGBA: """ Parse a string representing a color to an RGBA tuple. Possible formats for the input string include: * named color, see `COLORS_BY_NAME` * hex short eg. `fff` (prefix can be `#`, `0x` or nothing) * hex long eg. `ffffff` (prefix can be `#`, `0x` or nothing) * `rgb(, , )` * `rgba(, , , )` * `transparent` Args: value: A string representing a color. Returns: An `RGBA` tuple parsed from the input string. Raises: ValueError: If the input string cannot be parsed to an RGBA tuple. """ value_lower = value.lower() if value_lower in COLORS_BY_NAME: r, g, b = COLORS_BY_NAME[value_lower] return ints_to_rgba(r, g, b, None) m = re.fullmatch(r_hex_short, value_lower) if m: *rgb, a = m.groups() r, g, b = (int(v * 2, 16) for v in rgb) alpha = int(a * 2, 16) / 255 if a else None return ints_to_rgba(r, g, b, alpha) m = re.fullmatch(r_hex_long, value_lower) if m: *rgb, a = m.groups() r, g, b = (int(v, 16) for v in rgb) alpha = int(a, 16) / 255 if a else None return ints_to_rgba(r, g, b, alpha) m = re.fullmatch(r_rgb, value_lower) or re.fullmatch(r_rgb_v4_style, value_lower) if m: return ints_to_rgba(*m.groups()) # type: ignore m = re.fullmatch(r_hsl, value_lower) or re.fullmatch(r_hsl_v4_style, value_lower) if m: return parse_hsl(*m.groups()) # type: ignore if value_lower == 'transparent': return RGBA(0, 0, 0, 0) raise PydanticCustomError( 'color_error', 'value is not a valid color: string not recognised as a valid color', ) ``` ## ints_to_rgba ```python ints_to_rgba( r: int | str, g: int | str, b: int | str, alpha: float | None = None, ) -> RGBA ``` Converts integer or string values for RGB color and an optional alpha value to an `RGBA` object. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `r` | `int | str` | An integer or string representing the red color value. | *required* | | `g` | `int | str` | An integer or string representing the green color value. | *required* | | `b` | `int | str` | An integer or string representing the blue color value. | *required* | | `alpha` | `float | None` | A float representing the alpha value. Defaults to None. | `None` | Returns: | Type | Description | | --- | --- | | `RGBA` | An instance of the RGBA class with the corresponding color and alpha values. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/color.py` ```python def ints_to_rgba( r: int | str, g: int | str, b: int | str, alpha: float | None = None, ) -> RGBA: """ Converts integer or string values for RGB color and an optional alpha value to an `RGBA` object. Args: r: An integer or string representing the red color value. g: An integer or string representing the green color value. b: An integer or string representing the blue color value. alpha: A float representing the alpha value. Defaults to None. Returns: An instance of the `RGBA` class with the corresponding color and alpha values. """ return RGBA( parse_color_value(r), parse_color_value(g), parse_color_value(b), parse_float_alpha(alpha), ) ``` ## parse_color_value ```python parse_color_value( value: int | str, max_val: int = 255 ) -> float ``` Parse the color value provided and return a number between 0 and 1. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `value` | `int | str` | An integer or string color value. | *required* | | `max_val` | `int` | Maximum range value. Defaults to 255. | `255` | Raises: | Type | Description | | --- | --- | | `PydanticCustomError` | If the value is not a valid color. | Returns: | Type | Description | | --- | --- | | `float` | A number between 0 and 1. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/color.py` ```python def parse_color_value(value: int | str, max_val: int = 255) -> float: """ Parse the color value provided and return a number between 0 and 1. Args: value: An integer or string color value. max_val: Maximum range value. Defaults to 255. Raises: PydanticCustomError: If the value is not a valid color. Returns: A number between 0 and 1. """ try: color = float(value) except ValueError as e: raise PydanticCustomError( 'color_error', 'value is not a valid color: color values must be a valid number', ) from e if 0 <= color <= max_val: return color / max_val else: raise PydanticCustomError( 'color_error', 'value is not a valid color: color values must be in the range 0 to {max_val}', {'max_val': max_val}, ) ``` ## parse_float_alpha ```python parse_float_alpha( value: None | str | float | int, ) -> float | None ``` Parse an alpha value checking it's a valid float in the range 0 to 1. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `value` | `None | str | float | int` | The input value to parse. | *required* | Returns: | Type | Description | | --- | --- | | `float | None` | The parsed value as a float, or None if the value was None or equal 1. | Raises: | Type | Description | | --- | --- | | `PydanticCustomError` | If the input value cannot be successfully parsed as a float in the expected range. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/color.py` ```python def parse_float_alpha(value: None | str | float | int) -> float | None: """ Parse an alpha value checking it's a valid float in the range 0 to 1. Args: value: The input value to parse. Returns: The parsed value as a float, or `None` if the value was None or equal 1. Raises: PydanticCustomError: If the input value cannot be successfully parsed as a float in the expected range. """ if value is None: return None try: if isinstance(value, str) and value.endswith('%'): alpha = float(value[:-1]) / 100 else: alpha = float(value) except ValueError as e: raise PydanticCustomError( 'color_error', 'value is not a valid color: alpha values must be a valid float', ) from e if math.isclose(alpha, 1): return None elif 0 <= alpha <= 1: return alpha else: raise PydanticCustomError( 'color_error', 'value is not a valid color: alpha values must be in the range 0 to 1', ) ``` ## parse_hsl ```python parse_hsl( h: str, h_units: str, sat: str, light: str, alpha: float | None = None, ) -> RGBA ``` Parse raw hue, saturation, lightness, and alpha values and convert to RGBA. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `h` | `str` | The hue value. | *required* | | `h_units` | `str` | The unit for hue value. | *required* | | `sat` | `str` | The saturation value. | *required* | | `light` | `str` | The lightness value. | *required* | | `alpha` | `float | None` | Alpha value. | `None` | Returns: | Type | Description | | --- | --- | | `RGBA` | An instance of RGBA. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/color.py` ```python def parse_hsl(h: str, h_units: str, sat: str, light: str, alpha: float | None = None) -> RGBA: """ Parse raw hue, saturation, lightness, and alpha values and convert to RGBA. Args: h: The hue value. h_units: The unit for hue value. sat: The saturation value. light: The lightness value. alpha: Alpha value. Returns: An instance of `RGBA`. """ s_value, l_value = parse_color_value(sat, 100), parse_color_value(light, 100) h_value = float(h) if h_units in {None, 'deg'}: h_value = h_value % 360 / 360 elif h_units == 'rad': h_value = h_value % rads / rads else: # turns h_value %= 1 r, g, b = hls_to_rgb(h_value, l_value, s_value) return RGBA(r, g, b, parse_float_alpha(alpha)) ``` ## float_to_255 ```python float_to_255(c: float) -> int ``` Converts a float value between 0 and 1 (inclusive) to an integer between 0 and 255 (inclusive). Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `c` | `float` | The float value to be converted. Must be between 0 and 1 (inclusive). | *required* | Returns: | Type | Description | | --- | --- | | `int` | The integer equivalent of the given float value rounded to the nearest whole number. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/color.py` ```python def float_to_255(c: float) -> int: """ Converts a float value between 0 and 1 (inclusive) to an integer between 0 and 255 (inclusive). Args: c: The float value to be converted. Must be between 0 and 1 (inclusive). Returns: The integer equivalent of the given float value rounded to the nearest whole number. """ return round(c * 255) ``` The `pydantic_extra_types.coordinate` module provides the Latitude, Longitude, and Coordinate data types. ## Latitude Bases: `float` Latitude value should be between -90 and 90, inclusive. ```py from pydantic import BaseModel from pydantic_extra_types.coordinate import Latitude class Location(BaseModel): latitude: Latitude location = Location(latitude=41.40338) print(location) #> latitude=41.40338 ``` ## Longitude Bases: `float` Longitude value should be between -180 and 180, inclusive. ```py from pydantic import BaseModel from pydantic_extra_types.coordinate import Longitude class Location(BaseModel): longitude: Longitude location = Location(longitude=2.17403) print(location) #> longitude=2.17403 ``` ## Coordinate ```python Coordinate(latitude: Latitude, longitude: Longitude) ``` Bases: `Representation` Coordinate parses Latitude and Longitude. You can use the `Coordinate` data type for storing coordinates. Coordinates can be defined using one of the following formats: 1. Tuple: `(Latitude, Longitude)`. For example: `(41.40338, 2.17403)`. 1. `Coordinate` instance: `Coordinate(latitude=Latitude, longitude=Longitude)`. ```py from pydantic import BaseModel from pydantic_extra_types.coordinate import Coordinate class Location(BaseModel): coordinate: Coordinate location = Location(coordinate=(41.40338, 2.17403)) #> coordinate=Coordinate(latitude=41.40338, longitude=2.17403) ``` Country definitions that are based on the [ISO 3166](https://en.wikipedia.org/wiki/List_of_ISO_3166_country_codes). ## CountryAlpha2 Bases: `str` CountryAlpha2 parses country codes in the [ISO 3166-1 alpha-2](https://en.wikipedia.org/wiki/ISO_3166-1_alpha-2) format. ```py from pydantic import BaseModel from pydantic_extra_types.country import CountryAlpha2 class Product(BaseModel): made_in: CountryAlpha2 product = Product(made_in='ES') print(product) #> made_in='ES' ``` ### alpha3 ```python alpha3: str ``` The country code in the [ISO 3166-1 alpha-3](https://en.wikipedia.org/wiki/ISO_3166-1_alpha-3) format. ### numeric_code ```python numeric_code: str ``` The country code in the [ISO 3166-1 numeric](https://en.wikipedia.org/wiki/ISO_3166-1_numeric) format. ### short_name ```python short_name: str ``` The country short name. ## CountryAlpha3 Bases: `str` CountryAlpha3 parses country codes in the [ISO 3166-1 alpha-3](https://en.wikipedia.org/wiki/ISO_3166-1_alpha-3) format. ```py from pydantic import BaseModel from pydantic_extra_types.country import CountryAlpha3 class Product(BaseModel): made_in: CountryAlpha3 product = Product(made_in="USA") print(product) #> made_in='USA' ``` ### alpha2 ```python alpha2: str ``` The country code in the [ISO 3166-1 alpha-2](https://en.wikipedia.org/wiki/ISO_3166-1_alpha-2) format. ### numeric_code ```python numeric_code: str ``` The country code in the [ISO 3166-1 numeric](https://en.wikipedia.org/wiki/ISO_3166-1_numeric) format. ### short_name ```python short_name: str ``` The country short name. ## CountryNumericCode Bases: `str` CountryNumericCode parses country codes in the [ISO 3166-1 numeric](https://en.wikipedia.org/wiki/ISO_3166-1_numeric) format. ```py from pydantic import BaseModel from pydantic_extra_types.country import CountryNumericCode class Product(BaseModel): made_in: CountryNumericCode product = Product(made_in="840") print(product) #> made_in='840' ``` ### alpha2 ```python alpha2: str ``` The country code in the [ISO 3166-1 alpha-2](https://en.wikipedia.org/wiki/ISO_3166-1_alpha-2) format. ### alpha3 ```python alpha3: str ``` The country code in the [ISO 3166-1 alpha-3](https://en.wikipedia.org/wiki/ISO_3166-1_alpha-3) format. ### short_name ```python short_name: str ``` The country short name. ## CountryShortName Bases: `str` CountryShortName parses country codes in the short name format. ```py from pydantic import BaseModel from pydantic_extra_types.country import CountryShortName class Product(BaseModel): made_in: CountryShortName product = Product(made_in="United States") print(product) #> made_in='United States' ``` ### alpha2 ```python alpha2: str ``` The country code in the [ISO 3166-1 alpha-2](https://en.wikipedia.org/wiki/ISO_3166-1_alpha-2) format. ### alpha3 ```python alpha3: str ``` The country code in the [ISO 3166-1 alpha-3](https://en.wikipedia.org/wiki/ISO_3166-1_alpha-3) format. ### numeric_code ```python numeric_code: str ``` The country code in the [ISO 3166-1 numeric](https://en.wikipedia.org/wiki/ISO_3166-1_numeric) format. Currency definitions that are based on the [ISO4217](https://en.wikipedia.org/wiki/ISO_4217). ## ISO4217 Bases: `str` ISO4217 parses Currency in the [ISO 4217](https://en.wikipedia.org/wiki/ISO_4217) format. ```py from pydantic import BaseModel from pydantic_extra_types.currency_code import ISO4217 class Currency(BaseModel): alpha_3: ISO4217 currency = Currency(alpha_3='AED') print(currency) # > alpha_3='AED' ``` ## Currency Bases: `str` Currency parses currency subset of the [ISO 4217](https://en.wikipedia.org/wiki/ISO_4217) format. It excludes bonds testing codes and precious metals. ```py from pydantic import BaseModel from pydantic_extra_types.currency_code import Currency class currency(BaseModel): alpha_3: Currency cur = currency(alpha_3='AED') print(cur) # > alpha_3='AED' ``` The `pydantic_extra_types.isbn` module provides functionality to recieve and validate ISBN. ISBN (International Standard Book Number) is a numeric commercial book identifier which is intended to be unique. This module provides a ISBN type for Pydantic models. ## ISBN Bases: `str` Represents a ISBN and provides methods for conversion, validation, and serialization. ```py from pydantic import BaseModel from pydantic_extra_types.isbn import ISBN class Book(BaseModel): isbn: ISBN book = Book(isbn="8537809667") print(book) #> isbn='9788537809662' ``` ### validate_isbn_format ```python validate_isbn_format(value: str) -> None ``` Validate a ISBN format from the provided str value. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `value` | `str` | The str value representing the ISBN in 10 or 13 digits. | *required* | Raises: | Type | Description | | --- | --- | | `PydanticCustomError` | If the ISBN is not valid. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/isbn.py` ```python @staticmethod def validate_isbn_format(value: str) -> None: """Validate a ISBN format from the provided str value. Args: value: The str value representing the ISBN in 10 or 13 digits. Raises: PydanticCustomError: If the ISBN is not valid. """ isbn_length = len(value) if isbn_length not in (10, 13): raise PydanticCustomError('isbn_length', f'Length for ISBN must be 10 or 13 digits, not {isbn_length}') if isbn_length == 10: if not value[:-1].isdigit() or ((value[-1] != 'X') and (not value[-1].isdigit())): raise PydanticCustomError('isbn10_invalid_characters', 'First 9 digits of ISBN-10 must be integers') if isbn10_digit_calc(value) != value[-1]: raise PydanticCustomError('isbn_invalid_digit_check_isbn10', 'Provided digit is invalid for given ISBN') if isbn_length == 13: if not value.isdigit(): raise PydanticCustomError('isbn13_invalid_characters', 'All digits of ISBN-13 must be integers') if value[:3] not in ('978', '979'): raise PydanticCustomError( 'isbn_invalid_early_characters', 'The first 3 digits of ISBN-13 must be 978 or 979' ) if isbn13_digit_calc(value) != value[-1]: raise PydanticCustomError('isbn_invalid_digit_check_isbn13', 'Provided digit is invalid for given ISBN') ``` ### convert_isbn10_to_isbn13 ```python convert_isbn10_to_isbn13(value: str) -> str ``` Convert an ISBN-10 to ISBN-13. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `value` | `str` | The ISBN-10 value to be converted. | *required* | Returns: | Type | Description | | --- | --- | | `str` | The converted ISBN or the original value if no conversion is necessary. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/isbn.py` ```python @staticmethod def convert_isbn10_to_isbn13(value: str) -> str: """Convert an ISBN-10 to ISBN-13. Args: value: The ISBN-10 value to be converted. Returns: The converted ISBN or the original value if no conversion is necessary. """ if len(value) == 10: base_isbn = f'978{value[:-1]}' isbn13_digit = isbn13_digit_calc(base_isbn) return ISBN(f'{base_isbn}{isbn13_digit}') return ISBN(value) ``` ## isbn10_digit_calc ```python isbn10_digit_calc(isbn: str) -> str ``` Calc a ISBN-10 last digit from the provided str value. More information of validation algorithm on [Wikipedia](https://en.wikipedia.org/wiki/ISBN#Check_digits) Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `isbn` | `str` | The str value representing the ISBN in 10 digits. | *required* | Returns: | Type | Description | | --- | --- | | `str` | The calculated last digit of the ISBN-10 value. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/isbn.py` ```python def isbn10_digit_calc(isbn: str) -> str: """Calc a ISBN-10 last digit from the provided str value. More information of validation algorithm on [Wikipedia](https://en.wikipedia.org/wiki/ISBN#Check_digits) Args: isbn: The str value representing the ISBN in 10 digits. Returns: The calculated last digit of the ISBN-10 value. """ total = sum(int(digit) * (10 - idx) for idx, digit in enumerate(isbn[:9])) for check_digit in range(1, 11): if (total + check_digit) % 11 == 0: valid_check_digit = 'X' if check_digit == 10 else str(check_digit) return valid_check_digit ``` ## isbn13_digit_calc ```python isbn13_digit_calc(isbn: str) -> str ``` Calc a ISBN-13 last digit from the provided str value. More information of validation algorithm on [Wikipedia](https://en.wikipedia.org/wiki/ISBN#Check_digits) Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `isbn` | `str` | The str value representing the ISBN in 13 digits. | *required* | Returns: | Type | Description | | --- | --- | | `str` | The calculated last digit of the ISBN-13 value. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/isbn.py` ```python def isbn13_digit_calc(isbn: str) -> str: """Calc a ISBN-13 last digit from the provided str value. More information of validation algorithm on [Wikipedia](https://en.wikipedia.org/wiki/ISBN#Check_digits) Args: isbn: The str value representing the ISBN in 13 digits. Returns: The calculated last digit of the ISBN-13 value. """ total = sum(int(digit) * (1 if idx % 2 == 0 else 3) for idx, digit in enumerate(isbn[:12])) check_digit = (10 - (total % 10)) % 10 return str(check_digit) ``` Language definitions that are based on the [ISO 639-3](https://en.wikipedia.org/wiki/ISO_639-3) & [ISO 639-5](https://en.wikipedia.org/wiki/ISO_639-5). ## LanguageInfo ```python LanguageInfo( alpha2: Union[str, None], alpha3: str, name: str ) ``` LanguageInfo is a dataclass that contains the language information. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `alpha2` | `Union[str, None]` | The language code in the ISO 639-1 alpha-2 format. | *required* | | `alpha3` | `str` | The language code in the ISO 639-3 alpha-3 format. | *required* | | `name` | `str` | The language name. | *required* | ## LanguageAlpha2 Bases: `str` LanguageAlpha2 parses languages codes in the [ISO 639-1 alpha-2](https://en.wikipedia.org/wiki/ISO_639-1) format. ```py from pydantic import BaseModel from pydantic_extra_types.language_code import LanguageAlpha2 class Movie(BaseModel): audio_lang: LanguageAlpha2 subtitles_lang: LanguageAlpha2 movie = Movie(audio_lang='de', subtitles_lang='fr') print(movie) #> audio_lang='de' subtitles_lang='fr' ``` ### alpha3 ```python alpha3: str ``` The language code in the [ISO 639-3 alpha-3](https://en.wikipedia.org/wiki/ISO_639-3) format. ### name ```python name: str ``` The language name. ## LanguageName Bases: `str` LanguageName parses languages names listed in the [ISO 639-3 standard](https://en.wikipedia.org/wiki/ISO_639-3) format. ```py from pydantic import BaseModel from pydantic_extra_types.language_code import LanguageName class Movie(BaseModel): audio_lang: LanguageName subtitles_lang: LanguageName movie = Movie(audio_lang='Dutch', subtitles_lang='Mandarin Chinese') print(movie) #> audio_lang='Dutch' subtitles_lang='Mandarin Chinese' ``` ### alpha2 ```python alpha2: Union[str, None] ``` The language code in the [ISO 639-1 alpha-2](https://en.wikipedia.org/wiki/ISO_639-1) format. Does not exist for all languages. ### alpha3 ```python alpha3: str ``` The language code in the [ISO 639-3 alpha-3](https://en.wikipedia.org/wiki/ISO_639-3) format. ## ISO639_3 Bases: `str` ISO639_3 parses Language in the [ISO 639-3 alpha-3](https://en.wikipedia.org/wiki/ISO_639-3_alpha-3) format. ```py from pydantic import BaseModel from pydantic_extra_types.language_code import ISO639_3 class Language(BaseModel): alpha_3: ISO639_3 lang = Language(alpha_3='ssr') print(lang) # > alpha_3='ssr' ``` ## ISO639_5 Bases: `str` ISO639_5 parses Language in the [ISO 639-5 alpha-3](https://en.wikipedia.org/wiki/ISO_639-5_alpha-3) format. ```py from pydantic import BaseModel from pydantic_extra_types.language_code import ISO639_5 class Language(BaseModel): alpha_3: ISO639_5 lang = Language(alpha_3='gem') print(lang) # > alpha_3='gem' ``` The MAC address module provides functionality to parse and validate MAC addresses in different formats, such as IEEE 802 MAC-48, EUI-48, EUI-64, or a 20-octet format. ## MacAddress Bases: `str` Represents a MAC address and provides methods for conversion, validation, and serialization. ```py from pydantic import BaseModel from pydantic_extra_types.mac_address import MacAddress class Network(BaseModel): mac_address: MacAddress network = Network(mac_address="00:00:5e:00:53:01") print(network) #> mac_address='00:00:5e:00:53:01' ``` ### validate_mac_address ```python validate_mac_address(value: bytes) -> str ``` Validate a MAC Address from the provided byte value. Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/mac_address.py` ```python @staticmethod def validate_mac_address(value: bytes) -> str: """ Validate a MAC Address from the provided byte value. """ if len(value) < 14: raise PydanticCustomError( 'mac_address_len', 'Length for a {mac_address} MAC address must be {required_length}', {'mac_address': value.decode(), 'required_length': 14}, ) if value[2] in [ord(':'), ord('-')]: if (len(value) + 1) % 3 != 0: raise PydanticCustomError( 'mac_address_format', 'Must have the format xx:xx:xx:xx:xx:xx or xx-xx-xx-xx-xx-xx' ) n = (len(value) + 1) // 3 if n not in (6, 8, 20): raise PydanticCustomError( 'mac_address_format', 'Length for a {mac_address} MAC address must be {required_length}', {'mac_address': value.decode(), 'required_length': (6, 8, 20)}, ) mac_address = bytearray(n) x = 0 for i in range(n): try: byte_value = int(value[x : x + 2], 16) mac_address[i] = byte_value x += 3 except ValueError as e: raise PydanticCustomError('mac_address_format', 'Unrecognized format') from e elif value[4] == ord('.'): if (len(value) + 1) % 5 != 0: raise PydanticCustomError('mac_address_format', 'Must have the format xx.xx.xx.xx.xx.xx') n = 2 * (len(value) + 1) // 5 if n not in (6, 8, 20): raise PydanticCustomError( 'mac_address_format', 'Length for a {mac_address} MAC address must be {required_length}', {'mac_address': value.decode(), 'required_length': (6, 8, 20)}, ) mac_address = bytearray(n) x = 0 for i in range(0, n, 2): try: byte_value = int(value[x : x + 2], 16) mac_address[i] = byte_value byte_value = int(value[x + 2 : x + 4], 16) mac_address[i + 1] = byte_value x += 5 except ValueError as e: raise PydanticCustomError('mac_address_format', 'Unrecognized format') from e else: raise PydanticCustomError('mac_address_format', 'Unrecognized format') return ':'.join(f'{b:02x}' for b in mac_address) ``` The `pydantic_extra_types.payment` module provides the PaymentCardNumber data type. ## PaymentCardBrand Bases: `str`, `Enum` Payment card brands supported by the PaymentCardNumber. ## PaymentCardNumber ```python PaymentCardNumber(card_number: str) ``` Bases: `str` A [payment card number](https://en.wikipedia.org/wiki/Payment_card_number). Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/payment.py` ```python def __init__(self, card_number: str): self.validate_digits(card_number) card_number = self.validate_luhn_check_digit(card_number) self.bin = card_number[:6] self.last4 = card_number[-4:] self.brand = self.validate_brand(card_number) ``` ### strip_whitespace ```python strip_whitespace: bool = True ``` Whether to strip whitespace from the input value. ### min_length ```python min_length: int = 12 ``` The minimum length of the card number. ### max_length ```python max_length: int = 19 ``` The maximum length of the card number. ### bin ```python bin: str = card_number[:6] ``` The first 6 digits of the card number. ### last4 ```python last4: str = card_number[-4:] ``` The last 4 digits of the card number. ### brand ```python brand: PaymentCardBrand = validate_brand(card_number) ``` The brand of the card. ### masked ```python masked: str ``` The masked card number. ### validate ```python validate( __input_value: str, _: ValidationInfo ) -> PaymentCardNumber ``` Validate the `PaymentCardNumber` instance. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `__input_value` | `str` | The input value to validate. | *required* | | `_` | `ValidationInfo` | The validation info. | *required* | Returns: | Type | Description | | --- | --- | | `PaymentCardNumber` | The validated PaymentCardNumber instance. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/payment.py` ```python @classmethod def validate(cls, __input_value: str, _: core_schema.ValidationInfo) -> PaymentCardNumber: """Validate the `PaymentCardNumber` instance. Args: __input_value: The input value to validate. _: The validation info. Returns: The validated `PaymentCardNumber` instance. """ return cls(__input_value) ``` ### validate_digits ```python validate_digits(card_number: str) -> None ``` Validate that the card number is all digits. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `card_number` | `str` | The card number to validate. | *required* | Raises: | Type | Description | | --- | --- | | `PydanticCustomError` | If the card number is not all digits. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/payment.py` ```python @classmethod def validate_digits(cls, card_number: str) -> None: """Validate that the card number is all digits. Args: card_number: The card number to validate. Raises: PydanticCustomError: If the card number is not all digits. """ if not card_number or not all('0' <= c <= '9' for c in card_number): raise PydanticCustomError('payment_card_number_digits', 'Card number is not all digits') ``` ### validate_luhn_check_digit ```python validate_luhn_check_digit(card_number: str) -> str ``` Validate the payment card number. Based on the [Luhn algorithm](https://en.wikipedia.org/wiki/Luhn_algorithm). Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `card_number` | `str` | The card number to validate. | *required* | Returns: | Type | Description | | --- | --- | | `str` | The validated card number. | Raises: | Type | Description | | --- | --- | | `PydanticCustomError` | If the card number is not valid. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/payment.py` ```python @classmethod def validate_luhn_check_digit(cls, card_number: str) -> str: """Validate the payment card number. Based on the [Luhn algorithm](https://en.wikipedia.org/wiki/Luhn_algorithm). Args: card_number: The card number to validate. Returns: The validated card number. Raises: PydanticCustomError: If the card number is not valid. """ sum_ = int(card_number[-1]) length = len(card_number) parity = length % 2 for i in range(length - 1): digit = int(card_number[i]) if i % 2 == parity: digit *= 2 if digit > 9: digit -= 9 sum_ += digit valid = sum_ % 10 == 0 if not valid: raise PydanticCustomError('payment_card_number_luhn', 'Card number is not luhn valid') return card_number ``` ### validate_brand ```python validate_brand(card_number: str) -> PaymentCardBrand ``` Validate length based on [BIN]() for major brands. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `card_number` | `str` | The card number to validate. | *required* | Returns: | Type | Description | | --- | --- | | `PaymentCardBrand` | The validated card brand. | Raises: | Type | Description | | --- | --- | | `PydanticCustomError` | If the card number is not valid. | Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/payment.py` ```python @staticmethod def validate_brand(card_number: str) -> PaymentCardBrand: """Validate length based on [BIN](https://en.wikipedia.org/wiki/Payment_card_number#Issuer_identification_number_(IIN)) for major brands. Args: card_number: The card number to validate. Returns: The validated card brand. Raises: PydanticCustomError: If the card number is not valid. """ brand = PaymentCardBrand.other if card_number[0] == '4': brand = PaymentCardBrand.visa required_length = [13, 16, 19] elif 51 <= int(card_number[:2]) <= 55: brand = PaymentCardBrand.mastercard required_length = [16] elif card_number[:2] in {'34', '37'}: brand = PaymentCardBrand.amex required_length = [15] elif 2200 <= int(card_number[:4]) <= 2204: brand = PaymentCardBrand.mir required_length = list(range(16, 20)) elif card_number[:4] in {'5018', '5020', '5038', '5893', '6304', '6759', '6761', '6762', '6763'} or card_number[ :6 ] in ( '676770', '676774', ): brand = PaymentCardBrand.maestro required_length = list(range(12, 20)) elif card_number.startswith('65') or 644 <= int(card_number[:3]) <= 649 or card_number.startswith('6011'): brand = PaymentCardBrand.discover required_length = list(range(16, 20)) elif ( 506099 <= int(card_number[:6]) <= 506198 or 650002 <= int(card_number[:6]) <= 650027 or 507865 <= int(card_number[:6]) <= 507964 ): brand = PaymentCardBrand.verve required_length = [16, 18, 19] elif card_number[:4] in {'5019', '4571'}: brand = PaymentCardBrand.dankort required_length = [16] elif card_number.startswith('9792'): brand = PaymentCardBrand.troy required_length = [16] elif card_number[:2] in {'62', '81'}: brand = PaymentCardBrand.unionpay required_length = [16, 19] elif 3528 <= int(card_number[:4]) <= 3589: brand = PaymentCardBrand.jcb required_length = [16, 19] valid = len(card_number) in required_length if brand != PaymentCardBrand.other else True if not valid: raise PydanticCustomError( 'payment_card_number_brand', f'Length for a {brand} card must be {" or ".join(map(str, required_length))}', {'brand': brand, 'required_length': required_length}, ) return brand ``` Native Pendulum DateTime object implementation. This is a copy of the Pendulum DateTime object, but with a Pydantic CoreSchema implementation. This allows Pydantic to validate the DateTime object. ## DateTime Bases: `DateTime` A `pendulum.DateTime` object. At runtime, this type decomposes into pendulum.DateTime automatically. This type exists because Pydantic throws a fit on unknown types. ```python from pydantic import BaseModel from pydantic_extra_types.pendulum_dt import DateTime class test_model(BaseModel): dt: DateTime print(test_model(dt='2021-01-01T00:00:00+00:00')) #> test_model(dt=DateTime(2021, 1, 1, 0, 0, 0, tzinfo=FixedTimezone(0, name="+00:00"))) ``` ## Date Bases: `Date` A `pendulum.Date` object. At runtime, this type decomposes into pendulum.Date automatically. This type exists because Pydantic throws a fit on unknown types. ```python from pydantic import BaseModel from pydantic_extra_types.pendulum_dt import Date class test_model(BaseModel): dt: Date print(test_model(dt='2021-01-01')) #> test_model(dt=Date(2021, 1, 1)) ``` ## Duration Bases: `Duration` A `pendulum.Duration` object. At runtime, this type decomposes into pendulum.Duration automatically. This type exists because Pydantic throws a fit on unknown types. ```python from pydantic import BaseModel from pydantic_extra_types.pendulum_dt import Duration class test_model(BaseModel): delta_t: Duration print(test_model(delta_t='P1DT25H')) #> test_model(delta_t=Duration(days=2, hours=1)) ``` The `pydantic_extra_types.phone_numbers` module provides the PhoneNumber data type. This class depends on the [phonenumbers] package, which is a Python port of Google's [libphonenumber]. ## PhoneNumber Bases: `str` A wrapper around [phonenumbers](https://pypi.org/project/phonenumbers/) package, which is a Python port of Google's [libphonenumber](https://github.com/google/libphonenumber/). ### supported_regions ```python supported_regions: list[str] = [] ``` The supported regions. If empty, all regions are supported. ### default_region_code ```python default_region_code: str | None = None ``` The default region code to use when parsing phone numbers without an international prefix. ### phone_format ```python phone_format: str = 'RFC3966' ``` The format of the phone number. ## PhoneNumberValidator ```python PhoneNumberValidator( default_region: Optional[str] = None, number_format: str = "RFC3966", supported_regions: Optional[Sequence[str]] = None, ) ``` A pydantic before validator for phone numbers using the [phonenumbers](https://pypi.org/project/phonenumbers/) package, a Python port of Google's [libphonenumber](https://github.com/google/libphonenumber/). Intended to be used to create custom pydantic data types using the `typing.Annotated` type construct. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `default_region` | `str | None` | The default region code to use when parsing phone numbers without an international prefix. If None (default), the region must be supplied in the phone number as an international prefix. | `None` | | `number_format` | `str` | The format of the phone number to return. See phonenumbers.PhoneNumberFormat for valid values. | `'RFC3966'` | | `supported_regions` | `list[str]` | The supported regions. If empty, all regions are supported (default). | `None` | Returns: str: The formatted phone number. Example MyNumberType = Annotated\[ Union[str, phonenumbers.PhoneNumber], PhoneNumberValidator() \] USNumberType = Annotated\[ Union[str, phonenumbers.PhoneNumber], PhoneNumberValidator(supported_regions=['US'], default_region='US') \] class SomeModel(BaseModel): phone_number: MyNumberType us_number: USNumberType The `pydantic_extra_types.routing_number` module provides the ABARoutingNumber data type. ## ABARoutingNumber ```python ABARoutingNumber(routing_number: str) ``` Bases: `str` The `ABARoutingNumber` data type is a string of 9 digits representing an ABA routing transit number. The algorithm used to validate the routing number is described in the [ABA routing transit number](https://en.wikipedia.org/wiki/ABA_routing_transit_number#Check_digit) Wikipedia article. ```py from pydantic import BaseModel from pydantic_extra_types.routing_number import ABARoutingNumber class BankAccount(BaseModel): routing_number: ABARoutingNumber account = BankAccount(routing_number='122105155') print(account) #> routing_number='122105155' ``` Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/routing_number.py` ```python def __init__(self, routing_number: str): self._validate_digits(routing_number) self._routing_number = self._validate_routing_number(routing_number) ``` script definitions that are based on the [ISO 15924](https://en.wikipedia.org/wiki/ISO_15924) ## ISO_15924 Bases: `str` ISO_15924 parses script in the [ISO 15924](https://en.wikipedia.org/wiki/ISO_15924) format. ```py from pydantic import BaseModel from pydantic_extra_types.language_code import ISO_15924 class Script(BaseModel): alpha_4: ISO_15924 script = Script(alpha_4='Java') print(lang) # > script='Java' ``` SemanticVersion definition that is based on the Semantiv Versioning Specification [semver](https://semver.org/). ## SemanticVersion Semantic version based on the official [semver thread](https://python-semver.readthedocs.io/en/latest/advanced/combine-pydantic-and-semver.html). Time zone name validation and serialization module. ## TimeZoneName Bases: `str` TimeZoneName is a custom string subclass for validating and serializing timezone names. The TimeZoneName class uses the IANA Time Zone Database for validation. It supports both strict and non-strict modes for timezone name validation. #### Examples: Some examples of using the TimeZoneName class: ##### Normal usage: ```python from pydantic_extra_types.timezone_name import TimeZoneName from pydantic import BaseModel class Location(BaseModel): city: str timezone: TimeZoneName loc = Location(city="New York", timezone="America/New_York") print(loc.timezone) >> America/New_York ``` ##### Non-strict mode: ```python from pydantic_extra_types.timezone_name import TimeZoneName, timezone_name_settings @timezone_name_settings(strict=False) class TZNonStrict(TimeZoneName): pass tz = TZNonStrict("america/new_york") print(tz) >> america/new_york ``` ## get_timezones ```python get_timezones() -> Set[str] ``` Determine the timezone provider and return available timezones. Source code in `.venv/lib/python3.12/site-packages/pydantic_extra_types/timezone_name.py` ```python def get_timezones() -> Set[str]: """Determine the timezone provider and return available timezones.""" if _is_available('zoneinfo') and _is_available('tzdata'): # pragma: no cover return _tz_provider_from_zone_info() elif _is_available('pytz'): # pragma: no cover if sys.version_info[:2] > (3, 8): _warn_about_pytz_usage() return _tz_provider_from_pytz() else: # pragma: no cover if sys.version_info[:2] == (3, 8): raise ImportError('No pytz module found. Please install it with "pip install pytz"') raise ImportError('No timezone provider found. Please install tzdata with "pip install tzdata"') ``` The `pydantic_extra_types.ULID` module provides the \[`ULID`\] data type. This class depends on the [python-ulid] package, which is a validate by the [ULID-spec](https://github.com/ulid/spec#implementations-in-other-languages). ## ULID ```python ULID(ulid: ULID) ``` Bases: `Representation` A wrapper around [python-ulid](https://pypi.org/project/python-ulid/) package, which is a validate by the [ULID-spec](https://github.com/ulid/spec#implementations-in-other-languages). ## BaseSettings ```python BaseSettings( __pydantic_self__, _case_sensitive: bool | None = None, _nested_model_default_partial_update: ( bool | None ) = None, _env_prefix: str | None = None, _env_file: DotenvType | None = ENV_FILE_SENTINEL, _env_file_encoding: str | None = None, _env_ignore_empty: bool | None = None, _env_nested_delimiter: str | None = None, _env_parse_none_str: str | None = None, _env_parse_enums: bool | None = None, _cli_prog_name: str | None = None, _cli_parse_args: ( bool | list[str] | tuple[str, ...] | None ) = None, _cli_settings_source: ( CliSettingsSource[Any] | None ) = None, _cli_parse_none_str: str | None = None, _cli_hide_none_type: bool | None = None, _cli_avoid_json: bool | None = None, _cli_enforce_required: bool | None = None, _cli_use_class_docs_for_groups: bool | None = None, _cli_exit_on_error: bool | None = None, _cli_prefix: str | None = None, _cli_flag_prefix_char: str | None = None, _cli_implicit_flags: bool | None = None, _cli_ignore_unknown_args: bool | None = None, _cli_kebab_case: bool | None = None, _secrets_dir: PathType | None = None, **values: Any ) ``` Bases: `BaseModel` Base class for settings, allowing values to be overridden by environment variables. This is useful in production for secrets you do not wish to save in code, it plays nicely with docker(-compose), Heroku and any 12 factor app design. All the below attributes can be set via `model_config`. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `_case_sensitive` | `bool | None` | Whether environment and CLI variable names should be read with case-sensitivity. Defaults to None. | `None` | | `_nested_model_default_partial_update` | `bool | None` | Whether to allow partial updates on nested model default object fields. Defaults to False. | `None` | | `_env_prefix` | `str | None` | Prefix for all environment variables. Defaults to None. | `None` | | `_env_file` | `DotenvType | None` | The env file(s) to load settings values from. Defaults to Path(''), which means that the value from model_config['env_file'] should be used. You can also pass None to indicate that environment variables should not be loaded from an env file. | `ENV_FILE_SENTINEL` | | `_env_file_encoding` | `str | None` | The env file encoding, e.g. 'latin-1'. Defaults to None. | `None` | | `_env_ignore_empty` | `bool | None` | Ignore environment variables where the value is an empty string. Default to False. | `None` | | `_env_nested_delimiter` | `str | None` | The nested env values delimiter. Defaults to None. | `None` | | `_env_parse_none_str` | `str | None` | The env string value that should be parsed (e.g. "null", "void", "None", etc.) into None type(None). Defaults to None type(None), which means no parsing should occur. | `None` | | `_env_parse_enums` | `bool | None` | Parse enum field names to values. Defaults to None., which means no parsing should occur. | `None` | | `_cli_prog_name` | `str | None` | The CLI program name to display in help text. Defaults to None if \_cli_parse_args is None. Otherwse, defaults to sys.argv[0]. | `None` | | `_cli_parse_args` | `bool | list[str] | tuple[str, ...] | None` | The list of CLI arguments to parse. Defaults to None. If set to True, defaults to sys.argv[1:]. | `None` | | `_cli_settings_source` | `CliSettingsSource[Any] | None` | Override the default CLI settings source with a user defined instance. Defaults to None. | `None` | | `_cli_parse_none_str` | `str | None` | The CLI string value that should be parsed (e.g. "null", "void", "None", etc.) into None type(None). Defaults to \_env_parse_none_str value if set. Otherwise, defaults to "null" if \_cli_avoid_json is False, and "None" if \_cli_avoid_json is True. | `None` | | `_cli_hide_none_type` | `bool | None` | Hide None values in CLI help text. Defaults to False. | `None` | | `_cli_avoid_json` | `bool | None` | Avoid complex JSON objects in CLI help text. Defaults to False. | `None` | | `_cli_enforce_required` | `bool | None` | Enforce required fields at the CLI. Defaults to False. | `None` | | `_cli_use_class_docs_for_groups` | `bool | None` | Use class docstrings in CLI group help text instead of field descriptions. Defaults to False. | `None` | | `_cli_exit_on_error` | `bool | None` | Determines whether or not the internal parser exits with error info when an error occurs. Defaults to True. | `None` | | `_cli_prefix` | `str | None` | The root parser command line arguments prefix. Defaults to "". | `None` | | `_cli_flag_prefix_char` | `str | None` | The flag prefix character to use for CLI optional arguments. Defaults to '-'. | `None` | | `_cli_implicit_flags` | `bool | None` | Whether bool fields should be implicitly converted into CLI boolean flags. (e.g. --flag, --no-flag). Defaults to False. | `None` | | `_cli_ignore_unknown_args` | `bool | None` | Whether to ignore unknown CLI args and parse only known ones. Defaults to False. | `None` | | `_cli_kebab_case` | `bool | None` | CLI args use kebab case. Defaults to False. | `None` | | `_secrets_dir` | `PathType | None` | The secret files directory or a sequence of directories. Defaults to None. | `None` | Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/main.py` ```python def __init__( __pydantic_self__, _case_sensitive: bool | None = None, _nested_model_default_partial_update: bool | None = None, _env_prefix: str | None = None, _env_file: DotenvType | None = ENV_FILE_SENTINEL, _env_file_encoding: str | None = None, _env_ignore_empty: bool | None = None, _env_nested_delimiter: str | None = None, _env_parse_none_str: str | None = None, _env_parse_enums: bool | None = None, _cli_prog_name: str | None = None, _cli_parse_args: bool | list[str] | tuple[str, ...] | None = None, _cli_settings_source: CliSettingsSource[Any] | None = None, _cli_parse_none_str: str | None = None, _cli_hide_none_type: bool | None = None, _cli_avoid_json: bool | None = None, _cli_enforce_required: bool | None = None, _cli_use_class_docs_for_groups: bool | None = None, _cli_exit_on_error: bool | None = None, _cli_prefix: str | None = None, _cli_flag_prefix_char: str | None = None, _cli_implicit_flags: bool | None = None, _cli_ignore_unknown_args: bool | None = None, _cli_kebab_case: bool | None = None, _secrets_dir: PathType | None = None, **values: Any, ) -> None: # Uses something other than `self` the first arg to allow "self" as a settable attribute super().__init__( **__pydantic_self__._settings_build_values( values, _case_sensitive=_case_sensitive, _nested_model_default_partial_update=_nested_model_default_partial_update, _env_prefix=_env_prefix, _env_file=_env_file, _env_file_encoding=_env_file_encoding, _env_ignore_empty=_env_ignore_empty, _env_nested_delimiter=_env_nested_delimiter, _env_parse_none_str=_env_parse_none_str, _env_parse_enums=_env_parse_enums, _cli_prog_name=_cli_prog_name, _cli_parse_args=_cli_parse_args, _cli_settings_source=_cli_settings_source, _cli_parse_none_str=_cli_parse_none_str, _cli_hide_none_type=_cli_hide_none_type, _cli_avoid_json=_cli_avoid_json, _cli_enforce_required=_cli_enforce_required, _cli_use_class_docs_for_groups=_cli_use_class_docs_for_groups, _cli_exit_on_error=_cli_exit_on_error, _cli_prefix=_cli_prefix, _cli_flag_prefix_char=_cli_flag_prefix_char, _cli_implicit_flags=_cli_implicit_flags, _cli_ignore_unknown_args=_cli_ignore_unknown_args, _cli_kebab_case=_cli_kebab_case, _secrets_dir=_secrets_dir, ) ) ``` ### settings_customise_sources ```python settings_customise_sources( settings_cls: type[BaseSettings], init_settings: PydanticBaseSettingsSource, env_settings: PydanticBaseSettingsSource, dotenv_settings: PydanticBaseSettingsSource, file_secret_settings: PydanticBaseSettingsSource, ) -> tuple[PydanticBaseSettingsSource, ...] ``` Define the sources and their order for loading the settings values. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `settings_cls` | `type[BaseSettings]` | The Settings class. | *required* | | `init_settings` | `PydanticBaseSettingsSource` | The InitSettingsSource instance. | *required* | | `env_settings` | `PydanticBaseSettingsSource` | The EnvSettingsSource instance. | *required* | | `dotenv_settings` | `PydanticBaseSettingsSource` | The DotEnvSettingsSource instance. | *required* | | `file_secret_settings` | `PydanticBaseSettingsSource` | The SecretsSettingsSource instance. | *required* | Returns: | Type | Description | | --- | --- | | `tuple[PydanticBaseSettingsSource, ...]` | A tuple containing the sources and their order for loading the settings values. | Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/main.py` ```python @classmethod def settings_customise_sources( cls, settings_cls: type[BaseSettings], init_settings: PydanticBaseSettingsSource, env_settings: PydanticBaseSettingsSource, dotenv_settings: PydanticBaseSettingsSource, file_secret_settings: PydanticBaseSettingsSource, ) -> tuple[PydanticBaseSettingsSource, ...]: """ Define the sources and their order for loading the settings values. Args: settings_cls: The Settings class. init_settings: The `InitSettingsSource` instance. env_settings: The `EnvSettingsSource` instance. dotenv_settings: The `DotEnvSettingsSource` instance. file_secret_settings: The `SecretsSettingsSource` instance. Returns: A tuple containing the sources and their order for loading the settings values. """ return init_settings, env_settings, dotenv_settings, file_secret_settings ``` ## CliApp A utility class for running Pydantic `BaseSettings`, `BaseModel`, or `pydantic.dataclasses.dataclass` as CLI applications. ### run ```python run( model_cls: type[T], cli_args: ( list[str] | Namespace | SimpleNamespace | dict[str, Any] | None ) = None, cli_settings_source: ( CliSettingsSource[Any] | None ) = None, cli_exit_on_error: bool | None = None, cli_cmd_method_name: str = "cli_cmd", **model_init_data: Any ) -> T ``` Runs a Pydantic `BaseSettings`, `BaseModel`, or `pydantic.dataclasses.dataclass` as a CLI application. Running a model as a CLI application requires the `cli_cmd` method to be defined in the model class. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `model_cls` | `type[T]` | The model class to run as a CLI application. | *required* | | `cli_args` | `list[str] | Namespace | SimpleNamespace | dict[str, Any] | None` | The list of CLI arguments to parse. If cli_settings_source is specified, this may also be a namespace or dictionary of pre-parsed CLI arguments. Defaults to sys.argv[1:]. | `None` | | `cli_settings_source` | `CliSettingsSource[Any] | None` | Override the default CLI settings source with a user defined instance. Defaults to None. | `None` | | `cli_exit_on_error` | `bool | None` | Determines whether this function exits on error. If model is subclass of BaseSettings, defaults to BaseSettings cli_exit_on_error value. Otherwise, defaults to True. | `None` | | `cli_cmd_method_name` | `str` | The CLI command method name to run. Defaults to "cli_cmd". | `'cli_cmd'` | | `model_init_data` | `Any` | The model init data. | `{}` | Returns: | Type | Description | | --- | --- | | `T` | The ran instance of model. | Raises: | Type | Description | | --- | --- | | `SettingsError` | If model_cls is not subclass of BaseModel or pydantic.dataclasses.dataclass. | | `SettingsError` | If model_cls does not have a cli_cmd entrypoint defined. | Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/main.py` ```python @staticmethod def run( model_cls: type[T], cli_args: list[str] | Namespace | SimpleNamespace | dict[str, Any] | None = None, cli_settings_source: CliSettingsSource[Any] | None = None, cli_exit_on_error: bool | None = None, cli_cmd_method_name: str = 'cli_cmd', **model_init_data: Any, ) -> T: """ Runs a Pydantic `BaseSettings`, `BaseModel`, or `pydantic.dataclasses.dataclass` as a CLI application. Running a model as a CLI application requires the `cli_cmd` method to be defined in the model class. Args: model_cls: The model class to run as a CLI application. cli_args: The list of CLI arguments to parse. If `cli_settings_source` is specified, this may also be a namespace or dictionary of pre-parsed CLI arguments. Defaults to `sys.argv[1:]`. cli_settings_source: Override the default CLI settings source with a user defined instance. Defaults to `None`. cli_exit_on_error: Determines whether this function exits on error. If model is subclass of `BaseSettings`, defaults to BaseSettings `cli_exit_on_error` value. Otherwise, defaults to `True`. cli_cmd_method_name: The CLI command method name to run. Defaults to "cli_cmd". model_init_data: The model init data. Returns: The ran instance of model. Raises: SettingsError: If model_cls is not subclass of `BaseModel` or `pydantic.dataclasses.dataclass`. SettingsError: If model_cls does not have a `cli_cmd` entrypoint defined. """ if not (is_pydantic_dataclass(model_cls) or is_model_class(model_cls)): raise SettingsError( f'Error: {model_cls.__name__} is not subclass of BaseModel or pydantic.dataclasses.dataclass' ) cli_settings = None cli_parse_args = True if cli_args is None else cli_args if cli_settings_source is not None: if isinstance(cli_parse_args, (Namespace, SimpleNamespace, dict)): cli_settings = cli_settings_source(parsed_args=cli_parse_args) else: cli_settings = cli_settings_source(args=cli_parse_args) elif isinstance(cli_parse_args, (Namespace, SimpleNamespace, dict)): raise SettingsError('Error: `cli_args` must be list[str] or None when `cli_settings_source` is not used') model_init_data['_cli_parse_args'] = cli_parse_args model_init_data['_cli_exit_on_error'] = cli_exit_on_error model_init_data['_cli_settings_source'] = cli_settings if not issubclass(model_cls, BaseSettings): class CliAppBaseSettings(BaseSettings, model_cls): # type: ignore model_config = SettingsConfigDict( nested_model_default_partial_update=True, case_sensitive=True, cli_hide_none_type=True, cli_avoid_json=True, cli_enforce_required=True, cli_implicit_flags=True, cli_kebab_case=True, ) model = CliAppBaseSettings(**model_init_data) model_init_data = {} for field_name, field_info in model.model_fields.items(): model_init_data[_field_name_for_signature(field_name, field_info)] = getattr(model, field_name) return CliApp._run_cli_cmd(model_cls(**model_init_data), cli_cmd_method_name, is_required=False) ``` ### run_subcommand ```python run_subcommand( model: PydanticModel, cli_exit_on_error: bool | None = None, cli_cmd_method_name: str = "cli_cmd", ) -> PydanticModel ``` Runs the model subcommand. Running a model subcommand requires the `cli_cmd` method to be defined in the nested model subcommand class. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `model` | `PydanticModel` | The model to run the subcommand from. | *required* | | `cli_exit_on_error` | `bool | None` | Determines whether this function exits with error if no subcommand is found. Defaults to model_config cli_exit_on_error value if set. Otherwise, defaults to True. | `None` | | `cli_cmd_method_name` | `str` | The CLI command method name to run. Defaults to "cli_cmd". | `'cli_cmd'` | Returns: | Type | Description | | --- | --- | | `PydanticModel` | The ran subcommand model. | Raises: | Type | Description | | --- | --- | | `SystemExit` | When no subcommand is found and cli_exit_on_error=True (the default). | | `SettingsError` | When no subcommand is found and cli_exit_on_error=False. | Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/main.py` ```python @staticmethod def run_subcommand( model: PydanticModel, cli_exit_on_error: bool | None = None, cli_cmd_method_name: str = 'cli_cmd' ) -> PydanticModel: """ Runs the model subcommand. Running a model subcommand requires the `cli_cmd` method to be defined in the nested model subcommand class. Args: model: The model to run the subcommand from. cli_exit_on_error: Determines whether this function exits with error if no subcommand is found. Defaults to model_config `cli_exit_on_error` value if set. Otherwise, defaults to `True`. cli_cmd_method_name: The CLI command method name to run. Defaults to "cli_cmd". Returns: The ran subcommand model. Raises: SystemExit: When no subcommand is found and cli_exit_on_error=`True` (the default). SettingsError: When no subcommand is found and cli_exit_on_error=`False`. """ subcommand = get_subcommand(model, is_required=True, cli_exit_on_error=cli_exit_on_error) return CliApp._run_cli_cmd(subcommand, cli_cmd_method_name, is_required=True) ``` ## SettingsConfigDict Bases: `ConfigDict` ### pyproject_toml_depth ```python pyproject_toml_depth: int ``` Number of levels **up** from the current working directory to attempt to find a pyproject.toml file. This is only used when a pyproject.toml file is not found in the current working directory. ### pyproject_toml_table_header ```python pyproject_toml_table_header: tuple[str, ...] ``` Header of the TOML table within a pyproject.toml file to use when filling variables. This is supplied as a `tuple[str, ...]` instead of a `str` to accommodate for headers containing a `.`. For example, `toml_table_header = ("tool", "my.tool", "foo")` can be used to fill variable values from a table with header `[tool."my.tool".foo]`. To use the root table, exclude this config setting or provide an empty tuple. ## CliSettingsSource ```python CliSettingsSource( settings_cls: type[BaseSettings], cli_prog_name: str | None = None, cli_parse_args: ( bool | list[str] | tuple[str, ...] | None ) = None, cli_parse_none_str: str | None = None, cli_hide_none_type: bool | None = None, cli_avoid_json: bool | None = None, cli_enforce_required: bool | None = None, cli_use_class_docs_for_groups: bool | None = None, cli_exit_on_error: bool | None = None, cli_prefix: str | None = None, cli_flag_prefix_char: str | None = None, cli_implicit_flags: bool | None = None, cli_ignore_unknown_args: bool | None = None, cli_kebab_case: bool | None = None, case_sensitive: bool | None = True, root_parser: Any = None, parse_args_method: Callable[..., Any] | None = None, add_argument_method: ( Callable[..., Any] | None ) = add_argument, add_argument_group_method: ( Callable[..., Any] | None ) = add_argument_group, add_parser_method: ( Callable[..., Any] | None ) = add_parser, add_subparsers_method: ( Callable[..., Any] | None ) = add_subparsers, formatter_class: Any = RawDescriptionHelpFormatter, ) ``` Bases: `EnvSettingsSource`, `Generic[T]` Source class for loading settings values from CLI. Note A `CliSettingsSource` connects with a `root_parser` object by using the parser methods to add `settings_cls` fields as command line arguments. The `CliSettingsSource` internal parser representation is based upon the `argparse` parsing library, and therefore, requires the parser methods to support the same attributes as their `argparse` library counterparts. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `cli_prog_name` | `str | None` | The CLI program name to display in help text. Defaults to None if cli_parse_args is None. Otherwse, defaults to sys.argv[0]. | `None` | | `cli_parse_args` | `bool | list[str] | tuple[str, ...] | None` | The list of CLI arguments to parse. Defaults to None. If set to True, defaults to sys.argv[1:]. | `None` | | `cli_parse_none_str` | `str | None` | The CLI string value that should be parsed (e.g. "null", "void", "None", etc.) into None type(None). Defaults to "null" if cli_avoid_json is False, and "None" if cli_avoid_json is True. | `None` | | `cli_hide_none_type` | `bool | None` | Hide None values in CLI help text. Defaults to False. | `None` | | `cli_avoid_json` | `bool | None` | Avoid complex JSON objects in CLI help text. Defaults to False. | `None` | | `cli_enforce_required` | `bool | None` | Enforce required fields at the CLI. Defaults to False. | `None` | | `cli_use_class_docs_for_groups` | `bool | None` | Use class docstrings in CLI group help text instead of field descriptions. Defaults to False. | `None` | | `cli_exit_on_error` | `bool | None` | Determines whether or not the internal parser exits with error info when an error occurs. Defaults to True. | `None` | | `cli_prefix` | `str | None` | Prefix for command line arguments added under the root parser. Defaults to "". | `None` | | `cli_flag_prefix_char` | `str | None` | The flag prefix character to use for CLI optional arguments. Defaults to '-'. | `None` | | `cli_implicit_flags` | `bool | None` | Whether bool fields should be implicitly converted into CLI boolean flags. (e.g. --flag, --no-flag). Defaults to False. | `None` | | `cli_ignore_unknown_args` | `bool | None` | Whether to ignore unknown CLI args and parse only known ones. Defaults to False. | `None` | | `cli_kebab_case` | `bool | None` | CLI args use kebab case. Defaults to False. | `None` | | `case_sensitive` | `bool | None` | Whether CLI "--arg" names should be read with case-sensitivity. Defaults to True. Note: Case-insensitive matching is only supported on the internal root parser and does not apply to CLI subcommands. | `True` | | `root_parser` | `Any` | The root parser object. | `None` | | `parse_args_method` | `Callable[..., Any] | None` | The root parser parse args method. Defaults to argparse.ArgumentParser.parse_args. | `None` | | `add_argument_method` | `Callable[..., Any] | None` | The root parser add argument method. Defaults to argparse.ArgumentParser.add_argument. | `add_argument` | | `add_argument_group_method` | `Callable[..., Any] | None` | The root parser add argument group method. Defaults to argparse.ArgumentParser.add_argument_group. | `add_argument_group` | | `add_parser_method` | `Callable[..., Any] | None` | The root parser add new parser (sub-command) method. Defaults to argparse.\_SubParsersAction.add_parser. | `add_parser` | | `add_subparsers_method` | `Callable[..., Any] | None` | The root parser add subparsers (sub-commands) method. Defaults to argparse.ArgumentParser.add_subparsers. | `add_subparsers` | | `formatter_class` | `Any` | A class for customizing the root parser help text. Defaults to argparse.RawDescriptionHelpFormatter. | `RawDescriptionHelpFormatter` | Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def __init__( self, settings_cls: type[BaseSettings], cli_prog_name: str | None = None, cli_parse_args: bool | list[str] | tuple[str, ...] | None = None, cli_parse_none_str: str | None = None, cli_hide_none_type: bool | None = None, cli_avoid_json: bool | None = None, cli_enforce_required: bool | None = None, cli_use_class_docs_for_groups: bool | None = None, cli_exit_on_error: bool | None = None, cli_prefix: str | None = None, cli_flag_prefix_char: str | None = None, cli_implicit_flags: bool | None = None, cli_ignore_unknown_args: bool | None = None, cli_kebab_case: bool | None = None, case_sensitive: bool | None = True, root_parser: Any = None, parse_args_method: Callable[..., Any] | None = None, add_argument_method: Callable[..., Any] | None = ArgumentParser.add_argument, add_argument_group_method: Callable[..., Any] | None = ArgumentParser.add_argument_group, add_parser_method: Callable[..., Any] | None = _SubParsersAction.add_parser, add_subparsers_method: Callable[..., Any] | None = ArgumentParser.add_subparsers, formatter_class: Any = RawDescriptionHelpFormatter, ) -> None: self.cli_prog_name = ( cli_prog_name if cli_prog_name is not None else settings_cls.model_config.get('cli_prog_name', sys.argv[0]) ) self.cli_hide_none_type = ( cli_hide_none_type if cli_hide_none_type is not None else settings_cls.model_config.get('cli_hide_none_type', False) ) self.cli_avoid_json = ( cli_avoid_json if cli_avoid_json is not None else settings_cls.model_config.get('cli_avoid_json', False) ) if not cli_parse_none_str: cli_parse_none_str = 'None' if self.cli_avoid_json is True else 'null' self.cli_parse_none_str = cli_parse_none_str self.cli_enforce_required = ( cli_enforce_required if cli_enforce_required is not None else settings_cls.model_config.get('cli_enforce_required', False) ) self.cli_use_class_docs_for_groups = ( cli_use_class_docs_for_groups if cli_use_class_docs_for_groups is not None else settings_cls.model_config.get('cli_use_class_docs_for_groups', False) ) self.cli_exit_on_error = ( cli_exit_on_error if cli_exit_on_error is not None else settings_cls.model_config.get('cli_exit_on_error', True) ) self.cli_prefix = cli_prefix if cli_prefix is not None else settings_cls.model_config.get('cli_prefix', '') self.cli_flag_prefix_char = ( cli_flag_prefix_char if cli_flag_prefix_char is not None else settings_cls.model_config.get('cli_flag_prefix_char', '-') ) self._cli_flag_prefix = self.cli_flag_prefix_char * 2 if self.cli_prefix: if cli_prefix.startswith('.') or cli_prefix.endswith('.') or not cli_prefix.replace('.', '').isidentifier(): # type: ignore raise SettingsError(f'CLI settings source prefix is invalid: {cli_prefix}') self.cli_prefix += '.' self.cli_implicit_flags = ( cli_implicit_flags if cli_implicit_flags is not None else settings_cls.model_config.get('cli_implicit_flags', False) ) self.cli_ignore_unknown_args = ( cli_ignore_unknown_args if cli_ignore_unknown_args is not None else settings_cls.model_config.get('cli_ignore_unknown_args', False) ) self.cli_kebab_case = ( cli_kebab_case if cli_kebab_case is not None else settings_cls.model_config.get('cli_kebab_case', False) ) case_sensitive = case_sensitive if case_sensitive is not None else True if not case_sensitive and root_parser is not None: raise SettingsError('Case-insensitive matching is only supported on the internal root parser') super().__init__( settings_cls, env_nested_delimiter='.', env_parse_none_str=self.cli_parse_none_str, env_parse_enums=True, env_prefix=self.cli_prefix, case_sensitive=case_sensitive, ) root_parser = ( _CliInternalArgParser( cli_exit_on_error=self.cli_exit_on_error, prog=self.cli_prog_name, description=None if settings_cls.__doc__ is None else dedent(settings_cls.__doc__), formatter_class=formatter_class, prefix_chars=self.cli_flag_prefix_char, allow_abbrev=False, ) if root_parser is None else root_parser ) self._connect_root_parser( root_parser=root_parser, parse_args_method=parse_args_method, add_argument_method=add_argument_method, add_argument_group_method=add_argument_group_method, add_parser_method=add_parser_method, add_subparsers_method=add_subparsers_method, formatter_class=formatter_class, ) if cli_parse_args not in (None, False): if cli_parse_args is True: cli_parse_args = sys.argv[1:] elif not isinstance(cli_parse_args, (list, tuple)): raise SettingsError( f'cli_parse_args must be List[str] or Tuple[str, ...], recieved {type(cli_parse_args)}' ) self._load_env_vars(parsed_args=self._parse_args(self.root_parser, cli_parse_args)) ``` ### root_parser ```python root_parser: T ``` The connected root parser instance. ## DotEnvSettingsSource ```python DotEnvSettingsSource( settings_cls: type[BaseSettings], env_file: DotenvType | None = ENV_FILE_SENTINEL, env_file_encoding: str | None = None, case_sensitive: bool | None = None, env_prefix: str | None = None, env_nested_delimiter: str | None = None, env_ignore_empty: bool | None = None, env_parse_none_str: str | None = None, env_parse_enums: bool | None = None, ) ``` Bases: `EnvSettingsSource` Source class for loading settings values from env files. Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def __init__( self, settings_cls: type[BaseSettings], env_file: DotenvType | None = ENV_FILE_SENTINEL, env_file_encoding: str | None = None, case_sensitive: bool | None = None, env_prefix: str | None = None, env_nested_delimiter: str | None = None, env_ignore_empty: bool | None = None, env_parse_none_str: str | None = None, env_parse_enums: bool | None = None, ) -> None: self.env_file = env_file if env_file != ENV_FILE_SENTINEL else settings_cls.model_config.get('env_file') self.env_file_encoding = ( env_file_encoding if env_file_encoding is not None else settings_cls.model_config.get('env_file_encoding') ) super().__init__( settings_cls, case_sensitive, env_prefix, env_nested_delimiter, env_ignore_empty, env_parse_none_str, env_parse_enums, ) ``` ## EnvSettingsSource ```python EnvSettingsSource( settings_cls: type[BaseSettings], case_sensitive: bool | None = None, env_prefix: str | None = None, env_nested_delimiter: str | None = None, env_ignore_empty: bool | None = None, env_parse_none_str: str | None = None, env_parse_enums: bool | None = None, ) ``` Bases: `PydanticBaseEnvSettingsSource` Source class for loading settings values from environment variables. Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def __init__( self, settings_cls: type[BaseSettings], case_sensitive: bool | None = None, env_prefix: str | None = None, env_nested_delimiter: str | None = None, env_ignore_empty: bool | None = None, env_parse_none_str: str | None = None, env_parse_enums: bool | None = None, ) -> None: super().__init__( settings_cls, case_sensitive, env_prefix, env_ignore_empty, env_parse_none_str, env_parse_enums ) self.env_nested_delimiter = ( env_nested_delimiter if env_nested_delimiter is not None else self.config.get('env_nested_delimiter') ) self.env_prefix_len = len(self.env_prefix) self.env_vars = self._load_env_vars() ``` ### get_field_value ```python get_field_value( field: FieldInfo, field_name: str ) -> tuple[Any, str, bool] ``` Gets the value for field from environment variables and a flag to determine whether value is complex. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `field` | `FieldInfo` | The field. | *required* | | `field_name` | `str` | The field name. | *required* | Returns: | Type | Description | | --- | --- | | `tuple[Any, str, bool]` | A tuple that contains the value (None if not found), key, and a flag to determine whether value is complex. | Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def get_field_value(self, field: FieldInfo, field_name: str) -> tuple[Any, str, bool]: """ Gets the value for field from environment variables and a flag to determine whether value is complex. Args: field: The field. field_name: The field name. Returns: A tuple that contains the value (`None` if not found), key, and a flag to determine whether value is complex. """ env_val: str | None = None for field_key, env_name, value_is_complex in self._extract_field_info(field, field_name): env_val = self.env_vars.get(env_name) if env_val is not None: break return env_val, field_key, value_is_complex ``` ### prepare_field_value ```python prepare_field_value( field_name: str, field: FieldInfo, value: Any, value_is_complex: bool, ) -> Any ``` Prepare value for the field. - Extract value for nested field. - Deserialize value to python object for complex field. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `field` | `FieldInfo` | The field. | *required* | | `field_name` | `str` | The field name. | *required* | Returns: | Type | Description | | --- | --- | | `Any` | A tuple contains prepared value for the field. | Raises: | Type | Description | | --- | --- | | `ValuesError` | When There is an error in deserializing value for complex field. | Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def prepare_field_value(self, field_name: str, field: FieldInfo, value: Any, value_is_complex: bool) -> Any: """ Prepare value for the field. * Extract value for nested field. * Deserialize value to python object for complex field. Args: field: The field. field_name: The field name. Returns: A tuple contains prepared value for the field. Raises: ValuesError: When There is an error in deserializing value for complex field. """ is_complex, allow_parse_failure = self._field_is_complex(field) if self.env_parse_enums: enum_val = _annotation_enum_name_to_val(field.annotation, value) value = value if enum_val is None else enum_val if is_complex or value_is_complex: if isinstance(value, EnvNoneType): return value elif value is None: # field is complex but no value found so far, try explode_env_vars env_val_built = self.explode_env_vars(field_name, field, self.env_vars) if env_val_built: return env_val_built else: # field is complex and there's a value, decode that as JSON, then add explode_env_vars try: value = self.decode_complex_value(field_name, field, value) except ValueError as e: if not allow_parse_failure: raise e if isinstance(value, dict): return deep_update(value, self.explode_env_vars(field_name, field, self.env_vars)) else: return value elif value is not None: # simplest case, field is not complex, we only need to add the value if it was found return value ``` ### next_field ```python next_field( field: FieldInfo | Any | None, key: str, case_sensitive: bool | None = None, ) -> FieldInfo | None ``` Find the field in a sub model by key(env name) By having the following models: ````text ```py class SubSubModel(BaseSettings): dvals: Dict class SubModel(BaseSettings): vals: list[str] sub_sub_model: SubSubModel class Cfg(BaseSettings): sub_model: SubModel ```` ```` Then next_field(sub_model, 'vals') Returns the `vals` field of `SubModel` class next_field(sub_model, 'sub_sub_model') Returns `sub_sub_model` field of `SubModel` class Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `field` | `FieldInfo | Any | None` | The field. | *required* | | `key` | `str` | The key (env name). | *required* | | `case_sensitive` | `bool | None` | Whether to search for key case sensitively. | `None` | Returns: | Type | Description | | --- | --- | | `FieldInfo | None` | Field if it finds the next field otherwise None. | Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def next_field( self, field: FieldInfo | Any | None, key: str, case_sensitive: bool | None = None ) -> FieldInfo | None: """ Find the field in a sub model by key(env name) By having the following models: ```py class SubSubModel(BaseSettings): dvals: Dict class SubModel(BaseSettings): vals: list[str] sub_sub_model: SubSubModel class Cfg(BaseSettings): sub_model: SubModel ``` Then: next_field(sub_model, 'vals') Returns the `vals` field of `SubModel` class next_field(sub_model, 'sub_sub_model') Returns `sub_sub_model` field of `SubModel` class Args: field: The field. key: The key (env name). case_sensitive: Whether to search for key case sensitively. Returns: Field if it finds the next field otherwise `None`. """ if not field: return None annotation = field.annotation if isinstance(field, FieldInfo) else field if origin_is_union(get_origin(annotation)) or isinstance(annotation, WithArgsTypes): for type_ in get_args(annotation): type_has_key = self.next_field(type_, key, case_sensitive) if type_has_key: return type_has_key elif is_model_class(annotation) or is_pydantic_dataclass(annotation): fields = _get_model_fields(annotation) # `case_sensitive is None` is here to be compatible with the old behavior. # Has to be removed in V3. for field_name, f in fields.items(): for _, env_name, _ in self._extract_field_info(f, field_name): if case_sensitive is None or case_sensitive: if field_name == key or env_name == key: return f elif field_name.lower() == key.lower() or env_name.lower() == key.lower(): return f return None ```` ### explode_env_vars ```python explode_env_vars( field_name: str, field: FieldInfo, env_vars: Mapping[str, str | None], ) -> dict[str, Any] ``` Process env_vars and extract the values of keys containing env_nested_delimiter into nested dictionaries. This is applied to a single field, hence filtering by env_var prefix. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `field_name` | `str` | The field name. | *required* | | `field` | `FieldInfo` | The field. | *required* | | `env_vars` | `Mapping[str, str | None]` | Environment variables. | *required* | Returns: | Type | Description | | --- | --- | | `dict[str, Any]` | A dictionary contains extracted values from nested env values. | Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def explode_env_vars(self, field_name: str, field: FieldInfo, env_vars: Mapping[str, str | None]) -> dict[str, Any]: """ Process env_vars and extract the values of keys containing env_nested_delimiter into nested dictionaries. This is applied to a single field, hence filtering by env_var prefix. Args: field_name: The field name. field: The field. env_vars: Environment variables. Returns: A dictionary contains extracted values from nested env values. """ is_dict = lenient_issubclass(get_origin(field.annotation), dict) prefixes = [ f'{env_name}{self.env_nested_delimiter}' for _, env_name, _ in self._extract_field_info(field, field_name) ] result: dict[str, Any] = {} for env_name, env_val in env_vars.items(): if not any(env_name.startswith(prefix) for prefix in prefixes): continue # we remove the prefix before splitting in case the prefix has characters in common with the delimiter env_name_without_prefix = env_name[self.env_prefix_len :] _, *keys, last_key = env_name_without_prefix.split(self.env_nested_delimiter) env_var = result target_field: FieldInfo | None = field for key in keys: target_field = self.next_field(target_field, key, self.case_sensitive) if isinstance(env_var, dict): env_var = env_var.setdefault(key, {}) # get proper field with last_key target_field = self.next_field(target_field, last_key, self.case_sensitive) # check if env_val maps to a complex field and if so, parse the env_val if (target_field or is_dict) and env_val: if target_field: is_complex, allow_json_failure = self._field_is_complex(target_field) else: # nested field type is dict is_complex, allow_json_failure = True, True if is_complex: try: env_val = self.decode_complex_value(last_key, target_field, env_val) # type: ignore except ValueError as e: if not allow_json_failure: raise e if isinstance(env_var, dict): if last_key not in env_var or not isinstance(env_val, EnvNoneType) or env_var[last_key] == {}: env_var[last_key] = env_val return result ``` ## ForceDecode Annotation to force decoding of a field value. ## InitSettingsSource ```python InitSettingsSource( settings_cls: type[BaseSettings], init_kwargs: dict[str, Any], nested_model_default_partial_update: bool | None = None, ) ``` Bases: `PydanticBaseSettingsSource` Source class for loading values provided during settings class initialization. Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def __init__( self, settings_cls: type[BaseSettings], init_kwargs: dict[str, Any], nested_model_default_partial_update: bool | None = None, ): self.init_kwargs = init_kwargs super().__init__(settings_cls) self.nested_model_default_partial_update = ( nested_model_default_partial_update if nested_model_default_partial_update is not None else self.config.get('nested_model_default_partial_update', False) ) ``` ## JsonConfigSettingsSource ```python JsonConfigSettingsSource( settings_cls: type[BaseSettings], json_file: PathType | None = DEFAULT_PATH, json_file_encoding: str | None = None, ) ``` Bases: `InitSettingsSource`, `ConfigFileSourceMixin` A source class that loads variables from a JSON file Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def __init__( self, settings_cls: type[BaseSettings], json_file: PathType | None = DEFAULT_PATH, json_file_encoding: str | None = None, ): self.json_file_path = json_file if json_file != DEFAULT_PATH else settings_cls.model_config.get('json_file') self.json_file_encoding = ( json_file_encoding if json_file_encoding is not None else settings_cls.model_config.get('json_file_encoding') ) self.json_data = self._read_files(self.json_file_path) super().__init__(settings_cls, self.json_data) ``` ## NoDecode Annotation to prevent decoding of a field value. ## PydanticBaseSettingsSource ```python PydanticBaseSettingsSource( settings_cls: type[BaseSettings], ) ``` Bases: `ABC` Abstract base class for settings sources, every settings source classes should inherit from it. Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def __init__(self, settings_cls: type[BaseSettings]): self.settings_cls = settings_cls self.config = settings_cls.model_config self._current_state: dict[str, Any] = {} self._settings_sources_data: dict[str, dict[str, Any]] = {} ``` ### current_state ```python current_state: dict[str, Any] ``` The current state of the settings, populated by the previous settings sources. ### settings_sources_data ```python settings_sources_data: dict[str, dict[str, Any]] ``` The state of all previous settings sources. ### get_field_value ```python get_field_value( field: FieldInfo, field_name: str ) -> tuple[Any, str, bool] ``` Gets the value, the key for model creation, and a flag to determine whether value is complex. This is an abstract method that should be overridden in every settings source classes. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `field` | `FieldInfo` | The field. | *required* | | `field_name` | `str` | The field name. | *required* | Returns: | Type | Description | | --- | --- | | `tuple[Any, str, bool]` | A tuple that contains the value, key and a flag to determine whether value is complex. | Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python @abstractmethod def get_field_value(self, field: FieldInfo, field_name: str) -> tuple[Any, str, bool]: """ Gets the value, the key for model creation, and a flag to determine whether value is complex. This is an abstract method that should be overridden in every settings source classes. Args: field: The field. field_name: The field name. Returns: A tuple that contains the value, key and a flag to determine whether value is complex. """ pass ``` ### field_is_complex ```python field_is_complex(field: FieldInfo) -> bool ``` Checks whether a field is complex, in which case it will attempt to be parsed as JSON. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `field` | `FieldInfo` | The field. | *required* | Returns: | Type | Description | | --- | --- | | `bool` | Whether the field is complex. | Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def field_is_complex(self, field: FieldInfo) -> bool: """ Checks whether a field is complex, in which case it will attempt to be parsed as JSON. Args: field: The field. Returns: Whether the field is complex. """ return _annotation_is_complex(field.annotation, field.metadata) ``` ### prepare_field_value ```python prepare_field_value( field_name: str, field: FieldInfo, value: Any, value_is_complex: bool, ) -> Any ``` Prepares the value of a field. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `field_name` | `str` | The field name. | *required* | | `field` | `FieldInfo` | The field. | *required* | | `value` | `Any` | The value of the field that has to be prepared. | *required* | | `value_is_complex` | `bool` | A flag to determine whether value is complex. | *required* | Returns: | Type | Description | | --- | --- | | `Any` | The prepared value. | Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def prepare_field_value(self, field_name: str, field: FieldInfo, value: Any, value_is_complex: bool) -> Any: """ Prepares the value of a field. Args: field_name: The field name. field: The field. value: The value of the field that has to be prepared. value_is_complex: A flag to determine whether value is complex. Returns: The prepared value. """ if value is not None and (self.field_is_complex(field) or value_is_complex): return self.decode_complex_value(field_name, field, value) return value ``` ### decode_complex_value ```python decode_complex_value( field_name: str, field: FieldInfo, value: Any ) -> Any ``` Decode the value for a complex field Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `field_name` | `str` | The field name. | *required* | | `field` | `FieldInfo` | The field. | *required* | | `value` | `Any` | The value of the field that has to be prepared. | *required* | Returns: | Type | Description | | --- | --- | | `Any` | The decoded value for further preparation | Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def decode_complex_value(self, field_name: str, field: FieldInfo, value: Any) -> Any: """ Decode the value for a complex field Args: field_name: The field name. field: The field. value: The value of the field that has to be prepared. Returns: The decoded value for further preparation """ if field and ( NoDecode in field.metadata or (self.config.get('enable_decoding') is False and ForceDecode not in field.metadata) ): return value return json.loads(value) ``` ## PyprojectTomlConfigSettingsSource ```python PyprojectTomlConfigSettingsSource( settings_cls: type[BaseSettings], toml_file: Path | None = None, ) ``` Bases: `TomlConfigSettingsSource` A source class that loads variables from a `pyproject.toml` file. Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def __init__( self, settings_cls: type[BaseSettings], toml_file: Path | None = None, ) -> None: self.toml_file_path = self._pick_pyproject_toml_file( toml_file, settings_cls.model_config.get('pyproject_toml_depth', 0) ) self.toml_table_header: tuple[str, ...] = settings_cls.model_config.get( 'pyproject_toml_table_header', ('tool', 'pydantic-settings') ) self.toml_data = self._read_files(self.toml_file_path) for key in self.toml_table_header: self.toml_data = self.toml_data.get(key, {}) super(TomlConfigSettingsSource, self).__init__(settings_cls, self.toml_data) ``` ## SecretsSettingsSource ```python SecretsSettingsSource( settings_cls: type[BaseSettings], secrets_dir: PathType | None = None, case_sensitive: bool | None = None, env_prefix: str | None = None, env_ignore_empty: bool | None = None, env_parse_none_str: str | None = None, env_parse_enums: bool | None = None, ) ``` Bases: `PydanticBaseEnvSettingsSource` Source class for loading settings values from secret files. Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def __init__( self, settings_cls: type[BaseSettings], secrets_dir: PathType | None = None, case_sensitive: bool | None = None, env_prefix: str | None = None, env_ignore_empty: bool | None = None, env_parse_none_str: str | None = None, env_parse_enums: bool | None = None, ) -> None: super().__init__( settings_cls, case_sensitive, env_prefix, env_ignore_empty, env_parse_none_str, env_parse_enums ) self.secrets_dir = secrets_dir if secrets_dir is not None else self.config.get('secrets_dir') ``` ### find_case_path ```python find_case_path( dir_path: Path, file_name: str, case_sensitive: bool ) -> Path | None ``` Find a file within path's directory matching filename, optionally ignoring case. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `dir_path` | `Path` | Directory path. | *required* | | `file_name` | `str` | File name. | *required* | | `case_sensitive` | `bool` | Whether to search for file name case sensitively. | *required* | Returns: | Type | Description | | --- | --- | | `Path | None` | Whether file path or None if file does not exist in directory. | Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python @classmethod def find_case_path(cls, dir_path: Path, file_name: str, case_sensitive: bool) -> Path | None: """ Find a file within path's directory matching filename, optionally ignoring case. Args: dir_path: Directory path. file_name: File name. case_sensitive: Whether to search for file name case sensitively. Returns: Whether file path or `None` if file does not exist in directory. """ for f in dir_path.iterdir(): if f.name == file_name: return f elif not case_sensitive and f.name.lower() == file_name.lower(): return f return None ``` ### get_field_value ```python get_field_value( field: FieldInfo, field_name: str ) -> tuple[Any, str, bool] ``` Gets the value for field from secret file and a flag to determine whether value is complex. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `field` | `FieldInfo` | The field. | *required* | | `field_name` | `str` | The field name. | *required* | Returns: | Type | Description | | --- | --- | | `tuple[Any, str, bool]` | A tuple that contains the value (None if the file does not exist), key, and a flag to determine whether value is complex. | Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def get_field_value(self, field: FieldInfo, field_name: str) -> tuple[Any, str, bool]: """ Gets the value for field from secret file and a flag to determine whether value is complex. Args: field: The field. field_name: The field name. Returns: A tuple that contains the value (`None` if the file does not exist), key, and a flag to determine whether value is complex. """ for field_key, env_name, value_is_complex in self._extract_field_info(field, field_name): # paths reversed to match the last-wins behaviour of `env_file` for secrets_path in reversed(self.secrets_paths): path = self.find_case_path(secrets_path, env_name, self.case_sensitive) if not path: # path does not exist, we currently don't return a warning for this continue if path.is_file(): return path.read_text().strip(), field_key, value_is_complex else: warnings.warn( f'attempted to load secret file "{path}" but found a {path_type_label(path)} instead.', stacklevel=4, ) return None, field_key, value_is_complex ``` ## TomlConfigSettingsSource ```python TomlConfigSettingsSource( settings_cls: type[BaseSettings], toml_file: PathType | None = DEFAULT_PATH, ) ``` Bases: `InitSettingsSource`, `ConfigFileSourceMixin` A source class that loads variables from a TOML file Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def __init__( self, settings_cls: type[BaseSettings], toml_file: PathType | None = DEFAULT_PATH, ): self.toml_file_path = toml_file if toml_file != DEFAULT_PATH else settings_cls.model_config.get('toml_file') self.toml_data = self._read_files(self.toml_file_path) super().__init__(settings_cls, self.toml_data) ``` ## YamlConfigSettingsSource ```python YamlConfigSettingsSource( settings_cls: type[BaseSettings], yaml_file: PathType | None = DEFAULT_PATH, yaml_file_encoding: str | None = None, ) ``` Bases: `InitSettingsSource`, `ConfigFileSourceMixin` A source class that loads variables from a yaml file Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def __init__( self, settings_cls: type[BaseSettings], yaml_file: PathType | None = DEFAULT_PATH, yaml_file_encoding: str | None = None, ): self.yaml_file_path = yaml_file if yaml_file != DEFAULT_PATH else settings_cls.model_config.get('yaml_file') self.yaml_file_encoding = ( yaml_file_encoding if yaml_file_encoding is not None else settings_cls.model_config.get('yaml_file_encoding') ) self.yaml_data = self._read_files(self.yaml_file_path) super().__init__(settings_cls, self.yaml_data) ``` ## get_subcommand ```python get_subcommand( model: PydanticModel, is_required: bool = True, cli_exit_on_error: bool | None = None, ) -> Optional[PydanticModel] ``` Get the subcommand from a model. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `model` | `PydanticModel` | The model to get the subcommand from. | *required* | | `is_required` | `bool` | Determines whether a model must have subcommand set and raises error if not found. Defaults to True. | `True` | | `cli_exit_on_error` | `bool | None` | Determines whether this function exits with error if no subcommand is found. Defaults to model_config cli_exit_on_error value if set. Otherwise, defaults to True. | `None` | Returns: | Type | Description | | --- | --- | | `Optional[PydanticModel]` | The subcommand model if found, otherwise None. | Raises: | Type | Description | | --- | --- | | `SystemExit` | When no subcommand is found and is_required=True and cli_exit_on_error=True (the default). | | `SettingsError` | When no subcommand is found and is_required=True and cli_exit_on_error=False. | Source code in `.venv/lib/python3.12/site-packages/pydantic_settings/sources.py` ```python def get_subcommand( model: PydanticModel, is_required: bool = True, cli_exit_on_error: bool | None = None ) -> Optional[PydanticModel]: """ Get the subcommand from a model. Args: model: The model to get the subcommand from. is_required: Determines whether a model must have subcommand set and raises error if not found. Defaults to `True`. cli_exit_on_error: Determines whether this function exits with error if no subcommand is found. Defaults to model_config `cli_exit_on_error` value if set. Otherwise, defaults to `True`. Returns: The subcommand model if found, otherwise `None`. Raises: SystemExit: When no subcommand is found and is_required=`True` and cli_exit_on_error=`True` (the default). SettingsError: When no subcommand is found and is_required=`True` and cli_exit_on_error=`False`. """ model_cls = type(model) if cli_exit_on_error is None and is_model_class(model_cls): model_default = model_cls.model_config.get('cli_exit_on_error') if isinstance(model_default, bool): cli_exit_on_error = model_default if cli_exit_on_error is None: cli_exit_on_error = True subcommands: list[str] = [] for field_name, field_info in _get_model_fields(model_cls).items(): if _CliSubCommand in field_info.metadata: if getattr(model, field_name) is not None: return getattr(model, field_name) subcommands.append(field_name) if is_required: error_message = ( f'Error: CLI subcommand is required {{{", ".join(subcommands)}}}' if subcommands else 'Error: CLI subcommand is required but no subcommands were found.' ) raise SystemExit(error_message) if cli_exit_on_error else SettingsError(error_message) return None ``` RootModel class and type definitions. ## RootModel ```python RootModel( root: RootModelRootType = PydanticUndefined, **data ) ``` Bases: `BaseModel`, `Generic[RootModelRootType]` Usage Documentation [`RootModel` and Custom Root Types](../../concepts/models/#rootmodel-and-custom-root-types) A Pydantic `BaseModel` for the root object of the model. Attributes: | Name | Type | Description | | --- | --- | --- | | `root` | `RootModelRootType` | The root object of the model. | | `__pydantic_root_model__` | | Whether the model is a RootModel. | | `__pydantic_private__` | | Private fields in the model. | | `__pydantic_extra__` | | Extra fields in the model. | Source code in `pydantic/root_model.py` ```python def __init__(self, /, root: RootModelRootType = PydanticUndefined, **data) -> None: # type: ignore __tracebackhide__ = True if data: if root is not PydanticUndefined: raise ValueError( '"RootModel.__init__" accepts either a single positional argument or arbitrary keyword arguments' ) root = data # type: ignore self.__pydantic_validator__.validate_python(root, self_instance=self) ``` ### model_construct ```python model_construct( root: RootModelRootType, _fields_set: set[str] | None = None, ) -> Self ``` Create a new model using the provided root object and update fields set. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `root` | `RootModelRootType` | The root object of the model. | *required* | | `_fields_set` | `set[str] | None` | The set of fields to be updated. | `None` | Returns: | Type | Description | | --- | --- | | `Self` | The new model. | Raises: | Type | Description | | --- | --- | | `NotImplemented` | If the model is not a subclass of RootModel. | Source code in `pydantic/root_model.py` ```python @classmethod def model_construct(cls, root: RootModelRootType, _fields_set: set[str] | None = None) -> Self: # type: ignore """Create a new model using the provided root object and update fields set. Args: root: The root object of the model. _fields_set: The set of fields to be updated. Returns: The new model. Raises: NotImplemented: If the model is not a subclass of `RootModel`. """ return super().model_construct(root=root, _fields_set=_fields_set) ``` ### model_dump ```python model_dump( *, mode: Literal["json", "python"] | str = "python", include: Any = None, exclude: Any = None, context: dict[str, Any] | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: ( bool | Literal["none", "warn", "error"] ) = True, serialize_as_any: bool = False ) -> Any ``` This method is included just to get a more accurate return type for type checkers. It is included in this `if TYPE_CHECKING:` block since no override is actually necessary. See the documentation of `BaseModel.model_dump` for more details about the arguments. Generally, this method will have a return type of `RootModelRootType`, assuming that `RootModelRootType` is not a `BaseModel` subclass. If `RootModelRootType` is a `BaseModel` subclass, then the return type will likely be `dict[str, Any]`, as `model_dump` calls are recursive. The return type could even be something different, in the case of a custom serializer. Thus, `Any` is used here to catch all of these cases. Source code in `pydantic/root_model.py` ```python def model_dump( # type: ignore self, *, mode: Literal['json', 'python'] | str = 'python', include: Any = None, exclude: Any = None, context: dict[str, Any] | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, serialize_as_any: bool = False, ) -> Any: """This method is included just to get a more accurate return type for type checkers. It is included in this `if TYPE_CHECKING:` block since no override is actually necessary. See the documentation of `BaseModel.model_dump` for more details about the arguments. Generally, this method will have a return type of `RootModelRootType`, assuming that `RootModelRootType` is not a `BaseModel` subclass. If `RootModelRootType` is a `BaseModel` subclass, then the return type will likely be `dict[str, Any]`, as `model_dump` calls are recursive. The return type could even be something different, in the case of a custom serializer. Thus, `Any` is used here to catch all of these cases. """ ... ``` Pydantic supports many common types from the Python standard library. If you need stricter processing see [Strict Types](../../concepts/types/#strict-types), including if you need to constrain the values allowed (e.g. to require a positive `int`). Note Pydantic still supports older (3.8-) typing constructs like `typing.List` and `typing.Dict`, but it's best practice to use the newer types like `list` and `dict`. ## Booleans A standard `bool` field will raise a `ValidationError` if the value is not one of the following: - A valid boolean (i.e. `True` or `False`), - The integers `0` or `1`, - a `str` which when converted to lower case is one of `'0', 'off', 'f', 'false', 'n', 'no', '1', 'on', 't', 'true', 'y', 'yes'` - a `bytes` which is valid per the previous rule when decoded to `str` Note If you want stricter boolean logic (e.g. a field which only permits `True` and `False`) you can use [`StrictBool`](../types/#pydantic.types.StrictBool). Here is a script demonstrating some of these behaviors: ```python from pydantic import BaseModel, ValidationError class BooleanModel(BaseModel): bool_value: bool print(BooleanModel(bool_value=False)) #> bool_value=False print(BooleanModel(bool_value='False')) #> bool_value=False print(BooleanModel(bool_value=1)) #> bool_value=True try: BooleanModel(bool_value=[]) except ValidationError as e: print(str(e)) """ 1 validation error for BooleanModel bool_value Input should be a valid boolean [type=bool_type, input_value=[], input_type=list] """ ``` ## Datetime Types Pydantic supports the following [datetime](https://docs.python.org/library/datetime.html#available-types) types: ### datetime.datetime - `datetime` fields will accept values of type: - `datetime`; an existing `datetime` object - `int` or `float`; assumed as Unix time, i.e. seconds (if >= `-2e10` and \<= `2e10`) or milliseconds (if < `-2e10`or > `2e10`) since 1 January 1970 - `str`; the following formats are accepted: - `YYYY-MM-DD[T]HH:MM[:SS[.ffffff]][Z or [±]HH[:]MM]` - `YYYY-MM-DD` is accepted in lax mode, but not in strict mode - `int` or `float` as a string (assumed as Unix time) - datetime.date instances are accepted in lax mode, but not in strict mode ```python from datetime import datetime from pydantic import BaseModel class Event(BaseModel): dt: datetime = None event = Event(dt='2032-04-23T10:20:30.400+02:30') print(event.model_dump()) """ {'dt': datetime.datetime(2032, 4, 23, 10, 20, 30, 400000, tzinfo=TzInfo(+02:30))} """ ``` ### datetime.date - `date` fields will accept values of type: - `date`; an existing `date` object - `int` or `float`; handled the same as described for `datetime` above - `str`; the following formats are accepted: - `YYYY-MM-DD` - `int` or `float` as a string (assumed as Unix time) ```python from datetime import date from pydantic import BaseModel class Birthday(BaseModel): d: date = None my_birthday = Birthday(d=1679616000.0) print(my_birthday.model_dump()) #> {'d': datetime.date(2023, 3, 24)} ``` ### datetime.time - `time` fields will accept values of type: - `time`; an existing `time` object - `str`; the following formats are accepted: - `HH:MM[:SS[.ffffff]][Z or [±]HH[:]MM]` ```python from datetime import time from pydantic import BaseModel class Meeting(BaseModel): t: time = None m = Meeting(t=time(4, 8, 16)) print(m.model_dump()) #> {'t': datetime.time(4, 8, 16)} ``` ### datetime.timedelta - `timedelta` fields will accept values of type: - `timedelta`; an existing `timedelta` object - `int` or `float`; assumed to be seconds - `str`; the following formats are accepted: - `[-][[DD]D,]HH:MM:SS[.ffffff]` - Ex: `'1d,01:02:03.000004'` or `'1D01:02:03.000004'` or `'01:02:03'` - `[±]P[DD]DT[HH]H[MM]M[SS]S` ([ISO 8601](https://en.wikipedia.org/wiki/ISO_8601) format for timedelta) ```python from datetime import timedelta from pydantic import BaseModel class Model(BaseModel): td: timedelta = None m = Model(td='P3DT12H30M5S') print(m.model_dump()) #> {'td': datetime.timedelta(days=3, seconds=45005)} ``` ## Number Types Pydantic supports the following numeric types from the Python standard library: ### int - Pydantic uses `int(v)` to coerce types to an `int`; see [Data conversion](../../concepts/models/#data-conversion) for details on loss of information during data conversion. ### float - Pydantic uses `float(v)` to coerce values to floats. ### enum.IntEnum - Validation: Pydantic checks that the value is a valid `IntEnum` instance. - Validation for subclass of `enum.IntEnum`: checks that the value is a valid member of the integer enum; see [Enums and Choices](#enum) for more details. ### decimal.Decimal - Validation: Pydantic attempts to convert the value to a string, then passes the string to `Decimal(v)`. - Serialization: Pydantic serializes Decimal types as strings. You can use a custom serializer to override this behavior if desired. For example: ```python from decimal import Decimal from typing import Annotated from pydantic import BaseModel, PlainSerializer class Model(BaseModel): x: Decimal y: Annotated[ Decimal, PlainSerializer( lambda x: float(x), return_type=float, when_used='json' ), ] my_model = Model(x=Decimal('1.1'), y=Decimal('2.1')) print(my_model.model_dump()) # (1)! #> {'x': Decimal('1.1'), 'y': Decimal('2.1')} print(my_model.model_dump(mode='json')) # (2)! #> {'x': '1.1', 'y': 2.1} print(my_model.model_dump_json()) # (3)! #> {"x":"1.1","y":2.1} ``` 1. Using model_dump, both `x` and `y` remain instances of the `Decimal` type 1. Using model_dump with `mode='json'`, `x` is serialized as a `string`, and `y` is serialized as a `float` because of the custom serializer applied. 1. Using model_dump_json, `x` is serialized as a `string`, and `y` is serialized as a `float` because of the custom serializer applied. ### complex - Validation: Pydantic supports `complex` types or `str` values that can be converted to a `complex` type. - Serialization: Pydantic serializes complex types as strings. ### fractions.Fraction - Validation: Pydantic attempts to convert the value to a `Fraction` using `Fraction(v)`. - Serialization: Pydantic serializes Fraction types as strings. ## Enum Pydantic uses Python's standard enum classes to define choices. `enum.Enum` checks that the value is a valid `Enum` instance. Subclass of `enum.Enum` checks that the value is a valid member of the enum. ```python from enum import Enum, IntEnum from pydantic import BaseModel, ValidationError class FruitEnum(str, Enum): pear = 'pear' banana = 'banana' class ToolEnum(IntEnum): spanner = 1 wrench = 2 class CookingModel(BaseModel): fruit: FruitEnum = FruitEnum.pear tool: ToolEnum = ToolEnum.spanner print(CookingModel()) #> fruit= tool= print(CookingModel(tool=2, fruit='banana')) #> fruit= tool= try: CookingModel(fruit='other') except ValidationError as e: print(e) """ 1 validation error for CookingModel fruit Input should be 'pear' or 'banana' [type=enum, input_value='other', input_type=str] """ ``` ## Lists and Tuples ### list Allows list, tuple, set, frozenset, deque, or generators and casts to a list. When a generic parameter is provided, the appropriate validation is applied to all items of the list. ```python from typing import Optional from pydantic import BaseModel class Model(BaseModel): simple_list: Optional[list] = None list_of_ints: Optional[list[int]] = None print(Model(simple_list=['1', '2', '3']).simple_list) #> ['1', '2', '3'] print(Model(list_of_ints=['1', '2', '3']).list_of_ints) #> [1, 2, 3] ``` ```python from pydantic import BaseModel class Model(BaseModel): simple_list: list | None = None list_of_ints: list[int] | None = None print(Model(simple_list=['1', '2', '3']).simple_list) #> ['1', '2', '3'] print(Model(list_of_ints=['1', '2', '3']).list_of_ints) #> [1, 2, 3] ``` ### tuple Allows list, tuple, set, frozenset, deque, or generators and casts to a tuple. When generic parameters are provided, the appropriate validation is applied to the respective items of the tuple ### typing.Tuple Handled the same as `tuple` above. ```python from typing import Optional from pydantic import BaseModel class Model(BaseModel): simple_tuple: Optional[tuple] = None tuple_of_different_types: Optional[tuple[int, float, bool]] = None print(Model(simple_tuple=[1, 2, 3, 4]).simple_tuple) #> (1, 2, 3, 4) print(Model(tuple_of_different_types=[3, 2, 1]).tuple_of_different_types) #> (3, 2.0, True) ``` ```python from pydantic import BaseModel class Model(BaseModel): simple_tuple: tuple | None = None tuple_of_different_types: tuple[int, float, bool] | None = None print(Model(simple_tuple=[1, 2, 3, 4]).simple_tuple) #> (1, 2, 3, 4) print(Model(tuple_of_different_types=[3, 2, 1]).tuple_of_different_types) #> (3, 2.0, True) ``` ### typing.NamedTuple Subclasses of typing.NamedTuple are similar to `tuple`, but create instances of the given `namedtuple` class. Subclasses of collections.namedtuple are similar to subclass of typing.NamedTuple, but since field types are not specified, all fields are treated as having type Any. ```python from typing import NamedTuple from pydantic import BaseModel, ValidationError class Point(NamedTuple): x: int y: int class Model(BaseModel): p: Point try: Model(p=('1.3', '2')) except ValidationError as e: print(e) """ 1 validation error for Model p.0 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='1.3', input_type=str] """ ``` ## Deque ### deque Allows list, tuple, set, frozenset, deque, or generators and casts to a deque. When generic parameters are provided, the appropriate validation is applied to the respective items of the `deque`. ### typing.Deque Handled the same as `deque` above. ```python from typing import Deque, Optional from pydantic import BaseModel class Model(BaseModel): deque: Optional[Deque[int]] = None print(Model(deque=[1, 2, 3]).deque) #> deque([1, 2, 3]) ``` ```python from typing import Deque from pydantic import BaseModel class Model(BaseModel): deque: Deque[int] | None = None print(Model(deque=[1, 2, 3]).deque) #> deque([1, 2, 3]) ``` ## Sets ### set Allows list, tuple, set, frozenset, deque, or generators and casts to a set. When a generic parameter is provided, the appropriate validation is applied to all items of the set. ### typing.Set Handled the same as `set` above. ```python from typing import Optional, Set from pydantic import BaseModel class Model(BaseModel): simple_set: Optional[set] = None set_of_ints: Optional[Set[int]] = None print(Model(simple_set={'1', '2', '3'}).simple_set) #> {'1', '2', '3'} print(Model(simple_set=['1', '2', '3']).simple_set) #> {'1', '2', '3'} print(Model(set_of_ints=['1', '2', '3']).set_of_ints) #> {1, 2, 3} ``` ```python from pydantic import BaseModel class Model(BaseModel): simple_set: set | None = None set_of_ints: set[int] | None = None print(Model(simple_set={'1', '2', '3'}).simple_set) #> {'1', '2', '3'} print(Model(simple_set=['1', '2', '3']).simple_set) #> {'1', '2', '3'} print(Model(set_of_ints=['1', '2', '3']).set_of_ints) #> {1, 2, 3} ``` ### frozenset Allows list, tuple, set, frozenset, deque, or generators and casts to a frozenset. When a generic parameter is provided, the appropriate validation is applied to all items of the frozen set. ### typing.FrozenSet Handled the same as `frozenset` above. ```python from typing import FrozenSet, Optional from pydantic import BaseModel class Model(BaseModel): simple_frozenset: Optional[frozenset] = None frozenset_of_ints: Optional[FrozenSet[int]] = None m1 = Model(simple_frozenset=['1', '2', '3']) print(type(m1.simple_frozenset)) #> print(sorted(m1.simple_frozenset)) #> ['1', '2', '3'] m2 = Model(frozenset_of_ints=['1', '2', '3']) print(type(m2.frozenset_of_ints)) #> print(sorted(m2.frozenset_of_ints)) #> [1, 2, 3] ``` ```python from pydantic import BaseModel class Model(BaseModel): simple_frozenset: frozenset | None = None frozenset_of_ints: frozenset[int] | None = None m1 = Model(simple_frozenset=['1', '2', '3']) print(type(m1.simple_frozenset)) #> print(sorted(m1.simple_frozenset)) #> ['1', '2', '3'] m2 = Model(frozenset_of_ints=['1', '2', '3']) print(type(m2.frozenset_of_ints)) #> print(sorted(m2.frozenset_of_ints)) #> [1, 2, 3] ``` ## Other Iterables ### typing.Sequence This is intended for use when the provided value should meet the requirements of the `Sequence` ABC, and it is desirable to do eager validation of the values in the container. Note that when validation must be performed on the values of the container, the type of the container may not be preserved since validation may end up replacing values. We guarantee that the validated value will be a valid typing.Sequence, but it may have a different type than was provided (generally, it will become a `list`). ### typing.Iterable This is intended for use when the provided value may be an iterable that shouldn't be consumed. See [Infinite Generators](#infinite-generators) below for more detail on parsing and validation. Similar to typing.Sequence, we guarantee that the validated result will be a valid typing.Iterable, but it may have a different type than was provided. In particular, even if a non-generator type such as a `list` is provided, the post-validation value of a field of type typing.Iterable will be a generator. Here is a simple example using typing.Sequence: ```python from typing import Sequence from pydantic import BaseModel class Model(BaseModel): sequence_of_ints: Sequence[int] = None print(Model(sequence_of_ints=[1, 2, 3, 4]).sequence_of_ints) #> [1, 2, 3, 4] print(Model(sequence_of_ints=(1, 2, 3, 4)).sequence_of_ints) #> (1, 2, 3, 4) ``` ```python from collections.abc import Sequence from pydantic import BaseModel class Model(BaseModel): sequence_of_ints: Sequence[int] = None print(Model(sequence_of_ints=[1, 2, 3, 4]).sequence_of_ints) #> [1, 2, 3, 4] print(Model(sequence_of_ints=(1, 2, 3, 4)).sequence_of_ints) #> (1, 2, 3, 4) ``` ### Infinite Generators If you have a generator you want to validate, you can still use `Sequence` as described above. In that case, the generator will be consumed and stored on the model as a list and its values will be validated against the type parameter of the `Sequence` (e.g. `int` in `Sequence[int]`). However, if you have a generator that you *don't* want to be eagerly consumed (e.g. an infinite generator or a remote data loader), you can use a field of type Iterable: ```python from typing import Iterable from pydantic import BaseModel class Model(BaseModel): infinite: Iterable[int] def infinite_ints(): i = 0 while True: yield i i += 1 m = Model(infinite=infinite_ints()) print(m) """ infinite=ValidatorIterator(index=0, schema=Some(Int(IntValidator { strict: false }))) """ for i in m.infinite: print(i) #> 0 #> 1 #> 2 #> 3 #> 4 #> 5 #> 6 #> 7 #> 8 #> 9 #> 10 if i == 10: break ``` ```python from collections.abc import Iterable from pydantic import BaseModel class Model(BaseModel): infinite: Iterable[int] def infinite_ints(): i = 0 while True: yield i i += 1 m = Model(infinite=infinite_ints()) print(m) """ infinite=ValidatorIterator(index=0, schema=Some(Int(IntValidator { strict: false }))) """ for i in m.infinite: print(i) #> 0 #> 1 #> 2 #> 3 #> 4 #> 5 #> 6 #> 7 #> 8 #> 9 #> 10 if i == 10: break ``` Warning During initial validation, `Iterable` fields only perform a simple check that the provided argument is iterable. To prevent it from being consumed, no validation of the yielded values is performed eagerly. Though the yielded values are not validated eagerly, they are still validated when yielded, and will raise a `ValidationError` at yield time when appropriate: ```python from typing import Iterable from pydantic import BaseModel, ValidationError class Model(BaseModel): int_iterator: Iterable[int] def my_iterator(): yield 13 yield '27' yield 'a' m = Model(int_iterator=my_iterator()) print(next(m.int_iterator)) #> 13 print(next(m.int_iterator)) #> 27 try: next(m.int_iterator) except ValidationError as e: print(e) """ 1 validation error for ValidatorIterator 2 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] """ ``` ```python from collections.abc import Iterable from pydantic import BaseModel, ValidationError class Model(BaseModel): int_iterator: Iterable[int] def my_iterator(): yield 13 yield '27' yield 'a' m = Model(int_iterator=my_iterator()) print(next(m.int_iterator)) #> 13 print(next(m.int_iterator)) #> 27 try: next(m.int_iterator) except ValidationError as e: print(e) """ 1 validation error for ValidatorIterator 2 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] """ ``` ## Mapping Types ### dict `dict(v)` is used to attempt to convert a dictionary. ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: dict[str, int] m = Model(x={'foo': 1}) print(m.model_dump()) #> {'x': {'foo': 1}} try: Model(x={'foo': '1'}) except ValidationError as e: print(e) """ 1 validation error for Model x Input should be a valid dictionary [type=dict_type, input_value='test', input_type=str] """ ``` ### TypedDict Note This is a new feature of the Python standard library as of Python 3.8. Because of limitations in typing.TypedDict before 3.12, the [typing-extensions](https://pypi.org/project/typing-extensions/) package is required for Python \<3.12. You'll need to import `TypedDict` from `typing_extensions` instead of `typing` and will get a build time error if you don't. TypedDict declares a dictionary type that expects all of its instances to have a certain set of keys, where each key is associated with a value of a consistent type. It is same as dict but Pydantic will validate the dictionary since keys are annotated. ```python from typing_extensions import TypedDict from pydantic import TypeAdapter, ValidationError class User(TypedDict): name: str id: int ta = TypeAdapter(User) print(ta.validate_python({'name': 'foo', 'id': 1})) #> {'name': 'foo', 'id': 1} try: ta.validate_python({'name': 'foo'}) except ValidationError as e: print(e) """ 1 validation error for User id Field required [type=missing, input_value={'name': 'foo'}, input_type=dict] """ ``` ```python from typing import TypedDict from pydantic import TypeAdapter, ValidationError class User(TypedDict): name: str id: int ta = TypeAdapter(User) print(ta.validate_python({'name': 'foo', 'id': 1})) #> {'name': 'foo', 'id': 1} try: ta.validate_python({'name': 'foo'}) except ValidationError as e: print(e) """ 1 validation error for User id Field required [type=missing, input_value={'name': 'foo'}, input_type=dict] """ ``` You can define `__pydantic_config__` to change the model inherited from TypedDict. See the ConfigDict API reference for more details. ```python from typing import Optional from typing_extensions import TypedDict from pydantic import ConfigDict, TypeAdapter, ValidationError # `total=False` means keys are non-required class UserIdentity(TypedDict, total=False): name: Optional[str] surname: str class User(TypedDict): __pydantic_config__ = ConfigDict(extra='forbid') identity: UserIdentity age: int ta = TypeAdapter(User) print( ta.validate_python( {'identity': {'name': 'Smith', 'surname': 'John'}, 'age': 37} ) ) #> {'identity': {'name': 'Smith', 'surname': 'John'}, 'age': 37} print( ta.validate_python( {'identity': {'name': None, 'surname': 'John'}, 'age': 37} ) ) #> {'identity': {'name': None, 'surname': 'John'}, 'age': 37} print(ta.validate_python({'identity': {}, 'age': 37})) #> {'identity': {}, 'age': 37} try: ta.validate_python( {'identity': {'name': ['Smith'], 'surname': 'John'}, 'age': 24} ) except ValidationError as e: print(e) """ 1 validation error for User identity.name Input should be a valid string [type=string_type, input_value=['Smith'], input_type=list] """ try: ta.validate_python( { 'identity': {'name': 'Smith', 'surname': 'John'}, 'age': '37', 'email': 'john.smith@me.com', } ) except ValidationError as e: print(e) """ 1 validation error for User email Extra inputs are not permitted [type=extra_forbidden, input_value='john.smith@me.com', input_type=str] """ ``` ```python from typing_extensions import TypedDict from pydantic import ConfigDict, TypeAdapter, ValidationError # `total=False` means keys are non-required class UserIdentity(TypedDict, total=False): name: str | None surname: str class User(TypedDict): __pydantic_config__ = ConfigDict(extra='forbid') identity: UserIdentity age: int ta = TypeAdapter(User) print( ta.validate_python( {'identity': {'name': 'Smith', 'surname': 'John'}, 'age': 37} ) ) #> {'identity': {'name': 'Smith', 'surname': 'John'}, 'age': 37} print( ta.validate_python( {'identity': {'name': None, 'surname': 'John'}, 'age': 37} ) ) #> {'identity': {'name': None, 'surname': 'John'}, 'age': 37} print(ta.validate_python({'identity': {}, 'age': 37})) #> {'identity': {}, 'age': 37} try: ta.validate_python( {'identity': {'name': ['Smith'], 'surname': 'John'}, 'age': 24} ) except ValidationError as e: print(e) """ 1 validation error for User identity.name Input should be a valid string [type=string_type, input_value=['Smith'], input_type=list] """ try: ta.validate_python( { 'identity': {'name': 'Smith', 'surname': 'John'}, 'age': '37', 'email': 'john.smith@me.com', } ) except ValidationError as e: print(e) """ 1 validation error for User email Extra inputs are not permitted [type=extra_forbidden, input_value='john.smith@me.com', input_type=str] """ ``` ```python from typing import TypedDict from pydantic import ConfigDict, TypeAdapter, ValidationError # `total=False` means keys are non-required class UserIdentity(TypedDict, total=False): name: str | None surname: str class User(TypedDict): __pydantic_config__ = ConfigDict(extra='forbid') identity: UserIdentity age: int ta = TypeAdapter(User) print( ta.validate_python( {'identity': {'name': 'Smith', 'surname': 'John'}, 'age': 37} ) ) #> {'identity': {'name': 'Smith', 'surname': 'John'}, 'age': 37} print( ta.validate_python( {'identity': {'name': None, 'surname': 'John'}, 'age': 37} ) ) #> {'identity': {'name': None, 'surname': 'John'}, 'age': 37} print(ta.validate_python({'identity': {}, 'age': 37})) #> {'identity': {}, 'age': 37} try: ta.validate_python( {'identity': {'name': ['Smith'], 'surname': 'John'}, 'age': 24} ) except ValidationError as e: print(e) """ 1 validation error for User identity.name Input should be a valid string [type=string_type, input_value=['Smith'], input_type=list] """ try: ta.validate_python( { 'identity': {'name': 'Smith', 'surname': 'John'}, 'age': '37', 'email': 'john.smith@me.com', } ) except ValidationError as e: print(e) """ 1 validation error for User email Extra inputs are not permitted [type=extra_forbidden, input_value='john.smith@me.com', input_type=str] """ ``` ## Callable See below for more detail on parsing and validation Fields can also be of type Callable: ```python from typing import Callable from pydantic import BaseModel class Foo(BaseModel): callback: Callable[[int], int] m = Foo(callback=lambda x: x) print(m) #> callback= at 0x0123456789ab> ``` ```python from collections.abc import Callable from pydantic import BaseModel class Foo(BaseModel): callback: Callable[[int], int] m = Foo(callback=lambda x: x) print(m) #> callback= at 0x0123456789ab> ``` Warning Callable fields only perform a simple check that the argument is callable; no validation of arguments, their types, or the return type is performed. ## IP Address Types - ipaddress.IPv4Address: Uses the type itself for validation by passing the value to `IPv4Address(v)`. - ipaddress.IPv4Interface: Uses the type itself for validation by passing the value to `IPv4Address(v)`. - ipaddress.IPv4Network: Uses the type itself for validation by passing the value to `IPv4Network(v)`. - ipaddress.IPv6Address: Uses the type itself for validation by passing the value to `IPv6Address(v)`. - ipaddress.IPv6Interface: Uses the type itself for validation by passing the value to `IPv6Interface(v)`. - ipaddress.IPv6Network: Uses the type itself for validation by passing the value to `IPv6Network(v)`. See [Network Types](../networks/) for other custom IP address types. ## UUID For UUID, Pydantic tries to use the type itself for validation by passing the value to `UUID(v)`. There's a fallback to `UUID(bytes=v)` for `bytes` and `bytearray`. In case you want to constrain the UUID version, you can check the following types: - UUID1: requires UUID version 1. - UUID3: requires UUID version 3. - UUID4: requires UUID version 4. - UUID5: requires UUID version 5. ## Union Pydantic has extensive support for union validation, both typing.Union and Python 3.10's pipe syntax (`A | B`) are supported. Read more in the [`Unions`](../../concepts/unions/) section of the concepts docs. ## type Pydantic supports the use of `type[T]` to specify that a field may only accept classes (not instances) that are subclasses of `T`. ```python from pydantic import BaseModel, ValidationError class Foo: pass class Bar(Foo): pass class Other: pass class SimpleModel(BaseModel): just_subclasses: type[Foo] SimpleModel(just_subclasses=Foo) SimpleModel(just_subclasses=Bar) try: SimpleModel(just_subclasses=Other) except ValidationError as e: print(e) """ 1 validation error for SimpleModel just_subclasses Input should be a subclass of Foo [type=is_subclass_of, input_value=, input_type=type] """ ``` You may also use `type` to specify that any class is allowed. ```python from pydantic import BaseModel, ValidationError class Foo: pass class LenientSimpleModel(BaseModel): any_class_goes: type LenientSimpleModel(any_class_goes=int) LenientSimpleModel(any_class_goes=Foo) try: LenientSimpleModel(any_class_goes=Foo()) except ValidationError as e: print(e) """ 1 validation error for LenientSimpleModel any_class_goes Input should be a type [type=is_type, input_value=<__main__.Foo object at 0x0123456789ab>, input_type=Foo] """ ``` ## typing.TypeVar TypeVar is supported either unconstrained, constrained or with a bound. ```python from typing import TypeVar from pydantic import BaseModel Foobar = TypeVar('Foobar') BoundFloat = TypeVar('BoundFloat', bound=float) IntStr = TypeVar('IntStr', int, str) class Model(BaseModel): a: Foobar # equivalent of ": Any" b: BoundFloat # equivalent of ": float" c: IntStr # equivalent of ": Union[int, str]" print(Model(a=[1], b=4.2, c='x')) #> a=[1] b=4.2 c='x' # a may be None print(Model(a=None, b=1, c=1)) #> a=None b=1.0 c=1 ``` ## None Types None, `type(None)`, or `Literal[None]` are all equivalent according to [the typing specification](https://typing.readthedocs.io/en/latest/spec/special-types.html#none). Allows only `None` value. ## Strings - str: Strings are accepted as-is. - bytes and bytearray are converted using the decode() method. - Enums inheriting from str are converted using the value attribute. All other types cause an error. Strings aren't Sequences While instances of `str` are technically valid instances of the `Sequence[str]` protocol from a type-checker's point of view, this is frequently not intended as is a common source of bugs. As a result, Pydantic raises a `ValidationError` if you attempt to pass a `str` or `bytes` instance into a field of type `Sequence[str]` or `Sequence[bytes]`: ```python from typing import Optional, Sequence from pydantic import BaseModel, ValidationError class Model(BaseModel): sequence_of_strs: Optional[Sequence[str]] = None sequence_of_bytes: Optional[Sequence[bytes]] = None print(Model(sequence_of_strs=['a', 'bc']).sequence_of_strs) #> ['a', 'bc'] print(Model(sequence_of_strs=('a', 'bc')).sequence_of_strs) #> ('a', 'bc') print(Model(sequence_of_bytes=[b'a', b'bc']).sequence_of_bytes) #> [b'a', b'bc'] print(Model(sequence_of_bytes=(b'a', b'bc')).sequence_of_bytes) #> (b'a', b'bc') try: Model(sequence_of_strs='abc') except ValidationError as e: print(e) """ 1 validation error for Model sequence_of_strs 'str' instances are not allowed as a Sequence value [type=sequence_str, input_value='abc', input_type=str] """ try: Model(sequence_of_bytes=b'abc') except ValidationError as e: print(e) """ 1 validation error for Model sequence_of_bytes 'bytes' instances are not allowed as a Sequence value [type=sequence_str, input_value=b'abc', input_type=bytes] """ ``` ```python from collections.abc import Sequence from pydantic import BaseModel, ValidationError class Model(BaseModel): sequence_of_strs: Sequence[str] | None = None sequence_of_bytes: Sequence[bytes] | None = None print(Model(sequence_of_strs=['a', 'bc']).sequence_of_strs) #> ['a', 'bc'] print(Model(sequence_of_strs=('a', 'bc')).sequence_of_strs) #> ('a', 'bc') print(Model(sequence_of_bytes=[b'a', b'bc']).sequence_of_bytes) #> [b'a', b'bc'] print(Model(sequence_of_bytes=(b'a', b'bc')).sequence_of_bytes) #> (b'a', b'bc') try: Model(sequence_of_strs='abc') except ValidationError as e: print(e) """ 1 validation error for Model sequence_of_strs 'str' instances are not allowed as a Sequence value [type=sequence_str, input_value='abc', input_type=str] """ try: Model(sequence_of_bytes=b'abc') except ValidationError as e: print(e) """ 1 validation error for Model sequence_of_bytes 'bytes' instances are not allowed as a Sequence value [type=sequence_str, input_value=b'abc', input_type=bytes] """ ``` ## Bytes bytes are accepted as-is. bytearray is converted using `bytes(v)`. `str` are converted using `v.encode()`. `int`, `float`, and `Decimal` are coerced using `str(v).encode()`. See [ByteSize](../types/#pydantic.types.ByteSize) for more details. ## typing.Literal Pydantic supports the use of typing.Literal as a lightweight way to specify that a field may accept only specific literal values: ```python from typing import Literal from pydantic import BaseModel, ValidationError class Pie(BaseModel): flavor: Literal['apple', 'pumpkin'] Pie(flavor='apple') Pie(flavor='pumpkin') try: Pie(flavor='cherry') except ValidationError as e: print(str(e)) """ 1 validation error for Pie flavor Input should be 'apple' or 'pumpkin' [type=literal_error, input_value='cherry', input_type=str] """ ``` One benefit of this field type is that it can be used to check for equality with one or more specific values without needing to declare custom validators: ```python from typing import ClassVar, Literal, Union from pydantic import BaseModel, ValidationError class Cake(BaseModel): kind: Literal['cake'] required_utensils: ClassVar[list[str]] = ['fork', 'knife'] class IceCream(BaseModel): kind: Literal['icecream'] required_utensils: ClassVar[list[str]] = ['spoon'] class Meal(BaseModel): dessert: Union[Cake, IceCream] print(type(Meal(dessert={'kind': 'cake'}).dessert).__name__) #> Cake print(type(Meal(dessert={'kind': 'icecream'}).dessert).__name__) #> IceCream try: Meal(dessert={'kind': 'pie'}) except ValidationError as e: print(str(e)) """ 2 validation errors for Meal dessert.Cake.kind Input should be 'cake' [type=literal_error, input_value='pie', input_type=str] dessert.IceCream.kind Input should be 'icecream' [type=literal_error, input_value='pie', input_type=str] """ ``` ```python from typing import ClassVar, Literal from pydantic import BaseModel, ValidationError class Cake(BaseModel): kind: Literal['cake'] required_utensils: ClassVar[list[str]] = ['fork', 'knife'] class IceCream(BaseModel): kind: Literal['icecream'] required_utensils: ClassVar[list[str]] = ['spoon'] class Meal(BaseModel): dessert: Cake | IceCream print(type(Meal(dessert={'kind': 'cake'}).dessert).__name__) #> Cake print(type(Meal(dessert={'kind': 'icecream'}).dessert).__name__) #> IceCream try: Meal(dessert={'kind': 'pie'}) except ValidationError as e: print(str(e)) """ 2 validation errors for Meal dessert.Cake.kind Input should be 'cake' [type=literal_error, input_value='pie', input_type=str] dessert.IceCream.kind Input should be 'icecream' [type=literal_error, input_value='pie', input_type=str] """ ``` With proper ordering in an annotated `Union`, you can use this to parse types of decreasing specificity: ```python from typing import Literal, Optional, Union from pydantic import BaseModel class Dessert(BaseModel): kind: str class Pie(Dessert): kind: Literal['pie'] flavor: Optional[str] class ApplePie(Pie): flavor: Literal['apple'] class PumpkinPie(Pie): flavor: Literal['pumpkin'] class Meal(BaseModel): dessert: Union[ApplePie, PumpkinPie, Pie, Dessert] print(type(Meal(dessert={'kind': 'pie', 'flavor': 'apple'}).dessert).__name__) #> ApplePie print(type(Meal(dessert={'kind': 'pie', 'flavor': 'pumpkin'}).dessert).__name__) #> PumpkinPie print(type(Meal(dessert={'kind': 'pie'}).dessert).__name__) #> Dessert print(type(Meal(dessert={'kind': 'cake'}).dessert).__name__) #> Dessert ``` ```python from typing import Literal from pydantic import BaseModel class Dessert(BaseModel): kind: str class Pie(Dessert): kind: Literal['pie'] flavor: str | None class ApplePie(Pie): flavor: Literal['apple'] class PumpkinPie(Pie): flavor: Literal['pumpkin'] class Meal(BaseModel): dessert: ApplePie | PumpkinPie | Pie | Dessert print(type(Meal(dessert={'kind': 'pie', 'flavor': 'apple'}).dessert).__name__) #> ApplePie print(type(Meal(dessert={'kind': 'pie', 'flavor': 'pumpkin'}).dessert).__name__) #> PumpkinPie print(type(Meal(dessert={'kind': 'pie'}).dessert).__name__) #> Dessert print(type(Meal(dessert={'kind': 'cake'}).dessert).__name__) #> Dessert ``` ## typing.Any Allows any value, including `None`. ## typing.Hashable - From Python, supports any data that passes an `isinstance(v, Hashable)` check. - From JSON, first loads the data via an `Any` validator, then checks if the data is hashable with `isinstance(v, Hashable)`. ## typing.Annotated Allows wrapping another type with arbitrary metadata, as per [PEP-593](https://www.python.org/dev/peps/pep-0593/). The `Annotated` hint may contain a single call to the [`Field` function](../../concepts/types/#using-the-annotated-pattern), but otherwise the additional metadata is ignored and the root type is used. ## typing.Pattern Will cause the input value to be passed to `re.compile(v)` to create a regular expression pattern. ## pathlib.Path Simply uses the type itself for validation by passing the value to `Path(v)`. Bases: `Generic[T]` Usage Documentation [`TypeAdapter`](../../concepts/type_adapter/) Type adapters provide a flexible way to perform validation and serialization based on a Python type. A `TypeAdapter` instance exposes some of the functionality from `BaseModel` instance methods for types that do not have such methods (such as dataclasses, primitive types, and more). **Note:** `TypeAdapter` instances are not types, and cannot be used as type annotations for fields. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `type` | `Any` | The type associated with the TypeAdapter. | *required* | | `config` | `ConfigDict | None` | Configuration for the TypeAdapter, should be a dictionary conforming to ConfigDict. Note You cannot provide a configuration when instantiating a TypeAdapter if the type you're using has its own config that cannot be overridden (ex: BaseModel, TypedDict, and dataclass). A type-adapter-config-unused error will be raised in this case. | `None` | | `_parent_depth` | `int` | Depth at which to search for the parent frame. This frame is used when resolving forward annotations during schema building, by looking for the globals and locals of this frame. Defaults to 2, which will result in the frame where the TypeAdapter was instantiated. Note This parameter is named with an underscore to suggest its private nature and discourage use. It may be deprecated in a minor version, so we only recommend using it if you're comfortable with potential change in behavior/support. It's default value is 2 because internally, the TypeAdapter class makes another call to fetch the frame. | `2` | | `module` | `str | None` | The module that passes to plugin if provided. | `None` | Attributes: | Name | Type | Description | | --- | --- | --- | | `core_schema` | `CoreSchema` | The core schema for the type. | | `validator` | `SchemaValidator | PluggableSchemaValidator` | The schema validator for the type. | | `serializer` | `SchemaSerializer` | The schema serializer for the type. | | `pydantic_complete` | `bool` | Whether the core schema for the type is successfully built. | Compatibility with `mypy` Depending on the type used, `mypy` might raise an error when instantiating a `TypeAdapter`. As a workaround, you can explicitly annotate your variable: ```py from typing import Union from pydantic import TypeAdapter ta: TypeAdapter[Union[str, int]] = TypeAdapter(Union[str, int]) # type: ignore[arg-type] ``` Namespace management nuances and implementation details Here, we collect some notes on namespace management, and subtle differences from `BaseModel`: `BaseModel` uses its own `__module__` to find out where it was defined and then looks for symbols to resolve forward references in those globals. On the other hand, `TypeAdapter` can be initialized with arbitrary objects, which may not be types and thus do not have a `__module__` available. So instead we look at the globals in our parent stack frame. It is expected that the `ns_resolver` passed to this function will have the correct namespace for the type we're adapting. See the source code for `TypeAdapter.__init__` and `TypeAdapter.rebuild` for various ways to construct this namespace. This works for the case where this function is called in a module that has the target of forward references in its scope, but does not always work for more complex cases. For example, take the following: a.py ```python IntList = list[int] OuterDict = dict[str, 'IntList'] ``` b.py ```python from a import OuterDict from pydantic import TypeAdapter IntList = int # replaces the symbol the forward reference is looking for v = TypeAdapter(OuterDict) v({'x': 1}) # should fail but doesn't ``` If `OuterDict` were a `BaseModel`, this would work because it would resolve the forward reference within the `a.py` namespace. But `TypeAdapter(OuterDict)` can't determine what module `OuterDict` came from. In other words, the assumption that *all* forward references exist in the module we are being called from is not technically always true. Although most of the time it is and it works fine for recursive models and such, `BaseModel`'s behavior isn't perfect either and *can* break in similar ways, so there is no right or wrong between the two. But at the very least this behavior is *subtly* different from `BaseModel`'s. Source code in `pydantic/type_adapter.py` ```python def __init__( self, type: Any, *, config: ConfigDict | None = None, _parent_depth: int = 2, module: str | None = None, ) -> None: if _type_has_config(type) and config is not None: raise PydanticUserError( 'Cannot use `config` when the type is a BaseModel, dataclass or TypedDict.' ' These types can have their own config and setting the config via the `config`' ' parameter to TypeAdapter will not override it, thus the `config` you passed to' ' TypeAdapter becomes meaningless, which is probably not what you want.', code='type-adapter-config-unused', ) self._type = type self._config = config self._parent_depth = _parent_depth self.pydantic_complete = False parent_frame = self._fetch_parent_frame() if parent_frame is not None: globalns = parent_frame.f_globals # Do not provide a local ns if the type adapter happens to be instantiated at the module level: localns = parent_frame.f_locals if parent_frame.f_locals is not globalns else {} else: globalns = {} localns = {} self._module_name = module or cast(str, globalns.get('__name__', '')) self._init_core_attrs( ns_resolver=_namespace_utils.NsResolver( namespaces_tuple=_namespace_utils.NamespacesTuple(locals=localns, globals=globalns), parent_namespace=localns, ), force=False, ) ``` ## rebuild ```python rebuild( *, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None ) -> bool | None ``` Try to rebuild the pydantic-core schema for the adapter's type. This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `force` | `bool` | Whether to force the rebuilding of the type adapter's schema, defaults to False. | `False` | | `raise_errors` | `bool` | Whether to raise errors, defaults to True. | `True` | | `_parent_namespace_depth` | `int` | Depth at which to search for the parent frame. This frame is used when resolving forward annotations during schema rebuilding, by looking for the locals of this frame. Defaults to 2, which will result in the frame where the method was called. | `2` | | `_types_namespace` | `MappingNamespace | None` | An explicit types namespace to use, instead of using the local namespace from the parent frame. Defaults to None. | `None` | Returns: | Type | Description | | --- | --- | | `bool | None` | Returns None if the schema is already "complete" and rebuilding was not required. | | `bool | None` | If rebuilding was required, returns True if rebuilding was successful, otherwise False. | Source code in `pydantic/type_adapter.py` ```python def rebuild( self, *, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: _namespace_utils.MappingNamespace | None = None, ) -> bool | None: """Try to rebuild the pydantic-core schema for the adapter's type. This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails. Args: force: Whether to force the rebuilding of the type adapter's schema, defaults to `False`. raise_errors: Whether to raise errors, defaults to `True`. _parent_namespace_depth: Depth at which to search for the [parent frame][frame-objects]. This frame is used when resolving forward annotations during schema rebuilding, by looking for the locals of this frame. Defaults to 2, which will result in the frame where the method was called. _types_namespace: An explicit types namespace to use, instead of using the local namespace from the parent frame. Defaults to `None`. Returns: Returns `None` if the schema is already "complete" and rebuilding was not required. If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`. """ if not force and self.pydantic_complete: return None if _types_namespace is not None: rebuild_ns = _types_namespace elif _parent_namespace_depth > 0: rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {} else: rebuild_ns = {} # we have to manually fetch globals here because there's no type on the stack of the NsResolver # and so we skip the globalns = get_module_ns_of(typ) call that would normally happen globalns = sys._getframe(max(_parent_namespace_depth - 1, 1)).f_globals ns_resolver = _namespace_utils.NsResolver( namespaces_tuple=_namespace_utils.NamespacesTuple(locals=rebuild_ns, globals=globalns), parent_namespace=rebuild_ns, ) return self._init_core_attrs(ns_resolver=ns_resolver, force=True, raise_errors=raise_errors) ``` ## validate_python ```python validate_python( object: Any, /, *, strict: bool | None = None, from_attributes: bool | None = None, context: dict[str, Any] | None = None, experimental_allow_partial: ( bool | Literal["off", "on", "trailing-strings"] ) = False, by_alias: bool | None = None, by_name: bool | None = None, ) -> T ``` Validate a Python object against the model. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `object` | `Any` | The Python object to validate against the model. | *required* | | `strict` | `bool | None` | Whether to strictly check types. | `None` | | `from_attributes` | `bool | None` | Whether to extract data from object attributes. | `None` | | `context` | `dict[str, Any] | None` | Additional context to pass to the validator. | `None` | | `experimental_allow_partial` | `bool | Literal['off', 'on', 'trailing-strings']` | Experimental whether to enable partial validation, e.g. to process streams. * False / 'off': Default behavior, no partial validation. * True / 'on': Enable partial validation. * 'trailing-strings': Enable partial validation and allow trailing strings in the input. | `False` | | `by_alias` | `bool | None` | Whether to use the field's alias when validating against the provided input data. | `None` | | `by_name` | `bool | None` | Whether to use the field's name when validating against the provided input data. | `None` | Note When using `TypeAdapter` with a Pydantic `dataclass`, the use of the `from_attributes` argument is not supported. Returns: | Type | Description | | --- | --- | | `T` | The validated object. | Source code in `pydantic/type_adapter.py` ```python def validate_python( self, object: Any, /, *, strict: bool | None = None, from_attributes: bool | None = None, context: dict[str, Any] | None = None, experimental_allow_partial: bool | Literal['off', 'on', 'trailing-strings'] = False, by_alias: bool | None = None, by_name: bool | None = None, ) -> T: """Validate a Python object against the model. Args: object: The Python object to validate against the model. strict: Whether to strictly check types. from_attributes: Whether to extract data from object attributes. context: Additional context to pass to the validator. experimental_allow_partial: **Experimental** whether to enable [partial validation](../concepts/experimental.md#partial-validation), e.g. to process streams. * False / 'off': Default behavior, no partial validation. * True / 'on': Enable partial validation. * 'trailing-strings': Enable partial validation and allow trailing strings in the input. by_alias: Whether to use the field's alias when validating against the provided input data. by_name: Whether to use the field's name when validating against the provided input data. !!! note When using `TypeAdapter` with a Pydantic `dataclass`, the use of the `from_attributes` argument is not supported. Returns: The validated object. """ if by_alias is False and by_name is not True: raise PydanticUserError( 'At least one of `by_alias` or `by_name` must be set to True.', code='validate-by-alias-and-name-false', ) return self.validator.validate_python( object, strict=strict, from_attributes=from_attributes, context=context, allow_partial=experimental_allow_partial, by_alias=by_alias, by_name=by_name, ) ``` ## validate_json ```python validate_json( data: str | bytes | bytearray, /, *, strict: bool | None = None, context: dict[str, Any] | None = None, experimental_allow_partial: ( bool | Literal["off", "on", "trailing-strings"] ) = False, by_alias: bool | None = None, by_name: bool | None = None, ) -> T ``` Usage Documentation [JSON Parsing](../../concepts/json/#json-parsing) Validate a JSON string or bytes against the model. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `data` | `str | bytes | bytearray` | The JSON data to validate against the model. | *required* | | `strict` | `bool | None` | Whether to strictly check types. | `None` | | `context` | `dict[str, Any] | None` | Additional context to use during validation. | `None` | | `experimental_allow_partial` | `bool | Literal['off', 'on', 'trailing-strings']` | Experimental whether to enable partial validation, e.g. to process streams. * False / 'off': Default behavior, no partial validation. * True / 'on': Enable partial validation. * 'trailing-strings': Enable partial validation and allow trailing strings in the input. | `False` | | `by_alias` | `bool | None` | Whether to use the field's alias when validating against the provided input data. | `None` | | `by_name` | `bool | None` | Whether to use the field's name when validating against the provided input data. | `None` | Returns: | Type | Description | | --- | --- | | `T` | The validated object. | Source code in `pydantic/type_adapter.py` ```python def validate_json( self, data: str | bytes | bytearray, /, *, strict: bool | None = None, context: dict[str, Any] | None = None, experimental_allow_partial: bool | Literal['off', 'on', 'trailing-strings'] = False, by_alias: bool | None = None, by_name: bool | None = None, ) -> T: """!!! abstract "Usage Documentation" [JSON Parsing](../concepts/json.md#json-parsing) Validate a JSON string or bytes against the model. Args: data: The JSON data to validate against the model. strict: Whether to strictly check types. context: Additional context to use during validation. experimental_allow_partial: **Experimental** whether to enable [partial validation](../concepts/experimental.md#partial-validation), e.g. to process streams. * False / 'off': Default behavior, no partial validation. * True / 'on': Enable partial validation. * 'trailing-strings': Enable partial validation and allow trailing strings in the input. by_alias: Whether to use the field's alias when validating against the provided input data. by_name: Whether to use the field's name when validating against the provided input data. Returns: The validated object. """ if by_alias is False and by_name is not True: raise PydanticUserError( 'At least one of `by_alias` or `by_name` must be set to True.', code='validate-by-alias-and-name-false', ) return self.validator.validate_json( data, strict=strict, context=context, allow_partial=experimental_allow_partial, by_alias=by_alias, by_name=by_name, ) ``` ## validate_strings ```python validate_strings( obj: Any, /, *, strict: bool | None = None, context: dict[str, Any] | None = None, experimental_allow_partial: ( bool | Literal["off", "on", "trailing-strings"] ) = False, by_alias: bool | None = None, by_name: bool | None = None, ) -> T ``` Validate object contains string data against the model. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `obj` | `Any` | The object contains string data to validate. | *required* | | `strict` | `bool | None` | Whether to strictly check types. | `None` | | `context` | `dict[str, Any] | None` | Additional context to use during validation. | `None` | | `experimental_allow_partial` | `bool | Literal['off', 'on', 'trailing-strings']` | Experimental whether to enable partial validation, e.g. to process streams. * False / 'off': Default behavior, no partial validation. * True / 'on': Enable partial validation. * 'trailing-strings': Enable partial validation and allow trailing strings in the input. | `False` | | `by_alias` | `bool | None` | Whether to use the field's alias when validating against the provided input data. | `None` | | `by_name` | `bool | None` | Whether to use the field's name when validating against the provided input data. | `None` | Returns: | Type | Description | | --- | --- | | `T` | The validated object. | Source code in `pydantic/type_adapter.py` ```python def validate_strings( self, obj: Any, /, *, strict: bool | None = None, context: dict[str, Any] | None = None, experimental_allow_partial: bool | Literal['off', 'on', 'trailing-strings'] = False, by_alias: bool | None = None, by_name: bool | None = None, ) -> T: """Validate object contains string data against the model. Args: obj: The object contains string data to validate. strict: Whether to strictly check types. context: Additional context to use during validation. experimental_allow_partial: **Experimental** whether to enable [partial validation](../concepts/experimental.md#partial-validation), e.g. to process streams. * False / 'off': Default behavior, no partial validation. * True / 'on': Enable partial validation. * 'trailing-strings': Enable partial validation and allow trailing strings in the input. by_alias: Whether to use the field's alias when validating against the provided input data. by_name: Whether to use the field's name when validating against the provided input data. Returns: The validated object. """ if by_alias is False and by_name is not True: raise PydanticUserError( 'At least one of `by_alias` or `by_name` must be set to True.', code='validate-by-alias-and-name-false', ) return self.validator.validate_strings( obj, strict=strict, context=context, allow_partial=experimental_allow_partial, by_alias=by_alias, by_name=by_name, ) ``` ## get_default_value ```python get_default_value( *, strict: bool | None = None, context: dict[str, Any] | None = None ) -> Some[T] | None ``` Get the default value for the wrapped type. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `strict` | `bool | None` | Whether to strictly check types. | `None` | | `context` | `dict[str, Any] | None` | Additional context to pass to the validator. | `None` | Returns: | Type | Description | | --- | --- | | `Some[T] | None` | The default value wrapped in a Some if there is one or None if not. | Source code in `pydantic/type_adapter.py` ```python def get_default_value(self, *, strict: bool | None = None, context: dict[str, Any] | None = None) -> Some[T] | None: """Get the default value for the wrapped type. Args: strict: Whether to strictly check types. context: Additional context to pass to the validator. Returns: The default value wrapped in a `Some` if there is one or None if not. """ return self.validator.get_default_value(strict=strict, context=context) ``` ## dump_python ```python dump_python( instance: T, /, *, mode: Literal["json", "python"] = "python", include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: ( bool | Literal["none", "warn", "error"] ) = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False, context: dict[str, Any] | None = None, ) -> Any ``` Dump an instance of the adapted type to a Python object. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `instance` | `T` | The Python object to serialize. | *required* | | `mode` | `Literal['json', 'python']` | The output format. | `'python'` | | `include` | `IncEx | None` | Fields to include in the output. | `None` | | `exclude` | `IncEx | None` | Fields to exclude from the output. | `None` | | `by_alias` | `bool | None` | Whether to use alias names for field names. | `None` | | `exclude_unset` | `bool` | Whether to exclude unset fields. | `False` | | `exclude_defaults` | `bool` | Whether to exclude fields with default values. | `False` | | `exclude_none` | `bool` | Whether to exclude fields with None values. | `False` | | `round_trip` | `bool` | Whether to output the serialized data in a way that is compatible with deserialization. | `False` | | `warnings` | `bool | Literal['none', 'warn', 'error']` | How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError. | `True` | | `fallback` | `Callable[[Any], Any] | None` | A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised. | `None` | | `serialize_as_any` | `bool` | Whether to serialize fields with duck-typing serialization behavior. | `False` | | `context` | `dict[str, Any] | None` | Additional context to pass to the serializer. | `None` | Returns: | Type | Description | | --- | --- | | `Any` | The serialized object. | Source code in `pydantic/type_adapter.py` ```python def dump_python( self, instance: T, /, *, mode: Literal['json', 'python'] = 'python', include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False, context: dict[str, Any] | None = None, ) -> Any: """Dump an instance of the adapted type to a Python object. Args: instance: The Python object to serialize. mode: The output format. include: Fields to include in the output. exclude: Fields to exclude from the output. by_alias: Whether to use alias names for field names. exclude_unset: Whether to exclude unset fields. exclude_defaults: Whether to exclude fields with default values. exclude_none: Whether to exclude fields with None values. round_trip: Whether to output the serialized data in a way that is compatible with deserialization. warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError]. fallback: A function to call when an unknown value is encountered. If not provided, a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised. serialize_as_any: Whether to serialize fields with duck-typing serialization behavior. context: Additional context to pass to the serializer. Returns: The serialized object. """ return self.serializer.to_python( instance, mode=mode, by_alias=by_alias, include=include, exclude=exclude, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, round_trip=round_trip, warnings=warnings, fallback=fallback, serialize_as_any=serialize_as_any, context=context, ) ``` ## dump_json ```python dump_json( instance: T, /, *, indent: int | None = None, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: ( bool | Literal["none", "warn", "error"] ) = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False, context: dict[str, Any] | None = None, ) -> bytes ``` Usage Documentation [JSON Serialization](../../concepts/json/#json-serialization) Serialize an instance of the adapted type to JSON. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `instance` | `T` | The instance to be serialized. | *required* | | `indent` | `int | None` | Number of spaces for JSON indentation. | `None` | | `include` | `IncEx | None` | Fields to include. | `None` | | `exclude` | `IncEx | None` | Fields to exclude. | `None` | | `by_alias` | `bool | None` | Whether to use alias names for field names. | `None` | | `exclude_unset` | `bool` | Whether to exclude unset fields. | `False` | | `exclude_defaults` | `bool` | Whether to exclude fields with default values. | `False` | | `exclude_none` | `bool` | Whether to exclude fields with a value of None. | `False` | | `round_trip` | `bool` | Whether to serialize and deserialize the instance to ensure round-tripping. | `False` | | `warnings` | `bool | Literal['none', 'warn', 'error']` | How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError. | `True` | | `fallback` | `Callable[[Any], Any] | None` | A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised. | `None` | | `serialize_as_any` | `bool` | Whether to serialize fields with duck-typing serialization behavior. | `False` | | `context` | `dict[str, Any] | None` | Additional context to pass to the serializer. | `None` | Returns: | Type | Description | | --- | --- | | `bytes` | The JSON representation of the given instance as bytes. | Source code in `pydantic/type_adapter.py` ```python def dump_json( self, instance: T, /, *, indent: int | None = None, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False, context: dict[str, Any] | None = None, ) -> bytes: """!!! abstract "Usage Documentation" [JSON Serialization](../concepts/json.md#json-serialization) Serialize an instance of the adapted type to JSON. Args: instance: The instance to be serialized. indent: Number of spaces for JSON indentation. include: Fields to include. exclude: Fields to exclude. by_alias: Whether to use alias names for field names. exclude_unset: Whether to exclude unset fields. exclude_defaults: Whether to exclude fields with default values. exclude_none: Whether to exclude fields with a value of `None`. round_trip: Whether to serialize and deserialize the instance to ensure round-tripping. warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError]. fallback: A function to call when an unknown value is encountered. If not provided, a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised. serialize_as_any: Whether to serialize fields with duck-typing serialization behavior. context: Additional context to pass to the serializer. Returns: The JSON representation of the given instance as bytes. """ return self.serializer.to_json( instance, indent=indent, include=include, exclude=exclude, by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, round_trip=round_trip, warnings=warnings, fallback=fallback, serialize_as_any=serialize_as_any, context=context, ) ``` ## json_schema ```python json_schema( *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[ GenerateJsonSchema ] = GenerateJsonSchema, mode: JsonSchemaMode = "validation" ) -> dict[str, Any] ``` Generate a JSON schema for the adapted type. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `by_alias` | `bool` | Whether to use alias names for field names. | `True` | | `ref_template` | `str` | The format string used for generating $ref strings. | `DEFAULT_REF_TEMPLATE` | | `schema_generator` | `type[GenerateJsonSchema]` | The generator class used for creating the schema. | `GenerateJsonSchema` | | `mode` | `JsonSchemaMode` | The mode to use for schema generation. | `'validation'` | Returns: | Type | Description | | --- | --- | | `dict[str, Any]` | The JSON schema for the model as a dictionary. | Source code in `pydantic/type_adapter.py` ```python def json_schema( self, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', ) -> dict[str, Any]: """Generate a JSON schema for the adapted type. Args: by_alias: Whether to use alias names for field names. ref_template: The format string used for generating $ref strings. schema_generator: The generator class used for creating the schema. mode: The mode to use for schema generation. Returns: The JSON schema for the model as a dictionary. """ schema_generator_instance = schema_generator(by_alias=by_alias, ref_template=ref_template) if isinstance(self.core_schema, _mock_val_ser.MockCoreSchema): self.core_schema.rebuild() assert not isinstance(self.core_schema, _mock_val_ser.MockCoreSchema), 'this is a bug! please report it' return schema_generator_instance.generate(self.core_schema, mode=mode) ``` ## json_schemas ```python json_schemas( inputs: Iterable[ tuple[ JsonSchemaKeyT, JsonSchemaMode, TypeAdapter[Any] ] ], /, *, by_alias: bool = True, title: str | None = None, description: str | None = None, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[ GenerateJsonSchema ] = GenerateJsonSchema, ) -> tuple[ dict[ tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue, ], JsonSchemaValue, ] ``` Generate a JSON schema including definitions from multiple type adapters. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `inputs` | `Iterable[tuple[JsonSchemaKeyT, JsonSchemaMode, TypeAdapter[Any]]]` | Inputs to schema generation. The first two items will form the keys of the (first) output mapping; the type adapters will provide the core schemas that get converted into definitions in the output JSON schema. | *required* | | `by_alias` | `bool` | Whether to use alias names. | `True` | | `title` | `str | None` | The title for the schema. | `None` | | `description` | `str | None` | The description for the schema. | `None` | | `ref_template` | `str` | The format string used for generating $ref strings. | `DEFAULT_REF_TEMPLATE` | | `schema_generator` | `type[GenerateJsonSchema]` | The generator class used for creating the schema. | `GenerateJsonSchema` | Returns: | Type | Description | | --- | --- | | `tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], JsonSchemaValue]` | A tuple where: The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.) The second element is a JSON schema containing all definitions referenced in the first returned element, along with the optional title and description keys. | Source code in `pydantic/type_adapter.py` ```python @staticmethod def json_schemas( inputs: Iterable[tuple[JsonSchemaKeyT, JsonSchemaMode, TypeAdapter[Any]]], /, *, by_alias: bool = True, title: str | None = None, description: str | None = None, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, ) -> tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], JsonSchemaValue]: """Generate a JSON schema including definitions from multiple type adapters. Args: inputs: Inputs to schema generation. The first two items will form the keys of the (first) output mapping; the type adapters will provide the core schemas that get converted into definitions in the output JSON schema. by_alias: Whether to use alias names. title: The title for the schema. description: The description for the schema. ref_template: The format string used for generating $ref strings. schema_generator: The generator class used for creating the schema. Returns: A tuple where: - The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.) - The second element is a JSON schema containing all definitions referenced in the first returned element, along with the optional title and description keys. """ schema_generator_instance = schema_generator(by_alias=by_alias, ref_template=ref_template) inputs_ = [] for key, mode, adapter in inputs: # This is the same pattern we follow for model json schemas - we attempt a core schema rebuild if we detect a mock if isinstance(adapter.core_schema, _mock_val_ser.MockCoreSchema): adapter.core_schema.rebuild() assert not isinstance(adapter.core_schema, _mock_val_ser.MockCoreSchema), ( 'this is a bug! please report it' ) inputs_.append((key, mode, adapter.core_schema)) json_schemas_map, definitions = schema_generator_instance.generate_definitions(inputs_) json_schema: dict[str, Any] = {} if definitions: json_schema['$defs'] = definitions if title: json_schema['title'] = title if description: json_schema['description'] = description return json_schemas_map, json_schema ``` ## pydantic.types The types module contains custom types used by pydantic. ### StrictBool ```python StrictBool = Annotated[bool, Strict()] ``` A boolean that must be either `True` or `False`. ### PositiveInt ```python PositiveInt = Annotated[int, Gt(0)] ``` An integer that must be greater than zero. ```python from pydantic import BaseModel, PositiveInt, ValidationError class Model(BaseModel): positive_int: PositiveInt m = Model(positive_int=1) print(repr(m)) #> Model(positive_int=1) try: Model(positive_int=-1) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'greater_than', 'loc': ('positive_int',), 'msg': 'Input should be greater than 0', 'input': -1, 'ctx': {'gt': 0}, 'url': 'https://errors.pydantic.dev/2/v/greater_than', } ] ''' ``` ### NegativeInt ```python NegativeInt = Annotated[int, Lt(0)] ``` An integer that must be less than zero. ```python from pydantic import BaseModel, NegativeInt, ValidationError class Model(BaseModel): negative_int: NegativeInt m = Model(negative_int=-1) print(repr(m)) #> Model(negative_int=-1) try: Model(negative_int=1) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'less_than', 'loc': ('negative_int',), 'msg': 'Input should be less than 0', 'input': 1, 'ctx': {'lt': 0}, 'url': 'https://errors.pydantic.dev/2/v/less_than', } ] ''' ``` ### NonPositiveInt ```python NonPositiveInt = Annotated[int, Le(0)] ``` An integer that must be less than or equal to zero. ```python from pydantic import BaseModel, NonPositiveInt, ValidationError class Model(BaseModel): non_positive_int: NonPositiveInt m = Model(non_positive_int=0) print(repr(m)) #> Model(non_positive_int=0) try: Model(non_positive_int=1) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'less_than_equal', 'loc': ('non_positive_int',), 'msg': 'Input should be less than or equal to 0', 'input': 1, 'ctx': {'le': 0}, 'url': 'https://errors.pydantic.dev/2/v/less_than_equal', } ] ''' ``` ### NonNegativeInt ```python NonNegativeInt = Annotated[int, Ge(0)] ``` An integer that must be greater than or equal to zero. ```python from pydantic import BaseModel, NonNegativeInt, ValidationError class Model(BaseModel): non_negative_int: NonNegativeInt m = Model(non_negative_int=0) print(repr(m)) #> Model(non_negative_int=0) try: Model(non_negative_int=-1) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'greater_than_equal', 'loc': ('non_negative_int',), 'msg': 'Input should be greater than or equal to 0', 'input': -1, 'ctx': {'ge': 0}, 'url': 'https://errors.pydantic.dev/2/v/greater_than_equal', } ] ''' ``` ### StrictInt ```python StrictInt = Annotated[int, Strict()] ``` An integer that must be validated in strict mode. ```python from pydantic import BaseModel, StrictInt, ValidationError class StrictIntModel(BaseModel): strict_int: StrictInt try: StrictIntModel(strict_int=3.14159) except ValidationError as e: print(e) ''' 1 validation error for StrictIntModel strict_int Input should be a valid integer [type=int_type, input_value=3.14159, input_type=float] ''' ``` ### PositiveFloat ```python PositiveFloat = Annotated[float, Gt(0)] ``` A float that must be greater than zero. ```python from pydantic import BaseModel, PositiveFloat, ValidationError class Model(BaseModel): positive_float: PositiveFloat m = Model(positive_float=1.0) print(repr(m)) #> Model(positive_float=1.0) try: Model(positive_float=-1.0) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'greater_than', 'loc': ('positive_float',), 'msg': 'Input should be greater than 0', 'input': -1.0, 'ctx': {'gt': 0.0}, 'url': 'https://errors.pydantic.dev/2/v/greater_than', } ] ''' ``` ### NegativeFloat ```python NegativeFloat = Annotated[float, Lt(0)] ``` A float that must be less than zero. ```python from pydantic import BaseModel, NegativeFloat, ValidationError class Model(BaseModel): negative_float: NegativeFloat m = Model(negative_float=-1.0) print(repr(m)) #> Model(negative_float=-1.0) try: Model(negative_float=1.0) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'less_than', 'loc': ('negative_float',), 'msg': 'Input should be less than 0', 'input': 1.0, 'ctx': {'lt': 0.0}, 'url': 'https://errors.pydantic.dev/2/v/less_than', } ] ''' ``` ### NonPositiveFloat ```python NonPositiveFloat = Annotated[float, Le(0)] ``` A float that must be less than or equal to zero. ```python from pydantic import BaseModel, NonPositiveFloat, ValidationError class Model(BaseModel): non_positive_float: NonPositiveFloat m = Model(non_positive_float=0.0) print(repr(m)) #> Model(non_positive_float=0.0) try: Model(non_positive_float=1.0) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'less_than_equal', 'loc': ('non_positive_float',), 'msg': 'Input should be less than or equal to 0', 'input': 1.0, 'ctx': {'le': 0.0}, 'url': 'https://errors.pydantic.dev/2/v/less_than_equal', } ] ''' ``` ### NonNegativeFloat ```python NonNegativeFloat = Annotated[float, Ge(0)] ``` A float that must be greater than or equal to zero. ```python from pydantic import BaseModel, NonNegativeFloat, ValidationError class Model(BaseModel): non_negative_float: NonNegativeFloat m = Model(non_negative_float=0.0) print(repr(m)) #> Model(non_negative_float=0.0) try: Model(non_negative_float=-1.0) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'greater_than_equal', 'loc': ('non_negative_float',), 'msg': 'Input should be greater than or equal to 0', 'input': -1.0, 'ctx': {'ge': 0.0}, 'url': 'https://errors.pydantic.dev/2/v/greater_than_equal', } ] ''' ``` ### StrictFloat ```python StrictFloat = Annotated[float, Strict(True)] ``` A float that must be validated in strict mode. ```python from pydantic import BaseModel, StrictFloat, ValidationError class StrictFloatModel(BaseModel): strict_float: StrictFloat try: StrictFloatModel(strict_float='1.0') except ValidationError as e: print(e) ''' 1 validation error for StrictFloatModel strict_float Input should be a valid number [type=float_type, input_value='1.0', input_type=str] ''' ``` ### FiniteFloat ```python FiniteFloat = Annotated[float, AllowInfNan(False)] ``` A float that must be finite (not `-inf`, `inf`, or `nan`). ```python from pydantic import BaseModel, FiniteFloat class Model(BaseModel): finite: FiniteFloat m = Model(finite=1.0) print(m) #> finite=1.0 ``` ### StrictBytes ```python StrictBytes = Annotated[bytes, Strict()] ``` A bytes that must be validated in strict mode. ### StrictStr ```python StrictStr = Annotated[str, Strict()] ``` A string that must be validated in strict mode. ### UUID1 ```python UUID1 = Annotated[UUID, UuidVersion(1)] ``` A [UUID](https://docs.python.org/3/library/uuid.html) that must be version 1. ```python import uuid from pydantic import UUID1, BaseModel class Model(BaseModel): uuid1: UUID1 Model(uuid1=uuid.uuid1()) ``` ### UUID3 ```python UUID3 = Annotated[UUID, UuidVersion(3)] ``` A [UUID](https://docs.python.org/3/library/uuid.html) that must be version 3. ```python import uuid from pydantic import UUID3, BaseModel class Model(BaseModel): uuid3: UUID3 Model(uuid3=uuid.uuid3(uuid.NAMESPACE_DNS, 'pydantic.org')) ``` ### UUID4 ```python UUID4 = Annotated[UUID, UuidVersion(4)] ``` A [UUID](https://docs.python.org/3/library/uuid.html) that must be version 4. ```python import uuid from pydantic import UUID4, BaseModel class Model(BaseModel): uuid4: UUID4 Model(uuid4=uuid.uuid4()) ``` ### UUID5 ```python UUID5 = Annotated[UUID, UuidVersion(5)] ``` A [UUID](https://docs.python.org/3/library/uuid.html) that must be version 5. ```python import uuid from pydantic import UUID5, BaseModel class Model(BaseModel): uuid5: UUID5 Model(uuid5=uuid.uuid5(uuid.NAMESPACE_DNS, 'pydantic.org')) ``` ### UUID6 ```python UUID6 = Annotated[UUID, UuidVersion(6)] ``` A [UUID](https://docs.python.org/3/library/uuid.html) that must be version 6. ```python import uuid from pydantic import UUID6, BaseModel class Model(BaseModel): uuid6: UUID6 Model(uuid6=uuid.UUID('1efea953-c2d6-6790-aa0a-69db8c87df97')) ``` ### UUID7 ```python UUID7 = Annotated[UUID, UuidVersion(7)] ``` A [UUID](https://docs.python.org/3/library/uuid.html) that must be version 7. ```python import uuid from pydantic import UUID7, BaseModel class Model(BaseModel): uuid7: UUID7 Model(uuid7=uuid.UUID('0194fdcb-1c47-7a09-b52c-561154de0b4a')) ``` ### UUID8 ```python UUID8 = Annotated[UUID, UuidVersion(8)] ``` A [UUID](https://docs.python.org/3/library/uuid.html) that must be version 8. ```python import uuid from pydantic import UUID8, BaseModel class Model(BaseModel): uuid8: UUID8 Model(uuid8=uuid.UUID('81a0b92e-6078-8551-9c81-8ccb666bdab8')) ``` ### FilePath ```python FilePath = Annotated[Path, PathType('file')] ``` A path that must point to a file. ```python from pathlib import Path from pydantic import BaseModel, FilePath, ValidationError class Model(BaseModel): f: FilePath path = Path('text.txt') path.touch() m = Model(f='text.txt') print(m.model_dump()) #> {'f': PosixPath('text.txt')} path.unlink() path = Path('directory') path.mkdir(exist_ok=True) try: Model(f='directory') # directory except ValidationError as e: print(e) ''' 1 validation error for Model f Path does not point to a file [type=path_not_file, input_value='directory', input_type=str] ''' path.rmdir() try: Model(f='not-exists-file') except ValidationError as e: print(e) ''' 1 validation error for Model f Path does not point to a file [type=path_not_file, input_value='not-exists-file', input_type=str] ''' ``` ### DirectoryPath ```python DirectoryPath = Annotated[Path, PathType('dir')] ``` A path that must point to a directory. ```python from pathlib import Path from pydantic import BaseModel, DirectoryPath, ValidationError class Model(BaseModel): f: DirectoryPath path = Path('directory/') path.mkdir() m = Model(f='directory/') print(m.model_dump()) #> {'f': PosixPath('directory')} path.rmdir() path = Path('file.txt') path.touch() try: Model(f='file.txt') # file except ValidationError as e: print(e) ''' 1 validation error for Model f Path does not point to a directory [type=path_not_directory, input_value='file.txt', input_type=str] ''' path.unlink() try: Model(f='not-exists-directory') except ValidationError as e: print(e) ''' 1 validation error for Model f Path does not point to a directory [type=path_not_directory, input_value='not-exists-directory', input_type=str] ''' ``` ### NewPath ```python NewPath = Annotated[Path, PathType('new')] ``` A path for a new file or directory that must not already exist. The parent directory must already exist. ### SocketPath ```python SocketPath = Annotated[Path, PathType('socket')] ``` A path to an existing socket file ### Base64Bytes ```python Base64Bytes = Annotated[ bytes, EncodedBytes(encoder=Base64Encoder) ] ``` A bytes type that is encoded and decoded using the standard (non-URL-safe) base64 encoder. Note Under the hood, `Base64Bytes` uses the standard library `base64.b64encode` and `base64.b64decode` functions. As a result, attempting to decode url-safe base64 data using the `Base64Bytes` type may fail or produce an incorrect decoding. Warning In versions of Pydantic prior to v2.10, `Base64Bytes` used base64.encodebytes and base64.decodebytes functions. According to the [base64 documentation](https://docs.python.org/3/library/base64.html), these methods are considered legacy implementation, and thus, Pydantic v2.10+ now uses the modern base64.b64encode and base64.b64decode functions. If you'd still like to use these legacy encoders / decoders, you can achieve this by creating a custom annotated type, like follows: ```python import base64 from typing import Annotated, Literal from pydantic_core import PydanticCustomError from pydantic import EncodedBytes, EncoderProtocol class LegacyBase64Encoder(EncoderProtocol): @classmethod def decode(cls, data: bytes) -> bytes: try: return base64.decodebytes(data) except ValueError as e: raise PydanticCustomError( 'base64_decode', "Base64 decoding error: '{error}'", {'error': str(e)}, ) @classmethod def encode(cls, value: bytes) -> bytes: return base64.encodebytes(value) @classmethod def get_json_format(cls) -> Literal['base64']: return 'base64' LegacyBase64Bytes = Annotated[bytes, EncodedBytes(encoder=LegacyBase64Encoder)] ``` ```python from pydantic import Base64Bytes, BaseModel, ValidationError class Model(BaseModel): base64_bytes: Base64Bytes # Initialize the model with base64 data m = Model(base64_bytes=b'VGhpcyBpcyB0aGUgd2F5') # Access decoded value print(m.base64_bytes) #> b'This is the way' # Serialize into the base64 form print(m.model_dump()) #> {'base64_bytes': b'VGhpcyBpcyB0aGUgd2F5'} # Validate base64 data try: print(Model(base64_bytes=b'undecodable').base64_bytes) except ValidationError as e: print(e) ''' 1 validation error for Model base64_bytes Base64 decoding error: 'Incorrect padding' [type=base64_decode, input_value=b'undecodable', input_type=bytes] ''' ``` ### Base64Str ```python Base64Str = Annotated[ str, EncodedStr(encoder=Base64Encoder) ] ``` A str type that is encoded and decoded using the standard (non-URL-safe) base64 encoder. Note Under the hood, `Base64Str` uses the standard library `base64.b64encode` and `base64.b64decode` functions. As a result, attempting to decode url-safe base64 data using the `Base64Str` type may fail or produce an incorrect decoding. Warning In versions of Pydantic prior to v2.10, `Base64Str` used base64.encodebytes and base64.decodebytes functions. According to the [base64 documentation](https://docs.python.org/3/library/base64.html), these methods are considered legacy implementation, and thus, Pydantic v2.10+ now uses the modern base64.b64encode and base64.b64decode functions. See the Base64Bytes type for more information on how to replicate the old behavior with the legacy encoders / decoders. ```python from pydantic import Base64Str, BaseModel, ValidationError class Model(BaseModel): base64_str: Base64Str # Initialize the model with base64 data m = Model(base64_str='VGhlc2UgYXJlbid0IHRoZSBkcm9pZHMgeW91J3JlIGxvb2tpbmcgZm9y') # Access decoded value print(m.base64_str) #> These aren't the droids you're looking for # Serialize into the base64 form print(m.model_dump()) #> {'base64_str': 'VGhlc2UgYXJlbid0IHRoZSBkcm9pZHMgeW91J3JlIGxvb2tpbmcgZm9y'} # Validate base64 data try: print(Model(base64_str='undecodable').base64_str) except ValidationError as e: print(e) ''' 1 validation error for Model base64_str Base64 decoding error: 'Incorrect padding' [type=base64_decode, input_value='undecodable', input_type=str] ''' ``` ### Base64UrlBytes ```python Base64UrlBytes = Annotated[ bytes, EncodedBytes(encoder=Base64UrlEncoder) ] ``` A bytes type that is encoded and decoded using the URL-safe base64 encoder. Note Under the hood, `Base64UrlBytes` use standard library `base64.urlsafe_b64encode` and `base64.urlsafe_b64decode` functions. As a result, the `Base64UrlBytes` type can be used to faithfully decode "vanilla" base64 data (using `'+'` and `'/'`). ```python from pydantic import Base64UrlBytes, BaseModel class Model(BaseModel): base64url_bytes: Base64UrlBytes # Initialize the model with base64 data m = Model(base64url_bytes=b'SHc_dHc-TXc==') print(m) #> base64url_bytes=b'Hw?tw>Mw' ``` ### Base64UrlStr ```python Base64UrlStr = Annotated[ str, EncodedStr(encoder=Base64UrlEncoder) ] ``` A str type that is encoded and decoded using the URL-safe base64 encoder. Note Under the hood, `Base64UrlStr` use standard library `base64.urlsafe_b64encode` and `base64.urlsafe_b64decode` functions. As a result, the `Base64UrlStr` type can be used to faithfully decode "vanilla" base64 data (using `'+'` and `'/'`). ```python from pydantic import Base64UrlStr, BaseModel class Model(BaseModel): base64url_str: Base64UrlStr # Initialize the model with base64 data m = Model(base64url_str='SHc_dHc-TXc==') print(m) #> base64url_str='Hw?tw>Mw' ``` ### JsonValue ```python JsonValue: TypeAlias = Union[ list["JsonValue"], dict[str, "JsonValue"], str, bool, int, float, None, ] ``` A `JsonValue` is used to represent a value that can be serialized to JSON. It may be one of: - `list['JsonValue']` - `dict[str, 'JsonValue']` - `str` - `bool` - `int` - `float` - `None` The following example demonstrates how to use `JsonValue` to validate JSON data, and what kind of errors to expect when input data is not json serializable. ```python import json from pydantic import BaseModel, JsonValue, ValidationError class Model(BaseModel): j: JsonValue valid_json_data = {'j': {'a': {'b': {'c': 1, 'd': [2, None]}}}} invalid_json_data = {'j': {'a': {'b': ...}}} print(repr(Model.model_validate(valid_json_data))) #> Model(j={'a': {'b': {'c': 1, 'd': [2, None]}}}) print(repr(Model.model_validate_json(json.dumps(valid_json_data)))) #> Model(j={'a': {'b': {'c': 1, 'd': [2, None]}}}) try: Model.model_validate(invalid_json_data) except ValidationError as e: print(e) ''' 1 validation error for Model j.dict.a.dict.b input was not a valid JSON value [type=invalid-json-value, input_value=Ellipsis, input_type=ellipsis] ''' ``` ### OnErrorOmit ```python OnErrorOmit = Annotated[T, _OnErrorOmit] ``` When used as an item in a list, the key type in a dict, optional values of a TypedDict, etc. this annotation omits the item from the iteration if there is any error validating it. That is, instead of a ValidationError being propagated up and the entire iterable being discarded any invalid items are discarded and the valid ones are returned. ### Strict Bases: `PydanticMetadata`, `BaseMetadata` Usage Documentation [Strict Mode with `Annotated` `Strict`](../../concepts/strict_mode/#strict-mode-with-annotated-strict) A field metadata class to indicate that a field should be validated in strict mode. Use this class as an annotation via [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated), as seen below. Attributes: | Name | Type | Description | | --- | --- | --- | | `strict` | `bool` | Whether to validate the field in strict mode. | Example ```python from typing import Annotated from pydantic.types import Strict StrictBool = Annotated[bool, Strict()] ``` Source code in `pydantic/types.py` ````python @_dataclasses.dataclass class Strict(_fields.PydanticMetadata, BaseMetadata): """!!! abstract "Usage Documentation" [Strict Mode with `Annotated` `Strict`](../concepts/strict_mode.md#strict-mode-with-annotated-strict) A field metadata class to indicate that a field should be validated in strict mode. Use this class as an annotation via [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated), as seen below. Attributes: strict: Whether to validate the field in strict mode. Example: ```python from typing import Annotated from pydantic.types import Strict StrictBool = Annotated[bool, Strict()] ``` """ strict: bool = True def __hash__(self) -> int: return hash(self.strict) ```` ### AllowInfNan Bases: `PydanticMetadata` A field metadata class to indicate that a field should allow `-inf`, `inf`, and `nan`. Use this class as an annotation via [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated), as seen below. Attributes: | Name | Type | Description | | --- | --- | --- | | `allow_inf_nan` | `bool` | Whether to allow -inf, inf, and nan. Defaults to True. | Example ```python from typing import Annotated from pydantic.types import AllowInfNan LaxFloat = Annotated[float, AllowInfNan()] ``` Source code in `pydantic/types.py` ````python @_dataclasses.dataclass class AllowInfNan(_fields.PydanticMetadata): """A field metadata class to indicate that a field should allow `-inf`, `inf`, and `nan`. Use this class as an annotation via [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated), as seen below. Attributes: allow_inf_nan: Whether to allow `-inf`, `inf`, and `nan`. Defaults to `True`. Example: ```python from typing import Annotated from pydantic.types import AllowInfNan LaxFloat = Annotated[float, AllowInfNan()] ``` """ allow_inf_nan: bool = True def __hash__(self) -> int: return hash(self.allow_inf_nan) ```` ### StringConstraints Bases: `GroupedMetadata` Usage Documentation [`StringConstraints`](../../concepts/fields/#string-constraints) A field metadata class to apply constraints to `str` types. Use this class as an annotation via [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated), as seen below. Attributes: | Name | Type | Description | | --- | --- | --- | | `strip_whitespace` | `bool | None` | Whether to remove leading and trailing whitespace. | | `to_upper` | `bool | None` | Whether to convert the string to uppercase. | | `to_lower` | `bool | None` | Whether to convert the string to lowercase. | | `strict` | `bool | None` | Whether to validate the string in strict mode. | | `min_length` | `int | None` | The minimum length of the string. | | `max_length` | `int | None` | The maximum length of the string. | | `pattern` | `str | Pattern[str] | None` | A regex pattern that the string must match. | Example ```python from typing import Annotated from pydantic.types import StringConstraints ConstrainedStr = Annotated[str, StringConstraints(min_length=1, max_length=10)] ``` Source code in `pydantic/types.py` ````python @_dataclasses.dataclass(frozen=True) class StringConstraints(annotated_types.GroupedMetadata): """!!! abstract "Usage Documentation" [`StringConstraints`](../concepts/fields.md#string-constraints) A field metadata class to apply constraints to `str` types. Use this class as an annotation via [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated), as seen below. Attributes: strip_whitespace: Whether to remove leading and trailing whitespace. to_upper: Whether to convert the string to uppercase. to_lower: Whether to convert the string to lowercase. strict: Whether to validate the string in strict mode. min_length: The minimum length of the string. max_length: The maximum length of the string. pattern: A regex pattern that the string must match. Example: ```python from typing import Annotated from pydantic.types import StringConstraints ConstrainedStr = Annotated[str, StringConstraints(min_length=1, max_length=10)] ``` """ strip_whitespace: bool | None = None to_upper: bool | None = None to_lower: bool | None = None strict: bool | None = None min_length: int | None = None max_length: int | None = None pattern: str | Pattern[str] | None = None def __iter__(self) -> Iterator[BaseMetadata]: if self.min_length is not None: yield MinLen(self.min_length) if self.max_length is not None: yield MaxLen(self.max_length) if self.strict is not None: yield Strict(self.strict) if ( self.strip_whitespace is not None or self.pattern is not None or self.to_lower is not None or self.to_upper is not None ): yield _fields.pydantic_general_metadata( strip_whitespace=self.strip_whitespace, to_upper=self.to_upper, to_lower=self.to_lower, pattern=self.pattern, ) ```` ### ImportString A type that can be used to import a Python object from a string. `ImportString` expects a string and loads the Python object importable at that dotted path. Attributes of modules may be separated from the module by `:` or `.`, e.g. if `'math:cos'` is provided, the resulting field value would be the function `cos`. If a `.` is used and both an attribute and submodule are present at the same path, the module will be preferred. On model instantiation, pointers will be evaluated and imported. There is some nuance to this behavior, demonstrated in the examples below. ```python import math from pydantic import BaseModel, Field, ImportString, ValidationError class ImportThings(BaseModel): obj: ImportString # A string value will cause an automatic import my_cos = ImportThings(obj='math.cos') # You can use the imported function as you would expect cos_of_0 = my_cos.obj(0) assert cos_of_0 == 1 # A string whose value cannot be imported will raise an error try: ImportThings(obj='foo.bar') except ValidationError as e: print(e) ''' 1 validation error for ImportThings obj Invalid python path: No module named 'foo.bar' [type=import_error, input_value='foo.bar', input_type=str] ''' # Actual python objects can be assigned as well my_cos = ImportThings(obj=math.cos) my_cos_2 = ImportThings(obj='math.cos') my_cos_3 = ImportThings(obj='math:cos') assert my_cos == my_cos_2 == my_cos_3 # You can set default field value either as Python object: class ImportThingsDefaultPyObj(BaseModel): obj: ImportString = math.cos # or as a string value (but only if used with `validate_default=True`) class ImportThingsDefaultString(BaseModel): obj: ImportString = Field(default='math.cos', validate_default=True) my_cos_default1 = ImportThingsDefaultPyObj() my_cos_default2 = ImportThingsDefaultString() assert my_cos_default1.obj == my_cos_default2.obj == math.cos # note: this will not work! class ImportThingsMissingValidateDefault(BaseModel): obj: ImportString = 'math.cos' my_cos_default3 = ImportThingsMissingValidateDefault() assert my_cos_default3.obj == 'math.cos' # just string, not evaluated ``` Serializing an `ImportString` type to json is also possible. ```python from pydantic import BaseModel, ImportString class ImportThings(BaseModel): obj: ImportString # Create an instance m = ImportThings(obj='math.cos') print(m) #> obj= print(m.model_dump_json()) #> {"obj":"math.cos"} ``` Source code in `pydantic/types.py` ````python class ImportString: """A type that can be used to import a Python object from a string. `ImportString` expects a string and loads the Python object importable at that dotted path. Attributes of modules may be separated from the module by `:` or `.`, e.g. if `'math:cos'` is provided, the resulting field value would be the function `cos`. If a `.` is used and both an attribute and submodule are present at the same path, the module will be preferred. On model instantiation, pointers will be evaluated and imported. There is some nuance to this behavior, demonstrated in the examples below. ```python import math from pydantic import BaseModel, Field, ImportString, ValidationError class ImportThings(BaseModel): obj: ImportString # A string value will cause an automatic import my_cos = ImportThings(obj='math.cos') # You can use the imported function as you would expect cos_of_0 = my_cos.obj(0) assert cos_of_0 == 1 # A string whose value cannot be imported will raise an error try: ImportThings(obj='foo.bar') except ValidationError as e: print(e) ''' 1 validation error for ImportThings obj Invalid python path: No module named 'foo.bar' [type=import_error, input_value='foo.bar', input_type=str] ''' # Actual python objects can be assigned as well my_cos = ImportThings(obj=math.cos) my_cos_2 = ImportThings(obj='math.cos') my_cos_3 = ImportThings(obj='math:cos') assert my_cos == my_cos_2 == my_cos_3 # You can set default field value either as Python object: class ImportThingsDefaultPyObj(BaseModel): obj: ImportString = math.cos # or as a string value (but only if used with `validate_default=True`) class ImportThingsDefaultString(BaseModel): obj: ImportString = Field(default='math.cos', validate_default=True) my_cos_default1 = ImportThingsDefaultPyObj() my_cos_default2 = ImportThingsDefaultString() assert my_cos_default1.obj == my_cos_default2.obj == math.cos # note: this will not work! class ImportThingsMissingValidateDefault(BaseModel): obj: ImportString = 'math.cos' my_cos_default3 = ImportThingsMissingValidateDefault() assert my_cos_default3.obj == 'math.cos' # just string, not evaluated ``` Serializing an `ImportString` type to json is also possible. ```python from pydantic import BaseModel, ImportString class ImportThings(BaseModel): obj: ImportString # Create an instance m = ImportThings(obj='math.cos') print(m) #> obj= print(m.model_dump_json()) #> {"obj":"math.cos"} ``` """ @classmethod def __class_getitem__(cls, item: AnyType) -> AnyType: return Annotated[item, cls()] @classmethod def __get_pydantic_core_schema__( cls, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: serializer = core_schema.plain_serializer_function_ser_schema(cls._serialize, when_used='json') if cls is source: # Treat bare usage of ImportString (`schema is None`) as the same as ImportString[Any] return core_schema.no_info_plain_validator_function( function=_validators.import_string, serialization=serializer ) else: return core_schema.no_info_before_validator_function( function=_validators.import_string, schema=handler(source), serialization=serializer ) @classmethod def __get_pydantic_json_schema__(cls, cs: CoreSchema, handler: GetJsonSchemaHandler) -> JsonSchemaValue: return handler(core_schema.str_schema()) @staticmethod def _serialize(v: Any) -> str: if isinstance(v, ModuleType): return v.__name__ elif hasattr(v, '__module__') and hasattr(v, '__name__'): return f'{v.__module__}.{v.__name__}' # Handle special cases for sys.XXX streams # if we see more of these, we should consider a more general solution elif hasattr(v, 'name'): if v.name == '': return 'sys.stdout' elif v.name == '': return 'sys.stdin' elif v.name == '': return 'sys.stderr' else: return v def __repr__(self) -> str: return 'ImportString' ```` ### UuidVersion A field metadata class to indicate a [UUID](https://docs.python.org/3/library/uuid.html) version. Use this class as an annotation via [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated), as seen below. Attributes: | Name | Type | Description | | --- | --- | --- | | `uuid_version` | `Literal[1, 3, 4, 5, 6, 7, 8]` | The version of the UUID. Must be one of 1, 3, 4, 5, or 7. | Example ```python from typing import Annotated from uuid import UUID from pydantic.types import UuidVersion UUID1 = Annotated[UUID, UuidVersion(1)] ``` Source code in `pydantic/types.py` ````python @_dataclasses.dataclass(**_internal_dataclass.slots_true) class UuidVersion: """A field metadata class to indicate a [UUID](https://docs.python.org/3/library/uuid.html) version. Use this class as an annotation via [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated), as seen below. Attributes: uuid_version: The version of the UUID. Must be one of 1, 3, 4, 5, or 7. Example: ```python from typing import Annotated from uuid import UUID from pydantic.types import UuidVersion UUID1 = Annotated[UUID, UuidVersion(1)] ``` """ uuid_version: Literal[1, 3, 4, 5, 6, 7, 8] def __get_pydantic_json_schema__( self, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: field_schema = handler(core_schema) field_schema.pop('anyOf', None) # remove the bytes/str union field_schema.update(type='string', format=f'uuid{self.uuid_version}') return field_schema def __get_pydantic_core_schema__(self, source: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: if isinstance(self, source): # used directly as a type return core_schema.uuid_schema(version=self.uuid_version) else: # update existing schema with self.uuid_version schema = handler(source) _check_annotated_type(schema['type'], 'uuid', self.__class__.__name__) schema['version'] = self.uuid_version # type: ignore return schema def __hash__(self) -> int: return hash(type(self.uuid_version)) ```` ### Json A special type wrapper which loads JSON before parsing. You can use the `Json` data type to make Pydantic first load a raw JSON string before validating the loaded data into the parametrized type: ```python from typing import Any from pydantic import BaseModel, Json, ValidationError class AnyJsonModel(BaseModel): json_obj: Json[Any] class ConstrainedJsonModel(BaseModel): json_obj: Json[list[int]] print(AnyJsonModel(json_obj='{"b": 1}')) #> json_obj={'b': 1} print(ConstrainedJsonModel(json_obj='[1, 2, 3]')) #> json_obj=[1, 2, 3] try: ConstrainedJsonModel(json_obj=12) except ValidationError as e: print(e) ''' 1 validation error for ConstrainedJsonModel json_obj JSON input should be string, bytes or bytearray [type=json_type, input_value=12, input_type=int] ''' try: ConstrainedJsonModel(json_obj='[a, b]') except ValidationError as e: print(e) ''' 1 validation error for ConstrainedJsonModel json_obj Invalid JSON: expected value at line 1 column 2 [type=json_invalid, input_value='[a, b]', input_type=str] ''' try: ConstrainedJsonModel(json_obj='["a", "b"]') except ValidationError as e: print(e) ''' 2 validation errors for ConstrainedJsonModel json_obj.0 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] json_obj.1 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='b', input_type=str] ''' ``` When you dump the model using `model_dump` or `model_dump_json`, the dumped value will be the result of validation, not the original JSON string. However, you can use the argument `round_trip=True` to get the original JSON string back: ```python from pydantic import BaseModel, Json class ConstrainedJsonModel(BaseModel): json_obj: Json[list[int]] print(ConstrainedJsonModel(json_obj='[1, 2, 3]').model_dump_json()) #> {"json_obj":[1,2,3]} print( ConstrainedJsonModel(json_obj='[1, 2, 3]').model_dump_json(round_trip=True) ) #> {"json_obj":"[1,2,3]"} ``` Source code in `pydantic/types.py` ````python class Json: """A special type wrapper which loads JSON before parsing. You can use the `Json` data type to make Pydantic first load a raw JSON string before validating the loaded data into the parametrized type: ```python from typing import Any from pydantic import BaseModel, Json, ValidationError class AnyJsonModel(BaseModel): json_obj: Json[Any] class ConstrainedJsonModel(BaseModel): json_obj: Json[list[int]] print(AnyJsonModel(json_obj='{"b": 1}')) #> json_obj={'b': 1} print(ConstrainedJsonModel(json_obj='[1, 2, 3]')) #> json_obj=[1, 2, 3] try: ConstrainedJsonModel(json_obj=12) except ValidationError as e: print(e) ''' 1 validation error for ConstrainedJsonModel json_obj JSON input should be string, bytes or bytearray [type=json_type, input_value=12, input_type=int] ''' try: ConstrainedJsonModel(json_obj='[a, b]') except ValidationError as e: print(e) ''' 1 validation error for ConstrainedJsonModel json_obj Invalid JSON: expected value at line 1 column 2 [type=json_invalid, input_value='[a, b]', input_type=str] ''' try: ConstrainedJsonModel(json_obj='["a", "b"]') except ValidationError as e: print(e) ''' 2 validation errors for ConstrainedJsonModel json_obj.0 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] json_obj.1 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='b', input_type=str] ''' ``` When you dump the model using `model_dump` or `model_dump_json`, the dumped value will be the result of validation, not the original JSON string. However, you can use the argument `round_trip=True` to get the original JSON string back: ```python from pydantic import BaseModel, Json class ConstrainedJsonModel(BaseModel): json_obj: Json[list[int]] print(ConstrainedJsonModel(json_obj='[1, 2, 3]').model_dump_json()) #> {"json_obj":[1,2,3]} print( ConstrainedJsonModel(json_obj='[1, 2, 3]').model_dump_json(round_trip=True) ) #> {"json_obj":"[1,2,3]"} ``` """ @classmethod def __class_getitem__(cls, item: AnyType) -> AnyType: return Annotated[item, cls()] @classmethod def __get_pydantic_core_schema__(cls, source: Any, handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: if cls is source: return core_schema.json_schema(None) else: return core_schema.json_schema(handler(source)) def __repr__(self) -> str: return 'Json' def __hash__(self) -> int: return hash(type(self)) def __eq__(self, other: Any) -> bool: return type(other) is type(self) ```` ### Secret Bases: `_SecretBase[SecretType]` A generic base class used for defining a field with sensitive information that you do not want to be visible in logging or tracebacks. You may either directly parametrize `Secret` with a type, or subclass from `Secret` with a parametrized type. The benefit of subclassing is that you can define a custom `_display` method, which will be used for `repr()` and `str()` methods. The examples below demonstrate both ways of using `Secret` to create a new secret type. 1. Directly parametrizing `Secret` with a type: ```python from pydantic import BaseModel, Secret SecretBool = Secret[bool] class Model(BaseModel): secret_bool: SecretBool m = Model(secret_bool=True) print(m.model_dump()) #> {'secret_bool': Secret('**********')} print(m.model_dump_json()) #> {"secret_bool":"**********"} print(m.secret_bool.get_secret_value()) #> True ``` 1. Subclassing from parametrized `Secret`: ```python from datetime import date from pydantic import BaseModel, Secret class SecretDate(Secret[date]): def _display(self) -> str: return '****/**/**' class Model(BaseModel): secret_date: SecretDate m = Model(secret_date=date(2022, 1, 1)) print(m.model_dump()) #> {'secret_date': SecretDate('****/**/**')} print(m.model_dump_json()) #> {"secret_date":"****/**/**"} print(m.secret_date.get_secret_value()) #> 2022-01-01 ``` The value returned by the `_display` method will be used for `repr()` and `str()`. You can enforce constraints on the underlying type through annotations: For example: ```python from typing import Annotated from pydantic import BaseModel, Field, Secret, ValidationError SecretPosInt = Secret[Annotated[int, Field(gt=0, strict=True)]] class Model(BaseModel): sensitive_int: SecretPosInt m = Model(sensitive_int=42) print(m.model_dump()) #> {'sensitive_int': Secret('**********')} try: m = Model(sensitive_int=-42) # (1)! except ValidationError as exc_info: print(exc_info.errors(include_url=False, include_input=False)) ''' [ { 'type': 'greater_than', 'loc': ('sensitive_int',), 'msg': 'Input should be greater than 0', 'ctx': {'gt': 0}, } ] ''' try: m = Model(sensitive_int='42') # (2)! except ValidationError as exc_info: print(exc_info.errors(include_url=False, include_input=False)) ''' [ { 'type': 'int_type', 'loc': ('sensitive_int',), 'msg': 'Input should be a valid integer', } ] ''' ``` 1. The input value is not greater than 0, so it raises a validation error. 1. The input value is not an integer, so it raises a validation error because the `SecretPosInt` type has strict mode enabled. Source code in `pydantic/types.py` ````python class Secret(_SecretBase[SecretType]): """A generic base class used for defining a field with sensitive information that you do not want to be visible in logging or tracebacks. You may either directly parametrize `Secret` with a type, or subclass from `Secret` with a parametrized type. The benefit of subclassing is that you can define a custom `_display` method, which will be used for `repr()` and `str()` methods. The examples below demonstrate both ways of using `Secret` to create a new secret type. 1. Directly parametrizing `Secret` with a type: ```python from pydantic import BaseModel, Secret SecretBool = Secret[bool] class Model(BaseModel): secret_bool: SecretBool m = Model(secret_bool=True) print(m.model_dump()) #> {'secret_bool': Secret('**********')} print(m.model_dump_json()) #> {"secret_bool":"**********"} print(m.secret_bool.get_secret_value()) #> True ``` 2. Subclassing from parametrized `Secret`: ```python from datetime import date from pydantic import BaseModel, Secret class SecretDate(Secret[date]): def _display(self) -> str: return '****/**/**' class Model(BaseModel): secret_date: SecretDate m = Model(secret_date=date(2022, 1, 1)) print(m.model_dump()) #> {'secret_date': SecretDate('****/**/**')} print(m.model_dump_json()) #> {"secret_date":"****/**/**"} print(m.secret_date.get_secret_value()) #> 2022-01-01 ``` The value returned by the `_display` method will be used for `repr()` and `str()`. You can enforce constraints on the underlying type through annotations: For example: ```python from typing import Annotated from pydantic import BaseModel, Field, Secret, ValidationError SecretPosInt = Secret[Annotated[int, Field(gt=0, strict=True)]] class Model(BaseModel): sensitive_int: SecretPosInt m = Model(sensitive_int=42) print(m.model_dump()) #> {'sensitive_int': Secret('**********')} try: m = Model(sensitive_int=-42) # (1)! except ValidationError as exc_info: print(exc_info.errors(include_url=False, include_input=False)) ''' [ { 'type': 'greater_than', 'loc': ('sensitive_int',), 'msg': 'Input should be greater than 0', 'ctx': {'gt': 0}, } ] ''' try: m = Model(sensitive_int='42') # (2)! except ValidationError as exc_info: print(exc_info.errors(include_url=False, include_input=False)) ''' [ { 'type': 'int_type', 'loc': ('sensitive_int',), 'msg': 'Input should be a valid integer', } ] ''' ``` 1. The input value is not greater than 0, so it raises a validation error. 2. The input value is not an integer, so it raises a validation error because the `SecretPosInt` type has strict mode enabled. """ def _display(self) -> str | bytes: return '**********' if self.get_secret_value() else '' @classmethod def __get_pydantic_core_schema__(cls, source: type[Any], handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: inner_type = None # if origin_type is Secret, then cls is a GenericAlias, and we can extract the inner type directly origin_type = get_origin(source) if origin_type is not None: inner_type = get_args(source)[0] # otherwise, we need to get the inner type from the base class else: bases = getattr(cls, '__orig_bases__', getattr(cls, '__bases__', [])) for base in bases: if get_origin(base) is Secret: inner_type = get_args(base)[0] if bases == [] or inner_type is None: raise TypeError( f"Can't get secret type from {cls.__name__}. " 'Please use Secret[], or subclass from Secret[] instead.' ) inner_schema = handler.generate_schema(inner_type) # type: ignore def validate_secret_value(value, handler) -> Secret[SecretType]: if isinstance(value, Secret): value = value.get_secret_value() validated_inner = handler(value) return cls(validated_inner) return core_schema.json_or_python_schema( python_schema=core_schema.no_info_wrap_validator_function( validate_secret_value, inner_schema, ), json_schema=core_schema.no_info_after_validator_function(lambda x: cls(x), inner_schema), serialization=core_schema.plain_serializer_function_ser_schema( _serialize_secret, info_arg=True, when_used='always', ), ) __pydantic_serializer__ = SchemaSerializer( core_schema.any_schema( serialization=core_schema.plain_serializer_function_ser_schema( _serialize_secret, info_arg=True, when_used='always', ) ) ) ```` ### SecretStr Bases: `_SecretField[str]` A string used for storing sensitive information that you do not want to be visible in logging or tracebacks. When the secret value is nonempty, it is displayed as `'**********'` instead of the underlying value in calls to `repr()` and `str()`. If the value *is* empty, it is displayed as `''`. ```python from pydantic import BaseModel, SecretStr class User(BaseModel): username: str password: SecretStr user = User(username='scolvin', password='password1') print(user) #> username='scolvin' password=SecretStr('**********') print(user.password.get_secret_value()) #> password1 print((SecretStr('password'), SecretStr(''))) #> (SecretStr('**********'), SecretStr('')) ``` As seen above, by default, SecretStr (and SecretBytes) will be serialized as `**********` when serializing to json. You can use the field_serializer to dump the secret as plain-text when serializing to json. ```python from pydantic import BaseModel, SecretBytes, SecretStr, field_serializer class Model(BaseModel): password: SecretStr password_bytes: SecretBytes @field_serializer('password', 'password_bytes', when_used='json') def dump_secret(self, v): return v.get_secret_value() model = Model(password='IAmSensitive', password_bytes=b'IAmSensitiveBytes') print(model) #> password=SecretStr('**********') password_bytes=SecretBytes(b'**********') print(model.password) #> ********** print(model.model_dump()) ''' { 'password': SecretStr('**********'), 'password_bytes': SecretBytes(b'**********'), } ''' print(model.model_dump_json()) #> {"password":"IAmSensitive","password_bytes":"IAmSensitiveBytes"} ``` Source code in `pydantic/types.py` ````python class SecretStr(_SecretField[str]): """A string used for storing sensitive information that you do not want to be visible in logging or tracebacks. When the secret value is nonempty, it is displayed as `'**********'` instead of the underlying value in calls to `repr()` and `str()`. If the value _is_ empty, it is displayed as `''`. ```python from pydantic import BaseModel, SecretStr class User(BaseModel): username: str password: SecretStr user = User(username='scolvin', password='password1') print(user) #> username='scolvin' password=SecretStr('**********') print(user.password.get_secret_value()) #> password1 print((SecretStr('password'), SecretStr(''))) #> (SecretStr('**********'), SecretStr('')) ``` As seen above, by default, [`SecretStr`][pydantic.types.SecretStr] (and [`SecretBytes`][pydantic.types.SecretBytes]) will be serialized as `**********` when serializing to json. You can use the [`field_serializer`][pydantic.functional_serializers.field_serializer] to dump the secret as plain-text when serializing to json. ```python from pydantic import BaseModel, SecretBytes, SecretStr, field_serializer class Model(BaseModel): password: SecretStr password_bytes: SecretBytes @field_serializer('password', 'password_bytes', when_used='json') def dump_secret(self, v): return v.get_secret_value() model = Model(password='IAmSensitive', password_bytes=b'IAmSensitiveBytes') print(model) #> password=SecretStr('**********') password_bytes=SecretBytes(b'**********') print(model.password) #> ********** print(model.model_dump()) ''' { 'password': SecretStr('**********'), 'password_bytes': SecretBytes(b'**********'), } ''' print(model.model_dump_json()) #> {"password":"IAmSensitive","password_bytes":"IAmSensitiveBytes"} ``` """ _inner_schema: ClassVar[CoreSchema] = core_schema.str_schema() _error_kind: ClassVar[str] = 'string_type' def __len__(self) -> int: return len(self._secret_value) def _display(self) -> str: return _secret_display(self._secret_value) ```` ### SecretBytes Bases: `_SecretField[bytes]` A bytes used for storing sensitive information that you do not want to be visible in logging or tracebacks. It displays `b'**********'` instead of the string value on `repr()` and `str()` calls. When the secret value is nonempty, it is displayed as `b'**********'` instead of the underlying value in calls to `repr()` and `str()`. If the value *is* empty, it is displayed as `b''`. ```python from pydantic import BaseModel, SecretBytes class User(BaseModel): username: str password: SecretBytes user = User(username='scolvin', password=b'password1') #> username='scolvin' password=SecretBytes(b'**********') print(user.password.get_secret_value()) #> b'password1' print((SecretBytes(b'password'), SecretBytes(b''))) #> (SecretBytes(b'**********'), SecretBytes(b'')) ``` Source code in `pydantic/types.py` ````python class SecretBytes(_SecretField[bytes]): """A bytes used for storing sensitive information that you do not want to be visible in logging or tracebacks. It displays `b'**********'` instead of the string value on `repr()` and `str()` calls. When the secret value is nonempty, it is displayed as `b'**********'` instead of the underlying value in calls to `repr()` and `str()`. If the value _is_ empty, it is displayed as `b''`. ```python from pydantic import BaseModel, SecretBytes class User(BaseModel): username: str password: SecretBytes user = User(username='scolvin', password=b'password1') #> username='scolvin' password=SecretBytes(b'**********') print(user.password.get_secret_value()) #> b'password1' print((SecretBytes(b'password'), SecretBytes(b''))) #> (SecretBytes(b'**********'), SecretBytes(b'')) ``` """ _inner_schema: ClassVar[CoreSchema] = core_schema.bytes_schema() _error_kind: ClassVar[str] = 'bytes_type' def __len__(self) -> int: return len(self._secret_value) def _display(self) -> bytes: return _secret_display(self._secret_value).encode() ```` ### PaymentCardNumber Bases: `str` Based on: https://en.wikipedia.org/wiki/Payment_card_number. Source code in `pydantic/types.py` ```python @deprecated( 'The `PaymentCardNumber` class is deprecated, use `pydantic_extra_types` instead. ' 'See https://docs.pydantic.dev/latest/api/pydantic_extra_types_payment/#pydantic_extra_types.payment.PaymentCardNumber.', category=PydanticDeprecatedSince20, ) class PaymentCardNumber(str): """Based on: https://en.wikipedia.org/wiki/Payment_card_number.""" strip_whitespace: ClassVar[bool] = True min_length: ClassVar[int] = 12 max_length: ClassVar[int] = 19 bin: str last4: str brand: PaymentCardBrand def __init__(self, card_number: str): self.validate_digits(card_number) card_number = self.validate_luhn_check_digit(card_number) self.bin = card_number[:6] self.last4 = card_number[-4:] self.brand = self.validate_brand(card_number) @classmethod def __get_pydantic_core_schema__(cls, source: type[Any], handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: return core_schema.with_info_after_validator_function( cls.validate, core_schema.str_schema( min_length=cls.min_length, max_length=cls.max_length, strip_whitespace=cls.strip_whitespace ), ) @classmethod def validate(cls, input_value: str, /, _: core_schema.ValidationInfo) -> PaymentCardNumber: """Validate the card number and return a `PaymentCardNumber` instance.""" return cls(input_value) @property def masked(self) -> str: """Mask all but the last 4 digits of the card number. Returns: A masked card number string. """ num_masked = len(self) - 10 # len(bin) + len(last4) == 10 return f'{self.bin}{"*" * num_masked}{self.last4}' @classmethod def validate_digits(cls, card_number: str) -> None: """Validate that the card number is all digits.""" if not card_number.isdigit(): raise PydanticCustomError('payment_card_number_digits', 'Card number is not all digits') @classmethod def validate_luhn_check_digit(cls, card_number: str) -> str: """Based on: https://en.wikipedia.org/wiki/Luhn_algorithm.""" sum_ = int(card_number[-1]) length = len(card_number) parity = length % 2 for i in range(length - 1): digit = int(card_number[i]) if i % 2 == parity: digit *= 2 if digit > 9: digit -= 9 sum_ += digit valid = sum_ % 10 == 0 if not valid: raise PydanticCustomError('payment_card_number_luhn', 'Card number is not luhn valid') return card_number @staticmethod def validate_brand(card_number: str) -> PaymentCardBrand: """Validate length based on BIN for major brands: https://en.wikipedia.org/wiki/Payment_card_number#Issuer_identification_number_(IIN). """ if card_number[0] == '4': brand = PaymentCardBrand.visa elif 51 <= int(card_number[:2]) <= 55: brand = PaymentCardBrand.mastercard elif card_number[:2] in {'34', '37'}: brand = PaymentCardBrand.amex else: brand = PaymentCardBrand.other required_length: None | int | str = None if brand in PaymentCardBrand.mastercard: required_length = 16 valid = len(card_number) == required_length elif brand == PaymentCardBrand.visa: required_length = '13, 16 or 19' valid = len(card_number) in {13, 16, 19} elif brand == PaymentCardBrand.amex: required_length = 15 valid = len(card_number) == required_length else: valid = True if not valid: raise PydanticCustomError( 'payment_card_number_brand', 'Length for a {brand} card must be {required_length}', {'brand': brand, 'required_length': required_length}, ) return brand ``` #### masked ```python masked: str ``` Mask all but the last 4 digits of the card number. Returns: | Type | Description | | --- | --- | | `str` | A masked card number string. | #### validate ```python validate( input_value: str, /, _: ValidationInfo ) -> PaymentCardNumber ``` Validate the card number and return a `PaymentCardNumber` instance. Source code in `pydantic/types.py` ```python @classmethod def validate(cls, input_value: str, /, _: core_schema.ValidationInfo) -> PaymentCardNumber: """Validate the card number and return a `PaymentCardNumber` instance.""" return cls(input_value) ``` #### validate_digits ```python validate_digits(card_number: str) -> None ``` Validate that the card number is all digits. Source code in `pydantic/types.py` ```python @classmethod def validate_digits(cls, card_number: str) -> None: """Validate that the card number is all digits.""" if not card_number.isdigit(): raise PydanticCustomError('payment_card_number_digits', 'Card number is not all digits') ``` #### validate_luhn_check_digit ```python validate_luhn_check_digit(card_number: str) -> str ``` Based on: https://en.wikipedia.org/wiki/Luhn_algorithm. Source code in `pydantic/types.py` ```python @classmethod def validate_luhn_check_digit(cls, card_number: str) -> str: """Based on: https://en.wikipedia.org/wiki/Luhn_algorithm.""" sum_ = int(card_number[-1]) length = len(card_number) parity = length % 2 for i in range(length - 1): digit = int(card_number[i]) if i % 2 == parity: digit *= 2 if digit > 9: digit -= 9 sum_ += digit valid = sum_ % 10 == 0 if not valid: raise PydanticCustomError('payment_card_number_luhn', 'Card number is not luhn valid') return card_number ``` #### validate_brand ```python validate_brand(card_number: str) -> PaymentCardBrand ``` Validate length based on BIN for major brands: https://en.wikipedia.org/wiki/Payment_card_number#Issuer_identification_number\_(IIN). Source code in `pydantic/types.py` ```python @staticmethod def validate_brand(card_number: str) -> PaymentCardBrand: """Validate length based on BIN for major brands: https://en.wikipedia.org/wiki/Payment_card_number#Issuer_identification_number_(IIN). """ if card_number[0] == '4': brand = PaymentCardBrand.visa elif 51 <= int(card_number[:2]) <= 55: brand = PaymentCardBrand.mastercard elif card_number[:2] in {'34', '37'}: brand = PaymentCardBrand.amex else: brand = PaymentCardBrand.other required_length: None | int | str = None if brand in PaymentCardBrand.mastercard: required_length = 16 valid = len(card_number) == required_length elif brand == PaymentCardBrand.visa: required_length = '13, 16 or 19' valid = len(card_number) in {13, 16, 19} elif brand == PaymentCardBrand.amex: required_length = 15 valid = len(card_number) == required_length else: valid = True if not valid: raise PydanticCustomError( 'payment_card_number_brand', 'Length for a {brand} card must be {required_length}', {'brand': brand, 'required_length': required_length}, ) return brand ``` ### ByteSize Bases: `int` Converts a string representing a number of bytes with units (such as `'1KB'` or `'11.5MiB'`) into an integer. You can use the `ByteSize` data type to (case-insensitively) convert a string representation of a number of bytes into an integer, and also to print out human-readable strings representing a number of bytes. In conformance with [IEC 80000-13 Standard](https://en.wikipedia.org/wiki/ISO/IEC_80000) we interpret `'1KB'` to mean 1000 bytes, and `'1KiB'` to mean 1024 bytes. In general, including a middle `'i'` will cause the unit to be interpreted as a power of 2, rather than a power of 10 (so, for example, `'1 MB'` is treated as `1_000_000` bytes, whereas `'1 MiB'` is treated as `1_048_576` bytes). Info Note that `1b` will be parsed as "1 byte" and not "1 bit". ```python from pydantic import BaseModel, ByteSize class MyModel(BaseModel): size: ByteSize print(MyModel(size=52000).size) #> 52000 print(MyModel(size='3000 KiB').size) #> 3072000 m = MyModel(size='50 PB') print(m.size.human_readable()) #> 44.4PiB print(m.size.human_readable(decimal=True)) #> 50.0PB print(m.size.human_readable(separator=' ')) #> 44.4 PiB print(m.size.to('TiB')) #> 45474.73508864641 ``` Source code in `pydantic/types.py` ````python class ByteSize(int): """Converts a string representing a number of bytes with units (such as `'1KB'` or `'11.5MiB'`) into an integer. You can use the `ByteSize` data type to (case-insensitively) convert a string representation of a number of bytes into an integer, and also to print out human-readable strings representing a number of bytes. In conformance with [IEC 80000-13 Standard](https://en.wikipedia.org/wiki/ISO/IEC_80000) we interpret `'1KB'` to mean 1000 bytes, and `'1KiB'` to mean 1024 bytes. In general, including a middle `'i'` will cause the unit to be interpreted as a power of 2, rather than a power of 10 (so, for example, `'1 MB'` is treated as `1_000_000` bytes, whereas `'1 MiB'` is treated as `1_048_576` bytes). !!! info Note that `1b` will be parsed as "1 byte" and not "1 bit". ```python from pydantic import BaseModel, ByteSize class MyModel(BaseModel): size: ByteSize print(MyModel(size=52000).size) #> 52000 print(MyModel(size='3000 KiB').size) #> 3072000 m = MyModel(size='50 PB') print(m.size.human_readable()) #> 44.4PiB print(m.size.human_readable(decimal=True)) #> 50.0PB print(m.size.human_readable(separator=' ')) #> 44.4 PiB print(m.size.to('TiB')) #> 45474.73508864641 ``` """ byte_sizes = { 'b': 1, 'kb': 10**3, 'mb': 10**6, 'gb': 10**9, 'tb': 10**12, 'pb': 10**15, 'eb': 10**18, 'kib': 2**10, 'mib': 2**20, 'gib': 2**30, 'tib': 2**40, 'pib': 2**50, 'eib': 2**60, 'bit': 1 / 8, 'kbit': 10**3 / 8, 'mbit': 10**6 / 8, 'gbit': 10**9 / 8, 'tbit': 10**12 / 8, 'pbit': 10**15 / 8, 'ebit': 10**18 / 8, 'kibit': 2**10 / 8, 'mibit': 2**20 / 8, 'gibit': 2**30 / 8, 'tibit': 2**40 / 8, 'pibit': 2**50 / 8, 'eibit': 2**60 / 8, } byte_sizes.update({k.lower()[0]: v for k, v in byte_sizes.items() if 'i' not in k}) byte_string_pattern = r'^\s*(\d*\.?\d+)\s*(\w+)?' byte_string_re = re.compile(byte_string_pattern, re.IGNORECASE) @classmethod def __get_pydantic_core_schema__(cls, source: type[Any], handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: return core_schema.with_info_after_validator_function( function=cls._validate, schema=core_schema.union_schema( [ core_schema.str_schema(pattern=cls.byte_string_pattern), core_schema.int_schema(ge=0), ], custom_error_type='byte_size', custom_error_message='could not parse value and unit from byte string', ), serialization=core_schema.plain_serializer_function_ser_schema( int, return_schema=core_schema.int_schema(ge=0) ), ) @classmethod def _validate(cls, input_value: Any, /, _: core_schema.ValidationInfo) -> ByteSize: try: return cls(int(input_value)) except ValueError: pass str_match = cls.byte_string_re.match(str(input_value)) if str_match is None: raise PydanticCustomError('byte_size', 'could not parse value and unit from byte string') scalar, unit = str_match.groups() if unit is None: unit = 'b' try: unit_mult = cls.byte_sizes[unit.lower()] except KeyError: raise PydanticCustomError('byte_size_unit', 'could not interpret byte unit: {unit}', {'unit': unit}) return cls(int(float(scalar) * unit_mult)) def human_readable(self, decimal: bool = False, separator: str = '') -> str: """Converts a byte size to a human readable string. Args: decimal: If True, use decimal units (e.g. 1000 bytes per KB). If False, use binary units (e.g. 1024 bytes per KiB). separator: A string used to split the value and unit. Defaults to an empty string (''). Returns: A human readable string representation of the byte size. """ if decimal: divisor = 1000 units = 'B', 'KB', 'MB', 'GB', 'TB', 'PB' final_unit = 'EB' else: divisor = 1024 units = 'B', 'KiB', 'MiB', 'GiB', 'TiB', 'PiB' final_unit = 'EiB' num = float(self) for unit in units: if abs(num) < divisor: if unit == 'B': return f'{num:0.0f}{separator}{unit}' else: return f'{num:0.1f}{separator}{unit}' num /= divisor return f'{num:0.1f}{separator}{final_unit}' def to(self, unit: str) -> float: """Converts a byte size to another unit, including both byte and bit units. Args: unit: The unit to convert to. Must be one of the following: B, KB, MB, GB, TB, PB, EB, KiB, MiB, GiB, TiB, PiB, EiB (byte units) and bit, kbit, mbit, gbit, tbit, pbit, ebit, kibit, mibit, gibit, tibit, pibit, eibit (bit units). Returns: The byte size in the new unit. """ try: unit_div = self.byte_sizes[unit.lower()] except KeyError: raise PydanticCustomError('byte_size_unit', 'Could not interpret byte unit: {unit}', {'unit': unit}) return self / unit_div ```` #### human_readable ```python human_readable( decimal: bool = False, separator: str = "" ) -> str ``` Converts a byte size to a human readable string. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `decimal` | `bool` | If True, use decimal units (e.g. 1000 bytes per KB). If False, use binary units (e.g. 1024 bytes per KiB). | `False` | | `separator` | `str` | A string used to split the value and unit. Defaults to an empty string (''). | `''` | Returns: | Type | Description | | --- | --- | | `str` | A human readable string representation of the byte size. | Source code in `pydantic/types.py` ```python def human_readable(self, decimal: bool = False, separator: str = '') -> str: """Converts a byte size to a human readable string. Args: decimal: If True, use decimal units (e.g. 1000 bytes per KB). If False, use binary units (e.g. 1024 bytes per KiB). separator: A string used to split the value and unit. Defaults to an empty string (''). Returns: A human readable string representation of the byte size. """ if decimal: divisor = 1000 units = 'B', 'KB', 'MB', 'GB', 'TB', 'PB' final_unit = 'EB' else: divisor = 1024 units = 'B', 'KiB', 'MiB', 'GiB', 'TiB', 'PiB' final_unit = 'EiB' num = float(self) for unit in units: if abs(num) < divisor: if unit == 'B': return f'{num:0.0f}{separator}{unit}' else: return f'{num:0.1f}{separator}{unit}' num /= divisor return f'{num:0.1f}{separator}{final_unit}' ``` #### to ```python to(unit: str) -> float ``` Converts a byte size to another unit, including both byte and bit units. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `unit` | `str` | The unit to convert to. Must be one of the following: B, KB, MB, GB, TB, PB, EB, KiB, MiB, GiB, TiB, PiB, EiB (byte units) and bit, kbit, mbit, gbit, tbit, pbit, ebit, kibit, mibit, gibit, tibit, pibit, eibit (bit units). | *required* | Returns: | Type | Description | | --- | --- | | `float` | The byte size in the new unit. | Source code in `pydantic/types.py` ```python def to(self, unit: str) -> float: """Converts a byte size to another unit, including both byte and bit units. Args: unit: The unit to convert to. Must be one of the following: B, KB, MB, GB, TB, PB, EB, KiB, MiB, GiB, TiB, PiB, EiB (byte units) and bit, kbit, mbit, gbit, tbit, pbit, ebit, kibit, mibit, gibit, tibit, pibit, eibit (bit units). Returns: The byte size in the new unit. """ try: unit_div = self.byte_sizes[unit.lower()] except KeyError: raise PydanticCustomError('byte_size_unit', 'Could not interpret byte unit: {unit}', {'unit': unit}) return self / unit_div ``` ### PastDate A date in the past. Source code in `pydantic/types.py` ```python class PastDate: """A date in the past.""" @classmethod def __get_pydantic_core_schema__( cls, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: if cls is source: # used directly as a type return core_schema.date_schema(now_op='past') else: schema = handler(source) _check_annotated_type(schema['type'], 'date', cls.__name__) schema['now_op'] = 'past' return schema def __repr__(self) -> str: return 'PastDate' ``` ### FutureDate A date in the future. Source code in `pydantic/types.py` ```python class FutureDate: """A date in the future.""" @classmethod def __get_pydantic_core_schema__( cls, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: if cls is source: # used directly as a type return core_schema.date_schema(now_op='future') else: schema = handler(source) _check_annotated_type(schema['type'], 'date', cls.__name__) schema['now_op'] = 'future' return schema def __repr__(self) -> str: return 'FutureDate' ``` ### AwareDatetime A datetime that requires timezone info. Source code in `pydantic/types.py` ```python class AwareDatetime: """A datetime that requires timezone info.""" @classmethod def __get_pydantic_core_schema__( cls, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: if cls is source: # used directly as a type return core_schema.datetime_schema(tz_constraint='aware') else: schema = handler(source) _check_annotated_type(schema['type'], 'datetime', cls.__name__) schema['tz_constraint'] = 'aware' return schema def __repr__(self) -> str: return 'AwareDatetime' ``` ### NaiveDatetime A datetime that doesn't require timezone info. Source code in `pydantic/types.py` ```python class NaiveDatetime: """A datetime that doesn't require timezone info.""" @classmethod def __get_pydantic_core_schema__( cls, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: if cls is source: # used directly as a type return core_schema.datetime_schema(tz_constraint='naive') else: schema = handler(source) _check_annotated_type(schema['type'], 'datetime', cls.__name__) schema['tz_constraint'] = 'naive' return schema def __repr__(self) -> str: return 'NaiveDatetime' ``` ### PastDatetime A datetime that must be in the past. Source code in `pydantic/types.py` ```python class PastDatetime: """A datetime that must be in the past.""" @classmethod def __get_pydantic_core_schema__( cls, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: if cls is source: # used directly as a type return core_schema.datetime_schema(now_op='past') else: schema = handler(source) _check_annotated_type(schema['type'], 'datetime', cls.__name__) schema['now_op'] = 'past' return schema def __repr__(self) -> str: return 'PastDatetime' ``` ### FutureDatetime A datetime that must be in the future. Source code in `pydantic/types.py` ```python class FutureDatetime: """A datetime that must be in the future.""" @classmethod def __get_pydantic_core_schema__( cls, source: type[Any], handler: GetCoreSchemaHandler ) -> core_schema.CoreSchema: if cls is source: # used directly as a type return core_schema.datetime_schema(now_op='future') else: schema = handler(source) _check_annotated_type(schema['type'], 'datetime', cls.__name__) schema['now_op'] = 'future' return schema def __repr__(self) -> str: return 'FutureDatetime' ``` ### EncoderProtocol Bases: `Protocol` Protocol for encoding and decoding data to and from bytes. Source code in `pydantic/types.py` ```python class EncoderProtocol(Protocol): """Protocol for encoding and decoding data to and from bytes.""" @classmethod def decode(cls, data: bytes) -> bytes: """Decode the data using the encoder. Args: data: The data to decode. Returns: The decoded data. """ ... @classmethod def encode(cls, value: bytes) -> bytes: """Encode the data using the encoder. Args: value: The data to encode. Returns: The encoded data. """ ... @classmethod def get_json_format(cls) -> str: """Get the JSON format for the encoded data. Returns: The JSON format for the encoded data. """ ... ``` #### decode ```python decode(data: bytes) -> bytes ``` Decode the data using the encoder. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `data` | `bytes` | The data to decode. | *required* | Returns: | Type | Description | | --- | --- | | `bytes` | The decoded data. | Source code in `pydantic/types.py` ```python @classmethod def decode(cls, data: bytes) -> bytes: """Decode the data using the encoder. Args: data: The data to decode. Returns: The decoded data. """ ... ``` #### encode ```python encode(value: bytes) -> bytes ``` Encode the data using the encoder. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `value` | `bytes` | The data to encode. | *required* | Returns: | Type | Description | | --- | --- | | `bytes` | The encoded data. | Source code in `pydantic/types.py` ```python @classmethod def encode(cls, value: bytes) -> bytes: """Encode the data using the encoder. Args: value: The data to encode. Returns: The encoded data. """ ... ``` #### get_json_format ```python get_json_format() -> str ``` Get the JSON format for the encoded data. Returns: | Type | Description | | --- | --- | | `str` | The JSON format for the encoded data. | Source code in `pydantic/types.py` ```python @classmethod def get_json_format(cls) -> str: """Get the JSON format for the encoded data. Returns: The JSON format for the encoded data. """ ... ``` ### Base64Encoder Bases: `EncoderProtocol` Standard (non-URL-safe) Base64 encoder. Source code in `pydantic/types.py` ```python class Base64Encoder(EncoderProtocol): """Standard (non-URL-safe) Base64 encoder.""" @classmethod def decode(cls, data: bytes) -> bytes: """Decode the data from base64 encoded bytes to original bytes data. Args: data: The data to decode. Returns: The decoded data. """ try: return base64.b64decode(data) except ValueError as e: raise PydanticCustomError('base64_decode', "Base64 decoding error: '{error}'", {'error': str(e)}) @classmethod def encode(cls, value: bytes) -> bytes: """Encode the data from bytes to a base64 encoded bytes. Args: value: The data to encode. Returns: The encoded data. """ return base64.b64encode(value) @classmethod def get_json_format(cls) -> Literal['base64']: """Get the JSON format for the encoded data. Returns: The JSON format for the encoded data. """ return 'base64' ``` #### decode ```python decode(data: bytes) -> bytes ``` Decode the data from base64 encoded bytes to original bytes data. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `data` | `bytes` | The data to decode. | *required* | Returns: | Type | Description | | --- | --- | | `bytes` | The decoded data. | Source code in `pydantic/types.py` ```python @classmethod def decode(cls, data: bytes) -> bytes: """Decode the data from base64 encoded bytes to original bytes data. Args: data: The data to decode. Returns: The decoded data. """ try: return base64.b64decode(data) except ValueError as e: raise PydanticCustomError('base64_decode', "Base64 decoding error: '{error}'", {'error': str(e)}) ``` #### encode ```python encode(value: bytes) -> bytes ``` Encode the data from bytes to a base64 encoded bytes. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `value` | `bytes` | The data to encode. | *required* | Returns: | Type | Description | | --- | --- | | `bytes` | The encoded data. | Source code in `pydantic/types.py` ```python @classmethod def encode(cls, value: bytes) -> bytes: """Encode the data from bytes to a base64 encoded bytes. Args: value: The data to encode. Returns: The encoded data. """ return base64.b64encode(value) ``` #### get_json_format ```python get_json_format() -> Literal['base64'] ``` Get the JSON format for the encoded data. Returns: | Type | Description | | --- | --- | | `Literal['base64']` | The JSON format for the encoded data. | Source code in `pydantic/types.py` ```python @classmethod def get_json_format(cls) -> Literal['base64']: """Get the JSON format for the encoded data. Returns: The JSON format for the encoded data. """ return 'base64' ``` ### Base64UrlEncoder Bases: `EncoderProtocol` URL-safe Base64 encoder. Source code in `pydantic/types.py` ```python class Base64UrlEncoder(EncoderProtocol): """URL-safe Base64 encoder.""" @classmethod def decode(cls, data: bytes) -> bytes: """Decode the data from base64 encoded bytes to original bytes data. Args: data: The data to decode. Returns: The decoded data. """ try: return base64.urlsafe_b64decode(data) except ValueError as e: raise PydanticCustomError('base64_decode', "Base64 decoding error: '{error}'", {'error': str(e)}) @classmethod def encode(cls, value: bytes) -> bytes: """Encode the data from bytes to a base64 encoded bytes. Args: value: The data to encode. Returns: The encoded data. """ return base64.urlsafe_b64encode(value) @classmethod def get_json_format(cls) -> Literal['base64url']: """Get the JSON format for the encoded data. Returns: The JSON format for the encoded data. """ return 'base64url' ``` #### decode ```python decode(data: bytes) -> bytes ``` Decode the data from base64 encoded bytes to original bytes data. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `data` | `bytes` | The data to decode. | *required* | Returns: | Type | Description | | --- | --- | | `bytes` | The decoded data. | Source code in `pydantic/types.py` ```python @classmethod def decode(cls, data: bytes) -> bytes: """Decode the data from base64 encoded bytes to original bytes data. Args: data: The data to decode. Returns: The decoded data. """ try: return base64.urlsafe_b64decode(data) except ValueError as e: raise PydanticCustomError('base64_decode', "Base64 decoding error: '{error}'", {'error': str(e)}) ``` #### encode ```python encode(value: bytes) -> bytes ``` Encode the data from bytes to a base64 encoded bytes. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `value` | `bytes` | The data to encode. | *required* | Returns: | Type | Description | | --- | --- | | `bytes` | The encoded data. | Source code in `pydantic/types.py` ```python @classmethod def encode(cls, value: bytes) -> bytes: """Encode the data from bytes to a base64 encoded bytes. Args: value: The data to encode. Returns: The encoded data. """ return base64.urlsafe_b64encode(value) ``` #### get_json_format ```python get_json_format() -> Literal['base64url'] ``` Get the JSON format for the encoded data. Returns: | Type | Description | | --- | --- | | `Literal['base64url']` | The JSON format for the encoded data. | Source code in `pydantic/types.py` ```python @classmethod def get_json_format(cls) -> Literal['base64url']: """Get the JSON format for the encoded data. Returns: The JSON format for the encoded data. """ return 'base64url' ``` ### EncodedBytes A bytes type that is encoded and decoded using the specified encoder. `EncodedBytes` needs an encoder that implements `EncoderProtocol` to operate. ```python from typing import Annotated from pydantic import BaseModel, EncodedBytes, EncoderProtocol, ValidationError class MyEncoder(EncoderProtocol): @classmethod def decode(cls, data: bytes) -> bytes: if data == b'**undecodable**': raise ValueError('Cannot decode data') return data[13:] @classmethod def encode(cls, value: bytes) -> bytes: return b'**encoded**: ' + value @classmethod def get_json_format(cls) -> str: return 'my-encoder' MyEncodedBytes = Annotated[bytes, EncodedBytes(encoder=MyEncoder)] class Model(BaseModel): my_encoded_bytes: MyEncodedBytes # Initialize the model with encoded data m = Model(my_encoded_bytes=b'**encoded**: some bytes') # Access decoded value print(m.my_encoded_bytes) #> b'some bytes' # Serialize into the encoded form print(m.model_dump()) #> {'my_encoded_bytes': b'**encoded**: some bytes'} # Validate encoded data try: Model(my_encoded_bytes=b'**undecodable**') except ValidationError as e: print(e) ''' 1 validation error for Model my_encoded_bytes Value error, Cannot decode data [type=value_error, input_value=b'**undecodable**', input_type=bytes] ''' ``` Source code in `pydantic/types.py` ````python @_dataclasses.dataclass(**_internal_dataclass.slots_true) class EncodedBytes: """A bytes type that is encoded and decoded using the specified encoder. `EncodedBytes` needs an encoder that implements `EncoderProtocol` to operate. ```python from typing import Annotated from pydantic import BaseModel, EncodedBytes, EncoderProtocol, ValidationError class MyEncoder(EncoderProtocol): @classmethod def decode(cls, data: bytes) -> bytes: if data == b'**undecodable**': raise ValueError('Cannot decode data') return data[13:] @classmethod def encode(cls, value: bytes) -> bytes: return b'**encoded**: ' + value @classmethod def get_json_format(cls) -> str: return 'my-encoder' MyEncodedBytes = Annotated[bytes, EncodedBytes(encoder=MyEncoder)] class Model(BaseModel): my_encoded_bytes: MyEncodedBytes # Initialize the model with encoded data m = Model(my_encoded_bytes=b'**encoded**: some bytes') # Access decoded value print(m.my_encoded_bytes) #> b'some bytes' # Serialize into the encoded form print(m.model_dump()) #> {'my_encoded_bytes': b'**encoded**: some bytes'} # Validate encoded data try: Model(my_encoded_bytes=b'**undecodable**') except ValidationError as e: print(e) ''' 1 validation error for Model my_encoded_bytes Value error, Cannot decode data [type=value_error, input_value=b'**undecodable**', input_type=bytes] ''' ``` """ encoder: type[EncoderProtocol] def __get_pydantic_json_schema__( self, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: field_schema = handler(core_schema) field_schema.update(type='string', format=self.encoder.get_json_format()) return field_schema def __get_pydantic_core_schema__(self, source: type[Any], handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: schema = handler(source) _check_annotated_type(schema['type'], 'bytes', self.__class__.__name__) return core_schema.with_info_after_validator_function( function=self.decode, schema=schema, serialization=core_schema.plain_serializer_function_ser_schema(function=self.encode), ) def decode(self, data: bytes, _: core_schema.ValidationInfo) -> bytes: """Decode the data using the specified encoder. Args: data: The data to decode. Returns: The decoded data. """ return self.encoder.decode(data) def encode(self, value: bytes) -> bytes: """Encode the data using the specified encoder. Args: value: The data to encode. Returns: The encoded data. """ return self.encoder.encode(value) def __hash__(self) -> int: return hash(self.encoder) ```` #### decode ```python decode(data: bytes, _: ValidationInfo) -> bytes ``` Decode the data using the specified encoder. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `data` | `bytes` | The data to decode. | *required* | Returns: | Type | Description | | --- | --- | | `bytes` | The decoded data. | Source code in `pydantic/types.py` ```python def decode(self, data: bytes, _: core_schema.ValidationInfo) -> bytes: """Decode the data using the specified encoder. Args: data: The data to decode. Returns: The decoded data. """ return self.encoder.decode(data) ``` #### encode ```python encode(value: bytes) -> bytes ``` Encode the data using the specified encoder. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `value` | `bytes` | The data to encode. | *required* | Returns: | Type | Description | | --- | --- | | `bytes` | The encoded data. | Source code in `pydantic/types.py` ```python def encode(self, value: bytes) -> bytes: """Encode the data using the specified encoder. Args: value: The data to encode. Returns: The encoded data. """ return self.encoder.encode(value) ``` ### EncodedStr A str type that is encoded and decoded using the specified encoder. `EncodedStr` needs an encoder that implements `EncoderProtocol` to operate. ```python from typing import Annotated from pydantic import BaseModel, EncodedStr, EncoderProtocol, ValidationError class MyEncoder(EncoderProtocol): @classmethod def decode(cls, data: bytes) -> bytes: if data == b'**undecodable**': raise ValueError('Cannot decode data') return data[13:] @classmethod def encode(cls, value: bytes) -> bytes: return b'**encoded**: ' + value @classmethod def get_json_format(cls) -> str: return 'my-encoder' MyEncodedStr = Annotated[str, EncodedStr(encoder=MyEncoder)] class Model(BaseModel): my_encoded_str: MyEncodedStr # Initialize the model with encoded data m = Model(my_encoded_str='**encoded**: some str') # Access decoded value print(m.my_encoded_str) #> some str # Serialize into the encoded form print(m.model_dump()) #> {'my_encoded_str': '**encoded**: some str'} # Validate encoded data try: Model(my_encoded_str='**undecodable**') except ValidationError as e: print(e) ''' 1 validation error for Model my_encoded_str Value error, Cannot decode data [type=value_error, input_value='**undecodable**', input_type=str] ''' ``` Source code in `pydantic/types.py` ````python @_dataclasses.dataclass(**_internal_dataclass.slots_true) class EncodedStr: """A str type that is encoded and decoded using the specified encoder. `EncodedStr` needs an encoder that implements `EncoderProtocol` to operate. ```python from typing import Annotated from pydantic import BaseModel, EncodedStr, EncoderProtocol, ValidationError class MyEncoder(EncoderProtocol): @classmethod def decode(cls, data: bytes) -> bytes: if data == b'**undecodable**': raise ValueError('Cannot decode data') return data[13:] @classmethod def encode(cls, value: bytes) -> bytes: return b'**encoded**: ' + value @classmethod def get_json_format(cls) -> str: return 'my-encoder' MyEncodedStr = Annotated[str, EncodedStr(encoder=MyEncoder)] class Model(BaseModel): my_encoded_str: MyEncodedStr # Initialize the model with encoded data m = Model(my_encoded_str='**encoded**: some str') # Access decoded value print(m.my_encoded_str) #> some str # Serialize into the encoded form print(m.model_dump()) #> {'my_encoded_str': '**encoded**: some str'} # Validate encoded data try: Model(my_encoded_str='**undecodable**') except ValidationError as e: print(e) ''' 1 validation error for Model my_encoded_str Value error, Cannot decode data [type=value_error, input_value='**undecodable**', input_type=str] ''' ``` """ encoder: type[EncoderProtocol] def __get_pydantic_json_schema__( self, core_schema: core_schema.CoreSchema, handler: GetJsonSchemaHandler ) -> JsonSchemaValue: field_schema = handler(core_schema) field_schema.update(type='string', format=self.encoder.get_json_format()) return field_schema def __get_pydantic_core_schema__(self, source: type[Any], handler: GetCoreSchemaHandler) -> core_schema.CoreSchema: schema = handler(source) _check_annotated_type(schema['type'], 'str', self.__class__.__name__) return core_schema.with_info_after_validator_function( function=self.decode_str, schema=schema, serialization=core_schema.plain_serializer_function_ser_schema(function=self.encode_str), ) def decode_str(self, data: str, _: core_schema.ValidationInfo) -> str: """Decode the data using the specified encoder. Args: data: The data to decode. Returns: The decoded data. """ return self.encoder.decode(data.encode()).decode() def encode_str(self, value: str) -> str: """Encode the data using the specified encoder. Args: value: The data to encode. Returns: The encoded data. """ return self.encoder.encode(value.encode()).decode() # noqa: UP008 def __hash__(self) -> int: return hash(self.encoder) ```` #### decode_str ```python decode_str(data: str, _: ValidationInfo) -> str ``` Decode the data using the specified encoder. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `data` | `str` | The data to decode. | *required* | Returns: | Type | Description | | --- | --- | | `str` | The decoded data. | Source code in `pydantic/types.py` ```python def decode_str(self, data: str, _: core_schema.ValidationInfo) -> str: """Decode the data using the specified encoder. Args: data: The data to decode. Returns: The decoded data. """ return self.encoder.decode(data.encode()).decode() ``` #### encode_str ```python encode_str(value: str) -> str ``` Encode the data using the specified encoder. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `value` | `str` | The data to encode. | *required* | Returns: | Type | Description | | --- | --- | | `str` | The encoded data. | Source code in `pydantic/types.py` ```python def encode_str(self, value: str) -> str: """Encode the data using the specified encoder. Args: value: The data to encode. Returns: The encoded data. """ return self.encoder.encode(value.encode()).decode() # noqa: UP008 ``` ### GetPydanticSchema Usage Documentation [Using `GetPydanticSchema` to Reduce Boilerplate](../../concepts/types/#using-getpydanticschema-to-reduce-boilerplate) A convenience class for creating an annotation that provides pydantic custom type hooks. This class is intended to eliminate the need to create a custom "marker" which defines the `__get_pydantic_core_schema__` and `__get_pydantic_json_schema__` custom hook methods. For example, to have a field treated by type checkers as `int`, but by pydantic as `Any`, you can do: ```python from typing import Annotated, Any from pydantic import BaseModel, GetPydanticSchema HandleAsAny = GetPydanticSchema(lambda _s, h: h(Any)) class Model(BaseModel): x: Annotated[int, HandleAsAny] # pydantic sees `x: Any` print(repr(Model(x='abc').x)) #> 'abc' ``` Source code in `pydantic/types.py` ````python @_dataclasses.dataclass(**_internal_dataclass.slots_true) class GetPydanticSchema: """!!! abstract "Usage Documentation" [Using `GetPydanticSchema` to Reduce Boilerplate](../concepts/types.md#using-getpydanticschema-to-reduce-boilerplate) A convenience class for creating an annotation that provides pydantic custom type hooks. This class is intended to eliminate the need to create a custom "marker" which defines the `__get_pydantic_core_schema__` and `__get_pydantic_json_schema__` custom hook methods. For example, to have a field treated by type checkers as `int`, but by pydantic as `Any`, you can do: ```python from typing import Annotated, Any from pydantic import BaseModel, GetPydanticSchema HandleAsAny = GetPydanticSchema(lambda _s, h: h(Any)) class Model(BaseModel): x: Annotated[int, HandleAsAny] # pydantic sees `x: Any` print(repr(Model(x='abc').x)) #> 'abc' ``` """ get_pydantic_core_schema: Callable[[Any, GetCoreSchemaHandler], CoreSchema] | None = None get_pydantic_json_schema: Callable[[Any, GetJsonSchemaHandler], JsonSchemaValue] | None = None # Note: we may want to consider adding a convenience staticmethod `def for_type(type_: Any) -> GetPydanticSchema:` # which returns `GetPydanticSchema(lambda _s, h: h(type_))` if not TYPE_CHECKING: # We put `__getattr__` in a non-TYPE_CHECKING block because otherwise, mypy allows arbitrary attribute access def __getattr__(self, item: str) -> Any: """Use this rather than defining `__get_pydantic_core_schema__` etc. to reduce the number of nested calls.""" if item == '__get_pydantic_core_schema__' and self.get_pydantic_core_schema: return self.get_pydantic_core_schema elif item == '__get_pydantic_json_schema__' and self.get_pydantic_json_schema: return self.get_pydantic_json_schema else: return object.__getattribute__(self, item) __hash__ = object.__hash__ ```` ### Tag Provides a way to specify the expected tag to use for a case of a (callable) discriminated union. Also provides a way to label a union case in error messages. When using a callable `Discriminator`, attach a `Tag` to each case in the `Union` to specify the tag that should be used to identify that case. For example, in the below example, the `Tag` is used to specify that if `get_discriminator_value` returns `'apple'`, the input should be validated as an `ApplePie`, and if it returns `'pumpkin'`, the input should be validated as a `PumpkinPie`. The primary role of the `Tag` here is to map the return value from the callable `Discriminator` function to the appropriate member of the `Union` in question. ```python from typing import Annotated, Any, Literal, Union from pydantic import BaseModel, Discriminator, Tag class Pie(BaseModel): time_to_cook: int num_ingredients: int class ApplePie(Pie): fruit: Literal['apple'] = 'apple' class PumpkinPie(Pie): filling: Literal['pumpkin'] = 'pumpkin' def get_discriminator_value(v: Any) -> str: if isinstance(v, dict): return v.get('fruit', v.get('filling')) return getattr(v, 'fruit', getattr(v, 'filling', None)) class ThanksgivingDinner(BaseModel): dessert: Annotated[ Union[ Annotated[ApplePie, Tag('apple')], Annotated[PumpkinPie, Tag('pumpkin')], ], Discriminator(get_discriminator_value), ] apple_variation = ThanksgivingDinner.model_validate( {'dessert': {'fruit': 'apple', 'time_to_cook': 60, 'num_ingredients': 8}} ) print(repr(apple_variation)) ''' ThanksgivingDinner(dessert=ApplePie(time_to_cook=60, num_ingredients=8, fruit='apple')) ''' pumpkin_variation = ThanksgivingDinner.model_validate( { 'dessert': { 'filling': 'pumpkin', 'time_to_cook': 40, 'num_ingredients': 6, } } ) print(repr(pumpkin_variation)) ''' ThanksgivingDinner(dessert=PumpkinPie(time_to_cook=40, num_ingredients=6, filling='pumpkin')) ''' ``` Note You must specify a `Tag` for every case in a `Tag` that is associated with a callable `Discriminator`. Failing to do so will result in a `PydanticUserError` with code [`callable-discriminator-no-tag`](../../errors/usage_errors/#callable-discriminator-no-tag). See the [Discriminated Unions](../../concepts/unions/#discriminated-unions) concepts docs for more details on how to use `Tag`s. Source code in `pydantic/types.py` ````python @_dataclasses.dataclass(**_internal_dataclass.slots_true, frozen=True) class Tag: """Provides a way to specify the expected tag to use for a case of a (callable) discriminated union. Also provides a way to label a union case in error messages. When using a callable `Discriminator`, attach a `Tag` to each case in the `Union` to specify the tag that should be used to identify that case. For example, in the below example, the `Tag` is used to specify that if `get_discriminator_value` returns `'apple'`, the input should be validated as an `ApplePie`, and if it returns `'pumpkin'`, the input should be validated as a `PumpkinPie`. The primary role of the `Tag` here is to map the return value from the callable `Discriminator` function to the appropriate member of the `Union` in question. ```python from typing import Annotated, Any, Literal, Union from pydantic import BaseModel, Discriminator, Tag class Pie(BaseModel): time_to_cook: int num_ingredients: int class ApplePie(Pie): fruit: Literal['apple'] = 'apple' class PumpkinPie(Pie): filling: Literal['pumpkin'] = 'pumpkin' def get_discriminator_value(v: Any) -> str: if isinstance(v, dict): return v.get('fruit', v.get('filling')) return getattr(v, 'fruit', getattr(v, 'filling', None)) class ThanksgivingDinner(BaseModel): dessert: Annotated[ Union[ Annotated[ApplePie, Tag('apple')], Annotated[PumpkinPie, Tag('pumpkin')], ], Discriminator(get_discriminator_value), ] apple_variation = ThanksgivingDinner.model_validate( {'dessert': {'fruit': 'apple', 'time_to_cook': 60, 'num_ingredients': 8}} ) print(repr(apple_variation)) ''' ThanksgivingDinner(dessert=ApplePie(time_to_cook=60, num_ingredients=8, fruit='apple')) ''' pumpkin_variation = ThanksgivingDinner.model_validate( { 'dessert': { 'filling': 'pumpkin', 'time_to_cook': 40, 'num_ingredients': 6, } } ) print(repr(pumpkin_variation)) ''' ThanksgivingDinner(dessert=PumpkinPie(time_to_cook=40, num_ingredients=6, filling='pumpkin')) ''' ``` !!! note You must specify a `Tag` for every case in a `Tag` that is associated with a callable `Discriminator`. Failing to do so will result in a `PydanticUserError` with code [`callable-discriminator-no-tag`](../errors/usage_errors.md#callable-discriminator-no-tag). See the [Discriminated Unions] concepts docs for more details on how to use `Tag`s. [Discriminated Unions]: ../concepts/unions.md#discriminated-unions """ tag: str def __get_pydantic_core_schema__(self, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: schema = handler(source_type) metadata = cast('CoreMetadata', schema.setdefault('metadata', {})) metadata['pydantic_internal_union_tag_key'] = self.tag return schema ```` ### Discriminator Usage Documentation [Discriminated Unions with `Callable` `Discriminator`](../../concepts/unions/#discriminated-unions-with-callable-discriminator) Provides a way to use a custom callable as the way to extract the value of a union discriminator. This allows you to get validation behavior like you'd get from `Field(discriminator=)`, but without needing to have a single shared field across all the union choices. This also makes it possible to handle unions of models and primitive types with discriminated-union-style validation errors. Finally, this allows you to use a custom callable as the way to identify which member of a union a value belongs to, while still seeing all the performance benefits of a discriminated union. Consider this example, which is much more performant with the use of `Discriminator` and thus a `TaggedUnion` than it would be as a normal `Union`. ```python from typing import Annotated, Any, Literal, Union from pydantic import BaseModel, Discriminator, Tag class Pie(BaseModel): time_to_cook: int num_ingredients: int class ApplePie(Pie): fruit: Literal['apple'] = 'apple' class PumpkinPie(Pie): filling: Literal['pumpkin'] = 'pumpkin' def get_discriminator_value(v: Any) -> str: if isinstance(v, dict): return v.get('fruit', v.get('filling')) return getattr(v, 'fruit', getattr(v, 'filling', None)) class ThanksgivingDinner(BaseModel): dessert: Annotated[ Union[ Annotated[ApplePie, Tag('apple')], Annotated[PumpkinPie, Tag('pumpkin')], ], Discriminator(get_discriminator_value), ] apple_variation = ThanksgivingDinner.model_validate( {'dessert': {'fruit': 'apple', 'time_to_cook': 60, 'num_ingredients': 8}} ) print(repr(apple_variation)) ''' ThanksgivingDinner(dessert=ApplePie(time_to_cook=60, num_ingredients=8, fruit='apple')) ''' pumpkin_variation = ThanksgivingDinner.model_validate( { 'dessert': { 'filling': 'pumpkin', 'time_to_cook': 40, 'num_ingredients': 6, } } ) print(repr(pumpkin_variation)) ''' ThanksgivingDinner(dessert=PumpkinPie(time_to_cook=40, num_ingredients=6, filling='pumpkin')) ''' ``` See the [Discriminated Unions](../../concepts/unions/#discriminated-unions) concepts docs for more details on how to use `Discriminator`s. Source code in `pydantic/types.py` ````python @_dataclasses.dataclass(**_internal_dataclass.slots_true, frozen=True) class Discriminator: """!!! abstract "Usage Documentation" [Discriminated Unions with `Callable` `Discriminator`](../concepts/unions.md#discriminated-unions-with-callable-discriminator) Provides a way to use a custom callable as the way to extract the value of a union discriminator. This allows you to get validation behavior like you'd get from `Field(discriminator=)`, but without needing to have a single shared field across all the union choices. This also makes it possible to handle unions of models and primitive types with discriminated-union-style validation errors. Finally, this allows you to use a custom callable as the way to identify which member of a union a value belongs to, while still seeing all the performance benefits of a discriminated union. Consider this example, which is much more performant with the use of `Discriminator` and thus a `TaggedUnion` than it would be as a normal `Union`. ```python from typing import Annotated, Any, Literal, Union from pydantic import BaseModel, Discriminator, Tag class Pie(BaseModel): time_to_cook: int num_ingredients: int class ApplePie(Pie): fruit: Literal['apple'] = 'apple' class PumpkinPie(Pie): filling: Literal['pumpkin'] = 'pumpkin' def get_discriminator_value(v: Any) -> str: if isinstance(v, dict): return v.get('fruit', v.get('filling')) return getattr(v, 'fruit', getattr(v, 'filling', None)) class ThanksgivingDinner(BaseModel): dessert: Annotated[ Union[ Annotated[ApplePie, Tag('apple')], Annotated[PumpkinPie, Tag('pumpkin')], ], Discriminator(get_discriminator_value), ] apple_variation = ThanksgivingDinner.model_validate( {'dessert': {'fruit': 'apple', 'time_to_cook': 60, 'num_ingredients': 8}} ) print(repr(apple_variation)) ''' ThanksgivingDinner(dessert=ApplePie(time_to_cook=60, num_ingredients=8, fruit='apple')) ''' pumpkin_variation = ThanksgivingDinner.model_validate( { 'dessert': { 'filling': 'pumpkin', 'time_to_cook': 40, 'num_ingredients': 6, } } ) print(repr(pumpkin_variation)) ''' ThanksgivingDinner(dessert=PumpkinPie(time_to_cook=40, num_ingredients=6, filling='pumpkin')) ''' ``` See the [Discriminated Unions] concepts docs for more details on how to use `Discriminator`s. [Discriminated Unions]: ../concepts/unions.md#discriminated-unions """ discriminator: str | Callable[[Any], Hashable] """The callable or field name for discriminating the type in a tagged union. A `Callable` discriminator must extract the value of the discriminator from the input. A `str` discriminator must be the name of a field to discriminate against. """ custom_error_type: str | None = None """Type to use in [custom errors](../errors/errors.md) replacing the standard discriminated union validation errors. """ custom_error_message: str | None = None """Message to use in custom errors.""" custom_error_context: dict[str, int | str | float] | None = None """Context to use in custom errors.""" def __get_pydantic_core_schema__(self, source_type: Any, handler: GetCoreSchemaHandler) -> CoreSchema: if not is_union_origin(get_origin(source_type)): raise TypeError(f'{type(self).__name__} must be used with a Union type, not {source_type}') if isinstance(self.discriminator, str): from pydantic import Field return handler(Annotated[source_type, Field(discriminator=self.discriminator)]) else: original_schema = handler(source_type) return self._convert_schema(original_schema) def _convert_schema(self, original_schema: core_schema.CoreSchema) -> core_schema.TaggedUnionSchema: if original_schema['type'] != 'union': # This likely indicates that the schema was a single-item union that was simplified. # In this case, we do the same thing we do in # `pydantic._internal._discriminated_union._ApplyInferredDiscriminator._apply_to_root`, namely, # package the generated schema back into a single-item union. original_schema = core_schema.union_schema([original_schema]) tagged_union_choices = {} for choice in original_schema['choices']: tag = None if isinstance(choice, tuple): choice, tag = choice metadata = cast('CoreMetadata | None', choice.get('metadata')) if metadata is not None: tag = metadata.get('pydantic_internal_union_tag_key') or tag if tag is None: raise PydanticUserError( f'`Tag` not provided for choice {choice} used with `Discriminator`', code='callable-discriminator-no-tag', ) tagged_union_choices[tag] = choice # Have to do these verbose checks to ensure falsy values ('' and {}) don't get ignored custom_error_type = self.custom_error_type if custom_error_type is None: custom_error_type = original_schema.get('custom_error_type') custom_error_message = self.custom_error_message if custom_error_message is None: custom_error_message = original_schema.get('custom_error_message') custom_error_context = self.custom_error_context if custom_error_context is None: custom_error_context = original_schema.get('custom_error_context') custom_error_type = original_schema.get('custom_error_type') if custom_error_type is None else custom_error_type return core_schema.tagged_union_schema( tagged_union_choices, self.discriminator, custom_error_type=custom_error_type, custom_error_message=custom_error_message, custom_error_context=custom_error_context, strict=original_schema.get('strict'), ref=original_schema.get('ref'), metadata=original_schema.get('metadata'), serialization=original_schema.get('serialization'), ) ```` #### discriminator ```python discriminator: str | Callable[[Any], Hashable] ``` The callable or field name for discriminating the type in a tagged union. A `Callable` discriminator must extract the value of the discriminator from the input. A `str` discriminator must be the name of a field to discriminate against. #### custom_error_type ```python custom_error_type: str | None = None ``` Type to use in [custom errors](../../errors/errors/) replacing the standard discriminated union validation errors. #### custom_error_message ```python custom_error_message: str | None = None ``` Message to use in custom errors. #### custom_error_context ```python custom_error_context: ( dict[str, int | str | float] | None ) = None ``` Context to use in custom errors. ### FailFast Bases: `PydanticMetadata`, `BaseMetadata` A `FailFast` annotation can be used to specify that validation should stop at the first error. This can be useful when you want to validate a large amount of data and you only need to know if it's valid or not. You might want to enable this setting if you want to validate your data faster (basically, if you use this, validation will be more performant with the caveat that you get less information). ```python from typing import Annotated from pydantic import BaseModel, FailFast, ValidationError class Model(BaseModel): x: Annotated[list[int], FailFast()] # This will raise a single error for the first invalid value and stop validation try: obj = Model(x=[1, 2, 'a', 4, 5, 'b', 7, 8, 9, 'c']) except ValidationError as e: print(e) ''' 1 validation error for Model x.2 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] ''' ``` Source code in `pydantic/types.py` ````python @_dataclasses.dataclass class FailFast(_fields.PydanticMetadata, BaseMetadata): """A `FailFast` annotation can be used to specify that validation should stop at the first error. This can be useful when you want to validate a large amount of data and you only need to know if it's valid or not. You might want to enable this setting if you want to validate your data faster (basically, if you use this, validation will be more performant with the caveat that you get less information). ```python from typing import Annotated from pydantic import BaseModel, FailFast, ValidationError class Model(BaseModel): x: Annotated[list[int], FailFast()] # This will raise a single error for the first invalid value and stop validation try: obj = Model(x=[1, 2, 'a', 4, 5, 'b', 7, 8, 9, 'c']) except ValidationError as e: print(e) ''' 1 validation error for Model x.2 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='a', input_type=str] ''' ``` """ fail_fast: bool = True ```` ### conint ```python conint( *, strict: bool | None = None, gt: int | None = None, ge: int | None = None, lt: int | None = None, le: int | None = None, multiple_of: int | None = None ) -> type[int] ``` Discouraged This function is **discouraged** in favor of using [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated) with Field instead. This function will be **deprecated** in Pydantic 3.0. The reason is that `conint` returns a type, which doesn't play well with static analysis tools. ```python from pydantic import BaseModel, conint class Foo(BaseModel): bar: conint(strict=True, gt=0) ``` ```python from typing import Annotated from pydantic import BaseModel, Field class Foo(BaseModel): bar: Annotated[int, Field(strict=True, gt=0)] ``` A wrapper around `int` that allows for additional constraints. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `strict` | `bool | None` | Whether to validate the integer in strict mode. Defaults to None. | `None` | | `gt` | `int | None` | The value must be greater than this. | `None` | | `ge` | `int | None` | The value must be greater than or equal to this. | `None` | | `lt` | `int | None` | The value must be less than this. | `None` | | `le` | `int | None` | The value must be less than or equal to this. | `None` | | `multiple_of` | `int | None` | The value must be a multiple of this. | `None` | Returns: | Type | Description | | --- | --- | | `type[int]` | The wrapped integer type. | ```python from pydantic import BaseModel, ValidationError, conint class ConstrainedExample(BaseModel): constrained_int: conint(gt=1) m = ConstrainedExample(constrained_int=2) print(repr(m)) #> ConstrainedExample(constrained_int=2) try: ConstrainedExample(constrained_int=0) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'greater_than', 'loc': ('constrained_int',), 'msg': 'Input should be greater than 1', 'input': 0, 'ctx': {'gt': 1}, 'url': 'https://errors.pydantic.dev/2/v/greater_than', } ] ''' ``` Source code in `pydantic/types.py` ````python def conint( *, strict: bool | None = None, gt: int | None = None, ge: int | None = None, lt: int | None = None, le: int | None = None, multiple_of: int | None = None, ) -> type[int]: """ !!! warning "Discouraged" This function is **discouraged** in favor of using [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated) with [`Field`][pydantic.fields.Field] instead. This function will be **deprecated** in Pydantic 3.0. The reason is that `conint` returns a type, which doesn't play well with static analysis tools. === ":x: Don't do this" ```python from pydantic import BaseModel, conint class Foo(BaseModel): bar: conint(strict=True, gt=0) ``` === ":white_check_mark: Do this" ```python from typing import Annotated from pydantic import BaseModel, Field class Foo(BaseModel): bar: Annotated[int, Field(strict=True, gt=0)] ``` A wrapper around `int` that allows for additional constraints. Args: strict: Whether to validate the integer in strict mode. Defaults to `None`. gt: The value must be greater than this. ge: The value must be greater than or equal to this. lt: The value must be less than this. le: The value must be less than or equal to this. multiple_of: The value must be a multiple of this. Returns: The wrapped integer type. ```python from pydantic import BaseModel, ValidationError, conint class ConstrainedExample(BaseModel): constrained_int: conint(gt=1) m = ConstrainedExample(constrained_int=2) print(repr(m)) #> ConstrainedExample(constrained_int=2) try: ConstrainedExample(constrained_int=0) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'greater_than', 'loc': ('constrained_int',), 'msg': 'Input should be greater than 1', 'input': 0, 'ctx': {'gt': 1}, 'url': 'https://errors.pydantic.dev/2/v/greater_than', } ] ''' ``` """ # noqa: D212 return Annotated[ # pyright: ignore[reportReturnType] int, Strict(strict) if strict is not None else None, annotated_types.Interval(gt=gt, ge=ge, lt=lt, le=le), annotated_types.MultipleOf(multiple_of) if multiple_of is not None else None, ] ```` ### confloat ```python confloat( *, strict: bool | None = None, gt: float | None = None, ge: float | None = None, lt: float | None = None, le: float | None = None, multiple_of: float | None = None, allow_inf_nan: bool | None = None ) -> type[float] ``` Discouraged This function is **discouraged** in favor of using [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated) with Field instead. This function will be **deprecated** in Pydantic 3.0. The reason is that `confloat` returns a type, which doesn't play well with static analysis tools. ```python from pydantic import BaseModel, confloat class Foo(BaseModel): bar: confloat(strict=True, gt=0) ``` ```python from typing import Annotated from pydantic import BaseModel, Field class Foo(BaseModel): bar: Annotated[float, Field(strict=True, gt=0)] ``` A wrapper around `float` that allows for additional constraints. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `strict` | `bool | None` | Whether to validate the float in strict mode. | `None` | | `gt` | `float | None` | The value must be greater than this. | `None` | | `ge` | `float | None` | The value must be greater than or equal to this. | `None` | | `lt` | `float | None` | The value must be less than this. | `None` | | `le` | `float | None` | The value must be less than or equal to this. | `None` | | `multiple_of` | `float | None` | The value must be a multiple of this. | `None` | | `allow_inf_nan` | `bool | None` | Whether to allow -inf, inf, and nan. | `None` | Returns: | Type | Description | | --- | --- | | `type[float]` | The wrapped float type. | ```python from pydantic import BaseModel, ValidationError, confloat class ConstrainedExample(BaseModel): constrained_float: confloat(gt=1.0) m = ConstrainedExample(constrained_float=1.1) print(repr(m)) #> ConstrainedExample(constrained_float=1.1) try: ConstrainedExample(constrained_float=0.9) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'greater_than', 'loc': ('constrained_float',), 'msg': 'Input should be greater than 1', 'input': 0.9, 'ctx': {'gt': 1.0}, 'url': 'https://errors.pydantic.dev/2/v/greater_than', } ] ''' ``` Source code in `pydantic/types.py` ````python def confloat( *, strict: bool | None = None, gt: float | None = None, ge: float | None = None, lt: float | None = None, le: float | None = None, multiple_of: float | None = None, allow_inf_nan: bool | None = None, ) -> type[float]: """ !!! warning "Discouraged" This function is **discouraged** in favor of using [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated) with [`Field`][pydantic.fields.Field] instead. This function will be **deprecated** in Pydantic 3.0. The reason is that `confloat` returns a type, which doesn't play well with static analysis tools. === ":x: Don't do this" ```python from pydantic import BaseModel, confloat class Foo(BaseModel): bar: confloat(strict=True, gt=0) ``` === ":white_check_mark: Do this" ```python from typing import Annotated from pydantic import BaseModel, Field class Foo(BaseModel): bar: Annotated[float, Field(strict=True, gt=0)] ``` A wrapper around `float` that allows for additional constraints. Args: strict: Whether to validate the float in strict mode. gt: The value must be greater than this. ge: The value must be greater than or equal to this. lt: The value must be less than this. le: The value must be less than or equal to this. multiple_of: The value must be a multiple of this. allow_inf_nan: Whether to allow `-inf`, `inf`, and `nan`. Returns: The wrapped float type. ```python from pydantic import BaseModel, ValidationError, confloat class ConstrainedExample(BaseModel): constrained_float: confloat(gt=1.0) m = ConstrainedExample(constrained_float=1.1) print(repr(m)) #> ConstrainedExample(constrained_float=1.1) try: ConstrainedExample(constrained_float=0.9) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'greater_than', 'loc': ('constrained_float',), 'msg': 'Input should be greater than 1', 'input': 0.9, 'ctx': {'gt': 1.0}, 'url': 'https://errors.pydantic.dev/2/v/greater_than', } ] ''' ``` """ # noqa: D212 return Annotated[ # pyright: ignore[reportReturnType] float, Strict(strict) if strict is not None else None, annotated_types.Interval(gt=gt, ge=ge, lt=lt, le=le), annotated_types.MultipleOf(multiple_of) if multiple_of is not None else None, AllowInfNan(allow_inf_nan) if allow_inf_nan is not None else None, ] ```` ### conbytes ```python conbytes( *, min_length: int | None = None, max_length: int | None = None, strict: bool | None = None ) -> type[bytes] ``` A wrapper around `bytes` that allows for additional constraints. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `min_length` | `int | None` | The minimum length of the bytes. | `None` | | `max_length` | `int | None` | The maximum length of the bytes. | `None` | | `strict` | `bool | None` | Whether to validate the bytes in strict mode. | `None` | Returns: | Type | Description | | --- | --- | | `type[bytes]` | The wrapped bytes type. | Source code in `pydantic/types.py` ```python def conbytes( *, min_length: int | None = None, max_length: int | None = None, strict: bool | None = None, ) -> type[bytes]: """A wrapper around `bytes` that allows for additional constraints. Args: min_length: The minimum length of the bytes. max_length: The maximum length of the bytes. strict: Whether to validate the bytes in strict mode. Returns: The wrapped bytes type. """ return Annotated[ # pyright: ignore[reportReturnType] bytes, Strict(strict) if strict is not None else None, annotated_types.Len(min_length or 0, max_length), ] ``` ### constr ```python constr( *, strip_whitespace: bool | None = None, to_upper: bool | None = None, to_lower: bool | None = None, strict: bool | None = None, min_length: int | None = None, max_length: int | None = None, pattern: str | Pattern[str] | None = None ) -> type[str] ``` Discouraged This function is **discouraged** in favor of using [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated) with StringConstraints instead. This function will be **deprecated** in Pydantic 3.0. The reason is that `constr` returns a type, which doesn't play well with static analysis tools. ```python from pydantic import BaseModel, constr class Foo(BaseModel): bar: constr(strip_whitespace=True, to_upper=True, pattern=r'^[A-Z]+$') ``` ```python from typing import Annotated from pydantic import BaseModel, StringConstraints class Foo(BaseModel): bar: Annotated[ str, StringConstraints( strip_whitespace=True, to_upper=True, pattern=r'^[A-Z]+$' ), ] ``` A wrapper around `str` that allows for additional constraints. ```python from pydantic import BaseModel, constr class Foo(BaseModel): bar: constr(strip_whitespace=True, to_upper=True) foo = Foo(bar=' hello ') print(foo) #> bar='HELLO' ``` Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `strip_whitespace` | `bool | None` | Whether to remove leading and trailing whitespace. | `None` | | `to_upper` | `bool | None` | Whether to turn all characters to uppercase. | `None` | | `to_lower` | `bool | None` | Whether to turn all characters to lowercase. | `None` | | `strict` | `bool | None` | Whether to validate the string in strict mode. | `None` | | `min_length` | `int | None` | The minimum length of the string. | `None` | | `max_length` | `int | None` | The maximum length of the string. | `None` | | `pattern` | `str | Pattern[str] | None` | A regex pattern to validate the string against. | `None` | Returns: | Type | Description | | --- | --- | | `type[str]` | The wrapped string type. | Source code in `pydantic/types.py` ````python def constr( *, strip_whitespace: bool | None = None, to_upper: bool | None = None, to_lower: bool | None = None, strict: bool | None = None, min_length: int | None = None, max_length: int | None = None, pattern: str | Pattern[str] | None = None, ) -> type[str]: """ !!! warning "Discouraged" This function is **discouraged** in favor of using [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated) with [`StringConstraints`][pydantic.types.StringConstraints] instead. This function will be **deprecated** in Pydantic 3.0. The reason is that `constr` returns a type, which doesn't play well with static analysis tools. === ":x: Don't do this" ```python from pydantic import BaseModel, constr class Foo(BaseModel): bar: constr(strip_whitespace=True, to_upper=True, pattern=r'^[A-Z]+$') ``` === ":white_check_mark: Do this" ```python from typing import Annotated from pydantic import BaseModel, StringConstraints class Foo(BaseModel): bar: Annotated[ str, StringConstraints( strip_whitespace=True, to_upper=True, pattern=r'^[A-Z]+$' ), ] ``` A wrapper around `str` that allows for additional constraints. ```python from pydantic import BaseModel, constr class Foo(BaseModel): bar: constr(strip_whitespace=True, to_upper=True) foo = Foo(bar=' hello ') print(foo) #> bar='HELLO' ``` Args: strip_whitespace: Whether to remove leading and trailing whitespace. to_upper: Whether to turn all characters to uppercase. to_lower: Whether to turn all characters to lowercase. strict: Whether to validate the string in strict mode. min_length: The minimum length of the string. max_length: The maximum length of the string. pattern: A regex pattern to validate the string against. Returns: The wrapped string type. """ # noqa: D212 return Annotated[ # pyright: ignore[reportReturnType] str, StringConstraints( strip_whitespace=strip_whitespace, to_upper=to_upper, to_lower=to_lower, strict=strict, min_length=min_length, max_length=max_length, pattern=pattern, ), ] ```` ### conset ```python conset( item_type: type[HashableItemType], *, min_length: int | None = None, max_length: int | None = None ) -> type[set[HashableItemType]] ``` A wrapper around `typing.Set` that allows for additional constraints. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `item_type` | `type[HashableItemType]` | The type of the items in the set. | *required* | | `min_length` | `int | None` | The minimum length of the set. | `None` | | `max_length` | `int | None` | The maximum length of the set. | `None` | Returns: | Type | Description | | --- | --- | | `type[set[HashableItemType]]` | The wrapped set type. | Source code in `pydantic/types.py` ```python def conset( item_type: type[HashableItemType], *, min_length: int | None = None, max_length: int | None = None ) -> type[set[HashableItemType]]: """A wrapper around `typing.Set` that allows for additional constraints. Args: item_type: The type of the items in the set. min_length: The minimum length of the set. max_length: The maximum length of the set. Returns: The wrapped set type. """ return Annotated[set[item_type], annotated_types.Len(min_length or 0, max_length)] # pyright: ignore[reportReturnType] ``` ### confrozenset ```python confrozenset( item_type: type[HashableItemType], *, min_length: int | None = None, max_length: int | None = None ) -> type[frozenset[HashableItemType]] ``` A wrapper around `typing.FrozenSet` that allows for additional constraints. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `item_type` | `type[HashableItemType]` | The type of the items in the frozenset. | *required* | | `min_length` | `int | None` | The minimum length of the frozenset. | `None` | | `max_length` | `int | None` | The maximum length of the frozenset. | `None` | Returns: | Type | Description | | --- | --- | | `type[frozenset[HashableItemType]]` | The wrapped frozenset type. | Source code in `pydantic/types.py` ```python def confrozenset( item_type: type[HashableItemType], *, min_length: int | None = None, max_length: int | None = None ) -> type[frozenset[HashableItemType]]: """A wrapper around `typing.FrozenSet` that allows for additional constraints. Args: item_type: The type of the items in the frozenset. min_length: The minimum length of the frozenset. max_length: The maximum length of the frozenset. Returns: The wrapped frozenset type. """ return Annotated[frozenset[item_type], annotated_types.Len(min_length or 0, max_length)] # pyright: ignore[reportReturnType] ``` ### conlist ```python conlist( item_type: type[AnyItemType], *, min_length: int | None = None, max_length: int | None = None, unique_items: bool | None = None ) -> type[list[AnyItemType]] ``` A wrapper around list that adds validation. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `item_type` | `type[AnyItemType]` | The type of the items in the list. | *required* | | `min_length` | `int | None` | The minimum length of the list. Defaults to None. | `None` | | `max_length` | `int | None` | The maximum length of the list. Defaults to None. | `None` | | `unique_items` | `bool | None` | Whether the items in the list must be unique. Defaults to None. Warning The unique_items parameter is deprecated, use Set instead. See this issue for more details. | `None` | Returns: | Type | Description | | --- | --- | | `type[list[AnyItemType]]` | The wrapped list type. | Source code in `pydantic/types.py` ```python def conlist( item_type: type[AnyItemType], *, min_length: int | None = None, max_length: int | None = None, unique_items: bool | None = None, ) -> type[list[AnyItemType]]: """A wrapper around [`list`][] that adds validation. Args: item_type: The type of the items in the list. min_length: The minimum length of the list. Defaults to None. max_length: The maximum length of the list. Defaults to None. unique_items: Whether the items in the list must be unique. Defaults to None. !!! warning Deprecated The `unique_items` parameter is deprecated, use `Set` instead. See [this issue](https://github.com/pydantic/pydantic-core/issues/296) for more details. Returns: The wrapped list type. """ if unique_items is not None: raise PydanticUserError( ( '`unique_items` is removed, use `Set` instead' '(this feature is discussed in https://github.com/pydantic/pydantic-core/issues/296)' ), code='removed-kwargs', ) return Annotated[list[item_type], annotated_types.Len(min_length or 0, max_length)] # pyright: ignore[reportReturnType] ``` ### condecimal ```python condecimal( *, strict: bool | None = None, gt: int | Decimal | None = None, ge: int | Decimal | None = None, lt: int | Decimal | None = None, le: int | Decimal | None = None, multiple_of: int | Decimal | None = None, max_digits: int | None = None, decimal_places: int | None = None, allow_inf_nan: bool | None = None ) -> type[Decimal] ``` Discouraged This function is **discouraged** in favor of using [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated) with Field instead. This function will be **deprecated** in Pydantic 3.0. The reason is that `condecimal` returns a type, which doesn't play well with static analysis tools. ```python from pydantic import BaseModel, condecimal class Foo(BaseModel): bar: condecimal(strict=True, allow_inf_nan=True) ``` ```python from decimal import Decimal from typing import Annotated from pydantic import BaseModel, Field class Foo(BaseModel): bar: Annotated[Decimal, Field(strict=True, allow_inf_nan=True)] ``` A wrapper around Decimal that adds validation. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `strict` | `bool | None` | Whether to validate the value in strict mode. Defaults to None. | `None` | | `gt` | `int | Decimal | None` | The value must be greater than this. Defaults to None. | `None` | | `ge` | `int | Decimal | None` | The value must be greater than or equal to this. Defaults to None. | `None` | | `lt` | `int | Decimal | None` | The value must be less than this. Defaults to None. | `None` | | `le` | `int | Decimal | None` | The value must be less than or equal to this. Defaults to None. | `None` | | `multiple_of` | `int | Decimal | None` | The value must be a multiple of this. Defaults to None. | `None` | | `max_digits` | `int | None` | The maximum number of digits. Defaults to None. | `None` | | `decimal_places` | `int | None` | The number of decimal places. Defaults to None. | `None` | | `allow_inf_nan` | `bool | None` | Whether to allow infinity and NaN. Defaults to None. | `None` | ```python from decimal import Decimal from pydantic import BaseModel, ValidationError, condecimal class ConstrainedExample(BaseModel): constrained_decimal: condecimal(gt=Decimal('1.0')) m = ConstrainedExample(constrained_decimal=Decimal('1.1')) print(repr(m)) #> ConstrainedExample(constrained_decimal=Decimal('1.1')) try: ConstrainedExample(constrained_decimal=Decimal('0.9')) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'greater_than', 'loc': ('constrained_decimal',), 'msg': 'Input should be greater than 1.0', 'input': Decimal('0.9'), 'ctx': {'gt': Decimal('1.0')}, 'url': 'https://errors.pydantic.dev/2/v/greater_than', } ] ''' ``` Source code in `pydantic/types.py` ````python def condecimal( *, strict: bool | None = None, gt: int | Decimal | None = None, ge: int | Decimal | None = None, lt: int | Decimal | None = None, le: int | Decimal | None = None, multiple_of: int | Decimal | None = None, max_digits: int | None = None, decimal_places: int | None = None, allow_inf_nan: bool | None = None, ) -> type[Decimal]: """ !!! warning "Discouraged" This function is **discouraged** in favor of using [`Annotated`](https://docs.python.org/3/library/typing.html#typing.Annotated) with [`Field`][pydantic.fields.Field] instead. This function will be **deprecated** in Pydantic 3.0. The reason is that `condecimal` returns a type, which doesn't play well with static analysis tools. === ":x: Don't do this" ```python from pydantic import BaseModel, condecimal class Foo(BaseModel): bar: condecimal(strict=True, allow_inf_nan=True) ``` === ":white_check_mark: Do this" ```python from decimal import Decimal from typing import Annotated from pydantic import BaseModel, Field class Foo(BaseModel): bar: Annotated[Decimal, Field(strict=True, allow_inf_nan=True)] ``` A wrapper around Decimal that adds validation. Args: strict: Whether to validate the value in strict mode. Defaults to `None`. gt: The value must be greater than this. Defaults to `None`. ge: The value must be greater than or equal to this. Defaults to `None`. lt: The value must be less than this. Defaults to `None`. le: The value must be less than or equal to this. Defaults to `None`. multiple_of: The value must be a multiple of this. Defaults to `None`. max_digits: The maximum number of digits. Defaults to `None`. decimal_places: The number of decimal places. Defaults to `None`. allow_inf_nan: Whether to allow infinity and NaN. Defaults to `None`. ```python from decimal import Decimal from pydantic import BaseModel, ValidationError, condecimal class ConstrainedExample(BaseModel): constrained_decimal: condecimal(gt=Decimal('1.0')) m = ConstrainedExample(constrained_decimal=Decimal('1.1')) print(repr(m)) #> ConstrainedExample(constrained_decimal=Decimal('1.1')) try: ConstrainedExample(constrained_decimal=Decimal('0.9')) except ValidationError as e: print(e.errors()) ''' [ { 'type': 'greater_than', 'loc': ('constrained_decimal',), 'msg': 'Input should be greater than 1.0', 'input': Decimal('0.9'), 'ctx': {'gt': Decimal('1.0')}, 'url': 'https://errors.pydantic.dev/2/v/greater_than', } ] ''' ``` """ # noqa: D212 return Annotated[ # pyright: ignore[reportReturnType] Decimal, Strict(strict) if strict is not None else None, annotated_types.Interval(gt=gt, ge=ge, lt=lt, le=le), annotated_types.MultipleOf(multiple_of) if multiple_of is not None else None, _fields.pydantic_general_metadata(max_digits=max_digits, decimal_places=decimal_places), AllowInfNan(allow_inf_nan) if allow_inf_nan is not None else None, ] ```` ### condate ```python condate( *, strict: bool | None = None, gt: date | None = None, ge: date | None = None, lt: date | None = None, le: date | None = None ) -> type[date] ``` A wrapper for date that adds constraints. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `strict` | `bool | None` | Whether to validate the date value in strict mode. Defaults to None. | `None` | | `gt` | `date | None` | The value must be greater than this. Defaults to None. | `None` | | `ge` | `date | None` | The value must be greater than or equal to this. Defaults to None. | `None` | | `lt` | `date | None` | The value must be less than this. Defaults to None. | `None` | | `le` | `date | None` | The value must be less than or equal to this. Defaults to None. | `None` | Returns: | Type | Description | | --- | --- | | `type[date]` | A date type with the specified constraints. | Source code in `pydantic/types.py` ```python def condate( *, strict: bool | None = None, gt: date | None = None, ge: date | None = None, lt: date | None = None, le: date | None = None, ) -> type[date]: """A wrapper for date that adds constraints. Args: strict: Whether to validate the date value in strict mode. Defaults to `None`. gt: The value must be greater than this. Defaults to `None`. ge: The value must be greater than or equal to this. Defaults to `None`. lt: The value must be less than this. Defaults to `None`. le: The value must be less than or equal to this. Defaults to `None`. Returns: A date type with the specified constraints. """ return Annotated[ # pyright: ignore[reportReturnType] date, Strict(strict) if strict is not None else None, annotated_types.Interval(gt=gt, ge=ge, lt=lt, le=le), ] ``` Decorator for validating function calls. ## validate_call ```python validate_call( *, config: ConfigDict | None = None, validate_return: bool = False ) -> Callable[[AnyCallableT], AnyCallableT] ``` ```python validate_call(func: AnyCallableT) -> AnyCallableT ``` ```python validate_call( func: AnyCallableT | None = None, /, *, config: ConfigDict | None = None, validate_return: bool = False, ) -> AnyCallableT | Callable[[AnyCallableT], AnyCallableT] ``` Usage Documentation [Validation Decorator](../../concepts/validation_decorator/) Returns a decorated wrapper around the function that validates the arguments and, optionally, the return value. Usage may be either as a plain decorator `@validate_call` or with arguments `@validate_call(...)`. Parameters: | Name | Type | Description | Default | | --- | --- | --- | --- | | `func` | `AnyCallableT | None` | The function to be decorated. | `None` | | `config` | `ConfigDict | None` | The configuration dictionary. | `None` | | `validate_return` | `bool` | Whether to validate the return value. | `False` | Returns: | Type | Description | | --- | --- | | `AnyCallableT | Callable[[AnyCallableT], AnyCallableT]` | The decorated function. | Source code in `pydantic/validate_call_decorator.py` ```python def validate_call( func: AnyCallableT | None = None, /, *, config: ConfigDict | None = None, validate_return: bool = False, ) -> AnyCallableT | Callable[[AnyCallableT], AnyCallableT]: """!!! abstract "Usage Documentation" [Validation Decorator](../concepts/validation_decorator.md) Returns a decorated wrapper around the function that validates the arguments and, optionally, the return value. Usage may be either as a plain decorator `@validate_call` or with arguments `@validate_call(...)`. Args: func: The function to be decorated. config: The configuration dictionary. validate_return: Whether to validate the return value. Returns: The decorated function. """ parent_namespace = _typing_extra.parent_frame_namespace() def validate(function: AnyCallableT) -> AnyCallableT: _check_function_type(function) validate_call_wrapper = _validate_call.ValidateCallWrapper( cast(_generate_schema.ValidateCallSupportedTypes, function), config, validate_return, parent_namespace ) return _validate_call.update_wrapper_attributes(function, validate_call_wrapper.__call__) # type: ignore if func is not None: return validate(func) else: return validate ``` ## pydantic.__version__ ```python __version__ = VERSION ``` ## pydantic.version.version_info ```python version_info() -> str ``` Return complete version information for Pydantic and its dependencies. Source code in `pydantic/version.py` ```python def version_info() -> str: """Return complete version information for Pydantic and its dependencies.""" import importlib.metadata as importlib_metadata import os import platform import sys from pathlib import Path import pydantic_core._pydantic_core as pdc from ._internal import _git as git # get data about packages that are closely related to pydantic, use pydantic or often conflict with pydantic package_names = { 'email-validator', 'fastapi', 'mypy', 'pydantic-extra-types', 'pydantic-settings', 'pyright', 'typing_extensions', } related_packages = [] for dist in importlib_metadata.distributions(): name = dist.metadata['Name'] if name in package_names: related_packages.append(f'{name}-{dist.version}') pydantic_dir = os.path.abspath(os.path.dirname(os.path.dirname(__file__))) most_recent_commit = ( git.git_revision(pydantic_dir) if git.is_git_repo(pydantic_dir) and git.have_git() else 'unknown' ) info = { 'pydantic version': VERSION, 'pydantic-core version': pdc.__version__, 'pydantic-core build': getattr(pdc, 'build_info', None) or pdc.build_profile, 'install path': Path(__file__).resolve().parent, 'python version': sys.version, 'platform': platform.platform(), 'related packages': ' '.join(related_packages), 'commit': most_recent_commit, } return '\n'.join('{:>30} {}'.format(k + ':', str(v).replace('\n', ' ')) for k, v in info.items()) ``` # Internals Note This section is part of the *internals* documentation, and is partly targeted to contributors. Starting with Pydantic V2, part of the codebase is written in Rust in a separate package called `pydantic-core`. This was done partly in order to improve validation and serialization performance (with the cost of limited customization and extendibility of the internal logic). This architecture documentation will first cover how the two `pydantic` and `pydantic-core` packages interact together, then will go through the architecture specifics for various patterns (model definition, validation, serialization, JSON Schema). Usage of the Pydantic library can be divided into two parts: - Model definition, done in the `pydantic` package. - Model validation and serialization, done in the `pydantic-core` package. ## Model definition Whenever a Pydantic BaseModel is defined, the metaclass will analyze the body of the model to collect a number of elements: - Defined annotations to build model fields (collected in the model_fields attribute). - Model configuration, set with model_config. - Additional validators/serializers. - Private attributes, class variables, identification of generic parametrization, etc. ### Communicating between `pydantic` and `pydantic-core`: the core schema We then need a way to communicate the collected information from the model definition to `pydantic-core`, so that validation and serialization is performed accordingly. To do so, Pydantic uses the concept of a core schema: a structured (and serializable) Python dictionary (represented using TypedDict definitions) describing a specific validation and serialization logic. It is the core data structure used to communicate between the `pydantic` and `pydantic-core` packages. Every core schema has a required `type` key, and extra properties depending on this `type`. The generation of a core schema is handled in a single place, by the `GenerateSchema` class (no matter if it is for a Pydantic model or anything else). Note It is not possible to define a custom core schema. A core schema needs to be understood by the `pydantic-core` package, and as such we only support a fixed number of core schema types. This is also part of the reason why the `GenerateSchema` isn't truly exposed and properly documented. The core schema definitions can be found in the pydantic_core.core_schema module. In the case of a Pydantic model, a core schema will be constructed and set as the __pydantic_core_schema__ attribute. To illustrate what a core schema looks like, we will take the example of the bool core schema: ```python class BoolSchema(TypedDict, total=False): type: Required[Literal['bool']] strict: bool ref: str metadata: Any serialization: SerSchema ``` When defining a Pydantic model with a boolean field: ```python from pydantic import BaseModel, Field class Model(BaseModel): foo: bool = Field(strict=True) ``` The core schema for the `foo` field will look like: ```python { 'type': 'bool', 'strict': True, } ``` As seen in the BoolSchema definition, the serialization logic is also defined in the core schema. If we were to define a custom serialization function for `foo` (1), the `serialization` key would look like: 1. For example using the field_serializer decorator: ```python class Model(BaseModel): foo: bool = Field(strict=True) @field_serializer('foo', mode='plain') def serialize_foo(self, value: bool) -> Any: ... ``` ```python { 'type': 'function-plain', 'function': , 'is_field_serializer': True, 'info_arg': False, 'return_schema': {'type': 'int'}, } ``` Note that this is also a core schema definition, just that it is only relevant for `pydantic-core` during serialization. Core schemas cover a broad scope, and are used whenever we want to communicate between the Python and Rust side. While the previous examples were related to validation and serialization, it could in theory be used for anything: error management, extra metadata, etc. ### JSON Schema generation You may have noticed that the previous serialization core schema has a `return_schema` key. This is because the core schema is also used to generate the corresponding JSON Schema. Similar to how the core schema is generated, the JSON Schema generation is handled by the GenerateJsonSchema class. The generate method is the main entry point and is given the core schema of that model. Coming back to our `bool` field example, the bool_schema method will be given the previously generated boolean core schema and will return the following JSON Schema: ```json { {"type": "boolean"} } ``` ### Customizing the core schema and JSON schema Usage Documentation [Custom types](../../concepts/types/#custom-types) [Implementing `__get_pydantic_core_schema__`](../../concepts/json_schema/#implementing-__get_pydantic_core_schema__) [Implementing `__get_pydantic_json_schema__`](../../concepts/json_schema/#implementing-__get_pydantic_json_schema__) While the `GenerateSchema` and GenerateJsonSchema classes handle the creation of the corresponding schemas, Pydantic offers a way to customize them in some cases, following a wrapper pattern. This customization is done through the `__get_pydantic_core_schema__` and `__get_pydantic_json_schema__` methods. To understand this wrapper pattern, we will take the example of metadata classes used with Annotated, where the `__get_pydantic_core_schema__` method can be used: ```python from typing import Annotated, Any from pydantic_core import CoreSchema from pydantic import GetCoreSchemaHandler, TypeAdapter class MyStrict: @classmethod def __get_pydantic_core_schema__( cls, source: Any, handler: GetCoreSchemaHandler ) -> CoreSchema: schema = handler(source) # (1)! schema['strict'] = True return schema class MyGt: @classmethod def __get_pydantic_core_schema__( cls, source: Any, handler: GetCoreSchemaHandler ) -> CoreSchema: schema = handler(source) # (2)! schema['gt'] = 1 return schema ta = TypeAdapter(Annotated[int, MyStrict(), MyGt()]) ``` 1. `MyStrict` is the first annotation to be applied. At this point, `schema = {'type': 'int'}`. 1. `MyGt` is the last annotation to be applied. At this point, `schema = {'type': 'int', 'strict': True}`. When the `GenerateSchema` class builds the core schema for `Annotated[int, MyStrict(), MyGt()]`, it will create an instance of a `GetCoreSchemaHandler` to be passed to the `MyGt.__get_pydantic_core_schema__` method. (1) 1. In the case of our Annotated pattern, the `GetCoreSchemaHandler` is defined in a nested way. Calling it will recursively call the other `__get_pydantic_core_schema__` methods until it reaches the `int` annotation, where a simple `{'type': 'int'}` schema is returned. The `source` argument depends on the core schema generation pattern. In the case of Annotated, the `source` will be the type being annotated. When [defining a custom type](../../concepts/types/#as-a-method-on-a-custom-type), the `source` will be the actual class where `__get_pydantic_core_schema__` is defined. ## Model validation and serialization While model definition was scoped to the *class* level (i.e. when defining your model), model validation and serialization happens at the *instance* level. Both these concepts are handled in `pydantic-core` (providing a 5 to 20 performance increase compared to Pydantic V1), by using the previously built core schema. `pydantic-core` exposes a SchemaValidator and SchemaSerializer class to perform these tasks: ```python from pydantic import BaseModel class Model(BaseModel): foo: int model = Model.model_validate({'foo': 1}) # (1)! dumped = model.model_dump() # (2)! ``` 1. The provided data is sent to `pydantic-core` by using the SchemaValidator.validate_python method. `pydantic-core` will validate (following the core schema of the model) the data and populate the model's `__dict__` attribute. 1. The `model` instance is sent to `pydantic-core` by using the SchemaSerializer.to_python method. `pydantic-core` will read the instance's `__dict__` attribute and built the appropriate result (again, following the core schema of the model). Note This section is part of the *internals* documentation, and is partly targeted to contributors. Pydantic heavily relies on type hints at runtime to build schemas for validation, serialization, etc. While type hints were primarily introduced for static type checkers (such as [Mypy](https://www.mypy-lang.org/) or [Pyright](https://github.com/microsoft/pyright/)), they are accessible (and sometimes evaluated) at runtime. This means that the following would fail at runtime, because `Node` has yet to be defined in the current module: ```python class Node: """Binary tree node.""" # NameError: name 'Node' is not defined: def __init__(self, l: Node, r: Node) -> None: self.left = l self.right = r ``` To circumvent this issue, forward references can be used (by wrapping the annotation in quotes). In Python 3.7, [PEP 563](https://peps.python.org/pep-0563/) introduced the concept of *postponed evaluation of annotations*, meaning with the `from __future__ import annotations` [future statement](https://docs.python.org/3/reference/simple_stmts.html#future), type hints are stringified by default: ```python from __future__ import annotations from pydantic import BaseModel class Foo(BaseModel): f: MyType # Given the future import above, this is equivalent to: # f: 'MyType' type MyType = int print(Foo.__annotations__) #> {'f': 'MyType'} ``` ## The challenges of runtime evaluation Static type checkers make use of the AST to analyze the defined annotations. Regarding the previous example, this has the benefit of being able to understand what `MyType` refers to when analyzing the class definition of `Foo`, even if `MyType` isn't yet defined at runtime. However, for runtime tools such as Pydantic, it is more challenging to correctly resolve these forward annotations. The Python standard library provides some tools to do so (typing.get_type_hints(), inspect.get_annotations()), but they come with some limitations. Thus, they are being re-implemented in Pydantic with improved support for edge cases. As Pydantic as grown, it's adapted to support many edge cases requiring irregular patterns for annotation evaluation. Some of these use cases aren't necessarily sound from a static type checking perspective. In v2.10, the internal logic was refactored in an attempt to simplify and standardize annotation evaluation. Admittedly, backwards compatibility posed some challenges, and there is still some noticeable scar tissue in the codebase because of this.There's a hope that [PEP 649](https://peps.python.org/pep-0649/) (introduced in Python 3.14) will greatly simplify the process, especially when it comes to dealing with locals of a function. To evaluate forward references, Pydantic roughly follows the same logic as described in the documentation of the typing.get_type_hints() function. That is, the built-in eval() function is used by passing the forward reference, a global, and a local namespace. The namespace fetching logic is defined in the sections below. ## Resolving annotations at class definition The following example will be used as a reference throughout this section: ```python # module1.py: type MyType = int class Base: f1: 'MyType' # module2.py: from pydantic import BaseModel from module1 import Base type MyType = str def inner() -> None: type InnerType = bool class Model(BaseModel, Base): type LocalType = bytes f2: 'MyType' f3: 'InnerType' f4: 'LocalType' f5: 'UnknownType' type InnerType2 = complex ``` When the `Model` class is being built, different namespaces are at play. For each base class of the `Model`'s MRO (in reverse order — that is, starting with `Base`), the following logic is applied: 1. Fetch the `__annotations__` key from the current base class' `__dict__`, if present. For `Base`, this will be `{'f1': 'MyType'}`. 1. Iterate over the `__annotations__` items and try to evaluate the annotation [1](#fn:1) using a custom wrapper around the built-in eval() function. This function takes two `globals` and `locals` arguments: - The current module's `__dict__` is naturally used as `globals`. For `Base`, this will be `sys.modules['module1'].__dict__`. - For the `locals` argument, Pydantic will try to resolve symbols in the following namespaces, sorted by highest priority: - A namespace created on the fly, containing the current class name (`{cls.__name__: cls}`). This is done in order to support recursive references. - The locals of the current class (i.e. `cls.__dict__`). For `Model`, this will include `LocalType`. - The parent namespace of the class, if different from the globals described above. This is the locals of the frame where the class is being defined. For `Base`, because the class is being defined in the module directly, this namespace won't be used as it will result in the globals being used again. For `Model`, the parent namespace is the locals of the frame of `inner()`. 1. If the annotation failed to evaluate, it is kept as is, so that the model can be rebuilt at a later stage. This will be the case for `f5`. The following table lists the resolved type annotations for every field, once the `Model` class has been created: | Field name | Resolved annotation | | --- | --- | | `f1` | int | | `f2` | str | | `f3` | bool | | `f4` | bytes | | `f5` | `'UnkownType'` | ### Limitations and backwards compatibility concerns While the namespace fetching logic is trying to be as accurate as possible, we still face some limitations: - The locals of the current class (`cls.__dict__`) may include irrelevant entries, most of them being dunder attributes. This means that the following annotation: `f: '__doc__'` will successfully (and unexpectedly) be resolved. - When the `Model` class is being created inside a function, we keep a copy of the locals of the frame. This copy only includes the symbols defined in the locals when `Model` is being defined, meaning `InnerType2` won't be included (and will **not be** if doing a model rebuild at a later point!). - To avoid memory leaks, we use weak references to the locals of the function, meaning some forward references might not resolve outside the function (1). - Locals of the function are only taken into account for Pydantic models, but this pattern does not apply to dataclasses, typed dictionaries or named tuples. 1. Here is an example: ```python def func(): A = int class Model(BaseModel): f: 'A | Forward' return Model Model = func() Model.model_rebuild(_types_namespace={'Forward': str}) # pydantic.errors.PydanticUndefinedAnnotation: name 'A' is not defined ``` For backwards compatibility reasons, and to be able to support valid use cases without having to rebuild models, the namespace logic described above is a bit different when it comes to core schema generation. Taking the following example: ```python from dataclasses import dataclass from pydantic import BaseModel @dataclass class Foo: a: 'Bar | None' = None class Bar(BaseModel): b: Foo ``` Once the fields for `Bar` have been collected (meaning annotations resolved), the `GenerateSchema` class converts every field into a core schema. When it encounters another class-like field type (such as a dataclass), it will try to evaluate annotations, following roughly the same logic as [described above](#resolving-annotations-at-class-definition). However, to evaluate the `'Bar | None'` annotation, `Bar` needs to be present in the globals or locals, which is normally *not* the case: `Bar` is being created, so it is not "assigned" to the current module's `__dict__` at that point. To avoid having to call model_rebuild() on `Bar`, both the parent namespace (if `Bar` was to be defined inside a function, and [the namespace provided during a model rebuild](#model-rebuild-semantics)) and the `{Bar.__name__: Bar}` namespace are included in the locals during annotations evaluation of `Foo` (with the lowest priority) (1). 1. This backwards compatibility logic can introduce some inconsistencies, such as the following: ```python from dataclasses import dataclass from pydantic import BaseModel @dataclass class Foo: # `a` and `b` shouldn't resolve: a: 'Model' b: 'Inner' def func(): Inner = int class Model(BaseModel): foo: Foo Model.__pydantic_complete__ #> True, should be False. ``` ## Resolving annotations when rebuilding a model When a forward reference fails to evaluate, Pydantic will silently fail and stop the core schema generation process. This can be seen by inspecting the `__pydantic_core_schema__` of a model class: ```python from pydantic import BaseModel class Foo(BaseModel): f: 'MyType' Foo.__pydantic_core_schema__ #> ``` If you then properly define `MyType`, you can rebuild the model: ```python type MyType = int Foo.model_rebuild() Foo.__pydantic_core_schema__ #> {'type': 'model', 'schema': {...}, ...} ``` The model_rebuild() method uses a *rebuild namespace*, with the following semantics: - If an explicit `_types_namespace` argument is provided, it is used as the rebuild namespace. - If no namespace is provided, the namespace where the method is called will be used as the rebuild namespace. This *rebuild namespace* will be merged with the model's parent namespace (if it was defined in a function) and used as is (see the [backwards compatibility logic](#backwards-compatibility-logic) described above). ______________________________________________________________________ 1. This is done unconditionally, as forward annotations can be only present *as part* of a type hint (e.g. `Optional['int']`), as dictated by the [typing specification](https://typing.readthedocs.io/en/latest/spec/annotations.html#string-annotations). [↩](#fnref:1 "Jump back to footnote 1 in the text") # Optional Pydantic will raise a ValidationError whenever it finds an error in the data it's validating. Note Validation code should not raise the ValidationError itself, but rather raise a ValueError or a AssertionError (or subclass thereof) which will be caught and used to populate the final ValidationError. For more details, refer to the [dedicated section](../../concepts/validators/#raising-validation-errors) of the validators documentation. That ValidationError will contain information about all the errors and how they happened. You can access these errors in several ways: | Method | Description | | --- | --- | | errors() | Returns a list of ErrorDetails errors found in the input data. | | error_count() | Returns the number of errors. | | json() | Returns a JSON representation of the list errors. | | `str(e)` | Returns a human-readable representation of the errors. | The ErrorDetails object is a dictionary. It contains the following: | Property | Description | | --- | --- | | ctx | An optional object which contains values required to render the error message. | | input | The input provided for validation. | | loc | The error's location as a list. | | msg | A human-readable explanation of the error. | | type | A computer-readable identifier of the error type. | | url | The documentation URL giving information about the error. | The first item in the loc list will be the field where the error occurred, and if the field is a [sub-model](../../concepts/models/#nested-models), subsequent items will be present to indicate the nested location of the error. As a demonstration: ```python from pydantic import BaseModel, Field, ValidationError, field_validator class Location(BaseModel): lat: float = 0.1 lng: float = 10.1 class Model(BaseModel): is_required: float gt_int: int = Field(gt=42) list_of_ints: list[int] a_float: float recursive_model: Location @field_validator('a_float', mode='after') @classmethod def validate_float(cls, value: float) -> float: if value > 2.0: raise ValueError('Invalid float value') return value data = { 'list_of_ints': ['1', 2, 'bad'], 'a_float': 3.0, 'recursive_model': {'lat': 4.2, 'lng': 'New York'}, 'gt_int': 21, } try: Model(**data) except ValidationError as e: print(e) """ 5 validation errors for Model is_required Field required [type=missing, input_value={'list_of_ints': ['1', 2,...ew York'}, 'gt_int': 21}, input_type=dict] gt_int Input should be greater than 42 [type=greater_than, input_value=21, input_type=int] list_of_ints.2 Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='bad', input_type=str] a_float Value error, Invalid float value [type=value_error, input_value=3.0, input_type=float] recursive_model.lng Input should be a valid number, unable to parse string as a number [type=float_parsing, input_value='New York', input_type=str] """ try: Model(**data) except ValidationError as e: print(e.errors()) """ [ { 'type': 'missing', 'loc': ('is_required',), 'msg': 'Field required', 'input': { 'list_of_ints': ['1', 2, 'bad'], 'a_float': 3.0, 'recursive_model': {'lat': 4.2, 'lng': 'New York'}, 'gt_int': 21, }, 'url': 'https://errors.pydantic.dev/2/v/missing', }, { 'type': 'greater_than', 'loc': ('gt_int',), 'msg': 'Input should be greater than 42', 'input': 21, 'ctx': {'gt': 42}, 'url': 'https://errors.pydantic.dev/2/v/greater_than', }, { 'type': 'int_parsing', 'loc': ('list_of_ints', 2), 'msg': 'Input should be a valid integer, unable to parse string as an integer', 'input': 'bad', 'url': 'https://errors.pydantic.dev/2/v/int_parsing', }, { 'type': 'value_error', 'loc': ('a_float',), 'msg': 'Value error, Invalid float value', 'input': 3.0, 'ctx': {'error': ValueError('Invalid float value')}, 'url': 'https://errors.pydantic.dev/2/v/value_error', }, { 'type': 'float_parsing', 'loc': ('recursive_model', 'lng'), 'msg': 'Input should be a valid number, unable to parse string as a number', 'input': 'New York', 'url': 'https://errors.pydantic.dev/2/v/float_parsing', }, ] """ ``` ## Error messages Pydantic attempts to provide useful default error messages for validation and usage errors, which can be found here: - [Validation Errors](../validation_errors/): Errors that happen during data validation. - [Usage Errors](../usage_errors/): Errors that happen when using Pydantic. ### Customize error messages You can customize error messages by creating a custom error handler. ```python from pydantic_core import ErrorDetails from pydantic import BaseModel, HttpUrl, ValidationError CUSTOM_MESSAGES = { 'int_parsing': 'This is not an integer! 🤦', 'url_scheme': 'Hey, use the right URL scheme! I wanted {expected_schemes}.', } def convert_errors( e: ValidationError, custom_messages: dict[str, str] ) -> list[ErrorDetails]: new_errors: list[ErrorDetails] = [] for error in e.errors(): custom_message = custom_messages.get(error['type']) if custom_message: ctx = error.get('ctx') error['msg'] = ( custom_message.format(**ctx) if ctx else custom_message ) new_errors.append(error) return new_errors class Model(BaseModel): a: int b: HttpUrl try: Model(a='wrong', b='ftp://example.com') except ValidationError as e: errors = convert_errors(e, CUSTOM_MESSAGES) print(errors) """ [ { 'type': 'int_parsing', 'loc': ('a',), 'msg': 'This is not an integer! 🤦', 'input': 'wrong', 'url': 'https://errors.pydantic.dev/2/v/int_parsing', }, { 'type': 'url_scheme', 'loc': ('b',), 'msg': "Hey, use the right URL scheme! I wanted 'http' or 'https'.", 'input': 'ftp://example.com', 'ctx': {'expected_schemes': "'http' or 'https'"}, 'url': 'https://errors.pydantic.dev/2/v/url_scheme', }, ] """ ``` A common use case would be to translate error messages. For example, in the above example, we could translate the error messages replacing the `CUSTOM_MESSAGES` dictionary with a dictionary of translations. Another example is customizing the way that the `'loc'` value of an error is represented. ```python from typing import Any, Union from pydantic import BaseModel, ValidationError def loc_to_dot_sep(loc: tuple[Union[str, int], ...]) -> str: path = '' for i, x in enumerate(loc): if isinstance(x, str): if i > 0: path += '.' path += x elif isinstance(x, int): path += f'[{x}]' else: raise TypeError('Unexpected type') return path def convert_errors(e: ValidationError) -> list[dict[str, Any]]: new_errors: list[dict[str, Any]] = e.errors() for error in new_errors: error['loc'] = loc_to_dot_sep(error['loc']) return new_errors class TestNestedModel(BaseModel): key: str value: str class TestModel(BaseModel): items: list[TestNestedModel] data = {'items': [{'key': 'foo', 'value': 'bar'}, {'key': 'baz'}]} try: TestModel.model_validate(data) except ValidationError as e: print(e.errors()) # (1)! """ [ { 'type': 'missing', 'loc': ('items', 1, 'value'), 'msg': 'Field required', 'input': {'key': 'baz'}, 'url': 'https://errors.pydantic.dev/2/v/missing', } ] """ pretty_errors = convert_errors(e) print(pretty_errors) # (2)! """ [ { 'type': 'missing', 'loc': 'items[1].value', 'msg': 'Field required', 'input': {'key': 'baz'}, 'url': 'https://errors.pydantic.dev/2/v/missing', } ] """ ``` 1. By default, `e.errors()` produces a list of errors with `loc` values that take the form of tuples. 1. With our custom `loc_to_dot_sep` function, we've modified the form of the `loc` representation. ```python from typing import Any from pydantic import BaseModel, ValidationError def loc_to_dot_sep(loc: tuple[str | int, ...]) -> str: path = '' for i, x in enumerate(loc): if isinstance(x, str): if i > 0: path += '.' path += x elif isinstance(x, int): path += f'[{x}]' else: raise TypeError('Unexpected type') return path def convert_errors(e: ValidationError) -> list[dict[str, Any]]: new_errors: list[dict[str, Any]] = e.errors() for error in new_errors: error['loc'] = loc_to_dot_sep(error['loc']) return new_errors class TestNestedModel(BaseModel): key: str value: str class TestModel(BaseModel): items: list[TestNestedModel] data = {'items': [{'key': 'foo', 'value': 'bar'}, {'key': 'baz'}]} try: TestModel.model_validate(data) except ValidationError as e: print(e.errors()) # (1)! """ [ { 'type': 'missing', 'loc': ('items', 1, 'value'), 'msg': 'Field required', 'input': {'key': 'baz'}, 'url': 'https://errors.pydantic.dev/2/v/missing', } ] """ pretty_errors = convert_errors(e) print(pretty_errors) # (2)! """ [ { 'type': 'missing', 'loc': 'items[1].value', 'msg': 'Field required', 'input': {'key': 'baz'}, 'url': 'https://errors.pydantic.dev/2/v/missing', } ] """ ``` 1. By default, `e.errors()` produces a list of errors with `loc` values that take the form of tuples. 1. With our custom `loc_to_dot_sep` function, we've modified the form of the `loc` representation. Pydantic attempts to provide useful errors. The following sections provide details on common errors developers may encounter when working with Pydantic, along with suggestions for addressing the error condition. ## Class not fully defined This error is raised when a type referenced in an annotation of a pydantic-validated type (such as a subclass of `BaseModel`, or a pydantic `dataclass`) is not defined: ```python from typing import ForwardRef from pydantic import BaseModel, PydanticUserError UndefinedType = ForwardRef('UndefinedType') class Foobar(BaseModel): a: UndefinedType try: Foobar(a=1) except PydanticUserError as exc_info: assert exc_info.code == 'class-not-fully-defined' ``` Or when the type has been defined after usage: ```python from typing import Optional from pydantic import BaseModel, PydanticUserError class Foo(BaseModel): a: Optional['Bar'] = None try: # this doesn't work, see raised error foo = Foo(a={'b': {'a': None}}) except PydanticUserError as exc_info: assert exc_info.code == 'class-not-fully-defined' class Bar(BaseModel): b: 'Foo' # this works, though foo = Foo(a={'b': {'a': None}}) ``` For BaseModel subclasses, it can be fixed by defining the type and then calling `.model_rebuild()`: ```python from typing import Optional from pydantic import BaseModel class Foo(BaseModel): a: Optional['Bar'] = None class Bar(BaseModel): b: 'Foo' Foo.model_rebuild() foo = Foo(a={'b': {'a': None}}) ``` In other cases, the error message should indicate how to rebuild the class with the appropriate type defined. ## Custom JSON Schema The `__modify_schema__` method is no longer supported in V2. You should use the `__get_pydantic_json_schema__` method instead. The `__modify_schema__` used to receive a single argument representing the JSON schema. See the example below: Old way ```python from pydantic import BaseModel, PydanticUserError try: class Model(BaseModel): @classmethod def __modify_schema__(cls, field_schema): field_schema.update(examples=['example']) except PydanticUserError as exc_info: assert exc_info.code == 'custom-json-schema' ``` The new method `__get_pydantic_json_schema__` receives two arguments: the first is a dictionary denoted as `CoreSchema`, and the second a callable `handler` that receives a `CoreSchema` as parameter, and returns a JSON schema. See the example below: New way ```python from typing import Any from pydantic_core import CoreSchema from pydantic import BaseModel, GetJsonSchemaHandler class Model(BaseModel): @classmethod def __get_pydantic_json_schema__( cls, core_schema: CoreSchema, handler: GetJsonSchemaHandler ) -> dict[str, Any]: json_schema = super().__get_pydantic_json_schema__(core_schema, handler) json_schema = handler.resolve_ref_schema(json_schema) json_schema.update(examples=['example']) return json_schema print(Model.model_json_schema()) """ {'examples': ['example'], 'properties': {}, 'title': 'Model', 'type': 'object'} """ ``` ## Decorator on missing field This error is raised when you define a decorator with a field that is not valid. ```python from typing import Any from pydantic import BaseModel, PydanticUserError, field_validator try: class Model(BaseModel): a: str @field_validator('b') def check_b(cls, v: Any): return v except PydanticUserError as exc_info: assert exc_info.code == 'decorator-missing-field' ``` You can use `check_fields=False` if you're inheriting from the model and intended this. ```python from typing import Any from pydantic import BaseModel, create_model, field_validator class Model(BaseModel): @field_validator('a', check_fields=False) def check_a(cls, v: Any): return v model = create_model('FooModel', a=(str, 'cake'), __base__=Model) ``` ## Discriminator no field This error is raised when a model in discriminated unions doesn't define a discriminator field. ```python from typing import Literal, Union from pydantic import BaseModel, Field, PydanticUserError class Cat(BaseModel): c: str class Dog(BaseModel): pet_type: Literal['dog'] d: str try: class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') number: int except PydanticUserError as exc_info: assert exc_info.code == 'discriminator-no-field' ``` ```python from typing import Literal from pydantic import BaseModel, Field, PydanticUserError class Cat(BaseModel): c: str class Dog(BaseModel): pet_type: Literal['dog'] d: str try: class Model(BaseModel): pet: Cat | Dog = Field(discriminator='pet_type') number: int except PydanticUserError as exc_info: assert exc_info.code == 'discriminator-no-field' ``` ## Discriminator alias type This error is raised when you define a non-string alias on a discriminator field. ```python from typing import Literal, Union from pydantic import AliasChoices, BaseModel, Field, PydanticUserError class Cat(BaseModel): pet_type: Literal['cat'] = Field( validation_alias=AliasChoices('Pet', 'PET') ) c: str class Dog(BaseModel): pet_type: Literal['dog'] d: str try: class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') number: int except PydanticUserError as exc_info: assert exc_info.code == 'discriminator-alias-type' ``` ```python from typing import Literal from pydantic import AliasChoices, BaseModel, Field, PydanticUserError class Cat(BaseModel): pet_type: Literal['cat'] = Field( validation_alias=AliasChoices('Pet', 'PET') ) c: str class Dog(BaseModel): pet_type: Literal['dog'] d: str try: class Model(BaseModel): pet: Cat | Dog = Field(discriminator='pet_type') number: int except PydanticUserError as exc_info: assert exc_info.code == 'discriminator-alias-type' ``` ## Discriminator needs literal This error is raised when you define a non-`Literal` type on a discriminator field. ```python from typing import Literal, Union from pydantic import BaseModel, Field, PydanticUserError class Cat(BaseModel): pet_type: int c: str class Dog(BaseModel): pet_type: Literal['dog'] d: str try: class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') number: int except PydanticUserError as exc_info: assert exc_info.code == 'discriminator-needs-literal' ``` ```python from typing import Literal from pydantic import BaseModel, Field, PydanticUserError class Cat(BaseModel): pet_type: int c: str class Dog(BaseModel): pet_type: Literal['dog'] d: str try: class Model(BaseModel): pet: Cat | Dog = Field(discriminator='pet_type') number: int except PydanticUserError as exc_info: assert exc_info.code == 'discriminator-needs-literal' ``` ## Discriminator alias This error is raised when you define different aliases on discriminator fields. ```python from typing import Literal, Union from pydantic import BaseModel, Field, PydanticUserError class Cat(BaseModel): pet_type: Literal['cat'] = Field(validation_alias='PET') c: str class Dog(BaseModel): pet_type: Literal['dog'] = Field(validation_alias='Pet') d: str try: class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') number: int except PydanticUserError as exc_info: assert exc_info.code == 'discriminator-alias' ``` ```python from typing import Literal from pydantic import BaseModel, Field, PydanticUserError class Cat(BaseModel): pet_type: Literal['cat'] = Field(validation_alias='PET') c: str class Dog(BaseModel): pet_type: Literal['dog'] = Field(validation_alias='Pet') d: str try: class Model(BaseModel): pet: Cat | Dog = Field(discriminator='pet_type') number: int except PydanticUserError as exc_info: assert exc_info.code == 'discriminator-alias' ``` ## Invalid discriminator validator This error is raised when you use a before, wrap, or plain validator on a discriminator field. This is disallowed because the discriminator field is used to determine the type of the model to use for validation, so you can't use a validator that might change its value. ```python from typing import Literal, Union from pydantic import BaseModel, Field, PydanticUserError, field_validator class Cat(BaseModel): pet_type: Literal['cat'] @field_validator('pet_type', mode='before') @classmethod def validate_pet_type(cls, v): if v == 'kitten': return 'cat' return v class Dog(BaseModel): pet_type: Literal['dog'] try: class Model(BaseModel): pet: Union[Cat, Dog] = Field(discriminator='pet_type') number: int except PydanticUserError as exc_info: assert exc_info.code == 'discriminator-validator' ``` ```python from typing import Literal from pydantic import BaseModel, Field, PydanticUserError, field_validator class Cat(BaseModel): pet_type: Literal['cat'] @field_validator('pet_type', mode='before') @classmethod def validate_pet_type(cls, v): if v == 'kitten': return 'cat' return v class Dog(BaseModel): pet_type: Literal['dog'] try: class Model(BaseModel): pet: Cat | Dog = Field(discriminator='pet_type') number: int except PydanticUserError as exc_info: assert exc_info.code == 'discriminator-validator' ``` This can be worked around by using a standard `Union`, dropping the discriminator: ```python from typing import Literal, Union from pydantic import BaseModel, field_validator class Cat(BaseModel): pet_type: Literal['cat'] @field_validator('pet_type', mode='before') @classmethod def validate_pet_type(cls, v): if v == 'kitten': return 'cat' return v class Dog(BaseModel): pet_type: Literal['dog'] class Model(BaseModel): pet: Union[Cat, Dog] assert Model(pet={'pet_type': 'kitten'}).pet.pet_type == 'cat' ``` ```python from typing import Literal from pydantic import BaseModel, field_validator class Cat(BaseModel): pet_type: Literal['cat'] @field_validator('pet_type', mode='before') @classmethod def validate_pet_type(cls, v): if v == 'kitten': return 'cat' return v class Dog(BaseModel): pet_type: Literal['dog'] class Model(BaseModel): pet: Cat | Dog assert Model(pet={'pet_type': 'kitten'}).pet.pet_type == 'cat' ``` ## Callable discriminator case with no tag This error is raised when a `Union` that uses a callable `Discriminator` doesn't have `Tag` annotations for all cases. ```python from typing import Annotated, Union from pydantic import BaseModel, Discriminator, PydanticUserError, Tag def model_x_discriminator(v): if isinstance(v, str): return 'str' if isinstance(v, (dict, BaseModel)): return 'model' # tag missing for both union choices try: class DiscriminatedModel(BaseModel): x: Annotated[ Union[str, 'DiscriminatedModel'], Discriminator(model_x_discriminator), ] except PydanticUserError as exc_info: assert exc_info.code == 'callable-discriminator-no-tag' # tag missing for `'DiscriminatedModel'` union choice try: class DiscriminatedModel(BaseModel): x: Annotated[ Union[Annotated[str, Tag('str')], 'DiscriminatedModel'], Discriminator(model_x_discriminator), ] except PydanticUserError as exc_info: assert exc_info.code == 'callable-discriminator-no-tag' # tag missing for `str` union choice try: class DiscriminatedModel(BaseModel): x: Annotated[ Union[str, Annotated['DiscriminatedModel', Tag('model')]], Discriminator(model_x_discriminator), ] except PydanticUserError as exc_info: assert exc_info.code == 'callable-discriminator-no-tag' ``` ```python from typing import Annotated, Union from pydantic import BaseModel, Discriminator, PydanticUserError, Tag def model_x_discriminator(v): if isinstance(v, str): return 'str' if isinstance(v, (dict, BaseModel)): return 'model' # tag missing for both union choices try: class DiscriminatedModel(BaseModel): x: Annotated[ Union[str, 'DiscriminatedModel'], Discriminator(model_x_discriminator), ] except PydanticUserError as exc_info: assert exc_info.code == 'callable-discriminator-no-tag' # tag missing for `'DiscriminatedModel'` union choice try: class DiscriminatedModel(BaseModel): x: Annotated[ Union[Annotated[str, Tag('str')], 'DiscriminatedModel'], Discriminator(model_x_discriminator), ] except PydanticUserError as exc_info: assert exc_info.code == 'callable-discriminator-no-tag' # tag missing for `str` union choice try: class DiscriminatedModel(BaseModel): x: Annotated[ str | Annotated['DiscriminatedModel', Tag('model')], Discriminator(model_x_discriminator), ] except PydanticUserError as exc_info: assert exc_info.code == 'callable-discriminator-no-tag' ``` ## `TypedDict` version This error is raised when you use typing.TypedDict instead of `typing_extensions.TypedDict` on Python < 3.12. ## Model parent field overridden This error is raised when a field defined on a base class was overridden by a non-annotated attribute. ```python from pydantic import BaseModel, PydanticUserError class Foo(BaseModel): a: float try: class Bar(Foo): x: float = 12.3 a = 123.0 except PydanticUserError as exc_info: assert exc_info.code == 'model-field-overridden' ``` ## Model field missing annotation This error is raised when a field doesn't have an annotation. ```python from pydantic import BaseModel, Field, PydanticUserError try: class Model(BaseModel): a = Field('foobar') b = None except PydanticUserError as exc_info: assert exc_info.code == 'model-field-missing-annotation' ``` If the field is not meant to be a field, you may be able to resolve the error by annotating it as a `ClassVar`: ```python from typing import ClassVar from pydantic import BaseModel class Model(BaseModel): a: ClassVar[str] ``` Or updating `model_config['ignored_types']`: ```python from pydantic import BaseModel, ConfigDict class IgnoredType: pass class MyModel(BaseModel): model_config = ConfigDict(ignored_types=(IgnoredType,)) _a = IgnoredType() _b: int = IgnoredType() _c: IgnoredType _d: IgnoredType = IgnoredType() ``` ## `Config` and `model_config` both defined This error is raised when `class Config` and `model_config` are used together. ```python from pydantic import BaseModel, ConfigDict, PydanticUserError try: class Model(BaseModel): model_config = ConfigDict(from_attributes=True) a: str class Config: from_attributes = True except PydanticUserError as exc_info: assert exc_info.code == 'config-both' ``` ## Keyword arguments removed This error is raised when the keyword arguments are not available in Pydantic V2. For example, `regex` is removed from Pydantic V2: ```python from pydantic import BaseModel, Field, PydanticUserError try: class Model(BaseModel): x: str = Field(regex='test') except PydanticUserError as exc_info: assert exc_info.code == 'removed-kwargs' ``` ## Circular reference schema This error is raised when a circular reference is found that would otherwise result in an infinite recursion. For example, this is a valid type alias: ```python type A = list[A] | None ``` while these are not: ```python type A = A type B = C type C = B ``` ## JSON schema invalid type This error is raised when Pydantic fails to generate a JSON schema for some `CoreSchema`. ```python from pydantic import BaseModel, ImportString, PydanticUserError class Model(BaseModel): a: ImportString try: Model.model_json_schema() except PydanticUserError as exc_info: assert exc_info.code == 'invalid-for-json-schema' ``` ## JSON schema already used This error is raised when the JSON schema generator has already been used to generate a JSON schema. You must create a new instance to generate a new JSON schema. ## BaseModel instantiated This error is raised when you instantiate `BaseModel` directly. Pydantic models should inherit from `BaseModel`. ```python from pydantic import BaseModel, PydanticUserError try: BaseModel() except PydanticUserError as exc_info: assert exc_info.code == 'base-model-instantiated' ``` ## Undefined annotation This error is raised when handling undefined annotations during `CoreSchema` generation. ```python from pydantic import BaseModel, PydanticUndefinedAnnotation class Model(BaseModel): a: 'B' # noqa F821 try: Model.model_rebuild() except PydanticUndefinedAnnotation as exc_info: assert exc_info.code == 'undefined-annotation' ``` ## Schema for unknown type This error is raised when Pydantic fails to generate a `CoreSchema` for some type. ```python from pydantic import BaseModel, PydanticUserError try: class Model(BaseModel): x: 43 = 123 except PydanticUserError as exc_info: assert exc_info.code == 'schema-for-unknown-type' ``` ## Import error This error is raised when you try to import an object that was available in Pydantic V1, but has been removed in Pydantic V2. See the [Migration Guide](../../migration/) for more information. ## `create_model` field definitions This error is raised when you provide invalid field definitions in create_model(). ```python from pydantic import PydanticUserError, create_model try: create_model('FooModel', foo=(str, 'default value', 'more')) except PydanticUserError as exc_info: assert exc_info.code == 'create-model-field-definitions' ``` The fields definition syntax can be found in the [dynamic model creation](../../concepts/models/#dynamic-model-creation) documentation. ## `create_model` config base This error is raised when you use both `__config__` and `__base__` together in `create_model`. ```python from pydantic import BaseModel, ConfigDict, PydanticUserError, create_model try: config = ConfigDict(frozen=True) model = create_model( 'FooModel', foo=(int, ...), __config__=config, __base__=BaseModel ) except PydanticUserError as exc_info: assert exc_info.code == 'create-model-config-base' ``` ## Validator with no fields This error is raised when you use validator bare (with no fields). ```python from pydantic import BaseModel, PydanticUserError, field_validator try: class Model(BaseModel): a: str @field_validator def checker(cls, v): return v except PydanticUserError as exc_info: assert exc_info.code == 'validator-no-fields' ``` Validators should be used with fields and keyword arguments. ```python from pydantic import BaseModel, field_validator class Model(BaseModel): a: str @field_validator('a') def checker(cls, v): return v ``` ## Invalid validator fields This error is raised when you use a validator with non-string fields. ```python from pydantic import BaseModel, PydanticUserError, field_validator try: class Model(BaseModel): a: str b: str @field_validator(['a', 'b']) def check_fields(cls, v): return v except PydanticUserError as exc_info: assert exc_info.code == 'validator-invalid-fields' ``` Fields should be passed as separate string arguments: ```python from pydantic import BaseModel, field_validator class Model(BaseModel): a: str b: str @field_validator('a', 'b') def check_fields(cls, v): return v ``` ## Validator on instance method This error is raised when you apply a validator on an instance method. ```python from pydantic import BaseModel, PydanticUserError, field_validator try: class Model(BaseModel): a: int = 1 @field_validator('a') def check_a(self, value): return value except PydanticUserError as exc_info: assert exc_info.code == 'validator-instance-method' ``` ## `json_schema_input_type` used with the wrong mode This error is raised when you explicitly specify a value for the `json_schema_input_type` argument and `mode` isn't set to either `'before'`, `'plain'` or `'wrap'`. ```python from pydantic import BaseModel, PydanticUserError, field_validator try: class Model(BaseModel): a: int = 1 @field_validator('a', mode='after', json_schema_input_type=int) @classmethod def check_a(self, value): return value except PydanticUserError as exc_info: assert exc_info.code == 'validator-input-type' ``` Documenting the JSON Schema input type is only possible for validators where the given value can be anything. That is why it isn't available for `after` validators, where the value is first validated against the type annotation. ## Root validator, `pre`, `skip_on_failure` If you use `@root_validator` with `pre=False` (the default) you MUST specify `skip_on_failure=True`. The `skip_on_failure=False` option is no longer available. If you were not trying to set `skip_on_failure=False`, you can safely set `skip_on_failure=True`. If you do, this root validator will no longer be called if validation fails for any of the fields. Please see the [Migration Guide](../../migration/) for more details. ## `model_serializer` instance methods `@model_serializer` must be applied to instance methods. This error is raised when you apply `model_serializer` on an instance method without `self`: ```python from pydantic import BaseModel, PydanticUserError, model_serializer try: class MyModel(BaseModel): a: int @model_serializer def _serialize(slf, x, y, z): return slf except PydanticUserError as exc_info: assert exc_info.code == 'model-serializer-instance-method' ``` Or on a class method: ```python from pydantic import BaseModel, PydanticUserError, model_serializer try: class MyModel(BaseModel): a: int @model_serializer @classmethod def _serialize(self, x, y, z): return self except PydanticUserError as exc_info: assert exc_info.code == 'model-serializer-instance-method' ``` ## `validator`, `field`, `config`, and `info` The `field` and `config` parameters are not available in Pydantic V2. Please use the `info` parameter instead. You can access the configuration via `info.config`, but it is a dictionary instead of an object like it was in Pydantic V1. The `field` argument is no longer available. ## Pydantic V1 validator signature This error is raised when you use an unsupported signature for Pydantic V1-style validator. ```python import warnings from pydantic import BaseModel, PydanticUserError, validator warnings.filterwarnings('ignore', category=DeprecationWarning) try: class Model(BaseModel): a: int @validator('a') def check_a(cls, value, foo): return value except PydanticUserError as exc_info: assert exc_info.code == 'validator-v1-signature' ``` ## Unrecognized `field_validator` signature This error is raised when a `field_validator` or `model_validator` function has the wrong signature. ```python from pydantic import BaseModel, PydanticUserError, field_validator try: class Model(BaseModel): a: str @field_validator('a') @classmethod def check_a(cls): return 'a' except PydanticUserError as exc_info: assert exc_info.code == 'validator-signature' ``` ## Unrecognized `field_serializer` signature This error is raised when the `field_serializer` function has the wrong signature. ```python from pydantic import BaseModel, PydanticUserError, field_serializer try: class Model(BaseModel): x: int @field_serializer('x') def no_args(): return 'x' except PydanticUserError as exc_info: assert exc_info.code == 'field-serializer-signature' ``` Valid field serializer signatures are: ```python from pydantic import FieldSerializationInfo, SerializerFunctionWrapHandler, field_serializer # an instance method with the default mode or `mode='plain'` @field_serializer('x') # or @field_serializer('x', mode='plain') def ser_x(self, value: Any, info: FieldSerializationInfo): ... # a static method or function with the default mode or `mode='plain'` @field_serializer('x') # or @field_serializer('x', mode='plain') @staticmethod def ser_x(value: Any, info: FieldSerializationInfo): ... # equivalent to def ser_x(value: Any, info: FieldSerializationInfo): ... serializer('x')(ser_x) # an instance method with `mode='wrap'` @field_serializer('x', mode='wrap') def ser_x(self, value: Any, nxt: SerializerFunctionWrapHandler, info: FieldSerializationInfo): ... # a static method or function with `mode='wrap'` @field_serializer('x', mode='wrap') @staticmethod def ser_x(value: Any, nxt: SerializerFunctionWrapHandler, info: FieldSerializationInfo): ... # equivalent to def ser_x(value: Any, nxt: SerializerFunctionWrapHandler, info: FieldSerializationInfo): ... serializer('x')(ser_x) # For all of these, you can also choose to omit the `info` argument, for example: @field_serializer('x') def ser_x(self, value: Any): ... @field_serializer('x', mode='wrap') def ser_x(self, value: Any, handler: SerializerFunctionWrapHandler): ... ``` ## Unrecognized `model_serializer` signature This error is raised when the `model_serializer` function has the wrong signature. ```python from pydantic import BaseModel, PydanticUserError, model_serializer try: class MyModel(BaseModel): a: int @model_serializer def _serialize(self, x, y, z): return self except PydanticUserError as exc_info: assert exc_info.code == 'model-serializer-signature' ``` Valid model serializer signatures are: ```python from pydantic import SerializerFunctionWrapHandler, SerializationInfo, model_serializer # an instance method with the default mode or `mode='plain'` @model_serializer # or model_serializer(mode='plain') def mod_ser(self, info: SerializationInfo): ... # an instance method with `mode='wrap'` @model_serializer(mode='wrap') def mod_ser(self, handler: SerializerFunctionWrapHandler, info: SerializationInfo): # For all of these, you can also choose to omit the `info` argument, for example: @model_serializer(mode='plain') def mod_ser(self): ... @model_serializer(mode='wrap') def mod_ser(self, handler: SerializerFunctionWrapHandler): ... ``` ## Multiple field serializers This error is raised when multiple `model_serializer` functions are defined for a field. ```python from pydantic import BaseModel, PydanticUserError, field_serializer try: class MyModel(BaseModel): x: int y: int @field_serializer('x', 'y') def serializer1(v): return f'{v:,}' @field_serializer('x') def serializer2(v): return v except PydanticUserError as exc_info: assert exc_info.code == 'multiple-field-serializers' ``` ## Invalid annotated type This error is raised when an annotation cannot annotate a type. ```python from typing import Annotated from pydantic import BaseModel, FutureDate, PydanticUserError try: class Model(BaseModel): foo: Annotated[str, FutureDate()] except PydanticUserError as exc_info: assert exc_info.code == 'invalid-annotated-type' ``` ## `config` is unused with `TypeAdapter` You will get this error if you try to pass `config` to `TypeAdapter` when the type is a type that has its own config that cannot be overridden (currently this is only `BaseModel`, `TypedDict` and `dataclass`): ```python from typing_extensions import TypedDict from pydantic import ConfigDict, PydanticUserError, TypeAdapter class MyTypedDict(TypedDict): x: int try: TypeAdapter(MyTypedDict, config=ConfigDict(strict=True)) except PydanticUserError as exc_info: assert exc_info.code == 'type-adapter-config-unused' ``` ```python from typing import TypedDict from pydantic import ConfigDict, PydanticUserError, TypeAdapter class MyTypedDict(TypedDict): x: int try: TypeAdapter(MyTypedDict, config=ConfigDict(strict=True)) except PydanticUserError as exc_info: assert exc_info.code == 'type-adapter-config-unused' ``` Instead you'll need to subclass the type and override or set the config on it: ```python from typing_extensions import TypedDict from pydantic import ConfigDict, TypeAdapter class MyTypedDict(TypedDict): x: int # or `model_config = ...` for BaseModel __pydantic_config__ = ConfigDict(strict=True) TypeAdapter(MyTypedDict) # ok ``` ```python from typing import TypedDict from pydantic import ConfigDict, TypeAdapter class MyTypedDict(TypedDict): x: int # or `model_config = ...` for BaseModel __pydantic_config__ = ConfigDict(strict=True) TypeAdapter(MyTypedDict) # ok ``` ## Cannot specify `model_config['extra']` with `RootModel` Because `RootModel` is not capable of storing or even accepting extra fields during initialization, we raise an error if you try to specify a value for the config setting `'extra'` when creating a subclass of `RootModel`: ```python from pydantic import PydanticUserError, RootModel try: class MyRootModel(RootModel): model_config = {'extra': 'allow'} root: int except PydanticUserError as exc_info: assert exc_info.code == 'root-model-extra' ``` ## Cannot evaluate type annotation Because type annotations are evaluated *after* assignments, you might get unexpected results when using a type annotation name that clashes with one of your fields. We raise an error in the following case: ```python from datetime import date from pydantic import BaseModel, Field class Model(BaseModel): date: date = Field(description='A date') ``` As a workaround, you can either use an alias or change your import: ```python import datetime # Or `from datetime import date as _date` from pydantic import BaseModel, Field class Model(BaseModel): date: datetime.date = Field(description='A date') ``` ## Incompatible `dataclass` `init` and `extra` settings Pydantic does not allow the specification of the `extra='allow'` setting on a dataclass while any of the fields have `init=False` set. Thus, you may not do something like the following: ```python from pydantic import ConfigDict, Field from pydantic.dataclasses import dataclass @dataclass(config=ConfigDict(extra='allow')) class A: a: int = Field(init=False, default=1) ``` The above snippet results in the following error during schema building for the `A` dataclass: ```text pydantic.errors.PydanticUserError: Field a has `init=False` and dataclass has config setting `extra="allow"`. This combination is not allowed. ``` ## Incompatible `init` and `init_var` settings on `dataclass` field The `init=False` and `init_var=True` settings are mutually exclusive. Doing so results in the `PydanticUserError` shown in the example below. ```python from pydantic import Field from pydantic.dataclasses import dataclass @dataclass class Foo: bar: str = Field(init=False, init_var=True) """ pydantic.errors.PydanticUserError: Dataclass field bar has init=False and init_var=True, but these are mutually exclusive. """ ``` ## `model_config` is used as a model field This error is raised when `model_config` is used as the name of a field. ```python from pydantic import BaseModel, PydanticUserError try: class Model(BaseModel): model_config: str except PydanticUserError as exc_info: assert exc_info.code == 'model-config-invalid-field-name' ``` ## with_config is used on a `BaseModel` subclass This error is raised when the with_config decorator is used on a class which is already a Pydantic model (use the `model_config` attribute instead). ```python from pydantic import BaseModel, PydanticUserError, with_config try: @with_config({'allow_inf_nan': True}) class Model(BaseModel): bar: str except PydanticUserError as exc_info: assert exc_info.code == 'with-config-on-model' ``` ## `dataclass` is used on a `BaseModel` subclass This error is raised when the Pydantic `dataclass` decorator is used on a class which is already a Pydantic model. ```python from pydantic import BaseModel, PydanticUserError from pydantic.dataclasses import dataclass try: @dataclass class Model(BaseModel): bar: str except PydanticUserError as exc_info: assert exc_info.code == 'dataclass-on-model' ``` ## Unsupported type for `validate_call` `validate_call` has some limitations on the callables it can validate. This error is raised when you try to use it with an unsupported callable. Currently the supported callables are functions (including lambdas, but not built-ins) and methods and instances of partial. In the case of partial, the function being partially applied must be one of the supported callables. ### `@classmethod`, `@staticmethod`, and `@property` These decorators must be put before `validate_call`. ```python from pydantic import PydanticUserError, validate_call # error try: class A: @validate_call @classmethod def f1(cls): ... except PydanticUserError as exc_info: assert exc_info.code == 'validate-call-type' # correct @classmethod @validate_call def f2(cls): ... ``` ### Classes While classes are callables themselves, `validate_call` can't be applied on them, as it needs to know about which method to use (`__init__` or `__new__`) to fetch type annotations. If you want to validate the constructor of a class, you should put `validate_call` on top of the appropriate method instead. ```python from pydantic import PydanticUserError, validate_call # error try: @validate_call class A1: ... except PydanticUserError as exc_info: assert exc_info.code == 'validate-call-type' # correct class A2: @validate_call def __init__(self): ... @validate_call def __new__(cls): ... ``` ### Callable instances Although instances can be callable by implementing a `__call__` method, currently the instances of these types cannot be validated with `validate_call`. This may change in the future, but for now, you should use `validate_call` explicitly on `__call__` instead. ```python from pydantic import PydanticUserError, validate_call # error try: class A1: def __call__(self): ... validate_call(A1()) except PydanticUserError as exc_info: assert exc_info.code == 'validate-call-type' # correct class A2: @validate_call def __call__(self): ... ``` ### Invalid signature This is generally less common, but a possible reason is that you are trying to validate a method that doesn't have at least one argument (usually `self`). ```python from pydantic import PydanticUserError, validate_call try: class A: def f(): ... validate_call(A().f) except PydanticUserError as exc_info: assert exc_info.code == 'validate-call-type' ``` ## Unpack used without a TypedDict This error is raised when Unpack is used with something other than a TypedDict class object to type hint variadic keyword parameters. For reference, see the [related specification section](https://typing.readthedocs.io/en/latest/spec/callables.html#unpack-for-keyword-arguments) and [PEP 692](https://peps.python.org/pep-0692/). ```python from typing_extensions import Unpack from pydantic import PydanticUserError, validate_call try: @validate_call def func(**kwargs: Unpack[int]): pass except PydanticUserError as exc_info: assert exc_info.code == 'unpack-typed-dict' ``` ```python from typing import Unpack from pydantic import PydanticUserError, validate_call try: @validate_call def func(**kwargs: Unpack[int]): pass except PydanticUserError as exc_info: assert exc_info.code == 'unpack-typed-dict' ``` ## Overlapping unpacked TypedDict fields and arguments This error is raised when the typed dictionary used to type hint variadic keywords parameters has field names overlapping with other parameters (unless positional only). For reference, see the [related specification section](https://typing.readthedocs.io/en/latest/spec/callables.html#unpack-for-keyword-arguments) and [PEP 692](https://peps.python.org/pep-0692/). ```python from typing_extensions import TypedDict, Unpack from pydantic import PydanticUserError, validate_call class TD(TypedDict): a: int try: @validate_call def func(a: int, **kwargs: Unpack[TD]): pass except PydanticUserError as exc_info: assert exc_info.code == 'overlapping-unpack-typed-dict' ``` ```python from typing_extensions import TypedDict from typing import Unpack from pydantic import PydanticUserError, validate_call class TD(TypedDict): a: int try: @validate_call def func(a: int, **kwargs: Unpack[TD]): pass except PydanticUserError as exc_info: assert exc_info.code == 'overlapping-unpack-typed-dict' ``` ```python from typing import TypedDict, Unpack from pydantic import PydanticUserError, validate_call class TD(TypedDict): a: int try: @validate_call def func(a: int, **kwargs: Unpack[TD]): pass except PydanticUserError as exc_info: assert exc_info.code == 'overlapping-unpack-typed-dict' ``` ## Invalid `Self` type Currently, Self can only be used to annotate a field of a class (specifically, subclasses of BaseModel, NamedTuple, TypedDict, or dataclasses). Attempting to use Self in any other ways will raise this error. ```python from typing_extensions import Self from pydantic import PydanticUserError, validate_call try: @validate_call def func(self: Self): pass except PydanticUserError as exc_info: assert exc_info.code == 'invalid-self-type' ``` ```python from typing import Self from pydantic import PydanticUserError, validate_call try: @validate_call def func(self: Self): pass except PydanticUserError as exc_info: assert exc_info.code == 'invalid-self-type' ``` The following example of validate_call() will also raise this error, even though it is correct from a type-checking perspective. This may be supported in the future. ```python from typing_extensions import Self from pydantic import BaseModel, PydanticUserError, validate_call try: class A(BaseModel): @validate_call def func(self, arg: Self): pass except PydanticUserError as exc_info: assert exc_info.code == 'invalid-self-type' ``` ```python from typing import Self from pydantic import BaseModel, PydanticUserError, validate_call try: class A(BaseModel): @validate_call def func(self, arg: Self): pass except PydanticUserError as exc_info: assert exc_info.code == 'invalid-self-type' ``` ## `validate_by_alias` and `validate_by_name` both set to `False` This error is raised when you set `validate_by_alias` and `validate_by_name` to `False` in the configuration. This is not allowed because it would make it impossible to populate attributes. ```python from pydantic import BaseModel, ConfigDict, Field, PydanticUserError try: class Model(BaseModel): a: int = Field(alias='A') model_config = ConfigDict( validate_by_alias=False, validate_by_name=False ) except PydanticUserError as exc_info: assert exc_info.code == 'validate-by-alias-and-name-false' ``` Pydantic attempts to provide useful validation errors. Below are details on common validation errors users may encounter when working with pydantic, together with some suggestions on how to fix them. ## `arguments_type` This error is raised when an object that would be passed as arguments to a function during validation is not a `tuple`, `list`, or `dict`. Because `NamedTuple` uses function calls in its implementation, that is one way to produce this error: ```python from typing import NamedTuple from pydantic import BaseModel, ValidationError class MyNamedTuple(NamedTuple): x: int class MyModel(BaseModel): field: MyNamedTuple try: MyModel.model_validate({'field': 'invalid'}) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'arguments_type' ``` ## `assertion_error` This error is raised when a failing `assert` statement is encountered during validation: ```python from pydantic import BaseModel, ValidationError, field_validator class Model(BaseModel): x: int @field_validator('x') @classmethod def force_x_positive(cls, v): assert v > 0 return v try: Model(x=-1) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'assertion_error' ``` ## `bool_parsing` This error is raised when the input value is a string that is not valid for coercion to a boolean: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: bool Model(x='true') # OK try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'bool_parsing' ``` ## `bool_type` This error is raised when the input value's type is not valid for a `bool` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: bool try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'bool_type' ``` This error is also raised for strict fields when the input value is not an instance of `bool`. ## `bytes_invalid_encoding` This error is raised when a `bytes` value is invalid under the configured encoding. In the following example, `b'a'` is invalid hex (odd number of digits). ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: bytes model_config = {'val_json_bytes': 'hex'} try: Model(x=b'a') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'bytes_invalid_encoding' ``` ## `bytes_too_long` This error is raised when the length of a `bytes` value is greater than the field's `max_length` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: bytes = Field(max_length=3) try: Model(x=b'test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'bytes_too_long' ``` ## `bytes_too_short` This error is raised when the length of a `bytes` value is less than the field's `min_length` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: bytes = Field(min_length=3) try: Model(x=b't') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'bytes_too_short' ``` ## `bytes_type` This error is raised when the input value's type is not valid for a `bytes` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: bytes try: Model(x=123) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'bytes_type' ``` This error is also raised for strict fields when the input value is not an instance of `bytes`. ## `callable_type` This error is raised when the input value is not valid as a `Callable`: ```python from typing import Any, Callable from pydantic import BaseModel, ImportString, ValidationError class Model(BaseModel): x: ImportString[Callable[[Any], Any]] Model(x='math:cos') # OK try: Model(x='os.path') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'callable_type' ``` ```python from typing import Any from collections.abc import Callable from pydantic import BaseModel, ImportString, ValidationError class Model(BaseModel): x: ImportString[Callable[[Any], Any]] Model(x='math:cos') # OK try: Model(x='os.path') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'callable_type' ``` ## `complex_str_parsing` This error is raised when the input value is a string but cannot be parsed as a complex number because it does not follow the [rule](https://docs.python.org/3/library/functions.html#complex) in Python: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): num: complex try: # Complex numbers in json are expected to be valid complex strings. # This value `abc` is not a valid complex string. Model.model_validate_json('{"num": "abc"}') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'complex_str_parsing' ``` ## `complex_type` This error is raised when the input value cannot be interpreted as a complex number: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): num: complex try: Model(num=False) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'complex_type' ``` ## `dataclass_exact_type` This error is raised when validating a dataclass with `strict=True` and the input is not an instance of the dataclass: ```python import pydantic.dataclasses from pydantic import TypeAdapter, ValidationError @pydantic.dataclasses.dataclass class MyDataclass: x: str adapter = TypeAdapter(MyDataclass) print(adapter.validate_python(MyDataclass(x='test'), strict=True)) #> MyDataclass(x='test') print(adapter.validate_python({'x': 'test'})) #> MyDataclass(x='test') try: adapter.validate_python({'x': 'test'}, strict=True) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'dataclass_exact_type' ``` ## `dataclass_type` This error is raised when the input value is not valid for a `dataclass` field: ```python from pydantic import ValidationError, dataclasses @dataclasses.dataclass class Inner: x: int @dataclasses.dataclass class Outer: y: Inner Outer(y=Inner(x=1)) # OK try: Outer(y=1) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'dataclass_type' ``` ## `date_from_datetime_inexact` This error is raised when the input `datetime` value provided for a `date` field has a nonzero time component. For a timestamp to parse into a field of type `date`, the time components must all be zero: ```python from datetime import date, datetime from pydantic import BaseModel, ValidationError class Model(BaseModel): x: date Model(x='2023-01-01') # OK Model(x=datetime(2023, 1, 1)) # OK try: Model(x=datetime(2023, 1, 1, 12)) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'date_from_datetime_inexact' ``` ## `date_from_datetime_parsing` This error is raised when the input value is a string that cannot be parsed for a `date` field: ```python from datetime import date from pydantic import BaseModel, ValidationError class Model(BaseModel): x: date try: Model(x='XX1494012000') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'date_from_datetime_parsing' ``` ## `date_future` This error is raised when the input value provided for a `FutureDate` field is not in the future: ```python from datetime import date from pydantic import BaseModel, FutureDate, ValidationError class Model(BaseModel): x: FutureDate try: Model(x=date(2000, 1, 1)) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'date_future' ``` ## `date_parsing` This error is raised when validating JSON where the input value is string that cannot be parsed for a `date` field: ```python import json from datetime import date from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: date = Field(strict=True) try: Model.model_validate_json(json.dumps({'x': '1'})) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'date_parsing' ``` ## `date_past` This error is raised when the value provided for a `PastDate` field is not in the past: ```python from datetime import date, timedelta from pydantic import BaseModel, PastDate, ValidationError class Model(BaseModel): x: PastDate try: Model(x=date.today() + timedelta(1)) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'date_past' ``` ## `date_type` This error is raised when the input value's type is not valid for a `date` field: ```python from datetime import date from pydantic import BaseModel, ValidationError class Model(BaseModel): x: date try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'date_type' ``` This error is also raised for strict fields when the input value is not an instance of `date`. ## `datetime_from_date_parsing` Note Support for this error, along with support for parsing datetimes from `yyyy-MM-DD` dates will be added in `v2.6.0` This error is raised when the input value is a string that cannot be parsed for a `datetime` field: ```python from datetime import datetime from pydantic import BaseModel, ValidationError class Model(BaseModel): x: datetime try: # there is no 13th month Model(x='2023-13-01') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'datetime_from_date_parsing' ``` ## `datetime_future` This error is raised when the value provided for a `FutureDatetime` field is not in the future: ```python from datetime import datetime from pydantic import BaseModel, FutureDatetime, ValidationError class Model(BaseModel): x: FutureDatetime try: Model(x=datetime(2000, 1, 1)) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'datetime_future' ``` ## `datetime_object_invalid` This error is raised when something about the `datetime` object is not valid: ```python from datetime import datetime, tzinfo from pydantic import AwareDatetime, BaseModel, ValidationError class CustomTz(tzinfo): # utcoffset is not implemented! def tzname(self, _dt): return 'CustomTZ' class Model(BaseModel): x: AwareDatetime try: Model(x=datetime(2023, 1, 1, tzinfo=CustomTz())) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'datetime_object_invalid' ``` ## `datetime_parsing` This error is raised when the value is a string that cannot be parsed for a `datetime` field: ```python import json from datetime import datetime from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: datetime = Field(strict=True) try: Model.model_validate_json(json.dumps({'x': 'not a datetime'})) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'datetime_parsing' ``` ## `datetime_past` This error is raised when the value provided for a `PastDatetime` field is not in the past: ```python from datetime import datetime, timedelta from pydantic import BaseModel, PastDatetime, ValidationError class Model(BaseModel): x: PastDatetime try: Model(x=datetime.now() + timedelta(100)) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'datetime_past' ``` ## `datetime_type` This error is raised when the input value's type is not valid for a `datetime` field: ```python from datetime import datetime from pydantic import BaseModel, ValidationError class Model(BaseModel): x: datetime try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'datetime_type' ``` This error is also raised for strict fields when the input value is not an instance of `datetime`. ## `decimal_max_digits` This error is raised when the value provided for a `Decimal` has too many digits: ```python from decimal import Decimal from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: Decimal = Field(max_digits=3) try: Model(x='42.1234') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'decimal_max_digits' ``` ## `decimal_max_places` This error is raised when the value provided for a `Decimal` has too many digits after the decimal point: ```python from decimal import Decimal from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: Decimal = Field(decimal_places=3) try: Model(x='42.1234') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'decimal_max_places' ``` ## `decimal_parsing` This error is raised when the value provided for a `Decimal` could not be parsed as a decimal number: ```python from decimal import Decimal from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: Decimal = Field(decimal_places=3) try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'decimal_parsing' ``` ## `decimal_type` This error is raised when the value provided for a `Decimal` is of the wrong type: ```python from decimal import Decimal from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: Decimal = Field(decimal_places=3) try: Model(x=[1, 2, 3]) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'decimal_type' ``` This error is also raised for strict fields when the input value is not an instance of `Decimal`. ## `decimal_whole_digits` This error is raised when the value provided for a `Decimal` has more digits before the decimal point than `max_digits` - `decimal_places` (as long as both are specified): ```python from decimal import Decimal from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: Decimal = Field(max_digits=6, decimal_places=3) try: Model(x='12345.6') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'decimal_whole_digits' ``` ## `dict_type` This error is raised when the input value's type is not `dict` for a `dict` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: dict try: Model(x=['1', '2']) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'dict_type' ``` ## `enum` This error is raised when the input value does not exist in an `enum` field members: ```python from enum import Enum from pydantic import BaseModel, ValidationError class MyEnum(str, Enum): option = 'option' class Model(BaseModel): x: MyEnum try: Model(x='other_option') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'enum' ``` ## `extra_forbidden` This error is raised when the input value contains extra fields, but `model_config['extra'] == 'forbid'`: ```python from pydantic import BaseModel, ConfigDict, ValidationError class Model(BaseModel): x: str model_config = ConfigDict(extra='forbid') try: Model(x='test', y='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'extra_forbidden' ``` You can read more about the `extra` configuration in the Extra Attributes section. ## `finite_number` This error is raised when the value is infinite, or too large to be represented as a 64-bit floating point number during validation: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: int try: Model(x=2.2250738585072011e308) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'finite_number' ``` ## `float_parsing` This error is raised when the value is a string that can't be parsed as a `float`: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: float try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'float_parsing' ``` ## `float_type` This error is raised when the input value's type is not valid for a `float` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: float try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'float_type' ``` ## `frozen_field` This error is raised when you attempt to assign a value to a field with `frozen=True`, or to delete such a field: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: str = Field('test', frozen=True) model = Model() try: model.x = 'test1' except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'frozen_field' try: del model.x except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'frozen_field' ``` ## `frozen_instance` This error is raised when `model_config['frozen] == True` and you attempt to delete or assign a new value to any of the fields: ```python from pydantic import BaseModel, ConfigDict, ValidationError class Model(BaseModel): x: int model_config = ConfigDict(frozen=True) m = Model(x=1) try: m.x = 2 except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'frozen_instance' try: del m.x except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'frozen_instance' ``` ## `frozen_set_type` This error is raised when the input value's type is not valid for a `frozenset` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: frozenset try: model = Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'frozen_set_type' ``` ## `get_attribute_error` This error is raised when `model_config['from_attributes'] == True` and an error is raised while reading the attributes: ```python from pydantic import BaseModel, ConfigDict, ValidationError class Foobar: def __init__(self): self.x = 1 @property def y(self): raise RuntimeError('intentional error') class Model(BaseModel): x: int y: str model_config = ConfigDict(from_attributes=True) try: Model.model_validate(Foobar()) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'get_attribute_error' ``` ## `greater_than` This error is raised when the value is not greater than the field's `gt` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: int = Field(gt=10) try: Model(x=10) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'greater_than' ``` ## `greater_than_equal` This error is raised when the value is not greater than or equal to the field's `ge` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: int = Field(ge=10) try: Model(x=9) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'greater_than_equal' ``` ## `int_from_float` This error is raised when you provide a `float` value for an `int` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: int try: Model(x=0.5) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'int_from_float' ``` ## `int_parsing` This error is raised when the value can't be parsed as `int`: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: int try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'int_parsing' ``` ## `int_parsing_size` This error is raised when attempting to parse a python or JSON value from a string outside the maximum range that Python `str` to `int` parsing permits: ```python import json from pydantic import BaseModel, ValidationError class Model(BaseModel): x: int # from Python assert Model(x='1' * 4_300).x == int('1' * 4_300) # OK too_long = '1' * 4_301 try: Model(x=too_long) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'int_parsing_size' # from JSON try: Model.model_validate_json(json.dumps({'x': too_long})) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'int_parsing_size' ``` ## `int_type` This error is raised when the input value's type is not valid for an `int` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: int try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'int_type' ``` ## `invalid_key` This error is raised when attempting to validate a `dict` that has a key that is not an instance of `str`: ```python from pydantic import BaseModel, ConfigDict, ValidationError class Model(BaseModel): x: int model_config = ConfigDict(extra='allow') try: Model.model_validate({'x': 1, b'y': 2}) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'invalid_key' ``` ## `is_instance_of` This error is raised when the input value is not an instance of the expected type: ```python from pydantic import BaseModel, ConfigDict, ValidationError class Nested: x: str class Model(BaseModel): y: Nested model_config = ConfigDict(arbitrary_types_allowed=True) try: Model(y='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'is_instance_of' ``` ## `is_subclass_of` This error is raised when the input value is not a subclass of the expected type: ```python from pydantic import BaseModel, ValidationError class Nested: x: str class Model(BaseModel): y: type[Nested] try: Model(y='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'is_subclass_of' ``` ## `iterable_type` This error is raised when the input value is not valid as an `Iterable`: ```python from typing import Iterable from pydantic import BaseModel, ValidationError class Model(BaseModel): y: Iterable try: Model(y=123) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'iterable_type' ``` ```python from collections.abc import Iterable from pydantic import BaseModel, ValidationError class Model(BaseModel): y: Iterable try: Model(y=123) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'iterable_type' ``` ## `iteration_error` This error is raised when an error occurs during iteration: ```python from pydantic import BaseModel, ValidationError def gen(): yield 1 raise RuntimeError('error') class Model(BaseModel): x: list[int] try: Model(x=gen()) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'iteration_error' ``` ## `json_invalid` This error is raised when the input value is not a valid JSON string: ```python from pydantic import BaseModel, Json, ValidationError class Model(BaseModel): x: Json try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'json_invalid' ``` ## `json_type` This error is raised when the input value is of a type that cannot be parsed as JSON: ```python from pydantic import BaseModel, Json, ValidationError class Model(BaseModel): x: Json try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'json_type' ``` ## `less_than` This error is raised when the input value is not less than the field's `lt` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: int = Field(lt=10) try: Model(x=10) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'less_than' ``` ## `less_than_equal` This error is raised when the input value is not less than or equal to the field's `le` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: int = Field(le=10) try: Model(x=11) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'less_than_equal' ``` ## `list_type` This error is raised when the input value's type is not valid for a `list` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: list[int] try: Model(x=1) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'list_type' ``` ## `literal_error` This error is raised when the input value is not one of the expected literal values: ```python from typing import Literal from pydantic import BaseModel, ValidationError class Model(BaseModel): x: Literal['a', 'b'] Model(x='a') # OK try: Model(x='c') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'literal_error' ``` ## `mapping_type` This error is raised when a problem occurs during validation due to a failure in a call to the methods from the `Mapping` protocol, such as `.items()`: ```python from collections.abc import Mapping from pydantic import BaseModel, ValidationError class BadMapping(Mapping): def items(self): raise ValueError() def __iter__(self): raise ValueError() def __getitem__(self, key): raise ValueError() def __len__(self): return 1 class Model(BaseModel): x: dict[str, str] try: Model(x=BadMapping()) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'mapping_type' ``` ## `missing` This error is raised when there are required fields missing from the input value: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: str try: Model() except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'missing' ``` ## `missing_argument` This error is raised when a required positional-or-keyword argument is not passed to a function decorated with `validate_call`: ```python from pydantic import ValidationError, validate_call @validate_call def foo(a: int): return a try: foo() except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'missing_argument' ``` ## `missing_keyword_only_argument` This error is raised when a required keyword-only argument is not passed to a function decorated with `validate_call`: ```python from pydantic import ValidationError, validate_call @validate_call def foo(*, a: int): return a try: foo() except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'missing_keyword_only_argument' ``` ## `missing_positional_only_argument` This error is raised when a required positional-only argument is not passed to a function decorated with `validate_call`: ```python from pydantic import ValidationError, validate_call @validate_call def foo(a: int, /): return a try: foo() except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'missing_positional_only_argument' ``` ## `model_attributes_type` This error is raised when the input value is not a valid dictionary, model instance, or instance that fields can be extracted from: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): a: int b: int # simply validating a dict print(Model.model_validate({'a': 1, 'b': 2})) #> a=1 b=2 class CustomObj: def __init__(self, a, b): self.a = a self.b = b # using from attributes to extract fields from an objects print(Model.model_validate(CustomObj(3, 4), from_attributes=True)) #> a=3 b=4 try: Model.model_validate('not an object', from_attributes=True) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'model_attributes_type' ``` ## `model_type` This error is raised when the input to a model is not an instance of the model or dict: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): a: int b: int # simply validating a dict m = Model.model_validate({'a': 1, 'b': 2}) print(m) #> a=1 b=2 # validating an existing model instance print(Model.model_validate(m)) #> a=1 b=2 try: Model.model_validate('not an object') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'model_type' ``` ## `multiple_argument_values` This error is raised when you provide multiple values for a single argument while calling a function decorated with `validate_call`: ```python from pydantic import ValidationError, validate_call @validate_call def foo(a: int): return a try: foo(1, a=2) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'multiple_argument_values' ``` ## `multiple_of` This error is raised when the input is not a multiple of a field's `multiple_of` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: int = Field(multiple_of=5) try: Model(x=1) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'multiple_of' ``` ## `needs_python_object` This type of error is raised when validation is attempted from a format that cannot be converted to a Python object. For example, we cannot check `isinstance` or `issubclass` from JSON: ```python import json from pydantic import BaseModel, ValidationError class Model(BaseModel): bm: type[BaseModel] try: Model.model_validate_json(json.dumps({'bm': 'not a basemodel class'})) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'needs_python_object' ``` ## `no_such_attribute` This error is raised when `validate_assignment=True` in the config, and you attempt to assign a value to an attribute that is not an existing field: ```python from pydantic import ConfigDict, ValidationError, dataclasses @dataclasses.dataclass(config=ConfigDict(validate_assignment=True)) class MyDataclass: x: int m = MyDataclass(x=1) try: m.y = 10 except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'no_such_attribute' ``` ## `none_required` This error is raised when the input value is not `None` for a field that requires `None`: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: None try: Model(x=1) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'none_required' ``` Note You may encounter this error when there is a naming collision in your model between a field name and its type. More specifically, this error is likely to be thrown when the default value of that field is `None`. For example, the following would yield the `none_required` validation error since the field `int` is set to a default value of `None` and has the exact same name as its type, which causes problems with validation. ```python from typing import Optional from pydantic import BaseModel class M1(BaseModel): int: Optional[int] = None m = M1(int=123) # errors ``` ## `recursion_loop` This error is raised when a cyclic reference is detected: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: list['Model'] d = {'x': []} d['x'].append(d) try: Model(**d) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'recursion_loop' ``` ## `set_item_not_hashable` This error is raised when an unhashable value is validated against a set or a frozenset: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: set[object] class Unhasbable: __hash__ = None try: Model(x=[{'a': 'b'}, Unhasbable()]) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'set_item_not_hashable' print(repr(exc.errors()[1]['type'])) #> 'set_item_not_hashable' ``` ## `set_type` This error is raised when the value type is not valid for a `set` field: ```python from typing import Set from pydantic import BaseModel, ValidationError class Model(BaseModel): x: Set[int] try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'set_type' ``` ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: set[int] try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'set_type' ``` ## `string_pattern_mismatch` This error is raised when the input value doesn't match the field's `pattern` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: str = Field(pattern='test') try: Model(x='1') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'string_pattern_mismatch' ``` ## `string_sub_type` This error is raised when the value is an instance of a strict subtype of `str` when the field is strict: ```python from enum import Enum from pydantic import BaseModel, Field, ValidationError class MyEnum(str, Enum): foo = 'foo' class Model(BaseModel): x: str = Field(strict=True) try: Model(x=MyEnum.foo) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'string_sub_type' ``` ## `string_too_long` This error is raised when the input value is a string whose length is greater than the field's `max_length` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: str = Field(max_length=3) try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'string_too_long' ``` ## `string_too_short` This error is raised when the input value is a string whose length is less than the field's `min_length` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: str = Field(min_length=3) try: Model(x='t') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'string_too_short' ``` ## `string_type` This error is raised when the input value's type is not valid for a `str` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: str try: Model(x=1) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'string_type' ``` This error is also raised for strict fields when the input value is not an instance of `str`. ## `string_unicode` This error is raised when the value cannot be parsed as a Unicode string: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: str try: Model(x=b'\x81') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'string_unicode' ``` ## `time_delta_parsing` This error is raised when the input value is a string that cannot be parsed for a `timedelta` field: ```python from datetime import timedelta from pydantic import BaseModel, ValidationError class Model(BaseModel): x: timedelta try: Model(x='t') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'time_delta_parsing' ``` ## `time_delta_type` This error is raised when the input value's type is not valid for a `timedelta` field: ```python from datetime import timedelta from pydantic import BaseModel, ValidationError class Model(BaseModel): x: timedelta try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'time_delta_type' ``` This error is also raised for strict fields when the input value is not an instance of `timedelta`. ## `time_parsing` This error is raised when the input value is a string that cannot be parsed for a `time` field: ```python from datetime import time from pydantic import BaseModel, ValidationError class Model(BaseModel): x: time try: Model(x='25:20:30.400') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'time_parsing' ``` ## `time_type` This error is raised when the value type is not valid for a `time` field: ```python from datetime import time from pydantic import BaseModel, ValidationError class Model(BaseModel): x: time try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'time_type' ``` This error is also raised for strict fields when the input value is not an instance of `time`. ## `timezone_aware` This error is raised when the `datetime` value provided for a timezone-aware `datetime` field doesn't have timezone information: ```python from datetime import datetime from pydantic import AwareDatetime, BaseModel, ValidationError class Model(BaseModel): x: AwareDatetime try: Model(x=datetime.now()) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'timezone_aware' ``` ## `timezone_naive` This error is raised when the `datetime` value provided for a timezone-naive `datetime` field has timezone info: ```python from datetime import datetime, timezone from pydantic import BaseModel, NaiveDatetime, ValidationError class Model(BaseModel): x: NaiveDatetime try: Model(x=datetime.now(tz=timezone.utc)) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'timezone_naive' ``` ## `too_long` This error is raised when the input value's length is greater than the field's `max_length` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: list[int] = Field(max_length=3) try: Model(x=[1, 2, 3, 4]) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'too_long' ``` ## `too_short` This error is raised when the value length is less than the field's `min_length` constraint: ```python from pydantic import BaseModel, Field, ValidationError class Model(BaseModel): x: list[int] = Field(min_length=3) try: Model(x=[1, 2]) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'too_short' ``` ## `tuple_type` This error is raised when the input value's type is not valid for a `tuple` field: ```python from pydantic import BaseModel, ValidationError class Model(BaseModel): x: tuple[int] try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'tuple_type' ``` This error is also raised for strict fields when the input value is not an instance of `tuple`. ## `unexpected_keyword_argument` This error is raised when you provide a value by keyword for a positional-only argument while calling a function decorated with `validate_call`: ```python from pydantic import ValidationError, validate_call @validate_call def foo(a: int, /): return a try: foo(a=2) except ValidationError as exc: print(repr(exc.errors()[1]['type'])) #> 'unexpected_keyword_argument' ``` It is also raised when using pydantic.dataclasses and `extra=forbid`: ```python from pydantic import TypeAdapter, ValidationError from pydantic.dataclasses import dataclass @dataclass(config={'extra': 'forbid'}) class Foo: bar: int try: TypeAdapter(Foo).validate_python({'bar': 1, 'foobar': 2}) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'unexpected_keyword_argument' ``` ## `unexpected_positional_argument` This error is raised when you provide a positional value for a keyword-only argument while calling a function decorated with `validate_call`: ```python from pydantic import ValidationError, validate_call @validate_call def foo(*, a: int): return a try: foo(2) except ValidationError as exc: print(repr(exc.errors()[1]['type'])) #> 'unexpected_positional_argument' ``` ## `union_tag_invalid` This error is raised when the input's discriminator is not one of the expected values: ```python from typing import Literal, Union from pydantic import BaseModel, Field, ValidationError class BlackCat(BaseModel): pet_type: Literal['blackcat'] class WhiteCat(BaseModel): pet_type: Literal['whitecat'] class Model(BaseModel): cat: Union[BlackCat, WhiteCat] = Field(discriminator='pet_type') try: Model(cat={'pet_type': 'dog'}) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'union_tag_invalid' ``` ```python from typing import Literal from pydantic import BaseModel, Field, ValidationError class BlackCat(BaseModel): pet_type: Literal['blackcat'] class WhiteCat(BaseModel): pet_type: Literal['whitecat'] class Model(BaseModel): cat: BlackCat | WhiteCat = Field(discriminator='pet_type') try: Model(cat={'pet_type': 'dog'}) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'union_tag_invalid' ``` ## `union_tag_not_found` This error is raised when it is not possible to extract a discriminator value from the input: ```python from typing import Literal, Union from pydantic import BaseModel, Field, ValidationError class BlackCat(BaseModel): pet_type: Literal['blackcat'] class WhiteCat(BaseModel): pet_type: Literal['whitecat'] class Model(BaseModel): cat: Union[BlackCat, WhiteCat] = Field(discriminator='pet_type') try: Model(cat={'name': 'blackcat'}) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'union_tag_not_found' ``` ```python from typing import Literal from pydantic import BaseModel, Field, ValidationError class BlackCat(BaseModel): pet_type: Literal['blackcat'] class WhiteCat(BaseModel): pet_type: Literal['whitecat'] class Model(BaseModel): cat: BlackCat | WhiteCat = Field(discriminator='pet_type') try: Model(cat={'name': 'blackcat'}) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'union_tag_not_found' ``` ## `url_parsing` This error is raised when the input value cannot be parsed as a URL: ```python from pydantic import AnyUrl, BaseModel, ValidationError class Model(BaseModel): x: AnyUrl try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'url_parsing' ``` ## `url_scheme` This error is raised when the URL scheme is not valid for the URL type of the field: ```python from pydantic import BaseModel, HttpUrl, ValidationError class Model(BaseModel): x: HttpUrl try: Model(x='ftp://example.com') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'url_scheme' ``` ## `url_syntax_violation` This error is raised when the URL syntax is not valid: ```python from pydantic import BaseModel, Field, HttpUrl, ValidationError class Model(BaseModel): x: HttpUrl = Field(strict=True) try: Model(x='http:////example.com') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'url_syntax_violation' ``` ## `url_too_long` This error is raised when the URL length is greater than 2083: ```python from pydantic import BaseModel, HttpUrl, ValidationError class Model(BaseModel): x: HttpUrl try: Model(x='x' * 2084) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'url_too_long' ``` ## `url_type` This error is raised when the input value's type is not valid for a URL field: ```python from pydantic import BaseModel, HttpUrl, ValidationError class Model(BaseModel): x: HttpUrl try: Model(x=None) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'url_type' ``` ## `uuid_parsing` This error is raised when the input value's type is not valid for a UUID field: ```python from uuid import UUID from pydantic import BaseModel, ValidationError class Model(BaseModel): u: UUID try: Model(u='12345678-124-1234-1234-567812345678') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'uuid_parsing' ``` ## `uuid_type` This error is raised when the input value's type is not valid instance for a UUID field (str, bytes or UUID): ```python from uuid import UUID from pydantic import BaseModel, ValidationError class Model(BaseModel): u: UUID try: Model(u=1234567812412341234567812345678) except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'uuid_type' ``` ## `uuid_version` This error is raised when the input value's type is not match UUID version: ```python from pydantic import UUID5, BaseModel, ValidationError class Model(BaseModel): u: UUID5 try: Model(u='a6cc5730-2261-11ee-9c43-2eb5a363657c') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'uuid_version' ``` ## `value_error` This error is raised when a `ValueError` is raised during validation: ```python from pydantic import BaseModel, ValidationError, field_validator class Model(BaseModel): x: str @field_validator('x') @classmethod def repeat_b(cls, v): raise ValueError() try: Model(x='test') except ValidationError as exc: print(repr(exc.errors()[0]['type'])) #> 'value_error' ``` This page provides example snippets for creating more complex, custom validators in Pydantic. Many of these examples are adapted from Pydantic issues and discussions, and are intended to showcase the flexibility and power of Pydantic's validation system. ## Custom `datetime` Validator via Annotated Metadata In this example, we'll construct a custom validator, attached to an Annotated type, that ensures a datetime object adheres to a given timezone constraint. The custom validator supports string specification of the timezone, and will raise an error if the datetime object does not have the correct timezone. We use `__get_pydantic_core_schema__` in the validator to customize the schema of the annotated type (in this case, datetime), which allows us to add custom validation logic. Notably, we use a `wrap` validator function so that we can perform operations both before and after the default `pydantic` validation of a datetime. ```python import datetime as dt from dataclasses import dataclass from pprint import pprint from typing import Annotated, Any, Callable, Optional import pytz from pydantic_core import CoreSchema, core_schema from pydantic import ( GetCoreSchemaHandler, PydanticUserError, TypeAdapter, ValidationError, ) @dataclass(frozen=True) class MyDatetimeValidator: tz_constraint: Optional[str] = None def tz_constraint_validator( self, value: dt.datetime, handler: Callable, # (1)! ): """Validate tz_constraint and tz_info.""" # handle naive datetimes if self.tz_constraint is None: assert ( value.tzinfo is None ), 'tz_constraint is None, but provided value is tz-aware.' return handler(value) # validate tz_constraint and tz-aware tzinfo if self.tz_constraint not in pytz.all_timezones: raise PydanticUserError( f'Invalid tz_constraint: {self.tz_constraint}', code='unevaluable-type-annotation', ) result = handler(value) # (2)! assert self.tz_constraint == str( result.tzinfo ), f'Invalid tzinfo: {str(result.tzinfo)}, expected: {self.tz_constraint}' return result def __get_pydantic_core_schema__( self, source_type: Any, handler: GetCoreSchemaHandler, ) -> CoreSchema: return core_schema.no_info_wrap_validator_function( self.tz_constraint_validator, handler(source_type), ) LA = 'America/Los_Angeles' ta = TypeAdapter(Annotated[dt.datetime, MyDatetimeValidator(LA)]) print( ta.validate_python(dt.datetime(2023, 1, 1, 0, 0, tzinfo=pytz.timezone(LA))) ) #> 2023-01-01 00:00:00-07:53 LONDON = 'Europe/London' try: ta.validate_python( dt.datetime(2023, 1, 1, 0, 0, tzinfo=pytz.timezone(LONDON)) ) except ValidationError as ve: pprint(ve.errors(), width=100) """ [{'ctx': {'error': AssertionError('Invalid tzinfo: Europe/London, expected: America/Los_Angeles')}, 'input': datetime.datetime(2023, 1, 1, 0, 0, tzinfo=), 'loc': (), 'msg': 'Assertion failed, Invalid tzinfo: Europe/London, expected: America/Los_Angeles', 'type': 'assertion_error', 'url': 'https://errors.pydantic.dev/2.8/v/assertion_error'}] """ ``` 1. The `handler` function is what we call to validate the input with standard `pydantic` validation 1. We call the `handler` function to validate the input with standard `pydantic` validation in this wrap validator ```python import datetime as dt from dataclasses import dataclass from pprint import pprint from typing import Annotated, Any from collections.abc import Callable import pytz from pydantic_core import CoreSchema, core_schema from pydantic import ( GetCoreSchemaHandler, PydanticUserError, TypeAdapter, ValidationError, ) @dataclass(frozen=True) class MyDatetimeValidator: tz_constraint: str | None = None def tz_constraint_validator( self, value: dt.datetime, handler: Callable, # (1)! ): """Validate tz_constraint and tz_info.""" # handle naive datetimes if self.tz_constraint is None: assert ( value.tzinfo is None ), 'tz_constraint is None, but provided value is tz-aware.' return handler(value) # validate tz_constraint and tz-aware tzinfo if self.tz_constraint not in pytz.all_timezones: raise PydanticUserError( f'Invalid tz_constraint: {self.tz_constraint}', code='unevaluable-type-annotation', ) result = handler(value) # (2)! assert self.tz_constraint == str( result.tzinfo ), f'Invalid tzinfo: {str(result.tzinfo)}, expected: {self.tz_constraint}' return result def __get_pydantic_core_schema__( self, source_type: Any, handler: GetCoreSchemaHandler, ) -> CoreSchema: return core_schema.no_info_wrap_validator_function( self.tz_constraint_validator, handler(source_type), ) LA = 'America/Los_Angeles' ta = TypeAdapter(Annotated[dt.datetime, MyDatetimeValidator(LA)]) print( ta.validate_python(dt.datetime(2023, 1, 1, 0, 0, tzinfo=pytz.timezone(LA))) ) #> 2023-01-01 00:00:00-07:53 LONDON = 'Europe/London' try: ta.validate_python( dt.datetime(2023, 1, 1, 0, 0, tzinfo=pytz.timezone(LONDON)) ) except ValidationError as ve: pprint(ve.errors(), width=100) """ [{'ctx': {'error': AssertionError('Invalid tzinfo: Europe/London, expected: America/Los_Angeles')}, 'input': datetime.datetime(2023, 1, 1, 0, 0, tzinfo=), 'loc': (), 'msg': 'Assertion failed, Invalid tzinfo: Europe/London, expected: America/Los_Angeles', 'type': 'assertion_error', 'url': 'https://errors.pydantic.dev/2.8/v/assertion_error'}] """ ``` 1. The `handler` function is what we call to validate the input with standard `pydantic` validation 1. We call the `handler` function to validate the input with standard `pydantic` validation in this wrap validator We can also enforce UTC offset constraints in a similar way. Assuming we have a `lower_bound` and an `upper_bound`, we can create a custom validator to ensure our `datetime` has a UTC offset that is inclusive within the boundary we define: ```python import datetime as dt from dataclasses import dataclass from pprint import pprint from typing import Annotated, Any, Callable import pytz from pydantic_core import CoreSchema, core_schema from pydantic import GetCoreSchemaHandler, TypeAdapter, ValidationError @dataclass(frozen=True) class MyDatetimeValidator: lower_bound: int upper_bound: int def validate_tz_bounds(self, value: dt.datetime, handler: Callable): """Validate and test bounds""" assert value.utcoffset() is not None, 'UTC offset must exist' assert self.lower_bound <= self.upper_bound, 'Invalid bounds' result = handler(value) hours_offset = value.utcoffset().total_seconds() / 3600 assert ( self.lower_bound <= hours_offset <= self.upper_bound ), 'Value out of bounds' return result def __get_pydantic_core_schema__( self, source_type: Any, handler: GetCoreSchemaHandler, ) -> CoreSchema: return core_schema.no_info_wrap_validator_function( self.validate_tz_bounds, handler(source_type), ) LA = 'America/Los_Angeles' # UTC-7 or UTC-8 ta = TypeAdapter(Annotated[dt.datetime, MyDatetimeValidator(-10, -5)]) print( ta.validate_python(dt.datetime(2023, 1, 1, 0, 0, tzinfo=pytz.timezone(LA))) ) #> 2023-01-01 00:00:00-07:53 LONDON = 'Europe/London' try: print( ta.validate_python( dt.datetime(2023, 1, 1, 0, 0, tzinfo=pytz.timezone(LONDON)) ) ) except ValidationError as e: pprint(e.errors(), width=100) """ [{'ctx': {'error': AssertionError('Value out of bounds')}, 'input': datetime.datetime(2023, 1, 1, 0, 0, tzinfo=), 'loc': (), 'msg': 'Assertion failed, Value out of bounds', 'type': 'assertion_error', 'url': 'https://errors.pydantic.dev/2.8/v/assertion_error'}] """ ``` ```python import datetime as dt from dataclasses import dataclass from pprint import pprint from typing import Annotated, Any from collections.abc import Callable import pytz from pydantic_core import CoreSchema, core_schema from pydantic import GetCoreSchemaHandler, TypeAdapter, ValidationError @dataclass(frozen=True) class MyDatetimeValidator: lower_bound: int upper_bound: int def validate_tz_bounds(self, value: dt.datetime, handler: Callable): """Validate and test bounds""" assert value.utcoffset() is not None, 'UTC offset must exist' assert self.lower_bound <= self.upper_bound, 'Invalid bounds' result = handler(value) hours_offset = value.utcoffset().total_seconds() / 3600 assert ( self.lower_bound <= hours_offset <= self.upper_bound ), 'Value out of bounds' return result def __get_pydantic_core_schema__( self, source_type: Any, handler: GetCoreSchemaHandler, ) -> CoreSchema: return core_schema.no_info_wrap_validator_function( self.validate_tz_bounds, handler(source_type), ) LA = 'America/Los_Angeles' # UTC-7 or UTC-8 ta = TypeAdapter(Annotated[dt.datetime, MyDatetimeValidator(-10, -5)]) print( ta.validate_python(dt.datetime(2023, 1, 1, 0, 0, tzinfo=pytz.timezone(LA))) ) #> 2023-01-01 00:00:00-07:53 LONDON = 'Europe/London' try: print( ta.validate_python( dt.datetime(2023, 1, 1, 0, 0, tzinfo=pytz.timezone(LONDON)) ) ) except ValidationError as e: pprint(e.errors(), width=100) """ [{'ctx': {'error': AssertionError('Value out of bounds')}, 'input': datetime.datetime(2023, 1, 1, 0, 0, tzinfo=), 'loc': (), 'msg': 'Assertion failed, Value out of bounds', 'type': 'assertion_error', 'url': 'https://errors.pydantic.dev/2.8/v/assertion_error'}] """ ``` ## Validating Nested Model Fields Here, we demonstrate two ways to validate a field of a nested model, where the validator utilizes data from the parent model. In this example, we construct a validator that checks that each user's password is not in a list of forbidden passwords specified by the parent model. One way to do this is to place a custom validator on the outer model: ```python from typing_extensions import Self from pydantic import BaseModel, ValidationError, model_validator class User(BaseModel): username: str password: str class Organization(BaseModel): forbidden_passwords: list[str] users: list[User] @model_validator(mode='after') def validate_user_passwords(self) -> Self: """Check that user password is not in forbidden list. Raise a validation error if a forbidden password is encountered.""" for user in self.users: current_pw = user.password if current_pw in self.forbidden_passwords: raise ValueError( f'Password {current_pw} is forbidden. Please choose another password for user {user.username}.' ) return self data = { 'forbidden_passwords': ['123'], 'users': [ {'username': 'Spartacat', 'password': '123'}, {'username': 'Iceburgh', 'password': '87'}, ], } try: org = Organization(**data) except ValidationError as e: print(e) """ 1 validation error for Organization Value error, Password 123 is forbidden. Please choose another password for user Spartacat. [type=value_error, input_value={'forbidden_passwords': [...gh', 'password': '87'}]}, input_type=dict] """ ``` ```python from typing import Self from pydantic import BaseModel, ValidationError, model_validator class User(BaseModel): username: str password: str class Organization(BaseModel): forbidden_passwords: list[str] users: list[User] @model_validator(mode='after') def validate_user_passwords(self) -> Self: """Check that user password is not in forbidden list. Raise a validation error if a forbidden password is encountered.""" for user in self.users: current_pw = user.password if current_pw in self.forbidden_passwords: raise ValueError( f'Password {current_pw} is forbidden. Please choose another password for user {user.username}.' ) return self data = { 'forbidden_passwords': ['123'], 'users': [ {'username': 'Spartacat', 'password': '123'}, {'username': 'Iceburgh', 'password': '87'}, ], } try: org = Organization(**data) except ValidationError as e: print(e) """ 1 validation error for Organization Value error, Password 123 is forbidden. Please choose another password for user Spartacat. [type=value_error, input_value={'forbidden_passwords': [...gh', 'password': '87'}]}, input_type=dict] """ ``` Alternatively, a custom validator can be used in the nested model class (`User`), with the forbidden passwords data from the parent model being passed in via validation context. Warning The ability to mutate the context within a validator adds a lot of power to nested validation, but can also lead to confusing or hard-to-debug code. Use this approach at your own risk! ```python from pydantic import BaseModel, ValidationError, ValidationInfo, field_validator class User(BaseModel): username: str password: str @field_validator('password', mode='after') @classmethod def validate_user_passwords( cls, password: str, info: ValidationInfo ) -> str: """Check that user password is not in forbidden list.""" forbidden_passwords = ( info.context.get('forbidden_passwords', []) if info.context else [] ) if password in forbidden_passwords: raise ValueError(f'Password {password} is forbidden.') return password class Organization(BaseModel): forbidden_passwords: list[str] users: list[User] @field_validator('forbidden_passwords', mode='after') @classmethod def add_context(cls, v: list[str], info: ValidationInfo) -> list[str]: if info.context is not None: info.context.update({'forbidden_passwords': v}) return v data = { 'forbidden_passwords': ['123'], 'users': [ {'username': 'Spartacat', 'password': '123'}, {'username': 'Iceburgh', 'password': '87'}, ], } try: org = Organization.model_validate(data, context={}) except ValidationError as e: print(e) """ 1 validation error for Organization users.0.password Value error, Password 123 is forbidden. [type=value_error, input_value='123', input_type=str] """ ``` Note that if the context property is not included in `model_validate`, then `info.context` will be `None` and the forbidden passwords list will not get added to the context in the above implementation. As such, `validate_user_passwords` would not carry out the desired password validation. More details about validation context can be found [here](../../concepts/validators/#validation-context). `pydantic` is a great tool for validating data coming from various sources. In this section, we will look at how to validate data from different types of files. Note If you're using any of the below file formats to parse configuration / settings, you might want to consider using the pydantic-settings library, which offers builtin support for parsing this type of data. ## JSON data `.json` files are a common way to store key / value data in a human-readable format. Here is an example of a `.json` file: ```json { "name": "John Doe", "age": 30, "email": "john@example.com" } ``` To validate this data, we can use a `pydantic` model: ```python import pathlib from pydantic import BaseModel, EmailStr, PositiveInt class Person(BaseModel): name: str age: PositiveInt email: EmailStr json_string = pathlib.Path('person.json').read_text() person = Person.model_validate_json(json_string) print(repr(person)) #> Person(name='John Doe', age=30, email='john@example.com') ``` If the data in the file is not valid, `pydantic` will raise a ValidationError. Let's say we have the following `.json` file: ```json { "age": -30, "email": "not-an-email-address" } ``` This data is flawed for three reasons: 1. It's missing the `name` field. 1. The `age` field is negative. 1. The `email` field is not a valid email address. When we try to validate this data, `pydantic` raises a ValidationError with all of the above issues: ```python import pathlib from pydantic import BaseModel, EmailStr, PositiveInt, ValidationError class Person(BaseModel): name: str age: PositiveInt email: EmailStr json_string = pathlib.Path('person.json').read_text() try: person = Person.model_validate_json(json_string) except ValidationError as err: print(err) """ 3 validation errors for Person name Field required [type=missing, input_value={'age': -30, 'email': 'not-an-email-address'}, input_type=dict] For further information visit https://errors.pydantic.dev/2.10/v/missing age Input should be greater than 0 [type=greater_than, input_value=-30, input_type=int] For further information visit https://errors.pydantic.dev/2.10/v/greater_than email value is not a valid email address: An email address must have an @-sign. [type=value_error, input_value='not-an-email-address', input_type=str] """ ``` Often, it's the case that you have an abundance of a certain type of data within a `.json` file. For example, you might have a list of people: ```json [ { "name": "John Doe", "age": 30, "email": "john@example.com" }, { "name": "Jane Doe", "age": 25, "email": "jane@example.com" } ] ``` In this case, you can validate the data against a `list[Person]` model: ```python import pathlib from pydantic import BaseModel, EmailStr, PositiveInt, TypeAdapter class Person(BaseModel): name: str age: PositiveInt email: EmailStr person_list_adapter = TypeAdapter(list[Person]) # (1)! json_string = pathlib.Path('people.json').read_text() people = person_list_adapter.validate_json(json_string) print(people) #> [Person(name='John Doe', age=30, email='john@example.com'), Person(name='Jane Doe', age=25, email='jane@example.com')] ``` 1. We use TypeAdapter to validate a list of `Person` objects. TypeAdapter is a Pydantic construct used to validate data against a single type. ## JSON lines files Similar to validating a list of objects from a `.json` file, you can validate a list of objects from a `.jsonl` file. `.jsonl` files are a sequence of JSON objects separated by newlines. Consider the following `.jsonl` file: ```json {"name": "John Doe", "age": 30, "email": "john@example.com"} {"name": "Jane Doe", "age": 25, "email": "jane@example.com"} ``` We can validate this data with a similar approach to the one we used for `.json` files: ```python import pathlib from pydantic import BaseModel, EmailStr, PositiveInt class Person(BaseModel): name: str age: PositiveInt email: EmailStr json_lines = pathlib.Path('people.jsonl').read_text().splitlines() people = [Person.model_validate_json(line) for line in json_lines] print(people) #> [Person(name='John Doe', age=30, email='john@example.com'), Person(name='Jane Doe', age=25, email='jane@example.com')] ``` ## CSV files CSV is one of the most common file formats for storing tabular data. To validate data from a CSV file, you can use the `csv` module from the Python standard library to load the data and validate it against a Pydantic model. Consider the following CSV file: ```text name,age,email John Doe,30,john@example.com Jane Doe,25,jane@example.com ``` Here's how we validate that data: ```python import csv from pydantic import BaseModel, EmailStr, PositiveInt class Person(BaseModel): name: str age: PositiveInt email: EmailStr with open('people.csv') as f: reader = csv.DictReader(f) people = [Person.model_validate(row) for row in reader] print(people) #> [Person(name='John Doe', age=30, email='john@example.com'), Person(name='Jane Doe', age=25, email='jane@example.com')] ``` ## TOML files TOML files are often used for configuration due to their simplicity and readability. Consider the following TOML file: ```toml name = "John Doe" age = 30 email = "john@example.com" ``` Here's how we validate that data: ```python import tomllib from pydantic import BaseModel, EmailStr, PositiveInt class Person(BaseModel): name: str age: PositiveInt email: EmailStr with open('person.toml', 'rb') as f: data = tomllib.load(f) person = Person.model_validate(data) print(repr(person)) #> Person(name='John Doe', age=30, email='john@example.com') ``` Pydantic serves as a great tool for defining models for ORM (object relational mapping) libraries. ORMs are used to map objects to database tables, and vice versa. ## SQLAlchemy Pydantic can pair with SQLAlchemy, as it can be used to define the schema of the database models. Code Duplication If you use Pydantic with SQLAlchemy, you might experience some frustration with code duplication. If you find yourself experiencing this difficulty, you might also consider [`SQLModel`](https://sqlmodel.tiangolo.com/) which integrates Pydantic with SQLAlchemy such that much of the code duplication is eliminated. If you'd prefer to use pure Pydantic with SQLAlchemy, we recommend using Pydantic models alongside of SQLAlchemy models as shown in the example below. In this case, we take advantage of Pydantic's aliases feature to name a `Column` after a reserved SQLAlchemy field, thus avoiding conflicts. ```python import sqlalchemy as sa from sqlalchemy.orm import declarative_base from pydantic import BaseModel, ConfigDict, Field class MyModel(BaseModel): model_config = ConfigDict(from_attributes=True) metadata: dict[str, str] = Field(alias='metadata_') Base = declarative_base() class MyTableModel(Base): __tablename__ = 'my_table' id = sa.Column('id', sa.Integer, primary_key=True) # 'metadata' is reserved by SQLAlchemy, hence the '_' metadata_ = sa.Column('metadata', sa.JSON) sql_model = MyTableModel(metadata_={'key': 'val'}, id=1) pydantic_model = MyModel.model_validate(sql_model) print(pydantic_model.model_dump()) #> {'metadata': {'key': 'val'}} print(pydantic_model.model_dump(by_alias=True)) #> {'metadata_': {'key': 'val'}} ``` Note The example above works because aliases have priority over field names for field population. Accessing `SQLModel`'s `metadata` attribute would lead to a `ValidationError`. Pydantic is quite helpful for validating data that goes into and comes out of queues. Below, we'll explore how to validate / serialize data with various queue systems. ## Redis queue Redis is a popular in-memory data structure store. In order to run this example locally, you'll first need to [install Redis](https://redis.io/docs/latest/operate/oss_and_stack/install/install-redis/) and start your server up locally. Here's a simple example of how you can use Pydantic to: 1. Serialize data to push to the queue 1. Deserialize and validate data when it's popped from the queue ```python import redis from pydantic import BaseModel, EmailStr class User(BaseModel): id: int name: str email: EmailStr r = redis.Redis(host='localhost', port=6379, db=0) QUEUE_NAME = 'user_queue' def push_to_queue(user_data: User) -> None: serialized_data = user_data.model_dump_json() r.rpush(QUEUE_NAME, user_data.model_dump_json()) print(f'Added to queue: {serialized_data}') user1 = User(id=1, name='John Doe', email='john@example.com') user2 = User(id=2, name='Jane Doe', email='jane@example.com') push_to_queue(user1) #> Added to queue: {"id":1,"name":"John Doe","email":"john@example.com"} push_to_queue(user2) #> Added to queue: {"id":2,"name":"Jane Doe","email":"jane@example.com"} def pop_from_queue() -> None: data = r.lpop(QUEUE_NAME) if data: user = User.model_validate_json(data) print(f'Validated user: {repr(user)}') else: print('Queue is empty') pop_from_queue() #> Validated user: User(id=1, name='John Doe', email='john@example.com') pop_from_queue() #> Validated user: User(id=2, name='Jane Doe', email='jane@example.com') pop_from_queue() #> Queue is empty ``` Pydantic models are a great way to validating and serializing data for requests and responses. Pydantic is instrumental in many web frameworks and libraries, such as FastAPI, Django, Flask, and HTTPX. ## `httpx` requests [`httpx`](https://www.python-httpx.org/) is a HTTP client for Python 3 with synchronous and asynchronous APIs. In the below example, we query the [JSONPlaceholder API](https://jsonplaceholder.typicode.com/) to get a user's data and validate it with a Pydantic model. ```python import httpx from pydantic import BaseModel, EmailStr class User(BaseModel): id: int name: str email: EmailStr url = 'https://jsonplaceholder.typicode.com/users/1' response = httpx.get(url) response.raise_for_status() user = User.model_validate(response.json()) print(repr(user)) #> User(id=1, name='Leanne Graham', email='Sincere@april.biz') ``` The TypeAdapter tool from Pydantic often comes in quite handy when working with HTTP requests. Consider a similar example where we are validating a list of users: ```python from pprint import pprint import httpx from pydantic import BaseModel, EmailStr, TypeAdapter class User(BaseModel): id: int name: str email: EmailStr url = 'https://jsonplaceholder.typicode.com/users/' # (1)! response = httpx.get(url) response.raise_for_status() users_list_adapter = TypeAdapter(list[User]) users = users_list_adapter.validate_python(response.json()) pprint([u.name for u in users]) """ ['Leanne Graham', 'Ervin Howell', 'Clementine Bauch', 'Patricia Lebsack', 'Chelsey Dietrich', 'Mrs. Dennis Schulist', 'Kurtis Weissnat', 'Nicholas Runolfsdottir V', 'Glenna Reichert', 'Clementina DuBuque'] """ ``` 1. Note, we're querying the `/users/` endpoint here to get a list of users. `pydantic` integrates well with AWS Lambda functions. In this guide, we'll discuss how to setup `pydantic` for an AWS Lambda function. ## Installing Python libraries for AWS Lambda functions There are many ways to utilize Python libraries in AWS Lambda functions. As outlined in the [AWS Lambda documentation](https://docs.aws.amazon.com/lambda/latest/dg/lambda-python.html), the most common approaches include: - Using a [`.zip` file archive](https://docs.aws.amazon.com/lambda/latest/dg/python-package.html) to package your code and dependencies - Using [AWS Lambda Layers](https://docs.aws.amazon.com/lambda/latest/dg/python-layers.html) to share libraries across multiple functions - Using a [container image](https://docs.aws.amazon.com/lambda/latest/dg/python-image.html) to package your code and dependencies All of these approaches can be used with `pydantic`. The best approach for you will depend on your specific requirements and constraints. We'll cover the first two cases more in-depth here, as dependency management with a container image is more straightforward. If you're using a container image, you might find [this comment](https://github.com/pydantic/pydantic/issues/6557#issuecomment-1699456562) helpful for installing `pydantic`. Tip If you use `pydantic` across multiple functions, you may want to consider AWS Lambda Layers, which support seamless sharing of libraries across multiple functions. Regardless of the dependencies management approach you choose, it's beneficial to adhere to these guidelines to ensure a smooth dependency management process. ## Installing `pydantic` for AWS Lambda functions When you're building your `.zip` file archive with your code and dependencies or organizing your `.zip` file for a Lambda Layer, you'll likely use a local virtual environment to install and manage your dependencies. This can be a bit tricky if you're using `pip` because `pip` installs wheels compiled for your local platform, which may not be compatible with the Lambda environment. Thus, we suggest you use a command similar to the following: ```bash pip install \ --platform manylinux2014_x86_64 \ # (1)! --target= \ # (2)! --implementation cp \ # (3)! --python-version 3.10 \ # (4)! --only-binary=:all: \ # (5)! --upgrade pydantic # (6)! ``` 1. Use the platform corresponding to your Lambda runtime. 1. Specify the directory where you want to install the package (often `python` for Lambda Layers). 1. Use the CPython implementation. 1. The Python version must be compatible with the Lambda runtime. 1. This flag ensures that the package is installed pre-built binary wheels. 1. The latest version of `pydantic` will be installed. ## Troubleshooting ### `no module named 'pydantic_core._pydantic_core'` The ```text no module named `pydantic_core._pydantic_core` ``` error is a common issue that indicates you have installed `pydantic` incorrectly. To debug this issue, you can try the following steps (before the failing import): 1. Check the contents of the installed `pydantic-core` package. Are the compiled library and its type stubs both present? ```python from importlib.metadata import files print([file for file in files('pydantic-core') if file.name.startswith('_pydantic_core')]) """ [PackagePath('pydantic_core/_pydantic_core.pyi'), PackagePath('pydantic_core/_pydantic_core.cpython-312-x86_64-linux-gnu.so')] """ ``` You should expect to see two files like those printed above. The compile library file will be a .so or .pyd with a name that varies according to the OS and Python version. 1. Check that your lambda's Python version is compatible with the compiled library version found above. ```python import sysconfig print(sysconfig.get_config_var("EXT_SUFFIX")) #> '.cpython-312-x86_64-linux-gnu.so' ``` You should expect to see the same suffix here as the compiled library, for example here we see this suffix `.cpython-312-x86_64-linux-gnu.so` indeed matches `_pydantic_core.cpython-312-x86_64-linux-gnu.so`. If these two checks do not match, your build steps have not installed the correct native code for your lambda's target platform. You should adjust your build steps to change the version of the installed library which gets installed. Most likely errors: - Your OS or CPU architecture is mismatched (e.g. darwin vs x86_64-linux-gnu). Try passing correct `--platform` argument to `pip install` when installing your lambda dependencies, or build inside a linux docker container for the correct platform. Possible platforms at the moment include `--platform manylinux2014_x86_64` or `--platform manylinux2014_aarch64`, but these may change with a future Pydantic major release. - Your Python version is mismatched (e.g. `cpython-310` vs `cpython-312`). Try passing correct `--python-version` argument to `pip install`, or otherwise change the Python version used on your build. ### No package metadata was found for `email-validator` Pydantic uses `version` from `importlib.metadata` to [check what version](https://github.com/pydantic/pydantic/pull/6033) of `email-validator` is installed. This package versioning mechanism is somewhat incompatible with AWS Lambda, even though it's the industry standard for versioning packages in Python. There are a few ways to fix this issue: If you're deploying your lambda with the serverless framework, it's likely that the appropriate metadata for the `email-validator` package is not being included in your deployment package. Tools like [`serverless-python-requirements`](https://github.com/serverless/serverless-python-requirements/tree/master) remove metadata to reduce package size. You can fix this issue by setting the `slim` setting to false in your `serverless.yml` file: ```text pythonRequirements: dockerizePip: non-linux slim: false fileName: requirements.txt ``` You can read more about this fix, and other `slim` settings that might be relevant [here](https://biercoff.com/how-to-fix-package-not-found-error-importlib-metadata/). If you're using a `.zip` archive for your code and/or dependencies, make sure that your package contains the required version metadata. To do this, make sure you include the `dist-info` directory in your `.zip` archive for the `email-validator` package. This issue has been reported for other popular python libraries like [`jsonschema`](https://github.com/python-jsonschema/jsonschema/issues/584), so you can read more about the issue and potential fixes there as well. ## Extra Resources ### More Debugging Tips If you're still struggling with installing `pydantic` for your AWS Lambda, you might consult with [this issue](https://github.com/pydantic/pydantic/issues/6557), which covers a variety of problems and solutions encountered by other developers. ### Validating `event` and `context` data Check out our [blog post](https://pydantic.dev/articles/lambda-intro) to learn more about how to use `pydantic` to validate `event` and `context` data in AWS Lambda functions. # Code Generation with datamodel-code-generator The [datamodel-code-generator](https://github.com/koxudaxi/datamodel-code-generator/) project is a library and command-line utility to generate pydantic models from just about any data source, including: - OpenAPI 3 (YAML/JSON) - JSON Schema - JSON/YAML/CSV Data (which will be converted to JSON Schema) - Python dictionary (which will be converted to JSON Schema) - GraphQL schema Whenever you find yourself with any data convertible JSON but without pydantic models, this tool will allow you to generate type-safe model hierarchies on demand. ## Installation ```bash pip install datamodel-code-generator ``` ## Example In this case, datamodel-code-generator creates pydantic models from a JSON Schema file. ```bash datamodel-codegen --input person.json --input-file-type jsonschema --output model.py ``` person.json: ```json { "$id": "person.json", "$schema": "http://json-schema.org/draft-07/schema#", "title": "Person", "type": "object", "properties": { "first_name": { "type": "string", "description": "The person's first name." }, "last_name": { "type": "string", "description": "The person's last name." }, "age": { "description": "Age in years.", "type": "integer", "minimum": 0 }, "pets": { "type": "array", "items": [ { "$ref": "#/definitions/Pet" } ] }, "comment": { "type": "null" } }, "required": [ "first_name", "last_name" ], "definitions": { "Pet": { "properties": { "name": { "type": "string" }, "age": { "type": "integer" } } } } } ``` model.py: ```python # generated by datamodel-codegen: # filename: person.json # timestamp: 2020-05-19T15:07:31+00:00 from __future__ import annotations from typing import Any from pydantic import BaseModel, Field, conint class Pet(BaseModel): name: str | None = None age: int | None = None class Person(BaseModel): first_name: str = Field(description="The person's first name.") last_name: str = Field(description="The person's last name.") age: conint(ge=0) | None = Field(None, description='Age in years.') pets: list[Pet] | None = None comment: Any | None = None ``` More information can be found on the [official documentation](https://koxudaxi.github.io/datamodel-code-generator/) Note **Admission:** I (the primary developer of Pydantic) also develop python-devtools. [python-devtools](https://python-devtools.helpmanual.io/) (`pip install devtools`) provides a number of tools which are useful during Python development, including `debug()` an alternative to `print()` which formats output in a way which should be easier to read than `print` as well as giving information about which file/line the print statement is on and what value was printed. Pydantic integrates with *devtools* by implementing the `__pretty__` method on most public classes. In particular `debug()` is useful when inspecting models: ```python from datetime import datetime from devtools import debug from pydantic import BaseModel class Address(BaseModel): street: str country: str lat: float lng: float class User(BaseModel): id: int name: str signup_ts: datetime friends: list[int] address: Address user = User( id='123', name='John Doe', signup_ts='2019-06-01 12:22', friends=[1234, 4567, 7890], address=dict(street='Testing', country='uk', lat=51.5, lng=0), ) debug(user) print('\nshould be much easier read than:\n') print('user:', user) ``` Will output in your terminal: ``` devtools_example.py:30 user: User( id=123, name='John Doe', signup_ts=datetime.datetime(2019, 6, 1, 12, 22), friends=[ 1234, 4567, 7890, ], address=Address( street='Testing', country='uk', lat=51.5, lng=0.0, ), ) (User) should be much easier read than: user: id=123 name='John Doe' signup_ts=datetime.datetime(2019, 6, 1, 12, 22) friends=[1234, 4567, 7890] address=Address(street='Testing', country='uk', lat=51.5, lng=0.0) ``` Note `python-devtools` doesn't yet support Python 3.13. Pydantic uses [MkDocs](https://www.mkdocs.org/) for documentation, together with [mkdocstrings](https://mkdocstrings.github.io/). As such, you can make use of Pydantic's Sphinx object inventory to cross-reference the Pydantic API documentation. In your [Sphinx configuration](https://www.sphinx-doc.org/en/master/usage/configuration.html), add the following to the [`intersphinx` extension configuration](https://www.sphinx-doc.org/en/master/usage/extensions/intersphinx.html#configuration): ```python intersphinx_mapping = { 'pydantic': ('https://docs.pydantic.dev/latest', None), # (1)! } ``` 1. You can also use `dev` instead of `latest` to target the latest documentation build, up to date with the [`main`](https://github.com/pydantic/pydantic/tree/main) branch. In your [MkDocs configuration](https://www.mkdocs.org/user-guide/configuration/), add the following import to your [mkdocstrings plugin configuration](https://mkdocstrings.github.io/usage/#cross-references-to-other-projects-inventories): ```yaml plugins: - mkdocstrings: handlers: python: import: - https://docs.pydantic.dev/latest/objects.inv # (1)! ``` 1. You can also use `dev` instead of `latest` to target the latest documentation build, up to date with the [`main`](https://github.com/pydantic/pydantic/tree/main) branch. [Hypothesis](https://hypothesis.readthedocs.io/) is the Python library for [property-based testing](https://increment.com/testing/in-praise-of-property-based-testing/). Hypothesis can infer how to construct type-annotated classes, and supports builtin types, many standard library types, and generic types from the [`typing`](https://docs.python.org/3/library/typing.html) and [`typing_extensions`](https://pypi.org/project/typing-extensions/) modules by default. Pydantic v2.0 drops built-in support for Hypothesis and no more ships with the integrated Hypothesis plugin. Warning We are temporarily removing the Hypothesis plugin in favor of studying a different mechanism. For more information, see the issue [annotated-types/annotated-types#37](https://github.com/annotated-types/annotated-types/issues/37). The Hypothesis plugin may be back in a future release. Subscribe to [pydantic/pydantic#4682](https://github.com/pydantic/pydantic/issues/4682) for updates. ## Flake8 plugin If using Flake8 in your project, a [plugin](https://pypi.org/project/flake8-pydantic/) is available and can be installed using the following: ```bash pip install flake8-pydantic ``` The lint errors provided by this plugin are namespaced under the `PYDXXX` code. To ignore some unwanted rules, the Flake8 configuration can be adapted: ```ini [flake8] extend-ignore = PYD001,PYD002 ``` The Pydantic documentation is available in the [llms.txt](https://llmstxt.org/) format. This format is defined in Markdown and suited for large language models. Two formats are available: - [llms.txt](https://docs.pydantic.dev/latest/llms.txt): a file containing a brief description of the project, along with links to the different sections of the documentation. The structure of this file is described in details [here](https://llmstxt.org/#format). - [llms-full.txt](https://docs.pydantic.dev/latest/llms-full.txt): Similar to the `llms.txt` file, but every link content is included. Note that this file may be too large for some LLMs. As of today, these files *cannot* be natively leveraged by LLM frameworks or IDEs. Alternatively, a [MCP server](https://modelcontextprotocol.io/) can be implemented to properly parse the `llms.txt` file. Pydantic integrates seamlessly with **Pydantic Logfire**, an observability platform built by us on the same belief as our open source library — that the most powerful tools can be easy to use. ## Getting Started Logfire has an out-of-the-box Pydantic integration that lets you understand the data passing through your Pydantic models and get analytics on validations. For existing Pydantic users, it delivers unparalleled insights into your usage of Pydantic models. [Getting started](https://logfire.pydantic.dev/docs/) with Logfire can be done in three simple steps: 1. Set up your Logfire account. 1. Install the Logfire SDK. 1. Instrument your project. ### Basic Usage Once you've got Logfire set up, you can start using it to monitor your Pydantic models and get insights into your data validation: ```python from datetime import date import logfire from pydantic import BaseModel logfire.configure() # (1)! class User(BaseModel): name: str country_code: str dob: date user = User(name='Anne', country_code='USA', dob='2000-01-01') logfire.info('user processed: {user!r}', user=user) # (2)! ``` 1. The `logfire.configure()` call is all you need to instrument your project with Logfire. 1. The `logfire.info()` call logs the `user` object to Logfire, with builtin support for Pydantic models. ### Pydantic Instrumentation You can even record information about the validation process automatically by using the builtin [Pydantic integration](https://logfire.pydantic.dev/docs/why-logfire/pydantic/): ```python from datetime import date import logfire from pydantic import BaseModel logfire.configure() logfire.instrument_pydantic() # (1)! class User(BaseModel): name: str country_code: str dob: date User(name='Anne', country_code='USA', dob='2000-01-01') User(name='David', country_code='GBR', dob='invalid-dob') ``` 1. The `logfire.instrument_pydantic()` call automatically logs validation information for all Pydantic models in your project. You'll see each successful and failed validation logged in Logfire: And you can investigate each of the corresponding spans to get validation details: Pydantic works well with [mypy](http://mypy-lang.org) right out of the box. However, Pydantic also ships with a mypy plugin that adds a number of important Pydantic-specific features that improve its ability to type-check your code. For example, consider the following script: ```python from datetime import datetime from typing import Optional from pydantic import BaseModel class Model(BaseModel): age: int first_name = 'John' last_name: Optional[str] = None signup_ts: Optional[datetime] = None list_of_ints: list[int] m = Model(age=42, list_of_ints=[1, '2', b'3']) print(m.middle_name) # not a model field! Model() # will raise a validation error for age and list_of_ints ``` ```python from datetime import datetime from pydantic import BaseModel class Model(BaseModel): age: int first_name = 'John' last_name: str | None = None signup_ts: datetime | None = None list_of_ints: list[int] m = Model(age=42, list_of_ints=[1, '2', b'3']) print(m.middle_name) # not a model field! Model() # will raise a validation error for age and list_of_ints ``` Without any special configuration, mypy does not catch the [missing model field annotation](../../errors/usage_errors/#model-field-missing-annotation) and errors about the `list_of_ints` argument which Pydantic parses correctly: ```text 15: error: List item 1 has incompatible type "str"; expected "int" [list-item] 15: error: List item 2 has incompatible type "bytes"; expected "int" [list-item] 16: error: "Model" has no attribute "middle_name" [attr-defined] 17: error: Missing named argument "age" for "Model" [call-arg] 17: error: Missing named argument "list_of_ints" for "Model" [call-arg] ``` But [with the plugin enabled](#enabling-the-plugin), it gives the correct errors: ```text 9: error: Untyped fields disallowed [pydantic-field] 16: error: "Model" has no attribute "middle_name" [attr-defined] 17: error: Missing named argument "age" for "Model" [call-arg] 17: error: Missing named argument "list_of_ints" for "Model" [call-arg] ``` With the pydantic mypy plugin, you can fearlessly refactor your models knowing mypy will catch any mistakes if your field names or types change. Note that mypy already supports some features without using the Pydantic plugin, such as synthesizing a `__init__` method for Pydantic models and dataclasses. See the [mypy plugin capabilities](#mypy-plugin-capabilities) for a list of additional features. ## Enabling the Plugin To enable the plugin, just add `pydantic.mypy` to the list of plugins in your [mypy config file](https://mypy.readthedocs.io/en/latest/config_file.html): ```ini [mypy] plugins = pydantic.mypy ``` ```toml [tool.mypy] plugins = ['pydantic.mypy'] ``` Note If you're using `pydantic.v1` models, you'll need to add `pydantic.v1.mypy` to your list of plugins. See the [plugin configuration](#configuring-the-plugin) for more details. ## Supported mypy versions Pydantic supports the mypy versions released less than 6 months ago. Older versions may still work with the plugin but won't be tested. The list of released mypy versions can be found [here](https://mypy-lang.org/news.html). Note that the version support policy is subject to change at discretion of contributors. ## Mypy plugin capabilities ### Generate a `__init__` signature for Pydantic models - Any required fields that don't have dynamically-determined aliases will be included as required keyword arguments. - If the validate_by_name model configuration value is set to `True`, the generated signature will use the field names rather than aliases. - The [`init_forbid_extra`](#init_forbid_extra) and [`init_typed`](#init_typed) plugin configuration values can further fine-tune the synthesized `__init__` method. ### Generate a typed signature for `model_construct` - The model_construct method is an alternative to model validation when input data is known to be valid and should not be parsed (see the [documentation](../../concepts/models/#creating-models-without-validation)). Because this method performs no runtime validation, static checking is important to detect errors. ### Support for frozen models - If the frozen configuration is set to `True`, you will get an error if you try mutating a model field (see [faux immutability](../../concepts/models/#faux-immutability)) ### Respect the type of the `Field`'s `default` and `default_factory` - Field with both a `default` and a `default_factory` will result in an error during static checking. - The type of the `default` and `default_factory` value must be compatible with the one of the field. ### Warn about the use of untyped fields - While defining a field without an annotation will result in a [runtime error](../../errors/usage_errors/#model-field-missing-annotation), the plugin will also emit a type checking error. ### Prevent the use of required dynamic aliases See the documentation of the [`warn_required_dynamic_aliases`](#warn_required_dynamic_aliases) plugin configuration value. ## Configuring the Plugin To change the values of the plugin settings, create a section in your mypy config file called `[pydantic-mypy]`, and add any key-value pairs for settings you want to override. A configuration file with all plugin strictness flags enabled (and some other mypy strictness flags, too) might look like: ```ini [mypy] plugins = pydantic.mypy follow_imports = silent warn_redundant_casts = True warn_unused_ignores = True disallow_any_generics = True no_implicit_reexport = True disallow_untyped_defs = True [pydantic-mypy] init_forbid_extra = True init_typed = True warn_required_dynamic_aliases = True ``` ```toml [tool.mypy] plugins = ["pydantic.mypy"] follow_imports = "silent" warn_redundant_casts = true warn_unused_ignores = true disallow_any_generics = true no_implicit_reexport = true disallow_untyped_defs = true [tool.pydantic-mypy] init_forbid_extra = true init_typed = true warn_required_dynamic_aliases = true ``` ### `init_typed` Because Pydantic performs [data conversion](../../concepts/models/#data-conversion) by default, the following is still valid at runtime: ```python class Model(BaseModel): a: int Model(a='1') ``` For this reason, the plugin will use Any for field annotations when synthesizing the `__init__` method, unless `init_typed` is set or [strict mode](../../concepts/strict_mode/) is enabled on the model. ### `init_forbid_extra` By default, Pydantic allows (and ignores) any extra provided argument: ```python class Model(BaseModel): a: int = 1 Model(unrelated=2) ``` For this reason, the plugin will add an extra `**kwargs: Any` parameter when synthesizing the `__init__` method, unless `init_forbid_extra` is set or the extra is set to `'forbid'`. ### `warn_required_dynamic_aliases` Whether to error when using a dynamically-determined alias or alias generator on a model with validate_by_name set to `False`. If such aliases are present, mypy cannot properly type check calls to `__init__`. In this case, it will default to treating all arguments as not required. Compatibility with `Any` being disallowed Some mypy configuration options (such as [`disallow_any_explicit`](https://mypy.readthedocs.io/en/stable/config_file.html#confval-disallow_any_explicit)) will error because the synthesized `__init__` method contains Any annotations. To circumvent the issue, you will have to enable both `init_forbid_extra` and `init_typed`. While pydantic will work well with any IDE out of the box, a [PyCharm plugin](https://plugins.jetbrains.com/plugin/12861-pydantic) offering improved pydantic integration is available on the JetBrains Plugins Repository for PyCharm. You can install the plugin for free from the plugin marketplace (PyCharm's Preferences -> Plugin -> Marketplace -> search "pydantic"). The plugin currently supports the following features: - For `pydantic.BaseModel.__init__`: - Inspection - Autocompletion - Type-checking - For fields of `pydantic.BaseModel`: - Refactor-renaming fields updates `__init__` calls, and affects sub- and super-classes - Refactor-renaming `__init__` keyword arguments updates field names, and affects sub- and super-classes More information can be found on the [official plugin page](https://plugins.jetbrains.com/plugin/12861-pydantic) and [Github repository](https://github.com/koxudaxi/pydantic-pycharm-plugin). Pydantic models may be printed with the [Rich](https://github.com/willmcgugan/rich) library which will add additional formatting and color to the output. Here's an example: See the Rich documentation on [pretty printing](https://rich.readthedocs.io/en/latest/pretty.html) for more information. Pydantic works well with any editor or IDE out of the box because it's made on top of standard Python type annotations. When using [Visual Studio Code (VS Code)](https://code.visualstudio.com/), there are some **additional editor features** supported, comparable to the ones provided by the [PyCharm plugin](../pycharm/). This means that you will have **autocompletion** (or "IntelliSense") and **error checks** for types and required arguments even while creating new Pydantic model instances. ## Configure VS Code To take advantage of these features, you need to make sure you configure VS Code correctly, using the recommended settings. In case you have a different configuration, here's a short overview of the steps. ### Install Pylance You should use the [Pylance](https://marketplace.visualstudio.com/items?itemName=ms-python.vscode-pylance) extension for VS Code. It is the recommended, next-generation, official VS Code plug-in for Python. Pylance is installed as part of the [Python Extension for VS Code](https://marketplace.visualstudio.com/items?itemName=ms-python.python) by default, so it should probably just work. Otherwise, you can double check it's installed and enabled in your editor. ### Configure your environment Then you need to make sure your editor knows the [Python environment](https://code.visualstudio.com/docs/python/python-tutorial#_install-and-use-packages) (probably a virtual environment) for your Python project. This would be the environment in where you installed Pydantic. ### Configure Pylance With the default configurations, you will get support for autocompletion, but Pylance might not check for type errors. You can enable type error checks from Pylance with these steps: - Open the "User Settings" - Search for `Type Checking Mode` - You will find an option under `Python › Analysis: Type Checking Mode` - Set it to `basic` or `strict` (by default it's `off`) Now you will not only get autocompletion when creating new Pydantic model instances but also error checks for **required arguments**. And you will also get error checks for **invalid data types**. Technical Details Pylance is the VS Code extension, it's closed source, but free to use. Underneath, Pylance uses an open source tool (also from Microsoft) called [Pyright](https://github.com/microsoft/pyright) that does all the heavy lifting. You can read more about it in the [Pylance Frequently Asked Questions](https://github.com/microsoft/pylance-release/blob/main/FAQ.md#what-is-the-relationship-between-pylance-pyright-and-the-python-extension). ### Configure mypy You might also want to configure mypy in VS Code to get mypy error checks inline in your editor (alternatively/additionally to Pylance). This would include the errors detected by the [Pydantic mypy plugin](../mypy/), if you configured it. To enable mypy in VS Code, do the following: - Open the "User Settings" - Search for `Mypy Enabled` - You will find an option under `Python › Linting: Mypy Enabled` - Check the box (by default it's unchecked) ## Tips and tricks Here are some additional tips and tricks to improve your developer experience when using VS Code with Pydantic. ### Strict errors The way this additional editor support works is that Pylance will treat your Pydantic models as if they were Python's pure `dataclasses`. And it will show **strict type error checks** about the data types passed in arguments when creating a new Pydantic model instance. In this example you can see that it shows that a `str` of `'23'` is not a valid `int` for the argument `age`. It would expect `age=23` instead of `age='23'`. Nevertheless, the design, and one of the main features of Pydantic, is that it is very **lenient with data types**. It will actually accept the `str` with value `'23'` and will convert it to an `int` with value `23`. These strict error checks are **very useful** most of the time and can help you **detect many bugs early**. But there are cases, like with `age='23'`, where they could be inconvenient by reporting a "false positive" error. ______________________________________________________________________ This example above with `age='23'` is intentionally simple, to show the error and the differences in types. But more common cases where these strict errors would be inconvenient would be when using more sophisticated data types, like `int` values for `datetime` fields, or `dict` values for Pydantic sub-models. For example, this is valid for Pydantic: ```python from pydantic import BaseModel class Knight(BaseModel): title: str age: int color: str = 'blue' class Quest(BaseModel): title: str knight: Knight quest = Quest( title='To seek the Holy Grail', knight={'title': 'Sir Lancelot', 'age': 23} ) ``` The type of the field `knight` is declared with the class `Knight` (a Pydantic model) and the code is passing a literal `dict` instead. This is still valid for Pydantic, and the `dict` would be automatically converted to a `Knight` instance. Nevertheless, it would be detected as a type error: In those cases, there are several ways to disable or ignore strict errors in very specific places, while still preserving them in the rest of the code. Below are several techniques to achieve it. #### Disable type checks in a line You can disable the errors for a specific line using a comment of: ```python # type: ignore ``` or (to be specific to pylance/pyright): ```python # pyright: ignore ``` ([pyright](https://github.com/microsoft/pyright) is the language server used by Pylance.). coming back to the example with `age='23'`, it would be: ```python from pydantic import BaseModel class Knight(BaseModel): title: str age: int color: str = 'blue' lancelot = Knight(title='Sir Lancelot', age='23') # pyright: ignore ``` that way Pylance and mypy will ignore errors in that line. **Pros**: it's a simple change in that line to remove errors there. **Cons**: any other error in that line will also be omitted, including type checks, misspelled arguments, required arguments not provided, etc. #### Override the type of a variable You can also create a variable with the value you want to use and declare its type explicitly with `Any`. ```python from typing import Any from pydantic import BaseModel class Knight(BaseModel): title: str age: int color: str = 'blue' age_str: Any = '23' lancelot = Knight(title='Sir Lancelot', age=age_str) ``` that way Pylance and mypy will interpret the variable `age_str` as if they didn't know its type, instead of knowing it has a type of `str` when an `int` was expected (and then showing the corresponding error). **Pros**: errors will be ignored only for a specific value, and you will still see any additional errors for the other arguments. **Cons**: it requires importing `Any` and a new variable in a new line for each argument that needs ignoring errors. #### Override the type of a value with `cast` The same idea from the previous example can be put on the same line with the help of `cast()`. This way, the type declaration of the value is overridden inline, without requiring another variable. ```python from typing import Any, cast from pydantic import BaseModel class Knight(BaseModel): title: str age: int color: str = 'blue' lancelot = Knight(title='Sir Lancelot', age=cast(Any, '23')) ``` `cast(Any, '23')` doesn't affect the value, it's still just `'23'`, but now Pylance and mypy will assume it is of type `Any`, which means, they will act as if they didn't know the type of the value. So, this is the equivalent of the previous example, without the additional variable. **Pros**: errors will be ignored only for a specific value, and you will still see any additional errors for the other arguments. There's no need for additional variables. **Cons**: it requires importing `Any` and `cast`, and if you are not used to using `cast()`, it could seem strange at first. ### Config in class arguments Pydantic has a rich set of Model Configurations available. These configurations can be set in an internal `class Config` on each model: ```python from pydantic import BaseModel class Knight(BaseModel): model_config = dict(frozen=True) title: str age: int color: str = 'blue' ``` or passed as keyword arguments when defining the model class: ```python from pydantic import BaseModel class Knight(BaseModel, frozen=True): title: str age: int color: str = 'blue' ``` The specific configuration **`frozen`** (in beta) has a special meaning. It prevents other code from changing a model instance once it's created, keeping it **"frozen"**. When using the second version to declare `frozen=True` (with **keyword arguments** in the class definition), Pylance can use it to help you check in your code and **detect errors** when something is trying to set values in a model that is "frozen". ## Adding a default with `Field` Pylance/pyright requires `default` to be a keyword argument to `Field` in order to infer that the field is optional. ```python from pydantic import BaseModel, Field class Knight(BaseModel): title: str = Field(default='Sir Lancelot') # this is okay age: int = Field( 23 ) # this works fine at runtime but will case an error for pyright lance = Knight() # error: Argument missing for parameter "age" ``` This is a limitation of dataclass transforms and cannot be fixed in pydantic. ## Technical Details Warning As a Pydantic user, you don't need the details below. Feel free to skip the rest of this section. These details are only useful for other library authors, etc. This additional editor support works by implementing the proposed draft standard for [Dataclass Transform (PEP 681)](https://peps.python.org/pep-0681/). The proposed draft standard is written by Eric Traut, from the Microsoft team, the same author of the open source package Pyright (used by Pylance to provide Python support in VS Code). The intention of the standard is to provide a way for libraries like Pydantic and others to tell editors and tools that they (the editors) should treat these libraries (e.g. Pydantic) as if they were `dataclasses`, providing autocompletion, type checks, etc. The draft standard also includes an [Alternate Form](https://github.com/microsoft/pyright/blob/master/specs/dataclass_transforms.md#alternate-form) for early adopters, like Pydantic, to add support for it right away, even before the new draft standard is finished and approved. This new draft standard, with the Alternate Form, is already supported by Pyright, so it can be used via Pylance in VS Code. As it is being proposed as an official standard for Python, other editors can also easily add support for it. And authors of other libraries similar to Pydantic can also easily adopt the standard right away (using the "Alternate Form") and get the benefits of these additional editor features.