Fields
API Documentation
In this section, we will go through the available mechanisms to customize Pydantic model fields: default values, JSON Schema metadata, constraints, etc.
To do so, the Field()
function is used a lot, and behaves the same way as
the standard library field()
function for dataclasses:
from pydantic import BaseModel, Field
class Model(BaseModel):
name: str = Field(frozen=True)
Note
Even though name
is assigned a value, it is still required and has no default value. If you want
to emphasize on the fact that a value must be provided, you can use the ellipsis:
class Model(BaseModel):
name: str = Field(..., frozen=True)
However, its usage is discouraged as it doesn't play well with static type checkers.
The annotated pattern¶
To apply constraints or attach Field()
functions to a model field, Pydantic
supports the Annotated
typing construct to attach metadata to an annotation:
from typing_extensions import Annotated
from pydantic import BaseModel, Field, WithJsonSchema
class Model(BaseModel):
name: Annotated[str, Field(strict=True), WithJsonSchema({'extra': 'data'})]
As far as static type checkers are concerned, name
is still typed as str
, but Pydantic leverages
the available metadata to add validation logic, type constraints, etc.
Using this pattern has some advantages:
- Using the
f: <type> = Field(...)
form can be confusing and might trick users into thinkingf
has a default value, while in reality it is still required. - You can provide an arbitrary amount of metadata elements for a field. As shown in the example above,
the
Field()
function only supports a limited set of constraints/metadata, and you may have to use different Pydantic utilities such asWithJsonSchema
in some cases. - Types can be made reusable (see the documentation on custom types using this pattern).
However, note that certain arguments to the Field()
function (namely, default
,
default_factory
, and alias
) are taken into account by static type checkers to synthesize a correct
__init__
method. The annotated pattern is not understood by them, so you should use the normal
assignment form instead.
Tip
The annotated pattern can also be used to add metadata to specific parts of the type. For instance, validation constraints can be added this way:
from typing import List
from typing_extensions import Annotated
from pydantic import BaseModel, Field
class Model(BaseModel):
int_list: List[Annotated[int, Field(gt=0)]]
# Valid: [1, 3]
# Invalid: [-1, 2]
Default values¶
Default values for fields can be provided using the normal assignment syntax or by providing a value
to the default
argument:
from pydantic import BaseModel, Field
class User(BaseModel):
# Both fields aren't required:
name: str = 'John Doe'
age: int = Field(default=20)
Warning
In Pydantic V1, a type annotated as Any
or wrapped by Optional
would be given an implicit default of None
even if no
default was explicitly specified. This is no longer the case in Pydantic V2.
You can also pass a callable to the default_factory
argument that will be called to generate a default value:
from uuid import uuid4
from pydantic import BaseModel, Field
class User(BaseModel):
id: str = Field(default_factory=lambda: uuid4().hex)
The default factory can also take a single required argument, in which case the already validated data will be passed as a dictionary.
from pydantic import BaseModel, EmailStr, Field
class User(BaseModel):
email: EmailStr
username: str = Field(default_factory=lambda data: data['email'])
user = User(email='[email protected]')
print(user.username)
#> [email protected]
The data
argument will only contain the already validated data, based on the order of model fields
(the above example would fail if username
were to be defined before email
).
Validate default values¶
By default, Pydantic will not validate default values. The validate_default
field parameter
(or the validate_default
configuration value) can be used
to enable this behavior:
from pydantic import BaseModel, Field, ValidationError
class User(BaseModel):
age: int = Field(default='twelve', validate_default=True)
try:
user = User()
except ValidationError as e:
print(e)
"""
1 validation error for User
age
Input should be a valid integer, unable to parse string as an integer [type=int_parsing, input_value='twelve', input_type=str]
"""
Mutable default values¶
A common source of bugs in Python is to use a mutable object as a default value for a function or method argument, as the same instance ends up being reused in each call.
The dataclasses
module actually raises an error in this case, indicating that you should use
a default factory instead.
While the same thing can be done in Pydantic, it is not required. In the event that the default value is not hashable, Pydantic will create a deep copy of the default value when creating each instance of the model:
from typing import Dict, List
from pydantic import BaseModel
class Model(BaseModel):
item_counts: List[Dict[str, int]] = [{}]
m1 = Model()
m1.item_counts[0]['a'] = 1
print(m1.item_counts)
#> [{'a': 1}]
m2 = Model()
print(m2.item_counts)
#> [{}]
from pydantic import BaseModel
class Model(BaseModel):
item_counts: list[dict[str, int]] = [{}]
m1 = Model()
m1.item_counts[0]['a'] = 1
print(m1.item_counts)
#> [{'a': 1}]
m2 = Model()
print(m2.item_counts)
#> [{}]
Field aliases¶
Tip
Read more about aliases in the dedicated section.
For validation and serialization, you can define an alias for a field.
There are three ways to define an alias:
Field(alias='foo')
Field(validation_alias='foo')
Field(serialization_alias='foo')
The alias
parameter is used for both validation and serialization. If you want to use
different aliases for validation and serialization respectively, you can use the validation_alias
and serialization_alias
parameters, which will apply only in their respective use cases.
Here is an example of using the alias
parameter:
from pydantic import BaseModel, Field
class User(BaseModel):
name: str = Field(alias='username')
user = User(username='johndoe') # (1)!
print(user)
#> name='johndoe'
print(user.model_dump(by_alias=True)) # (2)!
#> {'username': 'johndoe'}
- The alias
'username'
is used for instance creation and validation. -
We are using
model_dump()
to convert the model into a serializable format.Note that the
by_alias
keyword argument defaults toFalse
, and must be specified explicitly to dump models using the field (serialization) aliases.When
by_alias=True
, the alias'username'
is also used during serialization.
If you want to use an alias only for validation, you can use the validation_alias
parameter:
from pydantic import BaseModel, Field
class User(BaseModel):
name: str = Field(validation_alias='username')
user = User(username='johndoe') # (1)!
print(user)
#> name='johndoe'
print(user.model_dump(by_alias=True)) # (2)!
#> {'name': 'johndoe'}
- The validation alias
'username'
is used during validation. - The field name
'name'
is used during serialization.
If you only want to define an alias for serialization, you can use the serialization_alias
parameter:
from pydantic import BaseModel, Field
class User(BaseModel):
name: str = Field(serialization_alias='username')
user = User(name='johndoe') # (1)!
print(user)
#> name='johndoe'
print(user.model_dump(by_alias=True)) # (2)!
#> {'username': 'johndoe'}
- The field name
'name'
is used for validation. - The serialization alias
'username'
is used for serialization.
Alias precedence and priority
In case you use alias
together with validation_alias
or serialization_alias
at the same time,
the validation_alias
will have priority over alias
for validation, and serialization_alias
will have priority
over alias
for serialization.
If you provide a value for the alias_generator
model setting, you can control the order of precedence for field alias and generated aliases via the alias_priority
field parameter. You can read more about alias precedence here.
Static type checking/IDE support
If you provide a value for the alias
field parameter, static type checkers will use this alias instead
of the actual field name to synthesize the __init__
method:
from pydantic import BaseModel, Field
class User(BaseModel):
name: str = Field(alias='username')
user = User(username='johndoe') # (1)!
- Accepted by type checkers.
This means that when using the populate_by_name
model
setting (which allows both the field name and alias to be used during model validation), type checkers
will error when the actual field name is used:
from pydantic import BaseModel, ConfigDict, Field
class User(BaseModel):
model_config = ConfigDict(populate_by_name=True)
name: str = Field(alias='username')
user = User(name='johndoe') # (1)!
- Not accepted by type checkers.
If you still want type checkers to use the field name and not the alias, the annotated pattern can be used (which is only understood by Pydantic):
from typing_extensions import Annotated
from pydantic import BaseModel, ConfigDict, Field
class User(BaseModel):
model_config = ConfigDict(populate_by_name=True)
name: Annotated[str, Field(alias='username')]
user = User(name='johndoe') # (1)!
user = User(username='johndoe') # (2)!
- Accepted by type checkers.
- Not accepted by type checkers.
Validation Alias
Even though Pydantic treats alias
and validation_alias
the same when creating model instances, type checkers
only understand the alias
field parameter. As a workaround, you can instead specify both an alias
and
serialization_alias(identical to the field name), as the
serialization_aliaswill override the
alias` during
serialization:
from pydantic import BaseModel, Field
class MyModel(BaseModel):
my_field: int = Field(validation_alias='myValidationAlias')
with:
from pydantic import BaseModel, Field
class MyModel(BaseModel):
my_field: int = Field(
alias='myValidationAlias',
serialization_alias='my_field',
)
m = MyModel(myValidationAlias=1)
print(m.model_dump(by_alias=True))
#> {'my_field': 1}
Numeric Constraints¶
There are some keyword arguments that can be used to constrain numeric values:
gt
- greater thanlt
- less thange
- greater than or equal tole
- less than or equal tomultiple_of
- a multiple of the given numberallow_inf_nan
- allow'inf'
,'-inf'
,'nan'
values
Here's an example:
from pydantic import BaseModel, Field
class Foo(BaseModel):
positive: int = Field(gt=0)
non_negative: int = Field(ge=0)
negative: int = Field(lt=0)
non_positive: int = Field(le=0)
even: int = Field(multiple_of=2)
love_for_pydantic: float = Field(allow_inf_nan=True)
foo = Foo(
positive=1,
non_negative=0,
negative=-1,
non_positive=0,
even=2,
love_for_pydantic=float('inf'),
)
print(foo)
"""
positive=1 non_negative=0 negative=-1 non_positive=0 even=2 love_for_pydantic=inf
"""
JSON Schema
In the generated JSON schema:
gt
andlt
constraints will be translated toexclusiveMinimum
andexclusiveMaximum
.ge
andle
constraints will be translated tominimum
andmaximum
.multiple_of
constraint will be translated tomultipleOf
.
The above snippet will generate the following JSON Schema:
{
"title": "Foo",
"type": "object",
"properties": {
"positive": {
"title": "Positive",
"type": "integer",
"exclusiveMinimum": 0
},
"non_negative": {
"title": "Non Negative",
"type": "integer",
"minimum": 0
},
"negative": {
"title": "Negative",
"type": "integer",
"exclusiveMaximum": 0
},
"non_positive": {
"title": "Non Positive",
"type": "integer",
"maximum": 0
},
"even": {
"title": "Even",
"type": "integer",
"multipleOf": 2
},
"love_for_pydantic": {
"title": "Love For Pydantic",
"type": "number"
}
},
"required": [
"positive",
"non_negative",
"negative",
"non_positive",
"even",
"love_for_pydantic"
]
}
See the JSON Schema Draft 2020-12 for more details.
Constraints on compound types
In case you use field constraints with compound types, an error can happen in some cases. To avoid potential issues,
you can use Annotated
:
from typing import Optional
from typing_extensions import Annotated
from pydantic import BaseModel, Field
class Foo(BaseModel):
positive: Optional[Annotated[int, Field(gt=0)]]
# Can error in some cases, not recommended:
non_negative: Optional[int] = Field(ge=0)
String Constraints¶
API Documentation
There are fields that can be used to constrain strings:
min_length
: Minimum length of the string.max_length
: Maximum length of the string.pattern
: A regular expression that the string must match.
Here's an example:
from pydantic import BaseModel, Field
class Foo(BaseModel):
short: str = Field(min_length=3)
long: str = Field(max_length=10)
regex: str = Field(pattern=r'^\d*$') # (1)!
foo = Foo(short='foo', long='foobarbaz', regex='123')
print(foo)
#> short='foo' long='foobarbaz' regex='123'
- Only digits are allowed.
JSON Schema
In the generated JSON schema:
min_length
constraint will be translated tominLength
.max_length
constraint will be translated tomaxLength
.pattern
constraint will be translated topattern
.
The above snippet will generate the following JSON Schema:
{
"title": "Foo",
"type": "object",
"properties": {
"short": {
"title": "Short",
"type": "string",
"minLength": 3
},
"long": {
"title": "Long",
"type": "string",
"maxLength": 10
},
"regex": {
"title": "Regex",
"type": "string",
"pattern": "^\\d*$"
}
},
"required": [
"short",
"long",
"regex"
]
}
Decimal Constraints¶
There are fields that can be used to constrain decimals:
max_digits
: Maximum number of digits within theDecimal
. It does not include a zero before the decimal point or trailing decimal zeroes.decimal_places
: Maximum number of decimal places allowed. It does not include trailing decimal zeroes.
Here's an example:
from decimal import Decimal
from pydantic import BaseModel, Field
class Foo(BaseModel):
precise: Decimal = Field(max_digits=5, decimal_places=2)
foo = Foo(precise=Decimal('123.45'))
print(foo)
#> precise=Decimal('123.45')
Dataclass Constraints¶
There are fields that can be used to constrain dataclasses:
init
: Whether the field should be included in the__init__
of the dataclass.init_var
: Whether the field should be seen as an init-only field in the dataclass.kw_only
: Whether the field should be a keyword-only argument in the constructor of the dataclass.
Here's an example:
from pydantic import BaseModel, Field
from pydantic.dataclasses import dataclass
@dataclass
class Foo:
bar: str
baz: str = Field(init_var=True)
qux: str = Field(kw_only=True)
class Model(BaseModel):
foo: Foo
model = Model(foo=Foo('bar', baz='baz', qux='qux'))
print(model.model_dump()) # (1)!
#> {'foo': {'bar': 'bar', 'qux': 'qux'}}
- The
baz
field is not included in themodel_dump()
output, since it is an init-only field.
Field Representation¶
The parameter repr
can be used to control whether the field should be included in the string
representation of the model.
from pydantic import BaseModel, Field
class User(BaseModel):
name: str = Field(repr=True) # (1)!
age: int = Field(repr=False)
user = User(name='John', age=42)
print(user)
#> name='John'
- This is the default value.
Discriminator¶
The parameter discriminator
can be used to control the field that will be used to discriminate between different
models in a union. It takes either the name of a field or a Discriminator
instance. The Discriminator
approach can be useful when the discriminator fields aren't the same for all the models in the Union
.
The following example shows how to use discriminator
with a field name:
from typing import Literal, Union
from pydantic import BaseModel, Field
class Cat(BaseModel):
pet_type: Literal['cat']
age: int
class Dog(BaseModel):
pet_type: Literal['dog']
age: int
class Model(BaseModel):
pet: Union[Cat, Dog] = Field(discriminator='pet_type')
print(Model.model_validate({'pet': {'pet_type': 'cat', 'age': 12}})) # (1)!
#> pet=Cat(pet_type='cat', age=12)
- See more about Validating data in the Models page.
from typing import Literal
from pydantic import BaseModel, Field
class Cat(BaseModel):
pet_type: Literal['cat']
age: int
class Dog(BaseModel):
pet_type: Literal['dog']
age: int
class Model(BaseModel):
pet: Cat | Dog = Field(discriminator='pet_type')
print(Model.model_validate({'pet': {'pet_type': 'cat', 'age': 12}})) # (1)!
#> pet=Cat(pet_type='cat', age=12)
- See more about Validating data in the Models page.
The following example shows how to use the discriminator
keyword argument with a Discriminator
instance:
from typing import Literal, Union
from typing_extensions import Annotated
from pydantic import BaseModel, Discriminator, Field, Tag
class Cat(BaseModel):
pet_type: Literal['cat']
age: int
class Dog(BaseModel):
pet_kind: Literal['dog']
age: int
def pet_discriminator(v):
if isinstance(v, dict):
return v.get('pet_type', v.get('pet_kind'))
return getattr(v, 'pet_type', getattr(v, 'pet_kind', None))
class Model(BaseModel):
pet: Union[Annotated[Cat, Tag('cat')], Annotated[Dog, Tag('dog')]] = Field(
discriminator=Discriminator(pet_discriminator)
)
print(repr(Model.model_validate({'pet': {'pet_type': 'cat', 'age': 12}})))
#> Model(pet=Cat(pet_type='cat', age=12))
print(repr(Model.model_validate({'pet': {'pet_kind': 'dog', 'age': 12}})))
#> Model(pet=Dog(pet_kind='dog', age=12))
from typing import Literal, Union
from typing import Annotated
from pydantic import BaseModel, Discriminator, Field, Tag
class Cat(BaseModel):
pet_type: Literal['cat']
age: int
class Dog(BaseModel):
pet_kind: Literal['dog']
age: int
def pet_discriminator(v):
if isinstance(v, dict):
return v.get('pet_type', v.get('pet_kind'))
return getattr(v, 'pet_type', getattr(v, 'pet_kind', None))
class Model(BaseModel):
pet: Union[Annotated[Cat, Tag('cat')], Annotated[Dog, Tag('dog')]] = Field(
discriminator=Discriminator(pet_discriminator)
)
print(repr(Model.model_validate({'pet': {'pet_type': 'cat', 'age': 12}})))
#> Model(pet=Cat(pet_type='cat', age=12))
print(repr(Model.model_validate({'pet': {'pet_kind': 'dog', 'age': 12}})))
#> Model(pet=Dog(pet_kind='dog', age=12))
from typing import Literal
from typing import Annotated
from pydantic import BaseModel, Discriminator, Field, Tag
class Cat(BaseModel):
pet_type: Literal['cat']
age: int
class Dog(BaseModel):
pet_kind: Literal['dog']
age: int
def pet_discriminator(v):
if isinstance(v, dict):
return v.get('pet_type', v.get('pet_kind'))
return getattr(v, 'pet_type', getattr(v, 'pet_kind', None))
class Model(BaseModel):
pet: Annotated[Cat, Tag('cat')] | Annotated[Dog, Tag('dog')] = Field(
discriminator=Discriminator(pet_discriminator)
)
print(repr(Model.model_validate({'pet': {'pet_type': 'cat', 'age': 12}})))
#> Model(pet=Cat(pet_type='cat', age=12))
print(repr(Model.model_validate({'pet': {'pet_kind': 'dog', 'age': 12}})))
#> Model(pet=Dog(pet_kind='dog', age=12))
You can also take advantage of Annotated
to define your discriminated unions.
See the Discriminated Unions docs for more details.
Strict Mode¶
The strict
parameter on a Field
specifies whether the field should be validated in "strict mode".
In strict mode, Pydantic throws an error during validation instead of coercing data on the field where strict=True
.
from pydantic import BaseModel, Field
class User(BaseModel):
name: str = Field(strict=True) # (1)!
age: int = Field(strict=False)
user = User(name='John', age='42') # (2)!
print(user)
#> name='John' age=42
- This is the default value.
- The
age
field is not validated in the strict mode. Therefore, it can be assigned a string.
See Strict Mode for more details.
See Conversion Table for more details on how Pydantic converts data in both strict and lax modes.
Immutability¶
The parameter frozen
is used to emulate the frozen dataclass behaviour. It is used to prevent the field from being
assigned a new value after the model is created (immutability).
See the frozen dataclass documentation for more details.
from pydantic import BaseModel, Field, ValidationError
class User(BaseModel):
name: str = Field(frozen=True)
age: int
user = User(name='John', age=42)
try:
user.name = 'Jane' # (1)!
except ValidationError as e:
print(e)
"""
1 validation error for User
name
Field is frozen [type=frozen_field, input_value='Jane', input_type=str]
"""
- Since
name
field is frozen, the assignment is not allowed.
Exclude¶
The exclude
parameter can be used to control which fields should be excluded from the
model when exporting the model.
See the following example:
from pydantic import BaseModel, Field
class User(BaseModel):
name: str
age: int = Field(exclude=True)
user = User(name='John', age=42)
print(user.model_dump()) # (1)!
#> {'name': 'John'}
- The
age
field is not included in themodel_dump()
output, since it is excluded.
See the Serialization section for more details.
Deprecated fields¶
The deprecated
parameter can be used to mark a field as being deprecated. Doing so will result in:
- a runtime deprecation warning emitted when accessing the field.
"deprecated": true
being set in the generated JSON schema.
You can set the deprecated
parameter as one of:
- A string, which will be used as the deprecation message.
- An instance of the
warnings.deprecated
decorator (or thetyping_extensions
backport). - A boolean, which will be used to mark the field as deprecated with a default
'deprecated'
deprecation message.
deprecated
as a string¶
from typing_extensions import Annotated
from pydantic import BaseModel, Field
class Model(BaseModel):
deprecated_field: Annotated[int, Field(deprecated='This is deprecated')]
print(Model.model_json_schema()['properties']['deprecated_field'])
#> {'deprecated': True, 'title': 'Deprecated Field', 'type': 'integer'}
from typing import Annotated
from pydantic import BaseModel, Field
class Model(BaseModel):
deprecated_field: Annotated[int, Field(deprecated='This is deprecated')]
print(Model.model_json_schema()['properties']['deprecated_field'])
#> {'deprecated': True, 'title': 'Deprecated Field', 'type': 'integer'}
deprecated
via the warnings.deprecated
decorator¶
Note
You can only use the deprecated
decorator in this way if you have
typing_extensions
>= 4.9.0 installed.
import importlib.metadata
from packaging.version import Version
from typing_extensions import Annotated, deprecated
from pydantic import BaseModel, Field
if Version(importlib.metadata.version('typing_extensions')) >= Version('4.9'):
class Model(BaseModel):
deprecated_field: Annotated[int, deprecated('This is deprecated')]
# Or explicitly using `Field`:
alt_form: Annotated[
int, Field(deprecated=deprecated('This is deprecated'))
]
import importlib.metadata
from packaging.version import Version
from typing_extensions import deprecated
from typing import Annotated
from pydantic import BaseModel, Field
if Version(importlib.metadata.version('typing_extensions')) >= Version('4.9'):
class Model(BaseModel):
deprecated_field: Annotated[int, deprecated('This is deprecated')]
# Or explicitly using `Field`:
alt_form: Annotated[
int, Field(deprecated=deprecated('This is deprecated'))
]
deprecated
as a boolean¶
from typing_extensions import Annotated
from pydantic import BaseModel, Field
class Model(BaseModel):
deprecated_field: Annotated[int, Field(deprecated=True)]
print(Model.model_json_schema()['properties']['deprecated_field'])
#> {'deprecated': True, 'title': 'Deprecated Field', 'type': 'integer'}
from typing import Annotated
from pydantic import BaseModel, Field
class Model(BaseModel):
deprecated_field: Annotated[int, Field(deprecated=True)]
print(Model.model_json_schema()['properties']['deprecated_field'])
#> {'deprecated': True, 'title': 'Deprecated Field', 'type': 'integer'}
Support for category
and stacklevel
The current implementation of this feature does not take into account the category
and stacklevel
arguments to the deprecated
decorator. This might land in a future version of Pydantic.
Accessing a deprecated field in validators
When accessing a deprecated field inside a validator, the deprecation warning will be emitted. You can use
catch_warnings
to explicitly ignore it:
import warnings
from typing_extensions import Self
from pydantic import BaseModel, Field, model_validator
class Model(BaseModel):
deprecated_field: int = Field(deprecated='This is deprecated')
@model_validator(mode='after')
def validate_model(self) -> Self:
with warnings.catch_warnings():
warnings.simplefilter('ignore', DeprecationWarning)
self.deprecated_field = self.deprecated_field * 2
Customizing JSON Schema¶
Some field parameters are used exclusively to customize the generated JSON schema. The parameters in question are:
title
description
examples
json_schema_extra
Read more about JSON schema customization / modification with fields in the Customizing JSON Schema section of the JSON schema docs.
The computed_field
decorator¶
API Documentation
The computed_field
decorator can be used to include property
or
cached_property
attributes when serializing a model or dataclass.
The property will also be taken into account in the JSON Schema (in serialization mode).
Note
Properties can be useful for fields that are computed from other fields, or for fields that
are expensive to be computed (and thus, are cached if using cached_property
).
However, note that Pydantic will not perform any additional logic on the wrapped property (validation, cache invalidation, etc.).
Here's an example of the JSON schema (in serialization mode) generated for a model with a computed field:
from pydantic import BaseModel, computed_field
class Box(BaseModel):
width: float
height: float
depth: float
@computed_field
@property # (1)!
def volume(self) -> float:
return self.width * self.height * self.depth
print(Box.model_json_schema(mode='serialization'))
"""
{
'properties': {
'width': {'title': 'Width', 'type': 'number'},
'height': {'title': 'Height', 'type': 'number'},
'depth': {'title': 'Depth', 'type': 'number'},
'volume': {'readOnly': True, 'title': 'Volume', 'type': 'number'},
},
'required': ['width', 'height', 'depth', 'volume'],
'title': 'Box',
'type': 'object',
}
"""
Here's an example using the model_dump
method with a computed field:
from pydantic import BaseModel, computed_field
class Box(BaseModel):
width: float
height: float
depth: float
@computed_field
@property # (1)!
def volume(self) -> float:
return self.width * self.height * self.depth
b = Box(width=1, height=2, depth=3)
print(b.model_dump())
#> {'width': 1.0, 'height': 2.0, 'depth': 3.0, 'volume': 6.0}
- If not specified,
computed_field
will implicitly convert the method to aproperty
. However, it is preferable to explicitly use the@property
decorator for type checking purposes.
As with regular fields, computed fields can be marked as being deprecated:
from typing_extensions import deprecated
from pydantic import BaseModel, computed_field
class Box(BaseModel):
width: float
height: float
depth: float
@computed_field
@property
@deprecated("'volume' is deprecated")
def volume(self) -> float:
return self.width * self.height * self.depth