Why use Pydantic?¶
Today, Pydantic is downloaded many times a month and used by some of the largest and most recognisable organisations in the world.
It's hard to know why so many people have adopted Pydantic since its inception six years ago, but here are a few guesses.
Type hints powering schema validation¶
The schema that Pydantic validates against is generally defined by Python type hints.
Type hints are great for this since, if you're writing modern Python, you already know how to use them. Using type hints also means that Pydantic integrates well with static typing tools (like mypy and Pyright) and IDEs (like PyCharm and VSCode).
Example - just type hints
from typing import Annotated, Literal
from annotated_types import Gt
from pydantic import BaseModel
class Fruit(BaseModel):
name: str # (1)!
color: Literal['red', 'green'] # (2)!
weight: Annotated[float, Gt(0)] # (3)!
bazam: dict[str, list[tuple[int, bool, float]]] # (4)!
print(
Fruit(
name='Apple',
color='red',
weight=4.2,
bazam={'foobar': [(1, True, 0.1)]},
)
)
#> name='Apple' color='red' weight=4.2 bazam={'foobar': [(1, True, 0.1)]}
- The
name
field is simply annotated withstr
— any string is allowed. - The
Literal
type is used to enforce thatcolor
is either'red'
or'green'
. - Even when we want to apply constraints not encapsulated in Python types, we can use
Annotated
andannotated-types
to enforce constraints while still keeping typing support. - I'm not claiming "bazam" is really an attribute of fruit, but rather to show that arbitrarily complex types can easily be validated.
Learn more
See the documentation on supported types.
Performance¶
Pydantic's core validation logic is implemented in a separate package (pydantic-core
),
where validation for most types is implemented in Rust.
As a result, Pydantic is among the fastest data validation libraries for Python.
Performance Example - Pydantic vs. dedicated code
In general, dedicated code should be much faster than a general-purpose validator, but in this example Pydantic is >300% faster than dedicated code when parsing JSON and validating URLs.
import json
import timeit
from urllib.parse import urlparse
import requests
from pydantic import HttpUrl, TypeAdapter
reps = 7
number = 100
r = requests.get('https://api.github.com/emojis')
r.raise_for_status()
emojis_json = r.content
def emojis_pure_python(raw_data):
data = json.loads(raw_data)
output = {}
for key, value in data.items():
assert isinstance(key, str)
url = urlparse(value)
assert url.scheme in ('https', 'http')
output[key] = url
emojis_pure_python_times = timeit.repeat(
'emojis_pure_python(emojis_json)',
globals={
'emojis_pure_python': emojis_pure_python,
'emojis_json': emojis_json,
},
repeat=reps,
number=number,
)
print(f'pure python: {min(emojis_pure_python_times) / number * 1000:0.2f}ms')
#> pure python: 5.32ms
type_adapter = TypeAdapter(dict[str, HttpUrl])
emojis_pydantic_times = timeit.repeat(
'type_adapter.validate_json(emojis_json)',
globals={
'type_adapter': type_adapter,
'HttpUrl': HttpUrl,
'emojis_json': emojis_json,
},
repeat=reps,
number=number,
)
print(f'pydantic: {min(emojis_pydantic_times) / number * 1000:0.2f}ms')
#> pydantic: 1.54ms
print(
f'Pydantic {min(emojis_pure_python_times) / min(emojis_pydantic_times):0.2f}x faster'
)
#> Pydantic 3.45x faster
Unlike other performance-centric libraries written in compiled languages, Pydantic also has excellent support for customizing validation via functional validators.
Learn more
Samuel Colvin's talk at PyCon 2023 explains how pydantic-core
works and how it integrates with Pydantic.
Serialization¶
Pydantic provides functionality to serialize model in three ways:
- To a Python
dict
made up of the associated Python objects. - To a Python
dict
made up only of "jsonable" types. - To a JSON string.
In all three modes, the output can be customized by excluding specific fields, excluding unset fields, excluding default values, and excluding None
values.
Example - Serialization 3 ways
from datetime import datetime
from pydantic import BaseModel
class Meeting(BaseModel):
when: datetime
where: bytes
why: str = 'No idea'
m = Meeting(when='2020-01-01T12:00', where='home')
print(m.model_dump(exclude_unset=True))
#> {'when': datetime.datetime(2020, 1, 1, 12, 0), 'where': b'home'}
print(m.model_dump(exclude={'where'}, mode='json'))
#> {'when': '2020-01-01T12:00:00', 'why': 'No idea'}
print(m.model_dump_json(exclude_defaults=True))
#> {"when":"2020-01-01T12:00:00","where":"home"}
Learn more
See the documentation on serialization.
JSON Schema¶
A JSON Schema can be generated for any Pydantic schema — allowing self-documenting APIs and integration with a wide variety of tools which support the JSON Schema format.
Example - JSON Schema
from datetime import datetime
from pydantic import BaseModel
class Address(BaseModel):
street: str
city: str
zipcode: str
class Meeting(BaseModel):
when: datetime
where: Address
why: str = 'No idea'
print(Meeting.model_json_schema())
"""
{
'$defs': {
'Address': {
'properties': {
'street': {'title': 'Street', 'type': 'string'},
'city': {'title': 'City', 'type': 'string'},
'zipcode': {'title': 'Zipcode', 'type': 'string'},
},
'required': ['street', 'city', 'zipcode'],
'title': 'Address',
'type': 'object',
}
},
'properties': {
'when': {'format': 'date-time', 'title': 'When', 'type': 'string'},
'where': {'$ref': '#/$defs/Address'},
'why': {'default': 'No idea', 'title': 'Why', 'type': 'string'},
},
'required': ['when', 'where'],
'title': 'Meeting',
'type': 'object',
}
"""
Pydantic is compliant with the latest version of JSON Schema specification (2020-12), which is compatible with OpenAPI 3.1.
Learn more
See the documentation on JSON Schema.
Strict mode and data coercion¶
By default, Pydantic is tolerant to common incorrect types and coerces data to the right type —
e.g. a numeric string passed to an int
field will be parsed as an int
.
Pydantic also has as strict mode, where types are not coerced and a validation error is raised unless the input data exactly matches the expected schema.
But strict mode would be pretty useless when validating JSON data since JSON doesn't have types matching
many common Python types like datetime
, UUID
or bytes
.
To solve this, Pydantic can parse and validate JSON in one step. This allows sensible data conversion
(e.g. when parsing strings into datetime
objects). Since the JSON parsing is
implemented in Rust, it's also very performant.
Example - Strict mode that's actually useful
from datetime import datetime
from pydantic import BaseModel, ValidationError
class Meeting(BaseModel):
when: datetime
where: bytes
m = Meeting.model_validate({'when': '2020-01-01T12:00', 'where': 'home'})
print(m)
#> when=datetime.datetime(2020, 1, 1, 12, 0) where=b'home'
try:
m = Meeting.model_validate(
{'when': '2020-01-01T12:00', 'where': 'home'}, strict=True
)
except ValidationError as e:
print(e)
"""
2 validation errors for Meeting
when
Input should be a valid datetime [type=datetime_type, input_value='2020-01-01T12:00', input_type=str]
where
Input should be a valid bytes [type=bytes_type, input_value='home', input_type=str]
"""
m_json = Meeting.model_validate_json(
'{"when": "2020-01-01T12:00", "where": "home"}'
)
print(m_json)
#> when=datetime.datetime(2020, 1, 1, 12, 0) where=b'home'
Learn more
See the documentation on strict mode.
Dataclasses, TypedDicts, and more¶
Pydantic provides four ways to create schemas and perform validation and serialization:
BaseModel
— Pydantic's own super class with many common utilities available via instance methods.- Pydantic dataclasses — a wrapper around standard dataclasses with additional validation performed.
TypeAdapter
— a general way to adapt any type for validation and serialization. This allows types likeTypedDict
andNamedTuple
to be validated as well as simple types (likeint
ortimedelta
) — all types supported can be used withTypeAdapter
.validate_call
— a decorator to perform validation when calling a function.
Example - schema based on a TypedDict
from datetime import datetime
from typing_extensions import NotRequired, TypedDict
from pydantic import TypeAdapter
class Meeting(TypedDict):
when: datetime
where: bytes
why: NotRequired[str]
meeting_adapter = TypeAdapter(Meeting)
m = meeting_adapter.validate_python( # (1)!
{'when': '2020-01-01T12:00', 'where': 'home'}
)
print(m)
#> {'when': datetime.datetime(2020, 1, 1, 12, 0), 'where': b'home'}
meeting_adapter.dump_python(m, exclude={'where'}) # (2)!
print(meeting_adapter.json_schema()) # (3)!
"""
{
'properties': {
'when': {'format': 'date-time', 'title': 'When', 'type': 'string'},
'where': {'format': 'binary', 'title': 'Where', 'type': 'string'},
'why': {'title': 'Why', 'type': 'string'},
},
'required': ['when', 'where'],
'title': 'Meeting',
'type': 'object',
}
"""
TypeAdapter
for aTypedDict
performing validation, it can also validate JSON data directly withvalidate_json
.dump_python
to serialise aTypedDict
to a python object, it can also serialise to JSON withdump_json
.TypeAdapter
can also generate a JSON Schema.
Customisation¶
Functional validators and serializers, as well as a powerful protocol for custom types, means the way Pydantic operates can be customized on a per-field or per-type basis.
Customisation Example - wrap validators
"wrap validators" are new in Pydantic V2 and are one of the most powerful ways to customize validation.
from datetime import datetime, timezone
from typing import Any
from pydantic_core.core_schema import ValidatorFunctionWrapHandler
from pydantic import BaseModel, field_validator
class Meeting(BaseModel):
when: datetime
@field_validator('when', mode='wrap')
def when_now(
cls, input_value: Any, handler: ValidatorFunctionWrapHandler
) -> datetime:
if input_value == 'now':
return datetime.now()
when = handler(input_value)
# in this specific application we know tz naive datetimes are in UTC
if when.tzinfo is None:
when = when.replace(tzinfo=timezone.utc)
return when
print(Meeting(when='2020-01-01T12:00+01:00'))
#> when=datetime.datetime(2020, 1, 1, 12, 0, tzinfo=TzInfo(+01:00))
print(Meeting(when='now'))
#> when=datetime.datetime(2032, 1, 2, 3, 4, 5, 6)
print(Meeting(when='2020-01-01T12:00'))
#> when=datetime.datetime(2020, 1, 1, 12, 0, tzinfo=datetime.timezone.utc)
Learn more
See the documentation on validators, custom serializers, and custom types.
Ecosystem¶
At the time of writing there are 466,400 repositories on GitHub and 8,119 packages on PyPI that depend on Pydantic.
Some notable libraries that depend on Pydantic:
huggingface/transformers
138,570 starshwchase17/langchain
99,542 starstiangolo/fastapi
80,497 starsapache/airflow
38,577 starslm-sys/FastChat
37,650 starsmicrosoft/DeepSpeed
36,521 starsOpenBB-finance/OpenBBTerminal
35,971 starsgradio-app/gradio
35,740 starsray-project/ray
35,176 starspola-rs/polars
31,698 starsLightning-AI/lightning
28,902 starsmindsdb/mindsdb
27,141 starsembedchain/embedchain
24,379 starspynecone-io/reflex
21,558 starsheartexlabs/label-studio
20,571 starsSanster/lama-cleaner
20,313 starsmlflow/mlflow
19,393 starsRasaHQ/rasa
19,337 starsspotDL/spotify-downloader
18,604 starschroma-core/chroma
17,393 starsairbytehq/airbyte
17,120 starsopenai/evals
15,437 starstiangolo/sqlmodel
15,127 starsydataai/ydata-profiling
12,687 starspyodide/pyodide
12,653 starsdagster-io/dagster
12,440 starsPaddlePaddle/PaddleNLP
12,312 starsmatrix-org/synapse
11,857 starslucidrains/DALLE2-pytorch
11,207 starsgreat-expectations/great_expectations
10,164 starsmodin-project/modin
10,002 starsaws/serverless-application-model
9,402 starssqlfluff/sqlfluff
8,535 starsreplicate/cog
8,344 starsautogluon/autogluon
8,326 starslucidrains/imagen-pytorch
8,164 starsbrycedrennan/imaginAIry
8,050 starsvitalik/django-ninja
7,685 starsNVlabs/SPADE
7,632 starsbridgecrewio/checkov
7,340 starsbentoml/BentoML
7,322 starsskypilot-org/skypilot
7,113 starsapache/iceberg
6,853 starsdeeppavlov/DeepPavlov
6,777 starsPrefectHQ/marvin
5,454 starsNVIDIA/NeMo-Guardrails
4,383 starsmicrosoft/FLAML
4,035 starsjina-ai/discoart
3,846 starsdocarray/docarray
3,007 starsaws-powertools/powertools-lambda-python
2,980 starsroman-right/beanie
2,172 starsart049/odmantic
1,096 stars
More libraries using Pydantic can be found at Kludex/awesome-pydantic
.
Organisations using Pydantic¶
Some notable companies and organisations using Pydantic together with comments on why/how we know they're using Pydantic.
The organisations below are included because they match one or more of the following criteria:
- Using Pydantic as a dependency in a public repository.
- Referring traffic to the Pydantic documentation site from an organization-internal domain — specific referrers are not included since they're generally not in the public domain.
- Direct communication between the Pydantic team and engineers employed by the organization about usage of Pydantic within the organization.
We've included some extra detail where appropriate and already in the public domain.
Adobe¶
adobe/dy-sql
uses Pydantic.
Amazon and AWS¶
- powertools-lambda-python
- awslabs/gluonts
- AWS sponsored Samuel Colvin $5,000 to work on Pydantic in 2022
Anthropic¶
anthropics/anthropic-sdk-python
uses Pydantic.
Apple¶
(Based on the criteria described above)
ASML¶
(Based on the criteria described above)
AstraZeneca¶
Multiple repos in the AstraZeneca
GitHub org depend on Pydantic.
Cisco Systems¶
- Pydantic is listed in their report of Open Source Used In RADKit.
cisco/webex-assistant-sdk
Comcast¶
(Based on the criteria described above)
Datadog¶
- Extensive use of Pydantic in
DataDog/integrations-core
and other repos - Communication with engineers from Datadog about how they use Pydantic.
Facebook¶
Multiple repos in the facebookresearch
GitHub org depend on Pydantic.
GitHub¶
GitHub sponsored Pydantic $750 in 2022
Google¶
Extensive use of Pydantic in google/turbinia
and other repos.
HSBC¶
(Based on the criteria described above)
IBM¶
Multiple repos in the IBM
GitHub org depend on Pydantic.
Intel¶
(Based on the criteria described above)
Intuit¶
(Based on the criteria described above)
Intergovernmental Panel on Climate Change¶
Tweet explaining how the IPCC use Pydantic.
JPMorgan¶
(Based on the criteria described above)
Jupyter¶
- The developers of the Jupyter notebook are using Pydantic for subprojects
- Through the FastAPI-based Jupyter server Jupyverse
- FPS's configuration management.
Microsoft¶
- DeepSpeed deep learning optimisation library uses Pydantic extensively
- Multiple repos in the
microsoft
GitHub org depend on Pydantic, in particular their - Pydantic is also used in the
Azure
GitHub org - Comments on GitHub show Microsoft engineers using Pydantic as part of Windows and Office
Molecular Science Software Institute¶
Multiple repos in the MolSSI
GitHub org depend on Pydantic.
NASA¶
Multiple repos in the NASA
GitHub org depend on Pydantic.
NASA are also using Pydantic via FastAPI in their JWST project to process images from the James Webb Space Telescope, see this tweet.
Netflix¶
Multiple repos in the Netflix
GitHub org depend on Pydantic.
NSA¶
The nsacyber/WALKOFF
repo depends on Pydantic.
NVIDIA¶
Multiple repositories in the NVIDIA
GitHub org depend on Pydantic.
Their "Omniverse Services" depends on Pydantic according to their documentation.
OpenAI¶
OpenAI use Pydantic for their ChatCompletions API, as per this discussion on GitHub.
Anecdotally, OpenAI use Pydantic extensively for their internal services.
Oracle¶
(Based on the criteria described above)
Palantir¶
(Based on the criteria described above)
Qualcomm¶
(Based on the criteria described above)
Red Hat¶
(Based on the criteria described above)
Revolut¶
Anecdotally, all internal services at Revolut are built with FastAPI and therefore Pydantic.
Robusta¶
The robusta-dev/robusta
repo depends on Pydantic.
Salesforce¶
Salesforce sponsored Samuel Colvin $10,000 to work on Pydantic in 2022.
Starbucks¶
(Based on the criteria described above)
Texas Instruments¶
(Based on the criteria described above)
Twilio¶
(Based on the criteria described above)
Twitter¶
Twitter's the-algorithm
repo where they
open sourced
their recommendation engine uses Pydantic.
UK Home Office¶
(Based on the criteria described above)