'pydantic `json_encoders` for builtin types (float, int, etc)
I'm having some unexpected behavior with json encoding of fields like float, int, etc. using pydantic. Here is the documentation for json encoding, for reference.
As an example, this model seems to encode complex just fine, but ignores my float field.
import pydantic as pd
class Model(pd.BaseModel):
class Config:
arbitrary_types_allowed=True
json_encoders = {
float: lambda x: 'test',
complex: lambda x: 'test'
}
d1:float
d2:complex
m = Model(d1=1.0, d2=1j)
m.json()
# '{"d1": 1.0, "d2": "test"}'
Can anyone shine light on this behavior and point me in the right direction?
My use case is a custom encoder to detect when a float is numpy.inf and then write it to "Infinity" in json rather than the illegal default value of Infinity as encoded by the json package.
Thanks for your help!
Solution 1:[1]
The reason behind why your custom json_encoder not working for float type is pydantic uses json.dumps() for serialization.
If any type is serializable with json.dumps() it will not use cutom json_encoder for those types.
And come to the complex type it's not serializable by json.dumps() that's why it's using the custom json_encoder you have provided.
If you still want to use custom json_encoder you can use orjson as suggested by pydantic.
Sample Code:
def orjson_dumps(v, *, default):
for key, value in v.items():
if isinstance(value, float):
v[key] = 'test'
elif isinstance(value, complex):
v[key] = 'test'
# orjson.dumps returns bytes, to match standard json.dumps we need to decode
return orjson.dumps(v, default=default).decode()
class Model(BaseModel):
class Config:
arbitrary_types_allowed=True
json_dumps = orjson_dumps
json_loads = orjson.loads
d1:float
d2:complex
m = Model(d1=1.0, d2=1j)
m.json() # {"d1":"test","d2":"test"}
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 |
