'How to resolve pydantic model is not JSON serializable

I am having below pydantic models.

class SubModel(BaseModel):
    columns: Mapping
    key: List[str]
    required: Optional[List[str]]

    class Config:
        anystr_strip_whitespace: True
        extra: Extra.allow
        allow_population_by_field_name: True


class MyModel(BaseModel):
    name: str
    config1: Optional[SubModel]
    config2: Optional[Mapping]
    class Config:
        anystr_strip_whitespace: True
        extra: Extra.allow
        allow_population_by_field_name: True

When I am trying to do a dumps on this, I am getting model is not JSON serializable

from io import BytesIO
from orjson import dumps
    
bucket = s3.Bucket(bucket_name)
bucket.upload(BytesIO(dumps(data)), key, ExtraArgs={'ContentType': 'application/json'})

Error -

TypeError: Type is not JSON serializable: MyModel

data is a normal python dictionary with one of item of type MyModel. Tried to use .json() but get dict has no attribute json.

I am stuck here. Can someone help me.



Solution 1:[1]

Got similar isse for FastAPI response, solved by:

return JSONResponse(content=jsonable_encoder(item), status_code=200)

or can be just like this:

return jsonable_encoder(item)

where jsonable_encoder is:

from fastapi.encoders import jsonable_encoder

More details are here: https://fastapi.tiangolo.com/tutorial/encoder/

Solution 2:[2]

Here the problem is that pydantic models are not json serializable by default, in your case, you can call data.dict() to serialize a dict version of your model.

from io import BytesIO
from orjson import dumps

bucket = s3.Bucket(bucket_name)
bucket.upload(BytesIO(dumps(data.dict())), key, ExtraArgs={'ContentType': 'application/json'})

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Dharman
Solution 2 Josep Pascual