'Custom MLFlow scoring_server for model serving

I would like to know if MLflow currently does support any kind of customization of it's scoring_serving that would allow the ability to register new endpoints to the published Rest API.

By default the scoring server provides /ping and /invocations endpoint, but i would like to include more endpoints in addition to those.

I've seen some resources that allow that kind of behaviour using custom WSGI implementations but i would like to know if extension of the provided mlflow scoring_server is possible in any way, so the default supporty provided by mlflow generated docker images and the deployment management is not lost.

I explored existing official and unofficial documentation, and explored existing github issues and the mlflow codebase in it's github repository.

Also i've explored some alternatives such as using custom WSGI server configuration for starting the Rest API.

Any kind of resource/documentation is greatly appreciated.

Thanks in advance.



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source