'Can batch Prediction Vertex.ai do vectorization with custom container had multiple models?
I have a system for auto translating all text in document into English, so I have built a very complicated Docker container which have multiple models inside it. My container has an endpoint API for doing these jobs:
- Download image from URL in request(Cloud Storage, S3, ..)
- Detect paper from image (1 deep learning model - YOLO)
- Detect some text from image (1 deep learning model - RESNET)
- get OCR result from text (tesseract)
- get translated result from OCR result (1 deep learning model for translating)
- return translated result in JSON format.
All of steps are running inside my container. As you see, my container is very complicated with 4 deep learning models and a high-level library(tesseract). I have used my container for Vertex Prediction without scaling and it is good, but it can serve only 1 image at time.
I am considering using batch prediction with Vertex batch prediction, but I doubt it. Can "batch prediction" service can really use vectorization for images batch, when google doesn't know what is inside my container ???
Or should I do vectorization by my self, change my container for getting multiple images from request, merge into 1 batch, then use batch prediction for each deep learning model inside my container ?
Please help
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
