How to create a robust micro service architecture for Machine Learning?

by Nilesh   Last Updated June 15, 2018 15:05 PM - source

I am trying to create a framework for machine learning. With respect to that, I have planned to have 3 docker containers where in one the container will be responsible for training the model, other will be required for predicting the entries, and the last one will host a redis container. I know that each service is supposed to have it's own database in order to reduce inter service dependency. The model information will be stored in the database, and the model will be stored in the train model volume, which will be accessible by the prediction service too. Am I thinking correctly over here, or there is a better way to organize here. The reason I have seperated training and prediction service is, when training, prediction should work in parallel too. I am using flask server for the same.

Related Questions

Serve multiple large model files in machine learning?

Updated November 02, 2018 10:05 AM

Flask vs ExpressJS for developing an API Gateway

Updated February 23, 2019 20:05 PM