BentoML is a python framework for building, shipping and running machine learning services. It provides high-level APIs for defining an ML service and packaging its artifacts, source code, dependencies, and configurations into a production-system-friendly format that is ready for deployment.
Use BentoML if you need to:
- Turn your ML model into REST API server, Serverless endpoint, PyPI package, or CLI tool
- Manage the workflow of creating and deploying a ML service
Defining a machine learning service with BentoML is as simple as a few lines of code:
@artifacts([PickleArtifact('model')]) @env(conda_pip_dependencies=["scikit-learn"]) class IrisClassifier(BentoService): @api(DataframeHandler) def predict(self, df): return self.artifacts.model.predict(df)
- Multiple Distribution Formats - Easily package machine learning
models and preprocessing code into a format that works best with your
- Docker Image - Deploy as container running REST API server
- PyPi Package - Integrate into python applications seamlessly
- CLI tool - Incorporate model into Airflow DAG or CI/CD pipeline
- Spark UDF - Run batch inference on a large dataset with Spark
- Serverless Function - Host model on serverless platforms such as AWS Lambda
- Multiple Frameworks Support - BentoML supports a wild range of machine learning frameworks out-of-box including Tensorflow, PyTorch, Scikit-Learn, xgboost, H2O, FastAI and can be easily extended to work with new or custom frameworks.
- Deploy Anywhere - BentoML bundled machine learning service can be easily deployed with platforms such as Docker, Kubernetes, Serverless, Airflow and Clipper, on cloud providers including AWS, Google Cloud, and Azure.
- Custom Runtime Backend - Easily integrate python preprocessing code with high performance deep learning runtime backend such as Tensorflow-serving.
- Quick Start
- Install BentoML
- Running the quick start project
- Quick start walk through
- Add BentoML to the notebook and training classification model
- Define machine learning service with BentoML
- Save defined ML service as BentoML service archive
- Using BentoML archive
- API Reference
- Using Bento Archive