TensorFlow#

About this page

This is an API reference for TensorFlow in BentoML. Please refer to TensorFlow for more information about how to use TensorFlow in BentoML.

Note

You can find more examples for TensorFlow in our BentoML/examples directory.

bentoml.tensorflow.save_model(name: Tag | str, model: tf_ext.KerasModel | tf_ext.Module, *, tf_signatures: tf_ext.ConcreteFunction | None = None, tf_save_options: tf_ext.SaveOptions | None = None, signatures: dict[str, ModelSignature] | dict[str, ModelSignatureDict] | None = None, labels: dict[str, str] | None = None, custom_objects: dict[str, t.Any] | None = None, external_modules: list[ModuleType] | None = None, metadata: dict[str, t.Any] | None = None) bentoml.Model[source]#

Save a model instance to BentoML modelstore.

Parameters:
  • name – Name for given model instance. This should pass Python identifier check.

  • model – Instance of model to be saved

  • tf_signatures – Refer to Signatures explanation from Tensorflow documentation for more information.

  • tf_save_options – TensorFlow save options..

  • signatures – Methods to expose for running inference on the target model. Signatures are used for creating Runner instances when serving model with bentoml.Service

  • labels – user-defined labels for managing models, e.g. team=nlp, stage=dev

  • custom_objects – user-defined additional python objects to be saved alongside the model, e.g. a tokenizer instance, preprocessor function, model configuration json

  • external_modules – user-defined additional python modules to be saved alongside the model or custom objects, e.g. a tokenizer module, preprocessor module, model configuration module

  • metadata – Custom metadata for given model.

Raises:

ValueError – If obj is not trackable.

Returns:

A tag with a format name:version where name is the user-defined model’s name, and a generated version by BentoML.

Return type:

Tag

Examples:

import tensorflow as tf
import numpy as np
import bentoml

class NativeModel(tf.Module):
    def __init__(self):
        super().__init__()
        self.weights = np.asfarray([[1.0], [1.0], [1.0], [1.0], [1.0]])
        self.dense = lambda inputs: tf.matmul(inputs, self.weights)

    @tf.function(
        input_signature=[tf.TensorSpec(shape=[1, 5], dtype=tf.float64, name="inputs")]
    )
    def __call__(self, inputs):
        return self.dense(inputs)

# then save the given model to BentoML modelstore:
model = NativeModel()
bento_model = bentoml.tensorflow.save_model("native_toy", model)

Note

bentoml.tensorflow.save_model API also support saving RaggedTensor model and Keras model. If you choose to save a Keras model with bentoml.tensorflow.save_model, then the model will be saved under a SavedModel format instead of h5.

bentoml.tensorflow.load_model(bento_model: str | Tag | bentoml.Model, device_name: str = '/device:CPU:0') tf_ext.AutoTrackable | tf_ext.Module[source]#

Load a tensorflow model from BentoML local modelstore with given name.

Parameters:
  • bento_model – Either the tag of the model to get from the store, or a BentoML ~bentoml.Model instance to load the model from.

  • device_name – The device id to load the model on. The device id format should be compatible with tf.device

Returns:

an instance of SavedModel format from BentoML modelstore.

Return type:

SavedModel

Examples:

import bentoml

# load a model back into memory
model = bentoml.tensorflow.load_model("my_tensorflow_model")
bentoml.tensorflow.get(tag_like: str | Tag) Model[source]#