Saving TensorFlow models for AI Explanations

This page explains how to save a TensorFlow model for use with AI Explanations, whether you're using TensorFlow 2.x or TensorFlow 1.15.

TensorFlow 2

If you're working with TensorFlow 2.x, use tf.saved_model.save to save your model. You can specify input signatures when saving your model. If you have one input signature, AI Explanations uses the default serving function for your explanations requests. If you have more than one input signature, you should specify the signature of your serving default function when you save your model:

tf.saved_model.save(m, model_dir, signatures={
    'serving_default': serving_fn,
    'xai_model': model_fn # Required for XAI
    })

In this case, AI Explanations uses the model function signature you saved with the xai_model key for your explanations request. Use the exact string xai_model for the key.

If you use a preprocessing function, you also need to specify the signatures for your preprocessing function and your model function. You must use the exact strings xai_preprocess and xai_model as the keys:

tf.saved_model.save(m, model_dir, signatures={
    'serving_default': serving_fn,
    'xai_preprocess': preprocess_fn, # Required for XAI
    'xai_model': model_fn # Required for XAI
    })

In this case, AI Explanations uses your preprocessing function and your model function for your explanation requests. Make sure that the output of your preprocessing function matches the input that your model function expects.

Learn more about specifying serving signatures in TensorFlow.

Try the full TensorFlow 2 example notebooks:

TensorFlow 1.15

If you're working with TensorFlow 1.15, do not use tf.saved_model.save. This function is not supported with AI Explanations when using TensorFlow 1.

If you build and train your model in Keras, you must convert your model to a TensorFlow Estimator, and then export it to a SavedModel. This section focuses on saving a model. For a full working example, see the example notebooks:

After you build, compile, train, and evaluate your Keras model, you have to:

  • Convert the Keras model to a TensorFlow Estimator, using tf.keras.estimator.model_to_estimator
  • Provide a serving input function, using tf.estimator.export.build_raw_serving_input_receiver_fn
  • Export the model as a SavedModel, using tf.estimator.export_saved_model.
# Build, compile, train, and evaluate your Keras model
model = tf.keras.Sequential(...)
model.compile(...)
model.fit(...)
model.predict(...)

## Convert your Keras model to an Estimator
keras_estimator = tf.keras.estimator.model_to_estimator(keras_model=model, model_dir='export')

## Define a serving input function appropriate for your model
def serving_input_receiver_fn():
  ...
  return tf.estimator.export.ServingInputReceiver(...)

## Export the SavedModel to Cloud Storage, using your serving input function
export_path = keras_estimator.export_saved_model(
  'gs://' + 'YOUR_BUCKET_NAME',
  serving_input_receiver_fn
).decode('utf-8')

print("Model exported to: ", export_path)

What's next