以Mnist为例

训练和保存模型

如果模型已经导出,可以先进行清理

$>rm -rf /tmp/mnist_model

serving/tensorflow_serving/example/mnist_saved_model.py的训练和保存模型的过程。

export_path_base = sys.argv[-1]
export_path = os.path.join(
      compat.as_bytes(export_path_base),
      compat.as_bytes(str(FLAGS.model_version)))
print 'Exporting trained model to', export_path
builder = tf.saved_model.builder.SavedModelBuilder(export_path)
builder.add_meta_graph_and_variables(
      sess, [tag_constants.SERVING],
      signature_def_map={
           'predict_images':
               prediction_signature,
           signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY:
               classification_signature,
      },
      legacy_init_op=legacy_init_op)
builder.save()

Bazel编译并执行

$>bazel build -c opt //tensorflow_serving/example:mnist_saved_model
$>bazel-bin/tensorflow_serving/example/mnist_saved_model /tmp/mnist_model

查看模型

$>ls /tmp/mnist_model
1
$>ls /tmp/mnist_model/1
saved_model.pb variables

Model Server加载模型

使用bazel方式加载模型

$>bazel build -c opt //tensorflow_serving/model_servers:tensorflow_model_server
$>bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000 --model_name=mnist --model_base_path=/tmp/mnist_model/

验证服务

Bazel方式

$>bazel build -c opt //tensorflow_serving/example:mnist_client
$>bazel-bin/tensorflow_serving/example/mnist_client --num_tests=1000 --server=localhost:9000

results matching ""

    No results matching ""