Skip to content

Change in export constructor in freeze_graph.py #1

@markb729

Description

@markb729

HamedMP - thank you for providing this example. Until I came across it, the lack of documentation and API support of a critical low-level function - along with spurious changes to graph output constructors - was creating quite a bit of frustration (and still is, see below).

I was able to get your cifar example to work and thought all of my problems solved. However, in versions >= 0.7.1, freeze_graph.py has changed and now requires initializer nodes to be specified, along with output node names (as before):

def freeze_graph(input_graph, input_saver, input_binary, input_checkpoint,
output_node_names, restore_op_name, filename_tensor_name,
output_graph, clear_devices, initializer_nodes):

This may not seem like too much of a problem, but my current goal is to adapt your approach to process the inception-v3 model. Essentially, I am attempting to replicate the output of classify_image_graph_def.pb found in http://download.tensorflow.org/models/image/imagenet/inception-2015-12-05.tgz.

The ops and constants contained within this graph are substantial, some 1011 instances.

<tensorflow.python.framework.ops.Graph object at 0x120fbe310>

  • OPS *
    Const
    Const_1
    e/initial_value
    e
    e/Assign
    e/read
  • DEFAULT GRAPH *
    Const
    import/DecodeJpeg/contents
    import/DecodeJpeg
    import/Cast
    import/ExpandDims/dim
    ...
    import/pool_3
    import/pool_3/_reshape/shape
    import/pool_3/_reshape
    import/softmax/weights
    import/softmax/biases
    import/softmax/logits/MatMul
    import/softmax/logits
    import/softmax

My attempts to reproduce this const/graph after training (I use the "flowers" subset example in the inception model for testing) have not been successful. Specifically, during training, I will first load the supplied Inception-v3 checkpoint (/inception-v3/model.ckpt-157585) and continue training (this implements inception_train.py). Once complete, I can export the new graph with tf.train.write_graph to a graph_def protobuf. No problems here. I can then read the graph, assemble the variable list by using tf.trainable_variables(), and ultimately creating a comma-separated list (as per python API; 1011 variables above). But when I pass to freeze_graph.py, it will error with

assert d in name_to_node_map, "%s is not in graph" % d

This suggests a variable naming issue, perhaps intrinsic to inception or to how graph.util processes names. Have you any experience with this issue? It was suggested to me that

Sorry for all of the trouble. We keep a track of pointers to all Tensors of
interest in the Python endpoints dict.

For instance, the logits can be found here:
https://github.com/tensorflow/models/blob/master/inception/inception/slim/inception_model.py#L328

And the normalized predictions can be found here:
https://github.com/tensorflow/models/blob/master/inception/inception/slim/inception_model.py#L329

If you run the script and print out endpoints['predictions'].op.name, this
should provide the name of the output layer.

but I have not worked on this angle yet.
Thanks!

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions