-
Notifications
You must be signed in to change notification settings - Fork 4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Embed more metadata in exported model and read it in native client #2007
Conversation
@@ -724,6 +724,17 @@ def do_graph_freeze(output_file=None, output_node_names=None, variables_blacklis | |||
if not FLAGS.export_tflite: | |||
frozen_graph = do_graph_freeze(output_node_names=output_names, variables_blacklist='previous_state_c,previous_state_h') | |||
frozen_graph.version = int(file_relative_read('GRAPH_VERSION').strip()) | |||
|
|||
# Add a no-op node to the graph with metadata information to be loaded by the native client |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And we don't break with TF Lite? nice :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nope. We don't get the info in the TFLite graph either, unfortunately. I'll try to find a workaround when I have some time.
@@ -73,6 +77,7 @@ def create_flags(): | |||
tf.app.flags.DEFINE_boolean ('export_tflite', False, 'export a graph ready for TF Lite engine') | |||
tf.app.flags.DEFINE_boolean ('use_seq_length', True, 'have sequence_length in the exported graph (will make tfcompile unhappy)') | |||
tf.app.flags.DEFINE_integer ('n_steps', 16, 'how many timesteps to process at once by the export graph, higher values mean more latency') | |||
tf.app.flags.DEFINE_string ('export_model_language', '', 'language the model was trained on. Gets embedded into exported model.') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd like we get a different naming, it's confusing, I thought it was the language model we were embedding, I ad t ore-read carefully several time. Can the option be named "add_language_metadata" ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about this:
tf.app.flags.DEFINE_string ('export_language', '', 'language the model was trained on e.g. "en" or "English". Gets embedded into exported model.')
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
WFM, feels less confusing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, make sure tests are green, and maybe change the flag name for exporting language metadata
OSX workers are acting up but the relevant code has been exercised in other completed tests. Merged manually to avoid starting another group here in the PR. |
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
No description provided.