Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stop passing None to compile() in TF examples #29597

Merged
merged 2 commits into from
Mar 12, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -509,7 +509,7 @@ def compute_metrics(p):
collate_fn=collate_fn,
).with_options(dataset_options)
else:
optimizer = None
optimizer = "sgd" # Just write anything because we won't be using it

if training_args.do_eval:
eval_dataset = model.prepare_tf_dataset(
Expand Down
2 changes: 1 addition & 1 deletion examples/tensorflow/multiple-choice/run_swag.py
Original file line number Diff line number Diff line change
Expand Up @@ -482,7 +482,7 @@ def preprocess_function(examples):
adam_global_clipnorm=training_args.max_grad_norm,
)
else:
optimizer = None
optimizer = "sgd" # Just write anything because we won't be using it
# Transformers models compute the right loss for their task by default when labels are passed, and will
# use this for training unless you specify your own loss function in compile().
model.compile(optimizer=optimizer, metrics=["accuracy"], jit_compile=training_args.xla)
Expand Down
2 changes: 1 addition & 1 deletion examples/tensorflow/question-answering/run_qa.py
Original file line number Diff line number Diff line change
Expand Up @@ -706,7 +706,7 @@ def compute_metrics(p: EvalPrediction):
model.compile(optimizer=optimizer, jit_compile=training_args.xla, metrics=["accuracy"])

else:
model.compile(optimizer=None, jit_compile=training_args.xla, metrics=["accuracy"])
model.compile(optimizer="sgd", jit_compile=training_args.xla, metrics=["accuracy"])
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add a comment here saying optimizer value doesn't mattr as it won't be used?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done!

training_dataset = None

if training_args.do_eval:
Expand Down
2 changes: 1 addition & 1 deletion examples/tensorflow/summarization/run_summarization.py
Original file line number Diff line number Diff line change
Expand Up @@ -621,7 +621,7 @@ def postprocess_text(preds, labels):
adam_global_clipnorm=training_args.max_grad_norm,
)
else:
optimizer = None
optimizer = "sgd" # Just write anything because we won't be using it

# endregion

Expand Down
5 changes: 4 additions & 1 deletion examples/tensorflow/text-classification/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,10 @@ python run_text_classification.py \
--train_file training_data.json \
--validation_file validation_data.json \
--output_dir output/ \
--test_file data_to_predict.json
--test_file data_to_predict.json \
--do_train \
--do_eval \
--do_predict
```

## run_glue.py
Expand Down
2 changes: 1 addition & 1 deletion examples/tensorflow/text-classification/run_glue.py
Original file line number Diff line number Diff line change
Expand Up @@ -477,7 +477,7 @@ def compute_metrics(preds, label_ids):
adam_global_clipnorm=training_args.max_grad_norm,
)
else:
optimizer = "adam" # Just write anything because we won't be using it
optimizer = "sgd" # Just write anything because we won't be using it
if is_regression:
metrics = []
else:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -526,7 +526,7 @@ def preprocess_function(examples):
adam_global_clipnorm=training_args.max_grad_norm,
)
else:
optimizer = None
optimizer = "sgd" # Just use any default
if is_regression:
metrics = []
else:
Expand Down
2 changes: 1 addition & 1 deletion examples/tensorflow/translation/run_translation.py
Original file line number Diff line number Diff line change
Expand Up @@ -584,7 +584,7 @@ def preprocess_function(examples):
adam_global_clipnorm=training_args.max_grad_norm,
)
else:
optimizer = None
optimizer = "sgd" # Just write anything because we won't be using it
# endregion

# region Metric and postprocessing
Expand Down
Loading