-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TFX Iris sample #3119
TFX Iris sample #3119
Conversation
/cc @jingzhang36 |
samples/core/iris/iris_pipeline.py
Outdated
from tfx.components import Trainer | ||
from tfx.components import Transform | ||
from tfx.components.base import executor_spec | ||
from tfx.components.trainer.executor import GenericExecutor |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder what would be the best option when we want to import executors from multiple components.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ummm. That's a good question. I don't see any shortcut to that.
I guess partly that's because we don't often explicitly interact with TFX component executor when authoring the pipeline.
/lgtm |
I have a question regarding the Evaluator output of this pipeline. I did a run of this pipeline and the Evaluator's output in my public GCS bucket gs://jingzhangjz-project-outputs/tfx_iris/8d8d58b1-2826-4533-ae0d-c04a65b2864d/Evaluator/evaluation/30. While I used the following to parse the metric file under that bucket import tensorflow as tf The parse data seems missing metrics field. They are like There are all slice_key fields, but no metrics field. As a reference, the taxi example in KFP has parsed data that looks like: |
It seems like a Keras problem IIUC. Switching to vanilla TF solved this. |
And if you indeed want to use Keras, you'll need to handcraft the eval_config = tfma.EvalConfig(
model_specs=[
tfma.ModelSpec(name='candidate', label_key='variety'),
tfma.ModelSpec(
name='baseline', label_key='variety', is_baseline=True)
],
slicing_specs=[tfma.SlicingSpec()],
metrics_specs=[
tfma.MetricsSpec(metrics=[
tfma.MetricConfig(
class_name='SparseCategoricalAccuracy',
threshold=tfma.config.MetricThreshold(
value_threshold=tfma.GenericValueThreshold(
lower_bound={'value': 0.9}),
change_threshold=tfma.GenericChangeThreshold(
direction=tfma.MetricDirection.HIGHER_IS_BETTER,
absolute={'value': -1e-10})))
])]) |
/lgtm |
New changes are detected. LGTM label has been removed. |
/assign @rmgogogo |
@rmgogogo , this is the forked native Keras sample. |
/approve |
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: neuromage The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
/retest |
2 similar comments
/retest |
/retest |
New changes are detected. LGTM label has been removed. |
/retest |
* init * update comment * fix module file * clean up * update to beam sample * add doc of default bucket * bump viz server tfma version * update iris sample to keras native version * update iris sample to keras native version * pin TFMA * add readme * add to sample test corpus * add prebuilt && update some config * sync frontend * update snapshot * update snapshot * fix gettingstarted page * fix unit test * fix unit test * update description * update some comments * add some dependencies.
Illustrative purpose
DO NOT SUBMITChanged my mind. I think we should include this sample :)Verified at https://3ed47013582d1554-dot-us-central2.pipelines.googleusercontent.com/#/runs/details/a17f06a2-7799-4ad0-9bde-bc9c275b6355
Environment:
SDK: TFX 0.21.0 + KFP 0.2.4
Runtime: Hosted pipeline 0.2.3
This change is