Skip to content

Commit

Permalink
Update parameter docstrings
Browse files Browse the repository at this point in the history
  • Loading branch information
jond01 committed Dec 14, 2023
1 parent 2d51551 commit 28e80ef
Show file tree
Hide file tree
Showing 2 changed files with 19 additions and 7 deletions.
9 changes: 8 additions & 1 deletion batch_inference_v2/batch_inference_v2.py
Original file line number Diff line number Diff line change
Expand Up @@ -186,7 +186,14 @@ def infer(
Can be provided as an input (DataItem) or as a parameter (e.g. string, list, DataFrame).
The default chosen sample set will always be the one who is set in the model artifact itself.
:param last_in_batch_set: Relevant only when `perform_drift_analysis` is `True`.
Whether to mark the monitoring window as completed and allow monitoring without extra inferences.
This flag can (and should only) be used when the model endpoint does not have
model-monitoring set.
If set to `True` (the default), this flag marks the current monitoring window
(on this monitoring endpoint) as completed - the data inferred so far is assumed
to be the complete data for this monitoring window.
You may want to set this flag to `False` if you want to record multiple results in
close time proximity ("batch set"). In this case, set this flag to `False` on all
but the last batch in the set.
Defaults to `None`, which means use the `mlrun`'s default if the parameter exists.
raises MLRunInvalidArgumentError: if both `model_path` and `endpoint_id` are not provided, or if `last_in_batch_set` is
provided for an unsupported `mlrun` version.
Expand Down
Loading

0 comments on commit 28e80ef

Please sign in to comment.