You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Configuring Pipeline via CLI
Starting pipeline via CLI... Ctrl+C to Quit
WARNING: Logging before InitGoogleLogging() is written to STDERR
W20220629 16:01:32.859344 1126 triton_inference.cpp:248] Failed to connect to Triton at 'ai-engine:8001'. Default gRPC port of (8001) was detected but C++ InferenceClientStage uses HTTP protocol. Retrying with default HTTP port (8000)
Drop Null Attributes rate[Complete]: 93085messages [00:01, 76Line 1: cannot convert integer from 'version https://git-lfs.github.com/spec/v1': stoi
Deserialization rate: 8192messages [00:01, 6749.30messages/s]
E20220629 16:01:34.085346 1159 context.cpp:125] main/preprocess-nlp-5; rank: 0; size: 1; tid: 139774159091456: set_exception issued; issuing kill to current runnable. Exception msg: stoi: 0inf [00:00, ?inf/s] E20220629 16:01:34.090284 959 runner.cpp:190] Runner::await_join - an exception was caught while awaiting on one or more contexts/instances - rethrowingages [00:01, 6749.30messages/s]
E20220629 16:01:34.090341 959 instance.cpp:259] segment::Instance - an exception was caught while awaiting on one or more nodes - rethrowing
E20220629 16:01:34.090363 959 instance.cpp:218] pipeline::Instance - an exception was caught while awaiting on segments - rethrowing
Drop Null Attributes rate[Complete]: 93085messages [00:01, 76784.38messages/s]
Deserialization rate[Complete]: 93085messages [00:01, 76555.85messages/s]
Preprocessing rate[Complete]: 0messages [00:00, ?messages/s]
Inference rate[Complete]: 0inf [00:00, ?inf/s]
Classification rate[Complete]: 0messages [00:00, ?messages/s]
Serialization rate[Complete]: 0messages [00:00, ?messages/s]
Describe the bug
Traditional NLP file-based smoke test fails in docker runtime image at 7684908
There's an unusual reference to "cannot convert integer from 'version https://git-lfs.github.com/spec/v1': stoi" in the stacktrace.
Steps/Code to reproduce bug
Expected behavior
Completed pipeline run without errors.
Environment overview (please complete the following information)
Environment details
https://gist.github.com/pdmack/8f9342321523251bf9ecdd8cd349a029
Additional context
Needed by #237
The text was updated successfully, but these errors were encountered: