Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failing test: X-Pack Alerting API Integration Tests.x-pack/test/alerting_api_integration/spaces_only/tests/alerting/builtin_alert_types/index_threshold/alert·ts - alerting api integration spaces only Alerting builtin alertTypes index_threshold alert runs correctly: min grouped #60744

Closed
kibanamachine opened this issue Mar 20, 2020 · 3 comments · Fixed by #60792
Assignees
Labels
failed-test A test failure on a tracked branch, potentially flaky-test Team:ResponseOps Label for the ResponseOps team (formerly the Cases and Alerting teams) v7.7.0

Comments

@kibanamachine
Copy link
Contributor

A test failed on a tracked branch

Error: expected 3 to equal 2
    at Assertion.assert (/dev/shm/workspace/kibana/packages/kbn-expect/expect.js:100:11)
    at Assertion.be.Assertion.equal (/dev/shm/workspace/kibana/packages/kbn-expect/expect.js:227:8)
    at Assertion.be (/dev/shm/workspace/kibana/packages/kbn-expect/expect.js:69:22)
    at Context.it (test/alerting_api_integration/spaces_only/tests/alerting/builtin_alert_types/index_threshold/alert.ts:280:27)
    at process._tickCallback (internal/process/next_tick.js:68:7)

First failure: Jenkins Build

@kibanamachine kibanamachine added the failed-test A test failure on a tracked branch, potentially flaky-test label Mar 20, 2020
@elasticmachine
Copy link
Contributor

Pinging @elastic/kibana-test-triage (failed-test)

@spalger spalger added the Team:ResponseOps Label for the ResponseOps team (formerly the Cases and Alerting teams) label Mar 20, 2020
@elasticmachine
Copy link
Contributor

Pinging @elastic/kibana-alerting-services (Team:Alerting Services)

@pmuellr pmuellr self-assigned this Mar 20, 2020
@pmuellr
Copy link
Member

pmuellr commented Mar 20, 2020

I was half-expecting one of these recently written tests to be flaky, though tried to design them to be as flake-free as possible - a flake leaked in! And I did actually test with the flaky test runner yesterday as well, and it passed 42 times :-)

Easy fix is in the works ...

pmuellr added a commit to pmuellr/kibana that referenced this issue Mar 20, 2020
resolves elastic#60744

This is a fairly complex test, with alerts that run actions that write to
an index which we then do queries over.  The tests didn't account for some
slop in all that async activity, but now should be about as flake-free as they
can be.
pmuellr added a commit to pmuellr/kibana that referenced this issue Mar 20, 2020
resolves elastic#60744

This is a fairly complex test, with alerts that run actions that write to
an index which we then do queries over.  The tests didn't account for some
slop in all that async activity, but now should be about as flake-free as they
can be.
pmuellr added a commit that referenced this issue Mar 21, 2020
resolves #60744

This is a fairly complex test, with alerts that run actions that write to
an index which we then do queries over.  The tests didn't account for some
slop in all that async activity, but now should be about as flake-free as they
can be.
pmuellr added a commit to pmuellr/kibana that referenced this issue Mar 21, 2020
resolves elastic#60744

This is a fairly complex test, with alerts that run actions that write to
an index which we then do queries over.  The tests didn't account for some
slop in all that async activity, but now should be about as flake-free as they
can be.
pmuellr added a commit that referenced this issue Mar 21, 2020
resolves #60744

This is a fairly complex test, with alerts that run actions that write to
an index which we then do queries over.  The tests didn't account for some
slop in all that async activity, but now should be about as flake-free as they
can be.
@kobelb kobelb added the needs-team Issues missing a team label label Jan 31, 2022
@botelastic botelastic bot removed the needs-team Issues missing a team label label Jan 31, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
failed-test A test failure on a tracked branch, potentially flaky-test Team:ResponseOps Label for the ResponseOps team (formerly the Cases and Alerting teams) v7.7.0
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants