Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Synthetics] adjust run_from.id for private locations #156324

Merged

Conversation

dominiqueclarke
Copy link
Contributor

@dominiqueclarke dominiqueclarke commented May 2, 2023

Release Note

All Synthetics errors that are in-progress for private locations at the time the stack is upgrading will be resolved. A new error state will take its place.

Summary

Resolves #156320

Adjust the value of the run_from.id field.

Before
Both the run_from.id and the run_from.name were the same value.

After
Screen Shot 2023-05-01 at 7 55 24 PM

Testing

  1. Check out this PR from integrations [Synthetics] adjust run_from.id integrations#6047
  2. In the integrations repo cd packages/synthetics
  3. Run elastic-package clean then elastic-package build (You may need to upgrade elastic package if you're significantly behind)
  4. In your kibana.dev.yml file add xpack.fleet.registryUrl: https://localhost:8080
  5. Start kibana with NODE_EXTRA_CA_CERTS=$HOME/.elastic-package/profiles/default/certs/kibana/ca-cert.pem yarn start --no-base-path
  6. Create a private location
  7. Create a monitor assigned to that private location
  8. Inspect the agent policy for that private location, then find the integration policy for that monitor. Ensure the run_from.id is the id of the location, and the run_from.name is the name of the location.

(If you run into problems setting up your local kibana connected to elastic-package registry, see https://github.com/elastic/security-team/blob/main/docs/cloud-security-posture-team/kibana/local-setup-using-elastic-package.mdx)

@apmmachine
Copy link
Contributor

🤖 GitHub comments

Expand to view the GitHub comments

Just comment with:

  • /oblt-deploy : Deploy a Kibana instance using the Observability test environments.
  • run elasticsearch-ci/docs : Re-trigger the docs validation. (use unformatted text in the comment!)

@dominiqueclarke dominiqueclarke added bug Fixes for quality problems that affect the customer experience Team:Uptime - DEPRECATED Synthetics & RUM sub-team of Application Observability v8.8.0 v8.9.0 release_note:skip Skip the PR/issue when compiling release notes labels May 2, 2023
@dominiqueclarke dominiqueclarke marked this pull request as ready for review May 2, 2023 12:45
@dominiqueclarke dominiqueclarke requested a review from a team as a code owner May 2, 2023 12:45
@elasticmachine
Copy link
Contributor

Pinging @elastic/uptime (Team:uptime)

@shahzad31 shahzad31 requested a review from a team as a code owner May 11, 2023 06:25
@botelastic botelastic bot added the Team:Fleet Team label for Observability Data Collection Fleet team label May 11, 2023
@elasticmachine
Copy link
Contributor

Pinging @elastic/fleet (Team:Fleet)

@@ -66,6 +68,12 @@ export const migratePackagePolicyToV880: SavedObjectMigrationFn<PackagePolicy, P
}
}

// set location_id.id to agentPolicyId
if (enabledStream.vars && enabledStream.vars.location_id) {
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This var doesn't yet exist in existing policies, so this condition will never be satisfied. We need to manually add on the new variable.

@@ -66,6 +68,12 @@ export const migratePackagePolicyToV880: SavedObjectMigrationFn<PackagePolicy, P
}
}

// set location_id.id to agentPolicyId
if (enabledStream.vars && enabledStream.vars.location_id) {
enabledStream.vars.location_id.value = agentPolicyId;
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So this var doesn't yet exist, but will in the version of the policy that will be shipped with 8.8.0. Because of that, we may want to also add in the type?

    enabledStream.vars.location_id = {
        type: 'text',
        value: agentPolicyId
    };

I'm not sure if creating a brand new variable that doesn't yet exist causes issues, @kpollich

@@ -64,6 +64,7 @@ export const httpPolicy = {
'ssl.verification_mode': { value: null, type: 'text' },
'ssl.supported_protocols': { value: null, type: 'yaml' },
location_name: { value: 'A private location', type: 'text' },
location_id: { value: 'A private location', type: 'text' },
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove this. This does not exist in the current policies. It's a bad fixture.

Copy link
Member

@nchaulet nchaulet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fleet migration 🚀

@kibana-ci
Copy link
Collaborator

💚 Build Succeeded

Metrics [docs]

Async chunks

Total size of all lazy-loaded chunks that will be downloaded as the user navigates the app

id before after diff
synthetics 1.2MB 1.2MB +35.0B
Unknown metric groups

ESLint disabled line counts

id before after diff
enterpriseSearch 19 21 +2
securitySolution 400 404 +4
total +6

Total ESLint disabled count

id before after diff
enterpriseSearch 20 22 +2
securitySolution 480 484 +4
total +6

History

To update your PR or re-run it, just comment with:
@elasticmachine merge upstream

@dominiqueclarke dominiqueclarke merged commit f59471b into elastic:main May 12, 2023
@dominiqueclarke dominiqueclarke deleted the fix/synthetics-run-from-id branch May 12, 2023 18:30
@dominiqueclarke dominiqueclarke added release_note:breaking and removed release_note:skip Skip the PR/issue when compiling release notes labels May 12, 2023
kibanamachine pushed a commit to kibanamachine/kibana that referenced this pull request May 12, 2023
## Summary

Resolves elastic#156320

Adjust the value of the `run_from.id` field.

Before
Both the `run_from.id` and the `run_from.name` were the same value.

After
<img width="543" alt="Screen Shot 2023-05-01 at 7 55 24 PM"
src="https://user-images.githubusercontent.com/11356435/235551964-bb9c419d-2ada-4a39-bd08-ce6f5021747b.png">

### Testing
1. Check out this PR from integrations
elastic/integrations#6047
2. In the integrations repo `cd packages/synthetics`
3. Run `elastic-package clean` then `elastic-package build` (You may
need to upgrade elastic package if you're significantly behind)
4. In your `kibana.dev.yml` file add `xpack.fleet.registryUrl:
https://localhost:8080`
5. Start kibana with
`NODE_EXTRA_CA_CERTS=$HOME/.elastic-package/profiles/default/certs/kibana/ca-cert.pem
yarn start --no-base-path`
6. Create a private location
7. Create a monitor assigned to that private location
8. Inspect the agent policy for that private location, then find the
integration policy for that monitor. Ensure the `run_from.id` is the id
of the location, and the `run_from.name` is the name of the location.

(If you run into problems setting up your local kibana connected to
elastic-package registry, see
https://github.com/elastic/security-team/blob/main/docs/cloud-security-posture-team/kibana/local-setup-using-elastic-package.mdx)

---------

Co-authored-by: shahzad31 <shahzad31comp@gmail.com>
(cherry picked from commit f59471b)
@kibanamachine
Copy link
Contributor

💚 All backports created successfully

Status Branch Result
8.8

Note: Successful backport PRs will be merged automatically after passing CI.

Questions ?

Please refer to the Backport tool documentation

kibanamachine added a commit that referenced this pull request May 15, 2023
…#157531)

# Backport

This will backport the following commits from `main` to `8.8`:
- [[Synthetics] adjust run_from.id for private locations
(#156324)](#156324)

<!--- Backport version: 8.9.7 -->

### Questions ?
Please refer to the [Backport tool
documentation](https://github.com/sqren/backport)

<!--BACKPORT [{"author":{"name":"Dominique
Clarke","email":"dominique.clarke@elastic.co"},"sourceCommit":{"committedDate":"2023-05-12T18:30:53Z","message":"[Synthetics]
adjust run_from.id for private locations (#156324)\n\n##
Summary\r\n\r\nResolves
https://github.com/elastic/kibana/issues/156320\r\n\r\nAdjust the value
of the `run_from.id` field.\r\n\r\nBefore\r\nBoth the `run_from.id` and
the `run_from.name` were the same value.\r\n\r\nAfter\r\n<img
width=\"543\" alt=\"Screen Shot 2023-05-01 at 7 55 24
PM\"\r\nsrc=\"https://user-images.githubusercontent.com/11356435/235551964-bb9c419d-2ada-4a39-bd08-ce6f5021747b.png\">\r\n\r\n###
Testing\r\n1. Check out this PR from
integrations\r\nhttps://github.com/elastic/integrations/pull/6047\r\n2.
In the integrations repo `cd packages/synthetics`\r\n3. Run
`elastic-package clean` then `elastic-package build` (You may\r\nneed to
upgrade elastic package if you're significantly behind)\r\n4. In your
`kibana.dev.yml` file add
`xpack.fleet.registryUrl:\r\nhttps://localhost:8080`\r\n5. Start kibana
with\r\n`NODE_EXTRA_CA_CERTS=$HOME/.elastic-package/profiles/default/certs/kibana/ca-cert.pem\r\nyarn
start --no-base-path`\r\n6. Create a private location\r\n7. Create a
monitor assigned to that private location\r\n8. Inspect the agent policy
for that private location, then find the\r\nintegration policy for that
monitor. Ensure the `run_from.id` is the id\r\nof the location, and the
`run_from.name` is the name of the location.\r\n\r\n(If you run into
problems setting up your local kibana connected to\r\nelastic-package
registry,
see\r\nhttps://github.com/elastic/security-team/blob/main/docs/cloud-security-posture-team/kibana/local-setup-using-elastic-package.mdx)\r\n\r\n---------\r\n\r\nCo-authored-by:
shahzad31
<shahzad31comp@gmail.com>","sha":"f59471bcdc8d8fa156e7997749efe7d252457d77","branchLabelMapping":{"^v8.9.0$":"main","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["bug","Team:uptime","release_note:skip","Team:Fleet","v8.8.0","v8.9.0"],"number":156324,"url":"https://github.com/elastic/kibana/pull/156324","mergeCommit":{"message":"[Synthetics]
adjust run_from.id for private locations (#156324)\n\n##
Summary\r\n\r\nResolves
https://github.com/elastic/kibana/issues/156320\r\n\r\nAdjust the value
of the `run_from.id` field.\r\n\r\nBefore\r\nBoth the `run_from.id` and
the `run_from.name` were the same value.\r\n\r\nAfter\r\n<img
width=\"543\" alt=\"Screen Shot 2023-05-01 at 7 55 24
PM\"\r\nsrc=\"https://user-images.githubusercontent.com/11356435/235551964-bb9c419d-2ada-4a39-bd08-ce6f5021747b.png\">\r\n\r\n###
Testing\r\n1. Check out this PR from
integrations\r\nhttps://github.com/elastic/integrations/pull/6047\r\n2.
In the integrations repo `cd packages/synthetics`\r\n3. Run
`elastic-package clean` then `elastic-package build` (You may\r\nneed to
upgrade elastic package if you're significantly behind)\r\n4. In your
`kibana.dev.yml` file add
`xpack.fleet.registryUrl:\r\nhttps://localhost:8080`\r\n5. Start kibana
with\r\n`NODE_EXTRA_CA_CERTS=$HOME/.elastic-package/profiles/default/certs/kibana/ca-cert.pem\r\nyarn
start --no-base-path`\r\n6. Create a private location\r\n7. Create a
monitor assigned to that private location\r\n8. Inspect the agent policy
for that private location, then find the\r\nintegration policy for that
monitor. Ensure the `run_from.id` is the id\r\nof the location, and the
`run_from.name` is the name of the location.\r\n\r\n(If you run into
problems setting up your local kibana connected to\r\nelastic-package
registry,
see\r\nhttps://github.com/elastic/security-team/blob/main/docs/cloud-security-posture-team/kibana/local-setup-using-elastic-package.mdx)\r\n\r\n---------\r\n\r\nCo-authored-by:
shahzad31
<shahzad31comp@gmail.com>","sha":"f59471bcdc8d8fa156e7997749efe7d252457d77"}},"sourceBranch":"main","suggestedTargetBranches":["8.8"],"targetPullRequestStates":[{"branch":"8.8","label":"v8.8.0","labelRegex":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"state":"NOT_CREATED"},{"branch":"main","label":"v8.9.0","labelRegex":"^v8.9.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/156324","number":156324,"mergeCommit":{"message":"[Synthetics]
adjust run_from.id for private locations (#156324)\n\n##
Summary\r\n\r\nResolves
https://github.com/elastic/kibana/issues/156320\r\n\r\nAdjust the value
of the `run_from.id` field.\r\n\r\nBefore\r\nBoth the `run_from.id` and
the `run_from.name` were the same value.\r\n\r\nAfter\r\n<img
width=\"543\" alt=\"Screen Shot 2023-05-01 at 7 55 24
PM\"\r\nsrc=\"https://user-images.githubusercontent.com/11356435/235551964-bb9c419d-2ada-4a39-bd08-ce6f5021747b.png\">\r\n\r\n###
Testing\r\n1. Check out this PR from
integrations\r\nhttps://github.com/elastic/integrations/pull/6047\r\n2.
In the integrations repo `cd packages/synthetics`\r\n3. Run
`elastic-package clean` then `elastic-package build` (You may\r\nneed to
upgrade elastic package if you're significantly behind)\r\n4. In your
`kibana.dev.yml` file add
`xpack.fleet.registryUrl:\r\nhttps://localhost:8080`\r\n5. Start kibana
with\r\n`NODE_EXTRA_CA_CERTS=$HOME/.elastic-package/profiles/default/certs/kibana/ca-cert.pem\r\nyarn
start --no-base-path`\r\n6. Create a private location\r\n7. Create a
monitor assigned to that private location\r\n8. Inspect the agent policy
for that private location, then find the\r\nintegration policy for that
monitor. Ensure the `run_from.id` is the id\r\nof the location, and the
`run_from.name` is the name of the location.\r\n\r\n(If you run into
problems setting up your local kibana connected to\r\nelastic-package
registry,
see\r\nhttps://github.com/elastic/security-team/blob/main/docs/cloud-security-posture-team/kibana/local-setup-using-elastic-package.mdx)\r\n\r\n---------\r\n\r\nCo-authored-by:
shahzad31
<shahzad31comp@gmail.com>","sha":"f59471bcdc8d8fa156e7997749efe7d252457d77"}}]}]
BACKPORT-->

Co-authored-by: Dominique Clarke <dominique.clarke@elastic.co>
Co-authored-by: Shahzad <shahzad31comp@gmail.com>
jasonrhodes pushed a commit that referenced this pull request May 17, 2023
## Summary

Resolves #156320

Adjust the value of the `run_from.id` field.

Before
Both the `run_from.id` and the `run_from.name` were the same value.

After
<img width="543" alt="Screen Shot 2023-05-01 at 7 55 24 PM"
src="https://user-images.githubusercontent.com/11356435/235551964-bb9c419d-2ada-4a39-bd08-ce6f5021747b.png">

### Testing
1. Check out this PR from integrations
elastic/integrations#6047
2. In the integrations repo `cd packages/synthetics`
3. Run `elastic-package clean` then `elastic-package build` (You may
need to upgrade elastic package if you're significantly behind)
4. In your `kibana.dev.yml` file add `xpack.fleet.registryUrl:
https://localhost:8080`
5. Start kibana with
`NODE_EXTRA_CA_CERTS=$HOME/.elastic-package/profiles/default/certs/kibana/ca-cert.pem
yarn start --no-base-path`
6. Create a private location
7. Create a monitor assigned to that private location
8. Inspect the agent policy for that private location, then find the
integration policy for that monitor. Ensure the `run_from.id` is the id
of the location, and the `run_from.name` is the name of the location.

(If you run into problems setting up your local kibana connected to
elastic-package registry, see
https://github.com/elastic/security-team/blob/main/docs/cloud-security-posture-team/kibana/local-setup-using-elastic-package.mdx)

---------

Co-authored-by: shahzad31 <shahzad31comp@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Fixes for quality problems that affect the customer experience release_note:breaking Team:Fleet Team label for Observability Data Collection Fleet team Team:Uptime - DEPRECATED Synthetics & RUM sub-team of Application Observability v8.8.0 v8.9.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Synthetics] Observer.name is wrong when using private locations
7 participants