Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ref(sentry-metrics): Add async task #29358

Merged
merged 5 commits into from
Oct 28, 2021
Merged

Conversation

MeredithAnya
Copy link
Member

@MeredithAnya MeredithAnya commented Oct 15, 2021

context
The metrics consumer has roughly four things it's responsible for after consumer messages from the ingest-metrics topic.

  1. get the id indexes from the indexer (the int version of the string)
  2. translate the message to use those ids instead of strings
  3. produce translated message to snuba-metrics topic
  4. asynchronously pass the relevant message data to the metrics product data model

More historical context can be found in #28431 and #28914.

4. asynchronously pass the relevant message data to the metrics product data model
The metrics product data model isn't flushed out yet, so until that happens it's not clear exactly what information we'll need to pass along. In the mean time, we still want to be able to stress test load so I've added a task that only records some metrics.

Copy link
Contributor

@fpacifici fpacifici left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall no major concerns.
Please see my two comments inline

@@ -638,6 +639,7 @@ def SOCIAL_AUTH_DEFAULT_USERNAME():
Queue("reports.deliver", routing_key="reports.deliver"),
Queue("reports.prepare", routing_key="reports.prepare"),
Queue("search", routing_key="search"),
Queue("sentry_metrics.indexer", routing_key="sentry_metrics.indexer"),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we have to manually create a queue in production for this ? Not sure what the process is.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@fpacifici I think we let @getsentry/ops know once this is deployed that we should have dedicated workers for this. (and we can make an ops ticket for it)

raise Exception(f"didn't get all the callbacks: {messages_left} left")

# if we have successfully produced messages to the snuba-metrics topic
# then enque task to send payload to the product metrics data model
process_indexed_metrics.apply_async(kwargs={"messages": batch})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's not provide all the content to the celery task. These messages are pretty heavy and the content will be useless.
At this stage you can format the message as you wish, but please let's remove the bucket values. The async task only needs the meta data of each bucket.


@instrumented_task( # type: ignore
name="sentry.sentry_metrics.indexer.tasks.process_indexed_metrics",
queue="sentry_metrics.indexer",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this being created now or is it ready ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@fpacifici it should be ready now to deploy, once things are deployed is when ops can add the dedicated workers

@MeredithAnya MeredithAnya merged commit b421a19 into master Oct 28, 2021
@MeredithAnya MeredithAnya deleted the sentry-metrics/SNS-737 branch October 28, 2021 20:16
@github-actions github-actions bot locked and limited conversation to collaborators Nov 13, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants