Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

public API endpoint to export schedule final shifts #2047

Merged
merged 25 commits into from
Jun 5, 2023
Merged
Show file tree
Hide file tree
Changes from 16 commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,10 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

## Unreleased

### Added

- Add public API endpoint to export a schedule's final shifts by @joeyorlando ([2047](https://github.com/grafana/oncall/pull/2047))

### Fixed

- Fix demo alert for inbound email integration by @vadimkerr ([#2081](https://github.com/grafana/oncall/pull/2081))
Expand Down
135 changes: 135 additions & 0 deletions docs/sources/oncall-api-reference/schedules.md
Original file line number Diff line number Diff line change
Expand Up @@ -197,3 +197,138 @@ curl "{{API_URL}}/api/v1/schedules/SBM7DV7BKFUYU/" \
**HTTP request**

`DELETE {{API_URL}}/api/v1/schedules/<SCHEDULE_ID>/`

# Export a schedule's final shifts

**HTTP request**

```shell
curl "{{API_URL}}/api/v1/schedules/SBM7DV7BKFUYU/final_shifts?start_date=2023-01-01&end_date=2023-02-01" \
--request GET \
--header "Authorization: meowmeowmeow"
```

The above command returns JSON structured in the following way:

```json
{
"count": 12,
"next": null,
"previous": null,
"results": [
{
"user_pk": "UC2CHRT5SD34X",
"shift_start": "2023-01-02T09:00:00Z",
"shift_end": "2023-01-02T17:00:00Z"
},
{
"user_pk": "U7S8H84ARFTGN",
"shift_start": "2023-01-04T09:00:00Z",
"shift_end": "2023-01-04T17:00:00Z"
},
{
"user_pk": "UC2CHRT5SD34X",
"shift_start": "2023-01-06T09:00:00Z",
"shift_end": "2023-01-06T17:00:00Z"
},
{
"user_pk": "U7S8H84ARFTGN",
"shift_start": "2023-01-09T09:00:00Z",
"shift_end": "2023-01-09T17:00:00Z"
},
{
"user_pk": "UC2CHRT5SD34X",
"shift_start": "2023-01-11T09:00:00Z",
"shift_end": "2023-01-11T17:00:00Z"
},
{
"user_pk": "U7S8H84ARFTGN",
"shift_start": "2023-01-13T09:00:00Z",
"shift_end": "2023-01-13T17:00:00Z"
},
{
"user_pk": "UC2CHRT5SD34X",
"shift_start": "2023-01-16T09:00:00Z",
"shift_end": "2023-01-16T17:00:00Z"
},
{
"user_pk": "U7S8H84ARFTGN",
"shift_start": "2023-01-18T09:00:00Z",
"shift_end": "2023-01-18T17:00:00Z"
},
{
"user_pk": "UC2CHRT5SD34X",
"shift_start": "2023-01-20T09:00:00Z",
"shift_end": "2023-01-20T17:00:00Z"
},
{
"user_pk": "U7S8H84ARFTGN",
"shift_start": "2023-01-23T09:00:00Z",
"shift_end": "2023-01-23T17:00:00Z"
},
{
"user_pk": "UC2CHRT5SD34X",
"shift_start": "2023-01-25T09:00:00Z",
"shift_end": "2023-01-25T17:00:00Z"
},
{
"user_pk": "U7S8H84ARFTGN",
"shift_start": "2023-01-27T09:00:00Z",
"shift_end": "2023-01-27T17:00:00Z"
}
]
}
```

## Caveats

Some notes on the `start_date` and `end_date` query parameters:

- they are both required and should represent ISO 8601 formatted dates
- `end_date` must be greater than or equal to `start_date`
- `end_date` cannot be more than 365 days in the future from `start_date`

Lastly, this endpoint is currently only active for web schedules. It will return HTTP 400 for schedules
defined via Terraform or iCal.

## Example script to transform data to .csv for all of your schedules

The following Python script will generate a `.csv` file, `oncall-report-2023-01-01-to-2023-01-31.csv`. This file will
contain two columns, `user_pk` and `hours_on_call`, which represents how many hours each user was on call during the
period starting January 1, 2023 to January 31, 2023 (inclusive).

```python
import collections
import csv
import requests
from datetime import datetime

# CUSTOMIZE THE FOLLOWING VARIABLES
START_DATE = "2023-01-01"
END_DATE = "2023-01-31"
OUTPUT_FILE_NAME = f"oncall-report-{START_DATE}-to-{END_DATE}.csv"
MY_ONCALL_API_BASE_URL = "https://oncall-prod-us-central-0.grafana.net/oncall/api/v1/schedules"
MY_ONCALL_API_KEY = "meowmeowwoofwoof"

headers = {"Authorization": MY_ONCALL_API_KEY}
schedule_ids = [schedule["id"] for schedule in requests.get(MY_ONCALL_API_BASE_URL, headers=headers).json()["results"]]
user_on_call_hours = collections.defaultdict(int)

for schedule_id in schedule_ids:
response = requests.get(
f"{MY_ONCALL_API_BASE_URL}/{schedule_id}/final_shifts?start_date={START_DATE}&end_date={END_DATE}",
headers=headers)

for final_shift in response.json()["results"]:
end = datetime.fromisoformat(final_shift["shift_end"])
start = datetime.fromisoformat(final_shift["shift_start"])
shift_time_in_seconds = (end - start).total_seconds()
user_on_call_hours[final_shift["user_pk"]] += shift_time_in_seconds / (60 * 60)

with open(OUTPUT_FILE_NAME, "w") as fp:
csv_writer = csv.DictWriter(fp, ["user_pk", "hours_on_call"])
joeyorlando marked this conversation as resolved.
Show resolved Hide resolved
csv_writer.writeheader()

for user_pk, hours_on_call in user_on_call_hours.items():
csv_writer.writerow({"user_pk": user_pk, "hours_on_call": hours_on_call})
```
113 changes: 113 additions & 0 deletions engine/apps/public_api/tests/test_schedules.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
import collections
from unittest.mock import patch

import pytest
Expand Down Expand Up @@ -781,3 +782,115 @@ def test_create_ical_schedule_without_ical_url(make_organization_and_user_with_t
}
response = client.post(url, data=data, format="json", HTTP_AUTHORIZATION=f"{token}")
assert response.status_code == status.HTTP_400_BAD_REQUEST


@pytest.mark.django_db
def test_oncall_shifts_request_validation(
make_organization_and_user_with_token,
make_schedule,
):
organization, _, token = make_organization_and_user_with_token()
ical_schedule = make_schedule(organization, schedule_class=OnCallScheduleICal)
terraform_schedule = make_schedule(organization, schedule_class=OnCallScheduleCalendar)
web_schedule = make_schedule(organization, schedule_class=OnCallScheduleWeb)

schedule_type_validation_msg = "OnCall shifts exports are currently only available for web calendars"
valid_date_msg = "is not a valid date, must be in any valid ISO 8601 format"

client = APIClient()

def _make_request(schedule, query_params=""):
url = reverse("api-public:schedules-final-shifts", kwargs={"pk": schedule.public_primary_key})
return client.get(f"{url}{query_params}", format="json", HTTP_AUTHORIZATION=token)

# only web schedules are allowed for now
response = _make_request(ical_schedule)
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert response.data == schedule_type_validation_msg

response = _make_request(terraform_schedule)
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert response.data == schedule_type_validation_msg

# query param validation
response = _make_request(web_schedule, "?start_date=2021-01-01")
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert response.data == "end_date is required"

response = _make_request(web_schedule, "?start_date=asdfasdf")
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert response.data == f"start_date {valid_date_msg}"

response = _make_request(web_schedule, "?end_date=2021-01-01")
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert response.data == "start_date is required"

response = _make_request(web_schedule, "?start_date=2021-01-01&end_date=asdfasdf")
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert response.data == f"end_date {valid_date_msg}"

response = _make_request(web_schedule, "?end_date=2021-01-01&start_date=2022-01-01")
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert response.data == "start_date must be less than or equal to end_date"

response = _make_request(web_schedule, "?end_date=2021-01-01&start_date=2019-12-31")
assert response.status_code == status.HTTP_400_BAD_REQUEST
assert response.data == "The difference between start_date and end_date must be less than one year (365 days)"


@pytest.mark.django_db
def test_oncall_shifts_export(
make_organization_and_user_with_token,
make_user,
make_schedule,
make_on_call_shift,
):
organization, user1, token = make_organization_and_user_with_token()
user2 = make_user(organization=organization)
schedule = make_schedule(organization, schedule_class=OnCallScheduleWeb)

start_date = timezone.datetime(2023, 1, 1, 9, 0, 0)
make_on_call_shift(
organization=organization,
schedule=schedule,
shift_type=CustomOnCallShift.TYPE_ROLLING_USERS_EVENT,
frequency=CustomOnCallShift.FREQUENCY_DAILY,
priority_level=1,
interval=1,
by_day=["MO", "WE", "FR"],
start=start_date,
until=start_date + timezone.timedelta(days=28),
rolling_users=[{user1.pk: user1.public_primary_key}, {user2.pk: user2.public_primary_key}],
rotation_start=start_date,
duration=timezone.timedelta(hours=8),
)

client = APIClient()

url = reverse("api-public:schedules-final-shifts", kwargs={"pk": schedule.public_primary_key})
response = client.get(f"{url}?start_date=2023-01-01&end_date=2023-02-01", format="json", HTTP_AUTHORIZATION=token)
response_json = response.json()
shifts = response_json["results"]

total_time_on_call = collections.defaultdict(int)
for row in shifts:
end = timezone.datetime.fromisoformat(row["shift_end"])
start = timezone.datetime.fromisoformat(row["shift_start"])
shift_time_in_seconds = (end - start).total_seconds()
total_time_on_call[row["user_pk"]] += shift_time_in_seconds / (60 * 60)

assert response.status_code == status.HTTP_200_OK

# 3 shifts per week x 4 weeks x 8 hours per shift = 96 / 2 users = 48h per user for this period
expected_time_on_call = 48
assert total_time_on_call[user1.public_primary_key] == expected_time_on_call
assert total_time_on_call[user2.public_primary_key] == expected_time_on_call

# pagination parameters are mocked out for now
assert response_json["next"] is None
assert response_json["previous"] is None
assert response_json["count"] == len(shifts)

print(response_json["results"])

assert True is False
94 changes: 94 additions & 0 deletions engine/apps/public_api/views/schedules.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
import logging
from datetime import date

from django_filters import rest_framework as filters
from rest_framework import status
from rest_framework.decorators import action
Expand All @@ -12,13 +15,16 @@
from apps.public_api.throttlers.user_throttle import UserThrottle
from apps.schedules.ical_utils import ical_export_from_schedule
from apps.schedules.models import OnCallSchedule, OnCallScheduleWeb
from apps.schedules.models.on_call_schedule import ScheduleEvents, ScheduleFinalShifts
from apps.slack.tasks import update_slack_user_group_for_schedules
from common.api_helpers.exceptions import BadRequest
from common.api_helpers.filters import ByTeamFilter
from common.api_helpers.mixins import RateLimitHeadersMixin, UpdateSerializerMixin
from common.api_helpers.paginators import FiftyPageSizePaginator
from common.insight_log import EntityEvent, write_resource_insight_log

logger = logging.getLogger(__name__)


class OnCallScheduleChannelView(RateLimitHeadersMixin, UpdateSerializerMixin, ModelViewSet):
authentication_classes = (ApiTokenAuthentication,)
Expand Down Expand Up @@ -120,3 +126,91 @@ def export(self, request, pk):
# Not using existing get_object method because it requires access to the organization user attribute
export = ical_export_from_schedule(self.request.auth.schedule)
return Response(export, status=status.HTTP_200_OK)

@action(methods=["get"], detail=True)
def final_shifts(self, request, pk):
schedule = self.get_object()

if not isinstance(schedule, OnCallScheduleWeb):
return Response(
"OnCall shifts exports are currently only available for web calendars",
status=status.HTTP_400_BAD_REQUEST,
)

start_date_field_name = "start_date"
end_date_field_name = "end_date"

def _field_is_required(field_name):
return Response(f"{field_name} is required", status=status.HTTP_400_BAD_REQUEST)

def _field_is_invalid_date(field_name):
return Response(
f"{field_name} is not a valid date, must be in any valid ISO 8601 format",
status=status.HTTP_400_BAD_REQUEST,
)

def _convert_date(value):
if not value:
return None

try:
return date.fromisoformat(value)
except ValueError:
return None

start_date_str = request.query_params.get(start_date_field_name, None)
start_date = _convert_date(start_date_str)

end_date_str = request.query_params.get(end_date_field_name, None)
end_date = _convert_date(end_date_str)

if start_date_str is None:
return _field_is_required(start_date_field_name)
elif start_date is None:
return _field_is_invalid_date(start_date_field_name)
elif end_date_str is None:
return _field_is_required(end_date_field_name)
elif end_date is None:
return _field_is_invalid_date(end_date_field_name)
joeyorlando marked this conversation as resolved.
Show resolved Hide resolved

if start_date > end_date:
return Response(
f"{start_date_field_name} must be less than or equal to {end_date_field_name}",
status=status.HTTP_400_BAD_REQUEST,
)
joeyorlando marked this conversation as resolved.
Show resolved Hide resolved

days_between_start_and_end = (end_date - start_date).days
if days_between_start_and_end > 365:
return Response(
f"The difference between {start_date_field_name} and {end_date_field_name} must be less than one year (365 days)",
status=status.HTTP_400_BAD_REQUEST,
)

final_schedule_events: ScheduleEvents = schedule.final_events("UTC", start_date, days_between_start_and_end)

logger.info(
f"Exporting oncall shifts for schedule {pk} between dates {start_date_str} and {end_date_str}. {len(final_schedule_events)} shift events were found."
)

data: ScheduleFinalShifts = [
{
"user_pk": user["pk"],
"shift_start": event["start"],
"shift_end": event["end"],
}
for event in final_schedule_events
for user in event["users"]
]

# right now we'll "mock out" the pagination related parameters (next and previous)
# rather than use a Pagination class from drf (as currently it operates on querysets). We've decided on this
# to make this response schema consistent with the rest of the public API + make it easy to add pagination
# here in the future (should we decide to migrate "final_shifts" to an actual model)
return Response(
{
"count": len(data),
"next": None,
"previous": None,
"results": data,
}
)
Loading