Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add unit tests for error states in critical workflows #1118

Merged
merged 40 commits into from
Apr 27, 2021

Conversation

zwliew
Copy link
Contributor

@zwliew zwliew commented Apr 9, 2021

Problem

#1110 added tests for the happy paths in critical workflows. It would be a good idea to add tests for the error paths as well to cover our bases.

Workflows:

  1. Creating and sending a new email campaign
  2. Creating and sending a new SMS campaign
  3. Creating and sending a new Telegram campaign
  4. Creating and sending a new protected email campaign
  5. Viewing a protected email message
  6. Unsubscribing from a mailing list

Partially resolves issue #1079

Solution

Features:

  • Added unit tests for various error states that occur during campaign creation

Before & After Screenshots

There are no visual changes.

Tests

These tests have been verified on Travis and Amplify.

  1. Deploy the change on Travis and Amplify.
  2. All the following tests should be run and pass:
  • SMS
    • Saving an invalid template (empty body after sanitization)
    • Attempting to validate an invalid credential
    • Uploading an invalid CSV file
  • Telegram
    • Saving an invalid template (empty body after sanitization)
    • Attempting to validate an invalid credential
    • Uploading an invalid CSV file
  • Email
    • Saving an invalid template (empty subject after sanitization)
    • Uploading an invalid CSV file
  • Protected Email
    • Saving an invalid template (empty subject after sanitization) (same as email)
    • Saving an invalid template (extraneous subject params)
    • Saving an invalid template (missing body params)
    • Saving an invalid template (extraneous body params)
    • Viewing a protected email with an invalid ID
    • Viewing a protected email with an invalid password
    • Uploading an invalid CSV file
  • Write unit tests for the following misc non-error cases (not a priority; simply written for practice):
    • SMS
      • Displays the necessary elements when rendered
      • Next button is disabled when template is empty
      • Next button is enabled when template is filled
      • Character count text reflects the actual number of characters in the textbox
    • Telegram
      • Displays the necessary elements when rendered
      • Next button is disabled when template is empty
      • Next button is enabled when template is filled
    • Email
      • Displays the necessary elements when rendered

Deploy Notes

The aforementioned tests have been added and should be run during every deployment on Travis and Amplify. If any test fails, the deployment should not go through.

Specifically, Amplify should not publish the new build if any test fails.

@zwliew zwliew changed the base branch from develop to frontend-tests-dashboard April 9, 2021 03:54
@zwliew zwliew added the frontend Frontend label Apr 9, 2021
@zwliew zwliew force-pushed the frontend-tests-dashboard branch 2 times, most recently from 44ec28f to 8931c6d Compare April 12, 2021 09:59
@zwliew zwliew force-pushed the frontend-tests-error-states branch 3 times, most recently from da76289 to 93f1f0b Compare April 13, 2021 04:57
@zwliew zwliew marked this pull request as ready for review April 13, 2021 05:00
@zwliew zwliew changed the title feat: add unit tests for campaign creation error states feat: add unit tests for error states in critical workflows Apr 19, 2021
@zwliew zwliew changed the base branch from frontend-tests-dashboard to develop April 19, 2021 09:12
@zwliew zwliew force-pushed the frontend-tests-error-states branch from 93f1f0b to 43cc2ed Compare April 19, 2021 10:12
@zwliew zwliew changed the base branch from develop to frontend-tests-dashboard April 19, 2021 10:12
@zwliew
Copy link
Contributor Author

zwliew commented Apr 19, 2021

I've rebased the branch on top of frontend-tests-dashboard

Copy link
Contributor

@miazima miazima left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here are some suggestions on scenarios we could add, can be done in a later pr too

EmailRecipient

  • test for invalid file/valid file scenario

SMSCredential

  • test with invalid phone number like 80000000

Comment on lines +93 to +104
// Setup
jest.spyOn(console, 'error').mockImplementation(() => {
// Do nothing. Mock console.error to silence expected errors
// due to submitting invalid templates to the API
})
server.use(...mockApis(false))
renderTemplatePage()
Copy link
Contributor

@miazima miazima Apr 20, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Possibly the setup and teardown in these tests could be extracted to jest before/after hooks

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's a bit difficult to do so, since most tests have slightly different setup/teardown phases. For example, some tests mock console.error and some use protected email campaigns.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alright, as for mocking the console.error, if you want to do this globally, you can feed it in a setup file in your jest.config.ts, this is something the backend tests currently implement to silence console.log messages

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. I considered doing so, but decided against mocking console.error globally. When fixing test failures, I found that console.error statements provided useful information.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, the backend tests currently implement that only for console.log to not overcrowd the travis log output

expect(screen.getByRole('button', { name: /next/i })).toBeInTheDocument()
})

test('displays an error if the subject is empty after sanitization', async () => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could also test for the error messages output when invalid templates are provided such as {{test}, {{test 123}} {{}}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you want to add these in? i think the reason why I gave these examples is that they each output a different error message, but if the error messages are generated on the backend and not the frontend, then we can address this in backend tests

Copy link
Contributor Author

@zwliew zwliew Apr 27, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I looked into them initially, but realised that I would need to add some extra cases to the API endpoint mocks to handle them, so I put that on hold 😅

Now that you mention it, having these in the backend tests sounds like a good idea as well! If so, I guess we can postpone adding frontend tests for those cases to later?

expect(screen.getByText(/characters/i)).toBeInTheDocument()
})

test('next button is disabled when template is empty', async () => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For tests like these which are common across the channels, is there a way we could share and reuse them?

Copy link
Contributor Author

@zwliew zwliew Apr 21, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. I could create a common test for these 2 components, but I think maybe a cleaner option would be to extract the common code in SMSTemplate and TelegramTemplate into a separate component, and write a test.

I'll explore both options.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've attempted to extract the common code to a BodyTemplate component here: #1148

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe this can go in first and then we can refactor the tests once the other pr is approved.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, sounds good!

Base automatically changed from frontend-tests-dashboard to develop April 26, 2021 04:39
zwliew added 14 commits April 26, 2021 15:21
Make sure to set CI=true explicitly as Amplify doesn't set it by
default. CI=true will run all the tests exactly once and exit.

Also compile translations before running the tests.
Just a simple render test to ensure that the test infrastructure works fine.
This makes more sense as compilation errors are more important than test
errors. This also matches the Amplify build configuration.
Also refactor common API endpoints shared with the Email campaign test
into a separate array.
Also add assertions for testing the credential dropdown for SMS
campaigns.
…paign

Previously, we were selecting the SMS channel button, not the email
button.
@zwliew zwliew force-pushed the frontend-tests-error-states branch from e560206 to b986412 Compare April 26, 2021 07:26
@zwliew
Copy link
Contributor Author

zwliew commented Apr 26, 2021

Sorry for the erased commit history -- I've rebased the branch atop of develop. The latest changes start from commit 74b94ea, feat: mock API responses for invalid CSV files

@miazima miazima self-requested a review April 27, 2021 04:57
@miazima miazima merged commit 201957c into develop Apr 27, 2021
@miazima miazima deleted the frontend-tests-error-states branch April 27, 2021 04:58
lamkeewei added a commit that referenced this pull request May 5, 2021
* develop:
  refactor: use shared function to initialize models (#1172)
  chore: setup scaffolding for backend tests (#940)
  1.23.0
  fix(frontend): fix frontend test flakiness (#1162)
  fix: update successful delivery status only if error does not exist (#1150)
  chore: upgrade dependencies (#1153)
  feat: add unit tests for error states in critical workflows (#1118)
  feat: support whitelisting domains through `agencies` table (#1141)
  feat: add tests for happy paths in critical workflows (#1110)
  fix: prevent campaign names from causing dashboard rows to overflow (#1147)
  fix(email): Fix SendGrid fallback integration (#1026)
lamkeewei added a commit that referenced this pull request May 5, 2021
* develop:
  feat: refactor msg template components; add telegram character limit (#1148)
  refactor: use shared function to initialize models (#1172)
  chore: setup scaffolding for backend tests (#940)
  1.23.0
  fix(frontend): fix frontend test flakiness (#1162)
  fix: update successful delivery status only if error does not exist (#1150)
  chore: upgrade dependencies (#1153)
  feat: add unit tests for error states in critical workflows (#1118)
  feat: support whitelisting domains through `agencies` table (#1141)
  feat: add tests for happy paths in critical workflows (#1110)
  fix: prevent campaign names from causing dashboard rows to overflow (#1147)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
frontend Frontend
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants