Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add tests for panics found by @Koral77 (branch 0.4.x) #1091

Closed
wants to merge 1 commit into from

Conversation

jtmoon79
Copy link
Contributor

Issue #1010

@jtmoon79 jtmoon79 changed the title add tests for panics found by @Koral77 add tests for panics found by @Koral77 (branch 0.4.x) May 28, 2023
@jtmoon79 jtmoon79 marked this pull request as ready for review May 28, 2023 06:34
@pitdicker
Copy link
Collaborator

Why do some of these tests panic while they are fixed?

@jtmoon79
Copy link
Contributor Author

jtmoon79 commented May 28, 2023

Why do some of these tests panic while they are fixed?

🤷

That's why it's good to add Issue-specific tests so the issue is definitively known. As the code is fixed, the #[should_panic] should be removed.

@pitdicker
Copy link
Collaborator

  • test_issue_1010_panic7 should panic (it returns None), a value for the month of 4294967295 doesn't make sense.
  • test_issue_1010_panic6 and test_issue_1010_panic5 have the same issue, a value for the day or ordinal of 4294967295 doesn't make sense.
  • test_issue_1010_panic8 can't do anything except panic on out of range years.
  • test_issue_1010_panic4 used to panic because the input contains Unicode characters, but now just fails because the format string doesn't match.

I have added tests for these issues when fixing the panics. Maybe best to not add these invalid cases?

@pitdicker
Copy link
Collaborator

With the panics fixed in #1093, test_issue_1010_panic1, test_issue_1010_panic2, test_issue_1010_panic3 still panic because of the is_ok() check, while they are trying to do something that should result in an error.

@jtmoon79 jtmoon79 force-pushed the Issue1010_panics_tests branch from 857d1e0 to 8650193 Compare May 28, 2023 23:31
@jtmoon79
Copy link
Contributor Author

jtmoon79 commented May 28, 2023

Good review @pitdicker . Indeed, I was hasty in throwing in the code from the examples. I updated the cases to be more sensible. What do you think?

These tests are somewhat redundant. However, I'd like to move toward a project habit of adding useful test cases per issue. It gives the project higher "quality confidence". Hopefully @djc agrees.
See my rationale explained in PR #1097
... but maybe this particular issue isn't the best example of such a practice, I'm not sure.

@djc
Copy link
Member

djc commented May 30, 2023

I'm not interested in adding "negative" tests (that test that some behavior still panics).

Honestly, I'm not a big fan of the current (proposed) practice for doing issue tests. For one thing, it's annoying to name tests by their issue number instead of giving them a somewhat descriptive (while hopefully still concise) name that describes what behavior their testing, because it means it's harder to determine what the test is trying to cover. Instead, add a comment linking to an issue. Also, we should keep tests closer to the code they cover (typically in a unit test, not in an integration test) rather than keeping a separate suite of "issues" test.

Also, while I think it's somewhat useful to add tests to cover issues that apparently went undetected for a long time, the more important thing we should probably do is setup coverage (using cargo-llvm-cov reporting into codecov.io) to track that new tests actually increase coverage.

@jtmoon79 jtmoon79 closed this Jun 4, 2023
@jtmoon79 jtmoon79 deleted the Issue1010_panics_tests branch June 4, 2023 18:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants