-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
show the name of the skipped test #11
Conversation
Thank you for submitting this PR! I will need to test this in detail, but I won't have time to do that before the first week of October. In the meanwhile I leave some comments in the code. |
pytest_dependency.py
Outdated
@@ -6,9 +6,11 @@ | |||
""" | |||
|
|||
import pytest | |||
# from blessings import Terminal |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It does not really make sense to me to keep the out-commented import here. I'd rather remove that entirely. Same remark applies to the other out-commented stuff below.
In principle, you could put that in a try statement and fall back not to use blessing on ImportError. On the other hand, I'm not sure if it is really a good idea to meddle with pytest's coloring schema in the first place, that's why I'd rather drop that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're right. I commented the code, just because the blessing dependence wasn't in setup.py, and I did't want to alter so much the project.
The colors is not really a big deal specially not to alter the default pytest's schema, so let's forget it. Thanks for pointing out.
pytest_dependency.py
Outdated
for i in depends: | ||
if not(i in self.results and self.results[i].isSuccess()): | ||
pytest.skip("depends on %s" % i) | ||
msg = "%s depends on %s (failed)" % (item.name, i) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That is claiming too much. If isSuccess()
yields False
that could either mean the dependency has not been run at all, it has been skipped, or it has failed. You would need to check that before claiming it was a fail.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The initial approach was to show the test names involved when something wasn't right, so I simplified the skip's reason message and remove the commented code.
The code looks good for me now. As I said, I need to test it. I'll get back as soon as I got around to do it. |
Ok, I had a look on it, it works. I still see some issues with the skip messages, but they are more involved to fix. I don't want to defer merging this PR any longer, so I will deal with this in a separate issue. I took the freedom to add a test and some documentation for this. @asteriogonzalez, I hope you don't mind me pushing boldly into your repository, this way I can add my changes directly to the PR and keep everything in one place. |
I would like you to consider to add this minor change for showing the name of the test that is skipped due some failed dependence.