Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

show the name of the skipped test #11

Merged
merged 7 commits into from
Oct 3, 2017
Merged

Conversation

asteriogonzalez
Copy link
Contributor

I would like you to consider to add this minor change for showing the name of the test that is skipped due some failed dependence.

@RKrahl
Copy link
Owner

RKrahl commented Sep 24, 2017

Thank you for submitting this PR! I will need to test this in detail, but I won't have time to do that before the first week of October. In the meanwhile I leave some comments in the code.

@@ -6,9 +6,11 @@
"""

import pytest
# from blessings import Terminal
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It does not really make sense to me to keep the out-commented import here. I'd rather remove that entirely. Same remark applies to the other out-commented stuff below.

In principle, you could put that in a try statement and fall back not to use blessing on ImportError. On the other hand, I'm not sure if it is really a good idea to meddle with pytest's coloring schema in the first place, that's why I'd rather drop that.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're right. I commented the code, just because the blessing dependence wasn't in setup.py, and I did't want to alter so much the project.

The colors is not really a big deal specially not to alter the default pytest's schema, so let's forget it. Thanks for pointing out.

for i in depends:
if not(i in self.results and self.results[i].isSuccess()):
pytest.skip("depends on %s" % i)
msg = "%s depends on %s (failed)" % (item.name, i)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is claiming too much. If isSuccess() yields False that could either mean the dependency has not been run at all, it has been skipped, or it has failed. You would need to check that before claiming it was a fail.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The initial approach was to show the test names involved when something wasn't right, so I simplified the skip's reason message and remove the commented code.

@RKrahl
Copy link
Owner

RKrahl commented Sep 26, 2017

The code looks good for me now. As I said, I need to test it. I'll get back as soon as I got around to do it.

@RKrahl RKrahl added the enhancement New feature or request label Sep 26, 2017
@RKrahl RKrahl changed the title show the name of the failed test and add some color to the output show the name of the skipped test Sep 30, 2017
@RKrahl RKrahl added this to the 0.3 milestone Oct 3, 2017
@RKrahl
Copy link
Owner

RKrahl commented Oct 3, 2017

Ok, I had a look on it, it works.

I still see some issues with the skip messages, but they are more involved to fix. I don't want to defer merging this PR any longer, so I will deal with this in a separate issue.

I took the freedom to add a test and some documentation for this. @asteriogonzalez, I hope you don't mind me pushing boldly into your repository, this way I can add my changes directly to the PR and keep everything in one place.

@RKrahl RKrahl merged commit fb246e2 into RKrahl:master Oct 3, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants