-
Notifications
You must be signed in to change notification settings - Fork 223
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix file recursion overflow problems #795
Fix file recursion overflow problems #795
Conversation
In the original issue, you said "We might have to build a HashSet every time and ignore files that are already in it." I think we should have a HashSet so that we don't end up creating |
My only real concern with the |
For a |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Just one suggestion (that may or may not be misguided 🙂)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@rjmholt + @SeeminglyScience you both made good points on the HashMap approach. It could be too expensive.
this LGTM
Oh oh oh, so I now see the problem with the HashSet method -- the same directory deeper down will be given back as a deeper path from
We couldn't just compare the paths. We'd have to do symlink inspection and that would be costly. |
We also need to do something about the BuildInfo file like @rkeithhill suggested |
This seems to be the best suggestion I've seen. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Thanks for the logging changes!
Aims to fix PowerShell/vscode-powershell#1613.
Rather than using method recursion to process recursive directories, we now use a while loop and a stack and stop recursing at a depth of 64.
It's a naive solution to the problem, but I didn't want to do any sophisticated expensive checks like building a hashmap of paths or testing for symlinks in case that would cause a performance hit (although it might not be very important).
I also added a new logging method to log exceptions in the log that aren't really fatal.