Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Speed up processing of new files in daemon by caching ASTs #10128

Merged
merged 10 commits into from
Feb 22, 2021
Merged

Conversation

JukkaL
Copy link
Collaborator

@JukkaL JukkaL commented Feb 22, 2021

Processing newly installed stub files, in particular, could be quite slow incrementally
in mypy daemon. This is because adding N files results in N steps interally, each of
which adds one file. However, each step parses all remaining files, resulting in
an O(n**2) algorithm.

For example, processing six stubs could take about 40s (when not using a
compiled mypy).

Partially address the issue by caching parsed ASTs during a single increment.
This speeds up the import six use case by about 3x when not using a compiled
mypy. It's still about 3x slower when using daemon, however.

mypy/build.py Outdated
@@ -1994,8 +2003,14 @@ def parse_file(self) -> None:
return

manager = self.manager

# Can we reuse a previously parsed AST? This avoids redundant work in daemon.
cached = self.id in manager.ast_cache and True
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do you need and True? Leftover from debugging?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I'll remove it.

@JukkaL JukkaL merged commit 4827f3a into master Feb 22, 2021
@JukkaL JukkaL deleted the cache-ast branch February 22, 2021 18:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants