You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There are certain situations where we rapidly construct and destroy and then have to reconstruct the same token stack when hitting bad routes. These are rare, but act as a major slowdown when they do occur, potentially hitting the parse cycle limit. Ideally, when failing a route, we store the tokenization of complete nodes in some kind of cache which associates them with the starting/ending head locations and the context. This cache can then be popped from if we reach that head location again with the same context, allowing us to bypass regular parsing.
The text was updated successfully, but these errors were encountered:
There are certain situations where we rapidly construct and destroy and then have to reconstruct the same token stack when hitting bad routes. These are rare, but act as a major slowdown when they do occur, potentially hitting the parse cycle limit. Ideally, when failing a route, we store the tokenization of complete nodes in some kind of cache which associates them with the starting/ending head locations and the context. This cache can then be popped from if we reach that head location again with the same context, allowing us to bypass regular parsing.
The text was updated successfully, but these errors were encountered: