-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question About LRU Cache Key Implementation #503
Comments
@HsinHeng have you tested what kind of performance degrade the hashing is introducing? |
I create a simple script within 3 cases:
The benchmark in my mac:
As I know, crypto is cpu-intensive, the result is expected. The other sides, we care about security & token collision.
Anyway, thank you for your time. |
Thanks for the detailed feedback, much appreciated! Ok so, based on your benchmarks above, it definitely does look like getting rid of the hashing would definitely improve the performance, so I'm happy to do that. We discussed it offline with @ilteoood and what we propose to do is allowing to override the cache key generation algorithm, but still defaulting to the existing one, primarily to avoid breaking changes and by staying conservative in terms of security and collision. Would that work for you? Basically, if we do this, what you'd do in order to skip the hashing would be to pass the identity function to the cache key generator option. |
Sometime I design jwt verify in node.js or gateway like kong ingress/nginx-plus based on different products. Actually I am looking for decode cache.😆 (At least to do twice JSON.parse) But I am inspired of your libs, that's awesome. |
🎉 This issue has been resolved in version 4.0.6 🎉 The release is available on: Your optic bot 📦🚀 |
Hi All, Thank you to contribute this awesome libs.
I have found that cache is use
hash(token)
as key, not direct to usetoken
as key.hash(token) spend a little time complexity, can we directly use
token
as key?The text was updated successfully, but these errors were encountered: