Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Asynchronous cache without cache loader #246

Closed
mklamra opened this issue May 18, 2018 · 14 comments
Closed

Asynchronous cache without cache loader #246

mklamra opened this issue May 18, 2018 · 14 comments

Comments

@mklamra
Copy link

mklamra commented May 18, 2018

@ben-manes Is there any way to create an asynchronous cache without a cache loader, i.e. a manual asynchronous cache? Caffeine.newBuilder().build() returns a synchronous Cache without a loader, but there is no corresponding async method. The only way to create a manual asynchronous cache now seems to be to call Caffeine.newBuilder().buildAsync with a dummy cache loader that always throws an exception.

@ben-manes
Copy link
Owner

Not yet. I do plan on adding that in my next sprint of energy.

For now you can use a dummy loader in the form of,

Caffeine.newBuilder().buildAsync(key -> null);

and avoid calling get(key) which would invoke the loader.

I wasn't sure if users would have good reasons to not use a cache loader, so I delayed adding a manual AsyncCache until later. However I did add all the necessary methods to the AsyncLoadingCache to make it easy to extract later and not block users. There have been enough users with good rationals that I think it makes sense to add now.

My initial concern is that most users of Cache use the racy getIfPresent, compute, put idiom. Ideally they would use a loading get, but many don't consider concurrency of the cache stampede. An async cache is more explicitly concurrent, so I more strongly promoted delegating the loading to the cache. Then my intent was to introduce the API once I better understood why users might want it.

@mklamra
Copy link
Author

mklamra commented May 18, 2018

Thanks for the explanation. I will use the dummy loader solution for now. I guess you can close this issue unless you want to keep it to track the future change.

@ben-manes
Copy link
Owner

I'll leave it open and close when I add this feature.

Also if you can explain your scenario, it would be nice to catalog as a reference.

@alexeyOnGitHub
Copy link

@mklamra is your use-case similar to #243 ?

@mklamra
Copy link
Author

mklamra commented May 18, 2018

@alexeyOnGitHub My use case is the same as yours in #243. I have an AsyncLoadingCache that caches data from another service. The cache is keyed on user ID. However, the service that I call requires that I provide a request context. The request context is provided to the method that calls the cache. The request context will always be the same for a given user ID. I could therefore use a cache loader and include the request context in the cache key, but that's a really hacky solution. As a workaround, I create the AsyncLoadingCache with a dummy cache loader that always throws an exception and instead provide a loading function when retrieving items from the cache:

cache.get(userId, (key, executor) -> client.get(userId, requestContext));

To avoid the dummy loader, there would need to be a version of Caffeine.newBuilder().buildAsync that did not require a loader.

@osklyar
Copy link

osklyar commented May 27, 2018

Thanks @ben-manes!

@yufei-cai
Copy link

@ben-manes Any plans to include AsyncCache in a release?

My use case is a sliding window of events, say caching all events fired in the last 5 minutes. Cache loading isn't meaningful there.

@ben-manes
Copy link
Owner

Sorry, I'm pretty behind at the moment. Work and a family medical emergency has caused a lot of chaos over the last few months.

I have work on adding a Map<K, CompletableFuture<V>> view to the async caches, which is 80-90% done where the remainder is to finish the test cases. It's a little quirky on statistics, e.g. computeIfAbent(key, k -> future) should records the loadTime as the future's most likely. So I think that I need to iterate on the stats, or release it with stats being good enough with later fixes since its not rigidly spec'd. If I can get some time to wrap that up, then I should be able to catch up on the other backlog tasks quickly and cut a minor release.

@ben-manes
Copy link
Owner

Released 2.7

@mklamra
Copy link
Author

mklamra commented Feb 26, 2019

Great! Thanks a lot @ben-manes.

@martinariehm
Copy link

We need Async caches with manual loading as well. However, we have async loading methods. With Caffeine 2.6.x we used synchronous Caches with manual loading, where the value type was a CompletableFuture. This works fine, except for one edge case: if the CompletableFuture completes exceptionally at some point, it is still stored as a valid entry in the cache, since the creation of the CompletableFuture did not throw an exception.

Is there now, with the 2.7 manual Async cache a way to have those values automatically invalidated or do we still need to do this manually?

@ben-manes
Copy link
Owner

Yes, async caches will remove entries when the future was exceptional or has a null result value. You could emulate a manual AsyncCache in 2.6.x by using a dummy cache loader, as all of the desired methods were on AsyncLoadingCache. That workaround isn't needed thanks to feedback asking for this cache type, so do please try 2.7 and let us know if you run into any problems.

@laurentvaills
Copy link

Thanks for this addition @ben-manes . We were in the same use-case that @mklamra .

@martinariehm
Copy link

Seems to work fine, thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants