Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Request: Some way to determine if future::Cache::get_with() returned cached or fresh value #136

Closed
lilyball opened this issue May 24, 2022 · 4 comments · Fixed by #193
Closed
Labels
enhancement New feature or request
Milestone

Comments

@lilyball
Copy link

I want to know if future::Cache:get_with() returned a cached or a fresh value. This is important for logging purposes. I imagine there are workarounds I could do involving heap-allocating a mutable boolean, setting it in the future I pass to the function, and testing it when the function returns, but that's very awkward. It would be great if the cache could simply tell me whether it was fresh or cached.

@tatsuya6502 tatsuya6502 added the enhancement New feature or request label May 26, 2022
@tatsuya6502 tatsuya6502 added this to the v0.8.6 milestone May 26, 2022
@lilyball
Copy link
Author

It occurs to me that the future given to get_with doesn't have the 'static bound so I probably can just stuff a boolean on the stack. That certainly makes it easier, it would just be more elegant if moka could tell me directly.

@tatsuya6502
Copy link
Member

tatsuya6502 commented May 26, 2022

It would be great if the cache could simply tell me whether it was fresh or cached.

Technically, this will be very easy to implement. get_with internally tracks whether the value was fresh or cached (code), so it is easy to return such information as a part of the return value. For example, get_with could return a tuple (V, bool) where bool indicates whether V was fresh or not. Or, it could return a struct having value: V and is_fresh: bool as the fields.

The only problem is that we cannot change get_with's return value from V as it seems many users are already using it and happy with it. So (if we take this path) we will need to add another method that will work as get_with but has a different return type. And then we will need to think about other methods: try_get_with and get_with_if. Do we need to do the same thing to them?

I am very open to adding such method(s) but I would like to keep the core Cache API small. So we will probably do something that @ben-manes, the Caffeine author, suggested before (comment)?

(In Caffeine) A lot of random functionality, like conditional entry removal or policy inspection, is hidden away under nested objects (asMap(), policy()) to provide the features while keeping the core interface simple to reduce the conceptual weight for learning a new library.

Perhaps, something like this?

// We will keep current `get_with` as is.
let val = cache.get_with(key, async {...}).await;

// For accessing advanced features, use a new `entry` method, which mimics `HashMap`'s `entry` method.
let result = cache.entry(key).or_insert_with(async {...}).await;

// `result` is a struct providing some methods to access its `bool` and `V` fields.
if result.is_fresh() {
    info!("Fresh value");
}
let val = result.into_value(); // This consumes `self` and returns `V`.

If we can come up with a nice interface, I would also hide unpopular get_with_if using it.

@ben-manes
Copy link

Ideally the caller doesn't need to know for business logic purposes and we shouldn't corrupt our apis if only for debug logging. The main reason would be for stats, which in Caffeine has a pluggable StatsCounter for the cases it records. The APIs can't be perfect so the goal has to be low conceptual weight, correctness / maintainability, and allowing users to solve their core problems. If the latter has to be sometimes a bit of a dirty one-off, then that is a reasonable decision. (See bumper-sticker api design).

Caffeine has this same problem with the backing ConcurrentHashMap and its compute methods. Similarly for some integrations into other's apis (most recently Coherence which requires additional callbacks). A mutable variable that the thread populates during a compute is very inexpensive, short-lived, and an isolated implementation detail. Given all the unique benefits these caches provide, the small code smell seems fine to me.

When it is not possible or a much worse solution than native support, I think stashing these ad hoc methods somewhere else (e.g. cache.policy()) is a good tradeoff. Less discoverable but unblocks those who ask and need it, and an escape hatch from trying to have a universal api. While I wouldn't add it to Caffeine as not carrying its weight, each language / ecosystem is different and I am curious to see how @tatsuya6502 evolves his cache to gleem his insights.

@tatsuya6502 tatsuya6502 modified the milestones: v0.8.6, v0.9.1 Jun 27, 2022
@tatsuya6502 tatsuya6502 modified the milestones: v0.9.1, v0.9.2, v0.9.3 Jul 16, 2022
@tatsuya6502 tatsuya6502 modified the milestones: v0.9.3, Backlog Jul 23, 2022
@tatsuya6502 tatsuya6502 modified the milestones: Backlog, v0.10.0 Nov 2, 2022
@tatsuya6502
Copy link
Member

I am working on this for v0.10.0 release via #193.

@tatsuya6502 tatsuya6502 linked a pull request Nov 2, 2022 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants