From bfa3fed034ce5a35ee47b470df7215fa138cdc14 Mon Sep 17 00:00:00 2001 From: Bryan Donovan Date: Tue, 26 Jan 2016 12:40:33 -0800 Subject: [PATCH] README update --- README.md | 8 +++++++- 1 file changed, 7 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index f4a902b..6664559 100644 --- a/README.md +++ b/README.md @@ -71,10 +71,16 @@ function getCachedUser(id, cb) { Second, node-cache-manager features a built-in memory cache (using [node-lru-cache](https://github.com/isaacs/node-lru-cache)), with the standard functions you'd expect in most caches: - set(key, val, ttl, cb) + set(key, val, {ttl: ttl}, cb) // * see note below get(key, cb) del(key, cb) + // * Note that depending on the underlying store, you may be able to pass the + // ttl as the third param, like this: + set(key, val, ttl, cb) + // ... or pass no ttl at all: + set(key, val, cb) + Third, node-cache-manager lets you set up a tiered cache strategy. This may be of limited use in most cases, but imagine a scenario where you expect tons of traffic, and don't want to hit your primary cache (like Redis) for every request.