@std/cache
This @std package is experimental and its API may change without a major version bump.
Overview Jump to heading
In-memory cache utilities, such as memoization and caches with different expiration policies.
import { memoize, LruCache, type MemoizationCacheResult } from "@std/cache";
import { assertEquals } from "@std/assert";
const cache = new LruCache<string, MemoizationCacheResult<bigint>>(1000);
// fibonacci function, which is very slow for n > ~30 if not memoized
const fib = memoize((n: bigint): bigint => {
return n <= 2n ? 1n : fib(n - 1n) + fib(n - 2n);
}, { cache });
assertEquals(fib(100n), 354224848179261915075n);
Add to your project Jump to heading
deno add jsr:@std/cache
See all symbols in @std/cache on
What is caching? Jump to heading
Caching stores recently used or expensive-to-compute data in memory so you can reuse it faster next time instead of recomputing or refetching. It is helpful when calculations are slow, network calls are frequent, or you access the same data repeatedly.
Memoization is caching for function results: given the same inputs, return the previously computed output. It is great for pure functions (no side effects) where the result depends only on the inputs.
Why use @std/cache? Jump to heading
The package provides a ready-to-use memoize()
helper and common cache types so
you don’t rebuild them yourself.
The package works across Deno/Node/Bun/Workers and keeps APIs small and predictable.
LRU Jump to heading
LRU (Least Recently Used) evicts the entry you haven’t used for the longest time. Use it when “recent things are likely needed again.” Good for hot working sets.
TTL Jump to heading
TTL (Time To Live) evicts entries after a fixed duration, regardless of access. Use it when data becomes stale quickly (e.g., config from a service, short-lived tokens).
- Choose eviction by workload: LRU for temporal locality, TTL for freshness, size-based when memory-bound.
- Memoization caches must consider argument identity—objects/functions need stable keys (e.g., serialize or use WeakMap when appropriate).
- Beware of unbounded growth; set sensible limits and measure hit ratios to tune capacity.
- Handle errors/timeouts explicitly—cache only successful results unless failures should be sticky.
Examples Jump to heading
import { LruCache, memoize } from "@std/cache";
const cache = new LruCache<string, number>(500);
const fib = memoize(
(n: number): number => n <= 1 ? n : fib(n - 1) + fib(n - 2),
{ cache },
);