This is a simple alternative to Service Worker for projects that cannot enable Service Worker. No need to manually write cache code, just configure the cache strategy.
import { request } from "keq"
import { cache, Strategy, MemoryStorage } from "keq-cache"
const storage = new MemoryStorage()
request
.use(cache({ storage }))
If you are invoke .use(cache({ ... }))
multiple times and want to share cache, the same storage instance should be used.
import { request } from "keq"
import { cache, Strategy, MemoryStorage } from "keq-cache"
request
.use(
cache({
storage: new MemoryStorage(),
rules: [
{
pattern: (ctx) => ctx.request.method === "get",
strategy: Strategy.STALE_WHILE_REVALIDATE,
ttl: 5 * 60 * 1000,
key: (ctx) => ctx.request.__url__.href,
exclude: async response => response.status !== 200,
onNetworkResponse: (response, cachedResponse) => {
console.log('The network response: ', response)
console.log('The response that cache hit: ', cachedResponse)
}
},
],
})
)
The above configuration, all GET request will use StateWileRevalidate Strategy and cache will expire after 5 minutes.
It is natural to override the global configuration when sending a request:
import { request } from "keq"
import { cache, Strategy, Eviction } from "keq-cache"
request
.get("/example")
.options({
cache: {
strategy: Strategy.NETWORK_FIRST,
key: 'custom-cache-key',
exclude: async response => response.status !== 200
ttl: 1000,
},
})
Name | Default | Description |
---|---|---|
storage | - | See More |
keyFactory | (context) => context.identifier |
The requested cache unique key factory. Requests with the same key will share the cache |
rules.pattern | - | |
rules.key | - | The cache key factory for the request match the rule. |
rules.strategy | NetworkFirst | how generates a response after receiving a fetch. See More |
rules.ttl | Infinity |
cache time to live |
rules.exclude | - | If return true, the request will not be cached. |
rules.onNetworkResponse | undefined |
Callback invoke after network request finish. |