API Caching
Improve API performance, scalability, and reliability by caching frequent responses at the CDN edge with Medianova’s Advanced API Caching technology.
API Caching reduces latency and server load by temporarily storing API responses at the CDN edge. When the same API endpoint is called repeatedly, the CDN serves the cached response instead of fetching it again from the origin. This approach improves response times, reduces database queries, and enhances overall API stability — especially for high-traffic web or mobile applications.
Medianova’s API Caching supports REST and GraphQL endpoints and can cache responses even for a single second.
Why Use API Caching?
Use API Caching to deliver faster and more resilient API services. It helps you:
Reduce Latency – Serve repetitive API calls instantly from edge caches.
Lower Origin Load – Prevent redundant backend processing and database queries.
Increase Reliability – Handle traffic spikes without degrading performance.
Enhance Security – Protect your origin from DDoS or brute-force attacks by offloading requests to CDN edges.
Optimize Costs – Reduce bandwidth and compute usage through intelligent caching.
Combine API Caching with Dynamic Content Acceleration for optimal performance on hybrid (static + dynamic) workloads.
Key Features
Edge-Level Microcaching – Cache API responses at edge nodes for as little as one second to reduce origin requests.
REST & GraphQL Support – Works seamlessly with modern API architectures and content negotiation formats (JSON, XML).
Automatic Invalidation – Ensures cached responses remain fresh using TTL-based or event-triggered invalidation.
Security Integration – Works with WAF, Rate Limiting, and DDoS Protection to block malicious API traffic.
Analytics & Monitoring – Provides detailed insights into cache hit ratios and latency improvements.
Custom TTL per Endpoint – Configure different caching durations for each API route or resource type.
How It Works
API requests (GET, POST, or GraphQL queries) are sent to the CDN edge.
The CDN edge checks whether a valid cached response exists.
If found, the CDN serves it immediately — bypassing the origin.
If not, the CDN retrieves the data from the origin server, caches the response, and forwards it to the client.
Subsequent identical requests are served directly from the cache until the TTL (cache lifetime) expires.
A “product pricing” API endpoint can be cached for 2–3 seconds — allowing thousands of requests to be served from the edge without re-querying the database.
Best Practices
Cache read-heavy endpoints (GET requests) that return identical responses.
Avoid caching user-specific or session-based data.
Start with short cache durations (1–5 seconds) and adjust based on traffic analytics.
Use Custom Page Rules to define which endpoints should be excluded from caching (e.g.,
/checkout
,/auth
).Combine API Caching with Rate Limiting for additional protection during high request bursts.
Last updated
Was this helpful?