
"With serverless functions being stateless by design, they spin up, handle a request, and disappear. That means any cached data must reside elsewhere and be retrieved over the network. Unkey attempted various mitigations for this, including building a multi-tier caching system utilising various Cloudflare services. They optimised cache keys and tuned expiry times, but nothing got around the basic physics of the situation. Thomas put it simply: zero network requests are always faster than one network request. No amount of clever caching could match what a stateful server does by default, which is to keep hot data in memory."
"Developer Platform Unkey has written about rebuilding its entire API authentication service from the ground up, moving from serverless Cloudflare Workers to stateful Go servers after re-evaluating the constraints of their serverless architecture. The company states that the move resulted in a sixfold performance improvement and eliminated the workarounds that had become a dominant part of its engineering efforts. Unkey's co-founder Andreas Thomas explained that the decision came down to latency, explaining that when a service sits in the request path for thousands of applications, "every millisecond matters.""
Unkey rebuilt its entire API authentication service, moving from serverless Cloudflare Workers to stateful Go servers. The migration produced a sixfold performance improvement and removed many engineering workarounds. Root cause analysis identified caching latency: Cloudflare's cache exceeded 30 milliseconds at the 99th percentile, preventing a sub-10 millisecond response goal. Serverless functions' statelessness forced external cache retrievals over the network, while stateful servers keep hot data in memory. Serverless also complicated event batching because functions vanish after invocation. Unkey therefore implemented a custom Go proxy, chproxy, to buffer analytics events before forwarding to its analytics service.
Read at InfoQ
Unable to calculate read time
Collection
[
|
...
]