Post

Redis Deep Dive: Real Engineering Uses Beyond Caching

Introduction

Redis is frequently described as a cache, but production systems use it for much more. It provides fast data structures, atomic operations, and streaming capabilities that enable rate limiting, distributed coordination, and event processing. This post focuses on real engineering use cases with Python.

Use Case 1: Rate Limiting with Sliding Windows

Fixed windows can burst at window boundaries. A sorted-set approach supports sliding windows with precise control.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
import time
from redis import Redis

redis_client = Redis.from_url("redis://localhost:6379/0")

RATE = 100
WINDOW_SECONDS = 60


def allow_request(user_id: str) -> bool:
    now = time.time()
    key = f"rate:{user_id}"
    pipeline = redis_client.pipeline()
    pipeline.zadd(key, {str(now): now})
    pipeline.zremrangebyscore(key, 0, now - WINDOW_SECONDS)
    pipeline.zcard(key)
    pipeline.expire(key, WINDOW_SECONDS)
    _, _, count, _ = pipeline.execute()
    return count <= RATE

Use Case 2: Distributed Locks with Safety Guarantees

Use unique tokens and short TTLs to avoid deadlocks.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
import uuid
from redis import Redis

redis_client = Redis.from_url("redis://localhost:6379/0")


def acquire_lock(resource: str, ttl_ms: int = 10000):
    token = str(uuid.uuid4())
    success = redis_client.set(resource, token, nx=True, px=ttl_ms)
    return token if success else None


def release_lock(resource: str, token: str):
    script = """
    if redis.call('get', KEYS[1]) == ARGV[1] then
        return redis.call('del', KEYS[1])
    end
    return 0
    """
    redis_client.eval(script, 1, resource, token)

Use Case 3: Event Processing with Streams

Streams provide durable message delivery without managing Kafka or other heavy systems for small to medium workloads.

1
2
3
4
5
6
7
8
9
10
from redis import Redis

redis_client = Redis.from_url("redis://localhost:6379/0")

redis_client.xadd("orders", {"order_id": "ORD-1", "amount": "125.00"})

messages = redis_client.xread({"orders": "0-0"}, count=10, block=1000)
for stream, entries in messages:
    for message_id, data in entries:
        print(message_id, data)

Use Case 4: Cache with Intelligent Invalidation

Use hashes or JSON to cache computed results, and pair with a version key to invalidate whole segments quickly.

1
2
3
4
5
6
7
8
from redis import Redis

redis_client = Redis.from_url("redis://localhost:6379/0")


def cache_key(segment: str, key: str) -> str:
    version = redis_client.get(f"segment:{segment}:version") or b"1"
    return f"segment:{segment}:{version.decode()}:{key}"

Operational Considerations

  • Configure persistence based on data criticality.
  • Monitor memory fragmentation and eviction events.
  • Use maxmemory-policy aligned with workload (e.g., allkeys-lru for caches).

Conclusion

Redis is a multi-purpose data engine. Treat it as a critical system with proper monitoring and runbooks, and you can unlock high-performance coordination primitives that are difficult to build elsewhere.

This post is licensed under CC BY 4.0 by the author.