lru_cache vs singleton in Python — they're not the same thing.

Published: (February 12, 2026 at 05:58 PM EST)
2 min read
Source: Dev.to

Source: Dev.to

Cover image for lru_cache vs singleton in Python — they're not the same thing.

Using @lru_cache for singletons

@lru_cache is often used as a quick way to create a singleton, even though that isn’t its primary purpose. It does have a few benefits:

  • Lazy instantiation – the object isn’t created until it’s actually needed.
  • Less boilerplate – a simple decorator replaces a verbose class definition.

Drawbacks

1. Performance

If the object’s construction is expensive, the first request that triggers lazy creation can suffer a noticeable delay. In a FastAPI application, for example, you might prefer to create such objects eagerly during startup rather than on the first request.

2. Exactly‑once guarantee

The decorator is documented as thread‑safe in terms of internal state — see the official Python docs. However, it does not guarantee that the wrapped factory will be called only once. If two threads invoke the function with the same arguments before the result is cached, two distinct instances can be created. This race condition is usually negligible but becomes more significant for long‑running initialisations.

When is it definitely bad to use @lru_cache as a singleton?

  • Multiprocessing scenarios – e.g., when the cached object spawns worker processes (ProcessPoolExecutor). Lazy creation means the first request bears the full cost of spawning, increasing latency and potentially creating multiple pools.
  • Teardown concerns – classic singletons can provide explicit cleanup hooks; @lru_cache has no built‑in mechanism for deterministic termination, which can be problematic for resources that need graceful shutdown.

Is it ever okay to use @lru_cache for singletons?

Yes, with an important caveat: you must accept that, in edge cases, multiple distinct instances might be created. In many situations—such as an unused database client—this is harmless. In other contexts, it can lead to subtle bugs.

A good rule of thumb

  • Use @lru_cache(maxsize=1) for lightweight objects where occasional extra instances are harmless.
  • Use a classic singleton pattern (or a lifespan‑managed approach) for objects with a real lifecycle, requiring controlled startup and teardown.

Conclusion

As the Zen of Python reminds us:

“Special cases aren’t special enough to break the rules. Although practicality beats purity.

0 views
Back to Blog

Related posts

Read more »

Python OOP for Java Developers

Converting a Simple Java Square Class to Python Below is a step‑by‑step conversion of a basic Square class written in Java into an equivalent Python implementa...