Skip to main content
1 of 5
LetMeSOThat4U
  • 769
  • 1
  • 8
  • 16

Thread-safe singleton cache (Python)

In a service I'm developing I'm using three cloud services (five if you count additional two streaming services I have to create that are used by those cloud services in turn) and to orchestrate bits and pieces incoming I need a cache for objects relating data from all those services. That cache has to be is safe to access from various Tornado handlers in an atomic manner.

Now, since Tornado is based on I/O loop, theoretically I do not need to use locks or some other synchronization primitives... but that's obviously relying on interpretation detail (GIL) and it can reportedly cause problems under some circumstances anyway, so to be rather safe than sorry I try to develop the cache that is all:

  • Singleton
  • Thread-safe
  • Creates a default object available under a key

I'm approaching the subject with some trepidation as so far I have not done such stuff much. This is what I've come up with so far:

class ThreadSafeSingletonCache(object):
    __me = None
    __cache = None
    __lock = threading.Lock()

    def __new__(cls):
        if cls.__me is None:
            cls.__cache = {}
            cls.__me = super(ThreadSafeSingletonCache, cls).__new__(cls)
        return cls.__me

    def __init__(self, default_factory):
        self.default_factory = default_factory

    def obj_setattr(self, cache_key, obj_attr, value):
        with self.__lock:
            item = self.__cache.setdefault(cache_key, self.default_factory())
            setattr(item, obj_attr, value)

    def get(self, cache_key):
        with self.__lock:
            return self.__cache.setdefault(cache_key, self.default_factory())

    def set(self, cache_key, obj):
        with self.__lock:
            self.__cache[cache_key] = obj

Please point out problems and potential improvements.

LetMeSOThat4U
  • 769
  • 1
  • 8
  • 16