这里没有其他解决方案解决的一个主要问题是LocMemCache在单个进程的生命周期中创建并销毁它们中的几个时会泄漏内存。 django.core.cache.backends.locmem
定义了几个全局字典,它们持有对每个LocalMemCache实例缓存数据的引用,并且这些字典从不清空。
以下代码解决了此问题。它起始于@ href_的回答和@ sqrtlogic.hayden评论中链接的代码所使用的更清晰逻辑的组合,然后我进一步细化。
from uuid import uuid4
from threading import current_thread
from django.core.cache.backends.base import BaseCache
from django.core.cache.backends.locmem import LocMemCache
from django.utils.synch import RWLock
# Global in-memory store of cache data. Keyed by name, to provides multiple
# named local memory caches.
_caches = {}
_expire_info = {}
_locks = {}
class RequestCache(LocMemCache):
"""
RequestCache is a customized LocMemCache with a destructor, ensuring that creating
and destroying RequestCache objects over and over doesn't leak memory.
"""
def __init__(self):
# We explicitly do not call super() here, because while we want
# BaseCache.__init__() to run, we *don't* want LocMemCache.__init__() to run.
BaseCache.__init__(self, {})
# Use a name that is guaranteed to be unique for each RequestCache instance.
# This ensures that it will always be safe to call del _caches[self.name] in
# the destructor, even when multiple threads are doing so at the same time.
self.name = uuid4()
self._cache = _caches.setdefault(self.name, {})
self._expire_info = _expire_info.setdefault(self.name, {})
self._lock = _locks.setdefault(self.name, RWLock())
def __del__(self):
del _caches[self.name]
del _expire_info[self.name]
del _locks[self.name]
class RequestCacheMiddleware(object):
"""
Creates a cache instance that persists only for the duration of the current request.
"""
_request_caches = {}
def process_request(self, request):
# The RequestCache object is keyed on the current thread because each request is
# processed on a single thread, allowing us to retrieve the correct RequestCache
# object in the other functions.
self._request_caches[current_thread()] = RequestCache()
def process_response(self, request, response):
self.delete_cache()
return response
def process_exception(self, request, exception):
self.delete_cache()
@classmethod
def get_cache(cls):
"""
Retrieve the current request's cache.
Returns None if RequestCacheMiddleware is not currently installed via
MIDDLEWARE_CLASSES, or if there is no active request.
"""
return cls._request_caches.get(current_thread())
@classmethod
def clear_cache(cls):
"""
Clear the current request's cache.
"""
cache = cls.get_cache()
if cache:
cache.clear()
@classmethod
def delete_cache(cls):
"""
Delete the current request's cache object to avoid leaking memory.
"""
cache = cls._request_caches.pop(current_thread(), None)
del cache
编辑2016年6月15日: 我发现了一个显著简单的解决了这个问题,并kindof facepalmed不认识多么容易这本来应该是从一开始。
from django.core.cache.backends.base import BaseCache
from django.core.cache.backends.locmem import LocMemCache
from django.utils.synch import RWLock
class RequestCache(LocMemCache):
"""
RequestCache is a customized LocMemCache which stores its data cache as an instance attribute, rather than
a global. It's designed to live only as long as the request object that RequestCacheMiddleware attaches it to.
"""
def __init__(self):
# We explicitly do not call super() here, because while we want BaseCache.__init__() to run, we *don't*
# want LocMemCache.__init__() to run, because that would store our caches in its globals.
BaseCache.__init__(self, {})
self._cache = {}
self._expire_info = {}
self._lock = RWLock()
class RequestCacheMiddleware(object):
"""
Creates a fresh cache instance as request.cache. The cache instance lives only as long as request does.
"""
def process_request(self, request):
request.cache = RequestCache()
有了这个,你可以使用request.cache
因为只要仅作为request
这是否活着,并会在请求做完全由垃圾收集清理缓存实例。
如果您需要从通常不可用的环境中访问request
对象,则可以使用可在线找到的所谓“全局请求中间件”的各种实现之一。
请小心此解决方案!随着越来越多的线程打开为您的用户提供服务,_request_cache字典将不断填满,并且它永远不会被清理干净。根据您的网络服务器如何存储Python全局变量,这可能会导致内存泄漏。 – CoreDumpError 2015-01-09 20:19:16
你应该在process_response方法中清除缓存????? – 2015-06-27 04:31:18
是的 - 清除process_response和process_expception上的缓存 - 在django cuser中间件插件中有一个很好的例子。见:https://github.com/Alir3z4/django-cuser/blob/master/cuser/middleware.py – 2015-08-26 14:14:15