Extensible memoizing collections and decorators

Related tags

Caching cachetools
Overview

cachetools

Latest PyPI version Documentation build status Travis CI build status Test coverage Libraries.io SourceRank License

This module provides various memoizing collections and decorators, including variants of the Python Standard Library's @lru_cache function decorator.

from cachetools import cached, LRUCache, TTLCache

# speed up calculating Fibonacci numbers with dynamic programming
@cached(cache={})
def fib(n):
    return n if n < 2 else fib(n - 1) + fib(n - 2)

# cache least recently used Python Enhancement Proposals
@cached(cache=LRUCache(maxsize=32))
def get_pep(num):
    url = 'http://www.python.org/dev/peps/pep-%04d/' % num
    with urllib.request.urlopen(url) as s:
        return s.read()

# cache weather data for no longer than ten minutes
@cached(cache=TTLCache(maxsize=1024, ttl=600))
def get_weather(place):
    return owm.weather_at_place(place).get_weather()

For the purpose of this module, a cache is a mutable mapping of a fixed maximum size. When the cache is full, i.e. by adding another item the cache would exceed its maximum size, the cache must choose which item(s) to discard based on a suitable cache algorithm. In general, a cache's size is the total size of its items, and an item's size is a property or function of its value, e.g. the result of sys.getsizeof(value). For the trivial but common case that each item counts as 1, a cache's size is equal to the number of its items, or len(cache).

Multiple cache classes based on different caching algorithms are implemented, and decorators for easily memoizing function and method calls are provided, too.

Installation

cachetools is available from PyPI and can be installed by running:

pip install cachetools

Project Resources

License

Copyright (c) 2014-2021 Thomas Kemmer.

Licensed under the MIT License.

Comments
  • When two @cachedmethod()'s are called with the same arguments, results are comingled

    When two @cachedmethod()'s are called with the same arguments, results are comingled

    Hi! Thanks for this useful library 😃

    I encountered an unexpected behavior. I'm using cachedmethod() on several methods in the same class. My expectation is that these could all share a cache object, and would cache their results independently of each other. If two different cached methods were called with the same arguments, I'd expect a mechanism inside the decorator to determine that these are for different methods, and therefore should not share each others' results.

    What I'm seeing instead is that if one cached methods is called, and then another is called with the same arguments, that the results of the first method are returned.

    class Cached(object):
    
        def __init__(self, cache, count=0, other_count=5):
            self.cache = cache
            self.count = count
            self.other_count = other_count
    
        @cachedmethod(operator.attrgetter('cache'))
        def get(self, value):
            count = self.count
            self.count += 1
            return count
    
        @cachedmethod(operator.attrgetter('cache'))
        def get_other(self, value):
            other_count = self.other_count
            self.other_count += 1
            return other_count
    
        def test_that_methods_are_cached_independently(self):
            self.assertEqual(cached.get(0), 0)
            self.assertEqual(cached.get(0), 0)
            self.assertEqual(cached.get(1), 1)
            self.assertEqual(cached.get(1), 1)
            self.assertEqual(cached.get_other(0), 5)
            self.assertEqual(cached.get_other(0), 5)
            self.assertEqual(cached.get_other(1), 6)
            self.assertEqual(cached.get_other(1), 6)
            self.assertEqual(cached.get_other(2), 7)
            self.assertEqual(cached.get(2), 2)
    

    It seems to me that this is a bug. Is that correct?

    If so I'll open a PR with the failing test case, and see if I can fix it.

    opened by paulmelnikow 13
  • TTLCache with

    TTLCache with "absolute" expiration times

    I need to have a TTLCache that has absolute expiration times.

    This means, i have a cache which expires items every full hour (for example) - or always at midnight.

    Currently, TTL seems to only be possible for relative values (200s from now, 1000s from now) - but not "until the clock hits :00 next time" - so the insert time will determine how long it'll live:

    from cachetools.ttl import TTLCache
    
    # Cache for 1h
    c = TTLCache(5, ttl=3600)
    c['a'] = 1
    

    In the above sample, item 'a' will live for 1 hour - but it might expire at 18:30, or 18:34, (exactly one hour after i inserted it). I'd however like it to expire at 18:00, 19:00, 20:00 (independently of the insertion time).

    I don't mind creating a subclass of TTLCache for this myself, but TTLCache seems pretty involved / complex and not well documented (what does what) - so i'm currently not sure where to start.

    I'll potentially need a different timer - but i'm not sure that will suffice my requirements (unfortunately, this is not pretty well documented).

    I could also work with it by somehow modifying the "expire" (all items should expire at the same moment - when the clock hits 11:00, 12:00, ... ) - so i think calling .expire() after insertion "might" work (although i'm not sure that will suffice my needs).

    The math of the expiration is pretty easy (to get to full hours) - but i miss understanding where i should plug this in ...

    from datetime import datetime
    ts = datetime.now().timestamp()
    offset = ts % 3600
    expire > datetime.fromtimestamp(ts - offset + 3600)
    
    enhancement question 
    opened by xmatthias 12
  • @cachedmethod: Remove 'self' from key arguments

    @cachedmethod: Remove 'self' from key arguments

    The whole point of the cache argument of @cachedmethod being a function is to select an appropriate cache for self. So there is little to no reason for passing self to the key function. This was simply an oversight when preparing for cachetools v2.0.0.

    Main question is: is this merely an ordinary bug, or a breaking change which would have to wait until cachetools v3.0.0?

    bug 
    opened by tkem 12
  • Support for decorating coroutines

    Support for decorating coroutines

    The memoization decorators could use support for coroutines:

    @cachetools.cached(cache={})
    async def foo():
        pass
    

    Currently this caches the coroutine itself, which can only be awaited once and will produce an error when awaited again.

    enhancement 
    opened by deceze 12
  • cache decorator causes issues with memory leak

    cache decorator causes issues with memory leak

    Hey tkem,

    Consider the following naive example:

    from cachetools.func import lfu_cache
    
    class Foo(object):
        @lfu_cache(100)
        def bar(self):
            pass
    
    foo = Foo()
    foo.bar()
    

    foo will not be properly garbage collected because it is still referenced as part of a key inside the cache.

    import gc
    gc.collect()
    referrers = gc.get_referrers(foo)
    assert 1 == len(referrers)
    

    Is there a recommended approach to decorating a method? I can use cachemethod with a key function that excludes the instance inside the hash, but is there a preferred way?

    Thank you,

    PJ

    question 
    opened by PJCampi 10
  • Support unlimited capacity

    Support unlimited capacity

    In my opinion, unlimited cache capacity that called maxsize is useful when we use TTLCache. This cache algorithm has own expiration condition above the size limit. So it still works as "cache" even though it has no any size limit. I want to control cache lifetime with only TTL not including the size.

    I've already tried to make an unlimited TTLCache with maxsize=float('inf') and it looks good. But it wasn't perfect because of __repr__:

    >>> c = TTLCache(float('inf'), ttl=3)
    >>> c['a'] = 123
    >>> c.keys()
    ['a']
    >>> c.keys()
    []
    >>> c
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File ".../cachetools/ttl.py", line 142, in __repr__
        return cache_repr(self)
      File ".../cachetools/cache.py", line 35, in __repr__
        self.__currsize,
    TypeError: %d format: a number is required, not float
    

    @tkem, if you agree this about this idea, I'll make a merge request.

    enhancement 
    opened by sublee 10
  • Added helper functions to support fine grained control over the cache

    Added helper functions to support fine grained control over the cache

    Hi,

    I found usefull the following two cases for method calls, but I think they might be useful for function calls as well:

    1. Invalidating only one entry in the cache.
    2. Setting manually a cache entry, given the parameters of the method call

    Here is an example of how I see it being used:

    class Cached(object):
        def __init__(self):
            self._cache = LRUCache(maxsize=100)
    
        @cachedmethod(operator.attrgetter('_cache'))
        def get_id(self, id):
            return expensive_call(uuid)
    
        def set_id(self, id, value):
            self.fast_get.cache_set(self, value, id)
    
        def del_id(self, id):
            self.fast_get.cache_invalidate(self, id)
    

    Tell me what you think, and I will update the function decorator as well, if you decide to merge my commit.

    Cheers, Vlad

    opened by vladwing 10
  • TTLCache - Odd crash on cache item set

    TTLCache - Odd crash on cache item set

    Describe the bug Just got the following crash log in production (cannot reproduce in dev), where a cache set of a specific key has permanently got "stuck", each time firing an unhandled exception when doing cache[key] = value.

    Expected result No exception, or documented exception.

    Actual result Exception

    Reproduction steps Code is more or less like this:

    from cachetools import TTLCache
    
    CACHE_KEY = "prefix:%s"
    CACHE_TIME = 10
    
    cache = TTLCache(maxsize=256, ttl=CACHE_TIME)
    
    def some_fun():
            # ... some code
            cache_key = CACHE_KEY % f"{key_1}:{key_2}:{key_3}"
            exists = cache.get(cache_key)
    
            if exists:
                return exists
            else:
                # grab some info from somewhere
                exists = my_expensive_data
                cache[cache_key] = (exists, other_data)
    

    The cache set call fails with the following:

    Traceback (most recent call last):
      ...redacted....
      File "/var/app/current/file.py", line 80, in some_fun
        cache[cache_key] = (my_expensive_data, other_data)
      File "/var/app/venv/staging-LQM1lest/lib/python3.7/site-packages/cachetools/__init__.py", line 426, in __setitem__
        self.expire(time)
      File "/var/app/venv/staging-LQM1lest/lib/python3.7/site-packages/cachetools/__init__.py", line 480, in expire
        cache_delitem(self, curr.key)
      File "/var/app/venv/staging-LQM1lest/lib/python3.7/site-packages/cachetools/__init__.py", line 94, in __delitem__
        del self.__data[key]
    KeyError: 'prefix:Token:XXX:YYY'
    

    Any thoughts on why would this fail? This code has been running like this for years and it's the first time I observe this behaviour.

    bug 
    opened by cristianoccazinsp 9
  • 5.1.0: sphinx warnings `reference target not found`

    5.1.0: sphinx warnings `reference target not found`

    Fiirs of all currently it is not possible to use straight sphinx-build command to build documentation out of source tree

    + /usr/bin/sphinx-build -n -T -b man docs build/sphinx/man
    Running Sphinx v4.5.0
    making output directory... done
    building [mo]: targets for 0 po files that are out of date
    building [man]: all manpages
    updating environment: [new config] 1 added, 0 changed, 0 removed
    reading sources... [100%] index
    WARNING: autodoc: failed to import class 'Cache' from module 'cachetools'; the following exception was raised:
    No module named 'cachetools'
    WARNING: autodoc: failed to import class 'FIFOCache' from module 'cachetools'; the following exception was raised:
    No module named 'cachetools'
    WARNING: autodoc: failed to import class 'LFUCache' from module 'cachetools'; the following exception was raised:
    No module named 'cachetools'
    WARNING: autodoc: failed to import class 'LRUCache' from module 'cachetools'; the following exception was raised:
    No module named 'cachetools'
    WARNING: autodoc: failed to import class 'MRUCache' from module 'cachetools'; the following exception was raised:
    No module named 'cachetools'
    WARNING: autodoc: failed to import class 'RRCache' from module 'cachetools'; the following exception was raised:
    No module named 'cachetools'
    WARNING: autodoc: failed to import class 'TTLCache' from module 'cachetools'; the following exception was raised:
    No module named 'cachetools'
    WARNING: autodoc: failed to import class 'TLRUCache' from module 'cachetools'; the following exception was raised:
    No module named 'cachetools'
    WARNING: autodoc: failed to import function 'keys.hashkey' from module 'cachetools'; the following exception was raised:
    No module named 'cachetools'
    WARNING: autodoc: failed to import function 'keys.typedkey' from module 'cachetools'; the following exception was raised:
    No module named 'cachetools'
    looking for now-outdated files... none found
    pickling environment... done
    checking consistency... done
    writing... python-cachetools.3 { } /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:32: WARNING: py:class reference target not found: Cache
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:32: WARNING: py:class reference target not found: collections.MutableMapping
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:32: WARNING: py:attr reference target not found: maxsize
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:32: WARNING: py:attr reference target not found: currsize
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:32: WARNING: py:meth reference target not found: Cache.__setitem__
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:32: WARNING: py:meth reference target not found: self.popitem
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:41: WARNING: py:class reference target not found: Cache
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:41: WARNING: py:meth reference target not found: getsizeof
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:41: WARNING: py:meth reference target not found: getsizeof
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:41: WARNING: py:const reference target not found: 1
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:50: WARNING: py:class reference target not found: Cache
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:193: WARNING: py:meth reference target not found: popitem
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:213: WARNING: py:class reference target not found: collections.defaultdict
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:213: WARNING: py:class reference target not found: Cache
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:213: WARNING: py:meth reference target not found: __missing__
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:213: WARNING: py:meth reference target not found: Cache.__getitem__
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:264: WARNING: py:class reference target not found: weakref.WeakValueDictionary
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:271: WARNING: py:func reference target not found: cachetools.keys.hashkey
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:277: WARNING: py:const reference target not found: None
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:277: WARNING: py:class reference target not found: threading.Lock
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:289: WARNING: py:attr reference target not found: cache
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:289: WARNING: py:attr reference target not found: cache_key
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:289: WARNING: py:attr reference target not found: cache_lock
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:315: WARNING: py:attr reference target not found: __wrapped__
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:356: WARNING: py:const reference target not found: self
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:356: WARNING: py:const reference target not found: cls
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:440: WARNING: py:func reference target not found: cached
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:440: WARNING: py:func reference target not found: cachedmethod
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:473: WARNING: py:func reference target not found: envkey
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:484: WARNING: py:func reference target not found: functools.lru_cache
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:489: WARNING: py:func reference target not found: functools.lru_cache
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:489: WARNING: py:const reference target not found: None
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:497: WARNING: py:const reference target not found: True
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:511: WARNING: py:func reference target not found: cache_parameters
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:516: WARNING: py:func reference target not found: cache_info
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:516: WARNING: py:func reference target not found: cache_clear
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:516: WARNING: py:func reference target not found: functools.lru_cache
    done
    build succeeded, 47 warnings.
    

    First batch of warnings can be fixed by patch like below:

    --- a/docs/conf.py~     2022-05-15 20:37:38.000000000 +0000
    +++ b/docs/conf.py      2022-05-17 17:28:17.016511154 +0000
    @@ -1,3 +1,8 @@
    +import os
    +import sys
    +
    +sys.path.append(os.path.abspath('../src'))
    +
     def get_version():
         import configparser
         import pathlib
    

    This patch is suggested in sphinx example copy py https://www.sphinx-doc.org/en/master/usage/configuration.html#example-of-configuration-file

    Than .. on building my packages I'm using sphinx-build command with -n switch which shows warmings about missing references. These are not critical issues.

    On building my packages I'm using sphinx-build command with -n switch which shows warmings about missing references. These are not critical issues. Here is the output with warnings:

    + /usr/bin/sphinx-build -n -T -b man docs build/sphinx/man
    Running Sphinx v4.5.0
    making output directory... done
    building [mo]: targets for 0 po files that are out of date
    building [man]: all manpages
    updating environment: [new config] 1 added, 0 changed, 0 removed
    reading sources... [100%] index
    looking for now-outdated files... none found
    pickling environment... done
    checking consistency... done
    writing... python-cachetools.3 { } /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:32: WARNING: py:class reference target not found: collections.MutableMapping
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:32: WARNING: py:attr reference target not found: maxsize
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:32: WARNING: py:attr reference target not found: currsize
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:32: WARNING: py:meth reference target not found: Cache.__setitem__
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:32: WARNING: py:meth reference target not found: self.popitem
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:41: WARNING: py:meth reference target not found: getsizeof
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:41: WARNING: py:meth reference target not found: getsizeof
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:41: WARNING: py:const reference target not found: 1
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:67: WARNING: py:meth reference target not found: popitem
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:67: WARNING: py:meth reference target not found: popitem
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:67: WARNING: py:meth reference target not found: __getitem__
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:67: WARNING: py:meth reference target not found: __setitem__
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:67: WARNING: py:meth reference target not found: __delitem__
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:104: WARNING: py:func reference target not found: random.choice
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:118: WARNING: py:func reference target not found: time.monotonic
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:140: WARNING: py:meth reference target not found: __setitem__
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:140: WARNING: py:meth reference target not found: __delitem__
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:140: WARNING: py:const reference target not found: None
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:180: WARNING: py:meth reference target not found: __setitem__
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:180: WARNING: py:meth reference target not found: __delitem__
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:180: WARNING: py:const reference target not found: None
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:193: WARNING: py:meth reference target not found: popitem
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:213: WARNING: py:class reference target not found: collections.defaultdict
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:213: WARNING: py:meth reference target not found: __missing__
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:213: WARNING: py:meth reference target not found: Cache.__getitem__
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:264: WARNING: py:class reference target not found: weakref.WeakValueDictionary
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:277: WARNING: py:const reference target not found: None
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:277: WARNING: py:class reference target not found: threading.Lock
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:289: WARNING: py:attr reference target not found: cache
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:289: WARNING: py:attr reference target not found: cache_key
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:289: WARNING: py:attr reference target not found: cache_lock
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:315: WARNING: py:attr reference target not found: __wrapped__
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:356: WARNING: py:const reference target not found: self
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:356: WARNING: py:const reference target not found: cls
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:440: WARNING: py:func reference target not found: cached
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:440: WARNING: py:func reference target not found: cachedmethod
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:473: WARNING: py:func reference target not found: envkey
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:484: WARNING: py:func reference target not found: functools.lru_cache
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:489: WARNING: py:func reference target not found: functools.lru_cache
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:489: WARNING: py:const reference target not found: None
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:497: WARNING: py:const reference target not found: True
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:511: WARNING: py:func reference target not found: cache_parameters
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:516: WARNING: py:func reference target not found: cache_info
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:516: WARNING: py:func reference target not found: cache_clear
    /home/tkloczko/rpmbuild/BUILD/cachetools-5.1.0/docs/index.rst:516: WARNING: py:func reference target not found: functools.lru_cache
    done
    build succeeded, 45 warnings.
    

    You can peak on fixes that kind of issues in other projects https://github.com/latchset/jwcrypto/pull/289 https://github.com/click-contrib/sphinx-click/commit/abc31069

    bug 
    opened by kloczek 9
  • Version 4.2.3 breaks other packages

    Version 4.2.3 breaks other packages

    I do not know if such a minor release has the intended consequences of breaking the API contract, but here is a project that just stopped working because of it:

    https://github.com/wimglenn/johnnydep/blob/8c3f1e9eb81bc43db3b5f9d1f0b4b3a0c187eb82/johnnydep/pipper.py#L21

    I assume this commit is at fault: https://github.com/tkem/cachetools/commit/be507a6234ac6f48ed84052a414e38dfb22aaa8a

    @wimglenn

    opened by tzickel 9
  • Documenting the _Link class used in TTLCache

    Documenting the _Link class used in TTLCache

    Hi, would it be possible if there could be some documentation regarding the usage of the _Link class in TTLCache? The code is a bit convoluted to read due to the __link attribute referring to something different.

    Will be happy to make a PR from suggestions.

    opened by haiyanghe 9
  • Additional behavioral context for TTL Cache Docu.

    Additional behavioral context for TTL Cache Docu.

    Docu currently is unclear on how the TTLCache actually behaves on update of key Values. I added a short sentence to make it more obvious that keys get deleted regardless of update.
    One could also additional note that "only reassignments of key will reset TTL of item"

    opened by TrevisGordan 2
  • Key-level locking which corrects multithreading performance

    Key-level locking which corrects multithreading performance

    A cache is generally applied to functions and methods which are either slow or expensive to execute, in order to minimize both caller latency and stress on underlying services.

    As it stands today, calling a cachetools cached function multiple times from separate threads with the same key may cause the function body to be evaluated multiple times. This means that a cached, 10 seconds reference data load may be invoked thread count number of times during the first 10 seconds that it's executing, potentially swamping underlying services.

    Cachetools today:

    For example, setting up a REST (I used FastAPI) server to call the following function per request yields multiple calls even though the function is cached. (Note that each timestamped line represents a call to the FastAPI endpoint)

    This is because @ cached only locks on the access to the cache, not on the generation of the value when the key is not present. During the time it takes from the first call for that key to that call (or a subsequent) call completing, the wrapped function will always be evaluated.

    cache = TTLCache(maxsize=1024, ttl=600)
    @cached(cache)
    def test(self):
    	print("Function body called")
    	time.sleep(10)
    
    > 2021-09-29 13:29:42,240 [.....
    > Function body called
    > 2021-09-29 13:29:44,137 [.....
    > Function body called
    > 2021-09-29 13:29:45,474 [.....
    > Function body called
    > 2021-09-29 13:29:46,974 [.....
    > Function body called
    > 2021-09-29 13:29:48,527 [.....
    > Function body called
    > 2021-09-29 13:29:50,242 [.....
    > Function body called
    > 2021-09-29 13:29:51,895 [.....
    > Function body called
    > 2021-09-29 13:29:51,895 [.....
    > 2021-09-29 13:29:53.543 [.....
    > 2021-09-29 13:29:57.213 [.....
    > 2021-09-29 13:29:59.753 [.....
    

    Another, more self contained example is as follows:

    from cachetools import TTLCache
    from cachetools.decorators import cached
    from time import sleep
    from concurrent.futures import ThreadPoolExecutor
    
    
    cache = TTLCache(maxsize=100,ttl=600)
    calls=0
    @cached(cache)
    def method(*args):
        global calls
        sleep(1)
        calls+=1
        print("Doing something expensive!")
        return args
    
    with ThreadPoolExecutor(max_workers=5) as executor:
        executor.map(method, ['arg']*10)
        
    print(calls)
    
    > Doing something expensive!
    > Doing something expensive!
    > Doing something expensive!Doing something expensive!
    > Doing something expensive!
    > 5
    

    Cachetools post-fix

    After the fixes which I'm proposing, the expensive underlying function is only executed a single time for each unique (per key) call.

    For the first example:

    cache = TTLCache(maxsize=1024, ttl=600)
    @cached(cache)
    def test(self):
    	print("Function body called")
    	time.sleep(10)
    
    > 2021-09-29 13:59:17,391 [...
    > Function body called
    > 2021-09-29 13:59:17,996 [.... subsequent calls to the API
    > 2021-09-29 13:59:21,140 [.... subsequent calls to the API
    > 2021-09-29 13:59:22,758 [.... subsequent calls to the API
    > 2021-09-29 13:59:24,222 [.... subsequent calls to the API
    > 2021-09-29 13:59:25,740 [.... subsequent calls to the API
    > 2021-09-29 13:59:27,289 [.... Original call unblocks
    > 2021-09-29 13:59:27,290 [.... All subsequent calls unblock once call 1 finishes 
    > 2021-09-29 13:59:27,292 [.... All subsequent calls unblock once call 1 finishes 
    > 2021-09-29 13:59:27,293 [.... All subsequent calls unblock once call 1 finishes 
    > 2021-09-29 13:59:27,293 [.... All subsequent calls unblock once call 1 finishes 
    > 2021-09-29 13:59:27,294 [.... All subsequent calls unblock once call 1 finishes 
    

    I have manually added some commentary to the log lines. Note how the first call hits our expensive function, while subsequent calls wait for it to complete.

    10 seconds after the first call has come in, all other calls instantly return, since the value is now available. The request at 13:59:25 took only two seconds to respond, whereas it would not only have taken 10 seconds to respond before the bug fix, it would also add more stress to the underlying services called from within test()

    In this second, self contained example, note how only one call is logged to the cached function, even though the code is functionally identical to before.

    from cachetools import TTLCache  # Still using cachetools TTLCache
    from cachetools_fixed.decorators import cached  # Fixed @ cached decorator 
    from time import sleep
    from concurrent.futures import ThreadPoolExecutor
    
    
    cache = TTLCache(maxsize=100,ttl=600)
    calls=0
    @cached(cache)
    def method(*args):
        global calls
        sleep(1)
        calls+=1
        print("Doing something expensive!")
        return args
    
    with ThreadPoolExecutor(max_workers=5) as executor:
        executor.map(method, ['arg']*10)
        
    print(calls)
    
    > Doing something expensive!
    > 1
    

    I'll also add that key level locking still works as expected - repeated calls with different keys yields no benefit over the previous implementation before this bug fix.

    opened by northyorkshire 16
  • Invalidating cached values in

    Invalidating cached values in "cachemethod"

    Hi,

    Is there a way to update/remove cache value returned by cachemethod?

    For example, in the below code,

    import operator as op
    from dataclasses import dataclass, field
    
    from cachetools import TTLCache, cachedmethod
    
    
    @dataclass
    class A:
        cache: TTLCache = field(init=False)
    
        def __post_init__(self):
            self.cache = TTLCache(maxsize=32, ttl=2 * 3600)
    
        @cachedmethod(cache=op.attrgetter('cache'))
        def compute(self, i):
            print('value of i', i)
            return i * 2
    

    Suppose, I want to remove cached value for a given i from cache, then it is not possible to do directly.

    Meanwhile, I learned about hashkey function in keys module. But it is not exposed in cachetools __init__, so can not be imported (of course there are hacks around it). Using this, we can updated cached value.


    So, I think there could be two solutions to it,

    1. As cachemethod wraps a method to enable caching feature in it, so apart from the caching feature, it can add functionality to invalidate. For example,
    a = A()
    
    print(a.compute(10))
    
    a.compute.remove(10) # remove can have signature *args, **kwargs. Instead of word `remove`, a more better function name can be thought
    
    1. Exposing keys module in cachetools __init__.py.
    from cachetools.keys import hashkey
    
    a = A()
    
    i = 10
    print(a.compute(i))
    
    del a.cache[hashkey(i)]
    

    I think approach 1 would be more user friendly.

    enhancement 
    opened by ShivKJ 8
  • Caching exceptions.

    Caching exceptions.

    I have a usecase where decorated function will raise exceptions for some values and I would like these exceptions to be cached and raised again.

    I opened this issue to know if something like this is already implemented or was thought through.

    So far what I've though about this is:

    • the best approach would be to decorate cache decorators, instead adding something like cache_exceptions=True flag to all of them;
    • caching traceback info is a huge NO, so the only way would be to store exception types, w/ or w/o arguments, and create and raise new exception instance, the con is that traceback info will be lost on consecutive calls.

    Your thoughts?

    enhancement 
    opened by WloHu 10
  • cache_clear() and cache_info() would be handy in regular decorators

    cache_clear() and cache_info() would be handy in regular decorators

    Just trying cachetools for the first time. I noticed that cache_clear and cache_info attributes are available only on those caches provided for backwards compatibility with functools.lru_cache(). Is there some reason they (or something like them) aren't available on functions memoized with @cached(...)?

    enhancement 
    opened by smontanaro 6
Owner
Thomas Kemmer
Thomas Kemmer
Automatic caching and invalidation for Django models through the ORM.

Cache Machine Cache Machine provides automatic caching and invalidation for Django models through the ORM. For full docs, see https://cache-machine.re

null 846 Nov 26, 2022
Python disk-backed cache (Django-compatible). Faster than Redis and Memcached. Pure-Python.

DiskCache is an Apache2 licensed disk and file backed cache library, written in pure-Python, and compatible with Django.

Grant Jenks 1.7k Jan 5, 2023
Aircache is an open-source caching and security solution that can be integrated with most decoupled apps that use REST APIs for communicating.

AirCache Aircache is an open-source caching and security solution that can be integrated with most decoupled apps that use REST APIs for communicating

AirCache 2 Dec 22, 2021
Robust, highly tunable and easy-to-integrate in-memory cache solution written in pure Python, with no dependencies.

Omoide Cache Caching doesn't need to be hard anymore. With just a few lines of code Omoide Cache will instantly bring your Python services to the next

Leo Ertuna 2 Aug 14, 2022
Python collections that are backended by sqlite3 DB and are compatible with the built-in collections

sqlitecollections Python collections that are backended by sqlite3 DB and are compatible with the built-in collections Installation $ pip install git+

Takeshi OSOEKAWA 11 Feb 3, 2022
Collection of admin fields and decorators to help to create computed or custom fields more friendly and easy way

django-admin-easy Collection of admin fields, decorators and mixin to help to create computed or custom fields more friendly and easy way Installation

Ezequiel Bertti 364 Jan 8, 2023
A Python module for decorators, wrappers and monkey patching.

wrapt The aim of the wrapt module is to provide a transparent object proxy for Python, which can be used as the basis for the construction of function

Graham Dumpleton 1.8k Jan 6, 2023
A lightweight (serverless) native python parallel processing framework based on simple decorators and call graphs.

A lightweight (serverless) native python parallel processing framework based on simple decorators and call graphs, supporting both control flow and dataflow execution paradigms as well as de-centralized CPU & GPU scheduling.

null 102 Jan 6, 2023
Generate Class & Decorators for your FastAPI project ✨🚀

Classes and Decorators to use FastAPI with class based routing. In particular this allows you to construct an instance of a class and have methods of that instance be route handlers for FastAPI & Python 3.8.

Yasser Tahiri 34 Oct 27, 2022
Python Advanced --- numpy, decorators, networking

Python Advanced --- numpy, decorators, networking (and more?) Hello everyone ?? This is the project repo for the "Python Advanced - ..." introductory

Andreas Poehlmann 2 Nov 5, 2021
Strong Typing in Python with Decorators

typy Strong Typing in Python with Decorators Description This light-weight library provides decorators that can be used to implement strongly-typed be

Ekin 0 Feb 6, 2022
Create argparse subcommands with decorators.

python-argparse-subdec This is a very simple Python package that allows one to create argparse's subcommands via function decorators. Usage Create a S

Gustavo José de Sousa 7 Oct 21, 2022
Decorators for maximizing memory utilization with PyTorch & CUDA

torch-max-mem This package provides decorators for memory utilization maximization with PyTorch and CUDA by starting with a maximum parameter size and

Max Berrendorf 10 May 2, 2022
Dude is a very simple framework for writing web scrapers using Python decorators

Dude is a very simple framework for writing web scrapers using Python decorators. The design, inspired by Flask, was to easily build a web scraper in just a few lines of code. Dude has an easy-to-learn syntax.

Ronie Martinez 326 Dec 15, 2022
Desktop application for Windows/macOS users to rotate through custom, preset, and searched-for collections of backgrounds with scheduling and additional settings

Background Revolution (In Development, Alpha Release) What? This will be an application for users to customize their windows backgrounds by uploading

Daniel Agapov 1 Nov 2, 2021
Command-line program to download image galleries and collections from several image hosting sites

gallery-dl gallery-dl is a command-line program to download image galleries and collections from several image hosting sites (see Supported Sites). It

Mike Fährmann 6.4k Jan 6, 2023
Free and open-source digital preservation system designed to maintain standards-based, long-term access to collections of digital objects.

Archivematica By Artefactual Archivematica is a web- and standards-based, open-source application which allows your institution to preserve long-term

Artefactual 338 Dec 16, 2022
EMNLP 2021 Adapting Language Models for Zero-shot Learning by Meta-tuning on Dataset and Prompt Collections

Adapting Language Models for Zero-shot Learning by Meta-tuning on Dataset and Prompt Collections Ruiqi Zhong, Kristy Lee*, Zheng Zhang*, Dan Klein EMN

Ruiqi Zhong 42 Nov 3, 2022
A Blender python script for getting asset browser custom preview images for objects and collections.

asset_snapshot A Blender python script for getting asset browser custom preview images for objects and collections. Installation: Click the code butto

Johnny Matthews 44 Nov 29, 2022