Skip to content

Python LRU and TTL cache for sync and async code

Notifications You must be signed in to change notification settings

sonic182/onecache

Folders and files

NameName
Last commit message
Last commit date

Latest commit

2765103 · Jan 26, 2025

History

53 Commits
Jan 26, 2025
Aug 1, 2024
Aug 1, 2024
Jan 30, 2022
Jan 26, 2025
Jul 16, 2021
Aug 1, 2024
Aug 11, 2023
Jan 26, 2025
Jan 26, 2025
Aug 1, 2024
Jul 16, 2021

Repository files navigation

Coverage Status github status

OneCache

Python cache for sync and async code.

Cache uses LRU algoritm. Cache can optionally have TTL.

Tested in python 3.7, 3.9, 3.11 and pypy3.9 for windows, mac and linux (see github status badge), it should work in versions between them. It may work for python3.6

Usage

from onecache import CacheDecorator
from onecache import AsyncCacheDecorator


class Counter:
    def __init__(self, count=0):
        self.count = count


@pytest.mark.asyncio
async def test_async_cache_counter():
    """Test async cache, counter case."""
    counter = Counter()

    @AsyncCacheDecorator()
    async def mycoro(counter: Counter):
        counter.count += 1
        return counter.count

    assert 1 == (await mycoro(counter))
    assert 1 == (await mycoro(counter))


def test_cache_counter():
    """Test async cache, counter case."""
    counter = Counter()

    @CacheDecorator()
    def sample(counter: Counter):
        counter.count += 1
        return counter.count

    assert 1 == (sample(counter))
    assert 1 == (sample(counter))

Decorator classes supports the following arguments

  • maxsize (int): Maximun number of items to be cached. default: 512
  • ttl (int): time to expire in milliseconds, if None, it does not expire. default: None
  • skip_args (bool): apply cache as the function doesn't have any arguments, default: False
  • cache_class (class): Class to use for cache instance. default: LRUCache
  • refresh_ttl (bool): if cache with ttl, This flag makes key expiration timestamp to be refresh per access. default: False
  • thread_safe (bool): tell decorator to use thread safe lock. default=False
  • max_mem_size (int): max mem size in bytes. Ceil for sum of cache values sizes. default=None which means no limit. For pypy this value is ignored as the objects can change by the JIT compilation.

If num of records exceds maxsize, it drops the oldest.

Development

Install packages with pip-tools:

pip install pip-tools
pip-compile
pip-compile test-requirements.in
pip-sync requirements.txt test-requirements.txt

Contribute

  1. Fork
  2. create a branch feature/your_feature
  3. commit - push - pull request

Thanks :)