API Rate Limit Decorator

Overview

ratelimit build Maintainability

APIs are a very common way to interact with web services. As the need to consume data grows, so does the number of API calls necessary to remain up to date with data sources. However many API providers constrain developers from making too many API calls. This is know as rate limiting and in a worst case scenario your application can be banned from making further API calls if it abuses these limits.

This packages introduces a function decorator preventing a function from being called more often than that allowed by the API provider. This should prevent API providers from banning your applications by conforming to their rate limits.

Installation

PyPi

Add this line to your application's requirements.txt:

ratelimit

And then execute:

$ pip install -r requirements.txt

Or install it yourself:

$ pip install ratelimit

GitHub

Installing the latest version from Github:

$ git clone https://github.com/tomasbasham/ratelimit
$ cd ratelimit
$ python setup.py install

Usage

To use this package simply decorate any function that makes an API call:

from ratelimit import limits

import requests

FIFTEEN_MINUTES = 900

@limits(calls=15, period=FIFTEEN_MINUTES)
def call_api(url):
    response = requests.get(url)

    if response.status_code != 200:
        raise Exception('API response: {}'.format(response.status_code))
    return response

This function will not be able to make more then 15 API call within a 15 minute time period.

The arguments passed into the decorator describe the number of function invocation allowed over a specified time period (in seconds). If no time period is specified then it defaults to 15 minutes (the time window imposed by Twitter).

If a decorated function is called more times than that allowed within the specified time period then a ratelimit.RateLimitException is raised. This may be used to implement a retry strategy such as an expoential backoff

from ratelimit import limits, RateLimitException
from backoff import on_exception, expo

import requests

FIFTEEN_MINUTES = 900

@on_exception(expo, RateLimitException, max_tries=8)
@limits(calls=15, period=FIFTEEN_MINUTES)
def call_api(url):
    response = requests.get(url)

    if response.status_code != 200:
        raise Exception('API response: {}'.format(response.status_code))
    return response

Alternatively to cause the current thread to sleep until the specified time period has ellapsed and then retry the function use the sleep_and_retry decorator. This ensures that every function invocation is successful at the cost of halting the thread.

from ratelimit import limits, sleep_and_retry

import requests

FIFTEEN_MINUTES = 900

@sleep_and_retry
@limits(calls=15, period=FIFTEEN_MINUTES)
def call_api(url):
    response = requests.get(url)

    if response.status_code != 200:
        raise Exception('API response: {}'.format(response.status_code))
    return response

License

This project is licensed under the MIT License.

Comments
  • Running once in x time instead of up to y times in x time

    Running once in x time instead of up to y times in x time

    Just to clarify, let me show an example: wheen you set it up for 10 calls in 100 seconds, it lets you call it once every ten seconds, which is not exactly the same. See the following example:

    >>> from ratelimit import rate_limited
    >>> import datetime
    >>> import time
    >>> @rate_limited(10,100)
    ... def test(i=1):
    ...     ts = time.time()
    ...     st = datetime.datetime.fromtimestamp(ts).strftime('%H:%M:%S')
    ...     print("{}::  {}".format(i, st))
    ... 
    >>> for i in range(12):
    ...     test()
    ... 
    1::  12:14:05
    1::  12:14:15
    1::  12:14:25
    1::  12:14:35
    1::  12:14:45
    1::  12:14:55
    1::  12:15:05
    1::  12:15:15
    1::  12:15:25
    1::  12:15:35
    1::  12:15:45
    1::  12:15:55
    >>>
    
    
    

    I was expecting test() function to be called 10 times and then be blocked, but instead, there is only one call in 10 seconds. Is there a way to get the behaviour I'm expecting?

    Thanks in advance

    opened by rola93 5
  • Return error instead of waiting

    Return error instead of waiting

    This is a feature request.

    It would be great, if there was a way to return an error, if there are too many calls of the limited function, instead of simply waiting and still doing the requested thing later. Perhaps a third parameter in the decorator, which takes an error class?

    opened by ZelphirKaltstahl 4
  • Feature/asyncio compatible decorators

    Feature/asyncio compatible decorators

    Hi! This PR implements idea from #26 to make limits decorator compatible with async functions and asyncio coroutines. It uses simple check inspect.iscoroutinefunction to dispatch work to sync/async implementation.

    opened by evemorgen 3
  • BUG? sleep_and_retry only retries once

    BUG? sleep_and_retry only retries once

    If you're really hammering a rate limited call with multiple threads, the retry can also raise RateLimitException. I'm using this now, and might add a max_retries parameter

    def sleep_and_retry_forever(func):
        '''
        Return a wrapped function that retries rate limit exceptions, sleeping the
        current thread until rate limit resets. Continues to retry until the call makes
        it through.
    
        :param function func: The function to decorate.
        :return: Decorated function.
        :rtype: function
        '''
        @wraps(func)
        def wrapper(*args, **kargs):
            '''
            Call the rate limited function. If the function raises a rate limit
            exception sleep for the remaing time period and retry the function, until it succeeds.
            :param args: non-keyword variable length argument list to the decorated function.
            :param kargs: keyworded variable length argument list to the decorated function.
            '''
            done = False
            while not done:
                try:
                    return func(*args, **kargs)
                    done = True
                except RateLimitException as exception:
                    time.sleep(exception.period_remaining)
        return wrapper
    
    opened by aarcro 3
  • The rate limit `period` appears to be a `frequency` (inverted logic)

    The rate limit `period` appears to be a `frequency` (inverted logic)

    Hi, I just scratched my head over this for a while then I made the following simple test:

    import datetime
    
    from ratelimit import rate_limited
    
    
    @rate_limited(2)
    def rate_limited_function():
        print("function called at: {}".format(datetime.datetime.now()))
    
    if __name__ == '__main__':
        while True:
            rate_limited_function()
    

    which produces the following output:

    function called at: 2017-02-02 12:51:00.969778
    function called at: 2017-02-02 12:51:01.470701
    function called at: 2017-02-02 12:51:01.971523
    function called at: 2017-02-02 12:51:02.472208
    function called at: 2017-02-02 12:51:02.972889
    function called at: 2017-02-02 12:51:03.473723
    

    e.g. the it's rate limiting the function call to once every 0.5 seconds, not once every 2 seconds.

    conversely, passing a period of 0.5 runs the function every 2 seconds: (and curiously it doesn't do the first call for 2 seconds)

    function called at: 2017-02-02 12:58:56.494749
    function called at: 2017-02-02 12:58:58.496957
    function called at: 2017-02-02 12:59:00.499353
    function called at: 2017-02-02 12:59:02.501679
    

    The README says "The argument passed into the decorator imposes the time that must elapse before a method can be called again." and the argument is called period so I believe passing period=2 is supposed to mean "no more than every 2 seconds". But seemingly the logic is inverted.

    There's a very simple, thread safe, [imer-based approach you might like to consider at http://stackoverflow.com/questions/30918772/rate-limiting-python-decorator/30918773#30918773

    opened by fawkesley 3
  • Multiprocess aware rate limit - Redis or Django DB backend ?

    Multiprocess aware rate limit - Redis or Django DB backend ?

    It would be good if ratelimit could store it's ratelimit somewhere like Redis or the database, this would make it more useful under Django, where there can be more than one process running.

    opened by stuaxo 2
  • Shimon/ratelimit/add condition to rate limiter

    Shimon/ratelimit/add condition to rate limiter

    This PR include additions below:

    1. Adding a condition to RateLimitDecorator that will allow 'bypassing' limit, in exceptional cases.
    2. Unit test for conditional ratelimiter.
    3. Add the ability to avoid rasing an exception in case limit was reached, but also, avoid calling the function.
    opened by shimon-lb 2
  • Reinstate backwards-compatible `rate_limited` function?

    Reinstate backwards-compatible `rate_limited` function?

    It seems the decorator is now called limits and rate_limited has disappeared, breaking old code.

    Is it possible to reinstate the old decorator for backwards compatibility?

    Thanks!

    opened by fawkesley 2
  • Improve example in README

    Improve example in README

    Following our discussion on https://github.com/tomasbasham/ratelimit/issues/1 I thought it would be helpful to update the README with an example using both parameters.

    (I use this library all the time, and I keep having to go back to that issue to remember what the arguments mean!)

    opened by fawkesley 2
  • Concurrency issues

    Concurrency issues

    I noticed the code here has been changed since my last commit.

    I do realize that my version will probably block 2 or more threads from accessing the API call at the same time, thus compromising my intention of multithreading. However, it seems that the new code might have issues on that subject too.

            with lock:
                elapsed = time.time() - last_called[0]
            left_to_wait = frequency - elapsed
            if left_to_wait > 0:
                time.sleep(left_to_wait)
            ret = func(*args, **kargs)
            last_called[0] = time.time()
            return ret
    

    The above code only locks the operation elapsed = time.time() - last_called[0], but not the operation last_called[0] = time.time(). Now, for example, if func call is an operation that blocks for 20 seconds (maybe due to slow API communications), and my frequency is 1 second, and I have 10 threads at the same time. What is going to happen?

    The first thread will acquire the lock, calculate the elapsed, release the lock, run func, which will block for 20 seconds. Note that last_called[0] will not be updated during the 20 seconds! Therefore, during the 20 seconds, all of my other 9 threads will be able to acquire and release the lock, call func. If more than 2 threads call func in one second, the effect of the rate limiter will be compromised.

    Therefore, I propose the change.

    Note that I moved the last_called[0] within the lock structure. The advantage is that if func blocks for a bit, the threads won't block.

    I tested the code with the following code:

    import time
    import threading
    @rate_limited(2, 1)
    def increment():
        '''
        Increment the counter at most twice every second.
        '''
        time.sleep(5)
        print time.time()
    
    thread1 = threading.Thread(target = increment)
    thread2 = threading.Thread(target = increment)
    thread3 = threading.Thread(target = increment)
    
    thread1.start()
    thread2.start()
    thread3.start()
    

    The output of the old commit is:

    1502028965.591502028965.591502028965.59

    The output of my new commit is:

    1502029318.34
    1502029318.84
    1502029319.35
    

    Apparently, all three threads called the same function at the same seconds in the old commit, which broke the limit. While the new commit ensures the limit.

    opened by ginward 2
  • ProcessPoolExecutor support

    ProcessPoolExecutor support

    Hi there,

    Thanks for this work, really helpful.

    However, I am running into an issue. I need to use ratelimit with concurrent.futures.ProcessPoolExecutor.

    Sadly, I am observing that the decorator is not aware of the function invocations going on on these parallel processes.

    Do you have plans to support such use case? Or could you maybe point me to related work which supports concurrent processes?

    Thanks,

    Alex

    opened by alexcombessie 1
  • Does not work for me

    Does not work for me

    When calling the function, it throws the exception and dies. What I am doing wrong?

    
    @on_exception(expo, RateLimitException, max_tries=8)
    @limits(calls=15, period=300)
    def mint_nft_to_address(cust_address):
        global contract_address
        assert contract_address, "Unable to mint. Empty contract address given"
        assert cust_address, "Unable to mint. Empty customer address given"
        assert metadata_directory_ipfs_uri, "Unable to mint. Empty metadata directory url given"
        url = "https://api.nftport.xyz/v0/mints/customizable"
    
        payload = {
            "chain": "polygon",
            "contract_address": contract_address,
            "metadata_uri": metadata_directory_ipfs_uri + "/1",
            "mint_to_address": cust_address
        }
        headers = {
            "Content-Type": "application/json",
            "Authorization": API_KEY
        }
    
        response = requests.post(url, json=payload, headers=headers)
    
        if response.status_code != 200:
            raise Exception('API response: {}'.format(response.status_code))
        print(response.text)
        return response
    
    opened by mikklaos 0
  • Add a “Usage note”?

    Add a “Usage note”?

    Sometimes you need to limit API calls from another package that you do not control. In this case, monkey patching the API may be the best option. But where? In the requests package, calls to .get, .head, and other methods of a Request() object are funneled to self.request(), so that is one possibility.  However, that method creates a temporary Session object and sends all invocations through it. As a result, the best way to limit all API calls is:

    throttled = limits(calls=15, period=FIFTEEN_MINUTES)
    Session.request = throttled(Session.request)
    
    opened by samwyse 0
  • New repo maintainer(s) / alternative projects?

    New repo maintainer(s) / alternative projects?

    Hi @tomasbasham, first of all thanks for this handy package. It solves a small but common enough problem such that this is currently the top Google result for "python rate limit" and similar searches.

    It appears that the repo has accumulated a number of unanswered issues and PRs over the last couple years. It's understandable if you don't have time to maintain this, but since there are others who are willing and able to make improvements to it, would you be willing to either:

    • Transfer the repo to a different owner, or
    • Add some users with 'Collaborator' level permissions so they can help maintain the project?

    Plan B

    Otherwise, would anyone else like to volunteer to do the following?

    • Start a fork (if you haven't done so already) that will be a recommended replacement for ratelimit
    • Be available to respond to issues and PRs within a reasonable amount of time
    • Add at least one or two Collaborators to help with this
    • Be willing to transfer the repo to a different owner if at some point in the future you will no longer have time to maintain it

    I would suggest @deckar01's fork, containing changes described in #31, as a good starting point. I would also like to see @evemorgen and @Jude188's changes from issue #26 / PR #39 included. I believe the changes from these two forks could be integrated with the use of aiosqlite to support both sliding log persistence and async calls.

    opened by JWCook 7
  • Support for requests-toolbelt

    Support for requests-toolbelt

    I'm using the requests extension called requests-toolbelt which allows for mult-threaded requests. I believe the algorithm feeds multiple URLs to a requests.Session object and threads them up in a pool.

    Can this be adapted to do rate limit/throttling for APIs that limit requests to something like 20 requests per second?

    https://toolbelt.readthedocs.io/en/latest/

    opened by leeprevost 1
Owner
Tomas Basham
Software Developer at @statuscake, hobbyist maker, baker and food lover. Trying to learn Rust in my spare time.
Tomas Basham
Python decorator for `TODO`s

Python decorator for `TODO`s. Don't let your TODOs rot in your python projects anymore !

Klemen Sever 74 Sep 13, 2022
This package tries to emulate the behaviour of syntax proposed in PEP 671 via a decorator

Late-Bound Arguments This package tries to emulate the behaviour of syntax proposed in PEP 671 via a decorator. Usage Mention the names of the argumen

Shakya Majumdar 0 Feb 6, 2022
Block when attacker want to bypass the limit of request

Block when attacker want to bypass the limit of request

iFanpS 1 Dec 1, 2021
Package pyVHR is a comprehensive framework for studying methods of pulse rate estimation relying on remote photoplethysmography (rPPG)

Package pyVHR (short for Python framework for Virtual Heart Rate) is a comprehensive framework for studying methods of pulse rate estimation relying on remote photoplethysmography (rPPG)

PHUSE Lab 261 Jan 3, 2023
Audio-analytics for music-producers! Automate tedious tasks such as musical scale detection, BPM rate classification and audio file conversion.

Click here to be re-directed to the Beat Inspect Streamlit Web-App You are a music producer? Let's get in touch via LinkedIn Fundamental Analytics for

Stefan Rummer 11 Dec 27, 2022
AIST++ API This repo contains starter code for using the AIST++ dataset.

AIST++ API This repo contains starter code for using the AIST++ dataset. To download the dataset or explore details of this dataset, please go to our

Google 260 Dec 30, 2022
Driving lessons made simpler. Custom scheduling API built with Python.

NOTE This is a mirror of a GitLab repository. Dryvo Dryvo is a unique solution for the driving lessons industry. Our aim is to save the teacher’s time

Adam Goldschmidt 595 Dec 5, 2022
Backend/API for the Mumble.dev, an open source social media application.

Welcome to the Mumble Api Repository Getting Started If you are trying to use this project for the first time, you can get up and running by following

Dennis Ivy 189 Dec 27, 2022
Some scripts for the Reverse engineered (old) api of CafeBazaar

bazz Note: This project is done and published only for educational purposes. Some scripts for the Reverse engineered (old) API of CafeBazaar. Be aware

Mohsen Tahmasebi 35 Dec 25, 2022
One Ansible Module for using LINE notify API to send notification. It can be required in the collection list.

Ansible Collection - hazel_shen.line_notify Documentation for the collection. ansible-galaxy collection install hazel_shen.line_notify --ignore-certs

Hazel Shen 4 Jul 19, 2021
Provide Prometheus url_sd compatible API Endpoint with data from Netbox

netbox-plugin-prometheus-sd Provide Prometheus http_sd compatible API Endpoint with data from Netbox. HTTP SD is a new feature in Prometheus and not a

Felix Peters 66 Dec 19, 2022
Simple Python API for the Ergo Platform Explorer

Ergo is a "Resilient Platform for Contractual Money." It is designed to be a platform for applications with the main focus to provide an efficient, se

null 7 Jul 6, 2021
Wrappers around the most common maya.cmds and maya.api use cases

Maya FunctionSet (maya_fn) A package that decompose core maya.cmds and maya.api features to a set of simple functions. Tests The recommended approach

Ryan Porter 9 Mar 12, 2022
An unofficial python API for trading on the DeGiro platform, with the ability to get real time data and historical data.

DegiroAPI An unofficial API for the trading platform Degiro written in Python with the ability to get real time data and historical data for products.

Jorrick Sleijster 5 Dec 16, 2022
This application demonstrates IoTVAS device discovery and security assessment API integration with the Rapid7 InsightVM.

Introduction This repository hosts a sample application that demonstrates integrating Firmalyzer's IoTVAS API with the Rapid7 InsightVM platform. This

Firmalyzer BV 4 Nov 9, 2022
A simple API to upload notes or files to KBFS

This API can be used to upload either secure notes or files to a secure KeybaseFS folder.

Dakota Brown 1 Oct 8, 2021
OpenSea NFT API App using Python and Streamlit

opensea-nft-api-tutorial OpenSea NFT API App using Python and Streamlit Tutorial Video Walkthrough https://www.youtube.com/watch?v=49SupvcFC1M Instruc

null 64 Oct 28, 2022
A tool for light-duty persistent memoization of API calls

JSON Memoize What is this? json_memoize is a straightforward tool for light-duty persistent memoization, created with API calls in mind. It stores the

null 1 Dec 11, 2021
External Network Pentest Automation using Shodan API and other tools.

Chopin External Network Pentest Automation using Shodan API and other tools. Workflow Input a file containing CIDR ranges. Converts CIDR ranges to ind

Aditya Dixit 9 Aug 4, 2022