Unsynchronize asyncio by using an ambient event loop, or executing in separate threads or processes.

Overview

unsync

Unsynchronize asyncio by using an ambient event loop, or executing in separate threads or processes.

Quick Overview

Functions marked with the @unsync decorator will behave in one of the following ways:

  • async functions will run in the unsync.loop event loop executed from unsync.thread
  • Regular functions will execute in unsync.thread_executor, a ThreadPoolExecutor
    • Useful for IO bounded work that does not support asyncio
  • Regular functions marked with @unsync(cpu_bound=True) will execute in unsync.process_executor, a ProcessPoolExecutor
    • Useful for CPU bounded work

All @unsync functions will return an Unfuture object. This new future type combines the behavior of asyncio.Future and concurrent.Future with the following changes:

  • Unfuture.set_result is threadsafe unlike asyncio.Future
  • Unfuture instances can be awaited, even if made from concurrent.Future
  • Unfuture.result() is a blocking operation except in unsync.loop/unsync.thread where it behaves like asyncio.Future.result and will throw an exception if the future is not done

Examples

Simple Sleep

A simple sleeping example with asyncio:

async def sync_async():
    await asyncio.sleep(1)
    return 'I hate event loops'


async def main():
    future1 = asyncio.create_task(sync_async())
    future2 = asyncio.create_task(sync_async())

    await future1, future2

    print(future1.result() + future2.result())

asyncio.run(main())
# Takes 1 second to run

Same example with unsync:

@unsync
async def unsync_async():
    await asyncio.sleep(1)
    return 'I like decorators'

unfuture1 = unsync_async()
unfuture2 = unsync_async()
print(unfuture1.result() + unfuture2.result())
# Takes 1 second to run

Multi-threading an IO-bound function

Synchronous functions can be made to run asynchronously by executing them in a concurrent.ThreadPoolExecutor. This can be easily accomplished by marking the regular function @unsync.

@unsync
def non_async_function(seconds):
    time.sleep(seconds)
    return 'Run concurrently!'

start = time.time()
tasks = [non_async_function(0.1) for _ in range(10)]
print([task.result() for task in tasks])
print('Executed in {} seconds'.format(time.time() - start))

Which prints:

['Run concurrently!', 'Run concurrently!', ...]
Executed in 0.10807514190673828 seconds

Continuations

Using Unfuture.then chains asynchronous calls and returns an Unfuture that wraps both the source, and continuation. The continuation is invoked with the source Unfuture as the first argument. Continuations can be regular functions (which will execute synchronously), or @unsync functions.

@unsync
async def initiate(request):
    await asyncio.sleep(0.1)
    return request + 1

@unsync
async def process(task):
    await asyncio.sleep(0.1)
    return task.result() * 2

start = time.time()
print(initiate(3).then(process).result())
print('Executed in {} seconds'.format(time.time() - start))

Which prints:

8
Executed in 0.20314741134643555 seconds

Mixing methods

We'll start by converting a regular synchronous function into a threaded Unfuture which will begin our request.

@unsync
def non_async_function(num):
    time.sleep(0.1)
    return num, num + 1

We may want to refine the result in another function, so we define the following continuation.

@unsync
async def result_continuation(task):
    await asyncio.sleep(0.1)
    num, res = task.result()
    return num, res * 2

We then aggregate all the results into a single dictionary in an async function.

@unsync
async def result_processor(tasks):
    output = {}
    for task in tasks:
        num, res = await task
        output[num] = res
    return output

Executing the full chain of non_async_functionresult_continuationresult_processor would look like:

start = time.time()
print(result_processor([non_async_function(i).then(result_continuation) for i in range(10)]).result())
print('Executed in {} seconds'.format(time.time() - start))

Which prints:

{0: 2, 1: 4, 2: 6, 3: 8, 4: 10, 5: 12, 6: 14, 7: 16, 8: 18, 9: 20}
Executed in 0.22115683555603027 seconds

Preserving typing

As far as we know it is not possible to change the return type of a method or function using a decorator. Therefore, we need a workaround to properly use IntelliSense. You have three options in general:

  1. Ignore type warnings.

  2. Use a suppression statement where you reach the type warning.

    A. When defining the unsynced method by changing the return type to an Unfuture.

    B. When using the unsynced method.

  3. Wrap the function without a decorator. Example:

    def function_name(x: str) -> Unfuture[str]:
        async_method = unsync(__function_name_synced)
        return async_method(x)
    
    def __function_name_synced(x: str) -> str:
        return x + 'a'
    
    future_result = function_name('b')
    self.assertEqual('ba', future_result.result())

Custom Event Loops

In order to use custom event loops, be sure to set the event loop policy before calling any @unsync methods. For example, to use uvloop simply:

import unsync
import uvloop

@unsync
async def main():
    # Main entry-point.
    ...

uvloop.install() # Equivalent to asyncio.set_event_loop_policy(EventLoopPolicy())
main()
Comments
  • Python 3.9.x compatability?

    Python 3.9.x compatability?

    Hi @alex-sherman , thanks a lot for this great API. I just wanted to give it a quick test with Python 3.9.0, but I get the following error. Number of concurrent processes do not change error (even 1 gives the same error).

    Code:

    from unsync import unsync
    import time
    
    
    @unsync(cpu_bound=True)
    def heavy_calculation(num: int) -> int:
    
        print(f"iteration {num} started")
        return num ** num
    
    
    if __name__ == "__main__":
    
        jobs = [heavy_calculation(i) for i in range(100)]
    
        for job in jobs:
            print(job.result())
    
        print(time.perf_counter())
    
    

    Error:

    Traceback (most recent call last):
      File "/Users/bbb/OneDrive/Docs/-- BW  --/-- Dev/SANDBOX/t_unsync.py", line 35, in <module>
        print(job.result())
      File "/Users/bbb/OneDrive/Docs/-- BW  --/-- Dev/SANDBOX/sandbox/lib/python3.9/site-packages/unsync/unsync.py", line 117, in result
        return self.concurrent_future.result(*args, **kwargs)
      File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/concurrent/futures/_base.py", line 440, in result
        return self.__get_result()
      File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/concurrent/futures/_base.py", line 389, in __get_result
        raise self._exception
    KeyError: ('__main__', 'heavy_calculation')
    (sandbox) bbb@Bariss-MacBook-Pro SANDBOX % /Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 5 leaked semaphore objects to clean up at shutdown
      warnings.warn('resource_tracker: There appear to be %d '
    
    opened by IdemenB 6
  • Can't Import unsync on AWS Lambda

    Can't Import unsync on AWS Lambda

    I am trying to use the unsync module with AWS Lambda but everytime I try to import unsync my lambda execution fails with following error.

    "module initialization error: [Errno 38] Function not implemented"

    opened by simplynd 5
  • Generic type Unfuture test.

    Generic type Unfuture test.

    He Alex,

    This is my suggestion from the typing issue.

    As a side question: do you think that unsync is production ready? And what can we do to improve the long longevity of this package.

    opened by Luttik 5
  • Ability to limit concurrency?

    Ability to limit concurrency?

    This looks like a fantastic tool! Very interested in it. I'm trying to see if there's a way to limit concurrency while using this tool. For example, I'd like to make hundreds or thousands of API calls, each of which takes about 10 seconds, but I'd like to limit the load on the server and have at most about 20 calls running at a time. Is that something I could do with this tool? I couldn't find anything in the tests or documentation so I'm guessing no, but thought I'd check. If no, is that a reasonable feature request or would that use case mean I should choose a different tool?

    opened by benlindsay 4
  • Any support or alternative for `asyncio.Semaphore`?

    Any support or alternative for `asyncio.Semaphore`?

    Hi! Thanks for developing unsync, it's a great job. Since I am new to this module and asyncio , I am wondering whether there is a way to apply asyncio.Semaphore while using @unsync? I wrote a program that retrieves data from a API with unsync but somehow it went wrong like this:

    File "C:\Users\Nature\Miniconda3\lib\asyncio\locks.py", line 92, in __aenter__
        await self.acquire()
      File "C:\Users\Nature\Miniconda3\lib\asyncio\locks.py", line 474, in acquire
        await fut
    RuntimeError: Task <Task pending coro=<fetch_file() running at MyPyCode.py:84>> got Future <Future pending> attached to a different loop
    

    After I remove asyncio.Semaphore , the program works fine but unable to restrict the coroutine number. The reason that I got to use asyncio.Semaphore is to avoid Dos Attack.

    Original Code

    import os
    from time import perf_counter
    import datetime
    import asyncio
    import aiohttp
    import aiofiles
    from unsync import unsync
    
    BASE_URL = "https://www.ebi.ac.uk/pdbe/api/pdb/entry/"
    DEMO = [
        (BASE_URL+'summary/1a01', '1a01_summary.json'),
        (BASE_URL+'summary/2xyn', '2xyn_summary.json'),
        (BASE_URL+'summary/1miu', '1miu_summary.json'),
        (BASE_URL+'summary/2hev', '2hev_summary.json'),
        (BASE_URL+'summary/3g96', '3g96_summary.json')]
    
    @unsync
    async def download_file(url):
        print(f"[{datetime.datetime.now()}] Start to get file: {url}")
        async with aiohttp.ClientSession() as session:
            async with session.get(url) as resp:
                if resp.status == 200:
                    return await resp.read()
                elif resp.status == 404:
                    return None
                else:
                    mes = "code={resp.status}, message={resp.reason}, headers={resp.headers}"
                    raise Exception(mes.format(resp=resp))
    
    
    @unsync
    async def save_file(path, data):
        print(f"[{datetime.datetime.now()}] Start to save file: {path}")
        async with aiofiles.open(path, 'wb') as jsonFile:
            await jsonFile.write(data)
    
    
    @unsync
    async def fetch_file(semaphore, url, path, rate):
        async with semaphore:
            data = await download_file(url)
            await asyncio.sleep(rate)
            if data is not None:
                await save_file(path, data)
                return path
    
    
    def multi_tasks(workdir, concur_req: int, rate=1.5):
        semaphore = asyncio.Semaphore(concur_req)
        tasks = [fetch_file(semaphore, url, os.path.join(workdir, path), rate) for url, path in DEMO]
        for t in tasks:
            t.result()
    
    
    if __name__ == "__main__":
        workdir = "./"
        t0 = perf_counter()
        multi_tasks(workdir, 4)
        elapsed = perf_counter() - t0
        print(f'downloaded in {elapsed}s')
    

    Behavior (as mentioned above)

    (base) PS C:\GitWorks\temp> python .\MyPyCode.py
    [2020-01-16 22:08:59.948408] Start to get file: https://www.ebi.ac.uk/pdbe/api/pdb/entry/summary/1a01
    [2020-01-16 22:08:59.972382] Start to get file: https://www.ebi.ac.uk/pdbe/api/pdb/entry/summary/2xyn
    [2020-01-16 22:08:59.974339] Start to get file: https://www.ebi.ac.uk/pdbe/api/pdb/entry/summary/1miu
    [2020-01-16 22:08:59.975377] Start to get file: https://www.ebi.ac.uk/pdbe/api/pdb/entry/summary/2hev
    [2020-01-16 22:09:02.782357] Start to save file: C:\GitWorks\temp\1miu_summary.json
    [2020-01-16 22:09:02.783426] Start to save file: C:\GitWorks\temp\2hev_summary.json
    [2020-01-16 22:09:02.896178] Start to save file: C:\GitWorks\temp\2xyn_summary.json
    [2020-01-16 22:09:02.932120] Start to save file: C:\GitWorks\temp\1a01_summary.json
    Traceback (most recent call last):
      File ".\MyPyCode.py", line 60, in <module>
        multi_tasks(workdir, 4)
      File ".\MyPyCode.py", line 54, in multi_tasks
        t.result()
      File "C:\Users\Nature\Miniconda3\lib\site-packages\unsync\unsync.py", line 112, in result
        return self.future.result()
      File ".\MyPyCode.py", line 41, in fetch_file
        async with semaphore:
      File "C:\Users\Nature\Miniconda3\lib\asyncio\locks.py", line 92, in __aenter__
        await self.acquire()
      File "C:\Users\Nature\Miniconda3\lib\asyncio\locks.py", line 474, in acquire
        await fut
    RuntimeError: Task <Task pending coro=<fetch_file() running at .\MyPyCode.py:41>> got Future <Future pending> attached to a different loop
    

    Code that removes asyncio.Semaphore

    Just modify the following functions and the rest of code is as same as the original code.

    @unsync
    async def fetch_file(url, path, rate):
        data = await download_file(url)
        await asyncio.sleep(rate)
        if data is not None:
            await save_file(path, data)
            return path
    
    def multi_tasks(workdir, concur_req: int, rate=1.5):
        # semaphore = asyncio.Semaphore(concur_req)
        tasks = [fetch_file(url, os.path.join(
            workdir, path), rate) for url, path in DEMO]
        for t in tasks:
            t.result()
    

    Behavior

    (base) PS C:\GitWorks\temp> python .\MyPyCode.py
    [2020-01-16 22:27:25.656445] Start to get file: https://www.ebi.ac.uk/pdbe/api/pdb/entry/summary/1a01
    [2020-01-16 22:27:25.685366] Start to get file: https://www.ebi.ac.uk/pdbe/api/pdb/entry/summary/2xyn
    [2020-01-16 22:27:25.686413] Start to get file: https://www.ebi.ac.uk/pdbe/api/pdb/entry/summary/1miu
    [2020-01-16 22:27:25.688351] Start to get file: https://www.ebi.ac.uk/pdbe/api/pdb/entry/summary/2hev
    [2020-01-16 22:27:25.712284] Start to get file: https://www.ebi.ac.uk/pdbe/api/pdb/entry/summary/3g96
    [2020-01-16 22:27:28.868901] Start to save file: C:\GitWorks\temp\1miu_summary.json
    [2020-01-16 22:27:28.918699] Start to save file: C:\GitWorks\temp\2hev_summary.json
    [2020-01-16 22:27:28.921433] Start to save file: C:\GitWorks\temp\1a01_summary.json
    [2020-01-16 22:27:28.923428] Start to save file: C:\GitWorks\temp\2xyn_summary.json
    [2020-01-16 22:27:28.947356] Start to save file: C:\GitWorks\temp\3g96_summary.json
    downloaded in 3.3005259000000002s
    

    It is good to see that the program runs very fast, but it would raise a Dos attack with the number of coroutines grows.

    Code that removes @unsync

    Just modify the following functions and the rest of code is as same as the original code.

    # @unsync
    async def download_file(url):
        print(f"[{datetime.datetime.now()}] Start to get file: {url}")
        async with aiohttp.ClientSession() as session:
            async with session.get(url) as resp:
                if resp.status == 200:
                    return await resp.read()
                elif resp.status == 404:
                    return None
                else:
                    mes = "code={resp.status}, message={resp.reason}, headers={resp.headers}"
                    raise Exception(mes.format(resp=resp))
    
    
    # @unsync
    async def save_file(path, data):
        print(f"[{datetime.datetime.now()}] Start to save file: {path}")
        async with aiofiles.open(path, 'wb') as jsonFile:
            await jsonFile.write(data)
    
    
    # @unsync
    async def fetch_file(semaphore, url, path, rate):
        async with semaphore:
            data = await download_file(url)
            await asyncio.sleep(rate)
            if data is not None:
                await save_file(path, data)
                return path
    
    
    def multi_tasks(workdir, concur_req: int, rate=1.5):
        semaphore = asyncio.Semaphore(concur_req)
        '''
        tasks = [fetch_file(semaphore, url, os.path.join(
            workdir, path), rate) for url, path in DEMO]
        for t in tasks:
            t.result()
        '''
        loop = asyncio.get_event_loop()
        tasks = asyncio.gather(*[loop.create_task(fetch_file(semaphore, url, os.path.join(workdir, path), rate))
                                 for url, path in DEMO])
        loop.run_until_complete(tasks)
    

    Behavior

    (base) PS C:\GitWorks\temp> python .\MyPyCode.py
    [2020-01-16 22:21:28.923504] Start to get file: https://www.ebi.ac.uk/pdbe/api/pdb/entry/summary/1a01
    [2020-01-16 22:21:28.948484] Start to get file: https://www.ebi.ac.uk/pdbe/api/pdb/entry/summary/2xyn
    [2020-01-16 22:21:28.949433] Start to get file: https://www.ebi.ac.uk/pdbe/api/pdb/entry/summary/1miu
    [2020-01-16 22:21:28.950435] Start to get file: https://www.ebi.ac.uk/pdbe/api/pdb/entry/summary/2hev
    [2020-01-16 22:21:32.499267] Start to save file: C:\GitWorks\temp\1a01_summary.json
    [2020-01-16 22:21:32.499759] Start to save file: C:\GitWorks\temp\1miu_summary.json
    [2020-01-16 22:21:32.507789] Start to get file: https://www.ebi.ac.uk/pdbe/api/pdb/entry/summary/3g96
    [2020-01-16 22:21:32.542851] Start to save file: C:\GitWorks\temp\2xyn_summary.json
    [2020-01-16 22:21:32.546636] Start to save file: C:\GitWorks\temp\2hev_summary.json
    [2020-01-16 22:21:35.358457] Start to save file: C:\GitWorks\temp\3g96_summary.json
    downloaded in 6.4380731s
    

    Although it works well without unsync, I would like to apply unsync's mixing methods feature in my job which integrate threads and processes.

    Environment

    {
        "platform": "Win10",
        "python": "3.7.1", 
        "aiohttp": "3.6.2",
        "aiofiles": "0.4.0",
        "unsync": "1.2.1"
    }
    
    opened by NatureGeorge 4
  • Why not use unittest's assertRaises instead of pytest.raises?

    Why not use unittest's assertRaises instead of pytest.raises?

    In tests:

        def test_exception(self):
            class TestException(Exception):
                pass
    
            @unsync
            async def error():
                await asyncio.sleep(0.1)
                raise TestException
    
            with raises(TestException):  # pytest.raises
                error().result()
    

    but self.assertRaises would do the same:

            with self.assertRaises(TestException):
                error().result()
    

    I think it would be nice to not have any 3rd-party dependencies at all, even in tests. Or maybe there is some tricky difference?

    opened by and-semakin 4
  • closing old processes

    closing old processes

    Hi, After running my script that uses unsync (thank you so much for this great tool!!) for a couple of hours my os starts complaining about all the open processes. I read that setting unsync.process_executor = None would clean out old processes, but I'm getting AttributeError: can't set attribute when executing that line in my code.

    opened by 0xstochastic 3
  • Refactors

    Refactors

    • use hasattr instead of getattr(..., None) is None
    • Rename TestEventLoopPolicy class so that pytest doesn't warn about not being able to test that class.
    opened by tusharsadhwani 3
  • Added support for uvloop in case of Mac / Linux systems

    Added support for uvloop in case of Mac / Linux systems

    I have been using unsync for some time.

    I thought that it would be nice if unsync could support uvloop for async methods for atleast mac and linux systems at least since windows is not supported yet by uvloop.

    Submitting a PR for the same. Let me know if this looks good enough.

    opened by rams3sh 3
  • Defining the amount of threads in use for sync functions

    Defining the amount of threads in use for sync functions

    How is the number of threads launched for an asynchronous function being decorated with unsync? Is there a possibility to define that number? If it's not possible out of the box, how could I start approaching that?

    opened by ivergara 3
  • Add support for preserving attributes of nested instance methods

    Add support for preserving attributes of nested instance methods

    This PR expands on #5, which used functools.update_wrapper to ensure that the @unsync decorator is well-behaved, and preserves some key attributes of wrapped functions (most notable the function name).

    It turns out however that unsync handles wrapping class methods slightly differently to wrapping regular functions, which meant that for two identically named callables wrapped_func:

    class Class:
    
        @unsync
        async def wrapped_func(self): pass
    
    @unsync
    async def wrapped_func(): pass
    

    ...the __name__ attribute of the class method would not be preserved, such that wrapped_func.__name__ != Class().wrapped_func.__name__

    This PR ports the behaviour from #5 to class methods so that attributes of both types of callables are preserved in a similar fashion.

    More details included in the expanded test cases.

    opened by jcass77 3
  • CPU Bound Decorator in Wrapped Functions

    CPU Bound Decorator in Wrapped Functions

    For context, I am a new user of unsync and I am exploring its use for the problem of running concurrent code triggered from a tkinter gui. I'm currently studying ways of getting data produced by concurrent operations back into tkinter's own event loop. As part of that I've come across a problem with cpu bound functions.

    In the following code the cpu_bound_2 function fails with this exception:

    
    Traceback (most recent call last):
      File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/concurrent/futures/process.py", line 243, in _process_worker
        r = call_item.fn(*call_item.args, **call_item.kwargs)
      File "/Users/stephen/Documents/Coding/Scrapbook/env/lib/python3.9/site-packages/unsync/unsync.py", line 98, in _multiprocess_target
        return unsync.unsync_functions[func_name](*args, **kwargs)
    KeyError: ('__main__', 'func')
    """
    
    The above exception was the direct cause of the following exception:
    
    KeyError: ('__main__', 'func')
    
    import math
    import sys
    import time
    
    from unsync import unsync
    
    
    @unsync(cpu_bound=True)
    def cpu_bound_1():
        for number in range(20_000_000):
            math.sqrt(number)
        print("Finishing cpu_bound_1")
        
        
    def threadable():
        @unsync
        def func():
            time.sleep(3)
            print("Finishing threadable")
        return func
    
    
    def cpu_bound_2():
        @unsync(cpu_bound=True)
        def func():
            for number in range(20_000_000):
                math.sqrt(number)
            print("Finishing cpu_bound_2")
        return func
    
    
    def main():
        cpu_bound_1()
        threadable()()
        cpu_bound_2()()
    
    
    if __name__ == '__main__':
        sys.exit(main())
    

    Python 3.9 on macOS Big Sur 11.4.

    opened by trin5tensa 1
  • Async generators

    Async generators

    Scenarios like these are not currently supported:

    @unsync
    def my_gen(block_size):
        while block := slow_read(block_size):
            yield from block
    
    async def my_consumer():
        async for item in my_gen(1000):
             print(f"Found item: {item}")
    

    They will error out by not finding __aiter__. At first sight this is easy to add, and I was planning to just drop a PR and walk off, but there are a few design decisions here.

    Should we just call next() on whatever thread in the thread pool as __anext__ is called? Or should we trap a thread for the lifetime of the iterator, detecting iterator abandonment using the weakref module? Should that be exposed as an option in the decorator?

    opened by ekevoo 2
  • Typing

    Typing

    It would be great if the IDE could understand that an unsynced method returns an Unfuture rather than the usual data. And it would be even sweater if the Unfuture would be generic such that we know which value .result() returns.

    opened by Luttik 5
Owner
Alex Sherman
Alex Sherman
The simple project to separate mixed voice (2 clean voices) to 2 separate voices.

Speech Separation The simple project to separate mixed voice (2 clean voices) to 2 separate voices. Result Example (Clisk to hear the voices): mix ||

vuthede 31 Oct 30, 2022
Ultra fast asyncio event loop.

uvloop is a fast, drop-in replacement of the built-in asyncio event loop. uvloop is implemented in Cython and uses libuv under the hood. The project d

magicstack 9.1k Jan 7, 2023
A python tool for synchronizing the messages from different threads, processes, or hosts.

Sync-stream This project is designed for providing the synchoronization of the stdout / stderr among different threads, processes, devices or hosts.

Yuchen Jin 0 Aug 11, 2021
Change between dark/light mode depending on the ambient light intensity

svart Change between dark/light mode depending on the ambient light intensity Installation Install using pip $ python3 -m pip install --user svart Ins

Siddharth Dushantha 169 Nov 26, 2022
Gradient - A Python program designed to create a reactive and ambient music listening experience

Gradient is a Python program designed to create a reactive and ambient music listening experience.

Alexander Vega 2 Jan 24, 2022
42-event-notifier - 42 Event notifier using 42API and Github Actions

42 Event Notifier 42서울 Agenda에 새로운 이벤트가 등록되면 알려드립니다! 현재는 Github Issue로 등록되므로 상단

null 6 May 16, 2022
Event sourced bank - A wide-and-shallow example using the Python event sourcing library

Event Sourced Bank A "wide but shallow" example of using the Python event sourci

null 3 Mar 9, 2022
Generic Event Boundary Detection: A Benchmark for Event Segmentation

Generic Event Boundary Detection: A Benchmark for Event Segmentation We release our data annotation & baseline codes for detecting generic event bound

null 47 Nov 22, 2022
Scikit-event-correlation - Event Correlation and Forecasting over High Dimensional Streaming Sensor Data algorithms

scikit-event-correlation Event Correlation and Changing Detection Algorithm Theo

Intellia ICT 5 Oct 30, 2022
Event-forecasting - Event Forecasting Algorithms With Python

event-forecasting Event Forecasting Algorithms Theory Correlating events in comp

Intellia ICT 4 Feb 15, 2022
Download images from forum threads

Forum Image Scraper Downloads images from forum threads Only works with forums which doesn't require a login to view and have an incremental paginatio

null 9 Nov 16, 2022
A mass creator for Discord's new channel threads.

discord-thread-flooder A mass creator for Discord's new channel threads. (obv created by https://github.com/imvast) Warning: this may lag ur pc if u h

Vast 6 Nov 4, 2022
The producer-consumer problem implemented with threads in Python

This was developed using a Python virtual environment, I would strongly recommend to do the same if you want to clone this repository. How to run this

Omar Beltran 1 Oct 30, 2021
Automatically re-open threads when they get archived, no matter your boost level!

ThreadPersist Automatically re-open threads when they get archived, no matter your boost level! Installation You will need to install poetry to run th

null 7 Sep 18, 2022
A bot framework for Reddit to manage threads, wiki pages, widgets, menus and more.

Sub Manager Sub Manager is a bot framework for Reddit to automate a variety of tasks on one or more subreddits, and can be configured and run without

r/SpaceX 3 Aug 26, 2022
Pearpy - a Python package for writing multithreaded code and parallelizing tasks across CPU threads.

Pearpy The Python package for (pear)allelizing your tasks across multiple CPU threads. Installation The latest version of Pearpy can be installed with

MLH Fellowship 5 Nov 1, 2021
City-seeds - A random generator of cultural characteristics intended to spark ideas and help draw threads

City Seeds This is a random generator of cultural characteristics intended to sp

Aydin O'Leary 2 Mar 12, 2022
Python directory buster, multiple threads, gobuster-like CLI, web server brute-forcer, URL replace pattern feature.

pybuster v1.1 pybuster is a tool that is used to brute-force URLs of web servers. Features Directory busting (URI) URL replace patterns (put PYBUSTER

Glaukio 1 Jan 5, 2022
Keepalive - Discord Bot to keep threads from expiring

keepalive Discord Bot to keep threads from expiring Installation Create a new Di

Francesco Pierfederici 5 Mar 14, 2022
Separate handling of protected media in Django, with X-Sendfile support

Django Protected Media Django Protected Media is a Django app that manages media that are considered sensitive in a protected fashion. Not only does t

Cobus Carstens 46 Nov 12, 2022