ShadowClone allows you to distribute your long running tasks dynamically across thousands of serverless functions and gives you the results within seconds where it would have taken hours to complete

Overview

ShadowClone

ShadowClone allows you to distribute your long running tasks dynamically across thousands of serverless functions and gives you the results within seconds where it would have taken hours to complete.

You can make full use of the Free Tiers provided by cloud providers and supercharge your mundane cli tools with shadow clone jutsu (Naruto style)!

Installation

Please visit the wiki for installation and intial configuration instructions

Usage

⚡ python shadowclone.py -h
usage: shadowclone.py [-h] -i INPUT [-s SPLITNUM] [-o OUTPUT] -c COMMAND

optional arguments:
  -h, --help            show this help message and exit
  -i INPUT, --input INPUT
  -s SPLITNUM, --split SPLITNUM
                        Number of lines per chunk of file
  -o OUTPUT, --output OUTPUT
  -c COMMAND, --command COMMAND
                        command to execute

How it works

We create a container image during the initial setup and register it as a runtime for our function in AWS/GCP/Azure whatever. When you execute ShadowClone on your computer, instances of that container are activated automatically and are only active for the duration of its execution. How many instances to activate is dynamically decided at runtime depending on the size of the input file provided and the split factor. The input is then split into chunks and equally distributed between all the instances to execute in parallel. For example, if your input file has 10,000 lines and you set the split factor to 100 lines, then it will be split into 100 chunks of 100 lines each and 100 instances will be run in parallel!

Features

  • Extremely fast
  • No need to maintain a VPS (or a fleet of it :))
  • Costs almost nothing per month
    • Compatible with free tiers of most cloud services
  • Cloud agnostic
    • Same script works with AWS, GCP, Azure etc.
  • Supports upto 1000 parallel invocations
  • Dynamically decide the number of invocations
  • Run any tool in parallel on the cloud
  • Pipe output to other tools

Comparison

This tool was inspired by the awesome Axiom and Fleex projects and goes beyond the concept of VPS for running the tools by using serverless functions and containers.

Features Axiom/Fleex ShadowClone
Instances 10-100s* 1000s
Cost Per instance/per minute Mostly Free**
Startup Time 4-5 minutes 2-3 seconds
Idle Cost $++ Free
On Demand Scalability No

*Most cloud providers do not allow spinning up too many instances by default, so you are limited to around 10-15 instances at max. You have to make a request to the support to increase this number.

** AWS & Azure allow 1 million invocations per month for free. Google allows 2 million invocations per month for free. You will be charged only if you go above these limits

Demo

DNS Bruteforcing using a 43mb file - 34 seconds

asciicast

Running httpx on 94K subdomains in 1 min

asciicast

References

Lithops documentation

Free Tiers

Cloud Provider Free Allowance Link
Google Functions 2 Million invocations, 400,000 GB-seconds per month Google Cloud Free Program
AWS Lambda 1 Million invocations, Up to 3.2 million seconds of compute time per month Free Cloud Computing Services - AWS Free Tier
Azure Functions 1 Million invocations Microsoft Azure Free Services

Obviously, you can make any number of function invocations per month. The table above only shows how many invocations are free.

Similar Tools

Disclaimer

This tool is designed as a proof-of-concept implementation and it's usage is intended for educational purpose only. Usage for attacking targets without prior mutual consent is illegal. It's the end user's responsibility to obey all applicable local, state and federal laws. Developers assume no liability and are not responsible for any misuse or damage caused by this program.

Comments
  • ARM64

    ARM64

    Any where to specify ARM64 in AWS Lambda via Lithops?

    Aside from that, modifying the Dockerfile to --plathform=linux/amd64 doesn't seem to make it work on a M1 MacBook.

    opened by vysecurity 3
  • Function stuck in pending state

    Function stuck in pending state

    It worked at the first go.

    Then I've added some other tooling to the Dockerfile and used this command to rebuild the Docker:

    lithops runtime build sc-runtime -f Dockerfile
    

    Since then, I'm not able anymore to run the program. Even Lithops test fails me:

    lithops test
    2022-01-24 10:46:09,925 [INFO] lithops.config -- Lithops v2.5.8
    2022-01-24 10:46:10,081 [INFO] lithops.storage.backends.aws_s3.aws_s3 -- S3 client created - Region: eu-west-3
    2022-01-24 10:46:10,124 [INFO] lithops.serverless.backends.aws_lambda.aws_lambda -- AWS Lambda client created - Region: eu-west-3
    2022-01-24 10:46:10,125 [INFO] lithops.invokers -- ExecutorID d5971d-0 | JobID A000 - Selected Runtime: python38 - 512MB
    2022-01-24 10:46:10,218 [INFO] lithops.invokers -- Runtime python38 with 512MB is not yet installed
    2022-01-24 10:46:10,311 [INFO] lithops.serverless.backends.aws_lambda.aws_lambda -- Creating default lambda layer for runtime python38
    Traceback (most recent call last):
      File "/root/tools/ShadowClone/test/bin/lithops", line 8, in <module>
        sys.exit(lithops_cli())
      File "/root/tools/ShadowClone/test/lib/python3.8/site-packages/click/core.py", line 1128, in __call__
        return self.main(*args, **kwargs)
      File "/root/tools/ShadowClone/test/lib/python3.8/site-packages/click/core.py", line 1053, in main
        rv = self.invoke(ctx)
      File "/root/tools/ShadowClone/test/lib/python3.8/site-packages/click/core.py", line 1659, in invoke
        return _process_result(sub_ctx.command.invoke(sub_ctx))
      File "/root/tools/ShadowClone/test/lib/python3.8/site-packages/click/core.py", line 1395, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "/root/tools/ShadowClone/test/lib/python3.8/site-packages/click/core.py", line 754, in invoke
        return __callback(*args, **kwargs)
      File "/root/tools/ShadowClone/test/lib/python3.8/site-packages/lithops/scripts/cli.py", line 168, in test_function
        fexec.call_async(hello, username)
      File "/root/tools/ShadowClone/test/lib/python3.8/site-packages/lithops/executors.py", line 205, in call_async
        runtime_meta = self.invoker.select_runtime(job_id, runtime_memory)
      File "/root/tools/ShadowClone/test/lib/python3.8/site-packages/lithops/invokers.py", line 121, in select_runtime
        runtime_meta = self.compute_handler.create_runtime(self.runtime_name, runtime_memory, runtime_timeout)
      File "/root/tools/ShadowClone/test/lib/python3.8/site-packages/lithops/serverless/serverless.py", line 73, in create_runtime
        return self.backend.create_runtime(runtime_name, memory, timeout=timeout)
      File "/root/tools/ShadowClone/test/lib/python3.8/site-packages/lithops/serverless/backends/aws_lambda/aws_lambda.py", line 413, in create_runtime
        layer_arn = self._create_layer(runtime_name)
      File "/root/tools/ShadowClone/test/lib/python3.8/site-packages/lithops/serverless/backends/aws_lambda/aws_lambda.py", line 200, in _create_layer
        response = self.lambda_client.invoke(
      File "/root/tools/ShadowClone/test/lib/python3.8/site-packages/botocore/client.py", line 391, in _api_call
        return self._make_api_call(operation_name, kwargs)
      File "/root/tools/ShadowClone/test/lib/python3.8/site-packages/botocore/client.py", line 719, in _make_api_call
        raise error_class(parsed_response, operation_name)
    botocore.errorfactory.ResourceConflictException: An error occurred (ResourceConflictException) when calling the Invoke operation: The operation cannot be performed at this time. The function is currently in the following state: Pending
    

    Already tried:

    • Remove runtime reference in config.py
    • Remove lambda function manually in console
    • Remove S3 bucket and create a new one and add new one in config
    • Changing AWS region

    Would really appreciate some guidance in how to proceed when editing the Dockerfile

    opened by ghost 2
  • deploy error

    deploy error

    (env) root@zyz:~/ShadowClone# lithops runtime create recon --memory 512 --timeout 800
    2022-12-06 00:33:19,115 [INFO] lithops.config -- Lithops v2.5.8
    2022-12-06 00:33:19,115 [DEBUG] lithops.config -- Loading configuration from /root/.lithops/config
    2022-12-06 00:33:19,133 [DEBUG] lithops.config -- Loading Serverless backend module: aws_lambda
    2022-12-06 00:33:19,192 [DEBUG] lithops.config -- Loading Storage backend module: aws_s3
    2022-12-06 00:33:19,193 [INFO] lithops.scripts.cli -- Creating new lithops runtime: recon
    2022-12-06 00:33:19,193 [DEBUG] lithops.storage.backends.aws_s3.aws_s3 -- Creating S3 client
    2022-12-06 00:33:19,290 [INFO] lithops.storage.backends.aws_s3.aws_s3 -- S3 client created - Region: ap-southeast-1
    2022-12-06 00:33:19,290 [DEBUG] lithops.serverless.backends.aws_lambda.aws_lambda -- Creating AWS Lambda client
    2022-12-06 00:33:19,290 [DEBUG] lithops.serverless.backends.aws_lambda.aws_lambda -- Creating Boto3 AWS Session and Lambda Client
    2022-12-06 00:33:20,206 [INFO] lithops.serverless.backends.aws_lambda.aws_lambda -- AWS Lambda client created - Region: ap-southeast-1
    2022-12-06 00:33:20,207 [DEBUG] lithops.serverless.backends.aws_lambda.aws_lambda -- Creating new Lithops lambda function: lithops_v2-5-8_gzmx_recon_512MB
    Traceback (most recent call last):
      File "/root/ShadowClone/env/lib/python3.8/site-packages/lithops/serverless/backends/aws_lambda/aws_lambda.py", line 371, in create_runtime
        response = self.ecr_client.describe_images(repositoryName=repo_name)
      File "/root/ShadowClone/env/lib/python3.8/site-packages/botocore/client.py", line 530, in _api_call
        return self._make_api_call(operation_name, kwargs)
      File "/root/ShadowClone/env/lib/python3.8/site-packages/botocore/client.py", line 960, in _make_api_call
        raise error_class(parsed_response, operation_name)
    botocore.errorfactory.RepositoryNotFoundException: An error occurred (RepositoryNotFoundException) when calling the DescribeImages operation: The repository with name 'lithops_v2-5-8_gzmx/recon' does not exist in the registry with id '866113468777'
    
    During handling of the above exception, another exception occurred:
    
    Traceback (most recent call last):
      File "/root/ShadowClone/env/bin/lithops", line 8, in <module>
        sys.exit(lithops_cli())
      File "/root/ShadowClone/env/lib/python3.8/site-packages/click/core.py", line 1130, in __call__
        return self.main(*args, **kwargs)
      File "/root/ShadowClone/env/lib/python3.8/site-packages/click/core.py", line 1055, in main
        rv = self.invoke(ctx)
      File "/root/ShadowClone/env/lib/python3.8/site-packages/click/core.py", line 1657, in invoke
        return _process_result(sub_ctx.command.invoke(sub_ctx))
      File "/root/ShadowClone/env/lib/python3.8/site-packages/click/core.py", line 1657, in invoke
        return _process_result(sub_ctx.command.invoke(sub_ctx))
      File "/root/ShadowClone/env/lib/python3.8/site-packages/click/core.py", line 1404, in invoke
        return ctx.invoke(self.callback, **ctx.params)
      File "/root/ShadowClone/env/lib/python3.8/site-packages/click/core.py", line 760, in invoke
        return __callback(*args, **kwargs)
      File "/root/ShadowClone/env/lib/python3.8/site-packages/lithops/scripts/cli.py", line 434, in create
        runtime_meta = compute_handler.create_runtime(name, mem, timeout=to)
      File "/root/ShadowClone/env/lib/python3.8/site-packages/lithops/serverless/serverless.py", line 73, in create_runtime
        return self.backend.create_runtime(runtime_name, memory, timeout=timeout)
      File "/root/ShadowClone/env/lib/python3.8/site-packages/lithops/serverless/backends/aws_lambda/aws_lambda.py", line 376, in create_runtime
        raise Exception('Runtime {} is not deployed to ECR'.format(runtime_name))
    Exception: Runtime recon is not deployed to ECR
    
    

    I thought the problem was with the "role" but I have adjusted what you gave on the wiki and it still has an error

    opened by kepangada 1
  • pickle_db

    pickle_db

    Hi, what can I put for Pickle_db or where can I find the bucket-hash.db?

    python shadowclone.py -i /home/kali/range.txt --split 100 -o test.txt -c "/go/bin/httpx -l {INPUT}" 2022-10-11 23:07:27,214 [INFO] Splitting input file into chunks of 100 lines 2022-10-11 23:07:27,219 [INFO] Uploading chunks to storage Traceback (most recent call last): File "/home/kali/Tools/ShadowClone/shadowclone.py", line 128, in db.dump() # save the db to file File "/usr/local/lib/python3.10/dist-packages/pickledb.py", line 92, in dump json.dump(self.db, open(self.loco, 'wt')) FileNotFoundError: [Errno 2] No such file or directory: ''

    in config.py I have the following line:

    PICKLE_DB=""

    FYI, it has same errors on running the RUN go install -... /nuclei@latest but I skipped nuclei installation. Thank you!

    opened by georgebecerescu 1
  • Error when running build command

    Error when running build command

    Hi!

    I've been trying to setup the tool on my Debian 10 box and getting the error when running the command:

    lithops runtime build sc-runtime -f ~/tools/ShadowClone/Dockerfile

    Error snapshot:

    image

    Python version : Python 3.10.6

    opened by sumgr0 1
  • error output when running

    error output when running

    Hello @fyoorer @vysecurity

    I managed to configure this tool with the steps mentioned, but when running the output only shows $HOME is not defined. What really happened? asu

    opened by marz-hunter 1
  • Rate limit error while running command

    Rate limit error while running command

    Any solution for this ?

    (env) kkkk:ShadowClone:% python shadowclone.py -i subdomains.txt --split 300 -o httpxresults.txt -c "/go/bin/httpx -l {INPUT} -t 100"                                                                   <main ✗>
    2022-10-20 04:52:29,805 [INFO] Splitting input file into chunks of 300 lines
    2022-10-20 04:52:29,869 [INFO] Uploading chunks to storage
    2022-10-20 04:52:29,886 [INFO] lithops.config -- Lithops v2.5.8
    2022-10-20 04:52:29,928 [INFO] lithops.storage.backends.aws_s3.aws_s3 -- S3 client created - Region: us-east-1
    2022-10-20 04:52:30,062 [INFO] lithops.serverless.backends.aws_lambda.aws_lambda -- AWS Lambda client created - Region: us-east-1
    2022-10-20 04:52:30,064 [INFO] lithops.invokers -- ExecutorID 85b642-0 | JobID M000 - Selected Runtime: lithops_v2-5-8_7vdl/sc-runtime1 - 512MB
    2022-10-20 04:52:37,246 [INFO] lithops.invokers -- ExecutorID 85b642-0 | JobID M000 - Starting function invocation: execute_command() - Total: 167 activations
    exception calling callback for <Future at 0x7fedac13f9d0 state=finished raised Exception>
    Traceback (most recent call last):
      File "/usr/lib/python3.8/concurrent/futures/_base.py", line 328, in _invoke_callbacks
        callback(self)
      File "/home/op/env/lib/python3.8/site-packages/lithops/invokers.py", line 439, in _callback
        future.result()
      File "/usr/lib/python3.8/concurrent/futures/_base.py", line 437, in result
        return self.__get_result()
      File "/usr/lib/python3.8/concurrent/futures/_base.py", line 389, in __get_result
        raise self._exception
      File "/usr/lib/python3.8/concurrent/futures/thread.py", line 57, in run
        result = self.fn(*self.args, **self.kwargs)
      File "/home/op/env/lib/python3.8/site-packages/lithops/invokers.py", line 370, in _invoke_task
        activation_id = self.compute_handler.invoke(payload)
      File "/home/op/env/lib/python3.8/site-packages/lithops/serverless/serverless.py", line 59, in invoke
        return self.backend.invoke(runtime_name, runtime_memory, job_payload)
      File "/home/op/env/lib/python3.8/site-packages/lithops/serverless/backends/aws_lambda/aws_lambda.py", line 577, in invoke
        raise Exception('Error {}: {}'.format(r.status_code, r.text))
    Exception: Error 429: {"Reason":"CallerRateLimitExceeded","Type":"User","message":"Rate exceeded"}
    exception calling callback for <Future at 0x7fed5f95ceb0 state=finished raised Exception>
    Traceback (most recent call last):
      File "/usr/lib/python3.8/concurrent/futures/_base.py", line 328, in _invoke_callbacks
        callback(self)
      File "/home/op/env/lib/python3.8/site-packages/lithops/invokers.py", line 439, in _callback
        future.result()
      File "/usr/lib/python3.8/concurrent/futures/_base.py", line 437, in result
        return self.__get_result()
      File "/usr/lib/python3.8/concurrent/futures/_base.py", line 389, in __get_result
        raise self._exception
      File "/usr/lib/python3.8/concurrent/futures/thread.py", line 57, in run
        result = self.fn(*self.args, **self.kwargs)
      File "/home/op/env/lib/python3.8/site-packages/lithops/invokers.py", line 370, in _invoke_task
        activation_id = self.compute_handler.invoke(payload)
      File "/home/op/env/lib/python3.8/site-packages/lithops/serverless/serverless.py", line 59, in invoke
        return self.backend.invoke(runtime_name, runtime_memory, job_payload)
      File "/home/op/env/lib/python3.8/site-packages/lithops/serverless/backends/aws_lambda/aws_lambda.py", line 577, in invoke
        raise Exception('Error {}: {}'.format(r.status_code, r.text))
    Exception: Error 429: {"Reason":"CallerRateLimitExceeded","Type":"User","message":"Rate exceeded"}
    exception calling callback for <Future at 0x7fedac1ae2b0 state=finished raised Exception>
    
    opened by b1bek 1
  • FFUF Example Usage?

    FFUF Example Usage?

    Hi there,

    Do you have an FFUF example usage? I kept getting errors parsing params for example:

    python3 shadowclone.py -i input.txt -o output.txt -s 1 -c "cat {INPUT} | xargs -I @ -c '/go/bin/ffuf -u @ -w wordlist.txt -mc 200'"

    opened by vysecurity 3
  • Could not execute the runtime

    Could not execute the runtime

    Using AWS Runtime.

    python3 shadowbrute.py -d REDACTED -w subdomain_wordlist.txt
    2022-01-24 16:47:39,414 [INFO] File with same name already exists in the bucket, skipping upload
    2022-01-24 16:47:39,414 [ERROR] Could not execute the runtime.
    

    Could not execute the runtime. How can I debug?

    opened by vysecurity 10
  • getting error while running this command lithops runtime build sc-runtime -f Dockerfile

    getting error while running this command lithops runtime build sc-runtime -f Dockerfile

    2022-01-24 14:04:46,069 [INFO] lithops.config -- Lithops v2.5.8 2022-01-24 14:04:46,069 [DEBUG] lithops.config -- Loading configuration from /root/.lithops/config 2022-01-24 14:04:46,139 [DEBUG] lithops.config -- Loading Serverless backend module: aws_lambda 2022-01-24 14:04:46,240 [DEBUG] lithops.serverless.backends.aws_lambda.aws_lambda -- Creating AWS Lambda client 2022-01-24 14:04:46,240 [DEBUG] lithops.serverless.backends.aws_lambda.aws_lambda -- Creating Boto3 AWS Session and Lambda Client 2022-01-24 14:04:47,588 [INFO] lithops.serverless.backends.aws_lambda.aws_lambda -- AWS Lambda client created - Region: us-east-1 2022-01-24 14:04:47,589 [INFO] lithops.serverless.backends.aws_lambda.aws_lambda -- Going to create runtime -d for AWS Lambda 2022-01-24 14:04:47,590 [DEBUG] lithops.utils -- Creating function handler zip in lithops_lambda.zip Traceback (most recent call last): File "/home/cherry/Desktop/ShadowClone/env/bin/lithops", line 8, in sys.exit(lithops_cli()) File "/home/cherry/Desktop/ShadowClone/env/lib/python3.8/site-packages/click/core.py", line 1128, in call return self.main(*args, **kwargs) File "/home/cherry/Desktop/ShadowClone/env/lib/python3.8/site-packages/click/core.py", line 1053, in main rv = self.invoke(ctx) File "/home/cherry/Desktop/ShadowClone/env/lib/python3.8/site-packages/click/core.py", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/home/cherry/Desktop/ShadowClone/env/lib/python3.8/site-packages/click/core.py", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/home/cherry/Desktop/ShadowClone/env/lib/python3.8/site-packages/click/core.py", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File "/home/cherry/Desktop/ShadowClone/env/lib/python3.8/site-packages/click/core.py", line 754, in invoke return __callback(*args, **kwargs) File "/home/cherry/Desktop/ShadowClone/env/lib/python3.8/site-packages/click/decorators.py", line 26, in new_func return f(get_current_context(), *args, **kwargs) File "/home/cherry/Desktop/ShadowClone/env/lib/python3.8/site-packages/lithops/scripts/cli.py", line 465, in build compute_handler.build_runtime(name, file, ctx.args) File "/home/cherry/Desktop/ShadowClone/env/lib/python3.8/site-packages/lithops/serverless/serverless.py", line 66, in build_runtime self.backend.build_runtime(runtime_name, file, extra_args) File "/home/cherry/Desktop/ShadowClone/env/lib/python3.8/site-packages/lithops/serverless/backends/aws_lambda/aws_lambda.py", line 319, in build_runtime subprocess.check_call(cmd.split()) File "/usr/lib/python3.8/subprocess.py", line 359, in check_call retcode = call(*popenargs, **kwargs) File "/usr/lib/python3.8/subprocess.py", line 340, in call with Popen(*popenargs, **kwargs) as p: File "/usr/lib/python3.8/subprocess.py", line 858, in init self._execute_child(args, executable, preexec_fn, close_fds, File "/usr/lib/python3.8/subprocess.py", line 1704, in _execute_child raise child_exception_type(errno_num, err_msg, err_filename) FileNotFoundError: [Errno 2] No such file or directory: 'None'

    opened by falcon319 8
Owner
null
null 4 Oct 28, 2021
Unencrypted Story View Botter is a helpful tool that allows thousands of people to watch your posts.

Unencrypted Story View Botter is a helpful tool that allows thousands of people to watch your posts.

null 8 Aug 5, 2022
An alternative to OpenFaaS nats-queue-worker for long-running functions

OpenFaas Job Worker OpenFaas Job Worker is a fork of project : OSCAR Worker - https://github.com/grycap/oscar-worker Thanks to Sebástian Risco @srisco

Sebastien Aucouturier 1 Jan 7, 2022
This project checks the weather in the next 12 hours and sends an SMS to your phone number if it's going to rain to remind you to take your umbrella.

RainAlert-Request-Twilio This project checks the weather in the next 12 hours and sends an SMS to your phone number if it's going to rain to remind yo

null 9 Apr 15, 2022
If you are in allot of groups or channel and you would like to leave them at once use this

Telegram-auto-leave-groups If you are in allot of groups or channel and you would like to leave them at once use this USER GUIDE ?? Insert your telegr

Julius Njoroge 4 Jan 3, 2023
GitHub action to deploy serverless functions to YandexCloud

YandexCloud serverless function deploy action Deploy new serverless function version (including function creation if it does not exist). Inputs yc_acc

Много Лосося 4 Apr 10, 2022
A smart tool to backup members 📈 So you even after a raid/ ban you can easily restore them in seconds 🎲

?? Discord-backer ?? A open-source Discord member backup and restore tool for your server. This can help you get all your members in 5 Seconds back af

John 29 Dec 21, 2022
Want to play What Would Rather on your Server? Invite the bot now!😏

What is this Bot? ?? What You Would Rather? is a Guessing game where you guess one thing. Long Description short Take this example: You typed r!rather

丂ㄚ么乙ツ 2 Nov 17, 2021
Easy & powerful bot to check if your all Telegram bots are working or not. This bot status bot updates every 45 minutes & runs for 24x7 hours.

PowerfulBotStatus-IDN-C-X Easy & powerful bot to check if your all Telegram bots are working or not. This bot status bot updates every 45 minutes & ru

IDNCoderX 5 Oct 6, 2022
Checks if Minecraft accounts are available, or taken.

MCNameChecker Checks validity of Minecraft IGN's. Using async to make it even faster. Has rate-limit detections and Proxy support Usage Q. How do I us

Dimitri Demarkus 5 Apr 22, 2022
Prime Mega is a modular bot running on python3 with autobots theme and have a lot features.

PRIME MEGA Prime Mega is a modular bot running on python3 with autobots theme and have a lot features. Easiest Way To Deploy On Heroku This Bot is Cre

『TØNIC』 乂 ₭ILLΣR 45 Dec 15, 2022
A modular bot running on python3 with anime theme and have a lot features

STINKY ROBOT Emiko Robot is a modular bot running on python3 with anime theme and have a lot features. Easiest Way To Deploy On Heroku This Bot is Cre

Riyan.rz 3 Jan 21, 2022
Okaeri Robot: a modular bot running on python3 with anime theme and have a lot features

OKAERI ROBOT Okaeri Robot is a modular bot running on python3 with anime theme a

Dream Garden (rey) 2 Jan 19, 2022
Kanata Bot - a modular bot running on python3 with anime theme and have a lot features

Kanata Bot Kanata Bot is a modular bot running on python3 with anime theme and have a lot features. Easiest Way To Deploy On Heroku This Bot is Create

Rikka-Chan 2 Jan 16, 2022
Automatically render tens of thousands of unique NFT images individually as png's.

Blend_My_NFTs Description This project is a work in progress (as of Oct 24th, 2021) and will eventually be an add on to Blender. Blend_My_NFTs is bing

Torrin Leonard 894 Dec 29, 2022
To dynamically change the split direction in I3/Sway so as to split new windows automatically based on the width and height of the focused window

To dynamically change the split direction in I3/Sway so as to split new windows automatically based on the width and height of the focused window Insp

Ritin George 6 Mar 11, 2022
Defi PancakeSwap bot is programmed in Python to buy and sell tokens in seconds once the target is hit.

Defi PancakeSwap BOT A BOT that will make easy your life in Trading. Watch tutorial on Youtube Table of Contents About The Project Built With Getting

Zain Ullah 208 Jan 5, 2023
💻 A fully functional local AWS cloud stack. Develop and test your cloud & Serverless apps offline!

LocalStack - A fully functional local AWS cloud stack LocalStack provides an easy-to-use test/mocking framework for developing Cloud applications. Cur

LocalStack 45.3k Jan 2, 2023
A Python IRC bot with dynamically loadable modules

pybot This is a modular, plugin-based IRC bot written in Python. Plugins can bedynamically loaded and unloaded at runtime. A design goal is the abilli

Jeff Kent 1 Aug 20, 2021