A server shell for you to play with Powered by Django + Nginx + Postgres + Bootstrap + Celery.

Overview

Template server

A server shell for you to play with

Powered by Django + Nginx + Postgres + Bootstrap + Celery.


Getting started

  1. Install Docker Community Edition
  2. Install docker-compose into python3, e.g. pip3 install --user docker-compose
  3. Add your user to the docker group. sudo usermod -a -G docker username ; you may have to reboot after this step for you to show up in the group.
  4. Create a file .local_params in the root directory using .local_params_examples as a template. Read section "Running jobs" for the details.

You should then use the local-docker-compose script as a drop in replacement for docker-compose. For example, to start the server you can run local-docker-compose up --build.

Cleaning up after docker for a clean rebuild:

  1. ./cluspro-docker-compose rm will remove the containers
  2. docker volume prune

If you don't explicitly remove the volumes between docker runs, the databases persist, so you can stop the containers and launch them again safely without any loss of data.

Architecture

Docker runs several services: web (which runs Gunicorn), nginx, db (Postgres database). Gunicorn handles the python (Django) code, accesses the database and cooperates with Nginx. Celery is a background task manager and it need rabbitMQ to run (message broker). Flower is a task monitor, which is powered by Celery. It can be accessed at localhost:5555

Structure

All the frontend code is located in server/. The structure of server/ directory is enforced by the Django rules, so we have server/server, where all the server settings are located (settings.py) as well as config.py. config.py is where the custom variables are kept (e.g. email login and password for sending messages to the user), which in turn are populated from the environment, which is set in docker-compose.yml.
core/ contains the app code, as it's called in Django. core/templates has all the html files, core/static - CSS and JS, and runner/ contains the code for job running.

Core/ structure

  1. views.py is the main file - it has functions, which render the pages and handle all the forms and requests. Most of functions return an HTML response.

  2. urls.py assigns URLs to the functions in views.py.

  3. models.py contains custom data tables, which are added to the default Django tables. Right now it contains a model for jobs, which can be customized as you wish. The intention, however, was to keep all the generic job fields as separate class attributes (job name, IP etc.) and to store all the rest job specific parameters as a json string in details_json field. This way we can prevent creating many different tables for different job types or addition of infinite new fields to the same job table (once we add new job parameters, for example).

  4. All the forms on the website are contained in forms.py and it should be kept so. These forms are all handled in views.py.

  5. emails.py has messages for users, whenever we want to send them something. They use the e-mail address and password specified in server/settings.py, which are in turn taken from environmental variables in docker-compose.yml. If they were not specified you will get an error, whenever the server is trying to send a message.

  6. env.py is where you should keep your local variables. Also all the variables in env dictionary will be passed as a context to the html templates, so you can refer to them in the templates.

At the first launch

Two users are created.

  1. admin with password 'admin'. This is a superuser, you should change the password for it immediately. The admin page is located at http://localhost:8080/admin
  2. anon, which is where you log in once you click 'use without your own account' button on the login page. It has limited permissions.

Also storage/ directory is created in the root, where all the jobs will be kept.

Jobs

When you run jobs they are stored in docker container in /storage, which is by default mounted in your project root. You can change this in docker-compose.yml. Storage has two directories: tmp/ for temporary storage, if you need to compute or check something before adding the job to the database, and jobs/ with all the jobs.


Running jobs

Jobs

Currently a job performs addition of two integer numbers. Some additional requirements are added to demonstrate how to use error pop-ups etc. The task itself is located in models.py.

.local_params

Environmental variables with some paths, e-mail login and password are stored in .local_params, which are used when you run local-docker-compose. To create the file use .local_params_example as a template.

Variables for sending e-mails. If you don't specify them, everything will still run, but you will get errors when new users register etc. If your e-mail is [email protected] and the password is password then the values should be:

EMAIL_USER - server
EMAIL_PASS - password
EMAIL_HOST - smtp.gmail.com

RABBITMQ_USER and RABBITMQ_PASS will be generated and added to .local_params at the first run of local-docker-compose, unless specified by the user.

LOCAL_PORT is the port, through which you access the server (default is 8080)

SECRET_KEY is for Django internal use (is generated at the first run of local-docker-compose) and should be kept secret.

You might also like...
 Improve current data preprocessing for FTM's WOB data to analyze Shell and Dutch Governmental contacts.
Improve current data preprocessing for FTM's WOB data to analyze Shell and Dutch Governmental contacts.

We're the hackathon leftovers, but we are Too Good To Go ;-). A repo by Lukas Schubotz and Raymon van Dinter. We aim to improve current data preprocessing for FTM's WOB data to analyze Shell and Dutch Governmental contacts.

Transform a Google Drive server into a VFX pipeline ready server
Transform a Google Drive server into a VFX pipeline ready server

Google Drive VFX Server VFX Pipeline About The Project Quick tutorial to setup a Google Drive Server for multiple machines access, and VFX Pipeline on

You can easily send campaigns, e-marketing have actually account using cash will thank you for using our tools, and you can support our Vodafone Cash +201090788026

*** Welcome User Sorry I Mean Hello Brother βœ“ Devolper and Design : Mokhtar Abdelkreem ========================================== You Can Follow Us O

An AI-powered device to stop people from stealing my packages.

Package Theft Prevention Device An AI-powered device to stop people from stealing my packages. Installation To install on a raspberry pi, clone the re

A Pythonic Data Catalog powered by Ray that brings exabyte-level scalability and fast, ACID-compliant, change-data-capture to your big data workloads.

DeltaCAT DeltaCAT is a Pythonic Data Catalog powered by Ray. Its data storage model allows you to define and manage fast, scalable, ACID-compliant dat

Clackety Keyboards Powered by Python

KMK: Clackety Keyboards Powered by Python KMK is a feature-rich and beginner-friendly firmware for computer keyboards written and configured in Circui

πŸ“½ Streamlit application powered by a PyScaffold project setup

streamlit-demo Streamlit application powered by a PyScaffold project setup. Work in progress: The idea of this repo is to demonstrate how to package a

Runtime profiler for Streamlit, powered by pyinstrument
Runtime profiler for Streamlit, powered by pyinstrument

streamlit-profiler πŸ„πŸΌ Runtime profiler for Streamlit, powered by pyinstrument. streamlit-profiler is a Streamlit component that helps you find out w

Replite - An embeddable REPL powered by JupyterLite
Replite - An embeddable REPL powered by JupyterLite

replite An embeddable REPL, powered by JupyterLite. Usage To embed the code cons

Owner
Mengting Song
Mengting Song
Task dispatcher for Postgres

Features a task being ran as an OS process supports task queue with priority and process limit per node fully database driven (a worker and task can b

null 2 Dec 6, 2021
A program made in PYTHON🐍 that automatically performs data insertions into a POSTGRES database 🐘 , using as base a .CSV file πŸ“ , useful in mass data insertions

A program made in PYTHON?? that automatically performs data insertions into a POSTGRES database ?? , using as base a .CSV file ?? , useful in mass data insertions.

Davi Galdino 1 Oct 17, 2022
OnTime is a small python that you set a time and on that time, app will send you notification and also play an alarm.

OnTime Always be OnTime! What is OnTime? OnTime is a small python that you set a time and on that time, app will send you notification and also play a

AmirHossein Mohammadi 11 Jan 16, 2022
Video Stream is an Advanced Telegram Bot that's allow you to play Video & Music on Telegram Group Video Chat

Video Stream is an Advanced Telegram Bot that's allow you to play Video & Music on Telegram Group Video Chat ?? Stats ?? Get SESSION_NAME from below:

dark phoenix 12 May 8, 2022
A beautiful and useful prompt for your shell

A Powerline style prompt for your shell A beautiful and useful prompt generator for Bash, ZSH, Fish, and tcsh: Shows some important details about the

Buck Ryan 6k Jan 8, 2023
Information about a signed UEFI Shell that can be used when Secure Boot is enabled.

SignedUEFIShell During our research of the BootHole vulnerability last year, we tried to find as many signed bootloaders as we could. We searched all

Mickey 61 Jan 3, 2023
Shell scripts made simple 🐚

zxpy Shell scripts made simple ?? Inspired by Google's zx, but made much simpler and more accessible using Python. Rationale Bash is cool, and it's ex

Tushar Sadhwani 492 Dec 27, 2022
Penelope Shell Handler

penelope Penelope is an advanced shell handler. Its main aim is to replace netcat as shell catcher during exploiting RCE vulnerabilities. It works on

null 293 Dec 30, 2022
Shell Trality API for local development.

Trality Simulator Intro This package is a work in progress. It allows local development of Trality bots in an IDE such as VS Code. The package provide

CrypTrality 1 Nov 17, 2021
Utility/Raiding selfbot made by Shell and Roover.

Utility/Raiding selfbot made by Shell and Roover. We are open to suggestions and ideas.

Shell 2 Dec 8, 2021