Plan Plan is a Python package for writing and deploying cron jobs. Plan will convert Python code to cron syntax. You can easily manage your cron jobs with Plan like a charm. It is designed for elegancy and writing cron jobs with as little amount of code as possible. It's extensible but comes with serveral good useful job types out of the box. The purpose is making writing cronfile fun and without causing mistakes. Plan has following goods: - one command to create a quickstart example schedule.py file - easy to define your task, every frequency, at moment, running path, running bash environment, task output - handle communicate with your crontab process with features like write, update or clear Read the docs at http://plan.readthedocs.org/ If you feel anything wrong, feedbacks or pull requests are welcome.
Crontab jobs management in Python
Overview
Comments
-
Do not exit on success cron update
I'm trying to use the Plan to add cron commands at Django site startup. When I use 'update' mode my Django server go down, because 'update' command send
sys.exit(1)
signal. I fix that by this code, but I think it's ugly:# Don't exit on success try: cron.run("update") except SystemExit: pass
-
Global distributed crontab
Brilliant module, congrats!
It would be extra awesome if it could be combined with, for instance, @ansible:
https://github.com/ansible/ansible
Often sysadmins have several machines running cron and they forget what is running on which, causing race conditions on services that rely on different states and dependencies.
Having both composable crontab modules and a global view of how crontabs look like across machines would be quite awesome feature.
Have you checked Chronos, for instance?:
http://airbnb.github.io/chronos/
-
add MAILTO to cron groups
Add mailto:
cron = Plan('cron', mailto=['[email protected]', '[email protected]'])
it will generate code below:
# Begin Plan generated jobs for: zhihu # jobs will send mail to users below [email protected],[email protected]
Only support named cron.
-
Make Python 3 page more positive
For this project, I don't see why the Python 3 page (http://plan.readthedocs.org/python3.html) needs to recommend Python 2. It supports Python 3 and only depends on click, which also supports Python 3.
It's also very well possible to schedule your cron using Python 3 but have the application be in Python 2 (if necessary). The far majority of packages now support Python 3 (http://python3wos.appspot.com/) so the decision should be left to the user.
-
weekday/weekdays
weekday
refers to Monday through Friday. We often say:- The library open 9AM to 10PM on weekdays
- Monday is a weekday
So in my opinion
weekdays
is better thanweekday
, when it means "any day between Monday and Friday".The same argue may stand for
weekend
andweekends
.I'm not a native English speaker so I might be wrong, but supporting the plural form seems to be harmless.
-
docs: Fix a few typos
There are small typos in:
- docs/job_definition.rst
- docs/run_types.rst
- plan/job.py
Fixes:
- Should read
separate
rather thanseperate
. - Should read
insensitive
rather thaninsenstive
. - Should read
should
rather thanshoud
. - Should read
separated
rather thanseperated
.
Semi-automated pull request generated by https://github.com/timgates42/meticulous/blob/master/docs/NOTE.md
-
fix the decode error bug if the job params has no-ascii word
File "/Users/chenzhang/PycharmProjects/threatbook_spider/visual_spider_web/views/slave_views.py", line 74, in make_cron_tasks cron.run('update') File "/Users/chenzhang/envs/threatbook_spider/lib/python3.6/site-packages/plan/core.py", line 278, in run self.update_crontab(run_type) File "/Users/chenzhang/envs/threatbook_spider/lib/python3.6/site-packages/plan/core.py", line 221, in update_crontab current_crontab = self.read_crontab() File "/Users/chenzhang/envs/threatbook_spider/lib/python3.6/site-packages/plan/core.py", line 202, in read_crontab r = communicate_process(command, universal_newlines=True) File "/Users/chenzhang/envs/threatbook_spider/lib/python3.6/site-packages/plan/utils.py", line 22, in communicate_process output, error = p.communicate(stdin) File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/subprocess.py", line 843, in communicate stdout, stderr = self._communicate(input, endtime, timeout) File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/subprocess.py", line 1554, in _communicate self.stdout.errors) File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/subprocess.py", line 740, in _translate_newlines data = data.decode(encoding, errors) UnicodeDecodeError: 'ascii' codec can't decode byte 0xe6 in position 60898: ordinal not in range(128)
generate HPC scheduler systems jobs input scripts and submit these scripts to HPC systems and poke until they finish
DPDispatcher DPDispatcher is a python package used to generate HPC(High Performance Computing) scheduler systems (Slurm/PBS/LSF/dpcloudserver) jobs in
Python job scheduling for humans.
schedule Python job scheduling for humans. Run Python functions (or any other callable) periodically using a friendly syntax. A simple to use API for
A powerful workflow engine implemented in pure Python
Spiff Workflow Summary Spiff Workflow is a workflow engine implemented in pure Python. It is based on the excellent work of the Workflow Patterns init
Python-Repeated-Timer is an open-source & highly performing timer using only standard-libraries.
Python Repeated Timer Python-Repeated-Timer is an open-source & highly performing timer using only standard-libraries.
A Python concurrency scheduling library, compatible with asyncio and trio.
aiometer aiometer is a Python 3.6+ concurrency scheduling library compatible with asyncio and trio and inspired by Trimeter. It makes it easier to exe
Nautobot-custom-jobs - Custom jobs for Nautobot
nautobot-custom-jobs This repo contains custom jobs for Nautobot. Installation P
Luigi is a Python module that helps you build complex pipelines of batch jobs. It handles dependency resolution, workflow management, visualization etc. It also comes with Hadoop support built in.
Luigi is a Python (3.6, 3.7 tested) package that helps you build complex pipelines of batch jobs. It handles dependency resolution, workflow managemen
Python 3.6+ toolbox for submitting jobs to Slurm
Submit it! What is submitit? Submitit is a lightweight tool for submitting Python functions for computation within a Slurm cluster. It basically wraps
ClusterMonitor - a very simple python script which monitors and records the CPU and RAM consumption of submitted cluster jobs
ClusterMonitor A very simple python script which monitors and records the CPU and RAM consumption of submitted cluster jobs. Usage To start recording
Automate SQL Jobs Monitoring with python
Automate_SQLJobsMonitoring_python Using python 3rd party modules we can automate
WebScraping - Scrapes Job website for python developer jobs and exports the data to a csv file
WebScraping Web scraping Pyton program that scrapes Job website for python devel
Student-Management-System-in-Python - Student Management System in Python
Student-Management-System-in-Python Student Management System in Python
Run MapReduce jobs on Hadoop or Amazon Web Services
mrjob: the Python MapReduce library mrjob is a Python 2.7/3.4+ package that helps you write and run Hadoop Streaming jobs. Stable version (v0.7.4) doc
Finds Jobs on LinkedIn using web-scraping
Find Jobs on LinkedIn ?? This program finds jobs by scraping on LinkedIn ???? Relies on User Input. Accepts: Country, City, State ?? Data about jobs
Monitor your ML jobs on mobile devices📱, especially for Google Colab / Kaggle
TF Watcher TF Watcher is a simple to use Python package and web app which allows you to monitor ?? your Machine Learning training or testing process o
Tools to help record data from Qiskit jobs
archiver4qiskit Tools to help record data from Qiskit jobs. Install with pip install git+https://github.com/NCCR-SPIN/archiver4qiskit.git Import the
Using AWS Batch jobs to bulk copy/sync files in S3
Using AWS Batch jobs to bulk copy/sync files in S3
Exercise to teach a newcomer to the CLSP grid to set up their environment and run jobs
Exercise to teach a newcomer to the CLSP grid to set up their environment and run jobs
Quick & dirty controller to schedule Kubernetes Jobs later (once)
K8s Jobber Operator Quickly implemented Kubernetes controller to enable scheduling of Jobs at a later time. Usage: To schedule a Job later, Set .spec.
Search and Find Jobs in Ethiopia
✨ EthioJobs ✨ Search and Find Jobs in Ethiopia Easy start critical warning Use pycharm No vscode No sublime No Vim No nothing when you want to use