Tools used by Ada Health's internal IT team to deploy and manage a serverless Munki setup.

Overview

Serverless Munki

This repository contains cross platform code to deploy a production ready Munki service, complete with AutoPkg, that runs entirely from within a single GitHub repository and an AWS S3 bucket. No other infrastructure is required. More specifically it contains the following:

  • Terraform code to setup a Munki repo in AWS S3.
  • Actions workflows to handle AutoPkg runs and related tasks.
  • Directories for maintaining Munki items and AutoPkg overrides.

How it works

After following the deployment steps outlined below to setup your own GitHub repo and S3 bucket, an Actions workflow will run daily which does the following:

  • Runs any AutoPkg recipes located in your RecipOverrides/ folder.
  • Imports any new items into the the munki_repo/ folder.
  • Git commits changes (pkgs, pkgsinfo) for each item into a separate branch.
  • Creates a PR for each new item.
  • Posts results to Slack (if enabled).
  • Syncs approved changes in munki_repo/ to your S3 bucket where the items will be available to client devices.

Deployment

Initial GitHub Setup

Firstly, you will need to create a new GitHub repository with Actions enabled. You can then clone this repo and copy its contents into your own private repo by running the following Terminal commands:

git clone [email protected]:adahealth/serverless-munki.git
cd serverless-munki
make init

By default this will create a new directory named my-serverless-munki inside the parent directory of our cloned repo and initialize it as it's own Git repository. Now we can install (if you haven't already) and configure Git LFS for your repo. In our example, we are installing Git LFS via Homebrew but feel free to install it how ever you like.

brew install git-lfs
make lfs

Then you can go ahead and push your new repo to the Actions enabled GitHub repository you created earlier.

cd ../my-serverless-munki
git remote add origin <your-github-repo-url>
git branch -M master
git push -u origin master

AWS / Terraform setup

Log in to your AWS account and create an AWS IAM user with the following permissions: AWSLambdaFullAccess, IAMFullAccess, AmazonS3FullAccess, CloudFrontFullAccess. Then create an access key for the user and set the access key ID and secret key as environment variables. This is so that Terraform can authenticate to the AWS provider. Also, if you don't have Terraform installed you should do that now.

brew install [email protected]
export AWS_ACCESS_KEY_ID="<your-access-key-id>"
export AWS_SECRET_ACCESS_KEY="<your-secret-key>"

While we're at it, we can also add both the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY as GitHub Actions secrets in our remote repo. They will be used in our Actions workflows when syncing our Munki files to our S3 bucket.

Next, we need to set our Terraform variables for our AWS configuration. Open the /terraform/variables.tf file and adjust the variables to match what you want the bucket to be called, and set the username and password your Munki clients will use to access the repo.

# prefix should be globally unique. Some characters seem to cause issues;
# Something like yourorg_munki might be a good prefix.
variable "prefix" {
  default = "YOU_BETTER_CHANGE_ME"
}

# you'd need to change this only if you have an existing bucket named
# "munki-s3-bucket"
variable "munki_s3_bucket" {
  default = "munki-s3-bucket"
}

# the price class for your CloudFront distribution
# one of PriceClass_All, PriceClass_200, PriceClass_100
variable "price_class" {
  default = "PriceClass_100"
}

# the username your Munki clients will use for BasicAuthentication
variable "username" {
  default = "YOU_BETTER_CHANGE_ME"
}

# the password your Munki clients will use for BasicAuthentication
variable "password" {
  default = "YOU_BETTER_CHANGE_ME"
}

Now we can change in to the terraform/ directory and check our Terraform plan.

cd terraform
terraform init
terraform plan

If everything is as expected we can apply the configuration.

terraform apply

That's it for our Munki "server" repository. We can use terraform outputs to obtain info for your client configuration.

terraform output cloudfront_url 
# This is your SoftwareRepoURL.


terraform output username       
terraform output password  
# These are the credentials that your clients will use to access the S3 bucket.

Update the /.github/workflows/sync-repo.yml file to include your bucket ID on line 41.

run : |
          aws s3 sync "$GITHUB_WORKSPACE"/munki_repo s3://<ADD-YOUR-BUCKET-ID-HERE> --exclude '.DS_Store' --exclude '.keep' --delete

The Munki wiki covers configuring your clients to use BasicAuthentication using the username and password you've chosen. Be sure also to set Munki's SoftwareRepoURL to "https://<your-cloudfront_url>"

Slack notifications

To configure Slack notifications, simply create an incoming webhook in your Slack tenant and add the webook URL as a GitHub Actions secret with the name SLACK_WEBHOOK

Usage

AutoPkg

Add your AutoPkg recipe overrides to the RecipeOverrides/ folder, commit them to your remote repo and add any necessary parent recipe repos to the .github/workflows/autopkg-run.yml workflow file by appending a repo-add command to the "Add AutoPkg repos" step.

- name: Add AutoPkg repos
        run: | 
          autopkg repo-add recipes
          autopkg repo-add <parent-recipe-repo1>
          autopkg repo-add <parent-recipe-repo2>
          autopkg repo-add <parent-recipe-repo3>
          # etc

Every time the autopkg-run workflow is triggered the following steps will happen inside of a GitHub Actions runner VM:

  • Repository is checked out containing AutoPkg overrides and Munki Repo.
  • Munki and AutoPkg is installed and configured.
  • Each recipe in the RecipeOverides directory is run.
  • If AutoPkg imported any new items into Munki, commit the changes and create a PR.
  • If enabled, post results to Slack.

By default this is scheduled to run at 6am everyday between Monday and Friday. You can change this by editing the schedule in .github/workflows/autopkg-run.yml.

After reviewing and merging any PRs created via the autopkg-run workflow, the sync-repo workflow will be triggered. This will sync any changes in your munki repo to your AWS S3 bucket where they will be available for your clients.

Updating recipe trust info

We update recipe trust info by manually running the update-trust-info workflow. Make sure the parent recipe repo is included in the "Add AutoPkg Repos" step in the .github/workflows/update-trust-info.yml file before triggering the workflow run.

Munki

You can populate and administer your munki repo whatever way you are used to by checking out your GitHub repo locally and making your required changes inside the munki_repo folder. When changes are pushed to the remote Master branch, they will be automatically synced to your S3 bucket via the sync-repo workflow.

Clean Repo

The clean-repo workflow will remove older, unused software items from the Munki repo. By default it is scheduled to run every Tuesday at 19:00. You can change this by editing .github/workflows/clean-repo.yml.

Acknowledgements

Terraform Munki Repo module from Graham Gilbert

The autopkg_tools.py script is a fork of Facebook's autopkg_tools.py

The GitHub Actions workflows and this project in general are based heavily on the GitHub Actions AutoPkg setup from Gusto

You might also like...
Use GitHub Actions to create a serverless service.

ActionServerless - Use GitHub Actions to create a serverless service ActionServerless is an action to do some computing and then generate a string/JSO

SQS + Lambda를 활용한 문자 메시지 및 이메일, Voice call 호출을 간단하게 구현하는 serverless 템플릿
SQS + Lambda를 활용한 문자 메시지 및 이메일, Voice call 호출을 간단하게 구현하는 serverless 템플릿

AWS SQS With Lambda notification 서버 구축을 위한 Poc TODO serverless를 통해 sqs 관련 리소스(람다, sqs) 배포 가능한 템플릿 작성 및 배포 poc차원에서 간단한 rest api 호출을 통한 sqs fifo 큐에 메시지

A Serverless Application Model stack that persists the $XRP price to the XRPL every minute as a TrustLine. There are no servers, it is effectively a "smart contract" in Python for the XRPL.

xrpl-price-persist-oracle-sam This is a XRPL Oracle that publishes external data into the XRPL. This Oracle was inspired by XRPL-Labs/XRPL-Persist-Pri

The AWS Lambda Serverless Blind XSS App
The AWS Lambda Serverless Blind XSS App

Ass The AWS Lambda Serverless Blind XSS App 利用VPS配置XSS平台太麻烦了,如果利用AWS的Lambda那不就是一个域名的事情么?剩下的环境配置、HTTPS证书、隐私性、VPS续费都不用管了, 所以根据xless重写了Lambda平台的XSS,利用sla

Dante, my discord bot. Open source project in development and not optimized for other filesystems, install and setup script in development

DanteMode (In private development for ~6 months) Dante, my discord bot. Open source project in development and not optimized for other filesystems, in

Unit testing AWS interactions with pytest and moto. These examples demonstrate how to structure, setup, teardown, mock, and conduct unit testing. The source code is only intended to demonstrate unit testing.

Unit Testing Interactions with Amazon Web Services (AWS) Unit testing AWS interactions with pytest and moto. These examples demonstrate how to structu

Hostapd-mac-monitor - Setup a hostapd AP to conntrol the connections of specific MACs

A brief explanation This script provides way to setup a monitoring service of sp

My homeserver setup. Everything managed securely using Portainer.

homeserver-traefik-portainer Features: access all services with free TLS from letsencrypt using your own domain running a side project is super simple

All Tools In One is a Script Developed with Python3. It gathers a total of 14 Discord tools (including a RAT, a Raid Tool, a Token Grabber, a Crash Video Maker, etc). It has a pleasant and intuitive interface to facilitate the use of all with help and explanations for each of them.
Comments
  • [IT-1261] Small updates

    [IT-1261] Small updates

    This PR reflects some small changes we made to our internal setup of serverless-munki at Ada which include:

    • The ability to specify recipe names when triggering the autopkg-run workflow manually
    • Switched version pinning on third-party Github actions from tags to SHA1 commit hashes
    • Updated the version of munki used in actions runners
    opened by stilljake 0
Owner
Ada Health
Supporting better health outcomes and clinical excellence with intelligent technology.
Ada Health
GitHub action to deploy serverless functions to YandexCloud

YandexCloud serverless function deploy action Deploy new serverless function version (including function creation if it does not exist). Inputs yc_acc

Много Лосося 4 Apr 10, 2022
Sail is a free CLI tool to deploy, manage and scale WordPress applications in the DigitalOcean cloud.

Deploy WordPress to DigitalOcean with Sail Sail is a free CLI tool to deploy, manage and scale WordPress applications in the DigitalOcean cloud. Conte

Konstantin Kovshenin 159 Dec 12, 2022
Jika ada pertanyaan lebih lanjut, hubungi kontak dibawah ini. Terimakasih...

⚡ Lynx Userbot ⚡ Userbot Used for Fun on Telegram, and for Maintianing Your Group. This is a Repo Lynx-Userbot. This is Repo was Created by Axel From

null 29 Aug 30, 2021
MONAI Deploy App SDK offers a framework and associated tools to design, develop and verify AI-driven applications in the healthcare imaging domain.

MONAI Deploy App SDK offers a framework and associated tools to design, develop and verify AI-driven applications in the healthcare imaging domain.

Project MONAI 49 Dec 23, 2022
fair-test is a library to build and deploy FAIR metrics tests APIs supporting the specifications used by the FAIRMetrics working group.

☑️ FAIR test fair-test is a library to build and deploy FAIR metrics tests APIs supporting the specifications used by the FAIRMetrics working group. I

Maastricht University IDS 6 Oct 30, 2022
Represents a Lavalink client used to manage nodes and connections.

lavaplayer Represents a Lavalink client used to manage nodes and connections. setup pip install lavaplayer setup lavalink you need to java 11* LTS or

HazemMeqdad 37 Nov 21, 2022
Crud-python-sqlite: used to manage telephone contacts through python and sqlite

crud-python-sqlite This program is used to manage telephone contacts through python and sqlite. Dependencicas python3 sqlite3 Installation Clone the r

Luis Negrón 0 Jan 24, 2022
💻 A fully functional local AWS cloud stack. Develop and test your cloud & Serverless apps offline!

LocalStack - A fully functional local AWS cloud stack LocalStack provides an easy-to-use test/mocking framework for developing Cloud applications. Cur

LocalStack 45.3k Jan 2, 2023
Cookiecutter templates for Serverless applications using AWS SAM and the Rust programming language.

Cookiecutter SAM template for Lambda functions in Rust This is a Cookiecutter template to create a serverless application based on the Serverless Appl

AWS Samples 24 Nov 11, 2022
ShadowClone allows you to distribute your long running tasks dynamically across thousands of serverless functions and gives you the results within seconds where it would have taken hours to complete

ShadowClone allows you to distribute your long running tasks dynamically across thousands of serverless functions and gives you the results within seconds where it would have taken hours to complete

null 240 Jan 6, 2023