Awslogs - AWS CloudWatch logs for Humans™

Overview

awslogs

https://badge.fury.io/py/awslogs.png https://travis-ci.org/jorgebastida/awslogs.png?branch=master

awslogs is a simple command line tool for querying groups, streams and events from Amazon CloudWatch logs.

One of the most powerful features is to query events from several streams and consume them (ordered) in pseudo-realtime using your favourite tools such as grep:

$ awslogs get /var/log/syslog ip-10-1.* --start='2h ago' | grep ERROR

Features

  • Aggregate logs from across streams.
    • Aggregate all streams in a group.
    • Aggregate streams matching a regular expression.
  • Colored output.
  • List existing groups
    • $ awslogs groups
  • List existing streams
    • $ awslogs streams /var/log/syslog
  • Watch logs as they are created
    • $ awslogs get /var/log/syslog ALL --watch
  • Human-friendly time filtering:
    • --start='23/1/2015 14:23'
    • --start='2h ago'
    • --start='2d ago'
    • --start='2w ago'
    • --start='2d ago' --end='1h ago'
  • Retrieve event metadata:
    • --timestamp Prints the creation timestamp of each event.
    • --ingestion-time Prints the ingestion time of each event.

Example

Running: awslogs get /var/logs/syslog ALL -s1d will return you events from any stream in the /var/logs/syslog group generated in the last day.

https://github.com/jorgebastida/awslogs/raw/master/media/screenshot.png

Installation

You can easily install awslogs using pip:

$ pip install awslogs

If you are on OSX El Capitan, use the following (Why? Check Donald Stufft's comment here)

$ pip install awslogs --ignore-installed six

You can also install it with brew:

$ brew install awslogs

Options

  • awslogs groups: List existing groups
  • awslogs streams GROUP: List existing streams withing GROUP
  • awslogs get GROUP [STREAM_EXPRESSION]: Get logs matching STREAM_EXPRESSION in GROUP.
    • Expressions can be regular expressions or the wildcard ALL if you want any and don't want to type .*.

Note: You need to provide to all these options a valid AWS region using --aws-region or AWS_REGION env variable.

Time options

While querying for logs you can filter events by --start -s and --end -e date.

  • By minute:

    • --start='2m' Events generated two minutes ago.
    • --start='1 minute' Events generated one minute ago.
    • --start='5 minutes' Events generated five minutes ago.
  • By hours:

    • --start='2h' Events generated two hours ago.
    • --start='1 hour' Events generated one hour ago.
    • --start='5 hours' Events generated five hours ago.
  • By days:

    • --start='2d' Events generated two days ago.
    • --start='1 day' Events generated one day ago.
    • --start='5 days' Events generated five days ago.
  • By weeks:

    • --start='2w' Events generated two week ago.
    • --start='1 week' Events generated one weeks ago.
    • --start='5 weeks' Events generated five week ago.
  • Using specific dates:

    • --start='23/1/2015 12:00' Events generated after midday on the 23th of January 2015.
    • --start='1/1/2015' Events generated after midnight on the 1st of January 2015.
    • --start='Sat Oct 11 17:13:46 UTC 2003' You can use detailed dates too.

    Note, for time parsing awslogs uses dateutil.

  • All previous examples are applicable for --end -e too.

Filter options

You can use --filter-pattern if you want to only retrieve logs which match one CloudWatch Logs Filter pattern. This is helpful if you know precisely what you are looking for, and don't want to download the entire stream.

For example, if you only want to download only the report events from a Lambda stream you can run:

$ awslogs get my_lambda_group --filter-pattern="[r=REPORT,...]"

Full documentation of how to write patterns: http://docs.aws.amazon.com/AmazonCloudWatch/latest/DeveloperGuide/FilterAndPatternSyntax.html

JSON logs

In a similar way than the aws-cli command, you can use --query to filter each of your json log lines and extract certain fields:

$ awslogs get my_lambda_group --query=message

This will only display the message field for each of the json log lines.

Using third-party endpoints

If you use tools like localstack, fakes3 or other, consider to change boto3 endpoint using --aws-endpoint-url or AWS_REGION env variable.

AWS IAM Permissions

The required permissions to run awslogs are contained within the CloudWatchLogsReadOnlyAccess AWS managed permissions. As of 2020-01-13, these are the permissions:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Action": [
                "logs:Describe*",
                "logs:Get*",
                "logs:List*",
                "logs:StartQuery",
                "logs:StopQuery",
                "logs:TestMetricFilter",
                "logs:FilterLogEvents"
            ],
            "Effect": "Allow",
            "Resource": "*"
        }
    ]
}

Contribute

  • Fork the repository on GitHub.
  • Write a test which shows that the bug was fixed or that the feature works as expected.
    • Use tox command to run all the tests in all locally available python version.
  • Send a pull request and bug the maintainer until it gets merged and published. :).

For more instructions see TESTING.rst.

Helpful Links

How to provide AWS credentials to awslogs

Although, the most straightforward thing to do might be use --aws-access-key-id and --aws-secret-access-key, this will eventually become a pain in the ass.

  • If you only have one AWS account, my personal recommendation would be to configure aws-cli. awslogs will use those credentials if available. If you have multiple AWS profiles managed by aws-cli, just add --profile [PROFILE_NAME] at the end of every awslogs command to use those credentials, or set the AWS_PROFILE env variable.
  • If you don't want to setup aws-cli, I would recommend you to use envdir in order to make AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY available to awslogs.
Comments
  • Add support for caching temporary session credentials

    Add support for caching temporary session credentials

    This PR adds support for caching temporary session credentials, using the same mechanism as used in the AWS CLI.

    The main benefit of this feature is a much improved experience for users that require MFA for API access. Prior to this PR, the experience is that users have to re-enter MFA token each time they run the command:

    $ awslogs groups
    Enter MFA code: ******
    /my/log/group
    
    $ awslogs get /my/log/group
    Enter MFA code: ******
    ...
    ...
    

    With this PR, the user experience will be that they only need to enter an MFA code once for the lifetime of the session credentials (e.g. once per hour):

    $ awslogs groups
    Enter MFA code: ******
    /my/log/group
    
    $ awslogs get /my/log/group
    ...
    ...
    

    Caching is invoked only if the assume-role provider is active, which happens when you are using a profile that assumes a role:

    $ export AWS_PROFILE=some-account-admin
    $ cat ~/.aws/config
    [profile some-account-admin]
    source_profile=users-account
    role_arn=arn:aws:iam::012345678912:role/admin
    mfa_serial=arn:aws:iam::219876543210:mfa/justin.menga
    region=us-west-2
    
    opened by mixja 31
  • added watch interval.

    added watch interval.

    AWS Cloudwatch limits are only 5 queries / second. As such, the current amount of threads that can be run at a single time is 5.

    Exposed a public variable named interval, to allow the user to change the interval at which awslogs queries Cloudwatch.

    opened by loghen41 9
  • Credentials Fail : NoAuthHandlerFound: No handler was ready to authenticate.

    Credentials Fail : NoAuthHandlerFound: No handler was ready to authenticate.

    Trying to use awslogs for the first time, already had aws-cli setup with credentials but get this error:

    Version: 0.0.1 Python: 2.7.8 (default, Sep 26 2014, 11:28:16) [GCC 4.2.1 Compatible Apple LLVM 5.1 (clang-503.0.40)] boto version: 2.38.0 Platform: Darwin-14.3.0-x86_64-i386-64bit Config: {'color_enabled': True, 'aws_region': u'eu-west-1', 'aws_access_key_id': 'SENSITIVE', 'aws_secret_access_key': 'SENSITIVE', 'func': 'list_groups'} Args: ['/usr/local/bin/awslogs', 'groups']

    Traceback (most recent call last): File "/usr/local/lib/python2.7/site-packages/awslogs/bin.py", line 115, in main logs = AWSLogs(**vars(options)) File "/usr/local/lib/python2.7/site-packages/awslogs/core.py", line 83, in init aws_secret_access_key=self.aws_secret_access_key File "/usr/local/lib/python2.7/site-packages/awslogs/core.py", line 27, in init self.connection = botologs.connect_to_region(_args, *_kwargs) File "/usr/local/lib/python2.7/site-packages/boto/logs/init.py", line 40, in connect_to_region return region.connect(**kw_params) File "/usr/local/lib/python2.7/site-packages/boto/regioninfo.py", line 187, in connect return self.connection_cls(region=self, **kw_params) File "/usr/local/lib/python2.7/site-packages/boto/logs/layer1.py", line 107, in init super(CloudWatchLogsConnection, self).init(**kwargs) File "/usr/local/lib/python2.7/site-packages/boto/connection.py", line 1100, in init provider=provider) File "/usr/local/lib/python2.7/site-packages/boto/connection.py", line 569, in init host, config, self.provider, self._required_auth_capability()) File "/usr/local/lib/python2.7/site-packages/boto/auth.py", line 987, in get_auth_handler 'Check your credentials' % (len(names), str(names))) NoAuthHandlerFound: No handler was ready to authenticate. 1 handlers were checked. ['HmacAuthV4Handler'] Check your credentials

    opened by labithiotis 9
  • Refresh streams while watching.

    Refresh streams while watching.

    • Add a new Thread to refresh the list of streams, as new streams might appear during watching.
    • When no streams matched the pattern, the output display a message 'No streams found, waiting:' It is then up to the user to wait until a stream get returned, or to press Ctrl+C, to stop the watching process. fixes #60
    opened by ticosax 8
  • next_token unbound after exception

    next_token unbound after exception

    The var next_token is not going to be defined if the Empty exception occurs.

    Wait for the gevent.sleep and continue with the next iteration of the loop.

    opened by agonzalezro 8
  • Improve stream discovery (eventual consistency)

    Improve stream discovery (eventual consistency)

    AWS documentation clearly states that lastEventTimestamp returned by DescriveLogStreams is eventually consistent. Lags up to 1h are to be expected. Until now, awslogs ignored this fact as stream discovery was based on user provided time filters leading to false negative. Many users reported this issue.

    This patch embraces eventual consistency by relaxing the time filter used to discover streams. False positives might occur but querying a stream with no event seems better than inadvertently ignoring a stream with relevant data.

    opened by cykl 7
  • Support `--filter-pattern` option

    Support `--filter-pattern` option

    Writing about AWS Logs CLI tools http://stackoverflow.com/a/33636237/346478 I have realized, that awslogs does not support filter patterns.

    Adding filter patterns in the way AWSCLI option --filter-pattern is doing would be handy.

    The method CloudWatchLogs.Client.filter_log_events offers the filterPattern parameter for that.

    opened by vlcinsky 7
  • Added aws_endpoint_url cli option

    Added aws_endpoint_url cli option

    To be able to use awslogs with locally running instances of cloudwatch such as localstack, we need a way to overwrite the endpoint url.

    This PR adds an --aws-endpoint-url cli flag to change the endpoint that is called by awslogs.

    help 
    opened by fewstera 6
  • Poor usability with MFA

    Poor usability with MFA

    We've been using awslogs for a long time in our team, but recently a change in the way we use AWS accounts made it much less usable.

    We took into use a separate federation account where our IAM users live, and access our other AWS accounts (where we have among other things CloudWatch logs) by assuming a role on the target account from the federation account. The federation account users always have multi-factor authentication (MFA). The AWS profile config for accessing a target account looks something like this:

    ~/.aws/credentials:
    [federation-account]
    aws_access_key_id = xxxx
    aws_secret_access_key = yyyyy
    
    ~/.aws/config:
    [profile assume-role-via-federation-account]
    source_profile = federation-account
    mfa_serial = arn:aws:iam::federation-account-id:mfa/iam-user-name
    role_arn = arn:aws:iam::target-account-id/AssumedRole
    

    It appears that boto3 actually supports this kind of AWS profile out of the box: when awslogs is called with --profile assume-role-via-federation-account in this example, boto3 automatically asks the user to enter a MFA code. But, even though this basically works, a major usability issue is that the MFA code is prompted every time an awslogs command is executed. The user also needs to wait for the MFA device to produce a new code if the current one was already used, making it impossible to execute several awslogs commands in rapid succession.

    A good solution would be to configure the boto3 client to use credential cache in the same way as e.g. AWS CLI does. We have a proof of concept for this approach here in a fork (the two latest commits):

    https://github.com/Opetushallitus/awslogs

    This modification seems to solve the problem we have, and doesn't seem to produce any regression when using a profile that doesn't need credential caching. I suppose we'll be using the forked version for the time being, but just wanted to suggest that maybe something like this could be useful in the main project too.

    help 
    opened by juhani-hietikko 6
  • Gracefully exit on SIGPIPE

    Gracefully exit on SIGPIPE

    Piping to utilities that close out their pipe by design (such as head) causes a traceback message when the pipe is closed. This tweak catches that particular error condition and exits, and lets other IO failures continue to escalate.

    Before:

    (venv) $ ./command_line.py get log_group_sanitized ALL | head
    log_group_sanitized log_stream_sanitized1 Test log message from 127509dae97b (Wed Apr 12 23:24:55 UTC 2017)
    log_group_sanitized log_stream_sanitized2 Test log message from 4edd0756f357 (Wed Apr 12 23:24:56 UTC 2017)
    log_group_sanitized log_stream_sanitized1 Test log message from 127509dae97b (Wed Apr 12 23:24:58 UTC 2017)
    log_group_sanitized log_stream_sanitized2 Test log message from 4edd0756f357 (Wed Apr 12 23:25:00 UTC 2017)
    log_group_sanitized log_stream_sanitized1 Test log message from 127509dae97b (Wed Apr 12 23:25:01 UTC 2017)
    log_group_sanitized log_stream_sanitized1 Test log message from 127509dae97b (Wed Apr 12 23:25:04 UTC 2017)
    log_group_sanitized log_stream_sanitized1 Test log message from 127509dae97b (Wed Apr 12 23:25:05 UTC 2017)
    log_group_sanitized log_stream_sanitized2 Test log message from 4edd0756f357 (Wed Apr 12 23:25:05 UTC 2017)
    log_group_sanitized log_stream_sanitized2 Test log message from 4edd0756f357 (Wed Apr 12 23:25:08 UTC 2017)
    log_group_sanitized log_stream_sanitized1 Test log message from 127509dae97b (Wed Apr 12 23:25:10 UTC 2017)
    
    ================================================================================
    You've found a bug! Please, raise an issue attaching the following traceback
    https://github.com/jorgebastida/awslogs/issues/new
    --------------------------------------------------------------------------------
    Version: 0.9.0
    Python: 2.7.5 (default, Sep 15 2016, 22:37:39)
    [GCC 4.8.5 20150623 (Red Hat 4.8.5-4)]
    boto3 version: 1.4.4
    Platform: Linux-3.10.0-327.10.1.el7.x86_64-x86_64-with-centos-7.2.1511-Core
    Config: {'output_timestamp_enabled': False, 'output_group_enabled': True, 'end': None, 'aws_secret_access_key': 'SENSITIVE', 'log_stream_name': 'ALL', 'watch': False, 'aws_region': None, 'aws_session_token': 'SENSITIVE', 'start': '5m', 'aws_profile': 'SENSITIVE', 'filter_pattern': None, 'log_group_name': 'log_group_sanitized', 'output_ingestion_time_enabled': False, 'query': None, 'func': 'list_logs', 'aws_access_key_id': 'SENSITIVE', 'color_enabled': True, 'output_stream_enabled': True}
    Args: ['./command_line.py', 'get', 'log_group_sanitized', 'ALL']
    
    Traceback (most recent call last):
      File "/homedir_sanitized/src/awslogs/awslogs/bin.py", line 172, in main
        getattr(logs, options.func)()
      File "/homedir_sanitized/src/awslogs/awslogs/core.py", line 187, in list_logs
        consumer()
      File "/homedir_sanitized/src/awslogs/awslogs/core.py", line 185, in consumer
        sys.stdout.flush()
    IOError: [Errno 32] Broken pipe
    ================================================================================
    

    After:

    (venv) $ ./command_line.py get log_group_sanitized ALL | head
    log_group_sanitized log_stream_sanitized1 Test log message from 127509dae97b (Wed Apr 12 23:25:33 UTC 2017)
    log_group_sanitized log_stream_sanitized2 Test log message from 4edd0756f357 (Wed Apr 12 23:25:33 UTC 2017)
    log_group_sanitized log_stream_sanitized1 Test log message from 127509dae97b (Wed Apr 12 23:25:35 UTC 2017)
    log_group_sanitized log_stream_sanitized2 Test log message from 4edd0756f357 (Wed Apr 12 23:25:35 UTC 2017)
    log_group_sanitized log_stream_sanitized1 Test log message from 127509dae97b (Wed Apr 12 23:25:36 UTC 2017)
    log_group_sanitized log_stream_sanitized1 Test log message from 127509dae97b (Wed Apr 12 23:25:40 UTC 2017)
    log_group_sanitized log_stream_sanitized2 Test log message from 4edd0756f357 (Wed Apr 12 23:25:40 UTC 2017)
    log_group_sanitized log_stream_sanitized1 Test log message from 127509dae97b (Wed Apr 12 23:25:41 UTC 2017)
    log_group_sanitized log_stream_sanitized1 Test log message from 127509dae97b (Wed Apr 12 23:25:45 UTC 2017)
    log_group_sanitized log_stream_sanitized2 Test log message from 4edd0756f357 (Wed Apr 12 23:25:45 UTC 2017)
    (venv) $
    
    opened by rwolfson 6
  • setup.py rewritten to pbr, tests run by tox

    setup.py rewritten to pbr, tests run by tox

    modified:   .gitignore
    deleted:    AUTHORS
    deleted:    CHANGELOG
    deleted:    MANIFEST.in
    new file:   requirements-py26.txt
    new file:   requirements.txt
    new file:   setup.cfg
    modified:   setup.py
    renamed:    tests.py -> tests/test_it.py
    new file:   tox.ini
    
    needs review 
    opened by vlcinsky 6
  • Is this project still maintained or has it been abandoned?

    Is this project still maintained or has it been abandoned?

    Just wondering if there is any hope that this project will be updated or any PRs be merged any time in the near future. It looks like it's been abandoned but I can't really tell.

    opened by rnhurt 0
  • How do I use hyphenated characters in filter pattern properties?

    How do I use hyphenated characters in filter pattern properties?

    I want to filter the thread name. Thread names contain characters that contain hyphens. How do I use hyphenated characters in filter pattern properties? I want to filter only ORDER-ITEM-1 in the red area as shown in the image.

    This is the code I tried. awslogs get /ecs/cafe24-webhook ALL --s='3h' --profile cafe24 --filter-pattern "ORDER ITEM 1"

    This is the code I want. But the filter doesn't work. awslogs get /ecs/cafe24-webhook ALL --s='3h' --profile cafe24 --filter-pattern "ORDER-ITEM-1"

    image
    opened by alsdud154 0
  • Changing the format of the logs

    Changing the format of the logs

    Hi! Thanks for this package! I was looking around and couldn't find a way to change the format of the logs but I thought you probably have a way to do that? For example, how can I remove the logstream name or the timestamp from the logs?

    Example:

    2022-11-03T16:01:04.010000+00:00 2022/11/03/[$LATEST]58asd312e7212131223124ae149 [INFO]    2022-11-03T16:01:04.010Z        70sadasbd-xxasaa9-5b70-a2ea-6860xasdasdas637ee5    Hey this is an actual log
    

    I just want the part Hey this is an actual log

    Thanks!

    opened by fersarr 2
  • allow sorting groups by date created and/or filter on groups created after some time

    allow sorting groups by date created and/or filter on groups created after some time

    I'm trying to use awslogs to grab the name of the newest log group. It would be good to be able to sort by name, or add some "--after" in the get groups statement

    opened by trondhindenes 0
  • Traceback bug :(

    Traceback bug :(

    Version: 0.14.0 Python: 3.10.8 (main, Oct 12 2022, 09:32:59) [Clang 14.0.0 (clang-1400.0.29.102)] boto3 version: 1.22.5 Platform: macOS-12.6-arm64-arm-64bit Args: ['/opt/homebrew/bin/awslogs', 'get', '/aws/lambda/backend-graphql-dev-graphql', '--start=2h ago'] Config: {'aws_access_key_id': 'SENSITIVE', 'aws_secret_access_key': 'SENSITIVE', 'aws_session_token': 'SENSITIVE', 'aws_profile': 'SENSITIVE', 'aws_region': None, 'aws_endpoint_url': None, 'log_group_name': '/aws/lambda/backend-graphql-dev-graphql', 'log_stream_name': 'ALL', 'filter_pattern': None, 'watch': False, 'watch_interval': 1, 'output_group_enabled': True, 'output_stream_enabled': True, 'output_timestamp_enabled': False, 'output_ingestion_time_enabled': False, 'start': '2h ago', 'end': None, 'color': 'auto', 'query': None, 'func': 'list_logs'}

    Traceback (most recent call last): File "/opt/homebrew/Cellar/awslogs/0.14.0_3/libexec/lib/python3.10/site-packages/awslogs/bin.py", line 175, in main logs = AWSLogs(**vars(options)) File "/opt/homebrew/Cellar/awslogs/0.14.0_3/libexec/lib/python3.10/site-packages/awslogs/core.py", line 89, in init self.client = boto3_client( File "/opt/homebrew/Cellar/awslogs/0.14.0_3/libexec/lib/python3.10/site-packages/awslogs/core.py", line 44, in boto3_client return session.client( File "/opt/homebrew/Cellar/awslogs/0.14.0_3/libexec/lib/python3.10/site-packages/boto3/session.py", line 299, in client return self._session.create_client( File "/opt/homebrew/Cellar/awslogs/0.14.0_3/libexec/lib/python3.10/site-packages/botocore/session.py", line 884, in create_client client = client_creator.create_client( File "/opt/homebrew/Cellar/awslogs/0.14.0_3/libexec/lib/python3.10/site-packages/botocore/client.py", line 102, in create_client client_args = self._get_client_args( File "/opt/homebrew/Cellar/awslogs/0.14.0_3/libexec/lib/python3.10/site-packages/botocore/client.py", line 384, in _get_client_args return args_creator.get_client_args( File "/opt/homebrew/Cellar/awslogs/0.14.0_3/libexec/lib/python3.10/site-packages/botocore/args.py", line 71, in get_client_args final_args = self.compute_client_args( File "/opt/homebrew/Cellar/awslogs/0.14.0_3/libexec/lib/python3.10/site-packages/botocore/args.py", line 148, in compute_client_args endpoint_config = self._compute_endpoint_config( File "/opt/homebrew/Cellar/awslogs/0.14.0_3/libexec/lib/python3.10/site-packages/botocore/args.py", line 234, in _compute_endpoint_config return self._resolve_endpoint(**resolve_endpoint_kwargs) File "/opt/homebrew/Cellar/awslogs/0.14.0_3/libexec/lib/python3.10/site-packages/botocore/args.py", line 320, in _resolve_endpoint return endpoint_bridge.resolve( File "/opt/homebrew/Cellar/awslogs/0.14.0_3/libexec/lib/python3.10/site-packages/botocore/client.py", line 465, in resolve resolved = self.endpoint_resolver.construct_endpoint( File "/opt/homebrew/Cellar/awslogs/0.14.0_3/libexec/lib/python3.10/site-packages/botocore/regions.py", line 196, in construct_endpoint result = self._endpoint_for_partition( File "/opt/homebrew/Cellar/awslogs/0.14.0_3/libexec/lib/python3.10/site-packages/botocore/regions.py", line 230, in _endpoint_for_partition raise NoRegionError() botocore.exceptions.NoRegionError: You must specify a region.

    opened by WillUK99 1
Owner
Jorge Bastida
Jorge Bastida
Automated AWS account hardening with AWS Control Tower and AWS Step Functions

Automate activities in Control Tower provisioned AWS accounts Table of contents Introduction Architecture Prerequisites Tools and services Usage Clean

AWS Samples 20 Dec 7, 2022
Implement backup and recovery with AWS Backup across your AWS Organizations using a CI/CD pipeline (AWS CodePipeline).

Backup and Recovery with AWS Backup This repository provides you with a management and deployment solution for implementing Backup and Recovery with A

AWS Samples 8 Nov 22, 2022
RichWatch is wrapper around AWS Cloud Watch to display beautiful logs with help of Python library Rich.

RichWatch is TUI (Textual User Interface) for AWS Cloud Watch. It formats and pretty prints Cloud Watch's logs so they are much more readable. Because

null 21 Jul 25, 2022
This is a script to export logs from AWS CloudTrail to a local file.

cloudtrail-export-logs This is a script to export logs from AWS CloudTrail to a local file. Getting Started Prerequisites python 3 boto3 pip Installin

Claick Assunção de Oliveira 2 Jan 2, 2022
Automatically compile an AWS Service Control Policy that ONLY allows AWS services that are compliant with your preferred compliance frameworks.

aws-allowlister Automatically compile an AWS Service Control Policy that ONLY allows AWS services that are compliant with your preferred compliance fr

Salesforce 189 Dec 8, 2022
SSH-Restricted deploys an SSH compliance rule (AWS Config) with auto-remediation via AWS Lambda if SSH access is public.

SSH-Restricted SSH-Restricted deploys an SSH compliance rule with auto-remediation via AWS Lambda if SSH access is public. SSH-Auto-Restricted checks

Adrian Hornsby 30 Nov 8, 2022
AWS Auto Inventory allows you to quickly and easily generate inventory reports of your AWS resources.

Photo by Denny Müller on Unsplash AWS Automated Inventory ( aws-auto-inventory ) Automates creation of detailed inventories from AWS resources. Table

AWS Samples 123 Dec 26, 2022
A suite of utilities for AWS Lambda Functions that makes tracing with AWS X-Ray, structured logging and creating custom metrics asynchronously easier

A suite of utilities for AWS Lambda Functions that makes tracing with AWS X-Ray, structured logging and creating custom metrics asynchronously easier

Amazon Web Services - Labs 1.9k Jan 7, 2023
aws-lambda-scheduler lets you call any existing AWS Lambda Function you have in a future time.

aws-lambda-scheduler aws-lambda-scheduler lets you call any existing AWS Lambda Function you have in the future. This functionality is achieved by dyn

Oğuzhan Yılmaz 57 Dec 17, 2022
Project template for using aws-cdk, Chalice and React in concert, including RDS Postgresql and AWS Cognito

What is This? This repository is an opinonated project template for using aws-cdk, Chalice and React in concert. Where aws-cdk and Chalice are in Pyth

Rasmus Jones 4 Nov 7, 2022
POC de uma AWS lambda que executa a consulta de preços de criptomoedas, e é implantada na AWS usando Github actions.

Cryptocurrency Prices Overview Instalação Repositório Configuração CI/CD Roadmap Testes Overview A ideia deste projeto é aplicar o conteúdo estudado s

Gustavo Santos 3 Aug 31, 2022
Python + AWS Lambda Hands OnPython + AWS Lambda Hands On

Python + AWS Lambda Hands On Python Criada em 1990, por Guido Van Rossum. "Bala de prata" (quase). Muito utilizado em: Automatizações - Selenium, Beau

Marcelo Ortiz de Santana 8 Sep 9, 2022
Unauthenticated enumeration of services, roles, and users in an AWS account or in every AWS account in existence.

Quiet Riot ?? C'mon, Feel The Noise ?? An enumeration tool for scalable, unauthenticated validation of AWS principals; including AWS Acccount IDs, roo

Wes Ladd 89 Jan 5, 2023
AWS Blog post code for running feature-extraction on images using AWS Batch and Cloud Development Kit (CDK).

Batch processing with AWS Batch and CDK Welcome This repository demostrates provisioning the necessary infrastructure for running a job on AWS Batch u

AWS Samples 7 Oct 18, 2022
Aws-lambda-requests-wrapper - Request/Response wrapper for AWS Lambda with API Gateway

AWS Lambda Requests Wrapper Request/Response wrapper for AWS Lambda with API Gat

null 1 May 20, 2022
AWS-serverless-starter - AWS Lambda serverless stack via Serverless framework

Serverless app via AWS Lambda, ApiGateway and Serverless framework Configuration

 Bəxtiyar 3 Feb 2, 2022
Aws-cidr-finder - A Python CLI tool for finding unused CIDR blocks in AWS VPCs

aws-cidr-finder Overview An Example Installation Configuration Contributing Over

Cooper Walbrun 18 Jul 31, 2022
AWS CloudSaga - Simulate security events in AWS

AWS CloudSaga - Simulate security events in AWS AWS CloudSaga is for customers to test security controls and alerts within their Amazon Web Services (

Amazon Web Services - Labs 325 Dec 1, 2022