Boto is a Python package that provides interfaces to Amazon Web Services.

Overview

Deprecation notice

This package is no longer maintained and has been replaced by Boto3. Issues and pull requests are not reviewed. If you are having an issue with the Boto3 package or the AWS CLI, please open an issue on their respective repositories.

boto

boto 2.49.0

Released: 11-July-2018

Introduction

Boto is a Python package that provides interfaces to Amazon Web Services. Currently, all features work with Python 2.6 and 2.7. Work is under way to support Python 3.3+ in the same codebase. Modules are being ported one at a time with the help of the open source community, so please check below for compatibility with Python 3.3+.

To port a module to Python 3.3+, please view our Contributing Guidelines and the Porting Guide. If you would like, you can open an issue to let others know about your work in progress. Tests must pass on Python 2.6, 2.7, 3.3, and 3.4 for pull requests to be accepted.

Services

At the moment, boto supports:

  • Compute
    • Amazon Elastic Compute Cloud (EC2) (Python 3)
    • Amazon Elastic Map Reduce (EMR) (Python 3)
    • AutoScaling (Python 3)
    • Amazon Kinesis (Python 3)
    • AWS Lambda (Python 3)
    • Amazon EC2 Container Service (Python 3)
  • Content Delivery
    • Amazon CloudFront (Python 3)
  • Database
    • Amazon Relational Data Service (RDS)
    • Amazon DynamoDB (Python 3)
    • Amazon SimpleDB (Python 3)
    • Amazon ElastiCache (Python 3)
    • Amazon Redshift (Python 3)
  • Deployment and Management
    • AWS Elastic Beanstalk (Python 3)
    • AWS CloudFormation (Python 3)
    • AWS Data Pipeline (Python 3)
    • AWS Opsworks (Python 3)
    • AWS CloudTrail (Python 3)
    • AWS CodeDeploy (Python 3)
  • Administration & Security
    • AWS Identity and Access Management (IAM) (Python 3)
    • AWS Key Management Service (KMS) (Python 3)
    • AWS Config (Python 3)
    • AWS CloudHSM (Python 3)
  • Application Services
    • Amazon CloudSearch (Python 3)
    • Amazon CloudSearch Domain (Python 3)
    • Amazon Elastic Transcoder (Python 3)
    • Amazon Simple Workflow Service (SWF) (Python 3)
    • Amazon Simple Queue Service (SQS) (Python 3)
    • Amazon Simple Notification Server (SNS) (Python 3)
    • Amazon Simple Email Service (SES) (Python 3)
    • Amazon Cognito Identity (Python 3)
    • Amazon Cognito Sync (Python 3)
    • Amazon Machine Learning (Python 3)
  • Monitoring
    • Amazon CloudWatch (EC2 Only) (Python 3)
    • Amazon CloudWatch Logs (Python 3)
  • Networking
    • Amazon Route53 (Python 3)
    • Amazon Route 53 Domains (Python 3)
    • Amazon Virtual Private Cloud (VPC) (Python 3)
    • Elastic Load Balancing (ELB) (Python 3)
    • AWS Direct Connect (Python 3)
  • Payments and Billing
    • Amazon Flexible Payment Service (FPS)
  • Storage
    • Amazon Simple Storage Service (S3) (Python 3)
    • Amazon Glacier (Python 3)
    • Amazon Elastic Block Store (EBS)
    • Google Cloud Storage
  • Workforce
    • Amazon Mechanical Turk
  • Other
    • Marketplace Web Services (Python 3)
    • AWS Support (Python 3)

The goal of boto is to support the full breadth and depth of Amazon Web Services. In addition, boto provides support for other public services such as Google Storage in addition to private cloud systems like Eucalyptus, OpenStack and Open Nebula.

Boto is developed mainly using Python 2.6.6 and Python 2.7.3 on Mac OSX and Ubuntu Maverick. It is known to work on other Linux distributions and on Windows. Most of Boto requires no additional libraries or packages other than those that are distributed with Python. Efforts are made to keep boto compatible with Python 2.5.x but no guarantees are made.

Installation

Install via pip:

$ pip install boto

Install from source:

$ git clone git://github.com/boto/boto.git
$ cd boto
$ python setup.py install

ChangeLogs

To see what has changed over time in boto, you can check out the release notes at http://docs.pythonboto.org/en/latest/#release-notes

Finding Out More About Boto

The main source code repository for boto can be found on github.com. The boto project uses the gitflow model for branching.

Online documentation is also available. The online documentation includes full API documentation as well as Getting Started Guides for many of the boto modules.

Boto releases can be found on the Python Cheese Shop.

Join our IRC channel #boto on FreeNode. Webchat IRC channel: http://webchat.freenode.net/?channels=boto

Join the boto-users Google Group.

Getting Started with Boto

Your credentials can be passed into the methods that create connections. Alternatively, boto will check for the existence of the following environment variables to ascertain your credentials:

AWS_ACCESS_KEY_ID - Your AWS Access Key ID

AWS_SECRET_ACCESS_KEY - Your AWS Secret Access Key

Credentials and other boto-related settings can also be stored in a boto config file. See this for details.

Comments
  • Various improvements around file pointer behavior and get_file/send_file...

    Various improvements around file pointer behavior and get_file/send_file...

    ....

    • When setting contents, the file pointer now starts reading content where you positioned it and returns the file pointer up to where you finished reading. This is useful for multipart uploads of large files where it is desirable to split them into different parts.
    • Computing md5 for a file pointer now returns the file pointer to the initial position instead of at the start of the file. It also takes a size of bytes to read from the file pointer which enables it to work well with multipart uploads.
    • If the provider supports chunking, delay the md5 calculation to when the file is being sent on the wire and send the Content-MD5 in the trailer. [THIS IS UNTESTED as I don't have a google storage account and AWS doesn't support chunking for uploads.]
    • Improve the callback so that it doesn't call too many times and also doesn't call twice or more for the final chunk.
    • Fix callback for get_file such that the Key size is available when figuring out cb_count.
    • get_file() now calculates the md5 as the stream is being read. It is useful to have key.md5 available after reading contents in case you want to verify what you read without re-calculating the md5 of a paricularly large piece of data. key.md5 is set.
    • after send_file() key.size will be accurate for the data that you sent.
    • Added unit tests for most of the work.
    feature ec2 
    opened by tpodowd 39
  • "ImportError: No module named boto" when Boto is installed on OS X.

    I've been searching all night for a solution to this and nothing has worked so far. I have Python 2.7.8 installed, and boto is also installed on OS X 10.10.3. The problem is that unless I'm running as my root user, when I try to run the AWS ec2.py script while deploying with Ansible, I get- ERROR: Inventory script (ec2.py) had an execution error: Traceback (most recent call last): File "/Users/peterjohnjoseph/Sites/massvapors.com/ansible/ec2.py", line 121, in <module> import boto ImportError: No module named boto

    Any idea what I might want to try? I'll give whatever information you need.

    closing-soon-if-no-response 
    opened by peterjohnjoseph 31
  • S3: ExpiredToken error with IAM's temporary credentials

    S3: ExpiredToken error with IAM's temporary credentials

    'ExpiredToken' errors are occasionally thrown when IAM role's temporary credentials are used.

    --- kvs.py

    --- Note that 'connection' and 'bucket' objects are created once and reused for put requests.

    connection = connect_s3() bucket = connection.get_bucket(bucket_name, False)

    def put_raw(bucket, key, value, headers=None): k = Key(bucket) k.key = key

    File "/home/myapp/kvs.py", line 160, in put_raw retval = k.set_contents_from_string(value, headers=headers) File "/usr/local/lib/python2.7/dist-packages/boto-2.5.2-py2.7.egg/boto/s3/key.py", line 1129, in set_contents_from_string encrypt_key=encrypt_key) File "/usr/local/lib/python2.7/dist-packages/boto-2.5.2-py2.7.egg/boto/s3/key.py", line 993, in set_contents_from_file size=size) File "/usr/local/lib/python2.7/dist-packages/boto-2.5.2-py2.7.egg/boto/s3/key.py", line 724, in send_file query_args=query_args) File "/usr/local/lib/python2.7/dist-packages/boto-2.5.2-py2.7.egg/boto/s3/connection.py", line 470, in make_request override_num_retries=override_num_retries) File "/usr/local/lib/python2.7/dist-packages/boto-2.5.2-py2.7.egg/boto/connection.py", line 913, in make_request return self._mexe(http_request, sender, override_num_retries) File "/usr/local/lib/python2.7/dist-packages/boto-2.5.2-py2.7.egg/boto/connection.py", line 790, in _mexe request.body, request.headers) File "/usr/local/lib/python2.7/dist-packages/boto-2.5.2-py2.7.egg/boto/s3/key.py", line 681, in sender response.status, response.reason, body) boto.exception.S3ResponseError: S3ResponseError: 400 Bad Request

    ExpiredTokenThe provided token has expired.A0E4730BE76CC9F0VdBvtwjLvBhr5QN1VZ2v/LJeBTxso52+q59kmfspLUdbWWnrz714qhXAMMgnvZYV ...

    opened by khj1218 30
  • slick53 methods grafted onto to boto route53 connection

    slick53 methods grafted onto to boto route53 connection

    As mentioned on the boto email list, I'd like to see slick53's more intuitive methods available in the boto package. Aside from minor changes to make the code more idiomatic, I've basically left the code alone, though I may revisit it pending review of this request.

    There are some basic unit tests.

    If accepted, I'll submit some changes to the docs.

    opened by rubic 30
  • STSConnection : how to specify anon auth handler ?

    STSConnection : how to specify anon auth handler ?

    When using assume_role_with_web_identity, the doc says : "AssumeRoleWithWebIdentity is an API call that does not require the use of AWS security credentials"

    However, calls to STSConnection() fail with

    2013-06-14 09:30:08,699 boto [ERROR]:Unable to read instance data, giving up
    Traceback (most recent call last):
      File "test.py", line 77, in <module>
        credentials = doGetAccessCredentials(token['id_token'], profile)
      File "test.py", line 17, in doGetAccessCredentials
        conn = boto.sts.STSConnection()
      File "/Library/Python/2.7/site-packages/boto/sts/connection.py", line 83, in __init__
        validate_certs=validate_certs)
      File "/Library/Python/2.7/site-packages/boto/connection.py", line 965, in __init__
        validate_certs=validate_certs)
      File "/Library/Python/2.7/site-packages/boto/connection.py", line 552, in __init__
        host, config, self.provider, self._required_auth_capability())
      File "/Library/Python/2.7/site-packages/boto/auth.py", line 683, in get_auth_handler
        'Check your credentials' % (len(names), str(names)))
    

    This is because STSConnection uses 'sign-v2' auth handler. How can we specify STSConnection to use anon handler instead ?

    opened by sebsto 26
  • he parameter groupName cannot be used with the parameter subnet . again

    he parameter groupName cannot be used with the parameter subnet . again

    Hi, using HEAD, and having checked Issue 526: VPC 2.0: The parameter groupName cannot be used with the parameter subnet.

    I still get the same error:

    conn = boto.connect_vpc(aws_access_key_id= AWS_ACCESS_KEY_ID,aws_secret_access_key=AWS_SECRET_ACCESS_KEY)

    res = conn.run_instances(AMI_ID, ... instance_type = 'm1.xlarge', ... key_name = 'AWSEC2key', ... min_count = 1, max_count = 1, ... security_groups=['lm'], ... user_data = None, ... subnet_id = 'subnet_f03c1f99', ... private_ip_address='10.0.0.189', ... security_group_ids=['sg-2c342940'], ... )

    Traceback (most recent call last): File "", line 9, in File "/Library/Python/2.6/site-packages/boto-2.0-py2.6.egg/boto/ec2/connection.py", line 625, in run_instances return self.get_object('RunInstances', params, Reservation, verb='POST') File "/Library/Python/2.6/site-packages/boto-2.0-py2.6.egg/boto/connection.py", line 871, in get_object raise self.ResponseError(response.status, response.reason, body) boto.exception.EC2ResponseError: EC2ResponseError: 400 Bad Request

    InvalidParameterCombinationThe parameter groupName cannot be used with the parameter subnet0a9bac66-e484-4d87-b4bf-62e845016c7e

    opened by adhorn 26
  • update cacerts from mozilla.org

    update cacerts from mozilla.org

    Used mk-ca-bundle.pl locally to generate a recent cacerts.txt file that fixes https certificate validation for OSX 10.8x. These certs were built with:

    $ perl mk-ca-bundle.pl -w 64 boto/cacerts/cacerts.txt
    

    fixes boto/boto#1535 closes jtriley/StarCluster#267

    opened by jtriley 23
  • Network acl support

    Network acl support

    This adds support from network acls. I was not able to find a method AssociateNetworkAcl on this page: http://docs.amazonwebservices.com/AWSEC2/latest/APIReference/query-apis.html

    vpc 
    opened by jsmartin 23
  • Add Amazon DynamoDB online indexing support on High level API

    Add Amazon DynamoDB online indexing support on High level API

    Hi,

    Yesterday (Jan-29), Amazon has added support to add or delete global indexes at any time, which is by far the feature I was most looking forward using for over a year.

    I've changed dynamodb2.table.update method taking into account the backward campatibility as well. Documentation amendments and a unit test case has been provided too accordingly to the contributing guidelines.

    Looking forward to see this in the new version of boto.

    feature dynamodb accepted 
    opened by PauloMigAlmeida 20
  • get_bucket has an expensive default

    get_bucket has an expensive default

    http://www.appneta.com/blog/s3-list-get-bucket-default/

    When get_bucket is called, by default it also does a full LIST because validate defaults to True. Thus, anyone using this library is accidentally paying way more for S3 than they should. This default should be False.

    opened by jasonroelofs 20
  • Make boto.ec2 work with Python 3.2 and Python 3.3

    Make boto.ec2 work with Python 3.2 and Python 3.3

    A bunch of changes to make boto.ec2 tests pass on Python 3.2 and Python 3.3 while all boto tests continue to pass on Python 2. There has been some manual testing as well to verify that basic functionality works (e.g.: printing all available images) when connecting to Eucalyptus Community Cloud (see here) and EC2 (see here).

    Passing Travis CI build: https://travis-ci.org/msabramo/boto/builds/3465072

    This is the work that I alluded to in #1127.

    opened by msabramo 20
  • insert data into dynamoDB from excel file using python

    insert data into dynamoDB from excel file using python

    {"mobile":{"S":"+9198765432100"},"taxRegistrationNumber": {"S":"TRN2255"},"phone":{"S":"+919876543211"},"companyRegistrationNumber":{"S":"REG0000152"},"email":{"S":"[email protected]"}} i have to insert this data into table by map datatype ..im using putitem ..but its not working

    opened by ghostakstevelodx 1
  • [Issue] Tag is also required to create the elastic IP

    [Issue] Tag is also required to create the elastic IP

    Please see the AWS official guide for allocate-address; there is allowed to assign a tag for the IP address (--tag-specifications).

      allocate-address
    [--domain <value>]
    [--address <value>]
    [--public-ipv4-pool <value>]
    [--network-border-group <value>]
    [--customer-owned-ipv4-pool <value>]
    [--dry-run | --no-dry-run]
    [--tag-specifications <value>]
    [--cli-input-json <value>]
    [--generate-cli-skeleton <value>]
    

    https://github.com/boto/boto/blob/91ba037e54ef521c379263b0ac769c66182527d7/boto/ec2/connection.py#L1830-L1834

    However, I hit the following errors when I passed the TagSpecification to the client.

    opened by yang-wang11 1
  • During data sync between 2 different S3 buckets/accounts String literals transforming into Unicode

    During data sync between 2 different S3 buckets/accounts String literals transforming into Unicode

    We observed in our production clusters that if we are doing sync between two different accounts/buckets String literals transforming into Unicode, after successful migration.

    As a result we are seeing failures when we are trying to read the value after sync.

    opened by GoutamHub 0
  • Add missing regions for S3 Website

    Add missing regions for S3 Website

    A complete list of supported websites endpoints are listed in https://docs.aws.amazon.com/general/latest/gr/s3.html#s3_website_region_endpoints these changes add all the missing endpoints.

    Note that as of today Feb 2021 cn-north-1 region isn't listed, but looks like is still in use.

    opened by mariocesar 0
  • describe-snapshots doesn't allow for empty values in ~/.aws/config

    describe-snapshots doesn't allow for empty values in ~/.aws/config

    This command breaks with my typical ~/.aws/config

    aws --profile default --region us-east-1 ec2 describe-snapshots
    
    'str' object has no attribute 'get'
    

    With values in my ~/.aws/config like it fails.

    [default]
    region = us-east-1
    cli_pager =
    s3 =
    signature_version = s3v4
    

    Note: These blank entries are supported and recommended in other places (or injected by other tools). Not supporting it for some CLI calls feels like a bug.

    With simpler values in my ~/.aws/config like it will work without issue (probably any set that doesn't have empty values).

    [default]
    region = us-east-1
    

    Environment:

    aws-cli/2.4.17 Python/3.9.10 Darwin/20.5.0 source/x86_64 prompt/off
    

    I haven't done an exhaustive check for this, but I've ran about 50 other commands since updating my aws-cli version and this is the first one that choked on it.

    Thank you for looking at this issue.

    opened by peteristhegreat 0
A python library for building user interfaces in discord.

blurple.py A front-end framework for discord.py Blurple.py is a framework built on top of discord.py, giving you the tools you need to build discord b

null 4 Oct 25, 2021
Python client library for Google Maps API Web Services

Python Client for Google Maps Services Description Use Python? Want to geocode something? Looking for directions? Maybe matrices of directions? This l

Google Maps 3.8k Jan 1, 2023
Python library for using SMS.ir web services

smsir smsir is a Python library for using SMS web services www.sms.ir Installation Use the package manager pip to install smsir. pip install smsir Usa

mohammad reza 2 Oct 14, 2022
Nasdaq Cloud Data Service (NCDS) provides a modern and efficient method of delivery for realtime exchange data and other financial information. This repository provides an SDK for developing applications to access the NCDS.

Nasdaq Cloud Data Service (NCDS) Nasdaq Cloud Data Service (NCDS) provides a modern and efficient method of delivery for realtime exchange data and ot

Nasdaq 8 Dec 1, 2022
A part of HyRiver software stack for accessing hydrology data through web services

Package Description Status PyNHD Navigate and subset NHDPlus (MR and HR) using web services Py3DEP Access topographic data through National Map's 3DEP

Taher Chegini 51 Dec 10, 2022
One version package to rule them all, One version package to find them, One version package to bring them all, and in the darkness bind them.

AwesomeVersion One version package to rule them all, One version package to find them, One version package to bring them all, and in the darkness bind

Joakim Sørensen 39 Dec 31, 2022
A simple Python wrapper for the Amazon.com Product Advertising API ⛺

Amazon Simple Product API A simple Python wrapper for the Amazon.com Product Advertising API. Features An object oriented interface to Amazon products

Yoav Aviram 789 Dec 26, 2022
The unofficial Amazon search CLI & Python API

amzSear The unofficial Amazon Product CLI & API. Easily search the amazon product directory from the command line without the need for an Amazon API k

Asher Silvers 95 Nov 11, 2022
An Amazon Product Scraper built using scapy module of python

Amazon Product Scraper This is an Amazon Product Scraper built using scapy module of python Features it scrape various things Product Title Product Im

Sudhanshu Jha 1 Dec 13, 2021
A complete Python application to automatize the process of uploading files to Amazon S3

Upload files or folders (even with subfolders) to Amazon S3 in a totally automatized way taking advantage of: Amazon S3 Multipart Upload: The uploaded

Pol Alzina 1 Nov 20, 2021
Python tool to Check running WebClient services on multiple targets based on @leechristensen

WebClient Service Scanner Python tool to Check running WebClient services on multiple targets based on @tifkin_ idea. This tool uses impacket project.

Pixis 153 Dec 28, 2022
A Python SDK for connecting devices to Microsoft Azure IoT services

V2 - We are now GA! This repository contains code for the Azure IoT SDKs for Python. This enables python developers to easily create IoT device soluti

Microsoft Azure 381 Dec 30, 2022
Lambda-function - Python codes that allow notification of changes made to some services using the AWS Lambda Function

AWS Lambda Function This repository contains python codes that allow notificatio

Elif Apaydın 3 Feb 11, 2022
A simple library for interacting with Amazon S3.

BucketStore is a very simple Amazon S3 client, written in Python. It aims to be much more straight-forward to use than boto3, and specializes only in

Jacobi Petrucciani 219 Oct 3, 2022
Integrating Amazon API Gateway private endpoints with on-premises networks

Integrating Amazon API Gateway private endpoints with on-premises networks Read the blog about this application: Integrating Amazon API Gateway privat

AWS Samples 12 Sep 9, 2022
Automated endpoint management for Amazon Aurora Global Database

This sample code can be used to manage Aurora global database endpoints. After failover the global database writer endpoints swap from one region to the other. This solution automates creation and management of Route 53 based endpoints, so the applications don't have to change the connections strings.

AWS Samples 13 Dec 8, 2022
A chatbot that helps you set price alerts for your amazon products.

Amazon Price Alert Bot Description A Telegram chatbot that helps you set price alerts for amazon products. The bot checks the price of your watchliste

Rittik Basu 24 Dec 29, 2022
Script to get a notification when a product, on Amazon Warehouse, is available within a target price

Amazon_Warehouse_Scraping This script aims to scrape Amazon Warehouse and send an email back if there are products whose price matches with the target

null 2 Oct 25, 2021