tfquery: Run SQL queries on your Terraform infrastructure. Query resources and analyze its configuration using a SQL-powered framework.

Overview

🌩️ tfquery 🌩️

Run SQL queries on your Terraform infrastructure. Ask questions that are hard to answer


🚀 What is tfquery?

tfquery is a framework that allows running SQL queries on Terraform code. It's made to analyze your Terraform infrastructure, locate resources, run security compliance checks, spot misconfigured resources, develop CI benchmarks, and much more.

tfquery is made to help in answering questions that are hard to answer about your infrastructure-as-code. It allows querying resources and analyzing its configuration using a SQL-powered framework.

Why?

infrastructure-as-code is the de-facto today for documenting and deploying infrastructure on cloud providers. As the organization grows, it becomes really hard to understand and analyze the deployed infrastructure. Grepping and searching for resources in Terraform state files is not enough. Terraform Modules are automating processes dynamically for infrastructure deployment, so searching for static resources is also not feasible for good visibility.

With tfquery, you can run SQL queries on Terraform state files, and gain the best possible visibility.


💡 Use tfquery to

  • Have full coverage of your infrastructure, without being locked on a specific provider, including Amazon AWS, Microsoft Azure, Google Cloud Platform, Alibaba Cloud, IBM Cloud, Oracle Cloud, and many others.

  • Analyze deployed resources configuration.

  • Develop CI and monitoring checks for cloud infrastructure.

  • Write custom queries to scan Terraform resources.


tfquery vs. Cloud-specific SQL engines?

There are cloud-specific SQL engines that allow you to run SQL queries to understand resources on their infrastructure, both are covered as provided service by the cloud provider, or given as an open-source tool or a product. The main difference tfquery brings:

  • Maintainability: Cloud-specific SQL engines require maintenance in case of new services or breaking changes to existing ones. tfquery make use of Terraform schemas as a standard. tfquery will work on all given services, without the need to continuously update it with new API specs.

  • Coverage: tfquery covers all the cloud providers that Terraform supports out of the box (thanks to Terraform Providers).


📖 Usage

Run SQL query on Terraform states

>>> import tfquery
>>>
>>> result = tfquery.tfstate.run_query("terraform.tfstate", "select count(*) from resources")
>> print(result)
[{'count(*)': 86}]

Parse all resources from a Terraform state file

>>> import tfquery
>>>
>>> resources = tfquery.tfstate.parse_resources("terraform.tfstate")
>>> print(f"[i] Resources Count: {len(resources)}")
[i] Resources Count: 1475

Advanced Usage

Migrate Version 3 to Version 4 Terraform states

This is a parsing library to migrate the older Version 3 Terraform states to a Version 4 state. This is made to add backward compatibility for Terraform states that is made for releases older than Terraform v0.11.

>>> import tfquery
>>>
>>> tfstate_v3 = tfquery.tfstate.load_file("terraform.tfstate")
>>> tfstate_v4 = tfquery.tfstate_v3_migration.upgrade_v3_tfstate(tfstate)

🖲️ Command-Line (tfquery)

TFquery is also available as a CLI tool. It can be used to run SQL queries directly on Terraform states, and for importing resources into persistent storage.

mazin@hackbox$> tfquery -h

usage: tfquery [-h] [--tfstate TFSTATE] [--tfstate-dir TFSTATE_DIR]
                      [--query QUERY] [--db DB_PATH] [--interactive] [--import]

tfquery: Run SQL queries on your Terraform infrastructure.

optional arguments:
  -h, --help            show this help message and exit
  --tfstate TFSTATE     Terraform .tfstate file.
  --tfstate-dir TFSTATE_DIR
                        Directory of Terraform .tfstate files, for running queries on
                        environments.
  --query QUERY, -q QUERY
                        SQL query to execute.
  --db DB_PATH          DB path (optional. default: temporarily-generated database).
  --interactive, -i     Interactive mode.
  --import              Import tfstate into database.

Examples

  • Run SQL query for a directory of multiple Terraform states (for multiple workspaces).:
$ tfquery -q 'select count(*) as count from resources;'  --tfstate-dir /path/to/terraform-states
[i] DB Path: tfstate.db
[+] Imported 4203 resources from ./prod.tfstate.
[i] DB Path: tfstate.db
[+] Imported 3675 resources from ./nonprod.tfstate.
[i] DB Path: tfstate.db
[+] Imported 463 resources from ./qa.tfstate.
  • Import Terraform states into Database.:
$ python3 tfquery --tfstate /path/to/terraform.state --db tfstate.db --import
[i] DB Path: tfstate.db
[+] Imported 386 resources from terraform.tfstate.
  • Run queries on imported resources in a database
$ tfquery --db tfstate.db -q 'select count(*) as count from resources;'
[
    {
        "count": 386
    }
]

💭 Awesome Queries & Scripts

Find all AWS S3 buckets without versioning being enabled

import tfquery, sys
results = tfquery.tfstate.run_query(sys.argv[1], "select * from resources where type = 'aws_s3_bucket'")
for result in results:
    attributes = result["attributes"]
    if 'versioning' not in attributes or len(attributes["versioning"]) == 0:
        # print(result)
        continue
    for versioning in attributes["versioning"]:
        if versioning["enabled"] is False:
            # print(result)
            pass

Find all AWS IAM users, and print their ARNs

import tfquery, sys
results = tfquery.tfstate.run_query(sys.argv[1], "select json_extract(attributes, '$.arn') as arn from resources where type = 'aws_iam_user';")
for result in results:
    print(result["arn"])

or

import tfquery, sys
results = tfquery.tfstate.run_query(sys.argv[1], "select attributes from resources where type = 'aws_iam_user';")
for result in results:
    print(result["attributes"]["arn"])

Find all resources in the environment, and show how many instances were deployed

import tfquery
results = tfquery.tfstate.run_query("terraform.tfstate", "select type, count(*) as count from resources group by type order BY count desc;")
print(results)

Interested in tfquery?

  1. Post a Tweet about the project and tag @mazen160 🙏

  2. 🌟 Star it on Github 🌟

  3. Create a PR for a new awesome feature 💛

  4. Would like to sponsor the project? Contact me on email!


💻 Contribution

Contribution is always welcome! Please feel free to report issues on Github and create PRs for new features.

📌 Ideas to Start on

Would like to contribute to tfquery? Here are some ideas that you may start with:

  • Better documentation: would be great to enhance the documentation with additional examples and queries.

  • CI: Implement CI along with test terraform states for Terragoat.

  • Support dependencies for resources lookup: Create a new table called "dependencies", parse V4 Terraform states, and implement a many-to-one relation for dependencies of resources.

  • More V3 --> V4 migration support: currently V3 resources migrations are supported. Dependencies are not migrated to the new V4 state. It will be great to continue on V3--> V4 support for Terraform states.

  • General validation of Terraform states parser implementation: Validate current implementation of the parser, and enhance it where possible.

  • Connect resources with terraform state base name: For environments with many workspaces, each workspace can have a different name, it would be nice to add a column for terraform state file base name, to help in querying across different workspaces.

  • tfplan parsing: Allow parsing of tfplan files. This can be an opening addition for implementing a new CI security scanner for Terraform deployments.

  • Logo design: a logo design would be great.

  • Web interface representation with coleifer/sqlite-web - Thanks @securityfu for the idea!

  • Add a managed repository of pre-defined queries to enable teams to be able to query Terraform for different compliance and security controls.

As you can see, there are many ways to support. Please help us make the project bigger for everyone!


Installation

mazin@hackbox$> git clone https://github.com/mazen160/tfquery.git
mazin@hackbox$> cd tfquery
mazin@hackbox$> python3 setup.py install

or

mazin@hackbox$> pip install git+https://github.com/mazen160/tfquery

📄 License

The project is licensed under MIT License.

💚 Author

Mazin Ahmed

You might also like...
Using GNU Radio and HackRF One to Receive, Analyze and Send ASK/OOK signals

play_with_ask NIS-8016 Lab A code: Recv.grc/py: Receive signals and match with ASK button using HackRF and GNU radio. I use AM demod block(can also in

Tools for use in DeFi. Impermanent Loss calculations, staking and farming strategies, coingecko and pancakeswap API queries, liquidity pools and more
Tools for use in DeFi. Impermanent Loss calculations, staking and farming strategies, coingecko and pancakeswap API queries, liquidity pools and more

DeFi open source tools Get Started Instalation General Tools Impermanent Loss, simple calculation Compare Buy & Hold with Staking and Farming Complete

Troposphere and shellscript based AWS infrastructure automation creates an awsapigateway lambda with a go backend

Automated-cloudformation-infra Troposphere and shellscript based AWS infrastructure automation. Feel free to clone and edit for personal usage. The en

Infrastructure template and Jupyter notebooks for running RoseTTAFold on AWS Batch.
Infrastructure template and Jupyter notebooks for running RoseTTAFold on AWS Batch.

AWS RoseTTAFold Infrastructure template and Jupyter notebooks for running RoseTTAFold on AWS Batch. Overview Proteins are large biomolecules that play

AWS DeepRacer Free Student Workshop: Run faster by using your custom waypoints

AWS DeepRacer Free Student Workshop: Run faster by using your custom waypoints Reward Function Template for waypoints def reward_function(params):

Draw your telegram bot in draw.io and generate its code
Draw your telegram bot in draw.io and generate its code

README Contents: Draw your bot Install requirements Registering a telegram bot Draw bot Start point Message block Single choice block Functions block

Build better AWS infrastructure

Sceptre About Sceptre is a tool to drive AWS CloudFormation. It automates the mundane, repetitive and error-prone tasks, enabling you to concentrate o

AWS Auto Inventory allows you to quickly and easily generate inventory reports of your AWS resources.
AWS Auto Inventory allows you to quickly and easily generate inventory reports of your AWS resources.

Photo by Denny Müller on Unsplash AWS Automated Inventory ( aws-auto-inventory ) Automates creation of detailed inventories from AWS resources. Table

ANKIT-OS/STYLISH-TEXT is a special repository. Its Is A Telegram Bot Which Can Translate Your Text Into 100+ Language
ANKIT-OS/STYLISH-TEXT is a special repository. Its Is A Telegram Bot Which Can Translate Your Text Into 100+ Language

🔥 ᴳᴼᴼᴳᴸᴱ⁻ᵀᴿᴬᴺᔆᴸᴬᵀᴱᴿ 🔥 The owner would not be responsible for any kind of bans due to the bot. • ⚡ INSTALLING ⚡ • • 🛠️ Lᴀɴɢᴜᴀɢᴇs Aɴᴅ Tᴏᴏʟs 🔰 • If

Comments
  • Invalid terraform state error

    Invalid terraform state error

    Hi 👋 , thanks for doing this.

    I got this error reporting invalid terraform state file:

     9, in validate_tfstate
        raise ValueError("Invalid tfstate file")
    ValueError: Invalid tfstate file
    

    and the terraform state file:

    {
        "version": 3,
        "serial": 1,
        "lineage": "<uuid>",
        "backend": {
            "type": "s3",
            "config": {
                "access_key": null,
                "acl": "private",
                "assume_role_policy": null,
    

    my understanding is the error comes from the validation expecting a key named "terraform_version" instead of "version". However every reference after that seems to be looking for "version", e.g:

    def validate_tfstate(tfstate):
        if "terraform_version" not in tfstate.keys():
            raise ValueError("Invalid tfstate file")
            return False
        if tfstate["version"] < 3:
            raise ValueError("Unsupported tfstate version")
            return False
        return True
    

    Do you think this could be a version issue and should accept both?

    opened by calvernaz 2
  • run_query does not initialise SQLHandler correctly

    run_query does not initialise SQLHandler correctly

    This looks like a really cool project that might help us answer some really interesting questions, thanks for starting it! If we end up using a lot I will endeavour to contribute.

    With tfquery installed from both pypi and git, running the below:

    import tfquery
    
    def main():
        state_path = "terraform.tfstate"
        result = tfquery.tfstate.run_query(state_path, "select count(*) from resources")
        print(result)
    
    if __name__ == "__main__":
        main()
    

    was giving me this error:

    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "/home/ben/scratch/tfquery/venv/lib/python3.8/site-packages/tfquery/tfstate.py", line 114, in run_query
        s.insert_resources(resources)
      File "/home/ben/scratch/tfquery/venv/lib/python3.8/site-packages/tfquery/sql_handler.py", line 85, in insert_resources
        self.insert_resource(resource)
      File "/home/ben/scratch/tfquery/venv/lib/python3.8/site-packages/tfquery/sql_handler.py", line 99, in insert_resource
        resource["tfstate_file"] = os.path.basename(self.tfstate_file)
      File "/usr/lib/python3.8/posixpath.py", line 142, in basename
        p = os.fspath(p)
    TypeError: expected str, bytes or os.PathLike object, not NoneType
    

    It looks like the line that initialises SQLHandler in run_query should pass the tfstate_file like so:

        s = SQLHandler(in_memory=True, tfstate_file=tfstate_file)
    

    Having made that change locally I can then run queries as expected.

    opened by benabineri 1
  • Tfplan support

    Tfplan support

    This PR introduces support for tfplan parsing. This should allow running SQL queries on Infra changing within Terraform.

    ToDo

    • [x] Update Readme and add examples
    • [x] Add sample data
    opened by mazen160 0
Releases(0.1)
Owner
Mazin Ahmed
Cyber Security Engineer (Offensive Security)
Mazin Ahmed
A python script fetches all your starred repositories from your GitHub account and clones them to your server so you will never lose important resources

A python script fetches all your starred repositories from your GitHub account and clones them to your server so you will never lose important resources

Ringo Hoffmann 27 Oct 1, 2022
Terraform module to ship CloudTrail logs stored in a S3 bucket into a Kinesis stream for further processing and real-time analysis.

AWS infrastructure to ship CloudTrail logs from S3 to Kinesis This repository contains a Terraform module to ship CloudTrail logs stored in a S3 bucke

Nexthink 8 Sep 20, 2022
Cookies is a project inspired by python cookiecutter but used for terraform generation.

Introduction Cookies is a project inspired by python cookiecutter but used for terraform generation. How to run your terraform After you download your

Digger 6 Mar 14, 2022
MAASTA is a wrapper to create an Ansible inventory for MAAS instances that are provisioned by Terraform.

MAASTA is a wrapper to create an Ansible inventory for MAAS instances that are provisioned by Terraform.

Saeid Bostandoust 144 Dec 16, 2022
Python binding for Terraform.

Python libterraform Python binding for Terraform. Installation $ pip install libterraform NOTE Please install version 0.3.1 or above, which solves the

Prodesire 28 Dec 29, 2022
♻️ API to run evaluations of the FAIR principles (Findable, Accessible, Interoperable, Reusable) on online resources

♻️ FAIR enough ?? An OpenAPI where anyone can run evaluations to assess how compliant to the FAIR principles is a resource, given the resource identif

Maastricht University IDS 4 Oct 20, 2022
Ghostbuster - Eliminate dangling elastic IPs by performing analysis on your resources within all your AWS accounts

Table of Contents Table of Contents Ghostbuster The problem Project Features Ins

Assetnote 182 Dec 24, 2022
This is Pdisk Upload Bot made using Python with Pyrogram Framework. Its capable of uploading direct download link with thumbnail or without thumbnail & with Title Support.

Pdisk-Upload-Bot Introduction This Is PDisk Upload Bot Used To Upload Direct Link To Pdisk With Thumb Support Deploy Heroku Deploy Local Deploy pip in

HEIMAN PICTURES 32 Oct 21, 2022
This solution helps you deploy Data Lake Infrastructure on AWS using CDK Pipelines.

CDK Pipelines for Data Lake Infrastructure Deployment This solution helps you deploy data lake infrastructure on AWS using CDK Pipelines. This is base

AWS Samples 66 Nov 23, 2022
AirDrive lets you store unlimited files to cloud for free. Upload & download files from your personal drive at any time using its super-fast API.

AirDrive lets you store unlimited files to cloud for free. Upload & download files from your personal drive at any time using its super-fast API.

Sougata 4 Jul 12, 2022