Yamale (ya·ma·lē) - A schema and validator for YAML.

Overview

Yamale (ya·ma·lē)

⚠️ Ensure that your schema definitions come from internal or trusted sources. Yamale does not protect against intentionally malicious schemas.

Yamale

A schema and validator for YAML.

What's YAML? See the current spec here and an introduction to the syntax here.

Build Status PyPI

Requirements

  • Python 3.6+
  • PyYAML
  • ruamel.yaml (optional)

Install

pip

$ pip install yamale

NOTE: Some platforms, e.g., Mac OS, may ship with only Python 2 and may not have pip installed. Installation of Python 3 should also install pip. To preserve any system dependencies on default software, consider installing Python 3 as a local package. Please note replacing system-provided Python may disrupt other software. Mac OS users may wish to investigate MacPorts, homebrew, or building Python 3 from source; in all three cases, Apple's Command Line Tools (CLT) for Xcode may be required. See also developers, below.

Manual

  1. Download Yamale from: https://github.com/23andMe/Yamale/archive/master.zip
  2. Unzip somewhere temporary
  3. Run python setup.py install (may have to prepend sudo)

Usage

Command line

Yamale can be run from the command line to validate one or many YAML files. Yamale will search the directory you supply (current directory is default) for YAML files. Each YAML file it finds it will look in the same directory as that file for its schema, if there is no schema Yamale will keep looking up the directory tree until it finds one. If Yamale can not find a schema it will tell you.

Usage:

usage: yamale [-h] [-s SCHEMA] [-n CPU_NUM] [-p PARSER] [--no-strict] [PATH]

Validate yaml files.

positional arguments:
  PATH                  folder to validate. Default is current directory.

optional arguments:
  -h, --help            show this help message and exit
  -s SCHEMA, --schema SCHEMA
                        filename of schema. Default is schema.yaml.
  -n CPU_NUM, --cpu-num CPU_NUM
                        number of CPUs to use. Default is 4.
  -p PARSER, --parser PARSER
                        YAML library to load files. Choices are "ruamel" or
                        "pyyaml" (default).
  --no-strict           Disable strict mode, unexpected elements in the data
                        will be accepted.

API

There are several ways to feed Yamale schema and data files. The simplest way is to let Yamale take care of reading and parsing your YAML files.

All you need to do is supply the files' path:

# Import Yamale and make a schema object:
import yamale
schema = yamale.make_schema('./schema.yaml')

# Create a Data object
data = yamale.make_data('./data.yaml')

# Validate data against the schema. Throws a ValueError if data is invalid.
yamale.validate(schema, data)

You can pass a string of YAML to make_schema() and make_data() instead of passing a file path by using the content= parameter:

data = yamale.make_data(content="""
name: Bill
age: 26
height: 6.2
awesome: True
""")

If data is valid, nothing will happen. However, if data is invalid Yamale will throw a YamaleError with a message containing all the invalid nodes:

try:
    yamale.validate(schema, data)
    print('Validation success! 👍')
except ValueError as e:
    print('Validation failed!\n%s' % str(e))
    exit(1)

and an array of ValidationResult.

try:
    yamale.validate(schema, data)
    print('Validation success! 👍')
except YamaleError as e:
    print('Validation failed!\n')
    for result in e.results:
        print("Error validating data '%s' with '%s'\n\t" % (result.data, result.schema))
        for error in result.errors:
            print('\t%s' % error)
    exit(1)

You can also specify an optional parser if you'd like to use the ruamel.yaml (YAML 1.2 support) instead:

# Import Yamale and make a schema object, make sure ruamel.yaml is installed already.
import yamale
schema = yamale.make_schema('./schema.yaml', parser='ruamel')

# Create a Data object
data = yamale.make_data('./data.yaml', parser='ruamel')

# Validate data against the schema same as before.
yamale.validate(schema, data)

Schema

⚠️ Ensure that your schema definitions come from internal or trusted sources. Yamale does not protect against intentionally malicious schemas.

To use Yamale you must make a schema. A schema is a valid YAML file with one or more documents inside. Each node terminates in a string which contains valid Yamale syntax. For example, str() represents a String validator.

A basic schema:

name: str()
age: int(max=200)
height: num()
awesome: bool()

And some YAML that validates:

name: Bill
age: 26
height: 6.2
awesome: True

Take a look at the Examples section for more complex schema ideas.

Includes

Schema files may contain more than one YAML document (nodes separated by ---). The first document found will be the base schema. Any additional documents will be treated as Includes. Includes allow you to define a valid structure once and use it several times. They also allow you to do recursion.

A schema with an Include validator:

person1: include('person')
person2: include('person')
---
person:
    name: str()
    age: int()

Some valid YAML:

person1:
    name: Bill
    age: 70

person2:
    name: Jill
    age: 20

Every root node not in the first YAML document will be treated like an include:

person: include('friend')
group: include('family')
---
friend:
    name: str()
family:
    name: str()

Is equivalent to:

person: include('friend')
group: include('family')
---
friend:
    name: str()
---
family:
    name: str()
Recursion

You can get recursion using the Include validator.

This schema:

person: include('human')
---
human:
    name: str()
    age: int()
    friend: include('human', required=False)

Will validate this data:

person:
    name: Bill
    age: 50
    friend:
        name: Jill
        age: 20
        friend:
            name: Will
            age: 10
Adding external includes

After you construct a schema you can add extra, external include definitions by calling schema.add_include(dict). This method takes a dictionary and adds each key as another include.

Strict mode

By default Yamale will provide errors for extra elements present in lists and maps that are not covered by the schema. With strict mode disabled (using the --no-strict command line option), additional elements will not cause any errors. In the API, strict mode can be toggled by passing the strict=True/False flag to the validate function.

It is possible to mix strict and non-strict mode by setting the strict=True/False flag in the include validator, setting the option only for the included validators.

Validators

Here are all the validators Yamale knows about. Every validator takes a required keyword telling Yamale whether or not that node must exist. By default every node is required. Example: str(required=False)

You can also require that an optional value is not None by using the none keyword. By default Yamale will accept None as a valid value for a key that's not required. Reject None values with none=False in any validator. Example: str(required=False, none=False).

Some validators take keywords and some take arguments, some take both. For instance the enum() validator takes one or more constants as arguments and the required keyword: enum('a string', 1, False, required=False)

String - str(min=int, max=int, equals=string, starts_with=string, ends_with=string, matches=regex, exclude=string, ignore_case=False, multiline=False, dotall=False)

Validates strings.

  • keywords
    • min: len(string) >= min
    • max: len(string) <= max
    • equals: string == value (add ignore_case=True for case-insensitive checking)
    • starts_with: Accepts only strings starting with given value (add ignore_case=True for case-insensitive checking)
    • matches: Validates the string against a given regex. Similar to the regex() validator, you can use ignore_case, multiline and dotall)
    • ends_with: Accepts only strings ending with given value (add ignore_case=True for case-insensitive checking)
    • exclude: Rejects strings that contains any character in the excluded value
    • ignore_case: Validates strings in a case-insensitive manner.
    • multiline: ^ and $ in a pattern match at the beginning and end of each line in a string in addition to matching at the beginning and end of the entire string. (A pattern matches at the beginning of a string even in multiline mode; see below for a workaround.); only allowed in conjunction with a matches keyword.
    • dotall: . in a pattern matches newline characters in a validated string in addition to matching every character that isn't a newline.; only allowed in conjunction with a matches keyword.

Examples:

  • str(max=10, exclude='?!'): Allows only strings less than 11 characters that don't contain ? or !.

Regex - regex([patterns], name=string, ignore_case=False, multiline=False, dotall=False)

Validates strings against one or more regular expressions.

  • arguments: one or more Python regular expression patterns
  • keywords:
    • name: A friendly description for the patterns.
    • ignore_case: Validates strings in a case-insensitive manner.
    • multiline: ^ and $ in a pattern match at the beginning and end of each line in a string in addition to matching at the beginning and end of the entire string. (A pattern matches at the beginning of a string even in multiline mode; see below for a workaround.)
    • dotall: . in a pattern matches newline characters in a validated string in addition to matching every character that isn't a newline.

Examples:

  • regex('^[^?!]{,10}$'): Allows only strings less than 11 characters that don't contain ? or !.
  • regex(r'^(\d+)(\s\1)+$', name='repeated natural'): Allows only strings that contain two or more identical digit sequences, each separated by a whitespace character. Non-matching strings like sugar are rejected with a message like 'sugar' is not a repeated natural.
  • regex('.*^apples$', multiline=True, dotall=True): Allows the string apples as well as multiline strings that contain the line apples.

Integer - int(min=int, max=int)

Validates integers.

  • keywords
    • min: int >= min
    • max: int <= max

Number - num(min=float, max=float)

Validates integers and floats.

  • keywords
    • min: num >= min
    • max: num <= max

Boolean - bool()

Validates booleans.

Null - null()

Validates null values.

Enum - enum([primitives])

Validates from a list of constants.

  • arguments: constants to test equality with

Examples:

  • enum('a string', 1, False): a value can be either 'a string', 1 or False

Day - day(min=date, max=date)

Validates a date in the form of YYYY-MM-DD.

  • keywords
    • min: date >= min
    • max: date <= max

Examples:

  • day(min='2001-01-01', max='2100-01-01'): Only allows dates between 2001-01-01 and 2100-01-01.

Timestamp - timestamp(min=time, max=time)

Validates a timestamp in the form of YYYY-MM-DD HH:MM:SS.

  • keywords
    • min: time >= min
    • max: time <= max

Examples:

  • timestamp(min='2001-01-01 01:00:00', max='2100-01-01 23:00:00'): Only allows times between 2001-01-01 01:00:00 and 2100-01-01 23:00:00.

List - list([validators], min=int, max=int)

Validates lists. If one or more validators are passed to list() only nodes that pass at least one of those validators will be accepted.

  • arguments: one or more validators to test values with
  • keywords
    • min: len(list) >= min
    • max: len(list) <= max

Examples:

  • list(): Validates any list
  • list(include('custom'), int(), min=4): Only validates lists that contain the custom include or integers and contains a minimum of 4 items.

Map - map([validators], key=validator, min=int, max=int)

Validates maps. Use when you want a node to contain freeform data. Similar to List, Map takes one or more validators to run against the values of its nodes, and only nodes that pass at least one of those validators will be accepted. By default, only the values of nodes are validated and the keys aren't checked.

  • arguments: one or more validators to test values with
  • keywords
    • key: A validator for the keys of the map.
    • min: len(map) >= min
    • max: len(map) <= max

Examples:

  • map(): Validates any map
  • map(str(), int()): Only validates maps whose values are strings or integers.
  • map(str(), key=int()): Only validates maps whose keys are integers and values are strings. 1: one would be valid but '1': one would not.
  • map(str(), min=1): Only validates a non-empty map.

IP Address - ip()

Validates IPv4 and IPv6 addresses.

  • keywords
    • version: 4 or 6; explicitly force IPv4 or IPv6 validation

Examples:

  • ip(): Allows any valid IPv4 or IPv6 address
  • ip(version=4): Allows any valid IPv4 address
  • ip(version=6): Allows any valid IPv6 address

MAC Address - mac()

Validates MAC addresses.

Examples:

  • mac(): Allows any valid MAC address

Any - any([validators])

Validates against a union of types. Use when a node must contain one and only one of several types. It is valid if at least one of the listed validators is valid. If no validators are given, accept any value.

  • arguments: validators to test values with (if none is given, allow any value; if one or more are given, one must be present)

Examples:

  • any(int(), null()): Validates either an integer or a null value.
  • any(num(), include('vector')): Validates either a number or an included 'vector' type.
  • any(str(min=3, max=3),str(min=5, max=5),str(min=7, max=7)): validates to a string that is exactly 3, 5, or 7 characters long
  • any(): Allows any value.

Subset - subset([validators], allow_empty=False)

Validates against a subset of types. Unlike the Any validator, this validators allows one or more of several types. As such, it automatically validates against a list. It is valid if all values can be validated against at least one validator.

  • arguments: validators to test with (at least one; if none is given, a ValueError exception will be raised)
  • keywords:
    • allow_empty: Allow the subset to be empty (and is, therefore, also optional). This overrides the required flag.

Examples:

  • subset(int(), str()): Validators against an integer, a string, or a list of either.
  • subset(int(), str(), allow_empty=True): Same as above, but allows the empty set and makes the subset optional.

Include - include(include_name)

Validates included structures. Must supply the name of a valid include.

  • arguments: single name of a defined include, surrounded by quotes.

Examples:

  • include('person')

Custom validators

It is also possible to add your own custom validators. This is an advanced topic, but here is an example of adding a Date validator and using it in a schema as date()

import yamale
import datetime
from yamale.validators import DefaultValidators, Validator

class Date(Validator):
    """ Custom Date validator """
    tag = 'date'

    def _is_valid(self, value):
        return isinstance(value, datetime.date)

validators = DefaultValidators.copy()  # This is a dictionary
validators[Date.tag] = Date
schema = yamale.make_schema('./schema.yaml', validators=validators)
# Then use `schema` as normal

Examples

⚠️ Ensure that your schema definitions come from internal or trusted sources. Yamale does not protect against intentionally malicious schemas.

Using keywords

Schema:

optional: str(required=False)
optional_min: int(min=1, required=False)
min: num(min=1.5)
max: int(max=100)

Valid Data:

optional_min: 10
min: 1.6
max: 100

Includes and recursion

Schema:

customerA: include('customer')
customerB: include('customer')
recursion: include('recurse')
---
customer:
    name: str()
    age: int()
    custom: include('custom_type')

custom_type:
    integer: int()

recurse:
    level: int()
    again: include('recurse', required=False)

Valid Data:

customerA:
    name: bob
    age: 900
    custom:
        integer: 1
customerB:
    name: jill
    age: 1
    custom:
        integer: 3
recursion:
    level: 1
    again:
        level: 2
        again:
            level: 3
            again:
                level: 4

Lists

Schema:

list_with_two_types: list(str(), include('variant'))
questions: list(include('question'))
---
variant:
  rsid: str()
  name: str()

question:
  choices: list(include('choices'))
  questions: list(include('question'), required=False)

choices:
  id: str()

Valid Data:

list_with_two_types:
  - 'some'
  - rsid: 'rs123'
    name: 'some SNP'
  - 'thing'
  - rsid: 'rs312'
    name: 'another SNP'
questions:
  - choices:
      - id: 'id_str'
      - id: 'id_str1'
    questions:
      - choices:
        - id: 'id_str'
        - id: 'id_str1'

The data is a list of items without a keyword at the top level

Schema:

list(include('human'), min=2, max=2)

---
human:
  name: str()
  age: int(max=200)
  height: num()
  awesome: bool()

Valid Data:

- name: Bill
  age: 26
  height: 6.2
  awesome: True

- name: Adrian
  age: 23
  height: 6.3
  awesome: True

Writing Tests

To validate YAML files when you run your program's tests use Yamale's YamaleTestCase

Example:

class TestYaml(YamaleTestCase):
    base_dir = os.path.dirname(os.path.realpath(__file__))
    schema = 'schema.yaml'
    yaml = 'data.yaml'
    # or yaml = ['data-*.yaml', 'some_data.yaml']

    def runTest(self):
        self.assertTrue(self.validate())

base_dir: String path to prepend to all other paths. This is optional.

schema: String of path to the schema file to use. One schema file per test case.

yaml: String or list of yaml files to validate. Accepts globs.

Developers

Testing

Yamale uses Tox to run its tests against multiple Python versions. To run tests, first checkout Yamale, install Tox, then run make test in Yamale's root directory. You may also have to install the correct Python versions to test with as well.

NOTE on Python versions: tox.ini specifies the lowest and highest versions of Python supported by Yamale. Unless your development environment is configured to support testing against multiple Python versions, one or more of the test branches may fail. One method of enabling testing against multiple versions of Python is to install pyenv and tox-pyenv and to use pyenv install and pyenv local to ensure that tox is able to locate appropriate Pythons.

Releasing

Yamale uses Github Actions to upload new tags to PyPi. To release a new version:

  1. Make a commit with the new version in setup.py.
  2. Run tests for good luck.
  3. Run make release.

Github Actions will take care of the rest.

Comments
  • Bugfix/clean output

    Bugfix/clean output

    PR should fix #64. Idea is that:

    • yamale.validate(schema, data) returns list of failures an external module can use as API.
    • command_line.py
      • if data files are valide, display "Validation Success" and exit 0
      • else, output to stdout list of failures and exit 1
    • In case of error like invalid schema format, missing schema file ..., Yamale still raise exception

    PR need improvment, on YamaleTestCase for example Any suggestions are welcome

    Arnaud

    opened by abourree 36
  • Feature Adds Day/Timestamp format Constraint

    Feature Adds Day/Timestamp format Constraint

    This PR introduces the format constraint to both the Day and Timestamp validators.

    The PyYAML and ruamel.yaml loaders have been modified to not automatically convert strings to dates or datetimes. Instead, dates and datetimes are loaded as strings. During the validation step, the string values are passed to the Day and Timestamp validators which try to coerce to a date/datetime using the value passed in the format argument of the schema definition file. If the format argument is not used, a default date/datetime parser util.parse_default_date() is used. This parser mimics the default PyYAML and ruamel.yaml behavior.

    This PR also includes updates to the README file and new and updated tests.

    opened by zbremmer 22
  • [Feature] Adds Subset validator

    [Feature] Adds Subset validator

    Similiar to the any validator, except it allows combining multiple types (instead of exactly one). This is mostly useful for various Includes, thereby allowing some sort of conditional key-value pairs, such as the following. Granted this example is over-simplistic and can be resolved in other ways (such as explicitly specifying the mapping values), but this would apply to a recursive definition as well.

    # schema.yaml
    my_types: union(include('type1'), include('type2'))
    ---
    type1: map(str(), key=regex('^my_str(ing)?$', ignore_case=True))
    ---
    type2: map(int(), key=regex('^my_int(eger)?$', ignore_case=True))
    
    # data_success.yaml
    my_types:
      my_string: foo
      my_int: 3
    
    # data_fail.yaml
    my_types:
      my_string: 3  # Not a string
      my_int: foo  # Not an int
    
    opened by idantene 16
  • Handle position with a custom class instead of string

    Handle position with a custom class instead of string

    Add datapath class to replace the 'dotted string' used to indicate the path in the data/schema

    • Fixes issue with nested maps in an include, closes #56
    • Allows top level validators other than fixed map, brought up in the now closed issue #40
    opened by drmull 15
  • Add IP and MAC Address Validator

    Add IP and MAC Address Validator

    This adds a very basic IP address validator. It uses ipaddress from the Python3 stdlib and the backported version for Python2. I'm unsure if ipaddress is in a default install of Python2.7.x. It was in mine, but it's also listed on PyPI as a backport, so I added it to dependencies anyway.

    Anyway, this will let you validate IPv4 and IPv6 addresses, optionally enforcing the version. You could extend this a lot, but I don't want to boil the ocean.

    I know you can do custom validators, but this seems like something that might be common enough that it is worth having upstream.

    I tried to make sure I did all the right things with testing. Please let me know if I missed something.

    opened by supertylerc 12
  • Questions about representation of numbers

    Questions about representation of numbers

    Dear devs, Example schema: number1:int() number2:float()

    1.Why 1e4 or 1.1e4 are not recognized as valid int. Should be possible imho. 2. Why build in float validator brings error on numbers without decimal point. I.e. 10.0 is valid but 10 is not valid double. Seems illogical, since ppl never add "." to such numbers

    I have created own validator, which did the job. But may be support is natively? Thanks you very much for great tool!

    `#custom type int64 class int64_type_validator(Validator): """ Custom Int64 validator """ tag = 'int64'
    def _is_valid(self, value): strg=str(value) try: f=float(value) i=np.int64(f) return (f-i)==0 #Notes: # TBD:May be add epsilon? # Is 10.1e9 valid input of int64. So far yes. or do we want to limit AeB except: return False

    class float_type_validator(Validator): """ Custom float validator """ tag = 'float'

    def _is_valid(self, value):
        strg=str(value)
        try:
            f:float=float(strg);
            return True          
        except:
            return False`
    
    opened by ilia-wolke7 10
  • Yamale not able to match multiline or multiple nodes with regex given

    Yamale not able to match multiline or multiple nodes with regex given

    I am trying to validate my filters node with help of a regex pattern which can match multiple lines if it contains the given keyword in it. I checked regex validation for the text outside of yamale and it is working fine but when I pass the same regex onto yamale it is failing saying not a regex match. Regex pattern: "((.|\n))custodian-skip((.|\n))" Content used for Validation: filters: - type: value value_type: age key: CreationDateTime value: 30 op: gt - "tag:custodian-skip": absent - and: - "tag:Owner": absent - "tag:owner": absent - "tag:team": absent.

    opened by zendesk-mpalkur 10
  • Add new base validator and base constraint classes

    Add new base validator and base constraint classes

    Add new base classes to validator and constraint that take additional arguments, such as schema, path etc. This allows constraints to use non-primitive validators and resolves issue #133

    My initial intention was to pull out the logic for the non-primitive validators from the schema file and put them in each respective validator class but then I figured it would be a bit too drastic..

    opened by drmull 9
  • Verifying file with multiple documents of different types

    Verifying file with multiple documents of different types

    Thanks for Yamale!

    Have used for a few projects already, but now have a problem. I have a file with several yaml documents, of differing types.

    kind: foo
    # ... more content corresponding to type foo
    ---
    kind: bar
    # ... more content corresponding to type bar
    ---
    ...
    

    [Although they aren't, one could picture them as kubernetes manifests, for instance.]

    Depending on the kind parameter, I want to validate each document against one of several schemas. How can I best accomplish this with Yamale?

    opened by shaunc 8
  • Ability to specify schema and data without a filename

    Ability to specify schema and data without a filename

    Currently you use the library like this:

    import yamale
    schema = yamale.make_schema('./schema.yaml')
    
    # Create a Data object
    data = yamale.make_data('./data.yaml')
    
    # Validate data against the schema. Throws a ValueError if data is invalid.
    yamale.validate(schema, data)
    

    It would be really useful if there was an option to provide the schema and data as Python strings instead - that way the could be loaded from a database or from an incoming HTTP request.

    Something like this could work

    import yamale
    
    schema = yamale.make_schema(content="""
    name: str()
    age: int(max=200)
    height: num()
    awesome: bool()
    """)
    
    data = yamale.make_data(content="""
    name: Bill
    age: 26
    height: 6.2
    awesome: True
    """)
    
    yamale.validate(schema, data)
    
    enhancement 
    opened by simonw 8
  • Regular expression validator

    Regular expression validator

    Hi, here's my stab at adding a regular expression validator. I would've created a feature request but one exists already at #40.

    The default fail message was modified, because 'foo' is not a regex didn't make any sense, but further customization of the fail message to something like 'foo' is not a User ID may or may not be something you want.

    I considered allowing regex flags to be passed to re.compile() by using the validator like regex(r'pattern', [r'pattern2',] flags=re.IGNORECASE, name='eggs'), but that looked like it would've required adding the re module (or at least the various flags it defines) to the allowed (very) short list of safe_globals in syntax.parser. So... never mind that. :relieved:

    I also updated the README to document the validator and fix a minor inaccuracy in the existing documentation for String.

    Thanks for reviewing!

    opened by kangtastic 8
  • Make error message parseable

    Make error message parseable

    Can this lib output an error that can parseable ? For example if yamale found an issue in yaml file it will output something like this

    authors.yml:10:8; apis.0.path: Required field missing

    so it have file name, line number, column number and error message that can be used for the other CLI tool

    opened by mychaelgo 0
  • Fail when validation doesn't actually run

    Fail when validation doesn't actually run

    When validating a file, if the schema or file doesn't actually exist, it still passes validation and returns an exit code of 0. If validation of an actual file fails then the exit code is 1. This is a similar behavior that would be expected for invalid files or missing files.

    If the schema file doesn't exist or the number of matching yaml files is 0, then the exit code should probably be 1.

    yamale -s not-a-schema not-a-file
    Finding yaml files...
    Found 0 yaml files.
    Validating...
    Validation success! 👍
    
    bug small 
    opened by slimm609 1
  • Add CodeQL workflow for GitHub code scanning

    Add CodeQL workflow for GitHub code scanning

    Hi 23andMe/Yamale!

    This is a one-off automatically generated pull request from LGTM.com :robot:. You might have heard that we’ve integrated LGTM’s underlying CodeQL analysis engine natively into GitHub. The result is GitHub code scanning!

    With LGTM fully integrated into code scanning, we are focused on improving CodeQL within the native GitHub code scanning experience. In order to take advantage of current and future improvements to our analysis capabilities, we suggest you enable code scanning on your repository. Please take a look at our blog post for more information.

    This pull request enables code scanning by adding an auto-generated codeql.yml workflow file for GitHub Actions to your repository — take a look! We tested it before opening this pull request, so all should be working :heavy_check_mark:. In fact, you might already have seen some alerts appear on this pull request!

    Where needed and if possible, we’ve adjusted the configuration to the needs of your particular repository. But of course, you should feel free to tweak it further! Check this page for detailed documentation.

    Questions? Check out the FAQ below!

    FAQ

    Click here to expand the FAQ section

    How often will the code scanning analysis run?

    By default, code scanning will trigger a scan with the CodeQL engine on the following events:

    • On every pull request — to flag up potential security problems for you to investigate before merging a PR.
    • On every push to your default branch and other protected branches — this keeps the analysis results on your repository’s Security tab up to date.
    • Once a week at a fixed time — to make sure you benefit from the latest updated security analysis even when no code was committed or PRs were opened.

    What will this cost?

    Nothing! The CodeQL engine will run inside GitHub Actions, making use of your unlimited free compute minutes for public repositories.

    What types of problems does CodeQL find?

    The CodeQL engine that powers GitHub code scanning is the exact same engine that powers LGTM.com. The exact set of rules has been tweaked slightly, but you should see almost exactly the same types of alerts as you were used to on LGTM.com: we’ve enabled the security-and-quality query suite for you.

    How do I upgrade my CodeQL engine?

    No need! New versions of the CodeQL analysis are constantly deployed on GitHub.com; your repository will automatically benefit from the most recently released version.

    The analysis doesn’t seem to be working

    If you get an error in GitHub Actions that indicates that CodeQL wasn’t able to analyze your code, please follow the instructions here to debug the analysis.

    How do I disable LGTM.com?

    If you have LGTM’s automatic pull request analysis enabled, then you can follow these steps to disable the LGTM pull request analysis. You don’t actually need to remove your repository from LGTM.com; it will automatically be removed in the next few months as part of the deprecation of LGTM.com (more info here).

    Which source code hosting platforms does code scanning support?

    GitHub code scanning is deeply integrated within GitHub itself. If you’d like to scan source code that is hosted elsewhere, we suggest that you create a mirror of that code on GitHub.

    How do I know this PR is legitimate?

    This PR is filed by the official LGTM.com GitHub App, in line with the deprecation timeline that was announced on the official GitHub Blog. The proposed GitHub Action workflow uses the official open source GitHub CodeQL Action. If you have any other questions or concerns, please join the discussion here in the official GitHub community!

    I have another question / how do I get in touch?

    Please join the discussion here to ask further questions and send us suggestions!

    opened by lgtm-com[bot] 0
  • Adding external includes from existing schema

    Adding external includes from existing schema

    Hi and thank you for yamale :)

    I would like to reuse some definitions from an old schema to generate a new schema. From your documentation it indeed looks like this should be possible:

    Adding external includes

    After you construct a schema you can add extra, external include definitions by calling schema.add_include(dict). This method takes a dictionary and adds each key as another include.

    I thought it could be done like shown in this dummy example:

    old_schema.yaml

    def_from_old_schema:
        element : str()
        other_element : num()
    

    new_schema.yaml

    list(include('def_from_old_schema'))
    
    import yamale
    
    old_schema = yamale.make_schema('old_schema.yaml')
    new_schema = yamale.make_schema('new_schema.yaml')
    new_schema.add_include(old_schema.dict)
    

    However this raises a rather convoluted error message (see below). I thought the old_schema.dict would have had the correct structure to be used for Schema.add_include(), however values like 'String((), {})' throws it off. What is the simplest way to generate a dict of the right structure?

    Traceback (most recent call last): File "/home/c71chilltown/.local/lib/python3.10/site-packages/yamale/syntax/parser.py", line 39, in parse tree = ast.parse(validator_string, mode='eval') File "/usr/lib/python3.10/ast.py", line 50, in parse return compile(source, filename, mode, flags, TypeError: compile() arg 1 must be a string, bytes or AST object During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/c71chilltown/.local/lib/python3.10/site-packages/yamale/schema/schema.py", line 47, in _parse_schema_item return syntax.parse(expression, validators) File "/home/c71chilltown/.local/lib/python3.10/site-packages/yamale/syntax/parser.py", line 46, in parse raise SyntaxError( SyntaxError: Invalid schema expression: 'String((), {})'. compile() arg 1 must be a string, bytes or AST object During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/c71chilltown/.local/lib/python3.10/site-packages/IPython/core/interactiveshell.py", line 3553, in run_code exec(code_obj, self.user_global_ns, self.user_ns) File "", line 1, in <cell line: 1> new_schema.add_include(old_schema.dict) File "/home/c71chilltown/.local/lib/python3.10/site-packages/yamale/schema/schema.py", line 26, in add_include t = Schema(custom_type, name=include_name, File "/home/c71chilltown/.local/lib/python3.10/site-packages/yamale/schema/schema.py", line 17, in init self._schema = self._process_schema(DataPath(), File "/home/c71chilltown/.local/lib/python3.10/site-packages/yamale/schema/schema.py", line 36, in _process_schema schema_data[key] = self._process_schema(path + DataPath(key), File "/home/c71chilltown/.local/lib/python3.10/site-packages/yamale/schema/schema.py", line 40, in _process_schema schema_data = self._parse_schema_item(path, File "/home/c71chilltown/.local/lib/python3.10/site-packages/yamale/schema/schema.py", line 51, in _parse_schema_item raise SyntaxError(error) SyntaxError: Invalid schema expression: 'String((), {})'. compile() arg 1 must be a string, bytes or AST object at node 'element'

    opened by edager 2
  • Command line argument for custom validators

    Command line argument for custom validators

    First of all, thank you for this great tool. It is a crucial part of our build pipeline to maintain code quality and therefore incredibly useful to our project.

    As Yamale already features a modular design, able to be extended by custom validators, my pull request aims to allow this using command line arguments.

    There are no breaking changes in my pull request and I wrote a test that completes successfully for python 3.6, 3.8 and (with a newer pytest version) 3.10 on my system.

    opened by jhe-iqbis 8
  • Error when dealing with duplicate anchors

    Error when dealing with duplicate anchors

    Hello and thank you for this tool.

    Consider the following sample YAML file:

    test:
      - name: &name foo
        properties:
          name: *name
      
      - name: &name bar
        properties:
          name: *name
    

    Note the two anchors with the same name, which is fine according to the YAML specification. Here is a schema to validate this file:

    test: list(include('value'))
    value:
      name: str()
      properties:
        name: str()
    

    Running Yamale with default options produces the following error:

    yaml.composer.ComposerError: found duplicate anchor; first occurrence
      in "test.yaml", line 3, column 11
    second occurrence
      in "test.yaml", line 7, column 11
    

    Same behavior with --parser ruamel but the exception is ruamel.yaml.composer.ComposerError.

    As described here, ruamel.yaml can parse this correctly, but pure=True must be specified when initializing it:

    from ruamel.yaml import YAML
    yaml = YAML(typ='safe', pure=True)
    

    Can you please consider adding a flag to toggle the pure option in ruamel so Yamale doesn't error out in this scenario?

    opened by ghost 1
Releases(4.0.4)
  • 4.0.4(Aug 22, 2022)

  • 4.0.2(Oct 28, 2021)

  • 4.0.0(Oct 11, 2021)

    This release is created to address the following issue: https://github.com/23andMe/Yamale/issues/167

    The change in PR https://github.com/23andMe/Yamale/pull/173 mitigates that specific issue. We are unaware of any backwards incompatibility with the introduction of this fix, but we wanted to increment the major version number in case there are users with more complex schemas than what we test again.

    We've also included the following warning in our README:

    ⚠️ Ensure that your schema definitions come from internal or trusted sources. Yamale does not protect against intentionally malicious schemas.

    Source code(tar.gz)
    Source code(zip)
  • 3.0.8(Aug 3, 2021)

  • 3.0.2(Aug 4, 2020)

    Fixes #119, strict mode was not the default on the command line....but it was for the API. This fix ensures strict mode is the default in all uses.

    Source code(tar.gz)
    Source code(zip)
  • 3.0.0(Jul 10, 2020)

    Note: Due to a packaging bug, users running Python 2.x should pin the major version of Yamale to 2.x.

    We're doing a major version jump to include the following changes:

    • Remove Python 2.x support
    • Make the default validation "strict". The --strict command line is now replaced with --no-strict for those that want the old behavior. See the README for more details.
    • Prevent int and num validators from accepting bool values. #109
    Source code(tar.gz)
    Source code(zip)
  • 2.2.0(Jun 26, 2020)

  • 2.1.0(Jun 3, 2020)

    Removed the printing of stacktraces to the command line (#83) Add support for a "key" constraint to the "map" validator (#95) Make any() accept anything (#93) Empty data file should fail if schema requires something (#81) Add a check for an empty schema file (#70)

    Source code(tar.gz)
    Source code(zip)
  • 2.0.1(Sep 10, 2019)

    Fixed a bug when using a schema with a static list and trying to validate a list with a missing element. https://github.com/23andMe/Yamale/issues/66

    Source code(tar.gz)
    Source code(zip)
  • 2.0(Aug 20, 2019)

    This release brings strict mode to Yamale. With strict mode, elements defined in your YAML that aren't specified in the schema will cause a validation error. You can also validate dynamic keys and include validators.

    We've bumped the version to 2.x due to an incompatibility. In the 1.x branch, if all the children of a node are optional, then the parent is optional as well. In the 2.x branch, the parent will no longer be optional in this case.

    Source code(tar.gz)
    Source code(zip)
  • 1.9.0(Mar 4, 2019)

  • 1.8.1(Feb 19, 2019)

  • 1.8.0(Nov 26, 2018)

  • 1.7.1(Nov 6, 2018)

  • 1.4.0(Mar 16, 2015)

  • 1.3.0(Feb 18, 2015)

  • 1.2.0(Nov 7, 2014)

  • 1.1.3(May 5, 2014)

Owner
23andMe
23andMe
Pydantic-ish YAML configuration management.

Pydantic-ish YAML configuration management.

Dribia Data Research 18 Oct 27, 2022
Simple dataclasses configuration management for Python with hocon/json/yaml/properties/env-vars/dict support.

Simple dataclasses configuration management for Python with hocon/json/yaml/properties/env-vars/dict support, based on awesome and lightweight pyhocon parsing library.

Teo Stocco 62 Dec 23, 2022
Apt2sbom python package generates SPDX or YAML files

Welcome to apt2sbom This package contains a library and a CLI tool to convert a Ubuntu software package inventory to a software bill of materials. You

Eliot Lear 15 Nov 13, 2022
Dag-bakery - Dag Bakery enables the capability to define Airflow DAGs via YAML.

DAG Bakery - WIP ?? dag-bakery aims to simplify our DAG development by removing all the boilerplate and duplicated code when defining multiple DAG cro

Typeform 2 Jan 8, 2022
Python YAML Environment (ymlenv) by Problem Fighter Library

In the name of God, the Most Gracious, the Most Merciful. PF-PY-YMLEnv Documentation Install and update using pip: pip install -U PF-PY-YMLEnv Please

Problem Fighter 2 Jan 20, 2022
Organize Django settings into multiple files and directories. Easily override and modify settings. Use wildcards and optional settings files.

Organize Django settings into multiple files and directories. Easily override and modify settings. Use wildcards in settings file paths and mark setti

Nikita Sobolev 942 Jan 5, 2023
A set of Python scripts and notebooks to help administer and configure Workforce projects.

Workforce Scripts A set of Python scripts and notebooks to help administer and configure Workforce projects. Notebooks Several example Jupyter noteboo

Esri 75 Sep 9, 2022
🤫 Easily manage configs and secrets in your Python projects (with CLI support)

Installation pip install confidential How does it work? Confidential manages secrets for your project, using AWS Secrets Manager. First, store a secr

Candid™️ 63 Oct 30, 2022
Generate config files and qr codes for wireguard vpn

wireguard config generator for python Generate config files and qr codes for wireguard vpn You will need to install qrcode and pillow in python and yo

null 18 Dec 2, 2022
Tools to assist with the configuration and maintenance of fapolicyd.

Tools to assist with the configuration and maintenance of fapolicyd.

Concurrent Technologies Corporation (CTC) 7 Dec 27, 2022
A Python library to parse PARI/GP configuration and header files

pari-utils A Python library to parse PARI/GP configuration and header files. This is mainly used in the code generation of https://github.com/sagemath

Sage Mathematical Software System 3 Sep 18, 2022
Secsie is a configuration language made for speed, beauty, and ease of use.

secsie-conf pip3 install secsie-conf Secsie is a configuration language parser for Python, made for speed and beauty. Instead of writing config files

Noah Broyles 3 Feb 19, 2022
Napalm-vs-openconfig - Comparison of NAPALM and OpenConfig YANG with NETCONF transport

NAPALM vs NETCONF/OPENCONFIG Abstracts Multi vendor network management and autom

Anton Karneliuk 1 Jan 17, 2022
A small example project for efficiently configuring a Python application with YAMLs and the CLI

Hydra Example Project for Python A small example project for efficiently configuring a Python application with YAMLs and the CLI. Why should I care? A

Florian Wilhelm 4 Dec 31, 2022
Type-safe YAML parser and validator.

StrictYAML StrictYAML is a type-safe YAML parser that parses and validates a restricted subset of the YAML specification. Priorities: Beautiful API Re

Colm O'Connor 1.2k Jan 4, 2023
A YAML validator for Programming Historian lessons.

phyaml A simple YAML validator for Programming Historian lessons. USAGE: python3 ph-lesson-yaml-validator.py lesson.md The script automatically detect

Riva Quiroga 1 Nov 7, 2021
API spec validator and OpenAPI document generator for Python web frameworks.

API spec validator and OpenAPI document generator for Python web frameworks.

1001001 249 Dec 22, 2022
Advanced Number Validator Using telnyx api

Number Validator Python v1.0.0 Number Validator Using telnyx api DISCLAIMER This Tool is only for educational purposes You'll be responsible yourself

xBlackxCoder 3 Sep 24, 2022
An application to see if your Ethereum staking validator(s) are members of the current or next post-Altair sync committees.

eth_sync_committee.py Since the Altair upgrade, 512 validators are randomly chosen every 256 epochs (~27 hours) to form a sync committee. Validators i

null 4 Oct 27, 2022
Yahoo Mail Validator For Python

Validator Validator helps to know if the mail is valid or not Installation Install The libraries pip install requests bs4 colorama Usage Create a new

Mr Python 3 Mar 12, 2022