Static Typing for Python

Overview

Chat at https://gitter.im/python/typing

Static Typing for Python

Documentation and Support

The documentation for Python's static typing can be found at typing.readthedocs.io. You can get help either in our support forum or chat with us on Gitter.

Improvements to the type system should be discussed on the typing-sig mailing list, although the issues in this repository contain some historic discussions.

Repository Content

This GitHub repository is used for several things:

Historically, this repository hosted a backport of the typing module for older Python versions. The last released version, supporting Python 2.7 and 3.4, is available at PyPI.

Workflow

See CONTRIBUTING.md for more.

Comments
  • Proposal: syntax for variable and attribute annotations (PEP 526)

    Proposal: syntax for variable and attribute annotations (PEP 526)

    Introduction

    This issue is reserved for substantive work on PEP 526, "Syntax for Variable and Attribute Annotations". For textual nits please comment directly on the latest PR for this PEP in the peps repo.

    I sent a strawman proposal to python-ideas. The feedback was mixed but useful -- people tried to poke holes in it from many angles.

    In this issue I want to arrive at a more solid specification. I'm out of time right now, but here are some notes:

    • Class variables vs. instance variables
    • Specify instance variables in class body vs. in __init__ or __new__
    • Thinking with your runtime hat on vs. your type checking hat
    • Importance of a: <type> vs. how it strikes people the wrong way
    • Tuple unpacking is a mess, let's avoid it entirely
    • Collecting the types in something similar to __annotations__
    • Cost of doing that for locals
    • Cost of introducing a new keywords

    Work in progress here!

    I'm updating the issue description to avoid spamming subscribers to this tracker. I'll keep doing this until we have reasonable discussion.

    Basic proposal

    My basic observation is that introducing a new keyword has two downsides: (a) choice of a good keyword is hard (e.g. it can't be 'var' because that is way too common a variable name, and it can't be 'local' if we want to use it for class variables or globals,) and (b) no matter what we choose, we'll still need a __future__ import.

    So I'm proposing something keyword-free:

    a: List[int] = []
    b: Optional[int] = None
    

    The idea is that this is pretty easy to explain to someone who's already familiar with function annotations.

    Multiple types/variables

    An obvious question is whether to allow combining type declarations with tuple unpacking (e.g. a, b, c = x). This leads to (real or perceived) ambiguity, and I propose not to support this. If there's a type annotation there can only be one variable to its left, and one value to its right. This still allows tuple packing (just put the tuple in parentheses) but it disallows tuple unpacking. (It's been proposed to allow multiple parenthesized variable names, or types inside parentheses, but none of these look attractive to me.)

    There's a similar question about what to about the type of a = b = c = x. My answer to this is the same: Let's not go there; if you want to add a type you have to split it up.

    Omitting the initial value

    My next step is to observe that sometimes it's convenient to decouple the type declaration from the initialization. One example is a variable that is initialized in each branch of a big sequence of if/elif/etc. blocks, where you want to declare its type before entering the first if, and there's no convenient initial value (e.g. None is not valid because the type is not Optional[...]). So I propose to allow leaving out the assignment:

    log: Logger
    if develop_mode():
        log = heavy_logger()
    elif production_mode():
        log = fatal_only_logger()
    else:
        log = default_logger()
    log.info("Server starting up...")
    

    The line log: Logger looks a little odd at first but I believe you can get used to it easily. Also, it is again similar to what you can do in function annotations. (However, don't hyper-generalize. A line containing just log by itself means something different -- it's probably a NameError.)

    Note that this is something that you currently can't do with # type comments -- you currently have to put the type on the (lexically) first assignment, like this:

    if develop_mode():
        log = heavy_logger()  # type: Logger
    elif production_mode():
        log = fatal_only_logger()  # (No type declaration here!)
    # etc.
    

    (In this particular example, a type declaration may be needed because heavy_logger() returns a subclass of Logger, while other branches produce different subclasses; in general the type checker shouldn't just compute the common superclass because then a type error would just infer the type object.)

    What about runtime

    Suppose we have a: int -- what should this do at runtime? Is it ignored, or does it initialize a to None, or should we perhaps introduce something new like JavaScript's undefined? I feel quite strongly that it should leave a uninitialized, just as if the line was not there at all.

    Instance variables and class variables

    Based on working with mypy since last December I feel strongly that it's very useful to be able to declare the types of instance variables in class bodies. In fact this is one place where I find the value-less notation (a: int) particularly useful, to declare instance variables that should always be initialized by __init__ (or __new__), e.g. variables whose type is mutable or cannot be None.

    We still need a way to declare class variables, and here I propose some new syntax, prefixing the type with a class keyword:

    class Starship:
        captain: str                      # instance variable without default
        damage: int = 0                   # instance variable with default (stored in class)
        stats: class Dict[str, int] = {}  # class variable with initialization
    

    I do have to admit that this is entirely unproven. PEP 484 and mypy currently don't have a way to distinguish between instance and class variables, and it hasn't been a big problem (though I think I've seen a few mypy bug reports related to mypy's inability to tell the difference).

    Capturing the declared types at runtime

    For function annotations, the types are captured in the function's __annotations__ object. It would be an obvious extension of this idea to do the same thing for variable declarations. But where exactly would we store this info? A strawman proposal is to introduce __annotations__ dictionaries at various levels. At each level, the types would go into the __annotations__ dict at that same level. Examples:

    Global variables

    players: Dict[str, Player]
    print(__annotations__)
    

    This would print {'players': Dict[str, Player]} (where the value is the runtime representation of the type Dict[str, Player]).

    Class and instance variables:

    class Starship:
        # Class variables
        hitpoints: class int = 50
        stats: class Dict[str, int] = {}
        # Instance variables
        damage: int = 0
        shield: int = 100
        captain: str  # no initial value
    print(Starship.__annotations__)
    

    This would print a dict with five keys, and corresponding values:

    {'hitpoints': ClassVar[int],  # I'm making this up as a runtime representation of "class int"
     'stats': ClassVar[Dict[str, int]],
     'damage': int,
     'shield': int,
     'captain': str
    }
    

    Finally, locals. Here I think we should not store the types -- the value of having the annotations available locally is just not enough to offset the cost of creating and populating the dictionary on each function call.

    In fact, I don't even think that the type expression should be evaluated during the function execution. So for example:

    def side_effect():
        print("Hello world")
    def foo():
        a: side_effect()
        a = 12
        return a
    foo()
    

    should not print anything. (A type checker would also complain that side_effect() is not a valid type.)

    This is inconsistent with the behavior of

    def foo(a: side_effect()):
        a = 12
        return a
    

    which does print something (at function definition time). But there's a limit to how much consistency I am prepared to propose. (OTOH for globals and class/instance variables I think that there would be some cool use cases for having the information available.)

    Effect of presence of a: <type>

    The presence of a local variable declaration without initialization still has an effect: it ensures that the variable is considered to be a local variable, and it is given a "slot" as if it was assigned to. So, for example:

    def foo():
        a: int
        print(a)
    a = 42
    foo()
    

    will raise UnboundLocalError, not NameError. It's the same as if the code had read

    def foo():
        if False: a = 0
        print(a)
    

    Instance variables inside methods

    Mypy currently supports # type comments on assignments to instance variables (and other things). At least for __init__ (and __new__, and functions called from either) this seems useful, in case you prefer a style where instance variables are declared in __init__ (etc.) rather than in the class body.

    I'd like to support this, at least for cases that obviously refer to instance variables of self. In this case we should probably not update __annotations__.

    What about global or nonlocal?

    We should not change global and nonlocal. The reason is that those don't declare new variables, they declare that an existing variable is write-accessible in the current scope. Their type belongs in the scope where they are defined.

    Redundant declarations

    I propose that the Python compiler should ignore duplicate declarations of the same variable in the same scope. It should also not bother to validate the type expression (other than evaluating it when not in a local scope). It's up to the type checker to complain about this. The following nonsensical snippet should be allowed at runtime:

    a: 2+2
    b: int = 'hello'
    if b:
        b: str
        a: str
    
    opened by gvanrossum 210
  • Decide how to handle str/unicode

    Decide how to handle str/unicode

    There's a long discussion on this topic in the mypy tracker: https://github.com/python/mypy/issues/1141

    I'm surfacing it here because I can never remember whether that discussion is here, or in the typeshed repo, or in the mypy tracker.

    (Adding str, bytes, unicode, Text, basestring as additional search keywords.)

    opened by gvanrossum 115
  • Define a JSON type

    Define a JSON type

    JSON is such a common interchange format it might make sense to define it as a specific type.

    JSON = t.Union[str, int, float, bool, None, t.Mapping[str, 'JSON'], t.List['JSON']]
    

    Not sure if this should go into typing or be introduces as json.JSONType instead (or if it's even worth it considering the variability of the type).

    topic: feature 
    opened by brettcannon 81
  • Proposal: signature copying for kwargs.

    Proposal: signature copying for kwargs.

    There's a quite common pattern in python code which is:

    def function(foo, *args, **kwargs):
        # do something with foo
        other_function(*args, **kwargs)
        # possibly do something else
    
    def other_function(color: str=..., temperature: float=..., style: Stylesheet=..., timeout: Optional[int]=..., database_adaptor: Adaptor=..., strict: bool=..., output: IO[str], allow_frogs: bool=..., mode: SomeEnum=...):
        # do something with a lot of options
    

    (a usual subcase of this one is when other_function is actually super().function ). This presents two problems for a static analyzer:

    • the call from function to other_function can not be type-checked properly because of the *args, **kwargs in the call arguments.
    • there is no sensible way to annotate function, so calls to it are unchecked.

    The problem for me also affects readability of the code (which is for me one of the main problems that annotations tries to address). James Powell from numfocus even gave a pydata talk about the difficulties it brings at https://www.youtube.com/watch?v=MQMbnhSthZQ

    Even if theoretically the args/kwargs packing feature of python can be used with more or less arbitrary data, IMO this use-case is common enough to warrant some special treatment. I was thinking on a way to flag this usage, for example:

    @delegate_args(other_function)
    def function(foo, *args, **kwargs):
        other_function(*args, **kwargs)
    

    This could hint an analyzer so:

    • On calls to function, the "extra" arguments are checked to match the signature of other_function
    • The call to other_function is considered valid given that it uses the same arguments (I know that the code above could have modified the content of kwargs, but it's still more checking than what we have now).

    For me, even without static analyzer, the readability benefits of seeing

    @delegate_args(matplotlib.pyplot.plot)
    def plot_valuation(ticker_symbol: str, start: date, end: date, *args, **kwargs): ...
    

    and knowing that plot_valuation accepts any valid arguments from matplotlib's plot function, is worth it.

    opened by dmoisset 66
  • Can we have a generic Type[C]?

    Can we have a generic Type[C]?

    The type object occupies a rather strange place in the type hierarchy:

    >>> type(type) is type
    True
    

    (I'm pretty sure that's a flat lie, since you can't instantiate something from itself, but regardless...)

    >>> isinstance(type, object)
    True
    >>> isinstance(object, type)
    True
    >>> isinstance(type, type)
    True
    

    In Java, the (very rough) equivalent is a class, specifically Class<T>. It's also generic; the type variable refers to the instance. Java has it easy because they don't support metaclasses. Classes are not first class in Java, so their type hierarchy doesn't have to deal with the strange loops shown above.

    I realize metaclasses in their full generality are out of scope (at least for now), but a generic Type[T] like Java's would be nice to have. So far as I can tell from the Mypy documentation, it doesn't currently exist.

    Here's some example code which might like to have this feature:

    def make_foo(class_: Type[T]) -> T:
        # Instantiate and return a T
    
    opened by NYKevin 65
  • Kill __subclasscheck__

    Kill __subclasscheck__

    Mark Shannon wants me to drop __subclasscheck__ from all type objects (Any, Union etc.). This is somewhat major surgery since it is used by Union simplification and for various type checks. See also #133 and #135.

    opened by gvanrossum 59
  • Allow variadic generics

    Allow variadic generics

    C++11 recently introduced the notion of variadic templates, which I believe Python could benefit from in a simplified form.

    The idea is that you can have a generic class with a variable number of type variables, like typing.Tuple[] has. Here is a real-world variadic-generic class which is not Tuple; it is variadic in TagVar. As you can see, TagVar only appears in tuple contexts. Those tuples are sometimes heterogenous (Caution: annotations are a homegrown 3.4-compatible mishmash of nonsense), so repeating TagVar as shown is actually incorrect (but the closest approximation I could find).

    Here's one possible syntax:

    class MultiField(AbstractField[GetSetVar], Generic[(*TagVar,)]):
        def __init__(self, nbt_names: ty.Sequence[str], *, default:
                     GetSetVar=None) -> None:
            ...
    
        @abc.abstractmethod
        def to_python(self, *tags: (*TagVar,)) -> GetSetVar:
            ...
    
        @abc.abstractmethod
        def from_python(self, value: GetSetVar) -> ty.Tuple[(*TagVar,)]:
            ...
    

    This is syntactically valid in Python 3.5 (if a bit ugly with the parentheses and trailing comma, which cannot be omitted without language changes), but doesn't currently work because type variables are not sequences and cannot be unpacked. It could be implemented by adding something like this to the TypeVar class:

    def __iter__(self):
        yield StarredTypeVar(self)
    

    StarredTypeVar would be a wrapper class that prefixes the repr with a star and delegates all other functionality to the wrapped TypeVar.

    Of course, syntax isn't everything; I'd be fine with any syntax that lets me do this. The other immediately obvious syntax is to follow the TypeVar with an ellipsis, which conveniently does not require changes to typing.py. However, that might require disambiguation in some contexts (particularly since Tuple is likely to be involved with these classes).

    topic: feature 
    opened by NYKevin 58
  • Type for heterogeneous dictionaries with string keys

    Type for heterogeneous dictionaries with string keys

    I've recently been reading Python code where heterogeneous dictionary objects are used a lot. I mean cases like this:

    foo({'x': 1, 'y': 'z'})
    

    The value for key 'x' must be an integer and the value for key 'z' must be a string. Currently there is no precise way of specifying the type of the argument to foo in the above example. In general, we'd have to fall back to Dict[str, Any], Mapping[str, Union[int, str]] or similar. This loses a lot of information.

    However, we could support such types. Here is a potential syntax:

    def foo(arg: Dict[{'x': int, 'y': str}]) -> ...: ...
    

    Of course, we could also allow Dict[dict(x=int, y=str)] as an equivalent. I don't really love either syntax, though.

    Alternatively, we could omit Dict[...] as redundant:

    def f(arg: dict(x=int, y=str)) -> ...
    

    Using type aliases would often be preferred:

    ArgType = Dict[{'x': int, 'y': str}]
    
    def f(arg: ArgType) -> ...
    

    These types would use structural subtyping, and missing keys could plausibly be okay. So Dict[dict(x=int, y=str)] could be a subtype of Dict[dict(x=int)], and vice versa (!).

    Maybe there should also be a way of deriving subtypes of heterogeneous dictionary types (similar to inheritance) to avoid repetition.

    Maybe we'd also want to support Mapping[...] variants (for read-only access and covariance).

    Some existing languages have types resembling these (at least Hack and TypeScript, I think).

    opened by JukkaL 56
  • Mark Shannon's presentation at the 2017 Language Summit

    Mark Shannon's presentation at the 2017 Language Summit

    @ilevkivskyi @markshannon.

    Mark observed that the typing module uses classes to represent types. This can be expensive, since e.g. the type List[int] really ought to be the tuple (List, int) but it's actually a class object which has a fair amount of overhead (though not as much as early versions of typing.py, since we now cache these class objects).

    If we changed to tuples (or at least objects simpler than class object), we'd have a problem: The simpler object couldn't be subclassed from. But who subclasses List[int]? Then again, maybe simpler objects aren't the point?

    Mark also pointed out that after

    from typing import List
    class C(List[int]): pass
    print(C.__mro__)
    

    We find that C.__mro__ has 17 items!

    I confirmed this. The roughly equivalent code using collections.abc

    from collections.abc import MutableMapping
    class C(MutableMapping): pass
    print(C.__mro__)
    

    has only 7 items. And subclassing builtins.list

    class C(list): pass
    print(C.__mro__)
    

    has only three.

    This affects performance, e.g. which is faster?

    class C(list, Sequence[int]): pass
    C().append(1)
    class D(Sequence[int], list): pass
    D().append(1)
    

    One append() call is 10% faster than the other.

    opened by gvanrossum 53
  • Should we change PEP 484 to disable implicit Optional when default = None?

    Should we change PEP 484 to disable implicit Optional when default = None?

    CC: @ddfisher @JukkaL @markshannon

    PEP 484 currently says:

    An optional type is also automatically assumed when the default value is None, for example::

      def handle_employee(e: Employee = None): ...
    

    This is equivalent to::

      def handle_employee(e: Optional[Employee] = None) -> None: ...
    

    This was intended as saving some typing in a common case, but I've received strong feedback from some quarters that this is not consistent and a bad idea. There are other places where None is allowed but none of them automatically add Optional (to the contrary).

    So far it hasn't mattered much for mypy users because mypy doesn't have Optional support, but that's soon going to change (the --strict-optional flag is becoming more reliable) so if we're going to change this, now would be a good time. Thoughts? If we don't change this soon it will probably be too late.

    opened by gvanrossum 52
  • Proposal: Generalize `Callable` to be able to specify argument names and kinds

    Proposal: Generalize `Callable` to be able to specify argument names and kinds

    Right now you can specify callables with two patterns of arguments (shown here by example):

    • Callable[..., int] takes in any arguments, any number.
    • Callable[[int, str, bool], int] takes in a predetermined number of required positional arguments, none of which have names specified.

    These don't cleanly match the actual types of callable objects in Python. Argument names, whether arguments are optional, and whether arguments are *args or **kwargs do affect the type of a callable. We should be able to spell these things in the type language. Doing so would enable us to correctly write the types of callback functions, for example.

    Callable should take two arguments: an argument list and a return type. The return type is exactly as currently described in PEP484. The argument list is either:

    • ..., indicating the function can take any arguments at all.
    • Square brackets around a comma separated series of argument specifiers (argspec for short), indicating particulars about the functions arguments

    An argument specifier is one of:

    • A bare type TYP. This has the same meaning as Arg(TYP)
    • Arg(type, name=None), indicating a positional argument. If the name is specified, the argument must have that name.
    • OptionalArg(type, name=None), indicating an optional positional argument. If the name is specified, the argument must have that name. (alternate name possibility OptArg(type, name=None)
    • StarArg(type), indicating a "star argument" like *args
    • KwArg(type), indicating a "double star argument" like **kwargs. (an alternate name here would be Star2Arg(type).

    The round parens are an indication that these are not actual types but rather this new argument specifier thing.

    Like the rules for python function arguments, all positional argspecs must come before all optional argspecs must come before zero or one star argspecs must come before zero or one kw argspecs.

    This should be able to spell all function types you can make in python by defining single functions or methods, with the exception of functions that need SelfType to be properly specified, which is an orthogonal concern.

    Some statements I think are true:

    • Callable[[Arg(T1, name='foo'), Arg(T2, name='bar')], R] is a subtype of Callable[[T1, T2], R]
    • Callable[[T1, OptionalArg(T2)], R] is a subtype of Callable[[T1], R]
    • Callable[[StarArg(T1)], R] is a subtype of Callable[[], R] and is also a subtype of Callable[[T1], R] and is also a subtype of Callable[[T1, T1], R] and so on.
    • Callable[[T1, StarArg(T1)], R] is a subtype of Callable[[T1], R] and is also a subtype of Callable[[T1, T1], R] and so on, but is not a subtype of Callable[[], R]
    • If we want to be able to spell overloaded function types we'll need a specific way of combining callable types in the specific way overloading works; this proposal doesn't address that.
    opened by sixolet 48
  • Allow `NotRequired[]` to be passed as a `TypeVar`

    Allow `NotRequired[]` to be passed as a `TypeVar`

    I just found myself wanting to do the following:

    from typing import TypedDict, NotRequired, Generic, TypeVar, Union, Never
    
    _T = TypeVar('_T')
    
    Foo = Union['FooSub1[_T]', 'FooSub2[_T]', 'FooSub3[_T]']
    
    class BaseFoo(TypedDict, Generic[_T]):
        bar: _T
    
    class FooSub1(BaseFoo[_T]):
        sub1: str
    
    class FooSub2(BaseFoo[_T]):
        sub2: str
    
    class FooSub3(BaseFoo[_T]):
        sub3: str
    
    def foo(foo_with_bar: Foo[int]):
        reveal_type(foo_with_bar) # Type of "foo_with_bar" is "FooSub1[int] | FooSub2[int] | FooSub3[int]"
    
    def bar(foo_without_bar: Foo[NotRequired[Never]]): # error: "NotRequired" is not allowed in this context
        pass
    

    Mypy and Pyright both disallow NotRequired to be passed as a TypeVar. However the only way to avoid doing that would require me to duplicate all subclass definitions, which I am not willing to do because I have more than a dozen and so much duplication would make the code less maintainable.

    topic: feature 
    opened by not-my-profile 2
  • Typing decorators which enrich a class

    Typing decorators which enrich a class

    Hi

    I would like to type a decorator which takes as input a class and returns the same class with added attributes, i.e., a subclass. Example:

    import typing
    
    _T = typing.TypeVar("_T")
    
    
    def config(*, /, prefix: str) -> typing.Callable[[typing.Type[_T], ???]:
        def wrap(cls: typing.Type[_T]) -> ???:
            class Subclass(cls):  # or could do cls.prefix = prefix instead of subclassing
                prefix: str = prefix
    
            return Subclass
    
        return wrap
    
    
    @config(prefix="test")
    class A:
        ...
    
    
    print(A.prefix)
    

    Been searching for an answer and experimenting for a few days now and can't find anything concrete. Any help would be much appreciated

    Thank you

    topic: feature 
    opened by lijok 1
  • Type hinting a non-iterator (non-consumable) iterable

    Type hinting a non-iterator (non-consumable) iterable

    I want to annotate a function taking an iterable, that is used multiple times. For example:

    def func(thing: Iterable[str]) -> None:
        for _ in range(10):
            for x in thing:
                do_thing(x)
    

    Iterator is a valid Iterable, but passing an Iterator to the function will lead to bugs (as it's consumed completely the first time, so empty on all next iterations). This could potentially be annotated using something like Intersection[Iterable[str], Not[Iterator]] if #213 is accepted, but that's still a workaround. It would be great if there was a type that allows any Iterable that cannot be consumed, like a list, but not the iterator of a list.

    When searching, the only thing I found was an unanswered SO question.

    topic: feature 
    opened by GideonBear 6
  • More consistency in inference rules between type checkers

    More consistency in inference rules between type checkers

    I've recently added type annotations to a large library, and have been checking my code using both mypy and pyright. While doing so, I noticed many differences between mypy and pyright in the types they choose to infer. Each type checker has a justification for its choices, but as a user this situation is frustrating, because I rely a lot on inference. This leads to a situation where I frequently had to think about "what pyright would do" and "what mypy would do", and scour their issue trackers to understand what's going on - what's a bug and what's a "feature".

    I totally understand that each type checker has been developed independently and influenced by different needs and design choices. I also understand that the ecosystem is very much in flux. I have seen authors of type checkers justify their choices - and rightfully so. Nonetheless I think it would be a big benefit to the community to specify type inference rules more fully (PEP?). If typing is seen as part of the Python language (in various PEPs), and type inference is seen as a feature of typing, then that feature should behave consistently.

    I assume there would be much work to define inference rules and reach an agreement that works for all type checkers. Also it would probably require considerable work to implement the necessary changes. However, I believe in the long run it's beneficial. Especially since as time goes by, more backwards compatibility concerns will just pile up.

    Areas where I have noticed considerable differences include redefinitions (mypy's allow_redefinition only partially consistent with pyright default), Literals, union vs join, and overloads. Maybe others I can't recall.

    I'd like to hear what others think about this topic.

    topic: other 
    opened by matangover 8
  • Suggestion: Allow free type variables in type variable bounds that permeate beyond

    Suggestion: Allow free type variables in type variable bounds that permeate beyond

    Suppose we have the following code:

    T_co = TypeVar("T_co", covariant=True)
    
    class CanProduce(Protocol[T_co]):
        def produce(self) -> T_co:
            ...
    
    T_in = TypeVar("T_in")
    U = TypeVar("U")
    
    @dataclass
    class Container(Generic[T_in]):
        value: T_in
    
        def produce_from_value(self: Container[CanProduce[U]]) -> U:
            return self.value.produce()
    

    This code has an undesirable property: Since T_in is invariant, produce_from_value can only be called on instances of Container whose type parameter is exactly CanProduce[U], for some U:

    class IntProducer(CanProduce[int]):
        def produce(self) -> int:
            return 42
    
    c: Container[IntProducer] = Container(IntProducer())
    c.produce_from_value()  # this produces a type error
    

    It would be ideal if we could communicate that in this method (which need not be an instance method, it could be a discrete function also), the type parameter behaves as though it were covariant, where any subtype of CanProduce[U] is valid.

    In fact, there exists a mechanism for doing this in other situations. Consider if we have the following non-generic protocol:

    class SupportsIndex(Protocol):
        def __index__(self) -> int:
            ...
    

    We can then make a method index_from_value that works with all subtypes of SupportsIndex:

    SI = TypeVar("SI", bound=SupportsIndex)
    
    @dataclass
    class Container(Generic[T_in]):
        value: T_in
        
        def index_from_value(self: Container[SI]):
            return self.value.__index__
    

    Now, the following is valid:

    c: Container[int] = Container(27)
    c.index_from_value()
    

    Unfortunately, this method cannot be used with generic protocols, because the following is invalid:

    U = TypeVar("U")
    CPU = TypeVar("CPU", bound=CanProduce[U])  # this produces a type error
    

    I suggest altering the restriction to allow this trick to work.

    topic: feature 
    opened by schuelermine 0
  • Annotations for Type factories

    Annotations for Type factories

    Context: at my company, we have a wildly used framework that at the time of writing didn't consider good static type hints for the framework users as one of the design objectives.

    It makes use of the "type factory" pattern that could be illustrated with the following (much simplified) example

    # framework code
    
    # `make_int_in_range_class` is a "type factory" method
    def make_int_in_range_class(lower: int, upper: int):
        # imagine here some very elaborated machinery that constructs the type dynamically
        class IntInRange(int):
            def __init__(self, v: int) -> None:
                if v < lower or v > upper:
                    raise ValueError("not in range")
                self.v = v
            
            # many more methods like
            def custom_serialization() -> bytes:
                return b"foo"
        
        return IntInRange
    
    # user code in another file
    
    MyIntInRange = make_int_in_range_class(0, 10)  # Create the `MyIntInRange` Type
    
    def foo(x: MyIntInRange) -> None:  # use `MyIntInRange` type in the annotation 
        print(x)
    
    foo(MyIntInRange(4))  # example usage
    

    When I run mypy on this code I'm rightfully getting

    -----------------------------------------------------------------------------
    demo.py: note: In function "foo":
    demo.py:12:12: error: Variable
    "robotypes_toy_generic.demo.MyIntInRange" is not valid as a type  [valid-type]
        def foo(x: MyIntInRange) -> None:
                   ^
    demo.py:12:12: note: See https://mypy.readthedocs.io/en/latest/common_issues.html#variables-vs-type-aliases
    Found 1 error in 1 file (checked 1 source file)
    

    Note that Pyright seems to be more permissive here and doesn't error out, but this seems to be a non-standard behavior from PEPs point of view.

    The goal of having the type hint at the first place in this code is 2 fold:

    1. Documentation.
    2. We could not afford yet to enable globally the check_untyped_defs = True flag, too many errors. But I'd like to remove one obstacle from getting type check coverage in the new code, so it's desirable to have the type hints (however poor they could be). And I'd like to avoid having excessive use of Any or type: ignore[untyped-def].

    Ideally, I'd like to have some syntax to tell any type checker that make_int_in_range_class produces a valid type (let's say even Any to make things simple, but maybe it could be some Protocol).

    I was not able to find a good way of doing it short of asking ALL USERS to write some typing lie like

    if TYPE_CHECKING:
      MyIntInRange = Any
    else:
      MyIntInRange = make_int_in_range_class(0, 10)  # Create the `MyIntInRange` Type
    

    This is kind of a sad solution and also we have something like 1000 call sites that would need to be updated like that. So I'm looking for advice on how this could be addressed on the framework level OR if people think it's not too fringy, maybe we could add a new feature in typing for that.

    I was imagining that it could be possible to make something like this work

    def make_int_in_range_class() -> Type[Any]:
    
    topic: feature 
    opened by vors 7
Owner
Python
Repositories related to the Python Programming language
Python
A static type analyzer for Python code

pytype - ?? ✔ Pytype checks and infers types for your Python code - without requiring type annotations. Pytype can: Lint plain Python code, flagging c

Google 4k Dec 31, 2022
Static type checker for Python

Static type checker for Python Speed Pyright is a fast type checker meant for large Python source bases. It can run in a “watch” mode and performs fas

Microsoft 9.2k Jan 3, 2023
Tool for automatically reordering python imports. Similar to isort but uses static analysis more.

reorder_python_imports Tool for automatically reordering python imports. Similar to isort but uses static analysis more. Installation pip install reor

Anthony Sottile 589 Dec 26, 2022
A static-analysis bot for Github

Imhotep, the peaceful builder. What is it? Imhotep is a tool which will comment on commits coming into your repository and check for syntactic errors

Justin Abrahms 221 Nov 10, 2022
Simple Python style checker in one Python file

pycodestyle (formerly called pep8) - Python style guide checker pycodestyle is a tool to check your Python code against some of the style conventions

Python Code Quality Authority 4.7k Jan 1, 2023
A Python Parser

parso - A Python Parser Parso is a Python parser that supports error recovery and round-trip parsing for different Python versions (in multiple Python

Dave Halter 520 Dec 26, 2022
A simple program which checks Python source files for errors

Pyflakes A simple program which checks Python source files for errors. Pyflakes analyzes programs and detects various errors. It works by parsing the

Python Code Quality Authority 1.2k Dec 30, 2022
Performant type-checking for python.

Pyre is a performant type checker for Python compliant with PEP 484. Pyre can analyze codebases with millions of lines of code incrementally – providi

Facebook 6.2k Jan 4, 2023
The strictest and most opinionated python linter ever!

wemake-python-styleguide Welcome to the strictest and most opinionated python linter ever. wemake-python-styleguide is actually a flake8 plugin with s

wemake.services 2.1k Jan 1, 2023
Tool to check the completeness of MANIFEST.in for Python packages

check-manifest Are you a Python developer? Have you uploaded packages to the Python Package Index? Have you accidentally uploaded broken packages with

Marius Gedminas 270 Dec 26, 2022
A python documentation linter which checks that the docstring description matches the definition.

Darglint A functional docstring linter which checks whether a docstring's description matches the actual function/method implementation. Darglint expe

Terrence Reilly 463 Dec 31, 2022
Flake8 plugin that checks import order against various Python Style Guides

flake8-import-order A flake8 and Pylama plugin that checks the ordering of your imports. It does not check anything else about the imports. Merely tha

Python Code Quality Authority 270 Nov 24, 2022
Flake8 extension for checking quotes in python

Flake8 Extension to lint for quotes. Major update in 2.0.0 We automatically encourage avoiding escaping quotes as per PEP 8. To disable this, use --no

Zachary Heller 157 Dec 13, 2022
Check for python builtins being used as variables or parameters

Flake8 Builtins plugin Check for python builtins being used as variables or parameters. Imagine some code like this: def max_values(list, list2):

Gil Forcada Codinachs 98 Jan 8, 2023
flake8 plugin to run black for checking Python coding style

flake8-black Introduction This is an MIT licensed flake8 plugin for validating Python code style with the command line code formatting tool black. It

Peter Cock 146 Dec 15, 2022
Custom Python linting through AST expressions

bellybutton bellybutton is a customizable, easy-to-configure linting engine for Python. What is this good for? Tools like pylint and flake8 provide, o

H. Chase Stevens 249 Dec 31, 2022
Unbearably fast O(1) runtime type-checking in pure Python.

Look for the bare necessities, the simple bare necessities. Forget about your worries and your strife. — The Jungle Book.

beartype 1.4k Jan 1, 2023
Naming Convention checker for Python

PEP 8 Naming Conventions Check your code against PEP 8 naming conventions. This module provides a plugin for flake8, the Python code checker. (It repl

Python Code Quality Authority 411 Dec 23, 2022
Code audit tool for python.

Pylama Code audit tool for Python and JavaScript. Pylama wraps these tools: pycodestyle (formerly pep8) © 2012-2013, Florent Xicluna; pydocstyle (form

Kirill Klenov 967 Jan 7, 2023