Comments by leycec

All comments ranked by humor rating

leycec19 days agobeartype/beartype

...heh. Fortunately for my scarce time with which I'd rather be bashing goblin mobs in some nameless trashy JRPG, @beartype isn't wrong. Whereas pure static type-checkers like

mypy
and
pyright
hallucinate literally everything like some delusional AI with the "Delusion" slider set to 11, @beartype doesn't hallucinate anything.

Behold! @beartype is right about everything:

$ python3.13
>>> print(isinstance(list[int], type))
False

See? Score 1 for @beartype. Score 0 for

mypy
and
pyright
.
list[int]
isn't a type; it's a type-hint, but that's not at all the same as a type. Interestingly, the upcoming Python 3.14 release will allow you to type-check type hints in a sane way that satisfies @beartype,
mypy
, and
pyright
alike:

from beartype.door import is_bearable
from beartype import beartype
from typing import (
    Any,
    TypeForm,  # <-- Python 3.14 magic, yo!
)

@beartype
def checked_cast[T](typ: TypeForm[T], val: Any) -> T:  # <-- Truly magic! Truly joy!
    ...

Interestingly, your exact use case is covered by this section of PEP 747:

There is currently no way to indicate to a type checker that a function accepts type form objects and knows how to work with them.

TypeForm
addresses this limitation. For example, here is a function that checks whether a value is assignable to a specified type and returns
None
if it is not:

def trycast[T](typx: TypeForm[T], value: object) -> T | None: ...

Pretty much your exact use case, right? I know. PEP 747 is baller.

Let's pivot this issue into a feature request for PEP 747 support in @beartype. @beartype basically already internally supports PEP 747 via our private

beartype._util.hint.is_hint()
tester. So adding PEP 747 support to @beartype should really just be a day or two of work for us.

Sadly, we're currently blocked on even getting started on this. Why? Because neither

typing.TypeForm
nor
typing_extensions.TypeForm
exist yet. Actually, the latter technically exists but has yet to be published in a stable release of
typing_extensions
. Somebody ping when
TypeForm
exists somewhere, please!

Until then, I'll be bashing goblin mobs in some nameless trashy JRPG if you need me. 👺 ⚔ ☠

leycec19 days agobeartype/beartype

...heh.

typing_extensions
, huh? I should probably preface everything that follows by admitting that I adhere to an altogether different development philosophy from most projects – open-source or otherwise. Notably:

  • @beartype implements what is fun. If it ain't fun, @beartype prolly ain't gonna do it. Implementing support for unofficial third-party

    typing_extensions
    attributes that (A) nobody in the @beartype userbase particularly cares about (as evidenced by the lack of vociferous users threatening to burn down this issue tracker while wielding flame torches precariously perched on pitchforks) and (B) play poorly with their official
    typing
    equivalents at runtime... doesn't sound very fun. That's pretty much the diametric opposite of fun. If dead-tree dictionaries still existed, the entry for "anti-fun" would surely read:

    Anti-fun, adjective: Implementing support for unofficial third-party

    typing_extensions
    attributes.

  • @beartype implements what is practical. Usually, what is fun is what is practical. If it's impractical, it's often also opposed to fun. This issue only proves the equivalence. Over 80 real issues threatening to burn down this issue tracker means I have to triage, sadly. Impractical issues inevitably get thrown under the Bus of Time™. Superficially, this seems like such an issue. Is this issue actually impractical? No idea. If no one else complains, it's impractical; else, it's practical. Thankfully, the definition of "practicality" sure also reads:

    Practicality, noun: When multiple living beings that are not AI bots all agree @beartype is doing something bad, there is a practical problem. Curse Bless the humans! 😯

  • typing_extensions
    has been in Development Hell™ for nearly a year. Their last stable release was nearly a year ago, people. Please let that sink in.
    typing_extensions
    is not a project committed to runtime sanity, stability, or usability.

  • typing_extensions
    fails to conform to the runtime API of the standard
    typing
    module.
    Seriously. How annoying is that? Really annoying, guys. Supporting
    typing
    itself across all actively maintained Python versions is infeasible enough. Now try adding
    typing_extensions
    , which fails to export a conformant API, into that mix. Anti-fun impracticality intensifies.

Ultimately, what we have here is a philosophical difference in bad opinions. Because supporting

typing_extensions
is painful, @beartype just does the bare minimum to support the proper subset of
typing_extensions
that users actually care about.

"Let laziness prevail!" — the @beartype motto, probably

Let's talk specifics. Because... why not? What else could possible go wrong!? 😫

typing.Any
is different from
typing_extensions.Any

lolbro.

typing_extensions
is fundamentally busted. You have just demonstrated the irrefutable truth, but it pains us all. Let's not sane-wash a package just because we like that package. This is insane behaviour. Also, this is 100% a critical bug in
typing_extensions
. Period. There is no reasonable argument whereby this could be called anything except a bug.

There is no justifiable reason for

typing.Any
to differ from
typing_extensions.Any
. Those singletons should be the exact same objects. In fact, the same principle of runtime conformance applies to all
typing
attributes.
typing_extensions
should do its utmost to ensure a one-to-one mapping between
typing
and
typing_extensions
attributes by:

  • Immediately dropping support for all unmaintained Python versions. Python 3.8 has already hit its End-of-Life (EOL) and thus constitutes a security risk. Supporting unmaintained Python versions just increases everyone's attack surface, which is already large enough. So,
    typing_extensions
    should drop support for Python ≤ 3.8 (if it hasn't already). Obviously. That's obvious to everyone. Right?
  • Trivially importing all
    typing
    attributes guaranteed to exist in Python 3.9 into
    typing_extensions
    .
    This includes
    typing.Any
    and a whole boatload of other attributes. Right? This is sensible, obvious, and just mandatory to preserve sanity across third-party runtime introspection of type hints.

Likewise...

HintPep695TypeAlias
should probably drop the Python 3.12 conditional, as the
typing_extensions.TypeAliasType
backport is perfectly usable in earlier Python versions.

@beartype has no mandatory dependencies. This includes

typing_extensions
, which @beartype generally regards as a mostly distasteful API that never works quite right and was never really intended to be usable at runtime. CPython devs only really intended the
typing
module to be consumed by pure static type-checkers like
mypy
and
pyright.
They consider @beartype to be profane, which... you know. That's fair enough. @beartype is profane. They're not wrong. But @beartype is only profane because so much of the
typing
ecosystem is hostile to runtime type-checking. @beartype has to behave profanely. I had no choice, I swear! 😭

Oh, wells. You type-check with the API you have – not the API you wish you had.

Which leads us to...

However, the next

typing_extensions
release will include a different version, meaning
typing.TypeAliasType is not typing_extensions.TypeAliasType
is
True
!

Guh! Awful intensifies. The body blows just keep coming.

Honestly, I don't even see the point of trying to backport PEP 695. You literally can't do that, because PEP 695 requires low-level changes to CPython's Parser Expression Grammar (PEG)-based parser. If you take away those changes, you take away the entirety of PEP 695 and are left with... nuthin'.

The PEP 695-compliant

type
statement does tons of cool stuff under Python ≥ 3.12 that you literally can't backport like:

  • Recursive type hints, which requires lazy
    type
    statement evaluation.
  • Fake poor man's unquoted forward references, which also requires lazy
    type
    statement evaluation.
  • Implicit instantiation of
    TypeVar
    ,
    TypeVarTuple
    , and
    ParamSpec
    objects.
    Syntactic sugar finally makes type variables usable by a wide audience. Rejoice! 🥂

Take all those things away and what are you actually left with? Not much. I mean, is anything left? Does the PEP 695 backport do anything at all? Even if the PEP 695 backport did one small thing, do @beartype users actually care about that one small thing? Wouldn't @beartype users who actually care about PEP 695 (...which is an absolutely huge time sink just to read, grok, and integrate into their workflows) just require Python ≥ 3.12 as a basic sanity check and use the

type
statement instead? Everyone pretty much wants to require Python ≥ 3.11 anyway, thanks to Python 3.11's miraculous speedup. At that point, it's only a small moon jump from there to Python 3.12.

I dunno, bro. All this kinda seems like massive bike-shedding and bureaucratic make-work for little to no tangible, practical, real-world gain. But... maybe that's just my crabby cabin fever talking. It's a snow blizzard outside and I don't get to ski again until tomorrow morning. <sup>...how can a man live like this!?</sup>

Until then, my surly crankiness surely can't be held against me.

<sup>average winter's day in the @leycec household</sup>

<sup>even @beartype doesn't know what's going on here</sup>

leycec29 days agobeartype/beartype

Resolved by 91f2a83cb222d. Thanks so much for the detailed writeup, @rg936672. Unsurprisingly, you were right about everything. Although this technically isn't a regression of #512, it practically sure feels like one. To ensure this never happens again, I've thoroughly audited the @beartype codebase for anything even remotely smelling like this issue. And... wouldn't you know it!?!

Sure enough. @beartype was making the same erroneous assumption (i.e., that third-party containers define sane

__bool__()
dunder methods) in multiple locations. I've resolved everything that
ripgrep
and a hastily concocted regex enabled me to find, which should be everything. The term "should" is doing a lot of heavy lifting here. 😅

I'm releasing these resolutions as @beartype

0.20.2
tonight. Thanks for sticking it out with @beartype – despite @beartype's occasional lapse in judgement. @beartype meant well. Someday, that will be good enough.

Let me know if your awesome team of crack specialists stumbles into any other explosive @beartype blockers. Until then, may the Creepy Fractal Toroid of Graph Theoretic Enlightenment be with us all.

<sup>the creepy fractal toroid of graph theoretic enlightenment is most pleased with this issue resolution</sup>

leycec30 days agobeartype/beartype

Fascinating question! I was sure we had an open feature request regarding this topic, but... nope. Therefore, let's do this.

You have a couple options here. When you want to tell static type-checkers (like

mypy
and
pyright
) one thing but runtime type-checkers (like @beartype and @pydantic) another thing, your first thought should always be...

typing.TYPE_CHECKING
: Lying to Everybody Since 2014

The standard

typing.TYPE_CHECKING
boolean allows you to selectively tell each different kind of type-checker a different kind of perfidious lie. In Python, you can never lie enough.

Behold! Lies, lies, and more lies:

from beartype import beartype
from typing import TYPE_CHECKING, TypeVar
import ray
from ray._private.worker import RemoteFunction0  # <-- this is the 🦆 

T = TypeVar("T")
R = TypeVar("R)

# If this module is currently being statically type-checked,
# define type hints suitable for static type-checking. Yowza.
if TYPE_CHECKING:
    RemoteFunction0RT = RemoteFunction0[R, T]
# Else, this module is currently being runtime type-checked.
# Define type hints suitable for runtime type-checking! Yikes.
else:
    # Ignore this type hint at runtime, because the third-party
    # "ray._private.worker.RemoteFunction0" type hint factory
    # fails to support runtime type-checking. YIIIIIIIIIIIKES.
    RemoteFunction0RT = object  # <-- lolbro


@beartype
def map_records(
    fn: RemoteFunction0RT,
    records: Sequence[T]
) -> R:
    ...


@ray.remote
def mapper(item: int) -> str:
    return str(item)


# this type checks because @ray.remote is typed to return RemoteFunction0
# but fails at runtime because `beartype.infer_hint(mapper)` is actually just Callable
# (or for other similar reasons)
string_items = map_records(mapper, [0, 1, 2, 3])

In theory, that should work. If either

mypy
or
pyright
complain about that, you might need to annotate the above
RemoteFunction0RT
alias as a
typing.TypeAlias
: e.g.,

if TYPE_CHECKING:
    from typing import TypeAlias
    RemoteFunction0RT: TypeAlias = RemoteFunction0[R, T]

But what if you hate that? What if you really, really hate that? You still have options. For example...

beartype.BeartypeHintOverrides
: Still Awful After All These Years

Sadly, I never got around to properly documenting @beartype hint overrides. Time never permitted; code kept breaking; issues kept exploding. These are the days of our coding lives.

Nonetheless, @beartype hint overrides work perfectly well (probably) for just this sort of icky situation. The tl;dr here is that you can tell @beartype to internally treat certain type hints like certain other type hints. Specifically, tell @beartype to treat the

RemoteFunction0[R, T]
as an ignorable
object
type hint:

from beartype import beartype, BeartypeConf, BeartypeHintOverrides
from typing import TypeVar
import ray
from ray._private.worker import RemoteFunction0  # <-- this is the 🦆 

T = TypeVar("T")
R = TypeVar("R)


# Ignore this type hint at runtime, because the third-party
# "ray._private.worker.RemoteFunction0" type hint factory
# fails to support runtime type-checking. YIIIIIIIIIIIKES.
@beartype(conf=BeartypeConf(hint_overrides=BeartypeHintOverrides({
    RemoteFunction0[R, T]: object})))  # <-- lolbro
def map_records(
    fn: RemoteFunction0[R, T],
    records: Sequence[T]
) -> R:
    ...


@ray.remote
def mapper(item: int) -> str:
    return str(item)


# this type checks because @ray.remote is typed to return RemoteFunction0
# but fails at runtime because `beartype.infer_hint(mapper)` is actually just Callable
# (or for other similar reasons)
string_items = map_records(mapper, [0, 1, 2, 3])

Flex it, @beartype. Flex those burly QA muscles. 💪 🐻

But... Which Is Better!?!

Both are awful. In both cases, you're lying to somebody about something. That's not great. Nobody should lie – especially to the type-checkers that are trying to crush your bugs.

typing.TYPE_CHECKING
is the industry-standard official solution for this sort of thing. That's what I'd try first. If that falls down,
BeartypeHintOverrides
is always there to pick you back up. That said, I acknowledge the obvious plush elephant in the room:

But... Can't @beartype Just Support This Feature Directly?

Sure! Let's keep this issue open until @beartype supports this feature directly. Ideally, @beartype would allow you to explicitly specify the name of a callable parameter to be ignored via a new

BeartypeConf(ignore_arg_names: Collection[str] = ())
configuration option. For your case, its usage would resemble:

from beartype import beartype, BeartypeConf
from typing import TypeVar
import ray
from ray._private.worker import RemoteFunction0  # <-- this is the 🦆 

T = TypeVar("T")
R = TypeVar("R)


# Ignore this type hint at runtime, because the third-party
# "ray._private.worker.RemoteFunction0" type hint factory
# fails to support runtime type-checking. YIIIIIIIIIIIKES.
@beartype(conf=BeartypeConf(ignore_arg_names=('fn',))  # <-- lolbro
def map_records(
    fn: RemoteFunction0[R, T],
    records: Sequence[T]
) -> R:
    ...


@ray.remote
def mapper(item: int) -> str:
    return str(item)


# this type checks because @ray.remote is typed to return RemoteFunction0
# but fails at runtime because `beartype.infer_hint(mapper)` is actually just Callable
# (or for other similar reasons)
string_items = map_records(mapper, [0, 1, 2, 3])

Clearly, that's the simplest approach. Just as clearly, that currently doesn't exist. I'm afraid it's either

typing.TYPE_CHECKING
or
beartype.BeartypeHintOverrides
for the moment. QA duck weeps for your codebase.

<sup>this is qa duck. the heart breaks just looking at qa duck.</sup>

leycec2 months agobeartype/beartype

Yes! So much "Yes!" I too share that dream of a future where Python just works. Because Python in the absence of types just does not work. Interestingly,

mypy
and
pyright
work surprisingly well... until they stop working at all. The breaking point in code complexity seems to be code that uses dark magic – even if it's only once:

  • Override a dunder method. Like, any dunder method. The minute you're overriding dunder methods is the minute you've broken the static type-checker.
  • Call the
    eval()
    or
    exec()
    builtins.
    When you're dynamically evaluating and executing code, you're no longer best friends with static type-checkers. In fact, they're now licking knives while wearing sunglasses whenever they see your codebase.
  • Call the
    type()
    builtin.
    Dynamically define a class? Great. Now static type-checkers hate you and all your children.
  • Use metaclass programming. Got a metaclass? Great. Metaclasses taste better than milk. Sadly, static type-checkers disagree. Of course, static type-checkers disagree on everything.
  • Use tensors. Seriously. Static type-checkers just do not know what to do with tensors. y u h8 AI so much?

None of this is that dark... especially tensors. That's just normal.

I usually throw out the full-blown static type-checker in favour of a simpler linter like

pylint
or Ruff. Both combo well with @beartype while still delivering limited static type-checking without the false positives emitted by full-blown static type-checkers. In fact, @beartype is the only project I use
mypy
and
pyright
on – and I only do it here because users really, really want me to do that. The things I do for love. 🥰

Thanks again for being such an amazing @beartype user. I'm delighted that I've eased your burden during live demos. That is soooooo cool. And stressful. Sounds pretty stressful. Just thinking about live demoing anything makes me reach for a Japanese role-playing game while the sweat beads my forehead.

<sup>live demos: truly, i now have a new nightmare</sup>

leycec3 months agopython/cpython

Ho, ho... ho. @beartype maintainer @leycec has been summoned via #129463. As a born contrarian living in a cabin in the Canadian woods with too much time and too little sense, I have a lot of tiresome things to say about this and every other topic. Thankfully, nobody wants to hear it. I'll just pontificate about type hint smells instead.

So Even Type Hints Smell Now, Huh?

A type hint smell is the

typing
equivalent of a code smell. All existing PEP-compliant type hints (including
typing.ForwardRef
) currently satisfy this simple maxim. In fact, all existing PEP-noncompliant third-party type hints supported by @beartype (like
numpy.typing.NDArray[...]
) also satisfy this simple maxim.

It's simple. Thus, it's maximally supported:

from typing import TypeForm

def is_hints_smelly(hint_a: TypeForm, hint_b: TypeForm) -> bool:
    '''
    :data:`True` only if the passed type hints **smell** (i.e.,
    are poorly designed in a manner suggesting catastrophic
    failure by end users).
    '''

    return not (
        repr(hint_a) == repr(hint_b) and
             hint_a  ==      hint_b
    )

There should thus exist a one-to-one correspondence between:

  • The string representation (i.e.,
    repr()
    ) of a type hint.
  • The identity (i.e.,
    is
    ) or equality (i.e.,
    ==
    ) of that type hint.

Two type hints that share the same string representation should thus either literally be the same type hint or compare equal to one another. Makes sense, right?

In fact, this maxim makes so much sense that we can pretty much extend it to most object-oriented APIs. An API that violates this smell test isn't necessarily bad per say, but it is suggestive of badness.

Nobody Cares About Your Suggestions, Though

In the case of type hints, this isn't quite a "suggestion."

@beartype really wants this smell test to hold across all type hints. @beartype aggressively memoizes all internal calls on the basis of type hints and their string representations. Since all existing type hints satisfy the above smell test, they also memoize well out-of-the-box with respect to @beartype.

The proof is in the tedious pudding. Pretend this matters:

# Prove that the "list[int]" type hint isn't smelly.
>>> is_hints_smelly(list[int], list[int])
False  # <-- good!

# Prove that the "Literal['smells', 'good']" type hint isn't smelly.
>>> is_hints_smelly(Literal['smells', 'good'], Literal['smells', 'good'])
False  # <-- good!

# Prove that the "ForwardRef('muh_package.MuhType')" type hint isn't smelly.
>>> is_hints_smelly(ForwardRef('muh_package.MuhType'), ForwardRef('muh_package.MuhType'))
False  # <-- good!

So. We're agreed. Existing type hints don't smell. That's good... isn't it? :thinking: :thought_balloon: :boom:

Would You Please Stop Talking Already

No. I've warned you that I would pontificate! I intend to do just that.

The proposed refactoring does simplify the existing implementation of

typing.ForwardRef
, which is to be applauded. Let us ease the burden of CPython volunteer devs. They have suffered too much already.

The proposed refactoring also violates the above smell test. After merging this PR,

typing.ForwardRef
will now produce smelly type hints. Two forward references to the same referent will still share the same string representations but no longer compare as equal (and certainly won't be identical objects).

So. You Admit You Are Willing to Do Something Then?

No. I'm super-lazy! I just pontificate without writing code. Mostly.

Actually, I lie. I casually inspected the

typing.ForwardRef
implementation for the first time a hot minute ago and... it looks pretty good, actually! It's nearly there.

Sure. CPython could diminish the real-world utility of forward references by gutting the

__eq__()
and
__hash__()
dunder methods. That's not hard. Anyone can destroy something useful. But why bother? Am I right? Deep down, you know I'm probably right. Let us preserve useful things instead.

I have a trivial proposal. It is trivial, because I just did it. If I can do it, literally anyone can do it. I live in a cabin in the woods, people. Let's just resolve the issue in the

__hash__()
dunder method by aligning the implementation of that method with that of the existing
__eq__()
dunder method: e.g.,

    # This is what I'm a-sayin', folks.
    def __hash__(self):
        return hash(
            (self.__forward_arg__, self.__forward_value__)
            if self.__forward_evaluated__ else
            (self.__forward_arg__, self.__forward_module__)
        )

Literally just two more lines of code. That's it. Of course...

Python 3.14: It Is Hell

I acknowledge that Python 3.14 is coming. Indeed, Python 3.14 is almost here.

typing.ForwardRef
is mutating into a shambolic monstrosity under Python 3.14, isn't it? Sure. It probably is.

Preserving the fragrant non-smelliness of

typing.ForwardRef
type hints is still useful, though – Python 3.14 or no Python 3.14. How many more fields could Python 3.14 possibly be adding to
typing.ForwardRef
, anyway? Surely, somebody who is not me wants to extend the above issue resolution to cover those Python 3.14 "improvements."

Somebody? Anybody? Hellllllllllo? :face_exhaling:

tl;dr: Let Us Fix a Thing Rather than Break a Thing

Thus spake @leycec. He don't know much, but he know a thing. This might be that thing.

leycec3 months agobeartype/beartype

...lol. That's brutal. Thanks so much for the detailed writeup, @Glinte. You're absolutely right. Python 3.12 has made a monster and I'm here for it.

Under Python ≥ 3.12, PEP 695 introduced a new class and callable pseudo-scope implicitly instantiating type variables. This new pseudo-scope interacts with PEP 563 (i.e.,

from __future__ import annotations
) in new soul-destroying ways that nobody ever imagined in their worst GitHub nightmares. Resolving type hints under PEP 563 is even harder, slower, and more complicated now.

Great. That's just great.

2025: another year in the QA trenches with PEP standards that hate each other. 😭

leycec3 months agobeartype/beartype

WOAH! Django horror show continues and, in fact, intensifies. I now apologise for straining your Django app so far past the breaking point of readable tracebacks. Even I fear to tread where your app is going. Usually, I love code that breaks minds. But this is a bit much.

Still, I can probably explain a bit of the special darkness you're seeing here. Let's see...

I found another bug...

Pretty sure you mean "feature." Right, @rudimichal? Feature. No? You're pretty sure? It's another bug, huh? Gods preserve our spare time. 😭

Based on the eldritch darkness useful traceback you've so helpfully copy-pasted, it looks like @beartype:

  1. Is decorating some
    @staticmethod
    or
    @classmethod
    directly defined on each of your Django
    Model
    subclasses. That's fine.
  2. Then unwraps that
    @staticmethod
    or
    @classmethod
    to obtain the underlying pure-Python function you've defined. That's fine.
  3. Dynamically generates code type-checking the value returned by that pure-Python function. That would be fine, except...
  4. The type hint annotating that return is a relative forward reference (i.e., a string referring to a currently undefined type like
    "MyOtherModel"
    ). That would be fine, except...

...then will be called with

owner=<forwardref MyOtherModel(__name_beartype__=None, __scope_name_beartype__=None)>

So friggin' weird! Totally a bug, too. No idea how or why that's happening, either. It shouldn't. Nobody else has ever hit this.

__name_beartype
should always be a non-empty string; instead, it's
None
here.

Are you enabling PEP 563 with the

from __future__ import annotations
pragma at the top of your modules? If so, that might explain why nothing makes sense. PEP 563 is pretty hard for any runtime type-checkers (including @beartype) to support. In theory, @beartype now fully supports PEP 563. But... maybe it don't.

You tried valiantly to reproduce this with a minimal working example – but failed through no fault of your own, because Django is balls crazy. I feel great despair for your codebase. Let's reopen this issue until Django stops beating @beartype with a stick.

Django: A beautiful but cursed thing.

leycec3 months agobeartype/beartype

I need an automated way of doing it...

Doesn't exist, sadly. Let us commiserate together in shared misery.

typing
is hell. :sob:

Do you know when the typing types are going away?

Superb question! Alas, the answer is: "I am dumb and no nothing." The best-case answer is 2023. CPython devs want this logic gone now, but they also don't want to enrage everyone still stuck with legacy type hints. The worst-case answer is later this year:

The deprecated functionality may eventually be removed from the

typing
module. Removal will occur no sooner than Python 3.9’s end of life, scheduled for October 2025.

And if the subscription sizes are going to be added to the official types?

...heh. Not a snowball's chance in Hell. CPython devs hate runtime typing. Hell, CPython devs hate typing in general. There are still no type hints in CPython's standard library. No way they'd ever add a public API for introspecting type hints to anything outside the

typing
module – which they're actively trying to deprecate, obsolete, and delete as fast as possible.

typing
is sadness. Cry, emoji cat! :crying_cat_face:

leycec4 months agogoogle/jaxtyping

Excellence! beartype/beartype@6b3aadfff7f9e4ef1ccde is the first step on this tumultuous voyage into the unknown. Your questions are, of course, apropos and almost certainly highlight deficiencies in my worldview. To wit:

Is naming the method

__instancecheck_str__()
going to be robust to future changes in Python?

...heh. Let's pretend it is. Actually, the hope is that this will eventually metastasize into an actual PEP standard. In the meanwhile, I hope that somebody who is not me will market this to agronholm himself at the

typeguard
issue tracker. Ideally, all runtime type-checkers should transparently support ``instancecheck_str()`. Right? It's just the right thing to do.

Injecting the suspiciously @beartype-specific substring

__beartype__
into this API would probably inhibit that. Likewise, @beartype would probably never support something named
__typeguard_instancecheck_str__()
. It's a fragile human ego thing, I think. :sweat_smile:

I'm finding the current beartype import hooks a bit... intense.

(╯°□°)╯︵ ┻━┻

5 different options

you're not wrong

beartype_package
and
beartype_packages
basically overlap;

you're not wrong

beartype_all
is probably too dangerous to ever be used;

you're not wrong

beartyping
isn't actually documented besides the one example;

you're not wrong

Wait... I'm beginning to detect a deterministically repeatable pattern here. Allow me to now try but fail to explain:

  • Personally, I just wanted
    beartype_this_package()
    and
    beartype_packages()
    . But everybody else who piled onto the pre-release beta test of the
    beartype.claw
    subpackage wanted all of those other things. Things got out of hand quickly. This is what you happens when you do what other people want. Now, @beartype can't back out of any of those decisions without breaking backward compatibility across the Python ecosystem. Thankfully, it's mostly fine. These functions have yet to actually break or do anything bad. So, there's no maintenance burden or technical debt here. They just kinda hang out, doing their own thing. I salute these functions I do not need and never really wanted.
  • I meant to document
    beartyping
    . But nobody complained, I got tired, and then video games happened.
  • beartyping
    just contextually applies to everything in the body of its
    with beartyping(...):
    block. I didn't realize that relative imports might break that. I never even considered relative imports anywhere. I always do absolute imports everywhere, because I am obsessive-compulsive and have trust issues. Do relative imports break
    beartyping
    ? I... don't know, actually. I should probably go and test that. But... I probably won't until somebody complains. I'm still tired and video games are still happening.

Assuming

beartyping
actually does what it says it does, this presumably works: <sup>totally doesn't</sup>

# foo/__init__.py
with jaxtyping.install_import_hook("foo"), beartype.beartyping():
    from . import bar
    from . import baz

This definitely should work, too:

# foo/__init__.py
jaxtyping.jaxtype_this_package()  # <-- dis iz h0t
beartype.beartype_this_package()
from . import bar
from . import baz

Likewise it'd be pretty neat if beartype had a pytest hook...

Yes! So much, "Yes!" I actually implemented a

plugin a month ago. But then I got bored, never tested anything, and never stuffed that into an actual GitHub repository. @beartype even has an existing
pytest-beartype
repository
– but it's still empty. Somebody else was supposed to do all that. But then the holidays and presumably video games happened.

Long story short: "Nobody did nuffin'." :face_exhaling:

...what's the status of

O(n)
checking in beartype?

Given that @beartype still fails to deeply type-check most standard container types like

dict
and
set
in
O(1)
time, let's just quietly accept that
O(n)
type-checking is a dream beyond the veil of sleep.

That said, does this actually intersect with

jaxtyping
? From @beartype's perspective, isn't a JAX array just this monolithic object that @beartype farms off to
jaxtyping
?

Uhm... Err...

Oh. Wait. I never actually implemented support for

jaxtyping
in @beartype. Welp, that's awkward. Basically, @beartype should be doing the same thing that @beartype currently does for the third-party Pandera type hint package: namely, @beartype should silently ignore all type hints from
jaxtyping
.
Is that right?

Previously, I'd assumed that

jaxtyping
was using the standard
__instancecheck__()
mechanism for its type hints and that @beartype didn't have to worry about anything. But that definitely doesn't seem to be the case. Is that right? Should @beartype not be type-checking objects annotated with
jaxtyping
type hints via the standard
isinstance()
builtin?

I'm not even necessarily clear what "trace time" is, frankly. My wife and I are currently sloooowly migrating our data science pipeline from Ye Ol' Mostly Single-threaded NumPy and SciPy World to Speedy Gonzalez JAX World. I must confess that I am dumb, in short.

leycecover 1 year agobeartype/beartype

Resolved by ef26913b4ac58d71f. Thanks for your patience, forbearance, and... wait. Was that an unexpected (but altogether welcome) bear pun in my @beartype feature request? You're not wrong. Where's there's puns, there's fire. This is your final API for raising human-readable violations:

from beartype import beartype

class MetaclassOfMuhClass(type):
    def __instancecheck_str__(cls, obj: object) -> str:
        return f'{repr(obj)} has disappointed {repr(cls)}... for the last time.'

class MuhClass(object, metaclass=MetaclassOfMuhClass):
    pass

@beartype
def muh_func(muh_obj: MuhClass) -> None:
    pass

muh_func("Some strings, you just can't reach.")

...which now raises the "human"-readable violation message:

beartype.roar.BeartypeCallHintParamViolation: Function __main__.muh_func() parameter
muh_obj="Some strings, you just can't reach." violates type hint <class '__main__.MuhClass'>, as
"Some strings, you just can't reach." has disappointed <class '__main__.MuhClass'>... for the last time.

Note that the string you return from your

__instancecheck_str__()
implementation:

  • Should be a sentence fragment whose first character is typically not capitalized.
  • May optionally be suffixed by a trailing period. In either case, @beartype intelligently does the right thing and ensures that the entire violation message is always suffixed by only a single trailing period.

Bear muscles are glistening with sweat. Flex it, @beartype! :muscle: :bear:

leycecover 1 year agogoogle/jaxtyping

...heh. The age-old PEP 526-compliant Annotated Variable Assignment Runtime Type-checking Problem, huh? That one just never gets old. So, thanks for pinging me on! I love this sort of hacky Python wrangling.

Sadly, you already know everything. This is like when Luke in Empire Strikes Back finally realizes that the little wizened green lizard blob thing is actually the most powerful surviving Jedi in the entire universe. You are that blob thing. Wait. That... no longer sounds complimentary. Let's awkwardly start over.

</ahem>

Two Roads: One That Sucks and One That Sucks a Bit Less

As you astutely surmise, two sucky roads lie before you:

  • Do what
    typeguard
    does.
    That is, supercharge
    @jaxtyped
    to instrument ASTs.
    typeguard
    code can be a little arduous to reverse-engineer, owing to a general lack of internal commentary in that codebase. From the little to nothing that I've gleaned, the
    @typeguard.typechecked
    decorator attempts to do this. I italicize attempts, because I remain unconvinced that that genuinely works in the general case. In-memory classes and callables will be the test failures of us all. @beartype briefly considered doing this and then quickly thought better of it. A bridge too far is a bridge that will explode, plummeting all of us to our shrieking dooms below. <sup>also, i have no idea how to do this</sup>
  • Do what @beartype does. That is, supercharge
    install_import_hook
    with a new
    visit_AnnAssign()
    method. This is the least magical and thus best possible approach.

Relatedly, you almost certainly want to steal everything not nailed down be inspired by @beartype's own

implementation. The proliferation of edge cases make the implementation non-trivial. Because @beartype and the Power of Asperger's, the code is outrageously commented. You can probably streamline that by... alot.

Let's Make a Deal: A Match Made in GitHub

Tangentially, would you like @beartype to automate your import hooks for you?

@beartype would be delighted to silently piggyback

jaxtyping
onto its own hunched back. @beartype already provides a rich constellation of import hook primitives (e.g.,
beartype_this_package()
,
beartype_packages()
,
beartyping()
). Moreover, anyone who wants @beartype probably also wants
jaxtyping
. Moreover, it's optimally efficient for @beartype to circumvent all of the file I/O and CPU churn associated with the
jaxtyping.install_import_hook()
import hook by instead just directly applying your currently private (and thus horribly risky)
jaxtyping._import_hook.JaxtypingTransformer
subclass where @beartype applies its own
beartype.claw._ast.clawastmain.BeartypeNodeTransformer
subclass.

Specifically, @beartype would automate away everything by:

  • Automatically detecting when
    jaxtyping
    is importable.
    Trivial.
  • When
    jaxtyping
    is importable:
    • Automatically applying the
      jaxtyping._import_hook.JaxtypingTransformer
      subclass
      immediately after or before (...not sure which) applying its own
      beartype.claw._ast.clawastmain.BeartypeNodeTransformer
      subclass. Trivial. If I'm reading your code correctly, @beartype can just do something like:
# In our private "beartype.claw._importlib._clawimpload` submodule:
...
        # Attempt to...
        try:
            # Defer optional dependency imports.
            from jaxtyping._import_hook import JaxtypingTransformer

            # AST transformer decorating typed callables and classes by "jaxtyping".
            #
            # Note that we intentionally pass *NO* third-party "typechecker" to avoid
            # redundantly applying the @beartype.beartype decorator to the same callables
            # and classes twice.
            ast_jaxtyper = JaxtypingTransformer(typechecker=None)

            # Abstract syntax tree (AST) modified by this transformer.
            module_ast = ast_jaxtyper.visit(module_ast)
        # If "jaxtyping" is currently unimportable, silently pretend everything is well.
        except ImportError:
            pass

        # AST transformer decorating typed callables and classes by @beartype.
        ast_beartyper = BeartypeNodeTransformer(
            conf_beartype=self._module_conf_beartype)

        # Abstract syntax tree (AST) modified by this transformer.
        module_ast_beartyped = ast_beartyper.visit(module_ast)

Of course, you don't need to actually depend upon or require @beartype in any way. No changes on needed on your end. Actually, one little change would improve sanity: if you wouldn't mind publicizing

JaxtypingTransformer
somewhere, @beartype can then just publicly import that and no longer worry about destroying itself.

That's it. Super-easy, honestly. Everyone currently getting @beartype would then get

jaxtyping
as well for free. Free is good. Let us give the good people what they want, @patrick-kidger. Smile, benevolent cat! Smile! :smile_cat:

Equinox: You Are Now Good to Go

Oh – and @beartype now officially supports Equinox. 2024: dis goin' be gud.

leycec3 months agobeartype/beartype

So sorry! I desperately want to say, "Yes! Absolutely! All things will be done, including the hardest things!" Sadly, I am fat, slovenly, and like video games too much. Hard things got done for @beartype 0.20.0 – just not the hardest.

O(n)
full-fat type-checking is the hardest. Given how long complete deep
O(1)
type-checking has taken (and is still taking), I estimate @beartype 0.21.0 to be released by June, 2025 as delivering our first provisional support for
O(n)
full-fat type-checking of some type hints. "Some type hints" probably means the most popular builtin container type hints. So,
list[...]
,
set[...]
,
tuple[...]
. That sort of thing.

That said, I'm kinda weighing options. There's a ton of stuff left to do for deep

O(1)
type-checking – including full support for:

  • collections.abc.Callable[...]
    and
    typing.Callable[...]
    type hints. Super-hard, but super-critical. Can't believe @beartype still doesn't do this. Here we are. My head is bandaged! 🤕
  • Pydantic-style
    @dataclasses.dataclass
    fields on assignment. Currently, @beartype only type-checks
    @dataclasses.dataclass
    fields on initialization. We currently avoid type-checking fields on assignment, because I'm lazy. Sadly, everyone is tired of that excuse. It's a good excuse, though! My laziness is incorrigible.
  • Generator and coroutine parameters, returns, and
    yield
    s. This is non-trivial, because the only feasible way of type-checking generators and coroutines is in abstract syntax tree (AST) transformations run by
    beartype.claw
    import hooks. Still, this is technically feasible. This should be done by somebody... someday. We all yearn for that day.

Those are the Big Three™, I think.

@JWCS: You're basically the Beartype Community Manager at this point. Congratulations! I will now weigh you down with awful responsibilities. Do you know of any simple way to make a poll in a GitHub Discussions thread? If it's too hard, let's not bother. But perhaps it's easy. If it is, redirecting this question to @beartype users makes more sense than @leycec just randomly hacking on whatever he wants. What do actual users want? What should I prioritize? No idea. A poll could disclose the truth. If I had my way, I'd just hack on features in the above order. Thus,

O(n)
will never get done within the lifetime of our fat cats. That... isn't great. I want our fat cats to see
O(n)
type-checking! 😻

Resolved by 9b7cd6a45e6d4d2bfac. Our upcoming @beartype 0.21.0 release now fully supports recursive type hints. We give praise to the long winter months here in Canada, which forced me to actually code something for once. Hallelujah! 👼

What's the Catch?

What? Catch? Surely you jest! There's no...

...oh, who am I kidding!?!?!? There are huge catches associated with this resolution. @beartype intentionally does not support older PEP-noncompliant variants of recursive type hints that used stringified forward references, like those used in @alisaifee's original example. @beartype probably could, but there's not much point in supporting non-standard functionality when standardized alternatives exist. Everyone now sees where this is going.

@beartype now fully supports PEP 695-compliant recursive

aliases. They're the standard for defining recursive type hints. In fact, they're the only standard for defining recursive type hints. Everything else was just something
mypy
made up.

Recursive

type
aliases now work wonderfully under Python ≥ 3.12 – but that's the gotcha here.

Oh, Gods! Here It Comes!

That's right. You love to hate it. PEP 695 is unusable under Python ≤ 3.11. Attempting to define any

type
alias under Python ≤ 3.11 results in CPython raising an unreadable
"SyntaxError: invalid syntax"
exception.

In a year or two, this will be significantly less of a hard blocker for everyone. Increasingly, nobody cares about Python ≤ 3.11. Do you care about Python ≤ 3.11? Maybe – but you probably shouldn't, unless your huge userbase is obsessed by Python ≤ 3.11. In that case, you're kinda screwed. You have to choose between your love for recursion and your love for having users.

Tough choice. I'd choose recursion, personally. Users who hate recursion are users you love to hate. 😆

Is That the Only Catch?

Absolutely! Totally! How could anything else possibly go wrong!

...oh, who am I kidding!?!?!? There is yet another huge catch associated with this resolution. @beartype currently does not deeply type-check recursive data structures to a countably infinite depth of nested recursion. Instead, @beartype:

  1. Type-checks recursive data structures to only a single level of nested recursion.
  2. Silently accepts all deeper levels of nested recursion in recursive data structures.

We'll see exactly what that looks like below. For now, let's just accept this is happening. But why is this happening? Coupla reasons, fam:

  • Constant-time
    O(1)
    time complexity.
    Deeply type-checking a recursive data structure with recursive height
    k
    would necessitate linear-time
    O(k)
    time complexity in @beartype – violating @beartype's fundamental efficiency guarantee.
  • Space safety. Recursion in super-unsafe in Python. Python lacks tail recursion (which sucks) and allocates recursive call frames on the stack rather than the heap (which also sucks). This means that @beartype cannot safely type-check arbitrarily large recursive data structures through recursion. Thankfully, we are all awesome. Therefore, we all know that all recursive algorithms can be implemented iteratively rather than recursively. In theory, this means @beartype could recursively type-check arbitrarily large recursive data structures through iteration rather than recursion. In practice, I am already so tired I can barely see the backs of my eyelids. There may never exist enough lifetimes in the known Universe for me to do that.
  • Time safety. Even an iterative approach to recursive type-checking rapidly bumps up against unpleasant real-world edge cases like infinitely recursive containers (i.e., containers that contain themselves as items like
    bad_list = []; bad_list.append(bad_list)
    ). Of course, an iterative approach could be protected against these edge cases by dynamically generating type-checking code that maintains:
    • For each recursive
      type
      alias to be type-checked, one set of the IDs of all previously type-checked objects. But now @beartype would need to allocate and append to one friggin' set for each recursive
      type
      alias for each function call.
      Space and time efficiency rapidly spirals into the gutter and then clutches its aching head like in a depressing Leaving Los Vegas scene.

Only Python ≥ 3.12!? Only one layer of recursion!?

Oh, Gods! There It Is! We're Screwed!

I know! I know! There are so many exclamation points happening here!

Therefore, allow me to exhibit how utterly awesome @beartype's support for PEP 695-compliant recursive

type
aliases is. Behold! @alisaifee's three year-old <sup>...my gods. where did the time and all my hair go?</sup> example, rewritten to use
type
aliases fully supported by @beartype 0.21.0:

from typing import Union
from typing_extensions import TypeGuard
import beartype

# PEP 695 type aliases. They just work under Python ≥ 3.12. Let's not talk about what happens
# under Python ≤ 3.11, because my face is already numb enough.
type Primitive = Union[int, bool, float, str, bytes]

# Note that PEP 695 explicitly supports unquoted forward references. Kindly do *NOT* quote
# recursive references to the same type alias. Just name this alias without using strings.
type Response = Union[
    Primitive,
    list[Response],  # <-- recursion erupts!
    set[Primitive],
    dict[Primitive, Response]  # <-- more recursion erupts! it's recursion everywhere! my eyes!
]


def valid_keys(
    dct: Union[
        dict[Primitive, Response],
        dict[
            Response, Response
        ],  # needed as this is what the typeguard is filtering out.
    ]
) -> TypeGuard[dict[Primitive, Response]]:
    return all(isinstance(k, (int, bool, float, str, bytes)) for k in dct.keys())


@beartype.beartype
def whynot(x: Response) -> dict[Primitive, Response]:
    resp = {}

    if isinstance(x, list):
        it = iter(x)
        resp = dict(zip(it, it))
    elif isinstance(x, dict):
        resp = x
    else:
        raise ValueError()

    if valid_keys(resp):
        return resp
    raise ValueError()


# These are all valid calls! @beartype happily accepts these, much like @beartype accepts you.
print(whynot({"bar": "baz"}))  # {"bar": "baz"}
print(whynot(["bar", "baz"]))  # {"bar": "baz"}
print(whynot({"bar": 1, b"baz": [2, 3]}))  # {"bar": 1, b"baz": [2,3]}
print(whynot(["bar", 1, b"baz", [2, 3]]))  # {"bar": 1, b"baz": [2,3]}
print(whynot({"bar": [1, {2, 3}]}))  # {"bar": [1, {2,3}]}
print(whynot(["bar", [1, {2, 3}]]))  # {"bar": [1, {2,3}]}
print(whynot(["bar", [1, {2, 3}]]))  # {"bar": [1, {2,3}]}

# This is an invalid call! Boooooooo. @beartype rejects this, much like @beartype rejects your
# coworker's bugs.
print(
    whynot({frozenset({"bar"}): [1, {2, 3}]})
)

...which prints the expected output and raises the expected exception:

{'bar': 'baz'}
{'bar': 'baz'}
{'bar': 1, b'baz': [2, 3]}
{'bar': 1, b'baz': [2, 3]}
{'bar': [1, {2, 3}]}
{'bar': [1, {2, 3}]}
{'bar': [1, {2, 3}]}
Traceback (most recent call last):
  File "/home/leycec/tmp/mopy.py", line 49, in <module>
    whynot({frozenset({"bar"}): [1, {2, 3}]})
    ~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<@beartype(__main__.whynot) at 0x7aeef3c94720>", line 122, in whynot
beartype.roar.BeartypeCallHintParamViolation: Function __main__.whynot() parameter
x={frozenset({'bar'}): [1, {2, 3}]} violates type hint Response, as dict
{frozenset({'bar'}): [1, {2, 3}]}:
* Not list or set.
* Not bytes, float, str, bool, or int.
* dict key frozenset "frozenset({'bar'})" not bytes, float, str, bool, or int.

Praise be to Bear Cat. This is Bear Cat:

<sup>hi! dis iz bear cat. bear cat iz doing' just fine. just clawin' yer bugz. tnx fer askin', bub.</sup>

Guh! Thanks so much for the detailed writeup, @thetianshuhuang. Sadly...

Currently (0.20.2) it seems that generic

TypedDict
and
NamedTuple
aren't supported.

ohnoes. You're the first @beartype user to combine both

typing.TypedDict
and
typing.NamedTuple
with
typing.Generic[T]
. Unsurprisingly, they catastrophically blow up. Surprisingly, I'll fix this immediately. This is a critical bug. Clearly, @beartype should just transparently support this valid use case without catastrophically blowing up. My smile is sweaty, indeed. 😅

I was on the cusp of publishing our upcoming @beartype

0.21.0
release, too! The GitHub issue tracker always blows up the best laid plans of bears, mice, and men.
</gulp>

leycec17 days agobeartype/beartype

There are some edge cases though:

...heh. Subscripted generics, huh? Thankfully, that is something @beartype now does. My Gods! @beartype supports something! I am shocked, too. 😲

PEP 728 will add extra complexity...

...heh. You're trying to kill me. Thankfully, I have a mantra against PEPs like this: "I didn't see that. If no one complains about that, that means it doesn't exist and goes away."

and one important thing is to always perform such validation, even if no remaining arguments are found. In that case, an empty dict should be used to validate against the typed dict

Right. That makes sense. Gotta reject calls failing to pass all requisite keyword arguments.

After a cursory once-over, I note a metric ton of edge cases. Some of which @beartype can conveniently just ignore for an initial draft implementation; others of which, not so much:

So yeah depending on how arguments are validated, this can be implemented more or less easily.

So what you're saying is: "It's gonna hurt. It's gonna hurt real bad, bro." 😆

leycec19 days agobeartype/beartype

Rejoice! This is next up on my list.

@beartype
should never raise exceptions on anything that seems valid, which makes this a high priority. But...

OMGGGGGGGGGGGGGG– Seriously. I cannot believe how utterly intense the

TensorDict
API. This is the most outrageously verbose
dir()
output I've ever seen Python spit out:

# Uhh... wat? Tensorclasses don't even have a distinguishable class?
# You've gotta be friggin' kidding me here. What is this ugly madness?
>>> print(type(MyData))
type  # <-- ...you don't say

# My eyes and your eyes are now both bleeding.
>>> print(dir(MyData))
['__abs__', '__add__', '__and__', '__annotations__', '__bool__', '__class__',
'__dataclass_fields__', '__dataclass_params__', '__delattr__', '__dict__',
'__dir__', '__doc__', '__enter__', '__eq__', '__exit__', '__expected_keys__',
'__firstlineno__', '__format__', '__ge__', '__getattr__', '__getattribute__', 
'__getitem__', '__getitems__', '__getstate__', '__gt__', '__hash__', '__iadd__', 
'__imul__', '__init__', '__init_subclass__', '__invert__', '__ipow__', '__isub__',
'__itruediv__', '__le__', '__len__', '__lt__', '__match_args__', '__module__', 
'__mul__', '__ne__', '__neg__', '__new__', '__or__', '__pow__', '__radd__', 
'__rand__', '__reduce__', '__reduce_ex__', '__replace__', '__repr__', '__rmul__', 
'__ror__', '__rpow__', '__rsub__', '__rtruediv__', '__rxor__', '__setattr__', 
'__setitem__', '__setstate__', '__sizeof__', '__static_attributes__', '__str__', 
'__sub__', '__subclasshook__', '__torch_function__', '__truediv__', '__weakref__',
'__xor__', '_add_batch_dim', '_apply_nest', '_autocast', '_check_batch_size', 
'_check_device', '_check_dim_name', '_check_unlock', '_clone', '_clone_recurse', 
'_data', '_default_get', '_erase_names', '_exclude', '_fast_apply', 
'_flatten_keys_inplace', '_flatten_keys_outplace', '_from_dict_validated', 
'_from_module', '_from_tensordict', '_frozen', '_get_at_str', '_get_at_tuple', 
'_get_names_idx', '_get_str', '_get_sub_tensordict', '_get_tuple', 
'_get_tuple_maybe_non_tensor', '_grad', '_has_names', '_is_non_tensor', 
'_is_tensorclass', '_items_list', '_load_memmap', '_map', '_maybe_names', 
'_maybe_remove_batch_dim', '_memmap_', '_multithread_apply_flat', 
'_multithread_apply_nest', '_multithread_rebuild', '_new_unsafe', '_nocast', 
'_permute', '_propagate_lock', '_propagate_unlock', '_reduce_get_metadata', 
'_remove_batch_dim', '_repeat', '_select', '_set_at_str', '_set_at_tuple', 
'_set_str', '_set_tuple', '_shadow', '_to_module', '_type_hints', '_unbind', 
'_values_list', 'abs', 'abs_', 'acos', 'acos_', 'add', 'add_', 'addcdiv', 
'addcdiv_', 'addcmul', 'addcmul_', 'all', 'amax', 'amin', 'any', 'apply', 
'apply_', 'as_tensor', 'asin', 'asin_', 'atan', 'atan_', 'auto_batch_size_', 
'auto_device_', 'batch_dims', 'batch_size', 'bfloat16', 'bitwise_and', 'bool', 
'bytes', 'cat', 'cat_from_tensordict', 'cat_tensors', 'ceil', 'ceil_', 'chunk', 
'clamp', 'clamp_max', 'clamp_max_', 'clamp_min', 'clamp_min_', 'clear', 
'clear_device_', 'clear_refs_for_compile_', 'clone', 'complex128', 'complex32', 
'complex64', 'consolidate', 'contiguous', 'copy', 'copy_', 'copy_at_', 'cos', 
'cos_', 'cosh', 'cosh_', 'cpu', 'create_nested', 'cuda', 'cummax', 'cummin', 
'data', 'data_ptr', 'del_', 'densify', 'depth', 'detach', 'detach_', 'device', 
'dim', 'div', 'div_', 'double', 'dtype', 'dumps', 'empty', 'entry_class', 'erf', 
'erf_', 'erfc', 'erfc_', 'exclude', 'exp', 'exp_', 'expand', 'expand_as', 'expm1', 
'expm1_', 'fields', 'fill_', 'filter_empty_', 'filter_non_tensor_data', 'flatten', 
'flatten_keys', 'float', 'float16', 'float32', 'float64', 'floor', 'floor_', 
'frac', 'frac_', 'from_any', 'from_consolidated', 'from_dataclass', 'from_dict', 
'from_dict_instance', 'from_h5', 'from_module', 'from_modules', 'from_namedtuple', 
'from_pytree', 'from_struct_array', 'from_tensordict', 'from_tuple', 'fromkeys', 
'gather', 'gather_and_stack', 'get', 'get_at', 'get_item_shape', 'get_non_tensor', 
'grad', 'half', 'int', 'int16', 'int32', 'int64', 'int8', 'irecv', 
'is_consolidated', 'is_contiguous', 'is_cpu', 'is_cuda', 'is_empty', 
'is_floating_point', 'is_locked', 'is_memmap', 'is_meta', 'is_shared', 'isend', 
'isfinite', 'isnan', 'isneginf', 'isposinf', 'isreal', 'items', 'keys', 
'lazy_stack', 'lerp', 'lerp_', 'lgamma', 'lgamma_', 'load', 'load_', 
'load_memmap', 'load_memmap_', 'load_state_dict', 'lock_', 'log', 'log10', 
'log10_', 'log1p', 'log1p_', 'log2', 'log2_', 'log_', 'logical_and', 'logsumexp', 
'make_memmap', 'make_memmap_from_storage', 'make_memmap_from_tensor', 'map', 
'map_iter', 'masked_fill', 'masked_fill_', 'masked_select', 'max', 'maximum', 
'maximum_', 'maybe_dense_stack', 'mean', 'memmap', 'memmap_', 'memmap_like', 
'memmap_refresh_', 'min', 'minimum', 'minimum_', 'mul', 'mul_', 'named_apply', 
'names', 'nanmean', 'nansum', 'ndim', 'ndimension', 'neg', 'neg_', 'new_empty', 
'new_full', 'new_ones', 'new_tensor', 'new_zeros', 'non_tensor_items', 'norm', 
'numel', 'numpy', 'param_count', 'permute', 'pin_memory', 'pin_memory_', 'pop', 
'popitem', 'pow', 'pow_', 'prod', 'qint32', 'qint8', 'quint4x2', 'quint8', 
'reciprocal', 'reciprocal_', 'record_stream', 'recv', 'reduce', 'refine_names', 
'rename', 'rename_', 'rename_key_', 'repeat', 'repeat_interleave', 'replace', 
'requires_grad', 'requires_grad_', 'reshape', 'round', 'round_', 'save', 
'saved_path', 'select', 'send', 'separates', 'set', 'set_', 'set_at_', 
'set_non_tensor', 'setdefault', 'shape', 'share_memory_', 'sigmoid', 'sigmoid_', 
'sign', 'sign_', 'sin', 'sin_', 'sinh', 'sinh_', 'size', 'softmax', 'sorted_keys', 
'split', 'split_keys', 'sqrt', 'sqrt_', 'squeeze', 'stack', 
'stack_from_tensordict', 'stack_tensors', 'state_dict', 'std', 'sub', 'sub_', 
'sum', 'tan', 'tan_', 'tanh', 'tanh_', 'to', 'to_dict', 'to_h5', 'to_module', 
'to_namedtuple', 'to_padded_tensor', 'to_pytree', 'to_struct_array', 
'to_tensordict', 'transpose', 'trunc', 'trunc_', 'type', 'uint16', 'uint32', 
'uint64', 'uint8', 'unbind', 'unflatten', 'unflatten_keys', 'unlock_', 
'unsqueeze', 'update', 'update_', 'update_at_', 'values', 'var', 'view', 'where', 
'zero_', 'zero_grad']

WTTTTTTTTTTTTTTTTF

I mean... just... wtf!? There's even an attribute called

cummin
up there, which just raises even more questions than it answers. 🤣

I genuinely have no idea where to start with this beastly nightmare. My new life goal for 2025 is to just figure out how to even detect

@tensorclass
-decorated types. Even that appears to be oddly impossible, because
@tensorclass
-decorated types don't have distinguishable types. Their type is literally the
type
superclass, which I've never seen before.
type
conveys no meaningful semantics and thus provides no means of detecting
@tensorclass
-decorated types.

Who designed this API horror show? They should feel bad, because they're bad. First I sigh. Then I facepalm. 🤦

leycec29 days agogchq/Vanguard

Thanks for your magnanimous patience, all. @beartype

0.20.2
now guarantees satisfaction or I will eat a handful of cat fur live on GitHub. Let's hope it doesn't come to that. 😸 😓

leycecabout 1 month agobeartype/beartype

Oh, salubrious hierophant of all that is profane and yet concurrently sacred in the hallowed halls of ML! Truly, your wizened name is recorded in the Akashic Data Science Records, @davidfstr. Even @beartype genuflucts before the QA prowess of an elder Python statesman. Actually... wait. My balding hair and wrinkled brow that looks like a blobfish suggests that I'm the elder Python statesman now. Hear my plaintive lament, GitHub:

"Oh, how did this happen to me..."

Let's see what we have here:

Specifically, the TypeForm-enhanced MyPy is now rightfully throwing shade at beartype’s usage of TypeForm.

A pox on mypy's house! You do not throw shade onto @beartype's filthy porch that hasn't been cleaned in seven years, mypy. @beartype throws shade onto your probably very cleanly porch, mypy!

Oh, very well. You can't argue with mypy. Well, you can. I do all the time while clenching my skinny fists tightly in the perfidious blackness of a Friday all-nighter. To no avail, though. Mypy ignores @beartype's baleful pleas in the night and furtive hand-wringing. This must be what it feels like to be on the bottom of a power-law distribution. 😅

Let's Get Pragmatic 'Cause All This Poetry Be Exhaustin' Me

My rural redneck can only take so much pithy Shakespearean musings on the Furies of QA. Please, allow me to speak like a normal person for the first time in my life.

Thanks so much for the detailed shell script! Indeed, it worked perfectly. I can confirm that @beartype is throwing spunky chunks all over mypy's newfound

TypeForm
support. Thankfully, this is all super-trivial stuff. The hypnotic power of autism will resolve all of mypy's justifiable concerns that I would rather ignore until they catastrophically blow up this issue tracker... by tonight? Maybe?

Indeed, let us see if @leycec actually does anything. Who is this @leycec guy, anyway!?

leycec2 months agobeartype/beartype

Resolved by 3f18100213f9a8e... uhh, maybe. It's best to assume nothing with this issue, which shall hereafter be referred to as "The Neverending Issue." I'm tired, I'm cranky, and I'm all of out luck dragons.

Would you mind giving this a final try? Your Django use case is the ultimate torture test. And I do emphasize "torture." Pretty sure we both now need to seek psychotherapy on a couch:

pip install git+https://github.com/beartype/beartype.git@3f18100213f9a8e396d2ae7d204461df7ecea610

Thanks again for your phenomenal efforts with this @beartype beast, @rudimichal. You've gone above and beyond the call of Pythonic QA. Truly, a magical miracle happened here tonight.

<sup>would you trust a face that looks like this? @leycec would</sup>

leycec3 months agobeartype/beartype

Guh! This is, indeed, an explosion in badness. Looks like Sphinx is doing insane, inhumane, and profane things in

conf.py
scripts – including:

  • Forcing PEP 563 by injecting
    from __future__ import annotations
    into the top of all
    conf.py
    scripts, regardless of whether users actually want PEP 563 or understand the harmful ramifications of enabling PEP 563.
  • Executing
    conf.py
    scripts by:
    1. Reading the entire contents of those files into a local in-memory string variable.
    2. Dynamically evaluating those strings as executable Python code by calling the
      exec()
      builtin.

As @beartype's exception message implies, all of this is madness. Although none of this is @beartype's fault, it's easy to blame the bear. After all, code that works without

@beartype
should still work with
@beartype
. Right? Fair enough. In @beartype's defense, though...

It's hard to work with crazy. Sphinx is very much crazy and "ackin' cray-cray." The only reason @beartype itself uses Sphinx for documentation is that I didn't even know about MkDocs at the time. In hindsight, a little more research there would have been helpful. Because of this and similar issues, I'm now considering refactoring all of @beartype's documentation from Sphinx to MkDocs. Ain't nobody got time for Sphinx and its "special-needs" doco stack.

But What About This Awful Issue!?!?

Hmmm. No idea, honestly. @beartype should clearly do something that it's not currently doing. But... what? One obvious improvement would be to improve the resiliency of @beartype's PEP 563 resolver.

Clearly, resolving the string

"None"
to the
None
singleton should be trivial. @beartype shouldn't need
func.__module__
to be defined for that. That's a sensible first action item. Then, at least we can say we did something. And doing something is half the battle. 😜

leycec3 months agobeartype/beartype

you seemed to know them outside of github

I only wish. Oh, QA Gods above, how I wish.

We'd have so much fun IRL. We'd Naruto-run; we'd pull all-nighters; we'd drink thirteen litres of God's Green Crack Ass Coffee; we'd take turns reading the first 15,000 pages of Brandon Sanderson's Stormlight Archive to each other at 3:27AM each night and I'd do all the voices of the bad guys; <sup>...geh heh heh</sup> we'd hard-core board game and video game through all the modern geek classics until the Kobe beef cows come home and the crows caw the dawn alight. Race for the Galaxy! Final Fantasty XIV! Underwater Cities! Metaphor: ReFantazio! Twilight Imperium! Tekken 7! Through the Ages! Dynasty Warrior Origins! We'd finish up with a five-day Mystery Science Theatre 3000 marathon until we can no longer feel our hands. Oh, what fun we'd have.

Back to reality. For better or worse, I live in a cabin in the Canadian wilderness. Usually, that is for the better. But it's times like this that make me momentarily miss the dense urban geek-filled landscape of Leh Big City™. Where mah homies at? 🏙

leycec3 months agobeartype/beartype

Fully agreed. You are correct about everything. I actually just finished up the changelog writeup a hot minute ago. It's impossible to believe, but I'm releasing @beartype

0.20.0rc0
first thing tomorrow night. There's just something about those late Friday nights that screams:

"Release me from my shackles! Gods! Releaaaaaaaaase me!" — zombie @leycec, probably

leycec3 months agobeartype/beartype

lolbro. Baymax intensifies and then feels deflated. <sup>...get it, "deflated"? you see, that was a pun because... oh, nmind.</sup>

typing
sure is super-weird, huh? You have now noticed that deprecated PEP 484-compliant
typing
type hint factories behave nothing like PEP 585-compliant builtin type hint factories. Behold! True horror!

>>> from typing import List, Union
>>> list[Union]
list[Union]  # <-- guess that's okay, i guess? thanks, pep 585!
>>> List[Union]
...
TypeError: Plain typing.Union is not valid as type argument
# ^-- thanks fer nuthin', pep 484.

PEP 585 and 484 disagree about everything. PEP 585 takes precedence, because PEP 585 destroyed PEP 484. I don't know. Neither does anyone else. It's probably best to assume PEP 585 has the right of it – especially because that assumption simplifies everything and makes everything orthogonal.

Basically:

  • Any type hint factory can be unsubscripted.
  • When unsubscripted, a type hint factory defaults to subscription by the
    typing.Any
    singleton.

Sadly, this was never actually formalized anywhere. PEP 484 strongly hints <sup>...get it? i'll stop now</sup> that this is the case, but never comes right out and says it. Because it's easier to be permissive, @beartype assumes that this is the case.

Thus ends the tale of the 5th QA battle. PEP 585: You win the day! :flags:

leycec4 months agobeartype/beartype

Ho, ho... ho. Alas, you just hit @beartype's last well-known bug. :sob:

I last attempted to type-check defaults with the ancient @beartype 0.18.0 release. I thought I'd tested all possible edge cases. I was wrong... dead wrong. I was so wrong that I temporarily broke PyTorch, OpenAI, and Microsoft for a day. I hit the panic button, immediately reverted type-checking of defaults, released @beartype 0.18.2 without that, <sup>...don't even ask what happened to @beartype 0.18.1; just don't</sup> and then permanently pulled @beartype 0.18.0 and 0.18.1 from PyPI after everyone begged me to.

I am now deathly afraid of type-checking defaults and run screaming whenever anyone suggests @beartype start doing that. Allow me to show you how brutal type-checking defaults can get with a common edge case:

from beartype import beartype

@beartype
def what_could_possibly(go_wrong: 'ThisIsAwful' | None = None): pass

class ThisIsAwful(object): pass

See what's awful there? The

go_wrong
parameter is annotated by the type hint
'ThisIsAwful' | None
embedding a stringified forward reference to the
ThisIsAwful
class. At the time that
@beartype
is decorating the
what_could_possibly()
function, the
ThisIsAwful
class is undefined. This implies that the type hint
'ThisIsAwful' | None
is undefined as well.

However, type-checking the default for that parameter requires that the type hint

'ThisIsAwful' | None
be defined! But... it isn't. It just isn't.

There are various ways around this, but they're all non-trivial and require varying degrees of cleverness. I am not a clever man, @ilyapoz. I am a dumb man who just bangs on keyboards a lot.

I acknowledge that this is a super-annoying deficit in @beartype. Please bug me about this if I still haven't resolved #154 in a year or two. Until then, I'd prefer not to break PyTorch, OpenAI, or Microsoft again. Someone would probably punch me if I did that again.

leycec4 months agobeartype/beartype

Guh! Justifiable begging intensifies. You're all painfully right. I should have rolled out a patch release months ago with all of these minor issue resolutions. Sadly, tests are currently failing, CI is exploding, and QA is out the window.

Please bear with @leycec as he desperately plugs the widening holes in the bursting dam.

<sup>@beartype leakier than a squirrel whose only fashion accessory is an acorn hat</sup>

leycec2 months agobeartype/beartype

Totally. The badness is appalling. A dramatic increase in space and time complexity proportional to the number of forward references used throughout a codebase is probably the least of the badness. Exactly as you say, forward reference proxies preventing garbage collection by erroneously maintaining a death grip on stale objects scares the digested food out of me. It's important that nobody else notices this. Wait. Are these discussions public!? ohnoooooooooo—

I'd better tackle this before anybody else notices this. Let us hang our heads in exhaustion. 😞

In the spirit of fraternity and making @beartype look better, let us quietly close this. Technically, this is all TensorDict's fault. TensorDict devs hate type-checking. There's just not much you can do when ideology and computer science collide. 😩

<sup>shocked cat is as shocked as @leycec and @pablovela5620 are right now</sup>

...lol.

importlib_metadata
is totally raving mad. I unhappily confirm that eldritch abominations lurk here. Indeed, your minimal working example raises:

beartype.roar.BeartypeCallHintParamViolation: Function __main__.main() parameter
package_path=[PackagePath('_beartype.pth'), PackagePath('beartype-0.20.1.dist-info/INSTALLER'),
Package...')] violates type hint list[importlib.metadata.PackagePath], as list index 1 item
<class "importlib_metadata.PackagePath"> "PackagePath('beartype-0.20.1.dist-info/INSTALLER')"
not instance of <class "importlib.metadata.PackagePath">.

Their paper-thin justifications at python/importlib_metadata#300 for monkey-patching CPython's standard library are equally banal and nonsensical:

...several users have indicated that maintaining compatibility for this interface would be important to them even if there are no users affected by it.

If there are no users affected by it, then why is this important to users? Make madness make sense. Nobody should maintain a blatantly broken and ill-advised monkey-patch that breaks the standard library and thus valid use cases like runtime introspection, which is way more important than any of their contrived hand-waving. Also:

See miurahr/aqtinstall#221 for an affected user. Although that use-case is no longer affected.

A user that is no longer affected is, by definition, not an affected user. My feelings about all of this:

I throw my hands up in the air! 🙌
And then I throw up everywhere. 🤮 

Against my better judgement, you have piqued my reluctant curiosity:

I "solved" my case by identifying under which python version

importlib_metadata
was imported. Then using
cast
and
sys.version_info
I was able to get both
pyright
and
beartype
working.

Would you mind sharing what you did? From the @beartype perspective, we can probably transparently "resolve" this for everybody by:

  1. Detecting whether
    importlib_metadata
    has been imported (i.e., is in
    sys.modules
    ).
  2. If so, internally expanding all references to the
    importlib.metadata.PackagePath
    class across all type hints to either:
    • The monkey-patched
      importlib_metadata.PackagePath
      class.
    • The union
      importlib_metadata.PackagePath | importlib.metadata.PackagePath
      comprising both classes, which is probably a bit safer.

<sup>@beartype wishes it could unsee this issue</sup>

leycecabout 1 month agobeartype/beartype

BRILLIANT! Thank you soooooooooooooo much for spending your extremely valuable weekend on the intersection of

mypy
and @beartype. I both feel profound gratitude. You are the QA man.

This

typing
one-liner is also exceedingly clever:

ANY: Hint = Any

I... never would have thought of that. Because I am a Canadian redneck, I probably would've just blindly chucked

type: ignore[cause-i-sed-so]
everywhere and called it a bad commit. Your fine-grained approach here and elsewhere is vastly preferable.

Let us merge this immediately for the good of everything that is holy and Pythonic. :partying_face: :snake: :face_holding_back_tears:

leycecabout 2 months agobeartype/beartype

this is one of the best library I have come accross and use it everyday!

Thanks so much for your kind and heartfelt words! That means so much. Let us now hug. 🤗

I have created a custom behaved class...

...uh oh. I got a bad feelin' about this one, Boss.

I suppose that if I just decorate my init method with beartype I will therefore get an error.

I... have no idea, actually. Your use case is super-interesting, though. Would you mind trying to just brute-force decorate everything with the

@beartype
decorator? It might work. @beartype is magical. If @beartype raises an exception, would you mind copy-pasting the entire
Traceback
into a comment below? That'd be wonderful. Yet again, I hug. 🤗

If @beartype does raise an exception, you have more than a few horrible kludges that destroy sanity valid options available. I suspect that @beartype probably won't raise exceptions, though. Why? Because @beartype currently raises no exceptions for

dataclasses.field
objects. Why? Because...

How did you do to bypass this bottleneck in dataclasses?

I didn't. It was glorious. I'm lazy and prefer to play video games. So, I didn't do anything. Specifically, @beartype itself actually doesn't do anything to officially support

@dataclasses.dataclass
. @beartype just natively supports dataclasses out-of-the-box. Sometimes, batteries really are included. Nifty, huh? We thought so, too.

Unlike static type-checkers like

mypy
or
pyright
, @beartype type-checks at runtime.
@dataclasses.dataclass
is mostly just runtime magic driven by dark dunder methods. Static type-checkers hate runtime magic driven by dark dunder methods. @beartype, though? @beartype looooooooves black magic. In fact, @beartype itself is entirely black magic "under the hood." I don't even know how @beartype works anymore. I just take it on faith that it does – mostly because unit tests have yet to disagree.

Can I then directly decorate TransitionClass's init with beartype?

Sure! Why not? Let's give that a try. When everything blows up, I'll be here to cry myself to sleep and then eventually do something about it.

@beartype currently ignores the PEP 681-compliant

@typing.dataclass_transform()
decorator. @beartype currently ignores as much as it can. We've never needed to detect or do anything about
@typing.dataclass_transform()
before, but... who knows? If your use case blows up, @beartype might need to start doing something about that.

Let's rub our square chins thoughtfully while doing nothing.

<sup>man with bat helmet teaches @beartype how to do it</sup>

leycec2 months agobeartype/beartype

YAAAAAAAAAAY! Thanks so much for deciphering our unabashed failures, @rudimichal. It... was totally our fault after all. @beartype has fallen down and cannot get up.

Now that you've done all the heavy and painful lifting that cost you your very will to live, this should be trivial. I'll stumble onwards and upwards from here by:

  1. Swapping out that clearly awful
    DICT_EMPTY
    implementation with @beartype's own empty
    FrozenDict
    implementation: e.g.,
# Everywhere we do something awful like this...
type_scope = DICT_EMPTY

# ...we need to instead do something glorious like this!
from beartype._data.kind.datakinddict import DICT_EMPTY
type_scope = FROZEN_DICT_EMPTY
  1. Probably removing
    DICT_EMPTY
    entirely from the codebase. It's just too risky to have xenomorph face-hugger like that hanging around.

Looks like there's gonna be a @beartype

0.20.0rc2
after all. Sadness becomes me. 😭

leycec3 months agobeartype/plum

Probably resolved by @beartype

. Since I tested nothing, please do reopen this if still broken. Likewise, since @beartype
0.20.0rc0
is only a release candidate, interested users will need to manually update @beartype as follows: <sup>...for the moment</sup>

pip install --upgrade --pre beartype     # <-- bugger the bugs

We salute you who are about to dispatch iterables, Plum users. 💪

leycec3 months agobeartype/beartype

beartype.typing.Protocol works!

...lolbro. Super-weird, but I'm delighted. I barely understand how any of this insane

Protocol
machinery works myself. I'm getting the feeling that CPython devs are no wiser.

We may be on the worst timeline, but at least we'll always have Zelda. :smile:

leycec3 months agobeartype/beartype

...heh. Standardization intensifies.

Is that actually allowed?

By @beartype? Sure! Absolutely. Of course. @beartype just makes up everything as it goes, anyway. @beartype has supported absolute stringified references for... Jesus. Has it really been a decade now? It really has. It's been a decade. What have I been doing with my life?

Havin' fun. That's what! :sunglasses:

...but PEP 484 and the current version in the spec don't mention it as a possibility.

It's PEP 563, actually. Consider this example from PEP 563:

from __future__ import annotations
import typing

if typing.TYPE_CHECKING:
    import expensive_mod

def a_func(arg: expensive_mod.SomeClass) -> None:
    a_var: expensive_mod.SomeClass = arg
    ...

typing.TYPE_CHECKING
unconditionally evaluates to
False
at runtime. Likewise,
from __future__ import annotations
activates PEP 563 which then stringifies the type hint
expensive_mod.SomeClass
to
"expensive_mod.SomeClass"
. Altogether, this means that CPython reduces the above to the following at runtime:

import typing

def a_func(arg: 'expensive_mod.SomeClass') -> None:
    a_var: expensive_mod.SomeClass = arg
    ...

This then implies that

'expensive_mod.SomeClass'
is a valid forward reference to the type
SomeClass
defined by the top-level package or module
expensive_mod
.

From @beartype's perspective,

typing.TYPE_CHECKING
and
from __future__ import annotations
basically don't exist. All that exists are stringified forward references. Ergo, @beartype transparently supports stringified forward references – because it has to. @beartype has no choice or say in the matter.

From @beartype's perspective, there's no need to explicitly import anything. Whether users import things in

if typing.TYPE_CHECKING:
blocks is irrelevant to @beartype, because @beartype can't see those things. They're hidden from view. They'd might as well not exist in the first place.

Given all of the above, why would there be any need to explicitly import anything in an

if typing.TYPE_CHECKING:
block, anyway? Hidden imports are just useless boilerplate. Absolute stringified references already convey all the metadata that @beartype and other runtime type-checkers need to unambiguously resolve forward references to external types.

It's all kinda silly, honestly. Boilerplate is bad. Python isn't Java. We're supposed to be the agile, anti-bureaucratic guys who wear flower-print Hawaiin t-shirts in the room. Nobody wears suits here.

I think mypy might be smart enough to notice that the submodule is not necessarily imported and complain...

I consider that stupidity rather than smart. Static type-checkers just waste everyone's time with this sort of pablum. There isn't really any error or issue with absolute stringified references. Static type-checkers just made up a mythical problem that doesn't actually exist.

Oh, well. @beartype continues shrugging.

Another solution is to keep the

TYPE_CHECKING
import, but then define a
__getattr__()
module hook to do the import, for runtime checkers to use.

Clever, but also awful. Nobody would ever do that. That's the problem, right? If we consider this sort of kludge deeply and earnestly, we come to the unmistakable conclusion that nobody should ever do that.

Seriously. That's maximum bureaucracy, boilerplate, and DRY violations. Can you imagine literally every single codebase that uses the

typing.TYPE_CHECKING
reference antipattern having to also now define one
__getattr__()
function per module just to satisfy... what? @beartype? Nobody cares about @beartype. Most third-party packages that don't actively use @beartype are tired of their userbase complaining about @beartype.

The Ultimate Solution Emerges, Ultimately

Ultimately, the solution is for @beartype to automate all of this on behalf of users. When a user activates a

beartype.claw
import hook, an abstract syntax tree (AST) transformation in @beartype should:

  • Detect the
    typing.TYPE_CHECKING
    reference antipattern.
  • Issue one non-fatal warning for each
    if typing.TYPE_CHECKING:
    block containing one or more hidden imports.
  • Automatically "repair" the antipattern by injecting one global-scoped @beartype forward reference per hidden import into the AST of the current module.

That is, @beartype will transform:

if typing.TYPE_CHECKING:
    from A.B import C, D

from beartype._check.forward.reference.fwdref import BeartypeForwardReference
C = BeartypeForwardReference('A.B', 'C')
D = BeartypeForwardReference('A.B', 'D')

In the above example,

C
and
D
are now objects that transparently proxy
isinstance()
and
issubclass()
checks against those external types defined elsewhere.

Fun stuff. I vomit. :vomiting_face:

leycec4 months agobeartype/beartype

Ho, ho, ho. Merry Christmas to all and to all a good QA! It's wonderful that something finally broke in @beartype for somebody. This repository has been so depressingly quiet lately. I miss Them Good Ol' Days™ when @beartype used to break every minute and I got to talk to everybody for issues on end. :sob:

perfect_software_that_actually_works = lonely_software_devbro

I at least want to have a lil chat with @leycec :))

This is the nicest thing anyone's ever said to me. :smile:

Wait. There was an actual issue here, wasn't there? Let's just take a looksie here. Aaaaaaand... @TeamSpen210's right about everything, like he always is. :laughing:

Second Way Really Sucks

In particular:

@lru_cache
@staticmethod
def _heavy_func(x):

Right. Python hates that, unfortunately. As @TeamSpen210 suggests,

@staticmethod
is a decorator that creates a full-blown descriptor object. Python really wants that to be the outermost decorator. Which leaves us with...

First Way Kinda Sucks, Too

@staticmethod
@lru_cache
def _heavy_func(x):

@beartype hates that, huh? Technically, @beartype and @TeamSpen210 aren't wrong.

@lru_cache
is a decorator that creates a full-blown pseudo-callable (i.e., an object that wouldn't normally be callable except that it defines the
__call__()
dunder method, so it is callable – technically).
@staticmethod
, on the other hand, probably expects a method. It's in the name, right? Static method.

Technicalities Suck, Too (So, Everything Sucks)

That said, technicalities suck. It'd be nice if this use case "just worked out of the box" even if it doesn't particularly make sense when you squint at the low-level details. Nobody really cares about the low-level details, anyway. They just want this madness called "Python" to just work for once.

This means this isn't quite a bug, really. More of a feature request, right? Sadly, this might take me a bit of time to hack on. I'm currently hip-deep in finally implementing deep type-checking support for

Iterable[...]
type hints, which is a super-high priority ticket.

Hacks for Great Glory of Turkey

@TeamSpen210's erudite hack of just disabling type-checking for

_heavy_func()
is certainly one of way of circumventing this issue: e.g.,

from beartype import beartype, BeartypeConf, BeartypeStrategy

class A:
  def some_func(self, x):
    return self._heavy_func(x)

  # Should work. Does it? No idea, bro. May Santa bless this code.
  @beartype(conf=BeartypeConf(strategy=BeartypeStrategy.O0))
  @staticmethod
  @lru_cache
  def _heavy_func(x):
     # do some work
     return 42

That works – sorta. But as you astutely recognize, @ilyapoz (aka his majesty, aka unkulunkulu, aka The Man with the Magnificent Hat), disabling type-checking sucks.

Can you get what you want without disabling type-checking? Sure! No problem. You just have to do what @leycec does. Ignore

@lru_cache
and spin up your own caching decorator. This is what @beartype does. Why? Because
@lru_cache
is slow, which defeats the whole point of caching and memoizing stuff in the first place.

Behold: A Caching Decorator So Fast It Could Blind You

from collections.abc import Callable
from functools import wraps

SENTINEL = object()
'''
Sentinel object of arbitrary value.

This object is internally leveraged by various utility functions to identify
erroneous and edge-case input (e.g., iterables of insufficient length).
'''

def callable_cached[T](func: Callable[T]) -> Callable[T]:
    '''
    **Memoize** (i.e., efficiently re-raise all exceptions previously raised by
    the decorated callable when passed the same parameters (i.e., parameters
    that evaluate as equals) as a prior call to that callable if any *or* return
    all values previously returned by that callable otherwise rather than
    inefficiently recalling that callable) the passed callable.

    Specifically, this decorator (in order):

    #. Creates:

       * A local dictionary mapping parameters passed to this callable with the
         values returned by this callable when passed those parameters.
       * A local dictionary mapping parameters passed to this callable with the
         exceptions raised by this callable when passed those parameters.

    #. Creates and returns a closure transparently wrapping this callable with
       memoization. Specifically, this wrapper (in order):

       #. Tests whether this callable has already been called at least once
          with the passed parameters by lookup of those parameters in these
          dictionaries.
       #. If this callable previously raised an exception when passed these
          parameters, this wrapper re-raises the same exception.
       #. Else if this callable returned a value when passed these parameters,
          this wrapper re-returns the same value.
       #. Else, this wrapper:

          #. Calls that callable with those parameters.
          #. If that call raised an exception:

             #. Caches that exception with those parameters in that dictionary.
             #. Raises that exception.

          #. Else:

             #. Caches the value returned by that call with those parameters in
                that dictionary.
             #. Returns that value.

    Caveats
    -------
    **The decorated callable must accept no keyword parameters.** While this
    decorator previously memoized keyword parameters, doing so incurred
    significant performance penalties defeating the purpose of caching. This
    decorator now intentionally memoizes *only* positional parameters.

    **The decorated callable must accept no variadic positional parameters.**
    While memoizing variadic parameters would of course be feasible, this
    decorator has yet to implement support for doing so.

    **The decorated callable should not be a property method** (i.e., either a
    property getter, setter, or deleter subsequently decorated by the
    :class:`property` decorator). Technically, this decorator *can* be used to
    memoize property methods; pragmatically, doing so would be sufficiently
    inefficient as to defeat the intention of memoizing in the first place.

    Efficiency
    ----------
    For efficiency, consider calling the decorated callable with only:

    * **Hashable** (i.e., immutable) arguments. While technically supported,
      every call to the decorated callable passed one or more unhashable
      arguments (e.g., mutable containers like lists and dictionaries) will
      silently *not* be memoized. Equivalently, only calls passed only hashable
      arguments will be memoized. This flexibility enables decorated callables
      to accept unhashable PEP-compliant type hints. Although *all*
      PEP-noncompliant and *most* PEP-compliant type hints are hashable, some
      sadly are not. These include:

      * :pep:`585`-compliant type hints subscripted by one or more unhashable
        objects (e.g., ``collections.abc.Callable[[], str]``, the `PEP
        585`_-compliant type hint annotating piths accepting callables
        accepting no parameters and returning strings).
      * :pep:`586`-compliant type hints subscripted by an unhashable object
        (e.g., ``typing.Literal[[]]``, a literal empty list).
      * :pep:`593`-compliant type hints subscripted by one or more unhashable
        objects (e.g., ``typing.Annotated[typing.Any, []]``, the
        :attr:`typing.Any` singleton annotated by an empty list).

    **This decorator is intentionally not implemented in terms of the stdlib**
    :func:`functools.lru_cache` **decorator,** as that decorator is inefficient
    in the special case of unbounded caching with ``maxsize=None``. Why? Because
    that decorator insists on unconditionally recording irrelevant statistics
    like cache misses and hits. While bounding the number of cached values is
    advisable in the general case (e.g., to avoid exhausting memory merely for
    optional caching), parameters and returns cached by this package are
    sufficiently small in size to render such bounding irrelevant.

    Consider the
    :func:`beartype._util.hint.pep.utilpeptest.is_hint_pep_type_typing`
    function, for example. Each call to that function only accepts a single
    class and returns a boolean. Under conservative assumptions of 4 bytes of
    storage per class reference and 4 byte of storage per boolean reference,
    each call to that function requires caching at most 8 bytes of storage.
    Again, under conservative assumptions of at most 1024 unique type
    annotations for the average downstream consumer, memoizing that function in
    full requires at most 1024 * 8 == 8096 bytes or ~8Kb of storage. Clearly,
    8Kb of overhead is sufficiently negligible to obviate any space concerns
    that would warrant an LRU cache in the first place.

    Parameters
    ----------
    func : Callable[T]
        Callable to be memoized.

    Returns
    -------
    Callable[T]
        Closure wrapping this callable with memoization.

    Raises
    ------
    ValueError
        If this callable accepts a variadic positional parameter (e.g.,
        ``*args``).
    '''
    assert callable(func), f'{repr(func)} not callable.'

    # Dictionary mapping a tuple of all flattened parameters passed to each
    # prior call of the decorated callable with the value returned by that call
    # if any (i.e., if that call did *NOT* raise an exception).
    args_flat_to_return_value: dict[tuple, object] = {}

    # get() method of this dictionary, localized for efficiency.
    args_flat_to_return_value_get = args_flat_to_return_value.get

    # Dictionary mapping a tuple of all flattened parameters passed to each
    # prior call of the decorated callable with the exception raised by that
    # call if any (i.e., if that call raised an exception).
    args_flat_to_exception: dict[tuple, Exception] = {}

    # get() method of this dictionary, localized for efficiency.
    args_flat_to_exception_get = args_flat_to_exception.get

    @wraps(func)
    def _callable_cached(*args):
        f'''
        Memoized variant of the {func.__name__}() callable.

        See Also
        --------
        :func:`.callable_cached`
            Further details.
        '''

        # Object representing all passed positional arguments to be used as the
        # key of various memoized dictionaries, defined as either...
        args_flat = (
            # If passed only one positional argument, minimize space consumption
            # by flattening this tuple of only that argument into that argument.
            # Since tuple items are necessarily hashable, this argument is
            # necessarily hashable and thus permissible as a dictionary key;
            args[0]
            if len(args) == 1 else
            # Else, one or more positional arguments are passed. In this case,
            # reuse this tuple as is.
            args
        )

        # Attempt to...
        try:
            # Exception raised by a prior call to the decorated callable when
            # passed these parameters *OR* the sentinel placeholder otherwise
            # (i.e., if this callable either has yet to be called with these
            # parameters *OR* has but failed to raise an exception).
            #
            # Note that:
            # * This statement raises a "TypeError" exception if any item of
            #   this flattened tuple is unhashable.
            # * A sentinel placeholder (e.g., "SENTINEL") is *NOT* needed here.
            #   The values of the "args_flat_to_exception" dictionary are
            #   guaranteed to *ALL* be exceptions. Since "None" is *NOT* an
            #   exception, disambiguation between "None" and valid dictionary
            #   values is *NOT* needed here. Although a sentinel placeholder
            #   could still be employed, doing so would slightly reduce
            #   efficiency for *NO* real-world gain.
            exception = args_flat_to_exception_get(args_flat)

            # If this callable previously raised an exception when called with
            # these parameters, re-raise the same exception.
            if exception:
                raise exception  # pyright: ignore
            # Else, this callable either has yet to be called with these
            # parameters *OR* has but failed to raise an exception.

            # Value returned by a prior call to the decorated callable when
            # passed these parameters *OR* a sentinel placeholder otherwise
            # (i.e., if this callable has yet to be passed these parameters).
            return_value = args_flat_to_return_value_get(args_flat, SENTINEL)

            # If this callable has already been called with these parameters,
            # return the value returned by that prior call.
            if return_value is not SENTINEL:
                return return_value
            # Else, this callable has yet to be called with these parameters.

            # Attempt to...
            try:
                # Call this parameter with these parameters and cache the value
                # returned by this call to these parameters.
                return_value = args_flat_to_return_value[args_flat] = func(
                    *args)
            # If this call raised an exception...
            except Exception as exception:
                # Cache this exception to these parameters.
                args_flat_to_exception[args_flat] = exception

                # Re-raise this exception.
                raise exception
        # If one or more objects either passed to *OR* returned from this call
        # are unhashable, perform this call as is *WITHOUT* memoization. While
        # non-ideal, stability is better than raising a fatal exception.
        except TypeError:
            #FIXME: If testing, emit a non-fatal warning or possibly even raise
            #a fatal exception. In either case, we want our test suite to notify
            #us about this.
            return func(*args)

        # Return this value.
        return return_value

    # Return this wrapper.
    return _callable_cached  # type: ignore[return-value]

It's trivial, I swear. The 5,000 lines of documentation makes it look non-trivial. Feel free to:

  • Strip out the docstrings and comments.
  • Relicense that under any license you like.

Given that, the following should now work for your use case right now. You could wait for the next @beartype release – or you could just do this now:

from beartype import beartype, BeartypeConf, BeartypeStrategy

class A:
  def some_func(self, x):
    return self._heavy_func(x)

  # Should work. Does it? No idea, bro. May Santa bless this code.
  @staticmethod
  @callable_cached  # <-- no @lru_cache, no cry
  def _heavy_func(x):
     # do some work
     return 42

:partying_face: :santa: :christmas_tree:

leycecover 1 year agogoogle/jaxtyping

Woooooooah. Indeed, I see terrifying – yet ultimately justifiable – shenanigans that sadden me. In the absence of a standardized plugin system for runtime-static type-checkers generically supported by both @beartype and

typeguard
, though, what else you gonna do? Right? So you force the issue by just doing everything yourself. Still, I feel a gnawing fear. That's some pretty gnarly AST munging just to circumvent the lack of a sane plugin API in other people's code.

@beartype and

typeguard
alike are clearly failing
jaxtyping
here. But... it's not just
jaxtyping
we're failing. It's any third-party package publishing custom type hint factories like
jaxtyping.Float[...]
, really.

My only wish for 2024 is to stop failing

jaxtyping
. That, and for our Maine Coons to finally cough up their pernicious hairballs. Just give it up already, cats! :cat2:

Wow. You're... stunning!

I'm working on a type system for a machine learning code base

Yes! Yes! So much "Yes!" ML-AI-LLM type systems is, indeed, the way. Let us tame this horrifying beast of Wild West-style bugginess. Carnegie Mellon getting its money's worth right here.

Thanks so much for that deep dive into @patrick-kidger's hot new

jaxtyping
generic array API, too. As always,
jaxtyping
is leagues ahead of the competition.
jaxtyping
is so far out ahead, it's just a blur on the horizon. Also:

... especially when weird proprietary data gets involved

😄 😆 🤣 😅 😢 😭

Now, if only @beartype would do something and help

jaxtyping
for once.

anything is an improvement over the status quo (nothing is typed, nothing is checked).

You... are... so... funny. It's all true, of course. That's why it's funny. But it's also sad, which also tends to be the case with funny stuff. You'd think the status quo would have long since moved past Ye Old Duck Typing to something that actually works. Yet here we are.

Type-checking: still just for the outlier devs after all these decades, huh? 😩

They also have a nice property that they can be declared independently and used interchangeably like protocols, which helps when multiple projects want to share interfaces, but don't want to agree on a shared dependency.

That's... absolutely true. Honestly, I never thought of that. That's a compelling argument in favour of the permissive API style that

TypedDict
promotes.

By popular demand, I've been deprioritizing deep type-checking of

TypedDict
- and
NamedTuple
-based data structures in favour of
@dataclass
-based structures. But you're right.
TypedDict
has demonstrable value, too. Thanks for convincing me to reprioritize
TypedDict
. I've pushed that higher in my mental stack of feature requests. Let's hope this fragile stack of Jenga blocks doesn't collapse.

I've internally detailed a solid plan to support generic

TypedDict
and
NamedTuple
type hints. In theory, this should only take a week or so. In practice, we'll probably still be here years later.

@beartype: if you know, you know. 😉

leycec11 days agobeartype/beartype

...woah. AI slop. I actually love AI slop. The autism in me acknowledges the autism in AI. I'm delighted and deeply honoured that AI even knows that @beartype exists. That's something I never thought I'd see. Thank you, AI. This robot emoji is for you. 🤖

Sadly, AI doesn't quite have the best advice here. Option:

  1. Moving type-checking to a wrapper function isn't the worst possible thing that you can do (unlike Option 2.), but it's not exactly the best possible thing that you can do either. Most users implicitly type-check everything with
    beartype.claw.beartype_this_package()
    . Technically, you can make that work with Option 1. – but to do so you need to manually do even more stuff like importing
    @typing.no_type_check
    and then decorating your inner private
    _map_records_impl()
    function with that. It works (sorta), but it's crude, clumsy, and inefficient. 🤭
  2. Replacing your type hints with
    typing.Any
    is the worst possible thing that you can do, because that destroys your type hints. What's the point, right? You'd might as well just delete your type hints entirely and be done with it. So... that's not great. 🤮

Thankfully, the

typing.TYPE_CHECKING
approach I suggested above is by far and away the best option:

from beartype import beartype
from typing import TYPE_CHECKING, TypeVar
import ray
from ray._private.worker import RemoteFunction0  # <-- this is the 🦆 

T = TypeVar("T")
R = TypeVar("R)

# If this module is currently being statically type-checked,
# define type hints suitable for static type-checking. Yowza.
if TYPE_CHECKING:
    RemoteFunction0RT = RemoteFunction0[R, T]
# Else, this module is currently being runtime type-checked.
# Define type hints suitable for runtime type-checking! Yikes.
else:
    # Ignore this type hint at runtime, because the third-party
    # "ray._private.worker.RemoteFunction0" type hint factory
    # fails to support runtime type-checking. YIIIIIIIIIIIKES.
    RemoteFunction0RT = object  # <-- lolbro


@beartype
def map_records(
    fn: RemoteFunction0RT,
    records: Sequence[T]
) -> R:
    ...


@ray.remote
def mapper(item: int) -> str:
    return str(item)


# this type checks because @ray.remote is typed to return RemoteFunction0
# but fails at runtime because `beartype.infer_hint(mapper)` is actually just Callable
# (or for other similar reasons)
string_items = map_records(mapper, [0, 1, 2, 3])

No need for inefficient and awkward wrapper functions. No need for type hint destruction via

typing.Any
. It just works in the standard way while preserving static type-checking.

@beartype: 'cause AI don't know it's paw from its paw-paw. 🥝 <sup><-- this is not a paw, AI</sup>

leycec19 days agobeartype/beartype

Ho, ho. It's @beartype's favourite computer vision roboticizer! And... he's back with a horrifying new issue I can't do much about. I'm not quite certain what "the injected request/evt parameters get stolen away by the @beartype decorator" means, but I am quite certain that Gradio is behaving suspiciously. Sadly...

is there some sort of "exclude" parameters that I could use with from BeartypeConf?

Not yet. We all feel sadness about this. All there is is a recent feature request from two weeks ago where somebody else also desperately begs for an

option. Clearly, I should make that happen. Clearly, I am lazy. These two goals are diametrically opposed. Yet...

What the @beartype people want, the @beartype people get. Until the people get what they deserve, the best that you can do is Poor Man's Ignore Parameter Technique™:

from typing import TYPE_CHECKING

# If we're statically type-checking under mypy or pyright, always tell the truth.
if TYPE_CHECKING:
    request_hint = r.Request
    evt_hint = SelectionChange
# Else, we're runtime type-checking under @beartype. In this case, LIE, LIE, LIE!!!
# Specifically, ignore all problematic parameters. @beartype doesn't see what
# @beartype doesn't want to see.
else:
    request_hint = object
    evt_hint = object

def register_keypoint(
    active_recording_id: str,
    current_timeline: str,
    current_time: float,
    request: request_hint,
    evt: evt_hint,
):
    ...

If that still doesn't work, I'm afraid you have no recourse but to enter a dangerous hibernal dream state in cryogenic storage for fifteen years. You will be known globally as Sleeping Pablo Vela Beauty. When you wake up, perhaps things will be better. 😅

leycec30 days agobeartype/beartype

Ho, ho... ho.

nptyping
is no longer actively maintained. Ergo, this feature request no longer has anything to live for. Let us quietly close this issue with sad (yet firm) apologies. Weep for the feature that could have been, @beartype users! 😭

leycecabout 1 month agopython/mypy

@davidfstr: @beartype is now down to only 31 air-quoted "errors" against this PR. Hallelujah! We give praise to Guido. That said, more than a few of these "errors" appear to be false positives in this PR's current implementation of

typing.TypeForm
. Notably, this PR appears to be:

  • Failing to match

    typing.Any
    as a valid
    typing.TypeForm
    .
    Any
    had better be a valid type hint. Right? I will eat my own mustache if
    Any
    isn't a valid type hint. Everywhere @beartype attempts in vain to return
    Any
    from a function annotated as returning
    TypeForm[Any]
    , this PR loudly complains that:

    beartype/_check/convert/_convcoerce.py:191: error: Incompatible return value type
    (got "<typing special form>", expected "TypeForm[Any]")  [return-value]

I'll report back if I find any other stragglers. Altogether and all in all, though, I'm exceptionally impressed with this madness. Thanks so much for all your valiant efforts here! Together, we'll build a better type-checker. And it shall be known as...

mybeartypepy.
:smile:

leycecabout 1 month agobeartype/beartype

...heh. So, it turns out this is technically a duplicate of feature request #128. Resolving this issue effectively means implementing full-blown support for recursive type hints. That's not a bad thing, though. That's a great thing! We've been meaning to do this for literally years, but never did because this feature request scares me. I am not afraid to admit the truth.

It's time to face my fears. I have no choice. @beartype is now blowing up on recursive type hints. Let's rock this.

<sup>unsure exactly what is happening here... but @leycec likes it</sup>

leycec2 months agobeartype/beartype

lol. Truly, IPython is playing with madness:

...the root cause is that

co_firstlineno
is determined incorrectly, while most other properties of the function object are correct. Incorrectness of
co_firstlineno
actually leads to the whole
__code__
being incorrect.

I tracked down the issue to the level of the IPython magic: everything breaks if we attempt to add any decorator to the

ast
transformations list. The most interesting part is that adding even the pointer-copying decorator fails, even if wrapped with
functools.wraps
or
wrapt
.

The body blows just keep coming.

<sup>ipython devs: "because we dgaf"</sup>

leycec2 months agobeartype/beartype

OMG! Thanks so much for the deep-dive on PEP 695. I truly thought @beartype had that nailed down to the floor at this point. Clearly, @beartype's work is still cut out for it. PEP 695 is like a summer camp slasher fic: it's wearing a hockey mask, it's holding an axe, it's breathing heavily, and it just keeps on coming. 😷

I'll promptly attend to this someday. Tragically, this delays our upcoming @beartype

0.20.0rc1
release candidate by yet another lifetime. Guess I'll have to push out @beartype
0.20.0rc1
in my next next reincarnation, huh?

leycecover 1 year agobeartype/beartype

with maybe a little bit of snow?

...we got nuthin', man! A whole boatload of nuthin'. What is this, Jamaica? Because this feels like Jamaica.

Hey just before doing this release, did you have time to check my comment on #311?

Gah! So sorry about that. Your awesome response somehow dropped right off my radar. Clearly, I need a new radar. The one in our basement appears to be broken. :sob:

I totally see where you're coming from there. Thankfully, Python ≥ 3.12 + @beartype 0.17.0 does actually provide a standardized solution for what you want to do: PEP 695-compliant

aliases.
type
aliases have a simple name (which @beartype will present to users in exception messages) but evaluate to an arbitrarily complex type hint (which @beartype will use behind-the-scenes to perform actual type-checking).

To quote the official article on

aliases:

Type aliases are useful for simplifying complex type signatures. For example:

from collections.abc import Sequence

type ConnectionOptions = dict[str, str]
type Address = tuple[str, int]
type Server = tuple[Address, ConnectionOptions]

def broadcast_message(message: str, servers: Sequence[Server]) -> None:
    ...

# The static type checker will treat the previous type signature as
# being exactly equivalent to this one.
def broadcast_message(
        message: str,
        servers: Sequence[tuple[tuple[str, int], dict[str, str]]]) -> None:
    ...

For your use case of the complex

numpy.typing.ArrayLike
union, you could simply define a
type
alias whose name is
ArrayLike
and whose value is
numpy.typing.ArrayLike
. In fact, NumPy itself should do just that to simplify everyone's lives already. Right? Release the code hounds, Watson! :poodle:

from numpy.typing import ArrayLike as np_ArrayLike

# Simply type alias of a complex type hint: arise, you fiend! Arise to glory!
type ArrayLike = np_ArrayLike

What's the Catch, Bub?

Two catches:

  • This requires Python ≥ 3.12. When I say "This requires Python ≥ 3.12," I mean really requires. If you even try to define a
    type
    alias in Python < 3.12, you'll get a non-human-readable
    SyntaxError
    – even if you try to hide that
    type
    alias from older Python versions with
    if
    conditionals like
    if version_info < (3, 12):
    . Hiding doesn't work. You either need to:
    • Isolate all
      type
      aliases to a unique submodule imported only under Python ≥ 3.12.
    • Dynamically declare
      type
      aliases with the
      exec()
      builtin. This is the lazy way. Thus, this is what we do below.
  • This requires @beartype ≥ 0.17.0, which has yet to be officially released. <sup>...heh.</sup>

You are now thinking: "

type
aliases seriously suck, dude. Seriously." Fear not, @leycec is here. Allow me to now dispel all your fears with insane code that should make you cringe. I say, "Embrace that cringe."

Let's gooooooooooooo:

from numpy.typing import ArrayLike as np_ArrayLike
from sys import version_info
from typing import TYPE_CHECKING

# If we're being statically type-checked, just declare a type alias directly.
if TYPE_CHECKING:
    type ArrayLike = np_ArrayLike
# If the current Python interpreter ≥ 3.12, dynamically declare a type alias to
# avoid "SyntaxError" complaints from older Python interpreters.
elif version_info >= (3, 12):
    exec('type ArrayLike = np_ArrayLike')  # <-- don't ask. don't tell.
# Else, the current Python interpreter sucks. In this case, define a stupid
# obsolete old-school type alias with deprecated "typing" attributes. Just. Do. It.
else:
    from typing import TypeAlias  # <-- deprecated, but who cares
    ArrayLike: TypeAlias = np_ArrayLike

...heh. What could be simpler, huh? :face_with_spiral_eyes:

leycec3 months agobeartype/beartype

Ho, ho, ho. Merry 2025, fellow Nintendude @rbroderi! With the power of friendship, Gannon is going down this time. <sup>...yeah right. gannon'll just be back in five minutes. we all know it. no real point in even defeating that guy, is there? dude has more lives than a Maine Coon cat hopped up on catnip.</sup>

Did you perhaps mean

die_if_unbearable()
rather than
is_bearable()
in this example? The former raises exceptions on type-checking violations; the latter only returns boolean
False
on type-checking violations.

Also, there's so much insanity happening in this example that I'm at a genuine loss for words. Actually, that's a lie. I'm still typing hyperbolic verbosity like a Maine Coon cat hopped up on catnip. Still, I kinda have no idea what's even going on in that example.

What's

win32com.client.Dispatch("Word.Application")
, anyway? Is that really just an instance of
_Word
? No idea.

Oh – and

typing.cast()
is basically pointless. From the little that I understand about all of this insanity, it's faster, easier, and more readable to just use PEP 526-style variable assignment annotations: e.g.,

# Pretty sure this is the same thing... just faster and easier. 
word: _Word = win32com.client.Dispatch("Word.Application")

This also has the benefit of being implicitly type-checked by @beartype – assuming you use

beartype.claw
import hooks, anyway. And now both of our faces are exhausted. What a way to start 2025. :weary:

leycecabout 1 month agobeartype/beartype

Wait - so you mean that actually Beartype parses traditional string forward references...

...heh. Of course, silly! <sup>i say "silly" in the cheekiest way possible. no shade intended. @beartype loves its wonderful userbase! this especially includes aspiring, idealistic, young PhD candidates such as yourself who are on the cusp of positively improving this troubled world.</sup>

@beartype fully parses arbitrarily complex stringified type hints, including modern QA madness under Python ≥ 3.12 like:

from __future__ import annotations  # <-- enable PEP 563, which implicitly stringifies everything

def oh_my_gods[T](  # <-- enable PEP 695, which implicitly instantiates "T" into a "TypeVar"
   this_is_horrible: T | list[T] | MyUndefinedType) -> (  # <-- PEP 695 intensifies
   T | set[T] | MyOtherUndefinedType):  # <-- more PEP 695, more horror
   ...

The above is effectively equivalent to this older syntax in older Python versions:

from typing import TypeVar

T = TypeVar('T')

def oh_my_gods(
   this_is_horrible: 'T | list[T] | MyUndefinedType') -> (
   'T | set[T] | MyOtherUndefinedType'):
   ...

@beartype then has to unparse those stringified type hints back into their constituent components. Crucially, this includes replacing the strings

'MyUndefinedType'
and
'MyOtherUndefinedType'
with actual usable objects known as forward reference proxies (i.e., objects proxying those undefined types by dynamically deferring the resolution of those types with just-in-time (JIT) implementations of the
__instancecheck__()
and
__subclasscheck__()
dunder methods underlying the
isinstance()
and
issubclass()
builtins).

Basically, @beartype is crazy... in the best way possible. If it's a PEP standard, @beartype almost certainly supports it. Okay, probably supports. Okay, sometimes supports. 😅

Thanks again for playing along with @beartype. I'd love to document all of the eldritch darkness that @beartype is doing these days... but then I'd never get anything else done. I keep hoping somebody else will want to submit a pull request (PR) improving our documentation... but nobody wants to do that, either. Which is understandable. Nuthin' worse than writing docos on a free weekend when you could just be playing video games all night instead.

<sup>a typical saturday night at the @leycec household descends into madness</sup>

leycecabout 1 month agobeartype/beartype

Sorry for the shameful delay! I am bad. You are all wonderful. I was busy beating TensorDict into submission, which I have now done to my glib satisfaction. Let's get dirty with dataclasses, everyone.

Interestingly, new polling data just in says everyone wants me to turn @beartype into a general-purpose competitor to Pydantic by improving our type-checking of

@dataclasses.dataclass
fields. Thus, this is probably what @beartype 0.21.0 is going to be all about. Probably. 😅

But... dataclasses frighten me. At least we have @TeamSpen210:

Dataclasses do have something for introspection - you can call

dataclasses.fields()
to get a tuple of field objects, telling you what fields were defined, their types, default values etc. With that you could probably wrap
__init__
, after it sets the attributes go and do each check.

Yes! So much yes. I did not even know about this

dataclasses.fields()
thing. Now I know about that thing, I'll probably avoid calling that thing (because it is slow) and instead violate privacy encapsulation by directly peeking inside dataclasses. Mu-hah... ah. Predictably, violating privacy encapsulation will blow up in my face. So be it! @beartype users are willing to assume that risk.

It also looks like @beartype probably wants to type-check fields inside a handrolled

__post_init__()
dunder method. Honestly, dataclasses are madness. They technically allow users to override both the
__init__()
and/or
__post_init__()
methods. @beartype could probably wrap either with type-checking. Wrapping
__post_init__()
just feels a bit cleaner and more in the ephemeral spirit of the
@dataclasses.dataclass
API.

WAIT...

The Dataclass API Sure Is Somethin', Ain't It?

It looks like

__post_init__()
won't work at all for @beartype. The
@dataclass
decorator dynamically generates the
__init__()
method at decoration time. Crucially, this includes introspecting whether the currently decorated class declared a
__post_init__()
method or not. By the later time that the
@beartype
decorator has arrived,
@dataclass
has already generated the
__init__()
method.
@beartype
cannot just monkey-patch a new
__post_init__()
method in. Well,
@beartype
can – but the existing
__init__
implementation will just ignore that new
__post_init__()
.

So... yet again, I'm pretty sure @TeamSpen210 is right about everything. @beartype has no choice but to replace the existing

__init__()
of the decorated dataclass with a new
__init__()
that type-checks everything.

Beartype could potentially provide an implementation there which adds validators to all fields, which will make it check whenever the fields are assigned.

Hmmm... Maybe. Press "F" to doubt, though. Overriding the

__setattr__()
dunder method is probably still the optimal approach. Ideally, whatever @beartype does should require no manual input or intervention from the user. It should "just work", batteries-included, out-of-the-box.

Will @beartype ever do dataclasses? Tune in next month as @leycec passes out on a keyboard. 😴

leycec3 months agobeartype/beartype

Awwwwwwwww! You're so nice, @Glinte. I love it when my amazing userbase shares in the communal horror. Indeed, I may have suffered for us all and the sins that PEP 563 has committed, but my heart has already been steeled in the icy forge of twenty Canadian winters. ❄ 🌨 ☃ 🧊

PEP 563 can't compare to waking up with a beard full of icicles. If you know, you know.

<sup>in the utopian world I dream of, warm beards are free of icicles</sup>

leycec3 months agobeartype/beartype

...lol. Thanks so much for catching this really ugly edge case. You're the ultimate QA turtle. Did I mention that we love turtles where we live? Everybody does here. We live in the cold wetlands of Ontario, Canada – where mosquitoes breed by the billions. We've all got bumper stickers like "I stop for turtles." Your GitHub username warms my heart.

What were we talking about again? Oh. Right. Type mocking. Pretty bizarre stuff, huh? Static type-checkers aren't wrong about this, per say – but they're not necessarily right either. Overriding

__class__
as a
@property
is already so far off into the DeepEndOfDarkPython™ that static type-checking doesn't even really apply anymore.
mypy
and
pyright
always choke whenever you start overriding dunder attributes. They just can't cope with Python's penchant for dark magic, bubbling cauldrons, and ill-determined runtime semantics.

Your initial implementation of

@__class__.setter
seems fine, too. Sure... it's pretty odd. Nobody should be resetting the
__class__
dunder attribute. But if they do, at least your approach preserves working code semantics. Raising
NotImplementedError
instead would probably break those semantics with an unreadable exception. Maybe. Anyyyyway.

It's all baseless conjecture and pride-posturing at this point. Probably nobody cares about

@__class__.setter
except
mypy
and
pyright
– and even they don't really care. This is way outside their narrow wheelhouse and theological mandate from Heaven Guido.

Let us just ship your amazing fix. BOOM. It's in. This head exploding proves that you're awesome: :open_mouth: :exploding_head: :boom:

leycec4 months agobeartype/beartype

That is an honorable achievement

In a similar way, surviving a moose trampling is an honorable achievement.

We have a meme with my wife. I worked at a big tech company in my country and when I was coming late I was saying that "Yandex is broken and I need to fix it".

...oh, my aching sides. Stop! You're killing' me here! My ribs hurt. Seriously. They hurt.

That's the funniest thing I've heard all week – and I live with three smelly cats and a non-smelly wife in a cramped cabin in the woods. So, you know it's hijinx all day long.

Yandex is a delightful engine. I'm honoured to be supporting a former Yandex warrior. Whenever Google fails me, I roll up my sleeves, stick a fake cigar made of candy into my mouth, and start muttering orthodox catechisms to the search-engine Gods:

"Don't fail me now, Yandex. It's 3:27AM. Find the unfindable, you ugly bastard. You're @beartype's final hopium."

<sup>Yandex: even Power Rangers knows.</sup>

leycecover 1 year agobeartype/beartype

Oh, you're most welcome. Thanks so much for your kind and hilarious comments. Also, you're not wrong:

...I still prefer my use of NewType in #311.

...yeah. They botched the implementation of

type
aliases. It's a shame they can't be used in open-source or even most commercial offerings. Ain't nobody gonna require Python ≥ 3.12 anytime soon. That said,
NewType
does still have the disadvantage of angering mypy and
pyright
. I think?

Behold! Code that pacifies mypy and

pyright
:

from numpy.typing import ArrayLike as np_ArrayLike
from typing import TYPE_CHECKING, NewType, TypeAlias

# If we're being statically type-checked, declare an obsolete type alias that pacifies mypy.
if TYPE_CHECKING:
    ArrayLike: TypeAlias = np_ArrayLike
# Else, we are normal Python. In this case, define a clever type alias that angers mypy.
else:
    ArrayLike = NewType('ArrayLike', np_ArrayLike)

But when you've gone that far, it's only one small step further to full madness:

from numpy.typing import ArrayLike as np_ArrayLike
from sys import version_info
from typing import TYPE_CHECKING, NewType, TypeAlias

# If we're being statically type-checked, declare an obsolete type alias that pacifies mypy.
if TYPE_CHECKING:
    ArrayLike: TypeAlias = np_ArrayLike
# Else, we are normal Python. If we are Python ≥ 3.12, dynamically declare a type alias to
# avoid "SyntaxError" complaints from older Python interpreters.
elif version_info >= (3, 12):
    exec('type ArrayLike = np_ArrayLike')  # <-- don't ask. don't tell.
# Else, we are obsolete Python. In this case, define a clever type alias that angers mypy.
else:
    ArrayLike = NewType('ArrayLike', np_ArrayLike)

I'll stop now. :rofl:

leycecover 1 year agobeartype/beartype

Ugh. It deeply saddens me to close this issue out unresolved, but... yeah. Python leaves us little choice here. It's unlikely that

typing
devs intend to do what we want; they're mostly mono-focused on pure-static type-checking, despite the year currently reading 2024. I grunt in consternation.

This issue tracker has become a festering hive of bugs and feature requests. In a futile effort to restore a modicum of hygiene, let's quietly close this and pretend

typing.TypeGuard[...]
actually did something useful. More grunting in consternation. :face_exhaling:

leycecabout 1 month agobeartype/beartype

...and then three years passed.

I am now almost entirely bald. My younger self is weeping to read this. Before the last hair follicle falls, @beartype will support recursive type aliases! As expected, @beartype will only support PEP 695-style recursive type aliases like

type NestedDict = Dict[str, Union[FileInformation, NestedDict]]
under Python ≥ 3.12. All older variants of recursive type hints have since been deprecated (and were basically impossible to support at runtime, anyway). If you need recursive type hints, this sadly means you need to require Python ≥ 3.12. Don't blame @leycec. (He means well.)

In short: it's happening. It's happening slowly. But it's happening. Rejoice, young people!

<sup>that moment when you realize no one is rejoicing with you</sup>

Resolved by 9b7cd6a45e6d4d2bfac. Through what could only be called a divine fit of madness, @beartype now supports indirect recursion in PEP 695-compliant type aliases. Thanks so much for keeping me on my frozen toes during the loooooooong winter months here in Canada. You've helped make @beartype such a better QA toolchain. This shocked Maine Coon cat apparently named ViralHog is for you, @EtaoinWu. 😻

<sup>@beartype now knows that feeling too, ViralHog</sup>

lol. I don't quite understand what's going on [read: i got no friggin' clue, bro], but our upcoming @beartype

0.21.0
release now appears to transparently support the
typing_extension.Unpack
backport. Notably, your original example:

import sys
from beartype import beartype
from typing_extensions import TypedDict, Unpack

class Dummy(TypedDict):
    """I am here for debug."""

    i_am_useless: bool

@beartype
def main(**dummy: Unpack[Dummy]) -> None:
    print(f"Am I useless ? {dummy}.")

...now usefully prints:

Am I useless ? {'i_am_useless': False}.

If I mistaken and indeed dumb, please do reopen this! Until then, let's choose to optimistically believe that everything works despite me doing nothing. Wow. Typing sure is a thing, huh? 😅

leycec19 days agobeartype/beartype

Oh, right... Of course! I am dum-dum. Apparently, Python 3.8 still exists.

<sup>poor naked drunk vampire guy can't get a break</sup>

leycecabout 1 month agobeartype/beartype

I'm not sure I understand exactly what do you mean by adhering to the runtime API of the standard typing module?

typing-extensions
doesn't work at runtime. Literally. It just doesn't work. At least, it doesn't work at runtime in the way that the standard
typing
module works
– which is the only way that matters. If
typing-extensions
isn't backporting
typing
behaviour, it's not a valid backport from @beartype's admittedly grotesque runtime perspective.

I'm super-shocked you've never hit a

typing-extensions
issue in Pydantic or related projects like
typing-inspection
before. @beartype hits
typing-extensions
issues all the friggin' time! It happens so frequently I had to put the "friggin'" adjective there. That's when you know it's bad.

Real-world Example or It Didn't Happen

Wonderful @beartype user @RomainBrault just hit issue #509 with the

typing_extensions.Unpack
backport:

Hello, it's me again 💀

I am using rather extensively

Unpack
to type my kwargs with
TypedDict
. However Unpack was only introduced with Python 3.11. So for Python < 3.11 I'm using typing_extensions 🙈.

# /// script
# requires-python = ">=3.9"
# dependencies = [
#     "beartype==0.20.0",
#     "typing-extensions==4.12.2; python_full_version<'3.11.0'",
# ]
# ///

import sys
from typing import TypedDict

from beartype import beartype


if sys.version_info >= (3, 11):
    from typing import Unpack
else:
    from typing_extensions import Unpack


class Dummy(TypedDict):
    """I am here for debug."""

    i_am_useless: bool


@beartype
def main(**dummy: Unpack[Dummy]) -> None:
    print(f"Am I useless ? {dummy}.")


if __name__ == "__main__":
    main(i_am_useless=False)

The following code failed for Python < 3.11 (tested with

uv run --python=3.10 ...
) with the error:

Traceback (most recent call last):
  File ".../Git/test_beartype_2/test.py", line 28, in <module>
    def main(**dummy: Unpack[Dummy]) -> None:
  File ".../.cache/uv/environments-v2/test-a5b3c20387222ad4/lib/python3.10/site-packages/beartype/_decor/decorcache.py", line 74, in beartype
    return beartype_object(obj, conf)
  File ".../.cache/uv/environments-v2/test-a5b3c20387222ad4/lib/python3.10/site-packages/beartype/_decor/decorcore.py", line 87, in beartype_object
    _beartype_object_fatal(obj, conf=conf, **kwargs)
  File ".../.cache/uv/environments-v2/test-a5b3c20387222ad4/lib/python3.10/site-packages/beartype/_decor/decorcore.py", line 137, in _beartype_object_fatal
    beartype_nontype(obj, **kwargs)  # type: ignore[return-value]
  File ".../.cache/uv/environments-v2/test-a5b3c20387222ad4/lib/python3.10/site-packages/beartype/_decor/_decornontype.py", line 249, in beartype_nontype
    return beartype_func(obj, **kwargs)  # type: ignore[return-value]
  File ".../.cache/uv/environments-v2/test-a5b3c20387222ad4/lib/python3.10/site-packages/beartype/_decor/_decornontype.py", line 336, in beartype_func
    func_wrapper_code = generate_code(decor_meta)
  File ".../.cache/uv/environments-v2/test-a5b3c20387222ad4/lib/python3.10/site-packages/beartype/_decor/wrap/wrapmain.py", line 122, in generate_code
    code_check_params = _code_check_args(decor_meta)
  File ".../.cache/uv/environments-v2/test-a5b3c20387222ad4/lib/python3.10/site-packages/beartype/_decor/wrap/_wrapargs.py", line 416, in code_check_args
    reraise_exception_placeholder(
  File ".../.cache/uv/environments-v2/test-a5b3c20387222ad4/lib/python3.10/site-packages/beartype/_util/error/utilerrraise.py", line 137, in reraise_exception_placeholder
    raise exception.with_traceback(exception.__traceback__)
  File ".../.cache/uv/environments-v2/test-a5b3c20387222ad4/lib/python3.10/site-packages/beartype/_decor/wrap/_wrapargs.py", line 269, in code_check_args
    hint_or_sane = sanify_hint_root_func(
  File ".../.cache/uv/environments-v2/test-a5b3c20387222ad4/lib/python3.10/site-packages/beartype/_check/convert/convsanify.py", line 200, in sanify_hint_root_func
    hint_or_sane = reduce_hint(
  File ".../.cache/uv/environments-v2/test-a5b3c20387222ad4/lib/python3.10/site-packages/beartype/_check/convert/_reduce/redhint.py", line 353, in reduce_hint
    hint = _reduce_hint_uncached(
  File ".../.cache/uv/environments-v2/test-a5b3c20387222ad4/lib/python3.10/site-packages/beartype/_check/convert/_reduce/redhint.py", line 628, in _reduce_hint_uncached
    hint = hint_reducer(
  File ".../.cache/uv/environments-v2/test-a5b3c20387222ad4/lib/python3.10/site-packages/beartype/_check/convert/_reduce/_pep/pep484/redpep484typevar.py", line 156, in reduce_hint_pep484_typevar
    hint = get_hint_pep484_typevar_bound_or_none(hint, exception_prefix)  # pyright: ignore
  File ".../.cache/uv/environments-v2/test-a5b3c20387222ad4/lib/python3.10/site-packages/beartype/_util/cache/utilcachecall.py", line 249, in _callable_cached
    raise exception
  File ".../.cache/uv/environments-v2/test-a5b3c20387222ad4/lib/python3.10/site-packages/beartype/_util/cache/utilcachecall.py", line 241, in _callable_cached
    return_value = args_flat_to_return_value[args_flat] = func(
  File ".../.cache/uv/environments-v2/test-a5b3c20387222ad4/lib/python3.10/site-packages/beartype/_util/hint/pep/proposal/pep484/pep484typevar.py", line 189, in get_hint_pep484_typevar_bound_or_none
    if hint.__constraints__:
  File ".../.local/share/uv/python/cpython-3.10.16-linux-x86_64-gnu/lib/python3.10/typing.py", line 984, in __getattr__
    raise AttributeError(attr)
AttributeError: __constraints__

Do you have any idea how to use beartype and unpack for Python < 3.11 ? If I can get rid of typing_extensions I am happy.

what's a bit odd it that the error seems to be coming from typing and not typing_extensions.

I could "unpack" manually the kwargs, but it would make the function signature very long and my linter would be mad at me, so I would like to keep using

Unpack
.

tl;dr

  • @beartype supports
    typing.Unpack[...]
    perfectly well. I think. Don't quote us on that. 😓
  • @beartype totally doesn't support
    typing_extensions.Unpack[...]
    at all. Why? No idea. The exception message is unreadable jibberish suggesting a moral failing in the
    typing_extensions.Unpack
    backport.

In theory, @beartype should transparently support all

typing-extensions
backports out-of-the-box without anyone (especially me) having to do anything. In practice,
typing-extensions
backports just do not work at runtime. Why? No idea. It's kinda above my pay grade and interest.

Sure, @beartype might be at fault here – but it kinda seems like

typing-extensions
is to blame. That exception message is suspiciously smelly. Like, why is the standard
typing
module even being involved here? What do type variables have to do with the
typing_extensions.Unpack
backport? No idea. It really does seem like
typing-extensions
is not adequately testing its backports against the runtime API of the
typing
module. Ideally, there should be a one-to-one correspondence between the runtime behaviour of
typing
and
typing-extensions
attributes. But... there just isn't.

And there just isn't because

typing-extensions
probably isn't doing enough testing. If they did, these sorts of runtime conformance issues wouldn't keep arising again and again... and again. Or maybe @beartype is too blame for everything. Totally possible. Either way, it's just too much unpaid volunteer investment for too little gain. I am facepalming myself here in the winter darkness of a Canadian cabin.

<sup>@leycec after staring down yet another inscrutable

typing-extensions
backport failure</sup>

leycecabout 2 months agobeartype/beartype

Yoinked! May @beartype

0.20.0rc0
never curse this realm again. ☠

leycecabout 2 months agobeartype/beartype

lolbro. Good Ol' FreeBSD, eh? In theory, @beartype continues to love us some *BSD. In practice, @beartype still lacks official *BSD support. This issue is thus a duplicate of #261, which has been around since before the dawn of Guido.

Please, somebody. Submit a PR improving *BSD support. Until then, emoji cat regretfully closes this duplicate issue. 😿

leycec3 months agobeartype/beartype

Oh... boy. Wonderful clue. You're almost certainly correct as you always are. Thank you so much! I never would have figured that out on my own. This is like 90's-era LucasArts point-and-click adventure games all over again. I need a strategy guide but I have no lunch money.

I hadn't realized that Sphinx only conditionally installs the

sphinx.testing
subpackage when the
[test]
extra is explicitly installed. That's... kinda crazy. It does make a peculiar sort of head-scratching sense that I've come to associate with Sphinx. Still, extras usually only install external optional dependencies. I've never seen an extra that installs an internal subpackage before. Sphinx probably uses some sort of arcane
setuptools
madness to perform this install-time logic too, I bet.

Sphinx: where good documentation and youthful dreams go to die. 🤦

leycec3 months agobeartype/plum

Probably resolved by @beartype

. Since I tested nothing, please do reopen this if still broken. Likewise, since @beartype
0.20.0rc0
is only a release candidate, interested users will need to manually update @beartype as follows: <sup>...for the moment</sup>

pip install --upgrade --pre beartype     # <-- bugger the bugs

We salute you who are about to dispatch generics, Plum users. 💪

leycec3 months agobeartype/beartype

Resolved by c30a5fd1b48d41f55bb1c. Thanks so much for your patience, His Majesty. This resolution (and a whole lot more!) will appear shortly in our upcoming @beartype 0.20.0 release.

I'll drop you a line when @beartype 0.20.0 lands. Until then, have a wonderful time repairing all of the broken code in this world. Your wife truly must have infinite patience. :smile:

leycec3 months agobeartype/beartype

I can always use the

typing
package on PyPI.

Ha-ha! I had a similar (yet perhaps even more perverse) idea. Just:

  • Embed a copy of the existing
    typing
    module inside your codebase before CPython 3.15 [or whatevahs] finally disembowels it.

The only downside there is a potential licensing conflict, depending on whatever license you've chosen. But... yeah. Other that licensing, that's probably the optimal solution. It's just a single file, so it's trivially embeddable. You control it's contents, so you're guaranteed that it will always satisfy your use case.

Oh, and... Happy 2025! Let us see if Canada can go a year without being violently invaded by the U.S. :partying_face:

leycecover 1 year agoleycec/fsnrnue

It is my absolute pleasure to be able to contribute.

When I'm done with my daily school work, tinkering is one of my hobbies.

We are weeb. Thus, we tinker. :laughing:

I've also created the Fate/Hollow Ataraxia installed on lutris since for some reason no one had done that.

That's... super-hyper-awesome, actually. Thanks so much for your service to humanity. If you're amenable, we'd love to host that here as well. In fact, let's eventually do the whole Nasuverse while we're at it. This hype train stops for no Servant. I am now thinking of a F/HA Lutris installer hosted here with a local filename like:

  • lutris/fate-hollow-ataraxia.yml
    .

Please do feel free to commit something like that. Of course, studies are the ultimate priority. Please ace everything. Show those normies your true power.

Done!

Apart from that, the movies seem to be a bit choppy/screen tear-y.

No worries at all. This is still a dramatic improvement over where we were before, which was nowhere. Even Angra Mainyu approves.

Choppiness and screen tearing is more than likely to be a Wine-GE issue rather than a "your system is a literal potato" issue. Even in 2024, Wine still commonly hiccups on DSP tasks like video decoding. Prolly just Wine being wine. Sakura shrugs.

...and uploaded to lutris.

Oh, right! I almost forgot to mention: please do attempt to revise the Lutris-hosted F/SN Ultimate Edition installer script yourself, but... please don't be surprised if Lutris moderators reject your efforts. They can kinda be jerkish about that sort of thing, sometimes. It's all the luck of the draw on which admin you pull, really. I recall @kaasknak attempting to do this and failing hard a year or two ago.

If that happens to you, fear not! I'll happily upload the script myself when you've finalized everything. Just let me know if this needs to happen and this will happen.

Thanks again for the tremendous tinkering, volunteerism, and enthusiasm! Love to see some much-needed Linux love for the Nasuverse. This GitHub issue be like:

leycecabout 1 month agobeartype/beartype

@RomainBrault! My QA man! Python 3.14 alpha releases are out already, huh? Yikes. Guess it's time to bear up and bring the flamethrowers out. This is gonna be rough.

TypedDict
breaking is probably the least of anyone's concerns under Python 3.14. @beartype definitely isn't ready for that prime time. Python 3.14 dramatically changes how annotations are stored and processed. Expect major breakage if you're throwing @beartype at Python 3.14. It'll probably take a few months for me to refactor everything to fully support Python 3.14.

Until then... this issue is still good to know. Thanks so much for the heads up. Let us resurrect this! QA Necromancy: "I invoke thee."

leycec3 months agobeartype/beartype

Possibly resolved by 29ebee81dacb. I have no idea what I'm doing anymore. It is Saturday night and I shouldn't even be sitting in front of a computer. I did, anyway. Maybe this fixes something for you? I have no answers for Django and even more questions:

pip install git+https://github.com/beartype/beartype.git@29ebee81dacb53af97f4218411ba98783d056d15

Let's start somewhere random. How about... here? This looks sufficiently crazy:

I found another bug... This time with

descriptor.__set_name__
if you are saving the owner argument for future use. If descriptors are black magic, then this is vantablack magic so I think it can be ignored 😛 Also, I couldn't create a simple example for the bug. I even tried creating a small Django project from scratch but everything worked (one time I would prefer things to not work...).

Ho, ho. I can explain what's happening here. Whenever you use a forward reference like this:

from future __import__ annotations

@beartype
def muh_func(muh_arg: muh_package.SomeType) -> None: ...

PEP 563 and CPython stringify that forward reference like this:

@beartype
def muh_func(muh_arg: 'muh_package.SomeType') -> None: ...  # <-- STRINGIFIED!

@beartype
then internally converts that stringified forward reference into a forward reference proxy (i.e., a private type internal to the @beartype codebase that proxies all
isinstance()
and
issubclass()
checks against the type referred to by that forward reference). Crucially, forward reference proxies are themselves types. That's the only way proxying
isinstance()
and
issubclass()
checks could possibly work, right?

What you are seeing is thus not a bug but exactly the intended behaviour. The first time:

  • Your code accesses a decorator-based class attribute on one of your classes, CPython calls the
    descriptor.__set_name__()
    dunder method with an
    owner
    parameter set to that class. So far, so good.
  • Someone's code (which could be @beartype's or Django's or possibly even your own, depending on how much dark magic introspection your code is doing) accesses the same decorator-based class attribute on one of @beartype's forward reference proxies, CPython yet again calls the same
    descriptor.__set_name__()
    dunder method with an
    owner
    parameter set to that forward reference proxy. You hate this. Yet, it makes sense. In fact, this is the only sensible thing for CPython to do here. If CPython behaved in any other way, then it would be CPython that was bugged. Thankfully, CPython isn't bugged. CPython is doing what it's supposed to. And... so is @beartype. Probably. We give praise!

I suppose your question then becomes: "Sure. But why is @beartype code or Django code or maybe somebody else's code accessing my class attributes on @beartype's internal forward reference proxies to my classes, bro?"

No idea, bro. I'm a dum-dum. It's Saturday. Let's just play video games instead of thinking about all this madness. 🎮 🙀 🎮

Also, this...

I changed

beartype/_check/forward/reference/fwdrefmeta.py:BeartypeForwardRefMeta.__instancecheck__
to catch
BeartypeCallHintForwardRefException
and
return True
. The idea was "if you can't resolve a forward reference to type check something, assume everything is fine".

This is awful... and hilarious! I'm extremely impressed you delved that deep into @beartype's labyrinthine Hall of Mirrors without your eyeballs exploding. Well done, @rudimichal. Well done. My applause is all yours tonight.

@beartype 0.21.0 will automatically repair all of these sorts of

TYPE_CHECKING
import forward reference issues. You won't need to maintain your eyeball-exploding monkey-patch for too long. The next @beartype stable release will be released sometime in the summer of 2025, probably. It'll convert imports isolated to
if TYPE_CHECKING:
blocks (which don't work at runtime, as you've noted) into @beartype's forward reference proxies (which do work at runtime).

Until then, please bear with us. 🐻

leycec3 months agobeartype/beartype

this beartype group of people

...oh, gods. I accidentally made a tribe. Now, a tribal dichotomy has emerged. There's a normative in-group comprised of tribal members named "Beartype Group of People" and a nonnormative out-group comprised of non-tribal members named "Surely Damned Heathens."

Kinda funny. Kinda sad. But mostly funny. Truly, to know @beartype is to know humans.

but why discourage exception handling altogether?

Gah! So sorry. You're totally right. I didn't mean to imply that exception handling is unconditionally bad – especially in the sense of Dijkstra's infamous "Considered Harmful" manifesto. I love exception handling, actually. Without exception handling, we'd all be eating rocks and flinging "night soil" at one another. We'd have to go back to the ugly days of horrid languages with unpronouncable names like C, COBOL, and Fortran.

Exception handling isn't bad. Exception handling is great! It's Easier to Ask for Permission than Forgiveness (EAFP) that's... not so good. I consider EAFP an abuse of exception handling. In most languages, exceptions are only supposed to be thrown under exceptional circumstances – like, when something exceptionally bad has happened. This makes sense. It's exception handling, after all. It's right there in the name. It's supposed to be the exception. Not the common case.

EAFP flips this common-sense notion on its head by intentionally throwing and catching exceptions in common use cases. In most implementations of EAFP, nothing catastrophic has actually happened. Exception handling is just being abused to detect a common case. Even that would be fine if:

  • (A) Python wasn't already the slowest language in existence and...
  • (B) Exception handling in Python wasn't already the slowest feature in the slowest language in existence.

EAFP compounds this by frequently triggering the slowest feature in the slowest language. This makes EAFP the slowest possible implementation of any decision problem.

Let's talk specifics. The

except ...:
branch of exception handling in Python is slow. Therefore, use cases that frequently trigger the
except ...:
branch of exception handling in Python are also slow. But EAFP is this use case. Therefore, EAFP is slow – very, very slow.

Examples or EAFP didn't happen. In this example, we define a function

is_booly()
returning
True
only if the passed arbitrary object is booly (i.e., is convertible to a meaningful
bool
by defining either the
__bool__()
or
__len__()
dunder attributes). @beartype actually does this frequently. So, this is a real-world example that matters to us (but probably nobody else, which is fair enough):

def is_booly(obj: object) -> bool:
   try:
       getattr(obj, '__bool__')
       getattr(obj, '__len__')
       return True
   except:
       return False

It's trivial. It's concise. It's also slow as molasses. Thankfully, we can implement the same

is_booly()
function without EAFP. It's non-trivial. It's long-winded. It's also fast as lightning:

from beartype.typing import Protocol

def is_booly(obj: object) -> bool:
    return isinstance(obj, _BoolLike)

class _SupportsBool(Protocol):
    def __bool__(self) -> bool: ...

class _SupportsLen(Protocol):
    def __len__(self) -> bool: ...

_BoolLike = (_SupportsBool, _SupportsLen)    

Note the lack of exception handling in the latter case. The result is the same. The implementations are anything but.

@beartype walks the road less travelled. Sometimes, you don't care about speed – but @beartype really cares about speed. It's pretty much the only thing @beartype cares about. @beartype cares way more about speed than anything else, including comformance to PEP standards. @beartype does care about PEP standards... but only when PEP standards don't get in the way of things we care more about, like speed. When they do, @beartype just ignores PEP standards and prefers speed.

@beartype: It's weird. Apparently, it's also a tribe. Gods! What have I wrought? I meant well. 😭

leycecover 1 year agobeartype/beartype

Note to future self: "See this comment at #216 for a fully working example abusing using

typing.NewType
,
typing.TypeAlias
, and
type
aliases. I invoke thee, madness!"

Note to past self: "Invest in everything by this guy named Sam Altman."

Woohoo! Thanks so much for this awesome PR, wise bearded bear bro. Because you are both wise and bearded, you are indeed right about everything – especially:

This updates the signature of the

ipython
argument to be the superclass
InteractiveShell
...

...indeed. Totally makes sense. Not all interactive shells are terminals. Pyodide: you are the case in point.

Let's merge this ASAP. Tests are spuriously failing, which is another issue altogether. Oh, and...

a source distribution

.tar.gz
on PyPI (preferred by most downstreams)

ohnoes.

ipython-beartype
isn't even shipping tarballs? What is this profane madness!? Let's open up a separate issue about that. I'll ping you there. :rightwards_hand: :bell:

leycecabout 1 month agobeartype/beartype

OHNOEEEEEEEEEEEEEEES!!!!

I... actually have no idea whether that code is PEP-compliant either. Still, it's horrifying. At the least, @beartype should probably be implicitly subscripting all unsubscripted type aliases like the

WithInt
in
T: WithInt
by
typing.All
. That is, @beartype should silently expand
T: WithInt
to
T: WithInt[Any]
. Probably.

It's nice that that make this issue go away. Still, something isn't right. This is indicative of a deeper recursion issue that will probably blow up unless we investigate. Instead, let's just play video games tonight. 😅

leycecabout 2 months agoleycec/raiagent

Closing this issue saddens me. I'd loooooove to keep this open indefinitely. I adored ZeroNet... and still do. Sadly, I'm on the cusp of archiving this Gentoo overlay. @beartype has eaten all of my free time. Now, @beartype has indigestion and I no longer have time. I have so little time I don't even have time to properly close this issue out.

Weep for the Gentoo overlay that once was, emoji cat! Weep. 😿

leycecabout 2 months agobeartype/beartype

Actually... I just mic-dropped @beartype

. So, you can just straight-up upgrade to that:

pip install --upgrade beartype     # <-- bugger the bugs

Would you mind giving that a try? Report back and I shall do something. Else, nothing. 😆

leycec2 months agobeartype/beartype

Resolved by 08693b8972a7304a. PEP 695: this is the day that @beartype conquers you. Just give it up already, PEP 695! You're never gonna defeat @beartype. Somehow, we will type-check you.

This resolution is slated for release with @beartype

0.20.0rc1
– the next (and hopefully last) release candidate of the @beartype
0.20.0
dev cycle. In theory, I drop @beartype
0.20.0rc1
tomorrow. Hopes and prayers for my free time.

Because you are @EtaoinWu, you'll continue to break stress-test @beartype against horrors features like PEP 695. I can't thank you enough for the intensity of your dedication to @beartype, PEP 695, and QA in general. You're awesome! This resolution is for you. Smiley cat + friendly unicorn for life. 😸

leycec2 months agobeartype/beartype

Resolved by 2b5009f72fd2ce. Congratulations, @woutdenolf! You've won a new @beartype release. That... probably wasn't what you were hoping for, but it's all we've got. 🤣

This is our last action item for @beartype

0.20.0rc1
– our last release candidate for the @beartype
0.20.0
dev cycle, maybe. Let's pretend that I'm actually publishing this tomorrow or Friday evening.

Thanks again for the heads-up! This issue closure is for you and the wonderful things you do.

leycec3 months agobeartype/beartype

Resolved by ca335c51c6113c. Technically, it's only a pre-release. @WeepingClown13 is currently frowning. Everyone else, feel free to smile half-heartedly.

@beartype

0.20.0rc0
: It's better than nuthin'. 😺

leycec3 months agobeartype/beartype

Resolved by e7577c8334b. Or, rather, we just exercised with unit tests that this has actually been working as intended for years. Well, isn't that special? No idea why #469 erupted with such weirdness, but... it looks like that's a separate conundrum entirely. Weird! :godmode:

leycec4 months agobeartype/beartype

OMG. It... it's happening! QA Christmas is finally happening, @rbroderi!

<sup>midget @leycec with fur begins delivering the Christmas goods</sup>

leycec5 months agobeartype/beartype

And sometimes instead of an explicit error, it will be a warning message with an automatic conversion to the right type. I want to test for those warnings showing up. But I also want to flag violations of type hints outside of those tests, since such violations are unexpected.

Ah, ha. Indeed, it all becomes clearer in the crystal ball. I see a dark future for your test suite. :crystal_ball: :fort

However...

(And if they are only warning messages, they wouldn't necessarily fail the test)

Ah, ha! Actually, it's generally advisable to fail tests on any warnings.

pytest
should probably default to that behaviour out-of-the-box but doesn't, because doing so would break backward compatibility. Still, treating warnings as test failures is really the only sane QA default. If
pytest
had been created in 2024 (rather than a lot earlier), that would almost certainly have been the default. Indeed, the current
pytest
default strips essential context and metadata from warnings and is thus largely useless. (See the
pytest.ini
snippet below for an example of this.)

Thankfully, configuring

pytest
to fail tests on warnings is trivial. Since this is what the @beartype test suite does, I speak from a place of wisdom and suffering. Just:

  • Add a new top-level
    pytest.ini
    file in the root directory of the GitHub repository hosting your Python package (...if you haven't already).
  • Add a new
    filterwarnings = error
    entry to this file resembling:
# The following pytest-specific section specifier is mandatory, despite this
# file's unambiguous basename of "pytest.ini". One is enraged by bureaucracy!
[pytest]

# Newline-delimited list of all custom warning filters applied by this test
# suite. Recognized strings include:
# * "default", printing the first occurrence of matching warnings for each
#   location (module + line number) where the warning is issued.
#   Unsurprisingly, this is pytest's default. Surprisingly, the resulting
#   output is overly fine-grained to the point of stripping all caller context
#   and thus being mostly useless: e.g.,
#       betse_test/func/sim/solve/test_sim_fast.py::test_cli_sim_fast
#         /usr/lib/python3.7/site-packages/numpy/core/_asarray.py:83:
#         VisibleDeprecationWarning: Creating an ndarray from ragged nested
#         sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays
#         with different lengths or shapes) is deprecated. If you meant to do
#         this, you must specify 'dtype=object' when creating the ndarray
#           return array(a, dtype, copy=False, order=order)
#   Note that pytest reports the warning as originating from within a private
#   NumPy submodule, which it technically does but which communicates no
#   practical meaning with respect to our codebase.
# * "error", turning matching warnings into exceptions. Ideally, pytest would
#   support a filter printing tracebacks on warnings. Since it fails to do so,
#   implicitly printing tracebacks by coercing non-fatal warnings into fatal
#   exceptions is our next best least worst solution.
filterwarnings =
    # Implicitly coerce all non-fatal warnings into fatal exceptions.
    error

    # Avoid coercing all non-fatal warnings matching one or more of the
    # following patterns formatted as:
    #     ignore:{warning_message}:{warning_classname}:{module_name}
    #
    # ...where:
    # * "{warning_message}" is a regular expression matching the plaintext of
    #   the warning message to be ignored.
    # * "{warning_classname}" is the fully-qualified classname of the warning
    #   to be ignored.
    # * "{module_name}" is the fully-qualified name of the module emitting the
    #   warning to be ignored.
    #
    # For example:
    #     ignore:^Use of \.\. or absolute path in a resource path.*:DeprecationWarning:pkg_resources
    #
    # See also:
    # * https://docs.python.org/3/library/warnings.html and
    # * https://docs.pytest.org/en/latest/warnings.html
    ignore:^'cgi' is deprecated and slated for removal in Python 3\.13$:DeprecationWarning:babel.messages.catalog

The above real-world example even demonstrates how to "whitelist" various warnings that you'd rather just silently ignore than treat as fatal test failures.

Even if I have to do some monkey patching of beartype in my test script I would be pretty happy!

...heh. I too love me some monkey-patching. Truly, I see that you are a gentleman coder as well.

Sadly, the @beartype codebase is less amenable to monkey-patching than most. Basically, what I'm saying is that @beartype is a labyrinthine nightmare of grotesquely non-Euclidean proportions. It's best "not to go there." Even I shudder to go there. The beast is hungry. Don't feed the beast, @MilesCranmer. :laughing:

This means that this probably will have to linger until somebody finds a spare volunteer week or three for this. If you're keen to go there, I'd be happy to walk you through a PR implementing this. It seems quite tractable, but... yeah. The @beartype codebase isn't for everybody. If you genuinely think you have the time, energy, and interest, we can definitely make this happen in less than a thousand lines of code – probably a lot less.

But no promises. :wink:

Phenomenal. And... I'm still stunned at how fast you respond on GitHub. I'm pretty sure that's 0-minute turn-around. How? How do you respond instantaneously!? It's almost like... telepathy. Like, your hands aren't even moving on the keyboard. Yet, messages are being written. Please teach me your dark secrets! 🙏

leycec11 days agobeartype/beartype

Thanks so much! And... I'm really sorry about my delayed response to this. I've been hip-deep in PEP 695 recursive type aliases for the past month, which is more brutal than I could have possibly imagined. I think it's finally caught up with me. 🥴 <sup><-- sad</sup>

leycec11 days agobeartype/beartype

Thanks so much! And... I'm really sorry about my delayed response to this. I've been hip-deep in PEP 695 recursive type aliases for the past month, which is more brutal than I could have possibly imagined. I think it's finally caught up with me. 🥴 <sup><-- sad</sup>

leycecabout 1 month agobeartype/beartype

Resolved by 2af8e4f6c07ea. I shall bravely endeavour to push out a new stable @beartype

0.20.1
release tomorrow night. Until then, thanks again for this awesome issue. Your username may be inscrutable, @rg936672, but...

WAIT.

936672
.
#936672
. That's the hex code for a deep shade of pink, right?
r
and
g
could then refer to
red
and
green
. Therefore, I unwisely choose to interpret your username as
red-green-pink
. I solved the puzzle! Now, I can sleep tonight. 😴

leycecabout 1 month agobeartype/beartype

Resolved by 48627eb7abe... supposedly. Would you mind testing this on your end when you get a chance? The magical one-liner that scares even me is:

uv install git+https://github.com/beartype/beartype.git@48627eb7abe4725280083f472b76e59d61254e42

Thanks so much, wonderful QA comrade! Long live both the Eurozone and Canada. May Python be with us always. 🐍 🇪‍🇺 🇨‍🇦

leycecabout 2 months agobeartype/beartype

Sure! No worries. I can definitely yank @beartype

0.20.0rc0
. That said, shouldn't the most recent @beartype
0.20.0rc2
release candidate (if anything) be installed? It strikes me as oddball that the outdated @beartype
0.20.0rc0
release candidate would ever be installed under any circumstances... unless someone explicitly demanded that with a hard version pin.

pip
, huh? Who can even understand its follies and its foibles anymore? 🤔

leycecabout 2 months agobeartype/beartype

I am basking in the rays of

uv
(literally, as it is summer here in NZ).

lolol. My wife (a Canadian) and I met in Lyttelton Harbor on the outskirts of Christchurch a literal lifetime ago. So much for love for New Zealand, where we both retain Permanent Residency. If the U.S. implements its horrifying threats to violently annex Canada, we're on the first Air New Zealand flight back to New Zealand.

Though... probably the North Island this time. We just can't do Dunedin again. Well, I could. I have Scotts blood. I could live in a wet burlap sack filled with brown mold. My wife? Not so much. 😆

Thanks again for getting back to me so fast on this! You're amazing. Long live Aotearoa, may her clouds always glow against the shadows of this darkening world.

leycecabout 2 months agobeartype/beartype

Ho, ho. I am chuckling deep inside even as I shake my head in stunned disbelief. CPython devs have made a funny.

Back on track! Would you mind trying our most recent release candidate:

? @beartype
0.20.0
features a gamut of improvements to forward reference proxies, which seems to be the underlying culprit in this issue. In theory, upgrading might even help:

pip install --upgrade --pre beartype     # <-- bugs begone

If that fails to fix you up, we panic like it's 2025. 😨

leycec2 months agobeartype/beartype

OMG! Thank you so much. You've made my weekend. The

repo is now live with you as its sole Admin. You are our only hope, @tusharsadhwani. 🤩 🌟 💥

In your honour, I've chosen to forego playing video games this weekend. Instead, I'm hacking on @beartype issues. Tonight, I slay all these bugs for you. Guh! What am I doing...

leycec3 months agobeartype/beartype

Gah! So sorry for the extreme delay... and for shamefully breaking this PR.* I knew I should have merged this immediately, sight unseen. Instead, I was dumb. Now one of must suffer by manually resolving the growing laundry list of merge conflicts. Pain intensifies. :hurtrealbad:

How does beartype check such assignments at runtime, while the module is executing?

...heh. @beartype doesn't. That's just it. That's where all the cleverness and madness lies.

Modern runtime type-checkers like @beartype and

typeguard
aren't actually runtime type-checkers (in the pure sense of the word "runtime") anymore. We're not doing what everyone thinks we're doing. @beartype and
typeguard
are hybrid third-generation runtime-static type-checkers. That is, @beartype and
typeguard
perform both runtime type-checking and static type-checking.

In @beartype's case, @beartype performs a preliminary phase of static type-checking on each module importation before CPython executes that module. How?

beartype.claw
import hooks (e.g.,
beartype.claw.beartype_this_package()
). Like all import hooks,
beartype.claw
import hooks register import path hooks that then dynamically apply abstract syntax tree (AST) transformations on each module importation. Pretty fun stuff.

Specifically,

beartype.claw
import hooks transform...

# PEP 526-compliant annotated variable assignments like this...
muh_attr: MuhTypeHint = muh_value

# ...into runtime type-checking validation like this.
from beartype.door import die_if_unbearable
muh_attr = muh_value
die_if_unbearable(muh_attr, MuhTypeHint)

Works flawlessly without issue or edge cases in both @beartype and

typeguard
. Certainly, invasive hacks are neither required nor desired. I'm personally allergic to fragile kludges that fall apart when you squint at them. I don't tolerate any of that sort of thing in my codebase. Most of us in this space are similar.

@beartype is hard enough to maintain without hacks. :laughing:

Base classes are not annotations.

Hard disagree with that. The whole point of PEP 484-, 544-, and 585-compliant generics is to subclass a type hint, which typing systems then use to validate those generics. The hints that generics subclass absolutely satisfy the PEP 747-compliant

typing.TypeForm
annotation. They're 100% PEP-compliant type hints.

For example:

from collections.abc import Iterable, Container
class IterableContainer[T](Iterable[T], Container[T]): ...

Iterable[T]
and
Container[T]
both satisfy
typing.TypeForm
. Ergo, they are type hints. They're also "base classes" in the sense that the
IterableContainer
subclass "subclasses" those "base classes." Of course, we know better. Type erasure erases those "base classes" into the
__orig_bases__
dunder attribute.

Still, at least semantically, they're "base classes." When

IterableContainer
is subscripted by a child type hint (e.g., as
IterableContainer[str]
), that child type hint then dynamically replaces the type variable
T
parametrizating those base classes. So:

# This type hint...
IterableContainerStr = IterableContainer[str]

# ...is effectively equivalent to this type.
class IterableContainerStr(Iterable[str], Container[str]): ...

All of the above only works because

Iterable[T]
and
Container[T]
are both type hints.

Anyway, arguments about semantics are entirely besides the point. The point is that real-world generics can currently subclass type hints subscripted by forward references to types that are currently undefined: e.g.,

class IterableContainerStr(Iterable[WhatTheHeck], Container[WhatTheHeck]): ...
class WhatTheHeck(object): ...

The

IterableContainerStr
type subclasses the
Iterable[WhatTheHeck]
and
Container[WhatTheHeck]
type hints referring to the
WhatTheHeck
type. Since
WhatTheHeck
is still undefined at the time the
IterableContainerStr
type is declared, PEP 649 should ideally have something to say about this.

Does PEP 649 have something to say about this? Probably not, huh. So, do users just have to continue manually stringifying forward references in generics? Is the following really the only way to reference undefined types in generics:

class IterableContainerStr(Iterable['WhatTheHeck'], Container['WhatTheHeck']): ...
class WhatTheHeck(object): ...

Kinda wierd that users can use unquoted forward references in some syntactic contexts but not in others. Ideally, users would be able to globally use unquoted forward references in all possible valid syntactic contexts for a type hint, right? The alternative is mass confusion and frustration.

Oh, well. Guess we're going with mass confusion and frustration. :face_exhaling:

Forward references work, but they must be defined when you actually evaluate the type alias.

So... what you're saying is that forward references don't work in

type
statements. The definition of a forward reference is "something that safely refers to something else that has yet to be defined without blowing up when you access it." But
type
aliases referencing types that have yet to be defined blow up when you access them. So... they actually prohibit rather than support forward references.

For efficiency, @beartype dynamically generates type-checking code at decoration time. If a user annotates a function parameter with a

type
alias referencing types that have yet to be defined, @beartype needs that alias to be introspectable without blowing up.

Thankfully...

annotationlib.call_evaluate_functions(Animals.__evaluate_value__, Format.FORWARDREF)

Superb! That's exactly what @beartype needs. I mean, the syntax is pretty awful. It'd be nice if

type
aliases just behaved like this by default, because this is what everybody wants by default. Nobody wants
type
aliases that spontaneously explode whenever you directly access their
__value__
attribute. They want
type
aliases that work robustly across all use cases without exploding.

Still, I'll take it. @beartype takes whatever scraps it can get. :smiling_face_with_tear:

PEP 695 explicitly instructs users to embed strings in PEP 604-style new unions

Where do you see that?

I mean, it's right there in PEP 695:

# A type alias that includes a forward reference
type AnimalOrVegetable = Animal | "Vegetable"

Animal | "Vegetable"
isn't syntactically valid Python. Period. You can't take the union of a
str
. The
str
builtin doesn't support the
__or__()
dunder method, nor should the
str
builtin support the
__or__()
dunder method.

It's like nobody even peer-reviewed that thing. I sigh.

</sigh>

I would recommend against looking too much at the PEPs only...

Respectfully disagree. Standards matter. In fact, standards are pretty much the only thing that matter to @beartype. For type-checkers, everything else is just gossip and hearsay. We're here to enforce uniform standards – whatever those standards might be. All other documentation (although occasionally valuable) is ultimately pretty incidental to the rigorous work performed by type-checkers.

Since the value of type aliases is lazily evaluated, you shouldn't need to use strings in them.

Under Python ≤ 3.13,

type
aliases absolutely are not lazily evaluated from @beartype's perspective. @beartype needs "lazily evaluated" to mean "unquoted forward references are transparently replaced with objects proxying
__instancecheck__()
and
__subclasscheck__()
calls that internally resolve the external types those unquoted forward references refer to at the last possible moment." Because
type
aliases currently don't behave that way, users absolutely need to use strings in
type
aliases to safely refer to types that have yet to be defined. For @beartype users,
type
aliases definitely do not support unquoted forward references.

Thankfully, that seems to be a moot point under Python ≥ 3.14. @beartype gives thanks and praise. :bow:

And... Now We're Both Tired

This is what happens when @leycec writes essays on a Friday night. Everyone ends up passed on in the futon in a puddle of their sputum. :drooling_face: :sleeping_bed:

leycecover 1 year agobeartype/beartype

Resolved by b5ed824. Gradually adopt a QA bear cub today, the @beartype way:

# In your "your_package.__init__" submodule:
from beartype import BeartypeConf
from beartype.claw import beartype_this_package

beartype_this_package(conf=BeartypeConf(
    violation_param_type=UserWarning,
    violation_return_type=UserWarning,
))

Thanks yet again, @justinchuby and @kasium. This flexed bear bicep is for you. :muscle: :bear: