JavaScript's == famously equates 0, '', false, and null in confusing ways; TypeScript's as assertions can fool the compiler; Python's truthiness rules (if obj: is false for [], 0, None, ''); SQL's silent casts between strings and numbers — every language has a coercion ruleset that becomes a foot-gun when authors forget the rule.

Shapes that bite: parseInt("007") was octal in old JS and decimal now (a generation of off-by-one bugs lived in that change). String comparison of numeric strings ("10" < "9" is true). Numeric comparison of dates (new Date() minus another new Date() is a number, not a Duration). Implicit null0 conversions in arithmetic.

The defense is strict modes and explicit conversions. ===, not ==. Number() and String(), not +. Schemas that parse and reject, not coerce. Treat "the language did something for me" as a code smell, not a feature.

Review heuristic

Hunt the diff for implicit conversions: bare + between unknown types, == instead of ===, JS truthiness checks where a strict null-check was meant. Each one is an opportunity for the language to do something the author didn't intend.