The type checker that shipped with v0.1.0-alpha was 815 lines. It handled primitives, function types, and basic inference. Enough to type-check arithmetic and catch obvious mistakes. Not enough to compile anything interesting without fighting it.
By v0.1.5, it was over 4,000 lines, and the changes weren’t incremental. Whole categories of programs that should have worked — programs that did work at runtime — were being rejected or flagged with spurious warnings.
Literal coercion
This was the one that kept biting me. Hew’s default integer is i64, but the language has i8, i16, i32, u8, u16, u32, u64, and f64. Before literal coercion, writing a function that takes an i32 parameter meant you couldn’t just pass 42 — the literal was an i64, the parameter wanted an i32, and the type checker refused.
fn set_port(port: i32) {
// ...
}
// Before: type error — expected i32, got i64
set_port(8080);
// Workaround: explicit cast everywhere
set_port(8080 as i32);Every integer literal forced a cast. Technically correct, completely unusable. (This is how you get people to close the tab.)
The fix was teaching the checker to recognize when an integer literal appears in a context with a known expected type. If the literal’s value fits in the target type, the coercion is implicit — no cast needed. 42 in an i32 context becomes an i32. 255 in a u8 context becomes a u8. Try to coerce 300 into a u8 and you still get an error, because that’s actually wrong.
fn set_port(port: i32) {
// ...
}
// After: literal coercion — 8080 fits in i32, just works
set_port(8080);
let buffer_size: u16 = 4096; // coerced, no cast
let flags: u8 = 0xFF; // coerced
let big: u8 = 300; // error: 300 doesn't fit in u8The implementation walks the expected type down into function arguments, let bindings with type annotations, struct field initializers, and array elements. Anywhere the checker already knows what type it wants, it checks whether the literal fits. One function, called from about a dozen sites in the checker.
Explicit casts with as
Literal coercion handles the easy case — when you know the type at the point of the literal. But programs also need to convert between numeric types at runtime, where the value isn’t known at compile time.
let bytes_read: i64 = read(buffer);
let progress: f64 = bytes_read as f64 / total as f64 * 100.0;
let code: i32 = response.status();
let narrow: i16 = code as i16;The as keyword handles numeric widening, narrowing, and int-to-float conversions. Widening is always safe. Narrowing truncates, same as C and Rust — if you cast 100000 to i16, you get whatever the low 16 bits say, and the compiler trusts that you meant it. Float-to-int truncates toward zero.
The cast validation was the source of a bug that took longer to find than the feature took to implement. The checker was rejecting f64 as i64 as an invalid cast because the type compatibility table only listed integer-to-integer conversions. Float-to-int was handled in codegen but not in the checker, so the MLIR backend would happily emit llvm.fptosi while the type checker complained about the same line. Programs compiled with warnings that said they shouldn’t work — and then worked fine. (Nothing erodes trust in diagnostics faster than that.)
The bytes type
Raw byte data isn’t text, and pretending it is causes exactly the kind of bugs that Go’s []byte-to-string conversions were designed to make visible. Network sockets, binary file reads, FFI buffers — none of these are UTF-8 strings, and Hew’s string type (UTF-8, length-prefixed, immutable) shouldn’t be anywhere near them.
bytes is a distinct primitive — not an alias for Vec<u8>, not a string with different encoding rules. It’s an owned, growable buffer of raw octets with its own methods:
let buf: bytes = bytes::new(1024);
buf.push(0xFF);
buf.extend(other_buf);
let slice = buf.slice(0, 16);
// Explicit conversion — not implicit
let text: string = buf.to_utf8(); // can fail
let raw: bytes = text.to_bytes(); // always succeedsMaking bytes a primitive rather than a type alias meant the type checker could enforce the boundary between text and binary data. You can’t pass bytes where string is expected, or vice versa, without an explicit conversion. The runtime representation is different too — bytes doesn’t carry encoding metadata, doesn’t validate on construction, and uses a simpler allocator path.
Indirect enums
This one came from trying to implement a tree type:
enum Tree {
Leaf(i64),
Node(Tree, Tree), // problem: infinite size
}A Tree contains two Tree values. The struct layout is recursive — the size depends on itself. C++ solves this with pointers. Rust solves it with Box<T>. Hew needed its own answer.
Indirect enums put heap-allocated boxes around recursive variants automatically. The indirect keyword marks either a single variant or the entire enum:
enum Tree {
Leaf(i64),
indirect Node(Tree, Tree), // heap-allocated
}
// Or mark the whole enum
indirect enum Expr {
Literal(i64),
Add(Expr, Expr),
Mul(Expr, Expr),
}The type checker validates that indirect is only applied to variants that actually need it — variants containing the enclosing enum type, directly or transitively. Marking a non-recursive variant as indirect is a warning, not an error, because it’s wasteful but not wrong. Codegen emits pointer indirection for indirect variants and inline storage for the rest, so Leaf(i64) in the Tree example is still a flat value — no allocation on the hot path.
Array-to-Vec coercion
Array literals and Vec are different types. An array [1, 2, 3] has a fixed size known at compile time. A Vec<i64> is heap-allocated and growable. But in practice, you almost always want to initialize a Vec from a literal:
// Before: construct a Vec, then push
let names: Vec<string> = Vec::new();
names.push("alice");
names.push("bob");
// After: array literal coerces to Vec
let names: Vec<string> = ["alice", "bob"];When the expected type is Vec<T> and the expression is an array literal whose elements are all T (or coercible to T — literal coercion composes with this), the checker inserts an implicit conversion. Codegen allocates a Vec with the right capacity and copies the elements in. No intermediate array, no reallocation.
This interacts with literal coercion in a way that felt almost accidental but turned out to be exactly right:
let ports: Vec<i32> = [8080, 8443, 9090];Two coercion rules fire: the array coerces to Vec<i32>, and each integer literal coerces from i64 to i32. The line just works.
What the checker looks like now
The original type checker was a single pass over the AST — check each expression, return a type, report mismatches. That worked when “checking” meant comparing two primitives. It broke down the moment you needed context flowing downward — expected types from function parameters, let bindings, struct fields — not just results flowing upward.
The current checker does bidirectional type checking. Expected types propagate down into expressions, and inferred types propagate back up. Literal coercion, array-to-Vec coercion, and enum variant construction all depend on the downward flow, while cast validation, method resolution, and generic instantiation depend on the upward flow — neither direction works alone.
The v0.1.7 WASM rebuild brought all of this into the browser playground — same checker compiled to WebAssembly, same diagnostics you’d see from hew build, running as you type.