We interrupt this doomscroll
to bring you...

Crash Lime

Delicious performance
64 bites at a time!

2023-02-15 • 9 minutes required • Anima Omnium

What Austral Proves

Austral is a novel systems programming language designed by Fernando Borretti:

You can think of it as Rust: The Good Parts or a modernized, stripped-down Ada. It features a strong static type system, linear types, capability-based security, and strong modularity.

I’ve been thinking a lot about Austral recently. It bundles a lot of the right features together in one place.

Austral is designed for safety-critical systems software. Austral has the philosophy that, if a programmer looks at some code, they should be able to say exactly what it does, down to the functions it calls and the assembly it emits. This goal has informed a number of design decisions: the language is unambiguous, uses linear types to model resources, and library permissions are constrained using capabilities. It wouldn’t be a stretch to say that Austral is a language Dijkstra would love: it guides you towards writing perfect programs.

Austral is a simple language. Like Go, it is designed to be small enough to fit in the mind of a single programmer. The language has a compact 100-page specification; it’s not a big language:

A central constraint in the design of Austral is the language should be easy to implement. Not just because I was the only person writing the compiler, but because I want Austral to be a hundred year language, where you can reliably run code from decades ago (Common Lisp is like this: it is possible). If, for whatever reason, the source code of the bootstrapping compiler was lost, it would be trivial to rewrite it again from the ~100 pages of spec (roughly half of which describes the rationale for various design decisions).

To help you cobble together a rough picture of Austral, I’ll say it has a Ada-like syntax with Rust-like semantics. Like Rust, it has traits (typeclasses) for modeling interfaces. Austral also leverages the type system to model resources, which it does through the use of Linear Types. (Rust uses Affine Types, which are similar, but a little less restrictive.)

Austral divides values into two type Universes: free values, which are small and can be copied freely, and linear values, which must be used exactly once. This constraint ensures that there is one—and only one—handle to a linear value at any given point in the program.

Austral’s power to model resources stems from its Linear type system. Most data—like bools, ints, and small structs—can be modeled using normal free values. Resources—like memory, file descriptors, and database connections—are modeled as linear values. Linear values use the type system to statically ensure that there is only one handle to a resource at any given point in the program. Modeling resources is essential in systems programming, and Austral, like Rust, will catch memory-safety bugs (and other similar classes of errors) at compile-time.

Austral includes a few innovations over Rust, such as a novel borrow syntax for annotating regional lifetimes:

let x: Lin := make();
borrow x as ref in R do
   -- Here, we can refer to the region by
   -- its name, `R`.
   let ref2: &[T, R] := transform(ref);
end;
consume(x);

By design of being a simpler language, however, Austral is a bit more restrictive than Rust in a few areas. The largest restrictions that come to mind are:

  1. Linear Types require explicit destructors to clean up resources, because all values must be used exactly once. This can get a little verbose. Rust uses Affine Types (values must be used at most once), meaning that resources are cleaned up automatically according to Rust’s Drop semantics.

  2. Austral uses lexical regions to model lifetimes, meaning that the lifetimes of values are processed at the lexical, i.e. block, level. Rust uses fine-grained non-lexical lifetimes, meaning there are some valid programs that Rust permits but Austral disallows.

  3. Austral, like Rust, uses a borrow checker. As borrow checkers tie resources to the stack, and stacks map to the one-hole context of cycle-free inductive datatypes, it is impossible to express data structures that contain cycles in Rust or Austral. (Without an escape-hatch like unsafe or another level of indirection.)

It’s important to note that these restrictions are purposeful: Requiring explicit resource cleanup ensures that the drop-order is well-defined, and that there is no invisible performance impact due to automatically-inserted drop calls; lexical regions keep the compiler implementation, and thus the language itself, small and easy to reason about; and if you’re reaching for cyclic data at the application-level, you’re probably looking to model relations, in which case just use (a linear handle to) a relational database.

Borretti has said that these restrictions exist to keep the language simple. By any metric, I’d say he’s succeeded. Heck, the linear type inference engine at the core of the language is only 600 lines long. No, seriously, take a look.

Rust, on the other hand, is a huge, complex language. In my mind, I pinned a lot of this complexity on Rust’s borrow checker and type system. Having worked with Rust for a while, I’d argue that a good amount of this complexity is accidental: when Rust took its first steps, no mainstream language sported a borrow-checker. The dichotomy for any new language at the time was slow-but-memory-safe or fast-but-footgunned. Rust proved this dichotomy false, but getting to that point wasn’t an easy road. (Polonius, anyone?)

While more restrictive than Rust, Austral, with its 600-line linear-type borrow-checker, proves that implementing compiler-enforced resource safety isn’t as hard as it once was; with the benefit of hindsight, we can avoid some of the accidental complexity originally thought necessary.

If you can express linear types in less code than a garbage collector, why not just use linear types to manage memory? A lot of impractical theoretical groundwork laid in decades past is finally becoming tractable and workable. We’re in a golden era of programming language research. LLVM, the Language Server Protocol, better resources, and bigger communities have turned designing a new compiler from a PhD thesis to a weekend project. There’s no excuse to ignore recent innovations. Linear types in 600 lines of code is quite the accomplishment, but it’s indicative of a larger trend.

I think we’re finally starting to get a handle on software engineering as a discipline. Gone are the wild-west days of PHP and CGI, shotgun debugging and panic deploys. Linear types, capabilities, algebraic effects, and program verification are old tools—finally becoming tractable—that let us understand, limit, and shape a program’s behavior, as Software Engineers.

Dijkstra grew disillusioned with Computing Science as a discipline: the tools simply weren’t there yet. Austral proves that this is no longer the case. The tools are on the table, what to build from here?

It’s too early to say what will happen with Austral in the long-run, but in the least it has been designed for longevity. Like Rust before it, Austral started with a bootstrapping compiler written in OCaml; it seems that the goal is now to write out a self-hosting compiler in Austral itself. I hope the language finds its niche (safety-critical embedded?) and quickly grows to do much more.

There will be some growing pains. At some point, as Rust discovered, the restrictions imposed by lexical regions will have to be re-evaluated. The package ecosystem is nascent, so while I wouldn’t quite build your next company in the language, it might be fun (and useful!) to port over a little low-level library. (Forget Rust, Rewrite it in Austral!)

To recap: Austral is a systems programming language that uses linear types to model resources. It’s small, and is working proof that borrow checkers need not be complex. I hope that Austral continues to grow and build on its innovations; I hope that future languages continue the example Austral sets and build off of prior research in a way that makes it practically tractable. What’s Austral for dependent types? Automatic parallelization? Distributed computing? Time will tell.

LobstersHNRedditTwitter