# How a Zig IDE Could Work

After parsing, the Ast is converted to an intermediate representation, Zir. This is where Zig diverges a bit from more typical statically compiled languages. Zir actually resembles something like Python’s bytecode — an intermediate representation that an interpreter for a dynamically-typed language would use. That’s because it is an interpreter’s IR — the next stage would use Zir to evaluate comptime.

Ast and Zir infra is good. It is per-file, so it naturally just works in an IDE.

# The evolution of Facebook’s iOS app architecture

After years of iteration, the Facebook codebase does not resemble a typical iOS codebase:

  • It’s full of C++, Objective-C(++), and Swift.
  • It has dozens of dynamically loaded libraries (dylibs), and so many classes that they can’t be loaded into Xcode at once.
  • There is almost zero raw usage of Apple’s SDK — everything has been wrapped or replaced by an in-house abstraction.
  • The app makes heavy use of code generation, spurred by Buck, our custom build system.
  • Without heavy caching from our build system, engineers would have to spend an entire workday waiting for the app to build.
  • FBiOS was never intentionally architected this way. The app’s codebase reflects 10 years of evolution, spurred by technical decisions necessary to support the growing number of engineers working on the app, its stability, and, above all, the user experience.

It must be either brilliant to work on or an absolute pain.

# Writability of Programming Languages

Main takeaways—

1. Commonplace syntax should be easy to type
2. // for comments is easier to type than #
3. Python’s indentation style is easy since you only need to use TAB (no end or {})
4. JS/C# lamba expressions using => are concise and easy to write
5. Short keywords like for in let var are easy to type
6. Using . for attributes (Python) is superior to $ (R)
7. >> is easier than |> or %>% for piping
8. Ruby’s usage of @ for @classvar is simpler than self.classvar
9. The ternary operator ?: is easy to write because it’s at the bottom right of the keyboard

Takes into account only US keyboard layout, but interesting anyway.

# Speeding up the JavaScript ecosystem – eslint

In today’s world for-of loops are supported everywhere, so I patched the package again and swapped out the function implementation with the original one in the source. This one single change saves about 400ms. I’m always impressed by how much CPU time we burn on wasteful polyfills or outdated downtranspilation.

So many libraries in our ecosystem suffer from this issue. I really wish there was a way to update them all with one click. Maybe we need a reverse transpiler which detects down transpilation patterns and converts it back to modern code again.

# Carving the Scheduler Out of Our Orchestrator

To turn our Docker transmogrification into a platform, we need to go from running a single job to running hundreds of thousands. That’s an engineering problem with a name: Orchestration.

Great article on how orchestrators work, and why Fly.io wrote their own.

# 30,000 lines of SwiftUI in production later: We love it but you know there was going to be a “but”

It took a few hours to fall in love with SwiftUI. So much so that we instantly decided to abandon a cross-platform codebase and go fully native on iOS. timing.is shipped this month on the App Store. It was built entirely in SwiftUI. Its development took several months. It would have been less if SwiftUI just gave. Unfortunately, repeatedly it would take.

Can confirm: SwiftUI is nice and easy in when it works, but in general, it’s a giant pain. It’s embarrassing for Apple to release a framework in such a half-baked state.

It was fun at least 51% of the time. But let’s talk about the <= 49% that wasn’t.

Imagine writing lots of code that works well, and then getting stuck for a few days on a tiny thing that just doesn’t want to work. That’s SwiftUI experience.

Top-level await in JavaScript REPLs is a hack

Today I learned that top-level await in JavaScript REPLs, such as Chrome’s Developer Tools console and Node.js REPL, is a BIG HACK!

Top-level await is a feature of JavaScript that allows you to use await outside of an async function, for example:

await Promise.resolve("cool")

Before it was added, the only way to use await was to wrap it in async function:

(async () => await Promise.resolve("cool"))()

However, top-level await only works within modules. This is a problem for REPLs, since they don’t make a module for every expression that you type. They basically use eval() on it, running in the global scope: if you type x = 1 or var x = 1, you expect x to be a global variable.

So, await wouldn’t work. But it does work! How? I initially thought they implemented some kind of a special eval-level await V8. Nope!

Turns out REPLs parse your expression and rewrite it into the async function! Both Node and Chrome use the acorn.js parser for that 🤯

If you type

await Promise.resolve('cool')

they turn it into

(async () => {return (await Promise.resolve('cool'));})()

What about global variables though? If you type var x = 1. and wrap it in a function, the var will be local to the function. But we want it to be global.

Here’s the trick — they also rewrite variable definitions:

var x = 1; await Promise.resolve('ok');

turns into:

(async () => {void (x = 1); return (await Promise.resolve('ok'));})()

They strip var/let/const from what now are function-scoped variables and just assign values to the global. Notice that your const will not actually be a const, but that’s fine, since REPLs allow redeclaring and reassigning top-level const and let anyway.

One funny thing: before running this whole parser stuff, Chrome dev tools check if rewriting is needed at all by looking for “async” in your code. So, if you type console.log("async work"), it will execute a tiny bit slower than console.log("meetings") 😜

If you want more details, read this processTopLevelAwait function in Node.

# Let’s build GPT: from scratch, in code, spelled out

Andrej Karpathy’s lecture:

We build a Generatively Pretrained Transformer (GPT), following the paper “Attention is All You Need” and OpenAI’s GPT-2 / GPT-3. We talk about connections to ChatGPT, which has taken the world by storm. We watch GitHub Copilot, itself a GPT, help us write a GPT (meta :D!)