# 30,000 lines of SwiftUI in production later: We love it but you know there was going to be a “but”

It took a few hours to fall in love with SwiftUI. So much so that we instantly decided to abandon a cross-platform codebase and go fully native on iOS. timing.is shipped this month on the App Store. It was built entirely in SwiftUI. Its development took several months. It would have been less if SwiftUI just gave. Unfortunately, repeatedly it would take.

Can confirm: SwiftUI is nice and easy in when it works, but in general, it’s a giant pain. It’s embarrassing for Apple to release a framework in such a half-baked state.

It was fun at least 51% of the time. But let’s talk about the <= 49% that wasn’t.

Imagine writing lots of code that works well, and then getting stuck for a few days on a tiny thing that just doesn’t want to work. That’s SwiftUI experience.

Top-level await in JavaScript REPLs is a hack

Today I learned that top-level await in JavaScript REPLs, such as Chrome’s Developer Tools console and Node.js REPL, is a BIG HACK!

Top-level await is a feature of JavaScript that allows you to use await outside of an async function, for example:

await Promise.resolve("cool")

Before it was added, the only way to use await was to wrap it in async function:

(async () => await Promise.resolve("cool"))()

However, top-level await only works within modules. This is a problem for REPLs, since they don’t make a module for every expression that you type. They basically use eval() on it, running in the global scope: if you type x = 1 or var x = 1, you expect x to be a global variable.

So, await wouldn’t work. But it does work! How? I initially thought they implemented some kind of a special eval-level await V8. Nope!

Turns out REPLs parse your expression and rewrite it into the async function! Both Node and Chrome use the acorn.js parser for that 🤯

If you type

await Promise.resolve('cool')

they turn it into

(async () => {return (await Promise.resolve('cool'));})()

What about global variables though? If you type var x = 1. and wrap it in a function, the var will be local to the function. But we want it to be global.

Here’s the trick — they also rewrite variable definitions:

var x = 1; await Promise.resolve('ok');

turns into:

(async () => {void (x = 1); return (await Promise.resolve('ok'));})()

They strip var/let/const from what now are function-scoped variables and just assign values to the global. Notice that your const will not actually be a const, but that’s fine, since REPLs allow redeclaring and reassigning top-level const and let anyway.

One funny thing: before running this whole parser stuff, Chrome dev tools check if rewriting is needed at all by looking for “async” in your code. So, if you type console.log("async work"), it will execute a tiny bit slower than console.log("meetings") 😜

If you want more details, read this processTopLevelAwait function in Node.

# Delphi VCL & FMX Libraries for Python

The DelphiVCL and DelphiFMX libraries for Python are a set of Python modules that put the robust and mature VCL and FireMonkey (FMX) GUI libraries in the hands of Python developers.

Check out this sample:

from delphivcl import *

class MainForm(Form):

    def __init__(self, owner):
        self.Caption = "A VCL Form..."
        self.SetBounds(10, 10, 500, 400)
        self.Position = "poScreenCenter"

        self.lblHello = Label(self)
            Caption="Hello DelphiVCL for Python")
        self.lblHello.SetBounds(10, 10, 300, 24)

        self.OnClose = self.__on_form_close

    def __on_form_close(self, sender, action):
        action.Value = caFree

def main():
    Application.Title = "Hello Python"
    Main = MainForm(Application)


Very nice and nostalgic! I used VCL a lot in the 2000s, and was very productive with it. This version is not open source, but has some kind of freeware license. I personally wouldn’t use any non-platform-provided UI library that’s not open source, for practical reasons.

# Production Twitter on One Machine? 100Gbps NICs and NVMe are fast

In this post I’ll attempt the fun stunt of designing a system that could serve the full production load of Twitter with most of the features intact on a single (very powerful) machine. I’ll start by showing off a Rust prototype of the core tweet distribution data structure handling 35x full load by fitting the hot set in RAM and parallelizing with atomics, and then do math around how modern high-performance storage and networking might let you serve a close-to-fully-featured Twitter on one machine.


# Color Formats in CSS

Nice introduction into CSS color formats by Joshua Comeau. I learned about the upcoming lch():

LCH is a color format that aims to be perceptually uniform to humans. Two colors with an equivalent “lightness” value should feel equally light!

Looks great!

Also, TIL about that rgb(r g b / a) thing I’ve seen a few times:

For most of CSS’ existence, we specified RGB colors using a slightly different syntax.

This changed in CSS Colors level 4, which introduces a standardized notation used across newer color formats. rgba() isn’t explicitly deprecated, but it’s recommended to use the newer format (fortunately, browser support is excellent).