# Writability of Programming Languages

Main takeaways—

1. Commonplace syntax should be easy to type
2. // for comments is easier to type than #
3. Python’s indentation style is easy since you only need to use TAB (no end or {})
4. JS/C# lamba expressions using => are concise and easy to write
5. Short keywords like for in let var are easy to type
6. Using . for attributes (Python) is superior to $ (R)
7. >> is easier than |> or %>% for piping
8. Ruby’s usage of @ for @classvar is simpler than self.classvar
9. The ternary operator ?: is easy to write because it’s at the bottom right of the keyboard

Takes into account only US keyboard layout, but interesting anyway.

# Speeding up the JavaScript ecosystem – eslint

In today’s world for-of loops are supported everywhere, so I patched the package again and swapped out the function implementation with the original one in the source. This one single change saves about 400ms. I’m always impressed by how much CPU time we burn on wasteful polyfills or outdated downtranspilation.

So many libraries in our ecosystem suffer from this issue. I really wish there was a way to update them all with one click. Maybe we need a reverse transpiler which detects down transpilation patterns and converts it back to modern code again.

# 30,000 lines of SwiftUI in production later: We love it but you know there was going to be a “but”

It took a few hours to fall in love with SwiftUI. So much so that we instantly decided to abandon a cross-platform codebase and go fully native on iOS. timing.is shipped this month on the App Store. It was built entirely in SwiftUI. Its development took several months. It would have been less if SwiftUI just gave. Unfortunately, repeatedly it would take.

Can confirm: SwiftUI is nice and easy in when it works, but in general, it’s a giant pain. It’s embarrassing for Apple to release a framework in such a half-baked state.

It was fun at least 51% of the time. But let’s talk about the <= 49% that wasn’t.

Imagine writing lots of code that works well, and then getting stuck for a few days on a tiny thing that just doesn’t want to work. That’s SwiftUI experience.

Top-level await in JavaScript REPLs is a hack

Today I learned that top-level await in JavaScript REPLs, such as Chrome’s Developer Tools console and Node.js REPL, is a BIG HACK!

Top-level await is a feature of JavaScript that allows you to use await outside of an async function, for example:

await Promise.resolve("cool")

Before it was added, the only way to use await was to wrap it in async function:

(async () => await Promise.resolve("cool"))()

However, top-level await only works within modules. This is a problem for REPLs, since they don’t make a module for every expression that you type. They basically use eval() on it, running in the global scope: if you type x = 1 or var x = 1, you expect x to be a global variable.

So, await wouldn’t work. But it does work! How? I initially thought they implemented some kind of a special eval-level await V8. Nope!

Turns out REPLs parse your expression and rewrite it into the async function! Both Node and Chrome use the acorn.js parser for that 🤯

If you type

await Promise.resolve('cool')

they turn it into

(async () => {return (await Promise.resolve('cool'));})()

What about global variables though? If you type var x = 1. and wrap it in a function, the var will be local to the function. But we want it to be global.

Here’s the trick — they also rewrite variable definitions:

var x = 1; await Promise.resolve('ok');

turns into:

(async () => {void (x = 1); return (await Promise.resolve('ok'));})()

They strip var/let/const from what now are function-scoped variables and just assign values to the global. Notice that your const will not actually be a const, but that’s fine, since REPLs allow redeclaring and reassigning top-level const and let anyway.

One funny thing: before running this whole parser stuff, Chrome dev tools check if rewriting is needed at all by looking for “async” in your code. So, if you type console.log("async work"), it will execute a tiny bit slower than console.log("meetings") 😜

If you want more details, read this processTopLevelAwait function in Node.

# Delphi VCL & FMX Libraries for Python

The DelphiVCL and DelphiFMX libraries for Python are a set of Python modules that put the robust and mature VCL and FireMonkey (FMX) GUI libraries in the hands of Python developers.

Check out this sample:

from delphivcl import *

class MainForm(Form):

    def __init__(self, owner):
        self.Caption = "A VCL Form..."
        self.SetBounds(10, 10, 500, 400)
        self.Position = "poScreenCenter"

        self.lblHello = Label(self)
        self.lblHello.SetProps(Parent=self, 
            Caption="Hello DelphiVCL for Python")
        self.lblHello.SetBounds(10, 10, 300, 24)

        self.OnClose = self.__on_form_close


    def __on_form_close(self, sender, action):
        action.Value = caFree


def main():
    Application.Initialize()
    Application.Title = "Hello Python"
    Main = MainForm(Application)
    Main.Show()
    FreeConsole()
    Application.Run()
    Main.Destroy()


main()

Very nice and nostalgic! I used VCL a lot in the 2000s, and was very productive with it. This version is not open source, but has some kind of freeware license. I personally wouldn’t use any non-platform-provided UI library that’s not open source, for practical reasons.

# Production Twitter on One Machine? 100Gbps NICs and NVMe are fast

In this post I’ll attempt the fun stunt of designing a system that could serve the full production load of Twitter with most of the features intact on a single (very powerful) machine. I’ll start by showing off a Rust prototype of the core tweet distribution data structure handling 35x full load by fitting the hot set in RAM and parallelizing with atomics, and then do math around how modern high-performance storage and networking might let you serve a close-to-fully-featured Twitter on one machine.

Fun!