The power that new tooling unlocks

Note: this is the first post that I’ll try submitting to Hacker News for discussion/feedback. I do have a bunch of things I’d like to write about, but I noticed that since I never make it visible to other people, there’s really no incentive for me to write them. Maybe some external interaction will push me to write more.


Go watch episode 231 of Smarter Every Day. It’s a tour of a rocket factory. The entire thing is very cool to watch, but if you’re in a hurry and just want to know what that video has anything to do with software development tools, I suggest watching this 90-seconds piece starting at 8:22 and ending at 9:52 (feel free to come back here after that timestamp). What Tory Bruno says at the end of this piece is very important: “It’s sort of an interesting thing, in the real world, how the engineering tools that are available dictate the kind of designs that we use”.

Ever since watching that video about 2 months ago, I still pause and think about this quote sometimes. It’s the kind of thing that sounds obvious, but you don’t really notice until someone else points out to you. I like to think of this as an extension to Conway’s law. The design of a system is not only influenced by an organization’s communication structure, but also by the tools available to it. If the organization is also the one creating its own tools, well… you’ll see a much stronger reflection of their communication structure on their designs.

Now, read this entry on Joe Armstrong’s old blog. You can take all sorts of things away from it, but I like to see it as an anecdote on the power that the right tools unlock. If you just want to transfer files between two machines and all you have are FTP servers full of features that require lots of configuration before doing anything, you’ll have to go through a lot of crap before achieving what you want.

Once again, this comment on Hacker News is another anecdote on the same thing. I’ll quote part of that comment: “…which made using the tool change from ’let’s go grab lunch while this finishes’ to ’let’s stream the logs through this live’”.

I believe the tools we use in software engineering have a much bigger impact on how we design software than most people think. And that’s why I think we should never be satisfied with the tools we have. There are two ways that I see new tools unlocking new potential:

Doing a job faster

A tool can do a job faster, perhaps a lot faster. If it does, it unlocks all sorts of cool use cases that people didn’t expect to be possible. The trick is that it’s really hard to think about these new use cases in advance, so you never know if it’s worth it investing time in making some tool faster. And even if you do make a tool faster, you also need access to experience and the right mindset to try out new things, to only then start to figure out these different scenarios that benefit from the faster tool.

This may sound more pessimistic than it’s intended to be. Sure, a faster tool will at least save you time on the task you need it to perform. But let’s briefly appeal to imagination here. Let’s take that 50x faster tool from the HN comment I linked earlier. Suppose that tool is used in a system from your company that pulls lots of logs from customer machines, aggregates them in some way, and shows it to the customer in some nice UI. Before the tool became 50x faster, your company’s product was only able to show aggregated log data on the next day - which means that a customer would only get any value out of the system 24 hours after the logs were generated. Now, with the faster tool, you can stream the results to the customer! That’s a lot of value there.

However, customer logs are somewhat sensitive. It’s hard for your company to earn the trust of customers because your product still requires access to the raw logs. Especially with all these privacy regulations, customers are more concerned about sending log data to third parties (well, that’s probably not true at all, but I’d like to believe that. We’re in imagination land here). But hey - you just made a tool 50x faster. You can now run this on your customer’s machines (and they’re fine with the small performance impact of this new tool), and only get the aggregated data out of it. Now your product becomes more appealing to a different set of customers, and you just got a lot more business for your company. This has a lot more impact.

We just explored two outcomes from having a much faster tool for the job. One of them is very straightforward, and gives you some benefits, but the other one has a lot more upside. The difference is that the later required access to more experience (in this case it was business experience, knowing what are the things keeping people from signing up for your product), and also the right mindset to try a completely different approach (“we’re a service business, but what if we tried running this tool on our customer’s machines?”). That’s what I meant when I said it’s really hard to figure out these new use cases just from making things faster.

We can leave imaginary land and look at something real for once: Cloud Gaming. There are a bunch of game streaming services now (Google Stadia, GeForce Now, Xbox xCloud), all made possible because, among many other advancements, a lot of things became faster in gaming (it wasn’t just hardware, although faster hardware certainly helps). Faster gaming by itself would just let me play games at higher frame rates on my computer. It took a mix of other things to unlock this new use case.

Creating a different paradigm

I like the example of programming languages here, because they’ve been part of every day of my life for over a decade now (just using them, I’m not that smart to create a new language yet). I’ll be boring and just reference Go and Rust. Both are relatively new languages, but they require you to take different approaches when writing code. In doing so, they simplify tasks that are way more complicated in other languages. In Go’s case, I like how its channel mechanisms allow one to write concurrent code effortlessly. In Rust’s case, the compile-time checks and guarantees simplify the entire software development cycle (you can eliminate a bunch of steps before shipping code because the compiler ensures some bad things won’t happen).

Those two languages are new tools that one can use when developing software, and (if used in the correct situations) simplify tasks that were more complicated/slower before. Generally, the use cases that they enable are straightforward, but the trick with them is that building the tools is a lot of work. It’s not as simple as “make something faster” (not trying diminish the effort needed to make things faster).

Always invest in tooling

Go back to that video snippet I shared at the beginning of this post. Recall how Tory mentions that the old pattern for the rocket barrel was designed in the 90’s. Looking at the wikipedia entry for Vulcan, I think it’s safe to say that it took around 15 years to get a new rocket barrel design. That doesn’t necessarily mean that it took that long for new tools to be available, but just think about this time interval. 15 years is about as long as I’ve been programming now. That’s a lot of time.

There’s still a lot of potential out there. Lots of new use cases we haven’t even thought about yet. Those will only be possible if you’re always investing in new tooling, and especially if you’re also looking at new tools that other people are building. One day, you might just find the right combination of a new tool, the experience and the right mindset to unlock cool new things for all of us.