🎉 End-of-year Sale! Save 25% when you subscribe today.

Concurrency's Future: Structured and Unstructured

Episode #194 • Jun 27, 2022 • Subscriber-Only

There are amazing features of Swift concurrency that don’t quite fit into our narrative of examining it through the lens of past concurrency tools. Instead, we’ll examine them through the lens of a past programming paradigm, structured programming, and see what is has to say about structured concurrency.

Collection
Threads, queues and tasks
Concurrency's Future: Structured and Unstructured
Locked

Unlock This Episode

Our Free plan includes 1 subscriber-only episode of your choice, plus weekly updates from our newsletter.

Sign in with GitHub

Introduction

This is yet another example of how difficult multithreaded programming can be. Just because we have extremely powerful tools for preventing data races doesn’t mean we have removed the possibilities of non-determinism creeping into our code. Just by virtue of the fact that we are firing off a bunch of concurrent tasks at once we have no way to avoid introducing some non-determinism into the system based on how the system is going to schedule and prioritize all of those tasks. If we don’t want that kind of non-determinism then we shouldn’t be performing concurrent work.

But the issue of non-determinism is completely separate from the issues of data races, and Swift’s tools are tuned to address the data races, not the non-determinism.

We’ve now seen how Swift’s new concurrency tools compare to many of the other tools on Apple’s platforms, including threads, operation queues, dispatch queues and the Combine framework. And in pretty much every category that we considered, Swift’s new concurrency tools blew the old tools out of the water:

  • First, the concepts of asynchrony and concurrency are now baked directly into the language rather than bolted on as a library. Swift can now express when a function needs to perform asynchronous work, using the new async keyword, and Swift can express types and functions that can be used concurrently, using the new Sendable protocol and @Sendable attribute.

  • Second, although we don’t explicitly manage something like a thread pool or an execution queue, somehow Swift allows spinning up many thousands of concurrent tasks without exploding the number of threads created. In fact, a max of only 10 threads seems to be created for our computers.

  • Third, tasks have all the features that threads and queues had, such as priority, cooperative cancellation and storage, but in each case tasks massively improved the situation over the older tools. Cancellation is deeply ingrained into the system so that the cancellation of top level tasks trickle down to child tasks, and task storage also now inherits from task to child task, allowing you to nest locals in complex yet understandable ways.

  • Fourth, although Swift’s concurrency runtime limits us to a small number of threads in the cooperative thread pool, Swift does give us the tools that help us not clog up that pool. Using things like non-blocking asynchronous functions and Task.yield we suspend our functions to allow other tasks to use our thread, and then once we are ready to resume a thread will be automatically provided to us.

  • Fifth, and perhaps most exciting, Swift now provides a first class type for synchronizing and isolating mutable data in such a way that the compiler understands when you might have used it incorrectly. They’re called actors, and they allow you to largely write code that looks like simple, synchronous code, but under the hood it is locking and unlocking access to the mutable data.

Already it’s pretty impressive for Swift to accomplish so much so quickly. But there’s even more. Swift new concurrency tools allow us to write our asynchronous and concurrent code in a style that is substantially different from how we wrote our code with threads and queues. There are other features of Swift concurrency that are so unique that there’s just nothing we can really compare them to for the older concurrency tools such as threads and queues.

So, we’d like to take one more episode in this series on concurrency to discuss the amazing features that don’t quite fit into our narrative of looking at concurrency through the lens of the past.

And we will begin by discussing the concept of structured concurrency. Well really, let’s back up a bit and talk about structured programming in general so that we know why structured concurrency is such a big deal.

Most modern, popular languages are primary “structured programming languages”, so there’s a very good chance that you have never really programmed in an “unstructured” way. To put it simply, structured programming is a paradigm that aims to make programs read linearly from top-to-bottom. Doing so can help you compartmentalize parts of the program as black boxes so that you don’t have to be intimately familiar with all of its details at all times. The bread-and-butter of structured programming are tools like conditionals, loops, function calls and recursion.

This may seem very intuitive and obvious to you, but back in the 1950s it wasn’t so clear. At that time human readable programming languages were still quite nascent, and so those languages had tools that made a lot of sense for how the code was run at a low level on the machine, but was difficult for humans to fully understand.

An example of such a tool is the jump command. It allows you to redirect the flow of execution of the program to any other part of the program. Swift doesn’t have this tool, at least not in full generality, but let’s look at what it could have looked like.

Structured programming


References

Downloads

Get started with our free plan

Our free plan includes 1 subscriber-only episode of your choice, access to 64 free episodes with transcripts and code samples, and weekly updates from our newsletter.

View plans and pricing