app performance with clear, practical examples.">
Skip to content

Recommended Reading

Swift Concurrency Explained: GCD, Operation Queues, and Async/Await

9 Minutes

Swift Concurrency Explained: GCD, Operation Queues, and Async/Await

Fix Bugs Faster! Log Collection Made Easy

Get started

Concurrency is the ability of an app to perform multiple tasks at once, and it’s a crucial concept for apps that need to perform multiple tasks at once in an efficient, usable way. Thankfully Swift has made great strides with concurrency, and now provides simple tools for writing robust apps that are responsive and enjoyable to use.

In this article we’ll explore two main ways of using threads for concurrency models. We’ll start with the classic way of doing it from the straight-up use of threads, Grand Central Dispatch and Operation Queues. Afterwards we’ll look at async/await, a more modern, simple, approach to concurrency that has recently been added to the arsenal of Swift tools at our disposal.

But before that, a quick look at the key concepts needed to fully grasp concurrency and threading.

What Is Concurrency?

Concurrency means apps can run multiple tasks or processes, regardless of each other’s life cycle. This improves the overall performance and responsiveness of an app, since it allows multiple operations to run at the same time, whether they are in parallel or not.

What are threads?

We’re going to talk about threads in more detail soon, so here’s a quicker explanation of what they actually are:

A thread is the context in which operations are executed. We can think of each thread as a small, independent worker who has a single task to execute, and will execute it independently.

We can manually create threads, and use them, with some simple code:

let thread = Thread {
    print("Here we can perform our task")
}

thread.start()

Note: Even though we’ve given the code string, this course of action is not usually recommended. It should only be used for very specific, low-level tasks. Using the abstractions, like Grand Central Dispatch, which we will explain shortly, is safer, simpler, and generally more advisable.

Classic Concurrency in Swift

We will now look at the older concurrency models in Swift, including Grand Central Dispatch (GCD) and Operation Queues. We could look directly at threads but, as we’ve just explained, they’re not the most recommended method of implementing concurrency, so we’ll put that to one side.

1. Grand Central Dispatch (GCD)

Grand Central Dispatch (GCD) is one of the oldest and most commonly used concurrency tools in iOS and macOS development. GCD is a low-level API that enables developers to manage tasks asynchronously, using dispatch queues. These queues can run tasks either serially (one task at a time) or concurrently (multiple tasks at once).

Queues

One of the key concepts is that of Queues. These are essentially lists of tasks that need to be executed in a specific order. There are many types of queues, so let’s take a look:

Serial queues and concurrent queues

Serial queues guarantee Tasks are executed in a serial manner. In other words, one after the other. Meanwhile, concurrent queues can execute and run many tasks simultaneously, as long as there are CPU cores to do so.

They are very similar in usage, as shown here:

//Serial:
let serialQueue = DispatchQueue(label: "mySerialQueue")

serialQueue.async {
    print("1 started")
    print("1 finished")
}
serialQueue.async {
    print("2 started")
    print("2 finished")
}

//Concurrent
let concurrentQueue = DispatchQueue(label: "myConcorrentQueue", attributes: .concurrent)

concurrentQueue.async {
    print("3 started")
    print("3 finished")
}
concurrentQueue.async {
    print("4 started")
    print("4 finished")
}

While the serial Queue will print:

    print("1 started")
    print("1 finished")
    print("2 started")
    print("2 finished")

It’s likely that the concurrent Queue would have a different order of operations, so it could be:

 print("1 started")
 print("2 started")
 print("1 finished")
 print("2 finished")

or

 print("1 started")
 print("2 started")
 print("2 finished")
 print("1 finished")

Since we’re talking about prints, the concurrent Queue is likely to have the same output as the serial one. But with more complex operations it is certain to differ.

Global Queues

Global Queues automatically let us choose a Quality of Service level, which refers to the priority in which they get executed. There are several QoS, or priority levels to note:

  • .userInteractive: The highest priority for a Global Queue. This is to be used in tasks that require immediate results, usually triggered by user actions.
  • .userInitiated: This has lower priority than userInteractive but it should also be completed quickly to provide a good user experience.
  • .default: The default priority for GlobalQueues. That means it will run after any high-priority queues, and before any low-priority ones. It is, in practice, the average priority of all of them.
  • .utility: This is usually used for tasks that are not being actively tracked by the user, like downloading a higher resolution asset to use in apps.
  • .background: The lowest priority. Tasks with this priority can run in the background without interfering with anything, even if they take a while to complete.

They are all similar when it comes to usage, as we can see in this example:

DispatchQueue.global(qos: .userInteractive).async {
    print("My userInteractive task")
}

DispatchQueue.global(qos: .userInitiated).async {
    print("My userInitiated task")
}

DispatchQueue.global(qos: .default).async {
    print("My default task")
}

DispatchQueue.global(qos: .utility).async {
    print("My utility task")
}

DispatchQueue.global(qos: .background).async {
    print("My background task")
}

The main Queue

The main Queue is a special case, since it’s the only one we rely on to update our UI. This means that if we use it for any task that takes a huge amount of time to complete, the UI of our apps will be frozen while the Task completes.

Thus, it is common practice to run more time-consuming tasks on other Queues, and then to get the results back to the main Queue to update our App’s UI:

 DispatchQueue.global(qos: .background).async {
    print("My background task that might take a while to complete")
    
    DispatchQueue.main.async {
	    //Update the UI here after completing what was processing on the background global queue
    }
}

Important: You should not use the main queue for expensive/slow computing, since that will freeze your UI and impact your users’ experience.

2. Operation Queues

While Operation Queues are also built on top of GCD, they are a higher-level abstraction over it. This means they’re simpler to learn and use, and provide us with enough tools to control task execution.

They allow us to add operations, which are literally instances of the Operation class, to a queue, manage dependencies between Operations, and configure the maximum number of concurrent Operations we’d like to have.

To get a better idea, let’s look at a simple example of Operation Queues:


let operationQueue = OperationQueue()

let operation1 = BlockOperation {
    // Perform a Task
    print("Task 1 started")
}

let operation2 = BlockOperation {
    // Perform another Task
    print("Task 2 started")
}

operationQueue.addOperations([operation1, operation2], waitUntilFinished: false)

Here, our Operation Queue will manage two BlockOperation objects running concurrently. We can also set them to run sequentially if we configure the Operation Queue that way. There are many important things we can do on our OperationQueues, including:

let operationQueue = OperationQueue()

//Define how many concurrent operations we'd like, 
//if we set it to 1, in practice it becomes a sequential queue
operationQueue.maxConcurrentOperationCount = 1;

//Define the QoS of our Queue, as we saw earlier for Global Queues
operationQueue.qualityOfService = .background

//Define a name to help us identify a Queue
operationQueue.name = "myQueue"

//Set it to wait until all operations are finished, which is useful in some scenarios
operationQueue.waitUntilAllOperationsAreFinished()

//Cancel all operations if something made the queue finish irrelevant to us
operationQueue.cancelAllOperations()

Modern Concurrency: Async/Await in Swift

Now that we’ve seen the basics of Classic concurrency in Swift, let’s have a look at the new kid on the block: Async/Await.

This was introduced with Swift 5.5. and drastically improves how we write and handle concurrency. Everything becomes way simpler and easier to use, without losing any capability.

The Basics of Async/Await

We use async to mark any function/method as asynchronous. We then use await, on a caller, to instruct that it needs to wait for the async function/method to finish.

It really is that simple. We don’t need blocks or callbacks, like we saw earlier.

Let’s look at how it all works in practice:


func makeWebRequest() async -> String {
    // Simulate a network request
    try await Task.sleep(nanoseconds: 2 * 1_000_000_000)
    return "Data is here"
}

Task {
    let data = await makeWebRequest()
    print(data)
}

  • makeWebRequest  is marked with async, indicating that it performs an asynchronous operation.
  • The await keyword is used to call makeWebRequest() and wait for its result.

Async Let and Task Groups

Swift 5.5 introduced structured concurrency, a new way of managing asynchronous tasks that makes the lifecycle of tasks more predictable and easier to manage. Structured concurrency ensures that asynchronous tasks are completed or canceled in a controlled manner.

Swift now allows asynchronous tasks to be created using async let and the managing of multiple tasks with TaskGroup.


func fetchDataFromMultipleSources() async {
    async let data1 = fetchDataFromSource1()
    async let data2 = fetchDataFromSource2()

    let result1 = await data1
    let result2 = await data2

    print("Data1: \\(result1), Data2: \\(result2)")
}
  • async let is used to create two asynchronous tasks that run concurrently.
  • The Tasks are awaited individually, and their results are processed.

Task groups allow you to create multiple asynchronous tasks, aggregate their results, and handle their cancellation together.


func fetchMultipleData() async {
    await withTaskGroup(of: String.self) { group in
        group.async {
            return await fetchDataFromSource1()
        }

        group.async {
            return await fetchDataFromSource2()
        }

        for await result in group {
            print(result)
        }
    }
}

Here, withTaskGroup manages multiple concurrent tasks, and the for await loop collects their results in a safe, structured manner.

Summing up

If you’re going to take three things away from this post, we urge you to remember the following.

Avoid blocking the main thread, regardless of what concurrency model we’re implementing.

Use async/await when possible, because it is relatively simple and easy to use.

Grand Central Dispatch, when understood, is not hard to use and we can leverage it for our background tasks*

If you remember these things, you’ll have a blast with Swift Concurrency. Happy coding!

Expect The Unexpected!

Debug Faster With Bugfender

Start for Free
blog author

Flávio Silvério

Interested in how things work since I was little, I love to try and figure out simple ways to solve complex issues. On my free time you'll be able to find me with the family strolling around a park, or trying to learn something new. You can contact him on Linkedin or GitHub

Join thousands of developers
and start fixing bugs faster than ever.