Grand Central Dispatch (GCD, Dispatch, or libdispatch
) consists of language features, runtime libraries, and system enhancements to provide comprehensive support for concurrent code execution in Apple platforms (iOS, macOS, etc.) that run on multi-core hardware.
GCD is a low-level API, operating at the system level, and built on top of threads. Under the hood, it manages a shared thread pool. With GCD, you add blocks of code or work items to dispatch queues and GCD decides which thread to execute them on.
GCD can help improve your app’s responsiveness by deferring computationally expensive tasks to the background, accommodate the needs of all running applications, matching them to the available system resources in a balanced fashion.
The decision of when to start a task is entirely up to GCD. If the execution time of one task overlaps with another, it’s up to GCD to determine if it should run on a different core — if one is available — or instead perform a context switch to run a different task.
GCD operates on dispatch queues through a class aptly named DispatchQueue
. You submit units of work to this queue, and GCD executes them in a FIFO order (first in, first out), guaranteeing that the first task submitted is the first one started.
Dispatch queues are thread-safe, meaning you can simultaneously access them from multiple threads. Queues can be either serial or concurrent.
Serial queues guarantee that only one task runs at any given time. GCD controls the execution timing. You won’t know the amount of time between one task ending and the next one beginning.
Concurrent queues allow multiple tasks to run at the same time. The queue guarantees tasks start in the order you add them. Tasks can finish in any order, and you have no knowledge of the time it will take for the next task to start, nor the number of tasks running at any given time.
Main queue runs on the main thread and is a serial queue.
Global queues are concurrent queues shared by the whole system. Four such queues exist, each with different priorities: high, default, low and background. The background priority queue has the lowest priority and is throttled in any I/O activity to minimize negative system impact.
Custom queues are queues you create that can be serial or concurrent. Requests in these queues end up in one of the global queues.
User interactive — This represents tasks that must complete immediately to provide a nice user experience. Use it for UI updates, event handling and small workloads that require low latency. The total amount of work done in this class during the execution of your app should be small. This should run on the main thread.
User initiated — The user initiates these asynchronous tasks from the UI. Use them when the user is waiting for immediate results and for tasks required to continue user interaction. They execute in the high-priority global queue.
Utility — This represents long-running tasks, typically with a user-visible progress indicator. Use it for computations, I/O, networking, continuous data feeds and similar tasks. This class is designed to be energy efficient. This gets mapped into the low-priority global queue.
Background — This represents tasks the user isn’t directly aware of. Use it for prefetching, maintenance and other tasks that don’t require user interaction and aren’t time-sensitive. This gets mapped into the background priority global queue.
With GCD, you can dispatch a task either synchronously or asynchronously.
A synchronous function returns control to the caller after the task completes. You can schedule a unit of work synchronously by calling DispatchQueue.sync(execute:)
.
An asynchronous function returns immediately, ordering the task to start but not waiting for it to complete. Thus, an asynchronous function doesn’t block the current thread of execution from proceeding to the next function. You can schedule a unit of work asynchronously by calling DispatchQueue.async(execute:)
.
In general, you want to use async
when you need to perform a network-based or CPU-intensive task in the background without blocking the current thread.
Here’s a quick guide on how and when to use the various queues with async
:
Here’s a quick overview of when and where to use sync
:
You’ve heard about tasks quite a bit by now. For the purposes of this tutorial, you can consider a task to be a closure. Closures are self-contained, callable blocks of code you can store and pass around.
Each task you submit to a DispatchQueue is a DispatchWorkItem. You can configure the behavior of a DispatchWorkItem, such as its QoS class or whether to spawn a new detached thread.
DispatchQueue allows you to delay task execution with asyncAfter()
. Don’t use this to solve race conditions or other timing bugs through hacks like introducing delays. Instead, use this when you want a task to run at a specific time.
GCD provides an elegant solution of creating a read/write lock using dispatch barriers. Dispatch barriers are a group of functions acting as a serial-style bottleneck when working with concurrent queues.
When you submit a DispatchWorkItem to a dispatch queue, you can set flags to indicate that it should be the only item executed on the specified queue for that particular time. This means all items submitted to the queue prior to the dispatch barrier must complete before DispatchWorkItem executes.
When DispatchWorkItem‘s turn arrives, the barrier executes it and ensures the queue doesn’t execute any other tasks during that time. Once finished, the queue returns to its default implementation.
A dispatch barrier allows us to create a synchronization point within a concurrent dispatch queue. In normal operation, the queue acts just like a normal concurrent queue. But when the barrier is executing, it acts as a serial queue. After the barrier finishes, the queue goes back to being a normal concurrent queue.
private let concurrentPhotoQueue =
DispatchQueue(
label: "com.raywenderlich.GooglyPuff.photoQueue",
attributes: .concurrent)
func addPhoto(_ photo: Photo) {
concurrentPhotoQueue.async(flags: .barrier) { [weak self] in
guard let self = self else {
return
}
self.unsafePhotos.append(photo)
DispatchQueue.main.async { [weak self] in
self?.postContentAddedNotification()
}
}
}
With dispatch groups, you can group together multiple tasks. Then, you can either wait for them to complete or receive a notification once they finish. Tasks can be asynchronous or synchronous and can even run on different queues.
DispatchGroup manages dispatch groups. You’ll first look at its wait
method. This blocks your current thread until all the group’s enqueued tasks finish.
DispatchQueue.global(qos: .userInitiated).async {
var storedError: NSError?
let downloadGroup = DispatchGroup()
for address in [
PhotoURLString.overlyAttachedGirlfriend,
PhotoURLString.successKid,
PhotoURLString.lotsOfFaces,
] {
guard let url = URL(string: address) else { return }
downloadGroup.enter()
let photo = DownloadPhoto(url: url) { _, error in
storedError = error
downloadGroup.leave()
}
PhotoManager.shared.addPhoto(photo)
}
downloadGroup.wait()
DispatchQueue.main.async {
completion?(storedError)
}
}
Dispatch groups are a good candidate for all types of queues. You should be wary of using dispatch groups on the main queue if you’re waiting synchronously for the completion of all work.
Dispatching asynchronously to another queue then blocking work using wait is clumsy. Fortunately, there’s a better way. DispatchGroup can instead notify you when all the group’s tasks are complete.
var storedError: NSError?
let downloadGroup = DispatchGroup()
for address in [
PhotoURLString.overlyAttachedGirlfriend,
PhotoURLString.successKid,
PhotoURLString.lotsOfFaces
] {
guard let url = URL(string: address) else { return }
downloadGroup.enter()
let photo = DownloadPhoto(url: url) { _, error in
storedError = error
downloadGroup.leave()
}
PhotoManager.shared.addPhoto(photo)
}
downloadGroup.notify(queue: DispatchQueue.main) {
completion?(storedError)
}
You can only cancel a DispatchWorkItem before it reaches the head of a queue and starts executing.
var storedError: NSError?
let downloadGroup = DispatchGroup()
var addresses = [
PhotoURLString.overlyAttachedGirlfriend,
PhotoURLString.successKid,
PhotoURLString.lotsOfFaces
]
addresses += addresses + addresses
var blocks: [DispatchWorkItem] = []
for index in 0..<addresses.count {
downloadGroup.enter()
let block = DispatchWorkItem(flags: .inheritQoS) {
let address = addresses[index]
guard let url = URL(string: address) else {
downloadGroup.leave()
return
}
let photo = DownloadPhoto(url: url) { _, error in
storedError = error
downloadGroup.leave()
}
PhotoManager.shared.addPhoto(photo)
}
blocks.append(block)
DispatchQueue.main.async(execute: block)
}
for block in blocks[3..<blocks.count] {
let cancel = Bool.random()
if cancel {
block.cancel()
downloadGroup.leave()
}
}
downloadGroup.notify(queue: DispatchQueue.main) {
completion?(storedError)
}
Dispatch sources are a C-based mechanism for processing specific types of system events asynchronously. A dispatch source encapsulates information about a particular type of system event and submits a specific block object or function to a dispatch queue whenever that event occurs. You can use dispatch sources to monitor the following types of system events: