What I discovered is quite fascinating!
A few weeks ago, I read a Article By Wojciech Kulik, where he talks about some of the shortcomings in the Swift Concurrency Framework. In one section, Wojciech briefly mentioned thread explosion, and how Swift concurrency can prevent this from happening by limiting systems with more threads than CPU cores from overcommitting.
This makes me wonder… is it really so? How does it work behind the scenes? Can we trick the system to create more threads than CPU cores?
We are going to answer all these questions in this article. So without further ado, let’s get straight in.
So, what is thread explosion? Thread explosion is a situation where a system has a large amount of threads running simultaneously and eventually causes performance problems and memory overhead.
There is no clear answer to how many threads are considered too many. As a general benchmark, we can refer to the example given in WWDC videosThereby a system that is running 16 times more threads than its CPU cores is said to be undergoing thread explosion.
Since Grand Central Dispatch (GCD) does not have a built-in mechanism that prevents thread explosion, it is very easy to create one using a dispatch queue. Consider the following code:
Once executed, the above code will spawn a total of 150 threads, causing a thread explosion. This can be verified by stopping the execution and checking the Debug Navigator.
Now that you have learned how to trigger a thread explosion, let’s try to execute the same code using Swift Concurrency and see what happens.
As we all know, there are 3 levels of task priority in Swift Concurrency, mainly userInitiated
, utility
And background
where userInitiated
top priority, then utility
And background
with the lowest priority. So let’s go ahead and update our HeavyWork
class accordingly:
Every time a task is created, we will print out the creation time. Then we can use it to see what is happening behind the scene.
with update HeavyWork
Class location, let’s start with the test first.
Test 1: Creating tasks with the same priority level
This test is basically the same as the dispatch queue example we saw earlier, but instead of using GCD, we’ll use Task
To create a thread from Swift Concurrency.
Following are the logs captured from Xcode console.
As you can see (from task creation time), thread creation stopped when thread count reached 6, which perfectly corresponds to the number of CPU cores of my6-core iPhone 12. Creation of tasks will continue only after an ongoing task has completed its execution. As a result, there can be a maximum of 6 threads running simultaneously at a time.
Comment:
The iOS simulator will always limit the maximum thread count to 1 regardless of the device selected. So, make sure to run the above test using a real device for a more accurate result.
For a clearer picture of what’s really happening behind the scenes, let’s pause the execution.
It looks like everything we just saw is controlled by a concurrent queue named “com.apple.root.user-initiated-qos.cooperative,
Based on the above observation, it is safe to say how Swift prevents concurrency thread explosion from happening: keep a dedicated concurrent queue to limit the maximum number of threads so that it does not exceed CPU cores.
Test 2: Creating all tasks at once from high to low priority level
Now, let’s go a little deeper by adding tasks with different priorities to the test.
Note that we are creating the highest priority task (userInitiated
) before, after utility
And background
, Based on our previous observation, I was expecting to see 3 queues with 6 threads running simultaneously in each queue, which means we would see a total of 18 threads being spawned. Surprisingly it is not so. Take a look at the following screenshot:
As you can see, both utility
And background
Queues are limiting the maximum number of threads to 1 when the queue with higher priority (userInitiated
) is satisfied. In other words, we can have at most 8 formulas in this test.
This is such an interesting find! Saturating the high-priority queue will somehow prevent other low-priority queues from generating more threads.
But, what if we reverse the order of priority levels? let’s find out!
Test 3: Creating all tasks at once from low to high priority level
First, let’s update the execution code:
Here comes the result:
The result we get is exactly the same as “Test 2”.
It seems that the system is smart enough to give way to the higher priority tasks to run first, even though we started the lowest priority tasks first. Also, the system is still preventing us from creating more than 8 concurrent threads, so we are still not able to create a thread explosion for this test. Good job apple! I
Test 4: Creating tasks from low to high priority level with breaks in between
In real life situations, it is very unlikely that we start a bunch of tasks with different priority levels all at once. So let’s make the situation more realistic by adding a short break between each for loop. Note that we are still using low to high order in this test.
The result we get is quite interesting.
As you can see, after the second break, all 3 queues are running multiple threads. It seems that if we first start a low priority queue and let it run for some time, the high priority queue will not depress the performance of the low priority queue.
I have executed this test twice, the maximum number of threads may vary slightly, but it equates to 3 times the CPU cores.
Is this considered thread explosion?
I don’t think so, because 3 times more threads than CPU cores is still less than the 16 times threshold I mentioned earlier. In fact, I think Apple allows this to be done intentionally in order to have a better balance between execution performance and multi-threading overhead. kill me Twitter If you have other perspectives, I’d really like to hear your thoughts.