Summary
In this chapter, we went through the basics of concurrency, and the essential concepts and terminology that you need to know in order to understand the upcoming topics of multithreading and multi-processing.
Specifically, we discussed the following:
- The definitions of concurrency and parallelism – the fact that each parallel task needs to have its own processor unit, while concurrent tasks can share a single processor.
- Concurrent tasks use a single processor unit while a task scheduler manages the processor time and shares it between different tasks. This will lead to a number context switches and different interleavings for each task.
- An introduction to blocking instructions. We also explained the patterns that suggest when we require concurrency, and the way we could break a single task into two or three concurrent tasks.
- We described what a shared state is. We also showed how a shared state could lead to serious concurrency issues like...