Was this helpful?
Concurrency and Parallelism
Concurrency is the ability to run independent units of work using the same resources (CPU, disk, memory) at the same time. The simplest example of this is an operating system. Modern operating systems allow many users to run jobs simultaneously. This time-sharing of machine resources is transparent to the users, making it appear to each as if they had the entire machine to themselves. This takes advantage of the likelihood that these simultaneous jobs will have different resource requirements. If users took turns monopolizing the machine, any unused resources would end up being wasted. By overlapping jobs with differing needs, resources can be efficiently used to increase overall job throughput.
Parallelism is the ability to break a unit of work into smaller sub-tasks, which can then be executed concurrently. By running concurrently, an overall solution can be reached quicker than without parallelism. There are a number of different forms of parallelism. Few problems exhibit only one of these forms; rather, most are some combination of several of them.
Last modified date: 06/14/2024