Home | Projects | Notes > Multi-Threading (POSIX Threads) > Concurrency vs. Parallelism

Concurrency vs. Parallelism

 

Understanding Concurrency

Meaning

Doing two or more different tasks:

Analogy

Consider three well-diggers assigned a task to dig their respective 100ft-deep well, they have only one well drilling tool which they have to share.

In this analogy, the work of all the well-diggers is in progression. It is slow because they need to share the resource and take turns. Here, the drilling tool represents the hardware resources.

 

Understanding Parallelism

Meaning

Doing two or more different tasks:

Analogy

Consider three well-diggers assigned a task to dig their respective 100ft-deep well, each one of them has their own well drilling tool.

In this analogy, work of all the well-diggers is in progression. It is fast because they don't need to take turns and perform the task in parallel. Here, the drilling tool represents the hardware resources.

 

Understanding Singularism

Meaning

Doing two or more different tasks:

Analogy

Consider three well-diggers assigned a task to dig their respective 100ft-deep well, each one of them has their own well drilling tool.

In this analogy, work of all the well-diggers is NOT in progression. The number of drilling tools provided does not affect the result.

 

Concurrency vs. Parallelism vs. Singularism

Time taken (performance) to complete a given task:

Concurrency don't give speed, but it gives progression.

Parallelism gives speed as well as progression, but it demands hardware resources.

Today's computer systems, in general, support both the concurrency and parallelism.

 

Analogy to Multi-threading

 Well Digging AnalogyThreading Environment
WorkersWell diggersThreads
ResourcesDigging toolCPU, memory, etc.
TransitionOne well digger to anotherContext switching
Work to accomplish100 ft wellWork to be done by each thread

 

Multi-threading Environment

Concurrent execution of threads:

Parallel execution of threads:

Singular execution of threads:

 

Why Do We Need Concurrency?

Why is concurrency still used in many modern computer systems when it is relatively poor in performance?

Let's consider a scenario where there are hundreds of threads running on your system at a time:

and only the finite number of CPUs are available. Then, all live threads have to share the CPUs:

 

parallelism-and-concurrency

 

Having either parallelism along or concurrency alone will incur poor performance. Think why!

 

Concurrent Process Design

We need concurrency because:

Concurrent Process Design - Case 1

When a process needs to wait for an I/O operation while continuing to execute the complete the task, multi-threading is necessary.

 

concurrent-process-design-example-1

 

Concurrent Process Design - Case 2

When the given work could be splitted into smaller independent chunks which then can be processed by worker threads.

 

concurrent-process-design-example-1

 

 

Why are Threads Called Light-Weight Process?

Threads are called a light-weighted process because:

 

Overlapping vs. Non-Overlapping Work

If the threads in a process access the same shared data (e.g., global variable) then the work done by those threads are called overlapping work.

Threads within a process may or may not need the synchronization depending on whether the work they are doing is overlapping or not. So, don't just blindly go for synchronization without analyzing the work the threads are supposed to do.

 

Summary

When multiple threads (same or different processes) share the same CPU with overlapping work, we get concurrency.

When multiple threads (same or different processes) share the same CPU with non-overlapping work, we get concurrency.

When multiple threads (same or different processes) run on different CPUs with non-overlapping work, we get parallelism.

 

 

References

Sagar, A. (2022). Part A - Multithreading & Thread Synchronization - Pthreads [Video file]. Retrieved from https://www.udemy.com/course/multithreading_parta/