Parallel computing is a type of computation in which many calculations or
processes are carried out simultaneously. Large problems can often be divided
into smaller ones, which can then be solved at the same time. There are
several different forms of parallel computing: bit-level, instruction-level,
data, and task parallelism. Parallelism has long been employed in high-
performance computing, but has gained broader interest due to the physical
constraints preventing frequency scaling. As power consumption (and
consequently heat generation) by computers has become a concern in recent
years, parallel computing has become the dominant paradigm in computer
architecture, mainly in the form of multi-core processors.