After analyzing the concurrency in a problem,
perhaps by using the patterns in the Finding Concurrency
design space, the next task is to refine the design and
move it closer to a program that can execute tasks
concurrently by mapping the concurrency onto multiple units of
execution (UEs) running on a parallel
computer.
Of the countless ways to define an algorithm structure, most follow one of six basic design patterns. These patterns make up the Algorithm Structure design space. The figure shows the patterns in the designs space and the relationship to the other spaces.
The key issue at this stage is to decide which pattern or patterns are most appropriate for the problem. In making this decision, various forces such as simplicity, portability, scalability, and efficiency may pull the design in different directions. The features of the target platform must also be taken into account. There is usually a major organizing principle
implied by the concurrency that helps choose a pattern. This
usually falls into one of three categories:
The most effective parallel algorithm design may make use of multiple algorithm structures (combined hierarchically, compositionally, or in sequence). For example, it often happens that the very top level of the design is a sequential composition of one or more Algorithm Structure patterns. Other designs may be organized hierarchically, with one pattern used to organize the interaction of the major task groups and other patterns used to organize tasks within the groups -- for example, an instance of Pipeline in which individual stages are instances of Task Parallelism. |
02 June, 2004