Parallel Computing Concepts
Summary
This concept module, intended for persons with a modest background in CS, introduces a core of parallel computing notions that CS students should know in preparation for the era of manycore computing. This includes categories of parallelism (pipelining, data parallelism, task parallelism), parallel speedup and Amdahl's Law, an elementary presentation of concurrency issues (definition and examples of deadlock, identifying race conditions, avoiding races using atomic actions), and selected concurrent programming strategies (message passing, Java-like per-object synchronization). It will also provide application notes to motivate and lay the groundwork for subsequent modules, including basic concepts of DISC and applications of DISC to industry.
Optional short programming exercises will illustrate selected concepts. For example, a program with a predictable and observable race condition may be provided, available for experimental modification; a Java-like synchronize feature could then be provided in order to prevent those race conditions.
Module Characteristics
Languages Supported: Any
Relevant Parallel Computing Concepts: Data Parallelism, Task Parallelism, Message Passing, Shared Memory, Distributed
Recommended Teaching Level: Intermediate, Advanced
Possible Course Use: Hardware Design, Software Design, Algorithm Design, Parallel Computing Systems, Programming Languages
Learning Goals
- Given a description of a problem to solve, students should be able to discriminate whether it requires a data parallel or task parallel solution
- Given a description of a parallelizable algorithm, students should be able to apply the principle of Amdahl's law to determine the maximum speedup of that algorithm using N processors.
- Given a language-specific library and an associated back-end platform, students will be able to implement at least one simple algorithm with data parallelism and at least one simple algorithm using task parallelism
- Students will be able to discriminate DISC from other forms of parallel computation, describe the operation of the map-reduce strategy for DISC in general terms, and list significant applications of DISC in industry.
- Given an example of parallel code, students will be able to identify race conditions and describe how to remove those race conditions using a synchronization primitive
Context for Use
Description and Teaching Materials
You can visit the module in your browser:
Parallel Computing Concepts
or you can download the module in either PDF format or latex format.
PDF Format: Parallel Computing Concepts.pdf.
Latex Format: Parallel Computing Concepts.tar.gz.
Word Format: Parallel Computing Concepts.docx.
Teaching Notes and Tips
Assessment
Assessment to be completed later...