Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

COMP 322: Fundamentals of Parallel Programming (Spring

...

2023)

 

Tory Songyang, Zishi Wang

Instructor:

Mackale Joyner, DH 2063

Head TAs:Jonathan Cai (hw), Paul Jiang (lab 1pm), William Su (lab 4pm)Admin Assistant:Annepha Hurlock, annepha@rice.edu, DH 3122, 713-348-5186Undergraduate TAs:Mohamed Abead, Chase Hartsell, Taha Hasan, Harrison Huang, Jerry Jiang, Jasmine Lee, Michelle Lee, Hung Nguyen, Quang Nguyen, Ryan Ramos, Oscar Reynozo, Delaney Schultz, Tina Wen, Raiyan Zannat, Kailin Zhang

Piazza site:

https://piazza.com/configure-classesrice/spring2020spring2022/comp322 (Piazza is the preferred medium for all course communications, but you can also send email to comp322-staff at rice dot edu if needed)

Cross-listing:

ELEC 323

Lecture location:

Sewell Hall 301TBD

Lecture times:

MWF 1:00pm - 1:50pm

Lab locations:

Sewell Hall 301TBD

Lab times:

Thursday, 1Mon  3:00pm - 13:50pm , ()

Tue 4:00pm - 4:50pm ()

Course Syllabus

A summary PDF file containing the course syllabus for the course can be found here.  Much of the syllabus information is also included below in this course web site, along with some additional details that are not included in the syllabus.

...

The desired learning outcomes fall into three major areas (course modules):

1) Parallelism: functional programming, Java streams, creation and coordination of parallelism (async, finish), abstract performance metrics (work, critical paths), Amdahl's Law, weak vs. strong scaling, data races and determinism, data race avoidance (immutability, futures, accumulators, dataflow), deadlock avoidance, abstract vs. real performance (granularity, scalability), collective & point-to-point synchronization (phasers, barriers), parallel algorithms, systolic algorithms.

...

3) Locality & Distribution: memory hierarchies, locality, cache affinity, data movement, message-passing (MPI), communication overheads (bandwidth, latency), MapReduce, accelerators, GPGPUs, CUDA, OpenCL.

To achieve these learning outcomes, each class period will include time for both instructor lectures and in-class exercises based on assigned reading and videos.  The lab exercises will be used to help students gain hands-on programming experience with the concepts introduced in the lectures.

To ensure that students gain a strong knowledge of parallel programming foundations, the classes and homeworks homework will place equal emphasis on both theory and practice. The programming component of the course will mostly use the  Habanero-Java Library (HJ-lib)  pedagogic extension to the Java language developed in the  Habanero Extreme Scale Software Research project  at Rice University.  The course will also introduce you to real-world parallel programming models including Java Concurrency, MapReduce, MPI, OpenCL and CUDA. An important goal is that, at the end of COMP 322, you should feel comfortable programming in any parallel language for which you are familiar with the underlying sequential language (Java or C). Any parallel programming primitives that you encounter in the future should be easily recognizable based on the fundamentals studied in COMP 322.

...

There are no required textbooks for the class. Instead, lecture handouts are provided for each module as follows.  You are expected to read the relevant sections in each lecture handout before coming to the lecture.  We will also provide a number of references in the slides and handouts.The links to the latest versions of the lecture handouts are included below:

  • Module 1 handout (Parallelism)
  • Module 2 handout (Concurrency)

There

...

There are also a few optional textbooks that are also a few optional textbooks that we will draw from during the course.  You are encouraged to get copies of any or all of these books.  They will serve as useful references both during and after this course:

...

Finally, here are some additional resources that may be helpful for you:

Lecture Schedule

 

 

24 Future Tasks, Functional Parallelism ("Back to the Future")Topic 2.1 Lecture, Topic 2.1 Demonstration   29 Map Reduce4 4     05 Loop-Level Parallelism, Parallel Matrix MultiplicationTopic 3.1 Lecture , Topic 3.1 Demonstration ,  Topic 3.2 Lecture,  Topic 3.2 Demonstration   07 Iteration Grouping (Chunking), Barrier Synchronization Topic 3.3 Lecture , Topic 3.3 Demonstration, Topic 3.4 Lecture  ,   Topic 3.4 Demonstration    10  Parallelism in Java Streams, Parallel Prefix Sums Topic 3.7 Java Streams 37 Java Streams   12 Iterative Averaging Revisited, SPMD pattern 3 , Topic 3.6 Lecture,   Topic 3.6 DemonstrationHomework 2 17 Data-Driven Tasks 45 45   21 Pipeline Parallelism, Signal Statement, Fuzzy Barriers44 44 41  Topic 4.1  Quiz for Unit 5 10Mon 23Wed 25lec25Mon 30 26: Java Threads (exercise)Quiz for Unit 5FriApr 03 28: Linearizability of Concurrent Objects

Homework 4 (includes one intermediate checkpoint)

 

Homework 3 (all)  Wed 22 36: Algorithms based on Parallel Prefix (Scan) operationsHomework 4 (all)Fri 24 3734Quiz for Unit 8

Week

Day

Date (20202022)

Lecture

Assigned Reading

Assigned Videos (see Canvas site for video links)

In-class Worksheets

Slides

Work Assigned

Work Due

 Worksheet Solutions 

1

Mon

Jan 1309

Lecture 1: Task Creation and Termination (Async, Finish)

Module 1: Section 1.1

Introduction

 

 Topic 1.1 Lecture, Topic 1.1 Demonstration

worksheet1lec1-slidesslides  

 

 

 
WS1-solution 

 

Wed

Jan 1511

Lecture 2:  Computation Graphs, Ideal Parallelism

Module 1: Sections 1.2, 1.3Topic 1.2 Lecture, Topic 1.2 Demonstration, Topic 1.3 Lecture, Topic 1.3 Demonstrationworksheet2lec2-slides

Homework 1

 

Functional Programming

GList.java worksheet2lec02-slides

 

 

WS2-solution  
 FriJan 1713Lecture 3: Abstract Performance Metrics, Multiprocessor SchedulingModule 1: Section 1.4Topic 1.4 Lecture, Topic 1.4 Demonstrationworksheet3 Higher order functions  worksheet3 lec3-slides   lec3-slides

 

  WS3-solution 

2

Mon

Jan 2016

No lecture, School Holiday (Martin Luther King, Jr. Day)class: MLK

        

 

Wed

Jan 2218

Lecture 4: Parallel Speedup and Amdahl's LawModule 1: Section 1.5 Lazy Computation

LazyList.java

Lazy.java

 Topic 1.5 Lecture, Topic 1.5 Demonstrationworksheet4lec4-slidesQuiz for Unit 1   WS4-solution 

 

Fri

Jan

20

Lecture 5:

Java Streams

  Module 1: Section 2.1worksheet5lec5-slidesHomework 1 WS5-solution 
3MonJan 2723

Lecture 6:   Finish Accumulators Map Reduce with Java Streams

Module 1: Section 2.34Topic 2.3 4 Lecture, Topic 2.3 4 Demonstration  worksheet6lec6-slides

 

  WS6-solution 

 

Wed

Jan

25

Lecture 7:

Futures

Module 1: Section 2.41Topic 2.1 Lecture , Topic 2.1 Demonstrationworksheet7lec7-slidesHomework 2

 

Homework 1 WS7-solution 

 

Fri

Jan 3127

Lecture 8: Data Races, Functional & Structural Determinism  Computation Graphs, Ideal Parallelism

Module 1: Section Sections 1.2.5, 21.63Topic 1.2 .5 Lecture, Topic 1.2 .5 Demonstration, Topic 21.6 3 Lecture, Topic 21.6 3 Demonstration   worksheet8lec8-slides  Quiz for Unit 1 WS8-solution 

4

Mon

Feb 03

 

Jan 30 Lecture 9: Java’s Fork/Join LibraryAsync, Finish, Data-Driven Tasks 

Module 1:

Sections 2

Section 1.

7

1,

2

4.

8

5

 

Topic

2

1.

7 Lecture

1 Lecture, Topic 1.1 Demonstration, Topic

2

4.

8 Lecture

5 Lecture, Topic 4.5 Demonstration

worksheet9

lec9-slidesQuiz for Unit 2 slides   WS9-solution  
 WedFeb 01Lecture 10: Module 1: Sections 3.1, 3.2 Event-based programming model

 

  worksheet10lec10-slides  Homework 1WS10-solution 
 FriFeb 03Lecture 11: Module 1: Sections 3.3, 3.4 GUI programming as an example of event-based,
futures/callbacks in GUI programming
  worksheet11lec11-slidesHomework 2 WS11-solution 
5

Mon

Feb

06

Lecture 12: Scheduling/executing computation graphs
Abstract performance metrics
Module 1: Section 31.74Topic 1.4 Lecture , Topic 1.4 Demonstrationworksheet12lec12-slides Quiz for Unit 2 WS12-solution 

 

Wed

Feb

08

Lecture 13:

Parallel Speedup, Critical Path, Amdahl's Law

Module 1: Sections 3Section 1.5, 3.6Topic 3

Topic 1.5 Lecture , Topic

1.5 Demonstration

worksheet13lec13-slides

Homework 3 (includes 2 intermediate checkpoints)

Quiz for Unit 3

  WS13-solution 

 

-

Fri

Feb 1410

No class: Spring Recess

 

        
6

Mon

Feb

13

Lecture 14:

Accumulation and reduction. Finish accumulators

Module 1: Sections 4Section 2.53Topic 2.3 Lecture   Topic 2.3 Demonstrationworksheet14lec14-slides  WS14-solution 

 

Wed

Feb 1915

Lecture 15: Recursive Task Parallelism   Point-to-point Synchronization with Phasers

Module 1: Section 4.2, 4.3

  Topic 4.2 Lecture ,   Topic 4.2 Demonstration, Topic 4.3 Lecture,  Topic 4.3 Demonstrationworksheet15lec15-slides

 

 

  WS15-solution 
 FriFeb 17

Lecture 16:

Data Races, Functional & Structural Determinism

Module 1: Sections 42.45, 42.16Topic 2.5 Lecture ,  Topic 2.5 Demonstration,  Topic 2.6 Lecture,  Topic 2.6 Demonstrationworksheet16 lec16-slidesQuiz for Unit 4Quiz for Unit 3Homework 3Homework 2WS16-solution 

7

Mon

Feb 2420

Lecture 17: Midterm  Midterm Review

   lec17-slides    

 

Wed

Feb 2622

Lecture 18: Limitations of Functional parallelism.
Abstract vs
. Real Performance. real performance. Cutoff Strategy

  worksheet18lec18lec18-slides    WS18-solution 

 

Fri

Feb 2824 

Lecture 19: Critical Sections, Isolated construct (start of Module 2)

Module 2: Sections 5.1, 5.2, 5.6,

Fork/Join programming model. OS Threads. Scheduler Pattern 

 Topic 2.7 Lecture, Topic 2.7 Demonstration, Topic 2.8 Lecture, Topic 2.8 Demonstration, Topic 5.1 Lecture, Topic 5.1 Demonstration, Topic 5.2 Lecture, Topic 5.2 Demonstration, Topic 5.6 Lecture, Topic 5.6 Demonstrationworksheet19lec19-slides  Homework 3, Checkpoint-1WS19-solution  

8

MonMar

02Feb 27

Lecture 20: Parallel Spanning Tree algorithm, Atomic variables Confinement & Monitor Pattern. Critical sections
Global lock

Module 2: Sections 5.31, 5.42, 5.56 Topic 5.1 Lecture, Topic 5.3 1 Demonstration, Topic 5.4 2 Lecture, Topic 5.4 2 Demonstration, Topic 5.5 6 Lecture, Topic 5.5 6 Demonstrationworksheet20lec20-slides         WS20-solution 

 

Wed

Mar 0401

Lecture 21: Actors  Atomic variables, Synchronized statements

Module 2:

6

Sections 5.

1

4,

6

7.2

Topic 65.1 4 Lecture,   Topic 65.1 4 Demonstration,   Topic 67.2 Lecture, Topic 6.2 Demonstrationworksheet21lec21-slides   WS21-solution 

 

Fri

Mar 0603

Lecture 22: Actors (contd)

Module 2: 6.3, 6.4, 6.5

Parallel Spanning Tree, other graph algorithms 

  Topic 6.3 Lecture, Topic 6.3 Demonstration, Topic 6.4 Lecture , Topic 6.4 Demonstration,   Topic 6.5 Lecture, Topic 6.5 Demonstration worksheet22lec22-slides 

Quiz for Unit 4

Homework 4

Homework 3

WS22-solution  

9

Mon

Mar 09

No class

    

06

Lecture 23: Java Threads and Locks

Module 2: Sections 7.1, 7.3

Topic 7.1 Lecture, Topic 7.3 Lecture

worksheet23 lec23-slides  

 

 
WS23-solution 

 

Wed

Mar 11

No class

   

08

Lecture 24: Java Locks - Soundness and progress guarantees  

Module 2: 7.5Topic 7.5 Lecture worksheet24 lec24-slides  

 

 
WS24-solution 

 

Fri

Mar 13

No class

    

10

 Lecture 25: Dining Philosophers Problem  Module 2: 7.6Topic 7.6 Lectureworksheet25lec25-slides 

 

WS25-solution 
 

Mon

Mar 13

No class:

 -

M-F

Mar 16 - Mar 20

Spring Break

     

 

  
 WedMar 15

Lecture 23: Actors (contd)

Module 2: 6.6Topic 6.6 Lecture, Topic 6.6 Demonstration No class: Spring Break   lec23-slides 

 

   

 

Fri

Mar

17

Lecture 24: Java Threads, Java synchronized statement

Module 2: 7.1, 7.2Topic 7.1 Lecture, Topic 7.2 Lecture 

No class: Spring Break

   lec24-slides   

 

  

 10

FriMon

Mar 2720

Lecture 25: Java Threads, Java synchronized statement (contd), wait/notify

Module 2: 7.1, 7.2

26: N-Body problem, applications and implementations 

  worksheet26lec26-slides   WS26-solution 

 

Wed

Mar 22

Lecture 27: Read-Write Locks, Linearizability of Concurrent Objects

Module 2: 7.3, 7.4Topic 7.3 Topic 7.1 Lecture, Topic 7.2 4 Lecture worksheet27lec27-slides

 

 Homework 3, Checkpoint-2WS27-solution 

 

11

Fri

Mar

24

Lecture

   lec26-handout Quiz for Unit 6

28: Message-Passing programming model with Actors

Module 2: 6.1, 6.2Topic 6.1 Lecture, Topic 6.1 Demonstration,   Topic 6.2 Lecture, Topic 6.2 Demonstration worksheet28lec28-slides

 

 

 

WS28-solution 

11

Mon

Mar 27

Lecture 29: Active Object Pattern. Combining Actors with task parallelism 

Wed

Apr 01

Lecture 27: Java Locks

Module 2: 76.3, 6.4Topic 76.3 Lecture , Topic 6.3 Demonstration,   Topic 6.4 Lecture, Topic 6.4 Demonstrationworksheet29lec29lec27-slides

 

 

 
WS29-solution 

 

Wed

Mar 29

Lecture

30: Task Affinity and locality. Memory hierarchy 

  worksheet30lec30Module 2: 7.4Topic 7.4 Lecture lec28-slides

 

 WS30-solution 

 

Fri

12

Mon

Apr 06

Lecture 29:  Java Locks (exercise)

   lec29-handout  

Quiz for Unit 6

  

 

Wed

Apr 08

Lecture 30: Safety and Liveness Properties, Java Synchronizers, Dining Philosophers Problem

Module 2: 7.5, 7.6Topic 7.5 Lecture, Topic 7.6 Lecture lec30-slides

Quiz for Unit 7

 

  

 

Fri

Apr 10

Lecture 31: Message Passing Interface (MPI), (start of Module 3)

 Topic 8.1 Lecture, Topic 8.2 Lecture, Topic 8.3 Lecture lec31-slides

 

   

13

Mon

Apr 13

Lecture 32: Message Passing Interface (MPI, contd)

 Topic 8.4 Lecture  lec32-slides 

Homework 4 Checkpoint-1

  

 

Wed

Apr 15

Lecture 33: Message Passing Interface (MPI, contd)

 Topic 8.5 Lecture, Topic 8 Demonstration Video lec33-slides

 

 

  

 

Fri

Apr 17

Lecture 34: Task Affinity with Places

   

lec34-slides

  Quiz for Unit 8

Quiz for Unit 7  

Mar 31

Lecture 31: Data-Parallel Programming model. Loop-Level Parallelism, Loop Chunking

Module 1: Sections 3.1, 3.2, 3.3Topic 3.1 Lecture, Topic 3.1 Demonstration , Topic 3.2 Lecture,  Topic 3.2 Demonstration, Topic 3.3 Lecture,  Topic 3.3 Demonstrationworksheet31lec31-slidesHomework 5

Homework 4

WS31-solution 

12

Mon

Apr 03

Lecture 32: Barrier Synchronization with PhasersModule 1: Section 3.4Topic 3.4 Lecture,  Topic 3.4 Demonstrationworksheet32lec32-slides

 

 

WS32-solution 

 

Wed

Apr 05

Lecture 33:  Stencil computation. Point-to-point Synchronization with Phasers

Module 1: Section 4.2, 4.3

Topic 4.2 Lecture, Topic 4.2 Demonstration, Topic 4.3 Lecture,  Topic 4.3 Demonstration

worksheet33lec33-slides

 

 WS33-solution 

 

Fri

Apr 07

Lecture 34: Fuzzy Barriers with Phasers

Module 1: Section 4.1Topic 4.1 Lecture, Topic 4.1 Demonstrationworksheet34lec34-slides 

 

WS34-solution 

13

Mon

Apr 10

Lecture 35: Eureka-style Speculative Task Parallelism 

 

worksheet35lec35-slides

 

 

WS35-solution 
 WedApr 12Lecture 36: Scan Pattern. Parallel Prefix Sum 

 

worksheet36lec36-slides  WS36-solution 
 FriApr 14Lecture 37: Parallel Prefix Sum applications  worksheet37lec37

14

Mon

Apr 20

Lecture 35: Eureka-style Speculative Task Parallelism

   lec35-slides    
14MonApr 17Lecture 38: Overview of other models and frameworks   lec36lec38-slides    
 WedApr 19Lecture 39: Course Review (Lectures 19-38)   lec37lec39-slides   - 
  Fri Apr 21Lecture 40: Course Review (Lectures 19-38)              lec40-slides  Homework 5   

Lab Schedule

0  Setup  30Futures 27midterm exam ---Apache Spark   Java's ForkJoin Framework

Lab #

Date (20202023)

Topic

Handouts

Examples

1

Jan 09

Infrastructure

setup

lab0-handout

1

Jan 16

Async-Finish Parallel Programming with abstract metrics

lab1-handout

 
- Jan 16No lab this week (MLK)  
2Jan 23Functional Programminglab2-handout 

3

Feb 06

Jan 30

Java Streams

Cutoff Strategy and Real World Performance

lab3-handout
 
-4

 

No lab this week - Spring RecessFeb 06Futureslab4-handout  
4

5

Feb 20

DDFs

13

Data-Driven Tasks

lab5-handoutlab4-handout  
-Feb 20No lab this week (Midterm)  
6

5

Mar 05

Loop-level Parallelism

lab5-handout lab5-intro

-

 

 

  

Feb 27

Async / Finish

lab6-handout 
7Mar 06Recursive Task Cutoff Strategylab7-handout

-

 

Isolated Statement and Atomic Variables

  
- Mar 13No lab this week (Spring Break)Actors  
8 Mar 20Java Threads, Java Locks  lab8-handout 

Message Passing Interface (MPI)

  
9Mar 27Concurrent Listslab9-handout 
10Apr 03Actorslab10-handout 
11

Apr 10

Loop Parallelism

lab11-handout

Eureka-style Speculative Task Parallelism

 

-

 

Apr 17

No lab this week

  

Grading, Honor Code Policy, Processes and Procedures

Grading will be based on your performance on five homeworks four homework assignments (weighted 40% in all), two exams (weighted 40% in all), weekly lab exercises (weighted 10% in all), online quizzes (weighted 5% in all), and in-class worksheets (weighted 5% in all).

The purpose of the homeworks homework is to give you practice in solving problems that deepen your understanding of concepts introduced in class. Homeworks are Homework is due on the dates and times specified in the course schedule.  No late submissions (other than those using slip days mentioned below) will be accepted.

The slip day policy for COMP 322 is similar to that of COMP 321. All students will be given 3 slip days to use throughout the semester. When you use a slip day, you will receive up to 24 additional hours to complete the assignment. You may use these slip days in any way you see fit (3 days on one assignment, 1 day each on 3 assignments, etc.). Slip days will be automatically tracked through the Autograder, more details are available later in this document and in the Autograder user guideusing the README.md file. Other than slip days, no extensions will be given unless there are exceptional circumstances (such as severe sickness, not because you have too much other work). Such extensions must be requested and approved by the instructor (via e-mail, phone, or in person) before the due date for the assignment. Last minute requests are likely to be denied.be denied.

Labs must be submitted by the following Wednesday at 4:30pm.  Labs Labs must be checked off by a TA by the following Monday at 11:59pm.

Worksheets should be completed in class for full credit.  For partial credit, a worksheet can be turned in before the start of the class following the one in which the worksheet for distributed, by the deadline listed in Canvas so that solutions to the worksheets can be discussed in the next class.

You will be expected to follow the Honor Code in all homeworks and homework and exams.  The following policies will apply to different work products in the course:

  • In-class worksheets: You are free to discuss all aspects of in-class worksheets with your other classmates, the teaching assistants and the professor during the class. You can work in a group and write down the solution that you obtained as a group. If you work on the worksheet outside of class (e.g., due to an absence), then it must be entirely your individual effort, without discussion with any other students.  If you use any material from external sources, you must provide proper attribution.
  • Weekly lab assignments: You are free to discuss all aspects of lab assignments with your other classmates, the teaching assistants and the professor during the lab.  However, all code and reports that you submit are expected to be the result of your individual effort. If you work on the lab outside of class (e.g., due to an absence), then it must be entirely your individual effort, without discussion with any other students.  If you use any material from external sources, you must provide proper attribution (as shown here).
  • HomeworksHomework: All submitted homeworks are homework is expected to be the result of your individual effort. You are free to discuss course material and approaches to problems with your other classmates, the teaching assistants and the professor, but you should never misrepresent someone else’s work as your own. If you use any material from external sources, you must provide proper attribution.
  • Quizzes: Each online quiz will be an open-notes individual test.  The student may consult their course materials and notes when taking the quizzes, but may not consult any other external sources.
  • Exams: Each exam will be a closedopen-book, closedopen-notes, and closedopen-computer individual written test, which must be completed within a specified time limit.  No class notes or external materials may be consulted when taking the exams.

...