Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.


Home

Office Hours

HJlib Info

edX site

Autograder Guide

Other Resources

COMP 322: Fundamentals of Parallel Programming (Spring

...

2024)

...


Instructor:

Mackale Joyner, DH 2063

Head
TAs:
Jonathan Cai (hw), Paul Jiang (lab 1pm), William Su (lab 4pm)
Haotian Dang, Andrew Ondara, Stefan Boskovic, Huzaifa Ali, Raahim Absar

Piazza site:

https://piazza.com/rice/spring2024

Admin Assistant:Annepha Hurlock, annepha@rice.edu, DH 3122, 713-348-5186Undergraduate TAs:

Tory Songyang, Zishi Wang

Piazza site:

https://piazza.com/configure-classes/spring2020

/comp322 (Piazza is the preferred medium for all course communications

, but you can also send email to comp322-staff at rice dot edu if needed

)

Cross-listing:

ELEC 323

Lecture location:

Sewell Hall 301

Herzstein Amp

Lecture times:

MWF 1:00pm - 1:50pm

Lab locations:

Mon (Brockman 101)

Tue (Herzstein Amp)

Sewell Hall 301

Lab times:

Thursday, 1

Mon  3:00pm -

1

3:50pm

, 4

(SB, HA, AO)

Tue   4:00pm - 4:

50pm

50pm  (RA, HD)

Course Syllabus

A summary PDF file containing the course syllabus for the course can be found here.  Much of the syllabus information is also included below in this course web site, along with some additional details that are not included in the syllabus.

...

The desired learning outcomes fall into three major areas (course modules):

1) Parallelism: functional programming, Java streams, creation and coordination of parallelism (async, finish), abstract performance metrics (work, critical paths), Amdahl's Law, weak vs. strong scaling, data races and determinism, data race avoidance (immutability, futures, accumulators, dataflow), deadlock avoidance, abstract vs. real performance (granularity, scalability), collective & point-to-point synchronization (phasers, barriers), parallel algorithms, systolic algorithms.

...

3) Locality & Distribution: memory hierarchies, locality, cache affinity, data movement, message-passing (MPI), communication overheads (bandwidth, latency), MapReduce, accelerators, GPGPUs, CUDA, OpenCL., MapReduce

To To achieve these learning outcomes, each class period will include time for both instructor lectures and in-class exercises based on assigned reading and videos.  The lab exercises will be used to help students gain hands-on programming experience with the concepts introduced in the lectures.

To ensure that students gain a strong knowledge of parallel programming foundations, the classes and homeworks homework will place equal emphasis on both theory and practice. The programming component of the course will mostly use the  Habanero-Java Library (HJ-lib)  pedagogic extension to the Java language developed in the  Habanero Extreme Scale Software Research project  at Rice University.  The course will also introduce you to real-world parallel programming models including Java Concurrency, MapReduce, MPI, OpenCL and CUDA. An important goal is that, at the end of COMP 322, you should feel comfortable programming in any parallel language for which you are familiar with the underlying sequential language (Java or C). Any parallel programming primitives that you encounter in the future should be easily recognizable based on the fundamentals studied in COMP 322.

...

There are no required textbooks for the class. Instead, lecture handouts are provided for each module as follows.  You are expected to read the relevant sections in each lecture handout before coming to the lecture.  We will also provide a number of references in the slides and handouts.The links to the latest versions of the lecture handouts are included below:

  • Module 1 handout (Parallelism)
  • Module 2 handout  handout (Concurrency)There is no lecture handout for Module 3 (Distribution and Locality).  The instructors will refer you to optional resources to supplement the lecture slides and videos.

There are also a few optional textbooks that we will draw from during the course.  You are encouraged to get copies of any or all of these books.  They will serve as useful references both during and after this course:

 

Finally, here are some additional resources that may be helpful for you:

Lecture Schedule

 

Lecture Schedule



Week

Day

Date (

2020 

2024)

Lecture

Assigned Reading

Assigned Videos (see Canvas site for video links)

In-class Worksheets

Slides

Work Assigned

Work Due

 
Worksheet Solutions

1

Mon

Jan

13

08

Lecture 1:

Task Creation and Termination (Async, Finish)Module 1: Section 1.1Topic 1.1 Lecture, Topic 1.1 Demonstration worksheet1lec1-slides

 

 

  

 

Wed

Jan 15

Lecture 2:  Computation Graphs, Ideal Parallelism

Module 1: Sections 1.2, 1.3Topic 1.2 Lecture, Topic 1.2 Demonstration, Topic 1.3 Lecture, Topic 1.3 Demonstrationworksheet2lec2-slides

Homework 1

 

   FriJan 17Lecture 3: Abstract Performance Metrics, Multiprocessor SchedulingModule 1: Section 1.4Topic 1.4 Lecture, Topic 1.4 Demonstrationworksheet3lec3-slides

 

   

2

Mon

Jan 20

No lecture, School Holiday (Martin Luther King, Jr. Day)

        

 

Wed

Jan 22

Lecture 4: Parallel Speedup and Amdahl's Law

Module 1: Section 1.5Topic 1.5 Lecture, Topic 1.5 Demonstrationworksheet4lec4-slidesQuiz for Unit 1   

 

Fri

Jan 24

Lecture 5: Future Tasks, Functional Parallelism ("Back to the Future")Module 1: Section 2.1Topic 2.1 Lecture, Topic 2.1 Demonstrationworksheet5lec5-slides    

3

Mon

Jan 27

Lecture 6:   Finish Accumulators

Module 1: Section 2.3Topic 2.3 Lecture, Topic 2.3 Demonstrationworksheet6lec6-slides     WedJan 29

Lecture 7: Map Reduce

Module 1: Section 2.4Topic 2.4 Lecture, Topic 2.4 Demonstration  worksheet7lec7-slides

Homework 2

Homework 1  

 

Fri

Jan 31

Introduction



worksheet1lec1-slides  



WS1-solution


Wed

Jan 10

Lecture 2:  Functional Programming



worksheet2lec02-slides



WS2-solution

FriJan 12Lecture 3: Higher order functions

worksheet3 lec3-slides   



WS3-solution

2

Mon

Jan 15

No class: MLK










Wed

Jan 17

Lecture 4: Lazy Computation



worksheet4lec4-slides

WS4-solution


Fri

Jan 19

Lecture 5: Java Streams



worksheet5lec5-slidesHomework 1
WS5-solution
3MonJan 22

Lecture 6: Map Reduce with Java Streams

Module 1: Section 2.4Topic 2.4 Lecture, Topic 2.4 Demonstration  worksheet6lec6-slides



WS6-solution


Wed

Jan 24

Lecture 7: Futures

Module 1: Section 2.1Topic 2.1 Lecture , Topic 2.1 Demonstrationworksheet7lec7-slides



WS7-solution


Fri

Jan 26

Lecture 8:  Async, Finish, Computation Graphs

Module 1: Sections 1.1, 1.2Topic 1.1 Lecture, Topic 1.1 Demonstration, Topic 1.2 Lecture, Topic 1.2 Demonstrationworksheet8lec8-slides

WS8-solution

4

Mon


Jan 29 Lecture 9: Ideal Parallelism, Data-Driven Tasks 

Module 1: Section 1.3, 4.5


Topic 1.3 Lecture, Topic 1.3 Demonstration, Topic 4.5 Lecture, Topic 4.5 Demonstration

worksheet9

lec9-slides 

WS9-solution

WedJan 31Lecture 10: Event-based programming model




worksheet10lec10-slides
Homework 1WS10-solution

FriFeb 02Lecture 11: GUI programming, Scheduling/executing computation graphs

Module 1: Section 1.4Topic 1.4 Lecture , Topic 1.4 Demonstrationworksheet11lec11-slidesHomework 2
WS11-solution
5

Mon

Feb 05

Lecture 12: Abstract performance metrics, Parallel Speedup, Amdahl's Law Module 1: Section 1.5Topic 1.5 Lecture , Topic 1.5 Demonstrationworksheet12lec12-slides

WS12-solution


Wed

Feb 07

Lecture 13: Accumulation and reduction. Finish accumulators

Module 1: Section 2.3

Topic 2.3 Lecture   Topic 2.3 Demonstration

worksheet13lec13-slides 
WS13-solution


Fri

Feb 09

No class: Spring Recess










6

Mon

Feb 12

Lecture 14:

Lecture 8:

Data Races, Functional & Structural Determinism

Module 1:
Section
Sections 2.5, 2.6Topic 2.5 Lecture ,  Topic 2.5 Demonstration,  Topic 2.6 Lecture,  Topic 2.6 Demonstration
   
worksheet8
worksheet14
lec8
lec14-slides

 

Quiz for Unit 1  


WS14-solution


Wed

Feb 14

Lecture 15: Limitations of Functional parallelism.
Abstract vs. real performance. Cutoff Strategy



worksheet15lec15-slides



Homework 2WS15-solution

FriFeb 16

Lecture 16: Recursive Task Parallelism  



worksheet16 lec16-slidesHomework 3
WS16-solution

7

Mon

Feb 19

Lecture 17: Midterm Review




lec17-slides




Wed

Feb 21

Lecture 18: Midterm Review




lec18-slides




Fri

Feb 23 

Lecture 19:  Fork/Join programming model. OS Threads. Scheduler Pattern


Topic 2.7 Lecture, Topic 2.7 Demonstration, Topic 2.8 Lecture, Topic 2.8 Demonstrationworksheet19lec19-slides

WS19-solution

8

Mon

Feb 26 

Lecture 20: Data-Parallel Programming model. Loop-Level Parallelism, Loop Chunking

4

Mon

Feb 03

Lecture 9: Java’s Fork/Join Library

Module 1: Sections 2.7, 2.8Topic 2.7 Lecture, Topic 2.8 Lectureworksheet9lec9-slidesQuiz for Unit 2   

 

Wed

Feb 05

Lecture 10: Loop-Level Parallelism, Parallel Matrix MultiplicationModule 1: Sections 3.1, 3.2Topic 3.1 Lecture , Topic 3.1 Demonstration ,  Topic 3.2 Lecture,  Topic 3.2 Demonstration worksheet10lec10-slides    

 

Fri

Feb 07

Lecture 11: Iteration Grouping (Chunking), Barrier Synchronization

Module 1: Sections 3.1, 3.2, 3.
4
3Topic 3.
3 Lecture
1 Lecture, Topic 3.1 Demonstration , Topic 3.2 Lecture,  Topic 3.2 Demonstration, Topic 3.
4
3 Lecture
 
,
  Topic
 Topic 3.
4
3 Demonstration
worksheet11
worksheet20
lec11
lec20-slides
 
   

5



WS20-solution


Wed

Mon

Feb

10

28

Lecture

12:  Parallelism in Java Streams, Parallel Prefix Sums

21: Barrier Synchronization with Phasers

Module 1:
Section
Sections 3.
7
Topic
Topic 3.
7 Java Streams
4 Lecture, Topic 3.
7 Java Streams
4 Demonstration
worksheet12
worksheet21    lec21
lec12
-slides
 Quiz for Unit 2   

Wed

Feb 12



WS21-solution


Fri

Mar 01

Lecture 22:Stencil computation. Point-to-point Synchronization with Phasers

Lecture 13: Iterative Averaging Revisited, SPMD pattern

Module 1: Sections

3

4.

5

2, 4.3

.6

Topic
3
4.
5
2 Lecture, Topic
3
4.
5
2 Demonstration, Topic 4.3
.6
Lecture,
 
Topic 4.3
.6
Demonstration
worksheet13
worksheet22
lec13
lec22-slides

Homework 3 (includes 2 intermediate checkpoints)

Quiz for Unit 3

Homework 2  


WS22-solution

9

Mon

Mar 04

Lecture 23: Fuzzy Barriers with Phasers

Module 1: Section 4.1 Topic 4.1 Lecture, Topic 4.1 Demonstrationworksheet23lec23-slides

Homework 3 (CP 1)

WS23-solution


Wed

Mar 06

Lecture 24: Confinement & Monitor Pattern. Critical sections
Global lock

Module 2: Sections 5.1, 5.2Topic 5.1 Lecture, Topic 5.1 Demonstration, Topic 5.2 Lecture, Topic 5.2 Demonstration, Topic 5.6 Lecture, Topic 5.6 Demonstrationworksheet24 lec24-slides


WS24-solution


Fri

Mar 08

 Lecture 25:  Atomic variables, Synchronized statementsModule 2: Sections 5.4, 7.2Topic 5.4 Lecture, Topic 5.4 Demonstration, Topic 7.2 Lecture worksheet25lec25-slides


WS25-solution

Mon

Mar 11

No class: Spring Break


 






WedMar 13No class: Spring Break








Fri

Mar 15

No class: Spring Break









10

Mon

Mar 18

Lecture 26: Java Threads and Locks

Module 2: Sections 7.1, 7.3Topic 7.1 Lecture, Topic 7.3 Lectureworksheet26lec26-slides

WS26-solution


Wed

Mar 20

Lecture 27: Read-Write Locks,  Soundness and progress guarantees

Module 2: Section 7.3Topic 7.3 Lecture, Topic 7.5 Lectureworksheet27lec27-slides


Homework 3 (CP 2)WS27-solution


Fri

Mar 22

Lecture 28: Dining Philosophers Problem


Topic 7.6 Lectureworksheet28lec28-slides




WS28-solution

11

Mon

Mar 25

Lecture 29:  Linearizability of Concurrent Objects

Module 2: Sections 7.4Topic 7.4 Lectureworksheet29lec29-slides



WS29-solution


Wed

Mar 27

Lecture 30:  Parallel Spanning Tree, other graph algorithms

 
worksheet30lec30-slides



WS30-solution


Fri

Mar 29

Lecture 31: Message-Passing programming model with Actors

Module 2: Sections 6.1, 6.2Topic 6.1 Lecture, Topic 6.1 Demonstration,   Topic 6.2 Lecture, Topic 6.2 Demonstrationworksheet31lec31-slides


WS31-solution

12

Mon

Apr 01

Lecture 32: Active Object Pattern. Combining Actors with task parallelismModule 2: Sections 6.3, 6.4Topic 6.3 Lecture, Topic 6.3 Demonstration,   Topic 6.4 Lecture, Topic 6.4 Demonstrationworksheet32lec32-slides

Homework 4

Homework 3 (All)

WS32-solution


Wed

Apr 03

Lecture 33: Task Affinity and locality. Memory hierarchy



worksheet33lec33-slides



WS33-solution


Fri

Apr 05

Lecture 34: Eureka-style Speculative Task Parallelism

 
worksheet34lec34-slides


WS34-solution

13

Mon

Apr 08

No class: Solar Eclipse









WedApr 10Lecture 35: Scan Pattern. Parallel Prefix Sum


worksheet35lec35-slides
Homework 4 (CP 1)WS35-solution

FriApr 12Lecture 36: Parallel Prefix Sum applications

worksheet36lec36-slides

WS36-solution
14MonApr 15Lecture 37: Overview of other models and frameworks


lec37-slides




WedApr 17Lecture 38: Course Review (Lectures 19-34)
 
lec38-slides
Homework 4 (All)


FriApr 19Lecture 39: Course Review (Lectures 19-34)


lec39-slides

-

Fri

Feb 14

Spring Recess

        

6

Mon

Feb 17

Lecture 14: Data-Driven Tasks 

Module 1: Sections 4.5Topic 4.5 Lecture   Topic 4.5 Demonstrationworksheet14 lec14-slides    

 

Wed

Feb 19

Lecture 15:  Point-to-point Synchronization with Phasers

Module 1: Section 4.2, 4.3Topic 4.2 Lecture ,   Topic 4.2 Demonstration, Topic 4.3 Lecture,  Topic 4.3 Demonstrationworksheet15lec15-slides    

 

Fri

Feb 21

Lecture 16: Pipeline Parallelism, Signal Statement, Fuzzy Barriers

Module 1: Sections 4.4, 4.1Topic 4.4 Lecture ,   Topic 4.4 Demonstration, Topic 4.1 Lecture,  Topic 4.1 Demonstrationworksheet16lec16-slidesQuiz for Unit 4Quiz for Unit 3  

7

Mon

Feb 24

Lecture 17: Midterm Review

   lec17-slides    

 

Wed

Feb 26

Lecture 18: Abstract vs. Real Performance

  worksheet18 lec18-slides     

 

Fri

Feb 28

Lecture 19: Critical Sections, Isolated construct (start of Module 2)

Module 2: Sections 5.1, 5.2, 5.6, Topic 5.1 Lecture, Topic 5.1 Demonstration, Topic 5.2 Lecture, Topic 5.2 Demonstration, Topic 5.6 Lecture, Topic 5.6 Demonstrationworksheet19lec19-slides Homework 3, Checkpoint-1  

8

Mon

Mar 02

Lecture 20: Parallel Spanning Tree algorithm, Atomic variables

Module 2: Sections 5.3, 5.4, 5.5Topic 5.3 Demonstration, Topic 5.4 Lecture, Topic 5.4 Demonstration, Topic 5.5 Lecture, Topic 5.5 Demonstrationworksheet20lec20-slides 

 

  

 

Wed

Mar 04

Lecture 21: Actors

Module 2: 6.1, 6.2

Topic 6.1 Lecture ,   Topic 6.1 Demonstration ,   Topic 6.2 Lecture, Topic 6.2 Demonstration

worksheet21 lec21-slides  

 

  

 

Fri

Mar 06

Lecture 22: Actors (contd)

Module 2: 6.3, 6.4, 6.5Topic 6.3 Lecture, Topic 6.3 Demonstration, Topic 6.4 Lecture , Topic 6.4 Demonstration,   Topic 6.5 Lecture, Topic 6.5 Demonstration worksheet22 lec22-slides 

Quiz for Unit 4

  

9

Mon

Mar 09

No class

    

Quiz for Unit 5

 

 

  

 

Wed

Mar 11

No class

    

 

 

  

 

Fri

Mar 13

No class

        -

M-F

Mar 16 - Mar 20

Spring Break

        

10

Mon

Mar 23

Lecture 23: Actors (contd)

Module 2: 6.6Topic 6.6 Lecture, Topic 6.6 Demonstration lec23-slides 

 

 

   

Wed

Mar 25

Lecture 24: Java Threads, Java synchronized statement

Module 2: 7.1, 7.2Topic 7.1 Lecture, Topic 7.2 Lecture lec24-slides

 

 

   

 

Fri

Mar 27

Lecture 25: Java Threads, Java synchronized statement (contd), wait/notify

Module 2: 7.1, 7.2Topic 7.1 Lecture, Topic 7.2 Lecture lec25-slides  

Homework 3, Checkpoint-2

  

11

Mon

Mar 30

Lecture 26: Java Threads (exercise)

   lec26-handout  Quiz for Unit 5  

 

Wed

Apr 01

Lecture 27: Java Locks

Module 2: 7.3Topic 7.3 Lecture  lec27-slides

 

   

 

Fri

Apr 03

Lecture 28: Linearizability of Concurrent Objects

Module 2: 7.4Topic 7.4 Lecture lec28-slides

Homework 4

(includes one intermediate checkpoint)

Homework 3 (all)

 

  

12

Mon

Apr 06

Lecture 29:  Java Locks (exercise)

   lec29-handout  

Quiz for Unit 6

  

 

Wed

Apr 08

Lecture 30: Safety and Liveness Properties, Java Synchronizers, Dining Philosophers Problem

Module 2: 7.5, 7.6Topic 7.5 Lecture, Topic 7.6 Lecture lec30-slides

 

 

  

 

Fri

Apr 10

Lecture 31: Message Passing Interface (MPI), (start of Module 3)

 Topic 8.1 Lecture, Topic 8.2 Lecture, Topic 8.3 Lecture lec31-slides

 

Homework 4 Checkpoint-1  

13

Mon

Apr 13

Lecture 32: Message Passing Interface (MPI, contd)

 Topic 8.4 Lecture, Topic 8.5 Lecture, Topic 8 Demonstration Video lec32-slides 

Quiz for Unit 7

  

 

Wed

Apr 15

Lecture 33: Task Affinity with Places

   lec33-slides

 

 

  

 

Fri

Apr 17

Lecture 34: Eureka-style Speculative Task Parallelism

   

lec34-slides

 

   

14

Mon

Apr 20

Lecture 35: Algorithms based on Parallel Prefix (Scan) operations

   lec35-slides 

Quiz for Unit 8

  

 

Wed

Apr 22

Lecture 36: Algorithms based on Parallel Prefix (Scan) operations, contd.   lec36-slides

 

Homework 4 (all)

  

 

Fri

Apr 24

Lecture 37: Course Review (Lectures 19-36)

     

 

  -                       





Lab Schedule

Lab #

Date (

2020

2023)

Topic

Handouts

Examples

0

1

 

Jan 08

Infrastructure

Setup

setup

lab0-handout

 

1

Jan 16

Async-Finish Parallel Programming with abstract metrics

lab1-handout


 


-
 
Jan 15No lab this week
  
(MLK)

2Jan
30
22
Futures
Functional Programminglab2-handout

 

3

Feb 06

Cutoff Strategy and Real World Performance

Jan 29

Futures

lab3-handout
 

-

 

No lab this week - Spring Recess  


4Feb
20DDFs
05Data-Driven Taskslab4-handout
 

-

Feb 12

No lab this week

 



-Feb
27
19No lab this week (
midterm exam)

 

  
Midterm Exam)

5
Mar 05

Feb 26

Loop

-level Parallelism

Parallelism 

lab5-handout
 
lab5-intro

-

 

 

  

-

 

Isolated Statement and Atomic Variables

  - Actors  -

 

Java Threads, Java Locks

  

-

 

Message Passing Interface (MPI)

  

-

 

Apache Spark

  

-

 

Eureka-style Speculative Task Parallelism

  - 

Java's ForkJoin Framework

 
image kernels
6Mar 04Recursive Task Cutoff Strategylab6-handout
-Mar 11No lab this week (Spring Break)

7Mar 18Java Threadslab7-handout
8Mar 25Concurrent Listslab8-handout
9Apr 01Actorslab9-handout
-

Apr 08

No lab this week (Solar Eclipse)



-

Apr 15

No lab this week

 



Grading, Honor Code Policy, Processes and Procedures

Grading will be based on your performance on five homeworks four homework assignments (weighted 40% in all), two exams (weighted 40% in all), weekly lab exercises (weighted 10% in all), online quizzes (weighted 5% in all), and in-class worksheets (weighted 5% in all).

The purpose of the homeworks homework is to give you practice in solving problems that deepen your understanding of concepts introduced in class. Homeworks are Homework is due on the dates and times specified in the course schedule.  No late submissions (other than those using slip days mentioned below) will be accepted.

The slip day policy for COMP 322 is similar to that of COMP 321. All students will be given 3 slip days to use throughout the semester. When you use a slip day, you will receive up to 24 additional hours to complete the assignment. You may use these slip days in any way you see fit (3 days on one assignment, 1 day each on 3 assignments, etc.). Slip days will be automatically tracked through the Autograder, more details are available later in this document and in the Autograder user guideusing the README.md file. Other than slip days, no extensions will be given unless there are exceptional circumstances (such as severe sickness, not because you have too much other work). Such extensions must be requested and approved by the instructor (via e-mail, phone, or in person) before the due date for the assignment. Last minute requests are likely to be denied..

Labs must be submitted by the following Monday at 3pm.  Labs Labs must be checked off by a TA by the following Monday at 11:59pm.

Worksheets should be completed in class for full credit.  For partial credit, a worksheet can be turned in before the start of the class following the one in which the worksheet for distributed, by the deadline listed in Canvas so that solutions to the worksheets can be discussed in the next class.

You will be expected to follow the Honor Code in all homeworks and homework and exams.  The following policies will apply to different work products in the course:

  • In-class worksheets: You are free to discuss all aspects of in-class worksheets with your other classmates, the teaching assistants and the professor during the class. You can work in a group and write down the solution that you obtained as a group. If you work on the worksheet outside of class (e.g., due to an absence), then it must be entirely your individual effort, without discussion with any other students.  If you use any material from external sources, you must provide proper attribution.
  • Weekly lab assignments: You are free to discuss all aspects of lab assignments with your other classmates, the teaching assistants and the professor during the lab.  However, all code and reports that you submit are expected to be the result of your individual effort. If you work on the lab outside of class (e.g., due to an absence), then it must be entirely your individual effort, without discussion with any other students.  If you use any material from external sources, you must provide proper attribution (as shown here).
  • HomeworksHomework: All submitted homeworks are homework is expected to be the result of your individual effort. You are free to discuss course material and approaches to problems with your other classmates, the teaching assistants and the professor, but you should never misrepresent someone else’s work as your own. If you use any material from external sources, you must provide proper attribution.
  • Quizzes: Each online quiz will be an open-notes individual test.  The student may consult their course materials and notes when taking the quizzes, but may not consult any other external sources.
  • Exams: Each exam will be a closedopen-book, closedopen-notes, and closedopen-computer individual written test, which must be completed within a specified time limit.  No class notes or external materials may be consulted when taking the exams.

...