[ skip to content ]

Toggle Mobile Menu

Institutional Effectiveness & Assessment:Assessment Guidelines

More Information about this image

Students at ODU sign

Establish clear, measurable expected outcomes of student learning.



The mission communicates the overarching purpose of the degree program. A brief statement about the general values, principles that guide the curriculum, and learning experiences provided by the program should be included. Indicate how the mission is aligned with the mission of the department, college, and university.

Use the following prompts as a guide:

  1. What is the degree program name?
  2. What is the educational purpose of the program?
  3. What is the primary function of the program?
  4. What are the primary activities or learning experiences provided by the program?
  5. Does the mission implicitly or explicitly align the program with the mission of the department, college, and university?


Learning outcomes convey the specific knowledge, skills, and abilities that students should be able to demonstrate at the end of the degree program. Well-written outcomes include concrete action verbs that appropriately specify the level of learning indicated by the program.

Use the following prompts as a guide:

  1. What level of learning (Blooms action verb) is taking place?
  2. What knowledge, skill, or ability should students demonstrate at the end of the program?

External Resources

What are the characteristics of well-stated learning outcomes? Blog post by Linda Suskie

Blooms Action Verbs, Resource by University of Nebraska

A learning outcome generator, Resource by Indiana University


Quick Tips

Use strong action verbs to describe learning

  • Original: The students will understand basic components of 80s Pop Culture.
  • Revised: Students graduating from the BA program in 80s Pop Culture will identify (a) relevant musicians, (b) TV shows and movies, (c) fads, and (d) technology of the period.

Verbs like "know", "understand", and "appreciate" are not measurable. Action verbs from established learning taxonomies (e.g. Blooms) such as identify, explain, examine, analyze, create, develop, and evaluate are much more descriptive of the learning process. The revised outcome expands on the majors components of 80s Pop Culture.

Select the action verb that represents the highest level of learning

  • Original: Students graduating from the BA program in 80s Pop Culture will identify, explain, summarize, and examine relevant musicians.
  • Revised: Students graduating from the BA program in 80s Pop Culture will examine relevant musicians.

Identify and examine are two different levels of learning. We strongly suggest that you use one action verb within a student learning outcome. Examine is a higher level of learning than identify; therefore, we can assume that you will be able to identify prior to examining.





Ensure that students have sufficient opportunities to achieve those outcomes.



A curriculum map makes visible how courses in a curriculum align to the program's learning outcomes. In its simplest version, the curriculum map is built on a two-dimensional matrix, with the outcomes arrayed across the top and courses listed down the left side. A mark is made in the box where a course addresses an outcome.

Use the following prompts to create a curriculum map:

  1. What are we mapping and why?
  2. What parts of the learning environment are included or left out by this approach?
  3. Who should be involved in the conversations?


External Resources

Example Curriculum Maps and Toolkit, Resource by the National Institute of Learning Outcomes Assessment (NILOA)

See an example of a graduate program curriculum map, Resource by University of Wyoming

See examples of undergraduate program curriculum maps, Resource by Washington State University



Creating a curriculum map helps individual faculty connect the dots between their course and the goals of the program. It also helps determine where assessment information should be collected.

Use the following prompts to collect information from faculty:

  1. Which student learning outcomes are taught in this course?
  2. What assignments demonstrate each of these outcomes?

External Resources

Download a Curriculum Mapping Tool (excel) to share with faculty, Resource by Carnegie Mellon

Follow these steps to develop a curriculum map, Resource by Washington State University

An important part of creating or updating a curriculum map is talking with faculty about what is happening in their courses and within the curriculum. This exercise improves communication about curriculum and promotes program coherence.

Use the following prompts to guide a review of the curriculum map:

  1. How is this outcome taught in the course and to what level (introduced, reinforced, mastered)?
  2. What is happening in all course sections vs what is happening in some course sections?

External Resources

Curriculum Mapping Instructions and Examples, Resource by Texas A&M University

Levels of Emphasis Curriculum Mapping Scale, Resource by Wabash Center


Quick Tips

Through a program discussion, determine what level of learning is taking place, across all instructors, for each course. Use this scale to categorize the development of student learning across the curriculum.

Introduce "I" = students are introduced to the outcome

Reinforce "R" = the outcome is reinforced, and the students afforded opportunities to practice

Mastery "M" = students have had sufficient practice and can now demonstrate mastery


Analyzing the curriculum map can help inform decisions about course offerings, sequencing, and scheduling. The map can also reveal strengths and weaknesses within the curriculum.

Use the following prompts to guide a programmatic discussion about the curriculum:

  1. In the key courses, are all outcomes addressed, in a logical order?
  2. Do all the key courses address at least one outcome?
  3. Do some outcomes get more coverage than others?
  4. Do students get practice on all the outcomes before being assessed?
  5. Do all students, regardless of which electives they choose, experience a coherent progression and coverage of all outcomes?

External Resources

Use the Curriculum Mapping Toolkit to guide departmental discussions and decision making, Resource by the National Institute of Learning Outcomes Assessment (NILOA)




Systematically gather, analyze, and interpret evidence to determine how well students are meeting our expectations.



Well-developed measures collect information that is relevant, meaningful, and actionable to the program. Two measures should be used to evaluate each student learning outcome and should include at least one direct measure.

Measures are opportunities for programs to collect information about how well students are demonstrating or performing the Student Learning Outcomes (SLOs).

Use the following prompts as a guide:

  1. Where and how are students demonstrating the learning outcome?
  2. What is the purpose of the measure and how does it relate to the outcome?
  3. How is the measure of student learning evaluated (rubric, faculty panel, answer key, survey, etc.)
  4. What scale, criteria, or standard is used to evaluate the student learning outcome?
  5. How is this consistently measured across administrations?
  6. What makes this measure trustworthy and useful?

External Resources

Choosing an assessment method, Resource by University of Hawai'i at Manoa

VALUE Rubrics, Valid Assessment of Learning in Undergraduate Education, Resource by Association of American Colleges & Universities


Quick Tips

Choose direct measures (e.g. assignments) first and indirect measures (e.g. surveys) second

If the program wants to measure students' ability to, "write a cogent argument about how a political event in the 80s shaped pop culture", then...

  • Original: Senior student satisfaction survey
  • Revised: Capstone argument paper and senior student satisfaction survey - questions related to confidence in written communication skills

Exemplary assessment uses two measures to evaluate student learning. Direct measures of student work like tests, papers, and projects provide the most compelling assessment evidence. Indirect measures, like surveys, can be useful as a supplement to the direct measures.

Show the connection between measures and outcomes by adding detail

The overall satisfaction rate of the senior student survey doesn't tell us very much about written communication. Its best to specify and report the specific survey questions or constructs that relate to the learning outcome.

Demonstrate mastery of learning with upper division coursework

If the program wants to measure students' ability to, "apply theories to solve real world problems", then...

  • Original: PCUL 201 project
  • Revised: PCUL 300/400 level project and rubric

Its best to collect programmatic assessment data throughout the curriculum in places where students are asked to demonstrate mastery or advanced levels of learning. An upper division course project is a better example of "application skills" than a lower or intermediate level course project.

Develop useful and meaningful data with shared assessments

  • Revised: PCUL 300/400 level project and program developed and accepted rubric*

Here the program has developed a shared rubric to evaluate this project. The rubric was adapted based on a nationally developed rubric in this area. This measure is more reliable and valid than course projects that are assessed with different instruments/rubrics.

*This is an assessment best practice.


A Target is an established achievement level that states how well and how many students in the program should be able to demonstrate a particular knowledge or skill.

Use the following prompts as a guide:

  1. What is the expected standard of performance?
  2. How many students should be able to achieve this standard of performance?

Quick Tips

Indicate performance standard and percentage of population

  • Original: Students will score B or better on the test
  • Revised: 80% of students will score 75% or better on the test

The revised target articulates the performance level as well as the number of students expected to reach the target.


Results simply report how well students are performing against the goals set by the program. The target achievement status (eg. exceeded, met, partially met, not met) should also be indicated.

Use the following prompts as a guide:

  1. Do the results report on the information described in the target?
  2. If using percentages or some other calculated final tally, what are the numbers involved in creating the final result? (e.g. 87/94=92.55%)
  3. Did the program clearly state achievement of the target (target status)?

Quick Tips

Indicate performance standard and percentage of population

  • Original: Students will score B or better on the test
  • Revised: 80% of students will score 75% or better on the test

The revised target articulates the performance level as well as the number of students expected to reach the target.

Convert the target into a finding with your results

  • Original: 70%
  • Revised: 70% (70/100) of students scored 75% or better on the designated test questions related to the outcome (methodology)

The finding should mimic the format of the target (see above). Adding the number of students represented in the percentage adds context and value to the data.





Use the resulting information to understand and improve student learning (Suskie, 2014).



This is where programs answer the "so what?" question. This asks programs to extrapolate meaning from the results and provide additional detail or context to fully explain the results to an outside reader. Various levels of analysis could be conducted to make sense of the information. It is especially important to compare learning environments and analyze the results over time to look for trends. This is an opportunity for faculty to make sense of the results against the larger landscape of the program and factors impacting the student learning outcome.

Use the following prompts as a guide:

  1. What are the strengths and weaknesses of student learning in this area?
  2. For programs with both online and face-to-face degree options: how does the performance of these unique learning environments compare?
  3. How do the results compare to previous years?
  4. How do the results fit into the larger landscape of student learning in the program?
  5. How were results shared within the program?

Quick Tips

Summarize and reflect on the data

  • Original: Test question item analysis attached.
  • Revised: Based on the test question item analysis (attached), students succeeded in answering the methodology questions. However, they struggled to answer questions when graphs and tables were involved. These questions also required proficiency in math. These results are up slightly from last year, but still below the program's target goal. The curriculum committee will review these results and determine how to improve this outcome across the upper division courses.

This asks programs to follow up and describe completed action plans, modifications, or improvements made by the program. This is an opportunity for programs to tell their story and connect the dots between the student learning outcome, assessment results that prompted action, and the modifications that were made. The program should outline and determine the impact of their changes on student learning.

Use the following prompts as a guide:

  1. Why were the changes made? (e.g., the student learning outcome and the information that prompted action)
  2. What changes were made during the year or in previous years that impacted student learning?
  3. What impact did this have on student learning?

Quick Tips

Write a learning improvement in three simple steps

  • Original: Results improved from last year.
  • Revised: Over the past three years, methodology related scores were below established program targets. Extra emphasis in this area was implemented across three key courses. Course assignments and rubrics were revised within the last two years. As a result, we have seen methodology scores improve.

A well written improvement includes three parts 1) a recap of the data/context that spurred action, 2) a description of the actions taken, 3) subsequent expected or reported results.

Describe actions taken by the program and consider their impact

  • Original: Supplemental materials will be created.
  • Revised: Over the last few years, the program saw a dip in exam scores. Exam preparation and tutoring sessions were held this year. We saw a minimal increase in scores this year. This is something to program continues to focus on. Additional interventions are planned (see our action plan to create supplemental materials).

This statement describes actions taken - past tense - by the program. (Actions that will take place - future tense - are action plans).


This is where programs answer the "now what?" question. Programs should explain their process for sharing and using assessment results to make decisions in areas such as curriculum, pedagogy, and other aspects that impact learning. The strength of assessment is not that it provides quick fixes for a problem, but that it promotes active, informed, and systematic improvement of a program through discussion among faculty. This is an opportunity to review student learning data and make decisions as a program.

Use the following prompts as a guide:

  1. How is assessment information about the quality of learning shared and used for program decision making in areas such as curriculum, pedagogy, and other aspects that impact learning?
  2. What actions do the results suggest need to be implemented?
  3. What concrete actions will the program take to sustain or improve this outcome? What is the timeframe of these actions?

Quick Tips

Outline tasks to improve student learning

  • Original: Faculty will meet to discuss the results of the assessment data and any necessary changes.
  • Revised: Faculty met and discussed the assessment results. Two faculty members and a graduate student will develop supplemental materials to improve the weak areas of student performance. They will complete this work during the fall semester and pilot the resources in the spring. They will share their feedback and materials with the faculty at the program retreat.

Action plans should report just that, intended actions. Plans to discuss data are not sufficient action items. Work with your program to review results, interpret information, and draft appropriate action plans.