Institutional Effectiveness and Research Resources



Glossary of Assessment Terms for Dallas Baptist University

(These are terms that DBU has standardized for internal use.  DBU recognizes usage of these terms is fluid and variable from institution to institution and from expert to expert.)

Alignment: The practice of ensuring that assessments are based on course and program outcomes, that course outcomes are based on program outcomes, that program outcomes are derived from the program mission, and that the program mission is governed by/related to/derived from the University mission.

Assessment: The assignment or task that students complete to demonstrate their mastery of a particular learning outcome. (Ex. Essay, Presentation, Test)

Assessment Review: A meeting of faculty stakeholders to review results of student performance on an assessment tool to identify whether targets have been met and where improvements in student learning may be made.

Assessment Tool: The assessment and scoring method used to determine student-learning success.

Benchmark: The actual score or standard that is the minimally acceptable level to demonstrate learning. (Ex. Level 3 or Above on a Rubric, 10 out of possible 12 points on a test question)

Curriculum Map: A listing of the program goals and learner-centered outcomes cross-referenced to the courses and specific assessments used to measure student mastery of the outcomes.

Diagnostic Data: Taken from assessment of knowledge, skills, or attributes prior to instruction.

Dimension: A sub-score on a test or rubric that can be used to pinpoint areas of learning weakness that should be addressed for student-learning improvement.

Formative Data: Taken from assessment of performance during the learning process to guide instructional efforts.

Plan Item: One section of a program annual report entered into the Campus Labs platform Planning and detailing specifics of yearly program assessments, assessment review, formative data, action plan, and/or summative data.

Plan Item One: The section of a program annual report entered into the Campus Labs platform Planning that details one program goal, outcome, and description of assessment tool for an annual report.

Plan Item Two: The section of a program annual report entered into the Campus Labs platform Planning that details the diagnostic or formative data from an outcome assessment, the participants and date of the assessment review, and the conclusions drawn from the student performance on the assessment.

Plan Item Three: The section of a program annual report entered into the Campus Labs platform Planning that details the action plan for instructional improvement implemented during the academic year of the report and the results of that implementation from a subsequent assessment of student learning.

Program Goal: A broad statement describing the knowledge, dispositions, and skills a graduate of the program is expected to have.

Program Learner-Centered Outcome: A student-centered statement describing a specific, observable, measurable demonstration of knowledge, skill, or disposition expected from graduates of the program.

Rotation Chart: A listing of the program learner-centered outcomes cross-referenced to the academic year in which each is assessed and reported on in an annual report.  All outcomes should be assessed and reported on within a five-year time frame.

Scoring Method: The standard or criteria applied to the assessment to measure meaningfully the students’ level of mastery of a particular learning outcome. (Ex. Rubric, Test Answer Key, etc.)

Student Learning Data: Quantified information regarding student success in meeting an outcome of the program. This data may be diagnostic, formative, or summative, and should reveal useful information to drive learning and program improvement.

Summative Data: Taken from assessment of learning at the end of instructional efforts.

Target: The proportion of students expected to meet the benchmark performance level. (Ex. 75% of students will score at level 3 or above on the Critical Thinking Rubric)

            An aspirational target may be set higher than minimal acceptable levels of achievement. 

            A critical target may be set for completely unacceptable levels of performance.  

Transparency: Clarification for students and faculty of what learning outcome a specific assessment may be used to measure, what steps are required to complete the assessment, what the criteria are for demonstrating mastery, what the scoring method is, and what a production or performance of mastery should “look like”

Additional Reading

  • Guskey, Thomas R. “Does Pre Assessment Work?” Educational Leadership, February 2018, 52 57. EBSCOhost
  • Susie, Linda. Assessing Student Learning: A Common Sense Guide. John Wiley &Sons, 2018.
  • Susie, Linda. Five Dimensions of Quality. John Wiley & Sons, 2015.
  • Walvoord, Barabara E. Assessment Clear and Simple. 2nd ed. John Wiley & Sons, 2010.