Analyze and Use Results
Analysis of quantitative data generally provides a clear understanding of how well targets have been met (for example, whether or not 80% of students are "competent" in oral communication) or to establish baselines for future assessments. Analyses can be as straightforward as an item-by-item tabulation of rubric scores, calculating percent of students receiving each score. Depending on the sample size and demographic information collected, other statistical tests can be performed, such as difference of means tests (do majors/non-majors or first-year/fourth-year students perform differently?) Institutional Research and Analytics will provide advice to those curious about various analysis options.
Qualitative data can be analyzed in both formal and informal ways. Informally, analysts can read through interview or focus group responses and summarize them. More systematic analysis can be conducted by performing content analysis - tallying key words or concepts. Survey results often include qualitative open-ended responses as well as quantitative scaled answers (e.g. very satisfied to very dissatisfied).
During the analysis stage, the validity and reliability of the assessment tool should be examined. Did the methods used measure what you intended (validity), and are the methods likely to yield the same findings each time they are employed (reliability)? This process need not be sophisticated; what is important is to take the time to reflect on the assessment process. For more information on validity and reliability, please see the article Scoring Rubric Development: Validity and Reliability in Practical Assessment, Research & Evaluation.
Summarize and Share Assessment Results
Share the assessment results with faculty colleagues or committee or with the department leadership to elicit their interpretation.
- If results affirm that student learning meets faculty expectations, then include that finding in the subsequent report.
- If results are inconclusive or conflict with "on the ground" perceptions, then either refine the assessment process for “next time” or continue the assessment to see if additional data yield clearer results.
- If results reveal that student performance does not meet faculty expectations, then initiate planning to improve student learning.
Design, plan, implement, and document improvements as needed
- Improvements can range from minor changes in course content to major changes in the program.
- Improvements can consist of changes to the assessment process or instrument.
- Describe and document improvements planned, undertaken, or completed.
- If the assessment found that outcomes were achieved and if faculty are confident in the validity of the process, then a conclusion of “no change needed” is welcome.