Skip Navigation
Search

Data Collection, Analysis, and Reporting

After conducting your assessment, you can begin the critical process of collecting evidence, analyzing data, and reporting the findings or results to your stakeholders in order to prompt thoughtful discussions on next steps. 


Collecting Data

Whether you are reviewing test results or survey responses, begin by taking stock of the assessment artifacts and evidence at your disposal. Review what data sources may already exist within the University or within your school, department or program. Determine what additional data may need to be collected at a local level, the process or policy or collection, and a responsible point person.  It is recommended to retain a copy of assessment instruments and artifacts, raw data results, and data analysis for your records and future reference. 

Like assessment methods, data can be direct or indirect in nature. Using both types of evidence can provide multiple perspectives in the assessment process. 

  • Direct evidence: tests, rubrics, certification or licensure exams, or field experience evaluations.
  • Indirect Evidence: job placement rates, salaries, retention rates, graduation rates, course grades, surveys of students or alumni.

Data Analysis

The primary purpose of data analysis is not only to determine if the benchmark was met, but also to understand how and why we receive a favorable or unfavorable result. This way, programs can gain a deeper understanding of what works well, what needs improvement, and which interventions can help move the needle toward more effective student learning outcomes.

  • Look for themes. Are there any patterns in the data?
  • Look for anomalies or outliers. Is there anything that doesn't "fit"that warrants closer examination?
  • Does the data corroborate or contradict classroom experiences or other anecdotal interactions with students? 
  • How does the data align with trends noted by national or regional organizations in your discipline? 
  • Are there external factors that may have impacted the results? Consider, for example, the impact of COVID-19 on program delivery formats in recent years. 
  • Upon reviewing the data, what questions do you have? What conclusion have you drawn? What recommendations do you have? 
  • Did the assessment method yield data that is appropriate for the objective, or do you need to make a change to the mapping, metrics, methods or benchmarks you previously set? 

Adapted from "Assessment Data Review Worksheet" by the University of South Carolina


Reporting Results & Findings

  • Keep data summaries short, sweet and easy to interpret. 
  • Use concise, simple charts, graphs, lists, or PowerPoint slides. Avoid narrative reports. 
  • Aggregate data first, then drill down into details.
  • Link results to your learning objectives and benchmarks to provide clear outcomes related to your established standards. 
  • Consider the audience receiving the report, and what data they might need to inform decision-making. 
  • Engage in meaningful discussion with stakeholders to determine what the data means for your program.  
  • As you collect evidence over time, provide historical trend data, such as a year over year analysis.

Adapted from "Understanding and Using Assessment Results" by Linda Suski (2008).