Analyzing Student Learning Outcomes Assessment Data

Assuming SLOs already exist and are stated adequately (see WRITING APPROPRIATE SLOs, available from the OIRPA), this handout provides a brief introduction to five basic steps in analyzing SLO assessment data.

STEP ONE: Selecting Assessment Methods – Data cannot be analyzed until after it is collected. Consequently, it is necessary to identify for each SLO one or more appropriate methods to “measure” or determine the level to which the desired outcome was achieved or is being achieved by students. Authors are wise to keep potential assessment methods in mind when writing SLOs. Doing so greatly assists in selecting and crafting achievable/measurable SLOs. Numerous methods are possible, but in general an appropriate assessment method must (a) be appropriate to the particular SLO, (b) fit faculty and/or staff’s comfort and skill level, and (c) yield data suitable for measuring the specific SLO.

For more information on assessment methods read the handout, METHODS TO ASSESS STUDENT LEARNING OUTCOMES (SLOs), available from the OIRPA.

STEP TWO: Collecting Data – Once a suitable assessment method(s) is/are identified for each SLO, the faculty and/or staff must determine when, where, how, and who will implement the assessment method(s) and collect the resulting data. At CBU collecting the data is planned at two levels. First, an Overall Assessment Plan (OPlan) is formulated and identifies the program’s SLOs, assessment methods, where each outcome is addressed, criteria for success, and assessment frequency. The OPlan is written once and updated as needed. Secondly, each year in a Yearly Assessment Plan (YPlan) the program identifies the specific SLOs it intends to assess (based on the frequency guidelines in the OPlan) and the specific assessment methods, as well as spelling out the exact when, how, and by whom.

After planning the data collection process, the data is collected in keeping with the plan.

STEP THREE: Analyzing Data – SLOs stated adequately include a “criterion level,” but not necessarily as part of the SLO statement itself. At CBU a criterion level for each SLO is identified in the program’s Overall Assessment Plan (OPlan). A criterion level selected by faculty is the minimum level students (not individual students, but all students in the major or minor; aggregated data) must achieve in order to demonstrate SLO achievement. For example, a specific percentile score on a major field exam, average performance evaluation, percentage of students completing an internship, number of students admitted to grad school, narrative description of student performance, etc. Explained briefly below, two types of data are collected and analyzed to determine if criteria are met:

Quantitative Data – Assessment data measured numerically (counts, scores, percentages, etc.) are most often summarized using simple charts, graphs, tables, and descriptive statistics- mean, median, mode, standard deviation, percentage, etc. Deciding on which quantitative analysis method is best depends on (a) the specific assessment method (b) the type of data collected (nominal, ordinal interval, or ratio data) and (c) the audience receiving and using the results. No one analysis method is best, but means (averages) and percentages are used most frequently. Refer to an introductory statistics text to learn more about basic numerical analysis. The OIRPA is available to consult on quantitative data analysis.

Qualitative Data – SLOs assessed using qualitative methods focus on words and descriptions and produce verbal or narrative data. These types of data are collected through focus groups, interviews, opened-ended questionnaires, and other less structured methodologies. Generally speaking descriptions or words are more the preferred alternative. Abundant assistance on using qualitative assessment is available on the internet. The OIRPA is available to consult on qualitative data analysis. (NOTE: Many qualitative methods are “quantifiable;” the ability to summarize qualitative data using numbers. For example, art faculty use a numeric rubric to apply their professional judgment in assessing student portfolios.)

STEP FOUR: Using Results to Make Decisions (evaluation) – Once an appropriate analysis technique is applied to the assessment data and results are in hand, the next step entails making decisions based on those results. Compare the results with the established criteria to determine how or if data answer the following generic “so-what” questions: Do the data affirm the SLO was achieved or is being achieved at the desired criterion level? Secondly, depending on the answer to the first question, what action(s) is/are warranted by the data?

Answering the generic questions leads to one or more subsequent actions:

  1. Do nothing and retain the SLO; continue on the same path.
  2. Identify what minor adjustments in the instructional and/or assessment methods are needed to facilitate achieving the SLO; or perhaps just more time is needed.
  3. Determine and implement major adjustments in the instructional and/or assessment methods as needed to facilitate achieving the SLO.
  4. Delete the SLO and replace it with a more appropriate SLO.

STEP FIVE: Reporting Results – Two questions arise: First, to whom are the results reported and secondly, how are the results reported? The audience receiving the report usually dictates the reporting method(s). In most cases it is either “official” or “unofficial” audiences. Official audiences include presidents, deans, faculty leaders, etc.; people to whom the results must be reported. In this context reporting methods often must conform to expected standards and/or formats. For example, an annual report containing specific information and filed in an exact method; a web-based reporting system or similar highly-structured procedure. Decisions made in Step 3—analysis methods—can be influenced by reporting requirements.

At CBU an “official” report is required each year. All programs must file a Yearly Assessment Report indicating the SLOs assessed during the past year, including actions taken or planning based on the assessment results.

Unofficial audiences are those persons to whom the results might be interesting, but they are not required to receive and/or act on the results. In these cases the reporting method is far more flexible. The options range from simple handouts to slick multi-color brochures. Factors such as time and resources influence selecting the best reporting method(s).

Regardless of the specific audience and reporting methods, it is imperative the data are current, accurate, and analyzed appropriately. Likewise, clearly stating what actions were taken or are planned is essential.

IMPORTANT CONSIDERATIONS – Please keep the following facts in mind:

  1. SLOs are rarely achieved at the desired criteria level the first time they are assessed. It is perfectly acceptable to assess SLOs multiple times over several years, being careful to document incremental improvements without expecting total perfection.
  2. Time is often the key factor; it may be necessary to give an SLO more time to “mature.” Assessment is NOT a one-time event, but an on-going marathon.
  3. Progress in assessing SLOs is far more important than exacting precision; sustained effort and patience are essential. Like any skill, the more assessment is practiced, the better the performance becomes.
  4. It is not sufficient to merely collect and analyze assessment data. Results must inform decision making related to making program improvements.

Student Learning Outcomes. (n.d.). Retrieved July 10, 2015, from http://www.calbaptist.edu/academics/schools-colleges/college-engineering/bachelor-science-software-engineering/software-engineering-student-outcomes/