Category Archives: SRTE

All Students Can Recognize Good Teaching: Just Ask Them

This fall semester is about over and soon students will be responding to the SRTEs. For many students, the opportunity to give feedback to their instructors, is a phenomenon that begins in college. That trend, however, is changing. According to the article “Why Kids Should Grade Teachers” a growing number of American schools across the country are asking their students, some as young as kindergarten, to evaluate their teachers.

Some interesting findings are emerging. Student ratings tend to be fairly stable from class to class and from fall to spring. Race and income done have much impact on results. What is clear is that students are looking for a classroom where the teacher has control and makes learning challenging. And one city, Memphis, has become the first school system in the country to link survey results to teachers’ annual reviews by having the surveys count for 5% of a teacher’s evaulation. 

A variety of questions are being asked on these surveys, but the five most correlated with student learning are listed below:

1. Students in this class treat the teacher with respect.

2. My classmates behave the way my teacher wants them to.

3. Our class stays busy and doesn’t waste time.

4. In this class, we learn a lot almost every day.

5. In this class, we learn to correct our mistakes.

What can those of us in higher education learn from this article? Are we, for example, asking the right questions? Are we using the information we gather from students to define quality teaching and to inform better practice?

Student Ratings of Teaching Effectiveness: Online SRTEs

This posting is prompted by a recent comment from a faculty member expressing concern about the decreased response rates seen with Online SRTEs.  A number of faculty over the past year have also communicated that they’ve heard that the Schreyer Institute says the response rates for SRTEs have not decreased.  This is incorrect.

Instead, what we have communicated is that
a) the response rate decrease was expected, but
b) that average ratings have held steady. 

Our research has shown that the response rates have decreased between 20-25%.  We fully expected a decrease because the assessment has moved out of the classroom, so students were no longer a “captive audience.”  In fact, the decrease we have experienced at Penn State is smaller than we expected and much smaller than other institutions have experienced.  The average response rates by college and campus are posted online (see and all are above 50%. 

The reports above also show that the average scores have held steady.  At the request of the University Faculty Senate’s Committee on Faculty Affairs and the Vice Provost for Academic Affairs, we have been monitoring the SRTE results. Other than overall decrease in response rates, we have found no patterns that can be attributed to the online SRTEs.  In other words, we see no evidence that faculty have been ‘harmed’ by the decreased response rates.  Vice Provost Blannie Bowen has indicated that unless we see a significant negative impact on faculty, we will continue with the online administration of the SRTEs. 

We have requested that campus and college administrators communicate to faculty review committees and academic unit heads that response rate decreases not be over-interepreted or attributed to the actions of any individual faculty members.  

If a faculty member believes that their results are the exception to the trend, i.e. that their average scores have decreased solely because of the online administration, that should be communicated to her/his academic unit head.  The University Faculty Senate’s Statement of Practices for the Evaluation of Teaching Effectiveness for Promotion and Tenure states that “If there is some reason to explain the results or the absence of results in a particular case, the appropriate academic administrator shall make a note to that effect in the dossier” (see p. 3, section I.A.11.a.2).

Please remember that the paper SRTEs had more than 20 years to become embedded in student culture.  We need to give the online SRTEs some time too.  We are still in the transition phase where many of our current students experienced paper SRTEs and now are having to do them online.  It will be a few years before all students have only submitted SRTEs online.  Institutions that have been administering student ratings online for far longer than we have indicate that the response rates do rebound.

The Schreyer Institute is currently gathering information from faculty who have response rates of 70% or higher (see and we will continue to add to this site.  The faculty included in this project come from a wide variety of colleges, campuses, and disciplines. The faculty teach courses with enrollments from 30-493.  We deliberately excluded graduate and upper-level majors courses in which it might be easier to get higher response rates. 

What this project indicates is what we’ve always known–that faculty are the most important determinant of students’ participation.  In short, students make the effort to submit ratings when they feel that “someone is listening.”  The faculty with the highest response rates communicate that they value students’ views, that they take suggestions for improvement seriously, and they tell their students changes made as a result of students’ feedback.  Some faculty do this by talking about the SRTEs, others never mention SRTEs, but instead gather student feedback throughout the semester, which creates a culture of feedback among the students.  None of these faculty discuss the fact that SRTEs are required for P&T and reappointment.

Faculty Tips: Getting Students to Do the SRTEs

As is the case at most universities, end-of-semester student evaluations are important at Penn State: In addition to providing valuable information about what’s working (or not) in a course, the SRTEs factor into decisions about promotion and tenure, or rehiring of non-tenure-track faculty.

Since Penn State’s move to online SRTEs, many faculty members have expressed concern about fewer students filling out the forms. Yet some faculty still achieve impressive response rates, and staff at the Schreyer Institute realized that those faculty have insights that could help everyone. So we asked faculty what their secrets are (the respondents had at least a 70% SRTE response rate and at least 30 students in their classes). 

Some of the faculty said they tell students about ways SRTE feedback has allowed them to improve their classes; some mentioned the importance of gathering feedback throughout the semester; and some had still other strategies. You can find tips from faculty in their own words here. We’ll be adding to the list over time, so make sure to stop back. And if you have a tip of your own, please add a comment to this blog!


Timely Reminders

As the end of the semester looms nearer, some reminders:

–Many of our students are starting to “hit the wall” around this time in the semester. (If only they knew that so are their faculty.) If you have major projects due in the next few weeks, consider checking in systematically with your students to gauge their progress. How many of them have completed the literature review for their research papers? Have their groups been meeting regularly to prep for their upcoming class presentations? The ideal situation is if you have built these checkpoints into your syllabus; even if you haven’t, consider whether you can informally check in now.

–Make sure your grade book is current. I’ve sometimes been guilty of waiting until the last minute to compile my end-of-semester grades. Not recommended, unless you like stress. Consider whether your grades are all up to date, and if not, there’s no time like the present.

–Prepare for those SRTEs. Although end-of-semester student evaluations are still a few weeks away, David Perlmutter offers some good tips about intepreting them — and preparing students to take them — in the Chronicle of Higher Education. One interesting suggestion: “In the case of course objectives, on the first day of class, lay them out carefully, noting that they are also spelled out in the syllabus. In a later class, perhaps the one previous to the session in which you will hand out the evaluations, reiterate your course objectives and explain how they have been achieved. That’s not pandering to students; that’s transparent teaching.”

–Remember that extra hour we all get this Sunday, when we turn the clocks back one hour. You might use it to sleep or to catch up on academic work. Don’t worry though — we’ll have to pay for this next year when we “spring forward.”

Online teaching and promotion and tenure

Yesterday the Institute hosted a panel discussion dealing with “How can the teaching of online courses be evaluated for P & T? A Panel Discussion with Keith Bailey, David DiBiase, Diane Parente and Angela Linse”.
Many PSU folks joined us, both face-to-face in 315 Rider Building as well as people over polycom from Brandywine, Fayette and Erie.  Dave and the folks from the eDutton Institute created a nice peer review guide for online courses that the college of EMS utilizes. One thing Dave mentioned is that, using this guide, non-online teachers would still be capable of providing a quality review of an online course.  This sparked some interesting debate among both panelists and attendees.  Keith Bailey and Diane Parente offered their methods of online course peer review as well.  Both Dave and Diane encourage faculty members to be ‘in’ the online environment the teacher is leveraging for the course for at least a month, sometimes for the entire semester. Keith indicated this could be a tough sell for faculty, when they already have a great deal of commitments vying for their time.  Another interesting point was raised from Brandywine, where they mentioned the guideline they established that keeps most tenure-track instructors away from online instruction.  Some of this is due to SRTE ratings typically being 6/10s of a point lower for online instructors vs. resident instructors. 

All in all, a great discussion and you could tell the panelists and some of the participants were passionate about the dialog.  Hopefully we can build off this enthusiasm and continue working towards a consistent, quality-driven method for evaluating the growing number of online courses offered around PSU.