Author Archives: Angela Rae Linse

Allowing Student Notes for Exams without Encouraging Cheating

The University Testing Center has just notified faculty that because of increased usage, Center staff cannot collect notes that faculty allow students to use during an exam.  Alternatives exist for faculty-created handouts (e.g. embedding them in ANGEL exam question sets), but how to deal with student-created notes is another challenge.

Many faculty allow students to bring in a page of notes to their exams because in preparing that page of notes, students review the material, synthesize it, and think about what is most important. All of these help students learn! 

However, other faculty are concerned that if students are allowed their own notes, some will take the opportunity to copy exam questions and pass them on to other students.  Is it possible to discourage such behavior, but still allow student-created notes?

Using a bank of test items and randomly drawing questions provides each student with a unique exam. If the bank is large and the questions sufficiently varied, there is little advantage to copying questions and sharing them with other students.  Faculty who use question banks should also take steps to ensure that each test is of comparable difficulty. Subdividing questions into different levels of difficulty and drawing a specified percent from each level is a good method to ensure that each unique exam is equally difficult.

A faculty member can also provide students with blank, but marked note paper, for students to use for their exam notes.  As a deterrent from adding notes during the exam, ask students to return the note paper in class after taking the test. Choose a marking that is difficult to replicate and easy to identify as an item student are allowed to use during the test.

While neither strategy is guaranteed to be 100% effective, both of them:
  a) communicate to students that cheating is unacceptable, and
  b) make it more difficult to cheat.  

Please share your thoughts on student-created notes and anti-cheating efforts. 

Replacing misconceptions and myths

I just finished reading an article in The Chronicle of Higher Education about why incorrect or inaccurate ideas and information persist even in the face of overwhelming contradictory evidence. The story is Why Lies Often Stick Better Than Truth.”

The educational research literature on misconceptions has clearly shown that learning can be significantly impacted by contrary pre-existing beliefs and conceptions.  What is really great about this article is that it provide a link to the Debunking Handbook, which has some excellent suggestions that faculty should find interesting.  It’s a quick and very useful read!

Cook, J., Lewandowsky, S. (2011), The Debunking Handbook. St. Lucia, Australia: University of Queensland.

Part-time Faculty welcome at Penn State’s teaching center

The article below from the Chronicle of Higher Education found that Part-time Faculty, aka Adjuncts, feel that they lack instructional resources.  Please help us get the word out to all part-time faculty teaching Penn State students that they are welcome to work with the Schreyer Institute for Teaching Excellence instructional consultants, participate in our programs, and access our resources!  

Adjuncts’ Working Conditions Affect Student Learning, Report Says
By Audrey Williams June
Short-notice hiring and a lack of instructional resources are major impediments to effective teaching, says the report, based on a survey of adjuncts last fall.

“Syllabus” a new resource that may help with “grade inflation” investigations

A recent article in the Chronicle, A New Journal Brings Peer Review to the College Syllabus tells us about a journal called Syllabus.  Not only are the example syllabi a good source of ideas, the journal has the potential to be an excellent resource for faculty who want to calibrate their syllabi with others’. 

Why would you want to calibrate your syllabi?  I sometimes recommend this course of action to faculty and administrators concerned about “grade inflation.”  Accusations of grade inflation are typically based solely on the preponderance of A-grades.  While skepticism is understandable, rarely do critics provide substantive evidence that those A-grades are undeserved. 

Two common assumptions underlying claims of grade inflation are:

  1. Grading standards are not high enough
  2. Students are not being asked to do enough work for an A-grade

Comparing syllabi is one way to investigate both of these concerns.  If a faculty member is told her course is “too easy” by colleagues, she can investigate whether faculty teaching similar courses at other institutions use a similar scale.  If this faculty member is using 80% as the boundary between an A and a B, but everyone else is using 90%, then she might indeed be viewed as being too lenient.  However, if that faculty member’s standards for 80% are equivalent to another faculty member’s expectations for a 90%, then she may be able to justify her grade distribution and student work may provide supporting evidence that her grades are not inflated.

Final Exam Resources for Students and Faculty

Lynn Sorenson, a leader in instructional development circles, recently shared some resources about final exams–one for students and one for faculty–from the BYU Center for Teaching & Learning.  They were developed by Michael C. Johnson and Scott Schaefermeyer and their teams.

Let us know what you think.

     How Do You Have a Successful Finals Week?
       http://ctl-dev.byu.edu/learning-tips/how-do-you-have-successful-finals-week

   Final Exam Options for Faculty
       http://ctl-dev.byu.edu/teaching-tips/final-exam-experience

Student Ratings of Teaching Effectiveness: Online SRTEs

This posting is prompted by a recent comment from a faculty member expressing concern about the decreased response rates seen with Online SRTEs.  A number of faculty over the past year have also communicated that they’ve heard that the Schreyer Institute says the response rates for SRTEs have not decreased.  This is incorrect.

Instead, what we have communicated is that
a) the response rate decrease was expected, but
b) that average ratings have held steady. 

Our research has shown that the response rates have decreased between 20-25%.  We fully expected a decrease because the assessment has moved out of the classroom, so students were no longer a “captive audience.”  In fact, the decrease we have experienced at Penn State is smaller than we expected and much smaller than other institutions have experienced.  The average response rates by college and campus are posted online (see http://www.srte.psu.edu/OnlineReports/) and all are above 50%. 

The reports above also show that the average scores have held steady.  At the request of the University Faculty Senate’s Committee on Faculty Affairs and the Vice Provost for Academic Affairs, we have been monitoring the SRTE results. Other than overall decrease in response rates, we have found no patterns that can be attributed to the online SRTEs.  In other words, we see no evidence that faculty have been ‘harmed’ by the decreased response rates.  Vice Provost Blannie Bowen has indicated that unless we see a significant negative impact on faculty, we will continue with the online administration of the SRTEs. 

We have requested that campus and college administrators communicate to faculty review committees and academic unit heads that response rate decreases not be over-interepreted or attributed to the actions of any individual faculty members.  

If a faculty member believes that their results are the exception to the trend, i.e. that their average scores have decreased solely because of the online administration, that should be communicated to her/his academic unit head.  The University Faculty Senate’s Statement of Practices for the Evaluation of Teaching Effectiveness for Promotion and Tenure states that “If there is some reason to explain the results or the absence of results in a particular case, the appropriate academic administrator shall make a note to that effect in the dossier” (see p. 3, section I.A.11.a.2).

Please remember that the paper SRTEs had more than 20 years to become embedded in student culture.  We need to give the online SRTEs some time too.  We are still in the transition phase where many of our current students experienced paper SRTEs and now are having to do them online.  It will be a few years before all students have only submitted SRTEs online.  Institutions that have been administering student ratings online for far longer than we have indicate that the response rates do rebound.

The Schreyer Institute is currently gathering information from faculty who have response rates of 70% or higher (see http://www.srte.psu.edu/ResponseRate/) and we will continue to add to this site.  The faculty included in this project come from a wide variety of colleges, campuses, and disciplines. The faculty teach courses with enrollments from 30-493.  We deliberately excluded graduate and upper-level majors courses in which it might be easier to get higher response rates. 

What this project indicates is what we’ve always known–that faculty are the most important determinant of students’ participation.  In short, students make the effort to submit ratings when they feel that “someone is listening.”  The faculty with the highest response rates communicate that they value students’ views, that they take suggestions for improvement seriously, and they tell their students changes made as a result of students’ feedback.  Some faculty do this by talking about the SRTEs, others never mention SRTEs, but instead gather student feedback throughout the semester, which creates a culture of feedback among the students.  None of these faculty discuss the fact that SRTEs are required for P&T and reappointment.

How valuable are teaching centers? Stepping Up for Outcomes Assessment.

I just returned from the 2012 Annual Meeting of the American Association of Colleges & Universities (AAC&U) in Washington DC.  I was honored to be a panelist in a session focused on the role of teaching centers in institutional transformation.  The panelists provided examples of how teaching centers are collaborating with other units to advance institutional change. My fellow panelists include Phyllis Worthy Dawkins, Provost and Sr. Vice Pres. of Dillard Univ., Peter Felton, Asst. Provost at Elon University, and Virginia Lee a consultant with her own company. All of us are very involved in the professional society for faculty developers in higher ed (podnetwork.org).

My contribution was to briefly talk about the role of the Schreyer Institute for Teaching Excellence in Penn State’s program and student learning outcomes assessment initiatives.  This led me to ponder why I think it is so important for us to take on both leadership and collaborative roles at Penn State.  My short answers:

  1. Teaching centers and faculty developers have valuable knowledge and skills to offer the teaching community;
  2. If we don’t collaborate and lead, the university is at risk of losing a valuable resource because no one will know how valuable we are!

If no one knows how valuable you are, how valuable are you really?

One way for the institution to recognize the value of teaching centers is for us to step-up-to-the-plate and take on areas, tasks, and projects that either no other units want or that are likely to be difficult (another panelist talked about Gen Ed Revision!). 

Assessment is a case in point.  Course, Program, and Institutional Assessment offer a great opportunity to further establish our value. 

So, what did Stepping Up for Outcomes Assessment entail?  First, it did not involve us in the role of assessment enforcer, nor did it involve us gathering and interpreting evidence.  Instead, we have helped faculty and administrators responsible for student learning outcomes assessment to meet their obligations.

A colleague at the University of Washington (J. Turns) coined the phrase “Assessment?  I hate it.  What is it?” which captures what we’ve done quite well! 

We decided to step up and provide:

Information.  When first entering the assessment arena, faculty and administrators have lots of questions (Why do we have to do this?  Why is it important?  Why don’t course grades count?  What exactly am I supposed to do?!?  Is our disciplinary accreditation evidence sufficient?)

Opportunities for intra- and interdisciplinary discussions via workshops, conferences, and meetings

Guidance about the process of assessment

Examples (goals, outcomes, plans)

Templates (curricular mapping, identifying and developing goals and outcomes, reports)

Feedback on assessment plans

Success stories

As we became more involved, we took on maintenance and further development of the University’s assessment website (assess.psu.edu), which has become the “go-to” place for information and updates on the Penn State approach to assessment.  (We also regularly hear from colleagues at other institutions about how they have used these resources.)

Stay tuned!  The Schreyer Institute and Penn State’s assessment story continues to evolve and mature.  And never forget, we are always looking for new opportunities to become even more valuable to our community of teachers and learners. Visit or contact us.

Faculty: Please remind your students about the University Teaching Awards

Recently, a reminder appeared in the Penn State Newswires that nominations for the Undergraduate Teaching Awards are accepted year-round at http://www.schreyerinstitute.psu.edu/AwardsForm/. 

Our students take courses with some pretty stellar teachers here at Penn State.  During this difficult semester, that is a good thing to remember. 

How about saying something like this:

“I know some of my faculty colleagues are among the best teachers here at Penn State.  Have you considered nominating one of them for an Undergraduate Teaching Award?”

Perhaps you could suggest that if they’ve recently given positive ratings to one of their other teachers, that faculty member might also deserve a teaching award.

Or, as a last resort, you could remind them simply because I’m asking for your help reaching out to students.  After all, students spend the most time experiencing faculty teaching excellence–we need to hear from them!

If you can’t remember the URL above, it is the first link listed in a search for “teaching awards” from the Penn State homepage.

Difficult Dialogues in the Classroom: Resources for Penn State Teachers

NEW!!! For more information, see “Difficult Dialogues” 


In light of the tragic events continuing to unfold at Penn State, we at the Schreyer Institute for Teaching Excellence recognize that there are many faculty and graduate student instructors who, while experiencing their own shock, anger, and sadness, are also looking to help their students process these events in a thoughtful and productive way.

Teachable moments can be found in this terrible situation, and thus can provide much needed support and healing through the learning process. Destiny Aman, with input from a number of other teaching institute consultants, developed the following tips for teachers interested in incorporating difficult dialogue discussions into their courses.

  1. Set Ground Rules. Encourage students to practice empathetic listening, use “I-statements,” and avoid personal attacks.
  2. Start with a guided reflective writing exercise – give students a chance to write about what they’re feeling and experiencing, but also incorporate questions to stimulate critical thinking.
  3. To give all students a chance to participate, and reduce the chance that a few individuals will dominate discussion, incorporate think-pair-share activities, dyads, or other discussion techniques that allow students to talk and process their ideas in smaller groups prior to speaking in the larger class setting.
  4. To the degree possible, connect the situation to course material and learning goals (keep in mind that while this might not directly relate to your course content, such discussions do overlap with learning goals such as critical thinking, reflection, and peer-learning).
  5. Recognize your own experience and role in the dialogue. Do not respond angrily or shut down students whose positions you disagree with – this will result in defensiveness and have a negative effect on student learning.
  6. End the session with a Critical Incident Questionnaire, and follow up with the topic as needed during the next class session or via email.

Even in classes where course content does not overlap, the learning environment will continue to be affected as news breaks on a daily (or even hourly) basis. Many students are struggling emotionally and psychologically, and may find it difficult to focus on course material. All students should be made aware of resource centers on campus where they can find a supportive, safe, and productive space to process their experience. These include:

Counseling & Psychological Services
Center for Women Students
LGBTA Resource Center
Pasquerilla Spiritual Center

As always, the Schreyer Institute for Teaching Excellence stands with all who teach at Penn State, and especially now as you work through this difficult time. If you have questions, concerns, or would like more suggestions related to teaching at Penn State, we are happy to schedule an individual consultation.

Reacting to the Past

Are any faculty at Penn State using Reacting to the Past?  I just learned about it and sounds fascinating.  This is program developed at Barnard College by anthropologists and historians that involves students in role-playing games as a way to learn not only about important events and transition points in history, but also writing and communication skills. The games emphasize historical contingency, but can also highlight the role that individual people can play in shaping history.

An article about it by David Walsh, editor of George Mason University’s History News Network, indicates that it is flexible enough to allow faculty to emphasize different aspects of history, historical scholarship, and even different learning objectives. 

Description from the Reacting to the Past website:

“Reacting to the Past (RTTP) consists of elaborate games, set in the past, in which students are assigned roles informed by classic texts in the history of ideas. Class sessions are run entirely by students; instructors advise and guide students and grade their oral and written work. It seeks to draw students into the past, promote engagement with big ideas, and improve intellectual and academic skills.”

I’ve not had a chance to delve deeply into it, but I wonder if any faculty members have taken this into cyberspace–it certainly seems to have potential.