Monthly Archives: April 2012

Free Range Learners

Ever wonder why students don’t read your carefully-chosen course materials and instead look for other sources of information on the web?

I peronally have always found this habit annoying. A recent (albeit informal) study published in the Chronicle of Higher Education suggests students have good reasons for treating the web as their textbook. Most notably, students surfing the web may be looking for the same information in their textbooks, but in a format that is more understandable to them.

This strikes me as an area ripe for further inquiry. I mean, who hasn’t looked for another source of information when a given source doesn’t make sense?





Getting Inspired at the Undergraduate Exhibition

Undergraduates get a lot of bad press these days. I guess that has always been true, but sometimes the laments about the “millennial generation” seem especially loud. So I found it a useful counterbalance to serve as a judge for this year’s Undergraduate Exhibition.

Picture the scene yesterday in the HUB-Robeson Center’s Alumni Hall: Dozens and dozens of research posters produced by Penn State undergraduates, with representation from the arts and humanities, engineering, health and life sciences, physical sciences, social and behavioral sciences, and course-based projects. I was a judge for social and behavioral sciences and got a chance to talk with some extremely smart and articulate undergraduates.

Strolling around afterward, I enjoyed seeing the variety inherent in the research — everything from posters on biosynthesis of Thiostrepton A to analysis of a poet’s oeuvre to an examination of ways that infants’ crawling behaviors affect their communicative development.

It was a humbling and inspiring way to spend an hour.

Learning Analytics: Tread Carefully

Over the last 8-10 months, a handful of folks from the Schreyer Institute, Teaching and Learning with Technology and the Office of Institutional Planning and Assessment discussed and researched the topic of learning analytics. If you never heard the term learning analytics before, the easiest way to explain it is by looking at companies like Netflix and Amazon. These companies leverage your personal renting or buying habits to compare you to hundreds (or thousands) of similar users to provide you with recommendations on what to rent or purchase next. Learning analytics is the application of these same practices, but in support of education. Specifically, learning analytics is:

 “the use of analytic techniques to help target instructional, curricular, and support resources to support the achievement of specific learning goals”. (Barneveld et al., 2012)
Many of these early efforts, such as the Signals project at Purdue University, live within a University’s course management system. These tools generate a risk assessment for each student in a course by taking into account a student’s demographic and historical data (age, gender, past GPA, SAT scores, etc) and then combines CMS activity data (how many logins, grade book data, number of forum posts, etc). A faculty member usually initiates the risk assessment and then receives a ‘risk level’ for each student. At this point, the faculty member can intervene with those at high risk of not succeeding in a course.
Overall, I think this is a wonderful idea, and the folks at Purdue ran a few studies that illustrate how effective this system can be at keeping students at a “C” or above. A new system from Austin Peay State University just crossed my desk that exists outside of the CMS. This system resides at the course registration level. When students login to register for a course, they are presented with two ratings:
  • The highest rated courses you should be taking (based off your major, semester courseload and other data)
  • Your predicted grade for the course (this is generated by comparing your historical transcript data to 10 years worth of other students that have similar characteristics).
This is a very interesting idea, but how will students interpret the data? Will they intentionally register for courses where the system predicts a “B” or above, even if they are not interested in the content? Will students only register for highly recommended courses, and not pursue other interests due to the system’s recommendations? A colleague mentioned that this could spiral into a self fulfilling prophecy for students rather quickly, where they take the prediction data as fact and don’t deviate from any of the recommendations. Scary stuff. 
With any learning analytics system, a key challenge will be educating and training the end users on how to best leverage the data. In many instances, that means being skeptical of the data, and using it as one of many different factors that contribute to a decision that, in the end, contributes to student success.

Final Exam Resources for Students and Faculty

Lynn Sorenson, a leader in instructional development circles, recently shared some resources about final exams–one for students and one for faculty–from the BYU Center for Teaching & Learning.  They were developed by Michael C. Johnson and Scott Schaefermeyer and their teams.

Let us know what you think.

     How Do You Have a Successful Finals Week?

   Final Exam Options for Faculty

Active Learning

I always believe that students learn when they are willing to learn and to take the responsibility of learning. However, it does not mean that the role of the instructor is not important. I think instructors still play an important role in students’ journey of learning. Instead of training students to be passive learners, instructors build an environment where students are eager to learn actively, facilitate students to think and explore, and create opportunities for students to transfer what they learned in class to real life situations.

Here are two articles on key principles and strategies of active learning from Faculty Focus written by Maryellen Weimer. You may want to take a look at these principles and strategies and see if any of them fit in your teaching style and your students’ learning needs.

The articles are here:

Five Key Principles of Active Learning

Active-Learning Ideas for Large Classes: Simple to Complex

Getting to Know

While preparing Community in the Classroom this past week, Andrew Porter and I discussed the question:

How do we keep our students from feeling isolated in class?

Our first answer, in simple terms, was: Get-To-Know-Each-Other.

A classroom where students feel like they are known enables them to feel like they belong.  Belonging breeds motivation, risk-taking.  Motivation and risk-taking generate learning, and learning is our goal.  But…

….getting to know each other involves getting closer to personal than can feel comfortable.

In order to help our students down this road, answering low-risk and entertaining questions at some point during class can help to break the proverbial ice over the semester.  No matter the size of your class, these questions can get conversation going between pairs, rows, sections, everybody.  10 minutes of getting to know each other may seem like a lot, but can lead to much richer learning over the long-term.

Below I’ve included 35 get-to-know-you-with-minimal-risk questions compiled and contributed to us by Heather Holleman, lecturer in English. Enjoy!

1.  What is the most interesting course you have ever taken in school?

2.  What is your favorite quotation?

3.  What is one item you might keep forever?

4.  What were you known for in high school?  Did you have any nicknames?

5.  If you could have witnessed any event in sports history, what would it be?

6.  What is something you consider beautiful?

7.  What was your first CD or song you played over and over again?

8.  What accomplishment are you most proud of?

9.  If you could be an apprentice to any person, living or deceased, from whom would you want to learn?

10.  What are three things that make you happy?

11.  What’s one movie you think everyone should see?  What’s a movie you think nobody should see?

12.  Who inspires you?

13.  What’s one thing you want to do before you die?

14.  Get in groups of three people.  What’s the most bizarre thing you have in common?

15.  Whenever you are having a bad day, what is the best thing you can do to help cheer yourself up?

16.  Have you ever experienced something unexplainable or supernatural?

17.  What was your best Halloween costume?

18.  You can choose the question you want to ask the class.

19.  What was the last thing you Googled?

20.  What YouTube video do you watch over and over?

21.  What’s the kindest act you’ve ever witnessed?

22.  Tell us one thing you know you do well (a talent?) and one thing you know you don’t.

23.  What is your favorite way to procrastinate?

24.  What is your favorite home-cooked meal?

25.  What was your favorite childhood toy?

26.  What do you do other than study?  What clubs are you involved in?

27.  What was your first job?

28.  Any brushes with fame?

29.  What’s the story behind your name?

30.  Do you believe in anything that most people might not believe in?

31.  I wish everyone would___________________

32.  What’s the best sound effect you can make?

33.  What’s the funniest thing you did as a kid that people still talk about today?

34.  What was the last thing you bought on eBay?

35.  Tell us something quirky about you. 


Student Ratings of Teaching Effectiveness: Online SRTEs

This posting is prompted by a recent comment from a faculty member expressing concern about the decreased response rates seen with Online SRTEs.  A number of faculty over the past year have also communicated that they’ve heard that the Schreyer Institute says the response rates for SRTEs have not decreased.  This is incorrect.

Instead, what we have communicated is that
a) the response rate decrease was expected, but
b) that average ratings have held steady. 

Our research has shown that the response rates have decreased between 20-25%.  We fully expected a decrease because the assessment has moved out of the classroom, so students were no longer a “captive audience.”  In fact, the decrease we have experienced at Penn State is smaller than we expected and much smaller than other institutions have experienced.  The average response rates by college and campus are posted online (see and all are above 50%. 

The reports above also show that the average scores have held steady.  At the request of the University Faculty Senate’s Committee on Faculty Affairs and the Vice Provost for Academic Affairs, we have been monitoring the SRTE results. Other than overall decrease in response rates, we have found no patterns that can be attributed to the online SRTEs.  In other words, we see no evidence that faculty have been ‘harmed’ by the decreased response rates.  Vice Provost Blannie Bowen has indicated that unless we see a significant negative impact on faculty, we will continue with the online administration of the SRTEs. 

We have requested that campus and college administrators communicate to faculty review committees and academic unit heads that response rate decreases not be over-interepreted or attributed to the actions of any individual faculty members.  

If a faculty member believes that their results are the exception to the trend, i.e. that their average scores have decreased solely because of the online administration, that should be communicated to her/his academic unit head.  The University Faculty Senate’s Statement of Practices for the Evaluation of Teaching Effectiveness for Promotion and Tenure states that “If there is some reason to explain the results or the absence of results in a particular case, the appropriate academic administrator shall make a note to that effect in the dossier” (see p. 3, section I.A.11.a.2).

Please remember that the paper SRTEs had more than 20 years to become embedded in student culture.  We need to give the online SRTEs some time too.  We are still in the transition phase where many of our current students experienced paper SRTEs and now are having to do them online.  It will be a few years before all students have only submitted SRTEs online.  Institutions that have been administering student ratings online for far longer than we have indicate that the response rates do rebound.

The Schreyer Institute is currently gathering information from faculty who have response rates of 70% or higher (see and we will continue to add to this site.  The faculty included in this project come from a wide variety of colleges, campuses, and disciplines. The faculty teach courses with enrollments from 30-493.  We deliberately excluded graduate and upper-level majors courses in which it might be easier to get higher response rates. 

What this project indicates is what we’ve always known–that faculty are the most important determinant of students’ participation.  In short, students make the effort to submit ratings when they feel that “someone is listening.”  The faculty with the highest response rates communicate that they value students’ views, that they take suggestions for improvement seriously, and they tell their students changes made as a result of students’ feedback.  Some faculty do this by talking about the SRTEs, others never mention SRTEs, but instead gather student feedback throughout the semester, which creates a culture of feedback among the students.  None of these faculty discuss the fact that SRTEs are required for P&T and reappointment.