Monthly Archives: May 2011

Show and Tell: Teaching with Technology

This list is updated several times a year and used for several Schreyer Presentations and seminars. If you have some good examples of technology being used for education, please feel free to add it to our list.

I created a 1-page handout that often accompanies examples on this page, that is distributed during seminars.

Yammer is a social network tool that was originally created for business use. Nearly 80% of Fortune 500 companies are using Yammer for internal communication and collaboration needs. Penn State recently adopted Yammer across the University, both for staff and faculty collaboration, as well as for course support. Several faculty are piloting Yammer to support course work in fall 2012 across a wide variety of disciplines. 
The primary purpose most faculty gravitate towards when using Yammer is collaboration, with an emphasis on teams. Yammer looks a lot like Facebook, and is very use-friendly in terms of facilitating discourse. Yammer also includes a file repository and a “Pages’ tool that allows students to collaboratively create documents.
WordPress is an open source web publishing platform recently adopted by Penn State. The functionality of WordPress is very similar to our current blogging platform, blogs.psu.edu. WordPress does offer some additional functionality, more customization options and a cleaner, easier-to-use interface. In terms of pedagogy, instructors are using WordPress in similar ways to the blog platform.

Blogs (blogs.psu.edu)
The focus here is on various pedagogies as it relates to blogging, as well as introducing the audience to the PSU blog platform. Here’s a link to the Getting Started Guide, and the full handbook (pdf).

Examples:

  • SITE Blog: institutional, multi-author blog.
  • Philosophy blog: faculty using a personal blog as a multi-author blog, adding students. NOTE that this is a single blog, that the entire class can create entries on.  This has been a very successful model for Chris Long as it pertains to creating a blogging community of users (vs. everyone with separate blogs). See Chris’s rubric for assessing student blog use.
  • SCIED 458: the instructors that co-teach this course are using this as the hub of the course, in addition to the blog.  Illustrates the power and flexibility of the PSU blog platform to also power websites.
  • ePort: Good example of an eportfolio, something Education is leveraging a lot these days for students that will eventually go out and become teachers.

Media Commons (mediacommons.psu.edu)
The focus here is the services offered via Media Commons and Media and Technology Services (MTSS) at PSU.  Key points include:
– The ability for students to rent equipment from MTSS
– Media commons facilities and professional staff that will assist students in editing video
– Linking video projects to learning objectives and pedagogy.

Examples: 

  • French 401: interesting example of a video that might replace a student presentation in class.  This method can open up more class time for instructors (vs. losing class time due to many student presentations).

Educational Gaming Commons (EGC) (gaming.psu.edu)
The focus here is on the link between games and education, and how the EGC can support faculty along these lines.  They offer things like engagement awards, where faculty can submit short proposals to have games custom-built for a course.  Also, the EGC lab consists of software and consoles that students can leverage for course-related assignments, as well as reserve the room for course use.

Examples:

  • Lab: 6A Findlay Commons @ University Park
  • Chemblaster: an example of a game built for introductory chemistry courses.
  • SimHealth: using an off-the-shelf software package for health policy.
  • EconU: A game used to teach intro economic concepts.
  • Gamification: This is actually a blog, but it’s using a gamification strategy.  On this page, students can see who has the most commented-on blog entry.  This is, in essence, a leaderboard and worked well in this course to motivate students to contribute more meaningful posts.

Training Services (its.psu.edu/training)
Training services primarily deals with technology training, both face-to-face and online.  The website typically lists all the training programs each semester.  Training that faculty might be interested in include ANGEL, blogs, podcasting and video.

Examples:

  • Lynda.com videos: through a university-wide contract, all PSU employees and students have access to Lynda.com technical training.  This website has thousands of video-based training modules on a vast number of technologies, primarily software packages.

Wikispaces (wikispaces.psu.edu)
I typically focus on teamwork and collaboration as it applies to Wikispaces, but also stress the open nature of the platform to support many different uses (for instance, collaboration on grants). 

Examples:

  • Biology 110: course delivery
  • IST 440w: research collaboration – requires special permission (email pursel[at]psu.edu)
  • PSY 525: course hub
Flipping the Classroom
Education Technology Services is currently working with several faculty around the idea of flipping the classroom. This is a model that involves students primarily watching recorded lectures or interacting with other one-way informational resources outside of class, then in-class time is used for various active learning activities. Penn State provides several resources on this model:

VoiceThread (voicethread.psu.edu)

Voicethread is a relatively new tool available to all Penn State users, although several folks from PSU have used the service for years.  Voicethread allows you to create media-rich dialogs using text, images and video.  One of the advantages of Voicethread is that it runs within a web browsers, so you don’t need to worry about downloading any specialized software.

Examples:

Lecture Capture (capture.psu.edu)
The University is currently piloting various lecture capture technologies, specialized pieces of software leveraged for capturing lectures and providing the lecture online in video format.  Currently no public examples from Penn State are available, but several instructors are recording their lectures and making them available to students as resources for the course. You can learn more about lecture capture, specifically what variables people are examining in this field, through this document drafted by the Schreyer Institute and ETS.

Clickers (Student Response System)
Penn State supports i>clicker, a specific type of clicker leveraged by a wide variety of faculty at University Park, often in large lecture courses. 

iTunes University (itunes.psu.edu)
This is an iTunes interface to a large collection of Penn State podcasts and related files and resources.  In addition to many PR-related podcasts, such as interviews with faculty and coaches and recruiting messages, faculty are using iTunes University to record lectures, post interviews with industry professionals and also having students produce podcasts as class assignments.  Most courses are not accessible, but the site does provide some ‘open’ and demo courses to view.

Post last updated: 10/16/12

Student Engagement

At this year’s Teaching and Learning with Technology Symposium, a main theme was community engagement.  Leading up to the Symposium, several units and individuals around campus created videos and media to help answer the question “What does community engagement mean to me?”

This is a Prezi that several folks from the Schreyer Institute worked on to answer that question from the Institute’s point of view.

.prezi-player { width: 550px; } .prezi-player-links { text-align: center; }

Assessing Teamwork

Many of us incorporate some level of teamwork in our courses.  I typically teach in the College of Information Sciences and Technology, where nearly every course has some level of teaming.  For my course in particular, IST 446, over 1/2 of the points in the course are based on team assignments.  With this much emphasis on teaming, it is often difficult to fairly assess team work.  I tried something a bit different this semester, and want to share what I felt were the pros and cons of my method.

First, I used the Comprehensive Assessment for Team-Member Effectiveness (CATME) system that originated from Purdue University. This is a web-based tool that allows faculty (regardless of university) to upload a course list with team data, and create assessments that allow students to rate one another based on 13 different categories related to teaming (all from teaming research and literature).  This year, I used the following categories:

  • Contribution to work
  • Interacting with teammates
  • Keeping Team on track
  • Expecting quality
  • Team satisfaction

The system will generate emails to all students that need to complete team evaluations.  This is actually a 40-point assignment in my course, requiring each student to login to the system and evaluate their teammates.  I tell them that if they complete the surveys for each team member, they will always start out with a 40/40.  Then, based on their evaluations by their peers, I make adjustments to the grade of the assignment.  Here are the steps I used:

  1. I created a ‘team average’ based on the scores of each team member.  For instance, if we had 4 people per team, I simply added up their total scores and divided by 4 for the average.
  2. Next, I compared each individual’s average to the team average.  Individuals that were below a whole point (on a 5-point scale) received a letter grade decrease for the assignment.  Assuming the average was a 4.2, and someone received an average of a 3.1, that student just went from an “A” to an “A-“
  3. I then went to .50 intervals.  So if the student had, for example, a 2.5, I would then drop the student to a “B+”. 

Overall, this was good in concept but I implemented it poorly.  Those students that were rated poorly by their peers brought the team average down very low, in essence making it very difficult to be less than a whole point below the team average.  If I did this again, I would probably use .25 intervals to adjust grades.

Also, I did not find a way to incorporate the qualitative comments the system collects.  On one team, for example, 3 of the 4 members all commented about the lack of participation of a single group member. My quantitative method did drop the student from an “A” to an “A-” for the assignment, but it was clear to me that I should have dropped the student further. But I did not have any method to somehow standardize on written feedback to impact the grade.

Lastly, the CATME system does a lot of interesting analysis for you, and highlights specific students that meet certain criteria.  For instance, one student was flagged as “Overconfident – The team’s average rating for this student is less than 3, and the student has rated themselves more than one point higher  on average than this rating”.  Another student was flagged “Personality Conflict – This student has rated one or more team members a 2 or less, but the median rating of the student(s) by the other team members gives a score of at least 3. Perhaps this student just didn’t get along with the student(s) that got poor ratings?”

Finally, several students were marked as “High Performer – This condition indicates that the average rating for this student by the other members of the team is more than half a point higher than the overall average rating of the team. The students average rating must be higher than a 3.5 to qualify”.

My method this year didn’t do a good enough job penalizing poor performers, or rewarding high performers.  I think moving to .25 increments from the team mean will help me better penalize poor performers, but I’m still not sure how to best reward high performers based on the current structure of the assignment. Ideas?