I’m working through a great article that deals with learning in a web 2.0 world called Minds on Fire: Open Education, the Long Tail, and Learning 2.0. John Seely and Richard Adler move through the changing dynamics of teaching and learning in a social, web 2.0 space, and talk specifically about social learning and the changing pedagogies it brings.
Ann pointed me to an interesting article from Inside Higher Ed that deals with adding a human element to online learning. Douglas Hersh is Dean of Educational Programs and Technology at Santa Barbara City College and shares his experiences going from Blackboard to Moodle, specifially focusing on a real-time video application they now run inside the open source system.
Hersh contends that this ‘human element’ via the real time video and audio (implemented with Skype) helps keep students involved and adds a much needed level of social presence to the course experience. While I do agree with most of this, the article also draws in another very good point from Reggie Smith, president of the United States Distance Learning Association. Smith reinforces the idea that real-time video and audio can be great, helpful tools, it’s the overall design of the coruse that has the larger impact and dictates success or failure.
Also worth noting is that Hersh conducted research on his own course students (n=145) looking at satisfaction and a few other variables. His research showed that students were more satisfied when using the video-rich features that allowed direct, real time interaction. I applaud all researchers’ efforts towards this type of data collection and reporting, but I am starting to find that the instructor plays such a big role in the outcome of this type of research and represents a variable that is very difficult to control. I’ve seen folks here at PSU try and replicate methods from different professors but end up with vastly different results. The context or discipline of the course also has a big impact on this, especially when we examine tool.
Time to get back to work on that meta-review of technology-enhanced learning environments!
I’m in the process of collecting recent articles on technology and how it can be used collaboratively for learning, specifically seeking out meta analysis articles and articles on blogs in education. I came across a nice meta analysis today from Resta and Laferriere (2007) titled “Technology in Support of Collaborative Learning“.
Throughout the article, the authors touch on research in this area over the last 20 years, interjecting recommendations for future research. One of the recomendations that stands out:
“Future computer supported collaborative learning (CSCL) studies should focus less attention on the question of whether CSCL is better than face-to-face collaborative learning, but rather focus on what is uniquely feasible with new technology…and the different ecologies and affordances of CSCL environments and tools that are diverging further and further from face-to-face learning environments.”
I agree with this assessment, especially coming off a dissertation that compared two different technology environments and trying to identify which one is ‘better’ for certain tasks. I wonder if I should have spent less time comparing, and more time examining ‘unique feasibility’ of each platform in specific contexts. The second part of the quote about different ecologies and affordances also rings true. Many of the collaborative tools used today like Facebook, Flickr and Youtube, have immense potential for learning, but each one has a very specific ecology that makes it unique for specific contexts. As these technologies mature alongside the LMC/CMS environments we are now using, will the gap between the two grow wider or smaller?
We are working with the Penn State World Campus on the creation of a course aimed at faculty and other PSU folks that are tasked with creating an online course. We are currently in the design process of figuring out what the course should look like and determining what topics we should be covering.
What better way to help us flesh this out than to ask the community? I received several responses from the Penn State Excellence in Teaching Ning community, members of the Schreyer’s Institute as well as past colleagues that were tasked with creating online courses.
After doing a very quick and dirty analysis, the comments fell into about 14 categories. Some of the categories had a bit of an overlap (for instance, “design” might encompass things such as “content quality” and “scaling”). Based on the analysis, four categories received 6 or more mentions (n = 16).
1. Technology – Not many surprises here. Faculty asked about the appropriateness of asynchronous tools vs. synchronous tools in certain contexts, how to keep up with changing technologies and how to use technology for group work, assignments and meeting online.
One interesting theme that emerged was ‘what can I use vs. what should I use?’ Without having more context, I can only guess at what faculty were referring to but I have a general idea this question ties to policy and maybe even FERPA. Sometimes faculty want to use certain technologies, but are afraid that they might be breaking rules or policies.
2. Time management – Many faculty asked about the time it takes to develop an online course. In terms of teaching courses, on respondent says “it feels like a 24/7 endeavor”. Both designing and teaching an online course are very large time commitments, especially when maintaining a high level of quality.
3. Collaboration – I was somewhat surprised at the frequency collaboration was mentioned. Questions dealt with the types of personnel faculty would collaborate with, who sets the deadlines/timelines, what are instructional designers and why do I have to work with a team?
4. Design – This could have been number one or last on the list, depending on inclusive you want to get with the word ‘design’. Some questions involved converting a face-to-face course to online while others mentioned how to migrate certain types of assignments meant for synchronous, face-to-face courses to an online course.
Overall, I did not find many surprises during the small data analysis aside from the collaboration questions. The collaboration is a tough one to answer, as most units have their own workflow and personnel that work with faculty on online course design and development.