18th Annual Sloan Consortium Conference on Online Learning: Reflections

Notable sessions Ten Strategies to Enhance Collaborative Learning in an Online Course October 10, 2012 – 3:00pm Lead Presenter: David Wicks (Seattle Pacific University, US) Andrew Lumpe (Seattle Pacific University, US) David Denton (Seattle Pacific University, US)Design appropriate projects -…

Notable sessions

October 10, 2012 – 3:00pm
David Wicks (Seattle Pacific University, US)
Andrew Lumpe (Seattle Pacific University, US)
David Denton (Seattle Pacific University, US)

  1. Design appropriate projects – requires collaboration, length of project, complex or challenging, COI
  2. Suitable collaborative tools
  3. Team planning – student picks, consider requirements, homo/heterogeneous?
  4. Collaborative script – http://tinyurl.com/collab-script
  5. Organized into phases – milestones
  6. Individual & group deadlines – everyone has a voice
  7. Provide training for technology
  8. Reflection on the process
  9. Assess individual and group after each phase, lots of feedback
  10. Assess deliverables after each phase

October 11, 2012 – 8:50am
Sebastian Thrun (Udacity, Google, US)

  • mission: education for everyone
  • higher ed in crisis – cost (2x inflation) and debt (next bubble, Penn State #1 borrowing at $160 million last year)
  • knowledge checks embedded in the video
  • Salaman Khan: separate teaching from credentialing
  • 160K classrooms of one
  • $1/student/class
  • adaptive learning – at their own pace, multiple paths, multi-dimensional assessment
  • impact on universities? faculty?

October 12, 2012 – 10:40am
Ray Schroeder (University of Illinois- Springfield, US)
Karen Vignare (MSUglobal, Michigan State University , US)

  • MOOC in three weeks?
  • success due to: internet, cost of tech, Great Recession
  • lots of new LMSs (Google, iTunesU, etc.)
  • University of the People – $500K from Gates foundation to get accredited
  • Factors by scale: other languages, cultures, distributed engagement, assessment (machine graded or peer review), gather data, look at emerging crediting models (badges)
  • MSU looked to Metropolitan Agriculture – new program opportunity, their specialty, international need
  • Open content was a challenge (Creative Commons Attribution license)
  • Used WordPress, Adobe Connect
  • 160 hours in WordPress with another 2 months

October 10, 2012 – 9:00am
Ray Schroeder (University of Illinois – Springfield, US)
Michele Gribbins (University of Illinois – Springfield, US)
  • no PPT! use webtools like Google Sites – worked OK… navigation and flow were awkward… what about a Google Docs Presentation?
  • while not a workshop… there were some interesting discussions

October 10, 2012 – 12:00pm
Amanda Rockinson-Szapkiw (Liberty University, US)

  • no significant difference between eTexts and traditional texts
  • etext users: exhibited different cognitive strategies, aggregated notes, significant impact on how they studied for the course, liked search features, portability, 6 month loan was a negative, some elements weren’t readable, navigation quality varied text to text, want to see more interactivity


Session topics worthy of mention:

  • Rubrics – holistic vs. analytic, involve students in the creation of rubrics, start with observable and measurable outcomes, 4-8 criteria, use even number 2-6 achievement scales, list high to low, use percentages
  • Simulations – Simwriter authoring software, traditional roles (writers, directors, actors, sound/film crews) can be accomplished with fewer people
  • Gaming – check out Lee Sheldon, start with fun, grading was a hassle, instructor buy-in essential, used both cooperative and competitive games

Overall

  • good use of pollanywhere – however give time to respond and reflect
  • twitter applet to gather questions worked well in the sessions I saw it used
  • QR codes everywhere
  • over 34 Penn staters
  • nice location: Disney World
  • schedule of sessions was unwieldy and outdated – should have used sched.org or something similar
  • initially at least, the electronic evaluations were accessible via QR code only
  • ePoster sessions not well organized – improve with table numbers and list locations in the catalog