NMC Webinar Presentations

http://catalyst.navigator.nmc.org/gallery/Future of Learning EnvironmentsJosephine Hofmann & Anna HobergWorkplace learning changing requirementslearning has to be a part of daily work (not rigid training)just-in-time learningLearning 2.0 – learning w/in working process, requires more from learners and from learning designFocus for Design GridManagement…

http://catalyst.navigator.nmc.org/gallery/

Future of Learning Environments
Josephine Hofmann & Anna Hoberg

  • Workplace learning changing requirements
    • learning has to be a part of daily work (not rigid training)
    • just-in-time learning
  • Learning 2.0 – learning w/in working process, requires more from learners and from learning design
    Capture.PNG
  • Focus for Design Grid
    Capture2.PNG
  • Management Audit Framework – looking at multiple aspects of a company before designing learning, conducts interviews, the create a complex graphic to summarize their findings
    learning environment audit.PNG
  • Learning 2.0 wrap-up: “assure future innovation, offers approaches for the demographic change, prepare employees for a dynamic, permanent changing environment”
  • big challenge: cultural change

Hands-on Info Tech Virtual Lab Powered by Cloud Computing
Peng Li, East Carolina University

  • HP catalyst project team
  • large DE student population (about 100 students)
  • abstract: secure, scalable, remote lab learning environment allows for learning anytime and anywhere
  • installed HP servers, virtual labs, application image library
    virtual labs cloud computing diagram.PNG
  • physical labs are too difficult to maintain
  • 1 server can replace multiple hardware computers
  • decentralized – students install their own, need powerful computers and instructors cannot monitor work/provide help
  • centralized approach – using multiple cloud systems, on-demand, highly scalable
    virtual lab project status.PNG
  • visualization is not simulation (SL = virtual world simulator), real IT applications
  • reservation system on Blade Server
  • setting up and maintaining a cloud computing system is not easy
  • assessment: most like virtual labs, helped to understand topics, develop hands-on skills, easy to monitor, easy to seek help, collect resource data
  • spread due dates… reduces load, use in evening
  • high speed internet and firefox required
  • space and memory is required to support more students

Computation Chemistry Infrastructure
Isaac K’Owino

  • audio problems – great opening video
  • virtual chemistry tools VLab 1.6.4 and ChemLab 2.0
  • http://www.modelscience.com/products.html?ref=home&link=chemlab
  • grad, undergrad, and HS students work together
  • encourages hands-on experience
  • students don’t need real labs if they have these virtual labs to learn
  • awesome collaboration and opportunities to make huge impacts
  • http://irydium.chem.cmu.edu/find.php

Reflections
I logged on today to specifically tune into the presentation on virtual labs powered by cloud computing. It was a very interesting presentation and I wonder if there are aspects of this project that we could benefit from here at IST or elsewhere around PSU. We’re already using virtual labs at IST, but I’ve heard that scalability is an issue and concerns that we’re starting to use demonstrations over visualization.

The project that really grabbed my attention was the last presentation from Isaac in Kenya. Although there were audio problems to begin with, I was impressed with the work Isaac has been doing with collaboration from around the world and the awesome impact they appear to be having with HS, undergrad and graduate-level students.

NMC Learning Analytics Online Workshop Reflections

Event Listing http://www.nmc.org/events/learning-analytics-webinarOn iTunesU http://itunes.apple.com/itunes-u/nmc-horizon-connect-learning/id489707147NERLA – NorthEast Regional Learning AnalyticsDavid Wedaman, from Brandeis, and others are working on creating a Learning Analytics Center that will open resources for schools. I wonder if IST and Shelby Thayer with Outreach would be…

Event Listing http://www.nmc.org/events/learning-analytics-webinar
On iTunesU http://itunes.apple.com/itunes-u/nmc-horizon-connect-learning/id489707147

NERLA – NorthEast Regional Learning Analytics
David Wedaman, from Brandeis, and others are working on creating a Learning Analytics Center that will open resources for schools. I wonder if IST and Shelby Thayer with Outreach would be interested in this.

Tom Haymes, Houston Community College – Lessons learned and take-aways

  • Garbage-in > Garbage Out led to “what is learning and how do we want to measure it?” discussion
  • You gotta start “right” in order to get something useful out of the project
  • Measure skills rather than knowledge
  • Gamification tie-in [Tom mentioned someone, but I missed the name, will ask for the contact]
  • Technology won’t be expensive, the planning and analysis will be
  • His project is going to be open source via Gates Grant!
  • The Three-E Strategy for Overcoming Resistance to Technological Change http://www.educause.edu/EDUCAUSE+Quarterly/EDUCAUSEQuarterlyMagazineVolum/TheThreeEStrategyforOvercoming/163448

Amber Stubbs – An Introduction to Corpus Linguistics

  • Computational linguistics, analyzing text
  • She has a book coming up in 2012
  • Corpus is a collection of natural language data – used for plagiarism detection, speech detection, machine translation
  • Toolkits – NLTK and MALLET and Weka
  • Unsupervised tasks – pour in the data and see what comes out (plagiarism detection)
  • Supervised tasks – annotate the data to get more accurate output; use training data (example: document classification)

Resources
http://www.educause.edu/blog/pkurkowski/ELIReleasesNewBriefonLearningA/229163
NMC Horizon Report > 2012 Higher Ed Edition http://www.nmc.org/publications/horizon-report-2012-higher-ed-edition
2012 NMC Horizon Project Short List http://www.nmc.org/publications/horizon-report-2012-higher-ed-edition

Reflection
PSU World Campus is currently considering the Pearson LMS. How does Pearson measure success? Because that drives the analytics behind student papers to high rates of graduation. I wonder how Pearson has communicated what their approach is. Would their decisions be the same that we would agree on? How could we know without a clear discussion internally and then with Pearson?