Tuesday, September 9, 2008

Web Analytics in Education

A few months ago, I saw a presentation by Clint Rogers on web analytics at BYU. Since then, I have been thinking a lot about the possibilities of web analytics for education. I even told my friends that I am pretty sure that whoever figures out how to do it right first is going to start an industry. I was pleased to see that Clint was teaching a seminar on that very topic this fall at BYU so I am sitting in on it even though I have almost no spare time right now. As a result, several upcoming posts will be related to this topic.

So what could we learn from web analytics that we don't already know and what could we do with that knowledge?

My brainstorm:

  • We could know exactly who looked at what and for how long. We could know which of the 10 things we thought they absolutely had to read they actually did read (or at least left open on their browser) and for how long and then correlate that to their scores to see if they really did need to read those ten things or not.
  • We could find out if the $5000 simulation we built gets more actual student face time than the $500 game.
  • We could provide approach A 50% of the time and approach B 50% of the time and correlate to outcomes to see if one has better results.
  • We could identify learners who are not logging in, or clicking randomly, or only doing the quizzes and intervene by notifying them automatically (but as if we are human) that we have noticed this pattern and we are concerned (a human would read the reply, of course).
  • We could possbily identify profiles of people who are cheating.
  • We could find out if online students really do cram the entire course in to the last three weeks of the semester and still get an A- on the final and reflect on how we feel about that.
  • We could discover that you only need to skim this particular course to get a B-.
  • We could discover that if you only read the intro and the summaries of each lesson you get a passing C.
  • We could discover that those who do all the optional quizzes and pace themselves so that they complete three lessons a week get an A and then tell new students at the beginning of the course of this pattern for success in this particular course to help them invest in good study practices. And, if they fall off the wagon, we could remind them that their current, not so hot learning patterns correlate with a D for 90% of the students last semester that fell into this pattern and didn't change by October 1st. In fact, profiling the behavior of high performing students or of those who get off to a rough start and recover or of those who spend the least amount of time in the course but get the highest grades or, or, or..., I think, is one of the most interesting areas that could be investigated and could lead to a lot of good advice for others taking the course and entire course redesigns to make them more lean and mean and precisely helpful. Especially if we can profile the students entry characteristics and then correlate them to success patters for those specific characteristics.
    "Dear student, According to the survey and your past grades, you are very similar to 86 students who took this course in the last 2 years. These students also 'enjoyed working on their own' but 'felt that they learned slower than most' and had similar grades to you on the pre-requisite courses. Students with this profile were most successful in this course when they followed these study habits: yada yada However, most of these students were more inclined to follow these less effective patterns: yida yida. We have sophisticated tools that can produce a weekly report showing how close your study habits are to those of students with your profile who were sucessful in the past and warning if you fall into the less effective learning patterns common to students with your profile. Would you like us to send this report to you?"
None of this feels like TLC for the student, but I believe that the hard numbers and statistical patterns can be presented in a very human, non-threatening, helpful way that really will help students feel like the course designers/instructors know them and are there to help them and have this almost magical insight into how they can improve their performance in the course. Maybe not. But it is very much worth a try.

Wednesday, April 9, 2008

Fire that (Fictitious) Employee! Unanticipated Consequences of Using Narrative in Instructional Design

I went to an interesting presentation at my local chapter of ISPI by Andrew Wolff of PriceWaterhouseCoopers. He talked about how they had recently begun using the simplest, cheapest versions of narrative and humor in their training. For example, to help their people understand the technical side of one of their businesses, they show a series of photos with audio where a guy gets a call at the end of the week that he needs to do a report on the chipset the company is selling. He is about to ignore the request and go home when his cell phone starts to talk to him. They have still drawings of a little talking cellphone, that change every ten seconds or so in an "animation," and this cell phone has a cartoon-y character voice. The talking cellphone tells the guy about the importance of the chips inside it to the business's bottom line and takes him on a tour of the factory. Or they had a confidentiality training where a story plays out where a character makes simple mistakes that leads to a major breach of security for the company. These innovations were not very expensive and didn't take much longer than a vanilla course to produce. Among the effects, the ones that stood out to me the most were:

1. With no promotion whatsoever of the new course other than word of mouth, training completion timeframes for the company went from something like 90% in the last three days before the deadline for training completion to 90%+ in the first three days the course was available.

2. In the case of the security training, partners in the firm were calling the training department in the first few days after the training was released, trying to get the (fictional) character in the training fired for her negligence.

Now you tell me some other strategy that would have led to similar outcomes. And think of what the company stands to gain by shaving three months off of the amount of time it takes for all of their people to complete required training. And imagine the employees of your company talking to each other in the halls about the great confidentiality training they just completed and how you don't want to miss it. Sounds like some kind of training department fantasy. One that I think many of us would like to be in.

Wednesday, April 2, 2008

Where is Emotion, Engagement, and Aesthetics in the Learning Sciences?

I am looking at the index to the The Cambridge Handbook of the Learning Sciences. I am surprised to find:
  • No entry for "engagement"
  • No entry for "aesthetics"
  • Only 2 pages under "emotions," one of which refers to this passage:
    We need a better understanding of the intertwining of affective, relational, and communicative aspects of learning interactions. How do emotional responses mediate learning, and how do they emerge from learning? (p.29)
    Good question!
  • Only 2 pages under "narratives"
  • Under "motivation," which has 21 sub-headings and 55 page references, there are a handful of possibly relevant subheadings: "Attention and motivation," with 1 page listed, "boredom and motivation," with 2 pages listed, "deep level engagement and motivation," 1 page listed, "emotions and motivation," 1 page listed.
Could someone clue me into to what a learning scientist might call "emotions," aesthetics," "engagement," and "narrative"? Could the field really have attended so little to these issues? I realize that the Handbook is hardly the entire corpus of the field, but I guess I was hoping to find a bit more than I did.

Thursday, March 27, 2008

Planning for Engagement in Instructional Design

Pat Parrish had an engagement plan for his course. It looked* something like this:


He thought that students would generally start with lower levels of engagement, that that engagement would grow as they learned new material and completed assignments, that it would plateau in the middle of the course, and then rise to a climax near the end of the course when the applications of their learning became more apparent to them. Pat measured the engagement of each student throughout the course. Each line represents one student's reported engagement on a per module basis. This* is what he found:



Wow. Try following any one path through the chart. Now compare it to any other path on the chart. Explain why the two paths are different. Good luck.

So what's the point? Two points.

First point: Pat is way ahead of the curve. Pat actually had an engagement plan. He actually thought about how each part of the course would engage learners and to what extent he thought it would. He actually implemented learning activities to reflect his plan. Have any of the rest of us really tried that? Do we have a plan for engaging students in any systematic way? Do we have picture of the ideal engagement arc of our course in our heads? Or are we just focused on achieving learning objectives (somehow) or, worse, content coverage, and hope/assume that engagement will happen? Or are we resigned to the sad fact that learners choose to be engaged or not engaged, period, not my problem? Is this how a screenwriter, a playwright, or a music composer would think about their audience? I believe that while learners do have a choice to engage, we also have a choice of deciding how seriously we are going to try to reach out to them in engaging ways. How determined of a suitor of meaningful student engagement are we going to be?

Second point: Pat's students were all over the map. And we have no idea why (though I imagine that Pat has some guesses). Most of us have students like Pat. I would bet valuable property that 95% of us would find a similar, random looking set of curves if we tried the same experiment in our courses. We have no idea why they are or aren't engaged. We have no idea why in module 7, one student who had been averaging between a 6 and a 4 dropped to a 2 when six other students posted increasing engagement scores for the same module. This is the sort of thing that I feel like we really, really need to know. We should be able to read these patterns, perhaps not easily or perfectly, but we should at least have a sense of why these things are happening.

If we want to engage learners, and I do, we had better start finding ways to create and understand charts like Pat's. We are at the starting point of this kind of research (to my knowledge, if I am wrong, please let me know). The point when everything looks like random chaos. But it isn't random. There is a reason for every bend up or down on those curves. Let's go find out what is going on so we can design in engagement into our learning experiences. Yes, the learners have to choose, but let's give them every reason to choose to engage.

*Graphics used by permission; taken from an AECT 2007 presentation. Update: looks like Pat put the paper on his website.

Thursday, March 20, 2008

Real-time Measure of Learner Engagement

I want to conduct this study:

Twenty college students (or learners of your choice) are given a means and a prompt (and a reward) to answer the following two questions every five minutes for all of their waking hours over the course of a day/week:

1. What are you doing right now (if different than your previous answer to this question)?
2. How engaged do you feel right now on a scale from 1 (I am bored to tears! Save me!) to 5 (Shhh! Go away, I am busy!)?

This would allow us to formulate some baseline data to see where learning experiences fall in the overall spectrum of a learner's environment in terms of engagement. I am guessing that most of them fall into the bottom twentieth percentile, most of the time. If so, that can't be good. This kind of study could also be instrumental in identifying those exceptional learning experiences that are maxing out the scales. Think we might want to look at those particular experiences a little more closely?

Or imagine this variation: You are the instructor of a course. Each student has a little engagement meter, asking to rate engagement on a scale from one to five every five minutes during your class time. You videotape the class. You synchronize the video with the data. You chart engagement across time. Where you see peaks (hopefully) and valleys (inevitably) you jump to that part of the tape to see what was/wasn't going on. How much do you think you could improve your course after just one session of this? After three? Five? Do you have the courage to have the data reported to you in real-time while teaching the class? (Would that even be a good idea?)

If you know of anyone anywhere who is doing anything along these lines, please let me know. I am aware of classroom clickers, but not aware of anyone using them to measure engagement throughout class time to create an engagement graph. And I have never heard of anyone trying to establish an "engagement baseline" for learners that compares their learning experiences to the rest of the daily experiences in their life.

In closing, could you please rate your level of engagement with this blog post on a scale from one to five? ;)

Monday, March 10, 2008

On Beyond ADDIE: Narrative, Aesthetics, & Learner Emotion

I cut my teeth in instructional design by designing a web-based air quality meteorology course. It all started when, one day, the director of my organization walked down the hall, stopping in at each office, asking, "Does anybody want to build a course for NOAA?" I said, "Sure!" and got to work.

At the time, I had no practical experience in the field to speak of. I had taken one class called "Introduction to Instructional Design." So I opened up my Principles of Instructional Design textbook from the class and started from page one. It was a great resource and got me up and running quickly. Between myself and two meteorology graduate students from NCSU, we produced the entire course that summer. To my knowledge, it was one of the first self-contained courses online and can still be found here (if you visit, please make sure you picture the state of the web in 1996 -- Hotmail was launched that summer and there was a whopping 342,081 websites online).

With time and experience, however, I have come to feel that the ADDIE-type approaches like the one in my Principles of Design book too often fail to account for the humanness of the learner. While you can use those methods to consider the humanity of your audience, you can also fulfill every prescribed step and entirely miss it.

What would I add? Let's start with three very powerful, very underutilized forces:

1. Narrative: Story has been used to bind people together in shared knowledge and understanding for thousands of years. It is arguably the first instructional strategy ever used to convey essential cultural knowledge to the rising generations. It's an essential aspect of virtually every culture on the planet. We are wired for narrative. We think in narrative, we speak in narrative, we even dream in narrative. We perceive our very existence as an unfolding narrative. We collectively pay billions of dollars to experience well-crafted (and not so well-crafted) narrative. Narrative design needs to be deeply understood and routinely practiced in our field.

How many instructional designers have even heard of the field of narratology? How many designers have studied the construction of a documentary, a screenplay, a dance performance, a musical composition? We are starting to scratch the surface with our recent attention to role-play scenarios and gaming, but have far, far to go.

2. Aesthetics: Human beings respond powerfully to aesthetic design. Every decision we make, like it or not, is mediated by our subjective perceptions. The "Bottomless Soup" study done by Brian Wansink, a recent winner of the IgNoble Prize for nutrition (and also has a book on the subject, Mindless Eating: Why We Eat More Than We Think) demonstrates this beautifully. And, of course, aesthetics don't only make us fat. They can relax us, orient us, inspire us, enliven us. Aesthetics are much more than the surface qualities of an object, but extend to encompass the richness of our experience, and the best applications of aesthetic design can embody and express layers of meaning in a profound, prereflective way. Patrick Parrish is starting the conversation in our field. This conversation needs to be accelerated and expanded.

3. Learner Emotion: Human beings feel, and what they feel influences their readiness to learn, their willingness to learn, how much they actually learn, and whether they will (ever) decide to learn about a particular topic again. As Russ Osguthorpe asks, "If they got an A in the class, and tell us that they never want to see that content again in their lives, have they really learned what we intended to teach them?" Emotions can work for or against learning. In order to account for emotion in our learning design, we need to know what learners are feeling before, during, and after learning experiences occur.

We have a whole science devoted to measuring learning before, during, and after learning experiences and, ostensibly, ways to intervene based on what is learned from these assessments. Where is the science and technique of measuring the learners' emotions? What are the best practices of how to intervene based in what is learned? What makes us think we can teach effectively if we only know what learners know and not how they feel? How they feel about learning this topic, how they feel about their ability to learn this topic, how they feel right now in this learning session during this learning activity? Engagement is both a cognitive and an emotional experience. Can you imagine a flow experience in a learning setting that was devoid of emotion? Can you imagine an overwhelmed, bored, distracted, or anxiety-filled learner maximizing their learning?

We teach human beings. Let's start designing human experiences as well as learning experiences.

Thursday, February 28, 2008

Holistic Education vs. Holistic Learning vs. Holistic Learning Experiences

tsharvey (who also happens to be my brother-in-law) asked this question and I thought it was worth responding to in a post rather than just a comment.

tsharvey said:
I don't think I've ever encountered the term 'holistic learning experiences' —of course, I'm not in the instructional design field. While I can speculate what you might mean by holistic learning experiences (and how it might differ from ideas such as integrated learning environments), can you clarify or provide examples of such approaches?

My answer: There is a Holistic Education movement that is more expansive than I am referring to that you can read about here or at wikipedia. If you are interested, a colleague of mine recommends this book:
Holistic Education: An Analysis of Its Ideas and Nature (The Foundations of Holistic Education Series, V. 8)*
(*Amazon link -- I am an Amazon associate -- but you can also buy it here and I won't get a dime.)

There is also something called "holistic learning" by Patrick G. Love and Anne Goodsell Love. They define this as
the integration of intellectual, social, and emotional aspects of undergraduate student learning. (reference)

That is closer, since I am interested in all three aspects of learning. So what exactly do I mean by "holistic learning experiences"? What I am getting at is that, in the field of instructional design, most models require an analysis of the content that breaks it down into ever smaller pieces until you have all these little parts of knowledge. Then, many models say, you teach this piece this way and that piece that way and when you are done teaching all of the pieces, the learner will know what they need to know. This can become a fragmented, decontextualized kind of experience. Our university system is an analogy, where individual disciplines can get so focused on themselves they lose sight of the big picture and fail to collaborate with each other to research cross-disciplinary issues.

When I say holistic learning experience, I mean designing the experience with the whole as well as the parts in mind. What will the overall structure be? How will we make sure the parts are related to each other meaningfully? When is it better for the learner to experience the material in larger chunks (and, at times, with more ambiguity)? This is one of the reasons I am interested in narrative, because I believe that narrative design has to pay attention to both the whole and the parts at the same time to be successful, and I think my field can learn from that.

"Holistic learning experiences" is a term I made up. I am open to suggestions if a better one comes to mind.

Joseph

Tuesday, February 26, 2008

Learning as a Human Experience

I was trained as an instructional designer and have worked as one now for over a decade. As Brenda Bannan-Ritland said to me recently, "I honor that tradition." But I feel drawn to other disciplines that can inform it, unconventional design approaches that can improve it, and a new emphasis on learners as human beings, whose hearts are inseparably connected to their heads. When we learn, we feel, and not enough designers of learning experiences care enough about that. And it is worth caring about. It is worth designing for. It can make all the difference.

In this blog, I explore the many, many places outside the field (and a few inside) where designers have chosen to account for the human experience in a holistic way. I explore what it feels like to be a learner and how design can impact that experience. I explore how we can make learning rewarding to the human mind and enriching to the human spirit.

Please join me.

Joseph