Sunday, November 9, 2008

Reflecting on my processing of the Zoomerang results

I received some very useful feedback from the Zoomerang survey…I really appreciate when people take the time to share their thoughts and ideas. I haven’t posted to this blog for awhile as I have been focused on the course (grading the second Culiminating Project), and making appropriate adjustments to the course based on the feedback.

Only about ½ the group responded to the Zoomerang survey. Right or wrong, my positive spin on it is that those who did not respond are satisfied with the direction of the course. At least I can’t think of a time when I provided an opportunity to share feedback and those who were unhappy about a particular aspect of the course didn’t take advantage of the opportunity. [Note: It also has a lot to do with how you ask for feedback, the questions you use. I learned from Marty Tessmer – formative evaluation expert – that you have to ask questions in a way that invites a response and assumes that there is always something that can be improved. So, questions that ask, “What three things would you change and why?” indicates that there are at least three things that one should be able to suggest for improvement. The invitation, and structure of questions, can definitely change the response rate and the quality of the responses received.]

Of the responds I received, there were several positive comments, which is good. It is helpful to have reinforcement for what I am doing and what I have designed. Positive comments included things about my attentiveness, quality and quantity of feedback on projects, nature of projects, flexibility to resubmit projects for more points, and the Duarte and Reynolds readings.

I also received several constructively critical comments. Some of the issues shared I immediately addressed. For example, because of comments about the workload being too heavy, I eliminated a Hands-on/Minds-on project to make more space and time for folks working on the final Culminating Project. There were also comments about wishing there were more discussions, so I added a discussion as a way to help the group process Marty Tessmer’s book on designing online tutorials.

Some comments I could respond to now. For example, I received negative comments about the Mayer text. Although I couldn’t fix it for this term, I will find another way to expose students to Mayer’s principles without using his text next year.
Some of the critical comments presented challenges for me because:
  • They were inconsistent (some folks liking a particular aspect, and others not)
  • They were about aspects of the course to which I am committed from an educational perspective (in this case, I just haven’t made the case well enough, I am assuming)
  • They are about structural issues with the eCollege shell that are out of my control, or – for me to design around – requires a fairly cluegy approach to the design
  • They are more related to the individuals than the course. At least I think they are…

Let me say a bit more about that last one… Honestly, as an instructional designer, I point my finger at myself as much as possible. It gives me comfort to think that there are things I can do – or do differently – to improve the chances that students’ motivation to learn will be enhanced. But, sometimes, I receive a few comments to a survey like this one that feel more like an abdication of student responsibility than something I can directly address.

For example, a specific comment I received had to do with being consistently confused about due dates. Because it is so easy to lose track of due dates in an online course, I standardized on a single weekly due date – end of day on Sundays. There are three times during the course where there are differing due dates, and those are related to three sets of peer reviews due on Thursdays. The calendar of graded activities (in the Syllabus) and Weekly Agendas – both including information regarding due dates – has been available since the start of the course. My assumption was that people would rely on whatever method they use to track course due dates. For example, I mapped all the due dates to my daytimer so they would be included in my overall view of my week. I also printed out the syllabus and weekly agendas so I could check items off as we went (in fact, I designed the weekly agendas to function as a checklist).

Was my assumption faulty? Yes, or I wouldn’t have received the comments. Was my assumption unreasonable? No. This is a graduate level course, with students who are midway through their programs. Online courses require a proficient level of self-directedness…folks have to be able to manage time, resources, and energy for themselves. Is there something I could do to make things easier to track? I guess so, I’m just surprised I need to.

Where does this leave me? I’ve made some positive adjustments for the remainder of the semester, and have a list of things – like the Mayer text – to reconsider for next time. The feedback – as always – was very useful to my thinking. I always appreciate it when folks take the time to share their thoughts and ideas, even when it isn’t what I hoped or wanted to hear. If you ask for feedback, you have to be prepared for what you receive, and be willing to take appropriate actions based on what you receive. I think I accomplished this.

1 comment:

Instructional Design Resources for eLearning said...

I felt the class was very organized. As grad students it is up to us to keep track of due dates, e.g. you have three weeks to complete this assignment.

Also I do think Mayer exposed me to very thorough research and analysis. I am a little disappointed to hear people complaining about his book.