Cameron Wills
  • Home
  • About
  • Portfolio
  • Services
  • Web Log
  • Contact
  • Home
  • About
  • Portfolio
  • Services
  • Web Log
  • Contact
Search

Web Log

DVEETO: A Quick and Concise Survey of Learner Perceptions on Instructional Design

6/3/2019

0 Comments

 
Picture
Gathering learner feedback is essential to iterative instructional design. Perceptions on the learning experience from the learner's point of view lets us know how we did things wrong(ish) the last time and how to do them better(ish) next time. 

Was this assignment too hard? Did it take too long? Do students even care about this video? Was this a good module?

Does their experience fit with my expectations?

These are some of the questions I'm seeking to answer in my courses. To address them, I need to survey my learner's perspectives on content and design. I also need not bore them to death with yet another survey. To accomplish this, I have settled on a simple tool to gather what I consider to be the most valuable learner perceptions quickly and concisely: I call it DVEETO. 

DVEETO [DAH-VEE-TOH] is an acronym for: Difficulty, Value, Engagement, Effort, Time, and Opinion. It's a short survey (1 - 3 minutes) that is attached to some target, such as a test, assignment, reading, module, etc.

​I will explain the instrument and my rational below.

Difficulty

Easy - Normal - Hard
The difficulty scale reports learners' perception of personal challenge or required skill to meet the target's requirement. 

Value

Low - Moderate - High
The value scale reports learners' perception of usefulness that the target provided them in exchange for their time and efforts.

Engagement

Low - Moderate - High
The engagement scale reports learners' perception of attention, motivation, and interest toward the target. ​

Effort

Low - Moderate - High
The effort scale reports learners' perceptions of their personal commitment toward the target. Effort is a hedge against the other indicators - which reflect more on design and not external motivators.

Time

dd:hh:mm - modifiable
Learners' report of the time spent on task, or exposure to the target.

Opinion

Optional essay or short answer
Opinion is optional written feedback. Can be a qualitative summary, highlights, lowlights, suggestions to improve, etc.
Picture

USES

I've used the information from this survey in two ways: (1) to provide formative feedback that allows me to be responsive to learner cohorts: do they need more remedial instruction? are they taking too long on the readings? did they enjoy this activity? These are elements that I can use to tune instruction in motion: add more activities like this, divide the remaining readings, post remedial content, etc. (2) To provide summative feedback that will inform the next design versions. If 80% of my students thought the readings were too hard, maybe it's time to reevaluate or replace the text, or provide additional support. I can also track the feedback throughout the course and across student groups.

There are several useful insights that this simple instrument can yield. Learners may report that an activity was difficult, but very engaging and valuable; or that an assignment was valuable, but it took too long and wasn't engaging. This might encourage the development of like activities, or spur the redesign of an assignment, respectively.

Importantly, it also gives learners the opportunity to provide direct feedback on what might be improved (or what exactly was so great) through the opinion field - so they can tell you exactly what you did wrong and you can fix it.

LEARNER REFLECTION

I've found that this tool also helps learners to reflect on their experiences, even if briefly. This is especially useful if DVEETO is used with mid-level experiences like online modules or multi-tiered assignments or activities, since they have to recall their experiences and own sense of commitment (effort) in sum to provide a response. Since reflection tends to be good for long-term retention, this tool can be a cheap and easy way to promote it.

LIMITATIONS

This is not a research-grade instrument. It's not meant to be. It's intended to provide useful feedback for course development and instruction that learners will actually complete. It allows me to compare what my design expectations were against their real experiences and make informed changes. I would not use this to justify grand declarations about how a particular course element improved student learning outcomes.

The number of possible responses is intentionally limited (3) and I advise against adding more options. More options take longer and require more thinking. Long, reflective thinking is fine for end-of-course/training surveys. That's not what this is for. No one wants to spend 10 minutes on a survey covering a 5 minute video.

That said, you may want to modify the questions.

MODIFICATIONS


As-is, this tool may not be for you. I am working primarily in a higher-education environment teaching undergraduates  and training faculty. So you may find that you don't need some of these fields or that there is one that is missing that would be useful to you and your context. I think these are fairly universal, but all six questions are not always necessary.

In fact, I actually recommend dropping as may questions as you can - but no more then you can't. If you are using the survey on the above-mentioned 5 minute video, do you really need to know how long your learners spent on it? I'm going to guess it was about 5 minutes.

Again, this isn't some rigorously tested research instrument, so mix-and-match at your discretion. Just don't make it an hour long because no one is going to take the time to complete it, unless you make them, and then they will hate you and your dumb survey.

WHAT ABOUT ASSESSMENT?

What about it? This is not an assessment tool, its a feedback tool. You aren't going to use it to find out if learners actually learned the content. That's a separate issue. However, it can be attached to items such as tests or assignments to inform you of learners perceptions of them. It would be interesting if say, your learners report that a test was easy but end up failing miserably. There is probably something there for you to fix.

Its also worth noting that assessment by itself is not the best way to judge the effectiveness of your instructional design. Learners may simply be highly motivated to get through your awful course because they really, really want to be a doctor, or engineer, or to be in compliance with your company's new policy, or so their parents don't take away their video games.

​Don't be fooled that performance alone is necessarily a good indicator of effective design. Ask your learners.

    Feedback

Submit
0 Comments
    Picture

    Author

    Cameron Wills is another guy with ideas and opinions. Half-baked concepts with incremental improvements go here.

    ​Opinions are my own. Use at your own risk.

    Archives

    January 2022
    June 2019
    March 2016
    July 2014
    October 2013
    September 2013

    Categories

    All
    Feedback
    Instructional Design
    Reflection
    Theory

    RSS Feed

Copyright 2022 - Cameron Wills
  • Home
  • About
  • Portfolio
  • Services
  • Web Log
  • Contact