How to not make questionnaires to collect feedback from language students


Young woman thinking with pen while working / studying at her desk
How do your customers like your courses? Most language schools administer some sort of a feedback questionnaire after each course. Students are asked to complete it anonymously. The school administration is convinced that this will yield them objective information on what teacher performs well and who not.

Very rarely does a school consult with a specialist in constructing questionnaires and doing surveys: like a sociologist or psychologist. Why? Because it seems so simple. Anyone can come up with a list of questions and construct some kind of rating scale.

Distributing badly constructed questionnaires can have unpleasant consequences:

1.     You may end up with a lot of data but do not know how to interpret it.
2.     You may elicit resentment among your teachers who feel they are not treated justly.
3.     You might lower the image of your school because the way the questionnaire is constructed “primes” the respondents to think that something seriously must be wrong.

(Extreme, hypothetical example. What would happen if parents of kindergarten children would be asked questions like this: Have you observed any signs of infectious illnesses in your child’s educator? How sure are you that he behaves properly with your child as soon as you leave?)

When I audit language schools and their quality management system, here are typical questions I ask when talking about questionnaires:

1.     Who had the authorization to develop the questionnaire? Where is this written down on paper (for example, in the school’s quality handbook)?
2.     To what degree have the employees (=teachers) participated in the development?
3.     Have experts been consulted with, and on what basis have they been selected?
4.     How is the questionnaire’s content linked to other QM documents, like the school’s strategy or standard operating procedures for teachers?
5.     Have teachers been briefed before distributing the questionnaire, and how?
6.     Is there a written procedure on how to apply the questionnaire (what do you say when handing it out)?
7.     How are results analyzed and archived?
8.     How often is the questionnaire applied?
9.     Have you piloted the questionnaire with a select group of students? Have you checked how respondents understand your questions? Do all customer groups understand them (also children or minority groups)?
10.  Do you ask respondents how they feel after answering the questions?
11.  Does your system allow you to statistically correlate answers, and to compare repeated surveys? Can you track individual students and calculate tendencies?
12.  What program do you use for statistics (for example, SPSS)? How has the relevant personnel been trained in this area?
13.  How do you know whether obtained differences are statistically significant, or not?
14.  How do you know what questions are more important than others?
15.  Do you combine individual questions into overarching factors (productivity, likability, materials, etc.)? Do you weigh all questions equally?
16.  How many questions do you really need to measure each factor?
17.  Are there any leading questions that make respondents answer in a particular way, especially when not paying attention while completing the questionnaire?
18.  How do you improve upon the current questionnaire? Do you have regular internal QM audits?
19.  Do you publish the results, or part of it? Do participants get the results?
20.  How are the results communicated to the teachers? Individually or in a group meeting? Is there a written procedure for that?
21.  How do you follow-up whether the survey has led to improvements?

Just some questions. In most situations, I do not get a satisfactory answer to most of them. One administrator told me, he wasn’t interested in overcomplicating things and making it “scientifically” sound. Well, I your goal is just to create some trouble with teachers and students, and end up with a data mess, not knowing what to do with it, please go ahead! If not, take your time and consult a specialist.

Here are some of the most blatant mistakes, and how they can be avoided. Apart from ignoring the above questions.

1.     The questionnaire is too long. Over 20 questions is much too long. It shows that you don’t know what questions really matter.
2.     Most questions are closed questions where you can answer only with a rating, or a yes or no. Use more open-ended questions like: What made this course particularly productive/unproductive for me? What surprised me? What new techniques of language learning did I get out of it?
3.     In some countries, people like to phrase questions negatively: Do you feel ignored by your teacher? How bad is your teacher in explaining? Either keep a balance, or avoid them altogether: How does your teacher explain? How much attention do you need during the classroom? To what degree do you get it?
4.     Questionnaires may have an inherent bias to one particular way of teaching. I observed that, oftentimes, the bias is in favor of teacher-centered classes. This way student-centered teachers will obtain lower ratings. Make sure all teaching-styles (and course levels!) are measured fairly.
5.     Questions are phrased ambiguously or with multiple negations, which makes comprehension less probable.
6.     Many questionnaires can be reduced to a single question: How did you LIKE the course/teacher/school? Focusing on like/dislike distracts attention from the real question: how productive has the course been? What results have I achieved? The real test of a course is when students, often months later, are abroad and put their abilities to practice. Many students who “enjoyed” the course (because it kept them in some cozy comfort zone with a “nice” professor) experience the shock of being unable to communicate. What will they think then about your school?


Tell me what you encounter and think. Or send me your questions. If you want me to hold a live seminar for your school on the topic of feedback, and performance appraisal in general, just send me an e-mail.

If you want to read more about quality management in language teaching, please the other articles on this blog.


Stay tuned!

Gerhard


About the GO Method
The GO Method is a quality management system for language schools. It conforms to key elements of the ISO 9001 standard, while being more specific on teaching-related issues. Customers get access to easily adaptable document templates.
Check us out at The GO Method.

About me
Psychologist and polyglot from Hamburg /Germany (*1979). Married with children. MA in psychology from the University of Hamburg. More than 15 years of experience in quality management and foreign language teaching. Coordinator of the GO Method network, with representatives in more than 90 countries worldwide.
Connect with me on
Linkedin or send me an e-mail.


No comments:

Post a Comment

Gabriele Oettingen’s Theory of Mental Contrasting

Gabriele Oettingen was one of my professors at the University of Hamburg. She teaches also at New York University. She and her ...