"The better the score, the more prestigious your University will appear"
the logic does not escape the students asked to complete the survey in Spring each year.
Nor does the logic escape those who wrote the 22 questions of the survey. Each is couched in positive terms, seeking agreement with a satisfied position on (amongst other things)
Staff - e.g. Q2 "Staff have made the subject interesting"
Feedback - e.g. Q9 "Feedback on my work has helped me clarify things I did not
understand"
and Overall satisfaction - Q22 "Overall, I am satisfied with the quality of the course"
And then, to ensure a positive result, only those purporting to "definitely agree" or "mostly agree" (scores of 5 or 4, respectively) are counted - the remainder not being reported.
"Mostly agree" is an interesting concept....."I agree that most of my last flight from India was satisfactory - then we tried to land at Birmingham in a crosswind" (see the YouTube footage HERE).
"Mostly agree" is an interesting concept....."I agree that most of my last flight from India was satisfactory - then we tried to land at Birmingham in a crosswind" (see the YouTube footage HERE).
Now, if that was a survey conducted by, say, a luxury hotel during happy hour on the club floor, we'd laugh and say " whoever designed this survey must have got a bare pass in their GCSE Marketing Research!" And yet, as experienced researchers and academics we do not. Instead we worry and fret over response rates, marginal changes in satisfaction rates, league table positions and then marshal resources to tackle perceived problems that a few disgruntled respondents have mentioned after being pressed by the polling company by email and phone for weeks.......
Actually I think I'll start beating my wife. It promises to give more interesting and authentic results.
No comments:
Post a Comment