This paper describes the development of a distance teaching survey instrument and outlines the differences created in the teaching and learning environments where students and academic staff may never meet face-to-face and where much of the 'teaching' in the form of study materials is developed well in advance of knowing who the students are.
The grant has resulted in the production of two distinctly separate evaluation products and processes: the Distance Education Students' Unit Evaluation or DESUE and the Distance Education Support Services Evaluation or DESSE. This paper outlines the purpose of the DESUE evaluation instrument, how it is administered and describes the differences created in the teaching and learning environments where students and academic staff may never meet face-to-face and where much of the 'teaching' in the form of study materials is developed well in advance of knowing who the students are. The DESSE evaluation instrument is described in a paper presented at HERDSA (Boyd, Fox & Herrmann, 1996) and is not discussed is this paper.
In the University Academic Board/Teaching and Learning Advisory Committee commissioned report, Valuing University Teaching, though staff indicated they supported the regular use of student evaluations of teaching, they were critical of the SAT questionnaire, observing that it did not meet their needs and was outdated (Baker, 1993). This view is supported by Gibbs, Habeshaw and Habeshaw (1989, p. 15) in their review of Curtin's SAT survey of 6 factors stating that it 'is very difficult for any brief set of teaching rating scales not to embody particular values or assumptions about how teaching ought to be conducted.' As a result of an investigation into evaluating and revising Curtin's appraisal of teaching systems, the university decided to pilot the Student Evaluation of Educational Quality or SEEQ. SEEQ was developed:
... by Professor Herbert Marsh of the University of Western Sydney (Macarthur), an internationally recognised expert in this area of psychometrics. It was established that SEEQ had been exhaustively evaluated in many universities, in many different teaching and learning contexts, and over a considerable period of time (e.g., Marsh, 1982; Abrami, 1989; Marsh & Hocevar, 1991). SEEQ had also been the subject of a DEET Evaluations and Investigations Program (Marsh & Roche, 1994). The validity, reliability, generalisability and applicability of the instrument were beyond doubt (Latchem, 1995, p.3).SEEQ was designed for use in the conventional face-to-face teaching contexts. However, if SEEQ were shown to meet the staff's needs, it was envisaged that it might be adapted for student evaluation of other forms of teaching and learning - for example, clinical supervision, laboratory work, teaching supervision, interactive multimedia, self-directed learning, computer managed learning, postgraduate supervision and distance learning.
SEEQ operates as a quality cycle in which:
Evaluating distance teaching and learning is problematic in the sense that firstly, much of the 'teaching' is developed well before individual students are known and that the 'teaching' is embedded in the study materials. Secondly that staff who develop the teaching materials may not be teaching the unit. And further, that those teaching, may not be the same people as the staff who are marking the assignments. There may be more than one tutor and more than one marker per unit and individual staff roles within the teaching of one unit may not be clear. Thirdly, if the unit's study materials were mailed late or the materials were below par, the students' responses may not truly reflect the quality of the tutor's performance. Therefore distance education should be seen as a method of mediated teaching and learning that requires particular and careful evaluation processes and instruments, as Cresswell and Hobson state: 'with distance education, problems are compounded because there is less opportunity for discussion to clear up possible misunderstandings' (1996).
The SEEQ instrument has a set front page, with 9 factors and 31 questions which cannot be altered in any way. Staff involved in reviewing SEEQ for distance purposes encountered many areas leading to ambiguity when used in distance teaching. Yet there are certain central SEEQ structures and processes that could be included in the distance education instrument that would ensure some level of 'equivalence' with the SEEQ.
In developing the Distance Education Students' Unit Evaluation or DESUE, sample evaluation processes and questionnaires from six universities offering distance education also were reviewed, along with SEEQ supplied supplementary questions that had been gathered from multiple sources (Marsh & Roche, 1995, p. 7). In designing DESUE, careful attention was given to the components of teaching effectiveness to be measured and some distinctions between teaching, teaching materials and the assessment were made. Poorly worded questions, inappropriate items, or heterogeneous items summarised as an average was accepted as yielding unreliable or uninformative data.
DESUE focuses on five factors:
The SEEQ quality cycle outlined in the four paragraphs above (Harrison, 1996) was considered as pivotal to the DESUE and the first stage pilot of the instrument was offered in second semester, 1996.
DESUE, like SEEQ is not simply an appraisal instrument but a self-development package comprising:
Tutors are informed that they cannot alter any of the DESUE items on the front of the survey form but that they may devise their own rating items by developing the 'Tutor Supplied Items' section. All questions use the same 9-point response scale.
If tutors decide to develop their own 'Tutor Supplied Items', these are printed on a separate sheet and copies made for each student. Tutors are advised to provide written instruction on the 'Tutor Supplied Items' form asking the students to respond to each item in the appropriately numbered answer space.
The preferred time for administering DESUE is during the final week before exams. If this is not possible, the survey should be administered as closely as possible to the final week to maximise the reliability of students' responses. As administering the DESUE is carried out by TLG, tutors need to inform TLG two or three weeks beforehand. TLG then will send DESUE forms and the necessary supplementary information including the tutor supplied items to students by post.
As DESUE was piloted during second semester, 1996, no feedback from tutors using the form has been received.
Boyd, A., Fox, R. & Herrmann, A. (1996). An investigation: using student feedback for staff development. In Leong, S. and Kirkpatrick, D. (Eds.), Different approaches: Theory and practice in higher education. Proceedings of the 1996 Annual Conference of Higher Education and Research Development Society of Australiasia (HERDSA), (pp.58-60). Perth: HERDSA.
Cresswell, R. and Hobson, P. (1996). Fallacies and assumptions in the use of student evaluation of distance education teaching materials. Distance Education, 17(1), 132-144.
Fox, R., Boyd, A. and Herrmann, A. (1995). Attitudes and access to technology in distance education. In L. Summers (Ed.), A focus on learning. Proceedings of the Teaching Learning Forum'95. (pp.89-93). Perth, WA: Edith Cowan.
Gibbs, G., Habeshaw, S. and Habeshaw, T. (1989). 53 interesting ways to appraise your teaching. 2nd Ed. Bristol: Technological and Educational Services.
Harrison, A. (1996, July). Enchancing the student Evaluation of educational Quality (SEEQ). A memo prepared for staff involved in stage 2 pilot of the SEEQ at Curtin. Perth: Teaching Learning Group, Curtin University.
Herrmann, A. & Fox, R. (1992). Gender differences in the perceptions of distance education students. In Research Forum '92, Perth, Western Australia: Western Australian Institute of Educational Research, 1992.
Latchem, C. R. (1996). Student Evaluation Of Teaching At Curtin University: Piloting The Student Evaluation Of Educational Quality (SEEQ). A report to the Promotions Review Committee, The Teaching and Learning Advisory Committee, and University Academic Board. Perth: Teaching Learning Group, Curtin University.
Marsh, H. W. & Roche, L. A. (1992). The use of student evaluations of university teaching in different settings: The applicability paradigm. Australian Journal of Education, 36, 278-300.
Sochacki, R. (1996, November). Enrolment Monitor of external load by course and unit base. A report to the Teaching Learning Group. Perth: University Planning and Statistics, Curtin University.
|Please cite as: Fox, R., Boyd, A. and Herrmann, A. (1997). What form should student evaluation of distance teaching take? In Pospisil, R. and Willcoxson, L. (Eds), Learning Through Teaching, p106-110. Proceedings of the 6th Annual Teaching Learning Forum, Murdoch University, February 1997. Perth: Murdoch University. http://lsn.curtin.edu.au/tlf/tlf1997/fox1.html|