Teaching and Learning Forum 97 [ Contents ]

What form should student evaluation of distance teaching take?

Robert Fox, Anna Boyd and Allan Herrmann
Teaching Learning Group
Curtin University of Technology


Introduction

The current emphasis on quality assurance across all Australian universities has lead to the widespread use of questionnaires asking students for feedback on various aspects of teaching effectiveness. Marsh and Roche confirm the view that Australian universities are increasingly relying on students' evaluations of teaching effectiveness as feedback to lecturers, in the hope that information obtained will lead to improvements in teaching, student course selection, personnel decisions and research on teaching (1993, p.278). In institutions with a distance education profile, questionnaires on teaching effectiveness are provided alongside, or included in questionnaires on distance study materials and management and administrative support services. Curtin University is no exception and has provided routine student feedback mechanisms for the past two years, based on a series of evaluation trials initiated between 1991 and 1994. (Herrmann & Fox, 1992; Fox, Boyd & Herrmann, 1995). The present evaluation processes are the result of a Teaching Learning Advisory Committee and Teaching Learning Group, Teaching Development Grant awarded to review the existing evaluation procedures and to develop, trial and report on new evaluation processes.

The grant has resulted in the production of two distinctly separate evaluation products and processes: the Distance Education Students' Unit Evaluation or DESUE and the Distance Education Support Services Evaluation or DESSE. This paper outlines the purpose of the DESUE evaluation instrument, how it is administered and describes the differences created in the teaching and learning environments where students and academic staff may never meet face-to-face and where much of the 'teaching' in the form of study materials is developed well in advance of knowing who the students are. The DESSE evaluation instrument is described in a paper presented at HERDSA (Boyd, Fox & Herrmann, 1996) and is not discussed is this paper.

Distance Education Students' Unit Evaluation or DESUE

Since 1980, teaching staff at Curtin have had access to the Student Appraisal of Teaching or SAT instrument, designed to provide diagnostic feedback that will be useful for the improvement of teaching and to measure teaching effectiveness to be used in administrative decision making - in the latter case, to support applications for promotion and incremental progression. The use of SAT has always been voluntary and, apart from its use for promotion and incremental progression, it is confidential. SAT has however, only been available for use with on-campus classes and with the growing number of distance courses and distance students, representing up to 5% of the total university EFTSU (University Statistician's Report, 1996), a mechanism to provide staff with such an instrument for use in the distance mode became an essential task.

In the University Academic Board/Teaching and Learning Advisory Committee commissioned report, Valuing University Teaching, though staff indicated they supported the regular use of student evaluations of teaching, they were critical of the SAT questionnaire, observing that it did not meet their needs and was outdated (Baker, 1993). This view is supported by Gibbs, Habeshaw and Habeshaw (1989, p. 15) in their review of Curtin's SAT survey of 6 factors stating that it 'is very difficult for any brief set of teaching rating scales not to embody particular values or assumptions about how teaching ought to be conducted.' As a result of an investigation into evaluating and revising Curtin's appraisal of teaching systems, the university decided to pilot the Student Evaluation of Educational Quality or SEEQ. SEEQ was developed:

... by Professor Herbert Marsh of the University of Western Sydney (Macarthur), an internationally recognised expert in this area of psychometrics. It was established that SEEQ had been exhaustively evaluated in many universities, in many different teaching and learning contexts, and over a considerable period of time (e.g., Marsh, 1982; Abrami, 1989; Marsh & Hocevar, 1991). SEEQ had also been the subject of a DEET Evaluations and Investigations Program (Marsh & Roche, 1994). The validity, reliability, generalisability and applicability of the instrument were beyond doubt (Latchem, 1995, p.3).
SEEQ was designed for use in the conventional face-to-face teaching contexts. However, if SEEQ were shown to meet the staff's needs, it was envisaged that it might be adapted for student evaluation of other forms of teaching and learning - for example, clinical supervision, laboratory work, teaching supervision, interactive multimedia, self-directed learning, computer managed learning, postgraduate supervision and distance learning.

SEEQ operates as a quality cycle in which:

  1. The lecturer reflects upon what it means for them to be 'a good teacher' and what attributes they wish to be appraised by the students. They can use the 9 factors on the front of SEEQ which translate into 31 questions and their own lecturer-supplied questions on the reverse of the form to self-rate themselves before inviting the students to appraise their teaching and learning. This enables them to compare their perceptions with those of the students.

  2. The lecturer invites the students to rate his/her performance using the SEEQ instrument which provides valid/reliable questions and feedback in regard to the 9 factors, answers to the lecturer-supplied questions and open-ended comments, all of which are contextualised by the background subject/class characteristics data.

  3. The lecturer receives the students' evaluations in the form of mean, SE and percentile data which can be used for the purposes of bench-marking and comparison with the self-rating survey.

  4. Equipped with this feedback, the lecturer may then engage in action research or reflective practice to enquire further into the nature of his/her performance ..... And/or, the lecturer may ask for Targeted Teaching Strategy Booklets which basically provide 'hints and tips' on improving practice in the areas covered by the 9 factors (Harrison, 1996).
The quality cycle outlined above is considered appropriate and desirable for application in distance teaching and learning contexts. And further, the SEEQ instrument and process is, without doubt, a well researched and trialled system. However, for the purposes of it's use in distance education, a number of important issues needed to be addressed.

Evaluating distance teaching and learning is problematic in the sense that firstly, much of the 'teaching' is developed well before individual students are known and that the 'teaching' is embedded in the study materials. Secondly that staff who develop the teaching materials may not be teaching the unit. And further, that those teaching, may not be the same people as the staff who are marking the assignments. There may be more than one tutor and more than one marker per unit and individual staff roles within the teaching of one unit may not be clear. Thirdly, if the unit's study materials were mailed late or the materials were below par, the students' responses may not truly reflect the quality of the tutor's performance. Therefore distance education should be seen as a method of mediated teaching and learning that requires particular and careful evaluation processes and instruments, as Cresswell and Hobson state: 'with distance education, problems are compounded because there is less opportunity for discussion to clear up possible misunderstandings' (1996).

The SEEQ instrument has a set front page, with 9 factors and 31 questions which cannot be altered in any way. Staff involved in reviewing SEEQ for distance purposes encountered many areas leading to ambiguity when used in distance teaching. Yet there are certain central SEEQ structures and processes that could be included in the distance education instrument that would ensure some level of 'equivalence' with the SEEQ.

In developing the Distance Education Students' Unit Evaluation or DESUE, sample evaluation processes and questionnaires from six universities offering distance education also were reviewed, along with SEEQ supplied supplementary questions that had been gathered from multiple sources (Marsh & Roche, 1995, p. 7). In designing DESUE, careful attention was given to the components of teaching effectiveness to be measured and some distinctions between teaching, teaching materials and the assessment were made. Poorly worded questions, inappropriate items, or heterogeneous items summarised as an average was accepted as yielding unreliable or uninformative data.

DESUE focuses on five factors:

These factors translate into 28 questions to which the students have to indicate the extent of their agreement/disagreement by circling numbers on a 9-point scale (Strongly Disagree; Disagree; Neutral; Agree; Strongly Agree).

The SEEQ quality cycle outlined in the four paragraphs above (Harrison, 1996) was considered as pivotal to the DESUE and the first stage pilot of the instrument was offered in second semester, 1996.

DESUE, like SEEQ is not simply an appraisal instrument but a self-development package comprising:

Administering DESUE

Teaching staff electing to pilot DESUE are sent a copy of the DESUE form that will be sent to students and a copy of the Tutor Self-Rating Survey which enables tutors to record their own judgements about their teaching, the study materials and study processes of a particular unit and compare these with the students' perceptions. This self-assessment is seen as an important component of the reflective practice and self-development approach embodied in DESUE. Tutors are reminded that the results will remain strictly confidential and that only they will be given copies of the ratings and summary reports.

Tutors are informed that they cannot alter any of the DESUE items on the front of the survey form but that they may devise their own rating items by developing the 'Tutor Supplied Items' section. All questions use the same 9-point response scale.

If tutors decide to develop their own 'Tutor Supplied Items', these are printed on a separate sheet and copies made for each student. Tutors are advised to provide written instruction on the 'Tutor Supplied Items' form asking the students to respond to each item in the appropriately numbered answer space.

The preferred time for administering DESUE is during the final week before exams. If this is not possible, the survey should be administered as closely as possible to the final week to maximise the reliability of students' responses. As administering the DESUE is carried out by TLG, tutors need to inform TLG two or three weeks beforehand. TLG then will send DESUE forms and the necessary supplementary information including the tutor supplied items to students by post.

On completion of the survey

The surveys are processed through TLG using SPSS and Summary Reports and a guide to interpreting the students' evaluations summary report are forwarded to staff along with a selection of targeted Teaching Strategy Booklets.

As DESUE was piloted during second semester, 1996, no feedback from tutors using the form has been received.

Conclusion

Distance education teachers now have a confidential facility to evaluate their teaching effectiveness and that of the study materials and study processes within a distance unit. As DESUE is itself evaluated and eventually validated, distance teachers will be able to include the results of the student evaluations in their teaching portfolios for promotion purposes and in general for appraisal purposes. DESUE can also be used as part of a self-development package. However, it is important to note that DESUE is only one form of collecting and analysing feedback from students and that data collected from the routine DESSE survey (Boyd, Fox & Herrmann, 1996), as well as individual communications between teachers and students help to construct a fuller and more useful picture of our distance students views and needs.

References

Baker, R. G. (1993). Valuing university teaching and learning: Academic staff perceptions. A report prepared for the University Academic Board: Teaching and Learning Advisory Committee. Perth: Faculty of Education, Curtin University of Technology.

Boyd, A., Fox, R. & Herrmann, A. (1996). An investigation: using student feedback for staff development. In Leong, S. and Kirkpatrick, D. (Eds.), Different approaches: Theory and practice in higher education. Proceedings of the 1996 Annual Conference of Higher Education and Research Development Society of Australiasia (HERDSA), (pp.58-60). Perth: HERDSA.

Cresswell, R. and Hobson, P. (1996). Fallacies and assumptions in the use of student evaluation of distance education teaching materials. Distance Education, 17(1), 132-144.

Fox, R., Boyd, A. and Herrmann, A. (1995). Attitudes and access to technology in distance education. In L. Summers (Ed.), A focus on learning. Proceedings of the Teaching Learning Forum'95. (pp.89-93). Perth, WA: Edith Cowan.

Gibbs, G., Habeshaw, S. and Habeshaw, T. (1989). 53 interesting ways to appraise your teaching. 2nd Ed. Bristol: Technological and Educational Services.

Harrison, A. (1996, July). Enchancing the student Evaluation of educational Quality (SEEQ). A memo prepared for staff involved in stage 2 pilot of the SEEQ at Curtin. Perth: Teaching Learning Group, Curtin University.

Herrmann, A. & Fox, R. (1992). Gender differences in the perceptions of distance education students. In Research Forum '92, Perth, Western Australia: Western Australian Institute of Educational Research, 1992.

Latchem, C. R. (1996). Student Evaluation Of Teaching At Curtin University: Piloting The Student Evaluation Of Educational Quality (SEEQ). A report to the Promotions Review Committee, The Teaching and Learning Advisory Committee, and University Academic Board. Perth: Teaching Learning Group, Curtin University.

Marsh, H. W. & Roche, L. A. (1992). The use of student evaluations of university teaching in different settings: The applicability paradigm. Australian Journal of Education, 36, 278-300.

Sochacki, R. (1996, November). Enrolment Monitor of external load by course and unit base. A report to the Teaching Learning Group. Perth: University Planning and Statistics, Curtin University.

Please cite as: Fox, R., Boyd, A. and Herrmann, A. (1997). What form should student evaluation of distance teaching take? In Pospisil, R. and Willcoxson, L. (Eds), Learning Through Teaching, p106-110. Proceedings of the 6th Annual Teaching Learning Forum, Murdoch University, February 1997. Perth: Murdoch University. http://lsn.curtin.edu.au/tlf/tlf1997/fox1.html


[ TL Forum 1997 Proceedings Contents ] [ TL Forums Index ]
HTML: Roger Atkinson, Teaching and Learning Centre, Murdoch University [rjatkinson@bigpond.com]
This URL: http://lsn.curtin.edu.au/tlf/tlf1997/fox1.html
Last revision: 1 Apr 2002. Murdoch University
Previous URL 12 Jan 1997 to 1 Apr 2002 http://cleo.murdoch.edu.au/asu/pubs/tlf/tlf97/fox106.html