As part of a CUTSD grant, we developed a video package to encourage primary teacher education students to think carefully about the many dimensions of teaching and learning in school classrooms, such as questioning, sequencing activities and making judgements about children's understandings. Our evaluation of the first cohort of students to use the material was based on the range of responses to classroom situations, with levels ranging from superficial to deeper reflective thinking. With such varying levels of interaction possible with the material, we have had some success in our objective, but we found we need to provide more support to develop students' skills in higher level thinking about teaching and learning processes. Considering ways to encourage higher level thinking in students led to modifications in the way the video was used for students in 1999.
Australia wide research supports our view that teachers are uncomfortable with science content, feel untrained, tend not to include it as a key subject, and often teach it from a textbook and not from hands-on practical activities (Smith and Neale, 1989). There is growing support that an effective way to address this problem is to look at the links between teachers' own substantive content knowledge and their pedagogical content knowledge. The focus on pedagogical content knowledge has been stimulated by the proposal that unless teachers have the scientific models to contrast with student models, they are not likely to be able to foster their students' conceptual change. This process of transforming subject matter knowledge into a form which makes it teachable to a particular group of children (Geddis, 1993) is the key to our proposal. We are supported by a wide spread trend towards a focus on content in primary science (Kruger and Summers, 1989) as seen in a major project by the Oxford University Primary School Teachers and Science Project (PSTS). The choice of a video format in a higher education context is well supported (Davis, 1993; Laurillad; 1993; Slaughter, 1990).
A major project based at the Open University (Tresman and Fox, 1994) also tackled the problem of primary science, with a focus on distance learning, using the BBC broadcast system. These materials do not translate well into the Australian context, and more importantly, do not include in their scope a strong linkage between content and process, to directly relate improvements in the quality of student-child interaction to improvements in the individual student teacher's science knowledge. We wish first to set up a powerful need and interest in extending personal science knowledge which is stimulated by practical realities. The workbooks with the UK material bear too strong a resemblance to high school science textbooks, which we wish to avoid with our presentation, as we have found high levels of resistance in our students to work with materials they associate with high school failure and disinterest.
Current materials do not sufficiently support student development in two areas that are critical to their success as primary teachers: they do not actively promote development of their personal science content knowledge beyond its present level; and they do not provide focused and personalised feedback on the success of their interactions with children while discussing science activities.
Why did the bulb light up? is an instructional, interactive video (90 mins) funded by CUTSD and designed to tackle a perceived problem with existing unit materials for external (off-campus) students in primary teacher education.
Our approach to counter this problem was to provide video film of student interactions with primary age children engaged in science activities, such as: floating and sinking, layers of liquids, and the chemical energy in batteries and circuits.
These were presented as case studies in video format. Each case study was supported by a written commentary from a university tutor, or video of the students talking about the science concepts and principles involved (the content), and the processes of the interactions. Off-campus students watched these case studies, and commented on what they saw. They were able to extend their content knowledge by reading linked information on the concepts and answering questions, then view the video again and comment on how well the student teacher handled the content aspect of the interaction.
Tutor commentary included in the video guided their reflections on the quality of the interaction from the point of view of questioning technique, appropriateness of response from the students, evidence of changes in the child's understanding, and making valid judgements about children's science concepts.
We have found that critically analysing the interactions of others in a guided fashion is a very powerful way for students to develop their own skills. Of course this happens in the school experience component of their teacher eduction course, but frequently not with a science content expert, which significantly reduces the learning student teachers make about science specifically.
Students are generally loath to make comments that reveal problems they are having with the science content, or how this is reducing the effectiveness of their discussions and questioning of the child. If tutors can see the student-child interaction, they can give much more personalised comments specific to the interaction, link these precisely to instances in the video, and respond to strengths and problems that are often not included in existing written assignments.
To link the method of instruction to the method of assessment, in the examination, external students were asked to analyse a transcript of a small group lesson in science. They demonstrated their skills in content knowledge by commenting on the stage of the child's conceptual development, and also on student's interaction with a child, by commenting on the strengths and suggesting alternatives to any less appropriate aspects.
It seemed to be like a waste of time as the students wouldn't learn the correct parts of the tongue.
I think it would be better to encourage students to analyse and interpret their own results.
So in terms of learning outcomes, younger children are learning to investigate. ... Older children practise investigating a problem looking for evidence.
I felt that one teacher was able to engage the students better. I think the learning was more beneficial for the first group than the 2nd.
I believe these students fit into Piaget's concrete operational stage of development where they need to experience something in order to be able to explain it.
You have to be careful about the concept of 'science' that you are portraying - the idea of some authority 'out there' that the students are inadvertently being compared against is disturbing
Children will also learn that by talking about the issues before carrying out the experiment they have a context set fort their learning, a reason for doing the experiment, that investigating different ideas is a way to learn and construct new knowledge.
(n = 29)
(n = 47)
|Transcript 1||Transcript 2||Exam (Transcript 3)||Exam|
|Range||1 to 3||1 to 3||1 to 4||1 to 4|
There was an overall improvement in the external students' responses in terms of the levels indicated, from 1.64 at Transcript 1, to 2.30 at the end of the course. The external students had a slightly higher mean score than the internal students in the exam transcript. The range of responses was increased, with more external students demonstrating Level 4 responses at the end of the course than at their first assignment. However, the average level is Level 2 for both internal and external students.
Comparison of internal and external students (see Table 2) was through analysis of the examination scripts (transcript 3). The number of responses at each level for the 27 external students (with corresponding numbers for the 47 internal students in parentheses): Level 1: 1 (7); Level 2: 18 (26); Level 3: 7 (10); and, Level 4: 1 (4). It would seem the video supported external students to move into Level 2, as the proportion of external students at Level 1 (3.4%) at the end of the course was lower than that of internal students (14.8%).
(n = 29)
(n = 47)
|Exam (Transcript 3)||Exam|
Most students in both groups demonstrated an awareness of teaching and learning of science with primary children at Level 2. The emphasis in the course on Student Outcome Statements seemed to influence the type of responses students gave. The internal students had more scores at the highest and lowest levels.
Generally students found the videos helpful, particularly as they showed them what was wanted from the course, and provided clear ideas on a starting point to develop their own lesson.
Analysis of how students commented on the transcript during assignments and the examination showed that most responses remained at early levels of development of reflective practice.
Student teachers tended to focus on the surface features of the lessons, such as where the children sat, if there was enough equipment, and if the teacher was organised. Too few were able to comment on more sophisticated levels, such as the impact of the lesson on the learning of the children in science, the use of questions to draw out children's understandings, or the conceptual development of children as shown through their own questions.
Analysis supported the observation that the nature of the questions in each of the items, for the examination or the assignments, did not encourage higher level analysis of students' views about science in general and issues in science teaching in general. Therefore, it was the exceptional students who achieved the higher levels.
Continued student monitoring of the project is a key feature, as we believe its success will lie in part in what our students tell us, so feedback by students about the support the video provides will be continued.
Clearly, the project has been successful in terms of encouraging greater student engagement with important teaching-learning concepts. However, we also need to do more to encourage higher level thinking. Reworking of the assignments for external students has led to an increase of question types that may prompt Level 3 responses. For example:
|T:||Is that all right? OK, so what have you understood about circuits?|
|S3:||They need electricity to, to run.|
|T:||What about, how, how can you make a circuit, yes?|
|S2:||They have to be all joined together.|
|T:||That's right, it has to be joined together correctly, all right (?). And what about a battery, what does a battery have, has?|
|T:||Good. Good girl. It has stored energy sitting in there, ready to be used... all right... and what is your task Matthew?|
|S5:||To make the, um, bulb go, um... (T: Great) turn on...|
|T:||And what are you doing, you're experimenting to...?|
|S2:||To see if... some materials work..|
|T:||If they will conduct electricity?|
Geddis, A. N. (1993). Transforming subject-matter knowledge: The role of pedagogical content knowledge in learning to reflect on teaching. International Journal of Science Education, 15(6), 673-683.
Kruger, C., & Summers, M. (1989). An investigation of some primary teachers' understanding of changes in materials. School Science Review, 71(255), 17-27.
Laurillard, D. (1993). Rethinking university teaching: A framework for the effective use of educational technology. London: Routledge.
Slaughter, T, M. (1990). Teaching with media. University of Melbourne: CSHE.
Smith, D. C., & Neale, D. C. (1989). The construction of subject matter knowledge in primary science teaching. Teaching and Teacher Education, 5(1), 1-20.
Tresman, S. & Fox, D. (1994) Reflections into action: Meeting the in-service needs of in primary science. British Journal of In-Service Education, 20(2), 231-244.
|Please cite as: Schibeci, R., Hickey, R. and Speering, W. (1999). How do we encourage higher level thinking in students? In K. Martin, N. Stanley and N. Davison (Eds), Teaching in the Disciplines/ Learning in Context, 360-366. Proceedings of the 8th Annual Teaching Learning Forum, The University of Western Australia, February 1999. Perth: UWA. http://lsn.curtin.edu.au/tlf/tlf1999/schibeci.html|