Evaluating Classroom Practices Using Qualitative Research Methods:
Defining and Refining the Process
Barbara Burks Fasse & Janet L. Kolodner
College of Computing
Georgia Institute of Technology
Atlanta, GA 30332-0280
{bfasse, jlk}@cc.gatech.edu
Formative evaluation of a curriculum development project is a complex undertaking. Understanding the success and potential of a curriculum unit requires more than documentation of what students have learned (or not learned) and how well they can use what theyve learned. In order to refine a curriculum unit appropriately, we also need to understand what was responsible for those results. This requires making a mapping between intentions of the curriculum design, the ways those intentions were enacted, and the results that accrued. It requires, as well, understanding the affordances intended by the designed learning environment, and investigating, for each, whether it was present, how easy it was to recognize, how easy it was to use, and if it was not used, why not. We need to know how a teacher taught the material her/his style, approach, methods, and rapport with students. We need to know how receptive the students were and what was going on in the classroom besides those things we expected. Such understanding is essential both to refinement of individual curriculum units and to development of principles for the design of effective learning environments.
Prolonged engagement and extensive observation are central to gaining an in-depth understanding of a classroom. This task calls for qualitative methodology. While it is labor intensive and requires patience as the emergent design and its results unfold (Fasse, 1993), documentation of a classrooms context is invaluable to the progress and success of implementing a curriculum design in real-world classrooms. But this kind of analysis requires methods that go beyond the tools of any one methodological approach. Ethnographic methods, for example, can be used to help us understand the social interactions in the environment and the affordances made available and ignored or made use of. But an orthodox ethnographic report-- i.e., "written cultural description" (Spradley, 1980 -- is time-consuming and inappropriate for focused analysis of the intentions built into design of an environment at this stage of development. Needed for comprehensive understanding of the ins and outs of learning environments we develop is a melding of several well-known methodologies along with development of strategies and tactics for data collection and analysis that allow us to identify essential features of a learning environment without spending all of our resources on evaluation.
We present our methodology-in-progress and how we got to it. Our evaluation is of several Learning by Design units. In Learning by Design (Kolodner et al., 1998), middle-school students learn science through a design approach. A posed design challenge (e.g., design a propulsion system for a miniature vehicle that will allow it to go over two hills, design a way of managing the erosion on a coastal island) provides students with motivation for inquiry. Attempts to address the design challenge are interleaved with investigative activities, allowing students to refine their understanding of key concepts, their ability to carry out important science process, their ability to be planful, communicative, collaborative, and reflective, and their solutions to the design challenge, all at the same time. A system of classroom activities, informed by case-based reasoning (Kolodner, 1993), problem-based learning (Barrows, 1986), communities of learners (Brown), and cognitive apprenticeship (Collins, Brown, Duguid), is designed to promote learning and acculturate students into an environment that values sharing of ideas, investigating for the purpose of informing a community, informed decision making, justifying based on evidence, building on what others have done, and critical evaluation.
Currently, in fall, 1999, Learning By Design is up and running in seven schools in several geographically and demographically diverse counties in the Atlanta area. We are running a field test of two physical science units and piloting a series of earth science units. The physical science and earth science units are at different levels of development. The field test for the physical science materials is at the polishing level. Although interested in the specifics of whats working, we are focusing on learning issues in these classrooms. The earth science materials are being newly piloted. Our evaluations in those classrooms are aimed at determining what does and doesnt work and how we could make materials. In both efforts, we put attention into teacher development issues what allows teachers to be successful LBD implementers.
Current Qualitative Evaluation Methodology
Our evaluation methodology is quite intricate, though in the aggregate, we are using case study design to answer our how and why questions (Fasse, 1993; Merriam, 1988; Yin, 1984). Since case study design does not lay claim to methodology unique to itself, we are drawing from standard qualitative methods such as participant observation, interview, and video taped accounts (Fasse, 1993). They are being used in separate though intersecting components of the research project. In one of the case study components, for example, two student ethnographers are visiting physical science classrooms twice a week to understand the experience of LBD through the eyes of two groups of students. What do they experience as students? What kind of help from the teacher (and from peers) contributes to their success? What confuses them? How does their understanding progress? How well are they working together, and what kinds of extra help do they need to work together well? We carry this out in two using our well-developed units one teacher is quite proficient, and one is still learning. We want to understand the affordances provided by our materials and by the teacher for the students. We are learning from this, as well, some of the affordances our materials provide and dont provide for teachers.
But such detailed evaluation is inappropriate for our under-development earth science units and too time-consuming to use across all of our physical science classrooms. On the other hand, we have a need to understand how different teachers with different styles make the affordances of LBD available to students, how students are responding, student levels of engagement, whats difficult for students, what teachers do to make those difficult things doable in some classes, and so on. For this, we are following four strategies. First, weve developed two observation instruments to help observers focus their observations in all of the classrooms. While this flies in the face of qualitative methodology, we do have a practical need to make sure that our untrained observers include the taken-for-granted world in their notes. We visit each teacher at least once a week for observation. Some teachers, who we feel we can learn specific things from, we observe more frequently. Second, we interleave thick description (Geertz, 1983) from our observations with description derived from video documentary. We dont have enough of us to send two people at a time to classrooms, and we learned last year that when observers are charged with the task of simultaneously taking field notes and video recording, one suffers at the hand of the other. As we have a need for extensive thick description and video evidence of what we are observing, we video in our classrooms once every three weeks in place of field notes. Third, we meet every two weeks for the purpose of triangulation (Goetz & LeCompte, 1984; Lincoln & Guba, 1985; Measor, 1985; Merriam, 1988; Spradley, 1980, that is, to review what we are, draw out what we are learning, provide advice for curriculum developers, and refine our observational. Fourth, we meet with our teachers in focus groups every six weeks to learn what works and doesnt work in their classrooms and to allow them to share their experiences with each other.
Observation Instruments: The two instruments weve developed have different functions. The Immediate Indicators Tool (ITT) (Fasse, Holbrook, Gray, 1999) is designed to help observers record a quick "snapshot" of the modality of the classroom environment. Observers make a judgement and place an X" along a continuum between discreet items weve identified as indicators of classroom environment. (e.g., "Displays in the classroom are: All student made vs. All purchased"; "Materials are dispensed by: Teacher on request only vs. Self-serve, student managed"). All items are checklist-type, and the user is instructed to briefly justify, explain, or describe their notations after leaving the classroom. The purpose of this document is to remind the observers that everything in the classroom/school environment is data not to be dismissed as minutia.
The other observational tool, the Observational Prompt Tool (OPT) (Holbrook, Gray, Fasse, 1999), is an exhaustive, detailed list of LBD elements that can be used as a reminder of or tutorial to help the observer focus his/her field notes. It prompts for what to look for during individual, small-group, and whole-class activities, what to look for during particular kinds of activities (e.g., gallery walks, messing about, whiteboarding), and what to look for when certain goals are active (e.g., generating questions for inquiry, investigation). Table 1 shows two selections from the Observation Handbook, a selection of guidelines about what to look for in teachers interactions with students and a selection about gallery walks (fancy show and tell). An observer would use both sets of guidelines while observing a gallery walk focusing both on the mechanics of the gallery walk and on the teachers use of questioning to help students learn from their own and peers presentations.
Questioning Gallery Walks
What are teacher questions about? Who initiates the session?
What are student questions about? Who displays the artifact?
What question types are being used? Who asks question?
Purpose of teacher questions? Who gives feedback?
How does teacher deal with off-topic questions? In what ways is feedback constructive?
(each question has a menu of types and a Are comparisons made between groups or
set of examples associated with it) to previous work of the presenting group?
Table 1: Excerpts from the Observational Prompt Tool
Both instruments were developed as a direct response to previous implementations. We learned then how difficult it is for untrained observers to take useful field notes. We helped student observers learn what LBD was about, learn about observation, and learn what to look for. Nonetheless, their field notes were all over the place. Once told to profile engagement, for example, the description would read "engagement is good, the students are listening to the teacher." These new tools were created on the one hand, to help our observers focus, and on the other hand, to help them understand the kinds of things they ought to be documenting. Both instruments are used both in LBD and non-LBD classrooms (we need to analyze whats going on in non-LBD classes to understand what special affordances LBD provides).
Video documentation: The importance of video recording is twofold. First, it provides an archive for substantiating and revisiting our findings. Second, the tape is useful for micro-ethnography. As stated above, we are dedicating an element of our manpower exclusively to video taping a single identified group in one classroom on the north side of town and one on the south side of town twice weekly throughout the run of the program. While our focus right now in evaluating those tapes is to understand LBD from the students point of view, we expect to be able to glean much more from those tapes documentation of teacher development, documentation of conceptual change in students, and so on. The taping being done once very three weeks in other classrooms provides our archive by focusing the taping where the action is, we will have a variety of examples of teachers and students in several different configurations.
Classroom visitations: Our observation team includes trained ethnographers, practiced observers, student researchers, and a teacher liaison all doing passive to moderate participant observation and interviewsboth formal and informal (Goetz & LeCompte, 1984; Measor, 1985; Merriam, 1988; Spradley, 1980). The students are undertaking the in-depth microgenetic analysis in two classrooms. One of our ethnography team visits each classroom about once a month to understand its culture, but most of her time is going into twice-weekly observations of two of our most promising physical science teachers. Our other ethnographer does the same in the classroom of one more of our promising physical science teachers. These three teachers exemplify, for us, the best-intentioned novice LBD teacher. Each has a different kind of intuitive understanding of what LBD is, and each is a strong teacher, but all are beginning LBD practitioners, and the expertise among the three teachers is quite varied. Some know science better than the others do, some have experience focusing on science process, and so on. They get things mostly right, but in different ways. Weve learned many things from these teachers about teacher development and about making LBD work. For example, the "rules of thumb charts" that we added to LBD recently are working well to draw connections between the design challenge students are working on and the science they are learning. These teachers show us how to make those charts work. Weve learned, as well, that we need to help teachers be more deliberate in stressing the planful aspects of design and that we need to figure out a way of managing planning so that it combines hands-on work with materials with the cognitive work of designing. The other observers on the project are making once a week or once every two week visits to the remaining physical science classes and to the earth science classes, getting periodic "snapshots" of those classes and understanding what works and doesnt work in the earth science units. Our teacher liaison visits teachers in their classes once every two weeks and speaks to them every week on the phone to find out what is working and what isnt in their classrooms.
The datas audit trail includes hand-written field notes, the two observational instruments (IIT and OPT), expanded written accounts, transcriptions of audiotapes, and written summations of the videotapes (Lincoln & Guba, 1985; Spradley, 1980). We have experimented with and plan to continue using NUD*ST, a specialized database for qualitative research organization and analysis, as an aid to managing the assortment of data from multiple sources.
Weekly meetings: An invaluable element of our formative evaluation plan has been the triangulation that has occurred during regularly scheduled (weekly, as much as possible) debriefing meetings that include observers and curriculum designers. In the early stages of the pilot work, spring 1998, the weekly sessions served as a debriefing for the curriculum designers to learn about what was going on in the field and as an opportunity for the ethnographer to determine the next focus in the emergent design of the research itself. Later, during the first field test period, fall 1998, when we had multiple observers, the weekly meetings served as a venue for sharing observations among the larger group and for refining observation focus. This oral account of the observations proved to be of great value to the curriculum development staff (even though, as stated earlier, the field notes themselves were less useful than we would have liked).
Currently, we are observing in more classrooms than previously, and these meetings are serving several purposes: (i) observers are learning from each other, (ii) through comparisons of what we are seeing in different classrooms, we are able to draw hypotheses about the kinds of teacher qualities that make for successful LBD implementation, (iii) those same comparisons help us to understand what needs to be included in our teacher development materials and workshops, (iv) we glean understanding of refinements needed in the curriculum or in the way weve written pieces of it, (v) we are learning about teacher development, and (vi) we can refine our observation strategies as needed. The data being reported for discussion fall into two categories: learning issues and practical matters. As we develop lists of both, we devote later discussions to each.
How We Got to Where We Are
From the earliest days of the LBD project, a qualitative component has been included to provide contextual understanding of the classroom for curriculum development team members. But the role of the qualitative component has changed over the course of the project. Early in the development of LBD, predating most of todays staff, an ethnographer was on board to provide a thick description of the culture (Gertzman & Kolodner, 1996). At that time, the program was morphing from Problem-based Learning (Barrows, 1986) into something more like its current LBD configuration. There were ideas and projects and mini-units for teachers to incorporate into their curriculum, but it had yet to become the comprehensive, freestanding curriculum it is today. A couple of pioneering classroom teachers were experimenting with adapting these projects to their settings (Hmelo et al, 2000). It was the ethnographers task to educate the curriculum design team on the affordances and limitations of the classroom so that the project could be tailored to fit the real world. The ethnographic reports kept the design teams ideas and good intentions grounded in the reality of the classroom.
In spring of 1998, when we moved toward piloting our first units, the focus of the qualitative component shifted away from an ethnographic account to case study design. Four brave teachers were trying out units that we had designed based on experiences in those pioneering classrooms. Ethnography of schools and classrooms was no longer useful. Now the project called for the use of qualitative methods to monitor the day-to-day progress of the students and teachers as they put theory into practice. The ethnographer traveled the circuit between the four schools gathering the "whats happening here" story and reporting it back to the design team in extensive field notes and oral narrative on a weeklyand sometimes dailybasis. Occasionally, members of the curriculum development team were also in the field taking their babys pulse. There were two areas of focus: the practical and the theoretical. The data informed our knowledge of curriculum development, learning theory, teacher training, and teacher support. There was a need to know which ideas were working and which were flops and under what circumstances each occurred. There was a need to know practical things like how the design of the j-hook impeded the trajectory of the vehicles and the role of wheel size on the success of the trials. There was a need to find out if and how LBD encouraged or enhanced learning. And then there emerged another, unpredicted though central, focus of observation: the role of the teacher in establishing the culture of the LBD classroom. It is from the ethnographers observations during this first full-scale implementation that we began to develop a real understanding of the pedagogy required for the success of LBD in the classroom. Observable characteristics of teaching style and classroom management techniques were identified as predictors of success. These notions were then incorporated into the summer workshop for teachers and into the publication of the first LBD teacher and student texts as well as being used to provide teacher support during the implementation.
By the fall of 1998, the LBD project advanced to the field test level as more teachers (8) joined. At this point, we provided textbooks for students and teacher handbooks for teachers. Our goals were to continue using the qualitative data to inform our growing knowledge of the practical, theoretical, and pedagogical aspects of LBD in the classrooms. However, there was an additional desire to record actual evidence of occurrences of students and teachers experiencing the ah-ha moments that LBD is designed to encourage (i.e., engagement, reflection, science talk, case-based reasoning, etc.).
This is where we are today, though with more teachers (12) and with more units. Our current observations are targeted towards both piloting and field-testing. In our earth science classes, we are aiming to answer the questions we asked of our physical science units during spring, 1998. In our physical science classes, we focus on understanding LBDs affordances and on documenting learning as it occurs.
We are, of course, integrating qualitative methodology with traditional quantitative assessments of what students know, performance assessments showing us what they are able to do, and analysis of embedded assessments the documentation students create as they engage in LBDs activities. At each stage in our need to know, the qualitative component of research has been restructured to fit the new needs, whether with a ethnographic narrative of the classroom culture or field notes from participant observation or videotape. The development continues.
Acknowlegments
This research has been supported in part by the National Science Foundation (ESI-9553583), the McDonnell Foundation, the BellSouth Foundation, and the EduTech Institute (with funding from the Woodruff Foundation). The views expressed are those of the authors.References
Barrows, H. S. (1985). How to design a problem-based curriculum for the preclinical years. NY: Springer.
Brown,
Collins,
Fasse, B. B. (1993) "No Guarantees": An ethnography of transition to parenthood in normative lifespan development. Dissertation: Georgia State University, Atlanta, GA.
Fasse, B.;Holbrook, J.; & Gray, J. (1999). Intermediate Indicators Tool (ITT) Learning By Design Project document. Georgia Institute of technology, Atlanta, GA.
Geertz, C. (1983). Thick description: Toward an interpretive theory of culture. In R.M. Emerson (Ed.), Contemporary field research (pp. 37-59). USA: Waveland Press.
Gertzman, A. & Kolodner, J. L. (1996). A Case Study of Problem-Based Learning in a Middle-School Science Class: Lessons Learned. Proceedings Second International Conference on the Learning Sciences, Evanston/Chicago, IL, July, 1996.
Goetz, J.P. & LeCompte, M.D. (1984). Ethnography and qualitative design in educational research. Orlando, FL: Academic Press.
Hmelo, C.E., Holton, D.L. & Kolodner, J.L. (2000). Designing to Learn about Complex Systems. Journal of the Learning Sciences.
Holbrook, J.; Gray, J; & Fasse, B. (1999). Observation prompt tool (OPT) Learning By Design document. Georgia Institute of Technology, Atlanta, GA.
Kolodner, J. L. (1993). Case-Based Reasoning. Morgan-Kaufmann Publishers. San Mateo, CA.
Kolodner, J.L., Crismond, D., Gray J., Holbrook, J., & Puntambekar, S.(1998). Learning by Design from Theory to Practice. Proceedings Third International Conference of the Learning Sciences '98, pp.16 - 22.
Lincoln, Y.S. & Guba, E.G. (1985). Naturalistic inquiry. CA: Sage.
Measor, L. (1985). Interviewing: A strategy in qualitative research. In R.G. Burgess (Ed.), Strategies of educational research (pp. 55-77). Philadelphia, PA: Falmer Press, Taylor & Francis.
Merriam, S.B. (1988). Case study research in education: A qualitative approach. CA: Jossey-Bass.
Spradley, J.P. (1980). Participant observation. NY: Holt, Rinehart & Winston.
Yin, R. K. (1984). Case study research: Design and methods. CA: Sage.