GVU Center, College of Computing,
EduTech Institute & Office of Information Technology
Georgia Institute of Technology, Atlanta, GA, USA
NEC Kansai C&C Research Laboratory, Osaka, JAPAN
To test this hypothesis, we initiated the Classroom 2000 project at Georgia Tech. The Classroom 2000 project is applying a variety of ubiquitous computing technologies -- electronic whiteboards, personal pen-based interfaces, and the World-Wide Web -- together with software to facilitate automatic capture and content-based access of multimedia information in the educational setting of the university classroom. The goal of the project is to evaluate and understand the effect of ubiquitous computing on the educational experience, both in terms of how it improves current practice and suggests new forms of education. This paper reports on our initial experiences developing and using a number of classroom prototypes.
Successful uses of ubiquitous technology will fundamentally alter some forms of education, but in ways that are hard to predict. To begin to explore the effects of such technology, our approach is to introduce novel technologies gradually into the traditional classroom lecture setting and see what happens. We do not intend to replace the traditional lecture-based style of pedagogy, at least not initially. However, much of the information in a lecture is inefficiently recorded or lost, and we can use simple capturing techniques to improve the situation. The result is a more complete record of what occurred during the lecture in a form that can be reviewed more easily.
Our work has been greatly influenced by the work at Xerox PARC in ubiquitous computing [18, 19] and tools to support electronic capture and access of collaborative activities [9, 10]. We want to capture information provided by the teacher during a lecture, so electronic whiteboard capabilities provided by the Xerox LiveWorks LiveBoard [4] are inviting. We also wanted to provide the students with an electronic notebook with the capability to take notes during the class that could be the basis for review after class. The Marquee note-taking prototype developed at PARC [17] and the Filochat prototype developed at Hewlett-Packard Labs [20] both came close to what we wanted to have in the hands of the students. Marquee provided a simple mechanism for producing notes with a pen-based interface that also created automatic indexing into a video stream. Filochat used a pen-based PC to capture electronic annotations that served as indices into a digital audio stream. We have also investigated paper-based solutions to note-taking, similar to the work done by Stifelman [14]. The implicit connection between the note-taking device and alternate information streams (audio and/or video) is a common theme that has also been explored at MIT's Media Lab [7] and at Apple [3].
With the availability of ubiquitous information technologies, such as the World-Wide Web, most universities are able to provide students with access to vast repositories of educational materials. It is quickly becoming the norm for individual courses at many universities to have their own Web page that serves as a central clearing house for all course documentation. While this use of the Web has some obvious advantages for both instructor and student, it does not take an active role in assisting learning and teaching. We wanted to view the whole classroom experience as a multimedia Web authoring task and provide ways to capture and relate information before, during and after an actual classroom session. A serious design decision for the Classroom 2000 prototypes, a decision that prevented the use of some existing solutions for our prototypes, was to make all information accessible via the Web. The hardships incurred by this decision were far outweighed by the ease of cross-platform distribution, a must for our student population. This more active use of the Web infrastructure is in tune with some recent applications of WWW technology in education [8, 12, 5]. Our major contribution beyond this existing work is the concentration on in-class capture of information that is to be augmented through coordination with other classroom information via the Web.
Several of the research prototypes cited above have been subjected to some form of evaluation to determine both usability and usefulness. The two most substantial evaluation studies have been conducted at PARC and Hewlett-Packard. For a two-year period at PARC, a suite of tools for capture, automated indexing and integration, and access [9] was used to support a process of intellectual property management [10]. At Hewlett-Packard, the Filochat system was evaluated in field and laboratory studies, generating both qualitative information on the reaction of users to the technology and also quantitative information comparing the accuracy, efficiency and confidence of Filochat with paper-based and dictophone systems [20]. The initial evaluation we report in this paper is based on a 10-week experiment in a live classroom and provides both quantitative and qualitative evaluation of student reaction to the technology. This evaluation is more of a feasibility study than a proper usability or utility study as was done at PARC.
Many researchers investigate the effect of technology in education. There is an important distinction for research in this area, based on whether the research is focused on education or on technology. We have taken a technology focus in our work so far, as evidenced by the way we describe our work and evaluate its impact. Before we can honestly assess the educational impact of ubiquitous computing technology, we must first develop robust, though not necessarily perfect, prototypes that have been tested in real environments. This is a serious challenge, especially when dealing with off-the-shelf ubiquitous computing technology that is anything but robust. Our technology-driven approach to educational technology contrasts with a more education-driven focus, as demonstrated by Wan and Johnson [16] or by Guzdial et al. [6], in which the purpose of the research is to understand and inform theories on learning. In the wider arena of educational technology, there must be both forms of research.
We do not suggest that this is a complete categorization of teaching styles, and we also recognize that some teachers may choose to combine teaching styles within a course or even within a single class session. Attempting to provide support for each of these teaching styles in simultaneously developed prototypes allows us the opportunity to identify general features of an ideal system that can support all classes.
Different teaching styles provide more or less support for the various recording styles. For example, the highlighting student probably receives good support in a presentation style lecture in which they can annotate their own copy of the slides during the lecture.
We built three separate prototypes to suit different teaching styles and to allow us to experiment with different technology in the hands of the students and teacher. One prototype used Apple MessagePads for note-taking and another used pen-based PCs. One class used an electronic whiteboard (the LiveBoard) while others simply used a projector attached to a workstation that the instructor used to display slides or notes.
To control the engineering problem of designing and maintaining distinct prototypes, we devised an overall common architecture or organization that each prototype would obey. The inspiration for this common architecture came from the movie industry, in which the development of a single movie is divided into three distinct phases -- pre-production, live recording and post-production. Table 1 summarizes the main differences between the prototypes by the activities and technology used in the various phases of production. We will now describe what each of these phases means in the context of our development.
Human-Computer Interaction |
Artificial Intelligence |
Future Computing Environments |
|
teaching style | presentation | public notes | discussion |
enrollment | 25 grad students | 60 undergrad CS majors | 15 grad students |
live recording (teacher) | ClassPad on LiveBoard captured navigation and annotation (Fig. 1) | LCD projector to display Web notes; no capture | LCD projector to display outline and Web pages; no caputre |
live recording (students) | ClassPad on pen-based PC captured navigation and annotation (Fig. 1) | paper notes; no capture | outline annotator on MessagePad to capture outline entry notes (right side of Fig. 1) |
live recording (classroom) | single digital audio stream recording | single analog audio-video stream recording | single analog audio-video stream recording |
post-production | log file, annotated slides and keyword text used by PERL script to create audio-enhanced, searchable Web notes (Fig. 2) | audio and video links added to HTML notes manually (Fig. 3); video digitized to QuickTime packets | PERL script transforms Newton data into audio-enhanced outline with notes (Fig. 3) |
We currently support only the presentation of static information within the lecture. That means we support lectures that include writing on a whiteboard, using overhead transparencies or slides. We do not support the presentation of videos or other dynamic information, such as a computer simulation. Support for dynamic information is a future consideration.
Teachers are already overworked, so were very concerned with minimizing the impact the technology had on the preparation of lecture materials. Most teachers already have some form of lecture material that they will want to reuse. The more we required a lecturer to recreate that information, the less likely they were going to want to adopt the Classroom 2000 technology. To minimize preparation effort, we adopted several strategies.
For one class, Human-Computer Interaction (HCI), a presentation-style class, we adopted PostScript as the universal representation for material prepared by the teacher. We used public domain filtering programs to transform the PostScript file into whatever image format was necessary for use in the class. In another class, an introduction to Artificial Intelligence (AI), a public-notes-style class, the notes were in the form of a LaTeX document and were to be presented in class and made available to students via a Web browser. We used existing filters to convert the LaTeX source to HTML. The third class was a seminar on Future Computing Environments (FCE). Since it was a discussion-style class, a template file was provided for discussion leaders to prepare an agenda for the class. The completed template was then automatically converted into a format (Newton Book) for our own Apple MessagePad note-taking program.
We used cameras and microphones placed at various locations around the room to capture one or more video and audio streams, all of which were eventually digitized. This recording required no extra effort on the part of the participants.
None of the existing applications we had available for the LiveBoard allowed us to easily log pen events and convert the resulting annotated slides into a form (e.g., GIF) that was easily displayed on all Web browsers. Other similar commercial products suffered the same limitation. As a result, we had to write our own electronic whiteboard application, a Visual Basic prototype called ClassPad, whose interface is shown on the left side of Figure 1. We used ClassPad in the presentation-style HCI course to present prepared slides and allow for public annotation by the teacher. ClassPad preserves all annotations made to a series of prepared slides. In addition, ClassPad creates a time-stamped log of when the user navigates between slides and when each slide was annotated with the pen (defined as a pen-down followed by a pen-up sequence). This captured information is used in the post-production phase described in the next section.
We developed ClassPad for a presentation teaching style, but it is also appropriate for private notes or discussion-style classes in which the teacher simply wanted to have a blank surface upon which to write. In fact, several lectures in the HCI class used the private notes style. However, ClassPad was not appropriate for the public notes teaching style of the AI class, because the notes were displayed as Web pages. There is a serious registration problem that must be solved by the developer of the capture tool in order to provide persistent pen annotations for a markup language such as HTML. The rendered image (including the HTML text and any associated pen annotations) depends on characteristics of the bounding window, so it is much more difficult to register the pen annotation with the underlying text. Because of this difficulty, we did not capture pen annotations in the AI class. The only captured data in that class were the audio and video streams.
Figure 2: Frame-based Web presentation of lecture notes with
annotations and audio links. This figure shows the notes made
by the teacher for an actual lecture in the HCI class. Student notes
are similar in appearance. The top frame is a sequence of
thumbnail images of all slides for that lecture. The user
selects one thumbnail image, and the full-sized image is shown in the
lower right frame. The lower left frame contains a list of
keywords associated with the slide (none shown here), the
automatically-generated audio links representing each time the
slide was entered during the lecture (one time in this example), and
a link to a form that allows keyword searching across all
slides for the entire course.
We also used ClassPad on pen-based PCs, resulting in the electronic student notebook. This version of the electronic notebook was suitable for both the verbatim and highlighting modes of note-taking, but it was designed with the highlighting student note-taker in mind. The student would see the same information that the teacher put up on the electronic whiteboard and could annotate it with personal comments to make certain points clearer, as shown on the left side of Figure 1. Students could flip though the class notes the same way the lecturer did (though the units were not synchronized so the student was free to browse the slides at their own pace) and write whatever notes they wanted on top of the slides. The navigation between slides and student annotations were logged by ClassPad.
We also investigated the use of smaller PDA-style electronic notebooks, such as the Apple MessagePad. The MessagePad's resolution made it infeasible to use the same philosophy of note-taking used in ClassPad. The prepared slide images would not have been legible, and there would have been little space on the screen for taking notes. Instead, the MessagePad note-taking application, shown on the right side of Figure 1, provides an outline of the lecture. A time-stamped note (called a ``slide'' in the actual interface in Figure 1), is associated with each entry in the outline. Touching the entry with the pen causes a note to appear (there is one note available per outline entry) and the student then writes in the note. This outline note-taker logs the first time each note was revealed.
Recall that the ClassPad application generates a log of when the teacher or student advances from one slide to the next and when an annotation was made. When reconstructing the annotated views for later review, these logged events are used to tie the static information (prepared slides with student/teacher annotations) to the audio or video stream associated with that class. Figure 2 shows an example of an automatically-generated Web presentation from a single lecture in the HCI class with audio-enhanced links. The top frame shows thumbnail sketches of all slides from the lecture. The selected thumbnail image is magnified in the lower right frame. The lower left frame is divided into three main sections: a keywords sections shows words associated with the file to facilitate a content-based search; an audio section lists automatically-generated audio links indicating times in the lecture when that slide was visited; and a search link provides access to a search form for simple keyword search across all lecture notes. When an audio link is selected, an audio client is launched and begins playing the recorded lecture from that point in the lecture. We built our own streaming, indexable audio server and client players for this purpose.
The static nature of slides in the presentation teaching style makes it easy to automatically generate audio links. For other teaching styles, it is not always a simple matter to attach the audio links to parts of the prepared material (see the discussion of the registration problem in Section 4.2). On the left side of Figure 3 is another example of an automatically generated Web page containing audio links, generated from output in the discussion-style FCE seminar using the Apple MessagePad as the note-taking device. The Web-accessible notes show the prepared outline augmented with notes inserted at the right location. Selecting a note launches the audio player at the point in the discussion in which the note was initially generated. It is possible to hide and reveal these annotations, so that the original discussion outline can be seen alone, if desired.
We did not have tools to automatically generate audio- or video-enhanced review material for the public notes-style AI course. Instead, audio and video links were generated manually from the videotaped lecture and the analog video was digitized into a single audio file and segments of QuickTime video. On the right side of Figure 3 is an example of a lecture with audio (marked with an ``A'') and video (marked with a ``V'') links manually added. It is an interesting research question to ask how recorded information from the lecture (e.g., gestures gleaned from the video recording, segmenting the audio) can be processed to determine when audio links should be created and how they can meaningfully be attached to the material [2].
Students kept a journal of their notes and reactions to the technology throughout the course. At the end of the course, 24 of the 25 students filled out a questionnaire that investigated their reactions to the use of the technology in the class. The objective questions asked about overall impressions of the use of technology in the class and then specifically about the ClassPad note-taking application and the use of the LiveBoard with Web-based review notes. These questions were rated on the following scale: 1 (strongly disagree); 2 (disagree); 3 (neutral); 4 (agree); and 5 (strongly agree). We asked questions about overall impression, ease of use, whether the technology made aspects of the class more effective, how the technology affected class participation and whether the technology contributed to learning the particular subject matter of the course (in this case HCI). Table 2 summarizes the results of the objective portion of the questionnaire.
Topic (# of questions) | Avg. (sigma) | |
O | Was desirable technology (11) | 3.67 (.98) |
Was easy to use (2) | 3.02 (1.23) | |
Increased effectivenss of class (9) | 3.62 (.99) | |
Improved class participation (2) | 3.40 (.88) | |
Contributed to learning subject (2) | 3.94 (.86) | |
N | Was desirable technology (1) | 3.13 (1.03) |
Was easy to use (3) | 3.13 (1.14) | |
Increased effectiveness of class (1) | 2.88 (.90) | |
Helped me take fewer notes (2) | 2.85 (.87) | |
L | Was desirable technology (2) | 3.87 (.82) |
Was easy to use (3) | 3.68 (1.09) | |
Increased effectiveness of class (1) | 3.29 (1.04) | |
Helped me take fewer notes (2) | 2.88 (1.00) |
The results show an overall positive reaction to the prototype. The strongest positive reaction is in how the prototype was perceived to contribute to learning the particular subject matter, and this is not surprising. The course was on HCI and the students were themselves experiencing a new interface in the classroom. In addition, the project work was based on developing and evaluating ideas for new Classroom 2000 prototypes, and the students appreciated the authenticity of redesigning a system they were currently using.
One of the initial goals of Classroom 2000 was to examine the effect of personal interfaces in the classroom. Our initial observations show that the students were most negative toward the personal electronic notebooks (see the next section for qualitative justification). The LiveBoard and Web notes together comprised the most desirable technology from the students' perspective.
Use of the LiveBoard for several in-class usability evaluation exercises and the group presentations at the end of the class were both very popular. Both of these activities involved more than just the teacher interacting with the LiveBoard. But many students did not feel the LiveBoard was any better than an overhead or chalkboard when used exclusively by the teacher in a lecture mode.
The majority of the electronic notebooks used in the class were palmtop PCs (specifically, Dauphin DTR-1s with a 7-inch diagonal screen) and while the students found them good for drawing pictures, in general the screens were too small. In addition, the response time of the units was slow relative to the LiveBoard. This made it difficult for students to navigate between slides as easily as the teacher. Also, a number of the palmtops were unreliable machines that would crash during lectures.
Students found the Web-based review notes interesting in their novelty and useful for examining their own notes and the teacher's notes. Several students found the notes very useful on the occasions when they missed class or did not pay close enough attention to some point during class. Despite the relative ubiquity of Web browsers on campus and in student rooms, several students still desired to have a printed copy of their notes because then they would be easier to carry around and easier to review. The Web notes were not always quickly accessible (especially over telephone lines) and sometimes hard to read.
We were unable to keep logs of use of the audio server, so we cannot give a quantitative indication of its use, but we have determined that audio annotations were not used very much. Only 4 of the students in the class noted in their journals that they had made consistent use of the audio features in more than just full playback mode. There were two reasons for the overall lack of use of the audio. First, students did not have regular access to the correct platform for listening to the audio and we were unable at that time to provide a cross-platform audio player. Second, the set-up of the audio service was too difficult for some students to bear, so they did not bother. Despite this minimal usage of the audio features, several students who did manage to use the audio found it particularly useful to clarify their own notes.
Of particular interest is how electronic notebooks promote different and possibly more effective note-taking strategies. A overview of the electronic notes taken by students reveals that initially most students would write quite a bit on the electronic slide, even if what was written was exactly what the teacher was writing on the LiveBoard. When questioned about this afterwards, several students who used the electronic notebook throughout the class noted that they felt their note-taking became more economical as the course progressed. This is in spite of an apparent lack of use of the audio features. Upon further investigation, these students revealed that even without audio services, merely having the teacher's notes available after class saved them from the sometimes mundane task of copying. One student in the class, however, stated a preference for writing down everything himself, even if what he wrote was identical to the teacher's notes that he could obtain later. This again points out the importance of recognizing different learning styles and supporting as many as possible. We are in the process of completing more significant quantitative analysis of the student notes and will report on those findings later.
All Web pages produced for this class (shown in Figure 2) were publicly viewable, a conscious choice of the students in the class. Students were unaware that they could edit their own Web notes, and we had not provided any easy way to do the editing. Several students commented on the shortcoming of these supposed static review notes, remarking that it was their habit to revise and rewrite their notes. This represents both a technical and social failure of the prototype to support long-term use of class notes. We are now concentrating on providing a better interface to revise notes.
From the instructor's perspective, there were several advantages and disadvantages. The ClassPad application running on the LiveBoard was easy to use and was responsive enough to allow for a natural level of interaction. All of the lecture material for this class was available from a previous section of the course, but as the quarter progressed, it was judged necessary to modify the format of the slides. At the request of several people, the slides were redone to increase the amount of whitespace available for making annotations. The ClassPad logging was effective, even though we had intended to provide per-annotation audio links instead of per-slide links. One drawback of our system, however, was the requirement that the teacher load all slides to be visited during one lecture prior to the beginning of the lecture. This seemingly simple requirement caused a problem twice in the course when the teacher wanted to refer to a slide from a previous lecture but was unable to do so because the capture semantics of ClassPad would not have correctly logged the remainder of the class.
One pleasant surprise came the first time the ClassPad application running on the LiveBoard misbehaved in class and would not load the slides for the lecture. The lecture proceeded more in the private notes style. In time, we began to appreciate ClassPad as being well-suited to the private notes style of teaching and we plan to take advantage of that in the future to attract other teachers into our experiments.
In the FCE seminar, we both videotaped and used the MessagePad outline-annotator. We provided a template file to prepare the outline of the class discussion; this was a useful service for the presenters. The student note-takers found it easy to understand how the outliner application worked, but did not find it all that useful to attach notes to the outline. The main problem was that discussion in the class did not follow the outline very closely. When the note-taker wanted to jot down a thought, it was hard to determine which entry in the outline to choose for annotation. Sometimes, the choice of entry was entirely arbitrary, and the resulting enhanced Web page looked somewhat confusing. As the quarter progressed, we altered the application by removing the outline and allowing the user to bring up blank, time-stamped note pages for writing whatever came to mind. This simpler interface, similar in spirit to Stifelman's audio notebook [14] was much less confusing to the user, but was only used in the class a few times.
One great advantage of our prototyping approach has been suggestions from the users. We received many suggestions from students for possible extensions of Classroom 2000 that we did not initially consider. Some of the more promising suggestions are:
This architectural division is not ideal, however. One drawback has been a limited interpretation of what occurs in post-production. Up to now, post-production has simply meant the generation of media-integrated notes based on multiple streams of information captured during the live recording phase. It has not included support for the user in accessing and modifying those notes. Work at Xerox PARC has identified tools to support this separate access phase [9] and we would be wise in the future to focus more effort there as well.
Annotation is simplified when the underlying image -- such as a slide used the presentation-style lecture -- does not change. This approach does not work for all lecture styles. It would be better to have prepared material for presentations that differed in form or content from the material used for review. The material for presentation must be readable when projected and fairly terse, as reading lots of text off a wall display is not effective. The material for review should be more like a user-modifiable textbook, suitable for a personal display and containing more explanations.
We are aiming to be able to replay the entire lecture experience, including multiple video views and all student interactions with their computers in class. We need to handle richer media sources at a finer level of granularity. For example, the student should be able to ask during review, ``What was the lecturer saying when I wrote this?'' while pointing to some arbitrary annotation, as in [10, 14, 20]. Or the student might want to find the notes associated with a live demonstration that occurred at some point in the class. The solution we have produced for indexing and reviewing an audio stream for the class is immediately transferable to video, keyboard and mouse events, and pen strokes. A constraint at the moment is efficient storage and delivery of the richer media types.
Some students noted that they can type faster than they can write and that the typed-in information is immediately available for content-based search mechanisms. We stayed away from keyboards because we felt the constant tapping of the keys would be a distraction in the class. We also feel that the purpose of Classroom 2000 is not to enable a student to take more notes, but rather to be more efficient note-takers.
The use of an electronic whiteboard was universally favored in the classroom, and the LiveBoard provided an excellent, albeit expensive, solution. It is still too small, both in terms of physical size and screen resolution (VGA), to consider replacing existing whiteboards. It is roughly the size of a whiteboard in an office or small meeting room and about a third the size in real estate of even the smallest classroom whiteboards. Our experience shows that the computational capability of the LiveBoard is useful for both encouraging group exercises in a class and also creating an accurate record of some class activity. We recommend that researchers investigate ways to provide larger scale interactive surfaces, both in terms of display size and resolution.
Though each Classroom 2000 prototype system is different in terms of technology provided, information captured and media-integrated review materials produced, they all follow a common organizational theme. We separate the functionality of the system into three distinct phases. Pre-production activity prepares all information leading up to the classroom interaction. Live recording captures various streams of information and actions during a lecture. Post-production activity generates multimedia-enhanced Web pages for a summary of the classroom activity.
Along with developing a variety of prototypes to support different teaching and learning styles, we have had the opportunity in one case to conduct an extended evaluation of the effect of the technology on the teaching and learning experience. Though we are not yet able to provide an assessment of how Classroom 2000 enhances learning, our preliminary evaluation does reveal a favorable student impression. Most encouraging was the response toward the use of the electronic whiteboard and Web notes. Least encouraging was the response toward the personalized electronic notebooks. We understand a lot of the misgivings with our initial prototype notebooks. They were too small and too slow and left the students feeling that their notes were unavailable for revision after class. Armed with these insights, we will continue to explore this valuable avenue of research in future computing environments for education.
Published in the Proceedings of Multimedia '96
This document was generated using the LaTeX2HTML translator Version 96.1-e (April 9, 1996) Copyright © 1993, 1994, 1995, 1996, Nikos Drakos, Computer Based Learning Unit, University of Leeds.
The translation was initiated and modified by
Jason Alan Brotherton on Tue Sep 10 11:08:15 EDT 1996