Professor Jeremy Cooperstock

PHOTO: OWEN EGAN

Putting technology in its place

BRONWYN CHESTER | A little bit of history was made last week. For the first time ever at McGill and, most likely, at any institute of higher learning in Canada, a class was taught involving "automated lecture capture."

What this means is that from the instant Jeremy Cooperstock began his class on Artificial Intelligence, a computer began to capture everything that the students saw or heard including the notes, hand-written on his digital tablet, which are projected onto a larger screen for the students to see. In a mere two to three minutes after the lecture, students were able to view or review the content on the class website.

To make all of this work, the classroom contains a video camera, microphone and two computers, one of which acts as an electronic whiteboard while the other records the audio and video of the lecture. Apart from the digital tablet, the computers are invisible.

Thanks to the computers' ability to digitize the lecture contents for the web site, students gain a powerful tool with which to review class material and view the instructor in action. From the latter's point of view, no extra work is required beyond giving the lecture.

What's the purpose of such a degree of accessibility to the lecture?

This is just step one, says Cooperstock, a professor of electrical and computer engineering, in developing a teaching-learning, computer-augmented environment. Soon, Cooperstock will hook up a second video camera so that the students' reactions, comments and discussion may be computer-recorded. He also wants to have their feedback on his teaching so that he may improve. Already, they may comment on his teaching via e-mail or in an anonymous feedback forum.

The idea for captured student feedback comes from PhD student, in education, Janet Blatter, a collaborator in the teaching project, who will be studying the use of feedback in teaching. Ralph Harris, professor of mining and metallurgical engineering, leads the project which is funded, in part, by the Royal Bank of Canada's Teaching Innovation Fund.

"Eventually," says the 31-year-old Cooperstock, "we would like to see an artificial intelligence system provide instantaneous feedback but, for now, we rely on students."

In other words, by using the students' feedback on the style and content of his teaching, Cooperstock hopes to program their measures of good teaching into the recording computer and if, for instance, his arm-waving is getting distracting or he's speaking too quickly, the computer will notify him. "It could be through background audio, something subtle like the sound of an engine in the distance," he says, "or something more obvious like a flashing light in the back of the class."

"This is a whole area of research in human-machine interaction," he continues, adding that "the students think the idea is hilarious."

Mightn't such feedback even be annoying? Indeed, says Cooperstock, but the lecturer will always have the option of selecting how much feedback is generated. In fact, it's one of Cooperstock's tenets of the brave new world of computerized devices that the user always has the upper hand; manual override, invisibility of the devices and feedback (regarding breakdowns or normal function) are the design principles he refined in his PhD work .

Cooperstock's vision of the "computer-augmented environment" stems from his frustration with the complexity of today's electronic devices, the tyranny of too many buttons. While a graduate student three years ago at the University of Toronto and involved in the Ontario Telepresence Project (whose goal was to research human communication through videoconference technology), he was struck by the irony of the situation. "Here we were, a bunch of PhDs and unable to conduct a single meeting without constant human intervention in the technology."

"What then must it be like for the general public?" he asks rhetorically, noting that one-third of Americans don't know how to set the time on their VCRs much less program them.

From that point on Cooperstock became a man with a mission to let the machines, via computers, run themselves and let people do what they are best at: higher level thinking, creating and exchange.

His first post-graduate stop was at Sony, in Japan, where he developed a voice-controlled VCR where the only switch is the on-off. (Due to the cost of the computer, we're still far from being able to order around our VCRs. But it's coming, says Cooperstock who uses his voice to command the VCR in his lab.)

Since coming to McGill in 1997, Cooperstock has put "his hands everywhere" where computer-augmented environments make sense, hence his collaboration with education, other departments in engineering and now with the faculties of music and medicine. The latter two are interested in the work he is doing with department colleagues James Clark and Vincent Hayward on the "shared reality environment."

A sort of extension of the videoconference, Cooperstock, Clark and Hayward are now working with Zack Settel and Wieslaw Woszcyk from music, and Michael Century from Communications, to develop a number of "shared reality" rooms as a first step toward "distance" rehearsing.

The project, funded at $500,000 over four years (thanks to a Canadian Foundation for Innovation award, won last year by Cooperstock and Clark) will begin this March with two jazz musicians. From his or her own shared reality room, each equipped with a large video screen with rear-projected video and a multi-channel sound system, the musicians will attempt to learn a piece together.

In situations of greater distance, sound delay and synchronization are factors that will have to be considered, says Cooperstock, and "we'll be observing to see at what point the music breaks down." When the technology is perfected, it ought to simplify the lives of international musicians who now must travel great distances in order to rehearse.

But, says Cooperstock, such technology will have a broader impact in other domains such as medicine where students may learn the rules of the operating theatre without having to be there or in a situation where a surgeon in the city, for instance, may be able to help a colleague during an operation in a remote setting. One of the extraordinary aspects of "shared reality" is that not only is there simultaneous viewing and hearing involved, but there will be a degree of shared haptic (tactile and physical sensations) reality. This is Hayward's speciality and it will involve the sensing, digitizing and transmission of such sensations as the resistance of the scalpel as it cuts through tissue Ð all in real time.

So much for the adage: you can't be in two places at once!

Cooperstock is also consulting to design a "reactive house" for the Ontario Science Centre's millennium exhibit. He and six students will get to play with the "robot turned inside out;" the house that greets you, assesses your mood and suggests appropriate music, turns on the television or stereo depending on which way you swivel your chair, etc. -- all done through unobtrusive audio-visual sensors and, of course, computerized devices.

And what about breakdowns? Cooperstock recently addressed the question in a talk titled: Why People Who Build MS-Windows Shouldn't Design Homes.

While the development of the above "house" will be for fun, the so-called "smart home" truly is coming, piece by piece, to a neighbourhood near us, and Cooperstock is adamant that residents have a manual override for all appliances. Otherwise, the possible disasters in such highly wired shelters could be the stuff of tragedy Ð or comedy. Imagine the scene where the computerized stereo won't let you turn off the Spice Girls!