Beating the bug

Beating the bug McGill University

| Skip to search Skip to navigation Skip to page content

User Tools (skip):

Sign in | Friday, November 30, 2018
Sister Sites: McGill website | myMcGill

McGill Reporter
January 13, 2000 - Volume 32 Number 08
| Help
Page Options (skip): Larger

Beating the bug

| What was endlessly hyped as the worst technology glitch in history, commonly called the Y2K computer problem or millennium bug, looks to be nothing but a bad memory of the '90s.

You know the story: Without fine-tuning or replacement parts, many computers couldn't make the transition from 1999 to 2000. Doomsayers were predicting everything from the end of e-mail to a worldwide blackout.

Now that 2000 has arrived without any apparent problems — power, water and PCs have all been running smoothly since Jan. 1 — lots of people are asking if there ever was a real computer problem to begin with.

But to clamour that the Y2K computer glitch didn't really exist or was merely hyped up, says Vice-Principal (Information Systems and Technology) Bruce Pennycook, "is as ridiculous as saying we all needed to build shelters in preparation for a catastrophe."

Just because most PCs have been running uninterrupted since the stroke of midnight on New Year's Eve, even though many aren't Y2K compliant, he says, doesn't mean the huge mainframe computers that banks, businesses, governments and universities like McGill need to operate would have continued to work properly after 2000 had they not been upgraded.

The main difference between regular PCs and mainframe monsters, he explains, "is that most PCs aren't used to run databases and aren't date-dependent." McGill's main computers, however, are date-dependent and need to amortize dates on everything from a student's scheduled year of graduation to loan payments.

"We need to know that those numbers are right," Pennycook says.

But it wasn't just at the Registrar's Office or accounting departments where numbers needed to add up.

McGill also needed to ensure that departments which conduct lengthy research projects were pouring their data into Y2K-friendly computers in order to safeguard results.

"With the legal department, we examined the areas where McGill might be liable and addressed them," Pennycook says, pointing to sponsored laboratory research as an example of where a fund-provider could have had legal recourse against McGill had it been able to prove the University had neglected to protect its investment i.e., computer-dependent research.

Making certain McGill's computer network was Y2K-ready was an ongoing, five-year project, says Tanya Steinberg, McGill's Y2K project manager, who oversaw the University's computers' hitch-free transition into the new millennium."So far, we aren't finding any problems and I hope that continues."

If McGill's computers are running efficiently today, Steinberg says, it's because they were debugged.

"If you look around the world a lot of money was spent on making computers Y2K compliant," she says. "And the computer problem turned out to be a non-event because of all this work."

While it's been widely estimated that $1 billion to $2 billion was spent to upgrade computers in North America alone, Steinberg says it would be impossible to say how much McGill spent on its Y2K computer initiative.

"Since McGill is so decentralized, there's no way to account for exactly what was upgraded," she says, noting that updating some computers required new software in some cases, but that other machines only needed a few hours of labour.

Even if McGill has an estimated 7,000 computers scattered across its campuses, says Pennycook, the University didn't need to hire an army of computer technicians to fix the machines.

"In most cases, computers that required updating were fixed by (McGill employees) in the course of their jobs," he says.

Had McGill's computer system come crashing down, the University was prepared with a crisis centre in the James Ferrier Building on Dec. 31. It was there that eight University employees, including Pennycook, Steinberg and Gary Bernstein, McGill Telecom director and acting director of the Computing Centre, sacrificed their millennial celebrations to ensure that the University's computer network remained foolproof.

"It turned out to be a pretty boring evening," Bernstein admits, since there were no crises to handle because the University was so well prepared. "When one little problem did show up, we were like, 'Finally.' And that took 10 minutes to fix."

This being a leap year, however, Bernstein says Feb. 29 remains a date to watch. "We still need to be vigilant about that," he says, noting McGill has already taken steps to be ready.

As for minor Y2K-related computer problems still cropping up over the year, Steinberg says it remains a possibility. Bugs might take that long to be discovered, she says, "since most people don't use every file or program in their computers every day."

Anyone who still doubts the validity of spending for Y2K initiatives should note that most computers need occasional upgrades anyway, says Pennycook. "We only fixed what we had to."

At the Students Society of McGill University, a total of $5,000 was invested to be Y2K-ready, says Kevin McPhee, vice president, operations. But these investments, on upgrades to the point of sales system for the Sadie's stores and the computerized bar serving system at Gert's Pub, were due to be made. "Even if there were no (Y2K) problems," says McPhee, "I was still glad we invested the time and money to be ready."

view sidebar content | back to top of page

Search