It was hard to believe, but the student insisted it was true. He had discovered that compact discs from a major record company, Sony BMG, were installing dangerous software on people’s computers, without notice. The graduate student, Alex Halderman (now a professor at the University of Michigan), was a wizard in the lab. As experienced computer security researchers, Alex and I knew what we should do: First, go back to the lab and triple-check everything. Second, warn the public.
But by this point, in 2005, the real second step was to call a lawyer. Security research was increasingly becoming a legal minefield, and we wanted to make sure we wouldn’t run afoul of the Digital Millennium Copyright Act. We weren’t afraid that our research results were wrong. What scared us was having to admit in public that we had done the research at all.
Meanwhile, hundreds of thousands of people were inserting tainted music CDs into their computers and receiving spyware. In fact, the CDs went beyond installing unauthorized software on the user’s computer. They also installed a “rootkit”—they modified the Windows operating system to create an invisible area that couldn’t be detected by ordinary measures, and in many cases couldn’t be discovered even by virus checkers. The unwanted CD software installed itself in the invisible area, but the rootkit also provided a safe harbor for any other virus that wanted to exploit it. Needless to say, this was a big security problem for users. Our professional code told us that we had to warn them immediately. But our experience with the law told us to wait.
The law that we feared, the DMCA, was passed in 1998 but has been back in the news lately because it prohibits unlocking cellphones and interferes with access by people with disabilities. But its impact on research has been just as dramatic. Security researchers have long studied consumer technologies, to understand how they work, how they can fail, and how users can protect themselves from malfunctions and security flaws. This research benefits the public by making complex technologies more transparent. At the same time, it teaches the technology community how to design better, safer products in the future. These benefits depend on researchers being free to dissect products and talk about what they find.
Click headline to read more--