Building a secure operating system with Roger R. Schell

Computer Weekly Buyers Guide: Mobile Technology–ComputerWeekly.com

This article can also be found in the Premium Editorial Download:Information Security magazine: Growing data protection risks and how to manage them:
Roger R. Schell is an authority on high-assurance computing and spent more than 20 years in the U.S. Air Force before working in private industry. As one of the lead authors of the U.S. Department of Defense Trusted Computer System Evaluation Criteria (TCSEC), known as the Orange Book, Schell has first-hand knowledge of the standards required for classified computer systems. Published in 1983 by the National Computer Security Center, where he served as deputy director, the TCSEC was replaced in 2005 by an international standard, the Common Criteria for Information Technology Security Evaluation.

The co-founder and vice president of Gemini Computers Inc., Schell led the development of the Gemini Multiprocessing Secure Operating System, known as GEMSOS. In 2001, he founded Aesec Corp., which acquired Gemini Computers and its security kernel in 2003. He also served as the corporate security architect at Novell.

Marcus Ranum spoke with Schell, now a professor of engineering at the University of Southern California Viterbi School of Engineering, about the security practices of the U.S. government, the National Security Agency’s A1 class systems — Gemini was one — and the development of a secure operating system. Is it even feasible at this point?

Editor’s Note: This Q&A has been edited for clarity and length. 

I cursed the Trusted Computer System Evaluation Criteria as a young pup, and it took me 20 years to work my way around to realizing it was way ahead of its time. Are you frustrated to see how computing has been moving downhill in terms of trustworthiness?

Roger R. Schell: It certainly has been disappointing. I suppose that with 22 years of military experience, I learned that ‘you fight with what you have,’ and I tend not to get frustrated personally. I try to hope for better outcomes and to influence a better direction.

It appears that the U.S. is putting backdoors in all kinds of systems, while ignoring the potential for people doing it to us. How do you see that trend?

Schell: I don’t know if you saw or remember the article I wrote about 40 years ago, on the electronic Air Force and how computers were its Achilles’ heel? I asserted that the primary problem we had to address was subversion. Colin Carter at IBM Research characterized that paper as the seminal paper in information warfare. I described, in sort of broad brushstrokes, what people today would recognize as Stuxnet. I and others saw that was what people had to address when things really mattered.

I remember that paper; it made an impression on me, along with Ken Thompson’s “Reflections on Trusting Trust” [Communications of the ACM, August 1984], which you no doubt recall. It seems that there was a brief period of time where people were concerned with trusted software distribution and controlling releases. But now, what have we got? Automatic download is built into everything.

Schell: I think there are different factors: There are the issues of government policy and then there are the issues of businesses. I did an interview on long research a couple of years ago and noted that there are huge vested interests that don’t want to have those issues successfully addressed. I won’t say that that’s the cause, but — taken together — I think it’s very hard to get a focused effort to address the problem.

That was very delicately put. I’ve phrased that as ‘The computer security industry doesn’t exist to build secure systems; it exists to get customers’ money.’

Is there any particular moment that made you realize that trust and integrity were important to systems?

Schell: I got introduced to the issues of subversion early in my life. My father was a migrant farmworker with a fifth-grade education, the son of immigrant parents. We came without a lot of resources, so we made do with what we had and creative ways of accomplishing our ends.

One of my first experiences with subversion was when I was in grade school. We had a teacher we did not particularly appreciate, so we stuffed a raw potato up his exhaust pipe. It had the usual effect: He’d drive for a mile or so, his car would stop and he’d try to figure out why, and after a bit it would go again. That was my first introduction to subversion–he’d look for all the usual things, and yet it didn’t work properly. It was a question of the need for integrity throughout the system.

One of the hardest things for me to understand, at one point, was that computers tend to fail at single points, so we can diagnose a failure in terms of what has changed. Subversion — induced failure — breaks that model badly.

Schell: In the town where I was in high school, when Halloween night came, for two years running, I turned off all the streetlights in the town. That was a case of where what happened wasn’t obvious, and the second year I was surprised that they let it happen again.

Not that I want you to confess to anything, but … this would have been before smart grid, so wouldn’t you have had to gain access to some physical switch somewhere?

Schell: That’s what you would think — and so did they. I could watch the power company building, and they ran around checking all the usual things to see what the problem was. But it turned out that towns at that time had gotten tired of having lamplighters go out and turn on the lights, so there was an electric eye that would turn the lights on when it got dark. I took a can, strapped a few batteries around the side of it, and put a little light bulb inside, and put the can over the electric eye.

[“Source-CNBC”]