Kelly Marshall pointed out this article in eWeek to me and asked what my take was. In the article Richard Clarke (former White House Security czar) is quoted saying that developers should be held responsible for the poor state of security in their applications:
To solve the problem, Clarke called on the government to put pressure on the software industry to develop and maintain secure coding practices.
"The reason you have people breaking into your software all over the place is because your software sucks," he told conference attendees. "I don't like the idea of 'buyer beware.' It was great in the 14th century, but I think we've moved beyond [that]."
Clarke also encouraged enterprises to get together and inform their vendors that they're not happy with the security of their software.
"Industries should establish what they want from the software industry," he said. "Let's allow these industries to get together and say what they expect. If they need an antitrust exemption for that, let's give it to them."From Clarke: Hold Developers Accountable for Software Insecurity
Referenced Thu May 20 2004 16:09:34 GMT-0600
I used to be a formal methods guy. Specifically, I built mathematical models of a computer's intended behavior and then a model of its structure and showed through mathematical analysis (i.e. proof) that the behavior followed from the structure. I did it because it was fun. I told other people I did it because it was important. That is, being able to use mathematics to analyze artifacts is one of the fundamental processes of any engineering discipline. I still believe that at some point we've got to be able to apply analysis to the problems of computer correctness, but we're not there yet. For real world problems, the analytical techniques we have now fall far short.
The problem with programmers is that we want to be "engineers" but we don't want any of the limitations or responsibility of other engineering disciplines. Being held responsible for your designs is part and parcel of any engineering discipline where the design has significant public impact. Believe me, if a bridge falls down, the engineer who signed off on it will find themselves testifying in court.
This is a topic that has been discussed for decades and yet, we're still not there. Part of the problem is that we don't really know what standards we'd hold people to. For example, if I sign off on a design for a piece of software, and a buffer overrun causes a security problem, that's not a design (i.e. engineering) problem, its an implementation problem. If the design of a bridge is good, but it falls down due to poor welds, that's not the engineer's fault.
So, how do we ensure the welds on a bridge are good? Through standards, best practices, review processes that are well understood, and so on. There is a whole body of standards and regulation that exist in other industries that just don't exist yet in software. For example, is it poor practice to use C++ even though its a pitifully hard language to avoid buffer overrun problems in? Would you testify in court that another engineer had no business using an untyped language? Probably not.
Even so, I think regulation, either self imposed or imposed by the government, is inevitable. The way those other industries got their large bodies of best-practices and processes is through the social process that results when people are held responsible for their errors. The only way to get better software is to hold someone responsible. To date, we haven't been willing to do that, but the pressure is growing.