October 11, 2002
by Andy Oram
American Reporter Correspondent
CAMBRIDGE, MASS.—On September 18, 2002, a federal security agency released a long-awaited draft containing recommendations for protecting the nation’s computers and networks from attack.
Little ensued but yawns and snickers. Security experts called the draft weak, vague, "a public relations statement," and worse. They referred to more specific recommendations that existed in earlier, non-public drafts, and accused the authors of removing them because they would be costly for powerful business interests.
But I find the draft of the National Strategy to Secure Cyberspace, by the President’s Critical Infrastructure Protection Board (headed by Richard A. Clarke) to be a beneficial plan for action. It offers potential precisely because it does not micromanage the security process, but lays out broad needs and sets a direction—in other words, because it shows leadership, a quality that the Bush Administration rarely displays but too often masquerades.
When security experts fault the strategic draft for not being specific enough, I believe they are really faulting the rest of the government and business communities. They sense that this document will be relegated to the file cabinet.
Certainly, after the initial flurry of press, the draft has disappeared as certainly as if it had been buried in centuries of debris. But let us look at what it offers and judge it by what it says.
First of all, I must stop and express some relief. The draft contains none of the hysterical stop-gaps that one might well expect in the current climate of fear.
It doesn’t call for spending billions on the immediate purchase of security products that will make some software company executive happy but that nobody will know how to use properly. It doesn’t join the current mania for "trusted systems," which really means dumbed-down computers that tie data and content to programs approved by copyright holders. One finds in the draft no scapegoating, no punitive demands for intrusive investigations and draconian punishments.
Instead, the document lays out the slow and steady paces by which the country and its allies can achieve a reasonable amount of protection. It advises each large enterprise, for instance, to form a "security council" containing their chief officers, as well partnerships with others in their industries and with public agencies.
For government agencies, the document explains the need for "a continuing cycle of risk assessment." It explicitly rejects the old approach where security was "tacked on."
The report also stresses the communal aspects of security. Because attacks target a huge number of diverse users and organizations with astonishing speed (witness the spread of computer viruses) cybersecurity requires something of an epidemiological approach. And the draft recognized this in its many provisions for "information sharing."
This can be fostered by a range of clearinghouses and "information sharing and analysis centers" serving "various sectors" of the economy. It can also be promoted by a "network operations center," which some overly suspicious critics assumed would turn into a spy outfit, but seems a perfectly benign resource for accumulating and disseminating information about vulnerabilities.
Funding is addressed. The report calls for "Federally funded near-term IT security research and development," grants to universities for the training of professionals, and other measures that cost real money. (Fat chance getting Congress or the President to act on these.)
Bravely, the authors insist on the "centrality of maintaining privacy." They write, "the National Strategy incorporates privacy principles—not just in one section of the Strategy, but in all facets. The overriding aim is to reach toward solutions that both enhance security and protect privacy and civil liberties."
Such reassurances are crucial at this historical juncture, with so much encroachment on civil liberties around the world. Whether the authors really mean it could be questioned, though, for they endorse the Convention on Cybercrime created by the Council of Europe. This convention promulgates rules requiring Internet Service Providers to keep customer information for a period of months ("data retention") in case law enforcement needed it. In doing so, it dramatically backpeddles on some thirty years of growing privacy protection in Europe.
Personally, I was struck by the document’s statement, "Each American who depends on cyberspace, the network of information networks, must secure that part that they own or for which they are responsible." This is exactly what I said in an American Reporter article long ago titled Cyber Hygiene, Not Cyber Fortress Protects Our Networks and repeated shortly after the September 11 attacks in Cyber-security: Uncle Sam Needs You.
Several times the draft takes pains to say that each individual in an organization, not just "cybersecurity professionals," needs training to understand his or her role in security. Insider threats are also mentioned.
The authors understand the complexities of today’s "borderless network" with its instant messaging, email attachments, and Web services. The draft says, "Placing a wall around the perimeter of a network is not adequate to achieve security." This has been a frequent refrain of security experts.
"Early warning for the entire Federal community starts first with detection by individual agencies," the draft stresses, "not incident response centers at the FBI, GSA, DOD, or elsewhere. The latter can only know what is reported to them."
These recognitions of current realities—of the distributed nature of security—are precisely what makes the document runs afoul of critics. A report by major staffers for the tech site C|Net News said that the report "punts" and concluded that, "it would not scope out any bold new ground in this effort."
The Center for Strategic and International Studies (CSIS) "called the report flawed since it did not demand new laws or regulations aimed at Internet companies." This seems to be the thrust of Steven Kirschbaum, CEO of Secure Information Systems, to whom is attributed the ambiguous comment, "The first rule of having any security policy is you have to have enforcement."
What on earth are these people looking for? Docking the pay of system administrators $100 for each open ICMP port? Critics of the document seem eager for precisely the punitive kind of approach that won’t work.
Meaningful security requires buy-in at every level of society and careful, educated improvement. There is plenty of accountability in the draft report, but you have to look deep for it. For example:
Suggestions are made to subject businesses to "cybersecurity audits" by "outside auditors."
The powerful incentive of insurance is brought in to the picture, as many security experts have recommended. As with all forms of risk, insurance companies can base premiums on the measures taken by corporations to secure their systems, and this will have a more than a ripple effect going all the way back to hardware and software vendors.
Government agencies are admonished to include security in "job and program performance" and to subject contractors to similar scrutiny. The draft considers "limiting contract awards to service providers that meet specific published criteria" for security.
Financial analysts and investors are advised to ask companies about security programs before investing.
Even small businesses come under the lens, as the draft suggests requiring an "IT security checklist" for loans from the Small Business Association.
What of all the provisions that mysteriously disappeared from earlier drafts? From experience with other drafts, I can testify that this doesn’t necessarily mean the authors were cowed by special interests. It could simply mean that they were rationally persuaded that seemingly obvious measures contained hidden gotchas and were not worth promoting.
There are certainly measures that organizations should take right now to secure their systems, but I don’t see how they can legislated directly.
For instance, it is often suggested that victims sue software companies for security lapses (or other bugs in code, for that matter), but this would be unfair unless the vendor has demonstrated flagrant and reckless irresponsibility. Such lawsuits might well be justified when a company knows about a serious bug and knows how to fix it, but continues to ship the product for months without doing so.
But unfortunately, the state of the fifty-year-old software field is nowhere near mature enough that we can justly hold programmers responsible for bugs. While we have made progress in many areas—for instance, we have programming languages that rule out the dreaded buffer overflows responsible for most security flaws—we are always running behind. This is because new programming techniques keep being invented (for instance, popular languages like ASP and PHP for building dynamic Web pages) and we cannot anticipate their security ramifications.
Liability for software bugs would fall particularly hard on the field of free and open source software, which ironically provides the most robust and secure products.
So it goes with other security measures. Thus, consequences for leaving corporate data and servers unprotected would best be assigned not by laws but by insurance policies, just as the draft report recommends.
While I do not solidarize with the published criticisms of this strategic draft, I certainly see some flaws in it.
First, the draft never defines what actually constitutes a cyberattack. Perhaps the drafters felt that the issue was already familiar and that they effectively defined it through a series of illustrative incidents. But in the absence of a formal definition—a kind of mission statement—dangers lie in drifting off into murky, unrelated goals.
The draft itself demonstrates these pitfalls. For instance, it encompasses the "Consumer Sentinel" project through which the Federal Trade Commission addresses online fraud. This has nothing to do with terrorism.
Similarly, cybercrime treaties will inevitably include measures to stop copyright infringement that encroach on users’ privacy and extend the power of content owners far beyond the current state of copyright laws.
Worse still, the draft recommends that parents install filters to limit the online content their children can access. This simplistic advice ignores the extensive problems found with such filters. Anyway, it’s not pertinent. If I wanted to infect a lot of computers in America with malicious software, I would probably do it by offering a site that looked very appropriate for children.
The proposal for certifying security professionals may also be misguided. Certain basic habits of good security can be delineated, but I don’t know how well they can be formally taught or tested. The particular vulnerabilities and attacks change so frequently that no certification can ensure public preparedness.
In its eagerness to find channels for progress, the document may encourage too much of a proliferation of institutions who supposedly will inform and guide us. People are more responsive to a single institution they know well than from a multiplicity of foggy sources of information competing for their attention.
Do we really need an "Information Integration Program Office" to "coordinate the sharing of essential information nationwide"? Or the http:/www.educause.edu/security/>"Task Force on Computer and Network Security" for educational institutions? The "Partnership for Critical Infrastructure Security" for companies? All the "information sharing and analysis centers"? What’s wrong with well-respected organizations that computer professionals listen to now, like CERT and the SANS Institute?
Finally, I am puzzled by what seems to be an unstated agenda behind the call to "review Federal and States regulations and laws that impede market forces from contributing to enhanced cybersecurity." Here I sense a bit of the bias seen by other critics. Who is trying to weasel out of oversight through this maneuver?
On the whole, though, I think the strategic draft states accurately what we need to do to protect our information systems. The onus falls on those responsible for implementing the report.
The Bush administration, for all its bluster about protecting the public, has lagged on such basic prophylactic measures as dismantling the nuclear facilities of the former Soviet Union. Security measures at public facilities and airports remain woefully underfunded. And domestically, the administration seems most intent on demoralizing the dock workers who are responsible for bringing most of our goods into the country. Smart move, Mr. President.
Security professionals, and Americans in general, have plenty to complain about in the way our leaders have handled the terrorist threat. But this document is not the proper butt of complaints. Here, unlike so many other issues, security professionals should be pressuring our leaders to step forward and actually carry out the measures they have signed up for.
Member, Computer Professionals for Social Responsibility
Editor, O’Reilly & Associates
Author’s home page
Other articles in chronological order
Index to other articles