The Cybersecurity Act of 2009: Trying to create order from chaos

The cyber arena is filled with the effluvia of vendor driven agendas and political wrangling for budgetary dollars. As a topic cyber security is especially vulnerable as the waning leadership and expertise is so centralized in so few individuals that consensus can be driven literally from people sitting in one room. Consider the recent testimony by Dr. Eugene Spafford to the Senate Commerce Committee on how few doctoral students graduate from the academic setting. The cyber security arena may be the one last place that a person with relatively little academic training can be a substantive contributor. Though that model has not served so well with over 40 years of computing and little to have moved us forward toward a secure environment.

For those that are interested, my first response is to say “CYBERSECURITY” is TWO words “Cyber & Security”.  Senators Snowe and Rockefeller to answer some specific needs are presenting the Cybersecurity Act of 2009. The draft of the act I reviewed is around 51 pages and has an interesting scope. I’m not going to go over the entire draft act (since it will likely change), but I will present the table of contents:

Sec. 1. Short title; table of contents.
Sec. 2. Findings.
Sec. 3. Cybersecurity Advisory Panel.
Sec. 4. Real-time cybersecurity dashboard.
Sec. 5. State and regional cybersecurity enhancement program.
Sec. 6. NIST standards development and compliance.
Sec. 7. Licensing and certification of cybersecurity professionals.
Sec. 8. Review of NTIA domain name contracts.
Sec. 9. Secure domain name addressing system.
Sec. 10. Promoting cybersecurity awareness.
Sec. 11. Federal cybersecurity research and development.
Sec. 12. Federal Cyber Scholarship-for-Service program.
Sec. 13. Cybersecurity competition and challenge.
Sec. 14. Public-private clearinghouse.
Sec. 15. Cybersecurity risk management report.
Sec. 16. Legal framework review and report.
Sec. 17. Authentication and civil liberties report.
Sec. 18. Cybersecurity responsibilities and authorities.
Sec. 19. Quadrennial cyber review.
Sec. 20. Joint intelligence threat assessment.
Sec. 21. International norms and cybersecurity deterrence measures.
Sec. 22. Federal Secure Products and Services Acquisitions Board.
Sec. 23. Definitions.

Section 3 of the act seems to reconstitute the Presidential Information Technology Council (PITAC) Cyber Security Working Group. The basic charge and mission of this group is nearly the exact same. In fact, the PITAC is directly referenced as foundational early in the act (page 8). The cyber security advisory panel will look at trends and developments, progress towards implementing a strategy, balance among the different components, whether the goals and priorities are being met, and whether societal and civil liberty concerns are adequately addressed.

As a core function each of these items have been found in numerous reports, suggestions, and foundational literature for a lot of years. There is nothing new here even in the codification of act. These are all things that have existed. The difference though is that this act is making sweeping changes across the broad landscape of cyber security. So implementing a new and improved version of PITAC makes sense. More importantly it will give a voice to the various industry partners, government agencies, and academic institutions.

In section 5 of the act there is a direction to create regional cyber security centers for the promotion and implementation of cyber security standards. As I read this section I had visions of the state intelligence FUSION centers being made into cyber centers. This actually though looks more like the Centers of Academic Excellence program which has been expanded way beyond the original concept. The cyber security centers are going to exist to enhance and transfer standards, processes, and techniques developed by the National Institute of Standards and Technology (NIST). These centers are not going to be research facilities but applied response entities that actively promote and react to cyber incidents. From an applied view these centers look more like emergency medical services than they do force protection specialists.

NIST is going to be required to engage in some specific activities. As stated in Section 6 they (NIST) are going to have to derive and develop cyber security metrics so that the economic effects of impacts on cyber security can be adequately evaluated. The business methods of return on investment (ROI) and risk analysis tools (e.g. expected loss averaging) are inaccurate and abysmal measures. New methods of calculating actual loss and impact are going to be required as computing ubiquity reaches monolithic scale.

The licensing and certification of cyber security professionals in section 7 is going to be the most debated section. A key phrase from the draft act is “…it shall be unlawful for any individual to engage in business in the United States, or to be employed in the United States, as a provider of cybersecurity (sic) services to any Federal agency or an information system or network designated by the President, or the President’s designee, as a critical infrastructure information system or network, who is not licensed and certified under the program.”

Discussion of this point has already been buzzing around the blogoshpere. The question is this a barrier to entry? Will this increase professionalism? Will this make security more or less likely? Are comparisons between doctors and lawyers who are licensed appropriate? Almost every substantive profession from electrician to plumber has some type of licensure to professionalize the discipline. Yet the argument will be that computer security is different. Perhaps it is that very argument that has created the current security fiasco. As long as information technology and information security are “different” then there can be no basic standardization. I suspect the argument on either side is fallacious.

The assessment of secure coding education is incredibly interesting to me. Found in Section 11, Federal cybersecurity research and development, there is to be an assessment done. What they are looking for is the number of students who earned under-graduate degrees in computer science or in each other programs where graduates have a substantial probability of being engaged in software design after graduation. They are also looking at the students who completed substantive education in secure coding. They want a description of what that education looks like.

Let me save them the trouble. The answer is that computer science programs have only recently seen an increase in enrollment. The computer science nerdgasm has finally bottomed out but the reality remains. Information technology programs though are still growing. The hyper specificity of the computer science discipline was superseded by the application oriented information technology disciplines. Where computer science devolves into intricate details the information technology programs actually do something. The metric driven, formal methods, and desk checking routines of computer science will never permeate the open source culture of hobbyists and volunteers. Further, the “many eyes” theory of open source is a myth and most users are free riders having never contributed to an open source project.

The use of “garbage collected” or “framework” languages such as C# and Java has decreased the understanding of students in basic coding tasks. The idea of memory leaks, buffer overflows, and bounds checking are left to the frameworks and inherently decrease understanding of the programming task. In an effort to lessen the education barriers to entry these types of languages have undermined the principles of software coding. An object oriented language like C++ has all of the original structures found in C but enforces good code practices (or it doesn’t work). Introduction of secure code practices is part of the software curriculum literature as is an early introduction premise. Somewhere in the rush to teach more students the basic premise that software coding is hard has been lost.

There is also another snake in the grass for software assurance. The reliance on tools rather than skills has eroded skills with a tool mindset. The software creation process is a tool heavy field of endeavor. A get back to basics coding requirement though will prepare software programmers for the future much better than tools that obscure and hide their mistakes. Academia is pushed by industry to use the latest greatest tools as a principle of vocationalization, and this is an inherent problem for the larger national community. There is though a significant amount of inertia engendered by corporations such as Microsoft (C#) and Sun Microsystems (Java) to use their latest tools or languages for teaching as they create false capabilities and stovepipes in the national software programming infrastructure. A much better for the larger community language (though no where as easy) ISO standardized C++ has free tools, no barrier to entry, enforces good coding practices, hides no errors, requires better understanding, and has no industry advocates.

Even in my institution I fight against the inherent Microsoft/Sun/Apple biases and the use of high level framework languages like C#. It is nearly impossible to overcome the inertia and the issues caused by this kind of bias. Though we have an entire software assurance course that taxes the students capabilities and enforces good programming practices. There is little to be done when few if any students understand a fundamental language like C++ having spent most of their time in C#. Software programming should be hard. It is a difficult and time-consuming task. The only way to secure software is to get back to basics and do high quality programming. Starting in the University and being supported by the Industry.

Another sub-section that is labeled cybersecurity (sic) modeling and testbeds (sic) caught my eye. The ability to realistically model the cyber environment is very interesting to me and the National Cyber Range proposed by DARPA even found me traveling to Washington DC to find out more. The result though was very disappointing. Rather than DARPA truly expanding the science, the incremental plodding would continue seeing current tools being integrated rather than a technological leap. Currently I have a design course running with my senior undergraduates who have more imagination and drive than I have seen in the DARPA funding announcements. The industry focused contract systems won’t see imagination and creativity funded at a university. What we will see is mediocrity. There is a fairly large body of literature on how to solve test bed issues and modeling within the safety of a laboratory. With very little money ($400K) my students could build something spectacular but there is no funding mechanism.

There is a lot of money to be made in keeping cyber security and information assurance and security in the dark ages. The issues identified in the Cyber Security Act of 2009 have been discussed in depth since the early 1970s. The recommendations are nothing new. The problems are solvable yet we continue to not solve them. Pedantic discussion of Cyber Pearl Harbors and Katrinas continue to be the opening discussion of salacious books. Yet within his testimony to the Senate a few weeks ago Eugene Spafford mentioned that these catastrophic disasters are occurring continuously. Our understanding of them though is not following the scope of the problem.

There are a variety of reasons that the information technology problems do not get solved. The computer science establishment within academia enjoys a certain specter of exclusivity as long as computers are special and not utilitarian. Computer science departments couched in either the mathematics discipline or the engineering discipline are additional levels of difficulty beyond the normal science, engineering, technology, mathematics (STEM) disciplines. As such they are given access to resources above and beyond the normal and have certain notoriety. However, within this exclusivity the student population has noticed and until recently computer science enrollment has declined. Barriers to entry couched in discussions of a failing K-12 STEM preparedness miss the actual issue of hubris within the computer science discipline.

The industries that serve the information technology environment have a profit premium in failing to produce products that are not obsoleted by flaws. The upgrade and long-term support costs represent enormous profit motive for providing poor technology implementations. The economic models are horribly upside down for any incentive to produce quality. Why would any software application or operating system provider produce software that was so superior as to not need upgrades in the future? Expectations that the egalitarian software industry is going to truly solve a problem that puts them out of business borders on the moronic. There is a lot of money to be made by producing products that have to be replaced every few years. Something the auto industry knows a lot about. This leads to the basic flaw in the Cybersecurity Act of 2009.

Licensure of cyber security specialists and providers would seem to solve so many problems. This though is horribly flawed and is analogous to licensing building inspectors. A licensed building inspector can point out the flaws but it is the building contractor who must fix them. Licensure of security experts without licensure of software programmers, providers, and companies will fail. The information technology industry has maintained a mystique of uniqueness through obfuscation and a unique celebration of mediocrity. It is a terrible truth that cyber security as a realizable goal is a myth. Over 40 years of literature, discussion, law base, and behavioral influence has not been able to change this as the culture of the discipline and profit motives of the companies are at odds with change.

Until the barriers to understanding are shattered by the very utility nature of information technology as a common service exemplified by water, gas, electricity and air security will not be possible. The business world needs to expel ideas of “return on investment” and other profit metrics when considering information technology. Information technology is a cost of doing business just like electricity. At the same time information technology practitioners need to give up the dictatorial, autocratic, high vizier attitude of system ownership. The only thing riding on these kinds of dramatic behavioral and strategic changes is national security.

 

 

3 comments for “The Cybersecurity Act of 2009: Trying to create order from chaos

Leave a Reply