It is interesting how we forget that information assurance and security is not always a technology problem. If you go to a major security conference there will be plenty of hardware vendors standing around but I’ve never seen psychologists or psychiatrists trotting out methods of determining insider threats. Sabotage, and destruction of company assets has been happening for a long time. Our own innate sense of humor at beating a rental car, driving the company van a little harder than we would our own car are symptoms of this perversion towards the company’s property. When we expand the scope out beyond our own actions and start considering the actions of hostile outsiders the risks becomes greater. With distance attribution as a capability is perceived to shrink.
OSS, January 1944 ”Simple sabotage field manual (provisional)” Declassified
- Sabotage inflicts the corporate and government network in various forms and fashions. Little things go a long way. The introduction gives explicit instructions but one thing that becomes obvious is the mundane not strange or fanciful are the elements of sabotage. Is this true in the information assurance realm of the corporation? Is this similar or different than we find in the computer and data security environments?
- Of course the surly, non-cooperative attitude including stupidity are the most dangerous tools of the office worker. Sabotage doesn’t take brilliance sometimes it takes the genius of feigned stupidity (Introduction bullet D).
- The effect is limited and the saboteur may feel discouraged unless they feel to be part of a larger group. Is this basic principle not very close to the collectivism and mobbing that the Internet has allowed to flourish? Is there not in fact this very capability inherent in the social media culture?
- Through each of the discussions replace saboteur with user and analyze the discussion in detail.
- Does after reading this document the concern about cyber terrorism and cyber warfare seem meager? Be sure and read pages 29 through 31 closely. Do these sabotage events seem similar to any information assurance issues?
OTA, January 1992, “Technology against terrorism: Structuring security”, NTIS #PB92-152529
- If there is any area that has become obvious as a terrorist target in the publics eye it has to be aviation security. Bruce Schneier has upon occasion discussed the concept of “security theater”. I want you to look at the date this document was prepared and filter the things that happened nine years later through that temporal lens.
- The authors discuss in depth the issues of sharing information and the problems of sharing that information between agencies (page 47). Of course all sharing problems are reportedly solved (page 48).
- One area we don’t think about in reducing risk is the redundant research and lack of intellectual dissemination and aggregation that occurs when people work together (page 49). In the field of information assurance and security this becomes very obvious considering the length of time we’ve been fighting the same problems.
- One element we come back to often is the idea of the human in the system. Users are blamed for many things; management is slovenly in dealing with problems. Humans have been part of the issue in security for a long time (page 80).
- Another interesting aspect is with all the effort put forward toward human factors the FAA is “ill-prepared to identify and address possible human-factors concerns”, (Page 81). This is blamed on the aspects of the technologies. How would you describe the corollary information assurance and security issues? Why would these be similar problems between computer security and aviation security? Reading the report what are the similar problems in security you can determine?
- Just recently has the TSA been thinking about doing passenger profiling. The document discusses this succinctly and in depth (page 85). One element of this is why would you allow a thread get into the airport let alone board an airplane? Why would you let bad packets or damaging payloads into your network at all? And once there are you sure you want to rely on the airplane being able to sustain damage to survive (page 11).
- Of course it took them 20 years but millimeter wave technology is making it’s way into the airports finally (Page 117). Read this section carefully and think about the time to implementation, the changes in the equipment that occurred, and the barriers. What are the barriers to information security? What are the similar technologies are currently unavailable because of similar barriers for the information security environment?
- Look at the recommendations (page 23). Are these consistent with other recommendations we’ve seen?
- Minimize changes to government oversight and regulation. Several of the infra- structures have a long history of government regulation, with a clear legislative man- date and a record of success. We consciously avoided proposing significant changes in regulation.
- Build on that which exists. It will be easier and faster to implement, more effective, and more likely to be accepted than creating something new.
- Depend on voluntary cooperation. Partnerships between industry and government will be more effective and efficient than legislation or regulation.
- Start with the owners and operators. They have a strong economic stake in protecting their assets and maximizing customer satisfaction. They understand the infrastructures and have experience in responding to outages.
- Practice continuous improvement. Take action in affordable increments. There is no “magic bullet” solution. Aim not only to protect the infrastructures, but also to enhance them.
- Coordinate security with maintenance and upgrades. Security should be incorporated in planned maintenance and scheduled upgrades.
- Promote government leadership by example. Government-owned facilities should be among the first to adopt best practices, active risk management, and improved security planning.
- The totality of the information saturation and scope of the outage data quickly overwhelms the cursory viewer. We’ve talked about information saturation before. Looking at the graphic (page 29) “Coincidence or attack” the reality starts to set in. Now imagine coordinating this with your enterprise information security program.
- See the information pyramid (page 30) and think about this. If the previous graphic (Page 29) is true and this is the result found in figure 6. Think back to the discussion on intelligence as a lossy system. Is there any possible way to capture the bulk of information risks, threats, and vulnerabilities regardless of recommendations? Depending on how you answer that several other elements rapidly become obvious.
- To answer the warning and notification requirements a suggestion is made on structure. In corporations we often try to mirror government organizations to make sure processes are aligned with regulations. Think about mirroring this (page 50):
- An Office of National Infrastructure Assurance in the White House to serve as the focal point for infrastructure assurance.
- A National Infrastructure Assurance Council of prominent infrastructure corporate leaders, representatives of state and local government, and Cabinet officers to address infrastructure assurance policy issues and make appropriate recommendations to the President.
- An Infrastructure Assurance Support Office to provide functional support and management of the federal organizations involved in infrastructure assurance, as well as providing direct assistance to the public and private sector partnership effort.
- A federal Lead Agency for each sector to take the initiative in bringing together the owners and operators to create a means for sharing information that is acceptable to all.
- A Sector Infrastructure Assurance Coordinator for each infrastructure to function as a “clearing house,” organizing information sharing activities, protecting the information provided by each participant, and acting as a channel for information to, and from, the government.
- An Information Sharing and Analysis Center consisting of government and industry representatives working together to receive information from all sources, analyze it to draw conclusions about what is happening within the infrastructures, and appropriately inform government and private sector users.
- A Warning Center designed to provide operational warning of a physical or cyber at- tack on the infrastructures.
- The banal idiocy of all reports is found when this homily is trotted out, “In many families, children are more computer literate than their parents.” (page 69). In how many ways is that wrong? The ability to send SMS, use MySpace, or FaceBook is not computer literacy any more than watching Wishbone is reading a book.
- The legal system itself in many ways challenges the ability to make computing systems more secure. There are specific legal impediments to simply testing that still have not been fixed (page 86). Consider them in detail and think about what the secondary and tertiary effects might be if they were removed.
- Look at the graphic (page 6 & page 18) this is showing what is commonly referred to as a cross domain attack. One domain or terrain is used to effect another or through hybridization impact the other. This was written in 1997 but still is not well understood in most circles.
- The attack discussed as Eligible Receiver is interesting. (Page 8). Consider the methodology and then think if it would be legal to do the same today. Remember this was written in 1997 (thirteen years ago).
- The entire systemic vulnerability spectrum can summed in the following quote from page 10.
“In sum, technology and change produce better service at lower cost, new markets and more efficient processes throughout the nation and indeed the world. As a result, we depend more than ever on infrastructure services. But at the same time, market forces result in a diffusion of accountability, a decrease in “end-to-end” or system-wide analysis and responsibility, less re- search and development investment, and a reduction in reserve capacity. Today’s processes are more efficient, but they lack the redundant characteristics that gave their predecessors more resilience.”
Looking at chapter 3 (page 11) take all of the technologies/services and draw a matrix or grid to determine which ones require which ones (dependency diagrams). What is the critical path?
February 2003, The National Strategy to Secure Cyberspace
- The report has five critical priorities (Page 11). Read them carefully. How many of the recommendations are for the private industry and how many for the government? Balance these recommendations with the recommendations for fixing aviation security in 1992.
- What is the similarities between the structure of the recommendations in each report (aviation v. cyber)? Have we implemented these priorities? If not why not? What will it take to cause a sea change in how people consider cyber security?
- Are the threats discussed (page 21) over blown? Are the threats as discussed given substance in the report?
- Sometimes you read a report and your are suddenly struck between the eyes by a missing and obvious huge issue. Which level is missing from the discussion of “Threat and Vulnerability” (page 22)? The report has significant biases that follow a certain obvious missing consideration. Part of this is likely due to the biases of the writers but the flaw is huge.
- We’re glad to see protecting privacy and civil liberties is once again not number one on the list of principles (page 29).
- What is cyber space (cyberspace, cyber-space?) (page 34).
- Tucked deep in this report is “address verification” (page 46). What would that mean to the Internet and what would the secondary and tertiary effects be of “address verification”?
- The authors of this document talking about promoting awareness about cyber space security issues (page 52). Within the realm of information assurance and security that is likely happening. However, what do you think about outside of the technical realm? An interesting paper for my undergraduates would be to gather ALL of the computer security conferences for a year and get the attendance records. Then compare that to say the “Knights of Columbus” or Rotary attendance records.
- Do you find it interesting that Institutions of Higher Education (IHE’s) are listed as a problem rather than a solution (Page 55)? IHE’s are blamed for the exploitation of their networks but not listed as a way to actually solve them. This is part of that bias discussed previously. The fix is listed as “training” (page 56).
- A common theme is attribution but the language “withstand attack regardless of the origin of the attack” (Page 64) is interesting. What would that mean to the enterprise and corporate network?