Research Note: Defining attacker knowledge, skill, and ability

Introduction

There exists an area of undiscovered value within the intersection of information assurance and security, standardized education, cognitive methods, and assessments of cognitive methods. At this intersection are the basics of profiling and evaluating the knowledge, skills and abilities of the adversarial network user. At this juncture point you find the area where you can evaluate and judge adversarial computer users knowledge, skills and abilities. You further can identify where they gathered those knowledge skills and abilities, the investigator can then structure a fairly accurate model of the expected capabilities of this type of adversary. Of interest is the knowledge of what that attacker might do next, where that attacker might have come from, and what level of sophistication is required by the protector of information assets.

Figure 1 A model of the perceived dissertation topic and the areas of knowledge required for success.

Figure 1 A model of the perceived dissertation topic and the areas of knowledge required for success.

The ability of investigators to profile and analyze accurately the attack vectors and the requisite skills to initiate an attack by adversarial computer user population within the infrastructure would be a great boon to future investigations. Understanding where the adversarial user population learns the knowledge, skills, and abilities to prosecute an attack would also create an opportunity for early intervention. This as a relatively onerous task to accomplish when dealing with corporations holding the valid data repositories and/or dealing with issues of privacy when trying to harvest data in the wild. The methodology to accomplish this task is observational and data intensive within the three domains of knowledge.

Information Assurance and Forensic Analysis Issues

The value of digitized information assets continue to rise as the value of criticality and fundamental requirements of those assets become apparent. The components of criminal enterprise focus on the junction of the possible and the adversarial. This nature of computer crime continues to require specific expertise.  The knowledge, skills, and abilities of people who use computers and abuse computers are a specific type of threat to infrastructure systems. No longer are geographical boundaries and time constraints on the criminal activity. We’ve long known the enterprise of criminal action has a widely expanded scope and threat envelope (Parker & Nycum, 1984).  This increased level of criminal activity has been perceived at all levels of society. Since the beginning of implementation the number of security officers has grown with the validated and realized threat to information assets. This growth in security professionals has been realized at the University level and has led to procedures for dealing with the domain of education and the expectations of privacy while investigating computer based criminal activity (Axlerod & R., 1999).

Models exist that detail the areas of information assurance and security and the specific risk to information assets within the enterprise. A new model can be developed by the author to easily break a diverse group of security services up into manageable chunks. Historically security services have existed as “secs” that were appended by their governmental service classifier.

Figure 2 Historical Security Service Classifiers (attribution pending, diagram by author)

Figure 2 Historical Security Service Classifiers (attribution pending, diagram by author)

A new model was designed that took into account the different types of security and then created a model of security services. The original three services were appended with two more services and this created a cohesive set of security services that have become the de-facto standards for understanding how security is implemented (Maconachy, Schou, Ragsdale, & Welch, 2001). The author thinks that the concept of non-repudiation and authentication are pretty much unneeded at this point, but they remain alliterated in several policy documents. 

Figure 3 The five security services as discussed (Diagram by author) (Maconachy, Schou, Ragsdale, & Welch, 2001)

Figure 3 The five security services as discussed (Diagram by author) (Maconachy, Schou, Ragsdale, & Welch, 2001)

A model is needed that would allow for the classification of the specific areas and security concepts to be analyzed. This ontological deconstruction occurs within the scope of determining the characteristics where business and process concerns of information assurance and security intersect to create comparatively similar needs. The model further adapted by the author set areas of software assurance, systems assurance, and operational assurance which were inclusive of the security services (Maconachy, Schou, Ragsdale, & Welch, 2001), and security types (attribution pending).

Figure 4 Security areas and there associated areas of concern

Figure 4 Security areas and there associated areas of concern

One issue is the “pushing out” of security to divergent user groups. Specifically the concepts of involving users to battle organized crime when organized crime uses computers. You would not expect a user to take on a crime ring face-to-face but we do when computers are involved. There are issues in the complexity of involving users in the process whether the user group is simply an exchanger of information or deeply involved in investigatory tactics (Vreede, Hengst, & Sol, 1995).  This cultural shift between organized crime and the concept of “hackers” has even spawned a debate as to the validity of the lexicon. Is the person involved in computer crime a hacker or a cracker. Though identifying the reason why somebody utilizes a computer maliciously appears to be supposition there are well known reasons people give for “hacking”. As the law enforcement community has matured into these concepts so has the criminal world matured. This has led to a dichotomy of participation on the part of companies now unwilling to work with law enforcement over the last decade (Sukhai, 2004).

The little known investigatory arm of computer crime is the computer forensics field of endeavor. Few people know or understand what computer forensics topics are, but they have likely ideas that forensics equates to law. The technology of forensics has expanded the powers of law enforcement to investigate criminal cases and access information previously thought lost or of low value (Fernandez, Stephen, Mario, & Dulal, 2005). The educational and technological changes apparent in society are also mimicked by the societal detritus and barring secure systems quite easily (Fernandez, Stephen, Mario, & Dulal, 2005). The discipline of forensics is based on the concept that people will always leave a digital trail through their activities on a computer. For example the deletion of files in the Windows operating system appears to copy a file to the “Recycle Bin”, and then when that is emptied the file appears to be deleted. None of this is actually true. The file pointer moves and then is deleted. The space is given up by the file system after the last step and may be copied over but until then the file remains.

This high tech digital cognitive divide of computer forensics has created onerous and often dangerous elements to the criminal prosecution aspects as in un-sworn citizens acting as law enforcement  officers and completing forensic investigations (Harrison, Heuston, Mocas, Morrissey, & Richardson, 2004). Though technical skill is better now then in 2004 the concept is still valid. Building upon the credibility issues of the investigators there is also the issue of the data itself. Is it possible to investigate a system while it is still live? This question hounds the forensics world over the years as the issue is debated between live forensic analysis and the risks of such analysis (Adelstein, 2006; Carrier & Spaffford, 2003). If you want to investigate malware it is almost a requirement that you deal with a live system. The problem being that both of these sides of the argument do not address the credibility issue of a discipline based only on tools and not science.

On our way to cyber forensics there have been issues. The idea of constraining and protecting digital evidence with the crypto services has helped in protecting the evidentiary nature of forensic data (Hosmer, 2006), but the issue remains that even with this procedural science the discipline of computer forensics may become endangered by turning a blind eye to scientific analysis. Since the definition of computer forensics has purposely not been given the area of endeavor can be further muddied when computer forensics is retroactively changed to network forensics. This subtle name change directs the investigator away from studying the hard drive or data of a system and draws the investigator back to the network layer. This is a valid and understood demarcation point until the systems become entangled at some level in between. Historically the network attack has been at the protocol layer, physical layer, or data traveling on the line (Laurie, 2004). Unfortunately the data to forensically analyze is still found on the storage devices of a system since a network has a short time to live for data. One answer is to not separate these areas and simply dump everything into a bucket called cyber.

Curricula Models and Standards as a Profiling Tool

A variety of standards and curricula models exist for information security. The assessment and evaluation of curricula standards as applied to the instructional process is a documented process. There is substantial literature that exists in the implementation and assessment of outcome based objectives and other learning models and taxonomies (Dark, 2004; Dark & Winstead, 2005; Kamali, Liles, Winer, Jiang, & Nicolai, 2005; Logan & Clarkson, 2005).

On the balance of understanding the mental processes of the attacker and the knowledge requirements for the attack the skills and associated curriculums can be evaluated for the location that knowledge might have been gleaned from. There are larger patterns to knowledge transference. Little more than a cataloging process this identification of knowledge, skills, and abilities within the different curriculum models as approached from the position of information assurance and security has not been done. A taxonomy of the models and the associated learning elements exist as approached for teaching information assurance and security (Dark Melissa J., Ekstrom Joseph J., & Lunt, 2005), but not as approached by academia for the sole purpose of identifying the skills of hackers. Further the associated protective skills and development strategies are well developed and alliterated in multiple areas by the federal government as they apply to the process of information assurance and security (National Security Telecommunications and Information Systems, 1994, 1997a, 1997b, 1997c, 2000).

Preparing a Test Model

The test model or hypothetical model should provide some amount of benefit in determining the skill set of an antagonistic user. Since the Internet was never really developed with security in mind (Parker & Nycum, 1984) and we must keep the idea of security front and center we also must realize that destruction is simple. Protecting systems and software from intrusion is a difficult task that requires diligence and resources (Oates-Lewandowski, 2005).

Figure 5 Possible example of a attack and corresponding understanding of the adversary.

Figure 5 Possible example of a attack and corresponding understanding of the adversary.

If the model as designed were able to determine the skills and characteristics of the attack (even if automated) then you would be able to model and determine the skills of the adversary fairly quickly. Within the model if you evaluated the skills of the attacker and found that they were for example only using an automated script (easily available tool), then the relative skill requirement may be lessened. If the perpetrator for example then hid their tracks successfully or cleverly then the model might suggest that the perpetrator had significantly more skill.

Cognitive Models and Learning

It is of great importance to attempt to identify the methods that adversarial computer users utilize to learn. There are a variety of taxonomies that will allow for the identification and categorization of knowledge acquisition. In identifying and evaluating this area of the research this student needs a lot more information.

Methodology and Data Repositories

The proposed methodology is to acquire the data repositories of a (CONFIDENTIAL) company that will make their large repository of attack vector data available for research. The database reportedly contains millions of attacks and a substantial amount of corresponding data on the location and identity of the adversarial users in question. This data has been examined from multiple views, but never applied to the conceptual learning models and curricula models for evaluating knowledge skills and abilities.

It is most likely that a small sample of the data will be analyzed with success by hand. However, utilizing natural language processing and ontological elements it may be possible to automate, parse and evaluate according to rule sets a substantial amount of the data once a small portion has been keyed and coded (definitions developed). This process is not fully developed as neither is the cognitive model or evaluation of the adversarial user.

Works Cited

Adelstein, F. (2006). Live forensics: diagnosing your system without killing it first. Commun. ACM, 49(2), 63-66.

Axlerod, H., & R., J. D. (1999). Crime and punishment in cyberspace: dealing with law enforcement and the courts. Paper presented at the Proceedings of the 27th annual ACM SIGUCCS conference on User services: Mile high expectations, Denver, Colorado, United States.

Carrier, B. D., & Spaffford, E. H. (2003). Getting Physical With Digital Investigation Process. International Journal of Digital Evidence, 2(2).

Dark Melissa J., Ekstrom Joseph J., & Lunt, B. M. (2005). Integration of information assurance and security into the IT2005 model curriculum. Paper presented at the Proceedings of the 6th conference on Information technology education, Newark, NJ, USA.

Dark, M. J. (2004). Assessing student performance outcomes in an information security risk assessment, service learning course. Paper presented at the Proceedings of the 5th conference on Information technology education, Salt Lake City, UT, USA.

Dark, M. J., & Winstead, J. (2005). Using educational theory and moral psychology to inform the teaching of ethics in computing. Paper presented at the Proceedings of the 2nd annual conference on Information security curriculum development, Kennesaw, Georgia.

Fernandez, J. D., Stephen, S., Mario, G., & Dulal, K. (2005). Computer forensics: a critical need in computer science programs. J. Comput. Small Coll., 20(4), 315-322.

Harrison, W., Heuston, G., Mocas, S., Morrissey, M., & Richardson, J. (2004). High-tech forensics. Commun. ACM, 47(7), 48-52.

Hosmer, C. (2006). Digital evidence bag. Commun. ACM, 49(2), 69-70.

Kamali, R., Liles, S., Winer, C., Jiang, K., & Nicolai, B. (2005). An implementation of the SIGITE model curriculum. Paper presented at the Proceedings of the 6th conference on Information technology education, Newark, NJ, USA.

Laurie, B. (2004). Network Forensics. Queue, 2(4), 50-56.

Logan, P. Y., & Clarkson, A. (2005). Teaching students to hack: curriculum issues in information security. Paper presented at the Proceedings of the 36th SIGCSE technical symposium on Computer science education, St. Louis, Missouri, USA.

Maconachy, W. V., Schou, C. D., Ragsdale, D., & Welch, D. (2001). A Model for Information Assurance: An Integrated Approach. Paper presented at the 2001 IEEE Information Assurance Workshop, West Point, NY.

National Security Telecommunications and Information Systems. (1994). National Training Standard for Information Systems Security (INFOSEC) Professionals. Retrieved April 20, 2006. from http://www.cnss.gov/Assets/pdf/nstissi_4011.pdf.

National Security Telecommunications and Information Systems. (1997a). National Training Standard for Designated Approving Authority (DAA). Retrieved April 20, 2006. from http://www.cnss.gov/Assets/pdf/cnssi_4012.pdf.

National Security Telecommunications and Information Systems. (1997b). National Training Standard for Information Systems Security Officers (ISSO). Retrieved April 20, 2006. from http://www.cnss.gov/Assets/pdf/cnssi_4014.pdf.

National Security Telecommunications and Information Systems. (1997c). National Training Standard for System Administrators in Information Systems Secuirty (INFOSEC). Retrieved April 20, 2006. from http://www.cnss.gov/Assets/pdf/cnssi_4013.pdf.

National Security Telecommunications and Information Systems. (2000). National Training Standard for System Certifiers. Retrieved April 20, 2006. from http://www.cnss.gov/Assets/pdf/nstissi_4015.pdf.

Oates-Lewandowski, J. (2005). Creating a culture of technical caution: addressing the issues of security, privacy protection and the ethical use of technology. Paper presented at the Proceedings of the 33rd annual ACM SIGUCCS conference on User services, Monterey, CA, USA.

Parker, D. B., & Nycum, S. H. (1984). Computer crime. Commun. ACM, 27(4), 313-315.

Sukhai, N. B. (2004). Hacking and cybercrime. Paper presented at the Proceedings of the 1st annual conference on Information security curriculum development, Kennesaw, Georgia.

Vreede, G.-J. d., Hengst, S. O. d., & Sol, H. G. (1995). Facilitating user involvement in information system design and development with GSS: the organized crime case. Paper presented at the Proceedings of the 1995 ACM SIGCPR conference on Supporting teams, groups, and learning inside and outside the IS function reinventing IS, Nashville, Tennessee, United States.

 

Leave a Reply