Distributed Computer Forensics: Challenges and Possible Solutions

Abstract

When the Internet began a small group of users were involved in the project, and the users knew each other quite well. When you consider the current state of the Internet and the variety of transactions currently taking place on any one computer connected the breadth and totality of the connected society is staggering. Currently on any one client based machine instant messenger traffic from a variety of vendors, email from a variety of servers, web browsing world wide, torrent clients sharing hosted collaborative materials, health and monitoring of the PC from the vendor via simple network management protocol, and printer traffic all coexist on the machine. The computing continuum has moved from users who knew each other, single machines running single applications connected to other single machines, to what has become a totality of interconnected and distributed computing environments sharing a variety of resources.

The amount of storage required for the data may not be vast on the single box but considered across the number of machines in the world currently accessing any website from anywhere and the value become immense. The data processing of project like sequencing the human genome are known to be enormous, but when you mix in criminal intent and criminal activity there is an expectation that the computer forensic investigator will rapidly acquire and analyze the information necessary, to prosecute a suspect in a timely manner with absolute precision.
That is where the problems are soon realized. Tool based methodologies based on vendor driven technologies abandon a principle based forensic science structured solution. The size and scope of the digital investigation at the enterprise level has rapidly escalated to the point where vendor based forensic practices related to “bag and tag” may not be nearly enough and within this paper those issues are discussed.

Distributed Computer Forensics: Challenges and Solutions 

Introduction

There are a large variety of issues when dealing with distributed environments and the forensic analysis of the domain. With the incredibly large volumes of data existing within application such as Enterprise Resource Planning (ERP) and as mail systems become larger as calendarization and other functionality is added to already humongous systems the security and ability to investigate breaches increases also (Cartwright, 2003). The volume and amounts of material being generated by large computing enterprises are not human readable in a lifetime let alone in the scope of a trial or litigation.

The forensic investigator works within a problem space where scope and capability are determined by outside agencies and where requirements may not match with resources. The problem space is dictated by the alleged perpetrator and defined by that perpetrator and the forensic investigator is only allowed to pick the tools that have been currently allowed within his organization. The requirements within many public law enforcement entities are set by courts and over-sight committees that may not be the funding agencies for purchasing of equipment. Technology may slowly creep ahead or literally leap ahead of the skills and tools available to the forensic investigator or criminal evidence technician.  Simply put the existence of a tool or strategy does not mean it will be used or understood because of the current rules and funding streams. The forensic investigation is tied to two poles of funding and vendor tools that hamper the actual forensic process.

Computer forensics has outgrown the personal computer era

In large scale enterprises with highly distributed environments, and or data intensive environments, information that is of evidentiary nature may not exist in a cohesive form until it is queried (Blankenhorn, Huebner, & Cook). The query or the request for the data then creates a data stream that is delivered to the client. Think of a Google search that is a query generated on the local machine, recorded on logs of the servers, and results in pointers to other websites, that also have pointers to other websites. Though not literal that is how large-scale enterprise applications build the dataset delivered from multiple databases to the desktop. Until the data request, the data on any individual machine may not be complete or more importantly of probative value in a court of law.  To make matters worse that data is ephemeral and will likely cease to exist in the current state upon the next transaction.

In an environment where data is dynamic and constantly moving, defining where and how to retrieve the data that is of evidentiary interest is important, and if it is of value to various law enforcement and investigating agencies. Data that may be of evidentiary interest may be obfuscated by the application, cluster computing, or the network type (e.g. wireless networks). In increasing cases the size of the data environment has grown so that only portions or segments of the data are within the scope of possible investigations.  These technological changes have created challenges to computer forensic investigations where large scale, highly dynamic, distributed computing, data intensive environments are overtaking the knowledge, skills, and abilities of the disk based static computer forensic investigator practices. In this morass of declining capabilities the science of the forensic investigator is abandoned

Significance of the problem

A large amount of computer forensic literature deals with distributing the effort of forensic analysis over larger and faster parallel processing platforms. As the scope and size of investigations increase to include substantially larger data environment it may be problematic to base investigations simply on data repositories such as hard disks. The physical representation of the computer forensic environment may no longer be that of disk geometry and the magnetic media of a hard disk. The computer forensic environment may reside on the network and the wireless environment or within the variety of applications and data environments. The body of literature defines the task of sharing the processing load as larger data sources resulting in faster, more scalable, and robust applications much like the Internet itself (Golden & Roussev, 2005).

Unfortunately the literature is less specific about how to analyze large scale parallel processing environments where data sources may be shared and partitioned (highly segmented) as part of data hiding (security) or protection (redundancy and disaster recovery) measures. This type of analysis is important to organizations and governmental entities that may be investigating financial or regulatory issues. Network analysis that might be used as evidence in a court of law has a very small body of literature. Further this type of analysis may be important for searching large scale networks and in active traitor tracing as done by governments looking for spies and terrorists. Of greater significance is the leading nature of this research. Currently large scale network analysis and large scale data store analysis is difficult, and if not difficult filled with possible process and procedural issues. The forensic science of this growing area of interest is routinely ignored and has almost no penetration by vendors. These issues with large-scale network forensic analysis would likely lead to difficulties in prosecution and validation of evidentiary information from an investigation when provided under current rules of computer forensics modeled after static hard disk analysis.

There is another portion of the puzzle in dealing with large scale networks and that because the data is transmitted but not normally utilized as evidence. If the network is not considered as part of the specific evidence domain the ability to monitor the network may not even be considered. When looking at the evidentiary trail often the focus is solely on the evidence of specific data held in the digitized form found on a disk. This ignores the criminal threat and forensic analysis of networks, such as Intrusion Detection Systems (IDS) found in wireless environments, and signal analysis to identify a suspect based on their participation in a network through a wireless link (J. Hall, Barbeau, & Kranakis). There may be opening for opposing counsel to claim that exculpatory evidence was abandoned and the forensic process was suspect.

The issues of the network being part of the larger environment for forensic analysis is part of a larger continuum of computer based forensics. When looking at the digital hierarchy of evidence there are data objects such as files and directories, digital evidence such as transmissions (e.g. wireless), physical items such as the computer or Personal Digital Assistant (PDA), original digital evidence including things like hard drives, duplicate digital evidence as in drive images, and copies which may not be exact but independent of the original item (Whitcomb, 2002). This hierarchy of evidentiary types is important to understand as resources are directed towards one form of the computer forensic science the comparative lack of interest and research into another area has unintended consequences. When looking at the network and specifically enterprise networks the general role of network forensics is to identify misuse and attacks on the network in such a way as to provide something of probative value for charging a suspect with a crime (Mukkamala & Sung, 2003). The classification of the attack method will usually result in a network hierarchical structure similar to Internet based attack, internal attack, protocol attack, application attack, social engineering attack, and so on through the levels of the network (Laurie, 2004). Though in some cases the data will be used simply for insurance claims resulting from loss, or decisions on human resource actions against employees the level of analysis and preservation of evidence carry equal levels of importance since in many cases non-criminal activities still end up in court as civil matters.

Distributed or Enterprise Forensic Issues

The overall objective of this paper is to provide a substantial review of the literature, further defining the problem statement, and working the alliterated problem statement to create a possible set of solutions. By working this specific problem the body of knowledge should be further defined and expanded. The specific objective is a literature review of this area and with the identification of possible solutions.

For the purpose of this project some topics were not addressed as they were either considered to be inclusive of other related issues or exclusive of the current discussion. Peer to Peer (P2P) networking was not discussed as the very nature of the topic has been suggested to be de rigueur violation of copyright law. Though the author(s) will not state that to be the case the topic of cluster computing and grid computing is complete without it being specifically discussed. P2P by its nature is also a large scale highly redundant high speed network and fully considered in the concepts alliterated in that section.

Trusted Computing and its initiatives as a vendor based solution were considered for inclusion within this paper. However, though one paper discussed the possibility of forensic vaults as a portion of trusted computing and thereby also the idea that trusted computing could be used successfully in a clustered environment the topic was not included. Trusted computing is a vendor led initiative and many of the methods discussed have much more to do with single computer traditional forensics and not within the scope of this paper.

As discussed in the section on grid and cluster computing grid was not included as a specific topic. The concept of clustering (smaller in nature) provides a fairly accurate picture of the issues with grid computing with the caveat that grid computing has more issues beyond the clustering environments problems. Grid computing is much more open than clustering and may not as a tool be a good idea for the forensic world to consider unless that computing grid is fully protected and kept on private networks away from the larger scale computing grid platforms of the Internet (Golden & Roussev, 2005).

Dealing With the Problems of Distributed Forensic Analysis

The criminal and crime aspects of computers were defined early in the rise of the computer. It is interesting to see that the computer forensics world expends energy on specific crimes when items like computer abuse and crime associated but not directly related to the computer may be just as prevalent and has been discussed as such for quite some time (Kling, 1981). The computer as a silent near perfect witness seems to have been lost whereas most experts appear to be seeing the computer as a crime scene.

It becomes apparent that computer crimes have different components that can be juxtaposed as computers used to victimize other computers and directly victimize people. This separation expands the scope of the crimes associated with computers and further exposes some previously held concepts of what computer crime actually is composed of (Gordon & Ford, 2006).

Computer forensics is defined formally as a discipline where forensic computing is “… the process of identifying, preserving, analyzing and presenting digital evidence in a manner that is legally acceptable” (McKemmish, 1999). Within the broader area of computer forensics there are sub disciplines. Hal Berghel has described and attempted to distinguish internet forensics as a sub discipline of computer forensics and further that as a discipline internet forensics may be maturing (Berghel, 2003). Internet forensics is important to understanding the problem since the distributed nature of the data leads to a substantial subset of the likely solutions and knowledge to define the problem.

The study and dissection of forensic computing as a discipline has included several topics that are of interest to the definition of computer forensics and e-crime. Specifically the existence of specific crimes and the associated skills to catalog and forensically analyze those crimes is interesting to the discipline of computer forensics; computer science topics, law, information systems, social science (Hannan, Turner, & Broucek, 2003). A specific important skill is the concept of investigation skills and the ability to keep a computer forensics or computer crime investigation on course and within the scope of localized laws and courts (Hannan, et al., 2003).

Furthering the definition of the concept of Internet forensics and more importantly the idea of high volume data intensive volatile environments is the idea of the where it might be used. The military applications and national security applications are varied. However, the tools concepts, and analysis methods would allow for military commanders to identify specific information quickly within the battle environment. This is part of the required military operational capability for rapid response in real time (Giordano & Maciag, 2002).

Much of the concept of digital or computer forensics resides in the tools and methods of assessing and analyzing hard disks after the acquisition of evidence from a suspect computer. Besides the issues already identified in acquiring evidence there are other issues. The concept of what is a computer is starting to degrade under ubiquitous computing such as cell phones, PDA’s, and music devices that take on computer like functionality. The idea of distributed computing and highly volatile network based computing is starting to increase as the storage mediums become transitory and unknown in the high volume data intensive distributed networks.

Then there is the amount of data on the network, which can be staggering. Currently there exist networks that transfer 2.5 terabits a second, and can be expanded to 900 terabits a second (“STC Builds the Largest Data Network in the Region & Adopts the Fastest Routing System in the History of Telecom Industry,” 2006). The ability to reach and copy evidentiary level data from a stream that large is nearly impossible if the investigator was to try and use current tools and forensic protocols. In a distributed environment with live forensic analyses the investigator is going to have to have forensic supportable protocols to deal with the specific changes that will occur while analyzing the evidence. With the current state of forensics suggesting disk based forensics and tools associated with disk based forensics (Mercuri, 2005) that stress “exact” matches of what is seen and what is captured the evidentiary trail of large scale network analysis will fail.

If forensic computing is the process of identifying and preserving digital evidence for analysis and presentation (McKemmish, 1999) then likely the problem for large scale network analysis is found in the area of preserving evidence in a relative state as it was found. With static disk based tools using forensics copying and preparing a case based on digitally finger printed storage devices is feasible. Large scale highly dynamic forensic analysis is like predicting the weather in multiple locations at the same time.  The large scale networks adopted by users and industry is moving forward much faster than the tool and protocol development that computer forensic investigators have counted upon. Most computer forensic investigators are not forensic scientists and the few forensic scientists rarely have considered distributed computing. The problem remains that the technology and legal requirements may be in contradiction and that the relative lack of advancement in technology is a large and substantial issue for digital and forensic computing (Palmer, 2001).

The large-scale networks are indicative of the involvement of the Internet. As a dynamic distributed network the Internet defies static description and limitation of scope. Where does the Internet begin or end and what is its purpose are valid questions without good answers. The investigator has to artificially create a demarcation line and refer at that point where for the purposes of this investigation the lines of a network exist. Network forensics and specifically Internet forensics covers the scope of the Internet and uses many of the tools that the attacker might use (Berghel, 2003). When dealing with the relatively static environment of disk analysis and forensics the investigator has the relative luxury of time and evidentiary containment on their side. Unfortunately the individual dealing with a distributed network must expect a much more sophisticated user and more importantly a relative rush in the acquisition stage of the forensic process (Berghel, 2003).  The large scale highly volatile nature of the Internet and wide area networks suggest that time may be the primary enemy and that “good enough” may become a part of the lexicon of the computer forensic investigator. There are areas within forensics where decay and rates of decay are measured using scientific method to ascertain the nature and issues of the forensic degradation process without loosing probative value. The sustaining science of forensics provides for analysis even when the original evidence or “best evidence” has degraded or continues to degrade. There are corollaries within the digital evidence world that remain inconsistent even in the static disk based analysis. As an example the degradation of a disk when subjected to extremely high levels of Gaussian (magnetic) fields may create changes on the disk and thereby change the forensic image hashed value in respects to the original.

The discussion of whether to unplug a machine and use static forensic tools (Carrier, 2006) or attempt live forensic analysis (Adelstein, 2006) has been debated extensively. In one case you have the risks of live analysis leading to poor evidentiary level proof and the possibility that the criminal agents have led the investigator awry. From another viewpoint the large scale large volume highly volatile environment inherent in computer networks can only be analyzed in a live or volatile manner. What is missing from the discussion is that though the proponent of tools and technology created in absence of scientific method would prefer those tools be used though the same technological creep of increasing storage capacities and larger networks has made the discussion moot.

Similarly the evidence of criminal intrusion into a network or of a traitor’s transactions within a network may only exist as transient data incapable of being traced or found in a storage medium that could be analyzed using current computer forensic tools. Even imaging random access memory may not be sufficient if all that exists in memory is a pointer to data somewhere else. The underlying computer forensic science must be expanded and generalized to allow for this type of forensics.

A common consideration within a network investigation is that there is a physical and reliable assumption that a network connection exists at the physical layer. However, the ubiquity of a network and pervasiveness of a wireless network makes it a different case when considering a computer or digital evidence investigation. When application or protocol analysis fails the investigator can continue through the network layer until they reach the physical layer when analyzing criminal or traitor behavior in a network. Until the network connections become wireless. When the physical layer of the network is no longer twisted pair wire between stations or accessing servers the last controlled segment in the network can become the wireless access point in use by the hostile entity. Unfortunately in large-scale networks outside of simply 802.11x wi-fi networks the geographical relationship can exist orders of magnitude larger than the 100 meters of a standard 2.4ghz 802.11x network. Wi-Max 802.16 has a coverage area that can be measured in tens of miles and in excess of 50 miles in radius. It would not be uncharacteristic to consider an entire metropolitan area covered by the umbrella of Wi-Max to be possible suspects of network intrusion or hostile activity on a large scale metropolitan wireless network.

The problem is that while network forensics might be able to detect the existence of a perpetrator in the network and evidence of the crime may exist there is no actual witness to the crime itself. In normal forensics the equipment and disks of a perpetrator stand as silent witnesses to the acts of the human agent. With the existence of the wireless network the physical forensic link breaks and new challenges begin. There are methods that can tie a suspect to a particular location. Using radio frequency transmitter fingerprinting the investigator in real time has a chance to either eliminate suspects from consideration or include them within the scope (Jeyanthi. Hall, Barbeau, & Kranakis, 2005). The technology of radio frequency fingerprinting being used as an intrusion detection system is not new but if studied it could be raised to a level of evidentiary analysis

The standard computer forensics approach to volatility is to attempt to contain the data environment and capture a current state of the medium under scrutiny. The ability to capture the current state accurately and successfully is constrained by the storage size as in large scale storage area networks capturing the entirety of the storage would require a substantial investment in a similar storage area network. Computer forensic investigators have to deal with this issue already when dealing with RAID 5 storage systems. Similarly capturing the data of a highly volatile network is going to be equally difficult. That volatility of the network exists within the single machine and even on relatively small scales. Where network forensic investigators have used virtual machine technology successfully is to trap and capture live data of attack scenarios (Garfinkel & Rosenblum, 2003).

For all of the same reasons that security advocates and computer forensics researchers like using the virtual machines for acquisition of data and evidentiary analysis hostile entities also enjoy the use and application of virtual machine technologies (Ren & Jin, 2005). The obfuscation capable by using a virtual machine that has no specific evidentiary fingerprint, and more importantly can exist solely in the volatile memory space leaving little to no physical long term memory storage signature should be substantially worrisome to the computer forensics investigator.

There are methods of saving and forensically protecting the memory of a computing system, but in general they require devices or tools be installed prior to the investigation and that a fairly substantial amount of work be done prior to any investigation (Carrier & Grand, 2004). Thereby creating a situation where insurance is being paid out on equipment fees and structuring of a networking environment with the risk being an unknown. There is a likelihood that such a scheme would not work if the user is implementing a virtual machine. They will likely know that a device for forensically analyzing a computing system is in existence.

The only situation that might cause an investigator more issues is a Knoppix (“Knoppix,”) Linux distribution booting through a virtual machine application. This creates an incredibly evidentiary volatile environment where evidence is not going to be readily accessible or even possibly retrieved. The virtualization of the Knoppix bootable environment when booted through the virtual machine will not leave a trail of it’s use in the host machine and with “go-back” technologies in most virtual machines it will automatically self erase the trail of evidence.

Distributed computing is available in a few different flavors. The concept of grid computing and cluster computing share a similarity but they are slightly different. Clustering is about shared resource capability and usually exists within the same geographic region and most likely within a small or local area network (Microsoft, 2006). Cluster computing is usually used for shared resources and fault tolerance or redundancy where computers can share resources as needed in a high volume data environment and are managed by a centralized resource whereas grid computing is decentralized and may be found in a much larger network of loosely coordinated nodes (Grid Computing, 2006).

For the scope of this paper cluster computing will be the primary discussion as anything that applies to clusters will apply to grid computing and whereas grid computing has a variety of substantial issues on it’s own the concepts of cluster computing will still apply. With cluster computing the variety of architectures and vendor methods of making clusters work have a commonality within their goals. They attempt to level or safeguard data by processing chunks of the data in different areas of the cluster. Though a primary scheduling node is common within a cluster the individual chunks of data being processed may only be passed through the secondary or processing nodes of a cluster.  A piece of data or a request for processing entering the cluster is prioritized and scheduled to be dealt with. Depending on size of the job the scheduler will tear it apart and sequence it out to a variety of other processors or machines. With some technologies the service support processor or cluster manager will make all determinations while other vendor specific application will split out the job and the leaves of the cluster will handle processing.

Until the central scheduling processor makes the query to the active nodes within the cluster the actual data may not exist in a cohesive form and then only exist for the length of the query. Caching and performance improvements can change that but in general queries only last as long as the session they were created within. This kind of batch processing or chunking should not be confused with high availability systems where data is cached or random updated.

It has been suggested that computer forensic investigators invest in using high performance cluster and grid computing to solve some of the problems inherent in the large volume environment and analysis phases of forensic investigations (Golden & Roussev, 2006).  The fact remains that grid or cluster computing may be the only method currently available to create high performance network enhanced forensic tools for analyzing large volume data sets of evidence (Golden & Roussev, 2005). The large scale environment has the high performance characteristics that are necessary, but there are issues with the evidentiary process when taking a piece of evidence that should be static (remain the same through the investigation all the way to appellate court), and spawning that analysis out to a variety of computers in an extremely virtual environment. With appropriate limits applied to what the investigators use distributed computing for file analysis, data carving, and other distributable tasks should be possible (Golden & Roussev, 2005). Few investigators though will be able to testify to the method of finding the evidence that may be reported and the idea of specificity may be an issue for courts when considering the aspects of search and seizure of a computing system.

One concept that has to be addressed when thinking about the large scale distributed environments is what is running in those environments as applications and services. It would be easy to consider a simple scientific application crunching numbers in a limited manner, but it is far more likely that the distributed nature will increase the reliance on a large scale networks carrying email, surveillance camera traffic, automated door locks, fire alarm system traffic, heating and cooling information, and much more in the heterogeneous network environment (Cartwright, 2003).  The totality of the transported traffic could also be the processing and storage within the environment further complicating the investigators task when air-conditioning data could be transiting through a user’s computer as part of a computing grid strategy.

Conclusions, discussions, and recommendations

This paper has shown that the variety of issues within the large scale high volume data intensive environment make the computer forensic investigator task extremely difficult. With the exception of highly volatile data encryption scenarios the forensic investigator can overcome the obfuscation and avoid the issues of data loss when seizing a high value asset.

These issues can be solved with the same tools that will cause further issues (e.g. grid and cluster computing). The use of distributed computing environments to increase the investigator speed and accuracy is a viable alternative to normal “single threaded” investigation. The broader computer forensics community must address the issues of distributed computing as regards the evidentiary cycle. The technology cloud computing is specifically a growing threat to computer forensics investigator biases towards static disk analysis. The user environments will continue to get more interactive and rely on technologies that push the envelope of the distributed and large-scale networks. The computer forensic investigator must be ready to address those issues.

 

References

Adelstein, F. (2006). Live forensics: Diagnosing Your System Without Killing it First. Communications of the ACM, 49(2), 63-66.

Berghel, H. (2003). The Discipline of Internet Forensics. Communications of the ACM, 46(8), 15-20.

Blankenhorn, C. A., Huebner, E., & Cook, M. Forensic Investigation of Data in Live High Volume Environments  Retrieved October 2, 2006, 2006, from http://www.cit.uws.edu.au/compsci/computerforensics/Technical%2520Reports/Blankenhorn2005.doc

Carrier, B. D. (2006). Risks of Live Digital Forensic Analysis. Communications of the ACM, 49(2), 56-61.

Carrier, B. D., & Grand, J. (2004). A Hardware-Based Memory Acquisition Procedure for Digital Investigations. Digital Investigation, 1(1).

Cartwright, D. (2003). Architectural Innovations for Enterprise Forensics. Paper presented at the 1st Australian Computer, Network & Information Forensics Conference Perth, Western Australia.

Garfinkel, T., & Rosenblum, M. (2003, 6-7 February 2003). A Virtual Machine Introspection Based Architecture for Intrusion Detection. Paper presented at the 2003 Network and Distributed System Security Symposium (NDSS), San Diego, California.

Giordano, J., & Maciag, C. (2002). Cyber Forensics: A Military Operations Perspective. International Journal of Digital Evidence, 1(2).

Golden, G. R. I., & Roussev, V. (2005). Scalpel: A Frugal, High Performance File Carver. Digital Forensic Research Workshop.

Golden, G. R. I., & Roussev, V. (2006). Next-Generation digital forensics. ACM, 49(2), 76-80.

Gordon, S., & Ford, R. (2006). On the Definition and Classification of Cybercrime. Journal in Computer Virology, 2(1), 13-20.

Grid Computing (2006). Grid Computing Retrieved December 2, 2006, from http://www.gridcomputing.com

Hall, J., Barbeau, M., & Kranakis, E. Enhancing Intrusion Detection in Wireless Networks Using Radio Frequency Fingerprinting (Extended Abstract).

Hall, J., Barbeau, M., & Kranakis, E. (2005). Radio Frequency Fingerprinting for Intrusion Detection in Wireless Networks. DRAFT. Retrieved from http://www.scs.carleton.ca/~jhall2/Publications/IEEETDSC.pdf

Hannan, M., Turner, P., & Broucek, V. (2003). Refining the Taxonomy of Forensic Computing in the Era of E-crime: Insights from a Survey of Australian Forensic Computing Investigation (FCI) Teams. 4th Australian Information Warfare and IT Security Conference, Adelaide, SA, Australia.

Kling, R. (1981). Computer Abuse and Computer Crime as Organizational Activities. SIGCAS Comput. Soc., 11(4), 12-24.

Knoppix. from http://www.knoppix.net/

Laurie, B. (2004). Network Forensics. Queue, 2(4), 50-56.

McKemmish, R. (1999). What is Forensic Computing. Trends and Issues in Crime and Criminal Justice, 118(NO. 118).

Mercuri, R. (2005). Challenges in Forensic Computing. Communications of the ACM, 48(12), 17-21.

Microsoft (2006). Windows Server 2003 Clustering Service Retrieved December 2, 2006, from http://www.microsoft.com/windowsserver2003/technologies/clustering/default.mspx

Mukkamala, S., & Sung, A. H. (2003). Identifying Significant Features for Network Forensic Analysis Using Artificial Intelligent Techniques International Journal of Digital Evidence, 1(4).

Palmer, G. (2001). A Road Map for Digital Forensic Research (No. DTR – T001-01 FINAL). Utica, NY: Air Force Research Laboratory, Rome Research Site, Information Directorate/Defensive Information Warfare Branch,.

Ren, W., & Jin, H. (2005). Honeynet Based Distributed Adaptive Network Forensics and Active Real Time Investigation. Paper presented at the 2005 ACM Symposium on Applied computing.

STC Builds the Largest Data Network in the Region & Adopts the Fastest Routing System in the History of Telecom Industry (2006).  IT & Telecom. Retrieved December 2, 2006, from http://www.menareport.com/en/business,IT_and_Telecom/205049

Whitcomb, C. M. (2002). An Historical Perspective of Digital Evidence: A Forensic Scientist’s View. International Journal of Digital Evidence, 1(1).

 

 

1 comment for “Distributed Computer Forensics: Challenges and Possible Solutions

Leave a Reply