We should call it cyber forensics rather than digital or computer forensics. Why? There are three states we find information in within the current technology paradigm; storage; transmission; or processing. Regardless of the mechanism of storage, transmission or processing a science should be able to realize and adapt to the environment in such a way as to be current today and tomorrow. When we step outside the bounds of incident handling, and into the realm of forensics where liberty or life is on the line. What the investigator supports as facts needs to be impeccable. What is in a name matters.
Much of the foundational computer forensics work has been about storage on disk or other media of artifacts that can be replicated in the laboratory. This allows us to determine how something that shouldn’t be there got there, or should be there but isn’t there left. In the world of computers and technology storage forensics is the low hanging fruit and likeliest easiest to instantiate as static elements are well-known and the physical principles are considerably well described.
This argument though has teetered on the precipice of active versus static forensics and the arguments for and against each viewpoint. The foundational principle of replicable results is in many experts opinion nullified in the active forensics paradigm. Though, we can see destruction of some evidence in the process of examination is valid as seen in drug testing, serology, genetics and other forensic sciences. The active forensics paradigm though could be considered to disrupt, degrade, or destroy the reliability of the entirety of analyzed evidence.
What the discussion has done in the past is put awareness of the science above the process of judgment and into the realm of evidence. In the examination of static versus active data storage, if static is chosen, then exculpatory evidence of malefactor actors may be expunged from the evidence. Thus, the investigator has tipped the cart of justice extraordinarily against the defendant. If the investigator has followed a known process in the active forensics arena then the evidence should be well vetted. Though most assuredly there will be some changes. This discussion though provides a jumping off point to the idea of transmission and processing forensics where all activity is an active environment.
Transmission forensics is the active environment where information may not be written to disk or memory. This concept causes some consternation to forensic investigators wrapped around the idea of easily defined information objects like pictures or documents. In the world of virtualization and cloud based elastic environment evidence of adversarial activity or criminality may be found in the transmission medium. That transmission medium being the actual information transiting the network, the telemetry of the network, and various data points that can be defined in a network.
The architectural representation of a network can provide evidence if an entity could even have access to a particular information asset. This very high level examination must be grounded in science that can be replicated and use rules that are standardized. As an example, if an investigator stated, “the entity of interest could not have accessed this node based on the Ethernet architecture.” Ignoring the wireless access mechanisms that could have been used or not accounting for them would create a credibility gap for the investigator. The protocols and various strategies of investigating networks are backed up by the principles of forensic soundness. Only when these two can be brought together do we have a good handle on the evidence.
Another element of forensics is the concept of processing. There has been a fundamental expansion of the concept of malware analysis and reverse engineering. An expression of software may have elements of adversarial impact that are not to be found in the storage or transmission structures. The behaviors and structures of the malware are evidence much the same as any other actors behaviors might be considered. Due to the dynamic and transitory nature of processing the forensics may be limited to behavior analysis.
Processing forensics is more than just malware analysis. Trusted dynamic software execution may be analyzed to determine if criminal or other agency can be identified. Processing forensics may be utilized in more venues than originally considered (beyond malware forensics). Computer science students are often taught about lethal radiation dosage rates as examples of poor programming. The process of identifying that kind of error often is tangential to processing forensics. Second to this point, evaluation of data from accident recorders, may in some cases be entirely dynamic.
While attempting to not belabor the point the storage forensics model is the most static. While the transmission forensic model is dynamic in execution it can be recorded remotely within the fidelity of the instruments available. The processing forensic paradigm is the most dynamic and seems to require agency within itself (as things are being processed) to evaluate what is going on within a system. This makes it the most volatile for evidence. This kind of self-referencing (using the processor to report on itself) is filled with opportunities for forensic resistance.
These three states of information (processing, storage and transmission) are independent of the concept of the current technology plane. Regardless of future technologies they are assured to remain relevant as they are descriptors of states rather than tools. This is why the term “cyber” is used to describe the forensics discipline of information rather than computer or digital. As more and more information becomes cyber (bill boards are not cyber but music cd’s are) the concept of meta enters. That is a topic for another day. Cyber is a holistic term often overused, but seemingly useful in this case.
1 comment for “Storage, transmission, and processing as basis for cyber forensics”