Researchers, including Dr Petra Leimich Prof Bill Buchanan and Josh Harrison, in The Cyber Academy at Edinburgh Napier, have created a new way of analysing the traces of evidence left in the memory of computers running within Big Data infrastructures.
In traditional digital forensics, analysts will examine static traces from hard disks, but increasingly the evidence is found within Cloud-based systems, where the trails of evidence our found on Cloud-based disk systems. Within a Hadoop Cluster we have a number of computers, and where we can run tasks which can taken hours, days or even months to run.
Overall the task is broken up into threads, which are then run across the cluster, and where there is redundancy built-in, so that data and processes can be replicated across the infrastructure. If one computer or disk fails, then the data/process can be recovered. In this way we create a robust environment for Big Data analysis.
Another major change in digital investigations has been the move from analysing static data on disks, towards investigating live data - known as live forensics. This is where the RAM of the computer is investigated, and where the device is left powered-on. This changes investigations, as there can be a great deal of information that can be gained if the device is still powered on, such as usernames and passwords, where, if it is powered off, a great deal of user information might be lost.
In a paper outlined here, the researchers outlines the methods that traces of evidence can be found within the RAM of a Hadoop Cluster - which is one of the most typical Big Data infrastructures used. The usage of in-memory analysis causes the least amount of disruption to the business process of the cluster, as most companies would not be able to shut-down the Hadoop cluster when it is in operation.
You can read more about the research team here.