ACPO Guidelines

The Association of Chief Police Officers (ACPO), in association with 7safe, has recently released their updated guide to collecting electronic evidence. While the entire document makes for an interesting read, I found pages 18 and 19, "Network forensics and volatile data" most interesting.

The section begins with a reference back to Principle 2 of the guidelines, which states:

In circumstances where a person finds it necessary to access original data held on a computer or on storage media, that person must be competent to do so and be able to give evidence explaining the relevance and the implications of their actions.

Sounds good, right? We should also look at Principle 1, which states:

No action taken by law enforcement agencies or their agents should change data held on a computer or storage media which may subsequently be relied upon in court.

Also, Principle 3 states:

An audit trail or other record of all processes applied to computer-based electronic evidence should be created and preserved. An independent third party should be able to examine those processes and achieve the same result.

Okay, I'm at a loss here. Collecting volatile data inherently changes the state of the system, as well as the contents of the storage media (i.e., Prefetch files, Registry contents, pagefile, etc.), and the process used to collect the volatile data cannot be later used by a third party to "achieve the same result", as the state of the system at the time that the data is collected cannot be reproduced.

That being said, let's move on...page 18, in the "Network forensics and volatile data" section, includes the following:

By profiling the forensic footprint of trusted volatile data forensic tools, an investigator will be in a position to understand the impact of using such tools and will therefore consider this during the investigation and when presenting evidence.

It's interesting that this says "profiling the forensic footprint", but says nothing about error rates or statistics of any kind. I fully agree that this sort of thing needs to be done, but I would hope that it would be done and made available via a resource such as the ForensicWiki, so that not every examiner has to run every test of every tool.

Here's another interesting tidbit...

Considering a potential Trojan defence...

Exactly!

Continuing on through the document, I can't say that I agree with the order of the sequence for collecting volatile data...specifically, the binary dump of memory should really be first, not last. This way, you can collect the contents of physical memory in as near a pristine state as possible. I do have to question the use of the term "bootable" to describe the platform from which the tools should be run, as booting to this media would inherently destroy the very volatile data you're attempting to collect.

Going back to my concerns (the part where I said I was "at a loss") above, I found this near the end of the section:

By accessing the devices, data may be added, violating Principle 1 but, if the logging mechanism is researched prior to investigation, the forensic footprints added during investigation may be taken into consideration and therefore Principle 2 can be complied with.

Ah, there we go...so if we profile our trusted tools and document what their "forensic footprints" are, then we can identify our (investigators) footprints on the storage media, much like a CSI following a specific route into and out of a crime scene, so that she can say, "yes, those are my footprints."

Thoughts?