I received some questions from readers and I thought
that I'd take a crack at them...
With consumer and commerical hard drives getting larger
and larger (soon there will be a consumer-level 1TB HD)
what effect will the sheer volume of information have
on the forensics community in general, and the
open-source forensics community in specific? Does the
future include more automated tools? Distributed forensic
examinations? Will the open-source forensic
community be able to keep up with the commercial tools?
For the most part, this appears to be two separate
questions, but I'll treat them as pretty much the same
one.
This is nothing new...the challenge of increasing volumes
has affected the forenic community since...well...since
there was storage. Capacities have always increased...such
is the nature of a consumer market. So what's the answer?
Well, in a nutshell...the Age of Nintendo Forensics is
dead. What I mean by that is that new techniques,
processes and methodologies need to be employed to
meet these challenges. We can no longer rely on
simply pushing a button...we have to understand
what it is we're looking for, and where to
look for it.
Notice I didn't say "developed". Rather, I said "employed".
Yes, it was my intention to imply that these techniques are
already available, and actually used by a number of
practioners. However, the vast majority of the digital
forensics community appears to rely on the traditional
approach to computer forensics analysis, which can be
likened to killing the victim of a crime and performing
an autopsy to see what happened.
Live analysis techniques need to be employed more often.
Many traditionalists say that "I won't use these
techniques until they've been proved in court."
Well, consider this...at one point, computer forensics
wasn't used in court, and had to be proven.
Suffice to say, computing technology grows in complexity
...this applies to the hardware, as well as the software.
Accordingly, forensic techniques need to keep pace.
The traditional techniques are still usable...I'm not
saying that they aren't...but practitioners can no longer
learn one thing and hope to stay in business as complexity
and change surrounds them. We need to grow in knowledge
and understanding in order to keep up. This includes
Locard's Exchange Principle, as well as understanding
what actions or conditions lead to the creation and
modification of certain artifacts...from that, we
understand that the absence of an artifact is in
itself an artifact.
Some of the challenges introduced by Windows Vista
include extracting live memory and (presumably) an
increase in use of EFS by consumers. How can these
challeges best be addressed by the forensics community?
The challenge of extracting live memory has been an
issue since Windows 2003 SP1, when user-mode access
to the \\.\PhysicalMemory object was restricted.
However, has it really been a challenge, per se?
How many folks have obtained memory dumps, and of
those, how have they been used? It isn't as if
the vast majority of the community has been using
physical memory as a source of evidence and
presented that in court.
That being said, these challenges will be addressed
by a small portion of the forensics community.
That core group (and it may be dispersed) have
developed processes and methodologies, as well as
documentation, that they will use, and even
publish. However, it may be a while before the
rest of the "community" of forensic practitioners
catches on and begins using these on a regular
basis.
Since Vista was specifically asked about, I'll
throw this out there...there is a Registry value
called NtfsDisableLastAccessUpdate, and yes, it
does just exactly what as it sounds. If this
value is set to "1", then the operating system
will not update last access times on files.
From NT through 2003, this has been a setting
that was disabled by default, and recommended
to be set for high-volume file servers in order
to increase performance. However, on Vista,
this functionality is enabled by default. What
this means is that examiners are going to have
to use other techniques for determining timelines
of activity, etc., and that may even require
Registry analysis.
Do you think there are ethical challenges
specific to forensic examiners or
incident handlers? What do you think is
the most effective method of oversight in
small offices where there may only be one
examiner? Would you say that strong personal
ethics are more important than technical
skill for an examiner/incident handler?
I would suggest that ethics is not something
that is restricted to forensic examiners alone,
but just as important in all fields. I think
that in any field, a wake-up call comes when
someone falls into that "who are you to
question me" trap and then gets caught.
I also think that is an important evolution
and good for the community at large.
In this kind of work it is important to
remain humble and to constantly keep checking
ourselves. There's nothing wrong with going to
someone else and asking, hey, did I do enough
here, or is this telling me what I think it's
telling me? Overall, I think that builds trust,
and strengthens the individuals.
Thoughts? Comments? Email me, or leave a comment...