Randomness

Writing Books...and whatnot...
As I've mentioned, Windows Registry Forensics is nearing completion (I'm reviewing the proofs now...) and according to Syngress, will be available in January. When something like this is coming out, as with WFA 2/e, the biggest question I tend to get is, "Did you cover the differences/changes in Windows versions?"

Most times, I'm taken aback by this question. I was at the CSI 2010 conference recently, and went by the Syngress table. I talked to someone who was looking at a copy of WFA 2/e, and once he realized that I was the author, he asked me that question. In order to get a better idea how to answer that question, I turned it around and asked him, "...like what?" To that, I got a blank stare in response.

I think...and please, correct me if I'm wrong here...but the real, underlying question is, what are the new sources and locations of potentially valuable data?

In many ways, this is something of a loaded question...because, yes, there are some things that are different between versions of Windows, and in particular between the venerable Windows XP and the shiny new Windows 7. But I could sit here for the better part of the day talking about the differences and never, not once, hit on anything of value to you or to your examinations. Some file paths and Registry locations have changed...while some haven't. Some file formats have changed, while new ones have been added. Some of the new file types in Windows 7...for example, sticky notes...are a bit questionable when it comes to their potential value as evidence.

My suggestion to the community is that if you want to see this kind of thing, start talking. What good does it do to stand there asking "...did you cover all of the changes?" AFTER the book has been written and published, and you're holding it in your hands (with your thumb on the table of contents)? Seriously. If you've got a concern, ask someone. I think you'll be surprised...if nothing else, that you're not the only person with that question. Sometimes, that's all it takes...

Registry Analysis
Speaking of working on the book (the proofs, really), one of the things that really intrigues me about Registry analysis is...well...Registry analysis.

One of the most valuable uses for Registry analysis that I've found is that you can go into the Registry and see historical access to a range of resources...USB removable storage devices (thumb drives, digital cameras, etc.), network shares, remote systems (via RDP or VNC), files that no longer exist on the system, etc. In many cases, you can tie a good deal of this activity to a user, as well as to a specific time frame. You can see what a user searched for on the system.

I've performed examinations of systems where there was a question regarding digital images...we'll just leave it at that (most of us know where I'm going with this...). Someone had claimed that the images had been put on the system by malware, and yet the Registry clearly showed not only access to the files in question, but also showed an association with a specific viewing application, and time stamps indicating when the files had been accessed. These time stamps from the Registry corresponded to file system metadata, as well as metadata from Prefetch files associated with the viewing application(s) (think multiple data sources for timelines). The Registry also indicated access to files no longer on the system, as well as files that were on other media.

I'm really looking forward to the book coming out for two reasons...one is to see it and hold it. The other is to see what folks think, in hopes that the content will spur discussion (what else is needed) as well as a greater move toward recognizing the Registry as a valuable forensic resource.

FIRST
I recently attended a portion of a FIRST technical colloquium put on by IBM. Thanks to Richard Bejtlich for sponsoring me so I could attend.

I saw David Bianco (I'd swear that David is Cory Altheide's fraternal twin...) from Richard's team at GE talk about what they were doing, and the biggest take-away for me was the fact that they collect what's practical, as opposed to everything that's possible. They've got a phased approach to their network monitoring coverage, which has been prioritized; much like the Great Wall, they don't so much look at what's coming in as what's trying to go out of their infrastructure in the first phase of their rollout. This clearly demonstrates considerable thought having been put into their approach, taking into account what needs to be covered, staffing levels, etc., and coordinating all of these resources.

In part, this is something of a breath of fresh air when applied to the IR/DF communities. Too often, I think, we tend to take an "I need EVERYTHING possible" approach, and end up losing site of our goals. I really liked what David said about getting everything practical, as I can really see how it applies to the IR/DF field. Have you ever seen someone who will arrive on-site as a responder and state their strategy is to image everything? By the time you're done imaging, the customer is no longer interested in your findings, as they're likely fined or out of business.

Another thing that came out of the discussion surrounding the presentation was that they have a tiered approach for their analysts, with a progressive career path, so that analysts coming in at a particular level have goals that they can strive for in order to progress to the next level. This is a different view than what I had seen when I got into the security industry in 1997, and I think that its an excellent approach.

Again, thanks, Richard. I was saddened to not see Eoghan Casey at the colloquium.