Book & Attending BlackHat DC
Recently, Troy Larson (bio and picture) graciously offered to review/tech edit my upcoming book, so to start, I asked the editor to send him two chapters; the one that covers volatile data collection and the one that covers Registry analysis. Here's a quote that I received from Troy regarding the Registry analysis chapter:
Speaking of the book, I found it on Amazon this morning, available for pre-order.
I spent most of yesterday at BlackHat DC and caught the first three presentations in the Forensics track...well most of them. I caught all of Kevin Mandia's presentation, as well as part of the presentation that followed on web app forensics, and then after lunch, I went to Nick Petroni and AAron Walter's presentation on VolaTools. Unfortunately, I didn't get to stay for the entire presentation.
Kevin's presentation was very interesting, addressing the need for live response. There were several comments that Kevin made, stating the need for "fast and light" response.
IMHO, there is a need for a greater understanding of live response...not just the technical aspects of it, but how those technical aspects can be applied to and affect the business infrastructure of an organization.
Following the presentation, I took a few minutes to jot down some notes and thoughts...I started by attempting to classify incidents (think of "Agent Smith" in The Matrix, classifying humans as a "virus") in terms of the speed of response required for each. I started by defining speed of response in general terms at first ("ASAP", etc.) and then attempting to become more specific but not to the point of minutes and seconds. Regardless of the angle I approached the problem from, the one thing that kept coming to mind were business requirements...what are the needs of the business? Sure, we've got all these great technical things we can do, and as Kevin pointed out (as did Nick and AAron), a lot of folks that he's talked to have said, "we've already got all this data (file system data, etc.) to analyze...we don't do live response because the last thing we need is more data!"
I guess the question this point is, is the data that you're currently collecting for analysis meeting your business needs? If you're backlogged 6 months in your analysis...probably not. Depending upon the nature (or class) of the incident and your business environment, it may/most likely will be useful to rapidly collect and analyze a specific subset of volatile data during live response so that you can get a better picture of the issue, and progress through your analysis more accurately and efficiently.
One of the things Kevin pointed out is that malware (can anyone verify that Kevin only said "malware" 6 times...I wasn't keeping track) is including the capability of modifying the MAC times of files written to the system. This is an anti-forensic technique that severely hampers those using the current "Nintendo forensics" approach to analysis.
I think that some people are beginning to recognize the need for live response and how it can be useful in an investigation. However, I think that the issue now is one of education...how do those currently doing response of any kind (live or otherwise) break out of their current mindset, and shift gears? By shifting to a more holistic approach and using live response where applicable, and in the appropriate manner, those people will begin to actually see how useful this activity can be.
The presentation I found to be the most interesting was Nick and AAron's "VolaTools" presentation. Once Nick finished with the overview (presenting some great information on why you'd want to use tools like this), AAron kicked off into the how portion of the talk. I think their approach is an excellent one, in that they've identified a major stumbling block for this kind of analysis...the tools that are available are not included with EnCase and are not "push the Find-All-Evidence button" kinds of tools, and people aren't using them because the tools themselves are too different. In a nutshell, there's a confidence factor that needs to be addressed. While Nick and AAron's tools are written in Python, they do provide a command-line interface in the Basic version of their tools that allow the user to extract information from a RAM dump in nearly the same manner as you would on a live system...to list the modules used, you would use a "dlllist" command (as opposed to listdlls.exe), and to list handles, you'd use "handles".
I can't wait to download and look at their tools...these two guys are really bright and have just moved the issue of Windows memory analysis forward a couple of huge leaps. And thanks to guys like Jesse Kornblum for things such as his Buffalo paper, because we can then use AAron and Nick's work as a basis for incorporating the pagefile into the memory analysis, as well.
Oh, and while I did meet both Ovie and Bret, unfortunately Ovie wasn't wearing his CyberSpeak t-shirt, so I guess I don't get to have a beer with Bret! ;-(
One final thing about BlackHat DC...seeing Jesse do his impression of a sardine was worth the price of admission! Speaking of Jesse, it looks like he's been busy on the ForensicWiki...
I really liked the registry chapter. It is worth the price of the book alone.Sweet! In the same email, Troy also mentioned that there wasn't as much info on Vista as there was on the other versions of Windows covered (2000, XP, 2003), but also said that this is largely due to the fact that there just isn't that much information available yet. secondedition
Speaking of the book, I found it on Amazon this morning, available for pre-order.
I spent most of yesterday at BlackHat DC and caught the first three presentations in the Forensics track...well most of them. I caught all of Kevin Mandia's presentation, as well as part of the presentation that followed on web app forensics, and then after lunch, I went to Nick Petroni and AAron Walter's presentation on VolaTools. Unfortunately, I didn't get to stay for the entire presentation.
Kevin's presentation was very interesting, addressing the need for live response. There were several comments that Kevin made, stating the need for "fast and light" response.
IMHO, there is a need for a greater understanding of live response...not just the technical aspects of it, but how those technical aspects can be applied to and affect the business infrastructure of an organization.
Following the presentation, I took a few minutes to jot down some notes and thoughts...I started by attempting to classify incidents (think of "Agent Smith" in The Matrix, classifying humans as a "virus") in terms of the speed of response required for each. I started by defining speed of response in general terms at first ("ASAP", etc.) and then attempting to become more specific but not to the point of minutes and seconds. Regardless of the angle I approached the problem from, the one thing that kept coming to mind were business requirements...what are the needs of the business? Sure, we've got all these great technical things we can do, and as Kevin pointed out (as did Nick and AAron), a lot of folks that he's talked to have said, "we've already got all this data (file system data, etc.) to analyze...we don't do live response because the last thing we need is more data!"
I guess the question this point is, is the data that you're currently collecting for analysis meeting your business needs? If you're backlogged 6 months in your analysis...probably not. Depending upon the nature (or class) of the incident and your business environment, it may/most likely will be useful to rapidly collect and analyze a specific subset of volatile data during live response so that you can get a better picture of the issue, and progress through your analysis more accurately and efficiently.
One of the things Kevin pointed out is that malware (can anyone verify that Kevin only said "malware" 6 times...I wasn't keeping track) is including the capability of modifying the MAC times of files written to the system. This is an anti-forensic technique that severely hampers those using the current "Nintendo forensics" approach to analysis.
I think that some people are beginning to recognize the need for live response and how it can be useful in an investigation. However, I think that the issue now is one of education...how do those currently doing response of any kind (live or otherwise) break out of their current mindset, and shift gears? By shifting to a more holistic approach and using live response where applicable, and in the appropriate manner, those people will begin to actually see how useful this activity can be.
The presentation I found to be the most interesting was Nick and AAron's "VolaTools" presentation. Once Nick finished with the overview (presenting some great information on why you'd want to use tools like this), AAron kicked off into the how portion of the talk. I think their approach is an excellent one, in that they've identified a major stumbling block for this kind of analysis...the tools that are available are not included with EnCase and are not "push the Find-All-Evidence button" kinds of tools, and people aren't using them because the tools themselves are too different. In a nutshell, there's a confidence factor that needs to be addressed. While Nick and AAron's tools are written in Python, they do provide a command-line interface in the Basic version of their tools that allow the user to extract information from a RAM dump in nearly the same manner as you would on a live system...to list the modules used, you would use a "dlllist" command (as opposed to listdlls.exe), and to list handles, you'd use "handles".
I can't wait to download and look at their tools...these two guys are really bright and have just moved the issue of Windows memory analysis forward a couple of huge leaps. And thanks to guys like Jesse Kornblum for things such as his Buffalo paper, because we can then use AAron and Nick's work as a basis for incorporating the pagefile into the memory analysis, as well.
Oh, and while I did meet both Ovie and Bret, unfortunately Ovie wasn't wearing his CyberSpeak t-shirt, so I guess I don't get to have a beer with Bret! ;-(
One final thing about BlackHat DC...seeing Jesse do his impression of a sardine was worth the price of admission! Speaking of Jesse, it looks like he's been busy on the ForensicWiki...