Hakin9 articles
I returned from a trip this morning, to find two copies of the most recent edition of Hakin9 on my desk, with the first of three articles I've written on timeline creation and analysis. This first article is more of an introduction to the topic, and my hope is that anyone reading the articles is able to understand what I'm trying to get across, and see the usefulness and the power of this technique. Personally, I've used this technique on several examinations, all to spectacular effect.
Something that's very interesting (and validating) about this edition is Ismael Valenzuela's "My ERP got hacked - An introduction to computer forensics, pt II" article. Not only does Ismael make use of RegRipper, but he also walks through some techniques for parsing data (i.e., Event Logs/.evt files, IE browser history/index.dat file, etc.) in forensic analysis...very cool stuff, indeed! While Ismael's article does not explicitly develop a timeline, there are some data collection and analysis techniques illustrated in the article that are pretty spot on and very useful.
The second article in the series (I'm told that it will be in the next edition) is a hands-on walk-through, using a freely available image file that can be downloaded from the Internet as a basis for actually creating a timeline. While this is still a very manual process, I firmly believe the benefits of this technique far outweigh the "costs" (i.e., having to extract files and run CLI tools, etc.).
The third and final article (which I'm working on now) is a wrap-up, showing some alternative and advanced techniques that have proven (for me, anyway) to be extremely useful in getting data to include in the timeline. I've also pointed out a couple of areas where we need coverage with respect to converting the retrieved data into something that we can include in a timeline.
Overall, I think that the biggest issue with timeline creation and analysis at this point is the sheer volume of data that's available, and how we can go about doing a bit of data reduction. For example, I have yet to find a suitable technique for data visualization on the front end, when you have all of this data to go through. Clustered dots showing various activity (i.e., file system, Event Log, etc.) don't particularly make a great deal of sense to me, largely due to the fact that things such as software updates and normal operating system activity tend to create a great deal of "noise", where as, the compromise or the malware activity falls into what Pete Silberman of Mandiant referred to as "least frequency of occurrence". So spitting things out in ASCII format so that the analyst can do...well...analysis seems, to me, to be the most effective way to go at this point.
Once the analyst has nailed down the events in question, essentially separating the wheat from the chaff, then is the time for visualization techniques, particularly for reporting. I've seen and referred to some techniques for doing this, including Simile and using Excel to generate something usable.
Something that's very interesting (and validating) about this edition is Ismael Valenzuela's "My ERP got hacked - An introduction to computer forensics, pt II" article. Not only does Ismael make use of RegRipper, but he also walks through some techniques for parsing data (i.e., Event Logs/.evt files, IE browser history/index.dat file, etc.) in forensic analysis...very cool stuff, indeed! While Ismael's article does not explicitly develop a timeline, there are some data collection and analysis techniques illustrated in the article that are pretty spot on and very useful.
The second article in the series (I'm told that it will be in the next edition) is a hands-on walk-through, using a freely available image file that can be downloaded from the Internet as a basis for actually creating a timeline. While this is still a very manual process, I firmly believe the benefits of this technique far outweigh the "costs" (i.e., having to extract files and run CLI tools, etc.).
The third and final article (which I'm working on now) is a wrap-up, showing some alternative and advanced techniques that have proven (for me, anyway) to be extremely useful in getting data to include in the timeline. I've also pointed out a couple of areas where we need coverage with respect to converting the retrieved data into something that we can include in a timeline.
Overall, I think that the biggest issue with timeline creation and analysis at this point is the sheer volume of data that's available, and how we can go about doing a bit of data reduction. For example, I have yet to find a suitable technique for data visualization on the front end, when you have all of this data to go through. Clustered dots showing various activity (i.e., file system, Event Log, etc.) don't particularly make a great deal of sense to me, largely due to the fact that things such as software updates and normal operating system activity tend to create a great deal of "noise", where as, the compromise or the malware activity falls into what Pete Silberman of Mandiant referred to as "least frequency of occurrence". So spitting things out in ASCII format so that the analyst can do...well...analysis seems, to me, to be the most effective way to go at this point.
Once the analyst has nailed down the events in question, essentially separating the wheat from the chaff, then is the time for visualization techniques, particularly for reporting. I've seen and referred to some techniques for doing this, including Simile and using Excel to generate something usable.