Timeline Creation Tools
As time progresses, we look at the tools we have available to us, tweak those that we have, and maybe look for new capabilities, creating new tools. Recently, someone was kind enough to take the time to post some feedback on their experiences with the timeline tools I released in the Win4n6 Yahoo Group a bit ago, and I took the opportunity to update some of the tools based on that feedback. Below are the tools I updated, and what I did to update them:
pref.pl - removed the path to the directory where the Prefetch files are kept; the feedback had an excellent point - don't want to confuse the user
evtparse.pl - updated this script to (a) dump the sequence of event records and time generated timestamps, and (b) get all .evt files in a directory, rather than requiring the user to enter one command line for each file
jobparse.pl - created this one recently, for parsing Scheduled Task .job files (NOT the schedlgu.txt log file); includes output in TLN format
Now, these updated tools have NOT been included in the toolset available in the group, largely because my second Hakin9 article - the one where I provide a hands-on walk-through of the tools - should be coming out in the near future, and I don't want to confuse anyone. Also, the feedback (which I greatly appreciate) pointed out that this is still largely a manual process, and I realize that this can be an impediment to a lot of forensic examiners. Maybe what needs to happen is that I need to provide training on using these tools, so that more folks can realize for themselves the real power in this analysis technique.
Another thing I really need to emphasize about timeline generation is how powerful it can be when used to optimize triage and analysis techniques. Let's say you have a large-ish incident that you're responding to, and it's clear that you need to have a means to get some analysis completed in parallel, while the rest of the data is being collected. On-site staff can collect file system metadata and specific files from acquired images while verifying the image file systems, and ship that data off to another analyst for timeline generation and analysis. Given an image of 80 or 160GB, getting the file system metadata, and archiving selected files that have been extracted from an image means that you're sending off several MB of data, rather than GB. In addition, you're not actually sending file contents...so in the case of response activities involving a data breach, you can get analysis done by shipping this data off, but you're not sending the actual sensitive data itself...file names and paths != file contents.
So consider this scenario...on-site staff are in the process of acquiring systems (or, perhaps the organization's own incident responders are acquiring memory dumps and images) and part of that process is to verify the acquired images by opening the image file in FTK Imager. Now, you may only have a few team members on-staff, all trying to collect a considerable amount of data; not just images, but also network diagrams, data flows, etc. So, their new process is to verify the file system of each image, and then run the appropriate tools to collect file system metadata, as well as various files (i.e., .evt, .pf, .job, Registry, etc.), zip them up, and ship them off for analysis. Put these in the hands of someone skilled and practiced in the use of the timeline creation tools, and you will very quickly get a timeline of activity from each system. This can help you quickly narrow down what you're looking for or at, as well as help you scope other systems that may be involved in the incident. And you haven't contributed to the exposure of sensitive data!
pref.pl - removed the path to the directory where the Prefetch files are kept; the feedback had an excellent point - don't want to confuse the user
evtparse.pl - updated this script to (a) dump the sequence of event records and time generated timestamps, and (b) get all .evt files in a directory, rather than requiring the user to enter one command line for each file
jobparse.pl - created this one recently, for parsing Scheduled Task .job files (NOT the schedlgu.txt log file); includes output in TLN format
Now, these updated tools have NOT been included in the toolset available in the group, largely because my second Hakin9 article - the one where I provide a hands-on walk-through of the tools - should be coming out in the near future, and I don't want to confuse anyone. Also, the feedback (which I greatly appreciate) pointed out that this is still largely a manual process, and I realize that this can be an impediment to a lot of forensic examiners. Maybe what needs to happen is that I need to provide training on using these tools, so that more folks can realize for themselves the real power in this analysis technique.
Another thing I really need to emphasize about timeline generation is how powerful it can be when used to optimize triage and analysis techniques. Let's say you have a large-ish incident that you're responding to, and it's clear that you need to have a means to get some analysis completed in parallel, while the rest of the data is being collected. On-site staff can collect file system metadata and specific files from acquired images while verifying the image file systems, and ship that data off to another analyst for timeline generation and analysis. Given an image of 80 or 160GB, getting the file system metadata, and archiving selected files that have been extracted from an image means that you're sending off several MB of data, rather than GB. In addition, you're not actually sending file contents...so in the case of response activities involving a data breach, you can get analysis done by shipping this data off, but you're not sending the actual sensitive data itself...file names and paths != file contents.
So consider this scenario...on-site staff are in the process of acquiring systems (or, perhaps the organization's own incident responders are acquiring memory dumps and images) and part of that process is to verify the acquired images by opening the image file in FTK Imager. Now, you may only have a few team members on-staff, all trying to collect a considerable amount of data; not just images, but also network diagrams, data flows, etc. So, their new process is to verify the file system of each image, and then run the appropriate tools to collect file system metadata, as well as various files (i.e., .evt, .pf, .job, Registry, etc.), zip them up, and ship them off for analysis. Put these in the hands of someone skilled and practiced in the use of the timeline creation tools, and you will very quickly get a timeline of activity from each system. This can help you quickly narrow down what you're looking for or at, as well as help you scope other systems that may be involved in the incident. And you haven't contributed to the exposure of sensitive data!