Good Reading, Tools


Reading
Cylance Blog - Uncommon Event Log Analysis - some great stuff here showing what can be found with respect to indirect or "consequential" artifacts, particularly within the Windows Event Logs on Vista systems and above.  The author does a pretty good job of pointing out how some useful information can be found in some pretty unusual places within Windows systems.  I'd be interested to see where things fall out when a timeline is assembled, as that's how I most often locate indirect artifacts.

Cylance Blog - Uncommon Handle Analysis - another blog post by Gary Colomb, this one involving the analysis of handles in memory.  I liked the approach taken, wherein Gary explains the why, and provides a tool for the how.  A number of years ago, I had written a Perl script that would parse the output of the MS SysInternals tool handle.exe (ran it as handle -a) and sort the handles found based on least frequency of occurrence, in order to do something similar to what's described in the post.

Security BrainDump - Bugbear found some interesting ZeroAccess artifacts; many of the artifacts are similar to what is seen in other variants of ZA, as well as in other malware families (i.e., file system tunneling), but in this case, the click fraud appeared in the systemprofile folder...that's very interesting. 

SpiderLabs Anterior - The White X - this was an interesting and insightful read, in that it fits right along with Chris Pogue's Sniper Forensics presentations, particularly when he talks about 'expert eyes'.  One thing Chris is absolutely correct about is that we, as a community, need to continue to shift our focus away from tools and more toward methodologies and processes.  Corey Harrell has said the same thing, and I really believe this to be true.  While others have suggested that the tools help to make non-experts useful, I would suggest that the usefulness of these "non-experts" is extremely limited.  I'm not suggesting that one has to be an expert in mechanical engineering and combustion engine design in order to drive a car...rather, I'm simply saying that we have to have an understanding of the underlying data structures and what the tools are doing when we run those tools.  We need to instead focus on the analysis process.

Java Web Vulnerability Mitigation on Windows - Great blog post that is very timely, and includes information that can be used in conjunction with RegRipper to in order to determine initial infection vector (IIV) during analysis.

ForkSec Blog - "new" blog I saw referenced on Twitter one morning, and I started my reading with the post regarding the review of the viaExtract demo.  I don't do any mobile forensics at the moment, but I did enjoy reading the post, as well as seeing the reference to Santoku Linux.

Tools
win-sshfs - ssh(sftp) file system for Windows - I haven't tried this one but it does look interesting.

4Discovery recently announced that they'd released a number of tools to assist in forensic analysis.  I downloaded and ran two of the tools...LinkParser and shellbagger.  I ran LinkParser against a legit LNK file that I'd pulled from a system that contained only a header and a shell item ID list (it had no LinkInfo block), and LinkParser didn't display anything.  I also ran LinkParser against a couple of LNK files that I have been using to test my own tools, and it did not seem to parse the shell item ID lists.  I then ran shellbagger against some test data I've been working with, and found that, similar to other popular tools, it missed some shell items completely.  I did notice that when the tool found a GUID that it didn't know, it said so...but it didn't display the GUID in the GUI so that the analyst could look it up.  I haven't yet had a chance to run some of the other tools, and there are reportedly more coming out in the future, so keep an eye on the web site.

ShadowKit - I saw via Chad Tilbury on G+ recently that ShadowKit v1.6 is availableHere's another blog post that talks about how to use ShadowKit; the process for setting up your image to be accessed is identical to the process I laid out in WFAT 3/e...so, I guess I'm having a little difficulty seeing the advantages of this tool over native tools such as vssadmin + mklink, beyond the fact that it provides a GUI.

Autopsy - Now has a graphical timeline feature; right now, this feature only appears to include the file system metadata, but this approach certainly has potential.  Based on my experience with timeline analysis, I do not see the immediate value in this approach to bringing graphical features to the front end of timeline analysis.  There are other tools that utilize a similar approach, and as with those, I don't see the immediate value, as most often I'm not looking for where or when the greatest number of events occur, but I'm usually instead looking for the needle in stack of needles.  However, I do see the potential for the use of this technique in timeline analysis.  Specifically, adding Registry, Windows Event Log, and other events will only increase the amount of data, but one means for addressing this would be to include alerts in the timeline data, and then show all events as one color, and alerts as another.  Alerts could be based on either direct or indirect/consequential artifacts, and can be extremely valuable in a number of types of cases, directing the analyst's attention to critical areas for analysis.

NTFS TriForce - David Cowen has released the public beta of his NTFS TriForce tool. I didn't see David's presentation on this tool, but I did get to listen to the recording of the DFIROnline presentation - the individual artifacts that David describes are very useful, but real value is obtained when they're all combined.

Auto-rip - Corey has unleashed auto-rip; Corey's done a great job of automating data collection and initial analysis, with the key to this automation being that Corey knows and understands EXACTLY what he's doing and why when he launches auto-rip.  This is really the key to automating any DFIR task..while some will say that "it goes without saying", too often there is a lack of understanding with respect to the underlying data structures and their context when automated tools are run.

WebLogParser - Eric Zimmerman has released a log parser with geolocation, DNS lookups, and more.