Brain Droppings
NoVA Forensics Meetup
The next NoVA Forensics Meetup will be held on Wed, 1 June 2011, from 7-8:30pm. As to a location, I met with the great folks at Reverse Space, a hacker space in Herndon where some of the folks have an interest in forensics. Thanks to Carl and Richard for taking the time to meet with me, and for offering to host our meetings.
I hope that we get a big turn-out for our currently scheduled presentation, titled "Build your own packet capture engine".
Our meetup in July will be scheduled for Wednesday, 6 July, and we've already got an offer of a presentation regarding setting up virtual machines to use for dynamic malware analysis.
As to further topics, I'd like to get suggestions regarding how we can expand our following; for example, Chris from the NoVA Hackers group told me that they follow the AHA participation model. I'd like the development of this group to be a group effort, and as such will be asking participants and attendees for thoughts, ideas, comments (and to even volunteer their own efforts) regarding how this group can expand. For example, do we need a mailing list or is the Win4n6 Group sufficient? If you have anything that you'd like to offer up, please feel free to drop me a line.
Breakin' In
Speaking of the NoVA Forensics Meetup, at our last meeting, one of our guests asked me how to go about getting into the business. I tried to give a coherent answer, but as with many things, this question is one of those that have been marinating for some time, not just in my brain housing group, but within the community.
From my own perspective, when interviewing someone for a forensics position, I'm most interested in what they can do...I'm not so much interested that someone is an expert in a particular vendor's application. I'm more interested in methodology, process, what problems have you solved, where have you stumbled and what have you learned. In short, are you tied to a single application, or do you fall back to a process or methodology? How do you go about solving problems? When you do something in particular (adding or skipping a step in your process), do you have a reason for doing so?
But the question really goes much deeper than that, doesn't it? How does one find out about available positions and what it really takes to fill them? One way to find available positions and job listings is via searches on Monster and Indeed.com. Another is to take part in communities, such as the...[cough]...NoVA Forensics Meetup, or online communities such as lists and forums.
Breaches
eWeek recently (6 May) had an article regarding the Sony breach available, written by Fahmida Rashad, which started off by stating:
Sony could have prevented the breach if they’d applied some fundamental security measures...
Sometimes, I don't know about that. Is it really possible to say that, just because _a_ way was found to access the network, that including these "fundamental security measures" would have prevented the breach?
The article went on to quote Eugene Spafford's comments that Sony failed to employ a firewall, and used outdated versions of their web server. 'Spaf' testified before Congress on 4 May, where these statements were apparently made.
Interestingly, a BBC News article from 4 May indicates that at least some of the data stolen was from an "outdated database".
The eWeek article also indicates (as did other articles) that Data Forte, Guidance Software and Protiviti were forensics firms hired to address the breach.
As an aside, there was another statement made within the article that caught my interest:
“There are no consequences for many companies that under-invest in security,” Philip Lieberman, CEO of Lieberman Software, told eWEEK.
As a responder and analyst, I deal in facts. When I've been asked to assist in breach investigations, I have done so by addressing the questions posed to me through analysis of the available data. I do not often have knowledge of what occurred with respect to regulatory or legislative oversight. Now and again, I have seen news articles in the media that have mentioned some of the fallout of the incidents I've been involved with, but I don't see many of these. What I find interesting about Lieberman's statement is that this is the perception.
The Big Data Problem
I read a couple of interesting (albeit apparently diametrically opposed) posts recently; one was Corey Harrell's Triaging My Way (shoutz to Frank Sinatra) post where Corey talked about focusing on the data needed to answer the specific questions of your case. Corey's post provides an excellent example of a triage process in which specific data is extracted/accessed based on specific questions. If there is a question about the web browsing habits of a specific user, there are a number of specific locations an analyst can go within the system to get information to answer that question.
The other blog post was Marcus Thompson's We have a problem, part II post, which says, in part, that we (forensic analysts) have a "big data" problem, given the ever-increasing volume (and decreasing cost) of storage media. Now, I'm old enough to remember when you could boot a computer off of a 5 1/4" floppy disk, remove that disk and insert the storage disk that held your documents...before the time of hard drives that were actually installed in systems. This dearth of storage media naturally leads to backlogs in analysis, as well as intelligence collection.
I would suggest that the "big data" problem is particularly an issue in the face of the use of traditional analysis techniques. Traditional techniques applied to Corey's example (above) states that all potential sources of media must be collected, and keyword searches run. Wait...what? Well, no wonder we have backlogs! If I'm interested in a particular web site that the user may have visited, why would I run a keyword search across all of the EXEs and DLLs in the system32 directory? While there may be files on the 1TB USB-connected external hard drive, what is the likelihood that the user's web browser history is stored there? And why would I examine the contents of the Administrator (or any other) account profile if it hasn't been accessed in two years?
Another variant on this issue was discussed, in part, in Mike Viscuso's excellent Understanding APT presentation (at the recent AccessData User's Conference)...the presentation indicates that the threat isn't really terribly "advanced", but mentions that the threat makes detection "just hard enough".
Writing Open Source Tools
This is a topic that came up when Cory and I were working on DFwOST...Cory thought that it would be a good section to add, and I agreed, but for the life of me, I couldn't find a place to put it in the book where it just didn't seem awkward. I still think that it's important, in part because open source tools come from somewhere, but also because I think that a lot more folks out there really have something to contribute to the community as a whole.
To start off, my own motivation for writing open source tools is to simply solve a problem or address something that I've encountered. This is where RegRipper came from...I found that I'd been looking at many of the same Registry keys/values over and over again, and had built up quite a few scripts. As such, I wanted a "better" (that's sort of relative, isn't it??) to manage these things, particularly when there was so many, and they seemed to use a lot of the same code over and over.
I write tools in Perl because it's widely available and there a LOT of resources available for anyone interested in learning to use it...even if just to read it. I know the same is true for Python, but back in '98-'99 when I started teaching myself Perl, I did so because the network monitoring guys in our office were looking for folks who could write Perl, and infosec work was as hard for folks to sell back then as forensic analysis is now.
When I write Perl scripts, I (in most cases) try to document the code enough so that someone can at least open the script in Notepad and read the comments to see what the script does. I don't always try for the most elegant solution, reducing the number of keystrokes to accomplish a task, as making the steps available not only lets someone see more clearly what was done, but it also lets someone else modify the code to meet their needs...simply comment out the lines in question and modify the script to meet your own needs.
DFF
Speaking of open source tools, one of the tools discussed in DFwOST is the Digital Forensics Framework, of which version 1.1.0 was recently released. This version includes a couple of updates, as well as a bug fix to the ntfs module. I've downloaded it and got it running nicely on a Windows XP system...great work and a huge thanks to the DFF folks for their work. Be sure to check out the DFF blog for some tips on how you can use this open source forensic analysis application.
The next NoVA Forensics Meetup will be held on Wed, 1 June 2011, from 7-8:30pm. As to a location, I met with the great folks at Reverse Space, a hacker space in Herndon where some of the folks have an interest in forensics. Thanks to Carl and Richard for taking the time to meet with me, and for offering to host our meetings.
I hope that we get a big turn-out for our currently scheduled presentation, titled "Build your own packet capture engine".
Our meetup in July will be scheduled for Wednesday, 6 July, and we've already got an offer of a presentation regarding setting up virtual machines to use for dynamic malware analysis.
As to further topics, I'd like to get suggestions regarding how we can expand our following; for example, Chris from the NoVA Hackers group told me that they follow the AHA participation model. I'd like the development of this group to be a group effort, and as such will be asking participants and attendees for thoughts, ideas, comments (and to even volunteer their own efforts) regarding how this group can expand. For example, do we need a mailing list or is the Win4n6 Group sufficient? If you have anything that you'd like to offer up, please feel free to drop me a line.
Breakin' In
Speaking of the NoVA Forensics Meetup, at our last meeting, one of our guests asked me how to go about getting into the business. I tried to give a coherent answer, but as with many things, this question is one of those that have been marinating for some time, not just in my brain housing group, but within the community.
From my own perspective, when interviewing someone for a forensics position, I'm most interested in what they can do...I'm not so much interested that someone is an expert in a particular vendor's application. I'm more interested in methodology, process, what problems have you solved, where have you stumbled and what have you learned. In short, are you tied to a single application, or do you fall back to a process or methodology? How do you go about solving problems? When you do something in particular (adding or skipping a step in your process), do you have a reason for doing so?
But the question really goes much deeper than that, doesn't it? How does one find out about available positions and what it really takes to fill them? One way to find available positions and job listings is via searches on Monster and Indeed.com. Another is to take part in communities, such as the...[cough]...NoVA Forensics Meetup, or online communities such as lists and forums.
Breaches
eWeek recently (6 May) had an article regarding the Sony breach available, written by Fahmida Rashad, which started off by stating:
Sony could have prevented the breach if they’d applied some fundamental security measures...
Sometimes, I don't know about that. Is it really possible to say that, just because _a_ way was found to access the network, that including these "fundamental security measures" would have prevented the breach?
The article went on to quote Eugene Spafford's comments that Sony failed to employ a firewall, and used outdated versions of their web server. 'Spaf' testified before Congress on 4 May, where these statements were apparently made.
Interestingly, a BBC News article from 4 May indicates that at least some of the data stolen was from an "outdated database".
The eWeek article also indicates (as did other articles) that Data Forte, Guidance Software and Protiviti were forensics firms hired to address the breach.
As an aside, there was another statement made within the article that caught my interest:
“There are no consequences for many companies that under-invest in security,” Philip Lieberman, CEO of Lieberman Software, told eWEEK.
As a responder and analyst, I deal in facts. When I've been asked to assist in breach investigations, I have done so by addressing the questions posed to me through analysis of the available data. I do not often have knowledge of what occurred with respect to regulatory or legislative oversight. Now and again, I have seen news articles in the media that have mentioned some of the fallout of the incidents I've been involved with, but I don't see many of these. What I find interesting about Lieberman's statement is that this is the perception.
The Big Data Problem
I read a couple of interesting (albeit apparently diametrically opposed) posts recently; one was Corey Harrell's Triaging My Way (shoutz to Frank Sinatra) post where Corey talked about focusing on the data needed to answer the specific questions of your case. Corey's post provides an excellent example of a triage process in which specific data is extracted/accessed based on specific questions. If there is a question about the web browsing habits of a specific user, there are a number of specific locations an analyst can go within the system to get information to answer that question.
The other blog post was Marcus Thompson's We have a problem, part II post, which says, in part, that we (forensic analysts) have a "big data" problem, given the ever-increasing volume (and decreasing cost) of storage media. Now, I'm old enough to remember when you could boot a computer off of a 5 1/4" floppy disk, remove that disk and insert the storage disk that held your documents...before the time of hard drives that were actually installed in systems. This dearth of storage media naturally leads to backlogs in analysis, as well as intelligence collection.
I would suggest that the "big data" problem is particularly an issue in the face of the use of traditional analysis techniques. Traditional techniques applied to Corey's example (above) states that all potential sources of media must be collected, and keyword searches run. Wait...what? Well, no wonder we have backlogs! If I'm interested in a particular web site that the user may have visited, why would I run a keyword search across all of the EXEs and DLLs in the system32 directory? While there may be files on the 1TB USB-connected external hard drive, what is the likelihood that the user's web browser history is stored there? And why would I examine the contents of the Administrator (or any other) account profile if it hasn't been accessed in two years?
Another variant on this issue was discussed, in part, in Mike Viscuso's excellent Understanding APT presentation (at the recent AccessData User's Conference)...the presentation indicates that the threat isn't really terribly "advanced", but mentions that the threat makes detection "just hard enough".
Writing Open Source Tools
This is a topic that came up when Cory and I were working on DFwOST...Cory thought that it would be a good section to add, and I agreed, but for the life of me, I couldn't find a place to put it in the book where it just didn't seem awkward. I still think that it's important, in part because open source tools come from somewhere, but also because I think that a lot more folks out there really have something to contribute to the community as a whole.
To start off, my own motivation for writing open source tools is to simply solve a problem or address something that I've encountered. This is where RegRipper came from...I found that I'd been looking at many of the same Registry keys/values over and over again, and had built up quite a few scripts. As such, I wanted a "better" (that's sort of relative, isn't it??) to manage these things, particularly when there was so many, and they seemed to use a lot of the same code over and over.
I write tools in Perl because it's widely available and there a LOT of resources available for anyone interested in learning to use it...even if just to read it. I know the same is true for Python, but back in '98-'99 when I started teaching myself Perl, I did so because the network monitoring guys in our office were looking for folks who could write Perl, and infosec work was as hard for folks to sell back then as forensic analysis is now.
When I write Perl scripts, I (in most cases) try to document the code enough so that someone can at least open the script in Notepad and read the comments to see what the script does. I don't always try for the most elegant solution, reducing the number of keystrokes to accomplish a task, as making the steps available not only lets someone see more clearly what was done, but it also lets someone else modify the code to meet their needs...simply comment out the lines in question and modify the script to meet your own needs.
DFF
Speaking of open source tools, one of the tools discussed in DFwOST is the Digital Forensics Framework, of which version 1.1.0 was recently released. This version includes a couple of updates, as well as a bug fix to the ntfs module. I've downloaded it and got it running nicely on a Windows XP system...great work and a huge thanks to the DFF folks for their work. Be sure to check out the DFF blog for some tips on how you can use this open source forensic analysis application.