Links and Updates
CyberSpeak
Ovie's got a new podcast up, be sure to check it out. In this podcast, Ovie interviews John Goldfoot, lawyer and author of the new paper, The Physical Computer and the 4th Amendment, published in the Berkeley Journal of Criminal Law. The overall idea is of the paper is to no longer view the computer as a system of containers, but simply as a container, in itself.
Ovie's podcasts are great listening and very educational. If you're one of those people who has to have a bit of coffee before becoming coherent in the morning, the CyberSpeak podcast is a great accompaniment for your Starbucks.
I especially like John's reference to the dog that did not bark, which is a Sherlock Holmes reference to the fact that the absence of evidence that would be present to something that happened, thus suggesting that it did not happen. This is very similar to something I've said and published in my books for a long time...the absence of an artifact where you would expect to find one is, itself, an artifact.
Take away from the podcast...don't use the S-word!
Anti-Malware Tools of Note
Klaus is back, with a great post pointing out anti-malware tools of note! I've always found Klaus's posts to be chock full of great information about the various tools that he finds, and in many cases, has tried and posted his sort-of-Consumer-Reports version of the tool. I find posts like this to be very useful, particularly as I tend to get a lot of "...we think this computer has/had malware on it; find it..." cases, and as such, I have what I believe to be a comprehensive process, in a checklist, that includes (but is not limited to) running AV scans. In that process, I specifically steps that require me to check the timeline of the system to determine which AV scanner(s), if any, have already been run on the system. Many times, the system will have a fairly up-to-date scanner (Symantec or McAfee) running (find Application Event Logs records, AV logs, etc.), and in a good number of cases, I've found that an Administrator had also installed and run Spybot Search & Destroy, and maybe even the MalwareBytes Anti-Malware scanner. If none of the tools that have already been used on the system found any threats, then it's incumbent upon the analyst to NOT use any of those tools; after all, if scanner A didn't find anything, how much value are you really providing to the customer to run your edition of scanner A on the mounted image and not finding any threats?
Compromises
I recently ran across this little gem, which states some pretty interesting "statistics". In this case, I use quotes around that last term, as I'm not entirely sure that something wasn't edited out of the article prior to publication. For example, the biggest issue I see with the article is that I don't understand is how you can survey 583 businesses, and then extrapolate their responses to "90% of US businesses"; this suggests that the 583 respondents, without any indication of their makeup, are a representative sampling of all US businesses. The article does point to this Ponemon Institute report.
Based on the survey results, there seem to be some interesting, albeit not surprising findings...
"...59% of respondents claimed that the worst consequences of the attacks were theft of information, followed by business disruption in second place."
This is interesting, as more and more organizations seem to be coming to this conclusion, that data theft is an issue. It wasn't so long ago that data theft was something of an afterthought for many of the organizations I encountered
Another interesting quote from the article:
"Employee mobile devices and laptops are believed to be the most likely entry point through which serious attacks are unleashed."
I found this particularly interesting, because as a responder, in most of the cases I've worked, there is very often little done to attempt to determine the "most likely entry point". I'm guessing that terms such as "believed" and "most likely" are intended to convey speculation, rather than assertions supported by fact.
One particular quote stood out from the article; Dr. Larry Ponemon reportedly stated, “conventional network security methods need to improve in order to curtail internal and external threats.”
Now, I realize that this is likely taken out of context, but "conventional network security methods" are likely not those things that security professionals have been harping on for years; these security methods are most likely what the respondents have in place, not what they should have in place. The fact is that many organizations may not have the ability to detect and effectively respond to incidents, which not only makes the threats appear to be extremely sophisticated, but also ends up costing much more in the long run.
The Value of Data
My last blog post, in which I asked, who defines the value of data, generated some interesting comments and discussion. Based on the discussion and further thought, I'm of the opinion that, right or wrong, and due to several factors, the value of data is entirely subjective and most often up to the analyst.
Think about it...how do most analysis engagements start? Ultimately, regardless of where or how the overall engagement originates, an analyst is presented with data (an acquired image, logs, etc.) and a set of requirements or goals. Where did these goals come from? Many times, the "customer" (business organization, investigator, etc.) has stated their goals, from their perspective, with little to no translation or understanding of/expertise in the subject (beyond what they've read in the industry trade journal du jour, that is...). These are then passed to the analyst via a management layer that often similarly lacks expertise, and the analyst is left to discern the goals of the analysis without any interaction or exchange with the "customer" or "end user".
Now, consider the issue of context...who determines the contextual value of the data? Again...the analyst. Isn't it the analyst who needs to know about the data in the first place? And then isn't it the analyst who presents the data to someone else, be it a "customer" or "end user"? If the analyst's training has made them an expert at calculating MFT data runs, but that training didn't address the Windows Registry, then who determines the contextual value of all of the available data if not all of the data is analyzed and/or presented?
No, I'm not busting on analysts, not at all...in fact, I'm not making derogatory remarks about anyone. What I am suggesting is that we take a look at the overall DFIR process and consider the effect that it has on studies and surveys such as was mentioned earlier in this post. Incidents of all kinds are clearly causing some pretty significant, detrimental effects on organizations, but IMHO and based on my experience over the years, as well as talking with others in the industry, it's pretty clear to me that incidents and compromises are becoming more targeted and focused, and the battle ensues between the attacker and the target's management.
Early this year, I read and wrote a review for CyberCrime and Espionage, by John Pirc and Will Gragido. Two statements I made in that review are:
What is deemed "adequate and reasonable" security is often decided by those with budgeting concerns/constraints, but with little understanding of the risk or the threat.
Compliance comes down to the auditor versus the attacker, with the target infrastructure as the stage. The attacker is not constrained a specific compliance "standard"; in fact, the attacker may actually use that "standard" and compliance to it against the infrastructure itself.
I believe that these statements still apply, and do so, in fact, quite well. However, I'd like to combine them into, A data breach comes down to the management versus the attacker. Attackers are neither concerned with nor constrained by budgets. Attackers pry the cover off of technologies such as Windows XP, Windows 7, Windows Media Player, Adobe Reader, etc., and dig deep into internal workings, knowing that their targets don't do the same. Management determines the overall security culture of their organization, as well as the use of resources to implement security. Management also determines whether their organization will focus on protecting their data, or simply meeting the letter of the law with respect to compliance.
Don Weber, a former US Marine, recently tweeted, "In their minds #LULZSEC ~== WWII French Resistance". That's very appropriate...when the Germans invaded France, they didn't know the lay of the land the way the resistance fighters did; sure, they could navigate from point to point with a compass, and move in force with tanks and troops, but the resistance fighters knew the land, and knew how to move from point to point in a stealthy manner, remaining out of site by using the terrain to their advantage.
Tool Update
John the Ripper, a password cracker for Unix, Windows, DOS, BeOS, and OpenVMS has seen an update, intended to make it a bit faster with respect to cracking passwords. There's a great deal of information available in the OpenWall wiki for the tool.
Selective Imaging
I ran across a reference to this thesis paper from Johannes Stüttgen via Twitter this morning. The paper is well-worth reading, and very informative, but at the same time, I can say from experience that there are a LOT of responders and analysts out there (myself included) who have already gone through the same sort of thought process and come up with our own methods for addressing the same issues raised by the author. Now, that is not to say that the paper doesn't have value...not at all. I think that after having read through the paper, it's going to provide something to everyone; for example, it's going to present a thought and exploration process to the new analyst who doesn't have confidence in his or her own abilities and experience. We often work in isolation, on our teams, and may not have someone more experienced to "bounce" things off, and if you don't feel that you can reach out to someone else in the community, this paper is a great resource. I also think that the paper may open the eyes of some more experienced analysts, providing insights to things that may not have been considered during the development of their own processes.
Ovie's got a new podcast up, be sure to check it out. In this podcast, Ovie interviews John Goldfoot, lawyer and author of the new paper, The Physical Computer and the 4th Amendment, published in the Berkeley Journal of Criminal Law. The overall idea is of the paper is to no longer view the computer as a system of containers, but simply as a container, in itself.
Ovie's podcasts are great listening and very educational. If you're one of those people who has to have a bit of coffee before becoming coherent in the morning, the CyberSpeak podcast is a great accompaniment for your Starbucks.
I especially like John's reference to the dog that did not bark, which is a Sherlock Holmes reference to the fact that the absence of evidence that would be present to something that happened, thus suggesting that it did not happen. This is very similar to something I've said and published in my books for a long time...the absence of an artifact where you would expect to find one is, itself, an artifact.
Take away from the podcast...don't use the S-word!
Anti-Malware Tools of Note
Klaus is back, with a great post pointing out anti-malware tools of note! I've always found Klaus's posts to be chock full of great information about the various tools that he finds, and in many cases, has tried and posted his sort-of-Consumer-Reports version of the tool. I find posts like this to be very useful, particularly as I tend to get a lot of "...we think this computer has/had malware on it; find it..." cases, and as such, I have what I believe to be a comprehensive process, in a checklist, that includes (but is not limited to) running AV scans. In that process, I specifically steps that require me to check the timeline of the system to determine which AV scanner(s), if any, have already been run on the system. Many times, the system will have a fairly up-to-date scanner (Symantec or McAfee) running (find Application Event Logs records, AV logs, etc.), and in a good number of cases, I've found that an Administrator had also installed and run Spybot Search & Destroy, and maybe even the MalwareBytes Anti-Malware scanner. If none of the tools that have already been used on the system found any threats, then it's incumbent upon the analyst to NOT use any of those tools; after all, if scanner A didn't find anything, how much value are you really providing to the customer to run your edition of scanner A on the mounted image and not finding any threats?
Compromises
I recently ran across this little gem, which states some pretty interesting "statistics". In this case, I use quotes around that last term, as I'm not entirely sure that something wasn't edited out of the article prior to publication. For example, the biggest issue I see with the article is that I don't understand is how you can survey 583 businesses, and then extrapolate their responses to "90% of US businesses"; this suggests that the 583 respondents, without any indication of their makeup, are a representative sampling of all US businesses. The article does point to this Ponemon Institute report.
Based on the survey results, there seem to be some interesting, albeit not surprising findings...
"...59% of respondents claimed that the worst consequences of the attacks were theft of information, followed by business disruption in second place."
This is interesting, as more and more organizations seem to be coming to this conclusion, that data theft is an issue. It wasn't so long ago that data theft was something of an afterthought for many of the organizations I encountered
Another interesting quote from the article:
"Employee mobile devices and laptops are believed to be the most likely entry point through which serious attacks are unleashed."
I found this particularly interesting, because as a responder, in most of the cases I've worked, there is very often little done to attempt to determine the "most likely entry point". I'm guessing that terms such as "believed" and "most likely" are intended to convey speculation, rather than assertions supported by fact.
One particular quote stood out from the article; Dr. Larry Ponemon reportedly stated, “conventional network security methods need to improve in order to curtail internal and external threats.”
Now, I realize that this is likely taken out of context, but "conventional network security methods" are likely not those things that security professionals have been harping on for years; these security methods are most likely what the respondents have in place, not what they should have in place. The fact is that many organizations may not have the ability to detect and effectively respond to incidents, which not only makes the threats appear to be extremely sophisticated, but also ends up costing much more in the long run.
The Value of Data
My last blog post, in which I asked, who defines the value of data, generated some interesting comments and discussion. Based on the discussion and further thought, I'm of the opinion that, right or wrong, and due to several factors, the value of data is entirely subjective and most often up to the analyst.
Think about it...how do most analysis engagements start? Ultimately, regardless of where or how the overall engagement originates, an analyst is presented with data (an acquired image, logs, etc.) and a set of requirements or goals. Where did these goals come from? Many times, the "customer" (business organization, investigator, etc.) has stated their goals, from their perspective, with little to no translation or understanding of/expertise in the subject (beyond what they've read in the industry trade journal du jour, that is...). These are then passed to the analyst via a management layer that often similarly lacks expertise, and the analyst is left to discern the goals of the analysis without any interaction or exchange with the "customer" or "end user".
Now, consider the issue of context...who determines the contextual value of the data? Again...the analyst. Isn't it the analyst who needs to know about the data in the first place? And then isn't it the analyst who presents the data to someone else, be it a "customer" or "end user"? If the analyst's training has made them an expert at calculating MFT data runs, but that training didn't address the Windows Registry, then who determines the contextual value of all of the available data if not all of the data is analyzed and/or presented?
No, I'm not busting on analysts, not at all...in fact, I'm not making derogatory remarks about anyone. What I am suggesting is that we take a look at the overall DFIR process and consider the effect that it has on studies and surveys such as was mentioned earlier in this post. Incidents of all kinds are clearly causing some pretty significant, detrimental effects on organizations, but IMHO and based on my experience over the years, as well as talking with others in the industry, it's pretty clear to me that incidents and compromises are becoming more targeted and focused, and the battle ensues between the attacker and the target's management.
Early this year, I read and wrote a review for CyberCrime and Espionage, by John Pirc and Will Gragido. Two statements I made in that review are:
What is deemed "adequate and reasonable" security is often decided by those with budgeting concerns/constraints, but with little understanding of the risk or the threat.
Compliance comes down to the auditor versus the attacker, with the target infrastructure as the stage. The attacker is not constrained a specific compliance "standard"; in fact, the attacker may actually use that "standard" and compliance to it against the infrastructure itself.
I believe that these statements still apply, and do so, in fact, quite well. However, I'd like to combine them into, A data breach comes down to the management versus the attacker. Attackers are neither concerned with nor constrained by budgets. Attackers pry the cover off of technologies such as Windows XP, Windows 7, Windows Media Player, Adobe Reader, etc., and dig deep into internal workings, knowing that their targets don't do the same. Management determines the overall security culture of their organization, as well as the use of resources to implement security. Management also determines whether their organization will focus on protecting their data, or simply meeting the letter of the law with respect to compliance.
Don Weber, a former US Marine, recently tweeted, "In their minds #LULZSEC ~== WWII French Resistance". That's very appropriate...when the Germans invaded France, they didn't know the lay of the land the way the resistance fighters did; sure, they could navigate from point to point with a compass, and move in force with tanks and troops, but the resistance fighters knew the land, and knew how to move from point to point in a stealthy manner, remaining out of site by using the terrain to their advantage.
Tool Update
John the Ripper, a password cracker for Unix, Windows, DOS, BeOS, and OpenVMS has seen an update, intended to make it a bit faster with respect to cracking passwords. There's a great deal of information available in the OpenWall wiki for the tool.
Selective Imaging
I ran across a reference to this thesis paper from Johannes Stüttgen via Twitter this morning. The paper is well-worth reading, and very informative, but at the same time, I can say from experience that there are a LOT of responders and analysts out there (myself included) who have already gone through the same sort of thought process and come up with our own methods for addressing the same issues raised by the author. Now, that is not to say that the paper doesn't have value...not at all. I think that after having read through the paper, it's going to provide something to everyone; for example, it's going to present a thought and exploration process to the new analyst who doesn't have confidence in his or her own abilities and experience. We often work in isolation, on our teams, and may not have someone more experienced to "bounce" things off, and if you don't feel that you can reach out to someone else in the community, this paper is a great resource. I also think that the paper may open the eyes of some more experienced analysts, providing insights to things that may not have been considered during the development of their own processes.