Resources, Link Mashup
Monitoring
MS's Sysmon was recently updated to version 3.2, with the addition of capturing opens for raw read access to disks and volumes. If you're interested in monitoring your infrastructure and performing threat hunting at all, I'd highly recommend that you consider installing something like this on your systems. While Sysmon is not nearly as fully-featured as something like Carbon Black, employing Sysmon along with centralized log collection and filtering will provide you with a level of visibility that you likely hadn't even imagined was possible previously.
This page talks about using Sysmon and NXLog.
The fine analysts of the Dell SecureWorks CTU-SO recently had an article posted that describes what the bad guys like to do with Windows Event Logs, and both of the case studies could be "caught" with the right instrumentation in place. You can also use process creation monitoring (via Sysmon, or some other means) to detect when an intruder is living off the land within your environment.
The key to effective monitoring and subsequent threat hunting is visibility, which is achieved through telemetry and instrumentation. How are bad guys able to persist within an infrastructure for a year or more without being detected? It's not that they aren't doing stuff, it's that they're doing stuff that isn't detected due to a lack of visibility.
MS KB article 3004375 outlines how to improve Windows command-line auditing, and this post from LogRhythm discusses how to enable Powershell command line logging (another post discussing the same thing is here). The MS KB article gives you some basic information regarding process creation, and Sysmon provides much more insight. Regardless of which option you choose, however, all are useless unless you're doing some sort of centralized log collection and filtering, so be sure to incorporate the necessary and appropriate logs into your SEIM, and get those filters written.
Windows Event Logs
Speaking of Windows Event Logs, sometimes it can be very difficult to find information regarding various event source/ID pairs. Microsoft has a great deal of information available regarding Windows Event Log records, and I very often can easily find the pages with a quick Google search. For example, I recently found this page on Firewall Rule Processing events, based on a question I saw in an online forum.
From Deus Ex Machina, you can look up a wide range of Windows Event Log records here or here. I've found both to be very useful. I've used this site more than once to get information about *.evtx records that I couldn't find any place else.
Another source of information about Windows Event Log records and how they can be used can often be one of the TechNet blogs. For example, here's a really good blog post from Jessica Payne regarding tracking lateral movement...
With respect to the Windows Event Logs, I've been looking at ways to increase instrumentation on Windows systems, and something I would recommend is putting triggers in place for various activities, and writing a record to the Windows Event Log. I found this blog post recently that discusses using PowerShell to write to the Windows Event Log, so whatever you trap or trigger on a system can launch the appropriate command or run a batch file the contains the command. Of course, in a networked environment, I'd highly recommend a SEIM be set up, as well.
One thought regarding filtering and analyzing Windows Event Log records sent to a SEIM...when looking at various Windows Event Log records, we have to look at them in the context of the system, rather than in isolation, as what they actually refer to can be very different. A suspicious record related to WMI, for example, when viewed in isolation may end up being part of known and documented activity when viewed in the context of the system.
Analysis
PoorBillionaire recently released a Windows Prefetch Parser, which is reportedly capable of handling *.pf files from XP systems all the way up through Windows 10 systems. On 19 Jan, Eric Zimmerman did the same, making his own Prefetch parser available.
Having tools available is great, but what we really need to do is talk about how those tools can be used most effectively as part of our analysis. There's no single correct way to use the tool, but the issue becomes, how do you correctly interpret the data once you have it?
I recently encountered a "tale of two analysts", where both had access to the same data. One analyst did not parse the ShimCache data at all as part of their analysis, while the other did and misinterpreted the information that the tool (whichever one that was) displayed for them.
So, my point is that having tools to parse data is great, but if the focus is tools and parsing data, but not analyzing and correctly interpreting the data, what have the tools really gotten us?
Creating a Timeline
I was browsing around recently and ran across an older blog post (yeah, I know it's like 18 months old...), and in the very beginning of that post, something caught my eye. Specifically, a couple of quotes from the blog post:
...my reasons for carrying this out after the filesystem timeline is purely down to the time it takes to process.
...and...
The problem with it though is the sheer amount of information it can contain! It is very important when working with a super timeline to have a pivot point to allow you to narrow down the time frame you are interested in.
The post also states that timeline analysis is an extremely powerful tool, and I agree, 100%. What I would offer to analysts is a more deliberate approach to timeline analysis, based on what Chris Pogue coined as Sniper Forensics.
Speaking of analysis, the folks at RSA released a really good look at analyzing carrier files used during a phish. The post provides a pretty thorough walk-through of the tool and techniques used to parse through an old (or should I say, "OLE") style MS Word document to identify and analyze embedded macros.
Powershell
Not long ago, I ran across an interesting artifact...a folder with the following name:
C:\Users\user\AppData\Local\Microsoft\Windows\PowerShell\CommandAnalysis\
The folder contained an index file, and a bunch of files with names that follow the format "PowerShell_AnalysisCacheEntry_GUID". Doing some research into this, I ran across this BoyWonder blog post, which seems to indicate that this is a cache (yeah, okay, that's in the name, I get it...), and possibly used for functionality similar to auto-complete. It doesn't appear to illustrate what was run, though. For that, you might want to see the LogRhythm link earlier in this post.
As it turned out, the folder path I listed above was part of legitimate activity performed by an administrator.
MS's Sysmon was recently updated to version 3.2, with the addition of capturing opens for raw read access to disks and volumes. If you're interested in monitoring your infrastructure and performing threat hunting at all, I'd highly recommend that you consider installing something like this on your systems. While Sysmon is not nearly as fully-featured as something like Carbon Black, employing Sysmon along with centralized log collection and filtering will provide you with a level of visibility that you likely hadn't even imagined was possible previously.
This page talks about using Sysmon and NXLog.
The fine analysts of the Dell SecureWorks CTU-SO recently had an article posted that describes what the bad guys like to do with Windows Event Logs, and both of the case studies could be "caught" with the right instrumentation in place. You can also use process creation monitoring (via Sysmon, or some other means) to detect when an intruder is living off the land within your environment.
The key to effective monitoring and subsequent threat hunting is visibility, which is achieved through telemetry and instrumentation. How are bad guys able to persist within an infrastructure for a year or more without being detected? It's not that they aren't doing stuff, it's that they're doing stuff that isn't detected due to a lack of visibility.
MS KB article 3004375 outlines how to improve Windows command-line auditing, and this post from LogRhythm discusses how to enable Powershell command line logging (another post discussing the same thing is here). The MS KB article gives you some basic information regarding process creation, and Sysmon provides much more insight. Regardless of which option you choose, however, all are useless unless you're doing some sort of centralized log collection and filtering, so be sure to incorporate the necessary and appropriate logs into your SEIM, and get those filters written.
Windows Event Logs
Speaking of Windows Event Logs, sometimes it can be very difficult to find information regarding various event source/ID pairs. Microsoft has a great deal of information available regarding Windows Event Log records, and I very often can easily find the pages with a quick Google search. For example, I recently found this page on Firewall Rule Processing events, based on a question I saw in an online forum.
From Deus Ex Machina, you can look up a wide range of Windows Event Log records here or here. I've found both to be very useful. I've used this site more than once to get information about *.evtx records that I couldn't find any place else.
Another source of information about Windows Event Log records and how they can be used can often be one of the TechNet blogs. For example, here's a really good blog post from Jessica Payne regarding tracking lateral movement...
With respect to the Windows Event Logs, I've been looking at ways to increase instrumentation on Windows systems, and something I would recommend is putting triggers in place for various activities, and writing a record to the Windows Event Log. I found this blog post recently that discusses using PowerShell to write to the Windows Event Log, so whatever you trap or trigger on a system can launch the appropriate command or run a batch file the contains the command. Of course, in a networked environment, I'd highly recommend a SEIM be set up, as well.
One thought regarding filtering and analyzing Windows Event Log records sent to a SEIM...when looking at various Windows Event Log records, we have to look at them in the context of the system, rather than in isolation, as what they actually refer to can be very different. A suspicious record related to WMI, for example, when viewed in isolation may end up being part of known and documented activity when viewed in the context of the system.
Analysis
PoorBillionaire recently released a Windows Prefetch Parser, which is reportedly capable of handling *.pf files from XP systems all the way up through Windows 10 systems. On 19 Jan, Eric Zimmerman did the same, making his own Prefetch parser available.
Having tools available is great, but what we really need to do is talk about how those tools can be used most effectively as part of our analysis. There's no single correct way to use the tool, but the issue becomes, how do you correctly interpret the data once you have it?
I recently encountered a "tale of two analysts", where both had access to the same data. One analyst did not parse the ShimCache data at all as part of their analysis, while the other did and misinterpreted the information that the tool (whichever one that was) displayed for them.
So, my point is that having tools to parse data is great, but if the focus is tools and parsing data, but not analyzing and correctly interpreting the data, what have the tools really gotten us?
Creating a Timeline
I was browsing around recently and ran across an older blog post (yeah, I know it's like 18 months old...), and in the very beginning of that post, something caught my eye. Specifically, a couple of quotes from the blog post:
...my reasons for carrying this out after the filesystem timeline is purely down to the time it takes to process.
...and...
The problem with it though is the sheer amount of information it can contain! It is very important when working with a super timeline to have a pivot point to allow you to narrow down the time frame you are interested in.
The post also states that timeline analysis is an extremely powerful tool, and I agree, 100%. What I would offer to analysts is a more deliberate approach to timeline analysis, based on what Chris Pogue coined as Sniper Forensics.
Speaking of analysis, the folks at RSA released a really good look at analyzing carrier files used during a phish. The post provides a pretty thorough walk-through of the tool and techniques used to parse through an old (or should I say, "OLE") style MS Word document to identify and analyze embedded macros.
Powershell
Not long ago, I ran across an interesting artifact...a folder with the following name:
C:\Users\user\AppData\Local\Microsoft\Windows\PowerShell\CommandAnalysis\
The folder contained an index file, and a bunch of files with names that follow the format "PowerShell_AnalysisCacheEntry_GUID". Doing some research into this, I ran across this BoyWonder blog post, which seems to indicate that this is a cache (yeah, okay, that's in the name, I get it...), and possibly used for functionality similar to auto-complete. It doesn't appear to illustrate what was run, though. For that, you might want to see the LogRhythm link earlier in this post.
As it turned out, the folder path I listed above was part of legitimate activity performed by an administrator.
Resources, Link Mashup
Reviewed by 0x000216
on
Wednesday, January 20, 2016
Rating: 5