Mining MSRC analysis for forensic info
Anyone who's followed this blog for a while is familiar with...call them my "rants"...against AV vendors and the information they post about malware; specifically, AV vendors and malware analysts have, right in front of them, information that is extremely useful to incident responders and forensic analysts, but they do not release or share it, because they do not recognize its value. This could be due to the AV mindset, or it could be due to their business model (the more I think about that, the more it sounds like a chicken-egg discussion...).
When I was on the IBM ISS ERS team, we did a good deal of malware response. In several instances, team members were on-site with an AV vendor rep, whose focus was to get a copy of the malware to his RE team, so that an updated signature file could be provided to the customer. However, in the time it takes to get all this done, the customer is hemorrhaging...systems are getting infected and re-infected, data is/maybe flooding off of the infrastructure, etc. Relying on known malware characteristics, our team members were able to assist in stemming the tide and getting the customer on the road to recovery, even in the face of polymorphic malware.
What I find useful sometimes is to look at malware write-ups from several sites, and search across the 'net (via Google) to see what others may be saying about either the malware or specific artifacts.
I watched this video recently, in which Bruce Dang of Microsoft's MSRC talked about analyzing StuxNet to figure out what it did/does. The video is of a conference presentation, and I'd have to say that if you get get past Bruce saying "what the f*ck" way too many times, there's some really good information that he discusses, not just for malware RE folks, but also for forensic analysts. Here are some things I came away with after watching the video:
Real analysis involves symbiotic relationships. I've found this to be very true in some of the analysis I've done. I have worked very closely with our own RE guy, giving him copies of the malware, dependency files (ie, DLLs), and information such as paths, Registry keys, etc. In return, I've received unique strings, domain names, etc., which I've rolled back into iterative analysis. As such, we've been able to develop analysis that is much greater than the sum of its parts. This is also good reason to keep a copy of Windows Internals on your bookshelf, and keep a copy of Malware Analyst's Cookbook within easy reach.
Malware may behave differently based on the eco-system. I've seen a number of times where malware behaves differently based on the eco-system it infects. For example, Zeus takes different steps if the infected user has Administrator rights or not. I've seen other malware infections be greatly hampered by the fact that the user who got infected was a regular user and not an admin...indicating that the variant does not have a mechanism for check for and handling different privilege levels. Based on what Bruce discussed in his presentation, StuxNet takes different steps depending upon the version of Windows (i.e., XP vs. Vista+) that its running on.
Task Scheduler. I hear the question all the time, "what's different in Windows 7, as compared to XP?" Well, this seems to be a never ending list. Oy. Vista systems (and above) use Task Scheduler 2.0, which is different from the version that runs on XP/Windows 2003 in a number of ways. For example, TS 1.o .job files are binary, whereas TS 2.0 files are XML based. Also, according to Bruce's presentation, when a task is created, a checksum for the task .job file is computed and stored in the Registry. Before the task is run, the checksum is recalculated and compared to the stored value, to check for corruption. Bruce stated that when StuxNet hit, the hash algorithm used was CRC32, and that generating collisions for this algorithm is relatively easy...because that's part of what StuxNet does. Bruce mentioned that the algorithm has since been updated to SHA-256.
The Registry key in question is:
HKLM\Software\Microsoft\Windows NT\CurrentVersion\Schedule\TaskCache
A lot more research needs to be done regarding how forensic analysts (and incident responders) can parse and use the information in this key, and in its subkeys and values.
MOF files. Bruce mentioned in his presentation that Windows has a thread that continually polls the system32\wbem\mof directory looking for new files, and when it finds one, runs it. In short, MOF files are compiled scripts, and StuxNet used such a file to launch an executable; in short, put the file in as a Guest, the executable referenced in the file gets run as System.
Management needs actionable information. This is true in a number of situations, not just the kind of analysis work that Bruce was performing. This also applies to IR and DF tasks, as well...sure, analysts can find a lot of "neat" and extremely technical stuff, but the hard part...and what we're paid to do...is to translate that wealth of technical information into actionable intelligence that the customer can use to make decisions, within their environment. What good does it do a customer if you pile a 70 page report on them, expecting them to sift through it for data, and figure out how to use it? I've actually seen analysts write reports, and when I've asked about the significance or usefulness of specific items, been told, "...they can Google it." Uh...no. So, through experience, Bruce's point is well-taken...analysts sift through all of the data to produce the nuggets, then filter those to produce actionable intelligence that someone else can use to make decisions.
A final thought, not based specifically on the video...it helps forensic analysts and incident responders to engage sources that are ancillary to their field, and not directly related specifically to what we do every day. This helps us to see the forest for the trees, as it were...
When I was on the IBM ISS ERS team, we did a good deal of malware response. In several instances, team members were on-site with an AV vendor rep, whose focus was to get a copy of the malware to his RE team, so that an updated signature file could be provided to the customer. However, in the time it takes to get all this done, the customer is hemorrhaging...systems are getting infected and re-infected, data is/maybe flooding off of the infrastructure, etc. Relying on known malware characteristics, our team members were able to assist in stemming the tide and getting the customer on the road to recovery, even in the face of polymorphic malware.
What I find useful sometimes is to look at malware write-ups from several sites, and search across the 'net (via Google) to see what others may be saying about either the malware or specific artifacts.
I watched this video recently, in which Bruce Dang of Microsoft's MSRC talked about analyzing StuxNet to figure out what it did/does. The video is of a conference presentation, and I'd have to say that if you get get past Bruce saying "what the f*ck" way too many times, there's some really good information that he discusses, not just for malware RE folks, but also for forensic analysts. Here are some things I came away with after watching the video:
Real analysis involves symbiotic relationships. I've found this to be very true in some of the analysis I've done. I have worked very closely with our own RE guy, giving him copies of the malware, dependency files (ie, DLLs), and information such as paths, Registry keys, etc. In return, I've received unique strings, domain names, etc., which I've rolled back into iterative analysis. As such, we've been able to develop analysis that is much greater than the sum of its parts. This is also good reason to keep a copy of Windows Internals on your bookshelf, and keep a copy of Malware Analyst's Cookbook within easy reach.
Malware may behave differently based on the eco-system. I've seen a number of times where malware behaves differently based on the eco-system it infects. For example, Zeus takes different steps if the infected user has Administrator rights or not. I've seen other malware infections be greatly hampered by the fact that the user who got infected was a regular user and not an admin...indicating that the variant does not have a mechanism for check for and handling different privilege levels. Based on what Bruce discussed in his presentation, StuxNet takes different steps depending upon the version of Windows (i.e., XP vs. Vista+) that its running on.
Task Scheduler. I hear the question all the time, "what's different in Windows 7, as compared to XP?" Well, this seems to be a never ending list. Oy. Vista systems (and above) use Task Scheduler 2.0, which is different from the version that runs on XP/Windows 2003 in a number of ways. For example, TS 1.o .job files are binary, whereas TS 2.0 files are XML based. Also, according to Bruce's presentation, when a task is created, a checksum for the task .job file is computed and stored in the Registry. Before the task is run, the checksum is recalculated and compared to the stored value, to check for corruption. Bruce stated that when StuxNet hit, the hash algorithm used was CRC32, and that generating collisions for this algorithm is relatively easy...because that's part of what StuxNet does. Bruce mentioned that the algorithm has since been updated to SHA-256.
The Registry key in question is:
HKLM\Software\Microsoft\Windows NT\CurrentVersion\Schedule\TaskCache
A lot more research needs to be done regarding how forensic analysts (and incident responders) can parse and use the information in this key, and in its subkeys and values.
MOF files. Bruce mentioned in his presentation that Windows has a thread that continually polls the system32\wbem\mof directory looking for new files, and when it finds one, runs it. In short, MOF files are compiled scripts, and StuxNet used such a file to launch an executable; in short, put the file in as a Guest, the executable referenced in the file gets run as System.
Management needs actionable information. This is true in a number of situations, not just the kind of analysis work that Bruce was performing. This also applies to IR and DF tasks, as well...sure, analysts can find a lot of "neat" and extremely technical stuff, but the hard part...and what we're paid to do...is to translate that wealth of technical information into actionable intelligence that the customer can use to make decisions, within their environment. What good does it do a customer if you pile a 70 page report on them, expecting them to sift through it for data, and figure out how to use it? I've actually seen analysts write reports, and when I've asked about the significance or usefulness of specific items, been told, "...they can Google it." Uh...no. So, through experience, Bruce's point is well-taken...analysts sift through all of the data to produce the nuggets, then filter those to produce actionable intelligence that someone else can use to make decisions.
A final thought, not based specifically on the video...it helps forensic analysts and incident responders to engage sources that are ancillary to their field, and not directly related specifically to what we do every day. This helps us to see the forest for the trees, as it were...