The system of the present invention is a behavioral analysis and response engine consisting of, but not limited to, 6 subparts (log collector and personally identifiable information tokenizer, rules database, threat profiling engine, dynamic network analysis engine, micro simulation/threat analysis engine, decision processor, and communication engine).
The system's methods emulate the decision-making ability of a human investigator to predict future threats. The system uses a graph based approach via semantic networks to compare event inputs across multiple logging channels against threat profiles and game theory to predict future outcomes. The system follows the structure of a traditional lexical knowledge base with nodes representing the threat descriptors and edges representing the semantic relations between them. The rules are based on known event types and given associated weights. The inference engine runs micro-simulations based on forward chaining of events where antecedents fire and assert the consequent. The knowledge base that the system relies on is the rules database. The rules are applied to the threat descriptors in the knowledge base to deduce the likelihood that a collection of threat descriptors, fired in the right sequence, will produce a future loss event.
The multi-channel distributed behavioral analysis architecture of the present invention provides a software solution to the major operational challenges faced with providing an early warning system for impending cyber security events. Most cyber security events are premeditated. However, many current cyber security defense technologies only address the real-time detection of a software vulnerability, the presence of malware (known or unknown “zero day”), anomalies from pre-established data points, or the signature of an active security event. The system and method of the invention described herein introduces a technique which provides the data collection, assessment, and alerting ability prior to the occurrence of an event based on threat actor behavior.
The system and method described herein attempts to automate many of the aspects of what a human investigator would use to collect independent data points in order to determine the likelihood of a future event, and provides an improved system and method for analyzing collected data.
Neighborhood Watch Analogy
A simple analogy that describes a basic concept of the system would be the automation of the “neighborhood watch” with improvements to this basic concept. In a neighborhood watch homeowners observe suspicious activity such as a person walking around the neighborhood that they are not familiar with. If they see an adult male walk by a neighbor's home and attempt to peer into the windows each day at 6 am for several days, then they are likely to call the neighbor and alert them and/or call the police. The person may not have taken any action to set off an alarm (i.e. they are peering from the driveway and have not attempted to come onto the property yet) but their current behavior gives a strong indication of future behavior.
The system described herein automates this manual process and provides an added advantage since it can provide an alert to the impending event before it occurs, to thereby provide a further advantage over current solutions that operate more like the alarm system on the house which will only alert when the person has taken an action to break into the house.
The system of the present invention also has the ability to detect a significant change of probability and type of event. Now, take the same scenario and add the element of a husband who leaves for work at 5:30 am and the a wife who is home alone, and the adult male walks by and takes a picture of the wife who is visible from the kitchen window in her bath robe making breakfast each morning at 6 am. The likely outcome based on the new event data has most likely changed from a potential robbery to robbery and/or assault.
The system described herein also updates the likelihood and type of event that may occur as new data is reported by the data collection engine.
FBI Investigator Analogy and Velocity of Events (Temporal Data) to alert on how soon a future event is likely to occur.
Another analogy is the process an investigator may use to profile a potential terrorist and determine the appropriate point at which enough of the right type data has been collected and the timeframe of when this data is generated [temporal] in order to make a decision on taking action [Colin Powell's 40-70% axiom is an example of the first part of this]. They may “collect the dots” by gathering data points on purchasing patterns, financial records, phone calls [to whom and where they are calling], travel history, and past criminal records. They then “connect the dots” by looking for relationships between the data points. For example, if the person has a history of communication with known terrorist groups and they've recently traveled to a known area where terrorists congregate then the investigator may begin to pay close attention to this person's next set of activities.
The temporal data in addition to the behavior data has a big effect on when the investigator may feel that they need to step in and take action. The data collected above (history of communication and recent travel to a terrorist location) may not prompt the investigator to take action to detain the suspect, but if within one week of return, the suspect received a $10,000 wire into his bank account, purchased one (1) ton of fertilizer, nitroglycerin, detonator cord, a disposable cell phone and rented a moving truck then the investigator will most likely immediately step in and detain the suspect for questioning as opposed to waiting for the next likely event to occur.
The system described herein also uses temporal data in an improved manner to determine not only what the likelihood of an event may be but when it may occur.
It is important to note that the system and method described herein distinctly differs from anomaly based behavioral detection systems in that this system is based on threat storylines and actor profiles instead of detecting the variance from predetermined data points. For example, an anomaly based detection system will report on a significant change in the number of emails sent in a given day by an actor which may or may not indicate a true threat (is it spam, a wedding/birthday invite, or holiday greeting?) which can lead to lots of false positives and gives no information regarding the actor's intent. The system described herein would not fire an alert on this behavior unless it was correlated with other events that indicated that this activity was a true threat such as a prior email from HR indicating that the employee was terminated. This system can analyze the data against threat scenarios to determine if the actor is simply updating their contact list with their new information or if the actor may be attempting to send out sensitive or derogatory information about the entity as a parting shot. This system can surmise that it would be unlikely for the employee to send out a positive mass communication after a termination (other than an address update) as opposed to an anomaly detection system which wouldn't know the difference between the two scenarios.