You have to read every single policy. You have to read every single email that comes in and goes out of your business. You have to follow every single chat session that every single employee opens. And you have to correlate them all to find patterns that indicate risks and violations.
Now, that’s a regulatory burden for you. Oh, and the new rules go into effect December 1.
Whether you are the head of a desk and carry supervisory responsibilities, the CCO that needs to make sure that compliance happens across the entire firm, or the IT executive who has to build a smart way to deal with this problem, the task at hand seems an impossible one. The fact is that firms are doing what they can to appear to comply, and regulators are doing what they can to help guide firms. But the complete solution is going to require advanced analytic technology that can handle words, sentiment, entities, relationships, and multisource behavioral profiling. Cave-man tools won’t fix this space-age problem.
There are three levels of e-communications surveillance that firms are going to deploy over the next year. Only one of them is both strategic and future-proof. Executives have a rare opportunity to reduce cost and complexity, while increasing visibility and accountability. How? By leveraging advanced text analytics that can look across information sources and types. Let’s look at the three stages of building a next-generation e-communications surveillance solution.
Level 1: Lexicon-based reviews
Most firms today have rudimentary systems that will do simple character matching against bodies of text found in emails and chat transcripts, looking for indicators of violations, such as words (“call me later”) or phrases (“heat tickets”). In the 1960’s Regular Expressions System became popular and is still used today to do “character matching,” but that really should be “Level 0” surveillance.
These systems are helping with compliance, and many firms rely on them considerably today. This is because back in 2007, when FINRA issued its guidance on surveillance compliance, advanced text analytics was only used in the realm of computer science academia. What firms are using now is essentially a cluster-bomb way to catch everything that is suspect. In other words, it identifies a bunch of text that shouldn’t be suspect (creating false positives), in the same way a cluster bomb damages everything it comes near. The problem is there is a vast amount of benign communication being caught in the net, along with the potentially problematic ones. That’s where it gets expensive. That’s where Level 2 surveillance comes in.
Level 2: Advanced text analytics
Level 2 of a next-generation e-communications surveillance solution requires intelligence to be added to the screening process with the goal of reducing “false positives.” The system needs to do more than match characters. To gain a higher confidence rating on matches, the system must be capable of understanding that certain concepts communicated between two certain individuals within a certain geographical area are significant. That means the system must have advanced text analytics capabilities that support entity extraction (people, places, organizations), key phrase extraction, and neural net for sentiment detection. This is easy to map out but difficult to execute at a high level.
The Level 2 system must also be aware of the context of entities and phrases in relation to each other. The system must have an elegant language to succinctly express and build intelligent and dynamic linguistic traps. This kind of optimization of language analysis is a meaningful increase in the value of the system and will significantly reduce costly manual intervention required to investigate the false positives generated by systems of yesterday. Better, yes -- but good enough? Not by a long shot. The next-generation e-communications solution must support statistical approaches to analyzing the text characteristics that matter most.
Level 3: Statistical analysis
The 250 largest financial firms in the Americas are the ones that truly need to craft a Level 3 system. They have the same requirement as smaller firms: Monitor every single communication and check for policy violations. But the larger firms have such a high volume and complexity of communications and communications platforms that only the most robust technology solution will deliver the desired combination of compliance confidence and cost effectiveness. Some examples of the kinds of statistical approaches supported by the next-generation e-communications solution:
- Concentration – What are the statistically significant occurrences of communications with regard to counterparties?
- Volume – Are there significant deviations from the normal volume of communications? Are there new risky linguistic concepts?
- Profiles – How do the communications vary relative to “peers”?
- Network – Are new counterparties exposing new risks based on the social network of those counterparties?
- Predictive analytics – Over 60 percent of fraud is done in collusion. How to find colluding perpetrators with a “more like this” style of behavior analysis?
All this amounts to a high-tech way to evolve surveillance from rudimentary character-matching to screening that is intelligent and risk-aware. Now, if you can connect to a variety of sources (Sharepoint for policies, chat logs from multiple vendors, exchange or other email systems) and give operators the flexibility to create and change parameters, while supporting real-time feed to any system, report, or dashboard, then you have it: the next-generation e-communications surveillance solution.
And it is in development at several firms today.Julio Gomez is Industry Strategy Partner for Knowledgent, where he leads efforts to help global financial services enterprises solve their most difficult data problems to power game-changing solutions. Time Magazine listed him among the "Fifty Most Influential People Shaping ... View Full Bio