Untangling chaos: Data in the financial world is “a total mess,” says former regulator Allan Mendelowitz, who supports a data standard that could give banks and regulators a clearer view of cash flow and risk.
By Penny Crosman, Editor-at-large, American Banker
May 24, 2016
Allan Mendelowitz, a former top regulator and current president of the ACTUS Financial Research Foundation, is a man on a mission.
He wants to fix the “total mess” of data standards in the financial world. The chief risk officer at a large bank recently complained to Mendelowitz about the fact that all trading data sits on traders’ desks. Management and risk officers don’t get a chance to see it until the day is over.
“It’s absolutely absurd,” Mendelowitz said in an interview last week.
Working with a group of data scientists, former regulators, economists and academics that include Thomas Day, co-managing partner of software firm FinRenaissance, Mendelowitz has helped to create open source software designed to bring order to the way financial data is collected and analyzed.
It’s called ACTUS (an acronym for Algorithmic Contract Types Unified Standards) and the basic idea is this: most of what happens in a bank — mortgages, deposit accounts, checking accounts, credit card agreements, commercial loans, credit default swaps, etc. — boils down to some kind of contract. Party A will pay party B an amount of X under Y and Z terms.
Though these contracts may be hundreds of pages long and written in language that even lawyers might find confusing, most can be rewritten as mathematical equations, the ACTUS group claims. They have created more than 30 such algorithms that turn legalistic contracts into equations and numbers that computers can read and understand.
When banks use ACTUS and link their day-to-day systems to it, they will get a live view of the status of all their contracts. That should enable them to monitor cash flows, see liquidity positions, identify risk problems and view counterparty risk.
With such a system in place, Jamie Dimon, who in mid-April 2012 dismissed the London Whale’s large credit default swap positions as “a tempest in a teacup,” might have seen exactly how much money the infamous trader had put at risk before he racked up $6 billion in losses.
“The London Whale was subject to risk oversight the way it’s done in the big banks,” Mendelowitz said. “He was given a value at risk or VAR limit, which is the cap on the maximum amount he was permitted to risk losing over a relatively short time period. Quantifying exposure relative to the VAR limit is based on a model. The London Whale exceeded his VAR limit, so he changed his model. There’s no way a chief risk officer can have insight into the implications of such trades without having access to the granular data.”
A View of Systemic Risk
Mendelowitz isn’t just trying to help banks, but also their regulators. Giving an agency like the Office of Financial Research access to live ACTUS information streamed from the largest U.S. institutions would let supervisors see problems before they blow up.
“When the crisis hit in 2008, it became readily apparent no one had a clue what was going on,” he said. Regulators allowed Lehman Brothers to fail, without having any visibility into the effects on its counterparties.
The agencies want to know how banks are connected to one another, Day said.
“How is JPMorgan connected to Bank of America, what’s the network effect? That’s called systemic risk,” he said in an interview. “That’s what the Federal Reserve System is supposed to be measuring and managing. A single bank can fail, but contagion and transmission of the virus, one bank to the next, is what causes a financial crisis.”
For regulators to get a consolidated view of all banks and their trading and lending positions with one another requires the industry-wide adoption of a data standard, Mendelowitz and Day noted.
ACTUS works with an ontology the EDM Council developed called Financial Industry Business Ontology (FIBO) that acts as the data standard. It harmonizes data across repositories in a common language the way the Rosetta Stone let people read a message in hieroglyphs by providing the same exact text in Greek. So, for instance, if one bank calls a customer number field “cust-no” in its databases and another uses “cust#,” those could be mapped to a common data scheme so a computer would recognize they mean the same thing.
“We provide the standard definition of what’s in the contract and feed it into [ACTUS’s] standard cash flow algorithms,” said Mike Atkin, managing director of the EDM Council.
The contract-level data is needed to do forward-looking analysis, measure how much pressure is building up in the system, and evaluate who is vulnerable, Mendelowitz said. “Unless you have a way of representing the obligations, the financial contracts, in a standard that supports analysis, even with legal entity identifiers, you’re like a bird with one wing and you can’t fly.”
Why a Bank Might Want ACTUS
ACTUS can allow institutions to obtain a view of risky behaviors (as in the London Whale case) and stave off a major loss, embarrassment and/or regulatory scrutiny. A bank could also gain better insight into its own liquidity, cash flow, and income and run what-if analyses.
A bank could use ACTUS to feed more precise data into its market, behavior, and counterparty risk models.
It could help also banks meet regulators’ data requests, for instance for stress tests and compliance with the Basel Committee of Banking Supervision’s 239 principles for risk data aggregation and risk reporting.
Theoretically, it could reduce banks’ regulatory burden. Mendelowitz said using ACTUS as the standard for granular financial data could eventually reduce the cost of stress testing and the burden of regulatory reporting.
“The stress tests are unbelievably expensive, drawn-out exercises,” he said. “It takes the banks months and hundreds of millions of dollars to do the stress tests. If the feds collected granular balance sheet data in the right data standard on an ongoing basis, you could stress the entire system at will — even daily — at a fraction of the cost incurred now.”
Banks might not want regulators peering into every contract they hold. However, Mendelowitz said they don’t really have a choice. “The regulators already have the right to demand anything they want from the banks,” he said. “The problem now is, they’re demanding all sorts of stuff that imposes extremely high costs on the banks and the banks don’t get any benefit from it. They do not use the data and analytics in the regulatory reports to manage their risk or run their banks. By creating this computational infrastructure and relying on an algorithmic representation of the components of the balance sheet, suddenly this gives the banks something they can use to better understand risk, and gives the regulators something they can use to understand risk without imposing excessive reporting costs on the regulated financial institutions.”
Banks could also gain operating efficiencies through fixing data problems. One investment bank executive estimated his company could reduce operating expenses by 20-30% by using ACTUS and FIBO to do straight-through processing.
“If you’ve got systems that don’t align with each other, you’re spending a lot of time reconciling and cross-referencing, and that winds up being expensive,” Atkin said.
Implementation of ACTUS throughout a bank should take eight to nine months, Day estimated.
To be sure, there are likely to be challenges, including the fact that the industry is mired with 30 to 40-year old technology and resistant to change. If bank regulators mandate the use of ACTUS, bankers are likely to complain and regulators may back off.
But the mission makes a lot of sense. People throughout the industry have been talking about common data standards and having a systemic view of risk since before the financial crisis. It would behoove stakeholders to take a close look at both FIBO and ACTUS and consider their usefulness.
Editor at Large Penny Crosman welcomes feedback at firstname.lastname@example.org.