For the first time, the trifecta of CCAR, “free” Floating Point arithmetic, and parallel Numerical Optimization makes it feasible to automate the Firm capital allocation plan implementation at the Accrual Portfolio security level. Let’s talk about the background of the capital allocation plan and then the elements of the trifecta and how they change things.
Banks can hold, in aggregate, millions of contractual positions or securities representing: deposits, commercial loans, consumer loans, and other tradable securities in their Accrual Portfolio. These positions can be originated, held, and managed by thousands of regional branches distributed around the country or around the world. Due to current Banking system’s computational limitations, Banks do not rapidly simulate the Accrual Portfolio’s balance fluctuations, rates of return, and default charges at the position/security/contract level. It can take a couple hours to execute on a large mp box/grid for a worst-case risk scenario. Newer competitive systems can rapidly simulate the Accrual Portfolio at the security/contract level on a single core in a couple of seconds.
In the Bank’s view, each of these Accrual Portfolio positions belong to one of two sets, Assets or Liabilities. The Liabilities, including deposits and Fed Funds, provide cash needed to fund the Assets, including credit cards and mortgages. Assets generally throw off 5-6x more cash than the Liabilities consume. The customer paying 3% on his $1000 credit card line to the Bank throws off more cash than the 0.51% rate the Bank pays on a $1000 deposit account. The Bank Treasury manages the cash flows from Assets to Liabilities, maintains the Balance Sheet of Assets and Liabilities making sure all assets are funded, and implements the Bank’s discretionary capital allocation plan with the excess Asset cash flow. The excess Asset cash flow is the cash left over after meeting Liability contractual obligations and regulatory reserve level requirements. If you know accurately when excess cash is available you are free to use it to purchase new assets according to the discretionary capital allocation plan. If the bank purchases a new asset before the excess cash is available the Treasury must fund some of the position from the money market. Funding from the money market is slightly more expensive than funding with free cash, and thus drives down the Bank’s Net Interest Margin. If the Bank waits too long to purchase a new asset the realized Net Interest Margin will somewhat smaller that it could have been in theory.
You can see how the NIMo problem decomposes to:
1.) A computationally intensive daily Balance Sheet simulation of millions of securities
2.) A less computationally demanding numerical optimization of what and when new investments are to be made, and
3.) Embedding this optimization computation in a control loop allowing feedback to come from anywhere in the capital allocation process, from regional branches exceeding, missing, or delaying their new business quotas to market shocks such as CNY devaluation or the Swiss Central Bank abandoning the EUR peg.
All large U.S. Banks have been running CCAR development and U.S. Federal Reserve reporting programs in the past several years. These CCAR projects are cleaning up the centralized global balance sheet data (positions, indicatives, market data) inside the banks so that the worst-case liquidity scenarios reported periodically to the Fed make sense. The novel idea presented in this note is to make CCAR sunk costs produce operational efficiency and more revenue in banks with large balance sheets using NIMo. We propose simulating the Firm’s full Balance Sheet, in the expected case rather than the worst case, to direct automated global capital allocation implementation.
“Free” Floating Point has been the label recently applied to contemporary commodity processor codes executing dozens of floating-point instructions per core every clock, on average. Typically you need vector code issued from your compiler to get the “free” FLOPS. Historically, this has only been sporadically accomplished in Wall Street shops. This fact is not lost on Intel with their new Code Modernization effort.
Despite the recent worries about Skylake delays demonstrating the end of Moore’s Law, there may still be some wave to ride from the perspective of certain types of FP performance algorithm designers. The Golden Age of Floating Point computing could continue to increase the FLOPS supply through 2020. Another 8x in per core throughput from vector issue on a commodity processor or through on chip GPUs could be on the drawing board. One of the things you can do with these “free” FLOPS is make CCAR Full Balance sheet simulation faster on a small commodity microprocessor core footprint. Even though the numerical optimization starts off as cubic in the expected asymptotic complexity, the computation it is very tractable and can probably run on a desktop.
Assume for a moment that the Accrual Portfolio security’s balance, rate of return, and default models can be improved to pull down the expected case approximation error. For example, you can use standard Mortgage Backed Security prepayment model techniques applied to pools of deposits and credit card accounts. Then there are enough “free” FLOPs to simulate a full multi-trillion dollar balance sheet security-by-security for a stochastic market model (e.g., Libor Market Model – LMM) Monte Carlo derivation of the expected Net Interest Margin or Net Interest Revenue. That is, of course, provided your Firm is on-board with the Intel Code Modernization idea.
Net Interest Margin (NIM) is the difference between the rate of return of Accrual Portfolio Liabilities and Assets on the Balance Sheet. NIM Optimization is a natural problem to attack given a fast balance sheet simulator. Why is that?
Figure 1: St Louis Fed – Net Interest Margin all U.S. Banks
NIM is reported in basis points (bps) or in hundredths of a percent. Figure 1 shows the St. Louis Fed data for Net Interest Margin for all U.S. Banks. It is currently around a 30 year low just above 300 bps. So the bank’s core business of funding commercial and consumer loans with deposits is generating a historically small return.
Figure 2: Top 20 Global Banks by Assets (Acuity)
What is a basis point of NIM worth in dollars? That depends on the size of your balance sheet. Figure 2 shows the Top 20 Global banks by Assets. In the Balance Sheet the Assets must be offset by the Liabilities therefore the NIM rate multiplied by the Assets tells you the Net Interest Revenue. A basis point is worth more or less $150MM to 250MM to the banks listed in Figure 2, right?
Figure 3: Historical Net Interest Margin
(FDIC, Gruenberg, 2013)
How much variability is there in the NIM reported by the banks historically? The graph shown in Figure 3 shows there is a wide dispersion in the historically reported NIM figures. For example, in 2005 there was about 100bps of NIM dispersion between banks of various sizes. NIM levels appear to converge during market crisis such at 2002 and 2008-9 and then disperse, again. Not all of this dispersion is due to the capital allocation plan or its implementation. Occasionally banks incur unexpected costs that can drive down the NIM. Some of that dispersion is due to the Firm’s discretionary capital allocation plan and its implementation.
The idea in this note is to provide an alternative to banks currently steering a rocket with a joystick. We propose to implement a control loop to numerically optimize NIM/NIR/Full Bank Balance Sheet/Asset & Liability at the security level in the Accrual Portfolio. Banks certainly have state-of–the-art capital planning processes in place, but not driven directly from the Accrual Portfolio security level. If it were otherwise CCAR would be completed. We are still gathering hard data but we plausibly expect to automate Bank NIM Growth on the order of 10 bps per annum. Moreover there appears to be sufficient FP capacity to numerically optimize NIM using all the target market’s assets (e.g., start with U.S. Fed data). The applications of this technology are abundant.