James Vickery, Federal Reserve Bank of New York, here. There is a picture on the FED website.
James Vickery is an Assistant Vice President in the Research and Statistics Group of the Federal Reserve Bank of New York, where he has worked since 2004. Mr Vickery’s research focuses on financial intermediation and banking. Recent and ongoing research topics include capital adequacy, scale economies in banking, mortgage design, interbank markets, and bank organizational complexity. A separate line of research studies financial innovation and insurance in emerging market economies. In addition, Mr. Vickery contributes to the Federal Reserve’s financial stability and monetary policy responsibilities, including serving on the Model Oversight Group that directs the Fed’s supervisory stress testing models. He also teaches as an Adjunct Assistant Professor at the NYU Stern School of Business. Mr. Vickery completed a PhD in Economics from MIT in 2004. Prior to graduate school he was a research analyst at the Reserve Bank of Australia.
The Stochastic Programming Society, here. Going to do some of the setup for Princeton Bank Consortium here. Pink I will stick with sorting through references and PBC will focus on the NIM Optimization numbers and commentary.
Stochastic programming is a framework for modeling optimization problems that involve uncertainty. Whereas deterministic optimization problems are formulated with known parameters, real world problems almost invariably include some unknown parameters. When the parameters are known only within certain bounds, one approach to tackling such problems is called robust optimization. Here the goal is to find a solution which is feasible for all such data and optimal in some sense. Stochastic programming models are similar in style but take advantage of the fact that probability distributions governing the data are known or can be estimated. The goal here is to find some policy that is feasible for all (or almost all) the possible data instances and maximizes the expectation of some function of the decisions and the random variables. More generally, such models are formulated, solved analytically or numerically, and analyzed in order to provide useful information to a decision-maker.
To find out more about stochastic programming a good place to start is A Tutorial on Stochastic Programming by Alexander Shapiro and Andy Philpott. This tutorial is aimed at readers with some acquaintance with optimization and probability theory; for example graduate students in operations research, or academics/ practitioners from a different field of operations research.
The older Stochastic Programming Introduction by Andy Philpott is aimed at readers with a less formal background in operations research, for example managers in industry who want to know more about what stochastic programming might offer them without delving too deeply into details.
In addition, tutorials on current research areas are being developed. The main idea is that those who’ve read the introduction and want to find out more about specific topics will have a good place to start. COSP will be inviting experts to write pages for each of these areas. This collection of introductions is edited by David Morton, Andy Philpott, and Maarten van der Vlerk.
Georgia Tech crew, here Shapiro and Nemirovski
Shapiro and Philpott, 2007, A Tutorial on Stochastic Programming, here. OK bibliography at the conclusion of the tutorial.
van der Vlerk, Stochastic Programming Bibliography, here.
neos Guide, here. Wisconson folks
Pioneers of Stochastic Programming, here. Prekopa, Rockafellar, Ziemba, Danzig links.
Stochastic Programming Links, here. old 2002 lots of dead links
Birge JR amd Judice P, Long-Term Bank Balance Sheet Management: Estimation and Simulation of Risk-Factors, 2012, here. QRM must be coming out of Northwestern or Chicago somehow. We propose a dynamic framework which encompasses the main risks in balance sheets of banks in an integrated fashion. Our contributions are fourfold: 1) solving a simple one-period model that describes the optimal bank policy under credit risk; 2) estimating the long-term stochastic processes underlying the risk factors in the balance sheet, taking into account the credit and interest rate cycles; 3) simulating several scenarios for interest rates and charge-offs; and 4) describing the equations that govern the evolution of the balance sheet in the long run. The models that we use address momentum and the interaction between different rates. Our results enable simulation of bank balance sheets over time given a bank's lending strategy and provides a basis for an optimization model to determine bank asset-liability management strategy endogenously. Birge, Dempster, Shapiro, Prekopa, Nemirovski, Rockafellar, Ruszczynski
Why is the bank Capital One’s 2015 NIM such an outlier relative to Bank Holding Companies (BHCs) of comparable and larger asset base? Capital One NIM is double the average bank NIM in 2016, Why is that? Press coverage over that past couple years attribute COF outperformance to going long risky credits and managing the write-downs. Maybe they are right and COF is just good at making and managing loans for a decade. On the other hand, maybe COF is running a very different quantitative model than its competitors. COF has 300+ BN of assets in 2016 and is making 300 more bps per year on their assets than the average competitor BHC makes on their assets. Banks’ NIMs are at 30 year lows but not at COF. Maybe the competitors can learn from COF. We discuss some of the quantitative possibilities.
- Sort out references
- cover assess QRM, Bancware, Polymaths, Oracle Financial …
- clean up flow start to finish
- Tighten up abstract and summary.
- Find some COF folks to comment
Why is the bank Capital One’s 2015 NIM such an outlier relative to Bank Holding Companies (BHCs) of comparable and larger asset base? Capital One NIM is double the average bank NIM in 2016, Why is that? Press coverage over that past couple years attribute COF outperformance to going long risky credits and managing the write-downs. Maybe they are right and COF is just good at making and managing loans for a decade. On the other hand, maybe COF is running a very different quantitative model than its competitors. COF has 300+ BN of assets in 2016 and is making 300 more bps per year on their assets than the average competitor BHC makes on their assets. Banks NIM are at 30 year lows but not at COF. Maybe the competitors can learn from COF. We discuss some of the quantitative possibilities.
Capital One’s NIM
Bank NIM is down and Interest rates are expected to stay down (see Rieder). St. Louis Federal Reserve figures show that the current average BHC NIM is at a 30 year low.
Capital One’s NIM is not at a 30 year low. In fact they buy assets on the market from ING, GE, Chevy Chase etc. and still maintain their historic NIM levels. COF assets grew 8x from 2004 to 2015 yet the NIM level stayed in a narrow range, occasionally busting out top side. How do you do that? We will review the evidence that the correct figurative question is not “What’s in their Wallet” but “How do they choose what’s in their Wallet?” We will make the case that COF is running some sort of Dynamic Stochastic Optimization for implementing their capital allocation plan. COF’s Wallet is filled with securities through an LP/NLP optimization process and possibly some Dynamic Programming feedback control process. That is how they make 300 bps more than mostly everyone else in a punishingly low interest rate environment. Moreover, other than the competition for assets this more of an internal efficiency game than a zero sum game of someone has to lose for there to be a winner. COF has a free hand in this game at moment because no one else knows how to play.
Capital One Investor Relations, here. I think if we poke around here we can get an estimate of where they are v.v. Net Interest Margin Optimization.
Capgenimi, Doing Business The Digital Way:How Capital One Fundamentally Disrupted the Financial Services Industry. here.
Here’s a quick exercise. What links these different out ts: online bank ING Direct; Bankons, a mobile startup that creates geo-located offers; Bundle, a Citibank spin-off that specializes in analysis of spend data; Sail, a mobile point-of-sale card-swiping device1? Here’s a hint – all of these companies were acquired to bolster the digital services of a leading nancial services organization. We are talking about Capital One. Since its founding as a credit card company in 1988, Capital One Financial Corp. has grown into a diversi ed bank with more than 65 million customer accounts worldwide. It is not hard to see why Capital One is investing heavily in digital technologies. It conducts over 80,000 big data experiments a year2. Currently, 75% of customer interactions with Capital One are digital, and this number is only expected to grow3. In Q4 2013, Capital One was one of the most visited websites, with 40 million unique online visitors4.
Donal Byrne, Tabb Forum, Rethinking Speed in Financial Markets, Part 4: The Need for Machine-Time Data, here. He said/wrote microseconds, … Funny. And the Corvil guys are reasonable for the most part in my experience. It is like Dr. Evil asking for 1 million dollars ransom for not destroying the world in 2000 something. I think the clock frequency is written in the computer box, no? You would think calling it a microprocessor CLOCK would be a hint, but I guess you can never be sure.
A machine world is different. Machines act much faster than humans. Their idea of real time is much closer to a microsecond. Roughly a million times faster. I refer to this as “machine real time” or “machine time” for short. We define machine time as the time within which a machine can act or make a decision. We therefore need a machine-time watch for a machine-time world.