James Vickery Papers

A Structural View of U.S. Bank Holding Companies, July 2012.

Do Big Banks Have Lower Operating Costs, 2014.

Available For Sale? Understanding Bank Securities Portfolios, Feb. 2015.

References

Capital One, Third Quarter Earnings, Oct. 25, here.

Birge and Judice, June 2012, Long-Term Bank Balance Sheet Management: Estimation and SImulation of Risk-Factors, here .

John R. Birge, Fac page at Chicago, here. Google Scholar page, here.  Introduction to Stochastic Programming (book), 2011, here.

 

James Vickery

James Vickery, Federal Reserve Bank of New York, here. There is a picture on the FED website.

James Vickery is an Assistant Vice President in the Research and Statistics Group of the Federal Reserve Bank of New York, where he has worked since 2004. Mr Vickery’s research focuses on financial intermediation and banking. Recent and ongoing research topics include capital adequacy, scale economies in banking, mortgage design, interbank markets, and bank organizational complexity. A separate line of research studies financial innovation and insurance in emerging market economies. In addition, Mr. Vickery contributes to the Federal Reserve’s financial stability and monetary policy responsibilities, including serving on the Model Oversight Group that directs the Fed’s supervisory stress testing models. He also teaches as an Adjunct Assistant Professor at the NYU Stern School of Business. Mr. Vickery completed a PhD in Economics from MIT in 2004. Prior to graduate school he was a research analyst at the Reserve Bank of Australia.

Stochastic Programming

The Stochastic Programming Society, here. Going to do some of the setup for Princeton Bank Consortium here. Pink I will stick with sorting through references and PBC will focus on the NIM Optimization numbers and commentary.

Stochastic programming is a framework for modeling optimization problems that involve uncertainty. Whereas deterministic optimization problems are formulated with known parameters, real world problems almost invariably include some unknown parameters. When the parameters are known only within certain bounds, one approach to tackling such problems is called robust optimization. Here the goal is to find a solution which is feasible for all such data and optimal in some sense. Stochastic programming models are similar in style but take advantage of the fact that probability distributions governing the data are known or can be estimated. The goal here is to find some policy that is feasible for all (or almost all) the possible data instances and maximizes the expectation of some function of the decisions and the random variables. More generally, such models are formulated, solved analytically or numerically, and analyzed in order to provide useful information to a decision-maker.

To find out more about stochastic programming a good place to start is A Tutorial on Stochastic Programming by Alexander Shapiro and Andy Philpott. This tutorial is aimed at readers with some acquaintance with optimization and probability theory; for example graduate students in operations research, or academics/ practitioners from a different field of operations research.

The older Stochastic Programming Introduction by Andy  Philpott is aimed at readers with a less formal background in operations research, for example managers in industry who want to know more about what stochastic programming might offer them without delving too deeply into details.

In addition, tutorials on current research areas are being developed. The main idea is that those who’ve read the introduction and want to find out more about specific topics will have a good place to start. COSP will be inviting experts to write pages for each of these areas. This collection of introductions is edited by David Morton, Andy Philpott, and Maarten van der Vlerk.

Georgia Tech crew, here  Shapiro and Nemirovski

Rutgers, here. Ruszczynski, here.

Shapiro and Philpott, 2007, A Tutorial on Stochastic Programming, here. OK bibliography at the conclusion of the tutorial.

Powell, Warren B., Computational Stochastic Programming, here.  See also Castle Labs, here. Princeton OR.

van der Vlerk, Stochastic Programming Bibliography, here.

neos Guide, here. Wisconson folks

Pioneers of Stochastic Programming, here.  Prekopa, Rockafellar, Ziemba, Danzig links.

Stochastic Programming Links, here. old 2002 lots of dead links

Stochastic Programming References

Birge JR amd Judice P, Long-Term Bank Balance Sheet Management: Estimation and 
Simulation of Risk-Factors, 2012, here. QRM must be coming out of Northwestern 
or Chicago somehow.

We propose a dynamic framework which encompasses the main risks in balance sheets
 of banks in an integrated fashion. Our contributions are fourfold: 1) solving a 
simple one-period model that describes the optimal bank policy under credit risk;
 2) estimating the long-term stochastic processes underlying the risk factors in
 the balance sheet, taking into account the credit and interest rate cycles; 
3) simulating several scenarios for interest rates and charge-offs; and 
4) describing the equations that govern the evolution of the balance sheet in 
the long run. The models that we use address momentum and the interaction 
between different rates. Our results enable simulation of bank balance sheets 
over time given a bank's lending strategy and provides a basis for an 
optimization model to determine bank asset-liability management strategy 
endogenously.

Birge, Dempster, Shapiro, Prekopa, Nemirovski, Rockafellar, Ruszczynski

What’s In Their Wallet Draft

whats-in-their-wallet-v2

Abstract:

Why is the bank Capital One’s 2015 NIM such an outlier relative to Bank Holding Companies (BHCs) of comparable and larger asset base? Capital One NIM is double the average bank NIM in 2016, Why is that? Press coverage over that past couple years attribute COF outperformance to going long risky credits and managing the write-downs. Maybe they are right and COF is just good at making and managing loans for a decade. On the other hand, maybe COF is running a very different quantitative model than its competitors. COF has 300+ BN of assets in 2016 and is making 300 more bps per year on their assets than the average competitor BHC makes on their assets. Banks’ NIMs are at 30 year lows but not at COF. Maybe the competitors can learn from COF. We discuss some of the quantitative possibilities.

  • Sort out references
  • cover assess QRM, Bancware, Polymaths, Oracle Financial …
  • clean up flow start to finish
  • Tighten up abstract and summary.
  • Find some COF folks to comment

What’s In Their Wallet?

Abstract:
Why is the bank Capital One’s 2015 NIM such an outlier relative to Bank Holding Companies (BHCs) of comparable and larger asset base? Capital One NIM is double the average bank NIM in 2016, Why is that? Press coverage over that past couple years attribute COF outperformance to going long risky credits and managing the write-downs. Maybe they are right and COF is just good at making and managing loans for a decade. On the other hand, maybe COF is running a very different quantitative model than its competitors. COF has 300+ BN of assets in 2016 and is making 300 more bps per year on their assets than the average competitor BHC makes on their assets. Banks NIM are at 30 year lows but not at COF. Maybe the competitors can learn from COF. We discuss some of the quantitative possibilities.

Capital One’s NIM

Bank NIM is down and Interest rates are expected to stay down (see Rieder). St. Louis Federal Reserve figures show that the current average BHC NIM is at a 30 year low.

Capital One’s NIM is not at a 30 year low. In fact they buy assets on the market from ING, GE, Chevy Chase etc. and still maintain their historic NIM levels. COF assets grew 8x from 2004 to 2015 yet the NIM level stayed in a narrow range, occasionally busting out top side. How do you do that? We will review the evidence that the correct figurative question is not “What’s in their Wallet” but “How do they choose what’s in their Wallet?” We will make the case that COF is running some sort of Dynamic Stochastic Optimization for implementing their capital allocation plan. COF’s Wallet is filled with securities through an LP/NLP optimization process and possibly some Dynamic Programming feedback control process. That is how they make 300 bps more than mostly everyone else in a punishingly low interest rate environment. Moreover, other than the competition for assets this more of an internal efficiency game than a zero sum game of someone has to lose for there to be a winner. COF has a free hand in this game at moment because no one else knows how to play.