Titan X

Chris Angelini, Tom’s Hardware, Update: Nvidia Titan X Pascal 12GB Review, here. Nvidia named their product  on the recommendation of Heywood Ja Blowme  and Munch Ma’Quchi.

You have a knack for trading the British Pound against the Japanese Yen. You have a killer hot sauce recipe, and it’s in distribution worldwide. You just made partner at your father-in-law’s firm. Whatever the case, you’re in that elite group that doesn’t really worry about money. You have the beach house, the Bentley, and the Bulgari. And now Nvidia has a graphics card for your gaming PC: the Titan X. It’s built on a new GP102 graphics processor featuring 3584 CUDA cores, backed by 12GB of GDDR5X memory on a 384-bit bus, and offered unapologetically at $1200.

Hot Chips 28

IEEE, Hot Chips 28, Program , here.

Sunday 8/21: Tutorials
8:00 AM – 9:00 AM: Breakfast
9:00 AM – 12:30 PM: Tutorial 1: Using Next-Generation Memory Technologies: DRAM and Beyond
12:30 PM – 1:45 PM: Lunch
1:45 PM – 5:00 PM: Tutorial 2: 3D Depth for Consumers: From Sensors to Apps
5:00 PM – 6:00 PM: Reception
Monday 8/22: Conference Day 1
8:30 AM – 9:30 PM: Breakfast
9:30 AM – 9:45 PM: Introduction
9:45 AM – 11:15 AM: GPUs and HPC Processors
11:15 AM – 11:45 AM: Break
11:45 AM – 12:45 PM: Processing on the Go: Mobile Devices
12:45 PM – 2:15 PM: Lunch
2:15 PM – 3:00 PM: Keynote 1: Mixed Reality
3:00 PM – 4:00 PM: Energy-Efficient Computing: Low-Power SoCs
4:00 PM – 4:30 PM: Break
4:30 PM – 6:00 PM: Vision and Image Processing
6:00 PM – 7:00 PM: Reception
Tuesday 8/23: Conference Day 2
7:30 AM – 8:30 AM: Breakfast
8:30 AM – 10:00 AM: Interconnects: Microns to Kilometers
10:00 AM – 10:30 AM: Break
10:30 AM – 12:00 PM: Emerging Embedded
12:00 PM – 1:15 PM: Lunch
1:15 PM – 2:15 PM: Keynote 2: Are We There Yet? Silicon in Self-Driving Cars.
2:15 PM – 3:15 PM: Many-Core Chips
3:15 PM – 3:45 PM: Break
3:45 PM – 5:15 PM: Dealing with Big Data
5:15 PM – 5:45 PM: Break
5:45 PM – 7:15 PM: High-Performance Processors
7:15 PM – 7:30 PM: Closing Remarks

Banking Book Securities Plan

To get started we run some back-testing sample code:

  1. Pick a list of representative securities to build a portfolio (sec).
  2. Generate cashflow and default data quarterly for 5 year history (sec).
  3. Build models for each of the securities that fit the cashflow and default data with suitable degree interpolating polynomials. They should hit the generated data points exactly. It would be like having a forecast model with very low modeling error (sec).
  4. Build a portfolio from these products – maybe match some the aggregate asset and liability sizes to NY Fed figures. (sec) .
  5. Run nim on this portfolio for 5y of back testing (nim). Should match the input data by definition.
  6. Run the LP on the modeled data to get the optimal capital allocation at: 5y, 4y, 3y, 2y, and 1y (nimo).
  7. Run the P&L attribution optimal versus realized (nimo).

Notice we are going to make the cashflow and default models depend only on time with a function choice that will replicate exactly the observed cashflows and defaults. We will substitute models depending on market data at a subsequent step. Then we can put in the Monte Carlo forecasting step to the compute the portfolio’s expected NIM prior to the optimization step.

 

Note there is also the option of using the optimization post the Adverse and Severely Adverse Fed scenarios. Probably not very interesting in terms of P&L but maybe there are other applications

 

Bank Liabilities

These are the US Federal Reserve security classifications for Bank Liabilities:

  • Interbank Transactions
  • Federal Reserve Float
  • Loans to Domestic Banks
  • Interbank transactions with Foreign Banls
  • Checkable Deposits
  • Checkable Deposits due Federal Gov
  • Checkable Deposits Private Domestic
  • Checkable Deposits Foreign
  • Large Time Deposits
  • Fed Funds
  • Repos
  • Debt Securities
  • Open Market Paper
  • Corp and Foreign Bonds
  • FHLB advances
  • Sallie Mae Loans
  • Taxes Payable
  • Holding Company Transactions
  • Misc Transactions

Bank Assets

  • Vault Cash
  • Reserves
  • Fed Funds
  • Repo
  • Debt Securities
  • Open Market Paper
  • Treasuries
  • Agencies
  • Residential Mortgage Passthroughs
  • Commercial Mortgage Passthroughs
  • CMO Structured MBS
  • Agency Backed MBS/CMO
  • Munis
  • Corporate and foreign bonds
  • Private Commercial MBS
  • Private Residential CMOs
  • Loans
  • Mortgages
  • Consumer Credit
  • Corporate Equities
  • Mutual Fund Shares
  • Direct Foreign Investment
  • Misc Assets

 

Federal Reserve Stats on Large Commercial Banks

US Federal Reserve, Large Commercial Banks w consolidated Assets $300+, here. These stats show  Total and Domestic Assets for 1700 banks 14.5 tn total and 13.1 tn domestic. The results are available historically back to 2003.

 The Federal Reserve Board compiles quarterly data on domestically chartered insured commercial banks that have consolidated assets of $300 million or more and releases the data about twelve weeks after the end of each quarter. The data are obtained from the Consolidated Reports of Condition and Income filed quarterly by banks (FFIEC 031 and 041) and from other information in the Board’s National Information Center database. Banks that are located in U.S. territories and possessions are not included in the table.

US Federal Reserve, Dodd-Frank Act Stress Tests, here . Comprehensive Capital Analysis and Review, here. 6 years of scenarios (Severely Adverse, Adverse). The Supervisory historical data gives a quarterly view of top level market and macroeconomic variables back to 1976. It will be useful when NIMO moves to forecasting security models and stochastic market models. This data is listed under the tab Banking Information & Regulation and the menu item Stress Tests and Capital Planning

US Federal Reserve, Bank Assets and Liabilities, here. Data Download Program.  Financial Accounts Guide, here. This table list the breakdown of products from 2016Q1 back  quarterly to 2014Q3 and annually back to 2012.

 

Comparison of Open-Source Linear Programming Solvers

Gearhart, et.al., Sandia National Labs, Comparison of Open-Source Linear Programming Solvers, 2013, here. I am thinking I will start off with Google or maybe GNU for NIMO. Looks like all roads lead to CPLEX in the long run. The idea is to fit all the Fed historical for the interest rate and credit  risk of the aggregate USD  BHC Assets and Liabilities with high enough degree interpolation polynomials  that the modeling error is effectively zero for  quarterly simulation over several years. We will fit the data to match the agg amounts reported, but we bypass the ability to forecast for the moment. Then we can run the forward optimization using the error free models from one historical time point t1 to another historical time point t2.  Once we know the optimal capital allocation for the period we can run the p&l explanatories back from t2 to t1 to do the attribution. If we can get the BHC portfolio breakdown for the period (t1, t2) then we might be able to make attribution observations that hold some interest on a per BHC (bank) basis.

When developing linear programming models, issues such as budget limitations, customer requirements, or licensing may preclude the use of commercial linear programming solvers. In such cases, one option is to use an open-source linear programming solver. A survey of linear programming tools was conducted to identify potential open-source solvers. From this survey, four open-source solvers were tested using a collection of linear programming test problems and the results were compared to IBM ILOG CPLEX Optimizer (CPLEX) [1], an industry standard. The solvers considered were: COIN-OR Linear Programming (CLP) [2], [3], GNU Linear Programming Kit (GLPK) [4], lp_solve [5] and Modular In-core Nonlinear Optimization System (MINOS) [6]. As no open-source solver outperforms CPLEX, this study demonstrates the power of commercial linear programming software. CLP was found to be the top performing open-source solver considered in terms of capability and speed. GLPK also performed well but cannot match the speed of CLP or CPLEX. lp_solve and MINOS were considerably slower and encountered issues when solving several test problems.

Follow

Get every new post delivered to your Inbox.

Join 184 other followers