Over the decades there have been many views on what the bank of the future would be. Some ideas have been radical and some have been transitional, while others never really took hold. This particular view begins in Pawtuxet, Rhode Island, on the eastern seaboard of the United States. By all accounts a lovely place to visit, Pawtuxet is well known for scenic harbour views and boating along its historic river corridor. But in the early 19th century, textile mills and coastal trade dominated its landscape. As the community thrived and local businesses grew, the Pawtuxet Bank emerged. Common for its time, the bank was a partnership and its directors mostly merchant-manufacturers.
The Pawtuxet Bank’s directors shared personal liability in the event of loan default, or if the bank itself failed. In “The Structure of Early Banks in Southeastern New England”, Naomi R. Lamoreaux recounts the events of June 1840 when the bank’s stockholders presented themselves before the Rhode Island General Assembly. The group appeared seeking permission to reduce the bank’s capitalization from $87,750 to $78,000 in order to cover losses sustained due to the death of John Pettis, one of the bank’s directors:
Pettis died in 1838 with notes worth $8,800 outstanding at the bank and endorsements amounting to at least another $1,500…(t)his loss was not sufficiently large to cause the bank to fail. Nor did depositors or the bank’s own noteholders suffer. Most (91 percent) of the bank’s loans were backed by capital rather than notes or deposits, and the stockholders simply absorbed the loss.
We are unlikely to return to the capitalization levels or strict regional focus employed by the gentlemen of Pawtuxet. There are however crucial lessons to be learned when examining the structure and scope of early financial institutions. When we talk about addressing concerns over capital, funding, and liquidity, it just might be that what we need for the bank of tomorrow is not a new model, but rather one that takes inspiration from the bank of yesterday.
In the early history of banking, each partner made decisions knowing they shared liability if the bank failed. As a result, choices on whom credit should be extended to were not taken likely. Unambitious business owners with a slight but steady production of widgets were considered ideal customers. Less attractive was the dubious repayment potential of innovative or entrepreneurial types. The latter group represented the potential of a phenomenal return on investment, but only if an unproven product or process succeeded.
This philosophy was in keeping with the dominant banking theory of the 18th & early 19th centuries. The real-bills doctrine proposed that banks should restrict the extension of credit to customers involved in the transfer of existing products only. Real-bills supporters argued that by basing loans on the security of actual goods, any individual bank’s liquidity was ensured.
The adjustment (towards limited liability) would correct what had turned out to be the too successful risk measure of personal obligation: banks weren’t interested in funding anything risky.
While there is an admirable simplicity in tying loans to tangible goods, the skyline of the 19th century was starting to change. There were towers, factories and infrastructure projects that needed to be built, without existing product to offer in exchange for funding. And these projects were poised to generate great returns.
The unlimited liability model was ill-suited to finance these projects. When shareholders’ money was directly on the line, banks had good reason to avoid speculative projects. Because the incentives for self-discipline were so high, banks often lent only to those they knew best. These could be local businessmen, often engaged in the same type of industry as the bank shareholders. Often bank funds became personal resources for the shareholders themselves, and this type of insider lending, or trading, would frequently make up the majority of a bank’s exposures.
As the world began to change, so did banks. By the late 1850s, Great Britain had moved towards limited liability, with France following suit in 1867. As with unlimited liability, the logic was easy enough to follow: if a bank could diversify its investor base, there would be a greater availability of credit and capital. The adjustment would correct what had turned out to be the too successful risk measure of personal obligation: banks weren’t interested in funding anything risky.
In “Early American Banking: The Significance of the Corporate Form,” Richard Sylla suggests that the tipping point away from unlimited liability originated with the New York Free Banking law of 1838 which stated, “no shareholder of any such association shall be liable in his individual capacity for any contract, debt or engagement of such association.” New York’s free banking law didn’t just make limited liability possible. It opened the door for the incorporation of banks and the freedom from personal obligation.
One step forward, two steps back
As banks moved beyond their villages in search of capital and opportunities, the strategies and measurements used in their operation changed as well. Instead of prudence being the only driver, customer profitability and shareholder value became ongoing concerns.
Expansions, mergers, and deregulation replaced local partnerships with a mandate to maximize customer bases and profitability. Operating at an extreme opposite of the early 19th century model were institutions like Alfinanz, an offshore administration factory that functioned as a global back office for a global network of financial advisors, intermediaries, or brokers.
As banks moved from private partnerships to public corporations, shareholder demands added another voice to how bank capital and risk would be managed. Enhanced returns were a factor in banks’ decisions to pursue diversification, more complex transactions such as structured products, and other strategies that gained support from managers operating with limited liability.
In the modern era of banking, even the idea of ‘who is a customer’ was up for grabs. In the 1990s, First Manhattan Consulting Group took a leading role in introducing profit-based segments to banking. First Manhattan came to prominence with the now famous conclusion that only 20% of a bank’s customers were profitable. Their idea to focus on profitable customers only was an attractive one to banks who were seeking to improve low revenue growth, particularly in core retail products. The concept also encouraged mergers and the creation of larger banks, who were better positioned to take advantage of segmentation opportunities.
Today we see banks retreating from these drivers and measures, often forced to adjust strategy by regulation, and perhaps in retreat from acting “in loco parentis”. Shareholder demands, which focused exclusively on the creation of shareholder value, must now be balanced against closer regulatory scrutiny and the need to protect customer interests.
Diversification led to its own set of challenges, as it did not help spread risk well. The credit crisis demonstrated that market risk and credit risk can appear in unexpected ways, and that the need to maintain strong liquidity positions was more crucial than realized.
All the short term profitability in the world cannot help if the system isn’t stable. And today, if you want stability, every discussion must begin with the importance of access to capital.
Capital: the once and future king
In his memoir On the Brink: Inside the Race to Stop the Collapse of the Global Financial System, former U.S. Secretary of the Treasury Henry Paulson reflects back on the credit crisis. One of his conclusions is that the financial system contained too much leverage, much of which was buried in complex structured products:
Today it is generally understood that banks and investment banks in the U.S., Europe, and the rest of the world did not have enough capital. Less well understood is the important role that liquidity needs to play in bolstering the safety and stability of banks…(f)inancial institutions that rely heavily on short-term borrowings need to have plenty of cash on hand for bad times. And many didn’t.
Politicians and regulators have joined hands on capital, proposing measures that would lead to banks holding more of it. Many banks have argued against this approach, claiming that additional capital requirements would affect performance and competition. Yet recent investigations into the correlation between bank capital and profitability suggest that holding additional capital may not be a bad thing. Which is encouraging, since banks will likely have to do it anyway.
It just might be that what we need for the bank of tomorrow is not a new model, but rather one that takes inspiration from the bank of yesterday.
Allen Berger and Christa Bouwman’s interests are reflected in the title of their recent paper, “How Does Capital Affect Bank Performance During Financial Crises?” The authors examine the effects of capital on bank performance, as well as how these effects might change during normal times as well as banking and market crises. The empirical evidence led Berger and Bouwman to the following conclusions:
First, capital enhances the performance of all sizes of banks during banking crises. Second, during normal times and market crises, capital helps only small banks unambiguously in all performance dimensions; it helps medium and large banks improve only profitability during market crises and only market share during normal times.
Empirical evidence, regulatory measures, and perhaps common sense dictate that holding additional capital is a worthy goal for banks. Yet even if banks wanted to raise capital thresholds, it isn’t as easy as flipping a switch. A lack of cheap availability and reduced funding sources have changed the capital landscape. Banks of the future must focus on the preservation and leverage of available capital, and make that capital work harder.
Part of this focus must be organizational. Allocation of capital can no longer be controlled at a business unit, subsidiary, country, or branch level. It needs to be allocated at the time of doing business with specific customers, business lines, and even at a transaction level. Dynamic capital leads to a radically different structure, where the treasury becomes the ‘owner’ of capital, lending it to deal makers on demand.
A dynamic treasury requires great understanding of the uses and cost of capital, connected to the technological ability to ‘solve’ the problem of Big Data. In the sidebars to this article, my colleagues have expanded on the linked topics of dynamic capital and managing complex data.
Banking on the past
In the early 19th century, the UK limited banking partnerships to six members. No one is suggesting banks return to this restriction. But if we think about various aspects of the unlimited liability banking model, it appears many of their tendencies are being echoed in calls from regulators and stakeholders.
Long-dated compensation reform and shareholder ‘say on pay’ programs can be seen as measures intended to update the shared liability and sense of ownership partners used to bring to banks. The credit crisis has driven home the importance of liquidity, and that gaining capital can be expensive – if it can even be acquired in times of a crisis. In a way this reflects the notion early bankers held that capital was expensive, and bringing in additional funds or partners would dilute earnings.
Insider lending and specialization that gave way to diversification and fewer restrictions on portfolios is being balanced by technologically enabled means to better know customers. Enhanced collateral management and approaches like CVA can be used to gain a deeper understanding of capital exposures before entering into an agreement.
If banks are to thrive in the future, preservation and leverage of available capital are crucial steps. Dynamic allocation, enabled by a treasury that quickly and effectively uses available capital in prudent ways, could perhaps be the defining characteristic of the bank of tomorrow. From the outside, these institutions would look nothing like the stakeholders of the Pawtuxet Bank, but they would be related in spirit.
Dynamic Capital Management: Francis Lacan
To visualize the concept of dynamic capital management, think of flying an aircraft as close as possible to the ground. If you go too high, the cost of fuel becomes unreasonable. You cannot go negative because the option doesn’t really exist. The goal is to seek out the most efficient middle ground that best mimics the changing landscape below.
Managing capital dynamically would enable a bank to determine, on a day by day and month by month basis, the most efficient flight path for capital and allocate it accordingly. Optimization and anticipation are the two extremes guiding such decisions, and in the middle reside a big set of constraints. Basel III and its liquidity coverage ratio have restricted certain freedoms, particularly in terms of asset qualification. The other set of constraints is risk management. Liquidity is increasingly subject to risk management because there are lots of dependences in funding liquidity and the rest of the risk.
It seems liquidity is following a similar path to what happened with capital and solvency. Banks didn’t invest much in economic capital, but the strong requirement to look at regulatory capital acted as an incentive to build more analytics, more rigorous reporting, and to become more serious about addressing uncertainty with the right tools. What is more complex for capital management is to connect all the sources of information. Pulling cash flow across entities and supporting good cash management today still has a lot of room to evolve. There are for example too many overlapping systems of information that are not very good at talking to one another. Overcoming this hurdle would be a huge step towards active capital management, rebalancing, and optimization.
The current baseline for automation is extremely rudimentary. The only true mechanical element is the planning of short term inflows and outflows within the Treasury, because their contractual commitments are relatively easy to model. The rest is seen as shocks, and the focus seems to be on what the regulators are asking banks to address as the possibility of these shocks. As a result, banks are being pushed into modeling with greater consistency what may happen with respect to different uncertainties tied to cash flows.
In the short term, banks will have to continue on the foundations of operational management of cash and collateral, addressing regulatory requirements for cash flow modeling and forecasting, asset qualification, and scenario modeling. Together, these elements will provide a rugged foundation to move towards automated decision making, and eventually, a more automated approach to at least some aspects of capital management.
This prediction comes with a number of ‘ifs’: if you are able to have very good and trustable aggregated pooling of all internal and external balances of cash in all currencies, and if you have access to a very good repository for your treasury operations so you can see your money market for all these currencies, you could to an extent begin to automate capital allocations for particular areas of the business. Decisions on how to refinance each of these currencies, and perhaps rebalance positions into a smaller number of currencies to save costs or optimize even the risk profile of certain transactions, could in theory be automated.
This won’t happen tomorrow. But with the proper foundation, we have the technology to make dynamic capital management part of the bank of the future.
Data Complexity: Leo Armer
In the Pawtuxet model, a small number of operating partners owned the bank’s data. It was their responsibility to collect information about their clients, and use this knowledge to guide business decisions.
Banks today have challenges managing data, in large part because the acts of collecting and analyzing information have become so separated. The greater this disconnect, the more important transparency becomes.
For both banks and clients, it’s crucial to be able to ask: “If this is my risk number, where did it originate? How do I track it? How can I see which systems it passed through, and what happened to it along the way?” Being able to take a number from a balance sheet or a general ledger and drill back to its origin provides a huge amount of confidence.
In order to make good decisions, you need to see the big picture. If data complexity is viewed purely as a technological issue, its strategic importance can be overlooked. When institutions attack data issues purely from an IT perspective, rules are created, transformations take place, and the data is considered ‘clean’ after going through a reconciliation process. Various systems and approaches are employed to ensure that the numbers coming out of the front system match numbers coming out of the general ledger system, and these match the numbers coming from treasury.
The problem is, as much as you can clean the data on a Monday, unless you change the people or method of entering that data, it’s going to need cleaning up again on Tuesday.
Today, a few firms are approaching data complexity from a business perspective. They have put their main focus on creating a single data warehouse where all the information is stored. This approach is based on the insight that every piece of data has a golden source: a reliable point of origin before it gets passed through different hands and different teams. It becomes as much about changing mental attitudes as it is about technical architectures.
Creating a golden source for data becomes even more crucial when we see what has happened in the last couple of years. CVA charges for example occur when a bank puts a variable fee on top of a deal, depending on whom they are trading with.
If your bank was to trade with another bank, and you had a long history with the other bank and deep insights into their credit status, the bank would likely get a better price for that trade than a small finance company from Greece who might be looking less solvent.
In these transactions, where is the golden source? Is it in the middle office or front office data? Who is taking ownership over the trades? They can’t be processed the way they used to be, otherwise you’re swimming against the current demand for real time responses. If it takes six or seven days to work out what happened when a counterparty defaults, you’re too far behind the curve.
It is becoming more common to run into or hear discussions about appointing a CDO, or Chief Data Officer. A CDO, or at least the mindset within an institution that data quality is crucial and strategically relevant, can help banks evolve beyond workarounds and create a repository of golden source data. Through a framework where standards, direction, and architecture are provided to different departments throughout an organization, the bank of the future can overcome data complexity.