How the banks threw away their client data

Banks didn’t need to work hard to get good client data before the dawn of the computer age because they already had it. Lending and investment decisions were taken by local managers who knew clients personally and understood discretion. The mega-merger and the arrival of the mainframe demolished much of that knowledge. If it didn’t fit into a bank’s database then it didn’t exist.

Traditional branch and relationship banking

The more respectable of my two grandfathers was a bank manager; the more rakish one ran East-End boozers. The bank manager, who retired young in the 1930s as a successful investor and businessman, was an expert at client data although he wouldn’t of called it that – he’d have used words like character or credit. 

At the time, every high street had several branches of local or regional banks and lending decisions were taken on-the-spot by the manager, or for larger amounts by regional managers, but still basing the decision on local knowledge. Bank managers lived amongst their clients as respected and important members of the community, ranking alongside the doctor and priest. They knew the credit of an individual not just from the transactions in his or her account, but from a myriad small pieces of what today we would call data and metadata: house and business ownership, character, family, habits and reputation.

Almost all business was done face-to-face and customers could wander through the front doors into a sometimes lavish banking hall. The employees would often know them and greet them by name. Client data gathering was done by highly-distributed human, backed up with paper files. Decisions could be made fast with none of today’s frustration of dealing with automated chat or a call centre. The story was similar in most advanced economies – in Germany, the local Sparkassen, and in the US, strong regional banks financed businesses and other local institutions, often mutuals, funded house ownership. 

Strong regional and local banks

My grandfather worked for the The Manchester and Liverpool District Banking Company, known as the District Bank, a good example of the strength and virtues of local, regional banks then. It was founded in 1829 as Manchester and Liverpool began to boom with engineering and manufacturing, mostly then the textile and cotton trade, Britain’s largest export sector until foreign competition took off after WWI. It’s founders were local businessman who had started broking stocks and they knew their area and the businesses in it, an area that in the 1930s still hung on to much of it’s Victorian prosperity with newer industries like aerospace and chemicals starting up.

By the 1930s they had 150 branches across the North and North-West of England. The manager would know most customers by name, and often interview them before accounts were opened or cheque books were issued: face-to-face data acquisition. Succesful businesses thrived because of the strength of their banking relationships and the data embedded in them.

So someone like Bill Parkinson started his now worldwide Lifting Gear Hire business, with an overdraft of £2,500 secured on his house. His manager at the local Natwest, which had just merged with and re-branded the District decided to lend to him the first time they met, on his own judgement and without any higher approvals. As his business grew, it was all financed with loans secured on the equipment he was leasing out to engineering customers. He’s written about it in Lifting the World. Read it if you have a moment – it’s excellent.

Good data but antiquated delivery

Yet it wasn’t all rosy and there lot of problems with banking then compared to now. Cheques could take days to settle and clear and lending was prey to all the prejudices and snobbery you might expect in a more traditional society. If you were unlucky enough to have a poor or lazy bank manager then all the valuable data about your potential and character was worthless. Lack of local competition meant you probably had nowhere else to turn. Bank charges were high even for mundane transactions and bank opening hours were ludicrously short by today’s standards – often not opening later than 3pm, since all accounts had to be balanced by hand and cash counted and reconciled.

Even worse most people were shut out of the system completely and couldn’t even get access to an account. As a result, most working people lived entirely cash-based lives with weekly pay packets. Opening an account often required recommendations by at least two existing customers and even then you faced an interview with an intimidating individual like my grandfather, who despite his many strengths and huge talents, was never the sunniest and easiest character to deal with.

Mergers damaged local knowledge

UK banks like the District began to modernise in the 60s and 70s and mergers took off. The old District Bank was acquired by the National and Provincial, which was in turn swallowed up by the National and Westminster Bank in 1970, turning the ‘Little Six” into the “Big Five” and then “big Four” UK banks that we know today – the Midland now having been subsumed into HSBC. The rationale for such mergers was the then current vogue for “economies of scale” which in banking really meant  the perceived need to share the very expensive investments in new technology which were just starting in the new era of mainframes.

In truth, like most mega-mergers, it was more driven by fashion and ego. It was the era of conglomerates, vertical integration and financial engineering. Governments and economists were in love with economic planning and mergers were seen to create efficient ‘national champions’. The ambition was to have a nationwide branch network, and local overlapping branches were closed to realise cost savings. Client service, relationships and data were not the priority and were often lost in the process. Industrial strategy in banking, as in many other industries like cars gave little though to the needs of the end customer. Luckily for the banks, regulation protected them from the foreign competition that swept way many other loopy national champions.

Mainframes murdered it

The UK clearing banks were one of the earliest adopters of computing and mainframes in the 1960s. Some of the early machines were by our standards hilarious like the Emidec 1100, Burroughs and later dominance by the epochal IBM/360 and later Z-series. Much of the code was custom written in COBOL a language so dated that it used to be shorthand for obsolescence but is still used in almost half of all UK banking systems. 

The modern Fintech world looks at such systems and chortles in merriment. Mainframes are still the backbone of UK bank transaction systems and are a huge challenge to adapt to the modern world of apps and open banking. But at the time they caused huge gains in efficiency as banks could move beyond paper ledgers and manual reconciliation. For the first time, a bank could, in theory, have a single live snapshot of it’s business. New technology like cashpoints, which Barclays and the UK banks were first to introduce could now allow cash to be withdrawn from anywhere outside opening hours.

The other big benefit, from the perspective of London management, was efficiency and cost-saving. Applications for any financial product could be entered on to the mainframe, laboriously transferred from paper of course, and then decisions made centrally. Local staff in branches gradually saw their decision making power ebb way and were increasingly seen by management, as retail assistants with pay and treatment to match.

The tyranny of the relational database and integration

The first banking systems were “electronic bookkeeping systems” which gradually evolved into much larger centralised relational databases – effectively huge spreadsheets of information linked with unique keys. Many were built in-house although increasingly bought in from IBM and new startups like Oracle and Sybase. The problem was that relational databases are inflexible and only store the basic data that the schema demands. Even worse, storage used to be expensive, initially on paper punch cards later on disk. Even by 1967 a one-megabyte hard disk cost almost £400,000 – similar storage today would cost a penny. Storing anything but basic transaction and account data was prohibitively expensive so all of the soft data about character, family, community and history was largely lost since there was no field on the mainframe form that captured them.

Today we assume that computer systems can exchange data in microseconds but from the 1960s onwards, integration was a constant nightmare so the Burroughs mainframe in Willesden couldn’t talk to the NCR terminals in the branch or even the cheque scanning system, without long custom programming and neither could communicate easily with the IBM mainframe in another region. The workaround was often humans laboriously retyping information from one system to another or at best, queuing up changes to be made once a day – batch processing. As bank mergers took off, more and more disparate systems had to be connected or transferred and the results could be disastrous. Efficiencies of scale were elusive and failures were common. So bank management became even more conservative about change and innovation. Tech client data was increasingly seen as a potentially career-ending problem rather than an opportunity.

Even banks that bit the bullet and transferred legacy systems to a single new platform had some well-publicised disasters like TSB’s complete system failure which cost it £366m in compensation and repair. Banks seemed by now to be stuck with poor client data. All but the smallest credit decisions could be made centrally or at least regionally and loan administration was equally computer-driven. The sympathetic, wise and informed local bank manager was now rapidly turning into a senior shop assistant.

The rise of the regulator and data protection

By the time of the 2009 global banking crash, Natwest had been swallowed by RBS and the publicity around banking tech failures convinced the new FCA and PRA that they had to intervene and ensure banks managed their tech infrastructure much more carefully. Similar changes happened across the G7. Infrastructure is now much more robust as a result but management are if anything now even more careful and conservative about tech and data innovation. Any change to a legacy system is probably righty seen as a huge risk.

The introduction of the stronger data protection (GDPR) regulations in 2018 has compounded the problem. Consumers have the right to demand access or deletion of any data that a bank holds on them. That’s simple enough for a new fintech with a single integrated tech platform, much harder for traditional banks with unlinked legacy systems and CRMs that may hold data stretching back decades. A lot of old client data has been discarded as a result. New initiatives that might hold or gather personal data are often seen as dangerously risky too.

Over decades the banks have moved from strong local knowledge of the client to centralised impersonality. Telephone and online banking have almost completely broken the idea of a human relationship or understanding of the client.


This is the first draft of a blog and we’re always interested to hear corrections or comments. Do email us at [email protected] if you have any.

In our next blog we’ll cover the impact of poor client data and later, how we are working on fixing the problem and giving full control to the client.

Related articles

How to get a mortgage, and mortgage in principle
Photo of author
Jonathan Gittos
Updated:
Photo of author
Jonathan Gittos
Updated:
What is an offset mortgage and is it right for you?
Photo of author
Jonathan Gittos
Updated: