Banking Fintech Security

The real reason for core systems refreshment

The real reason for core systems refreshment. Image: Freepik
Written by Chris Skinner

Are banks’ core systems secure enough for 21st century banking? Chris Skinner finds out.

I’ve been advocating for some time that banks should refresh core systems; a complete renewal of the back-end. Everyone tells me I’m an idiot for saying so. It’s impossible, stupid, naive and impractical. OK, I hear that. I know it’s not going to be easy, but if a bank has systems built before Mark Zuckerberg was born, how can they expect to be fit for the real-time, free world of the mobile internet?

I don’t advocate the renewal purely to be fit to market to the 21st century consumer using contextual data analytics
But here’s the thing: I don’t advocate the renewal purely to be fit to market to the 21st century consumer using contextual data analytics, although that’s useful and virtually impossible when you have fragmented back-end systems. Equally, I don’t say that you need to do this purely to enable consistency of access across digital media, although that’s a great improvement over the inconsistency created by having multiple channels of legacy. In addition, I don’t say this just because old systems typically work in batch overnight updates that cannot keep up with real-time needs. Finally, I don’t say this because old systems are regularly having glitches, although they are (you’ll find a selection of a few this year at the end of this blog entry). No. Replacing core systems give you a lot of benefits, including:

  • Real-time provision of service
  • Consistency of data
  • Ability to leverage deep data analytics
  • Single view of the customer
  • Enterprise information leverage.

But perhaps the greatest benefit of consolidating into a single service is risk management. This is evidenced by a fascinating article in the Harvard Business Review this month, talking about lessons in cybersecurity from the US Department of Defence. The focus of the article is the risk factors of cyberattack, which, as you can imagine, the Pentagon takes fairly seriously. The aim is to provide a few lessons for business to learn, and here are a few headlines:

 From September 2014 to June 2015, the US military repelled more than 30 million known malicious attacks at the boundaries of its networks. Of the small number that did get through, less than 0.1% compromised systems in any way.

In a 2014 study by the Ponemon Institute, the average annualized cost of cybercrime incurred by a benchmark sample of US companies was $12.7m, a 96% increase in five years. Meanwhile, the time it took to resolve a cyberattack had increased by 33%, on average, and the average cost incurred to resolve a single attack totalled more than $1.6m.

 Over the past three years, intrusions into critical US infrastructure – systems that control operations in the chemical, electrical, water, and transport sectors – have increased 17-fold.

 The US Department of Defence experiences 41 million scans, probes, and attacks a month.

 The annual global cost of cybercrime against consumers is $113bn [2013 Norton Report, Symantec]

The Department of Defence is consolidating 15,000 networks into a single, unified architecture.

That last part is the critical part, and maybe the key paragraph in the article is what the Department of Defence is doing to overcome the issues of cyberattack:

“Back in 2009, the Defense Department comprised seven million devices operating across 15,000 network enclaves, all run by different system administrators, who configured their parts of the network to different standards. It was not a recipe for security or efficiency. It brought network operations across the entire .mil domain under the authority of one four-star officer. The department simultaneously began to consolidate its sprawling networks, collapsing the 15,000 systems into a single, unified architecture called the Joint Information Environment. What once was a jumble of more than 100,000 network administrators with different chains of command, standards, and protocols is evolving toward a tightly run cadre of elite network defenders.”

And, although the US Cyber Command has been upgrading the military’s technology to quickly detect anomalies, “one key lesson of the military’s experience is that while technical upgrades are important, minimizing human error is even more crucial.”

This is why the Pentagon treats security as a culture challenge rather than a technological challenge. At the heart of that culture are six interconnected principles:

  • Integrity
  • Depth of knowledge
  • Procedural compliance
  • Forceful backup
  • A questioning attitude
  • Formality in communication.

It’s a useful insight into the way in which the military are approaching cyberdefence, and the key is to ensure that the technologies are up to date, but more importantly, that the people are trained to beware.

Here are a few glitches in the UK since 1 June 2015:

This article is reproduced with kind permission. Some minor changes have been made to reflect BankNXT style considerations. You can read the original article here.

About the author

Chris Skinner

Chris Skinner is an independent commentator on the financial markets through the Finanser, and chair of the European networking forum the Financial Services Club, which he founded in 2004. He is an author of numerous books covering everything from European regulations in banking through to the credit crisis, to the future of banking.

Leave a Comment