top of page
Search
Phil Teplitzky

Shift in the Computing Model for the Financial Services Sector

Updated: Nov 3, 2021


There has, in recent years, been a significant paradigm shift in computing. The original computing model was that you would write a generalized program and bring separate sets of data to the program.

This was the model used back when Alan Turing invented his BOMBE machine to decrypt the Nazi’s Enigma cipher and also by the people who built ENIAC at the Moore School of Engineering to compute artillery Ballistic Tables. And this is the way computing continued for many years: you wrote an application, and you brought different sets of data (files) to it. For those in the financial services sector it meant that overnight, you ran the Accounts Payable, DDA or TDA Application against that day’s files. Essential activities were always built on algorithms, data structures, and files.

But recently there has been a significant shift in the model – today we bring the applications to the data! With the advent of modern databases, data warehouses, and data lakes many programs share the same data structure, at the same time, representing a fundamental shift in the computing model.

What are the implications of this change in the fundamental paradigm? As I see it there are six key changes that need to be taken into account for financial services organizations. These are:

  1. Data takes on a new role; it becomes ever more important that the data in the data structure be accurate, correct, authorized, timely, and meet the criteria for an ongoing concern. For example, for use in back up and recovery, and resilient to meet data security requirements.

  2. That the Data Governance and Master Data Standards, Procedures and Guidelines are being observed. All programs that access the data structures must follow the Data Governance Standards or there will be a degradation of the integrity and reliability of the data.

  3. CRUD (Create, Read, Update and Delete) analysis assumes a new level of importance. When it was one application and one data structure, the CRUD analysis was limited to which part of the program was doing what. Now, however, the essential question is which part of which program is doing what, which is a far more complex and challenging task.

  4. There must be consistency of edit and validation coding; it must be consistent across all applications.

  5. Documentation must be up to date and accurately reflect what is being done in the program.

  6. The System of Internal Control and the Audit Processes must be upgraded to consider the more complex and interdependent ecosystem.

In summary, the paradigm shift and increased importance of the shared data environment requires an equal change in the System Development Life Cycle (SDLC) and related organizational structures. But that’s a topic for another day.

About The Author

Mr. Teplitzky is a member of HP Marin’s Executive office and is the firm’s CTO. He has more than forty years of experience in technical and managerial positions for both corporate and consulting environments. He was a National Director at Coopers & Lybrand, responsible for data and architecture, in addition to a being a Founding Partner of The Plagman Group. Mr. Teplitzky was also the CIO at The Harry Fox Agency, CTO for The VitaminShoppe.com and Head of Data Architecture for Citi Bank Retail.

43 views0 comments
bottom of page