Forward Thinking

Technology is Changing Cultures

Long Predicted, Finally Reality

18 November 2020

Remote working has tested the resilience of fund managers’ IT infrastructures and the effectiveness of their governance structures. The pandemic has also confirmed how important it is to have sufficient operational expertise to create and curate high quality data across all business domains. If ever in doubt, technological and operational capabilities are now acknowledged as business-critical competencies.

Threats can create opportunities

After a long period of significant remote working it is obvious that savings can be made on city centre office space without impairing overall business performance. We will all work in offices again, but in smarter and more flexible ways than before the pandemic. Firms fortunate enough to have some flexibility in their accommodation costs are reallocating budgets from premises to projects – investing to further reduce costs and improve competitiveness, with payback periods of only one or two years. 

As for technology infrastructure, the pandemic has demonstrated that cloud computing is now mature and safe, and surely puts into question any further investment in on-premise IT infrastructure by firms for which managing hardware is not a core competence. 

Business Services – not “back-offices”

 

In the main, boards understand the importance of IT & Operations in their roles of servicing, advancing and protecting the wider business and its clients. An efficient Operations team ensures that investment decisions are made based on accurate and complete data. Furthermore, Operations supports the roll out of new products and is often able to prevent or rectify operating errors that occur in other parts of the organisation before they affect clients or the firm’s P&L. Without effective and resilient IT & Operations functions firms cannot manage their risks and reputations. The so called “back-office” has finally been acknowledged as a business enabler.

Leading firms are increasing “build the firm” budgets for projects which at the same time seek to reduce operating costs, reduce operational risks, and advance front-to-back resilience through the use of cloud computing.

Enriching C-Suite pools with technology expertise

Over the years, many fund managers have appointed NEDs with technology backgrounds to their boards. This helps ensure that the technology and data strategy is understood and supported at the highest level of an organisation. At the executive level, a number of C-Suite positions have also been created such as Chief Data Officer, Chief Transformation Officer and Chief Digital Officer, to complement the more traditional COO and CTO roles. 

However, representation on a committee is secondary. What is important is that teams collaborate and think critically and creatively on how to progress together. COOs and CTOs have always been champions for change with remits and skill-sets well beyond that of a subject matter expert.

The New Now

Hierarchical structures need to be gradually adjusted to deal with the obvious current challenges to firms’ cultures and modi operandi. Operating in a C-19 environment for a long period of time is likely to redefine the meaning of business culture, and strict governance structures must be liberated to ensure the acquisition, development and retention of talent. 

A successful digital transformation of an organisation requires talent, technology, trust and above all, teamwork. 

markus.ruetimann@aprexo.com

A Data Mastering Solution for Sustainability

5 October 2020

Surface - Control - Use

ESG data brings critical insights into opportunities and risks and is rapidly evolving to become an integral component of the investment data set and decision framework. Firms are seeking to innovate with new products and also integrate ESG factors into existing funds. Demand for differentiating ways of managing ESG data is high from businesses in many parts of the asset management ecosystem. 

Whether the ESG data is used to identify alpha, profile and manage risk, or enhance corporate stewardship and environmental engagement, each requirement creates its own data challenges. How to integrate multiple sources of ESG data into complex multi-asset investment processes presents a major challenge for most firms.

The situation is made even more complex due to the lack of standardised ratings and methodologies. Each security will be rated differently by each data provider. Accodingly, any ESG-ready data store needs to be flexible by design, and readily adaptable once in use in order to support changing requirements from internal and external clients.

Aprexo’s very successful ‘Surface - Control - Use’ paradigm for every aspect of the asset management data universe can incorporate any ESG analytics and data into its extensible database, for each stage of the investment process. 

The advent of this new class of investment data presents an opportunity to manage it in a modern, bitemporal way, fully lineaged and with the provenance of all data items recorded at the most granular atomic level. Only then can real business value can then be created from it. This can be done by storing ESG data in Aprexo’s immutable and bitemporal Data Mastering Solution, which facilites easy access to everything in it due to its API-led design. 

Aprexo’s DMS supports workflows from research, portfolio construction and analytics through to risk and reporting. Our solution captures, validates and maps ESG factors to different business functions across the investible issuer and securities universe. It delivers the widest possible view of this emerging data set and thereby drives informed decision making.

markus.ruetimann@aprexo.com

The Nature of Data

21 September 2020

When designing a Data Mastering System (DMS), it’s all about the data. Data comes in many shapes and forms, and within every set of data there are hidden rules and complexities that are important to understand in order to process, store, surface, control and use that data; its nature if you will. This is the third of three articles about the nature of data.

 

Part 3. The Timeliness of Data

In many financial systems today, timeliness of data is a real issue. 

If we look back 20 years, and consider retail bank accounts, knowing your balance and what you had spent meant walking into a branch and asking, or waiting for a monthly statement in the post. Fast forward 10 years, and telephone / online banking eased access, but even then your balance and list of transactions were updated only once a day, and not available until the next. Today retail bank accounts provide near real-time updates of balances and transactions, and challenger banks such as Revolut, Monzo and Starling offer mobile alerts as transactions are processed; so buying a coffee and seeing the money leave your bank account before you’ve had your first sip is now a common occurrence.

In asset management most systems still operate like retail banking did over 10 years ago, being unable to see an accurate view of positions (balances) until the next day. Whilst clearly far from ideal, this is still widely accepted as the standard.

So, what is the point? Well, if you have a near real-time view of positions, you are able to make more informed decisions. Knowing how much cash you have to spend minute by minute allows for better investment of that cash; knowing if a mistake occurs in near real-time allows for correction of that mistake sooner, potentially limiting the damage; knowing if the market is moving against you and seeing how it affects your portfolio tick by tick, allows for immediate action; and knowing your costs intraday gives you a chance to optimise those costs. More information is generally considered better, more timely information doubly so.

Some in asset management don’t see the need for a near real-time system, which is understandable as many portfolios are only rebalanced 2 or 3 times a week, sometimes even less. But if you were to buy or design a modern system today, why wouldn’t you choose a near real-time one, for all the benefits above and the new ones which will arise in the future? Today, would you sign up for a retail bank account from 10 years ago or would you go with one that sent you real-time mobile alerts?

In many financial systems much of the data created intraday is not useable until the next day, which leaves opportunities for investment returns, and for cost reductions, on the table. As the saying goes, this is playing with one hand tied behind your back!

markus.ruetimann@aprexo.com

The Nature of Data

14 September 2020

When designing a Data Mastering System (DMS), it’s all about the data. Data comes in many shapes and forms, and within every set of data there are hidden rules and complexities that are important to understand in order to process, store, surface, control and use that data; its nature if you will. This is the second of three articles about the nature of data.

 

Part 2. The Temporality of Data

Understanding how time applies to data is a critical concept when it comes to modelling data correctly.

 

Most financial systems, in broad terms, have what is referred to as “static” data, which has no associated date, and “time-series” data, which has a single associated “valid for” business date. Examples of static data are country or currency data used for reference purposes, and this data is valid across all dates; whereas examples of time-series data are stock prices and holdings, which are valid for a given business date.

 

So where is the problem with this approach? Consider the following scenario: a portfolio manager (PM) at 6pm on Monday evening runs an end of day (EOD) holdings report for a portfolio and notes that it holds 1M shares of IBM. At 1am on Tuesday morning a purchase of 200K shares of IBM that was traded on Monday, but delayed, hits the system. At 7am on Tuesday morning the PM runs Monday’s EOD holdings report again, but now finds that it has changed to reflect a holding of 1.2M shares of IBM. Given this, how does the PM confirm that the holdings were read correctly the previous evening? If they were, how does the PM find out what happened overnight? The related cash accounts might now be overdrawn and have incurred fees due to the purchase of the unexpected 200K shares of IBM, so was this a mistake or a cost that couldn’t have been avoided? 

 

To investigate, the PM could just recreate the EOD holdings report from Monday at 6pm that showed 1M shares of IBM and compare the differences, right? Unfortunately, not. The problem is that the holdings data only contains a single business date and does not track the date and time that it was added to or updated in the system. The PM can only ask for the holdings data using this business date, Monday, which can return only the latest data for that business date, which in this case now shows 1.2M shares of IBM. Now to be fair, in most systems it would be possible to look at the transaction history and figure out what occurred overnight and why; but this would take time and effort to diagnose given an incomplete understanding of what had happened in the first place.

 

There is a better way, “bi-temporality”! As the name suggests, this means two timelines for data, though in practice it means adding to all data a date and time that the system knew about the data, i.e. when it was loaded, committed, or received into the system. So static data becomes a time-series of when it was known to the system, and stock prices and holdings data becomes a “bi-temporal” timeseries, where both a valid-for business date and a when-known system date-time are tracked independently.

 

Given this, it is then possible for the PM to run a report asking the system what it knew at 6pm on Monday evening (also known as “as-at” a given system date and time), about the EOD holdings for Monday (also known as “as-of” a given business date), versus what the system knows now about the EOD holdings for Monday. A well designed bi-temporal data system allows, for instance, a PM to run a report for what a given portfolio looked like to the system, 6 months ago, at 2:49pm on a Tuesday.

 

Bi-temporality is a powerful concept that creates the ability to “time-travel” through data. It enables a fundamental understanding of data and the decisions that were taken using that data. This provides value for so many areas of finance: for compliance, audit and risk in the ability to explain and reproduce exactly a given report or set of data, to research and portfolio management where learning from the past is key to investing for the future.

markus.ruetimann@aprexo.com

The Nature of Data

9 September 2020

When designing a Data Mastering System (DMS), it’s all about the data. Data comes in many shapes and forms, and within every set of data there are hidden rules and complexities that are important to understand in order to process, store, surface, control and use that data; its nature if you will. This is the first of three articles about the nature of data.

 

Part 1. The Mutability of Data

If you were to look it up, the dictionary definition of mutability is “the liability or tendency to change”. This is apt, as nearly all systems used in finance today mutate data as a matter of course, and this can create several “liabilities”.

 

Data that is mutable can be changed, overwritten and deleted, and when this happens, the previous version of that data is lost. Why is this a problem? Consider this scenario: a portfolio manager (PM) holds 1M shares of IBM, bought for $130 per share and the share price in the system then changes to $150. The PM instructs the trading desk to sell those equities at best market price believing that to be $150, but the price increase was due to a market data error and in fact the market price had decreased to $110. In the system, the $150 intraday price is corrected to $110 and is overwritten with no record of it having been $150. How does the PM vindicate their decision and the potential loss; how do they explain this to their clients without an audit trail of what happened? How is the market data error investigated without any evidence?

 

But no system would do this right, it seems fundamental? You would be surprised at how many systems today overwrite intraday prices and positions!

 

So, what is the solution? That would be immutability, “something which cannot change after it has been created”. Mutable data can change after a user or system has read it, or even while they are reading it, which can cause data inconsistencies and invalid results; it is not possible to go back to look at what the data looked like before it was changed, and there is no history to understand what has happened in the past. Immutable data however has many benefits; it can be copied around without fear of those copies being out of date; two or more users or systems can access immutable data at the same time since it is unchanging and read only; for audit or look back purposes it can be relied upon to always be the same. It is akin to an author who is writing a book and is half way through; the pages behind are fixed (bar redrafts!), and can always be relied upon to be the same, whereas the pages ahead have yet to be written, but once they are, they will also be fixed, as they have now been added to the pages behind.

 

Now in reality, very little data can be considered never to change after it has been created, and even in those cases, such as a birth date, data may have been created incorrectly and therefore needs to be able to change in order for it to be corrected. If this is achieved by creating a new version of the data to be changed rather than changing it in place and consequently overwriting it, then each new data version is immutable with the above benefits. If we take the author/book analogy again, an author cannot change the books already out there in the world that have been sold, they are immutable! But an author can produce a 2ndedition of a previous work, a new version. The 1st and 2nd editions can then be compared for differences, and to understand the changes.

 

In financial systems, auditability and compliance are paramount, and inadequate controls in these areas can potentially lead to large fines and reputational damage. Immutable data, where nothing is ever overwritten or deleted, is a key advantage here.

markus.ruetimann@aprexo.com

Climbing the Legacy Mountain

7 July 2020

Addressing legacy systems and processes is a mountain most financial services firms will have to climb eventually. History will show that for many it was the sad events of COVID-19 which forced the start of their journey. In the post-pandemic world it will be early adopters of new technology that benefit not only from new functionality, but also from mobilising their organisations to make step-changes in their operating models.

In a very short period of time COVID-19 has brought the investment industry to an inflection point where digital client engagement (mobile and otherwise), operational resilience, enterprise risk reduction and the need for substantial ongoing cost savings dominate senior executives’ priorities now and for the foreseeable future. Technology-driven change, implemented through modular system enhancements and process simplification, is now being funded by mandatory rather than discretionary budgets. Operational efficiency and good data management provide the critical foundation for investment alpha.

'A journey of a thousand miles begins with a single step' Lao Tzu

 

Addressing data challenges will form the basis of the first step.

 

'The effective control and management of data has been one of the central issues facing the asset management industry for several years. The challenges around providing high quality data that can be consistently and timely delivered across the whole organisation has been further heightened in recent years by a) the increased demands on that data from not only investment, operations and client reporting functions but by other areas such as risk management, regulatory reporting, product etc; and b) the multiplicity of systems which have grown over time across functions which are all hungry to consume the same reliable and consistent data' George Efthimiou, former Global COO, HSBC Global Asset Management

In our professional and private lives, we have to cope with less readily digestible and more voluminous data every day. Data is both an asset and a liability for firms. It is an asset if access is easy and we are able to make key decisions based on complete and accurate data, available on demand. However, that asset quickly becomes a liability when the data is fraught with inaccuracy and access and availability is poor. 

In volatile markets, having intra-day access to near real-time and accurate investment data, such as portfolios’ investible cash balances and forecasts, has become a necessity for asset managers. New data-focused technologies, such as Aprexo’s Data Mastering Solution, play a vital role in future operating platforms. Transactions and events, in addition to the positions they impact, are becoming the atoms of the new modus operandi. Creating an atomic chain gives data a valuable lineage, benefiting every part of the investment management operating model. Investment decision making, regulatory oversight, client and executive reporting are increasingly requiring DMS technologies.

For many asset managers and owners, the ‘fly in the data ointment’ is too often the legacy systems the data is tied to. The consequences are a multiplicity of interfaces to maintain, data richness being restricted to what the lowest common denominator in the chain can handle, and manual oversight and intervention. Scalability is absent, and front-office confidence in outputs is low.

Technology is gradually liberating the asset management industry from this legacy. Big data infrastructures are emerging, helping to capture and analyse vast amounts of structured and unstructured data. Trials of Robotic Process Automation are showing fruit in many places. Acceptance of cloud computing and the concomitant need for cloud-born applications is now high, and the new Software-as-a-Service paradigm is accepted by all leading firms.

For asset servicers, a DMS constitutes an integral part of a modern Data-as-a-Service offering. The economic impacts of COVID-19 are likely to spark another wave of outsourcing of middle-office functions by Tier 2 and Tier 3 asset managers. In anticipation of this global securities services providers have already confirmed their renewed interest in a foundational DMS. Some have tried to do this via in-house development, few have yet succeeded.

markus.ruetimann@aprexo.com