Distribution Data: Here’s the Truth About Solving Your Distribution Data

Nick Dekker, Will Basnett

If you were to ask the Head of Distribution at your asset management firm, “What are the top three blockers to our growth?”, what do you think they would say? In our experience, data makes that list more often than not. From the perspective of Distribution (in its broadest functional sense including Sales, Marketing and Client Service), Heads of Distribution and their leadership teams often express concerns that are rooted in a lack of good quality, timely data.

According to our 2020 Data Operating Models Survey, most firms (61%) are still at the stage of “getting organised” – building the foundations to enable their firm to spend more time getting value from data, and less time manually collating, checking and transforming it.

So, what’s holding Distribution functions back from becoming more data-driven? We see a few reasons typically given:

  • Quality – Data is perceived to be unreliable, so firms postpone initiatives such as client reporting or analytics until it’s fixed by an enterprise data programme. Spoiler alert: the enterprise data programme often fails to deliver, or Distribution is deprioritised.
  • Relevance – Data is transformed to meet the needs of other functions and isn’t usable for Distribution. For example, AUM and flows data is usually adapted for Finance – it doesn’t represent the way Distribution see the world, and so isn’t fit for purpose.
  • Accessibility – Data isn’t available in a timely manner or at all internally, before you even think about publishing it in fund documents or on client portals.

 

We have recently helped five leading asset management firms with the design and implementation of data-driven transformations impacting Distribution. In our experience, there are three key lessons:

  1. Focus on building strong foundations – and ignore shiny objects
  2. Connect your data – or risk looking disconnected to your clients
  3. Democratise your data – both internally and externally

1. Focus on building strong foundations - and ignore shiny objects

Identify your ‘golden sources’ that will deliver value

Without clear, single master sources of data, deriving even basic information like a list of all funds can prove difficult. Successful firms identify and master core operational datasets.

It’s not glamorous, but it’s the foundation on which everything is built. Firms do not need to wait for their enterprise data programme to deliver a data warehouse or Master Data Management solution.  It is entirely realistic to define robust strategic data models now for those operational datasets and implement them on available technology such as a CRM; firms can transition later, retaining the underlying data model.

Investment in data must start at source

Asset Managers have invested heavily over recent years in Distribution technology and are struggling to achieve the ROI they anticipated – this is largely because the same investment was not made on data integrity.

“42% of Asset Managers say that Senior Management are very concerned about Data Governance across the business”
Source: Alpha FMC 2020 Data Operating Models Survey

In our experience, most new technology initiatives are at some point constrained by data quality – typically resulting in these projects costing more and taking longer to deliver.

Furthermore, in today’s digital environment, Distribution teams are feeling the impacts of incorrect data more than ever:

  • The embarrassment for the Client Services team when a misspelled name in the CRM ends up in a client report; or worse…
  • An input error in the product master leading to the wrong fund investment objectives being displayed in a regulatory document (such as a KIID) or by a third-party fund platform

Reshaping the culture around data quality takes time, but the appointment of data owners and stewards, along with the use of transparent quality metrics, is an effective way to create trust in the completeness, accuracy, and timeliness of data.

2. Connect your data – or risk looking disconnected to your clients

Bring together datasets to view the world the way you want to

Most asset managers struggle to organise and reconcile multiple datasets, which leaves them with a fragmented view of the client and an inability to see outside of a product-centric lens.

For example, by combining mastered datasets with external data, firms can get an enterprise Client Book of Record (CBOR). This provides a significantly more accurate view of the AUM and flow across the firm for clients, prospects and influencers. Importantly, it goes beyond the basic finance/legal view to see the entirety of the value of the client relationship. For more on Enterprise CBOR, see the article in The Alpha Outlook 2021.

Distribution teams rarely see how disorderly they look to buyers – conversely, the disorganisation is all the client sees.  Without clarity of information, firms run the risk of having an incomplete or inaccurate view of clients, leading to at best missed opportunities, and at worst lost mandates, clients and flow.

 

Be the architect of your own fund distribution success

Complex legacy enterprise architectures have typically caused key data sources to end up in disparate departmental databases and data warehouses. As a result, technology solutions have proliferated via inefficient point-to-point solutioning with heavy reliance on unsupported manual data manipulation to make sure that data is processed and checked for accuracy.

In our experience, Asset Managers are trying to solve current architectural challenges by either battling problems in isolation (e.g. what is our data warehousing strategy? Can we decommission this expensive legacy application?) or through buying new, best-of-breed applications. Only later do they realise that these do not solve for the underlying data model and architectural issues that are causing the inefficiencies in the first place.

Often the best solution is to step back and first review and address the broader architectural landscape: ensure all applications within the stack are fully integrated, fit for purpose and supported by the underlying data model. From there it is a leap forward from a strong foundation, enabling firms to maximise the strategic value from the existing platforms. Additionally, firms can reduce risk by decommissioning redundant legacy applications and easily scale for the future by being able to seamlessly connect new technology into the existing ecosystem.

3. Democratise your data – both internally and externally

Data is ‘available’ but is it ‘accessible’?

The processes, technology and use cases that rely on trusted data have continued to grow significantly. The challenge is now shifting from one of data availability (can I get the data?), to one of data accessibility (can I get the data easily?).

Leading firms are developing a dedicated information provisioning layer over the existing architecture. This means stored, wrangled and connected data can be quickly transformed and presented to customers, providing a ‘one-stop-shop’ for their complex data requirements.

 

Data is now a service

The technology and processes supporting this architectural design can be matured further to provide a full-service delivery operating model that:

  • tracks inbound and outbound data to support data quality monitoring;
  • allows scheduling with alerting to help operational teams manage SLAs and support their existing BAU processes;
  • embeds data governance for each data point; and
  • has DevOps capabilities to manage technical change.

With an established service delivery model in place, not only can a better standard of data be provided to internal customers of the data, but opportunities to achieve efficiency with external clients begin to present themselves.

One of the most common efficiencies we see our clients exploiting is in the provision of fund information to third party distributors. An information provision ‘service’ allows data to be reconciled and formatted in any required structure, on a repeatable basis. This makes it easier for firms to get monthly data in the right format to provide to an intermediary or a third party fund data aggregator / publisher.  Overall, data can be put in the hands of distributors more quickly, at lower cost, and with fewer mistakes.

How does all of this help you?

In our experience helping clients with data-driven Distribution projects, the business and benefits case can be compelling. We have highlighted five important benefits around speed, client relationships, headcount, risk, and pricing.

It’s not only the Distribution function that benefits: your COO will value a simplified and more scalable operating model; your CTO will appreciate a streamlined technology estate and a more modern, loosely coupled architecture; your CRO will be supportive because of the regulatory risk reduction; and your CFO will see the benefits to the bottom line.

The truth is, there’s nothing holding you back

To recap, this is how some successful firms are moving past the typical Distribution data blockers:

  1. Focus on building strong foundations – and ignore shiny objects
    Identify the ‘golden sources’ that will deliver value and implement strategic data models.
  2. Connect your data – or risk looking disconnected to your clients
    Bring together datasets to view the world the way you want to, and enable a complete view of clients, prospects, and influencers.
  3. Democratise your data – both internally and externally
    Make data easy to use and treat data as a service.

If you would like to see a step change in the quality of Distribution data in your organisation, there are two easy actions you can take right now to move this forward:

Share this article with those in your organisation who have an interest in Distribution data being fixed or own the Distribution change budget.

Ask the authors a question on any aspect of this article and set up a further conversation.

Why Alpha?

Alpha is the leader in implementing and enhancing CRM systems for the asset management industry. Building on years of experience and successful client projects, we have accumulated a deep knowledge of the operational data models and architecture required to optimally position Distribution teams for the future.

We have supported clients at whatever stage they are in, in terms of data and technology maturity. We typically start with a Distribution health-check or design project, demonstrate our value quickly, and continue supporting our clients along their journeys to become more data-driven.

We’d love to show you how we’ve been helping leading asset managers’ Distribution teams, and would be delighted to hear from you.

About the Authors

Nick Dekker Alpha FMC
Nick Dekker
Director

Nick Dekker is a Director at Alpha FMC, has 12 years’ asset management industry experience, and is Alpha’s lead for Distribution Operations in the UK. He helps Distribution functions improve their data, client operations, and client reporting, which often involves getting the basics right as set out in this article. He is currently leading a data-driven Distribution transformation at a well-known UK asset manager.

Will Basnett
Senior Manager

Will is a Senior Manager within the Distribution practice at Alpha and has 10 years’ experience consulting within the financial services industry. Will has extensive experience leading and delivering transformational change through technology and data programmes. He has spent the last 18 months implementing most of the recommendations in this article to support fund distribution at a leading global asset manager.