Client AUM, Flow and Revenue is arguably the most valuable data set within the enterprise. What function or executive does not need to understand the economic performance of clients across product?
With analytics powered by this data, an executive can articulate a clear case for resource allocation, cost control and investment decisions from KPIs including:
- The economic performance and future potential of clients – by client segment, geography, business unit, channel and region
- The performance of products and strategies – by client & distributor, client segment and region
- The performance of sales and marketing teams with measures like annualised revenue on accurate views of net new money
- The logic for an acquisition or merger, including assessments of the performance of the combined and legacy businesses post-integration
The Industry is rapidly realising the potential and necessity for an enterprise-wide data set
The data transformation underway in Investment Management is running at full steam and has been accelerated by the pandemic. Leading managers believe that Master Data Management (MDM) capabilities – including shared ontology, semantics, governance and stewardship processes – enable inter-enterprise data sharing and lead to out-performing peers that have not invested in this capability. Of the data domains across the enterprise, mastering Client AUM, Flow & Revenue data has a high ROI and should be a top priority for every manager.
Paradoxically, the extreme value and demand for this data set across all the areas of the firm have actually blocked investment in a single strategic store of data with cutting-edge MDM capabilities to govern and exploit data in a secure, controlled & agile manner.
Indeed many asset managers tell us the reverse has happened and that the management and use of this data is an ungoverned ‘free for all’ characterised by:
- Multiple stores of similar data exist across the firm
- Tactical, manual patches of the data to address chronic data quality issues in the AUM & Flow data-set
- Expensive and complex processing of the data that is partially duplicated across functions creating confusion where clarity was sought
- Complex reconciliation processes to align functional views of the truth at the top of the house
- Distrust in the accuracy and reliability of all of the data, resulting in reversion to spreadsheets and other tactical measures
The impact of this ‘free for all’ approach is:
- An unsustainable and avoidable cost of £100 mn ($137 mn) p.a. incurred by managers by governing the data in a non-strategic manner
- ‘Missed’ decisions at functional & enterprise level because data badged as ‘directionally accurate’ is too high level or inaccurate to support the required decision
Our other Insight articles, such as “How zombie rebates ruin relationships” and “How we can tackle the industry’s chronic rebate issue” examine the root causes of this paradox. Below we outline how leading managers have acted to resolve it. This prescription of initiatives is examined in further detail in other articles in this series.
5 key factors to successfully master AUM, Flow & Revenue data
1. Quantify the total lifetime enterprise cost of bad data
Including both the hard costs of duplicative data management processes and expensive reconciliation work but also – which takes more creativity – the cost of blocked capabilities of strategic agility and data-driven management processes.
2. Iteratively evidence the business value
Focus on high-value use cases across the Enterprise in: Strategy, Finance, Sales, Marketing, Service, Risk & Compliance, Product Management, etc. Examples include monitoring and reporting the performance of clients and sales team post-acquisition / merger – in both the legacy businesses and the newly combined merged entity.
3. Define a comprehensive data integration strategy from day 1
If duplicated functional silos are to be decommissioned and replaced by a single strategic source of Client Master AUM, Flow & Revenue data for the Enterprise, there must be the means to transport the right data to the right data consumers at the point of usage – at the right time. Otherwise, tactical sources will spring up to fill the gap. This is a tall order and frequently not resourced and is a common reason why enterprise initiatives fail.
4. Build resiliency into the solution architecture
Business-level Service Level Agreements (SLAs) in line with ‘data as a service’ principles should be supported by a secure, agile change capability – necessary to accommodate the rapidly escalating numbers of data sources and data volumes, poor data quality and fast-moving analytical and data science requirements.
5. Cultivate senior sponsorship and business engagement
To persuade functional leaders that the enterprise solution will solve for their specific requirements and not become yet another corporate pyramid in the sand. This requires political capital and an ability to engage with technologists and data engineers to hash out a roadmap that will require multi-year investments and concurrently deliver progressively better data every quarter. Many investment managers have senior executives with the organisational ‘mana’ to sponsor complex data projects but often are not digital and data natives and inadvertently miss critical decisions, simplify requirements or make compromises that fatally undermine the effectiveness of the solution.