How do we manage the ever-growing intricacies of market data spend? Market data teams face the challenge of balancing rising costs and stringent compliance demands with ever evolving business requirements, old legacy setups and the introduction of emerging technologies such as AI. These were going to be key themes driving the day’s discussions, which I was excited to join.
As co-host with Financial.com, TRG Screen was proud to support DKF Amsterdam 2024 and contribute to the debate. I had the privilege of delivering a keynote and participating in a panel discussion to explore these topics alongside industry leaders.
Keynote highlights
In my keynote, ‘Proactive Reference Data Management: From Challenges to Strategic Advantage,’
I spoke about how firms can tackle the rising costs and inefficiencies of managing reference data and how the first step is to have proactive visibility — a challenge that mirrors the event’s theme, ‘untangling the spaghetti.’
Many of the financial firms we work with share a familiar story: what begins as a simple, centralized setup grows into a fragmented web of opaque data sources, legacy systems and consequently, soaring costs. These aren’t just operational headaches—they have a direct and far-reaching impact on a firm’s ability to ensure compliance, operate efficiently and stay competitive.
I outlined how a proactive approach can transform reference data management:
- Real-time control: Moving beyond reactive, after-the-fact, invoice-centric approaches. This one step might look small, but it has massive implications in terms of what benefits it unlocks for organizations.
- Drive cost and time efficiencies: Using analytics and actionable insights to uncover inefficiencies and identify cost-saving opportunities. With a proactive approach and right technology for insights and analytics, companies reduce their reference data costs by 25% on average.
- Shift the power balance: Shifting the power dynamic to enable data buyers to negotiate from a position of strength, with facts and data.
The highlight of my talk was a preview of a generative AI-powered query tool we’ve been developing. It supplements the powerful reporting already built into our technology, enabling users to ask natural-language questions about their data—such as “How can I reduce costs for this data source?” or “Which systems are driving fee-liable attributes?” I believe GenAI’s ability to ingest large amounts of data and contextualize it to specific use cases will revolutionize how organizations derive insights and actions.
The reception to the demo was fantastic, and it sparked some great follow-up conversations during the networking sessions.
Later, I joined a panel discussion eloquently titled ‘Master Data Management: How to effectively manage the market data streams in your firm with the aid of AI and in an industry with growing regulatory developments.’ The session brought together perspectives from across the industry to discuss strategies for navigating the growing complexity of market data.
One of the key points we explored was how firms can centralize and streamline their data flows. I emphasized the importance of the first step: proactive monitoring— implementing technologies that put organizations in a position of real-time visibility and control. There are of course also other important elements like centralizing data distribution and data piping, and these were tackled by my colleagues on the panel.
We also delved into the role of AI in managing market data more efficiently. Building on themes from my keynote, I shared how it can go beyond providing insights to offering actionable recommendations, such as optimizing data licensing or configuring compliance rules automatically. I strongly believe GenAI will play a big part in how optimizations and compliance are looked at – but there is work to be done both on the technology itself in terms of getting reliable for production usage, but also of course, on the IP protection side.
The panel also touched on emerging regulatory developments like DORA (Digital Operational Resilience Act) and the European Consolidated Tape (CTP). These discussions reflected a broader conversation at the event about the need for firms to remain agile in order to adapt to the changing compliance landscape.
Other topics du jour
Hearing from other industry leaders throughout the day provided valuable insights into where market data management technologies and services are heading.
Unsurprisingly, the recurring theme was the potential of generative AI. Discussions frequently circled back to how AI could streamline operations, enhance compliance and bridge gaps in decision making when handling complex data ecosystems. The conversations weren’t just about generating insights but about how AI could enable more intuitive and actionable approaches to tackling data challenges—acting as digital ‘advisors’ and helping firms transform complexity into actionable insights.
Looking ahead
What stood out to me was the shared recognition that firms are at a crossroads. Rising costs, budgetary and regulatory pressures and rapidly evolving technology like AI are forcing firms to rethink how they manage market data. Firms that want to be successful should take control and be proactive.
The challenge isn’t just about untangling the spaghetti— it’s about reimagining how we value and use reference data as a strategic asset. By combining proactive management, human expertise, and AI-driven innovation, firms can position themselves to navigate this complexity and seize new opportunities.
As the day closed with lively networking conversations, one question lingered: Are we ready to adapt quickly enough to define the future of market data management—or will the spaghetti remain a tangled mess? We think that with the right technology and expertise and the right granular data, it’s very possible to untangle. What do you think? Does your team have the tools and insight you need to go from tangled mess to a sense of order and control over your reference data management?