Plumery has announced the launch of AI Fabric, which develops an AI-ready foundation for AI-assisted digital banking.
Created on an event-driven data mesh, Plumery’s new solution provides financial institutions with a standardised option for connecting AI and generative AI models/agents to banking data, mitigating the need for bespoke system integrations. The company’s AI Fabric positions institutions towards an event-driven, API-first architecture that facilitates innovation.
Removing fragmentation for FIs
Considering that the majority of financial institutions find it difficult to operationalise AI because their data is fragmented across legacy cores, channels, and point-to-point integrations, Plumery’s AI Fabric is set to allow organisations to plug in and swap AI capabilities as the ecosystem advances. The solution reveals high-quality, domain-oriented banking events and data streams in a consistent, governed, and reusable way.
Additionally, the platform divides systems of record from systems of engagement and intelligence, facilitating long-term agility for financial institutions. At the same time, minimising point-to-point integrations and one-off data pipelines enables financial institutions to decrease operational complexity and technical debt. This allows them to change more cost-effectively and safely, while maintaining predictability.
Moreover, by having concise data lineage, ownership, and control, financial institutions can simplify explaining decisions, managing model risk, and meeting regulatory demands, minimising compliance complexity as AI adoption scales.
Besides financial institutions that are prepared to operationalise AI, those that are not yet ready can leverage Plumery’s AI Fabric to develop the foundation for this technology, ensuring that they can expand securely when priorities, budgets, and markets change.
Expanding on this, Ben Goldin, Founder and CEO of Plumery, stated that AI Fabric will offer financial institutions a standard, bank-grade method to enable AI use within their tools and data without requiring them to rebuild integrations for every model. Also, the event-driven data mesh architecture is set to optimise the process by modifying how banking data is produced, shared, and consumed, instead of stacking another AI layer on fragmented systems.