Modern system for insurance data management

Insurance

Modernizing product data ingestion and distribution

Development of a robust framework and a secure, automated solution to ensure smooth migration of data from legacy systems to AWS while maintaining flexibility in distribution across file formats and channels.

Client
A large NY-based life insurance and investment company
Goal
Create a secure, automated solution for data ingestion and a robust framework for distribution across channels
Tools and Technologies
Python, PySpark, AWS Glue/Redshift/Lambda/S3/Aurora, Stonebranch, Jira, Github
Business Challenge

The client used a legacy product data infrastructure (PACE) and other systems that provided neither fully-secure access nor enabled efficient quality checks. This affected system integration and data ingestion and distribution. 

Workflows and checks were not adequately automated, and they did not offer a reusable framework to generate and deliver outbound data files aligned with business requirements.

Solution
  • Created reusable and scalable ETL/ELT pipelines using Python and AWS services
  • Integrated Stonebranch for orchestration and automated job scheduling, with monitoring mechanisms and alerts 
  • Tuned Redshift queries and optimized data ingestion processes to reduce latency and improve throughput
  • Defined data specifications and output formats as per business needs
  • Built a configurable pipeline to create dynamic CSV/Excel files from Redshift views
  • Automated file delivery via email/SFTP monitored and orchestrated by Stonebranch 
Outcomes
  • Improved data distribution and a reusable framework for ingestion and distribution of data across existing and new products
  • Streamlined operations and improved data accessibility
  • Enhanced performance and scalability 
  • Ensured better data quality and governance with automation and structured reusability 
Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

Modern insurance product data services platform

Insurance

Streamlining product data access and delivery

Development of a centralized platform for product data services that can distribute critical data securely, enforce governance, scale with business needs, and integrate smoothly with downstream applications.

Client
A large NY-based life insurance and investment services provider
Goal
Create a unified, scalable architecture for secure, standardized sharing of data with downstream services
Tools and Technologies
Java 21, Spring Boot, AWS, Jenkins, Stonebranch, Jira, Microservices, Redshift, Aurora, DynamoDB, Angular, JS, MYSQL, EKS, SQS
Business Challenge

The client used a product data mart that lacked a secure, standardized method to share data with downstream services. This affected data governance and consistency across applications. It also faced the risk of disrupting tightly integrated legacy data flows while shifting from a legacy PACE system that supports critical AEM microservices to a PDP mart. 

It needed a new system that is scalable, and the modernization process had to be carried out with minimal disruption to existing systems. 

Solution
  • Implemented a dual-solution strategy to modernize data access and delivery. Key elements included:

    • Built a microservices-based platform and deployed on AWS
    • Enabled secure, flexible API access to the product data mart using tagged identifiers
    • Ensured a scalable design with minimal code changes for expansion
    • Allowed existing AEM microservices to operate without changes during and post-transition
Outcomes
  • Improved efficiency and security by enabling standardized, governed access to product and entity data 
  • Achieved faster delivery and reduced manual effort
  • Ensured seamless integration while modernizing the backend
  • Maintained front-end stability of AEM microservices
Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

Modern insurance platform for loan processing

Insurance

Modernized platform enhances loan processing and UI-UX

Platform modernization with system integration, functional enhancements, and UI transformation, helps streamline processes, minimize duplication, deliver strong ROI and grow customer retention by a projected 50%.

Client
A large NY-based life insurance and investment company
Goal
Modernize product data management services to enhance loan processing and improve user experience
Tools and Technologies
Angular 15, .Net 4.8, .NET 8.0, AWS EC2, SQL Server, PostgreSQL, Stonebranch, SharePoint, IIS
Business Challenge

The client operated a critical legacy application for loan processing, which had significant operational inefficiencies and shortcomings in user experience. It hindered the speed and accuracy of loan disbursements, affecting both internal operations and customer satisfaction.

The goal was to modernize the platform with scalable solutions that could enable secure distribution of critical data and better governance, while integrating smoothly with downstream applications.

Solution
  • Implemented a holistic modernization approach with a focus on system integration, functional enhancements, and UI transformation 
  • Streamlined deal creation and approval through a third-party system with robust verification and compliance features
  • Enabled multi-loan support under a single deal
  • Ensured consistency by introducing a standardized component library with reusable UI components
  • Revamped the interface with intuitive design, responsive layouts, and improved user experience
Outcomes
  • Enhanced efficiency, scalability, and user-centric loan operations 
  • Improved ROI through greater efficiency, better user experience, and agility
  • Reduced verification effort by 30–55% and enabled focus on higher-value tasks
  • Streamlined processes and minimized data duplication
  • Ensured consistent design and efficient development
  • Increased customer satisfaction, with a projected 50% rise in retention
Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

Eagle Access Data Platform Transforms Accounting

Insurance

Eagle Access Data Platform Transforms Accounting

A centralized and secure cloud-based financial data warehouse provides a single source of truth, improving analytics and reducing risk, for a major insurer.

Client
Large NY-based life insurance and investment company
Goal
Consolidate multiple accounting systems into a centralized, reliable data warehouse to improve reporting and decision-making
Tools and Technologies
BNY Eagle Access, Python, Oracle, AWS EC2 and S3, SQL Server, Jira, Stonebranch
Business Challenge

The insurer faced growing inefficiencies due to siloed accounting systems that lacked integration, consistency, and scalability. Reporting processes were time-consuming, error-prone, and lacked real-time visibility—hindering timely business and investment decisions. A centralized solution was needed to ingest and unify data from disparate systems like SAP GL, Singularity, and Loan Management into a single, trusted platform to support strategic financial insights and reduce operational complexity.

Solution
  • Built a centralized accounting data warehouse using the Eagle Access secure private cloud environment
  • Ingested and standardized data from SAP GL, Singularity, and Loan Management systems
  • Used AWS EC2, S3, and Stonebranch Universal Automation Controller for cloud infrastructure and job orchestration
  • Enabled real-time reporting via Tableau integration and migration of legacy dashboards
  • Improved data accuracy and consistency through robust validation and automation
Outcomes
  • Created a unified source of truth for all accounting data
  • Enabled faster, more accurate reporting and analytics, improving business and investment decision-making
  • Reduced data silos and improved accessibility across systems
  • Minimized infrastructure complexity and operational risk with secure private cloud hosting
  • Enhanced efficiency through automated data processing and orchestration
Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

Data migration to cloud expedites credit risk functions

Banking

Data migration to cloud expedites credit risk functions

Migrating on-premises models and data to the cloud enhances financial forecasting, sensitivity analysis, and time-to-market.

Client
A leading North American bank
Goal
Migrate credit risk data and SAS-based analytics models from on-premises data warehouse to AWS to enhance functionality
Tools and Technologies
AWS Glue, Redshift, DataSync, Athena, CloudWatch, SageMaker; Apache Airflow; Delta Lake; Power BI
Business Challenge

The credit risk unit of a major bank aimed to migrate SAS-based analytics models containing data for financial forecasting and sensitivity analysis to Amazon SageMaker.

This was to leverage benefits such as enhanced scalability, improved maintenance for MLOps engineers, and better developer experience. It also sought to migrate credit risk data from a Netezza-based on-premises data warehouse to AWS, utilizing a data lake on AWS S3 and a data warehouse on Redshift to support model migration.

Solution
  • Decoupled data workload processing from relational systems using the phased approach with a focus on historical migration, transformational complexities, data volumes, and ingestion frequencies of the incremental loads
  • Developed a flexible ETL framework using DataSync for extracting data to AWS as flat files from Netezza
  • Transformed data in S3 layers using Glue ETL and moved it to the Redshift data warehouse
  • Enabled Glue integration with Delta Lake for incremental data workloads
  • Built ETL workflows using Step Functions during orchestration and concurrent runs of the workflow; orchestrated the concurrent runs of workflows using Apache Airflow
  • Architected data shift from Netezza to AWS, leveraging a flexible ETL framework
Outcomes
  • Enhanced financial forecasting and sensitivity analysis operations with analytical models and data migrated to the AWS public cloud
  • Expedited time-to-market catering to client’s downstream consumption needs through Power BI and Amazon SageMaker
Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

Generative AI platform for business use cases

Commercial & Corporate Banking

Gen AI platform offers future-ready capabilities

Intuitive Generative AI platform provides AI, integration, and end-use capabilities and forward-looking roadmaps for the risk teams and business users at a leading bank.

Client
A leading North American bank
Goal
Provide a wide range of AI capabilities for various risk and business teams and avoid building fragmented, outdated systems
Tools and Technologies
Amazon Bedrock and Titan V2, pgvector, Faiss, OpenSearch, Llama 7B, Claude Sonnet 3.5 and 3.7
Business Challenge

The Enterprise Risk function at a leading North American bank initiated a Generative AI (Gen AI) solution to offer a wide range of AI capabilities, including document intelligence, summarization, generation, translation, and more.

As the project evolved through proofs of concept and pilots, a key challenge emerged: the risk of creating a fragmented ecosystem with an overwhelming array of unmanageable bespoke solutions, model integrations, and reliance on potentially outdated models and libraries.

Solution

Based on prior engagements across clients, our team delivered thought leadership around how to develop and deliver capabilities using a platform approach. We also set up a Minimum Viable Product Team to iterate on new problem areas and solution approaches. Platform development includes generalized capabilities for:

  • Setting up document ingestion pipelines, with choice of parsing approaches, embedding models and vector index stores
  • A factory model along with configurations for integrating new parsers, embedding models, LLM interfaces etc., to quickly bring new capabilities to the platform
  • User management, SSO integration, entitlements management
  • API integration to bring in information/ data from internal and external sources
  • Platform support of pgvector, Faiss, OpenSearch, Amazon Titan V2, Llama 7B, Claude Sonnet 3.5, and 3.7, etc.
  • Intuitive chat interface for AI Masters - designated business users trained in Prompt Engineering and other techniques to assemble new AI/Gen AI capabilities for users through configuration - and end users
Outcomes
  • A future-ready Gen AI platform that can easily incorporate new capabilities and updates
  • Multiple specific capabilities, called skills, for use by various risk teams and business users
  • A forward-looking roadmap, including ability to compose more complex capabilities using atomic capabilities
Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

Custom analytics enable faster business decisions

Brokerage, Wealth & Asset Management

Custom analytics enable faster business decisions

Optimized data and on-demand analytics deliver faster business insights and better user experience for asset management firm

Client
U.S.-based asset management company
Goal
Streamline and improve data and analytics capabilities for enhanced user experiences
Technology Tools
Java, React JS, MS SQL Server, Spring Boot, GitHub, Jenkins
Business Challenge

The client captures voluminous data from multiple internal and external sources. The absence of quick, on-demand capabilities for business users was inefficient in generating customized portfolio analytics on attributes such as average quality, yield to maturity, average coupon, etc.

The client teams were spending enormous amounts of manual effort and elapsed time (approximately 12-15 hours) to respond to requests for proposals from their respective clients.

Solution

Iris implemented a data acquisition and analytics system with pre-processing capabilities for grouping, classifying, and handling historical data.

A data dictionary was established for key concepts, such as asset classes and industry classifications, enabling end users to access data for analytical computation. The analytics engine was refactored, optimized, and integrated into the streamlined investment performance data infrastructure.

The team developed an interactive self-service capability, allowing business users to track data availability, perform advanced searches, generate custom analytics, visualize information, and utilize the insights for decision-making.

Outcomes

The solution brought several benefits to the client, including:

  • Simplified data access to generate custom analytics for end users
  • Eliminated manual processing and the need for complex queries
  • Enhanced the stakeholder experience
  • Reduced response time to client RFPs by over 50%
Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

Data warehouse enhances client communications

Data warehouse enhances client communications

Asset Management

Investment data warehouse enhances client communications

Account management and marketing teams of an investment bank acquired improved multi-channel client communications and portfolio management capabilities with a data warehouse and a single source of truth.

Client

A U.S.-based investment bank

Goal

Improve data collation and information quality for enhanced marketing and client reporting functions

Tools and technologies

Composite C1, Oracle DB, PostgreSQL, Vermilion Reporting Suite, Python, MS SQL Server, React.js

BUSINESS CHALLENGE

The client’s existing investment data structure lacked a single source of truth for investment and performance data. The account management and marketing teams were making significant manual efforts to track portfolio performance, identify opportunities and ensure accurate client reporting. The time-consuming and manual processes of generating marketing exhibits and client reports were highly error-prone.

SOLUTION

Iris implemented a comprehensive investment data infrastructure for a single source of truth and improved reporting capabilities for marketing content and client report generation. An automated Quality Assurance process was instituted to validate the information in critical marketing materials, such as fact sheets, snapshots, sales kits, and flyers, against the respective data source systems. Retail and institutional portals were developed to provide a consolidated view of portfolios, with the ability to drill down to underlying assets, AUM (Assets Under Management) trends, incentives, commissions, and active opportunities.

OUTCOMES

The new data infrastructure delivered a holistic, on-demand view of investment details, including performance characteristics, breakdowns, attributions, and holdings, to the client's marketing team and account managers with: 

  • ~95% reduction in performance data and exhibit information discrepancies
  • ~60% improvement in operational efficiency in core marketing and client reporting functions

Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch

Cloud Data Lakehouse for Single Source of Truth

Cloud Data Lakehouse for Single Source of Truth

Life Sciences

Cloud Data Lakehouse for Single Source of Truth

Data systems built on legacy technologies were delaying month-end activities and actionable insights for finance and sales teams. Transformation of data and BI applications to MS Azure delivered "Order to Cash" sales analytics on cloud.

Client

A medical devices and fertility solutions company

Goal

Establish a cloud data warehouse for a single source of truth and timely month-end activities

Tools and technologies

Azure Data Factory, Azure Data Lake, Power BI, Synapse Analytics

BUSINESS CHALLENGE

The client had multiple instances of ERPs, sales systems, and warehouses built on obsolete technology and frameworks. The existing system siloed the data, resulting in inconsistent versions of the truth. The client's finance and sales teams were struggling to reconcile data offline and feed the same back into the ERPs, causing significant delays in month-end activities. As the record systems were also not synced and legacy reports were built on the interim warehouses, line managers and executive teams were not able to extract actionable and comprehensive insights. A solution to onboard and integrate new datasets on an ongoing basis was required to support the data merger and acquisition process.

SOLUTION

A transformation strategy to transform data and BI applications to MS Azure was finalized. The transformation was executed in phases and included discovery, report rationalization and foundational build of a global system of reporting. 

The solution included the data ingestion process with Azure Data Factory, data storage and processing using Azure Data Lake and Synapse Analytics, reports and dashboards with Power BI. A utility to accelerate the onboarding of new data entities was conceptualized and delivered to onboard and integrate new datasets to support mergers and acquisitions.

OUTCOMES

Iris data practitioners helped the client overcome key challenges and advanced data warehousing capabilities by:

  • Establishing a "single version of the truth" that enabled data-driven decisions and timely completion of month-end and other critical activities 
  • Delivering analytics to cash business processes, including subject areas of sales, inventory, shipping, finance, etc., on Power BI
  • Facilitating the generation of actionable, insightful reports and dashboards, allowing "self-service" consumption for the business leadership

Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch

Software transformation gets compliance for bank

Software transformation gets compliance for bank

Risk & Compliance

Software transformation gets FDIC compliance for bank

World’s renowned investment bank gets timely compliant with new QFC (Qualified Financial Contracts) and FDIC (Federal Deposit Insurance Corp.) regulations through holistic system transformation and extensive QA & testing.

Client

A global investment bank

Goal

To have a unified functional validation system for FDIC compliance

Tools and technologies

SQL Server, Sybase, Data Lake, UTM, .NET, DTA, Control-M, ALM, JIRA, Git, RLM, Nexus, Unix, WinSCP, Putty, Python, PyCharm, Confluence, Rabacus, SNS, and Datawatch

BUSINESS CHALLENGE

The client mandated to comply with new QFC (Qualified Financial Contracts) regulations. The client also needed to perform in-depth functional validation across a revamped data platform to ensure it could timely process, review and submit to the FDIC (Federal Deposit Insurance Corp.) required daily reports on the open QFC positions of all its counterparties. The project entailed immediate availability and processing of accurate QFC information at the close of each business day to swiftly assess data and note exceptions and exclusions for early corrective action. It also aimed to help the client meet stringent deadlines with varied report formats. Any breach or delay in compliance could attach hefty fines and reputational damage to the bank.

SOLUTION

Iris revamped the entire system and performed end-to-end quality assurance and testing across the new regulatory reporting platform. This meant validating the transformed multi-layer database, user interface (UI), business process rules, and downstream applications. We identified and solved workflow design gaps affecting data reporting on all open positions, agreements, margins, collaterals, and corporate entities, thus enhancing the capability for addressing irregularities. Our experts established an integrated and collaborative system, commanding transaction and reference data within a single platform by incorporating 166 distinct controls pertaining to data completeness, accuracy, consistency, and timeliness within a strategic framework.

OUTCOMES

Our quality assurance and testing solution delivered the following impacts:
  • Faster and more efficient internal analysis with highly accurate QFC open positions
  • 100% compliance with timing and format of required daily QFC report submissions to the FDIC
  • Significant decrease in exceptions before the platform went go-live and zeroed critical defect delivery post-go-live
  • An intuitive UI dashboard reflecting the real-time status of critical underlying data volumes, leakages, job run, and other stats

Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch