

Client
A large NY-based life insurance and investment company
Goal
Create a secure, automated solution for data ingestion and a robust framework for distribution across channels
Tools and Technologies
Python, PySpark, AWS Glue/Redshift/Lambda/S3/Aurora, Stonebranch, Jira, Github
Business Challenge
The client used a legacy product data infrastructure (PACE) and other systems that provided neither fully-secure access nor enabled efficient quality checks. This affected system integration and data ingestion and distribution.
Workflows and checks were not adequately automated, and they did not offer a reusable framework to generate and deliver outbound data files aligned with business requirements.

Solution
- Created reusable and scalable ETL/ELT pipelines using Python and AWS services
- Integrated Stonebranch for orchestration and automated job scheduling, with monitoring mechanisms and alerts
- Tuned Redshift queries and optimized data ingestion processes to reduce latency and improve throughput
- Defined data specifications and output formats as per business needs
- Built a configurable pipeline to create dynamic CSV/Excel files from Redshift views
- Automated file delivery via email/SFTP monitored and orchestrated by Stonebranch

Outcomes
- Improved data distribution and a reusable framework for ingestion and distribution of data across existing and new products
- Streamlined operations and improved data accessibility
- Enhanced performance and scalability
- Ensured better data quality and governance with automation and structured reusability

Our experts can help you find the right solutions to meet your needs.
Modern insurance product data services platform



Client
A large NY-based life insurance and investment services provider
Goal
Create a unified, scalable architecture for secure, standardized sharing of data with downstream services
Tools and Technologies
Java 21, Spring Boot, AWS, Jenkins, Stonebranch, Jira, Microservices, Redshift, Aurora, DynamoDB, Angular, JS, MYSQL, EKS, SQS
Business Challenge
The client used a product data mart that lacked a secure, standardized method to share data with downstream services. This affected data governance and consistency across applications. It also faced the risk of disrupting tightly integrated legacy data flows while shifting from a legacy PACE system that supports critical AEM microservices to a PDP mart.
It needed a new system that is scalable, and the modernization process had to be carried out with minimal disruption to existing systems.

Solution
Implemented a dual-solution strategy to modernize data access and delivery. Key elements included:
- Built a microservices-based platform and deployed on AWS
- Enabled secure, flexible API access to the product data mart using tagged identifiers
- Ensured a scalable design with minimal code changes for expansion
- Allowed existing AEM microservices to operate without changes during and post-transition

Outcomes
- Improved efficiency and security by enabling standardized, governed access to product and entity data
- Achieved faster delivery and reduced manual effort
- Ensured seamless integration while modernizing the backend
- Maintained front-end stability of AEM microservices

Our experts can help you find the right solutions to meet your needs.
Modern insurance platform for loan processing



Client
A large NY-based life insurance and investment company
Goal
Modernize product data management services to enhance loan processing and improve user experience
Tools and Technologies
Angular 15, .Net 4.8, .NET 8.0, AWS EC2, SQL Server, PostgreSQL, Stonebranch, SharePoint, IIS
Business Challenge
The client operated a critical legacy application for loan processing, which had significant operational inefficiencies and shortcomings in user experience. It hindered the speed and accuracy of loan disbursements, affecting both internal operations and customer satisfaction.
The goal was to modernize the platform with scalable solutions that could enable secure distribution of critical data and better governance, while integrating smoothly with downstream applications.

Solution
- Implemented a holistic modernization approach with a focus on system integration, functional enhancements, and UI transformation
- Streamlined deal creation and approval through a third-party system with robust verification and compliance features
- Enabled multi-loan support under a single deal
- Ensured consistency by introducing a standardized component library with reusable UI components
- Revamped the interface with intuitive design, responsive layouts, and improved user experience

Outcomes
- Enhanced efficiency, scalability, and user-centric loan operations
- Improved ROI through greater efficiency, better user experience, and agility
- Reduced verification effort by 30–55% and enabled focus on higher-value tasks
- Streamlined processes and minimized data duplication
- Ensured consistent design and efficient development
- Increased customer satisfaction, with a projected 50% rise in retention

Our experts can help you find the right solutions to meet your needs.
Eagle Access Data Platform Transforms Accounting



Client
Large NY-based life insurance and investment company
Goal
Consolidate multiple accounting systems into a centralized, reliable data warehouse to improve reporting and decision-making
Tools and Technologies
BNY Eagle Access, Python, Oracle, AWS EC2 and S3, SQL Server, Jira, Stonebranch
Business Challenge
The insurer faced growing inefficiencies due to siloed accounting systems that lacked integration, consistency, and scalability. Reporting processes were time-consuming, error-prone, and lacked real-time visibility—hindering timely business and investment decisions. A centralized solution was needed to ingest and unify data from disparate systems like SAP GL, Singularity, and Loan Management into a single, trusted platform to support strategic financial insights and reduce operational complexity.

Solution
- Built a centralized accounting data warehouse using the Eagle Access secure private cloud environment
- Ingested and standardized data from SAP GL, Singularity, and Loan Management systems
- Used AWS EC2, S3, and Stonebranch Universal Automation Controller for cloud infrastructure and job orchestration
- Enabled real-time reporting via Tableau integration and migration of legacy dashboards
- Improved data accuracy and consistency through robust validation and automation

Outcomes
- Created a unified source of truth for all accounting data
- Enabled faster, more accurate reporting and analytics, improving business and investment decision-making
- Reduced data silos and improved accessibility across systems
- Minimized infrastructure complexity and operational risk with secure private cloud hosting
- Enhanced efficiency through automated data processing and orchestration

Our experts can help you find the right solutions to meet your needs.
Data migration to cloud expedites credit risk functions



Client
A leading North American bank
Goal
Migrate credit risk data and SAS-based analytics models from on-premises data warehouse to AWS to enhance functionality
Tools and Technologies
AWS Glue, Redshift, DataSync, Athena, CloudWatch, SageMaker; Apache Airflow; Delta Lake; Power BI
Business Challenge
The credit risk unit of a major bank aimed to migrate SAS-based analytics models containing data for financial forecasting and sensitivity analysis to Amazon SageMaker.
This was to leverage benefits such as enhanced scalability, improved maintenance for MLOps engineers, and better developer experience. It also sought to migrate credit risk data from a Netezza-based on-premises data warehouse to AWS, utilizing a data lake on AWS S3 and a data warehouse on Redshift to support model migration.

Solution
- Decoupled data workload processing from relational systems using the phased approach with a focus on historical migration, transformational complexities, data volumes, and ingestion frequencies of the incremental loads
- Developed a flexible ETL framework using DataSync for extracting data to AWS as flat files from Netezza
- Transformed data in S3 layers using Glue ETL and moved it to the Redshift data warehouse
- Enabled Glue integration with Delta Lake for incremental data workloads
- Built ETL workflows using Step Functions during orchestration and concurrent runs of the workflow; orchestrated the concurrent runs of workflows using Apache Airflow
- Architected data shift from Netezza to AWS, leveraging a flexible ETL framework

Outcomes
- Enhanced financial forecasting and sensitivity analysis operations with analytical models and data migrated to the AWS public cloud
- Expedited time-to-market catering to client’s downstream consumption needs through Power BI and Amazon SageMaker

Our experts can help you find the right solutions to meet your needs.
Generative AI platform for business use cases


Commercial & Corporate Banking
Gen AI platform offers future-ready capabilities

Client
A leading North American bank
Goal
Provide a wide range of AI capabilities for various risk and business teams and avoid building fragmented, outdated systems
Tools and Technologies
Amazon Bedrock and Titan V2, pgvector, Faiss, OpenSearch, Llama 7B, Claude Sonnet 3.5 and 3.7
Business Challenge
The Enterprise Risk function at a leading North American bank initiated a Generative AI (Gen AI) solution to offer a wide range of AI capabilities, including document intelligence, summarization, generation, translation, and more.
As the project evolved through proofs of concept and pilots, a key challenge emerged: the risk of creating a fragmented ecosystem with an overwhelming array of unmanageable bespoke solutions, model integrations, and reliance on potentially outdated models and libraries.

Solution
Based on prior engagements across clients, our team delivered thought leadership around how to develop and deliver capabilities using a platform approach. We also set up a Minimum Viable Product Team to iterate on new problem areas and solution approaches. Platform development includes generalized capabilities for:
- Setting up document ingestion pipelines, with choice of parsing approaches, embedding models and vector index stores
- A factory model along with configurations for integrating new parsers, embedding models, LLM interfaces etc., to quickly bring new capabilities to the platform
- User management, SSO integration, entitlements management
- API integration to bring in information/ data from internal and external sources
- Platform support of pgvector, Faiss, OpenSearch, Amazon Titan V2, Llama 7B, Claude Sonnet 3.5, and 3.7, etc.
- Intuitive chat interface for AI Masters - designated business users trained in Prompt Engineering and other techniques to assemble new AI/Gen AI capabilities for users through configuration - and end users

Outcomes
- A future-ready Gen AI platform that can easily incorporate new capabilities and updates
- Multiple specific capabilities, called skills, for use by various risk teams and business users
- A forward-looking roadmap, including ability to compose more complex capabilities using atomic capabilities

Our experts can help you find the right solutions to meet your needs.
Custom analytics enable faster business decisions


Brokerage, Wealth & Asset Management
Custom analytics enable faster business decisions

Client
U.S.-based asset management company
Goal
Streamline and improve data and analytics capabilities for enhanced user experiences
Technology Tools
Java, React JS, MS SQL Server, Spring Boot, GitHub, Jenkins
Business Challenge
The client captures voluminous data from multiple internal and external sources. The absence of quick, on-demand capabilities for business users was inefficient in generating customized portfolio analytics on attributes such as average quality, yield to maturity, average coupon, etc.
The client teams were spending enormous amounts of manual effort and elapsed time (approximately 12-15 hours) to respond to requests for proposals from their respective clients.

Solution
Iris implemented a data acquisition and analytics system with pre-processing capabilities for grouping, classifying, and handling historical data.
A data dictionary was established for key concepts, such as asset classes and industry classifications, enabling end users to access data for analytical computation. The analytics engine was refactored, optimized, and integrated into the streamlined investment performance data infrastructure.
The team developed an interactive self-service capability, allowing business users to track data availability, perform advanced searches, generate custom analytics, visualize information, and utilize the insights for decision-making.

Outcomes
The solution brought several benefits to the client, including:
- Simplified data access to generate custom analytics for end users
- Eliminated manual processing and the need for complex queries
- Enhanced the stakeholder experience
- Reduced response time to client RFPs by over 50%

Our experts can help you find the right solutions to meet your needs.
Data warehouse enhances client communications

Client
A U.S.-based investment bank
Goal
Improve data collation and information quality for enhanced marketing and client reporting functions
Tools and technologies
Composite C1, Oracle DB, PostgreSQL, Vermilion Reporting Suite, Python, MS SQL Server, React.js


BUSINESS CHALLENGE
The client’s existing investment data structure lacked a single source of truth for investment and performance data. The account management and marketing teams were making significant manual efforts to track portfolio performance, identify opportunities and ensure accurate client reporting. The time-consuming and manual processes of generating marketing exhibits and client reports were highly error-prone.

SOLUTION
Iris implemented a comprehensive investment data infrastructure for a single source of truth and improved reporting capabilities for marketing content and client report generation. An automated Quality Assurance process was instituted to validate the information in critical marketing materials, such as fact sheets, snapshots, sales kits, and flyers, against the respective data source systems. Retail and institutional portals were developed to provide a consolidated view of portfolios, with the ability to drill down to underlying assets, AUM (Assets Under Management) trends, incentives, commissions, and active opportunities.

OUTCOMES
The new data infrastructure delivered a holistic, on-demand view of investment details, including performance characteristics, breakdowns, attributions, and holdings, to the client's marketing team and account managers with:
- ~95% reduction in performance data and exhibit information discrepancies
- ~60% improvement in operational efficiency in core marketing and client reporting functions
Contact
Our experts can help you find the right solutions to meet your needs.
Get in touchCloud Data Lakehouse for Single Source of Truth

Client
A medical devices and fertility solutions company
Goal
Establish a cloud data warehouse for a single source of truth and timely month-end activities
Tools and technologies
Azure Data Factory, Azure Data Lake, Power BI, Synapse Analytics


BUSINESS CHALLENGE
The client had multiple instances of ERPs, sales systems, and warehouses built on obsolete technology and frameworks. The existing system siloed the data, resulting in inconsistent versions of the truth. The client's finance and sales teams were struggling to reconcile data offline and feed the same back into the ERPs, causing significant delays in month-end activities. As the record systems were also not synced and legacy reports were built on the interim warehouses, line managers and executive teams were not able to extract actionable and comprehensive insights. A solution to onboard and integrate new datasets on an ongoing basis was required to support the data merger and acquisition process.

SOLUTION
A transformation strategy to transform data and BI applications to MS Azure was finalized. The transformation was executed in phases and included discovery, report rationalization and foundational build of a global system of reporting.
The solution included the data ingestion process with Azure Data Factory, data storage and processing using Azure Data Lake and Synapse Analytics, reports and dashboards with Power BI. A utility to accelerate the onboarding of new data entities was conceptualized and delivered to onboard and integrate new datasets to support mergers and acquisitions.

OUTCOMES
Iris data practitioners helped the client overcome key challenges and advanced data warehousing capabilities by:
- Establishing a "single version of the truth" that enabled data-driven decisions and timely completion of month-end and other critical activities
- Delivering analytics to cash business processes, including subject areas of sales, inventory, shipping, finance, etc., on Power BI
- Facilitating the generation of actionable, insightful reports and dashboards, allowing "self-service" consumption for the business leadership
Contact
Our experts can help you find the right solutions to meet your needs.
Get in touchSoftware transformation gets compliance for bank

Client
A global investment bank
Goal
To have a unified functional validation system for FDIC compliance
Tools and technologies
SQL Server, Sybase, Data Lake, UTM, .NET, DTA, Control-M, ALM, JIRA, Git, RLM, Nexus, Unix, WinSCP, Putty, Python, PyCharm, Confluence, Rabacus, SNS, and Datawatch


BUSINESS CHALLENGE

SOLUTION

OUTCOMES
- Faster and more efficient internal analysis with highly accurate QFC open positions
- 100% compliance with timing and format of required daily QFC report submissions to the FDIC
- Significant decrease in exceptions before the platform went go-live and zeroed critical defect delivery post-go-live
- An intuitive UI dashboard reflecting the real-time status of critical underlying data volumes, leakages, job run, and other stats
Contact
Our experts can help you find the right solutions to meet your needs.
Get in touchIndustries
Company
