Quality engineering optimizes a DLT platform

Banking & Financial Services

Quality engineering optimizes a DLT platform

Reliability, availability, scalability, observability, and resilience ensured; release cycles and testing time improve 75% and 80%.

Client
A leading provider of financial services digitization solutions
Goal
Reliability assurance for a digital ledger technology (DLT) platform
Tools and Technologies
Kotlin, Java, Http Client, AWS, Azure, GCP, G42, OCP, AKS, EKS, Docker, Kubernetes, Helm Chart, Terraform
Business Challenge

A leader in Blockchain-based digital financial services required assurance for non-GUI (Graphic User Interface), Command Line Interface (CLI), microservices and Representational State Transfer (REST) APIs for a Digital Ledger Technology (DLT) platform, as well as platform reliability assurance on Azure, AWS services (EKS, AKS) to ensure availability, scalability, observability, monitoring and resilience (disaster recovery). It also wanted to identify capacity recommendations and any performance bottlenecks (whether impacting throughput or individual transaction latency) and required comprehensive automation coverage for older and newer product versions and management of frequent deliveries of multiple DLT product versions on a monthly basis.

Solution
  • 130+ Dapps were developed and enhanced on the existing automation framework for terminal CLI and cluster utilities
  • Quality engineering was streamlined with real-time dashboarding via Grafana and Prometheus
  • Coverage for older and newer versions of the DLT platform was automated for smooth, frequent deliverables for confidence in releases
  • The test case management tool, Xray, was implemented for transparent automation coverage
  • Utilities were developed to execute a testing suite for AKS, EKS, local MAC/ Windows/ Linux cluster environments to run on a daily or as-needed basis
Outcomes
  • Automation shortened release cycles from 1x/month to 1x/week; leads testing time was reduced by 80%
  • Test automation coverage with 2,000 TCs was developed, with pass rate of 96% in daily runs
  • Compatibility was created across AWS-EKS, Azure-AKS, Mac, Windows, Linux and local cluster
  • Increased efficiency in deliverables was displayed, along with an annual $350K savings for TCMs
  • An average throughput of 25 complete workflows per second was sustained
  • Achieved a 95th percentile flow completion time, should not exceed 10 seconds
Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

Conversational assistant boosts AML product assurance

BANKING

Conversational assistant boosts AML product assurance

Gen AI-powered responses improve the turnaround time to provide technical support for recurring issues, resulting in a highly efficient product assurance process.

Client
A large global bank
Goal
Improve turnaround time to provide technical support for the application support and global product assurance teams
Tools and Technologies
React, Sentence–Bidirectional Encoder Representations from Transformers (S-BERT), Facebook AI Similarity Search (FAISS), and Llama-2-7B-chat
Business Challenge

The application support and global product assurance teams of a large global bank faced numerous challenges in delivering efficient and timely technical support as they had to manually identify solutions to recurring problems within the Known Error Database (KEDB), comprised of documents in various formats. With the high volume of support requests and limited availability of teams across multiple time zones, a large backlog of unresolved issues developed, leading to higher support costs.

Solution

Our team developed a conversational assistant using Gen AI by:

  • Building an interactive customized React-based front-end
  • Ringfencing a corpus of problems and solutions documented in the KEDB
  • Parsing, formatting and extracting text chunks from source documents and creating vector embeddings using Sentence–Bidirectional Encoder Representations from Transformers (S-BERT)
  • Storing these in a Facebook AI Similarity Search (FAISS) vector database
  • Leveraging a local Large Language Model (Llama-2-7B-chat) to generate summarized responses
Outcomes

The responses generated using Llama-2-7B LLM were impressive and significantly reduced overall effort. Future enhancements to the assistant would involve:

  • Creating support tickets based on information collected from users
  • Categorizing tickets based on the nature of the problem
  • Automating repetitive tasks such as access requests / data volume enquiries / dashboard updates
  • Auto-triaging support requests by asking users a series of questions to determine the severity and urgency of the problem

Gen AI For Software Engineers

Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

AI-powered summarization boosts compliance workflow

INSURANCE

AI-powered summarization boosts compliance workflow

Gen AI-enabled conversational assistant substantially simplifies access to underwriting policies and procedures across multiple, complex documents.

Client
A leading specialty property and casualty insurer
Goal
Improve underwriters’ ability to review policy submissions by providing easier access to information stored across multiple, voluminous documents.
Tools and Technologies
Azure OpenAI Service, React, Azure Cognitive Services, Llama-2-7B-chat, OpenAI GPT 3.5-Turbo, text-embedding-ada-002 and all-MiniLM-L6-v2
Business Challenge

The underwriters working with a leading specialty property and casualty insurer have to refer to multiple documents and handbooks, each running into several hundreds of pages, to understand the relevant policies and procedures, key to the underwriting process. Significant effort was required to continually refer to these documents for each policy submission.

Solution

A Gen-AI enabled conversational assistant for summarizing information was developed by:

  • Building a React-based customized interactive front end
  • Ringfencing a knowledge corpus of specific documents (e.g., an insurance handbook, loss adjustment and business indicator manuals, etc.)
  • Leveraging OpenAI embeddings and LLMs through Azure OpenAI Service along with Azure Cognitive Services for search and summarization with citations
  • Developing a similar interface in the Iris-Azure environment with a local LLM (Llama-2-7B-chat) and embedding model (all-MiniLM-L6-v2) to compare responses
Outcomes

Underwriters significantly streamlined the activities needed to ensure that policy constructs align with applicable policies and procedures and for potential compliance issues in complex cases.

The linguistic search and summarization capabilities of the OpenAI GPT 3.5-Turbo LLM (170 bn parameters) were found to be impressive. Notably, the local LLM (Llama-2-7B-chat), with much fewer parameters (7 bn), also produced acceptable results for this use case.

Gen AI For Software Engineers

Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

Automated financial analysis reduces manual effort

BANKING

Automated financial analysis reduces manual effort

Analysts in a large North American bank's commercial lending and credit risk operations can source intelligent information across multiple documents.

Client
Commerical lending and credit risk units of large North American bank
Goal
Automated retrieval of information from multiple financial statements enabling data-driven insights and decision-making
Tools and Technologies
OpenAI API (GPT-3.5 Turbo), LlamaIndex, LangChain, PDF Reader
Business Challenge

A leading North American bank had large commercial lending and credit risk units. Analysts in those units typically refer to numerous sections in a financial statement, including balance sheets, cash flows, and income statements, supplemented by footnotes and leadership commentaries, to extract decision-making insights. Switching between multiple pages of different documents took a lot of work, making the analysis extra difficult.

Solution

Many tasks were automated using Gen AI tools. Our steps:

  • Ingest multiple URLs of financial statements
  • Convert these to text using the PDF Reader library
  • Build vector indices using LlamaIndex
  • Create text segments and corresponding vector embeddings using OpenAI’s API for storage in a multimodal vector database e.g., Deep Lake
  • Compose graphs of keyword indices for vector stores to combine data across documents
  • Break down complex queries into multiple searchable parts using LlamaIndex’s DecomposeQueryTransform library
Outcomes

The solution delivered impressive results in financial analysis, notably reducing manual efforts when multiple documents were involved. Since the approach is still largely linguistic in nature, considerable Prompt engineering may be required to generate accurate responses. Response limitations due to the lack of semantic awareness in Large Language Models (LLMs) may stir considerations about the usage of qualifying information in queries.

Gen AI For Software Engineers

Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

Next generation chatbot eases data access

BROKERAGE & WEALTH

Next generation chatbot eases data access

Gen AI tools help users of retail brokerage trading platform obtain information related to specific needs and complex queries.

Client
Large U.S.-based Brokerage and Wealth Management Firm
Goal
Enable a large number of users to readily access summarized information contained in voluminous documents.
Tools and Technologies
Google Dialogflow ES, Pinecone, Llamaindex, OpenAI API (GPT-3.5 Turbo)
Business Challenge

A large U.S.-based brokerage and wealth management client has a large number of users for its retail trading platform that offers sophisticated trading capabilities. Although extensive information was documented in hundreds of pages of product and process manuals, it was difficult for users to access and understand information related to their specific needs (e.g., How is margin calculated? or What are Rolling Strategies? or Explain Beta Weighting).

Solution

Our Gen AI solution encompassed:

  • Building a user-friendly interactive chatbot using Dialogflow in Google Cloud
  • Ringfencing a knowledge corpus comprising specific documents to be searched against and summarized (e.g., 200-page product manual, website FAQ content)
  • Using a vector database to store vectors from the corpus and extract relevant context for user queries
  • Interfacing the vector database with OpenAI API to analyze vector-matched contexts and generate summarized responses
Outcomes

The OpenAI GPT-3.5 turbo LLM (170 bn parameters) delivered impressive linguistic search and summarization capabilities in dealing with information requests. Prompt engineering and training are crucial to secure those outcomes.

In the case of a rich domain such as a trading platform, users may expect additional capabilities, such as:

  • API integration, to support requests requiring retrieval of account/user specific information, and
  • Augmentation of linguistic approaches with semantics to deliver enhanced capabilities.

Gen AI For Software Engineers

Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

Release automation reduces testing time by 80%

PROFESSIONAL SERVICES

Release automation reduces testing time by 80%

DevOps implementation and release automation improved testing time, product quality, and global reach for a leading multi-level marketing company.

Client
A leading multi-level marketing company
Goal
Shorten the release cycle and improve product quality
Technology Tools
Amazon CloudWatch, Elasticsearch, Bitbucket, Jenkins, Amazon ECR, Docker, and Kubernetes
Business Challenge

The client's Commercial-off-the-shelf (COTS) applications were built using substandard code branching methods, causing product quality issues. The absence of a release process and a manual integration and deployment process were elongating release cycles. Manual configuration and setup of these applications were also leading to extended downtime. Missing functional, smoke, and regression test cases were adding to the unstable development environment. The database migration process was manual, resulting in delays, data quality issues, and higher costs.

Solution
  • Code branching and integration strategy for defects / hotfixes in major and minor releases​
  • Single-click application deployment, including environment creation, approval and deployment activities​
  • Global DevOps platform implementation with a launch pad for applications to onboard other countries​
  • Automated configuration and deployment of COTS applications and databases​
  • Automation suite with 90% coverage of smoke and regression test cases​
  • Static and dynamic analysis implementations to ensure code quality and address configuration issues​
Outcomes

Automation of release cycles delivered the following benefits to the client:

  • Release cycle shortened from once a month to once per week
  • MTTR reduced by 6 hrs
  • Downtime decreased to <4 hours from 8 hours
  • Product quality and defect leakage improved by 75%
  • Testing time reduced by 80%
  • Reach expanded to global geographies
  • Availability, scalability, and fault tolerance enhanced for microservices-based applications
Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

DevOps solution improves scalability by 5x

LIFE SCIENCES

Automated app & infra deployment improves scalability

Automated app and infra deployment with DevOps implementation help a leading medical company launch applications in new geographies, improve time-to-market, and reduce the total cost of ownership.

Client
North America-based fertility and genomics company
Goal
Expand business reach, reduce time-to-market, and support critical compliance
Technology Tools
.NET 5, Vue.js, AWS Secrets Manager, AWS Transfer Family, Amazon RDS, Amazon EKS, Amazon Route 53, Amazon CloudFront, Terraform, GitLab
Business Challenge

The client wanted to expand its reach to Canada, Europe, and APAC regions to meet the requirements for a 10x increase in their user base. Legacy application infrastructure and code built on the old tech stack, with high technical debt, were slowing down the rollout of new features, making the client less competitive. The infra-deployment process was only partially automated, stretching the time-to-market to three months. The total cost of ownership was relatively high. HIPPA and PII compliance were also not supported.

Solution

Iris modernized the application into microservices, built the infrastructure using Terraform and automated its provisioning and configuration.

  • Application developed using .NET 5 and Vue.js
  • Architecture transformed into cloud-native
  • AWS Managed Services, including Secrets Manager, AWS Transfer Family, RDS, EKS, Route 53, CloudFront, and S3, configured using Terraform
  • EKS Cluster and associated components provisioned via Terraform
  • App pushed to container registry using GitLab pipeline
  • Secrets (API keys, database connection strings, etc.) and app images moved to EKS Cluster using S3 Bucket Helm
  • Static code analysis, coverage and vulnerability scans integrated to ensure code quality and reduce configuration issues
Outcomes
Our DevOps solution enabled the client to achieve significant benefits, including:
  • Application launch in Canada and Europe; Asia Pacific release in the pipeline
  • HIPPA and PII compliance
  • 5x scalability improvement from weekly average usage
  • Time-to-market reduced from three months to 3 weeks
  • Total cost of ownership lowered by 50%
Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

Order management platform transformation

PROFESSIONAL SERVICES

New platform transforms transactions processes

Platform transformation and multi-cloud integration improve multinational publishing company's order management, time-to-market and performance.

Client
Multinational publishing, media, and educational company
Goal
Improve order management and transaction processing capabilities
Technology Tools
AWS EKS, Kong, Salesforce Commerce Cloud (SFCC), Salesforce CRM, Jenkins, Sumo Logic, Datadog
Business Challenge

The client's order management platform was complex and had scalability issues, causing poor customer experience and loss of revenue. The platform was hosted on Oracle cloud, with data stored in different repositories. Services were also hosted in the Oracle cloud, which used the BICC extract to fetch information about order details from Oracle databases. The low performance of customer-facing applications was causing latency and very high transaction processing time.

Solution

Team Iris transformed Oracle-based SOA services into six microservices and migrated them to AWS EKS for autoscaling with self-healing and monitoring capabilities.

We developed services for publishing data to Salesforce CRM for quick order processing and conversions. The BICC system for diversified information and order history was enabled with real-time integration between Oracle Fusion and materialized views for data consumption.

Post migration, these services were registered in Kong for discovery, and a CI/CD pipeline was created for deployment using Jenkins. Sumo Logic was used for monitoring the logs, and Datadog was used to observe latency, anomalies and other metrics.

Outcomes

The order management platform transformation delivered the following benefits to the client:

  • System performance improved by 70%
  • Transaction processing capability increased by 4x
  • Order processing capabilities were enhanced by 200%
  • Total cost of ownership (TCO) was reduced by 30%
Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.

Quality engineering for Blockchain-DLT platform

Quality engineering for Blockchain-DLT platform

Banking & Financial Services

Next-gen Quality Engineering for Blockchain-DLT platform

Quality engineering implementation helps a digital financial services client smooth the legacy migration of its Blockchain-DLT (Digital Ledger Technology) platform by advancing automation coverage and patch delivery efficiencies.

Client

A leading digital financial services company

Goal

Blockchain- DLT platform assurance with improved automation coverage

Tools and technologies

Amazon Elastic Kubernetes Service (EKS), Azure Kubernetes Services (AKS), Docker, Terraform, Helm Charts, Microservices, Kotlin, Xray

BUSINESS CHALLENGE

The client's legacy DLT platform did not support cloud capabilities with the Blockchain-DLT tech stack. The non-GUI (Graphic User Interface) and CLI (Command Line Interface)-based platform lacked the microservices architecture and cluster resilience. The REST (Representational State Transfer) APIs-based platform did not support platform assurance validation at the backend. Automation coverage for legacy and newer versions of the products was very low. Support for delivery patches was insufficient, impacting the delivery of multiple versions of R3 products each month.

SOLUTION

Iris developed multiple CorDapps to support automation around DLT-platform functionalities and enhanced the CLI-based & cluster utilities in the existing R3 automation framework. The team implemented the test case management tool Xray to improve test automation coverage for legacy and newer versions of the Corda platform, enabling smooth and frequent patch deliveries every month. The quality engineering process was streamlined for the team's Kanban board by modifying the workflows. Iris also introduced the ability to execute a testing suite that could run on a daily or as-needed basis for AKS, EKS, and Local MAC/ Windows/ Linux cluster environments.

OUTCOMES

The Blockchain-DLT reliability assurance solution enabled the client to attain:

  • Improved automation coverage of the DLT platform with 900 test cases with a pass rate of 96% in daily runs
  • Compatibility across AWS-EKS, Azure-AKS, Mac, Windows, Linux, and local clusters
  • Increased efficiency in deliverables with an annual $35K savings in the test case management area

Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch

Tech stack automation expedites script development by 3x

Tech stack automation expedites script development by 3x

Manufacturing

Tech stack automation expedites script development by 3x

Manual processes across the multi-technology stack were severely affecting the script development cycles in terms of time, effort and cost. Iris application agnostic Test Automation framework and DevOps integration helped the client reduce the script development time and cost significantly.

Client

A leading building supplies manufacturing company

Goal

To support 30+ applications stack for UI, E2E, APIs, performance, mobile automation along with DevOps pipeline integration

Tools and technologies

.NET Core, PeopleSoft, Salesforce, WMS, JavaScript, Angular, Foxpro, C#, Selenium, SpecFlow, RestSharp, Nunit, Mobile Center/Emulators, Allure, Jira, Azure Pipeline, GitHub

BUSINESS CHALLENGE

The client had technology stacks comprising of diverse technologies that were difficult to manage. Substantial manual effort and time were spent on integrating the checkpoints, elongating the development process. Validating end-to-end business flows across different applications was the prime challenge. Reporting processes were also scattered across the entire application stack, making it vulnerable.

SOLUTION

Iris developed a robust application agnostic Test Automation framework to support the client’s multiple-technology stacks. Following the Behavior-driven Development (BDD) approach to align the acceptance criteria with the stakeholders, we built business and application layers of the common utilities in the core framework. Our experts identified E2E business flows to validate the downstream impact of the change and automated the entire stack through the shift-left approach. Azure DevOps integration enabled a common dashboard for reporting. The client attained complete version control to track production health and enforce strong validations.

OUTCOMES

Iris Automation solution enabled the client to surpass several business goals. The key outcomes of the delivered solution included:

  • ~65% Increase in automation coverage
  • 100+ Pipelines for in-scope applications across multiple environments
  • 3700+ Test Automation scripts execution per sprint cycle achieved across applications
  • 3X Faster script development of behavior-driven test cases 
  • Multi-day manual test effort reduced to a few hours of automated regression 
  • 70% Reduction in effort

Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Copyright © 2024 Iris Software, Inc. All rights reserved