Quality engineering optimizes a DLT platform

Banking & Financial Services

Quality engineering optimizes a DLT platform

Reliability, availability, scalability, observability, and resilience ensured; release cycles and testing time improve 75% and 80%.

Client
A leading provider of financial services digitization solutions
Goal
Reliability assurance for a digital ledger technology (DLT) platform
Tools and Technologies
Kotlin, Java, Http Client, AWS, Azure, GCP, G42, OCP, AKS, EKS, Docker, Kubernetes, Helm Chart, Terraform
Business Challenge

A leader in Blockchain-based digital financial services required assurance for non-GUI (Graphic User Interface), Command Line Interface (CLI), microservices and Representational State Transfer (REST) APIs for a Digital Ledger Technology (DLT) platform, as well as platform reliability assurance on Azure, AWS services (EKS, AKS) to ensure availability, scalability, observability, monitoring and resilience (disaster recovery). It also wanted to identify capacity recommendations and any performance bottlenecks (whether impacting throughput or individual transaction latency) and required comprehensive automation coverage for older and newer product versions and management of frequent deliveries of multiple DLT product versions on a monthly basis.

Solution
  • 130+ Dapps were developed and enhanced on the existing automation framework for terminal CLI and cluster utilities
  • Quality engineering was streamlined with real-time dashboarding via Grafana and Prometheus
  • Coverage for older and newer versions of the DLT platform was automated for smooth, frequent deliverables for confidence in releases
  • The test case management tool, Xray, was implemented for transparent automation coverage
  • Utilities were developed to execute a testing suite for AKS, EKS, local MAC/ Windows/ Linux cluster environments to run on a daily or as-needed basis
Outcomes
  • Automation shortened release cycles from 1x/month to 1x/week; leads testing time was reduced by 80%
  • Test automation coverage with 2,000 TCs was developed, with pass rate of 96% in daily runs
  • Compatibility was created across AWS-EKS, Azure-AKS, Mac, Windows, Linux and local cluster
  • Increased efficiency in deliverables was displayed, along with an annual $350K savings for TCMs
  • An average throughput of 25 complete workflows per second was sustained
  • Achieved a 95th percentile flow-completion time that should not exceed 10 seconds
Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.
Productionizing Generative AI Pilots

Productionizing Generative AI Pilots

Get scalable solutions and unlock insights from information siloed across an enterprise by automating data extraction, streamlining workflows, and leveraging models.




    Enterprises have vast amounts of unstructured information such as onboarding documents, contracts, financial statements, customer interaction records, confluence pages, etc., with valuable information siloed across formats and systems.

    Generative AI is now starting to unlock new capabilities, with vector databases and Large Language Models (LLMs) tapping into unstructured information using natural language, enabling faster insight generation and decision-making. The advent of LLMs, exemplified by the publicly-available ChatGPT, has been a game-changer for information retrieval and contextual question answering. As LLMs evolve, they’re not just limited to text. They’re becoming multi-modal, capable of interpreting charts and images. With a large number of offerings, it is very easy to develop Proofs of Concept (PoCs) and pilot applications. However, to derive meaningful value, the PoCs and pilots need to be productionized and delivered in significant scale.

    PoCs/pilots deal with only the tip of the iceberg. Productionizing needs to address a lot more that does not readily meet the eye. To scale extraction and indexing information, we need to establish a pipeline that, ideally, would be driven by events, new documents generated and available, possibly through an S3 document store and SQS (Simple Queue Service), to initiate parsing of documents for metadata, chunking, creating vector embedding and persisting metadata and vector embedding to suitable persistence stores. There is a need for logging and exception-handling, notification and automated retries when the pipeline encounters issues.

    While developing pilot applications using Generative AI is easy, teams need to carefully work through a number of additional considerations to take these applications to production, scale the volume of documents and the user-base, and deliver full value. It would be easier to do this across multiple RAG (Retrieval-Augmented Generation) applications, utilizing conventional NLP (Natural Language Processing) and classification techniques to direct user requests to different RAG pipelines for different queries. Implementing the capabilities required around productionizing Generative AI applications using LLMs in a phased manner will ensure that value can be scaled as the overall solution architecture and infrastructure is enhanced.

    Read our perspective paper for more insights on Productionizing Generative AI Pilots.

    Download Perspective Paper




      Contact

      Our experts can help you find the right solutions to meet your needs.

      Get in touch
      Leveraging Generative AI for asset tokenization

      Leveraging Generative AI for asset tokenization

      Enhance efficiency, security, and user experience by leveraging Generative AI in DLT-based asset tokenization.




        Asset tokenization is converting ownership rights of an asset that has traditionally resided within legacy or traditional systems into a digital token on a Distributed Ledger Technology (DLT) platform. This transformation enables numerous benefits, including fractional ownership, 24/7 availability, easier transferability, and enhanced liquidity.

        Developing and deploying a comprehensive asset tokenization system on DLT is a full-scale software development endeavor encompassing all SDLC phases. Every phase presents challenges, including technology complexities, evolving business use cases, non-standardization, scarcity of resources, and reluctance to adopt.

        As asset tokenization emerges as an essential solution for financial institutions, the integration of Gen AI amplifies customer value. Institutions can achieve unprecedented efficiency, accuracy, and innovation by leveraging Gen AI's capabilities throughout the asset tokenization process.

        Gen AI is set to play a pivotal role in improving asset tokenization by contributing to the different phases of its implementation. Gen AI can assist both in the implementation phase and beforehand, as it can help produce synthetic financial data that closely resembles real market conditions conduct stress tests and other simulations, helping to strengthen the platform.

        Read our Perspective Paper for more insights into the key phases, benefits and the road map to asset tokenization.

        Download Perspective Paper




          Contact

          Our experts can help you find the right solutions to meet your needs.

          Get in touch

          NC Tech Association Leadership Summit 2024

          NC Tech Association Leadership Summit 2024

          NC Tech Leadership Summit 2024

          Join Michel Abranches, Senior Client Partner, at the NC Tech Association’s annual Summit to learn the latest on tech transformation, building resilient tech teams, and adaptive leadership.

          Iris Software will participate in the exclusive, attendance-capped, annual Leadership Summit hosted by the NC Tech Association, along with its board of directors and advisors, on August 7 & 8, 2024. Our representative, Senior Client Partner, Michel Abranches, will be among the executives gathering for the Summit, at the Pinehurst Resort in Pinehurst, NC, to network and discuss a variety of topics relevant to tech leaders and the projects and associates they manage.

          The theme of this year’s summit is Adaptive Leadership. The event includes keynote addresses, executive workshops, and two panel discussions on ‘Why digital transformation is more about people than technology’ and ‘Building resilient tech teams: the power of emotional intelligence.’ 

          As a technology provider to Fortune 500 and other leading global enterprises for more than 30 years, Iris is a trusted choice for leaders who want to realize the full potential of digital transformation. We deliver complex, mission-critical software engineering, application development, and advanced tech solutions that enhance business competitiveness and achieve key outcomes. Our agile, collaborative, right-sized teams and high-trust, high-performance, award-winning culture ensure clients enjoy top value and experience.  

          Contact Michel Abranches, based in our Charlotte, NC office, or visit www.irissoftware.com for details and success stories about our innovative approach and how we are leveraging the latest in AI / Gen AI / ML, Automation, Cloud, DevOps, Data Science, Enterprise Analytics, Integrations, and Quality Engineering.

          Contact

          Our experts can help you find the right solutions to meet your needs.

          Get in touch
          Asset tokenization transforming global finance

          Real world asset tokenization can transform financial markets

          Integration with Distributed Ledger Technologies is critical to realizing the full potential of tokenization.




            The global financial markets create and deal in multiple asset classes, including equities, bonds, forex, derivatives, and real estate investments. Each of them constitutes a multi-trillion-dollar market. These traditional markets encounter numerous challenges in terms of time and cost which impede accessibility, fund liquidity, and operational efficiencies. Consequently, the expected free flow of capital is hindered, leading to fragmented, and occasionally limited, inclusion of investors.

            In response to these challenges, today's financial services industry seeks to explore innovative avenues, leveraging advancements such as Distributed Ledger Technology (DLT). Using DLTs, it is feasible to tokenize assets, thus enabling issuance, trading, servicing and settlement digitally, not just in whole units, but also in fractions.

            Asset tokenization is the process of converting and portraying the unique properties of a real-world asset, including ownership and rights, on a Distributed Ledger Technology (DLT) platform. Digital and physical real-world assets, such as real estate, stocks, bonds, and commodities, are depicted by tokens with distinctive symbols and cryptographic features. These tokens exhibit specific behavior as part of an executable program on a blockchain.

            Many domains, especially financial institutions, have started recognizing the benefits of tokenization and begun to explore this technology. Some of the benefits are fractional ownership, increased liquidity, efficient transfer of ownership, ownership representation and programmability.

            With the recent surge in the adoption of tokenization, a diverse array of platforms has emerged, paving the way for broader success, but at the same time creating fragmented islands of ledgers and related assets. As capabilities mature and adoption grows, interconnectivity and interoperability across ledgers representing different institutions issuing/servicing different assets could improve, creating a better integrated market landscape. This would be critical to realizing the promise of asset tokenization using DLT.

            Read our Perspective Paper for more insights on asset tokenization and its potential to overcome the challenges, the underlying technology, successful use cases, and issues associated with implementation.

            Download Perspective Paper




              Contact

              Our experts can help you find the right solutions to meet your needs.

              Get in touch
              How Low-code Empowers Mission-critical End Users

              How Low-code Empowers Mission-critical End Users

              Low-code platforms enable rapid conversions to technology-managed applications that provide end users with rich interfaces, powerful configurations, easy integrations, and enhanced controls.




                How Low-code Empowers Mission-critical End Users

                Do you trust your data?

                Data driven organizations are ensuring that their Data assets are cataloged and a lineage is established to fully derive value out of their data assets.

                Many large and small enterprises utilize business-managed applications (BMAs) in their value chain to supplement technology-managed applications (TMAs). BMAs are applications or software that end users create or procure off-the-shelf and implement on their own; these typically are low-code or no-code software applications. Such BMAs offer the ability to automate or augment team-specific processes or information to enable enterprise-critical decision-making.

                Technology teams build and manage TMAs to do a lot of heavy lifting by enabling business unit workflows and transactions and automating manual processes. TMAs are often the source systems for analytics and intelligence engines that drive off data warehouses, marts, lakes, lake-houses, etc. BMAs dominate the last mile in how these data infrastructures support critical reporting and decision making. 

                While BMAs deliver value and simplify complex processes, they bring with them a large set of challenges in security, opacity, controls collaboration, traceability and audit. Therefore, on an ongoing basis, business-critical BMAs that have become relatively mature in their capabilities must be industrialized with optimal time and investment. Low-code platforms provide the right blend of ease of development, flexibility and governance that enables the rapid conversion of BMAs to TMAs with predictable timelines and low-cost, high-quality output. 

                Read our Perspective Paper for more insights on using low-code platforms to convert BMAs to TMAs that provide end users with rich interfaces, powerful configurations, easy integrations, and enhanced controls.

                Download Perspective Paper




                  Contact

                  Our experts can help you find the right solutions to meet your needs.

                  Get in touch

                  Conversational assistant boosts AML product assurance

                  Anti-Money Laundering & Know-Your-Customer

                  Conversational assistant boosts AML product assurance

                  Gen AI-powered responses improve the turnaround time to provide technical support for recurring issues, resulting in a highly efficient product assurance process.

                  Client
                  A large global bank
                  Goal
                  Improve turnaround time to provide technical support for the application support and global product assurance teams
                  Tools and Technologies
                  React, Sentence–Bidirectional Encoder Representations from Transformers (S-BERT), Facebook AI Similarity Search (FAISS), and Llama-2-7B-chat
                  Business Challenge

                  The application support and global product assurance teams of a large global bank faced numerous challenges in delivering efficient and timely technical support as they had to manually identify solutions to recurring problems within the Known Error Database (KEDB), comprised of documents in various formats. With the high volume of support requests and limited availability of teams across multiple time zones, a large backlog of unresolved issues developed, leading to higher support costs.

                  Solution

                  Our team developed a conversational assistant using Gen AI by:

                  • Building an interactive customized React-based front-end
                  • Ringfencing a corpus of problems and solutions documented in the KEDB
                  • Parsing, formatting and extracting text chunks from source documents and creating vector embeddings using Sentence–Bidirectional Encoder Representations from Transformers (S-BERT)
                  • Storing these in a Facebook AI Similarity Search (FAISS) vector database
                  • Leveraging a local Large Language Model (Llama-2-7B-chat) to generate summarized responses
                  Outcomes

                  The responses generated using Llama-2-7B LLM were impressive and significantly reduced overall effort. Future enhancements to the assistant would involve:

                  • Creating support tickets based on information collected from users
                  • Categorizing tickets based on the nature of the problem
                  • Automating repetitive tasks such as access requests / data volume enquiries / dashboard updates
                  • Auto-triaging support requests by asking users a series of questions to determine the severity and urgency of the problem

                  Gen AI For Software Engineers

                  Contact

                  Our experts can help you find the right solutions to meet your needs.

                  Get in touch
                  Explore the world with Iris. Follow us on social media today.

                  AI-powered summarization boosts compliance workflow

                  Insurance

                  AI-powered summarization boosts compliance workflow

                  Gen AI-enabled conversational assistant substantially simplifies access to underwriting policies and procedures across multiple, complex documents.

                  Client
                  A leading specialty property and casualty insurer
                  Goal
                  Improve underwriters’ ability to review policy submissions by providing easier access to information stored across multiple, voluminous documents.
                  Tools and Technologies
                  Azure OpenAI Service, React, Azure Cognitive Services, Llama-2-7B-chat, OpenAI GPT 3.5-Turbo, text-embedding-ada-002 and all-MiniLM-L6-v2
                  Business Challenge

                  The underwriters working with a leading specialty property and casualty insurer have to refer to multiple documents and handbooks, each running into several hundreds of pages, to understand the relevant policies and procedures, key to the underwriting process. Significant effort was required to continually refer to these documents for each policy submission.

                  Solution

                  A Gen-AI enabled conversational assistant for summarizing information was developed by:

                  • Building a React-based customized interactive front end
                  • Ringfencing a knowledge corpus of specific documents (e.g., an insurance handbook, loss adjustment and business indicator manuals, etc.)
                  • Leveraging OpenAI embeddings and LLMs through Azure OpenAI Service along with Azure Cognitive Services for search and summarization with citations
                  • Developing a similar interface in the Iris-Azure environment with a local LLM (Llama-2-7B-chat) and embedding model (all-MiniLM-L6-v2) to compare responses
                  Outcomes

                  Underwriters significantly streamlined the activities needed to ensure that policy constructs align with applicable policies and procedures and for potential compliance issues in complex cases.

                  The linguistic search and summarization capabilities of the OpenAI GPT 3.5-Turbo LLM (170 bn parameters) were found to be impressive. Notably, the local LLM (Llama-2-7B-chat), with much fewer parameters (7 bn), also produced acceptable results for this use case.

                  Gen AI For Software Engineers

                  Contact

                  Our experts can help you find the right solutions to meet your needs.

                  Get in touch
                  Explore the world with Iris. Follow us on social media today.

                  Automated financial analysis reduces manual effort

                  Commercial & Corporate Banking

                  Automated financial analysis reduces manual effort

                  Analysts in a large North American bank's commercial lending and credit risk operations can source intelligent information across multiple documents.

                  Client
                  Commerical lending and credit risk units of large North American bank
                  Goal
                  Automated retrieval of information from multiple financial statements enabling data-driven insights and decision-making
                  Tools and Technologies
                  OpenAI API (GPT-3.5 Turbo), LlamaIndex, LangChain, PDF Reader
                  Business Challenge

                  A leading North American bank had large commercial lending and credit risk units. Analysts in those units typically refer to numerous sections in a financial statement, including balance sheets, cash flows, and income statements, supplemented by footnotes and leadership commentaries, to extract decision-making insights. Switching between multiple pages of different documents took a lot of work, making the analysis extra difficult.

                  Solution

                  Many tasks were automated using Gen AI tools. Our steps:

                  • Ingest multiple URLs of financial statements
                  • Convert these to text using the PDF Reader library
                  • Build vector indices using LlamaIndex
                  • Create text segments and corresponding vector embeddings using OpenAI’s API for storage in a multimodal vector database e.g., Deep Lake
                  • Compose graphs of keyword indices for vector stores to combine data across documents
                  • Break down complex queries into multiple searchable parts using LlamaIndex’s DecomposeQueryTransform library
                  Outcomes

                  The solution delivered impressive results in financial analysis, notably reducing manual efforts when multiple documents were involved. Since the approach is still largely linguistic in nature, considerable Prompt engineering may be required to generate accurate responses. Response limitations due to the lack of semantic awareness in Large Language Models (LLMs) may stir considerations about the usage of qualifying information in queries.

                  Gen AI For Software Engineers

                  Contact

                  Our experts can help you find the right solutions to meet your needs.

                  Get in touch
                  Explore the world with Iris. Follow us on social media today.
                  The State of Central Bank Digital Currency

                  The State of Central Bank Digital Currency

                  Innovations in digital currencies could redefine the concept of money and transform payments and banking systems.




                    The State of Central Bank Digital Currency

                    Do you trust your data?

                    Data driven organizations are ensuring that their Data assets are cataloged and a lineage is established to fully derive value out of their data assets.

                    Central banking institutions have emerged as key players in the world of banking and money. They play a pivotal role in shaping economic and monetary policies, maintaining financial system stability, and overseeing currency issuance. A manifestation of the evolving interplay between central banks, money, and the forces that shape financial systems is the advent of Central Bank Digital Currency (CBDC). Many drivers have led central banks to explore CBDC: declining cash payments, the rise of digital payments and alternative currencies, and disruptive forces in the form of fin-tech innovations that continually reshape the payment landscape.

                    Central banks are receptive towards recent technological advances and well-suited to the digital currency experiment, leveraging their inherent role of upholding the well-being of the monetary framework to innovate and facilitate a trustworthy and efficient monetary system.

                    In 2023, 130 countries, representing 98% of global GDP, are known to be exploring a CBDC solution. Sixty-four of them are in an advanced phase of exploration (development, pilot, or launch), focused on lower costs for consumers and merchants, offline payments, robust security, and a higher level of privacy and transparency. Over 70% of the countries are evaluating digital ledger technology (DLT)-based solutions.  

                    While still at a very nascent stage in terms of overall adoption for CBDC, the future of currency promises to be increasingly digital, supported by various innovations and maturation. CBDC has the potential to bring about a paradigm shift, particularly in the financial industry, redefining the way in which money, as we know it, exchanges hands.

                    Read our perspective paper to learn more about CBDCs – the rationale for their existence, the factors driving their implementation, potential ramifications for the financial landscape, and challenges associated with their adoption.

                    Download Perspective Paper




                      Contact

                      Our experts can help you find the right solutions to meet your needs.

                      Get in touch