Quality engineering optimizes a DLT platform

Banking & Financial Services

Quality engineering optimizes a DLT platform

Reliability, availability, scalability, observability, and resilience ensured; release cycles and testing time improve 75% and 80%.

Client
A leading provider of financial services digitization solutions
Goal
Reliability assurance for a digital ledger technology (DLT) platform
Tools and Technologies
Kotlin, Java, Http Client, AWS, Azure, GCP, G42, OCP, AKS, EKS, Docker, Kubernetes, Helm Chart, Terraform
Business Challenge

A leader in Blockchain-based digital financial services required assurance for non-GUI (Graphic User Interface), Command Line Interface (CLI), microservices and Representational State Transfer (REST) APIs for a Digital Ledger Technology (DLT) platform, as well as platform reliability assurance on Azure, AWS services (EKS, AKS) to ensure availability, scalability, observability, monitoring and resilience (disaster recovery). It also wanted to identify capacity recommendations and any performance bottlenecks (whether impacting throughput or individual transaction latency) and required comprehensive automation coverage for older and newer product versions and management of frequent deliveries of multiple DLT product versions on a monthly basis.

Solution
  • 130+ Dapps were developed and enhanced on the existing automation framework for terminal CLI and cluster utilities
  • Quality engineering was streamlined with real-time dashboarding via Grafana and Prometheus
  • Coverage for older and newer versions of the DLT platform was automated for smooth, frequent deliverables for confidence in releases
  • The test case management tool, Xray, was implemented for transparent automation coverage
  • Utilities were developed to execute a testing suite for AKS, EKS, local MAC/ Windows/ Linux cluster environments to run on a daily or as-needed basis
Outcomes
  • Automation shortened release cycles from 1x/month to 1x/week; leads testing time was reduced by 80%
  • Test automation coverage with 2,000 TCs was developed, with pass rate of 96% in daily runs
  • Compatibility was created across AWS-EKS, Azure-AKS, Mac, Windows, Linux and local cluster
  • Increased efficiency in deliverables was displayed, along with an annual $350K savings for TCMs
  • An average throughput of 25 complete workflows per second was sustained
  • Achieved a 95th percentile flow completion time, should not exceed 10 seconds
Contact

Our experts can help you find the right solutions to meet your needs.

Get in touch
Explore the world with Iris. Follow us on social media today.
Productionizing Generative AI Pilots

Productionizing Generative AI Pilots

Get scalable solutions and unlock insights from information siloed across an enterprise by automating data extraction, streamlining workflows, and leveraging models.




    Enterprises have vast amounts of unstructured information such as onboarding documents, contracts, financial statements, customer interaction records, confluence pages, etc., with valuable information siloed across formats and systems.

    Generative AI is now starting to unlock new capabilities, with vector databases and Large Language Models (LLMs) tapping into unstructured information using natural language, enabling faster insight generation and decision-making. The advent of LLMs, exemplified by the publicly-available ChatGPT, has been a game-changer for information retrieval and contextual question answering. As LLMs evolve, they’re not just limited to text. They’re becoming multi-modal, capable of interpreting charts and images. With a large number of offerings, it is very easy to develop Proofs of Concept (PoCs) and pilot applications. However, to derive meaningful value, the PoCs and pilots need to be productionized and delivered in significant scale.

    PoCs/pilots deal with only the tip of the iceberg. Productionizing needs to address a lot more that does not readily meet the eye. To scale extraction and indexing information, we need to establish a pipeline that, ideally, would be driven by events, new documents generated and available, possibly through an S3 document store and SQS (Simple Queue Service), to initiate parsing of documents for metadata, chunking, creating vector embedding and persisting metadata and vector embedding to suitable persistence stores. There is a need for logging and exception-handling, notification and automated retries when the pipeline encounters issues.

    While developing pilot applications using Generative AI is easy, teams need to carefully work through a number of additional considerations to take these applications to production, scale the volume of documents and the user-base, and deliver full value. It would be easier to do this across multiple RAG (Retrieval-Augmented Generation) applications, utilizing conventional NLP (Natural Language Processing) and classification techniques to direct user requests to different RAG pipelines for different queries. Implementing the capabilities required around productionizing Generative AI applications using LLMs in a phased manner will ensure that value can be scaled as the overall solution architecture and infrastructure is enhanced.

    Read our perspective paper for more insights on Productionizing Generative AI Pilots.

    Download Perspective Paper




      Contact

      Our experts can help you find the right solutions to meet your needs.

      Get in touch
      How Gen AI Can Transform Software Engineering

      How Gen AI Can Transform Software Engineering

      Unlocking efficiency across the software development lifecycle, enabling faster delivery and higher quality outputs.




        Generative AI has enormous potential for business use cases, and its application to software engineering is equally promising.

        In our experience, development activities, including automated test and deployment scripts, account for only 30-50% of the time and effort spent across the software engineering lifecycle. Within that, only a fraction of the time and effort is spent in actual coding. Hence, to realize the true promise of Generative AI in software engineering, we need to look across the entire lifecycle.

        A typical software engineering lifecycle involves a number of different personas (Product Owner, Business Analyst, Architect, Quality Assurance/ Tech Leads, Developer, Quality/ DevSecOps/ Platform Engineers), each using their own tools and producing a distinct set of artifacts. Integrating these different tools through a combination of Gen AI software engineering extensions and services will help streamline the flow of artifacts through the lifecycle, formalize the hand-off reviews, enable automated derivation of initial versions of related artifacts, etc.

        As an art-of-the-possible exercise, we developed extensions (for VS Code IDE and Chrome Browser at this time) incorporating the above considerations. Our early experimentation suggests that Generative AI has the potential to enable more complete and consistent artifacts. This results in higher quality, productivity and agility, reducing churn and cycle time, across parts of the software engineering lifecycle that AI coding assistants do not currently address.

        Complementary approaches to automate repetitive activities through smart templating, leveraging Generative AI and traditional artifact generation and completion techniques can help save time, let the team focus on higher-value activities and improve overall satisfaction. However, there are key considerations in order to do this at scale across many teams and team members. To enable teams to become high-performant, the Gen AI software engineering extensions and services need to provide capabilities around standardization and templatization of standard solution patterns (archetypes) and formalize the definition and automation of steps of doneness for each artifact type.

        Read our perspective paper for more insights on How Gen AI Can Transform Software Engineering through streamlined processes, automated tasks, and augmented collaboration, bringing faster, higher-quality software delivery.

        Download Perspective Paper




          Contact

          Our experts can help you find the right solutions to meet your needs.

          Get in touch

          NC Tech Association Leadership Summit 2024

          NC Tech Association Leadership Summit 2024

          NC Tech Leadership Summit 2024

          Join Michel Abranches, Senior Client Partner, at the NC Tech Association’s annual Summit to learn the latest on tech transformation, building resilient tech teams, and adaptive leadership.

          Iris Software will participate in the exclusive, attendance-capped, annual Leadership Summit hosted by the NC Tech Association, along with its board of directors and advisors, on August 7 & 8, 2024. Our representative, Senior Client Partner, Michel Abranches, will be among the executives gathering for the Summit, at the Pinehurst Resort in Pinehurst, NC, to network and discuss a variety of topics relevant to tech leaders and the projects and associates they manage.

          The theme of this year’s summit is Adaptive Leadership. The event includes keynote addresses, executive workshops, and two panel discussions on ‘Why digital transformation is more about people than technology’ and ‘Building resilient tech teams: the power of emotional intelligence.’ 

          As a technology provider to Fortune 500 and other leading global enterprises for more than 30 years, Iris is a trusted choice for leaders who want to realize the full potential of digital transformation. We deliver complex, mission-critical software engineering, application development, and advanced tech solutions that enhance business competitiveness and achieve key outcomes. Our agile, collaborative, right-sized teams and high-trust, high-performance, award-winning culture ensure clients enjoy top value and experience.  

          Contact Michel Abranches, based in our Charlotte, NC office, or visit www.irissoftware.com for details and success stories about our innovative approach and how we are leveraging the latest in AI / Gen AI / ML, Automation, Cloud, DevOps, Data Science, Enterprise Analytics, Integrations, and Quality Engineering.

          Contact

          Our experts can help you find the right solutions to meet your needs.

          Get in touch
          Asset tokenization transforming global finance

          Real world asset tokenization can transform financial markets

          Integration with Distributed Ledger Technologies is critical to realizing the full potential of tokenization.




            The global financial markets create and deal in multiple asset classes, including equities, bonds, forex, derivatives, and real estate investments. Each of them constitutes a multi-trillion-dollar market. These traditional markets encounter numerous challenges in terms of time and cost which impede accessibility, fund liquidity, and operational efficiencies. Consequently, the expected free flow of capital is hindered, leading to fragmented, and occasionally limited, inclusion of investors.

            In response to these challenges, today's financial services industry seeks to explore innovative avenues, leveraging advancements such as Distributed Ledger Technology (DLT). Using DLTs, it is feasible to tokenize assets, thus enabling issuance, trading, servicing and settlement digitally, not just in whole units, but also in fractions.

            Asset tokenization is the process of converting and portraying the unique properties of a real-world asset, including ownership and rights, on a Distributed Ledger Technology (DLT) platform. Digital and physical real-world assets, such as real estate, stocks, bonds, and commodities, are depicted by tokens with distinctive symbols and cryptographic features. These tokens exhibit specific behavior as part of an executable program on a blockchain.

            Many domains, especially financial institutions, have started recognizing the benefits of tokenization and begun to explore this technology. Some of the benefits are fractional ownership, increased liquidity, efficient transfer of ownership, ownership representation and programmability.

            With the recent surge in the adoption of tokenization, a diverse array of platforms has emerged, paving the way for broader success, but at the same time creating fragmented islands of ledgers and related assets. As capabilities mature and adoption grows, interconnectivity and interoperability across ledgers representing different institutions issuing/servicing different assets could improve, creating a better integrated market landscape. This would be critical to realizing the promise of asset tokenization using DLT.

            Read our Perspective Paper for more insights on asset tokenization and its potential to overcome the challenges, the underlying technology, successful use cases, and issues associated with implementation.

            Download Perspective Paper




              Contact

              Our experts can help you find the right solutions to meet your needs.

              Get in touch
              How Low-code Empowers Mission-critical End Users

              How Low-code Empowers Mission-critical End Users

              Low-code platforms enable rapid conversions to technology-managed applications that provide end users with rich interfaces, powerful configurations, easy integrations, and enhanced controls.




                How Low-code Empowers Mission-critical End Users

                Do you trust your data?

                Data driven organizations are ensuring that their Data assets are cataloged and a lineage is established to fully derive value out of their data assets.

                Many large and small enterprises utilize business-managed applications (BMAs) in their value chain to supplement technology-managed applications (TMAs). BMAs are applications or software that end users create or procure off-the-shelf and implement on their own; these typically are low-code or no-code software applications. Such BMAs offer the ability to automate or augment team-specific processes or information to enable enterprise-critical decision-making.

                Technology teams build and manage TMAs to do a lot of heavy lifting by enabling business unit workflows and transactions and automating manual processes. TMAs are often the source systems for analytics and intelligence engines that drive off data warehouses, marts, lakes, lake-houses, etc. BMAs dominate the last mile in how these data infrastructures support critical reporting and decision making. 

                While BMAs deliver value and simplify complex processes, they bring with them a large set of challenges in security, opacity, controls collaboration, traceability and audit. Therefore, on an ongoing basis, business-critical BMAs that have become relatively mature in their capabilities must be industrialized with optimal time and investment. Low-code platforms provide the right blend of ease of development, flexibility and governance that enables the rapid conversion of BMAs to TMAs with predictable timelines and low-cost, high-quality output. 

                Read our Perspective Paper for more insights on using low-code platforms to convert BMAs to TMAs that provide end users with rich interfaces, powerful configurations, easy integrations, and enhanced controls.

                Download Perspective Paper




                  Contact

                  Our experts can help you find the right solutions to meet your needs.

                  Get in touch
                  The State of Central Bank Digital Currency

                  The State of Central Bank Digital Currency

                  Innovations in digital currencies could redefine the concept of money and transform payments and banking systems.




                    The State of Central Bank Digital Currency

                    Do you trust your data?

                    Data driven organizations are ensuring that their Data assets are cataloged and a lineage is established to fully derive value out of their data assets.

                    Central banking institutions have emerged as key players in the world of banking and money. They play a pivotal role in shaping economic and monetary policies, maintaining financial system stability, and overseeing currency issuance. A manifestation of the evolving interplay between central banks, money, and the forces that shape financial systems is the advent of Central Bank Digital Currency (CBDC). Many drivers have led central banks to explore CBDC: declining cash payments, the rise of digital payments and alternative currencies, and disruptive forces in the form of fin-tech innovations that continually reshape the payment landscape.

                    Central banks are receptive towards recent technological advances and well-suited to the digital currency experiment, leveraging their inherent role of upholding the well-being of the monetary framework to innovate and facilitate a trustworthy and efficient monetary system.

                    In 2023, 130 countries, representing 98% of global GDP, are known to be exploring a CBDC solution. Sixty-four of them are in an advanced phase of exploration (development, pilot, or launch), focused on lower costs for consumers and merchants, offline payments, robust security, and a higher level of privacy and transparency. Over 70% of the countries are evaluating digital ledger technology (DLT)-based solutions.  

                    While still at a very nascent stage in terms of overall adoption for CBDC, the future of currency promises to be increasingly digital, supported by various innovations and maturation. CBDC has the potential to bring about a paradigm shift, particularly in the financial industry, redefining the way in which money, as we know it, exchanges hands.

                    Read our perspective paper to learn more about CBDCs – the rationale for their existence, the factors driving their implementation, potential ramifications for the financial landscape, and challenges associated with their adoption.

                    Download Perspective Paper




                      Contact

                      Our experts can help you find the right solutions to meet your needs.

                      Get in touch

                      Quality engineering for Blockchain-DLT platform

                      Quality engineering for Blockchain-DLT platform

                      Banking & Financial Services

                      Next-gen Quality Engineering for Blockchain-DLT platform

                      Quality engineering implementation helps a digital financial services client smooth the legacy migration of its Blockchain-DLT (Digital Ledger Technology) platform by advancing automation coverage and patch delivery efficiencies.

                      Client

                      A leading digital financial services company

                      Goal

                      Blockchain- DLT platform assurance with improved automation coverage

                      Tools and technologies

                      Amazon Elastic Kubernetes Service (EKS), Azure Kubernetes Services (AKS), Docker, Terraform, Helm Charts, Microservices, Kotlin, Xray

                      BUSINESS CHALLENGE

                      The client's legacy DLT platform did not support cloud capabilities with the Blockchain-DLT tech stack. The non-GUI (Graphic User Interface) and CLI (Command Line Interface)-based platform lacked the microservices architecture and cluster resilience. The REST (Representational State Transfer) APIs-based platform did not support platform assurance validation at the backend. Automation coverage for legacy and newer versions of the products was very low. Support for delivery patches was insufficient, impacting the delivery of multiple versions of R3 products each month.

                      SOLUTION

                      Iris developed multiple CorDapps to support automation around DLT-platform functionalities and enhanced the CLI-based & cluster utilities in the existing R3 automation framework. The team implemented the test case management tool Xray to improve test automation coverage for legacy and newer versions of the Corda platform, enabling smooth and frequent patch deliveries every month. The quality engineering process was streamlined for the team's Kanban board by modifying the workflows. Iris also introduced the ability to execute a testing suite that could run on a daily or as-needed basis for AKS, EKS, and Local MAC/ Windows/ Linux cluster environments.

                      OUTCOMES

                      The Blockchain-DLT reliability assurance solution enabled the client to attain:

                      • Improved automation coverage of the DLT platform with 900 test cases with a pass rate of 96% in daily runs
                      • Compatibility across AWS-EKS, Azure-AKS, Mac, Windows, Linux, and local clusters
                      • Increased efficiency in deliverables with an annual $35K savings in the test case management area

                      Contact

                      Our experts can help you find the right solutions to meet your needs.

                      Get in touch

                      Test Automation Speeds Model Risk Management System

                      Test Automation Speeds Model Risk Management System

                      Banking

                      Test Automation Speeds Model Risk Management System

                      Automated testing for a top international bank's model risk management system brings efficiency and reliability.

                      Client

                      Top international bank

                      Goal

                      Fully automate the model risk management system framework to improve quality and confidence in testing results

                      Tools and technologies

                      Java, Selenium, Maven, TestNG, Git

                      BUSINESS CHALLENGE

                      The client's existing model risk framework was inefficiently handling functional testing aspects and risk scenarios due to lack of an end-to-end testing framework. Built on redundant, hard-to-debug, and non-scalable code, the system was unreliable for model risk testing. Test cases and controls were maintained and executed in Excel, eliminating parallel workflow abilities, tempering testing results, contributing to increased testing efforts and even delaying production launches in some cases. Scalability of testing using automation, running data-driven, end-to-end test flows, and restoring confidence in test results were the client's prime challenges.

                      SOLUTION

                      Iris built a lightweight and scalable new framework, providing 100% automated regression testing of functional test cases. Using simplified, customizable code that separated automation utilities and test functions, Iris' solution brought multiple improvements. Among them was faster test execution due to significantly reduced manual efforts. It also resulted in better quality and stability from the early identification of testing issues, enabling immediate corrective actions to occur. Another advantage of the solution was adaptability to multiple application areas due to ease of maintainability and traceability of code employed.

                      OUTCOMES

                      The client experienced several positive effects from the new, fully-automated solution:

                      • Acquired a 100% stable, scalable, reusable test framework
                      • ROI of 72%; payback period of less than 8 months
                      • 20% reduction in testing efforts for faster time to market
                      • Significant decrease in time required for ongoing maintenance of test scripts

                      Contact

                      Our experts can help you find the right solutions to meet your needs.

                      Get in touch

                      Tech stack automation expedites script development by 3x

                      Tech stack automation expedites script development by 3x

                      Manufacturing

                      Tech stack automation expedites script development by 3x

                      Manual processes across the multi-technology stack were severely affecting the script development cycles in terms of time, effort and cost. Iris application agnostic Test Automation framework and DevOps integration helped the client reduce the script development time and cost significantly.

                      Client

                      A leading building supplies manufacturing company

                      Goal

                      To support 30+ applications stack for UI, E2E, APIs, performance, mobile automation along with DevOps pipeline integration

                      Tools and technologies

                      .NET Core, PeopleSoft, Salesforce, WMS, JavaScript, Angular, Foxpro, C#, Selenium, SpecFlow, RestSharp, Nunit, Mobile Center/Emulators, Allure, Jira, Azure Pipeline, GitHub

                      BUSINESS CHALLENGE

                      The client had technology stacks comprising of diverse technologies that were difficult to manage. Substantial manual effort and time were spent on integrating the checkpoints, elongating the development process. Validating end-to-end business flows across different applications was the prime challenge. Reporting processes were also scattered across the entire application stack, making it vulnerable.

                      SOLUTION

                      Iris developed a robust application agnostic Test Automation framework to support the client’s multiple-technology stacks. Following the Behavior-driven Development (BDD) approach to align the acceptance criteria with the stakeholders, we built business and application layers of the common utilities in the core framework. Our experts identified E2E business flows to validate the downstream impact of the change and automated the entire stack through the shift-left approach. Azure DevOps integration enabled a common dashboard for reporting. The client attained complete version control to track production health and enforce strong validations.

                      OUTCOMES

                      Iris Automation solution enabled the client to surpass several business goals. The key outcomes of the delivered solution included:

                      • ~65% Increase in automation coverage
                      • 100+ Pipelines for in-scope applications across multiple environments
                      • 3700+ Test Automation scripts execution per sprint cycle achieved across applications
                      • 3X Faster script development of behavior-driven test cases 
                      • Multi-day manual test effort reduced to a few hours of automated regression 
                      • 70% Reduction in effort

                      Contact

                      Our experts can help you find the right solutions to meet your needs.

                      Get in touch
                      Copyright © 2024 Iris Software, Inc. All rights reserved