diff --git "a/data.jsonl" "b/data.jsonl" --- "a/data.jsonl" +++ "b/data.jsonl" @@ -8,6 +8,7 @@ {"text": "# Infosys Whitepaper \nTitle: Need of a Test Maturity Model \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 8 \n\n---\n\n PERSPECTIVE NEED FOR A COMPREHENSIVE TEST MATURITY MODEL - Reghunath Balaraman, Harish Krishnankutty Abstract Constant change and ever growing complexity of business has necessitated that IT organizations and specifically Test/QA organizations make thorough and periodic introspection of their processes and delivery capabilities. This is necessary to ensure that at all possible times the Test/QA organization, and its systems and processes, are relevant and available to support business needs. While there are multiple maturity models in the marketplace to help this process, there are yet not comprehensive enough and fail to provide today\u2019s dynamic businesses the much needed flexibility and power of customization. The need of the hour is a comprehensive Test/QA maturity assessment model, which not only answers the requirements of customization and flexibility, but also ensures relevance in today\u2019s complex delivery structures of multi-vendor scenarios, multi-location engagements, global delivery models, etc.<|endoftext|>\n\n---\n\n Page: 2 / 8 \n\n---\n\n and revolutionary technology trends have changed the role of IT organizations in supporting business growth. Though the recession created a scarcity of capital for IT investments, the demands and expectations on the ability of IT to quickly adapt and support business, has only increased multifold. In addition, rapid/ revolutionary changes in technologies are forcing companies to recast their entire IT landscape. All these factors together have created a complex environment where the demand for change from the business is high, the capital for investment is scarce and time-to-market is critical to success. The constant change and ever growing complexity of the business environment, and the risks associated, have necessitated that organizations make thorough and periodic introspection of their processes and delivery capabilities to ensure operation in an efficient, effective and agile manner. A key requirement, amongst all, is the ability to ensure that there are adequate controls in place to ensure quality in the processes and the outcomes, no matter the extent of change being introduced in business or technology landscape of the organization. This is where the Test/QA organization\u2019s capability comes under the scanner. An organization\u2019s ability to assure and control quality of its IT systems and processes largely determines the success or failure of the business in capturing, servicing and expanding its client base. When Quality and reliability play a very significant role in determining the current and future course of business outcomes delivered, it is imperative that the Quality Assurance function itself is evaluated periodically for the relevance, effectiveness and efficiency of the processes, practices and systems. An objective self-introspection is the ideal first step. However, most often than not, QA organizations fall short of using this process to unearth gaps in their current systems and practices. Also, many organizations may have lost touch with the ever-evolving world of QA to be aware of the leading practices and systems available today. This necessitates an independent assessment of the organization\u2019s QA practices to benchmark it against the practices prevalent in the industry and to get that all-important question \u201cwhere do we stand in comparison with Industry standards?\u201d answered. Also, the assessment of maturity in testing processes becomes critical in laying out the blue-print for a QA/ testing transformation program that would establish the function as a fit-for-purpose one, and often world-leading. The global economic crisis External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 3 / 8 \n\n---\n\n Limitations of traditional approaches to Test/QA Maturity Assessment There are several models, proprietary and others, available for assessing the maturity of the IT processes and systems, including quality assurance. Most of these are developed and promoted as models helping an organization to certify capabilities in one or more areas of the software development lifecycle. Like all other models and frameworks that lead to certification, these maturity models too have a fixed framework for an organization to operate within, and provide very little flexibility to address specific assessment needs. Further, these models fail to help organizations assess overall process maturity due to the following limitations: Inability to accommodate and account for heterogeneous delivery structures Over the last decade or so, most organizations have evolved into a heterogeneous composition of internal staff and service providers, delivering services through global delivery models with diverse talent, disparate processes, etc. All this has made assessing an organization\u2019s process maturity increasingly difficult. The existing maturity models in the marketplace are not flexible enough to accommodate for these complex delivery structures created through multi-vendor scenarios, multi-location engagements focusing on selective parts of the software development lifecycle, etc. This significantly reduces the overall effectiveness of the output provided by the existing maturity model and its applicability to the client situation.<|endoftext|>Focused on comprehensive certification rather than required capabilities Most conventional models are \u201ccertification focused\u201d and can help organizations in assessing their IT process capabilities and getting certified. They are exhaustive in the coverage of process areas and answer the question, \u201chow comprehensive are the processes and practices to service a diverse sets of users of the QA services?\u201d. Such a certification is often a much needed qualification for IT service provider organizations to highlight their process capability and maturity to diverse clients and prospects. However, most non-IT businesses maintain IT divisions to support their business and are more interested in selectively developing the required capabilities of their respective IT groups, leading to efficient business processes and better business outcomes. Hence 1 External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 4 / 8 \n\n---\n\n the focus of maturity assessments in these organizations is not certification, but the ability to deliver specific business outcomes. Since the traditional assessment models are often certification-focused, most non-IT businesses find it an overhead to go through an exhaustive assessment process that does not help them answer the question, \u201chow effective are my organization\u2019s QA processes and practices to ensure quality of my business outcomes? Staged Vs Continuous model for growth in maturity Majority of certification models follow a staged approach, which means that the organization has to satisfy all the requirements of a particular level and get certified in the same, before becoming eligible for progress to the next level. But, most organizations are selective in their focus and want to develop those areas that are relevant and necessary to their business, rather than meeting all the requirements just to get certified at a particular level. Because of the staged approach to certification, such maturity models do not present organizations with a good view of where their current capabilities stand with respect to what is needed by the organization.<|endoftext|>Lack of focus on QA The existing maturity models primarily focus on software development, and treat testing as a phase in the Software Development Lifecycle. However, today, testing has evolved as a mature and specialized discipline in the software industry and hence the ability of the traditional models to assess the QA/testing processes and practices to the required level of detail is very limited. They fall short of organizations that have realized the need/ importance for an independent testing team and want to manage the QA maturity mapping process as an independent entity. Hence, the various dimensions of the test organization should be given adequate focus in the maturity assessment approach covering the Process, People and Technology aspects of testing.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 5 / 8 \n\n---\n\n A comprehensive model to assessing an organization\u2019s test capabilities and the ability to handle transformational programs Now that we have looked at the shortcomings of the traditional models of QA assessment, it is time to answer that all-important question, \u201cwhat should a comprehensive model for assessing QA/ Test maturity be like?\u201d. The key attributes of a comprehensive QA/Test assessment framework/model can be summed up as follows: Provide business-comprehensible decision-aiding results The model should allow for selective assessment of the relevant parameters for maturity, in the context of business. The results of the assessment should help the business identify and plot the possibility of immaturity in their systems and processes, using lead indicators that have a negative impact on the business. These indicators should help the senior management to decide whether to go for a detailed assessment of maturity, before any adverse effect on business is felt. Choice of business-relevant factors and focus areas The model should be flexible enough to provide the right level of focus on the various factors, business deems relevant, that contribute to the overall maturity index. For example, an organization which depends on one or a set of service providers for their key IT services may want strong governance and gating mechanisms. While, another organization that does testing in-house, and leverages vendors for development, will have a much wider focus on maturity in processes and practices. Basically, the model should be flexible enough to account for the intent of assessment, as outlined by the organization.<|endoftext|>Detailed and comprehensive view of areas of improvement and strengths The model should also be one that helps determine the maturity of the testing organization in a detailed manner. The methods and the systems of the model should provide a robust mechanism of objectively calculating the maturity level of the testing organization, based on the behaviors exhibited by the organization. It should provide the members of the QA organization with a detailed view of the areas of strength (and hence to be retained) and the areas of improvement. The model should enable the testing organization to understand the measures that should be implemented at the granular level, rather than at a high-level and thereby help the organization to focus on their key QA dimensions, 2 External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 6 / 8 \n\n---\n\n and strengthen the maturity of these dimensions.<|endoftext|>A frame of reference for improvement initiatives The comprehensive maturity model should provide the organization with a roadmap to move its QA/Testing processes and practices to a higher level of maturity and effectiveness. It should provide a reference framework for selective improvement of capabilities, keeping in mind the business context and organizational objectives. This will help the organization design a roadmap for improvement and devise ways to implement the same effectively.<|endoftext|>Conclusion So, in order to meet the needs of a dynamic business environment and rapidly evolving technology space, IT organizations need to respond quickly and efficiently with high-quality, high- reliability and cost-effective processes and systems. This calls for a robust and scalable QA organization that can guard and ensure the quality of solutions that are put into operation, and assess itself on its capabilities and maturity, periodically to ensure business-relevance and effectiveness.<|endoftext|>Hence a comprehensive QA maturity model, which assists organizations in this assessment, should move away from certification-based models with \u201cgeneric\u201d and \u201chard-to-customize\u201d stages, to a model that is adaptable to the context in which business operates. It needs to be a model that evaluates factors that influence maturity and quality of processes at a detailed level and helps the organization to embed quality and maturity in processes, governance and development of key competencies. This would help ensure the maturity of operations and promote continuous improvement and innovation throughout the organization.<|endoftext|>3 External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 7 / 8 \n\n---\n\n About the \nAuthors Reghunath Balaraman (Reghunath_Balaraman@infosys.com) Reghunath Balaraman is a Principal Consultant, and has over 16 years of experience. A post graduate in Engineering and Management, he has been working closely with several large organizations to assess the maturity of their test and QA organizations and to help them build mature and scalable QA organizations. Raghunath is also well versed with several industry models for assessing maturity of software testing.<|endoftext|>Harish Krishnankutty (Harish_T@infosys.com) Harish Krishnankutty is a Industry Principal in Infosys\u2019 Independent Validation Solutions unit and has over 14 years of experience in the IT industry. His area of specialization is QA consulting and program management and has extensive expertise in the design and implementation of Testing Centers of Excellence/ Managed QA services. Currently, he focuses on the development of tools, IP and services in different areas of specialized testing, including Test Automation, SOA Testing, Security Testing, User Experience Testing, Data Warehouse Testing and Test Data Management.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 8 / 8 \n\n---\n\n \u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: DAST Automation for Secure, Swift DevSecOps Cloud Releases \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 8 \n\n---\n\n Abstract DevSecOps adoption in the cloud goes well beyond merely managing continuous integration and continuous deployment (CI/CD) cycles. Its primary focus is security automation. This white paper examines the barriers organizations face when they begin their DevSecOps journey, and beyond. It highlights one of the crucial stages of security testing known as Dynamic Application Security Testing (DAST). It explores the challenges and advantages of effectively integrating DAST into the CI/ CD pipeline, on-premises and in the cloud. The paper delineates the best practices for DAST tool selection and chain set-up, which assist in shift-left testing and cloud security workflows that offer efficient security validation of deployments with risk- based prompt responses.<|endoftext|>DAST AUTOMATION FOR SECURE, SWIFT DEVSECOPS CLOUD RELEASES WHITE PAPER \n\n---\n\n Page: 2 / 8 \n\n---\n\n Traditional security practices involve security personnel running tests, reviewing findings, and providing developers with recommendations for modifications. This process, including threat modeling, conducting compliance checks, and carrying out architectural risk analysis and management, is time-consuming and incongruous with the speed of DevOps. Some of these practices are challenging to automate, leading to a security and DevOps imbalance. To overcome these challenges, many organizations have shifted to an agile DevOps delivery model. However, this exerts significant pressure on DevOps to achieve speed with security as part of the CI/CD pipeline. As a result, release timelines and quality have been impacted due to the absence of important security checks or the deployment of vulnerable code under time pressure.<|endoftext|>Even as DevOps was evolving, the industry concurrently fast- tracked its cloud transformation roadmap. Most organizations shifted their focus to delivering highly scalable applications built on customized modern architectures with 24/7 digital services. These applications include a wide-ranging stack of advanced tiers, technologies, and microservices, backed by leading cloud platforms such as AWS, GCP, and Azure. Despite the accelerated digital transformations, a large number of organizations continue to harbor concerns about security. The year- end cybercrime statistics provide good reason to do so: 1. The global average cost of a data breach is an estimated US $4.35 million, as per IBM\u2019s 2022 data breach report1 2. Cybercrime cost the world US $7 trillion in 2022 and is set to reach US $10.5 trillion by 2025, according to Cybersecurity Ventures2 Evidently, security is an important consideration in cloud migration planning. Speed and agility are imperatives while introducing security to DevOps processes. Integrating automated security checks directly into the CI/CD pipeline enables DevOps to evolve into DevSecOps. DevSecOps is a flexible collaboration between development, security, and IT operations. It integrates security principles and practices into the DevOps life cycle to accelerate application releases securely and confidently. Moreover, it adds value to business by reducing cost, improving the scope for innovation, speeding recovery, and implementing security by design. Studies project DevSecOps to reach a market size of between US $20 billion to US $40 billion by the end of 2030.<|endoftext|>Background \n\n---\n\n Page: 3 / 8 \n\n---\n\n As enterprises race to get on the DevSecOps bandwagon, IT teams continue to experience issues: \u2022 60% find DevSecOps technically challenging 3 \u2022 38% report a lack of education and adequate skills around DevSecOps 3 \u2022 94% of security and 93% of development teams report an impact from talent shortage 1 Some of the typical challenges that IT teams face when integrating security into DevOps on-premise or in the cloud are: People/culture challenges: \u2022 Lack of awareness among developers on secure coding practices and processes \u2022 Want of collaboration and cohesive skillful teams with development, operations, and security experts Process challenges: \u2022 Security and compliance remain postscript \u2022 Inability to fully automate traditional manual security practices to integrate into DevSecOps \u2022 Continuous security assessments without manual intervention Tools/technology challenges: \u2022 Tool selection, complexity, and integration problems \u2022 Configuration management issues \u2022 Prolonged code scanning and consumption of resources DevSecOps implementation challenges Focusing on each phase of the modern software development life cycle (SDLC) can help strategically resolve DevSecOps implementation challenges arising from people, processes, and technology. Integrating different types of security testing for each stage can help overcome the issues more effectively (Figure 1). Figure 1: Modern SDLC with DevSecOps and Types of Security Testing Solution PLAN Requirements CODE Code Repository BUILD CI Server TEST Integration Testing RELEASE Artifact Repository DEPLOY CD Orchestration OPERATE Monitor Threat Modelling Software Composition Analysis and Secret Management Secure Code Analysis and Docker Linting Dynamic Application Security Testing Network Vulnerability Assessments System/Cloud Hardening Cloud Configuration Reviews \n\n---\n\n Page: 4 / 8 \n\n---\n\n What is DAST? DAST is the technique of identifying the vulnerabilities and touchpoints of an application while it is running. DAST is easy even for beginners to get started on without in-depth coding experience. However, DAST requires a subject matter expert (SME) in the area of security to configure and set up the tool. An SME with good spidering techniques can build rules and configure the correct filters to ensure better coverage, improve the effectiveness of the DAST scan, and reduce false positives.<|endoftext|>Best practices to integrate DAST with CI/CD The last few years have shown that next-generation CX requires heavy doses of perseverance and attitudinal focus. At Infosys, we have extended this to the way we deliver projects by relying on a few key cultural principles: Besides adopting best practices, the CI/CD environment needs to be test-ready. A basic test set-up includes: There can be several alternatives to the set-up based on the toolset selection. The following diagram depicts a sample (see Figure 2).<|endoftext|>Figure 2: DevSecOps Lab Set-up \u2022 Integrate DAST scan in the CI/CD production pipeline after provisioning the essential compute resources, knowing that the scan will take under 15 minutes to complete. If not, create a separate pipeline in a non-production environment \u2022 Create separate jobs for each test in the case of large applications. E.g., SQL injection and XSS, among others \u2022 Consider onboarding an SME with expertise in spidering techniques, as the value created through scans is directly proportional to the skills exhibited \u2022 Roll out security tools in phases based on usage, from elementary to advanced \u2022 Fail builds that report critical or high-severity issues \u2022 Save time building test scripts from scratch by leveraging existing scripts from the functional automation team \u2022 Provide links to knowledge pages in the scan outputs for additional assistance \u2022 Pick tools that provide APIs \u2022 Keep the framework simple and modular \u2022 Control the scope and false positives locally instead of maintaining a central database \u2022 Adopt the everything-as-a-code strategy as it is easy to maintain Developer machine for testing locally Code repository for version controlling CI/CD server for integrations and running tests with the help of slave/runner Staging environment \n\n---\n\n Page: 5 / 8 \n\n---\n\n Right tool selection With its heavy reliance on tools, DevSecOps enables the automation of engineering processes, such as making security testing repeatable, increasing testing speed, and providing early qualitative feedback on application security. Therefore, selecting the appropriate security testing tools for specific types of security testing and applying the correct configuration in the CI/CD pipeline is critical.<|endoftext|>Challenges in tool selection and best practices Common pitfalls \u2022 Lack of standards in tool selection \u2022 Security issues from tool complexity and integration \u2022 Inadequate training, skills, and documentation \u2022 Configuration challenges Best practices in tool selection \u2022 Expert coverage of tool standards \u2022 Essential documentation and security support \u2022 Potential for optimal tool performance, including language coverage, open source or commercial options, the ability to ignore issues, incident severity categories, failure on issues, and results reporting feature \u2022 Cloud technology support \u2022 Availability of customization and integration capabilities with other tools in the toolchain \u2022 Continuous vulnerability assessment capability Best practices in tool implementation \u2022 Create an enhanced set of customized rules for tools to ensure optimum scans, and reliable outcomes \u2022 Plan incremental scans to reduce the overall time taken \u2022 Use artificial intelligence (AI) capabilities to optimize the analysis of vulnerabilities reported by tools \u2022 Aim for zero-touch automation \u2022 Consider built-in quality through automated gating of the build against the desired security standards After selecting the CI/CD and DAST tools, the next step is to set up a pre-production or staging environment and deploy the web application. The set-up enables DAST to run in the CI/CD pipeline as a part of integration testing. Let us consider an example using the widely available open-source DAST tool, Zed Attack Proxy (ZAP). Some of the key considerations for integrating DAST in the CI/CD pipeline using ZAP (see Figure 3) are listed below: \u2022 Test on the developer machine before moving the code to the CI/CD server and the Gitlab CI/CD \u2022 Set up the CI/CD server and Gitlab. Ensure ZAP container readiness with Selenium on Firefox, along with custom scripts \u2022 Reuse the functional automation scripts, only modifying them for security testing use cases and data requirements \u2022 Push all the custom scripts to the Git server and pull the latest code. Run the pipeline after meeting all prerequisites \n\n---\n\n Page: 6 / 8 \n\n---\n\n Some of the key considerations for integrating DAST in the CI/CD pipeline using ZAP (see Figure 3) are listed below: \u2022 Test on the developer machine before moving the code to the CI/CD server and the Gitlab CI/CD \u2022 Set up the CI/CD server and Gitlab. Ensure ZAP container readiness with Selenium on Firefox, along with custom scripts \u2022 Reuse the functional automation scripts, only modifying them for security testing use cases and data requirements \u2022 Push all the custom scripts to the Git server and pull the latest code. Run the pipeline after meeting all prerequisites \n\n---\n\n Page: 7 / 8 \n\n---\n\n DevSecOps with DAST in the cloud Integrating DAST with cloud CI/CD requires a different approach. Approach: \u2022 Identify, leverage, and integrate cloud-native CI/CD services, continuous logging and monitoring services, auditing, and governance services, as well as operation services with regular CI/CD tools \u2013 mainly DAST \u2022 Control all CI/CD jobs with server and slave architecture by using containers, such as Docker, to build and deploy applications as cloud orchestration tools.<|endoftext|>An effective DAST DevSecOps in cloud architecture appears as shown in Figure 4: Best practices \u2022 Control access to pipeline resources using identity and access management (IAM) roles and security policies \u2022 Encrypt data at transit and rest always \u2022 Store sensitive information, such as API tokens and passwords, in the Secrets Manager Key steps 1. The user commits the code to a code repository 2. The tool builds artifacts and uploads them to the artifact library 3. Integrated tools help perform the SCA and SAST tests 4. Reports of critical/high-failure vulnerabilities from the SCA and SAST scans go to the security dashboard for fixing 5. Code deployment to the staging environment takes place if reports indicate \u201cno or ignore vulnerabilities\u201d 6. Successful deployment triggers a DAST tool, such as the OWASP ZAP, for scanning 7. User repeats steps 4 to 6 in the event of a vulnerability detection 8. If no vulnerabilities are reported, the workflow triggers an approval email.<|endoftext|>9. Receipt of approval schedules automatic deployment to production Figure 4: DAST DevSecOps in Cloud Workflow \n\n---\n\n Page: 8 / 8 \n\n---\n\n \u00a9 2023 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected About the authors Kedar J Mankar Kedar J Mankar is an Infosys global delivery lead for Cyber Security testing with Infosys. He has extensive experience across different software testing types. He has led large size delivery and transformation programs for global Fortune 500 customers and delivered value through different COEs with innovation at core. He has experience working and handling teams in functional, data, automation, DevOps, performance and security testing across multiple geographies and verticals.<|endoftext|>Amlan Sahoo Amlan Sahoo has an overall 27+ years in IT industry in application development and testing. He is currently the head of Cyber Security testing division. He has a proven track record in managing and leading transformation programs with large teams for Fortune 50 clients, managing deliveries across multiple geographies and verticals. He also has 4 IEEE and 1 IASTED publications to his credit on bringing efficiencies in heterogeneous software architectures.<|endoftext|>Vamsi Kishore Vamsi Kishore Sukla is a Security consultant with over 8 years of professional experience in the security field, specializing in application security testing, cloud security testing, network vulnerability assessments following OWASP standards and CIS benchmarks. With a deep understanding of the latest security trends and tools, he provides comprehensive security solutions to ensure the safety and integrity of organization and clients.<|endoftext|>Conclusion DevOps is becoming a reality much faster than we anticipate. However, there should be no compromise on security testing to avoid delayed deployments and the risk of releasing software with security vulnerabilities. Successful DevSecOps requires integrating security at every stage of DevOps, enabling DevOps teams on security characteristics, enhancing the partnership between DevOps teams and security SMEs, automating security testing to the extent possible, and shift-left security for early feedback. By leveraging the best practices recommended in this paper, organizations can achieve a more secure and faster release by as much as 15%, both on-premises and in the cloud.<|endoftext|>References 1. https://www.cobalt.io/blog/cybersecurity-statistics-2023 2. https://cybersecurityventures.com/boardroom-cybersecurity-report/ 3. https://strongdm.com/blog/devsecops-statistics \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: Data Archival Testing \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 4 \n\n---\n\n WHITE PAPER DATA ARCHIVAL TESTING Abstract Today, there is an exponential rise in the amount of data being generated by organizations. This explosion of data increases IT infrastructure needs and has an immense impact on some important business decisions that are dependent on proficient data analytics. These challenges have made data archival extremely important from a data management perspective. Data archival testing is becoming increasingly important for businesses as it helps address these challenges, validate the accuracy and quality of archived data and improve the performance of related applications. The paper is aimed at helping readers better understand the space of data archival testing, its implementation and the associated benefits.<|endoftext|>\n\n---\n\n Page: 2 / 4 \n\n---\n\n Introduction One of the most important aspects of managing a business today is managing its data growth. On a daily basis, the cost of data management outpaces the data storage costs for most organizations. Operational analytics and business intelligence reporting usually require active operational data. Data that does not have any current requirement or usage, known as inactive data, can be archived to a safe and secure storage. Data archiving becomes important for companies who want to manage their data growth, without compromising on the quality of data that resides in their production systems.<|endoftext|>Many CIOs and CTOs are reworking on their data retention policies and their data archival and data retrieval strategies because of an increased demand for data storage, reduced application performance and the need to be compliant with the ever changing legislations and regulations.1 Data Archival Testing \u2013 Test Planning Data Archival is the process of moving data that is not required for operational, analytical or reporting purposes to offline storage. A data retrieval mechanism is developed to restore data from the offline storage.<|endoftext|>The common challenges faced during data archival are: \u2022 Inaccurate or irrelevant data in data archives \u2022 Difficulty in the data retrieval process from the data archives Data archival testing helps address these challenges. While devising the data archival test plan, the following factors need to be taken into consideration: Data Dependencies There are many intricate data dependencies in an enterprise\u2019s architecture. The data which is archived should include the complete business objects along with metadata, that helps retain the referential integrity of data across related tables and applications. Data archival testing needs to validate that all related data is archived together for easy interpretation, during storage and retrieval. Data Encoding The encoding of data in the archival database depends on the underlying hardware for certain types of data. To illustrate, data archival testing needs to ensure that the encoding of numerical fields such as integers also archives the related hardware information, for easier future data retrieval and display of data with a different set of hardware.<|endoftext|>Data Retrieval Data needs to be retrieved from archives for regulatory, legal and business needs. The validation of the data retrieval process ensures that the archived data is easily accessed, retrieved and displayed in a format which can be clearly interpreted without any time consuming manual intervention.<|endoftext|>Data Archival Testing \u2013 Implementation The data archival testing process includes validating processes which encompass data archival, data deletion and data retrieval. Figure 1 below describes the different stages of a data archival testing process, the business drivers, the different types of data that can be archived and the various offline storage modes. Figure 1: The Data Archival Testing Process External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 3 / 4 \n\n---\n\n Test the Data Archival process 1 \u2022 Testing the Data Archival process ensures that business entities that are archived includes master data, transaction data, meta data and reference data \u2022 Validates the storage mechanism and that the archived data is stored in the correct format. The data also has to be tested for hardware independence Test the Data Deletion process 2 \u2022 Inactive data needs to be archived and moved to a secure storage for retrieval at a later point and then deleted from all active applications using it. This validation process would verify that that the test data deletion process has not caused any error to any existing applications and dashboards \u2022 When archived data is deleted from systems, verify that the applications and reports are conforming to their performance requirements Test the Data Retrieval process 3 \u2022 Data that has been archived needs to be easily identified and accessible in case of any legal or business needs \u2022 For scenarios that involve urgent data retrievals, processes for the same need to be validated within a defined time period Benefits of Data Archival Testing The benefits of data archival testing are often interrelated and have a significant impact on the IT infrastructure costs for a business. Some of the benefits are: Accomplishing all these benefits determine the success of a data archival test strategy.<|endoftext|>Reduced storage costs Improved application performance Minimize business outages Data Compliance Only the data that is relevant gets archived and for a defined time period which reduces hardware costs and its maintenance costs significantly.<|endoftext|>Data is retrieved faster; the network performs better as only relevant data is present in the production environment. All these factors enhance application performance.<|endoftext|>Archived data that is deleted from production systems does not have an impact on the related applications\u2019 performance and functionality, leading to smooth business operations.<|endoftext|>Easy retrieval and availability of archived data ensures higher data compliance with the legal and regulatory requirements.<|endoftext|>Conclusion Due to the critical business needs for data retention, regulatory and compliance requirements and a cost effective way to access archived data, many businesses have started realizing and adopting data archival testing. Therefore an organization\u2019s comprehensive test strategy needs to include a data archival test strategy which facilitates smooth business operations, ensures fulfillment of all data requirements, maintains data quality and reduces infrastructure costs.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 4 / 4 \n\n---\n\n About the \nAuthor Naju D. Mohan Naju is a Group Project Manager with Infosys with about 15 years of IT experience. She is currently managing specialized testing services like SOA testing, Data Warehouse testing and Test Data Management for many leading clients in the retail sector.<|endoftext|>REFERENCES 1. \u2018Data overload puts UK retail sector under pressure\u2019, Continuity Central, February 2009 2. \u2018 Data Archiving, Purging and Retrieval Methods for Enterprises\u2019, Database Journal, January 2011 \u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected \n\n\n***\n\n\n "} +{"text": "# Infosys POV \nTitle: Data Imperatives in IT MA&D in Life Sciences Industry \nAuthor: Infosys Consulting \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 10 \n\n---\n\n An Infosys Consulting Perspective Consulting@Infosys.com | InfosysConsultingInsights.com DATA IMPERATIVES IN IT MA&D IN LIFE SCIENCES INDUSTRY \n\n---\n\n Page: 2 / 10 \n\n---\n\n 2 FOREWORD Larger macroeconomic headwinds (first the pandemic, then rising interest rates and now recessionary fears) are pushing organizations to resort to mergers, acquisitions, and divestitures (MA&D) as a strategic lever to achieve higher market share, acquire new capabilities, or/and refocus strategy on core business to improve financial performance. The average annual global MA&D value was approximately $3.6 trillion in 2011-20 cycle and increased to $5.9 trillion in 2021, highlighting the growing importance of MA&D in meeting future business needs. The life sciences industry is increasingly looking at MA&Ds to acquire new specialty / generic drug line (and related market pipeline) in pharmaceuticals, specialized capabilities in diagnostics and digital health sector, niche research and development capabilities for effective drug discovery around \u201cspecialty drugs\u201d and patented IP data around experimental drugs.<|endoftext|>There is growing emphasis on antitrust regulations, regulatory reporting, disclosure requirements, and overall deal approval processes. Compliance with these directly relates to the way entity data is managed (before and after MA&D transaction). Multiple data types including financial, operational, people, supplier, and customer data come into remit. This requires organizations to carefully design and execute their data strategy. There are multiple examples from the industry which showcase that despite growing importance of data strategy in MA&D transactions, just 24% of organizations included CIOs in pre-merger planning4. Abbott Laboratories\u2019 acquisition of Alere got delayed due to regulatory concerns on market concentration1 and anti-competition2. Pfizer and Allergan terminated their planned merger due to the change in treasury rules that made tax benefits less attractive3 and many more.<|endoftext|>Data strategy design and execution start with definition of business metrics and alignment on value measurement approach. After metrics are defined and accepted, linkage to source systems, standardization of data element definitions and management of meta-data along with master data ownerships are key to accurately measuring and interpreting these metrics. Data qualification, especially in regulated industries, is critical to understand and managing qualified data (GxP) and related platforms & applications involved. Finally, a performance oriented and scalable data integration methodology followed by an overarching process and governance mechanism is necessary for ensuring ongoing quality and compliance.<|endoftext|>A poorly designed data strategy and execution often leads to ambiguous understanding of key metrics and underlying data elements, incongruent data standards and unclear ownerships - resulting in faulty data integration, inaccurate transaction records, and ultimately unreliable insights and legal complications. An effective way to overcome these pitfalls is to define a robust data design and execution strategy covering key elements addressing distinctive needs of the life sciences industry.<|endoftext|>Data Imperatives in IT MA&D in Life Sciences Industry | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 3 / 10 \n\n---\n\n Mergers, acquisitions, and divestitures (MA&D) are strategic channels for growth. Multiple benefits can be achieved through an effective MA&D transaction, including exponential growth, entry to newmarkets, optimized cost savings, and improved competitiveness.<|endoftext|>The life sciences industry has been experiencing rapid growth and transformation in recent years, fueled by innovations in R&D, regulatory changes, and technological advancements in provider and payer domains. MA&D transactions have become a vital strategic tool for organizations to expand their portfolios, access new markets and improve their competitive positions. Given the complex nature of MA&D transactions, it requires comprehensive due diligence and planning before, during, and after the transaction. Data, being the fundamental building block of any organization, is a critical factor in this due diligence and planning. It is also one of the commonly overlooked factors. In this article, we highlight key elements of data strategy and design within a MA&D transaction and typical pitfalls along with ways to overcome them.<|endoftext|>MA&D transactions in the life sciences industry are increasingly subject to higher scrutiny from regulatory bodies to ensure greater transparencies and better shareholder and consumer protection. There are three key regulation types which are in place: 1. Greater financial and operationaltransparencies: A. India - Foreign Exchange Management Act (FEMA), SEBI Laws B. USA \u2013 Securities Act, Securities Exchange Act C. Europe \u2013 European Union Merger Law 2. Better intellectual property protection: Patents, trademarks, copyrights, trade secrets, designs, data protection.<|endoftext|>3. Higher fair play and consumer protection: A. India \u2013 Competition Act B. USA \u2013 Federal Antitrust Laws C. Europe \u2013 Competition Law According to an analyst report, the average MA&D failure rate is ~70%4. A key reason for this high failure rate is difficulty in integrating the two entities,5 especially w.r.t culture, operational ways of working, revenue recognition and performance incentives. All these aspects are directly impacted by the way data is designed and managed. Despite the importance, just 24% of organizations included CIOs in pre-merger planning3. Effective data management is key to adhering to these regulatory requirements and ensuring that data is properly collected, analyzed, and reported throughout the transaction process.<|endoftext|>Introduction 3 Data Imperatives in IT MA&D in Life Sciences Industry | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 4 / 10 \n\n---\n\n MA&Ds are fundamentally complex transactions that impact business entities, systems, processes, and data of the organizations involved. There are eight elements which underpin data strategy and execution.<|endoftext|>Fig 1 \u2013 Key elements of data strategy within a MA&D transaction. 1. Business metrics and measurement: Defining metrics to evaluate the performance of the target entity is critical. It is important that all entities involved in the transaction clearly define and agree upon the metrics which define success; noteworthy metrics in the life sciences industry include clinical trial outcomes, regulatory approval timelines, molecule discovery rates, drug pipeline progress and GxP compliance metrics. These metrics articulate objectives and key results of the target entity. Agreement on accurate metrics and their measurement logic improves operational and financial transparencies, thereby promoting adoption of the integration / divestiture decision.<|endoftext|>2. Data policies and standards: It is essential to establish a common set of data standards and policies to maintain data assets in the target environment. This involves defining standard data formats, structures, and rules for data management and establishing governance policies to ensure security, privacy, compliance, and protection of data. In a merger or a divestiture scenario, data policies for resulting entities are driven by target business needs and operational requirements.<|endoftext|>Key elements of data strategy and execution 4 Data Imperatives in IT MA&D in Life Sciences Industry | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 5 / 10 \n\n---\n\n 3. Metadata management: Metadata helps to classify, manage, and interpret master data.<|endoftext|>Managing metadata is essential in ensuring standardization of data elements across systems e.g., customer ID, distribution channel codes, clinical trial identifiers, drug classification codes, etc.<|endoftext|>Effective metadata management promotes improved data consistency, better data quality, governance, compliance, and security. Like data policies and standards, metadata management standards are driven by target business needs and operational requirements.<|endoftext|>4. Master data management: Data ownership is crucial in MA&Ds because it determines necessary accountabilities and responsibilities towards maintaining the data assets.<|endoftext|>Defining master data ownership during the pre/post-close phases is critical in ensuring smooth transition to integrated operations6. All parties involved must align on a clear ownership on gaining access, maintaining, and governing the master data assets after the transaction. Establishing data stewardship roles and processes to maintain master data is essential to avoid pitfalls such as delays in integration, legal disputes, and potential regulatory penalties. In addition, clear data ownership contributes to better intellectual property protection in a MA&D transaction. This ownership also means managing data at a product level with a promise of a required level of data quality, making it easier for users to extract valuable insights and intelligence.<|endoftext|>5. Data lineage management: MA&D transactions create large data assets, which increasingly become interconnected, complex, and challenging to work with. Data lineage tracks flow of data from source to destination, noting any changes in its journey across different systems. This allows for tracing data origins, evaluating data accuracy and pinpointing potential risks, enabling risk management, thus elevating probability of success of MA&D transactions.<|endoftext|>6. Data qualification: A crucial element for consideration is qualification of data into GxP and non-GxP. GxP data is subject to stringent regulations, while non-GxP data has fewer regulatory constraints. Proper data qualification enables organizations to manage GxP data in compliance with regulatory guidelines and handle non-GxP data as appropriate for its intended use. This helps in adoption of efficient data management processes especially from an extract, transform and load perspective. It also emphasizes relevance of systems that will hold the regulatory data thus ensuring required controls in place when interacting with such systems.<|endoftext|>5 Data Imperatives in IT MA&D in Life Sciences Industry | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 6 / 10 \n\n---\n\n 7. Data integration: Effective integration of data across systems such as clinical trial databases, product development pipelines, and sales and marketing platforms into a single, unified environment is critical for the new entity to make effective decisions. Integration of data requires consistent understanding of data and minimization of data redundancies. This helps the new entity gain better and more accurate understanding of its business and operational data, thereby expediting envisioned synergy realization. It also increases operational efficiency by streamlining internal processes, reducing duplication of effort, thereby improving risk profile. Effective data integration is essential for achieving information protection and transparency in a MA&D transaction.<|endoftext|>8. Data governance: Data governance is a crucial element for managing \u201cdata at rest\u201d and \u201cdata in motion\u201d. Robust data governance establishes policies, processes and controls to manage data throughout the life cycle. An effective data governance framework ensures both \u201cdata in motion\u201d and \u201cdata at rest\u201d are adequately protected while tracking data health in a near real time manner, thereby fostering trust with regulators, customers, and partners.<|endoftext|>Common pitfalls in a MA&D and ways to overcome them Data design and execution to support an integration / divestiture transaction is often complicated and stressful. However, with the right interventions, organizations can navigate around these complications. A non-effective data strategy can have far-reaching consequences, such as reduced financial and operational transparency, compromised intellectual property protection, decreased fair play among entities involved and weakened consumer protection. We have identified sixcommon pitfalls and their impact.<|endoftext|>Fig 2 \u2013 Critical elements of pitfalls in a MA&D transaction 6 Data Imperatives in IT MA&D in Life Sciences Industry | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 7 / 10 \n\n---\n\n 1. Data ownership: One of the most common pitfalls in MA&D transactions is limited clarity around ownership of data assets in target state. The issue is particularly pronounced when organizations involved have multiple focus areas with data stored in a single system but without proper segregation and ownership. For example, an organization may have three focus areas such as BioSimilars, BioPharma and Med Devices. Data on these focus areas may be stored in one system but not segregated based on focus areas. MA&D in any one of these areas will impose a significant challenge in terms of data segregation dependency identification. Ambiguities regarding data asset ownership often leads to intellectual property disputes, faulty data integration, and challenges extracting data specific to a new entity. To avoid such confusion, it is essential to establish data ownership early in the transaction and assign data stewards to manage data in rest as well in motion.<|endoftext|>2. Business metrics: Organizations involved in a MA&D transaction may prioritize select GxP and non-GxP metrics based on their distinctive strategic objectives and market priorities. For example, in a MA&D involving a generic and a specialty drug maker, the generic drug maker might emphasize GxP metrics such as manufacturing quality and regulatory submission timelines, as well as non-GxP metrics such as market share and cost efficiency. On the other hand, specialty drug makers might focus on GxP metrics such as clinical trial data quality and patient safety, and non-GxP metrics such as R&D pipeline growth and innovative therapy development. Given these diverse priorities, establishing common performance criteria for the new entity might be a challenge. Moreover, lack of uniformity in underlying logic for measuring the performance may further exacerbate the issue. Organizations must establish uniform metrics and underlying measurement criteria that is reflective of strategic priorities of the target entity.<|endoftext|>3. Data standards: Organizations also face roadblocks when they fail to establish common definitions for data elements. The resulting inconsistency in data standards increases the risk of inaccurate transaction records. Such inaccuracies can impair decision-making during critical stages of the MA&D and might even jeopardize the overall success of the transaction. Creating a unified data dictionary and standardizing data definitions across all entities involved is essential to mitigate such risks.<|endoftext|>4. Data lineage: A common pitfall is related to replication of source data elements across multiple source systems. Replication of data elements in multiple systems increases complexity of managing data and leads to additional synchronization overheads.<|endoftext|>Establishing standardized data lineage practices along with synchronized replication processes through automated tools is key to increasing data congruency.<|endoftext|>7 Data Imperatives in IT MA&D in Life Sciences Industry | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 8 / 10 \n\n---\n\n 5. Data governance: Another common challenge encountered during MA&Ds arises from ineffective and inconsistent data governance processes. Inconsistent data governance processes decrease accuracy of inferences and insights which can be derived from datasets. A consistent data governance process ensures data protection and regulatory compliance.<|endoftext|>6. Knowledge management: Heavy reliance on individuals makes knowledge retention vulnerable to personnel changes. To overcome this challenge, organizations must develop a knowledge management capability that is not solely dependent on people but facilitated through a set of processes and tools. A robust knowledge management capability enables effective and efficient use of data during the transaction.<|endoftext|>7. A well-designed data strategy is complemented by an effective execution plan. By proactively identifying potential challenges and implementing mitigating solutions, organizations can effectively navigate through the complexities and maximize value realization from a MA&D transaction. Effective data strategy and execution can safeguard the success of the transaction and ensure that the resulting entity(s) operates efficiently and effectively.<|endoftext|>About the CIO advisory practice at Infosys Consulting Over the next 5 years CIOs will lead their organizations towards fundamentally new ways of doing business. The CIO Advisory practice at Infosys Consulting is helping organizations all over the world transform their operating model to succeed in the new normal \u2013 scaling up digitization and cloud transformation programs, optimizing costs, and accelerating value realization. Our solutions focus on the big-ticket value items on the C-suite agenda, providing a deep link between business and IT to help you lead with influence.<|endoftext|>8 Data Imperatives in IT MA&D in Life Sciences Industry | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 9 / 10 \n\n---\n\n MEET THE AUTHORS Inder Neel Dua inder_dua@infosys.com Inder is a Partner with Infosys Consulting and leads the life sciences practice in India. He has enabled large scale programs in the areas of digital transformation, process re-engineering and managed services.<|endoftext|>Anurag Sehgal anurag.sehgal@infosys.com Anurag is an Associate Partner with Infosys Consulting and leads the CIO advisory practice in India. He has enabled large and medium scale clients to deliver sustainable results from multiple IT transformation initiatives.<|endoftext|>Ayan Saha ayan.saha@infosys.com Ayan is a Principal with the CIO advisory practice in Infosys Consulting. He has helped clients on business transformation initiatives focusing on IT M&A including operating model transformation.<|endoftext|>Manu A R manu.ramaswamy@infosys.com Manu is a Senior Consultant with CIO advisory practice in Infosys Consulting.<|endoftext|>He has assisted clients on technology transformation initiatives in the areas of IT M&A and cloud transformation.<|endoftext|>Sambit Choudhury sambit.choudhury@infosys.com Sambit is a Senior Consultant with the CIO advisory practice in Infosys Consulting. His primary focus areas include enterprise transformation with IT M&A as a lever. He has helped clients in areas of IT due diligence, integration, and divestitures.<|endoftext|>1 FTC Requires Abbott Laboratories to Divest Two Types of Point-Of-Care Medical Testing Devices as Condition of Acquiring Alere Inc.<|endoftext|>2 EU clears Abbott acquisition of Alere subject to divestments | Reuters 3 Pfizer formally abandons $160bn Allergan deal after US tax inversion clampdown | Pharmaceuticals industry | The Guardian 4 Why, and when, CIOs deserve a seat at the M&A negotiating table | CIO 4 The New M&A Playbook - Article - Faculty & Research - Harvard Business School (hbs.edu) 5 Don\u2019t Make This Common M&A Mistake (hbr.org) 6 6 ways to improve data management and interim operational reporting during an M&A transaction 9 Data Imperatives in IT MA&D in Life Sciences Industry | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 10 / 10 \n\n---\n\n consulting@Infosys.com InfosysConsultingInsights.com LinkedIn: /company/infosysconsulting Twitter: @infosysconsltng About Infosys Consulting Infosys Consulting is a global management consulting firm helping some of the world\u2019s most recognizable brands transform and innovate. Our consultants are industry experts that lead complex change agendas driven by disruptive technology. With offices in 20 countries and backed by the power of the global Infosys brand, our teams help the C- suite navigate today\u2019s digital landscape to win market share and create shareholder value for lasting competitive advantage. To see our ideas in action, or to join a new type of consulting firm, visit us at www.InfosysConsultingInsights.com. For more information, contact consulting@infosys.com \u00a9 2023 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names, and other such intellectual property rights mentioned in this document. Except as expressly permitted, neither this document nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printed, photocopied, recorded or otherwise, without the prior permission of Infosys Limited and/or any named intellectual property rights holders under this document. \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: Need for data masking in a data-centric world \nAuthor: Infosys Limited \nFormat: PDF 1.6 \n\n---\n\n Page: 1 / 4 \n\n---\n\n VIEW POINT Abstract With data gaining increasing prominence as the foundation of organizational operations and business, ensuring data security is emerging as a main priority. It is critical to safeguard sensitive data and customer privacy, the lack of which can lead to financial and reputational losses. Thus, there is a rising demand to protect personally identifiable information during transfer within organizations as well as across the external ecosystem. This paper highlights the need for data masking solutions. It also explains how customized data masking solutions can be used in today\u2019s data centric world.<|endoftext|>Paromita Shome, Senior Project Manager, Infosys Limited NEED FOR DATA MASKING IN A DATA-CENTRIC WORLD \n\n---\n\n Page: 2 / 4 \n\n---\n\n External Document \u00a9 2018 Infosys Limited Introduction The key differentiator for today\u2019s businesses is how they leverage data. Thus, ensuring data security is of utmost importance, particularly for organizations that deal with sensitive data. However, this can be challenging because data that is marked critical and sensitive often needs to be accessed by different departments within an organization. Without a well- defined enterprise-wide data access management strategy, securing data transfer can be difficult. The failure to properly control handling of sensitive information can lead to dangerous data breaches with far-reaching negative effects. For instance, a 2017 report by the Ponemon Institute titled \u2018Cost of a Data Breach Study, 2017\u20191 found that: \u2022 The average consolidated total cost of a data breach is US $3.62 million \u2022 The average size of a data breach (number of records lost or stolen) increased by 1.8% in the past year \u2022 The average cost of a data breach is US $141 per record \u2022 Any incident \u2013 either in-house, through a third party or a combination of both \u2013 can attract penalties of US$19.30 per record. Thus, for a mere 100,000 records, the cost of a data breach can be as high as US $1.9 million These statistics indicate that the consequences of data breaches go beyond financial losses. They also affect the organization\u2019s reputation, leading to loss of customer and stakeholder trust. Thus, it is imperative for organizations to adopt robust solutions that manage sensitive data to avert reputational damage and financial losses.<|endoftext|>External Document \u00a9 2019 Infosys Limited \n\n---\n\n Page: 3 / 4 \n\n---\n\n External Document \u00a9 2018 Infosys Limited Data masking as a solution Data masking refers to hiding data such that sensitive information is not revealed. It can be used for various testing or development activities. The most common use cases for data masking are: \u2022 Ensuring compliance with stringent data regulations \u2013 Nowadays, there are many emerging protocols that mandate strict security compliance such as Health Insurance Portability and Accountability Act (HIPAA) and General Data Protection Regulation (GDPR). These norms do not allow organizations to transfer Types of data masking There are various masking models or algorithms that can be leveraged to address the above use cases. These ensure data integrity while adhering to masking demands. The most common types are: \u2022 Substitution or random replacement of data with substitute data \u2022 Shuffling or randomizing existing values personal information such as personally identifiable information (PII), payment card information (PCI) and personal health information (PHI) \u2022 Securely transferring data between project teams \u2013 With the increasing popularity of offshore models, project management teams are concerned about how data is shared for execution. For instance, sharing production data raises concerns about the risk of data being misused/mishandled during transition. Thus, project teams need to build an environment that closely mimics production environments and can be used for functionality validation. This requires hiding sensitive information when converting and executing production data It is important to note that data sensitivity varies across regions. Organizations with global operations are often governed by different laws. Hence, the demand for data security and the potential impact of any breach differ based on the operating regions. Thus, having an overarching data privacy strategy is paramount to ensure that sensitive data remains protected. This calls for a joint data protection strategy that includes vendors in offshore and near- shore models as well.<|endoftext|>vertically across a data set/column \u2022 Data encryption by replacing sensitive values with arithmetically formulated data and using an encryption key to view the data \u2022 Deleting the input data for sensitive fields and replacing with a null value to prevent visibility of the data element \u2022 Replacing the input value with another value in the lookup table While the above models enable straight- forward masking, they cannot be applied to all cases, thus creating the need for customized data masking. Customized data masking uses an indirect masking technique where certain business rules must be adhered to along with encryption as shown in Fig 1.<|endoftext|>In the figure, the source data \u2013 WBAPD11040WF70037 \u2013 is received from any source system like RDBMS/Flat files. The business rules state that: 1. There should be no change in the first 10 characters post masking 2. The next 2 letters should be substituted with letters post masking 3. The last 5 numerals should be substituted with numerals only post masking Fig 1: A technical approach to customized data masking External Document \u00a9 2019 Infosys Limited Data Masking RDBMS Individual Data (WBAPD11040WF70 037) Data Splitter Data Splits WBAPD11040 WF Individual Data (WBAPD11040AG817 26) Masked Data Source Data Data Concatenation 70037 Masking TDM Tool PLSQL WBAPD11040 AG 81726 Masked Data Splits Files Input Source RDBMS Files Output Source \n\n---\n\n Page: 4 / 4 \n\n---\n\n \u00a9 2019 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected As part of customized masking, the source data is passed through a data splitter. The single source data is then split into individual source data based on the business rules as shown in Fig 1.<|endoftext|>After this, the individually-split data is run through the data masking tool and the masking algorithm defined in the tool is executed for each item, yielding an output of masked data. In each section, the masking type is selected based on the business rule and then the encryption is applied. The three individual sections of masked data are finally concatenated before being published at the output, which can be a database or a flat file. The masked data for the reference source data now reads as WBAPD11040AG81726. This output data still holds the validity of the source input data, but it is substituted with values that do not exist in line with the business rules. Hence, this can be utilized in any non-production environment.<|endoftext|>Customized masking can be used in various other scenarios such as: \u2022 To randomly generate a number to check Luhn\u2019s algorithm where masked data ensures that the source data lies within the range of Luhn\u2019s algorithum \u2022 To check number variance in a range between \u2018x\u2019 and \u2018y\u2019 where the input values will be replaced with a random value between the border values, and the decimal points are changed \u2022 To check number variance of around +/- * % where a random percentage value between defined borders will be added to the input value Conclusion As the demand for safeguarding sensitive data increases, organizations need effective solutions that support data masking capabilities. Two key areas where data masking is of prime importance are ensuring compliance with data regulations and protecting data while it is transferred to different environments during testing. While there are several readily-available tools for data masking, some datasets require specialized solutions. Customized data masking tools can help organizations hide source data using encryption and business rules, allowing safe transfer while adhering to various global regulatory norms. This not only saves manual effort during testing but averts huge losses through financial penalties and reputational damages arising from data breaches.<|endoftext|>References 1. https://securityintelligence.com \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: Infosys Test Automation Accelerator \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 4 \n\n---\n\n WHITE PAPER HOW TO ENSURE DATA QUALITY DURING DATA MIGRATION Naju D Mohan, Delivery Manager \n\n---\n\n Page: 2 / 4 \n\n---\n\n Introduction In today\u2019s business world, change is the only constant and changes make their appearances in various forms, some of which are: \u2022 Mergers and acquisitions \u2022 New compliance requirements \u2022 New Package implementations \u2022 Migration to new technologies such as the cloud \u2022 Big data programs Being data driven, the business has to upgrade and keep its intelligence up- to-date to realize the benefits of these changes. So in short, all these changes result in data migrations. Most of the time, it is assumed that data migration is an IT problem 2. All the visible changes and the actions lie with the IT team, so the business moves on, putting the entire burden of data migration management on the IT team. Mergers and acquisitions and compliance requirements clearly stand out as having its origin with the business team. So does the decision to implement a CRM, loyalty or HR package with its beginning at the business department. The need to optimize operating costs and make intelligent decisions and act in real-time, leads the business to migrate to cloud and embark on big data programs. But the onus of the migration management, often, lies with the IT team.<|endoftext|>It must be clearly understood that any data migration without the business leading the program has a high rate of failure. Business has to not just care about data migration but command it.<|endoftext|>Who is the Primary Owner? According to Gartner, 83% of the data migration programs fail to meet expectations, running over time and budget1.<|endoftext|>Some key reasons for this are: 1. Poor Understanding About Data Migration Complexity \u2022 The focus on data migration is lost in the excitement of the new package implementation, migration to cloud or big data initiatives \u2022 Most often, it is assumed that data fits one-one into the new system \u2022 The whole attention is on the implementation of the new business processes with less or almost no focus on data migration 2. Lack of Proper Attention to Data \u2022 Lack of data governance and proper tools for data migration can impact the quality of data loaded into the new system \u2022 Mergers and acquisitions can introduce new data sources and diverse data formats \u2022 Huge volumes of data may force us to overlook whether the data is still relevant for the business 3. Late Identification of Risks \u2022 Poor data quality of the source systems and lack of documentation or inaccurate data models would be identified late in the migration cycle \u2022 Lack of clarity on the job flows and data integrity relationship across source systems would cause data load failures Why Such a High Failure Rate for Data Migration Programs? External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 3 / 4 \n\n---\n\n An innovative data migration test strategy is critical to the success of the change initiatives undertaken by the business. The test strategy should be prepared in close collaboration with the business team as they are a vital stakeholder, who initiated the change resulting in data migration. The two principal components which should be considered as part of the test strategy are: 1. Risk-Based Testing The data volumes involved in data migration projects emphasize the need for risk-based testing to provide optimum test coverage with the least risk of failure. Master test strategy can be created by ensuring proactive analysis with business and third- parties. Tables can be prioritized and bucketed based on the business criticality and sensitivity of data. Composite key agreed with the business can be used to select sample rows for validation in tables with billions of rows. 2. Data Compliance Testing It is very important that the quality assurance (QA) team is aware of the business requirements that necessitated data migration, because the change would have been to meet new government regulations or compliance requirements. The test strategy must have a separate section to validate the data for meeting all compliance regulations and standards such as Basel II, Sarbanes-Oxley (SOX), etc. Is There a Right Validation Strategy for Data Migration? A Test Approach Giving Proper Attention to Data Data migration, as mentioned earlier, is often a by-product of a major initiative undertaken by the company. So in a majority of scenarios, there would be an existing application which was performing the same functionality. It is suitable to adopt a parallel testing approach which would save effort spent to understand the system functionality. The testing can be done in parallel with development in sprints, following an agile approach to avoid the risk of failure at the last moment.<|endoftext|>1. Metadata Validation Data migration testing considers information that describes the location of each source such as the database name, filename, table name, field or column name, and the characteristics of each column, such as its length and type, etc. as part of metadata. Metadata validation must be done before the actual data content is validated, which helps in early identification of defects which could be repeated across several rows of data.<|endoftext|>2. Data Reconcilliation Use automated data comparison techniques and tools for column to column data comparison. There could be duplicate data in legacy systems and it has to be validated that this is merged and exists as a single entity in the migrated system. Sometimes the destination data stores do not support the data types from the source and hence the storage of data in such columns have to be validated for truncation and precision. There could be new fields in the destination data store and it has to be validated that these fields are filled with values as per the business rule for the entity.<|endoftext|>Benefits A well thought-out data migration validation strategy helps to make the data migration highly predictable and paves the way for a first-time right release. Regular business involvement helps to maintain the testing focus on critical business requirements. A successful implementation of the shift-left approach in the migration test strategy helps identify defects early and save cost.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 4 / 4 \n\n---\n\n Case Study: Re-Platforming of Existing HP NEOVIEW Data Warehouse to Teradata The Client One of the largest super market chains in the United Kingdom which offers online shopping, DVD rentals, financial services, and multiple store locations.<|endoftext|>The Objectives \u2022 To complete the re-platform of HP Neoview to Teradata and re-platform associated services before HP discontinued support to Neoview \u2022 To migrate the existing IT business services currently operating against a Neoview data warehouse onto a Teradata warehouse with minimal disruption \u2022 To improve the performance of current Ab-Initio ETL batch processes and reporting services using Microstrategy, SAS, Pentaho, and Touchpoint The QA Solution The validation strategy was devised to ensure that the project delivered a like-for-like, \u2018lift-and-shift\u2019. This project had environmental challenges and dependencies throughout the entire execution cycle. SIT phase overcame all the challenges by devising strategies that departed from traditional testing approach in terms of flexibility and agility. The testing team maintained close collaboration with the development and infrastructure teams while maintaining their independent reporting structure. The approach was to maximize defect capture within the constraints placed on test execution.<|endoftext|>It was planned to have individual tracks tested independently on static environment and then have an end- to-end SIT, where all the applications / tracks are integrated. Testing was always focused on migrating key business functions on priority such as sales transaction management, merchandise and range planning, demand management, inventory management, price and promotion management, etc.<|endoftext|>The Benefits \u2022 15% reduction in effort through automation using in-house tools \u2022 100% satisfaction in test output through flexibility and transparency in every testing activity achieved through statistical models to define acceptance baseline End Notes 1. Gartner, \u201cRisks and Challenges in Data Migrations and Conversions,\u201d February 2009 2. https://www.hds.com/go/cost-efficiency/pdf/white-paper-reducing-costs-and-risks-for-data-migrations.pdf \u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: End-to-end test automation \u2013 A behavior-driven and tool-agnostic approach \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 8 \n\n---\n\n PERSPECTIVE END-TO-END TEST AUTOMATION \u2013 A BEHAVIOR-DRIVEN AND TOOL- AGNOSTIC APPROACH Anand Avinash Tambey Product Technical Architect, Infosys Abstract In today\u2019s fast changing world, IT is under constant pressure to deliver new applications faster and cheaper. The expectation from the quality assurance (QA) organization is to make sure that all the applications are tuned to deliver to every rising user expectation across devices, locations, and typically, at no additional cost. And with an exponential growth in the diversity and number of end users in almost all sectors, requirements are fairly fluctuating and demanding too.<|endoftext|>\n\n---\n\n Page: 2 / 8 \n\n---\n\n Let us discuss these approaches in detail to face the above challenges. How to face these challenges Challenges l Technical complexity l Infrastructure, licensing, and training costs l Late involvement of users l Late involvement of testers Approaches l De-skilling l Using open source stack l Using behavior-driven techniques l Utilizing the testers and users effectively This fast-paced software engineering advancement is also posing challenges to software engineers to build an ecosystem that enables rapid prototyping and design, agile development and testing, and fully automated deployment. For the QA community, this translates to a need to maximize automation across all stages of the software engineering and development life cycle and more importantly, do it in an integrated fashion. Consequently, `extreme automation\u2019 is the new mantra for success. And there is more. According to the 2014 State of DevOps report, high-performing organizations are still deploying code 30 times more frequently, with 50 percent fewer failures than their lower-performing counterparts. High IT performance leads to strong business performance, helping boost productivity, profitability, and market share. The report counts automated testing as one of the top practices correlated with reducing the lead time for changes.<|endoftext|>However, automated testing puts together another set of challenges. The latest technologies stack advocates multiple choices of test automation tools, platform- specific add-ins, and scripting languages. There is no inherent support available for generic, tool-agnostic, and scriptless approach with easy migration from one tool to another. Therefore, a significant investment in training, building expertise, and script development is required to utilize these tools effectively. The cost and associated challenges inadvertently affect the time-to-market, profitability, and productivity although it also creates an opportunity to resolve the issues using a combination of an innovative tool-agnostic approach and latest industry practices such as behavior-driven development (BDD) and behavioral-driven test (BDT).<|endoftext|>Business challenges A persistent need of businesses is to reduce the time between development and deployment. QA needs to evolve and transform to facilitate this. And this transformation requires a paradigm shift from conventional QA in terms of automation achieved in each life cycle stage and across multiple layers of architecture. Technical complexity The technology and platform stack is not limited to traditional desktop and the web for current application portfolios. It extends to multiple OS (platforms), mobile devices, and the newest responsive web applications.<|endoftext|>Infrastructure, licensing, and training costs To test diverse applications, multiple test automation tools need to be procured (license cost), testing environment needs to be set up (infrastructure), and the technical skills of the team need to be brought to speed with training and self- learning / experimentation (efforts). Late involvement of users The end user is not involved in the development process until acceptance testing and is totally unaware of whether the implemented system meets her requirements. There is no direct traceability between the requirements and implemented system features. Late involvement of testers Testing and automation also need to start much earlier in the life cycle (Shift- Left) with agility achieved through the amalgamation of technical and domain skills of the team, as well as the end user. External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 3 / 8 \n\n---\n\n Using open-source stack To reduce the cost of commercial tool license and infrastructure, utilize open- source tools and platforms.<|endoftext|>De-skilling Using easy modeling of requirements and system behaviors, accelerated framework, and automated script generators, reduces the learning curve and the dependency on expert technical skills.<|endoftext|>Using behavior-driven techniques Behavioral-driven development and testing (BDD and BDT) is a way to reduce the gap between the end user and the actual software being built. It is also called `specification by example.\u2019 It uses natural language to describe the `desired behavior\u2019 of the system in a common notation that can be understood by domain experts, developers, testers, and the client alike, improving communication. It is a refinement of practices such as test-driven development (TDD) and acceptance test-driven development (ATDD).<|endoftext|>The idea behind this approach is to describe the behaviors of the system being built and tested. The main advantage is that the tests verifying the behaviors reflect the actual business requirements / user stories and generate the live documentation of the requirements, that is, successful stories and features as test results. Therefore, the test results generated can be read and understood by a non-technical person, such as a project sponsor, a domain expert, or a business analyst, and the tests can be validated. Utilizing testers and users effectively Our accelerated automation approach provides a simple modeling interface for a scriptless experience and thereby utilizes non-technical staff effectively. It introduces the `outside-in\u2019 software development methodology along with BDT, which has changed the tester\u2019s role dramatically in recent years, and bridges the communication gap between business and technology. It focuses on implementing and verifying only those behaviors that contribute most directly to the business outcomes.<|endoftext|>Solution approach Our solution approach is threefold, to resolve challenges in a holistic way. It applies a behavior-driven testing approach with a tool-agnostic automation framework, while following an integrated test life cycle vision. It ensures that business users and analysts are involved.<|endoftext|>It provides the flexibility of using any tool chosen and helps save cost and effort. It also provides a simple migration path to switch between tools and platforms, if such a case arises. With an integrated test life cycle approach, it ensures seamless communication between multiple stakeholders and leverages industry- standard tools.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 4 / 8 \n\n---\n\n Finally, it introduces automation in early stages to realize the benefit of Shift-Left.<|endoftext|>Behavior-driven test (BDT) To facilitate real-time traceability of user stories / requirements and to aid live documentation, we have implemented a BDT approach across multiple Infosys projects as described below.<|endoftext|>This approach acted as a single point of continuous interaction between the tester and the business users. The user story is divided into various scenarios by the testers into a feature file.<|endoftext|>l These scenarios are written using Gherkin language. These scenarios include the business situation, the pre-ions, the data to be used, and the acceptance criteria.<|endoftext|>l The end user signs off the features / scenarios. This provides the user control to execute and validate the scenarios based on the data as per the user need and bring up feature reports / dashboard. l The underlying technical implementation is abstracted from the business user.<|endoftext|>l Tester creates the underlying test scripts for the scenarios which could be a business layer test script, service test script, or a UI automated test script.<|endoftext|>l The tool then converts the scenarios to `step definitions,\u2019 which act as the binder between the test scripts and the scenarios. This ensures that a single point / interface is used to execute any type of tests.<|endoftext|>Tester User story End user Zero distance with user User aware of feature success / failures early Shift-Left Early test automation from day one UI Testing framework (BDT) Reports in user's language Feature fles Step defnition 11% Back to Jenkins Tag Overview Cucumber Reports Feature Overview for Build: 15 The following graph shows passing and falling statistics for features in this build: Failed Skipped Pending Passed Steps Scenarios 89% 92% Passed Failed Feature Statistics Cucumber-JVM Jenkins report plugin - 06-07-2012 22:31:55 Account holder withdraws cash project 2 8 40 40 0 0 0 113 ms passed 1 9 5 1 3 0 4 ms failed 8 40 40 0 0 0 112 ms passed 1 9 5 1 3 0 4 ms failed Account holder withdraws More cash project 2 Account holder withdraws more cash Account holder withdraws cash Feature Scenarios Steps Passed Failed Skipped Pending Duration Status 4 18 98 90 2 6 0 233ms Totals External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 5 / 8 \n\n---\n\n Tool- / platform-agnostic solution (BDD release) Tool-agnostic approach To remove dependencies on technical skills, tools, and platforms, our solution proposes modeling of system behaviors using generic English-like language via an intuitive user interface. This model would be agnostic to any specific tool or UI platform and reusable.<|endoftext|>This model will be translated into a widely acknowledged format, XML, and will act as an input to generate automation scripts for specific tools and platforms. Finally, it would integrate with a continuous integration platform to achieve an end-to- end automation of build-test-deploy cycle.<|endoftext|>BDD modeling UI Scriptless, tool- / platform-agnostic implementation Business analyst / end user User story Features Fields and step defnitions QTP Selenium SAHI Protractor Tester Execution AUT Continuous integration Build Test Deploy Script generator External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 6 / 8 \n\n---\n\n Integrated test life cycle The role of the traditional tester and end users is changing in the era of DevOps and Shift-Left. The integrated-solution approach enables a larger stakeholder Benefits Reduced time-to-market l Shift-Left, early automation, and early life cycle validation l Single-click generation and execution of automated scripts Reduced cost l 40\u201360 percent reduction in effort for automated test case generation over manual testing l Detailed error reporting reduces defect reporting effort considerably base to contribute towards the quality of the system under development. It also ensures the satisfaction of stakeholders via early validation and early feedback of system progress while using the industry- standard toolsets seamlessly.<|endoftext|>l Easy maintenance of requirements, stories, features, and the automated test suite l No additional cost involved in building integration components for test management tools (HP-ALM) l The agnostic approach works for a broad range of applications irrespective of tools, technology, and platform Improved quality l Enhanced business user participation and satisfaction due to the live documentation of features and user stories, available at fingertips l Developer, tester, and client collaboration possible due to a common language l High defect detection rates (95\u201399 percent) due to high test coverage Requirement analysis Test design Test execution Test reporting Tester Enhances automation libraries Generates requirements and features HP ALM/QC IBM RQM JIRA Jenkins Others Executes automation scripts Generates test reports Designs business processes Generates automation scripts Tester BA BA Automation expert QA manager / end user BDD & UML modeling tool BDD & UML modeling tool Features and metrics reports Tool-agnostic, platform-agnostic automation accelerator Integration With Popular Tools Test Life Cycle Management View External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 7 / 8 \n\n---\n\n References https://puppetlabs.com/sites/default/files/2014-state-of-devops-report.pdf http://www.ibm.com/developerworks/library/a-automating-ria/ http://guide.agilealliance.org/guide/bdd.html http://www.infosysblogs.com/testing-services/2015/08/extreme_automation_ the_need_fo.html Conclusion Our solution approach is the first step towards reducing the complexity of test automation and making it more useful for the end user by providing early and continuous feedback to the incremental system development. Moreover, it advances automation at every level to achieve rapid development and faster time-to-market objectives.<|endoftext|>With the advent of multiple technologies and high-end devices knocking at the door, using a tool- and platform-agnostic approach will increase overall productivity while reducing the cost of ownership. External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 8 / 8 \n\n---\n\n \u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected \n\n\n***\n\n\n "} @@ -18,13 +19,16 @@ {"text": "# Infosys Whitepaper \nTitle: Big data testing for a leading global brewer \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 8 \n\n---\n\n WHITE PAPER MOVING FRAGMENTED TEST DATA MANAGEMENT TOWARDS A CENTRALIZED APPROACH Abstract Test Data Management (TDM) ensures managing test data requests in an automated way to ensure a high degree of test coverage by providing the right data, in the right quantity, and at the right time, in non-production environments. Automated TDM service facilitates test data management across test environments through a structured approach of data subsetting, cleansing, gold copy creation, data refresh, and sensitive data masking. Typically, a centralized TDM system with well-defined processes is more effectual than the traditional manual or decentralized approach, but in some cases, a decentralized approach is adopted. This paper takes a deeper dive into the considerations for the centralization of TDM processes within enterprise ITs.<|endoftext|>\n\n---\n\n Page: 2 / 8 \n\n---\n\n Introduction In most organizations where TDM is at its infancy, test data-related activities are done by the individual project teams themselves. There will not be a dedicated team identified or process defined to handle test data requests. Such projects with a primitive TDM approach possess several drawbacks: \u2022 Lack of a defined ownership for the test environment and test data setup: results in unintentionally losing the test data setup or data overstepping \u2022 Unavailability of data setup for testing end-to-end scenarios: Lack of data setup between inter-dependent and third- party applications \u2022 Lack of referential integrity defined in the databases: Absence of primary, foreign, relationships defined in the database makes it difficult to identify related tables and generate the correct test data set \u2022 Insufficient data available for performance load testing: Manually generating bulk data is a tedious task and less feasible \u2022 Increased number of defects due to incorrect test data: Leads to re-work and losing time unnecessarily analyzing issues caused due to incorrect test data used for testing \u2022 Outdated test data in QA database: Periodic refresh of test data does not happen from production \u2022 Inability to provision data since data is unavailable: Lack the mechanism required for generating synthetic data \u2022 Risk of exposing sensitive data to testing teams: Sensitive fields need to be masked before provisioning for testing \u2022 Multiple copies of data: Storage costs can be reduced by maintaining required gold copies and refreshing and reusing gold copies after major releases Having a well-defined practice for handling all the test, data-related, requirements across all non-production environments in an organization is the essence of TDM. Aimed to address all the above stated issues, it will bring in more control and make TDM more effective.<|endoftext|>Based on the TDM requirement type, organizations can opt for either a de- centralized or a centralized approach. This paper gives a detailed view of both approaches and highlights how the centralized approach is more efficient and beneficial.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 3 / 8 \n\n---\n\n Centralized TDM deals with consolidating the test data provisioning for all non- production environments across the organization. It provides a systematic approach to analyze and provision test data.<|endoftext|>Pros \u2022 Well-established TDM team with a workflow-based mechanism for managing test data requests \u2022 Reduced latency in provisioning test data with quick turnaround time \u2022 Automated end-to-end approach with tools and processes \u2022 Reduced infrastructure cost by storing only the required data for provisioning in gold copy database \u2022 Reduced risk of incorrect test data, resulting in lesser defects \u2022 Resolution of data overstepping issues by the TDM team \u2022 Periodic refresh of the gold copy makes the latest data available for testing by the QA team \u2022 Reusable masking configurations and test data generation scripts provides quick turnaround time \u2022 Easy handling of complex end-to-end test scenarios that require data setup across heterogeneous data sources having federated relationships through a centralized test data managment \u2022 Creation of bulk data which is relationally intact for non-functional testing requirements is achieved using automated solutions \u2022 Varied techniques available for creating synthetic data in scenarios where source data is not available for provisioning Cons \u2022 Considerable time and effort is required to consolidate the TDM across various portfolios \u2022 High knowledge acquisition effort required to understand the different application data models \u2022 Sporadic bottlenecks and dependency on the TDM team in case of high workload from all LOBs Centralized TDM External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 4 / 8 \n\n---\n\n Decentralized TDM It is not necessary that all applications in different portfolios in the organization\u2019s landscape have to be covered under the consolidated TDM umbrella. There are instances where some applications can follow the de-centralized TDM approach. This is mostly determined by the level of integration between the applications, technologies supported, data sensitivity, environment constraints, etc. For example, data in HR, infrastructure applications, etc., may be independent and not related to marketing, sales, inventory, or corporate data. These systems, hence, can adopt a de- centralized TDM approach and need to be handled outside the centralized umbrella.<|endoftext|>Pros \u2022 Minimal effort required to set up TDM for individual applications \u2022 Good understanding of the respective application data models, which makes the team capable to address the test data requests quickly Cons \u2022 Multiple copies of data without owner- ship because individual teams store separate copies of production data. \u2022 Unmasked sensitive data in non- production environments can lead to a security breach \u2022 Less uniformity in standards and pro- cesses \u2022 Increase in data overstepping issues \u2022 Minimal automation may be present with lack of coordinated processes \u2022 Limitations in setting up data across multiple data sources due to decentral- ized systems. Data set up in one ap- plication may not be in sync with other inter-dependent applications Sales Corporate HR Infrastructure Inventory Marketing Independent TDM Consolidated TDM Centralized TDM implementa- tion approaches Primarily, there are two approaches for implementing centralized test data man- agement within an organization: \u2022 Big Bang approach: In this approach, all major applications under the TDM scope in the organization are identi- fied, test data requirements across applications are analyzed, and gold copies for these applications are cre- ated at one go. A TDM team is set up to address test data needs for all the applications. This approach will take considerable time for the initial setup and knowl- edge of the application stack across the organization\u2019s portfolio is a must. Another key challenge with this ap- proach is keeping up with the database (DB) changes happening in production during the initial setup \u2022 Incremental approach: In this ap- proach, based on the business requirements, TDM is established for an application or a prioritized set of applications. Separate test data management implementations will be carried out which can be progressively integrated. The TDM team will address the test data needs as soon as the gold copies for the applications are set up.<|endoftext|> In this approach, TDM is more manage- able, and can reap early benefits. TDM set up for smaller set of applications takes lesser time compared to the Big Bang approach.<|endoftext|>A phased approach for TDM implementation Centralized and automated test data man- agement implementations can follow a phased approach. Each stage has a defined set of activities to achieve the goals and these stages are: \u2022 Analysis \u2022 Design \u2022 Implementation \u2022 Steady State From the initial assessment phase, it moves to a stabilized stage, expanding TDM services to other domains and portfolios in the organization, and working for continu- ous improvements on the way.<|endoftext|>The timelines proposed in the diagram are highly indicative. Time duration for each phase will depend on factors like: \u2022 TDM requirement complexity \u2022 Number of portfolios or applications involved \u2022 Knowledge or understanding about the application landscape or database models Figure 1: TDM Landscape External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 5 / 8 \n\n---\n\n Figure 2: Phased TDM Approach When to go for centralized TDM? \u2022 Applications or technologies used are mostly compatible with tools in the market \u2022 Scope of TDM for applications across various portfolios continue throughout the life cycle of the application \u2022 Incoming TDM requests for application or application clusters are fairly high \u2022 Technologies are widely supported and not disparate \u2022 High number of inter-dependent systems which require data set up across systems for end-to-end testing When to go for decentralized TDM? \u2022 The nature of portfolios or departments within the organization are highly de- centralized \u2022 A specific TDM process is required for a prioritized set of applications within a short span of time \u2022 Scope of TDM is limited within the project and does not continue after the project is complete \u2022 Disparate or obsolete technologies used in the project are not supported by common TDM tools \u2022 Limited number of dependent / external applications \u2022 Need for test data provisioning is very low and the requests flow is manageable Common TDM challenges and resolutions 1. Inconsistent data relationship Well-defined data relationship between database objects is a key factor for data subsetting, masking, and data provisioning. It is often observed that in case of legacy applications, relationships are not present in the database layer. The business rules and logical constraints may be applied at the application level, but will be poorly defined at the database level. Logical database model architectures may not be available in most cases.<|endoftext|>Impact \u2022 Data subsetting, data masking, and data provisioning get affected \u2022 Data integrity will not be maintained Resolution \u2022 Understand the application and database structure, relevance of tables, and how they are related with help of SME / DBA \u2022 Analyze and understand the database structure using data model artifacts \u2022 Validate the logically-related entities and confirm with business analyst 2. Unclear test data requirements Teams requesting data sometimes lack information aboutwhich data sources would have the related data that needs to be set up. In some scenarios, test data requirements can be very complex, like for testing an end-to- end scenario with data spread across multiple databases or with data spread across tables.<|endoftext|>Impact \u2022 Inaccurate test data Resolution \u2022 Understand the requirement from QA perspective \u2022 Understand the database entities involved and the relationships 3. Lack of application knowledge System or application knowledge, especially the data sources under the TDM scope, is a prerequisite for the TDM team. If teams possess a limited knowledge about the application, External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 6 / 8 \n\n---\n\n it will result in writing incorrect test cases, raising ambiguous test data requirements, and finally, provisioning inaccurate data.<|endoftext|>Impact \u2022 Inaccurate test data \u2022 Increased defects due to incorrect test data Resolution \u2022 Understand the application with the help of SMEs \u2022 Understand the database entities involved and the relationships 4. Corrupted gold copy Most projects will have a gold copy database available from where data will be provisioned to the lower environments. If the gold copy is not refreshed periodically, or the data in the gold copy has been tampered with, it can cause issues while provisioning data.<|endoftext|>Impact \u2022 Inaccurate test data Resolution \u2022 Periodically refresh gold copy database \u2022 Restrict access to gold copy database 5. Data overstepping If the same set of test data is used by multiple teams for testing, it can lead to conflicts and the test results will not be as expected.<|endoftext|>Impact \u2022 Affects test execution \u2022 Incorrect test results \u2022 Rework in test data provisioning and test execution Resolution \u2022 Data has to be reserved \u2022 Centralized TDM team can handle the test data requirements 6. Identifying correct sensitive fields and masking techniques While masking any application database, it is important that the correct sensitive fields are identified for masking. Also, what is important is that relevant masking techniques are applied to these fields. For example, email id should be masked in such a way that the email id format is retained. Otherwise, while using the masked email id, it might break the application. Another point to consider is while masking the primary key columns, the masking has to be consistently applied for the child tables also where the primary key columns are referenced.<|endoftext|>Impact \u2022 Data inconsistency across tables \u2022 Unnecessary masked data Resolution \u2022 Identify sensitive fields belonging to the category PII, PHI, PCI, financial data, etc.<|endoftext|>\u2022 Apply relevant masking techniques that will preserve the format of the data Best practices Some of the best practices that can be adopted while implementing test data management processes in projects are listed below: \u2022 Automate TDM processes with reusable templates and checklists \u2022 Improve test data coverage and test data reuse by provisioning and preserving the right data \u2022 Analyze TDM metrics and take corrective actions \u2022 Data refresh to gold copy can be automated using scripts \u2022 Batch mode of data masking can be implemented to improve performance without exposing sensitive data to testing teams \u2022 Test data can be used for configuring the masking rules, which can be replaced with production data for actual execution. Thus, production data is not exposed to the execution team \u2022 Reusable configuration scripts for masking similar data (for example \u2013 similar data for a different region) \u2022 Developing automation scripts to automate any manual TDM-related activities \u2022 Developing data relationship architecture diagrams for the most commonly used tables for provisioning which can be used as a reference External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 7 / 8 \n\n---\n\n Summary Reduced cost and improved management with faster time-to-market are the key points for any successful program. Central- ized and automated test data management provides an organized approach in man- aging test data requirements across the organization in a better and more efficient way. Only the required masked, subset- ted, and reusable data sets, are stored as gold copies centrally, which are used for provisioning by the testing teams. Most of the TDM tools available in the market offer web-based solutions, which act as a single interface for both the testing and provi- sioning teams. Testing teams can place the test data request and provisioning teams can address the request from a single por- tal. All test data requests are tracked using a single solution. A centralized, automated TDM system with streamlined processes introduce increased accuracy and predict- ability to the entire testing process. Imple- menting centralized test data management is certainly beneficial over the de-central- ized approach.<|endoftext|>Glossary Acronym Definition TDM Test Data Management PII Personally Identifiable Infor- mation PHI Personal Health Information PCI Payment Card Information SME Subject Matter Expert PoC Proof of Concept Case study \u2013 centralized TDM for a leading pharmacy client Overview The client has a complex IT landscape with data spread across multiple portfolios including marketing, sales, corporate, pharmacy, and supply chain. Some of the applications across the portfolios have a federated relationship with related data. The TDM service engagement requirement was to establish a well-defined TDM process and governance, which will address all the test data-related requests for the projects under different portfolios, gradually expand the TDM services to newer portfolios, and finally consolidate under the same umbrella.<|endoftext|>Problem statement \u2022 Identify the test data and data masking requirements in different portfolios and application databases \u2022 Perform gap analysis for the existing TDM processes \u2022 Establish a defined test data management process and governance \u2022 Implement an automated TDM process using the right tools \u2022 Test data provisioning for functional, automation, and performance testing teams \u2022 Metrics-based approach for the evaluation of test data management implementation Challenges \u2022 Complex IT landscape with heterogeneous data source types \u2022 Lack of defined test data management processes / strategy \u2022 Manual TDM activities for data subsetting and masking \u2022 Lack of integrated data across systems \u2022 Sensitive data being moved to a non-production environment without masking \u2022 Huge cycle time for generating test data, impacting test execution schedules Solution approach \u2022 Established a centralized TDM team to provision test data for functional and non-functional testing \u2022 Deployed a web-based, self-service tool for the testing teams to place the data request and provisioning \u2022 Masked data is provisioned to testing teams ensuring compliance to PIPEDA (Personal Information Protection and Electronic Documents Act) \u2022 Established automated TDM processes and capabilities across portfolios \u2022 End-to-end testing made easy by synching up test data across interdependent applications Benefits / value-adds \u2022 20% reduction in test data provisioning cycle time \u2022 Production data not exposed to testing teams \u2022 Repository of reusable masking and test data generation scripts \u2022 Automated TDM services reduced test data related defects to zero resulting in quality deliverables Suply Chain Pharmacy Corporate Sales Marketing Legend Interdependent Data TDM Service Rollout Map 2 4 6 8 10 12 14 16 18 20 Portfolio Period (Months) External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 8 / 8 \n\n---\n\n Seema Varghese is a Technical Test Lead with Infosys, having 11 years of IT experience, including leading teams and developing testing expertise in different domains (retail, pharmacy, and telecom). She has worked in data migration and data warehouse testing projects. She also has experience handling TDM and data masking projects.<|endoftext|>\u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: A framework to increase ROI through quality data \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 8 \n\n---\n\n WHITE PAPER A FRAMEWORK TO INCREASE ROI THROUGH QUALITY DATA Kuriakose K. K., Senior Project Manager \n\n---\n\n Page: 2 / 8 \n\n---\n\n External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 3 / 8 \n\n---\n\n The perception of data across organizations is changing. Data is no longer just one of the components of business. It has turned out to be \u2018the business.\u2019 Today, information is viewed as a lifeline by senior management and by many other departments for decision making, customer interactions, and optimum operations. An organization\u2019s success heavily depends on how it is able to understand and leverage its data. Unfortunately, there continues to be a high amount of inaccurate data within enterprises today, despite developing multiple solutions/ systems to counter the same. Nowadays, a major portion of data for decision making is collected from external sources through a variety of channels. This often results in poor data quality, which has a negative effect on an organization\u2019s ability to take decisions, run business, budget, market, and gain customer satisfaction. Organizations which fail to control the quality of data, is unable to sustain in today\u2019s data-centric world. Any data-driven effort needs to have strong focus on data quality, which implies that organizations looking for success are mandated to prioritize data accuracy and accessibility. It is essential for them to interact with consumers, vendors, suppliers, and third parties in countless ways, by exploring diverse new methods of communication. Information is the key for areas like inventory management, shipment, and marketing. The objective of this paper is to analyze principle challenges with data across few key business functions and discuss a framework which can bring down the erroneous data that is getting pumped in and out of an enterprise system.<|endoftext|>Marketing: If we have accurate information on who our customers are and what their needs are, we have hit gold in Marketing terms. This is more easily said than done as today we neither have an accurate nor enough information about customers. We can surely gather information about customers from various sources like- website, physical store, mobile application, call center, face to face, catalogues etc. But, one can never be sure if they are same or different set of people consuming your services. There\u2019s no surety of the information being accurate as most of these channels accept data directly with limited to no validations. Now let\u2019s assume that we have done all possible validations and found out the target group of customers but there\u2019s still no defined method of reaching them-- should it be through emails, telephonic conversations, social media, physical address etc. Let\u2019s drill this down into one of the mediums as--physical address. The catch- the customer has many addresses like for credit card, savings bank account, driving license, and for office purposes.<|endoftext|>Shipping The current status of shipments is constantly added to enterprise systems through shipping vendors like DHL Express, DHL Parcel, United States Postal Service (USPS), United Parcel Service (UPS), FedEx, Canada Post, LaserShip, OnTrac, and Hermes. Most of these vendors do not even share shipment history, hence organizations are forced to store and link this continuous flow of information. Many times incorrect data gets fed into system through these external sources. We see data like order shipment data being before order book data etc. This results in: \u2022 Order lost in transit \u2022 Incorrect shipping address \u2022 Order sent to wrong address \u2022 Shipping wrong items \u2022 Late shipments External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 4 / 8 \n\n---\n\n Inventory Inventory management can help a manufacturer / supplier in improving accuracy, cost savings, and speed. This in turn will help organizations have better control on operations and reduce cost of goods. Today, most of the manufacturers are facing challenges in inventory management systems. Few challenges listed below: \u2022 Limited standardization in management systems, business users, inventory integration, and movement checkpoints \u2022 Limited inventory reconciliation on regular intervals \u2022 Data discrepancies between demand planning and inventory planning systems \u2022 Improper logging of inventory information \u2022 Inaccurate data fed into forecasting systems Banking Financial organizations are required to meet regulatory compliance requirements according to the law of the land to avoid instances such as housing crisis. At the same time, data quality issues lead to transparency and accountability problems. Hence, quality of data for banking needs to be measured along the dimensions of completeness, accuracy, consistency, duplication, and integrity. There is also a need to ensure information that is being shared complies with information privacy and protection laws and regulations.<|endoftext|>Pharma Pharmaceutical industry gets warnings on regular intervals for falsifying, altering, or failing to protect essential quality data on their drug manufacturing process and its validation, resulting in huge business risks. According to US Food and Drug Administration (USFDA) regulations, pharma companies are mandated to maintain manufacturing and drug testing data. Many times, issues occur due to human data entry errors and machine errors like data recording failures. These regulations have even resulted in shutdown of plants causing huge losses.<|endoftext|>Today\u2019s state Many organizations are constantly investing in data quality to improve their efficiency and customer interactions through data insights. Majority of the companies are suffering from common data errors like incomplete or missing data, outdated information, and inaccurate data. This level of inaccurate data jeopardizes business that relies on business intelligence for taking key decisions. An organization\u2019s current level of maturity can be assessed from data quality maturity model given below: Insurance When it comes to insurance industry, data not only helps run operations, but also helps them ensure that claims have the required and correct information. Gener- ally, the following issues are found in the claims data: \u2022 Invalid diagnosis codes \u2022 Incorrect pin codes of hospitals \u2022 Incomplete forms with missing crucial data like gender, patients\u2019 stay in hospital, etc.<|endoftext|>\u2022 Inaccurate age data External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 5 / 8 \n\n---\n\n Automated data quality framework: This calls for a need of a strong quality framework, which can validate standard business rules against the processed data coming from external sources into enterprise systems. This framework should be able to report incorrect data and related information. A framework which has easily configurable rules and threshold values can be set by business using simple text through a user interface directly into framework. The framework can connect to almost all kinds of data sources \u2014 mainframes, file systems, relational database management system (RDBMS) systems, analytical databases such as columnar, massively parallel processing (MPP), in-memory data base, NoSQL databases, Hadoop, web services, packaged enterprise applications, OLAP applications, software as a service, and cloud-based applications.<|endoftext|>The details of common business rules are also collected by our subject matter experts (SMEs) in retail, consumer packaged goods (CPG), logistics, manufacturing, banking, healthcare, insurance, life sciences, capital markets, financial services, cards and payments, energy, utilities, communications, and services. This helped in the creation of a backbone for our standard quality framework where one can add / remove rules according to the specific business need.<|endoftext|>The user can pick a set of business rules and schedule it according to their need. An automated report gets generated which is emailed to the concerned parties. It is recommended to go with open source solution to bring down the cost of development and maintenance of the tool. We have used a combination of tools--Talend and Python scripts for the development. This framework can be based out of other open source solutions like KETL, Pentaho Data Integrator - Kettle, Talend Open Source Data Integrator, Scriptella, Jaspersoft ETL, GeoKettle, CloverETL, HPCC Systems, Jedox, Python, Apatar. The framework can also be enhanced further to carry out data profiling and data cleansing on an \u201cas- needed\u201d basis.<|endoftext|>Infosys Automated data quality framework consists of a configurable data quality (DQ) framework with built-in, reusable rules across domains with the following: \u2022 Standard business rules which can validate the processed data and information from third parties \u2022 Framework reports incorrect data and related information crossing the thresholds \u2022 Capability to easily configure rules and threshold values independently \u2022 Daily automated report generation, post job completion, enables independent operations for business Data quality maturity model Think & Act Local Think Global & Act Local Think Global & Act collectively Think & Act Global Matured Data Governance model \u2022 Limited awareness of data quality \u2022 Certain defined rules for data quality and integration for a specific module based on production issues encountered \u2022 Duplication of data across systems with no established system for data quality validation \u2022 Data inconsistency across systems \u2022 Organizations / programs accepting the impact of inconsistent, inaccurate, or unreliable data \u2022 Steps initiated to identify corrupt data \u2022 Gains are defined more at project level \u2022 Data inconsistency across systems \u2022 Organizations / programs accepting the impact of inconsistent, inaccurate, or unreliable data \u2022 Steps initiated to identify corrupt data \u2022 Gains are defined more at project level \u2022 Establishment of a well defined and unified data governance model \u2022 Regular checks and reporting of established business rules and data quality on a defined frequency \u2022 Organization shift toward management of data as business critical asset \u2022 System information and data is trusted \u2022 Key metrics for data quality is tracked against the defined variance percentage \u2022 Action items are tracked to closure for any variances beyond the agreed limit \u2022 ROI for data quality projects is tracked External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 6 / 8 \n\n---\n\n Realizing the return on investment (ROI) for data quality Today, businesses need relevant data to make informed decisions. Decisions and communications based out of bad data carries substantial risks to business performance. For any data-driven organization, it is important to ensure that utmost standards of data quality are met and the organization has scheduled processes to validate quality along with the data that is being pumped in and out of the organization. We also need to ensure that a structured methodology is being followed in data quality metric definition and its validation on regular intervals. Few possible outcomes of successful implementation of a strong data quality framework are: Marketing: Accurate data helps drive more effective campaigns for the intended target audience Shipping: Cost savings and operational efficiencies achieved with basic address validation and order-related data quality checks Inventory: Faster turnover of stock Insurance: Complete information about client\u2019s risk exposure enabling more accurate decisions on policy costs Banking: Ability to detect fraud patterns and improved customer service Pharma: Gain more compliance as per FDA regulations External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 7 / 8 \n\n---\n\n External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 8 / 8 \n\n---\n\n \u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: The future of enterprise test automation \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 8 \n\n---\n\n WHITE PAPER RPA: THE FUTURE OF ENTERPRISE TEST AUTOMATION \n\n---\n\n Page: 2 / 8 \n\n---\n\n RPA: The future of enterprise test automation If one word defines the 21st century, it would be speed. Never before has progress happened at such a breakneck pace. We have moved from discrete jumps of innovation to continuous improvement and versioning. The cycle time to production is at an all-time low. In this era of constant innovation where the supreme need is to stay ahead of the competition and drive exceptional user experiences, product quality deployed in production is paramount to ensure that speed does not derail the product as a whole. A robust testing mechanism ensures quality while allowing faster release and shorter time to market \u2013 so essential for that competitive edge. Today, USD 550bn is spent on testing and validation annually. It is also the second largest IT community in the world. That is a significant investment and effort being put into this space already, but is it delivering results? In the past five or so years, there has been a push from CXOs, based on recommendations from industry experts and analysts, to go for extreme automation. Companies have been adopting multiple tools, opensource technologies, and building enterprise automation frameworks. This attempt to achieve end-to-end automation has created a mammoth network of tool sets in the organization that may or may not work well with each other. This is how test automation was done conventionally; it still requires elaborate effort to build test scripts, significant recurring investment for subscription and licensing, and training and knowhow for multiple tools. By some estimates, traditional testing can take up to 40% of total development time \u2013 that is untenable in the agile and DevOps modes companies operate in today. What if this ongoing effort can be eliminated? What if the need for multiple tools can be done away with? Enter Robotic Process Automation (RPA) in testing. While originally not built for testing, RPA tools show great potential to make testing more productive, more efficient, and help get more features to the market faster \u2013 giving them an edge over conventional tools (see Fig 1). The state of testing and validation in the enterprise Product features Traditional automation tools RPA Tools Coding Knowledge \u2022 Coding knowledge is essential to develop automated scripts \u2022 Programming knowledge and effort is needed to build the framework, generic reusable utilities and libraries \u2022 These tools offer codeless automation. Developing automated scripts requires some effort for configuration and workflow design. However, coding is minimal compared to traditional tools \u2022 Generic reusable utilities are available as plug-and-play components Maintenance Extensive maintenance effort required Minimal test maintenance effort required Cognitive automation No support for cognitive automation RPA tools are popular for supporting cognitive automation by leveraging AI Plugin support Limited plugins are available for different technologies Plugins are available for all leading technologies Orchestration and load distribution Load distribution during execution requires additional effort to develop the utilities and set up the infrastructure This feature is available in most RPA tools. For example, feature of a popular RPA tool helps in load distribution during execution without any additional effort aside from configuration Automation development Productivity Test development productivity is low since custom coding is required most of the time Test development productivity is high as most generic activities are available as plug-and-play OCR for text recognition This feature is not available This feature is available in all RPA tools Advanced image recognition This feature is not available. Either additional scripting or a third-party tool is needed to support this This feature is available in all RPA tools In-built screen and data scarping wizards This feature is not available and requires integration with other tools This feature is available in all RPA tools Fig. 1: Traditional automation tools vs. RPA External Document \u00a9 2020 Infosys Limited \n\n---\n\n Page: 3 / 8 \n\n---\n\n External Document \u00a9 2018 Infosys Limited RPA \u2013 the next natural evolution of testing automation In the last decade, automation has evolved and matured with time along with changing technologies. As discussed, automation in testing is not new but its effectiveness has been a challenge \u2013 especially the associated expense and lack of skill sets. RPA can cut through the maze of tool sets within an enterprise, replacing them with a single tool that can talk to heterogenous technology environments. From writing stubs to record and playback, to modular and scriptless testing, and now to bots, we are witnessing a natural evolution of test automation. In this 6th Gen testing brought about by RPA orchestration, an army of bots will drastically change the time, effort, and energy required for testing and validation. We are heading towards test automation that requires no script, no touch, works across heterogenous platforms, creates extreme automation, and allows integration with opensource and other tools.<|endoftext|>According to Forrester\u00b9, \u201cRPA brings production environment strengths to the table.\u201d This translates into production level governance, a wide variety of use cases, and orchestration of complex processes via layers of automation. RPA allows companies to democratize automation very rapidly within the testing organization. RPA has an advantage over traditional tools in that it can be deployed where they fail to deliver results (see Fig 2). For instance, when: \u2022 the testing landscape is heterogenous with complex data flows \u2022 there is a need for attended and unattended process validation \u2022 there is a need to validate digital system needs An RPA solution can bring in a tremendous amount of simplicity for building out bots quickly and deploying them with the least amount of technical know-how and skills that even business stakeholders can understand.<|endoftext|>External Document \u00a9 2020 Infosys Limited \n\n---\n\n Page: 4 / 8 \n\n---\n\n However, the challenges in testing are not limited to writing and executing test cases. The automation needs to also handle the periphery of testing activities \u2013 validating that all components of the environment are up and running and that test data is available on time. This dependency on the peripheral activities, and the teams running them, could cost valuable time. For instance, for a large banking client, this dependency put a lot of pressure on the testing team to finish a sprint in 5 days. Using RPA, we were able to automate the batch monitoring and batch rendering process. We also automated synthetic test data creation and data mining processes reducing time to market by 40%. To really provide value for testing and validation, RPA needs to provide some very testing specific capabilities such as: A Cohesive Automation Platform: Enabling enterprises to leverage the full potential of automation with process discovery, process automation (attended, unattended, and UI based), combined with process orchestration capabilities. This should include a test automation interface that can bridge the gap between test management tools and automated test cases. A workflow-driven test automation approach can make the solution business- centric. Native AI Capabilities: A cognitive engine can leverage various data sources to deliver pervasive intelligence across process design, management, and execution.<|endoftext|>Security and Scalability: The solution should allow running multiple bots on a single virtual machine, have robust access management with a credential vault built into the product, and offer out-of-the-box technology-specific adaptors Fig 2. How RPA tools address traditional testing challenges Challenge Areas Performance of RPA tools Test Data Management Data-driven testing is supported by many traditional tools. RPA can manage data form files like Excel/JSON/XML/DB and use these for testing Testing in different environments End-to-end business processes navigate through various environments like mainframe/web/DB/client server applications. RPA tools can easily integrate this process across multiple systems. Thus, RPA tools simplify business orchestration and end-to-end testing compared to other testing tools Traceability While RPA tools do not directly provide test script traceability, there are methods to enable this functionality. For instance, user stories/requirements stored in JIRA can be integrated with RPA automation scripts using Excel mappings to create a wrapper that triggers execution Script versioning A batch process can be implemented in the RPA tool to address this CI-CD integration This is available in most of the RPA Tools Reporting and defect logging RFA tools have comprehensive dashboards that showcase defects that can be logged in Excel or JIRA through a suitable wrapper Error handling This feature is available in all RPA tools External Document \u00a9 2020 Infosys Limited \n\n---\n\n Page: 5 / 8 \n\n---\n\n AssistEdge for Testing in Action One of our clients, a large investment company based in Singapore realized the benefits of RPA based testing when it helped them save 60% of testing efforts. They were running legacy modernization as a program using mainframe systems that are notoriously difficult to automate using traditional automation tools. RPA with its AI and OCR capabilities and the ability to traverse through and handle any technology, was easily able to automate 800+ test cases in the mainframe.<|endoftext|>In another instance, a large banking client was using package-based applications that used multiple technologies to build different screens. It becomes difficult to integrate multiple tools in this scenario. With RPA, we were able to automate the end-to-end workflow for each application using just one tool. This helped reduce the overall maintenance effort by over 30%. Another one of our clients was facing a quality assurance (QA) challenge where bots were being deployed without testing. We developed specific QA bots with added exceptional handlers to check whether the bot is actually handling exceptions and if it fails then how it comes back to the original state. By validating the bots, we improved the overall efficiency by 30%.<|endoftext|>External Document \u00a9 2020 Infosys Limited \n\n---\n\n Page: 6 / 8 \n\n---\n\n Take advantage of the Edge The COVID-19 pandemic has accelerated the organizations\u2019 need to be hyper- productive. Companies are realizing that they have to transform to build the capabilities that will prepare them for the future. Companies are thinking of ways to drive efficiency and effectiveness to a level not seen before. There is a strong push for automation to play a central role in making that happen. This is also reflected in the testing domain, where any opportunity for improvement will be welcome.<|endoftext|>Realizing the need to drive testing efficiencies and reduce manual effort, organizations want to adopt RPA in testing. We are at a tipping point where the benefits of RPA adoption are clear, what is needed is that first step towards replacing existing frameworks. Recognizing the potential of RPA in testing, EdgeVerve and Infosys Validation Solutions (IVS) have been helping clients simplify and scale up test automation with AssistEdge for testing. AssistEdge brings the experience of handling tens of thousands of processes with different technologies and environment systems to test automation, helping navigate heterogenous environments with ease. By building an army of bots for functional, regression, and user acceptance testing, it can help achieve 100% test automation with incredible accuracy. In addition to being faster to build and deploy, AssistEdge reduces the overall time to value and also the investment needed for deploying and managing RPA infrastructure. Infosys Validation Solutions\u2019 (IVS) engineering-led QA capabilities enable enterprises to effortlessly scale up testing in real-time, delivering unprecedented accuracy, flexibility, and speed to market. With its vast experience, IVS enables clients across industry verticals to successfully implement an automated testing strategy, allowing them to move away from tedious and error prone manual testing, thereby improving performance and software quality while simultaneously resulting in effort- and cost-savings.<|endoftext|>The journey to RPA-based test automation has to be implemented. And those that adopt faster will hold a competitive advantage in faster realization of benefits. The question is, are you willing to take the leap? Want to know more about the potential of RPA in testing? Write to us at askus@infosys.com External Document \u00a9 2020 Infosys Limited \n\n---\n\n Page: 7 / 8 \n\n---\n\n About AssistEdge AssistEdge offers a cohesive automation platform that enables enterprises to scale in their automation journey. It offers enterprises with a comprehensive suite of products enabling them to drive initiatives around process discovery, intelligent automation and digital workforce orchestration. AssistEdge has helped enterprises unlock value in the form of reduced service time, faster sales cycles, better resource allocation, accelerated revenue recognition and improved efficiency among others.<|endoftext|>About EdgeVerve EdgeVerve Systems Limited, a wholly owned subsidiary of Infosys, is a global leader in AI and Automation, assisting clients thrive in their digital transformation journey. Our mission is to create a world where our technology augments human intelligence and creates possibilities for enterprises to thrive. Our comprehensive product portfolio across AI (Infosys Nia), Automation (AssistEdge) and AI enabled Business Applications (TradeEdge, FinXEdge, ProcureEdge) helps businesses develop deeper connections with stakeholders, power continuous innovation and accelerate growth in the digital world. Today EdgeVerve\u2019s products are used by global corporations across financial services, insurance, retail, consumer & packaged goods, life sciences, manufacturing telecom and utilities. Visit us to know how enterprises across the world are thriving with the help of our technology. https://www.edgeverve.com/ About Infosys Infosys is a global leader in next- generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With over three decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|>Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|>External Document \u00a9 2020 Infosys Limited \n\n---\n\n Page: 8 / 8 \n\n---\n\n \u00a9 2020 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected About the \nAuthors Vasudeva Naidu AVP \u2013 Delivery Head Infosys Validation Solutions Sateesh Seetharamiah VP \u2013 Global Product Head \u2013 AssistEdge EdgeVerve References: \u00b9\u201dRPA And Test Automation Are More Friends Than Foes\u201d, Forrester Research, Inc., May 15, 2020 https://www.infosys.com/services/it-services/validation-solution/white-papers/documents/rpa-tool-testing.pdf https://www.infosys.com/services/it-services/validation-solution/documents/automation-testing-assistedge.pdf https://www.ibeta.com/risks-of-not-testing-software-properly/#:~:text=The%20cost%20to%20fix%20bugs,profit%20loss%20during%20 software%20downtime.<|endoftext|> \n\n\n***\n\n\n "} +{"text": "# Infosys POV \nTitle: Energy Transition: Hydrogen for Net Zero \nAuthor: Infosys Consulting \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 11 \n\n---\n\n An Infosys Consulting Perspective By Sundara Sambasivam & Shivank Saxena Consulting@Infosys.com | InfosysConsultingInsights.com Energy Transition Hydrogen for Net Zero \n\n---\n\n Page: 2 / 11 \n\n---\n\n Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 2 Energy transition: Hydrogen for Net Zero The pressure to reduce carbon emissions to achieve the target of net zero emissions by 2050 is ever-increasing. There is no silver bullet, no \u2018one-size-fits-all\u2019 solution to address this challenge. At this point in time, there are many different energy sources with varying levels of investment that are being explored and tested to enable our transition towards net zero.<|endoftext|>Hydrogen (H2) is one of the most abundant elements found in nature. For decarbonization of the industry, it is considered a key component; opening new frontiers and complementing existing solutions. This series of papers aims to share some interesting perspectives on this sector, the associated challenges, and why it could play a significant role in the decarbonisation agenda. Current limitations in tech, scaling challenges, and feasibility concerns are just some of the reasons it has not yet been harnessed fully. However, hydrogen has significant potential to manage this challenging journey towards net zero.<|endoftext|>\n\n---\n\n Page: 3 / 11 \n\n---\n\n Types of Hydrogen Both the production source and process used define the hydrogen type. Below is a list of diverse hydrogen types produced today based on production method and source (The hydrogen colour chart, 2022).<|endoftext|>Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 3 \n\n---\n\n Page: 4 / 11 \n\n---\n\n Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 4 MARKET OUTLOOK - Production & Economies Production and demand outlook According to the 2021 Report of International Energy Agency on Hydrogen, only 0.49 Mt of hydrogen was produced via electrolysis. Although this was only 0.5% of overall global production, the outlook on green and blue hydrogen is promising. It has become an essential element for any state policy on energy transition for net zero. By 2050, more than 80% of production is estimated to be of green or blue hydrogen. Demand will primarily be driven by power, transport, and industry where demand for green hydrogen has the potential to grow 200% by 2050.<|endoftext|>Figure 2: Global hydrogen production and demand outlook (Harnessing Green Hydrogen: Opportunities for Deep Decarbonization in India, 2022) \n\n---\n\n Page: 5 / 11 \n\n---\n\n Economic outlook The current hydrogen production costs from different methods are listed in Figure 3 (Hydrogen Strategy: Enabling a low-carbon Economy, 2020). Coal and other fossil fuel-based production is inexpensive at around 2 USD/kg. Prices increase by 10 to 20% when using carbon capture and storage (CCS). Electrolysis powered by renewable energy (RE) is the most expensive at 5 to 10 USD/kg and is not currently a competitive price. This needs to decrease to at least 2 USD/kg or lower in the next decade to directly compete with fossil fuels as an energy source.<|endoftext|>There are several elements that would play a critical role in driving the cost of the end-to-end supply chain of production and distribution. These include higher levels of innovation through research and development (R&D) and the right investment through disruptive digital technologies like artificial intelligence, the Internet of Things, blockchain smart contracts, certificates, and digital twin.<|endoftext|>Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 5 Figure 3: Hydrogen production costs by source and method (Hydrogen Strategy: Enabling a low-carbon economy, 2020) \n\n---\n\n Page: 6 / 11 \n\n---\n\n Economic outlook Renewables and electrolyser costs drive green hydrogen prices and are both showing declining trends. Electrolyser costs are expected to fall by 30% in the next ten years (Harnessing Green Hydrogen, 2022). Industrial manufacturers like Siemens Energy and Linde have already started setting up some of the world\u2019s biggest electrolyser production facilities in line with the European Union\u2019s (EU) strategy (REPowerEU plan May 2022) for fuel diversification, which will need a 27 billion EUR direct investment in domestic electrolyser and distribution of hydrogen in the EU, excluding the investment of solar and wind electricity (REPowerEU Plan, 2022).<|endoftext|>The US, on the other hand, has announced future investments of up to 9 billion USD from 2022 to 2026 through its \u2018Infrastructure Investments and Jobs Act\u2019 (Garc\u00eda-Herrero et al, 2022). The key difference is that US policy plans to use both blue and green hydrogen in the fuel mix, while the EU views blue hydrogen as a temporary solution only. Based on policy support and market conditions, the industry will decide on a future roadmap. Green credits and green hydrogen trading can turn many fossil fuel-dependent countries into future energy suppliers. Various states and corporates are funding green and brown field projects which have created finance opportunities for venture capital, underwriters, and insurance firms.<|endoftext|>Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 6 Figure 4: Renewables and electrolyser cost outlook (Harnessing Green Hydrogen, 2022) \n\n---\n\n Page: 7 / 11 \n\n---\n\n Economic outlook Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 7 Figure 5: Renewables and electrolyser cost outlook (Harnessing Green Hydrogen, 2022) \n\n---\n\n Page: 8 / 11 \n\n---\n\n Figure 6: Hydrogen value chain opportunities Hydrogen Value Chain Opportunities Figure 6 outlines the end-to-end value chain from production and electrolyser plant setup, operations in conjunction with RE parks, storage (long- and short-term), distribution (liquified or gaseous), and consumption applications (power, transportation, and industries). It gives an overview on the current usage of Hydrogen in industry applications. New emerging areas where significant opportunities exist for growth are primarily transportation (heavy duty vehicles and shipping), long-term energy storage (sub-surface), and green ammonia (production and energy carrier). Hydrogen can contribute directly to decarbonising the biggest polluters like steel, refineries, and ammonia production. Although Hydrogen has a clean burn, its production is not clean. Hydrogen production from fossil fuels resulted in 900 Mt CO2 emissions in the year 2020 (Global Hydrogen Review 2021, 2021). High demand for green and blue hydrogen and hydrogen-based fuels could reduce up to 60 Gt of CO2 emissions between 2021 and 2050, accounting for a reduction of 6% of total cumulative emissions (Hydrogen, 2022).<|endoftext|>Some of the biggest polluters in the transportation sector include long-haul freight, heavy-duty vehicles, maritime, and jet fuel. Decarbonizing them is not easy. By 2050, green ammonia can meet 25% of shipping fuel demand to meet the International Maritime Organization\u2019s goal of reducing CO2 emissions by 50% from 2008\u2019s levels. Hydrogen fuel cells can gear up short distance rides such as ferry journeys (Harnessing Green Hydrogen, 2022). With air travel growth, a significant carbon footprint increase is expected in aviation which already has the highest carbon emission intensity. Options like hydrogen fuel cells, hydrogen turbines, and hydrogen-based electrolytic synthetic fuel exist to decarbonize aviation, but each option has its merits and demerits. Big corporations like Airbus or start-ups like ZeroAvia have already presented their roadmaps for a hydrogen-based carrier in the next decade.<|endoftext|>For building, hydrogen can be blended into existing gas networks for both residential and commercial complexes. It can also be used by boilers and fuel cells. Its biggest promise is in long-term energy storage. This will impart stability to renewables-based generation and grid operations. Today, new gas turbines can also use hydrogen as a fuel component.<|endoftext|>Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 8 Opportunities for the industry \n\n---\n\n Page: 9 / 11 \n\n---\n\n What\u2019s next? In our next articles, we will discuss the challenges of this emerging sector, some exciting industry projects underway around hydrogen, support, and digital solutions needed to help pave the way to net zero. Infosys Consulting achieved its net zero goals 30 years ahead of time and is working to help our partners in their energy transition journey towards their own net zero goals.<|endoftext|>Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 9 \n\n---\n\n Page: 10 / 11 \n\n---\n\n MEET THE EXPERTS Sundara Sambasivam Associate Partner - Services, Utilities, Resources and Energy Practice Sundara.Sambasivam@infosys.com \u201cThe lines betw een digita l and physi cal retail will conti nue to blur\u201d Sources \u2022 Garc\u00eda-Herrero, A.,Tagliapietra, S. & Vorsatz, V. (2021), \u2018Hydrogen development strategies: a global perspective\u2019, Bruegel, August, [Online], Link: [Accessed: 21 Nov 2022].<|endoftext|>\u2022 \u2018Global Hydrogen Review 2021\u2019, (2021), International Energy Agency: IEA, [Online], Link Accessed:16 Nov 2022].<|endoftext|>\u2022 \u2018Harnessing Green Hydrogen: Opportunities for Deep Decarbonisation in India\u2019, (2022), Niti Aayog & Rocky Mountain Institute (RMI), June, [Online], Link [Accessed: 16 Nov 2022].<|endoftext|>\u2022 \u2018Hydrogen\u2019, (2021), International Energy Agency: IEA, [Online], Link [Accessed: 16 Nov 2022].<|endoftext|>\u2022 \u2018Hydrogen\u2019, (2022), International Energy Agency: IEA, [Online], Link [Accessed: 16 Nov 2022].<|endoftext|>\u2022 \u2018Hydrogen Strategy: Enabling A Low-Carbon Economy\u2019, (2020), U.S. Department of Energy, July, [Online], Link [Accessed: 16 Nov 2022].<|endoftext|>\u2022 \u2018REPowerEU Plan\u2019, (2022), European Commission, [Online], Link [Accessed: 16 Nov 2022].<|endoftext|>\u2022 \u2018The hydrogen colour chart\u2019, (2022), National Grid, [Online], Link [Accessed: 16 Nov 2022].<|endoftext|>Shivank Saxena Senior Consultant - Services, Utilities, Resources and Energy Practice Shivank01@infosys.com Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 10 Over 22 years of global experience, Sundar has led a number of business and digital transformation and outcome-based efficiency turnaround programmes across the Energy and Utilities (Transmission & Distribution). Sundar is excited to collaborate and help our clients to navigate the journey of Energy Transition towards the net zero ambitions.<|endoftext|>Over 11 years of experience, Shivank has led digital transformation projects, enabling end-to-end systems\u2019 delivery for clients across industries and sectors. He has ensured sustained value delivery on multiple engagements by building roadmaps and driving planning-to-execution for various business-led initiatives. He is passionate about supporting the industry to meet its net zero goals, and currently helps clients innovate to drive energy transition initiatives.<|endoftext|>\n\n---\n\n Page: 11 / 11 \n\n---\n\n consulting@Infosys.com InfosysConsultingInsights.com LinkedIn: /company/infosysconsulting Twitter: @infosysconsltng About Infosys Consulting Infosys Consulting is a global management consulting firm helping some of the world\u2019s most recognizable brands transform and innovate. Our consultants are industry experts that lead complex change agendas driven by disruptive technology. With offices in 20 countries and backed by the power of the global Infosys brand, our teams help the C- suite navigate today\u2019s digital landscape to win market share and create shareholder value for lasting competitive advantage. To see our ideas in action, or to join a new type of consulting firm, visit us at www.InfosysConsultingInsights.com. For more information, contact consulting@infosys.com \u00a9 2022 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names, and other such intellectual property rights mentioned in this document. Except as expressly permitted, neither this document nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printed, photocopied, recorded or otherwise, without the prior permission of Infosys Limited and/or any named intellectual property rights holders under this document. \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: Leveraging advanced validation techniques for retail size optimization \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 8 \n\n---\n\n WHITE PAPER LEVERAGING ADVANCED VALIDATION TECHNIQUES FOR RETAIL SIZE OPTIMIZATION - Divya Mohan C, Chitra Sylaja \n\n---\n\n Page: 2 / 8 \n\n---\n\n Introduction In today\u2019s highly volatile business environment, retailers that want to remain profitable must be able to predict customer demand and ensure availability of the right products in the right store at the right time. This is a challenging task when merchandise such as apparel and footwear are offered in a range of sizes. To maximize revenue and profitability, retailers need a strategy that allows them to sell goods at full price while reducing markdowns.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 3 / 8 \n\n---\n\n How does size optimization help retailers Size optimization transforms historical sales and inventory data into size-demand intelligence. This enables smart buying and allocation at the size level to match customer needs at each store. The size profile optimization (SPO) application provides optimized size curves for products at the store level. SPO uses past sales history to deliver the right sizes to the right stores and reduces the imbalance between consumer demand and inventory. By leveraging a combination of back-end data crunching technologies and front- end size profiling tools, SPO enables allocators, buying coordinators and buyers to leverage sales and inventory data and create precise style/color size scales for each store. This degree of specificity can help buyers and allocators create size profiles that recover lost sales, improve the size balance across the retail chain and launch new products and categories based on previous successes.<|endoftext|>Key challenges in size optimization Retailers find it challenging to procure and maintain the right mix of merchandise in the right size at the right store. The store- specific size profile plays an important role in helping retailers make assortment and allocation decisions. As the competition evolves, new challenges arise and retailers need suitable approaches to counter these challenges.<|endoftext|>Ability to understand consumer demand Customer demand varies across stores. Understanding the size level demand at each store for different products is critical to meeting each unique customer demand. This requirement can be accurately captured and analyzed through granular data that represents size profiles for every store/product.<|endoftext|>Historical performance is not true customer demand As size profiling is based mainly on sales history, a cleansed history is necessary to generate an accurate size profile. Typically, historical data comprises of profitable sales, lost sales and markdowns. To get relevant size profiling data, it becomes necessary to filter out inventory data related to stock-outs, markdowns, margins, etc., from the consolidated data. This ensures that the right data set is used to design better analytical models and prevent biases arising from extreme data points.<|endoftext|>Analyze and process large volume of data Gathering information at the granular level of store and size can generate large data volumes that are difficult to handle and analyze. Retailers may find it challenging to derive impactful business decisions from such large data sets.<|endoftext|>Key influencers in size optimization: Predicting customer decisions Consumers are always looking for specific merchandise; say for instance, a shoe that has the name of a basketball legend. In case the store does not have this particular product, they should be able to offer the customer a suitable alternative through smart product grouping. Size profile optimization can help retailers predict similar products and position them appropriately within the store.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 4 / 8 \n\n---\n\n Region-specific demand for products Certain products have high customer demand in specific regions. Size profile optimization can analyze historical data to ensure adequate stock of high demand products in these stores.<|endoftext|>Reducing imbalance between consumer demand and inventory Customer demand for merchandise depends on factors such as holiday seasons, upcoming special events such as a marathon, etc. Size optimization can predict the type of merchandise that needs to be in stock during these seasons. This ensures that the inventory is never out-of-stock for such products.<|endoftext|>Recovering lost sales On the other hand, owing to inaccurate or excessive merchandise allocation, some stores are forced to conduct clearance sales at the end of a season to reduce their inventory. SPO can assist retailers in allocating accurate size profiles, thereby ensuring only the required inventory is stocked to meet existing business demand.<|endoftext|>Allocate new products/ new stores based on similar products/stores Size profile optimization goes beyond allocating size profiles based on historical sales and inventory data. Retail merchandising mandates that the allocation of a new article to any store is profitable to the company. The SPO engine can map a new article to similar existing articles and create profiles based on these. Similarly, SPO can map a new store to similar existing stores and create relevant profiles.<|endoftext|>Demand Transfer Walk Switch Point Product Type - Apparel for Men Class Men - Top 3 2 1 Navigate maximum up to Class Level Style 123 Style-Color 123-143 Style-Color/Size S, M, L, XL, XXL SHORT SLEEVE Top Style 132 Style-Color 132-145 Style-Color/Size S, M, L, XL, XXL LONG SLEEVE Top External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 5 / 8 \n\n---\n\n Functional testing Functional testing is performed to ensure that size profiling is done according to business expectations. There are four key ways to conduct functional testing: 1. Validate pre-load data 2. Analyze data in SPO 3. Validate of the size profile engine based on business rules 4. Validate business intelligence reports Validation of pre-load data Data from various sources such as point-of- sale (POS) terminal, digital stores, etc., are loaded into the database. The raw data can exist in any form (flat file or XML). To verify that the right data is fed into the database, different validation techniques can be used. These include: \u2022 Comparing data received from different input source systems through XML/flat file format with data available in the intermediate database \u2022 Ensuring that data is loaded according to business rules defined in the system \u2022 Ensuring the data loaded from the source to intermediate databases is according to the mapping sheet specified in the requirement Analysis of data in spo Retail merchants possess large amounts of historical data accumulated over several years that can be fed into the size profile engine for profile generation.<|endoftext|>Testing teams should ensure that the correct data and possible scenarios are sampled and transferred to the size engine. a Data from POS, Digital Stores Intermediate DB d b c SPO Why validation is critical for size optimization? Previously, retailers were unaware of the importance of size optimization. They would randomly determine an average size profile and apply it across all stores and, occasionally, across geographic locations. Retailers lacked the right approach to leverage the large amount of historical data readily available. This often resulted in early season stock-outs in some stores and markdowns for the same merchandise in other stores. Additionally, retailers struggled to make intelligent data-driven decisions owing to the lack of automated approaches and techniques to validate data quality and integrity issues.<|endoftext|>Validation strategy Validation involves processing a large amount of data from various sources according to the format specified by the size profile engine. A validation strategy can help retailers meet customer demand by accurately projecting each store\u2019s future sales and inventory needs. It can support the business goals of the retailer, i.e., reduce markdowns, increase full price sales and drive higher revenue. Besides validation, size profiling also includes functional and performance testing.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 6 / 8 \n\n---\n\n Non-functional testing Any size profile optimization project involves processing a large volume of structured data. Performance testing ensures that size profile engines perform optimally and that size profiles are generated within the stipulated time limits to support business needs. For best results, the test environment for performance testing should be similar to the product environment. Further, if the performance service level agreement (SLA) is not met, then the advantages of size profile optimization are lost. Performance can be monitored by different tools that are available in the market. The typical performance testing check-points are: \u2022 Ensure data movement in each stage is completed according to the SLA \u2022 Monitor system performance on maximum data load Validation of reports Once the size profiles are generated, business users can compare the profiles for different products and allocate them based on analytical reports drawn using business intelligence report-generation mechanisms.<|endoftext|>Analytical reports are generated based on the business rule set. The testing team validates the accuracy of the report data with data from the data warehouse and verifies the usefulness of information displayed to the business.<|endoftext|>The reports generated by the size profile engine provide the following key details: \u2022 Allocation by store \u2013 How many articles of a particular size have been allocated to a particular store \u2022 Allocation percentage at various levels such as class, style, style-color, concept, etc \u2022 Effectiveness of size profile \u2013 Business can measure the effectiveness of size profiles in improving allocation to stores Validation of the size profile engine based on business rules Once data is fed into the size profile engine, it needs to be processed according to business rules specified within the system. Business rules are set to analyze the accuracy of size profiling. The size profile engine can analyze and process data using these validation techniques: \u2022 In cases where the business rule should exclude stock-out data and sales data having a margin filter greater than 10% for a particular set of merchandise, the validation team verifies that the size profile engine has not considered such data for profile generation \u2022 The validation team has to ensure that relevant data is used to determine the appropriate profile for the introduction of a new article/ store. Often, the data used may be incorrect owing to non- availability of relevant data for the new article/ store To execute high-level validation for business rules, the following validation techniques can be used by validation teams: \u2022 Compare data on new products with data on existing/similar products to verify that a similar size profile is generated \u2022 Ensure that the correct sample of data is selected for verifying all the business rules \u2022 Monitor and verify that size profiles are generated for every size of a particular style/color of a product \u2022 Ensure that the total size profile generated for a particular style/color of an article is 100% External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 7 / 8 \n\n---\n\n Conclusion Size profile optimization helps retailers effectively stock the right sizes in stores based on various parameters, thereby enabling them to maximize profit, reduce markdowns and recover lost sales. Historical sales and inventory data is analyzed and transformed to drive critical business decisions. Here, data quality and data analysis play a vital role. By leveraging the right validation strategy with appropriate validation techniques, retailers can ensure that all possible business scenarios are considered and accurate data is chosen for size optimization decisions.<|endoftext|>References http://www.sas.com/industry/retail/sas-size-optimization http://www.oracle.com/us/industries/retail/retail-size-profile-optimize-ds-078546.pdf External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 8 / 8 \n\n---\n\n \u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: An Insight into Microservices Testing Strategies \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 8 \n\n---\n\n WHITE PAPER AN INSIGHT INTO MICROSERVICES TESTING STRATEGIES Arvind Sundar, Technical Test Lead Abstract The ever-changing business needs of the industry necessitate that technologies adopt and align themselves to meet demands and, in the process of doing so, give rise to newer techniques and fundamental methods of architecture in software design. In the context of software design, the evolution of \u201cmicroservices\u201d is the result of such an activity and its impact percolates down to the teams working on building and testing software in the newer schemes of architecture. This white paper illustrates the challenges that the testing world has to deal with and the effective strategies that can be envisaged to overcome them while testing for applications designed with a microservices architecture. The paper can serve as a guide to anyone who wants an insight into microservices and would like to know more about testing methodologies that can be developed and successfully applied while working within such a landscape.<|endoftext|>\n\n---\n\n Page: 2 / 8 \n\n---\n\n Microservices attempt to streamline the software architecture of an application by breaking it down into smaller units surrounding the business needs of the application. The benefits that are expected out of doing so include creating systems that are more resilient, easily scalable, flexible, and can be quickly and independently developed by individual sets of smaller teams.<|endoftext|>Formulating an effective testing strategy for such a system is a daunting task. A combination of testing methods along with tools and frameworks that can provide support at every layer of testing is key; as is a good knowledge of how to go about testing at each stage of the test life cycle. More often than not, the traditional methods of testing have proven to be ineffective in an agile world where changes are dynamic. The inclusion of independent micro-units that have to be thoroughly tested before their integration into the larger application only increases the complexity in testing. The risk of failure and the cost of correction, post the integration of the services, is immense. Hence, there is a compelling need to have a successful test strategy in place for testing applications designed with such an architecture.<|endoftext|>Introduction External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 3 / 8 \n\n---\n\n The definition of what qualifies as a microservice is quite varied and debatable with some SOA (service-oriented architecture) purists arguing that the principles of microservices are the same as that of SOA and hence, fundamentally, they are one and the same. However, there are others who disagree and view microservices as being a new addition to software architectural styles, although there are similarities with SOA in the concepts of design. Thus, a simpler and easier approach to understand what microservices architecture is about, would be to understand its key features: \u2022 Self-contained and componentized \u2022 Decentralized data management \u2022 Resilient to failures \u2022 Built around a single business need \u2022 Reasonably small (micro) The points above are not essentially the must-haves for a service to be called a microservice, but rather are \u2018good-to-have.\u2019 The list is not a closed one either, as it can also include other features that are common among implementations of a microservices architecture. However, the points provide a perspective of what can be termed as a microservice. Now that we know what defines a microservice, let us look at the challenges it poses to testers.<|endoftext|>The distributed and independent nature of microservices development poses a plethora of challenges to the testing team. Since microservices are typically developed by small teams working on multiple technologies and frameworks, and are integrated over light-weight protocols (usually ReST over HTTPs, though this is not mandatory), the testing teams would be inclined to use the Web API testing tools that are built around SOA testing. This, however, could prove to be a costly mistake as the timely availability of all services for testing is not guaranteed, given that they are developed by different teams. Furthermore, the individual services are expected to be independent of each other although they are interconnected with one another. In such an environment, a key factor in defining a good test strategy would be to understand the right amount of testing required at each point in the test life cycle.<|endoftext|>Additionally, if these services integrate with another service or API that is exposed externally or is built to be exposed to the outside world, as a service to consumers, then a simple API testing tool would prove to be ineffective. With microservices, unlike SOA, there is no need to have a service level aggregator like ESB (enterprise service bus) and data storage is expected to be managed by the individual unit. This complicates the extraction of logs during testing and data verification, which is extremely important in ensuring there are no surprises during integration. The availability of a dedicated test environment is also not guaranteed as the development would be agile and not integrated.<|endoftext|>Microservices architecture Challenges in testing microservices External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 4 / 8 \n\n---\n\n In order to overcome the challenges outlined above, it is imperative that the test manager or lead in charge of defining the test strategy appreciates the importance of Mike Cohn\u2019s Test Pyramidi and is able to draw an inference of the amount of testing required. The pictorial view emphasizes the need to have a bottom-up approach to testing. It also draws attention to the number of tests and in turn, the automation effort that needs to be factored in at each stage. The representation of the pyramid has been slightly altered for the various phases in microservice testing. These are: i. Unit testing The scope of unit testing is internal to the service and in terms of volume of tests, they are the largest in number. Unit tests should ideally be automated, depending on the development language and the framework used within the service.<|endoftext|>ii. Contract testing Contract testing is integral to microservices testing and can be of two types, as explained below. The right method can be decided based on the end purpose that the microservice would cater to and how the interfaces with the consumers would be defined. a) Integration contract testing: Testing is carried out using a test double (mock or stub) that replicates a service that is to be consumed. The testing with the test double is documented and this set needs to be periodically verified with the real service to ensure that there are no changes to the service that is exposed by the provider. b) Consumer-driven contract testing: In this case, consumers define the way in which they would consume the service via consumer contracts that can be in a mutually agreed schema and language. Here, the provider of the service is entrusted with copies of the individual contracts from all the consumers. The provider can then test the service against these contracts to ensure that there is no confusion in the expectations, in case changes are made to the service.<|endoftext|>iii. Integration testing Integration testing is possible in case there is an available test or staging environment where the individual microservices can be integrated before they are deployed. Another type of integration testing can be envisaged if there is an interface to an externally exposed service and the developer of the service provides a testing or sandbox version. The reliance on integration tests for verification is generally low in case a consumer- driven contract approach is followed.<|endoftext|>iv. End-to-end testing It is usually advised that the top layer of testing be a minimal set, since a failure is not expected at this point. Locating a point of failure from an end-to-end testing of a microservices architecture can be very difficult and expensive to debug.<|endoftext|>Mike Cohn\u2019s Testing Pyramid E2E UI Testing Scope of Testing Execution Time Number of Tests Integration Testing Contrast Testing Unit Testing Approach to testing microservices and testing phases External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 5 / 8 \n\n---\n\n \u2022 For unit testing, it would be ideal to use a framework like xUnit (NUnit or JUnit). The change in data internal to the application needs to be verified, apart from checking the functional logic. For example, if reserving an item provides a reservation ID on success in the response to a REST call, the same needs to be verified within the service for persistence during unit testing.<|endoftext|>\u2022 The next phase of testing in contract testing. In case there are several dissimilar consumers of the service within the application, it is recommended to use a tool that can enable consumer-driven contract testing. Open source tools like Pact, Pacto, or Janus can be used. This has been discussed in further detail in the last example and hence, in the context of this example, we will assume that there is only a single consumer of the service. For such a condition, a test stub or a mock can be used for testing by way of REST over HTTPS Item ID. date Reservation ID Application Mocked Service Unit Testing Scope Integration Testing Scope Select an Item Reserve an item Integration Contract Testing Scope integration contract testing. Data being passed between the services needs to be verified and validated using tools like SOAPUI. For example, an item number being passed between the services that selects it to the one that reserves it. \u2022 E2E tests should ensure that dependency between microservices is tested at least in one flow, though extensive testing is not necessary. For example, an item being purchased should trigger both the \u2018select\u2019 and \u2018reserve\u2019 microservices.<|endoftext|>In order to get a clear understanding of how testing can be carried out in different scenarios, let us look at a few examples that can help elucidate the context of testing and provide a deeper insight into the test strategies used in these cases.<|endoftext|>\u2022 Scenario 1: Testing between microservices internal to an application or residing within the same application This would be the most commonly encountered scenario, where there are small sets of teams working on redesigning an application by breaking it down into microservices from a monolithic architecture. In this example, we can consider an e-commerce application that has two services a) selecting an item and b) reserving an item, which are modelled as individual services. We also assume there is a close interaction between these two services and the parameters are defined using agreed schemas and standards.<|endoftext|>Testing scenarios and test strategy External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 6 / 8 \n\n---\n\n \u2022 Unit tests should ensure that the service model is catering to the requirements defined for interacting with the external service, while also ensuring that internal logic is maintained. Since there is an external dependency, there exists a need to ensure that requirements are clearly defined and hence, documenting them remains key. TDD approach is suggested where possible and any of the popular frameworks discussed in the previous example can be chosen for this.<|endoftext|>\u2022 Contract testing can be used in this case to test the expectations from consumer microservices, that is, the applications internal service, decoupling it from the dependency on the external web service to be available. In this context, test doubles, created using tools like Mockito or Mountebank, can be used to define the PayPal API\u2019s implementation and tested. This is essentially integration contract testing and again needs to be verified with a live instance of the external service periodically, to ensure that there is no change to the external service that has been published and consumed by the consumer.<|endoftext|>\u2022 Integration tests can be executed if the third-party application developer \u2022 Scenario 2: Testing between internal microservices and a third-party service Here, we look at a scenario where a service with an application consumes or interacts with an external API. In this example, we have considered a retail application where paying for an item is modelled as a microservices and interacts with the PayPal API that is exposed for authenticating the purchase.<|endoftext|> Let us look at the testing strategy in each phase of the test cycle in this case: provides a sandbox (e.g. PayPal\u2019s Sandbox APIii ) for testing. Live testing for integration is not recommended. If there is no availability of a sandbox, integration contract testing needs to be exercised thoroughly for verification of integration.<|endoftext|>\u2022 E2E tests should ensure that there are no failures in other workflows that might integrate with the internal service. Also, a few monitoring tests can be set up to ensure that there are no surprises. In this example, selecting and purchasing an item (including payment) can be considered an E2E test that can run at regular and pre-defined intervals to spot any changes or breaks.<|endoftext|>PayPal (External) API PayPal Sandbox API Unit Testing Scope Application Pay for an Item Test Double Virtual Provider Firewall Contract Testing Scope Integration Testing Scope REST over HTTPS External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 7 / 8 \n\n---\n\n \u2022 Unit tests should cover testing for the various functions that the service defines. Including a TDD development can help here to ensure that the requirements are clearly validated during unit testing. Unit test should also ensure that data persistence within the service is taken care of and passed on to other services that it might interact with.<|endoftext|>\u2022 Contract testing \u2013 In this example, consumers need to be set up by using tools that help define contracts. Also, the expectations from a consumer\u2019s perspective need to be understood. The consumer should be well-defined and in line with the expectations in the live situation and contracts should be collated and agreed upon.<|endoftext|> Once the consumer contracts are validated, a consumer-driven contract approach to testing can be followed. It is assumed that in this scenario, there would be multiple consumers and hence, individual consumer contracts for each of them. For example, in the above context, a local retailer and an international retailer can have \u2022 Scenario 3: Testing for a microservice that is to be exposed to public domain Consider an e-commerce application where retailers can check for availability of an item by invoking a Web API.<|endoftext|>different methods and parameters of invocation. Both need to be tested by setting up contracts accordingly. It is also assumed that consumers subscribe to the contract method of notifying the provider on the way they would consume the service and the expectations they have from it via consumer contracts.<|endoftext|>\u2022 E2E tests \u2013 minimal set of E2E tests would be expected in this case, since interactions with external third parties are key here Unit Testing Scope Application Consumer Driven Contract Testing Scope REST over HTTPS REST over HTTPS Customer Contract 1 Virtual Consumer 1 Customer Contract 2 Virtual Consumer 2 External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 8 / 8 \n\n---\n\n Improvements in software architecture has led to fundamental changes in the way applications are designed and tested. Teams working on testing applications that are developed in the microservices architecture need to educate themselves on the behavior of such services, as well as stay informed of the latest tools and strategies that can help deal with the challenges they could potentially encounter. Furthermore, there should be a clear consensus on the test strategy and approach to testing. A consumer-driven contract approach is suggested as it is a better way to mitigate risk when services are exposed to an assorted and disparate set of consumers and as it further helps the provider in dealing with changes without impacting the consumer. Ensuring that the required amount of testing is focused at the correct time, with the most suitable tools, would ensure that organizations are able to deal with testing in such an environment and meet the demands of the customer.<|endoftext|>References : ihttps://www.mountaingoatsoftware.com/blog/the-forgotten-layer-of-the-test-automation-pyramid iihttps://www.sandbox.paypal.com In conclusion \u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: Modernizing enterprise systems in healthcare \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 8 \n\n---\n\n WHITE PAPER MODERNIZING ENTERPRISE SYSTEMS IN HEALTHCARE An end-to-end testing approach for greater predictability and quality Abstract As digital technologies, smart wearables and remote monitoring capabilities penetrate healthcare, traditional healthcare companies are unable to keep up with end-user expectations. Under pressure to adopt rapid transformation, these organizations are looking for robust and end-to-end testing procedures. This paper explains various end-to-end testing approaches within the four main modernization techniques for healthcare companies. The analysis presented here acts as a guideline for healthcare leaders to make strategic and informed decisions on how to modernize their systems based on the needs of their end-users. \n\n---\n\n Page: 2 / 8 \n\n---\n\n Introduction Sustainability in healthcare is a looming challenge, particularly as the fusion of disruptive innovations such as digitization, Internet-of-Things and smart wearables enable remote and real-time health tracking, diagnosis and management. To succeed in such an environment, healthcare organizations rely heavily on IT. Thus, using the latest end-to-end testing approaches becomes essential to: \u2022 Ensure that all applications operate as a single entity with multi-module interactions \u2022 Maintain performance/non-functional scenarios within the desired limit \u2022 Identify bottlenecks and dependencies ahead of time so that the business can take appropriate actions Testing challenges in healthcare modernization Business transformation in healthcare is complex because of the challenges in maintaining integrity between different types of customer needs and health- related plans. Modernizing healthcare software applications mandates enabling multi-directional flow of information across multiple systems, which can complicate the entire healthcare workflow application. Further, failures or errors in systems outside the enterprise environment can adversely affect the performance of applications with which they are integrated. To address such challenges, it is important to determine the right method and types of end-to-end testing. This will optimize application performance by testing it across all layers from the front-end to the back-end along with its interfaces and endpoints.<|endoftext|>Typically, most healthcare organizations use multi-tier structures with multiple end-users, making end-to-end testing very complex. Launching a new product in such a multi-directional business scenario requires extensive user testing. Thus, to enable end-to-end (E2E) testing, health insurance companies must first understand what customers expect from their healthcare providers and identify how they can meet these expectations in shorter timelines.<|endoftext|>Fig 1: A typical multi-tier healthcare business Customer Profle Cloud Member Rule Firewall Web server Service Rule Oracle Big Data Web service Find Doctor Hospital Claim Web Page Payment Application Provider Web Page Business Team XML XML XML XML XML SOA XML SOAPUI Pictorial Diagram of Multi-tier Healthcare Business API Call Database Update SOA Call Service Call Consumers External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 3 / 8 \n\n---\n\n Infosys solution End-to-end testing approaches for different modernization techniques Infosys leverages four modernization techniques to help healthcare organizations enable end-to-end testing. These techniques are: 1. Re-engineering technique Use-case: Best-suited in cases where companies need to digitize healthcare product marketing to different constituents of a state through online retail. This modernization technique is useful when venturing into new markets or when retiring obsolete technologies due to high maintenance costs. It leverages the following end-to-end testing approaches: \u2022 Simulation testing: User-centric interaction testing is performed based on different behavior events like usability testing, cross browser compatibility and mobile testing \u2022 Compliance testing: This testing is needed for security protocols and financial boundary testing as per mandates defined by state and central governments \u2022 Blend testing: This combines functional and structural testing into a single approach and is essential for any healthcare digitization transformation strategy Universal automation: This is a new approach that automates the acceptance of changed features in applications through browser recognition, payment gateways, etc.<|endoftext|>\u2022 Risk-based testing: This focuses on testing a few components or critical defects that are identified as high-risk functions, have significant complexity in business operations and can impact key features \u2022 Continuous automation testing: Based on continuous integration and continuous delivery, this testing is for real as well as virtual features that are proposed for projects in minor transition \u2022 Recognition accuracy testing: This tests non-textual data like images, pictorial figures, feelings, fingerprinting, etc., using a virtual augmented framework Benefits \u2013 This testing approach provides higher returns on investment by nearly 80- 90% in short spans within 5-7 iterations. It also improves co-ordination, accuracy and reusability of data in ensuing runs, thus providing a robust and reusable testing option through cutting-edge technology.<|endoftext|>Risks \u2013 Diversified technology exposure is critical to support such big bang transformation and limited technical knowledge may result in uncovering fewer quality issues. Further, rebuilding the enterprise framework can be costly.<|endoftext|>2. Replacing or Retiring technique Use case: Best-suited when one needs to remove contract and legal documentation from healthcare insurance and hospitals to a separate online portal.<|endoftext|>This modernization technique is used when there is a need for more control and accuracy. Migratory functions are clustered as units and can be renovated easily without disturbing other applications. Here, end-to-end testing focuses on components that undergo gradual replacement or are retired as described below: \u2022 Plug-and-play testing: This is usually executed when testing teams employ different types of tools for automation scripting or when different types of technologies are involved in testing \u2022 Web service-based testing: This is a mechanism or medium of communication by which two or more applications exchange data, irrespective of the underlying architecture and technology \u2022 Neutrality testing: This is typically used when the existing platform is replaced with a new one without altering the final business outcomes or end-user experiences \u2022 Parallel testing: This analyzes several applications or sub-elements of one application simultaneously and in the same instance using agile or waterfall models in order to reduce test time \u2022 Assembly testing: This reveals precise interactions among modules as per user requirements. It is used when functions are grouped into a logical entity and alliances are needed \u2022 Usability testing: Usability testing covers learnability, memorability, adeptness, and customer satisfaction indices to determine how easy to use the application is for end-users Benefits \u2013 This modernization approach provides more structure and control to end-to-end testing with15-20% effort reduction. It ensures effective application testing with the option of reverting to native state on-demand when needed. Further, it requires only 5-7% effort for automation changes during build. Risks \u2013 Project overrun can occur without proper supervision. Additionally, it requires repeated testing of the same regression suite even for small deployments. External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 4 / 8 \n\n---\n\n 3. Re-fronting technique Use case: Best-suited when adding encryption logic protocol is required for sensitive claim-related information passing through a web service.<|endoftext|>This approach is used when end-users want to use the same data efficiently and quickly without investing in expensive infrastructure set-up. It covers virtualization, non-functional testing and regression testing as described below: \u2022 Virtualization testing: This simulates multiple users to check the performance of the new technology while it interacts with existing applications \u2022 Non-functional testing: Certain features like technology compatibility, platform integrity, exception handling, help analysis, impact exploration, and application availability falls under the purview of non-functional testing \u2022 Regression testing: Regression re-run approach is used when there is a slight change in functionality but the overall system behavior has not changed Benefits \u2013 This approach simplifies localized defect resolution. Here, end- to-end testing is more stable as changes are limited and specific. Further, the cost of running E2E test cases is lower as the regression suite can be easily automated.<|endoftext|>Risks \u2013 Frequent patch changes can lower productivity and increase maintenance cost. Further, repeated testing of the same emergency bug fix can reduce long-term RoI. 4.Re-platforming technique Use case: Best-suited when upgrading billing/payments databases to recent versions is needed due to license renewals.<|endoftext|>Re-platforming of application modernization is primarily done in areas where businesses aim to minimize maintenance costs with cost effective technology. This modernization technique uses migration, acceptance, intrusive, and volume testing approaches as described below: \u2022 Migration testing: This is used when ensuring data integrity is the most important factor during technology upgrades \u2022 Acceptance testing: Acceptance testing ensures that applications being moved to a new platform have the same recognition intensities as before \u2022 Intrusive testing: Also used as negative testing, this approach determines the effect of hosting unexpected variables into the system or overall application \u2022 Volume testing: This evaluates the stability of applications by ingesting a huge number of records Benefits \u2013 This approach simplifies end- to-end testing as predicted business outcomes are achieved. It lowers testing cost, thus reducing total cost of operations and time-to-market and does not require additional infrastructure or specialized licensing tools. Further, it increases testing penetration by reusing scenarios, data and execution strategies.<|endoftext|>Risks \u2013 Re-platforming may warrant additional testing of critical business flows to ensure functional defects are caught early to avoid cost impact. Also conducting testing in the new platform requires proper training.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 5 / 8 \n\n---\n\n Modernization techniques for end-to-end testing approaches Testing Approach Re-engineer Remediate or replace Re-front Re-platform Simulation testing Yes Yes No No Compliance testing Yes Yes No No Blend testing Yes Yes No No Universal testing Yes Yes Yes No Risk-based testing Yes Yes No No Plug-and-play testing No Yes Yes Yes Web service-based testing Yes Yes Yes Yes Agile testing No Yes Maybe No Parallel testing No Yes Yes No Virtualization testing No No Yes Yes Usability testing Yes Yes No No Recognition testing Yes No Yes Maybe Regression testing No No Yes Yes Migration testing Yes No Maybe Yes Assembly testing Yes Yes Yes No Volume testing Yes No No Yes Intrusive testing Yes No No No Acceptance testing Yes Maybe No Yes Comparative analysis of various testing approaches The following table depicts a matrix of end-to-end test approaches along with modernization techniques in healthcare. The matrix illustrates which E2E testing method is best-suited to the four different modernization techniques. While \u2018yes\u2019 and \u2018no\u2019 represent absolute outcomes, it is important to note that \u2018maybe\u2019 results depend on how critical the business needs are and whether the approach is actually cost-effective when considering the overall business operations.<|endoftext|>Table 1: Comparison of different end-to-end test approaches and modenization techniques External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 6 / 8 \n\n---\n\n Case study Business need \u2013 A healthcare company with three verticals for policyholders, doctors and claims wanted to remodel their business portfolio to adopt new technologies and meet customer demand. While the policy procuring system and the consumer premium collection system were digitized, the company decided to re-engineer the claims processing system from DB2 to a big data-based system. As the claims vertical interacted with doctors and hospital web portals, they also wanted to gradually transform the portals component-wise in order to give doctors sufficient time to acquaint themselves with the new digitized system.<|endoftext|>Solution approach \u2013 To support the company\u2019s hybrid transformation project, we used an end-to-end testing strategy that leveraged complementary test approaches from different modernization techniques across the three verticals, as described below: \u2022 For policyholders: The customer procurement system was treated with a combination of re-front modernization and big bang transformation. Blend testing was used with continuous automation followed by web service testing and assembly testing \u2022 For providers (doctors/hospitals): Here, we used a combination of assembly, regression rerun and agile testing to ensure gradual changes since agile testing methodology is best-suited for scenarios where constituents are deployed slowly over a period of time \u2022 For claims: Claims is a crucial vertical. Thus, skeleton scripts, virtualization and migration testing methods were used for their stability and lower risk when migrating from DB2 to big data As each vertical of the company has different business needs, different types of modernization were needed to suit various end-users.<|endoftext|>Fig 2: End-to-end testing approach for the three verticals External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 7 / 8 \n\n---\n\n The road ahead In future, more consumers will embrace digitization and the uber connectedness of wearables and mobile devices that can track the user\u2019s health through in-built monitoring systems. Thus, as a higher number of service operators orchestrate multiple domains, we can expect to see greater challenges ahead for end-to-end testing. This makes it imperative to leverage DevOps and analytics-based testing capabilities along with modernization approaches.<|endoftext|>Conclusion Disruptive technologies are creating avenues for healthcare providers to issue virtual treatment, advice and services. However, this requires some degree of IT modernization for which end-to-end testing is crucial. There are various approaches that can be used to enable re-engineering, replacing, re- fronting, and re-platforming modernization techniques. Each testing approach has its benefits and risks and must be chosen based on the end-user expectations. Thus, it is important for business leaders to be aware of these in order to make the right decision for their IT modernization journey. The right approach can offer significant cost advantages, accelerate time-to-market and ensure seamless end-user experience. \nAuthors Dipayan Bhattacharya Project Manager Infosys Validation Solutions Amit Kumar Nanda Group Project Manager Infosys Validation Solutions External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 8 / 8 \n\n---\n\n References Egui Zhu1, M., Anneliese Lilienthal1, M., Shluzas, L. A., Masiello, I., & Zary, N. (2015). Design of Mobile Augmented Reality in Health Care Education: A Theory- Driven Framework. JMIR Medical Education, 1-17.<|endoftext|>Flahiff, J. (2011, June 6). Integrating Agile into a Waterfall World. The InfoQ Podcast.<|endoftext|>Hewlett-Packard Development Company. (2012). Survival guide for testing modern applications. Hewlett-Packard Development Company.<|endoftext|>Infosys. (n.d.). https://www.infosys.com/it-services/validation-solutions/white-papers/documents/end-test-automation.pdf.<|endoftext|>Infosys. (n.d.). https://www.infosys.com/it-services/validation-solutions/white-papers/documents/qa-strategy-succeed.pdf.<|endoftext|>Karan Maini. (n.d.). Legacy Modernization. Retrieved from https://www.google.co.in/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=rja&uact=8&ved= 0ahUKEwjSs5HssZnTAhUFSY8KHeKeDBoQFggiMAI&url=http%3A%2F%2Fdoczz.net%2Fdoc%2F5515301%2Flegacy-modernization&usg=AFQjCNFDRMLgm slUWqaqhyCqR7XahprSBQ&bvm=bv.152174688,d.c2I NetReach Technologies. (2010). Legacy Application Modernization Balancing Risk vs. Reward.<|endoftext|>Quality Thoughts. (2016, Oct 10). Quality Thoughts. Retrieved from http://www.qualitythought.in/courses/webservices-testing/ Selenium Labs. (2016, Oct 10). Web Services SOA Testing . Retrieved from http://www.seleniumlabs.in/web-services-soa-testing-in-bangalore.html Slide share. (2016, Oct 10). Webservices Testing. Retrieved from http://www.slideshare.net/AmitChanna/webservices-testing-a-changing-landscape Transvive. (2011). Migration Strategies & Methodologies. Toronto: Transvive.<|endoftext|>(n.d.). Retrieved from http://healthinsurancemedical.50webs.com/article1.html (n.d.). Retrieved from https://www.google.co.in/search?hl=en-IN&biw=1280&bih=866&tbm=isch&q=3+pillars+of+financial+security&oq=&gs_ l=#imgrc=uQhgXTjoI03D8M%3A (n.d.). Retrieved from https://www.google.co.in/search?q=computer+server+clipart&sa=G&hl=en-IN&biw=1280&bih=866&tbm=isch&imgil=W9Nhy A0FXtloxM%253A%253B5cEEUU2VgnYvcM%253Bhttp25253A%25252F%25252Fpublicdomainvectors.org%25252Fen%25252Fpowerpoint-clip-art- server&source=iu&pf=m&fir (n.d.). Retrieved from http://www.google.co.in/imgres?imgurl=http://sacramentoappraisalblog.com/wp-content/uploads/2016/08/real-estate-market- future-sacramento-appraisal-blog-mage-purchased-and-used-with-permission-by-123rf-1.jpg&imgrefurl=http://paper.li/AngelaRunsAmuck/130652 (n.d.). Retrieved from https://www.google.co.in/search?q=virtual+reality+in+business&hl=en-IN&biw=1280&bih=866&source=lnms&tbm=isch&sa=X&ved =0ahUKEwjP9MWOnM3QAhXIRY8KHbjjBCwQ_AUICCgB#imgrc=VyCt_MfLFMQo0M%3A \u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected \n\n\n***\n\n\n "} +{"text": "# Infosys POV \nTitle: Next-gen Process Mining powers Oil & Gas transformation \nAuthor: Infosys Consulting \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 13 \n\n---\n\n An Infosys Consulting Perspective By Sachin Padhye, Naveen Kamakoti, Shruti Jayaraman and Sohini De Consulting@Infosys.com | InfosysConsultingInsights.com Next-gen Process Mining powers Oil & Gas transformation \n\n---\n\n Page: 2 / 13 \n\n---\n\n Oil & Gas transformation | \u00a9 2023 Infosys Consulting 2 Process Mining technology: A key enabler to transform Oil & Gas Technology continues to be a reliant and indispensable enabler to transform operations of Oil and Gas companies. As a growing trend, the broader intent of incorporating technology is the use of operational data to support analytics and fact-based decision-making. However, increasing complexity of core and supplementary processes, coupled with limited agility of legacy and monolithic IT systems adds a constant challenge to continuous process improvement.<|endoftext|>The agility of business processes and operations depends on the ability to capture real-time data and perform large scale analyses to generate actionable insights on-demand and help steer and nudge key metrics and key performance indicators (KPIs). One such technology framework with ever-growing adoption is Process Mining, especially due to the evolution from process-discovery-based limited applications to centralized platforms for integrated process automation.<|endoftext|>\n\n---\n\n Page: 3 / 13 \n\n---\n\n 3 Process Mining uses detailed data from business processes Process Mining is the practice of using data from various sources to analyze, baseline and improve business processes. The concept of Process Mining is built on the pillars of analysis techniques using artificial intelligence (AI) and machine learning (ML). It is an approach to analyze, optimize, and improve complex operational processes. Powered by event data logs and data science tools, Process Mining helps identify process variations and bottlenecks and gathers quantitative insights in process flows. It also helps address performance and compliance-related issues in processes. The following high-level steps are involved in a typical process mining lifecycle journey: Step Description Tools used 1. Data collection Collect data from various sources, such as event logs, databases, operational data stores.<|endoftext|>Data extraction tools, such as ETL tools, log parsers, or database connectors.<|endoftext|>2. Data pre- processing Clean, filter, and normalize data to ensure consistency and accuracy.<|endoftext|>Data cleaning and preparation tools, such as Python, or R scripts.<|endoftext|>3. Process discovery Create a process model based on the collected data.<|endoftext|>Process Mining tools, such as Disco, ProM, or Celonis.<|endoftext|>4. Conformance checking Compare the process model with the collected data to identify deviations, errors, or inefficiencies in the process.<|endoftext|>Conformance checking tools, such as Disco, ProM, or Celonis.<|endoftext|>5. Process enhancement Optimize the process model to improve efficiency, reduce costs, and enhance quality.<|endoftext|>Process simulation and optimization tools, such as Arena, Simul8, or ProModel.<|endoftext|>6. Process monitoring Continuously track and analyze process data to identify potential issues, bottlenecks, or opportunities for improvement.<|endoftext|>Process monitoring tools, such as Celonis, Splunk, ELK, or Graylog.<|endoftext|>7. Process visualization Create graphical representations of the process model and process data to help stakeholders understand the process and identify areas for improvement.<|endoftext|>Data visualization tools, such as Celonis, Tableau, Power BI, or QlikView.<|endoftext|>Oil & Gas transformation | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 4 / 13 \n\n---\n\n 4 Evolution of Process Mining From being a niche technology used in research-oriented projects to a completely integrated cross-functional collaboration platform, Process Mining has evolved and matured for broad adoption. Here is a summary of this evolution: Oil & Gas transformation | \u00a9 2023 Infosys Consulting Process discovery Process conformance User-centered process Integrated process automation Time period First Generation Second Generation Third Generation Fourth Generation 1988 \u2013 2004 2004 \u2013 2011 2011 \u2013 2016 2016 \u2013 Present Summary Focus on discovery of process models from event logs Introduction of conformance checking and process enhancements Shift towards more user- centered and interactive approaches Expansion of process mining beyond event logs to include other types of data and processes Use of process discovery algorithms and process tree visualizations Integration of multiple perspectives and data sources Focus on business process management and improvement Integration of process mining with other technologies such as AI, IoT, and blockchain Limited support for large and complex processes Focus on quality control, compliance, and audit trails Integration of social, organizational, and environmental factors Increased focus on automation, robotics, and digital transformation Challenges in handling noise, concurrency, and infrequent behavior Use of data mining and machine learning techniques Increased emphasis on big data, cloud computing, and distributed systems Development of new techniques such as predictive process monitoring and prescriptive analytics People Primarily academic researchers and process experts Involvement of business stakeholders and end-users in process mining projects Involvement of a wider range of stakeholders including end-users, IT staff, and top management Involvement of a wide range of stakeholders including business users, IT staff, data scientists, and process experts Minimal involvement of business stakeholders and end-users Increasing emphasis on collaboration and communication Greater emphasis on user needs and user experience Greater emphasis on cross- functional collaboration and co-creation Process Emphasis on process modeling and analysis Shift towards process improvement and optimization Increased integration of process mining with business strategy and management practices Integration of process mining with digital transformation and innovation initiatives Limited focus on process improvement and optimization Greater attention to business objectives and value creation Greater emphasis on continuous improvement and innovation Systems Basic computing tools and algorithms Development of more sophisticated algorithms and methods Greater use of cloud computing, big data, and advanced analytics Integration of AI, IoT, Blockchain, and Advanced Analytics Primarily desktop- based software Increased use of enterprise-level software systems Integration with other digital technologies such as social media and mobile devices Use of event logs and basic data mining techniques Integration of multiple data sources and formats Greater use of process automation and robotic process automation (RPA) \n\n---\n\n Page: 5 / 13 \n\n---\n\n 5 Process Mining impacts multiple areas in Oil & Gas The Oil and Gas industry is complex and dynamic with significant data generated across all areas. For Oil and Gas companies, Process Mining can be particularly important because of the complex and highly regulated nature of their operations. Here are some specific ways in which Process Mining can benefit Oil and Gas companies: Oil & Gas transformation | \u00a9 2023 Infosys Consulting Value levers Impact of Process Mining Operational efficiency Process Mining can help identify inefficiencies in processes, such as bottlenecks or unnecessary steps, and suggest ways to streamline them. This can lead to cost savings and better use of resources.<|endoftext|>Regulatory compliance Oil and Gas companies are subject to numerous regulations and standards, such as those related to environmental protection and worker safety. Process Mining can help ensure that these regulations are being followed and identify areas where improvements are needed.<|endoftext|>Operational safety Safety is a top priority for Oil and Gas companies, and Process Mining can help identify potential hazards and risks.<|endoftext|>By analyzing data from sensors, equipment, and other sources, companies can identify patterns that may indicate an increased risk of accidents or equipment failure.<|endoftext|>Optimized maintenance Process Mining can help companies optimize maintenance schedules by analyzing data from equipment and other sources to identify when maintenance is needed. This can help prevent unplanned downtime and reduce maintenance costs.<|endoftext|>Customer satisfaction Oil and Gas companies may interact with customers in various ways, such as through fuel delivery or service stations. Process Mining can help companies understand how customers interact with their services and identify ways to improve the customer experience.<|endoftext|>\n\n---\n\n Page: 6 / 13 \n\n---\n\n 6 Pre-requisites for Process Mining Certain conditions need to be met before leveraging Process Mining effectively. These can broadly be grouped under People, Process and Technology.<|endoftext|>Oil & Gas transformation | \u00a9 2023 Infosys Consulting Who can help manage the organizational changes that may result from Process Mining initiatives. BUSINESS ANALYSTS With domain knowledge of the processes to be analyzed. With expertise in data management and system integration. With expertise in data analysis and statistical modelling. Or subject matter experts who can provide feedback on the accuracy and relevance of Process Mining results. DATA SCIENTISTS IT PROFESSIONALS PROCESS OWNERS CHANGE MANAGEMENT EXPERTS LEADERSHIP SUPPORT People Process Compliance with legal and regulatory requirements, such as data privacy laws. PROCESSES Well-defined processes with documented workflows and procedures. Availability of the required hardware or software infrastructure to support Process Mining activities Access to event logs or other data sources that capture process data. Alignment with the organization\u2019s strategic objectives and goals. DATA CAPTURE TECH INFRASTRUCTURE STRATEGIC OBJECTIVES COMPLIANCE AGILITY Organization\u2019s ability and culture to adopt new frameworks for continuous improvement. \n\n---\n\n Page: 7 / 13 \n\n---\n\n 7 Oil & Gas transformation | \u00a9 2023 Infosys Consulting System Visualization software to create dashboards and reports. ACCESS TO DATA Access to digitized processes and/or processes with event/case data and relevant data sources. This includes, event logs, databases, and other data repositories. Data cleaning, transformation, and normalization tools to prepare data for analysis. Process Mining software to extract and analyze process data. Process modelling software to create process model. PROCESS DATA DATA TOOLS PROCESS MODEL VISUALIZATION INVESTMENT Continued investment in technology platforms and relevant features. \n\n---\n\n Page: 8 / 13 \n\n---\n\n 8 Case studies The following case studies cite instances where Process Mining helped a US-based Oil and Gas major realize efficiencies and optimize resources using Celonis.<|endoftext|>Oil & Gas transformation | \u00a9 2023 Infosys Consulting Approach \u2022 Key AS-IS process flows for these processes were modeled in ARIS to begin with. This provided an understanding of the current pain points and areas of improvement. \u2022 This process model was leveraged to identify the data availability in applications across each of the steps.<|endoftext|>\u2022 The journey: A case was created through all its states and the data captured from the previous step was mapped against this to ensure data consistency.<|endoftext|>\u2022 This data was imported into the Celonis Execution Management system to create a data model. \u2022 Based on this data model, multiple process analysis dashboards and components were created to track various metrics and KPIs across key dimensions such as time, vendors, locations. Process Mining in upstream logistics \n\n---\n\n Page: 9 / 13 \n\n---\n\n 9 Oil & Gas transformation | \u00a9 2023 Infosys Consulting Business/process area Common challenges Potential process mining gains Value levers impacted Standard enterprise processes (order-to- cash, procure-to-pay) Manual interventions Reduction of TAT Operational efficiency Form corrections Improve no-touch processing Regulatory compliance Rate changes, data mismatches Automation Customer satisfaction Supply chain management High complexity of supply chain Reduction of process lead time Operational efficiency Visibility is limited among all stakeholders Reduction of cost by removing bottlenecks Optimized maintenance Best practices are not well-defined Full transparency of process Customer satisfaction Vessel schedule optimization Multiple rigs covered by same vessel Effective route planning to reduce fuel costs and optimize time Operational efficiency Route planning done at the last minute Operational safety Optimized maintenance Helicopter schedule optimization High cost due to over utilization Incorporate best practices for utilization Operational efficiency Unnoticed maintenance risks Monitoring risks Regulatory compliance Operational safety Optimized maintenance Warehouse management Warehouse layout inefficient Root cause analysis for layout Operational efficiency Lack of process automation Forecasting data for inventory utilization and avoiding stock outs Customer satisfaction Warehouse inventory inaccuracy Enhanced customer management Warehouse utilization inaccuracy Fleet management High fuel cost Improved fleet efficiency and routing Operational efficiency Under-utilized assets KPI monitoring to improve utilization Customer satisfaction Vendor management Manual processes, poor automation SLA improvement Operational efficiency Rental costs high and equipment under- utilized Contract visibility and optimization Customer satisfaction End-to-end system integration not available \n\n---\n\n Page: 10 / 13 \n\n---\n\n 10 Oil & Gas transformation | \u00a9 2023 Infosys Consulting Reference industry use cases Large integrated Oil & Gas major One of the largest Oil and Gas companies in the world has been using Process Mining to improve the efficiency of its drilling operations. By analyzing data from drilling rigs, this company was able to identify inefficiencies and areas for improvement, such as reducing idle time and optimizing drilling parameters. As a result, the firm was able to reduce drilling time and costs while improving safety and environmental performance.<|endoftext|>A European Oil & Gas company This company used Process Mining to optimize its maintenance processes for offshore platforms. By analyzing maintenance data, the firm was able to identify patterns and trends which improved the reliability of its equipment, reduced downtime, and lowered maintenance costs. The company also used Process Mining to identify opportunities for process standardization and optimization, resulting in further improvements in efficiency and cost savings.<|endoftext|>A large National Oil Corporation (NOC) This NOC used Process Mining to improve its customer service processes. By analyzing customer service data, the NOC was able to identify areas where it could improve its service levels, such as reducing response times and increasing the accuracy of billing. The company also used Process Mining to optimize its meter reading processes, resulting in significant cost savings.<|endoftext|>\n\n---\n\n Page: 11 / 13 \n\n---\n\n Process Mining encourages sustainable growth Oil and Gas companies operate in a complex environment with multiple interconnected processes, making it challenging to identify inefficiencies and areas for improvement. Process Mining provides a valuable tool for these companies to gain insights into their operational processes by analyzing data from various sources. By applying Process Mining techniques, Oil and Gas companies can identify bottlenecks, reduce costs, improve efficiency, and enhance the quality of their products and services. The benefits of Process Mining include improved compliance, enhanced decision-making, and increased operational efficiency. Therefore, implementing this technology can help Oil and Gas companies stay competitive and achieve sustainable growth in an ever-changing industry.<|endoftext|>11 Oil & Gas transformation | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 12 / 13 \n\n---\n\n MEET THE EXPERTS SACHIN PADHYE Associate Partner, SURE Sachin.Padhye@infosys.com 12 Sachin works with large Oil and Gas companies in the upstream, midstream, and downstream areas to frame their digital strategy across customer and employee experiences. He helps clients quantify value, beginning with industry opportunities and ending with decisions built with big data, analytical tools and visualizations and narratives. His current focus is digital data monetization, where he helps companies put a monetary value to the data that is used to execute their digital strategy. NAVEEN KAMAKOTI Principal, SURE Venkata_Kamakoti@infosys.com Naveen has over 18 years\u2019 experience in digital business transformation initiatives, focusing on process consulting, re-engineering and mining, as well as business architecture and consulting across information and professional services, plus Oil & Gas (upstream) domains. He leads the process consulting and transformation community of practice for Infosys Consulting.<|endoftext|>SHRUTI JAYARAMAN Senior Consultant, SURE Shruti.Jayaraman@infosys.com Shruti has four years\u2019 experience in business process improvement and digital transformation initiatives with a focus on process modelling, analysis, and mining. She\u2019s worked with upstream Oil & Gas clientele, across financial planning, process design and optimization, third-party hiring and government reporting areas for the last two years. She has administered trainings in process modelling using ARIS and has worked in Agile methodologies. SOHINI DE Consultant, SURE Sohini.De@infosys.com Sohini has over four years\u2019 experience in process transformation initiatives focusing on business process improvement, process design, modeling and mining. She has two years\u2019 experience in the upstream energy industry in marine logistics, and integrity inspection. She has conducted trainings in ARIS Designer platform for process modeling and has hands-on experience working in Agile methodologies. Oil & Gas transformation | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 13 / 13 \n\n---\n\n consulting@Infosys.com InfosysConsultingInsights.com LinkedIn: /company/infosysconsulting Twitter: @infosysconsltng About Infosys Consulting Infosys Consulting is a global management consulting firm helping some of the world\u2019s most recognizable brands transform and innovate. Our consultants are industry experts that lead complex change agendas driven by disruptive technology. With offices in 20 countries and backed by the power of the global Infosys brand, our teams help the C- suite navigate today\u2019s digital landscape to win market share and create shareholder value for lasting competitive advantage. To see our ideas in action, or to join a new type of consulting firm, visit us at www.InfosysConsultingInsights.com. For more information, contact consulting@infosys.com \u00a9 2022 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names, and other such intellectual property rights mentioned in this document. Except as expressly permitted, neither this document nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printed, photocopied, recorded or otherwise, without the prior permission of Infosys Limited and/or any named intellectual property rights holders under this document. \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: Trends in Performance Testing and Engineering \u2013 Perform or Perish \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 4 \n\n---\n\n WHITE PAPER TRENDS IN PERFORMANCE TESTING AND ENGINEERING \u2013 PERFORM OR PERISH Hemalatha Murugesan \n\n---\n\n Page: 2 / 4 \n\n---\n\n One is spoilt for choices in today\u2019s high consumerism and materialistic world leaving the end users both highly excited, vulnerable as well as extremely demanding. The array and diversity of choices is not just limited to technology, gadgets, smartphones, wearable devices, sports vehicles, FMCG goods, white goods, tourism, food choices etc. but is extensible, penetrating in every single aspect of one\u2019s day to day life. In today\u2019s world, no business can survive if one\u2019s product/ service/training or any item is taken to the market without being online \u2013 aka Digitization, Mobilization. Stepping two decades back, one wonders how business was being done and reached various parts of the globe! With intense competition and aggression to extend the market footprint, every organization is launching multiple products or services catering to different user groups, age sectors, geo\u2019s based launches, customization, personalization or rather \u201cmood based\u201d coupled with analytics, user preferences, predictions, etc. Businesses and IT organizations are moving at rapid pace to roll out their launches using the latest cutting edge technology and migration to newer technologies with the sole objective to ensure that they do not only retain their existing customers but also add to their base and be market- dominant leader.<|endoftext|>As such, every application that is catering to diverse populations 24x7, 365 days a year must be Available, Scalable for future growth, Predictable and Reliable, lest the end user lose their tolerance. What was earlier 8 seconds being the norm for a page to be rendered has now reduced to less than 2 seconds or milliseconds and almost all launches going the \u201capp\u201d way, the response time expected is to be in milliseconds.<|endoftext|>Devops, Agile, SMAC, migration to cloud, VPN, Big Data, MongoDB, Cassandra, - phew \u2013 the list is endless with newer technology, tools being launched by the day to address to ever expanding technology landscape. The rush to absorb these technologies is also increasing leading to high vulnerability on the application\u2019s performance. There is a significant change in the way Performance Testing and Engineering including monitoring is being performed which is continuously evolving and becoming more complex.<|endoftext|>With increased digitization and mobilization being the norm, data analytics and testing to scale would play a major role in application performance to ensure better customer experience is provided. DevOps and Agile development will \u201cshift left\u201d forcing Performance Testing and Engineering teams to make early assumptions on customer behaviors and their needs, viable experiences and growth spikes to run tests quickly and validate those assumptions. Early predictability leveraging analytics on the application performance will change gears in the way we do or approach performance testing and engineering.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 3 / 4 \n\n---\n\n Performance Driven Development (PDD) i.e. PE focus right from requirements to production rollout to post production, monitoring are one of the key trends noticed as it helps in early bottleneck identifications and tuning of the same. With Devops adoption having a close handshake between Development and Operations teams to address real production volumetric, PDD helps achieve it to a large extent. This in turn demands adoption of APM tools and methodologies.<|endoftext|>The focus has now shifted for early involvement of Performance Engineering in the application lifecycle \u2013 Early Validation of PE proactively and not leaving it to be addressed prior to roll out which was a reactive approach followed earlier. Due to this, the industry is seeing increasing launches of tools/processes supporting proactive PE approach.<|endoftext|>Businesses are demanding launches at faster pace with high Availability and Resiliency, yet no compromise on quality and security. All this at less TCO! Automation is the key across all SDLC and widely prevalent in testing types. As such every activity be in NFR gathering phases, scripting, modeling, test environment setup, releases, configuration management, etc. is getting automated which is inclusive of performance testing and engineering activities through these phases as well. Performance Engineering framework adapting to the agile/CI CD methodology for Web-based, Thick Client, batch job-based apps etc. needs to be developed, suited to the ever-changing technology landscape.<|endoftext|>Financial institutions have been one of the front runners in IT adoption where they would need to meet the regulatory compliances inclusive of performance. With multiple banking packages/products to choose from trading platforms and investment management products like Appian Way, middle and back- office products like Alteryx, Analytical Workbenches, etc., clients are looking for standard benchmarks/baselines of these products and its impact on PE before rolling out full-blown implementation. With almost all the apps on Mobile channels to interact with their systems, there is an intense need to do PE at every stage, at every component and layer and across all stacks. Few impacting trends seen are: Omni-channel retail customer experience \u2013 Performance testing and engineering to ensure consistent user experience across various touch points for application rewrite or new application development projects.<|endoftext|>Technology and infrastructure rationalization \u2013 Mostly driven by cost optimization and compliance requirements PT&E done to ensure zero disruption in service and user experience in data center migration/consolidation, technology stack upgrade or movement from on-premise to cloud.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 4 / 4 \n\n---\n\n Conclusion For any organization to survive in today\u2019s competitive world, it is important that the products/applications are Scalable, Predictable, and Available exciting the user, thereby ensuring loyalty as well as converting to business. However rich the application features are, when functionally tested, if the application is not responding to the expectations of the user, it is but natural to lose the customer. With changing and demanding trends, it is important that Performance testing and Engineering are considered at all layers and components as well for successive use of the products launched.<|endoftext|>Hemalatha Murugesan is currently heading the Performance Testing and Engineering in IVS at Infosys. She has been involved in setting up, pioneering, incubation and evolving emerging testing services like cloud testing, test data management, Infrastructure testing, virtualization, TEMS and other upcoming specialized services, etc., at Infosys. Hemalatha has been instrumental in developing the Enterprise Performance Testing Solutions which offers performance testing solutions & services and has also setup the state-of-art performance testing lab, both at Infosys.<|endoftext|>Bulk customer data handling-Retail and institutional customers are given more control to deal with their data. As a result of which interfaces such as dashboard, search, profiles, and homepages are becoming more interactive and data-heavy. PT&E is ensuring the performance SLAs are within acceptable limits in all user interactions.<|endoftext|>Tackle Integration Challenges \u2013 PT&E has been carried out to deal with scalability and performance issues arising due to enterprise, partner integration, middleware upgrades, etc.<|endoftext|>With intense pressure to reduce cost, banks are looking at embracing Clouds, DC consolidation and solutions around it. Consolidation of their LOB\u2019s, tools by encouraging COE/NFT factory is setup to reduce cost. Banks are also moving to deploying software\u2019s like SAP, PEGA, and Siebel etc. due to their low maintenance cost and better predictable quality compared to home-grown solutions. Besides, PE for apps hosted in Cloud and Virtualized environments is also picking up due to the on-demand resource provisioning and sharable hardware infrastructure that minimizes TCO. Performance simulation and Engineering of Day in the Life, for e.g., in a line of business through end-to-end PT of disparate systems analyzed. An example work-flow of a Mortgage loan, Mutual Fund, Credit rating process etc. is assessed for performance simulation.<|endoftext|>While the request has always been to have the exact production environment with right volume for Performance testing but the move is to Test Right with the Next Best which has predictable performance on the production environment replayed. Hardware Capacity planning and seamless integration to PT/PE framework especially for new Automated Infrastructure spawning through CHEF/RECIPE, automated platform build-outs and other large Infragistics-related programs owing to Merger & Acquisition, Data Center migration etc.<|endoftext|>DB virtualization for PT&E also seems to be another emerging trend, though not implemented on a large scale today as compared to service virtualization. Service virtualization and Agile or component PT or layered performance testing and engineering also are gaining prevalence as there will be so many components and interfaces in financial products and Production Monitoring, Capacity Prediction Modeling based on that. Another trend we are seeing in Retail space is that applications built using Microservice, Docker Container etc. which requires tweaking monitoring and analysis approach. An interesting emerging trend is Hybrid datacenter approach i.e. part of the system is hosted in Cloud and while part of it is hosted in a Permanent data center. This would require expanding the list of performance KPI to cover all aspects of both the DCs.In some cases, we are also seeing hybrid datacenter approach e.g. part of system in Cloud and part of it in on permanent data center. Again we need to expand list of performance KPI to cover all aspects of two DCs.<|endoftext|>An upcoming hot trend seen is front-end Performance testing and engineering due to RIA/Web 2.0 popularity and to provide same personalized user experience across various media.<|endoftext|>\u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: Performance testing Internet of Things (IoT) \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 4 \n\n---\n\n VIEW POINT PERFORMANCE TESTING INTERNET OF THINGS (IOT) - Yakub Reddy Gurijala Senior Technology Architect \n\n---\n\n Page: 2 / 4 \n\n---\n\n External Document \u00a9 2018 Infosys Limited Internet of things (IoT) is a network of systems, devices, and sensors which are connected and this connectivity enables these objects to share the data. It is a platform which allows to manage the data and controls the devices remotely based on the requirement.<|endoftext|>The IoT has gained momentum in the recent years due to Internet availability, cloud evaluation, and micro services evaluation. According to Gartner, IoT connected devices are growing at 30 percent year-on-year and there will be 20 billion devices connected by 2020 (more than human population). IoT business is growing at 22 percent year- on-year and will reach US$3010 billion. People, government, and business will be hugely affected by IoT in the coming years resulting in smart cities, smart homes, smart hospitals, and so on. IoT devices produce data continuously. This data needs to be saved and analyzed for future decisions and these decisions may be immediate or may be taken later by using business intelligence (BI) analytics. IoT helps in improving the operational performance and cost optimization. So achieve this, IoT systems must be built for high performance and scalalability. To measure these two key attributes of an IoT application, it is important to understand the business value for which it is built. In addition, to measure performance, it is necessary to simulate real-world workload models, which can be created using business requirements, historic data and future growth require- ments, type of devices, network conditions, usage patterns, and geographic spread. Application usage patterns are arrived by analyzing the IoT application logs for peak hours and normal hours. Using these data points, different workload conditions (real-world load test / simulation) can be created for peak usage, normal usage, future growth, and daylong / multiday simulations.<|endoftext|>IoT performance testing (PT) is little different from traditional performance testing. Following table illustrates differences between traditional PT vs IoT PT.<|endoftext|>Because of these differences, IoT PT poses a lot of challenges to performance engineers. Below sections will describe different challenges posed by IoT applications and Infosys solution elements for each of the challenges.<|endoftext|>Some of key differences between traditional PT and IoT PT: Key diferences Simulation Simulation of users Simulation of devices / sensors Scale Few hundred users to few thousand users Few thousand devices to few million devices Amount of data Sends and receives large amount of data per request Sends and receives minimal data per request but data is shared continuously with time interval Protocols Uses standard protocols to communicate Uses non-standard and new protocols to communicate Requests / responses In most of the cases, users create the requests and receive the response Generally IoT devices create the requests and receive response as well as request, and provide response BI Only few applications have BI as part of testing BI will be a part of IoT; needs to measure performance by applying loads on IoT app Traditional PT IoT PT \n\n---\n\n Page: 3 / 4 \n\n---\n\n External Document \u00a9 2018 Infosys Limited Performance testing challenges Protocols and performance testing tool IoT does not have standard protocol set to establish the connectivity between IoT application and devices. IoT protocols used range from HTTP, AllJoyn, IoTivity, MQTT, CoAP, AMQP, and more. These protocols are still in the early phases of development and different IoT solution vendors come up with specific protocol standards (sets). These protocols are continuously evolving with IoT applications. Since these are new technologies / protocols, and current performance testing tools may or may not support them. Geographical spread and network conditions IoT devices / sensors are spread across the world and use different networks to connect to the IoT servers to send and receive the data. As part of performance testing, there is need to simulate devices from different locations (to simulate latency) with required network technolo- gies like 2G, 3G, 4G, Bluetooth, etc.<|endoftext|>Load conditions It is necessary to load test the applications by simulating real-world conditions. These patterns are complex in nature and it will be extremely difficult to collect and predict the data. To recreate real-world load conditions, we may land up simulating millions of devices.<|endoftext|>Real-time decision making Some IoT implementations may require the data from a device that needs to be processed at runtime and based on the data received, the corresponding decision is taken. These decisions are generally no- tifications / requests to different devices / sensors or different systems which perform particular action. As part of testing, these notifications / requests need to be monitored for performance (time taken to generate the notification / request from the data received by IoT application). IoT application monitoring and BI processing Monitoring is essential for any application. It helps understand the system behavior under real-world conditions. For IoT ap- plications, both the application and the backend BI systems need to be monitored. This will help understand data processing, both in terms of the volume and accuracy. Infosys IoT PT solution Infosys created a comprehensive framework using JMeter to support all the needs of IoT PT. Protocols and performance testing tool Infosys selected JMeter as performance test tool to conduct PT. JMeter already has support to most of the IoT protocols like HTTP, CoAP, AMQP, MQTT and Kafka. As IoT is an emerging area, new protocols are being developed over the time. To on-board new protocols Infosys has come up with a protocol framework using protocol SDK and extending the JMeter. Using these JMeter extensions, scripts can be prepared to simulate new protocol requests and devices.<|endoftext|>Geographical spread and network conditions To simulate geographical spread, JMeter is integrated with cloud solutions like Amazon web services (AWS) to setup the load generators across different geographies. Using AWS integration, JMeter is able to generate the traffic from different locations of the world to IoT application to mimic the geographical spread and network latency. Infosys has in-house IP-based solution, Infosys Network Simulation tool (iNITS), to simulate different network conditions required for any requests which use transmission control protocol (TCP). We have integrated iNITS solution with JMeter to simulate different network conditions required by IoT PT.<|endoftext|>Load conditions To collect the accurate real-world scenarios, Infosys developed different tools / frameworks like non-functional require- ments (NFR) questionnaire, workload modeling tools, and others. These tools / frameworks reduce the requirement gathering and collect the information more accurately. To simulate millions of devices, JMeter integrated with cloud using automated scripts. These scripts will create required number of load generators in cloud, setup the JMeter, copy the scripts, test data, execute the results, collect the results, shutdown the LG\u2019s which are created, and process the results.<|endoftext|>Real-time decision making Notifications, which are sent to other devices / sensors / systems, need to be monitored using stubs / service virtualiza- tion technologies. IoT application logs are collected and analyzed for processing time and response time of the real-time processing and decision making scenarios under different load conditions.<|endoftext|>IoT application monitoring and BI processing Infosys created predefined process / performance metrics collection to monitor the systems (Web / app / database layers) deployed in cloud and data center. These metrics are analyzed to uncover possible performance bottlenecks. If BI systems were built using batch jobs, then enough test data needs to be created using performance test scripts and the batch jobs executed to monitor the BI system. If real-time BI systems were implemented using hot channels then, BI systems need to be monitored as a part of different performance tests by generating different amount of data per second / minute / hour. Using this approach, IoT applica- tions are comprehensively monitored and performance results are benchmarked against different load conditions.<|endoftext|>IoT PT resources Infosys presently has 1200+ performance testing resources having experience in testing different types of applications, technologies, and tools. And more than 500 employees have working experience on JMeter. Infosys has dedicated resources who are trained on IoT performance test frameworks (JMeter, new protocols, network simulation, and IoT monitoring). These resources continuously explore the opportunities to improve the framework, tool, and protocols supported.<|endoftext|>\n\n---\n\n Page: 4 / 4 \n\n---\n\n \u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected Supports diferent network simulations for all type of Protocols.<|endoftext|>Framework is available to onboard new protocols.<|endoftext|>As solution is based on open source tool, no License cost for performance test tool and cost for network simulation only.<|endoftext|>No need to have hardware as device simulation can be done from Cloud.<|endoftext|>Features Benefts Support for diferent communication protocols such as HTTP, REST over HTTP, MQTT, AMQP, CoAP, Kafka and Web Sockets.<|endoftext|>Supports cloud based load generation. Automated scripts available to generate the load from cloud.<|endoftext|>Faster time to market.<|endoftext|>Quick onboarding of new protocols.<|endoftext|>01 02 04 03 Infosys IoT PT \u2013 Key features and benefits Conclusion Infosys created a compressive solution for IoT performance testing, which covers specific needs / demands of IoT. Currently, solution supports all leading IoT protocols and network simulations. Infosys IoT performance solution is very cost-effective when compared to any standard performance test tool.<|endoftext|>We have dedicated workforce trained on IoT performance testing to support the growing demands of IoT PT. Using Infosys IoT PT solution, clients can save 80 to 90 percent tool cost and reduce go-to-market time by 20 percent.<|endoftext|>References http://www.gartner.com/newsroom/id/3165317 https://www.infosys.com/IT-services/validation-solutions/white-papers/Documents/successful-network-impact-testing.pdf \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: QA Strategy to Succeed in the Digital Age \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 4 \n\n---\n\n VIEW POINT QA STRATEGY TO SUCCEED IN THE DIGITAL AGE \n\n---\n\n Page: 2 / 4 \n\n---\n\n \u201cDigital\u201d is the new buzzword for organizations. Across industries, organizations are at various stages of their digital transformation journeys. For some, this might mean reimagining their entire businesses around digital technologies; and for others, incorporating aspects of digital into their existing ways of working.<|endoftext|>Examples of this are abundant across industries. In retail, the margin between online and offline is blurring. Retailers are incorporating technologies such as augmented reality, beacons to provide interactive in-store experiences to users. Targeted campaigns are being delivered to customers based on their proximity to a store. On one hand, customers have the option to order online and pick up in-store; and on the other, they can pay in-store and have it delivered at home. In the insurance industry, companies are enabling customers to purchase insurance anytime, anywhere, using pre-populated information from their Facebook profiles.<|endoftext|>What this means is that IT is now at the forefront of business transformation. Technology is the driver for business improvements and hence, IT departments have a greater role to play in business success. Consequently, the stakes are higher than ever, for IT to deliver better value, faster and more efficiently.<|endoftext|>However, the path to a successful digital transformation is wrought with multiple challenges, of which, a key challenge is one that is inherent to the most important aspect of digital transformations \u2013 the changing nature of customer interactions. While the digital revolution brings to organizations, newer models and channels of interaction with customers, the success of businesses is also becoming increasingly dependent on the quality of these interactions. The end customer experience is now the single most important factor in a business\u2019 success. Customers today are more demanding, and much more likely to switch loyalties if the customer experience is not up to their expectations. Thus, the most important factor for success of digital transformations is ensuring a superlative end customer experience through the quality assurance function.<|endoftext|>However, can traditional testing organizations that follow age-old ways, be able to provide quality assurance in the new scheme of things? To answer this, let us look at some of the imperatives of digital assurance.<|endoftext|>Focus on Customer Experience As discussed already, the nature of customer interactions has undergone a great transformation in the recent past. Businesses are increasingly engaging with customers through a multitude of channels such as web, mobile, and social media, in addition to the existing traditional channels. A single customer transaction can now span across multiple online and offline channels. Hence, customer experience across each of these channels is important; but so is providing similar and seamless experiences across all channels, as well as maintaining consistency in messaging all throughout. This also requires a change in the approach to quality assurance. QA needs to shift focus from the traditional functional validation, to more of customer experience validation, across the digital landscape. This requires a 360\u00b0 view of quality, encompassing functional and non-functional aspects, and cutting across channels and technologies. The anytime-anywhere nature of customer transactions pose challenges in all aspects of testing. With the increase in online transactions, usage of cloud infrastructure, the multitude of interconnected applications and devices, and the advent of big data analytics, there are newer challenges to application security and data privacy. Ensuring the security of applications from any breaches, along with adherence to security and data privacy guidelines is essential for ensuring a good customer experience, and business continuity. Comprehensive security assurance is thus a key component of digital assurance. Application performance is another key determinant of success. Users are much more likely to uninstall an app or abandon an online transaction with the slightest of reductions in application performance. Unlike traditional QA, performance evaluation needs to be incorporated at all stages of the application development lifecycle. Performance evaluation needs to be augmented with performance monitoring in production to ensure availability of business critical applications. Strategies for compatibility, usability, and accessibility testing should also be optimized to cover multiple customer touch points and technologies like desktop, mobile, and other connected devices. There is also an increasing focus on providing personalized experiences to customers. In addition to functional validation, personalized content validation across channels, and validation of digital content and assets also needs to be incorporated. Another aspect of the digital world is the constant customer feedback and inputs, which have become important drivers for business decisions. Companies are co-creating products with customers, or using customer inputs to improve existing products. This is also now extending to using customer inputs to improve IT platforms and services. In this constantly evolving landscape, a continuous feedback mechanism is also important for QA organizations to understand the end customer requirements and preempt customer issues. End customer feedback, learnings from production, and findings from previous testing cycles can all serve as inputs to continuously improve testing effectiveness and efficiency, and provide a truly 360\u00b0 view of application quality. External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 3 / 4 \n\n---\n\n Manage Complexity An important challenge that digital transformation brings about it is the increasing complexity of the application landscape. The IT landscape now needs to support multiple newer applications built on disparate technologies. The interconnectedness of applications, as well as the requirement to test them on different device configurations, pose {{ img-description : a group of people sitting around a table working with a laptop, in the style of light teal and crimson, uniformly staged images, light gray and navy, iso 200, yankeecore, focus on joints/connections, gravure printing }} Increase Agility With the digital revolution, newer technologies are being adopted at a much faster pace. Organizations are now trying to pilot newer and sometimes unproven technologies, in a bid to enhance their business. For QA teams to support this effectively, they have to be extremely nimble and quick to learn. Teams should be tuned-in on technological changes, be able to innovate quickly, and come up with optimal solutions for new testing challenges. In general, development cycles are getting progressively shorter, with businesses vying to provide better features, faster. Development methodologies are moving to Agile, and DevOps. Consequently, there is an increasing pressure on QA teams to reduce the turnaround time and deliver the code to production. It also has to be balanced with the requirement to support more and more devices and platforms. This needs a two-pronged approach to optimize testing requirements, as well as increase the speed of testing. With limited time to test, it is crucial to adopt methods to optimize testing requirements, so that the time is well spent on validating critical functionalities. While automation has been the key enabler to increase testing effectiveness, it should not be limited to test execution alone. It should also encompass the entire testing lifecycle \u2013 from requirements analysis to reporting. Efficiencies need to be built into the testing process by a combination of tools, accelerators, and reusable test artifacts. Early automation strategies can be deployed to ensure availability of automated test scripts for system testing. To conclude, an assurance strategy in the digital world has to address the following: \u2022 Focus on customer experience, rather than functional validation \u2022 Provide a 360\u00b0 assurance, encompassing different aspects of testing as well as end-to-end validation \u2022 Focus on continuous learning and innovation \u2022 Continuously optimize and accelerate testing additional challenges for the QA teams. On one hand, assurance needs to be provided for all application layers, from the database, to the UI, to isolate issues; and on the other hand, end-to-end business process assurance encompassing multiple applications is equally crucial. The testing strategy should be able to balance these requirements and provide optimal test coverage, ensuring early isolation of issues. A well planned approach involving service virtualization, judicious mix of automation tools, test data management and optimized testing scope should be implemented.<|endoftext|>Thus, the need of the hour is a holistic assurance strategy encompassing all aspects of validation, which is also \t optimized for the changing application landscape.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 4 / 4 \n\n---\n\n \u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: Quantifying Customer Experience for Quality Assurance in the Digital Era \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 8 \n\n---\n\n WHITE PAPER QUANTIFYING CUSTOMER EXPERIENCE FOR QUALITY ASSURANCE IN THE DIGITAL ERA Abstract Post the pandemic, the new normal situation demands an increased digitalization across all industry sectors. Ensuring top class customer experience became crucial for all digital customer interactions through multiple channels like web, mobile, chatbot, etc. Customer experience is an area in which neither the aesthetics nor the content can be compromised as that will lead to severe negative business impact. This paper explains various automation strategies that can enable QA teams to provide a unified experience to the end customers across multiple channels. The focus is to identify the key attributes of customer experience and suggest metrics that can be used to measure its effectiveness.<|endoftext|>{{ img-description : rateyourexperience with a palm and pen with four stars, in the style of interactive, smartphone footage, digital }} \n\n---\n\n Page: 2 / 8 \n\n---\n\n External Document \u00a9 2022 Infosys Limited Introduction Customer experience has always been a dynamic topic as it is becoming more personalized day by day and varies according to individual preferences. It is hard to measure customer experience which make the work even more difficult for Quality Assurance teams. The factors which amplify the customer experience not only include the functional and visual factors like front end aesthetics, user interface, user experience, etc., but also include non-functional and social aspects like omnichannel engagements, social media presence, customer sentiments, accessibility, security, performance, etc.<|endoftext|>Enterprises encounter various challenges in providing a unified experience to their end customers across multiple channels such as: \u2022 Lack of information or mismatch in information \u2022 Quality of content is not up to the standard \u2022 Lack of usability in cross navigation to make it intuitive and self-guided \u2022 Consistent look and feel and functional flow across various channels \u2022 Improper content placement \u2022 Inappropriate format and alignment \u2022 Performance issues across local and global regions \u2022 Violation of security guidelines \u2022 Nonconformance to Accessibility as per the Web Content Accessibility Guidelines (WCAG) guidelines \u2022 Lack of social media integration Why do we need to measure the Customer Experience? Quality Assurance is required in all these areas of functional, nonfunctional, and social aspects of Customer Experience. Since, Customer Experience is hyper personalized in the digital era, a persona-based experience measurement is required. Conventional Quality Assurance practices need to be changed to evaluate all aspects of customers journey across multiple channels, comprehensively.<|endoftext|>Figure 1 Challenges in Quality Assurance of Customer Experience Lack of single view of factors afecting customer experience.<|endoftext|>Traditional Testing fails to adapt to real time learning, lacks feedback loop Lack of persona based test strategy Quantifable CX measurements not available Adapting experience unique to each customer Testing is inward focused rather than customer focused Testing based on biz/ technical requirements resulting in gaps in customer\u2019s expectations Vast sea of social messages and user feedback data from social media platforms \n\n---\n\n Page: 3 / 8 \n\n---\n\n External Document \u00a9 2022 Infosys Limited Experience Validation Needs to Cover Multiple Areas of a Customer Journey While organizations try to focus on enhancing the customer experience, there are various areas need to be validated and remediated independently for functional, nonfunctional, and social aspects. The current testing trend covers the basic functional and statistical aspects, emerging testing areas will cover behavioral aspects and focus more on providing customer centric approach like using AI for enhancing the quality of digital impression with personalized customizations. Below table provides information on areas where quality assurance is required along with the popular tools for automation.<|endoftext|>Sr No Area Key Aspects / Metrics Current Testing Trend Emerging Testing Trend Tools 1 Visual Conformance Webpage content alignment, font size, font color, web links, images, audio files, video files, forms, tabular content, color scheme, font scheme, navigation buttons, theme etc.<|endoftext|>A/B testing, Style guide check, Font check, Color check, Usability testing, Readability testing Persona based testing Siteimprove Applitools, SortSite 2 Content Checking whether the image, video, audio, text, tables, forms, links etc. are up to the standards.<|endoftext|>A/B Testing, Voice quality testing, Streaming media testing, Compatibility testing, Internationalization/ Localization testing Personalized UX Testing, CSS3 Animation testing, 2D Illustrations, AI powered translators Siteimprove, SortSite 3 Performance of webpage Loading speed, Timeto\nTitle, DNS lookup speed, Requests per second, Conversion rate, TimetoFirstByte, TimetoInteract, Error Rate Performance testing, Network testing, cross browser testing, multiple device testing, multiple OS testing Performance Engineering, AI in performance testing, Chaos Engineering GTMetrix, Pingdom Tool, Google Lighthouse, Web Page Test, etc.<|endoftext|>4 Security Conformance with security standards across geographies. Secured transactions, cyber security, biometric security, user account security Application security testing, Cyber Assurance, Biometric testing, Payment Testing Blockchain testing, Brain Computer Interface BCI testing, Penetration testing, Facial recognition Sucuri SiteCheck, Mozilla Observatory, Acunetix, Wapiti 5 Usability Navigation on website, visibility, readability, chatbot integrations, user interface Usability testing, Readability testing, Eye tracking, Screen reader validation, Chatbot testing AI led design testing, Emotion tracking, Movement tracking Hotjar, Google Anaytics, Delighted, SurveyMonkey, UserZoom 6 Web Accessibility Conformance to web accessibility guidelines as per geography Checking conformance to guidelines [Web Content Accessibility Guidelines (WCAG), Disability Discrimination Act (DDA) etc.) Persona based accessibility testing Level Access, AXE, Siteimprove, SortSite.<|endoftext|>7 Customer Analytics Net Promoter Score, Customer Effort Score, Customer Satisfaction, Customer Lifetime Value, Customer Turn Rate, Average Resolution Time, Conversion Rate, Percentage of new sessions, Pages per session Sentiment Analytics, Crowd testing, Real time analytics, social media analytics, IOT testing AR/ VR testing, Immersive testing Sprout Social, Buffer, Google Analytics, Hootsuite.<|endoftext|>8 Social Media Integration Clickthrough rate, measuring engagement, influence, brand awareness Measuring social media engagement, social media analytics AR/VR testing, Advertising Playbook, Streaming Data Validation Sprout Social, Buffer, Google Analytics, etc.<|endoftext|>Table 1 Holistic Customer Experience Validation and Trends \n\n---\n\n Page: 4 / 8 \n\n---\n\n External Document \u00a9 2022 Infosys Limited Emerging Trends in Customer Experience Validation Below are few of the emerging trends that can help enhance the customer experience. QA team can use quantifiable attributes to understand where exactly their focus is required. Telemetry Analysis using AI/ML in Customer Experience Telemetry data collected from various sources can be utilized for analyzing the customer experience and implementing the appropriate corrective action. These sources could be the social media feeds, various testing tools mentioned in Table 1, web pages, etc. Analytics is normally done through custom built accelerators using AI/ML techniques. Some of the common analytics are listed below: \u2022 Sentiment Analytics: Sentiment of the message is analyzed as positive, negative, or neutral \u2022 Intent Analytics: Identifies intent as marketing, query, opinion etc.<|endoftext|>\u2022 Contextual Semantic Search (CSS): Intelligent Smart Search Algorithm which filters the messages into given concept. Unlike the keyword-based search, here the search is done on a dump of social media messages for a concept (e.g Price, Quality, etc.) using AI techniques. \u2022 Multilingual Sentiment Analytics: Analyze sentiment based on languages \u2022 Text Analytics, Text Cleansing, Clustering: Extracting meaning out of the text by language identification, sentence breaking, sentence clustering etc.<|endoftext|>\u2022 Response Tag Analysis: To filter pricing, performance, support issues \u2022 Named entity recognition (NER): To identify who is saying what on social media posts and classify \u2022 Feature Extraction from Text: Transform text using bag of words and bag-of-n- grams \u2022 Classification Algorithms: Classification algorithms assign the tags and create categories according to the content. It has broad applications such as sentiment analysis, topic labeling, spam detection, and intent detection.<|endoftext|>\u2022 Image analytics: - Identifying the context of the image using image analytics, categorizes the image and sort them according to gender, age, facial expression, objects, actions, scenes, topic, and sentiment.<|endoftext|>Computer Vision Computer Vision helps to derive meaningful information from images, objects, and videos. With hyper personalization of customer experience, we need an intelligent and integrated customer experience which can be personalized by the people. While AI plays an important role in analyzing the data and recommend the corrective actions, Computer Vision helps to capture the objects, face expressions, etc. and the image processing technology can be leveraged to interpret the customer response.<|endoftext|>Chatbot A chatbot is an artificial intelligence software that can simulate a conversation (or chat) with a user. Chatbot has become a very important mode of communication and most of the enterprises use chatbots for their customer interactions, especially in the new normal scenario.<|endoftext|>Some of the metrics to measure customer experience using a chatbot are: 1. Customer Satisfaction: This metrics will determine the efficiency and effectiveness of chatbot. Questions which can be included in this can be: \u2022 Whether chatbot was able to understand the query of the customer? \u2022 Was the response provided to the specific query? \u2022 Whether the query was transferred to the specific agent in case on non-resolution of the query 2. Activity Volume: How frequently is the chatbot used? Is the usage of chatbot increasing or decreasing? 3. Completion Rates: This metric measures the amount of time the customer took. Also, the levels of question asked by the customer. It will measure the instance when the customer opted to get resolution from an agent and left the chatbot. This will help identify the opportunities to improve the chatbot further, improving the comprehension, scripts and adding other functionalities to the chatbot.<|endoftext|>4. Reuse Rates: This metric will provide the insight on the reuse of chatbot by the same customer. This will also enable to dive deep into the results of customer satisfaction metric, help us understand new user v/s old user usage ratio and allow us to conclude on re-usability and adaptability of chatbot by customers. 5. Speech Analytics Feedback: In this speech analytics can be used to examine customer interactions with service agents. Some of the specific elements to be noted include tone of the call, frustration level of customer, knowledge level of customer, ease of use etc.<|endoftext|>Measuring Tools Even though there are various tools available from startups like BotAnalytics, BotCore, CharBase, Dashbot, etc., most of the QA teams are measuring the Chatbot performance parameters through AI/ ML utilities.<|endoftext|>\n\n---\n\n Page: 5 / 8 \n\n---\n\n External Document \u00a9 2022 Infosys Limited Alternative Reality Alternative Reality includes augmented reality (AR), virtual reality (VR) and mixed reality. AR is in many ways adding value to the customer experience of an enterprise by providing an interactive environment and helps them to stay ahead of their competitors. The data points used to measure it overlap with those of website and app metrics, with addition of a few new points to be measured.<|endoftext|>Some of the additional metrics to measure customer experience in Alternate Reality: 1. Dwell time: Total time spent on the platform. More time spent on platform being the positive outcome 2. Engagement: Interaction with the platform. More the engagement better is the outcome.<|endoftext|>3. Recall: Ability to remember. Higher recall rate indicates proper attention and guides us on the effectiveness of the platform 4. Sentiment: Reaction. Positive, Negative and Neutral. This will assist in understanding the sentiment.<|endoftext|>5. Hardware used: Desktop, laptop, tablet, mobile etc. Measuring Tools There is not much automation done in AR/ VR experience validation. Custom built utilities using Unity framework can be explored to measure the AR/ VR experience. Brain computer interface A brain computer interface (BCI) is a system that measures activity of the central nervous system (CNS) and converts it into artificial output that replaces, restores, enhances, supplements, or improves natural CNS output, and thereby changes the ongoing interactions between the CNS and its external or internal environment. BCI will help in personalizing the user experience by understanding the brain signals from a user.<|endoftext|>Metrics to measure customer experience in BCI: 1. Speed - Speed of the user\u2019s reaction. Higher the speed, more is the user interest on digital print.<|endoftext|>2. Intensity - Intensity of user\u2019s reaction towards a digital presence will help understanding the likes and dislikes of user.<|endoftext|>3. Reaction - This will help understand the different reactions on digital interaction.<|endoftext|>Measuring Tools Open-source tools like OpenEXP, Psychtoolbox, etc. can be leveraged to build custom built utilities for measurement of the above metrics {{ img-description : a person is holding an ipad with location & shopping app in the view, in the style of photobashing, rounded shapes, light emerald and violet, blink-and-you-miss-it detail, precision, david brayne, sharp focus }} \n\n---\n\n Page: 6 / 8 \n\n---\n\n External Document \u00a9 2022 Infosys Limited With multiple channels to interact with the end customers, companies really looking at ensuring the digital quality assurance in a faster and in a continuous way. To reduce time to market, customer experience assurance should be automated with more and more infusion of AI and ML. Further, quality assurance should be in an end-to-end manner, where the developer can ensure the quality even before the application is passed to QA. With the adoption of DevSecOps, customer experience assurance should be an ogoing process which goes beyond the conventional QA phase Some of the technical challenges in automation are: \u2022 Services offered by company should have a seamless experience with all distribution channels (Web, mobile, Doc, etc.).<|endoftext|>\u2022 Early assurance during development \u2022 Ensure regulatory compliance \u2022 Collaboration environment for developers, testers, and auditors with proper governance \u2022 On demand service availability \u2022 Automating the remediation and Continuous Integration \u2022 Actionable insights \u2022 Scoring mechanism to benchmark \u2022 Integration with Test and Development tools The above challenges will call for a fully automated customer experience platform as depicted below: Automation in Customer Experience Assurance Figure 2 Automation approach for evaluating holistic customer experience An automation approach should be comprehensive enough to provide a collaboration environment between testers, developers, auditors, and the customers. It needs accelerators or external tools to measure and analyze various aspects of customer experience. Cognitive analysis to ensure continuous improvement in customer experience is a key success factor for every enterprise. As shown in the picture, complete automation can never be achieved as some assistive or manual verification is required. For example, JAWS screen reader to test the text to speech output. Also, the platform needs to have the integration capabilities with external tools for end-to- end test automation.<|endoftext|>IDE plugins for shift left remediation Online Experience Audit Services APIs and CI/CD plugins Accessibility Analyzer Sentimental Analytics Visual Consistency checker Google APIs Intelligent application crawler Cognitive analysis Dashboards & Reports Cloud Environments with multi browser & device PCloudy Applitools ALM JiRA Subscription & Administration Scheduler Tool adapters External IPs Accelerators Usability Analyzer User touch points Platform component Accelerators/ tools Others Manual Assistive technologies \n\n---\n\n Page: 7 / 8 \n\n---\n\n Conclusion As the digital world is moving towards personalization, QA teams should work on data analytics and focus on analyzing user behavior and activities, leveraging various available testing tools. They should also focus on adapting new and emerging testing areas like AI based testing, Persona based testing, Immersive testing, 2D illustration testing etc. These new testing areas can help in identifying the issues faced in providing the best customer experience, quantify the customer experience and can help in improving it.<|endoftext|>Since there is considerable amount of time, money and effort are put into QA., for ensuring good ROI, QA team should start taking customer experience as a personality-based experience and work upon all major aspects mentioned above. QA teams should look beyond the normal hygiene followed for digital platforms, dig deeper and adapt a customer centric approach in order to make digital prints suitable to the user in all the aspects.<|endoftext|>{{ img-description : hands holding a phone with a star rating on it, in the style of interactive experiences, futurist claims, expert draftsmanship, meticulous attention to detail, precision and detail-oriented }} External Document \u00a9 2022 Infosys Limited \n\n---\n\n Page: 8 / 8 \n\n---\n\n \u00a9 2022 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected About the \nAuthor Saji V.S Principal Technology Architect References 1. Customer Experience Validation - Offerings | Infosys 2. https://www.gartner.com/imagesrv/summits/docs/na/customer-360/C360_2011_brochure_FINAL.pdf 3. The Future of CX 2022, a trends report by Freshworks \n\n\n***\n\n\n "} +{"text": "# Infosys POV \nTitle: Reinventing the CSP Product Lifecycle Management for Digital Ecosystems \nAuthor: Infosys Consulting \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 16 \n\n---\n\n An Infosys Consulting Perspective By Sagar Roongta, Kiran Amin and Thiag Karunanithi Consulting@Infosys.com | InfosysConsultingInsights.com REINVENTING THE CSP PRODUCT LIFECYCLE MANAGEMENT FOR DIGITAL ECOSYSTEMS How to sustainably manage product portfolio complexity in the digital age? \n\n---\n\n Page: 2 / 16 \n\n---\n\n CONTENTS 2 1.<|endoftext|>Introduction 2.<|endoftext|>Unified PLM Framework 3.<|endoftext|>Components of Unified PLM 4.<|endoftext|>Recommendations for CSPs 5.<|endoftext|>Infosys PLM Maturity Model Reinventing the CSP Product Lifecycle Management for the Digital Ecosystems \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 3 / 16 \n\n---\n\n Generating new sources of revenue & free cash flows are the top priorities for CSP CEOs in 2023 In our regular interactions with Communication Service Providers (CSPs) across Asia Pacific, increasing revenue growth is a key priority for executive leadership. The traditional revenue sources have diminished while any price increase will prove to be an extremely sensitive subject for price savvy customers. The COVID- 19 pandemic allowed CSPs to emerge unscathed, or at least to re-think how they now commit substantial investments to expand their 5G networks and open additional revenue sources.<|endoftext|>As per Gartner\u2019s 2023 Board of Director\u2019s Survey, 46% of boards wanted to expand new product lines, to create new growth opportunities. In this environment, CSPs have expanded to occupy the role of digital ecosystem gateways, acting as a marketplace operator where consumer and enterprise customers can buy bundled service offering(s) within an enclosed ecosystem and CSP partners. These ecosystem partners vary from digital content providers, gaming fintech, financial services, cybersecurity, insurance, and health-tech companies seeking to access the digital ready customer portfolio of CSPs. Through these mass-personalized offerings, CSPs could increase customer stickiness and reinforce core business goals. For the end- consumer, their CSP becomes not just a connectivity provider but an incumbent one- stop-shop for digital services. Consequently, as per a recent IDC report, one in three CSP is expected to generate more than 15% of their overall revenue from new digital products and services, compared to one in six in 2020. 1,2 INTRODUCTION 3 Did You Know? As per a recent IDC report, one in three CSP is expected to generate more than 15% of their overall revenue from digital products and services, compared to one in six in 2020.<|endoftext|>Reinventing the CSP Product Lifecycle Management for the Digital Ecosystems \u00a9 2023 Infosys Consulting How should CSPs manage their product portfolios as they launch plethora of new 5G, digital and innovative offerings? \n\n---\n\n Page: 4 / 16 \n\n---\n\n INTRODUCTION 4 As an ecosystem provider, CSPs would be operating in a dramatically different business model. Newer, open, and more complex offerings must be introduced to markets quickly, which requires more investments and collaboration, while the lifecycle of each product must be better controlled from financial and technological perspectives. However, the existing CSP product portfolio is already overly complex, and the product development process is highly bureaucratic as it currently functions on legacy systems, processes, and historical ways of working. Furthermore, mistakes or shortcomings perceived in the products or product designs reach the market because the company cannot react to market changes quickly enough. The slowness of an end-to-end process means that companies are unable to bring its products to market in rhythm with customers\u2019 wishes, market changes, and set timetables or to collect the greatest possible product margin. Subsequently, CSPs are forced to undergo expensive product rationalization exercises to cull out redundant or non-profitable offerings. Hence, to succeed in the dynamic ecosystem era, CSP would need to reimagine how they develop new products whilst innovating and managing their lifecycles. This requires a deep dive into their existing product lifecycle approach from a piecemeal activity to a next generation product lifecycle management approach. How and at what level of each company conducts its product lifecycle management implementation depends on several factors that can be explained through the Unified PLM Framework.<|endoftext|>For sustainable digital ecosystem success, CSPs need to re-imagine their PLM activities as a strategic initiative Did You Know? According to Bain\u2019s Digital GPS Benchmark, more than half of respondents from telecom industry, said that automation of back-office operations like PLM, as their top digital priority3 Reinventing the CSP Product Lifecycle Management for the Digital Ecosystems \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 5 / 16 \n\n---\n\n The Unified Product Lifecycle Management (Unified PLM) is a comprehensive approach to implementing product lifecycle strategy in a telecom organization. It integrates a comprehensive PLM strategy, a modular product architecture, an efficient PLM process design and enabling data & technology architecture that saves time, reduces product complexity, and excels in a multi-party environment. It creates a framework to capture insights across lifecycle phases to rapidly create new offerings while having an automated process to right-size unused offerings.4 It starts with setting up a PLM strategy and governance structures that defines the stakeholders critical to implement a product architecture relevant for the digital age which need not be complex but simplified and modularized. Subsequently, the enabling processes and technology are implemented to execute the product portfolio and lifecycle strategy. The unified PLM framework has five key components: Unified PLM is a holistic framework for PMs to save time and reduce costs in a multi-party world UNIFIED PLM FRAMEWORK 5 \u2022 Strict process stage gates \u2022 IT change & configuration management \u2022 Process improvement initiatives \u2022 Product retirement process \u2022 Process support systems \u2022 Decision support system \u2022 Product data management and policies \u2022 Process efficiency tools \u2022 PLM maturity assessment \u2022 Product portfolio & PLM alignment \u2022 PLM governance framework \u2022 Incorporation of product variants \u2022 Organizational structure \u2022 Roles and responsibilities \u2022 Skill & Resources allocated \u2022 Multi-party collaboration \u2022 Modular marketing product structure \u2022 Product rules governance \u2022 Modular process design \u2022 Reusability of components PLM Strategy PLM Process Excellence Data & Technology Product Design Organization & People Unified PLM Framework Reinventing the CSP Product Lifecycle Management for the Digital Ecosystems \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 6 / 16 \n\n---\n\n The PLM strategy is the foundational rail on which the organization embarks on transforming its PLM activities. It is an optimal alignment between the need for innovation and marketing priorities with a governance mechanism that effectively addresses the PLM priorities. It allows for seamless synchronization of product development, market management, and retirement processes. It also sets the necessary governance and control mechanisms to detect and mitigate potential threats. However, CSPs should be wary of implementing a one-size- fits-all approach to all products in the portfolio; they should treat various products differently based on their operating model, product complexity and lifecycle behavior.<|endoftext|>For example, a device bundled with mobile plans have a limited shelf-life; hence the PLM Strategy should be agile to respond faster to any type of change while enterprise products have a longer shelf-life to allow for market development and sales cycles. Hence, an effective PLM strategy creates a framework for managing distinct product variants within the strategic priorities. Unified PLM begins from strategy to create the rails on which people & products are organized COMPONENTS OF UNIFIED PLM 6 PLM Strategy 1 PLM Governance 2 Alignment to Product Portfolio Strategy 3 PLM Process Variants PLM Strategy Reinventing the CSP Product Lifecycle Management for the Digital Ecosystems \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 7 / 16 \n\n---\n\n COMPONENTS OF UNIFIED PLM PLM activities cut across different stakeholders such as product marketing, sales, technology, customer management, business intelligence and finance teams. And in an unstructured environment, these activities are conducted on an ad-hoc basis by individual functions. And consequently, conflicts between stakeholders are quite common. Various case studies have demonstrated that more conflicts lead to a higher probability of final product failure. Hence, for a sustainable PLM success, an organizational structure must be present where all the relevant departments & teams can efficiently collaborate and coordinate.<|endoftext|>CSPs have traditionally implemented a divisional structure or a matrix structure where group managers from distinct functions collaborate to develop new products. However, with the number of parties spreading across multiple organizations, new organizational models must be considered. Implementing new structures include offering management squads or agile product teams to independently execute the PLM strategy within their organizations. These squads are accountable for the entire lifecycle of an offering, including validating the need in the marketplace and conducting the impact analysis on engineering, sales, support, and budgeting. In addition, these squads have the mandate to work across business units and disciplines to harness the company\u2019s entire arsenal of talent and knowledge base. These structures could be customized based on process variants and the strategic necessities of the company. 7 Organization & People 1 PLM Organization 2 Responsibility Assignment 3 Employee Empowerment Organization & People Reinventing the CSP Product Lifecycle Management for the Digital Ecosystems \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 8 / 16 \n\n---\n\n What is modularization? Modularization is an activity of dividing a product or process into logical and interchangeable modules. The objective is to create a flexible system that enables creation of different configurations, while reducing the need to create unique building blocks each time.6 Benefits of a modular system are numerous: \u2022 Higher efficiency as modules can be consolidated across different products \u2022 Higher agility as changes & modifications can be isolated to specific modules, enabling others to remain unchanged \u2022 Higher flexibility as mass customization on individual module can be achieved at scale The product design component aims to enable product component reusability by defining the constraints and rules for decomposing the product functionality into meaningful modules with a coherent product data model. For CSPs, the product structure includes breaking down the product offer and reusing the individual service modules from a market, technical and operational perspective. A modular market perspective includes cross-linkage between aspects like tariff plans, fees, market segments, etc. A technical perspective includes decomposing the product offering into individual product, service, and resource modules. Finally, an operational perspective includes reusing process modules such as fulfillment, assurance, and billing processes independent of the product type. Important advantages of a modular product design include faster time to market, more efficient development, better innovation ability and lesser operational costs as complex systems become easier to manage; parallel activities can operate independently, reusability of existing components, and faster fault localization. The product model defined as per the SID framework by TM Forum is a classic example of a modular product architecture.5 COMPONENTS OF UNIFIED PLM 8 Product Design 1 Modular Marketing Product Design 2 Product Rules Governance 3 Modular Process Design Product Design LOW HIGH HIGH No. of product variants offered to customers Cost efficiency of product portfolio LOW Customization Standardization Modularization Relative cost efficiency with the increase in product variants in the market Reinventing the CSP Product Lifecycle Management for the Digital Ecosystems \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 9 / 16 \n\n---\n\n A typical PLM process includes six phases - ideation, design, development, go-to-market, sales, and finally, retirement. Most CSPs have a well-documented product lifecycle management process of end-to-end activities however, the presence of a process excellence framework is rarer or if one does exist, it needs to be more effectively applied.<|endoftext|>PLM process excellence embeds the continuous improvement in the PLM process, aligning with the strategic PLM goals. It includes unambiguously defining relevant activities, their sequence, data requirements and configuration. In addition, it also defines the roles and responsibilities of the product organization throughout the value chain to ensure proper execution. The critical elements within PLM process excellence includes definition of stage gates that ensure that the product satisfies the minimum criteria to move to next lifecycle phase, standardization of various process variants and a regular retirement process.7 A regular rules-based product retirement process is critical in making the product offerings more targeted and manageable and eliminates the need for product rationalization activities every few years. COMPONENTS OF UNIFIED PLM 9 PLM Process Excellence 1 Continuous Process Improvement PLM Process Excellence 2 Strict Process Stage Gates 3 IT Change & Configuration Management 4 Retirement Management Standard PLM Process Reinventing the CSP Product Lifecycle Management for the Digital Ecosystems \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 10 / 16 \n\n---\n\n The objective of Data & Technology component is to provide frameworks that increase the efficiency of the PLM process execution while making it more efficacious. Informed by the product design and PLM process components, it helps in implementing these components in a business environment. The product data management forms the backbone for managing and controlling the lifecycle of a product. It starts from the market research and business planning data to the subsequent product performance and eventual retirement justification. A well-constructed PDM framework enables all stakeholders to capture, communicate and disseminate all heterogeneous data throughout its lifecycle. It helps in speeding up product development, reduce errors and increase efficiency of resources. The technology component includes two key elements \u2013 process support systems and decision support systems. Process support systems include RPA or workflow management system for automated process implementation, decision gate evaluation, process compliance and automated triggers. Decision support systems include machine learning and AI-based tools that improve the decision-making capabilities to identify new product opportunities and proactively retire products that are underperforming. COMPONENTS OF UNIFIED PLM 10 Data & Technology 1 Process Support Systems 2 Decision Support Systems 3 Product Data Management Data & Technology Reinventing the CSP Product Lifecycle Management for the Digital Ecosystems \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 11 / 16 \n\n---\n\n More than ever, customers are expecting more for less. To manage now and future needs, CSPs must understand their own core capabilities and have the expertise, agility, and flexibility to react to meet demand and ultimately remain relevant. Developing a strategy that balances internal barriers and constraints, whilst integrating organization goals and vision is by no means an easy task. This coupled with external competition has intensified the entire PLM playing field, thus essential for CSPs to understand the \u2018real\u2019 opportunity within that if managed and executed well, will lead to create value across the board.<|endoftext|>CSPs across the world have approached their PLM initiatives that vary on a broad spectrum of parameters from ad hoc manual activities to the use of AI & machine learning algorithms to recommend product lifecycle actions. The effectiveness of these initiatives naturally would be determined against internal as well as external factors. However, implementation of any PLM project requires an extensive change in intra and inter-organization processes, new types of skills and capabilities, and more than that, an organization wide cultural and strategic transformation. Hence, any PLM initiative would require strategic commitment and resources. 8 As a first step towards that initiative, a maturity model can help CSPs to assess as-is while giving a guided path to advance their PLM capabilities for the future. Evaluate your as-is PLM capabilities by determining your organization\u2019s PLM maturity RECOMMENDATIONS FOR CSPs 11 Reinventing the CSP Product Lifecycle Management for the Digital Ecosystems \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 12 / 16 \n\n---\n\n PLM Maturity Model Infosys Consulting has developed the PLM Maturity Model based on industry research, best practices and external trends to assess the relative CSP performance on PLM. It is a scientific method to rate the performance of individual component against the best-in- class industry standards and identify opportunities for improvement. It considers not only existing PLM initiatives but also the relevant market trends, customer readiness and competitive profile to rate the performance of a CSP on their maturity state Infosys\u2019s PLM Maturity Model categorizes CSPs into 5 maturity levels: 1.<|endoftext|>Ad-hoc: Ad-hoc is the preliminary maturity state where there is no evidence of a PLM strategy and vision. This stage is often characterized by inconsistent processes, monolithic product structure and absence of enabling process and technological framework. The lifecycle activities often are executed on a case-by-case basis by individual functions in the organization for a specific need. The PLM maturity model assesses relative performance to recommend next course of action INFOSYS PLM MATURITY MODEL 12 Infosys PLM Maturity Model Reinventing the CSP Product Lifecycle Management for the Digital Ecosystems \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 13 / 16 \n\n---\n\n INFOSYS PLM MATURITY MODEL 13 2. Structured: In a structured state, a high- level PLM strategy and governance is present which defines PLM objectives and participating stakeholders. This structured state incorporates basic process and governance framework to increase efficiency with a singular focus on reducing time to market. 3.<|endoftext|>Integrated: An integrated maturity state increases the PLM coverage with different product variants and parties included in the PLM activities. Considering disparate product variants, integrated state CSPs are able to modularize their product architecture that is scalable across different product offerings and operating models.<|endoftext|>4. Automated: In an automated maturity state, CSPs incorporate multiple systems and tools to automate their PLM implementation. This includes use of a centralized product data management capability and use of process support systems to track and manage product lifecycle stage. 5. Adaptive: CSPs in the adaptive maturity state, increasingly use AI and Machine Learning algorithms to analyze and predict the performance of the marketed products. AI-based tools help CSPs to analyze customer behavior to recommend product offering ideas and process automation opportunities for better PLM outcomes. Common Pitfalls On their path to achieving PLM transformation, CSPs need to avoid certain common pitfalls: \u2022 Lack of executive commitment - PLM cannot be treated as a siloed initiative, its sponsorship must have the right organizational backing.<|endoftext|>\u2022 Fragmented governance - where PLM is not defined or limited with no real controls.<|endoftext|>\u2022 Limited process standardization across departments, making it more a free for all rather than centralized and cohesive \u2022 Lack of empowerment and resources allocated to the PLM organization to implement change \u2022 Visibility of product data \u2013 to support decision making and product performance \u2022 Inability to utilize new emerging technologies \u2013 use newer technologies to support PLM process e.g., advanced data analytical tools, AI, ML etc. \u2022 Proliferation of products \u2013With the improvement in product go-to-market timelines, avoid creating new superfluous offerings Although daunting, it is not an impossible task to move away from more traditional CSP behaviors, the key to progression and moving forward is to first understand where you currently are and how incrementally you can move in the right direction.<|endoftext|>Hence, it is now very important for CSPs to consider self-maturity assessment. Reinventing the CSP Product Lifecycle Management for the Digital Ecosystems \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 14 / 16 \n\n---\n\n Take the Maturity Health Check NOW The Infosys consulting maturity health check will allow CSPs to:: \u2022 Evaluate PLM health and performance against best-in-class operationalized processes and technology implementations \u2022 Quickly identify and pinpoint areas with biggest improvements and business gain \u2022 Execute a plan of how to take current as- is and \u201cupgrade\u201d to the next maturity state \u2022 Use a scientific approach to objectively assess performance against PLM company wide objectives \u2022 Implement a PLM monitoring framework to assess different components on a continual basis INFOSYS PLM MATURITY MODEL 14 SAGAR ROONGTA Consultant Singapore +65 8264 6036 Sagar.Roongta@infosysconsulting.com MEET THE EXPERTS AUTHORS KIRAN AMIN Senior Principal Singapore +65 9742 7657 Kiran.Amin@infosysconsulting.com Reinventing the CSP Product Lifecycle Management for the Digital Ecosystems \u00a9 2023 Infosys Consulting THIAG KARUNANITHI Associate Partner Australia +61 4064 2736 0 Thiag.Karunanithi@infosys.com \n\n---\n\n Page: 15 / 16 \n\n---\n\n 1.<|endoftext|>https://www.gartner.com/en/articles/see-the-key-findings-from-the-gartner-2023-board-of- directors-survey 2. https://www.idc.com/getdoc.jsp?containerId=prAP49619722 3.<|endoftext|>https://www.bain.com/insights/digital-transformation-what-matters-most-in-your-sector- interactive/ 4. https://www.researchgate.net/publication/204100092_Next_Generation_Telco_Product_Lifecy cle_Management_- _How_to_Overcome_Complexity_in_Product_Management_by_Implementing_Best- Practice_PLM 5. https://www.tmforum.org/oda/information-systems/information-framework-sid/ 6. https://www.modularmanagement.com/blog/all-you-need-to-know-about-modularization 7.<|endoftext|>https://www.productfocus.com/product-management-resources/infographics/product- management-lifecycle/ 8. https://link.springer.com/article/10.1007/s00170-013-5529-1 REFERENCES 15 Reinventing the CSP Product Lifecycle Management for the Digital Ecosystems \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 16 / 16 \n\n---\n\n consulting@Infosys.com InfosysConsultingInsights.com LinkedIn: /company/infosysconsulting Twitter: @infosysconsltng About Infosys Consulting Infosys Consulting is a global management consulting firm helping some of the world\u2019s most recognizable brands transform and innovate. Our consultants are industry experts that lead complex change agendas driven by disruptive technology. With offices in 20 countries and backed by the power of the global Infosys brand, our teams help the C-suite navigate today\u2019s digital landscape to win market share and create shareholder value for lasting competitive advantage. To see our ideas in action, or to join a new type of consulting firm, visit us at www.InfosysConsultingInsights.com. For more information, contact consulting@infosys.com \u00a9 2023 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names, and other such intellectual property rights mentioned in this document. Except as expressly permitted, neither this document nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printed, photocopied, recorded or otherwise, without the prior permission of Infosys Limited and/or any named intellectual property rights holders under this document. \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: The right approach to testing interoperability of healthcare APIs under FHIR \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 8 \n\n---\n\n VIEW POINT THE RIGHT APPROACH TO TESTING INTEROPERABILITY OF HEALTHCARE APIs UNDER FHIR \n\n---\n\n Page: 2 / 8 \n\n---\n\n External Document \u00a9 2020 Infosys Limited External Document \u00a9 2020 Infosys Limited Abstract Historically, the lack of mutual data exchange on patient health between entities in the healthcare industry has impaired the quality of patient care. This has led to poor health outcomes and increased costs for patients. The Trump administration\u2019s MyHealthEData initiative has the stated objective of placing the patient at the center of the US healthcare system. The initiative aims to promote interoperability of patient health data between entities using latest technologies such as cloud and APIs.<|endoftext|>This paper outlines the test approach to ensure compliance with the CMS (Centers for Medicare and Medicaid Services) payor policies mandate effective July 2021. The paper explains 3 simple steps for interoperability of cloud/on- premises healthcare APIs between payors and providers as per Fast Healthcare Interoperability Resources (FHIR) guidelines using the Infosys FHIR testing solution. This solution has been built on the proven Infosys Interoperability Test Automation Framework that leverages open-source tools, cloud components, and test servers.<|endoftext|>\n\n---\n\n Page: 3 / 8 \n\n---\n\n External Document \u00a9 2020 Infosys Limited External Document \u00a9 2020 Infosys Limited Introduction All healthcare payors, providers, and stakeholders need to ensure that their systems are truly interoperable based on Fast Healthcare Interoperability Resources (FHIR) guidelines. Organizations are seeking to minimize the manual effort involved in ensuring data integrity and real-world testing as per these guidelines. The need of the hour is test accelerators that can seamlessly integrate with FHIR cloud/ on-premises servers to run automated conformance and data validation tests.<|endoftext|>Fragmented data consolidation Currently, organizations have non-standard data consolidation architecture due to the nature of their source systems. To be CMS (Centers for Medicare and Medicaid Services) certified, organizations must implement FHIR-compliant microservices- based architecture for API enablement including authentication and security policies. This adds additional complexities for testers to validate data accuracy and performance at the FHIR layer (cloud/ on premise) in addition to specification conformance validation.<|endoftext|>Any test strategy to achieve FHIR compliance must be carefully structured to mitigate all known challenges. To begin with, organizations must finalize their API end-to-end operating model, cloud/ on-premises implementation strategy, consent management, and intermediate data aggregation strategy so that we understand the various stages that need FHIR testing. Key challenges Adoption and interpretation of FHIR standards FHIR standards have base rules, constraints, and other metadata that make them complex to interpret and derive conformance scenarios. FHIR regulated payors and providers will be required to implement FHIR compatible cloud/on-premises based healthcare APIs considering different market segments like Medicare, Medicaid, and any other state-specific inclusions. They also need to enable various FHIR resources such as diagnostics, medication, care provision, billing, payments, and coverage to be accurate and accessible as per the standard SLAs for patient and provider.<|endoftext|>Real world testing This is a gray area at this initial stage because of the lack of a SMART application (a user-facing application that connects to payors or providers for a patient\u2019s health records). This limits the process of simulating and validating complete end- to-end testing for FHIR conformance. A payor or provider must qualify themselves as FHIR compliant both internally and externally through acceptance from consumers. To achieve acceptance, a payor needs to conduct multiple tests to cover data integrity, conformance, and consumer registration and consent as per FHIR guidelines. Tests need to be conducted between peer-to-peer servers, client to server, and from a standalone server to a proven FHIR-compatible test framework.<|endoftext|>Infosys 3-step process for interoperability testing Infosys recommends these 3 simple steps to achieve interoperability of cloud/on- premises healthcare APIs between payors and providers as per FHIR guidelines Step 1 \u2013 Functional/non- functional testing The primary focus areas of functional testing are: \u2022 Conformance (structure and behavior) validation based on FHIR specifications \u2022 Data validation of cloud/on premise- based healthcare APIs through the original source or from intermediate data aggregation or virtualization. If an organization considers any data aggregation before FHIR mapping, then additional testing around data quality should be done to ensure source data is aggregated correctly at the FHIR layer Non-functional testing such as performance and security testing are equally important given that APIs are exposed externally using OAuth2.0 or OpenID connect protocols.<|endoftext|>\n\n---\n\n Page: 4 / 8 \n\n---\n\n External Document \u00a9 2020 Infosys Limited External Document \u00a9 2020 Infosys Limited Step 2 \u2013 Regulatory compliance testing This validation should ensure payor or provider complies with the guidelines defined by CMS and FHIR. The primary focus of regulatory compliance testing is on features mandated by HealthIT standards which includes self- discoverability, capability statement, authentication/authorization, FHIR conformance and so on. Every payor must undergo this testing so that they can pass the CMS certification within the given deadline.<|endoftext|>\n\n---\n\n Page: 5 / 8 \n\n---\n\n External Document \u00a9 2020 Infosys Limited Step 3 \u2013 End-user beta testing There are several third-party healthcare applications (cloud/on-premises) that are under development by Apple, Amazon, Google, and other payor/vendors that are registered in their respective developer portals. Our strategy involves partnering External Document \u00a9 2020 Infosys Limited with these vendors and using their beta applications to integrate with payor FHIR servers. This will provide payors early confidence in terms of usability and conformance with regulation.<|endoftext|>Figure 1 below depicts an end-to-end automated test-driven development flow of Infosys Interoperability Test Automation Framework covering the 3-step testing procedure. The framework leverages Infosys FHIR Testing Solution, open-source tools, cloud components, and test servers to achieve FHIR compliance seamlessly across various healthcare entities within a short time to market.<|endoftext|>Figure 1: Infosys Interoperability Test Automation Framework \n\n---\n\n Page: 6 / 8 \n\n---\n\n External Document \u00a9 2020 Infosys Limited External Document \u00a9 2020 Infosys Limited The road ahead Today, organizations are focused on getting their FHIR-compliant cloud/on- premises based patient access API and provider directory API up and running. However, organizations must also think of other policies specified in the CMS final rules. These include payor to payor data sharing, improving user experience for the beneficiary, and admission/discharge/ transfer event notifications. With such requirements in mind, payors and vendors need to think about developing a scalable test automation framework that not only works between payors and consumers but also across payors. Infosys is enhancing the capabilities of its FHIR testing solution so that it is future ready.<|endoftext|>\n\n---\n\n Page: 7 / 8 \n\n---\n\n External Document \u00a9 2020 Infosys Limited Conclusion Establishing end-to-end testing for FHIR guidelines compliance across healthcare payors and third-party vendors comes with its own challenges and complexities. The only way forward is based on the final CMS interoperability rule. A solution that ensures fast and effective delivery of FHIR rules to consumers should focus on two key aspects: \u2022 Building configurable FHIR accelerators with the ability to support cloud/on- premises FHIR servers \u2022 Identifying FHIR-compliant consumers for continuous testing practices The end goal is to build a test approach that can be scaled rapidly based on the rate of increase in FHIR adoption by payors, providers and consumers.<|endoftext|>External Document \u00a9 2020 Infosys Limited \n\n---\n\n Page: 8 / 8 \n\n---\n\n \u00a9 2020 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected https://www.cms.gov/Regulations-and-Guidance/Guidance/Interoperability/index https://inferno.healthit.gov/inferno/ https://www.aegis.net/touchstone.html https://projectcrucible.org/ https://www.logicahealth.org/solutions/fhir-sandbox/ https://fhir.cerner.com/smart/ https://apievangelist.com/2019/09/18/creating-a-postman-collection-for-the-fast-healthcare-interoperability-resources-fhir-specification/ https://hapifhir.io/hapi-fhir/docs/validation/introduction.html About the \nAuthor Amit Kumar Nanda, Group Project Manager, Infosys References: \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: Test automation framework \u2013 how to choose the right one for digital transformation? \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 4 \n\n---\n\n VIEW POINT TEST AUTOMATION FRAMEWORK \u2013 HOW TO CHOOSE THE RIGHT ONE FOR DIGITAL TRANSFORMATION? \n\n---\n\n Page: 2 / 4 \n\n---\n\n What is a test automation framework and what are its different types? A test automation framework is a combin- ation of guidelines, coding standards, concepts, practices, processes, project hierarchies, reporting mechanism, test data, to support automation testing. A tester follows these guidelines while automating applications to take advantage of various productive results.<|endoftext|>Introduction Digitalization and the disruption caused by the adoption of digital technologies are rapidly changing the world. Speed matters a lot in all IT operations, and this calls for a paradigm shift in quality assurance (QA). Quality at high speed is the key focus in digital assurance, and organizations want to deliver quality products much faster than ever before. This is making QA teams to bank on test automation. From the initial automation of regression tests, the industry is moving towards progressive automation and day one automation. At the same time, extreme automation and zero touch automation are the buzz words in the QA world these days. Various advancements have evolved in the area of automation testing. However, it is critical that organizations choose the right automation framework, which is considered a critical factor for its success. In this document, we will explore the different types of automation frameworks, and how to choose the right framework which will help in achieving the digital assurance goals of the organizations.<|endoftext|>There are many types of test automation frameworks available in the market, and the most popular ones are listed here. Each one of these frameworks has their individual characteristics and features.<|endoftext|>Let us now examine some of the popular frameworks and understand their pros, cons, and usability recommendations: Linear Functional Decomposition /Modular Data Driven Keyword Driven Hybrid BDD Automation Framework Types Keyword-driven framework In the keyword-driven framework, testers create various keywords and associate different actions or functions with each of these keywords. Function library contains the logic to read the keywords and call and perform the associated actions. Generally, test scenarios are written in excel sheets. The driver script reads the scenario and performs test execution. This is used in situations where the testers who create test scripts the strengths of the different frameworks and mitigates their weaknesses. It is highly robust, flexible, and more maintainable. However, this requires strong technical expertise to design and maintain.<|endoftext|> Behavior-driven development framework Behavior-driven development (BDD) framework automates validations in an easily readable and understandable format to business analysts, developers, testers, etc. Such frameworks do not necessarily require the user to be acquainted with any programming language. There are different tools available for BDD like Cucumber, JBehave, and more which work along with other test automation tools. This framework is more suitable for applications using agile methodology and where user stories and early automation are required. It focuses on the behavior of the system rather than the implementation aspect of the system. The traceability between requirements and scripts is maintained throughout, and test scripts are easy to understand for the business users.<|endoftext|>Pillars of the right framework for the digital era Automation can improve quality and lead to higher testing efficiency. Hence, it is important to plan it well and make the right choice of tools and frameworks. When test automation uses the right framework based on the context, it yields great benefits. Hence, it is worth understanding the key requirements of the framework, before choosing the right one.<|endoftext|>have less programming expertise, whereas framework creation is done by automation experts Hybrid framework The hybrid automation framework is created by combining distinct features of two or more frameworks. This enhances External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 3 / 4 \n\n---\n\n Some key aspects of automation framework to look for during the digital assurance journey are provided below: Extreme automation Digital transformation programs, big data, cloud, and mobility are changing the way testing is being done. Leaders in testing are moving towards extreme automation to achieve a faster time to market. Extreme automation is the key, and automating every part of the testing process instead of just regression is crucial now. A framework which is more scalable and facilitates lifecycle automation as well as broader test coverage is needed for digital assurance programs.<|endoftext|>Technology and tool agnostic approach The landscape of tools in QA is becoming wider day by day. There are too many tools and frameworks, which poses a lot of integration challenges. Hence, it is imperative to choose a framework which is technology and tool agnostic and supports various tools and technologies. The framework needs to address enterprise-level automation strategy and goals instead of catering to just a single project goals.<|endoftext|>Script less capabilities Automate the automation, and look out for scriptless automation avenues. Most software testers and business users find it challenging to learn programming languages such as Java, Visual Basic, etc. well enough to write the scripts that the test automation demands. There are frameworks and accelerators available with user-friendly graphical user interfaces (GUI) which help to create automation scripts in a much easier way than having to know and write code in any specific programming language. Choosing a framework which helps to create a test script from the recorded script or based on the input from a spreadsheet will help in accelerating automation and reduce dependency on skilled resources.<|endoftext|>True shift left attitude Digitalization and frequent releases call for day one automation. Gone are the days when the automation team would wait until the application is built and start automation activities thereafter. The need of the hour is to shift extreme left and start automation during the requirement gathering phase of the systems development life cycle (SDLC) itself. Automation framework with exhaustive reusable library and support for BDD will help both business users and QA teams to start automation activities early in the life cycle.<|endoftext|>Omnichannel, mobility, and cloud features Organizations today are focusing more on digital assurance, but it is important to test real user behaviors and to test on multiple devices such as various mobiles, tablets, platforms, and such. Hence, the chosen test automation framework needs to facilitate testing on multiple devices to ensure a uniform experience across devices. If the framework supports the reuse of the script used for online or desktop testing for mobile testing as well with minimal rework, it will help in saving much effort. When addressing the multifaceted needs of mobile testing, conducting comprehensive testing across hundreds of different devices, brands, models, and different operating system combinations is tedious. A framework that facilitates integration with cloud infrastructure will be an added advantage.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 4 / 4 \n\n---\n\n Zero touch automation As DevOps is slowly taking over the IT landscape, it is vital to reduce the distance between development and deployment. Test scripts need to be executed in an unattended manner without requiring much manual intervention. Remote execution, parallel execution, zero touch execution, and execution from continuous integration tools like Jenkins and Hudson, when supported by automation framework, will help a lot in managing multiple sprints and shorter cycles better. Seamless integration With a plethora of tools being used in application development and testing landscape, it is important that the automation tool and the framework chosen facilitate integration with various tools. Hence, it is imperative that the chosen automation framework and tool integrate with test management tools, defect tracking tools, build tools, analytics tools, and continuous integration tools in the landscape.<|endoftext|>User-friendly reporting Agile and Dev Ops has brought the business, development and QA teams to work together. The ability to run a high volume of tests is of little use if the results of the tests are not easy to understand by various stakeholders involved. The framework has to facilitate automatic generation of reports of the test execution and show the results in an easy-to-read format. Though most of the market tools give few reporting options, they are not self-explanatory and adequate. Hence, the framework with good reporting capabilities such as HTML reports, live execution dashboard, screen shots in case of failures, and video reporting of the execution options will be very helpful. Automation framework facilitating detailed test result reporting reduces the overall effort to a greater extent.<|endoftext|>About the author Indumathi Devi, a project manager with Infosys, has 13+ years of experience in software testing. She has effectively executed a multitude of automation projects and designed and developed automation frameworks. Using her strong working knowledge of multiple test automation tools, including open source and commercial ones, Indu has worked with numerous clients in implementing robust test automation solutions.<|endoftext|>Conclusion No one size fits all. This perfectly holds true when it comes to framework selection. Since every project is unique, the challenges, duration, and tools choices may vary. Organizations seeking agility in their business processes need to onboard robust test automation solutions that ensure superior software quality. Successful test automation frameworks for digital assurance are the ones which support extreme automation, omnichannel testing, zero touch execution of test scripts, and have some or all of the key aspects detailed above. We recommend that organizations select an automation framework that can lead to smarter automation, better overall results, productivity benefits, and cost efficiencies in the highly dynamic digital landscape.<|endoftext|>\u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: The right testing strategy for AI systems - An Infosys viewpoint \nAuthor: Infosys Limited \nFormat: PDF 1.6 \n\n---\n\n Page: 1 / 8 \n\n---\n\n PERSPECTIVE THE RIGHT TESTING STRATEGY FOR AI SYSTEMS AN INFOSYS VIEWPOINT VENKATESH IYENGAR, AVP - Group Practice Engagement Manager, Infosys SUNDARESA SUBRAMANIAN G, Practice Engagement Manager, Infosys \n\n---\n\n Page: 2 / 8 \n\n---\n\n External Document \u00a9 2018 Infosys Limited Abstract Over the years, organizations have invested significantly in optimizing their testing processes to ensure continuous releases of high-quality software. When it comes to artificial intelligence, however, testing is more challenging owing to the complexity of AI. Thus, organizations need a different approach to test their AI frameworks and systems to ensure that these meet the desired goals. This paper examines some key failure points in AI frameworks. It also outlines how these failures can be avoided using four main use cases that are critical to ensuring a well-functioning AI system.<|endoftext|>The hierarchy of evolution of AI Introduction Experts in nearly every field are in a race to discover how to replicate brain functions \u2013 wholly or partially. In fact, by 2025, the value of the artificial intelligence (AI) market will surpass US $100 billion1. For corporate organizations, investments in AI are made with the goal of amplifying the human potential, improving efficiency and optimizing processes. However, it is important to be aware that AI too is prone to error owing to its complexity. Let us first understand what makes AI systems different from traditional software systems: S.No Software systems AI systems 1 Features \u2013 Software is deterministic, i.e., it is pre-pro- grammed to provide a specific output based on a given set of inputs Features \u2013 Artificial intelligence/machine learning (AI/ML) is non-deterministic, i.e., the algorithm can behave differently for different runs 2 Accuracy \u2013 Accurate software depends on the skill of the programmer and is deemed successful if it produces an output in accordance with its design Accuracy \u2013 Accuracy of AI algorithms depends on the training set and data inputs 3 Programming \u2013 All software functions are designed based on if-then and for loops to convert input data to output data Programming \u2013 Different input and output combinations are fed to the machine based on which it learns and defines the function 4 Errors \u2013 When software encounters an error, remediation depends on human intelligence or a coded exit function Errors \u2013 AI systems have self-healing capabilities whereby they resume operations after handling exceptions/errors Fig 1: Hierarchy of AI\u2019s evolution and techniques Fig 1: Hierarchy of AI\u2019s evolution and techniques Visualization Machine learning and analytics Input data conditioning Learning process \u2013 data sources Feedback Feedback \u2013 From sensors, devices, apps, and systems Visualization \u2013 Custom apps, connected devices, web, and bots Machine learning and analytics \u2013 Cognitive learning/algorithms Input data conditioning \u2013 Big data stores and data lakes Data sources \u2013 Dynamic or static sources like text, image, speech, sensor, video, and touch \n\n---\n\n Page: 3 / 8 \n\n---\n\n External Document \u00a9 2018 Infosys Limited External Document \u00a9 2018 Infosys Limited The above figure shows the sequential stages of AI algorithms. While each stage is necessary for successful AI programs, there are some typical failure points that exist within each stage. These must be carefully identified using the right testing technique as shown in the table below: S.No Evolution stage in AI Typical failure points How can they be detected in testing 1 Data sources \u2013 Dynamic or static sources \u2022 Issues of correctness, completeness and appropriateness of source data quality and formatting \u2022 Variety and velocity of dynamic data resulting in errors \u2022 Heterogeneous data sources \u2022 Automated data quality checks \u2022 Ability to handle heterogeneous data during comparison \u2022 Data transformation testing \u2022 Sampling and aggregate strategies 2 Input data condition- ing \u2013 Big data stores and data lakes \u2022 Incorrect data load rules and data duplicates \u2022 Data nodes partition failure \u2022 Truncated data and data drops \u2022 Data ingestion testing \u2022 Knowledge of development model and codes \u2022 Understanding data needed for testing \u2022 Ability to subset and create test data sets 3 ML and analytics \u2013 Cognitive learning/ algorithms \u2022 Determining how data is split for training and testing \u2022 Out-of-sample errors like new behavior in previously unseen data sets \u2022 Failure to understand data relationships between enti- ties and tables \u2022 Algorithm testing \u2022 System testing \u2022 Regression testing 4 Visualization \u2013 Cus- tom apps, connected devices, web, and bots \u2022 Incorrectly coded rules in custom applications resulting in data issues \u2022 \nFormatting and data reconciliation issues between reports and the back-end \u2022 Communication failure in middleware systems/APIs resulting in disconnected data communication and visualization \u2022 API testing \u2022 End-to-end functional testing and automa- tion \u2022 Testing of analytical models \u2022 Reconciliation with development models 5 Feedback \u2013 From sen- sors, devices, apps, and systems \u2022 Incorrectly coded rules in custom applications resulting in data issues \u2022 Propagation of false positives at the feedback stage resulting in incorrect predictions \u2022 Optical character recognition (OCR) testing \u2022 Speech, image and natural language pro- cessing (NLP) testing \u2022 RPA testing \u2022 Chatbot testing frameworks The right testing strategy for AI systems Given the fact that there are several failure points, the test strategy for any AI system must be carefully structured to mitigate risk of failure. To begin with, organizations must first understand the various stages in an AI framework as shown in Fig 1. With this understanding, they will be able to define a comprehensive test strategy with specific testing techniques across the entire framework. Here are four key AI use cases that must be tested to ensure proper AI system functioning: \u2022 Testing standalone cognitive features such as natural language processing (NLP), speech recognition, image recognition, and optical character recognition (OCR) \u2022 Testing AI platforms such as IBM Watson, Infosys NIA, Azure Machine Learning Studio, Microsoft Oxford, and Google DeepMind \u2022 Testing ML-based analytical models \u2022 Testing AI-powered solutions such as virtual assistants and robotic process automation (RPA) \n\n---\n\n Page: 4 / 8 \n\n---\n\n External Document \u00a9 2018 Infosys Limited Use case 1: Testing standalone cognitive features Natural language processing (NLP) \u2022 Test for \u2018precision\u2019 return of the keyword, i.e., a fraction of relevant instances among the total retrieved instances of NLP \u2022 Test for \u2018recall\u2019, i.e., a fraction of retrieved instances over total number of retrieved instances available \u2022 Test for true positives (TPs), true negatives (TNs), false positives (FPs), and false negatives (FNs). Ensure that FPs and FNs are within the defined error/fallout range Speech recognition inputs \u2022 Conduct basic testing of the speech recognition software to see if the system recognizes speech inputs \u2022 Test for pattern recognition to determine if the system can identify when a unique phrase is repeated several times in a known accent and whether it can identify the same phrase when it is repeated in a different accent \u2022 Test deep learning, the ability to differentiate between \u2018New York\u2019 and \u2018Newark\u2019 \u2022 Test how speech translates to response. For example, a query of \u201cFind me a place I can drink coffee\u201d should not generate a response with coffee shops and driving directions. Instead, it should point to a public place or park where one can enjoy his/her coffee Image recognition \u2022 Test the image recognition algorithm through basic forms and features \u2022 Test supervised learning by distorting or blurring the image to determine the extent of recognition by the algorithm \u2022 Test pattern recognition by replacing cartoons with the real image like showing a real dog instead of a cartoon dog \u2022 Test deep learning using scenarios to see if the system can find a portion of an object in a larger image canvas and complete a specific action Optical character recognition \u2022 Test OCR and optical word recognition (OWR) basics by using character or word inputs for the system to recognize \u2022 Test supervised learning to see if the system can recognize characters or words from printed, written or cursive scripts \u2022 Test deep learning, i.e., whether the system can recognize characters or words from skewed, speckled or binarized (when color is converted to grayscale) documents \u2022 Test constrained outputs by introducing a new word in a document that already has a defined lexicon with permitted words Use case 2: Testing AI platforms Testing any platform that hosts an AI framework is complex. Typically, it follows many of the steps used during functional testing. Data source and conditioning testing \u2022 Verify the quality of data from various systems \u2013 data correctness, completeness and appropriateness along with format checks, data lineage checks and pattern analysis \u2022 Verify transformation rules and logic applied on raw data to get the desired output format. The testing methodology/automation framework should function irrespective of the nature of data \u2013 tables, flat files or big data \u2022 Verify that the output queries or programs provide the intended data output \u2022 Test for positive and negative scenarios Algorithm testing \u2022 Split input data for learning and for the algorithm \u2022 If the algorithm uses ambiguous datasets, i.e., the output for a single input is not known, the software should be tested by feeding a set of inputs and checking if the output is related. Such relationships must be soundly established to ensure that algorithms do not have defects \u2022 Check the cumulative accuracy of hits (TPs and TNs) over misses (FPs and FNs) API integration \u2022 Verify input request and response from each application programming interface (API) \u2022 Verify request response pairs \u2022 Test communication between components \u2013 input and response returned as well as response format and correctness \u2022 Conduct integration testing of API and algorithms and verify reconciliation/visualization of output System/regression testing \u2022 Conduct end-to-end implementation testing for specific use cases, i.e., provide an input, verify data ingestion and quality, test the algorithms, verify communication through the API layer, and reconcile the final output on the data visualization platform with expected output \u2022 Check for system security, i.e., static and dynamic security testing \u2022 Conduct user interface and regression testing of the systems \n\n---\n\n Page: 5 / 8 \n\n---\n\n External Document \u00a9 2018 Infosys Limited Use case 3: Testing ML-based analytical models Organizations build analytical models for three main purposes as shown in Fig 2 Fig 2: Types and purposes of analytical models The validation strategy used while testing the analytical model involves the following three steps: \u2022 Split the historical data into \u2018test\u2019 and \u2018train\u2019 datasets \u2022 Train and test the model based on generated datasets \u2022 Report the accuracy of model for the various generated scenarios Fig 2: Types and purposes of analytical models External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 6 / 8 \n\n---\n\n External Document \u00a9 2018 Infosys Limited Fig 3: Testing analytical models Organizations build analytical models for three main purposes as shown in Fig 2 Fig 2: Types and purposes of analytical models Use case 4: Testing of AI-powered solutions Chatbot testing framework \u2022 Test the chatbot framework using semantically equivalent sentences and create an automated library for this purpose \u2022 Maintain configurations of basic and advanced semantically equivalent sentences with formal and informal tones and complex words \u2022 Automate end-to-end scenario (requesting chatbot, getting a response and validating the response action with accepted output) \u2022 Generate automated scripts in Python for execution RPA testing framework \u2022 Use open source automation or functional testing tools (Selenium, Sikuli, Robot Class, AutoIT) for mul- tiple applications \u2022 Use flexible test scripts with the ability to switch between machine language programming (where required as an input to the robot) and high-level language for functional automation \u2022 Use a combination of pattern, text, voice, image, and optical character recognition testing techniques with functional automation for true end-to-end testing of applications While testing a model, it is critical to do the following to ensure success: \u2022 Devise the right strategy to split and subset historical dataset using deep knowledge of development model and code to understand how it works on data \u2022 Model the end-to-end evaluation strategy to train and recreate model in test environments with associated components \u2022 Customize test automation to optimize testing throughput and predictability by leveraging customized solutions to split the dataset, evaluate the model and enable reporting \n\n---\n\n Page: 7 / 8 \n\n---\n\n External Document \u00a9 2018 Infosys Limited Conclusion AI frameworks typically follow 5 stages \u2013 learning from various data sources, input data conditioning, machine learning and analytics, visualization, and feedback. Each stage has specific failure points that can be identified using several techniques. Thus, when testing the AI systems, QA departments must clearly define the test strategy by considering the various challenges and failure points across all stages. Some of the important testing use cases to be considered are testing standalone cognitive features, AI platforms, ML-based analytical models, and AI- powered solutions. Such a comprehensive testing strategy will help organizations streamline their AI frameworks and minimize failures, thereby improving output quality and accuracy. External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 8 / 8 \n\n---\n\n \u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected \n\n\n***\n\n\n "} @@ -40,6 +44,9 @@ {"text": "# Infosys Whitepaper \nTitle: Test Factory Setup for SAP Applications \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 8 \n\n---\n\n PERSPECTIVE TEST FACTORY SETUP FOR SAP APPLICATIONS - Barry Cooper-Brown, Diageo Chandur Ludhani and Sailesh Chandrasekaran, Infosys Abstract Emergence of IT enabled business growth is compelling organizations to give testing and testing related strategies the much needed importance. By now, it has been reaffirmed that the cost of a single development defect can snowball to many times the original cost, if not discovered until the QA phase of testing and eventually showing up in the Production Environment.<|endoftext|>However, when it comes to SAP there are unique testing challenges to deal with. The challenges of SAP testing will present you with both tradeoffs that need to be considered and the choices that need to be made about the kind of testing that is needed for your QA organization. The following point-of-view has been written based on the engagement between a major drinks manufacturer and Infosys, describing a successful approach of setting up a Test Factory to manage testing of SAP applications.<|endoftext|>\n\n---\n\n Page: 2 / 8 \n\n---\n\n Challenges in managing changes in SAP application which indirectly gets mapped to reaping benefits for all its stakeholders namely the customers, employees, shareholders, etc. Often these investments do not bear fruits and are in turn viewed as a cost. For example, a delay in readying the application for regulatory changes, could lead to serious consequences for the organisation in the region within which the changes were mandated. Another such example could be with encountering incidents in the live environment or facing downtime with particular applications leading to severe business disruption. Organisations are constantly on the lookout for innovative ways to help adapt quickly to these changes in the SAP business processes. Their ability to do so also facilitates: \u2022 Improvement of delivery confidence with every change deployed \u2022 Reduction in cost of every change implemented \u2022 Ability to contract the overall lead time required for such activities thereby allowing more frequent releases An organisation\u2019s inability to do so, leads to a host of challenges to the employees who interface with SAP for their day-to-day activities: \u2022 Long time for deploying the SAP changes, means more business application downtime \u2022 Leakage of defects to production\u2013 hampering their day to day operations \u2022 High number of defects detected during User Acceptance Testing (UAT) resulting in a delay of final application go-live. The question therefore is whether there is a single and efficient solution available to organisations in managing their SAP related QA/Testing operations, inexpensively and efficiently? The concept of Test Factory and the offering under the business tag of NEM (New Engagement Models) is gaining traction across the globe. The Test Factory, also an alias for Managed Test Service or Testing Centre of Excellence, acts as an independent function in the SDLC; whilst supplanting the existing set of processes with benefits of a more agile, efficient and repeatable set of processes.<|endoftext|>In our next section, we explore the business needs of SAP enabled organisations to deploy a Test Factory model setup for their QA organisation.<|endoftext|>Today\u2019s unrelenting economic and financial pressures, coupled with far too many internal and external factors (beyond the control of any organisation\u2019s circle of influence), are compelling organisations to change and tailor their business processes accordingly. These changes, most often than not, necessitate a change in the very supporting business applications itself (example, SAP, Oracle, etc.) Acquisition of new business, opening of new facilities, introduction of new business line, consolidation of service lines, Regulatory Changes like changes in tax laws, changes in reporting needs, etc., incorporating these changes to SAP is fairly complex. Any update, even if major, moderate or minor, needs to be completely analysed from several dimensions keeping in mind that these are all large business applications. The result is that organisations end up managing projects involving a high number of interrelated or moving parts. Further, SAP applications span geographies and often need heavy customisations to suit local requirements (like government, language, etc.) and have multiple vendors running the applications. Every investment that an organisation makes in its IT systems is channelled towards ensuring a smooth running External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 3 / 8 \n\n---\n\n Understanding the Business Need \u2022 Inconsistent in application test quality across teams and \u2022 Lack of usage of appropriate testing tools 2. Governance Challenges Organisations running SAP are continuously engaged in taking up large change programmes or rolling out applications to newer regions. A large change programme or rollout requires the QA function to constantly generate status reports and deal with various risks and issues.<|endoftext|>Absence of defined processes, metrics to track progress, risk management and inability to consolidate reports frequently, leads to a governance challenge with respect to managing decentralised QA teams. Another common problem encountered with decentralised QA teams is with the large amount of time consumed in assimilating and consolidating information for status reporting from various regional teams and resolution of risks and issues. Further, under the decentralised structure, teams lack adoption of uniform processes and hence there are bound to be differences in the content and structure of status reporting and the way risks are identified and dealt with. 3. High Cost of Testing / Maintenance Sound testing processes and deep business knowledge are pre-requisites to testing of SAP applications. We have also learnt earlier that a large amount of effort is spent in running the QA function for a SAP enabled organisation. Majority of organisations dedicate a large number of resources for testing of SAP releases. In addition to this, SAP testing involves testing some portions of business functionalities repeatedly and often decentralised teams lack the benefits associated with the reusability aspect. This can be cited as an additional reason for the inflated costs of SAP testing in a decentralised model.<|endoftext|>Project based QA teams primarily look at testing from a very narrow project point of view and often miss the holistic implication of the changes from a complete business landscape perspective. This leads to high efforts from the business users during UAT and large number of defects getting identified in the later stages of testing. In addition to this, the non-functional testing aspects such as Performance, Security etc., are also overlooked in the initial phases of testing leading to high amount of re-work and maintenance costs downstream.<|endoftext|>4. Lower Delivery Confidence and Higher Time-to-Market There is very low confidence on delivery of release considering the testing in the earlier phases is not really focussed on business knowledge. This results in a high percentage of defect identification in later stages of SDLC. In the absence of benchmark metrics, there is no opportunity to measure the test execution productivity, often leading to increased durations of testing cycles.<|endoftext|>While these may sound like age old problems and issues, these are indeed the common issues across organisations. These pitfalls are the reasons why organisations find themselves grappling with an expensive and a non-yielding QA function.<|endoftext|>Listed below are some of the common pitfalls encountered by SAP enabled organisations in running their QA functions - 1. Decentralised Testing This model of testing is usually prevalent in organisations which have undergone mergers or made acquisitions. Testing in such organisations is carried out using a decentralised model where no common testing processes and methodologies exist. Each Line of business (LOB) has its own processes and differences exist even within units of the same LOB. In most cases, testing is managed by the development team itself, being aligned directly to each project.<|endoftext|>This model does not provide clear delineation between the build and test functions. If there are any inefficiencies or delays in build, then the same is compensated for in the testing phase by either compressing the testing timelines or by moving forward with inadequate coverage of business scenarios. Individual teams often adopt the approach of testing with a self defined set of testing processes and scope limited to their project. The result is the lack of co-ordination when it comes to delivering together with other ongoing projects. This not only results in severe delays in the programme go-live, but also leads to a severe compromise of the quality and quantity of testing that is necessary.<|endoftext|>Most common limitations of this approach: \u2022 Lack of defined uniform testing processes and global governance based on metrics \u2022 Duplication of test effort \u2022 Non \u2013 conformance of testing timelines External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 4 / 8 \n\n---\n\n So, What is a Test Factory? Having looked at what is ailing SAP based organisations, setting up a Test Factory can be the most definitive solution available in the market currently. Test Factory is a centralised testing model that brings together people and infrastructure into a shared services function adopting standardising processes, effective usage of tools, high reuse and optimising resource utilisation in order to generate required benefits for the organisation.<|endoftext|>Let us broadly explore the solution that a Test Factory can provide \u2013 a. Test Factory acts as an independent function in the SDLC and resolves the very first ailment by having a clear delineation between the Build and Test functions of an organisation.<|endoftext|>b. Test Factory is setup as a centralised QA function which brings in uniform process adoption and an enterprise wide QA approach with easier governance. Test Factory is also setup with the attributes of a more agile, efficient and repeatable set of processes. Test Factory can also be operated in the new engagement model (NEM) format which helps in measuring the business value linked to the services offered, example; pricing is based on the work performed instead of traditional Time & Material models.<|endoftext|>Implementing a Test Factory The entire process of implementing a Test Factory involves 3 major phases \u2013 Solution Definition Solution Design Solution Implementation 3 Solution Definition Phase \u201cBuilding the Case for Organisational Buy- in\u201d One of the most essential starting points of the entire Test Factory setup involves assessing the existing organisational test processes, determining the maturity level of the processes and deriving the gaps observed. The solution definition phase involves arranging for one-on-one or group interview sessions with various stakeholders, in the existing ecosystem, and understanding the various pros and cons of the existing processes. Alternatively or additionally, a questionnaire pertaining to the respective areas of the stakeholders can be used to help document the same. For assessing the maturity, organisations are spoilt for choices with widely known Test Maturity models such as the TMMi, TMAP, TPI or the ITMM (Infosys Test Maturity Model).<|endoftext|>The ITMM is a well-blended model which builds upon the standard Test Maturity models and also adds further dimensions to its fabric in being able to evolve constantly to the changing business context. The assessment results show the current level of maturity of the organisation\u2019s processes. It is of utmost importance at this stage to bring together the leadership team of the organisation and showcase the various process improvements and benefits of moving the organisation to the higher levels of maturity. On the basis of the agreed level of maturity to be targeted, the ITMM model allows for a continuous improvement process to be imbibed into the organisation. Once an agreement is reached, a roadmap is devised on how the solution is to be designed and implemented. Solution Design Phase \u201cStructuring A Winning Solution\u201d The Solution design phase is a core component in the Test Factory setup process and involves designing processes on three dimensions of the ITMM model \u2013 \u2022 Test Engineering Dimension, covering the focus areas of Requirements Gathering, Test Strategy, Testing tools, Test Data and Environment \u2022 Test Management dimension, covering the focus areas of Estimation, Test Planning, Communications, Defect Management and Knowledge Management \u2022 Test Governance dimension, covering the focus areas of Test Methodology, Test policy, Organisational structure and Test Metrics The most important key areas to focus would be to design the processes for - \u2022 Test Methodology Defining various types of testing to External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 5 / 8 \n\n---\n\n be performed, estimation techniques, entry and exit criteria for each testing phase, testing environment set-up, operating model and various input and output artifacts \u2022 Test Governance Defining the governance structure and chalking out clear test roles and responsibilities \u2022 Test Factory Structure Defining the various communication paths within and outside the Test Factory \u2022 Metrics, KPIs and SLAs Defining the various testing related metrics and ensuring agreement on the various SLAs and KPIs for each role and processes in the Test Factory \u2022 Knowledge Management Framework Defining a centralised service to allow effortless and effective sharing of knowledge between teams, across knowledge assets In addition, teams may create, test data management processes, a catalogue of the testing services, non-functional test services methodology, Guidelines for various Testing tools and Testing Policies. The solution design phase is a highly collaborative process in which the design and delivery teams play equal roles. It involves both, fine tuning some of the current processes and completely revamping the rest. The implementation of the solution in the right manner, and with the right amount of calibration, can bring about bountiful benefits to the organisation in having a sound testing process.<|endoftext|>Solution Implementation Phase \u201cWalk the Talk\u201d Depending on the level of maturity that organisations choose to attain, this phase needs a good deal of time to be invested. The time taken to nurture the processes and imbibe them could range anywhere from six months up to two years, depending on the organisational buy-in and focus in implementing the same. Having a good amount of time on hand, organisations also have the option of choosing to implement the processes in either a staggered manner or with a big bang approach. In general, it is advisable that a staggered approach be chosen.<|endoftext|>In a staggered approach, the implementation team collaborates with the champion or manager of the Test factory prioritising the areas lacking basic maturity and identifying a pilot release in which the updated processes can be put to test. This allows the implementation team to lay out checkpoints where any anomalies can be corrected. At the end of the pilot implementation, a survey can be conducted with the stakeholders in determining the success and failures in the implementation. The lessons learnt at the pilot implementation stage are a crucial input to the next phase of implementation.<|endoftext|>It is important to look at some of the frequently encountered challenges associated with the solution implementation phase - \u2022 Aversion to change This often is a sticky issue with teams unwilling to adapt to new processes as it involves moving away from the comfort zone \u2022 Poorly adapted processes and communications This is an indication that the impacted teams are not aware of new processes and are not well trained \u2022 Handing over testing to Test Factory Traditional approach of testing by business users due to lack of business knowledge by testing team is one of the most challenging change management aspects to deal with It is therefore a task for both the implementation team and the Leadership team of the organisation in addressing these challenges / change management. The task of the implementation team lies in devising a thorough training plan for the various teams involved, designing user manuals and guidelines for any reference required to the new processes.<|endoftext|>On the other hand, the task of the leadership team is to put together a strong communication plan, listing the benefits that accrue to, both the impacted teams and the business benefit in adapting the changes. In certain situations, grievance redressal efforts and holding communication forums is a good way to engage with the teams.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 6 / 8 \n\n---\n\n Figure: Test Maturity Model External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 7 / 8 \n\n---\n\n Benefits of a Test Factory We have managed to explore the story of the Test Factory setup with the sound process framework forming the base, but the real icing on the cake is to see the benefits that accrue post setting up of the Test Factory.<|endoftext|>Test Factory brings about a set of both Qualitative and Quantitative benefits. Some of the Qualitative and hard hitting benefits include \u2022 Ensures high levels of repeatability, predictability and test coverage \u2022 Delivers on business requirements driven end-to-end testing This helps in identification of critical defects and requirement gaps which would have been typically identified only during UAT phase \u2022 Definition of key metrics Assists in tracking enabling effective governance at every stage of the programme \u2022 A reusable set of artifacts and test design, helping crash test planning and design timelines \u2022 Reduce the UAT phase saving on delivery timelines \u2022 Understanding of core business processes & building the core regression library \u2022 Effective usage of tools enabling complete traceability from requirements to test cases and defects during various phases of the project \u2022 Accelerated test automation helping in reduction of cycle time and execution costs \u2022 Providing platform to have more frequent releases annually \u2013 This will benefit the business and end users in moving away from the erstwhile lower frequency, which meant having to wait long for the release of the important Business/IT changes \u2022 One-stop shop for various testing needs \u2013 Performance Testing, Security Testing, UAT Support etc.<|endoftext|>Quantitative benefits include \u2022 An estimated cost saving of 50% owing to reuse of the test design artifacts \u2022 An expected 40% reduction in execution cost with automation of test build and \u2022 Near zero defect leakage from System Integration testing to UAT and Go-Live, ensuring faster time to market and less Production downtime. Conclusion Setting up a Test Factory can give organisations a unique insight into how they can successfully tailor and reinvent the traditional onsite- offshore model for ensuring effective and comprehensive application testing. In addition to the cost benefits, with the adoption of the Test Factory approach, organisations emerge with an integrated and comprehensive SLA driven QA organisation with tightly knitted processes. One of the most enterprising advantages with a Test Factory setup is the ability to add further services associated with Testing and without having to largely tweak the underlying processes framework. This helps the organisation in quickly on-boarding and implementing testing skills and processes required for the supporting of new business initiatives oriented towards upcoming areas like Cloud, Mobility, Social Commerce, etc. The Test Factory model is a welcome addition to the plate of offerings by Service firms and is definitely a force to reckon with in the foreseeable future.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 8 / 8 \n\n---\n\n About the \nAuthors Barry Cooper-Brown Test Manager, Diageo Barry is Test Manager with Diageo plc and is based in London. He has over 20 years of experience working with SAP across many disciplines from Basis to programme delivery. He has worked for over 14 years within the FMCG sector and has specialised in delivery of projects and programmes of SAP solutions. Currently Barry is the Test Factory Manager for Diageo responsible for the implementation and running of a Global Test Organisation across Multiple regions and SAP solutions.<|endoftext|>Chandur Ludhani Principal Consultant, Infosys Chandur is a principal consultant with the Retail, CPG and Life Sciences unit of Infosys and has over 15 years experience. He has experience on ERP Product development, Testing, Implementation and Support. As part of the testing engagements, he has helped clients in setting up the processes related to various types of testing services - Manual testing, Automation etc. leading to Testing Centers of excellence.<|endoftext|>Sailesh Chandrasekaran Senior Consultant, Infosys Sailesh is a senior consultant and has over 6 years of experience working for clients in Retail, Banking and Financial Services industry. He helps clients in assessing the maturity of their test organizations, improving their testing processes and transforming them into Centers of Excellence.<|endoftext|>\u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: 5G testing holds the key to empower healthcare industry \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 4 \n\n---\n\n WHITE PAPER 5G TESTING HOLDS THE KEY TO EMPOWER HEALTHCARE INDUSTRY Abstract COVID-19 has unleashed uncertainty on organizations, limiting their visibility and ability to strategize for the future. Technology, though, continues to evolve and has played a major role in helping deal with this crisis. When combined with AI and IoT, 5G becomes a potent technology across industries and domains, bringing unprecedented empowerment and superior customer service. This paper explores the impact of 5G on the healthcare industry. It also examines why 5G testing is important when supporting healthcare services and functions.<|endoftext|>\n\n---\n\n Page: 2 / 4 \n\n---\n\n External Document \u00a9 2020 Infosys Limited About 5G Mobile communication has evolved rapidly with changing technologies. 5G represents the latest generation of cellular mobile communication, characterized by ultra-reliable low latency communication (URLLC), enhanced mobile broadband (eMBB) and massive machine type communication (mMTC). These capabilities enable data to be transferred at very high speeds with extremely low latency. As the Internet of Things ecosystem widens, 5G has the ability to support the capture of enormous volumes of data as well as provide computing power to process the same across billions of devices. This will lead to superior customer experience with unprecedented insights and capabilities, resulting in a new digital world. Fig 1: Defining characteristics of 5G Testing 5G in healthcare 5G will become the backbone of telemedicine or remote healthcare in the future. Remote healthcare involves frequent but distant monitoring of patients using a range of devices that capture vital parameters. This data must be continuously transmitted to doctors for real-time monitoring and assistance. It also includes video consultations for diagnosis and precision medicine prescription, especially for people in care homes. On a larger scale 5G can improve overall healthcare capabilities through innovations like robotic-assisted laser surgery where doctors can use machines to perform complex procedures with greater precision and flexibility.<|endoftext|>Remote healthcare requires devices as well as apps that enable real-time patient monitoring. AR/VR can provide an immersive user experience while artificial intelligence and machine learning (AI/ML) provide descriptive, prescriptive and, more importantly, predictive diagnostics. Each of these technologies are interconnected. They often converge to create use cases that provide next-gen healthcare services. To ensure their effectiveness, it is critical to check that these technologies are tested and certified. Thus, 5G testing is important to ensure that it can support these technologies. Let us examine some of the use cases of 5G testing in the healthcare industry. Use case 1: Healthcare devices Healthcare devices include wearables used by patients to monitor vital parameters like heart rate, speech analysis, body mass index, facial expressions, and more. Real- time video consultations involve using video devices for consultations, remote- assisted surgeries and training. Some of the key considerations for testing these two types of devices are: \u2022 High reliability to ensure uninterrupted service \u2022 Minimal to zero latency for critical medical procedures like remote-assisted surgeries z eMBB \u2013 enhanced mobile broadband mMTC \u2013 massive machine-type communica\ufffdon URLLC \u2013 ultra-reliable low-latency communica\ufffdon 5G drivers 1 million devices per Km2 10 Gbps peak data rate 1 ms latency 10X battery life for low power devices 5G One network \u2013 multiple Industries Mobility \u2013 500 km/h high-speed railway 99.999% reliable 99.999% available \n\n---\n\n Page: 3 / 4 \n\n---\n\n External Document \u00a9 2020 Infosys Limited \u2022 Ensuring that sensors monitoring vital parameters send real-time alerts to the patient\u2019s/doctor\u2019s handheld devices \u2022 Interoperability testing for many devices from different vendors \u2022 Compatibility testing to ensure that the devices can integrate with applications that are built to support real-time service, high-speed performance and superior user experience Use case 2: Healthcare apps Medical apps can help deliver a range of healthcare services covering: \u2022 Mental health \u2013 These aid in tracking psychological/behavioral patterns, substance addiction and emotional well-being \u2022 Patient medication \u2013 These apps provide medication reminders, maintain medication history, document online prescriptions, and more \u2022 Telemedicine \u2013 These apps aid diagnostics, real-time consultations, and monitoring of patient progress, to name a few \u2022 Wellness \u2013 These apps help maintain fitness and exercise regimes, diet prescriptions, monitoring of food intake, and meditation sessions Some of the key focus areas for testing healthcare apps are: \u2022 User interface and experience (UI/UX) testing to ensure enhanced customer experience \u2022 Non-functional requirements (NFR) testing for performance and security of apps that provide real-time patient data including large imaging files/ videos on a 5G network \u2022 Crowd testing of apps for varied user experience and localization \u2022 Device compatibility to support a vast number of devices that will run on 5G Use case 3: AR/VR AR/VR aims at creating real-time immersive experiences that can empower telemedicine and remote patient monitoring. It requires very high data rate transmissions and low latencies to deliver healthcare services covering: \u2022 Simulation of different conditions using sound, vision or 3D image rendering that can be transmitted from connected ambulances to operating rooms for advanced medical care \u2022 Real-time training for medical students \u2022 Treatment of patients with various phobias \u2022 Early detection of diseases \u2022 Various types of therapies to support physical and mental wellbeing Some of the key focus areas for testing include: \u2022 Checking that the networks can support real-time immersive experience through: \u2022 High speed data transfer rates \u2022 Ultra-low-level latency with no lag in the experience \u2022 High bandwidth \u2022 Checking that a range of hardware devices will work with a user\u2019s smartphone to create a good mobile VR experience Use case 4: AI/ML It is evident that, in the near future, AI/ML will play a significant role across healthcare functions and services by helping diagnose illnesses earlier and prescribing the right treatment to patients. 5G will be critical in enabling healthcare functions that involve analyzing massive volumes of data. These healthcare functions include: \u2022 AI-based imaging diagnostics that provide doctors with insights about diseases and severity \u2022 ML-enabled digital biomarkers that analyze and enable early and accurate detection of Alzheimer\u2019s and dementia \u2022 Assisting in clinical trials and research to observe patient responses to new drugs and their behavior patterns \u2022 Collecting large volumes of critical patient data from healthcare apps and devices to predict the occurrence of potential diseases through ML algorithms Some of the key focus areas for testing include: \u2022 Testing for cognitive features like speech recognition, image recognition, OCR, etc.<|endoftext|>\u2022 Implementing robotic process automation (RPA) for common functions like recurring medication prescriptions, appointments and periodic medical reports. \u2022 Big data testing for both structured and unstructured data. As large volumes of data are transmitted and received from every device, app and equipment, it will need to be ingested and stored to derive insights that aid in next-gen predictive and preventive healthcare Testing on cloud In addition to the above test focus areas, cloud testing will be a common test function for the entire healthcare ecosystem.<|endoftext|>Cloud can support the core network through network functions virtualization (NFV). It can also enable software-defined networking (SDN), a must-have capability for 5G. Moreover, cloud will also host the various healthcare devices and equipment in the healthcare ecosystem. Thus, cloud testing will be critical to ensure smooth performance including functional, non- functional and network testing.<|endoftext|>These use cases provide an overview of the impact of 5G in healthcare with a focus on hospitalization, preventive healthcare, device monitoring, and patient wellbeing. There are other areas such as pharmaceuticals, insurance, compliance, etc., that will also leverage 5G to deliver their services, creating a connected healthcare ecosystem.<|endoftext|>\n\n---\n\n Page: 4 / 4 \n\n---\n\n \u00a9 2020 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected References https://www.cnet.com/news/covid-19-has-pushed-health-care-online-and-5g-will-make-it-better/ https://www.uschamber.com/series/above-the-fold/how-innovation-accelerating-meet-coronavirus-challenges https://www.pwc.co.uk/communications/assets/5g-healthcare.pdf https://www.ericsson.com/en/blog/2018/6/why-compatibility-and-5g-interoperability-are-crucial-for-success https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4029126/ https://www.scnsoft.com/healthcare/mobile/patient-apps https://pharmaphorum.com/digital/four-ways-ai-and-machine-learning-will-transform-healthcare-in-2020/ Conclusion 5G is emerging as the backbone of future healthcare, catering to a wide range of healthcare services from ambulances to operating theatres, apps to diagnostic equipment, physical illnesses to mental wellbeing, and more. Technologies like cloud, AR/VR and AI/ML will play a key role in driving a technological revolution within healthcare. As technologies and equipment for monitoring and diagnostics become more sophisticated, 5G will act as the high-speed expressway, allowing devices to exchange data at much faster speeds. 5G will support a variety of remote healthcare use cases such as early disease detection, at-home patient monitoring, precision surgeries, and distant medical training, to name a few. Thus, 5G testing will be crucial to ensure that it unlocks the potential of modern technologies in healthcare.<|endoftext|>About the \nAuthor Sumanth Dakshinamurthy Principal Consultant, Infosys Validation Solutions \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: Testing IoT Applications - A Perspective \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 4 \n\n---\n\n VIEW POINT TESTING IOT APPLICATIONS - A PERSPECTIVE - Manjunatha Gurulingaiah Kukkuru, Principal Research Analyst \n\n---\n\n Page: 2 / 4 \n\n---\n\n Introduction The Internet of Things (IoT) is a network of physical objects (devices, vehicles, buildings, and other items) that are embedded with electronics, software, sensors, and network connectivity to collect and exchange data.<|endoftext|>According to a recent report by McKinsey, around 30 billion objects may be connected through IoT by 2020. Enterprises are adopting IoT solutions for the benefits they offer; such as, optimization in operations, reduction in costs, and improvement in efficiency. The development and adoption of IoT is being driven by multiple factors, including easily available low-cost sensors, increase in bandwidth and processing power, wide-spread usage of smartphones, availability of big data analysis tools, and scalability of internet protocol version 6. Organizations are now starting to focus on external benefits such as generating revenues from IoT-enabled products, services, and customer experiences.<|endoftext|>IoT: A web of interconnected layers The following figure indicates a reference architecture for IoT, comprising of multiple layers built on top of each other to create industry- specific solutions. The components in each layer include devices, protocols, and modules that need to work in sync in order to effectively convert data to information, and subsequently to insights. Resource Efceincy Device management Policy LLRP MODBUS Zigbee BACNet ZPL SNMP Location Provisioning Remote Mgmt Access mgmt Activation Diagnostics Report Complex Event Processor Data Transformation Wi-Fi LAN WAN Satellite Cellular Que Listner Protocol Adapter Cmd Interpreter Data Mangement Ticketing Workfow Maps Command Center Communication Presence Analtics and Machine Learning Big Data Storage Database Location / Geo Fence User Mgmt Machine Learning Analytics Class of Applications Data Processing Layer Domains Base Stations / Readers Data Ingest & Transfromation Layer Devices Tracking & Tracing Site & Employee Safety Remote Monitoring Remote Visualization eTraceability Process Visibility & Automation Predictive Analtics Risk, Fraud & Warranty Analytics Smart Grid Manufacturing Logistics Agriculture Buildings Healthcare Retail Residences Oil Mining Pharma Connected Machines Barcode Sensors RFID Smart Phone Actuators Drones GPS Wearables Smart Meters \u2022 Device layer: Consists of various devices like sensors, wearables, smart meters, radio frequency identification (RFID) tags, smartphones, drones, etc. With such a diverse set of devices, a huge set of standard and custom communication protocols \u2014 including ZigBee, BACnet, LLRP, and Modbus \u2014 are implemented. \u2022 Data ingestion and transformation layer: Data from the device layer is transformed through different protocols to a standard format for further processing by the data processing layer. This data could be from sensors, actuators, wearables, RFID, etc., received via TCP/IP socket communication or messaging queues like MQTT, AMQP, CoAP, DDS, Kafka, and HTTP / HTTPS over Rest API. \u2022 Data processing layer: With data available from millions of devices, performing image, preventive, and predictive analytics on batch-data provides meaningful insights. Modules like a \u2018complex event processor\u2019 enable the analysis of transformed data by performing real-time streaming analytics \u2014 such as filtering, correlation, pattern- matching, etc. Additionally, multiple APIs for geo-maps, reporting, ticketing, device provisioning, communication, and various other modules aid in quick creation of dashboards. \u2022 Applications layer: With the availability of such rich datasets from a multitude of devices, a gamut of applications can be developed for resource efficiency, tracking and tracing, remote monitoring, predictive analytics, process visibility and automation, etc. and can also be applied to different industries and segments.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 3 / 4 \n\n---\n\n Unique characteristics and requirements of IoT systems Compared to other applications, IoT applications are characterized by several unique factors, such as: \u2022 Combination of hardware, sensors, connectors, gateways, and application software in a single system \u2022 Real-time stream analytics / complex event processing \u2022 Support for data volume, velocity, variety, and veracity \u2022 Visualization of large-scale data Challenges that thwart IoT testing These characteristics consequently present a unique set of challenges when it comes to testing IoT applications. The primary challenges include: \u2022 Dynamic environment: Unlike application testing performed in a defined environment, IoT has a very dynamic environment with millions of sensors and different devices in conjunction with intelligent software \u2022 Real-time complexity: IoT applications can have multiple, real-time scenarios and its use cases are extremely complex \u2022 Scalability of the system: Creating a test environment to assess functionality along with scalability and reliability is challenging Apart from the above challenges, there exist several factors that present operational challenges: \u2022 Related subsystems and components owned by third-party units \u2022 Complex set of use cases to create test cases and data \u2022 Hardware quality and accuracy \u2022 Security and privacy issues \u2022 Safety concerns Types of IoT testing The complex architecture of IoT systems and their unique characteristics mandate various types of tests across all system components. In order to ensure that the scalability, performance, and security of IoT applications is up to the mark, the following types of tests are recommended: Edge testing Several emerging, industrial IoT applications require coordinated, real- time analytics at the \u2019edge\u2019 of a network, using algorithms that require a scale of computation and data volume / velocity. However, the networks connecting these edge devices often fail to provide sufficient capability, bandwidth, and reliability. Thus, edge testing is very essential for any IoT application. Protocol and device interoperability testing IoT communication protocol and device interoperability testing involves assessing the ability to seamlessly interoperate protocols and devices across different standards and specifications. Security and privacy testing This includes security aspects like data protection, device identity authentication, encryption / decryption, and trust in cloud computing. Network impact testing Network impact testing involves measuring the qualitative and quantitative performance of a deployed IoT application in real network conditions. This can include testing IoT devices for a combination of network size, topology, and environment conditions. Performance and real-time testing This covers complex aspects like timing analysis, load testing, real-time stream analytics, and time-bound outputs, under the extremes of data volume, velocity, variety, and veracity. End user application testing Includes the testing of all functional and non-functional use cases of an IoT application, which also includes user experience and usability testing. External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 4 / 4 \n\n---\n\n Infosys IoT Validation solution Infosys has developed a comprehensive quality assurance (QA) strategy to handle the unique requirements and challenges associated with validating IoT applications. The Infosys IoT Validation solution enables testing with a combination of actual devices, tools, and frameworks. In addition, the Infosys IoT Test Framework provides all the capabilities required to perform functional validation, load simulation, and security verification. It can easily integrate with various IoT protocols and platforms, thus providing interoperability. This is just a glimpse of our capabilities, as we have various tools and solutions that can be leveraged to perform end-to-end testing of IoT solutions. To find out more about our IoT services, download the IoT testing flyer here. \u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected \n\n\n***\n\n\n "} +{"text": "# Infosys POV \nTitle: PowerPoint Presentation \nAuthor: Chloe Hibbert \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 30 \n\n---\n\n An Infosys Consulting Perspective Led by Olu Adegoke Consulting@Infosys.com | InfosysConsultingInsights.com The Future of Telco a four-part series \n\n---\n\n Page: 2 / 30 \n\n---\n\n Beyond connectivity | \u00a9 2023 Infosys Consulting THE FUTURE OF TELCO In this series, experts outline the four key trends influencing and driving telecom companies to the future. Contents Part 1: Beyond connectivity: How communication service providers can monetize emerging B2B growth opportunities By Ravi Jayanthi Ravi Jayanthi explains how telcos can best monetize emerging growth opportunities in the enterprise segment to remain competitive. Part 2: The new telecom operating model: Exponential growth opportunities By Alastair Birt Alastair Birt outlines why telecom companies need to creatively dismantle the old ways and rise from the ashes stronger, wise, and ready to seize the future within their grasp. Part 3: Telecoms: Why talent is the next frontier for competitive advantage By Mick Burn and Stanislava Gaspar (n\u00e9e Stoyanova) Mick Burn and Stanislava Gaspar explain how radical transformation will affect HR strategy and why betting on people is essential for success. Part 4: Getting value from AI&A in the telecom industry By James Thornhill It\u2019s clear that the digital future will be driven by Artificial Intelligence and Automation (AI&A) \u2013 especially in the telecom industry. James Thornhill outlines AI&A opportunities and why CSPs should ensure they have the right strategic initiatives to grow. Contributors: Olu Adegoke and Gaurav Kapoor Editor: Vica Granville 2 \n\n---\n\n Page: 3 / 30 \n\n---\n\n \n\n---\n\n Page: 4 / 30 \n\n---\n\n BEYOND CONNECTIVITY: HOW COMMUNICATION SERVICE PROVIDERS CAN MONETIZE B2B GROWTH OPPORTUNITIES Beyond connectivity | \u00a9 2023 Infosys Consulting 4 Companies looking to operate normally today are facing insurmountable obstacles. High geopolitical risks, economic uncertainty, labor shortages, and the realignment of supply chains have forced industries to adapt and readjust almost immediately post-pandemic.<|endoftext|>The digital transformation and workforce reinvention required are reshaping demand for information and communications technology (ICT) solutions and are utilizing the rapid evolution of advanced technologies. According to Ericsson and Arthur D. Little, these new digitalization opportunities are providing telecom companies with a significant opportunity in the B2B market to capitalize on the 35% topline growth potential. But what is needed to grow the B2B telecommunication market and what hurdles do telcos need to overcome to grasp this potential? \n\n---\n\n Page: 5 / 30 \n\n---\n\n Understanding the potential Before outlining how telcos can succeed in the B2B market, it\u2019s important to understand where the opportunity is coming from. Due to turbulent market forces, many organizations are already at various stages of: \u2022 Developing new technological solutions \u2022 Improving service delivery \u2022 Increasing operational efficiency \u2022 Reducing cost \u2022 Gaining competitive advantage \u2022 Meeting rising customer expectations Such digital transformations require ICT solutions that provide integrated connectivity, security for digitally connected devices, data, and applications. In addition, the rapid evolution of cloud, artificial intelligence (AI), machine learning (ML), and automation technologies are bringing incredible value to businesses. Companies can now aim to provide employees and customers with secure, high speed, reliable, low latency mobile networks with edge computing capabilities. However, many telecom companies are failing to seize the opportunity presented before them. They\u2019re losing out to forward thinking hyperscale cloud providers (HCP) by relying on outdated systems, processes, and operating models. What hurdles will telcos need to overcome? To ensure future growth, leaders will need to repackage the intrinsic value of the network with innovative ways to bring extrinsic value to customers. As such, organizations must expand the communication service providers (CSP) operating model from a network to a flexible end-to- end business platform provider. This will expand their role in the value-chain. Beyond connectivity | \u00a9 2023 Infosys Consulting 5 \n\n---\n\n Page: 6 / 30 \n\n---\n\n But to do that, telecom companies will need to must overcome several obstacles: \u2022 The lack of industry domain knowledge that\u2019s essential for innovation at the edge and value-based selling \u2022 The lack of necessary relationships to influence enterprises on their digitalization journey. Many telcos are, therefore, missing out on business-level (vs. connectivity-level) conversations to create relevancy \u2022 The absence of skills necessary to implement and adopt use cases for enterprises. Adoption is made more complex as there\u2019s a lack of subject matter experts focused on domains and customer ecosystems \u2022 Many telcos aren\u2019t replicating use cases across enterprises, and therefore fail to amortize the transformation investment \u2022 There\u2019s an uneven distribution of spectrum assets among CSPs that is hindering large-scale deployment of 5G, and thereby hampering use case adoption \u2022 Companies are failing to embrace open- source technologies. This prevents them from accelerating innovation and reducing costs \u2022 Competition from hyperscalers. Their ability to spend the same on telecom infrastructure services as Tier-1 CSPs has enabled them to expand their involvement in the telecom industry/value chain. This includes edge computing and private wireless networks How can telecoms capitalize on the B2B market opportunity and overcome these challenges? Beyond connectivity | \u00a9 2023 Infosys Consulting To cater to new segments and opportunity areas telcos must expand their role. Organizations must transform from simply being a network provider to becoming a service enabler and creator. This is a complex undertaking to do alone. To be successful, CSPs need to show considerable flexibility in how they deploy innovative business models. This includes working with HCPs and service integrators (SIs) to secure a stake in the B2B market. Technology works better when it\u2019s built together with partners. Especially, when dealing with a complex ecosystem across a gamut of customer segments, as well as growth areas with varying business outcomes, capabilities, and systems of differentiation. Establishing a rich partnership ecosystem will help both market leaders and aspirants to build a complete portfolio and confidently embark on a journey of scale.<|endoftext|>6 \n\n---\n\n Page: 7 / 30 \n\n---\n\n Overcoming the various challenges to transform current ways of working isn\u2019t an easy undertaking. But there are four crucial factors that should be considered in your service strategy and go-to-market (GTM) playbook to help. These are: Go-to-market playbook for telecoms to capitalize on the B2B market opportunity 7 Beyond connectivity | \u00a9 2023 Infosys Consulting B2B2X revenue Create extrinsic value with an outside-in approach to enterpirse customers. Innovation/GTM Build a strong partner ecosystem and focus on value-based selling. Adoption Simplify use cases for customer adoption and integrate (E2E) their ecosystem within your business. Scale Industrialize network engineering and operations \u2013 essential to scale at large. \n\n---\n\n Page: 8 / 30 \n\n---\n\n Key takeaways Telcos can no longer justify network investments for cost efficiency and competitive parity alone. They must focus on creating a path for growth, leveraging technology to shape problems that drive innovation and differentiation. They need to bring what\u2019s next to life and start thinking outside of the box, building a marketplace that will serve growth. That\u2019s the story of scale that telecom companies need to prioritize in partnership with SIs and HCPs. With boots on the ground, such partners are key to driving growth both through GTM and building out intellectual property.<|endoftext|>This will be a significant undertaking. As such, it must be executed in waves to align with market readiness. It will require CXOs to commit to innovation, industry collaboration, and long-term investments. With a growth potential of 35% to the topline through network enabled digitalization, the potential returns are worth the risk.<|endoftext|>8 Beyond connectivity | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 9 / 30 \n\n---\n\n \n\n---\n\n Page: 10 / 30 \n\n---\n\n THE NEW TELECOM OPERATING MODEL: EXPONENTIAL GROWTH OPPORTUNITIES Most telecom companies claim to be well on their way to making the shift from communication service providers (CSPs) to digital service providers (DSPs). They draw attention to simplified portfolios, improved customer experiences, a shift to digital channels, and higher NPS as evidence of success.<|endoftext|>However, when comparing financial performance over the last decade, the telecoms sector falls far short, as compared to the \u2018double-digit\u2019 compound growth rates of hyperscalers, OTT players, and software-as-a-service (SAAS) platform disrupters. New operating model | \u00a9 2023 Infosys Consulting 10 \n\n---\n\n Page: 11 / 30 \n\n---\n\n The case for change and the future of telco The telecom industry\u2019s earlier period of stellar growth depended on three factors: 1. The competitive advantage of owning the underlying network. 2. Regional presence, and cultural affinity with their target customers.<|endoftext|>3. Underserved generation of customers, with limited competition.<|endoftext|>Today, it\u2019s a very different story, as next-generation customers continue to shift spend and loyalty to whichever brand provides the most utility, experience, and value. On the other hand, research shows that customers are ready and willing to let telecom providers help them in many other areas of their life. Many of these aren\u2019t small, insignificant areas, nor is this list exhaustive. These are large addressable markets, with many use cases, and potential partners. So, why is it proving so difficult? Existing telco products infiltrate deeply into customer\u2019s lives, both work and play. The largest operators have massive customer and business scale, an existing platform for billing, and service management. They also have a real-time stream of interaction, content, and location data to mine for opportunities and enable personalization of any service launched.<|endoftext|>But, when it comes to building new businesses, telecom operators have historically struggled to scale adjacent business opportunities. A 2021 survey of telco CXOs highlighted corporate culture as the overarching problem, with any new business hampered by: \u2022 Cumbersome corporate processes \u2022 A lack of buy-in from senior management \u2022 And pressure for short-term results in a business where legacy network-margins create an inertia to take risks Telcos now find themselves at a crossroads. For the next decade, they can either tweak their business and operating models to achieve incremental gains or make the bold choice to reinvent their value-creation formula across a larger set of use cases, and, if possible, over a larger addressable market.<|endoftext|>New operating model | \u00a9 2023 Infosys Consulting 11 \n\n---\n\n Page: 12 / 30 \n\n---\n\n The future: An \u2018ecosystem orchestrator\u2019 and platform partner The more complex something becomes, the more important it is to simplify. Revenues flow to those who simplify \u201cthe new\u201d. It goes to those who are able to reduce the friction of buying and using by guiding customers through each stage of their journey \u2013 from first recognizing a need to becoming wedded to a service. Hyperscalers, OTT, and SAAS players have operated on the edge of this curve for the last 10 years, delivering great value to shareholders. We believe that the next generation of telco must emulate these traits and embark on an ambitious organizational transformation: New operating model | \u00a9 2023 Infosys Consulting 12 Telecom companies need to become the first point of contact for customers when they want innovation, education, and excellence across a much wider set of use cases. This will require a significant shift in organizational brain, muscle, and talent.<|endoftext|>1. Partner with a broader set of successful growth companies Leverage and combine the strengths of other partners \u2013 many outside the established telco ecosystem. At the same time, organizations should sprinkle their own differentiation and creativity into each of the combined solutions they market. Such partnerships provide the manpower, experience, and laser-focus needed to succeed at a task that would otherwise be impossible to execute alone.<|endoftext|>2. Adopt and emulate a SAAS-centric growth playbook Transform how you develop and sell products, and the way your people and digital platforms engage with customers at every stage of the lifecycle. This means continually scanning and embedding the most successful SAAS strategies and tactics into your operating model.<|endoftext|>3. Transform into an \u2018orchestrator of value\u2019 in the connectivity ecosystem In other words, build the capabilities to operate above this layer of software, service partners, and global networks. This will require developing into a bi-modal enterprise \u2013 one which knows both when it\u2019s best to \u2018own\u2019 the relationship with the customer, and when it\u2019s not, focusing on making partners successful instead.<|endoftext|>4. Optimize the use of customer and partner data...<|endoftext|>\u2026to target, personalize, and deliver a broader set of value-add services for customers. Forget \u201cquad-play\u201d and 30+ segments, the future is \u2018mega-play\u2019 to every single unique customer. To achieve this, telcos need to build a scalable, partner-curated set of products and services. They also need to hyper-personalize the offer and the delivery into the hands/devices of every individual, household, and community. 5. Consciously reconstruct the enterprise into a use case driven organization This should be equipped with a lean core which provides the scalable, personalized digital platform and rich database on which to build out their business. At the same time, it should free the outer ring to aggressively chase products and service-markets which fit within their curated partner ecosystem.<|endoftext|>\n\n---\n\n Page: 13 / 30 \n\n---\n\n In recent years, we\u2019ve seen more and more evidence that telcos can successfully incubate new products and services, achieving healthy top line growth and M&A activity around these new service models and business units.<|endoftext|>The future is already being created 13 New operating model | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 14 / 30 \n\n---\n\n Opportunity ahead: Telco for good Emboldened by this success, we believe that the telecoms industry is now entering a new phase of transformation: From DSP to ecosystem orchestrators. As such, there\u2019ll be a wave of new products, service models, and partnerships across the entire ecosystem, fueled by broader societal change. This is just the beginning. Connectivity is the central nervous system of our society. Industries will experience unprecedented technological change in the next 10 years, as AI, 5G, Blockchain, AR/VR, Web3, and a convergence of far-reaching technologies take hold and reshape the way we all live and work. Market success for the telecom industry will rely on many factors, including the five areas of transformation mentioned earlier in the article. Not forgetting, attracting, and retaining the right partnerships and talent to transform, unencumbered by the past. 14 New operating model | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 15 / 30 \n\n---\n\n \n\n---\n\n Page: 16 / 30 \n\n---\n\n TELECOMS: WHY TALENT IS THE NEXT FRONTIER FOR COMPETITIVE ADVANTAGE Talent for competitive advantage | \u00a9 2023 Infosys Consulting The fundamental shift in the telecoms industry from being communication service providers (CSPs) to becoming ecosystem orchestrators presents an interesting challenge to CHROs. It requires them to innovate and drive a future-fit HR strategy, acting as key enablers to successful business evolution and growth. The ever-changing digital economy and unprecedented levels of disruption have made it imperative for telecom companies to transform. HR has a critical role to play here, empowering telcos to deliver the overall vision through people. Empower growth through talented people Recent Gartner research shows that the workforce is one of the Top 3 priorities for CEOs in 2023. In fact, 50% of HR leaders expect increased talent competition over the next six months and 46% anticipate attrition will remain high for in-demand roles in 2023.<|endoftext|>The shift to becoming ecosystem orchestrators, the need to adopt an \u2018AI first\u2019 vision, and the drive to maximize B2B market potential have set new skills requirements for the future of telco. The industry is becoming increasingly attractive to top talent who would like to work for businesses that foster innovation and digital excellence. However, there is a shortage of talent and strong market competition for critical future skills and subject matter expertise. This raises the bar higher for HR leaders to re-imagine the employee value proposition (EVP) and the end-to-end employee experience. They need to effectively find new ways to attract and retain talent and harness future skills, while helping facilitate business changes. As such, it\u2019s essential for HR to evolve towards delivering advanced capabilities and enhanced AI-enabled digital experiences. 16 \n\n---\n\n Page: 17 / 30 \n\n---\n\n 1. Human-centered employee value proposition In the post-pandemic world of work, there\u2019s been a strong shift towards \u2018human-centered\u2019 EVP and culture, and flexible working arrangements. People are now seeking to gain emotional value in their employment, which means to feel understood, cared for, invested in, empowered, and valued. Alongside commitments for sustainability, net-zero ambitions, and aspirations for green operations, employees need to feel that the purpose of the organization they work for resonates with them. Telcos are uniquely challenged and positioned to focus on sustainable growth and minimize their carbon footprint. Digitization, 5G, Internet of Things (IoT), and cloud computing are key to achieving these ambitions, and the industry has been making significant steps to reduce the impact on the environment. In addition, the strong business focus on diversity, equity, and inclusion is allowing companies to harness the power of diverse talent and unleash creativity. For example, tackling the challenge of social mobility or providing reasonable adjustments to people with disabilities will expand talent pools and further strengthen the employer brand.<|endoftext|>Such a strong value proposition can go a long way in retaining employees, thereby addressing some of the industry\u2019s high attrition rates while also reducing the costs to recruit \u2013 key challenges impacting the bottom lines for most telcos today. 2. Employee experience and digital excellence in HR In recent years, employee experience has been at the forefront of HR priorities and continues to be critical for winning the competition for talent. For telecom companies, customer experience is at the heart of the business. As such, they utilize ongoing transformation activities which focus on truly personalizing the experience and leveraging AI insights in real-time to address fundamental needs. The same approach is to be consistently applied when re-imagining the employee experience and driving digital excellence in HR. 17 Talent for competitive advantage | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 18 / 30 \n\n---\n\n 18 Talent for competitive advantage | \u00a9 2023 Infosys Consulting DIGITAL Digitizing workforce products to exceed employee expectations and improve productivity by utilizing human-centered design methodologies and best-in-class technology. EMOTIONAL Placing employee listening at the core, gaining insights to provide employees with a meaningful purpose to work. Whilst, at the same time, actively promoting a cohesive emotional environment that considers mental health and wellbeing. PHYSICAL Recognizing unique working styles and collaboration demands to promote productivity and innovation. And transforming office buildings to smart and connected environments primed for hybrid working. Providing efficient, seamless, and appealing experiences across the employee lifecycle and the significant employee moments are key differentiators from competitors.<|endoftext|>The three pillars of successful employee experience strategy are: Digital, emotional, and physical.<|endoftext|>3. Skills and capabilities for the future To support the strategic business agenda for growth, HR leaders in the telecom industry have been building foundations to define critical future skills, assess potential gaps, and develop strategies to effectively address the talent shortages across new skills in demand. The adoption of new AI-powered technologies has been a critical lever to accelerate the skills transformation, leverage both external and internal talent pools, and develop in-house future-fit skills and capabilities by re- and up-skilling the existing workforce. AI-driven talent marketplaces help embrace agile talent mobility, breaking down barriers to internal moves and career progression. AI-assisted learning provides consumer-grade personalized learning experiences, while targeting organizational skills gaps. It also helps employees discover new learning opportunities that support career progression whilst helping build internal talent pipelines.<|endoftext|>The in-house training capabilities, targeted talent pools, and ongoing lifelong learning opportunities optimize the re- and up-skilling process. As such, it saves money in the long- term and futureproofs both people and companies. \n\n---\n\n Page: 19 / 30 \n\n---\n\n 4. Smart AI can empower and complement frontline workers Whether it\u2019s a technician on their way to a customer or a call center rep dealing with a complaint, there are many factors that AI can help telcos track and streamline to improve employee wellbeing. These smart tools use AI and machine learning to improve shift allocation, optimally determining who is needed when and for how long. This makes work flexible and fits into today\u2019s desired hybrid working model, as well as adjusting staff misalignment in real-time. It can also identify what caused delays, helping telcos keep their customers promptly informed and implement required steps for improvement. In addition, AI-based smart coaching can provide training pre- and post-work, creating daily and weekly team reports for training and recognition. Such systems have improved workforce management by 90%, reducing complaints and creating an agile and encouraging work environment. The future of smart AI in the telco space is immense and has the potential to significantly enrich frontline workers and drive operational efficiencies for those that embrace emerging tech opportunities to the fullest.<|endoftext|>19 Talent for competitive advantage | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 20 / 30 \n\n---\n\n Key takeaways The telecoms industry is on a disruptive and creative growth trajectory, and CHROs have the leading role to empower the transformation by focusing on the following strategic areas: \u2022 Foster consumer-grade employee experience and well-being, and support diverse workforce \u2022 Become an employer of choice and magnet for critical talent \u2022 Ensure availability of future-fit skills and develop strong internal talent pools \u2022 Digitize HR and adopt agile ways of working \u2022 Facilitate change and help shape the right business operating model We believe that the human element is the true differentiator for success. By drawing employees together to tackle problems in new ways, it\u2019s possible to transform organizations from the inside out, creating tangible change for a more sustainable future. 20 Talent for competitive advantage | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 21 / 30 \n\n---\n\n \n\n---\n\n Page: 22 / 30 \n\n---\n\n GETTING VALUE FROM AI&A IN THE TELECOM INDUSTRY In this series we\u2019ve already mentioned the difficulties the telecom industry has had in remaining competitive. We explained how historic poor customer service, complex business operations, and financial pressure have negatively impacted the industry\u2019s ability to compete. This is especially true when looking at the track record of communication service providers (CSPs) in comparison with hyperscalers and over-the-top providers who excel in the experience they offer. The good news is that Artificial Intelligence and Automation (AI&A) can play a significant role in helping telcos gain the competitive edge they\u2019ve been seeking. This is something that many telecom companies have already understood, as the global AI market size in the industry is projected to reach 14.9 billion US dollars by 2027. The problem is investing for investment\u2019s sake, risking a poor return on investment (ROI) and losing the competitive potential AI&A opportunities promise. So how can CSPs invest wisely to best utilize AI&A for future growth? How can AI&A improve CSP operations? Before answering how best to invest, it\u2019s important to understand how AI&A can help telcos overcome the challenges they currently face. There are four key opportunity areas where AI&A can be applied: 1. Digital customer engagement: AI&A can improve the customer experience through direct engagement, e.g., chat. It can also drive hyper-personalization, targeting each customer directly and effectively. 2. Network transformation: Network operations and AIOps can be optimized from process automation, machine learning, and reasoning to zero-touch predictive maintenance/self- healing. These technologies can also support network rollout and field force optimization. 3. Lean telco operations: Multiple AI&A use cases can be used in lead-to-cash and service assurance. Business and operations support systems (BSS/OSS), as well as supply chain management can also be augmented. 4. Revenue growth: B2B services are optimized due to real-time data insights and other advanced technologies. Overall, sales and services are also optimized.<|endoftext|>22 Value from AI&A | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 23 / 30 \n\n---\n\n Clearly, each opportunity area presents many applications for AI&A, but where does a CSP place its bets? There are several factors that make this tricky.<|endoftext|>On the one hand, most CSP value streams will yield benefits \u2013 for example, both customer- facing lead-to-cash and operational record-to-analyze offer happy hunting grounds. Within each value stream there are a range of addressable opportunities, both straightforward and complex with differing ROIs, from simple data entry to complex back-office virtual agents. On the other hand, a wide variety of new and established technologies are crowding the market, which can be applied individually or in concert. For example, workflow, decision management, and robotic process automation, through to emerging Generative AI. Moreover, stakeholders have differing levels of enthusiasm ranging from outright resistance and box ticking to strong advocates \u2013 possibly depending on certain technology types or vendors. This makes it difficult for CSPs to understand which technologies would best suit their operations and business goals.<|endoftext|>AI&A maturity and governance models also abound, often themselves siloed, and which are required to contend with new ethical AI regulation. Some applications of AI may seem appealing but expose the enterprise to risks like recruiting or the management of critical network infrastructure. The benefits that some of these technologies provide will only be felt in the long-term. For instance, curating quality data is essential to most higher value AI use cases but can be highly complex, taking years. Finally, acquiring, developing, and retaining AI skills will remain challenging for the foreseeable future. Finding the right North Star, organization, and roles to help identify, rollout and operate AI&A is also a challenge.<|endoftext|>It\u2019s therefore hard to have visibility and make informed decisions. Investment can be applied in an ad hoc fashion, the easy opportunities missed or held up, and the more difficult but interesting overserved. By over-strategizing and governing, you can risk accumulating high overheads, with local enthusiasm and innovation stifled. Too little planning and control leads to losing precious budget and focusing on a plethora of local initiatives and technologies, driven by the enthusiastic, with unknown returns. Why is it difficult to know how best to invest? 23 Value from AI&A | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 24 / 30 \n\n---\n\n CSPs can start to improve their ROI by focusing on the following areas: Where do CSPs spend to get the most return on investment? 1. Make an inventory of your investments by value stream Depending on the model used (e.g., TM Forum, APQC) there are around 25-30 value streams in a CSP, more if separated by market segments. Most should see a measure of investment, however, attention should be focused on business-critical value streams where AI&A can add most value. CSP strategies vary but most customer-facing value streams such as lead-to-cash should be high-focus, as well as key operational areas such as service and network planning, assurance, and lifecycle management. Opinions on relative levels of investment may also vary, but there should be visibility and direction. Targeted improvements to value stream business outcomes in the form of business critical KPIs can provide achievable North Stars for programs. 24 Value from AI&A | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 25 / 30 \n\n---\n\n 2. Use a range of technology There\u2019s no single killer technology, each macro pain point will have a subset of AI&A technologies that can be considered. If potential is being address, the organization should see: \u2022 A range of simple to complex challenges being addressed across most value streams \u2022 The highest business impact from the biggest investment in the value stream \u2022 A range of established and new technologies being used 25 Value from AI&A | \u00a9 2023 Infosys Consulting 3. Check pain points In CSPs, each value stream has relatively well-known macro pain points where AI&A can be applied, some straightforward \u2013 e.g., automation of swivel chair data entry between and within BSS and OSS systems \u2013 others higher value but more complex to address. The latter include recommending an optimum price point or guiding back-office staff through labyrinthian business rules for complex B2B products. \n\n---\n\n Page: 26 / 30 \n\n---\n\n 4. Direct the investment Taking this AI&A portfolio view can provide a framework to direct investment to the most appropriate areas for a CSP in the early stages of exploiting AI&A, or to augment more mature strategies and governance. To do this successfully, ask yourself the following questions: 26 Value from AI&A | \u00a9 2023 Infosys Consulting If AI&A is being exploited well, then a CSP should expect to see initiatives in certain areas, as well as a range of tactical and strategic endeavors. For example, in lead-to-cash a range of investments should be seen across all areas of the value stream.<|endoftext|>Are we addressing the low hanging fruit? Is investment being skewed toward an area where the enthusiasts are, as opposed to where most value can be gained? Are we avoiding the difficult areas that are strategically important? Are we ignoring certain technology types, or applying others where there are better options? Are we chasing too many fads or ignoring great innovations? \n\n---\n\n Page: 27 / 30 \n\n---\n\n 27 Value from AI&A | \u00a9 2023 Infosys Consulting This can also be further developed to construct an end-to-end view of AI&A initiatives and their aggregated impacts on a value stream performance. This then raises further questions: Are initiatives creating siloed improvements but not impacting key end- to-end metrics, or are the overall impacts unknown? Has the aggregated customer and employee experience been considered to create value? Are we consolidating efforts (data, process, tools) at least within or across related value streams e.g., lead-to-cash, request-to-change, service assurance? What is the balance between creating quality data architectures vs. speed of execution and business benefit, especially for \u2018low hanging fruit\u2019? CSPs should expect their internal capabilities or external partners in AI&A to proactively seek to create such views. They can also expect these stakeholders to come to the table with points of view on where investment should be made \u2013 basing investment on generic discovery phases where consultants or internal AI&A subject matter experts learn their industry is questionable. \n\n---\n\n Page: 28 / 30 \n\n---\n\n Key takeaways Prioritize, manage, and track activity through a value stream lens applying widely understood knowledge of pain points. This will ensure appropriate coverage using a range of established and emerging technology to manage risk and achieve ROI. 28 Value from AI&A | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 29 / 30 \n\n---\n\n MEET THE EXPERTS \u201cThe lines betw een digita l and physi cal retail will conti nue to blur\u201d MICK BURN Partner, Head of T&O EMEA Michael_Burn@infosys.com RAVI JAYANTHI Associate Partner, CMT Ravi.Jayanthi@infosys.com 29 ALASTAIR BIRT Associate Partner, CMT Alastair_Birt@infosys.com JAMES THORNHILL Associate Partner, CMT James.Thornhill@infosys.com GAURAV KAPOOR Associate Partner, CMT Gaurav_Kapoor01@infosys.com STANISLAVA GASPAR (n\u00e9e Stoyanova) Principal, T&O Stanislava.Gaspar@infosys.com OLU ADEGOKE Partner, Global Practice Head, CMT Olu.Adegoke@infosys.com \n\n---\n\n Page: 30 / 30 \n\n---\n\n consulting@Infosys.com InfosysConsultingInsights.com LinkedIn: /company/infosysconsulting Twitter: @infosysconsltng About Infosys Consulting Infosys Consulting is a global management consulting firm helping some of the world\u2019s most recognizable brands transform and innovate. Our consultants are industry experts that lead complex change agendas driven by disruptive technology. With offices in 20 countries and backed by the power of the global Infosys brand, our teams help the C- suite navigate today\u2019s digital landscape to win market share and create shareholder value for lasting competitive advantage. To see our ideas in action, or to join a new type of consulting firm, visit us at www.InfosysConsultingInsights.com. For more information, contact consulting@infosys.com \u00a9 2022 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names, and other such intellectual property rights mentioned in this document. Except as expressly permitted, neither this document nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printed, photocopied, recorded or otherwise, without the prior permission of Infosys Limited and/or any named intellectual property rights holders under this document. \n\n\n***\n\n\n "} +{"text": "# Infosys POV \nTitle: The synergy between AI and cloud transformation \nAuthor: Infosys Consulting \nFormat: PDF 1.6 \n\n---\n\n Page: 1 / 10 \n\n---\n\n An Infosys Consulting Perspective By Wahid Sattar & Ross Oldbury Consulting@Infosys.com | InfosysConsultingInsights.com The synergy between AI and cloud transformation How prepared are you for the future of work and innovation? \n\n---\n\n Page: 2 / 10 \n\n---\n\n The synergy between AI and cloud transformation | \u00a9 2023 Infosys Consulting 2 Introduction The exponential growth of data and the increasing complexity of business operations have necessitated the adoption of cloud computing as a critical enabler of digital transformation with spending on-end user cloud computing services hitting a staggering $600 billion in 2023. In parallel, artificial intelligence (AI) has emerged as a game-changer in the technology landscape with spending in 2023 expected to hit between $160 billion to $400 billion and with it transforming the way businesses operate and driving innovation across various domains. This paper explores the relationship between AI and cloud transformation, highlighting the synergies, strategies and benefits of their integration, along with the key technologies that underpin their relationship.<|endoftext|>Joint spent on cloud computing and AI could reach up to $1 trillion by end of 2023. \n\n---\n\n Page: 3 / 10 \n\n---\n\n The synergy between AI and cloud transformation | \u00a9 2023 Infosys Consulting 3 The emergence of AI and cloud computing AI and cloud computing have emerged as two of the most disruptive technologies in recent years. While AI is transforming the way we process data and automate tasks, cloud computing is enabling organizations to scale and optimize their operations. The convergence of these two technologies has created new possibilities for businesses to leverage data and gain insights, streamline processes, and drive innovation.<|endoftext|>1.<|endoftext|>Natural Language Processing (NLP) and Chatbots: NLP is an AI technology that allows machines to understand and interpret human language. Chatbots are AI-powered applications that use NLP to interact with users in a conversational manner. Together, NLP and chatbots are enabling businesses to provide more personalized and efficient customer service, while reducing the workload on human customer service representatives. Think of Mondly: a chatbot transforming how people learn languages combining innovative NLP, Chatbot and NPC (non- playable character) features. In April 2023, OpenAI, the creator behind the much-publicized ChatGPT, recently closed a $300 million share sale, valuing the company at nearly $30 billion.<|endoftext|>2.<|endoftext|>AIOps, or Artificial Intelligence for IT Operations, is an emerging technology that uses machine learning algorithms and advanced analytics to automate and improve IT operations. This technology is gaining popularity among companies as it enables them to proactively detect and resolve issues before they impact business operations. Deploying AIOps involves several steps. Firstly, companies need to collect large volumes of IT infrastructure data, including logs, metrics, and events, from their monitoring tools. Next, this data is processed and analyzed using machine learning algorithms that can recognize patterns and identify anomalies.<|endoftext|>\n\n---\n\n Page: 4 / 10 \n\n---\n\n These insights are then used to detect and isolate problems in real-time and provide proactive defense against. A major player in this area is Broadcom, who completed their acquisition of VMWare, the largest cloud computing and virtualization acquisition to-date.<|endoftext|>3. Edge-to-cloud: Edge Computing is a cloud computing technology that enables data processing and analysis to be done at the edge of the network or closer to the source of the data such as SmartMeters. By leveraging edge computing, businesses can reduce latency and improve the performance of cloud-based applications, while also improving data security and reducing bandwidth costs. The market is expected to grow to over $50 billion (TechTarget) 4. Cloud-based IoT platforms: Cloud-based Internet of Things (IoT) platforms are enabling businesses to connect and manage IoT devices and sensors from a centralized location. By leveraging the power of cloud computing and AI, these platforms are enabling businesses to gain insights into IoT data and automate IoT-based tasks, such as predictive maintenance.<|endoftext|>5. Cloud-based collaboration: Cloud-based collaboration tools, such as Microsoft Teams and Slack, are enabling businesses to improve communication and teamwork across the organization. By providing a centralized platform for communication and collaboration, these tools are helping to streamline business operations and improve productivity. Microsoft are currently testing their flagship AI collaboration offering MS Co- pilot which is built open OpenAi\u2019s GPT4 Large Language Model (LLM) \u2013 sure to be game-changer.<|endoftext|>The synergy between AI and cloud transformation | \u00a9 2023 Infosys Consulting 4 \n\n---\n\n Page: 5 / 10 \n\n---\n\n Examples of AI and cloud transformation in various industries The synergy between AI and cloud transformation | \u00a9 2023 Infosys Consulting 5 Healthcare The healthcare industry has been leveraging the synergy between AI and cloud transformation to improve patient outcomes and reduce costs. AI-enabled cloud services are being used to develop predictive models to identify patients who are at risk of developing chronic diseases, detect diseases at an early stage, and develop personalized treatment plans. Cloud computing is also enabling healthcare providers to store, manage and analyze large volumes of patient data, making it easier to identify patterns and trends, and make informed decisions. The global healthcare cloud infrastructure spend is expected to reach over $60 Billion by end of 2023 and analysts estimating the industry could save ~$200 Billion from further AI investment. Manufacturing The manufacturing industry has been using AI-enabled cloud services to improve operational efficiency, reduce downtime and improve product quality, spending is expected to increase to over $350 billion on cloud infrastructure with AI expected to hit ~$15 billion. AI algorithms are being used to analyze large volumes of production data to identify patterns and anomalies, enabling manufacturers to predict machine failures and schedule maintenance proactively. Cloud computing is also being used to store and manage production data, making it easier to analyze and optimize operations, and enable predictive maintenance.<|endoftext|>\n\n---\n\n Page: 6 / 10 \n\n---\n\n The synergy between AI and cloud transformation | \u00a9 2023 Infosys Consulting 6 Financial services The financial services industry is using AI and cloud computing to improve risk management, reduce fraud, and improve customer experience. AI-enabled services are being used to develop predictive models to identify potential fraud, detect anomalies, and improve fraud prevention \u2013 it is estimated banks saved over 860 Million Hours with use of chatbots (Forbes, DataBank). Cloud computing is also being used to store and manage transaction data, making it easier to analyze customer behavior and provide personalized recommendations. Retail The retail industry is leveraging the synergy between AI and cloud computing to improve customer experience and optimize operations. AI-enabled services are being used to develop personalized recommendations, enable visual search, and optimize supply chain management. Cloud computing is also being used to store and manage customer data, making it easier to analyze behavior and provide personalized recommendations; expected spend to reach over $30 Billion coupled with AI innovations in Metaverse, personalization, ethics and delivery expected to significantly transform the customer experience. \n\n---\n\n Page: 7 / 10 \n\n---\n\n The synergy between AI and cloud transformation | \u00a9 2023 Infosys Consulting 7 Cloud strategy with AI and innovation As businesses continue to leverage the power of cloud computing to streamline operations, reduce costs, and drive innovation, the importance of having a cloud strategy in place has become increasingly clear.<|endoftext|>One of the key drivers of cloud strategy is cloud innovation which refers to the development and deployment of new and cloud-based technologies, services, and applications that can help businesses to stay competitive and drive growth. By leveraging the latest advances in cloud computing, businesses can gain a competitive edge by being able to rapidly innovate, scale, and adapt to changing market conditions.<|endoftext|>The integration of cloud computing and artificial intelligence (AI) has become a major focus for cloud innovation in recent years. By leveraging AI technologies such as machine learning, natural language processing, and computer vision, businesses can gain insights and automate tasks that were once impossible to achieve at scale. For example, by using AI-enabled cloud services, businesses can automate routine tasks such as data entry, processing, and analysis, freeing up valuable time and resources that can be used to focus on higher-value activities.<|endoftext|>Cloud innovation is also driving the development of new and cloud-based services and applications that can help businesses to drive growth and stay competitive. For example, cloud-based analytics services are helping businesses to gain insights into customer behavior, while cloud-based collaboration tools are improving communication and teamwork across the organization. Cloud-based services are also helping businesses to streamline their operations, reduce costs, and improve efficiency.<|endoftext|>\n\n---\n\n Page: 8 / 10 \n\n---\n\n The synergy between AI and cloud transformation | \u00a9 2023 Infosys Consulting 8 However, as businesses continue to embrace cloud innovation, they must also be aware of the risks and challenges associated with cloud adoption. These can include issues such as data privacy and security, vendor lock-in, and the need for skilled personnel to manage cloud operations but in the absence of a ready-made skilled workforce, organizations will need to put the right platforms and processes in to develop and train their workforce to manage these technologies effectively. To mitigate these risks, businesses need to develop a comprehensive cloud strategy that addresses these challenges and ensures that they are well-equipped to handle them.<|endoftext|>What\u2019s next? The synergy between AI and cloud transformation is transforming the way businesses operate and drive innovation across various domains. The key technologies that underpin this relationship, including machine learning, natural language processing, cloud infrastructure, and edge computing, are enabling businesses to leverage data and gain insights, streamline processes, and drive innovation. AI-enabled cloud services are helping businesses to extract insights and automate tasks, making it easier to scale operations and drive innovation. By leveraging the latest advances in cloud computing and AI, businesses can gain a competitive edge by being able to rapidly innovate, scale, and adapt to changing market conditions. However, to succeed in the cloud era, businesses must also develop a comprehensive cloud strategy that addresses the risks, challenges, regulatory and geographical requirements associated with cloud adoption and ensures that they are well-positioned to achieve their goals and objectives.<|endoftext|>At Infosys Consulting, we help our clients navigate the world of AI and Cloud transformation, let us help you Navigate Your Next.<|endoftext|>\n\n---\n\n Page: 9 / 10 \n\n---\n\n MEET THE EXPERTS Ross Oldbury Associate Partner \u2013 CIO Advisory Practice. Ross has over 25 years of Enterprise IT experience. He started with roles in Engineering and Infrastructure, then moved into Strategic Cloud Consultancy on Azure and AWS. More recently, Ross has focused on complex cloud transformations and Alliances Partner Networks.<|endoftext|>Ross.Oldbury@infosysconsulting.com Wahid Sattar Senior Principal \u2013 CIO Advisory Practice With over 15 years\u2019 experience, Wahid is pragmatic business transformation and technology enablement specialist who delivers across the CIO, CTO and Digital spectrum.<|endoftext|>Wahid.Sattar@infosysconsulting.com The synergy between AI and cloud transformation | \u00a9 2023 Infosys Consulting 9 \n\n---\n\n Page: 10 / 10 \n\n---\n\n consulting@Infosys.com InfosysConsultingInsights.com LinkedIn: /company/infosysconsulting Twitter: @infosysconsltng About Infosys Consulting Infosys Consulting is a global management consulting firm helping some of the world\u2019s most recognizable brands transform and innovate. Our consultants are industry experts that lead complex change agendas driven by disruptive technology. With offices in 20 countries and backed by the power of the global Infosys brand, our teams help the C- suite navigate today\u2019s digital landscape to win market share and create shareholder value for lasting competitive advantage. To see our ideas in action, or to join a new type of consulting firm, visit us at www.InfosysConsultingInsights.com. For more information, contact consulting@infosys.com \u00a9 2023 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names, and other such intellectual property rights mentioned in this document. Except as expressly permitted, neither this document nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printed, photocopied, recorded or otherwise, without the prior permission of Infosys Limited and/or any named intellectual property rights holders under this document. \n\n\n***\n\n\n "} +{"text": "# Infosys POV \nTitle: Thriving in the Post-Covid Insurance Growth \nAuthor: Infosys Consulting \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 28 \n\n---\n\n An Infosys Consulting Perspective By Aaro Kauppinen, Sagar Roongta, and Jitin Sharma Consulting@Infosys.com | InfosysConsultingInsights.com THRIVING IN THE POST-COVID INSURANCE GROWTH Insurance in Emerging South-East Asia \n\n---\n\n Page: 2 / 28 \n\n---\n\n CONTENTS 2 1.<|endoftext|>Introduction 2.<|endoftext|>Operating Environment Changes 3.<|endoftext|>Changed Consumer Preferences 4.<|endoftext|>Succeeding in the Next Normal 5.<|endoftext|>Recommendations for Insurers Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 3 / 28 \n\n---\n\n Successful insurers gain market share through CX and personalization Emerging South-East Asian markets are seeing tremendous growth in insurance. We expect +9-11% Gross Written Premiums (GWP) growth year-on-year until 2025 across the region, contributing to incremental 34-40 billion USD in GWP. This is much welcomed after the COVID induced setback. Despite COVID and other regionally impacting headwinds like China\u2019s extended economic slowdown or war in Ukraine, several insurers have been able to grow their premium income through this tumultuous period.<|endoftext|>Our analysis of the success stories in the region highlights three high-impact opportunities for insurers to grow at a market-beating rate: personalized customer experience, distribution, and the ecosystem, and lastly, product innovation. In addition to the three high-impact opportunities, we identified several other opportunities to improve their market position.<|endoftext|>INTRODUCTION 1 2 3 Personalized Customer Experience Distribution and Ecosystem Product Innovation Other Opportunities \u2713 Scenario-based Business Planning \u2713 Growth Market Investments \u2713 Tailored Pricing and Commercial Excellence \u2713 Product Portfolio Rationalization \u2713 Operational Efficiency \u2713 AI and Analytical Customer Models \u2713 Wearables and the Internet of Things Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 3 What should insurers in South-East Asia do to reap benefits from the post-COVID growth momentum? \n\n---\n\n Page: 4 / 28 \n\n---\n\n INTRODUCTION This point of view is structured into 4 chapters. First, we explore the macro-level operating environment impacts of COVID to insurance and apply a driver-based model to forecast insurance growth in the region. Second, we summarize the changed consumer preferences towards insurance. Third, we highlight key success stories of insurers that were able to beat the market throughout this tumultuous period. And in our Fourth chapter, we summarize all the key learnings how insurers could accelerate their growth in the post-COVID economy.<|endoftext|>Product innovation provided two growth opportunities for insurers: response to market changes and increase in relevancy. The most agile insurers were able to release new products covering COVID-related risks. Financial and insurance literacy has not developed at the rate of digital technology adoption; targeted, customer-friendly products have been a success driver for insurers to penetrate the previously uninsured segments.<|endoftext|>Distribution and sales are still largely driven using agency and broker channels in the region. The individuals still want the personal contact to understand the relatively complex insurance products. Supporting agents and brokers with digital tools and training, and therefore enabling the digital customer experience through an intermediary is a major growth driver. Ecosystem participation, including bancassurance, is prevalent through the region. Providing the right offers and tools for the right partners and through the right ecosystems is the key for growth.<|endoftext|>Personalized customer experience is a digital and analytics- driven approach to increase the insurance income, GWP, customer satisfaction, and cost-to-income ratio. Targeted and tailored messaging from marketing, through sales, underwriting, claims and the rest of the customer journey has proven its value in the region. Emerging South- East Asia is generally favorable in sharing their data for personalized experiences and they have adopted mobile and digital technologies to match or even beat the developed APAC.<|endoftext|>Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 4 \n\n---\n\n Page: 5 / 28 \n\n---\n\n COVID-induced drop in economy was succeeded with accelerated growth The immediate impact of COVID was the restrictive government policies and reduction of economic activity. The GDP for the emerging South-East Asia dropped by 4.3% in 2020 from 2019 levels and recovered only the next year 2021. Despite the recovery with accelerated GDP growth at 8.0%, the GDP will not reach the pre-COVID growth trajectory before 2025.1 China has been the principal driver for economic growth in Asia and its restrictive COVID policies extended the depressed economic growth. The sudden loosening of China\u2019s restrictions has resulted in a mixed immediate response. We expect that eventually the loosened policies will re-accelerate its economic engine bringing growth for the entire region. Furthermore, the war in Ukraine and resulting sanctions has led to further reduction of growth prospects; South-East Asia is geographically relatively far away from the war, but the impacts to trade, food and energy have extended to the region.2 OPERATING ENVIRONMENT CHANGES 1,289 535 402 434 414 2,497 2,668 2,849 2,727 2,942 3,178 3,441 3,698 3,995 0 500 1,000 1,500 2,000 2,500 3,000 3,500 4,000 4,500 5,000 2017 2018 2019 2020 2021 2022 2023 2024 2025 Emerging South-East Asia GDP by country Vietnam Malaysia Philippines Thailand Indonesia B USD +6.8% +8.0% +7.9% -4.3% The others include Myanmar, Cambodia and Laos.<|endoftext|>KEY ACTIONS FOR INSURERS \u2713 Prepare for growth: SEA economies will exceed pre-pandemic growth rates.<|endoftext|>\u2713 Use scenario-based business planning: Discrete world events dramatically impact the macro-economic environment.<|endoftext|>Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 5 \n\n---\n\n Page: 6 / 28 \n\n---\n\n Out of the South-East Asian markets, we analyzed Indonesia, Thailand, Malaysia, Vietnam, and the Philippines which constitute 97% of the region\u2019s GDP. The COVID-induced financial hardship led to an overall reduced spending on insurance. Pre-COVID, from 2016 to 2019, the total GWP had grown 7.1% annually (in USD). From 2019 to 2020 it shrank by 1.1%. Post-COVID, in 2021, the insurance spending increased by 6.8%.3, 4, 5, 6, 7 We are forecasting the GWP to grow 9-11% year-over-year until 2025 aligned with the elevated GDP growth. Especially Vietnam and Indonesia are expected to be the growth drivers contributing ~50% of the total growth of the 5 countries. OPERATING ENVIRONMENT CHANGES KEY ACTIONS FOR INSURERS \u2713 Focus on growth markets: Especially Vietnam and Indonesia.<|endoftext|>\u2713 Proactive pricing: Inflation-driven growth needs active management of policy prices; adjust policy price review cadence to match the increased claim costs and market requirements.<|endoftext|>\u2713 Differentiated product adjustments: Increase prices for premium products which are less price-sensitive; reduce the coverage and therefore the exposure of basic products to alleviate price increase pressure of highly price-sensitive customer groups.<|endoftext|>16.9 19.1 18.6 19.4 17.5 17.3 14.0 14.1 15.6 16.6 16.8 17.9 5.1 5.2 5.6 5.8 6.0 8.3 22.1 24.4 26.7 27.7 27.1 27.7 3.9 4.7 5.8 6.9 8.1 9.5 27.6 25.1 10.2 34.9 18.7 0 5 10 15 20 25 30 35 40 2016 2017 2018 2019 2020 2021 2022 2023 2024 2025 GWP in SEA emerging markets Vietnam Malaysia Philippines Thailand Indonesia B USD Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 6 \n\n---\n\n Page: 7 / 28 \n\n---\n\n OPERATING ENVIRONMENT CHANGES The 2020 dip in GWP resulted in higher insurance penetration as the GDP declined faster than the GWP. We forecast mild insurance penetration development in all countries except for Vietnam. Vietnam\u2019s insurance penetration is expected to grow by 30% by 2025 driven by current low baseline, strong economic recovery, legalization of online insurance, urbanization, growing middle class population, and easier access to insurance especially through the bancassurance channel.8 GWP growth trend and forecast Annual GWP Growth Value-add B USD (22-25) Country 2016-2019 2019-2020 2020-2021 Forecast 22-25 Indonesia 4.5% -9.7% -1.1% 9% \u2013 12% 8.7 \u2013 10.4 Malaysia 5.8% 1.2% 6.6% 8% \u2013 9% 7.2 \u2013 8.0 Philippines 4.2% 4.0% 36.7% 6% \u2013 10% 1.9 \u2013 3.4 Thailand 7.8% -1.9% 2.1% ~6% 7.2 \u2013 8.0 Vietnam 20.9% 17.0% 18.0% 18% \u2013 20% 9.1 \u2013 10.1 Total 7.1% -1.1% 6.8% 9% \u2013 11% 34 \u2013 40 Insurance penetration (GWP / GDP) Country 2019 2021 2025 FC Thailand 5.08% 5.39% 5.4% Malaysia 4.53% 4.52% 4.6% Vietnam 2.10% 2.60% 3.3% Philippines 1.54% 2.10% 2.1% Indonesia 1.73% 1.46% 1.7% Insurance penetration forecast Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 7 GWP forecast model Macroeconomic drivers \u2022 GDP (USD) \u2022 Population \u2022 GDP / capita (USD) \u2022 Inflation \u2022 Unemployment Insurance market \u2022 Gross Written Premium \u2022 Insurance Penetration Driver-Based Model Forecast \u2022 2 macro-economic driver models: insurance penetration and insurance density \u2022 Models based on 2015-2021, excluding 2020 (COVID year) \u2022 P-value for both models <0.005 \u2022 GWP model is a weighted average of both models for best fit against actuals \u2022 GWP forecast based on the driver-based model applied on IMF\u2019s macro- economic forecast (October 2022) \u2022 Insurance penetration and density calculated based on forecast GWP against IMF\u2019s relevant macro drivers, GDP and population \n\n---\n\n Page: 8 / 28 \n\n---\n\n OPERATING ENVIRONMENT CHANGES Our model suggests that the insurance density continues to develop largely at pre-COVID rates. Vietnam is the only country in our model that is expected to see significant take-off from its peers, Indonesia, and the Philippines. The population who are insured will depend on insurance types and countries. For example, in Thailand, 75% of the population are covered by health insurance and the rest are covered by national healthcare systems. In the Philippines and Vietnam, circa 90% of the population are covered by national health insurance. In Malaysia and Indonesia on the other hand, over 50% and 35% of the population do not have health coverage, respectively.9, 10, 11, 12, 13 In summary, while COVID induced a bump in the road, we forecast that the future growth is 40% faster compared to pre-COVID. The growth is driven by increased digital adoption promoting easier access, urbanization and the growth of middle class generating more demand, increased interest towards health and life insurance due to the pandemic.<|endoftext|>66 73 71 442 440 482 509 514 546 53 54 56 320 352 384 397 389 396 42 51 71 83 64 98 732 88 496 97 184 0 100 200 300 400 500 600 700 800 2016 2017 2018 2019 2020 2021 2022 2023 2024 2025 Insurance density in SEA emerging markets (GWP / person) USD Vietnam Malaysia Philippines Thailand Indonesia KEY ACTIONS FOR INSURERS \u2713 Reap the growth now, as the pandemic is still fresh in memory.<|endoftext|>\u2713 Seize the opportunity in growth markets especially in growing urban middle class.<|endoftext|>\u2713 Focus on reach and easy-to-understand offerings in underinsured markets.<|endoftext|>Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 8 \n\n---\n\n Page: 9 / 28 \n\n---\n\n Three years of digitalization in three months CHANGED CONSUMER PREFERENCES Overall, COVID increased the attention to insurance, especially health and life products. The persistence of COVID, with its several waves and increasing number of variants, has kept health topics, including insurance, in the public discourse. Not all news has been positive though, e.g., regarding the insurers\u2019 inability to pay the claims, or rehabilitation of insurers facing bankruptcy. Despite the controversies, we see that the demand for insurance has grown in the region.<|endoftext|>Firstly, COVID deeply changed the way how people behave and what they expect. In general, it forced everyone to adopt digital channels one way or another. Secondly, the impacts of COVID were very personal in nature; we see different response between user groups depending on how they were impacted by COVID. In the next sub-chapters, we address both effects and then summarize the best practices for insurers.<|endoftext|>Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 9 \n\n---\n\n Page: 10 / 28 \n\n---\n\n COVID doubled digital channel adoption in a single year CHANGED CONSUMER PREFERENCES Working from home and social distancing measures led consumers to adopt digital technologies much faster than anticipated pre- COVID. In many aspects, the emerging APAC markets have surpassed the developed APAC markets in digital adoption. For example, in 2022, 43% of people in the emerging markets managed their policies digitally, when only 23% in the developed markets do so. In financial services in the emerging Asia-Pacific, digital banking usage nearly doubled from 55% to 88% of the population from 2017 to 2021. While many respondents to surveys say that they are interested in buying insurance through an online channel, only 25% of those willing to buy the insurance online actually do so. The blockers include limited online channel, and difficulties to use the channel.14, 15, 16 Digital insurance adoption may have been inhibited by the regulatory restrictions, e.g., in Thailand and Vietnam online insurance sales has been limited by the regulator. Philippines on the other hand has seen doubling of the online channel preference as the principal interaction channel from under 30% to more than 60% from pre- to post-COVID.17 In emerging South-East Asia, the digital channel effectively means the mobile channel. Due to limitations of public infrastructure, the smartphone is the principal tool to access the Internet and online services in both urban and rural areas. The digital channel is not limited to direct channel; consumers are expecting digital interactions embedded with the human interaction with an agent or a broker. Nearly half of the people in the region say that they would want at least some human interaction when they are purchasing new policies. Omnichannel interactions where one part of the customer\u2019s journey is digital, and another part is handled through traditional channels are even more important during the transitioning and resetting into the post-COVID World.18 55% 20% 16% 18% 28% 60% 1% 2% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Pre-covid Post-covid Philippines insurance customers' channel preference In person By phone Digitally Other KEY ACTIONS FOR INSURERS \u2713 Double down efforts on digital channels, especially mobile.<|endoftext|>\u2713 Extend digitalization through the agent and broker channels.<|endoftext|>\u2713 Build omnichannel capabilities for seamless customer journeys across the digital and non-digital touchpoints.<|endoftext|>Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 10 \n\n---\n\n Page: 11 / 28 \n\n---\n\n COVID is a strong use case of benefits of behavioral segmentation CHANGED CONSUMER PREFERENCES Some people were financially relatively unimpacted by COVID, while others might have needed to tap into their savings, lose a job, or make other adjustments in their life. E.g., in the Philippines, unemployment doubled from 5.1% to 10.4% from 2019 to 2020. The ones whom COVID most impacted were 5-10% more interested to acquire insurance than their less impacted peers. Very importantly, only 50% of people who were negatively impacted by COVID were interested in sharing their personal data for personalized pricing, while 80% of those who were less impacted were willing to share that detail. Depending on the country, 66-82% of the respondents in Emerging South-East Asia were willing to share personal data to get a customized insurance plan.9, 19 80% 50% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% Less impacted by COVID More impacted by COVID Philippines insurance customers\u2019 willingness to share personal data for personalized price Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 11 \n\n---\n\n Page: 12 / 28 \n\n---\n\n CHANGED CONSUMER PREFERENCES COVID induced a wave of insurance terminations, but also new policies. For example, in Philippines Life Insurance, the switching increased by 35% from pre-COVID levels. Study in USA examined similar switching behavior and identified that price, experience, and product to be the most significant reasons for switching the insurance. The same was applicable for both Life and Non-Life. This switching behavior can be forecasted and prevented using advanced behavioral models.20 0 200 400 600 800 1,000 1,200 1,400 1,600 0% 5% 10% 15% 20% 25% 2017 2018 2019 2020 2021 Philippines Life Insurance Policies Policies in effect (end of year) New policies Terminated policies Switches (est) +35% Count of policies, x1,000 New & terminated policies, % of total KEY ACTIONS FOR INSURERS \u2713 Apply dynamic pricing and contextual offers to increase conversion and average policy value \u2713 Use AI and customer behavioral models to predict and prevent customer churn Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 12 \n\n---\n\n Page: 13 / 28 \n\n---\n\n Customer experience drivers that make an impact CHANGED CONSUMER PREFERENCES Advances in digitalization enable 5 key customer experience trends: personalization, Internet of Things, immediate response, ecosystem offering, and product innovation. All the trends have been further accelerated due to the digital adoption by both customers and insurers. Insurers who succeed in customer experience can expect up to 15% increased premium income and 10-40% reduction of operating expenses compared to their less successful peers.21 Personalization covers topics such as targeted marketing, product tailoring / bundling, dynamic pricing and policy administration. Data-driven approaches allow individual customer-specific tailoring as long as the marketing materials, products, prices, customer journeys, and other content or processes are parametrized in such way that an algorithm can define the appropriate approach based on targeted customer\u2019s information. Personalization improves reach, increases click- through rates and conversion, number of policies per customer, and the average value of the policies. Internet of Things is typically used to gather health and usage related information e.g., from smartwatch, smartphone or car\u2019s systems. This influx of data will enable improved targeting and customer risk assessments. With customer\u2019s consent and suitable apps, the same data can be used e.g., to alert when accidents occur, identify fraud, collect details for claim application, and recommend medical visits for preventive health. IOT is currently a nice-to-have value- add, but it will be more pervasive across the insurance value chain as the IOT devices will become more prevalent. The pioneering customers are already asking and expecting insurance products to collect their data for tailored pricing. Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 13 \n\n---\n\n Page: 14 / 28 \n\n---\n\n CHANGED CONSUMER PREFERENCES Immediate response is the customer\u2019s experience of straight-through processing. Whether the process in underwriting or claim handling, customers are increasingly expecting immediate responses. The customer expectations of the digital world arise from leading non-insurance digital services, e.g., Netflix provisions all the content immediately after subscription; Grab will immediately tell the car and driver\u2019s details including estimated time of arrival once the ride is ordered. These expectations are liquid: customers expect them from all digital services, and increasingly also from the B2B interactions they have. Process mining, automation, digital applications, e- KYC, e-Signatures, intelligent assessment of customers claims, and automated fraud checks are a few opportunities insurers are pursuing to improve their response times. Operational uplifts focus on turn-around time, but they also reduce manual effort, operational cost, and improve the customer experience of each transaction. Ecosystem enables insurers to serve their customers as a part of a larger ecosystem. Insurers\u2019 typical partners include banks as a distribution channel for bancassurance. Insurance ecosystem may extend to Insurtech, 3rd party data exchanges / provision, customer risk analyses, white labeled underwriting, retail and business collaboration for distributing bundled insurance products, alliances between healthcare and insurance, financial consolidation / asset management. The ecosystem will need a robust technology platform to facilitate the layered exchanges through APIs; it might be that the platform provider is yet another 3rd party in the ecosystem. Ecosystem and platform offerings will require careful disaggregation of the business model; targeted white-label products and services focusing on volume and efficiency, whereas holistic service provision requires market leading customer experience layer and plethora of enticing offers from the ecosystem. Ecosystems enable growth and efficiency based on where the insurer wants to focus. Product innovation is supercharged by digital capabilities. New digital technologies enable e.g., usage-based pricing, modular offerings, parametrized products, disaggregated insurance policies, and low- value micro insurances. Further, digitalization enables supercharging the product innovation and market testing processes e.g., through simulation and automated A/B testing. We have witnessed that a range of specific, customer-friendly insurance policies have enabled faster growth compared to policies with extensive and difficult-to-understand coverage. In the emerging markets with limited insurance literacy, simplification and concreteness have been successful. Most market-responsive insurers were able to provide targeted health insurance products for COVID and for the potential side effects of COVID vaccinations.<|endoftext|>KEY ACTIONS FOR INSURERS \u2713 Focus on customer experience with a dedicated team owning the Customer Experience end-to-end \u2713 Innovate and launch new products to market alone or as a part of an ecosystem \u2713 Operational excellence is a driver for customer experience Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 14 \n\n---\n\n Page: 15 / 28 \n\n---\n\n Selected high-performing insurers and their growth relative to market True stories of driving insurance growth from the market SUCCEEDING IN THE NEXT NORMAL In 2020, businesses needed to quickly respond to the abruptly changed business environment. Financially, the insurers saw increased claims and general downturn in sales. At operational level, working from home policies mandated a massive shift in remote working forcing the insurers to open the company\u2019s private on- premises network to be accessible remotely. That shift in connectivity digitalized parts of many customer-facing and back-end processes resulting in hybrid workflows. Essentially, COVID accelerated insurers\u2019 digital transformation plans from 3 years to 3 months. The most successful insurers were able to capitalize on digital, mobile-first channels when interacting with their customers. They were able to retrain their workforce to use the new digital technologies and utilize agile ways of working to seize the opportunities as they arose. Furthermore, leveraging partnerships helped the customers to gain access to services bringing value for the entire ecosystem.<|endoftext|>We analyzed Philippines, Malaysia and Thailand markets to identify the leaders through COVID. We highlight 3 success stories showcasing growth through the pandemic and gain of market share in their respective markets. Insurer Country Segment CAGR (2017-2021) Company growth relative to segment Growth Drivers Company Segment Personalized CX New Product Innovation Distribution & Ecosystem Prudential Thailand Life 7.5% 0.2% 47x \u2713 \u2713 AIA Thailand Life 4.4% 0.2% 28x \u2713 \u2713 \u2713 Etiqa Malaysia Non-Life 7.3% 0.5% 13x \u2713 \u2713 \u2713 Allianz Malaysia Non-Life 3.1% 0.5% 5.8x \u2713 \u2713 \u2713 Malayan Philippines Non-Life 12.7% 2.7% 4.8x \u2713 \u2713 Allianz Philippines Life 54.9% 11.8% 4.7x \u2713 \u2713 \u2713 FWD Philippines Life 43.4% 11.8% 3.7x \u2713 \u2713 \u2713 Pioneer Philippines Non-Life 9.5% 2.7% 3.6x \u2713 \u2713 Bangkok Thailand Non-Life 10.5% 4.7% 2.2x \u2713 \u2713 Dhipaya Thailand Non-Life 10.0% 4.7% 2.1x \u2713 \u2713 Etiqa Malaysia Life 13.9% 7.0% 2.0x \u2713 \u2713 Allianz Malaysia Life 11.7% 7.0% 1.7x \u2713 \u2713 \u2713 Prudential Malaysia Life 8.7% 7.0% 1.3x \u2713 \u2713 Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 15 The insurers presented in the table have significant size in their market and market-beating growth rate. It will therefore leave fast-growing companies out that are comparatively small in the market. Information is collated from each country\u2019s official insurance statistics and may differ from companies\u2019 own annual reports.<|endoftext|>\n\n---\n\n Page: 16 / 28 \n\n---\n\n 3.8 4.0 4.1 4.2 4.5 17.9 18.7 18.1 17.7 18.0 0 2 4 6 8 10 12 14 16 18 20 2017 2018 2019 2020 2021 Thailand Life Insurers GWP Success Story \u2013 AIA, Thailand SUCCEEDING IN THE NEXT NORMAL Thailand is the largest insurance market in ASEAN. Its insurance penetration reached 5.4% in 2021, highest among its peer countries. Its GWP has grown at a CAGR of 2.9% from 2017 to 2021. The life insurance segment in Thailand contributes close to 75% of the overall GWP in Thailand. However, the progress of the life segment has been tumultuous with 0.1% CAGR. 2021 showed 2.0% GWP growth after two consecutive years of decline.<|endoftext|>The largest insurer in the industry, AIA Thailand, was also the fastest growing during the period. Its GWP grew at 3.5% CAGR, increasing its market share from 21% to 25%.<|endoftext|>AIA Thailand\u2019s success can be attributed to three key strategic initiatives: shift towards higher margin products, tech-driven agency development programs and digitalized customer experience. AIA made a strategic shift towards selling higher margin products such as unit-linked and long-term protection products which became more popular due to pandemic. As a result, in 2021, AIA Thailand\u2019s profitability from the life segment improved by 19 percentage points compared to the previous year, from 71% to 90%. AIA also shrewdly used bundling strategies to gain additional revenue. It generated 20 M USD additional revenue by cross-selling its protection products to the 17,000+ customers looking for COVID vaccine side-effect protection.22 Others B USD 4.4% -0.1% 2.6%* -8.5% 0.1% CAGR Thailand 2.4% 4.4% * - FWD Thailand acquired Siam City Insurance in 2019. -5.9% -5.6% 7.5% Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 16 \n\n---\n\n Page: 17 / 28 \n\n---\n\n SUCCEEDING IN THE NEXT NORMAL AIA deployed digital solutions for agency productivity and quality improvements. Its agency talent development program embeds analytics throughout the agency's development lifecycle. Its award-winning talent development program, provides a one-stop solution with a digitalized and seamless experience from prospecting, candidate assessment and selection, to personalized coaching and developmental programs. Through this, they can identify high-potential agents early and create customized support for their retention and continuous development.23 AIA strives to become Thailand\u2019s first digital insurer with an omnichannel digital customer experiences to serve new and existing customers. Besides telehealth and telemedicine program, it launched ALive, a personal assistant app for underserved young families (18-34 years old) segment. Also available for non-AIA customers, its users have access to a wide range of services and community for parenting. The app had over a million downloads with a rating of 4.5/5.0 on Play Store. The app generated 20% online to offline leads and reactivated dormant customers to drive sales.24 AIA iService App has also brought substantial operational efficiencies. Through the iService app, customers can manage their own policies and make claims. AIA was able to resolve 76% of the minor health claims within the same day as of 2021.25, 22 KEY ACTIONS FOR INSURERS \u2713 Identify underserved customer segments to create outcome-based offerings with a pull- based sales model \u2713 Employ HR analytics to identify high- potential talent and handhold them with AI- powered learning paths for sustained agent productivity \u2713 Use agency management platforms for agency differentiation, to drive higher productivity and improved agent retention Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 17 \n\n---\n\n Page: 18 / 28 \n\n---\n\n SUCCESS STORY \u2013 FWD, Philippines SUCCEEDING IN THE NEXT NORMAL The Philippines\u2019 insurance penetration has hovered around the 1.6%. However, the pandemic caused a step-change in insurance purchases leading to insurance penetration increasing to 2.1% in 2021. The step-change included; life insurance growth averaged 11.8% CAGR. FWD life insurance\u2019s GWP has grown close to four times faster than the market, at 43.4% CAGR, increasing its market share from 2% in 2016 to 6% in 2021. It has become the 6th largest life insurer in the Philippines. This remarkable, continuous growth was built on three key pillars: creating innovative holistic product solutions, increasing accessibility through ecosystems and partnerships, and utilizing digital technologies for superior customer experience. Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 18 0.09 0.15 0.18 0.28 0.38 4.0 4.4 4.5 5.0 6.3 0 1 2 3 4 5 6 7 2017 2018 2019 2020 2021 Philippines Life Insurance GWP B USD 43.4% 17.3% CAGR 54.9% -3.7% 19.2% 11.0% 9.9% 7.5% 11.8% Others Philippines Success Story \u2013 FWD, Philippines \n\n---\n\n Page: 19 / 28 \n\n---\n\n SUCCEEDING IN THE NEXT NORMAL Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 19 In 2021, FWD introduced new holistic product solutions like \u2018Babyproof\u2019, \u2018Family Hero\u2019, \u2018Manifest\u2019, and \u2018Health and Wellbeing\u2019 that catered to end-to-end consumer needs. These products targeted the young adults and families and were promoted extensively on digital channels to increase their accessibility during the pandemic. These tailored products targeted consumers who were relatively new in their insurance journey and provided guidance on how to build and secure long term wealth.26 FWD built multiple partnerships to increase product distribution. For example, the partnered with a pawnshop chain to make affordable insurance plans and digital solutions available to the pawnshop\u2019s 30 million customers across its 2,500 branches nationwide. In another partnership with insurance aggregator Kwik.Insure, FWD introduced a chatbot, \u201cFi\u201d, in its customer servicing processes. The chatbot can handle customer product enquiries, product recommendation, agent appointments, and claims activities. The chatbot received a customer rating of 4.9/5 and increased operational efficiency by freeing agents to focus on more complex tasks.26 KEY ACTIONS FOR INSURERS \u2713 Take advantage of bundling and unbundling products to create targeted and tailored offerings \u2713 Forge partnerships to increase sales beyond traditional channels \u2713 Use self-help tools, like chatbots and AI, to improve customer experiences while also realizing operational efficiencies \n\n---\n\n Page: 20 / 28 \n\n---\n\n 0.51 0.57 0.64 0.70 0.80 7.1 7.5 8.1 8.6 9.3 0 1 2 3 4 5 6 7 8 9 10 2017 2018 2019 2020 2021 Malaysia Life Insurers GWP Success Story \u2013 Allianz Life, Malaysia SUCCEEDING IN THE NEXT NORMAL Malaysia Malaysia is the second largest insurance market in the ASEAN region after Thailand. Close to 70% of the total 18.3 B USD GWP in 2021 is Life insurance with CAGR of 5.5%. Malaysian life insurance growth was driven by increased demand during COVID and ability of insurers to sell and provide services digitally. Among the biggest 5 life insurers, Allianz was the fastest growing insurer with a CAGR of 9.2% from 2017 to 2021, roughly twice faster than the market. Its market share improved from 7% to 9% in the same period.27 There were three major drivers contributing to its outperformance: launch of new innovative offerings, digitalization of processes, and a variety of agency development programs.<|endoftext|>Innovative new offerings were launched by Allianz that targeted the evolving customer needs and contained minimal negative surprises. In 2021, they launched the \u2018PreciousCover\u2019 and \u2018BabyCover\u2019 that provided coverage for mother and child through pre- and post-natal period with added benefits such as mental health coverage, hospitalization benefits and juvenile critical illness. It also launched endowment products that had simplified underwriting and guaranteed acceptance rules to address customer hesitance during the pandemic. Allianz also participated in a government-endorsed micro- insurance program, Perlindungan Tenang, which increased penetration in the bottom 40% households primarily via digital channels.27 Others B USD 4.8% 13.9% 1.4% 11.6% 5.7% 8.7% 6.9% 7.0% CAGR Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 20 \n\n---\n\n Page: 21 / 28 \n\n---\n\n SUCCEEDING IN THE NEXT NORMAL Digitalization of processes was also critical, especially during the COVID, where Allianz could quickly pivot to digital distribution and policy admin. It enabled end-to-end digital claims experience from submission to reimbursement through its MyAllianz customer portal and mobile app. Customers could also access their medical cards and guarantee letters at any time via the mobile app. It incorporated OCR, facial recognition and video call option in its KYC processes which allowed their agents to remotely perform an end-to- end customer onboarding in just five minutes for non-complex cases.27 This digitalization initiative was extended to its agency development programs. Agency partners contributed 70+% of the written premiums for Allianz. The insurer launched a series of online training sessions on remote sales, product training, digital marketing, use of digital tools, and soft skills to uplift agent productivity. Their agency partners valued the trainings with average attendance of 800 agents per session. To make the agency partner growth more sustainable, Allianz Life also launched a 24-month \u201cC.E.O Program\u201d in 2019 for its agency partners to build a quality talent pipeline and enhance their capabilities.28 KEY ACTIONS FOR INSURERS \u2713 Use needs-based analysis to design new product offerings \u2713 Address customer hesitance for online purchases by simplifying product offerings and removing negative surprises \u2713 Use automation & AI-based tools for routine tasks to reduce turnaround times \u2713 Actively engage agency partners and new partnership models to enhance productivity and product accessibility Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 21 \n\n---\n\n Page: 22 / 28 \n\n---\n\n We recommend a customer-experience driven strategy The best way for insurers to achieve market- beating growth in emerging South-East Asian markets is to double down on personalized customer experience, digital omnichannel across the E2E customer journey and customer-friendly product innovation. The success stories highlighted further concretize the recommendation justified by the global and regional trends. There are 6 key areas for insurers to embark on this journey. We have witnessed massive growth for insurers who have implemented all or parts of the above actions. The requirements and prioritization depend on the insurer, market, and most importantly, leadership ambition. Infosys Consulting is the partner of choice for insurers seeking transformations, whether laser-focused interventions or enterprise-wide step-changes.<|endoftext|>1 2 3 Define a customer-experience- led growth strategy Design the to-be customer experience Lead ecosystems for growth 4 5 6 Implement product innovation at scale Define the to-be technologies needed Design the to-be processes and operating model RECOMMENDATIONS FOR INSURERS Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 22 \n\n---\n\n Page: 23 / 28 \n\n---\n\n Infosys Consulting approach to customer-experience driven strategy Strategy Customer experience Target customer segments\u2026 Persona modelling (who) Goals Needs Expectations Preferences Operating model Culture Technology & Data Organization & Roles Talent Ways of Working Processes Governance KPIs & Control Engaging with the insurer along their journey\u2026 Omnichannel journey design \u2026at touchpoints, across channels Not exhaustive Direct Agent Broker Portal 3rd pty I discover I compare & decide I onboard I claim I manage I pay I sign up for more I advocate I stop Web Wearables App Phone call Video call Mail API In person Social 1 Define a customer-experience-led growth strategy Articulate the leadership ambition for customer experience vision relative to your market competition for each customer segment. Set concrete goals for growth, market share, NPS, customers\u2019 brand perception and ecosystem participation. Prepare the growth-focused business case to justify investments needed for the customer experience uplift. Define a concrete roadmap ahead.<|endoftext|>2 Design the to-be customer experience Define the to-be end-to-end customer journeys across the touchpoints that would realize the customer experience vision. Define levels of personalization and contextualization expected for every interaction from marketing to processes, products, and pricing. Do not overlook the enterprise and SME customer journeys. Prioritize growth-driving customer experience of marketing, sales, and distribution. To-be customer experience sets requirements for the insurers\u2019 operations especially around technology, data, processes, and people. Prioritize the requirements based on the overall customer experience uplift potential against the strategic growth goals.<|endoftext|>RECOMMENDATIONS FOR INSURERS Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 23 \n\n---\n\n Page: 24 / 28 \n\n---\n\n RECOMMENDATIONS FOR INSURERS 3 Lead ecosystems for growth Identify the ecosystem vision articulating the two sided of ecosystem strategy: what is the insurance company centric ecosystem, and, on the other hand, what roles the insurer takes in ecosystems of others. Define the experience for agents, brokers, banks, healthcare, repair shops, and any other relevant partners in the ecosystem. Focus the strategy on growth drivers, especially around marketing presence, distribution channels, customer experience, and product mix. When creating an insurance- centric ecosystem, define the business model for partnering. Define the technology platform required to realize the ecosystem ambition.<|endoftext|>5 Define the to-be technologies needed Technology and data changes are the concrete steps in the customer-experience driven transformation. The technology and data architecture need to be designed to fulfil the customer experience requirements. Especially the personalization, product innovation, and omnichannel requirements will likely require legacy technology uplifts. Data will be the key enabler for the personalized experiences; data supply chains from sources to databases to analysis and usage will be crucial for realizing the growth. Define the data requirements; design the data engineering and data science practices and systems required.<|endoftext|>4 Implement product innovation at scale New, customer-friendly insurance products are easy to design on paper, but legacy systems and processes might hinder their realization. Set up an innovation pipeline with industrialized approach for each of the stages from idea incubation to prototyping, market testing, market launch and scaling. Product portfolio requirements will increase substantially from modularization, disaggregation, parametrization, and personalization. Define requirements for technology and data to realize the new products.<|endoftext|>6 Design the to-be processes and operating model Powered with the new technology, design the to-be processes to realize the target customer experience. Focus providing near-immediate turn-around time for all customer-facing activities through process elimination, simplification, automation, and parallelization. Support the own and 3rd party manual processes, e.g., with AI image analysis, next- best action, recommendation engines, and price calculators. Define the new capabilities and roles needed, set training paths, and recalibrate staffing needed to operate the new processes.<|endoftext|>Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 24 \n\n---\n\n Page: 25 / 28 \n\n---\n\n 1.<|endoftext|>https://www.imf.org/en/Publications/WEO/weo-database/2022/October 2. https://www.imf.org/en/Blogs/Articles/2022/10/13/asia-sails-into-headwinds-from-rate-hikes- war-and-china-slowdown 3.<|endoftext|>https://stats.oecd.org/Index.aspx?QueryId=25444 4. https://www.insurance.gov.ph/ 5. https://www.bakermckenzie.com/-/media/files/locations/thailand/insurance-outlook-14th- edition.pdf 6. https://www.lexology.com/library/detail.aspx?g=7ce23ebe-600e-482d-aeb8-fe8e386c498c 7.<|endoftext|>https://mof.gov.vn/webcenter/portal/btcvn 8. https://www.febis.org/2022/04/01/vietnam-s-insurance-market-to-see-double-digit-growth-in- 2022/ 9. https://equityhealthj.biomedcentral.com/articles/10.1186/s12939-021-01578-0 10. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6008596/ 11. https://www.globaldata.com/media/insurance/general-insurance-industry-vietnam-reach-3- 5bn-2026-forecasts-globaldata/ 12. https://pubmed.ncbi.nlm.nih.gov/33853361/ 13. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9532955/#:~:text=Results%3A%20The%20pre valence%20of%20health,the%20Indonesian%20population%20is%2062.3%25. 14. https://services.google.com/fh/files/misc/e_conomy_sea_2022_report.pdf 15. https://www.swissre.com/reinsurance/life-and-health/l-h-risk-trends/consumers-embrace- digitalisation-points-better-protected-future.html 16. https://www.mckinsey.com/industries/financial-services/our-insights/emerging-markets-leap- forward-in-digital-banking-innovation-and-adoption 17. https://www.ey.com/en_ph/insurance/why-insurers-must-adapt-to-meet-the-changing- philippine-landscap 18. https://www.swissre.com/dam/jcr:f5cfd028-b7c9-4958-a5c3-06f7d2e720fc/ep_en_digital- adoption-in-personal-pc-insurance-in-southern-asia-webb.pdf 19. https://www2.deloitte.com/content/dam/Deloitte/sg/Documents/consumer-business/sea-cb- gacs-study-perspective.pdf 20. https://www.bain.com/insights/how-insurance-customers-are-responding-to-COVID/ 21. https://insuranceblog.accenture.com/reimagining-end-to-end-customer-experience-drive- growth REFERENCES Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 25 \n\n---\n\n Page: 26 / 28 \n\n---\n\n REFERENCES 22. https://www.aia.com/content/dam/group/en/docs/annual-report/Annual Report 2021_E.pdf.coredownload.inline.pdf 23. https://www.aia.com.hk/en/about-aia/about-us/media-centre/press-releases/2022/aia-press- release-20221014 24. https://www.the-digital-insurer.com/award-application/aia-thailand-alive-powered-by-aia/ 25. https://www.bangkokpost.com/business/2286146/aia-aiming-to-help-1bn-people-across-asia- to-live-healthier-longer-better-lives-by-2030 26. https://www.fwd.com.ph/-/media/pdf/documents/fwd-ph-annual-report-2021- 2.pdf?rev=20b476bd356540fe93fb6a0b02ccb947 27. https://www.allianz.com.my/content/dam/onemarketing/azmb/wwwallianzcommy/pdf/financia l-reports/annual-reports/AllianzAnnualReport2021.pdffwd-ph-annual-report-2021-2.pdf 28. https://www.allianz.com.my/content/dam/onemarketing/azmb/wwwallianzcommy/personal/ca mpaigns/allianz-ceo-programme/AllianzCEOProgrammeBrochure_EN.pdf Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 26 \n\n---\n\n Page: 27 / 28 \n\n---\n\n JITIN SHARMA Associate Partner Singapore +65 8849 7197 Jitin.Sharma@infosysconsulting.com SAGAR ROONGTA Consultant Singapore +65 8264 6036 Sagar.Roongta@infosysconsulting.com MEET THE EXPERTS AARO KAUPPINEN Principal Singapore +65 8870 2590 Aaro.Kauppinen@infosysconsulting.com Thriving in the Post-Covid Insurance Growth \u00a9 2023 Infosys Consulting 27 \n\n---\n\n Page: 28 / 28 \n\n---\n\n consulting@Infosys.com InfosysConsultingInsights.com LinkedIn: /company/infosysconsulting Twitter: @infosysconsltng About Infosys Consulting Infosys Consulting is a global management consulting firm helping some of the world\u2019s most recognizable brands transform and innovate. Our consultants are industry experts that lead complex change agendas driven by disruptive technology. With offices in 20 countries and backed by the power of the global Infosys brand, our teams help the C-suite navigate today\u2019s digital landscape to win market share and create shareholder value for lasting competitive advantage. To see our ideas in action, or to join a new type of consulting firm, visit us at www.InfosysConsultingInsights.com. For more information, contact consulting@infosys.com \u00a9 2023 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names, and other such intellectual property rights mentioned in this document. Except as expressly permitted, neither this document nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printed, photocopied, recorded or otherwise, without the prior permission of Infosys Limited and/or any named intellectual property rights holders under this document. \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: Transforming Quality Assurance Organizations by Enabling Dev\u201dT\u201dOps \nAuthor: Infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 4 \n\n---\n\n VIEW POINT Abstract Despite the rising popularity of DevOps in enabling agile development, its potential to transform the testing lifecycle is largely untapped. This paper examines the importance of DevOps in today\u2019s business landscape and explains why it should play a greater role in testing/QA organizations (Dev\u201dT\u201dOps). It also focuses on implementation view of DevOps along with its measurable benefits.<|endoftext|>TRANSFORMING QUALITY ASSURANCE ORGANIZATIONS BY ENABLING DEV\u201dT\u201dOPS \n\n---\n\n Page: 2 / 4 \n\n---\n\n Development Repository Validations Environment Provisioning Integration DevOps DevOps \u2013 The \u2018agile\u2019 catalyst The rapid pace of change in the present economy requires organizations to find new ways to quickly overcome business and technology challenges. The winning organizations are those that possess the will to transform and discover smarter ways of working. Today, there is an ever-increasing demand for quality software, which places greater focus on optimizing all stages of the software delivery lifecycle (SDLC). Thus, many players are adopting agile methodologies to stay relevant and achieve reliable and predictable business outcomes. In such a competitive landscape, DevOps has already become a vital catalyst as the preferred solution to address technical challenges such as: \u2022 Digital Transformation : Omni Channel Experience, End to End business process workflow implementation, IOT devices integrations \u2022 Cloud Deployment: \u2022 Rapid applications development - code publishing, security keys container/stack/queue creations, full / half / partial deployments \u2022 On demand virtual machines - auto scaling, machine images \u2022 Tracking of real time usage \u2013 data, volume, and transactions \u2022 Data : Data Replications, Data Provisioning, Data Deployment \u2022 APIs: Micro Services Integrations, Service Virtualizations More importantly, DevOps is expanding beyond technology and is significantly impacting people as well as processes.<|endoftext|>DevTOps \u2013 Implementing DevOps in testing DevOps is a set of practices and processes that foster better collaboration and communication between developers and other professionals involved in operations. Despite its importance in enabling agility in development lifecycles, DevOps is often considered as an add-on capability. Many organizations are unable to tap its full potential to transform testing lifecycles that typically consume nearly 30% of their budgets or schedules. Principally, DevOps is about building a collaborative culture, adopting agile methodologies, exploiting automation to accelerate innovation, and providing rapid feedback to achieve common goals. As such, DevOps mandates a close involvement with the QA organization. DevOps in QA is essential for many reasons, the key among which are Enabling rapid testing cycles, Ensuring reliable finished products, Providing predictable business outcomes, Reducing cost or Improving SLAs.<|endoftext|>The time has come for enterprises to introduce DevOps in testing. Enabling DevTOps will help QA organizations promote DevOps across the SDLC for superior benefits. To tap the objectives such as speed & agility, many choices of tool sets and point solutions (Open Source / COTS) are available.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 3 / 4 \n\n---\n\n Over the past 15 years, test automation has come a long way. Techniques such as behavior-driven development (BDD) and acceptance test-driven development (ATDD) are enabling the much-needed shift-left in QA. Further, automation tools are being deployed intelligently across the software test lifecycle (STLC), thereby improving automation coverage beyond user interfaces (UIs), irrespective of disparate technologies. This increased automation coverage provides a great platform to practice DevTOps. In a continuous integration (CI) and continuous delivery (CD) pipeline end to end automation is a key. Pipe line consists of build automation, continuous integration; test automation; and deployment automation. The pipeline starts by building the binaries by cloning & caching of the repository, hooking / polling the changes, merging the code branches to create the deliverables. This ensures that newly developed features are continuously integrated with the central code base.<|endoftext|>Each time build is done and code is unit tested. The health of checked-in codes are informed to the developer\u2019s real time basis and in case of failure checked-ins are prohibited. Once the build is successful it is automatically deployed for the testing to ensure that it meets all desired exit criteria. The test automation are extended beyond the functional to several non-functional such as security, performance, user experience or compliance etc. The defects are automatically reported in the defect management tools.<|endoftext|>Once the application is verified for the quality gates it moves for deployment pipeline. The deployment pipeline is supported by platform provisioning, environment validations and system configuration management & verifications. The deployment on premise or a hosted environment such as Amazon Web Services (AWS) are automated and allows provisioning or tearing down environments automatically. The deployment in staged / production environment happens in controlled manners where changes get deployed partial or full. The deployment is fully automated and executes just in few minutes Deployment automation ensures the applications quality and reliability as well as its future scalability. The deployed applications are continuously monitored preventive basis such as applications / system logs are scanned regularly and alerts are sent in case of issues. The infrastructure such as memory / space are controlled automatically and scaling ups / downs happens real time.<|endoftext|>Failed Failed Code Branches Environment Validation Code Compilations Unit Testing Test Cluster / Buckets Test Automation Cloning/ Caching Control Hooking / Polling Control Build Automation Defect Automation Deployment Automation Application Deployment & Monitoring Functional Test Automation Security Test Automation Performance Test Automation UX Test Automation Repository Management Change Control Failure Reporting Continuous Integration Pipeline Continuous Delivery Pipeline Platform Provisioning Environment Validations Confgurations Mgmt.<|endoftext|>Apps Log Monitoring System Log Monitoring Scaling up / Down Full / partial deployment Version Management Code collections Branch Merging Machine Images External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 4 / 4 \n\n---\n\n Benefits of DevTOps Mature IT services organizations are already reaping the benefits of DevTOps investments across people, processes and technologies. In these organizations, developers and testers collaborate better to achieve common goals and deliver top-quality products. DevTOps helps QA organizations prevent defects rather than detect them later. In doing so, DevTOps delivers measurable success and tangible benefits such as shorter timeframes to verify and implement changes, reduced average time taken to recover from production incidents, reduced failure rates during production changes and greater bandwidth for innovation.<|endoftext|>Further, the return on investment in DevTOps is faster and more significant compared to other investments made by QA organizations. The key areas of improved ROI include lower cost due to reduced downtime/wait-time, faster time- to-market when launching new features, improved customer satisfaction and more opportunities to innovate.<|endoftext|>Conclusion Top quality and greater speed in development and operations are the main drivers for adopting DevOps. DevOps helps organizations accurately track changes to source code, manage development, testing and operational complexities and ensure self-healing. As new tools and techniques enable higher testing automation coverage, there is a need to enable DevOps in testing as well. However, this can be challenging for traditional testers as they must shift to the center of development and operations activities and contribute to overall product vision and strategy. Such a change is critical to improving the cadence of production releases. QA organizations must focus on making strategic investments in their own operations by implementing DevTOps to enable software development engineers in test (SDETs) and focus on defect prevention rather than detection. By adopting DevTOps, QA teams can reduce production failures, recover quickly from production incidents, launch new features faster, and improve customer satisfaction \u2013 all of which improve ROI. Investing in DevTOps in QA drives innovation in testing that will continue to generate returns for the organization.<|endoftext|>\nAuthor Manish Kumar Pandey ManishK_Pandey@Infosys.com Delivery Manager, IVS Consulting \u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: Leveraging Database Virtualization for Test Data Management \nAuthor: infosys Limited \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 8 \n\n---\n\n WHITE PAPER LEVERAGING DATABASE VIRTUALIZATION FOR TEST DATA MANAGEMENT Vikas Dewangan, Senior Technology Architect, Infosys Abstract Database virtualization is an emerging trend in test data management (TDM) and is all set to bring about a revolution in the provisioning of non-production data. These tools offer efficient virtual cloning of production databases for software application development and testing, without having to physically replicate the entire database. Coupled with powerful features like taking point-in- time snapshots, write backs to the source database, and rolling back changes; these tools have the potential to bring in significant cost savings and efficiency improvements. In this paper, we explore the concept of database virtualization, industry tool features, and how these tools can be leveraged to improve TDM.<|endoftext|>\n\n---\n\n Page: 2 / 8 \n\n---\n\n Introduction Virtualization technologies have been around for a while and were first introduced in the era of mainframe systems. However, it is only in the recent past, due to advances in technology, that virtualization has been gaining popularity. The term \u2018virtualization\u2019 refers to encapsulating a particular hardware or software resource so that multiple systems can make use of it. By means of virtualization, hardware resources can be utilized more effectively through consolidation, resulting in significant cost savings. Further, virtualization leads to savings in hardware costs; thus resulting in decreased energy consumption and increased green technology. Given these advantages, virtualization will continue to see a trend of increasing adoption in the years to come. There are a number of popular virtualization technologies today, such as server, storage, database and desktop virtualization. Out of these, database virtualization is the focus of this paper.<|endoftext|>The field of test data management (TDM) concerns itself with provisioning of production-like data in non-production environments, such as those used for application development and testing and is thus a related area. Database virtualization is an emerging trend in TDM, with the potential of offering significant cost advantages. In this paper we will explore how database virtualization can be optimally leveraged to increase TDM effectiveness.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 3 / 8 \n\n---\n\n Traditional approach The common approach taken by most enterprises today is to have physical copies of the production database in each non- production environment. Non-production environments include environments for software development, testing, and training.<|endoftext|>It is not uncommon to find even 10 \u2013 12 such production copies in most enterprises. These physical copies may contain either an entire copy or a subset of the production database. With each additional environment needed there is an increase in physical storage requirements, database administration needs and the time required to setup these databases. Refreshing these non-production databases is a time- consuming and costly exercise. Often, due to contentions, the availability of development and test environments becomes a bottleneck in the software development life cycle and can lead to increased overall turnaround times. In summary, the traditional approach is complex and costly.<|endoftext|>Database virtualization \u2013 a better approach With the coming of age of database virtualization, a gamut of opportunities for optimization of non-production environments has opened up. Several enterprises have started taking cognizance of this trend and are actively exploring how best they can leverage this technology for deriving maximum business value. So what exactly is database virtualization and how does it work? In the case of a database virtualization-based approach, the virtualization engine creates multiple virtual clones of the production database. These virtual copies of the database can be utilized in the same way as the actual physical copies. They will be transparent to the applications in these environments and will continue to function with these virtual database instances in exactly the same manner as they did with the physical non-production databases. There is no need to create subsets of production data and entire datasets can be made available to the relevant applications.<|endoftext|>Presented below is a simplified view of the traditional data refresh process for non- production environments compared with a database virtualization-based approach. It should be noted that in some cases there may be an intermediate database between the production and non-production environments, which may be used for masking or sub-setting purposes.<|endoftext|>Test Environment 2 Test Environment 3 Dev Environment 1 Production Database Virtual Clone 2 Virtual Clone 3 Virtual Clone 1 Traditional ETL approach for non-production environments Database virtualization approach for non-production environments Database ETL / sub-settingutility Involves creating physical copies of the production database Production Database Database virtualization engine Involves creating virtual clones of the production database Key features of database virtualization tools Today, there are a number of vendors offering products in the database virtualization space. Let us take a look at some of the important features and benefits that these products (though not all) provide: 1. Provision database clones: The functionality of rapidly provisioning virtual database clones is at the heart of all database virtualization tools. These virtual clones can typically be created in a matter of hours, as opposed to physical databases, which may take up to several days to setup and load data 2. Resource segregation: Each virtual database instance is a private copy and is unaffected by changes made to other virtual clones (\u2018Ring fencing\u2019) 3. Lower resource utilization: Database virtualization tools implement highly- optimized compression algorithms to minimize the in-memory footprint of each cloned database instance. Usually data that is duplicate across clones is maintained as a common block across instances, while modified data is stored privately for each clone, leading to optimization of the memory footprint 4. Ease of refresh: Unlike refreshing physical databases, refreshing an existing virtual database clone with the latest production data is relatively simple and straightforward. The clones being in-memory instances, the refresh process is much faster and efficient as there are no physical writes. Most database virtualization engines will refresh the virtual instance with changes only, rather than purging and refreshing the entire database 5. Point-in-time snapshots: Often, in test environments, there is a need to take snapshots of multiple databases at the same point in time to ensure consistency of the test data. In the traditional approach, extracting data from multiple databases seldom happens simultaneously and there is some time difference between data extraction processes. This leads to referential integrity issues across databases (as several transactions may have taken place during this time difference), which have to be fixed in non-production environments. With database virtualization engines, it is possible to obtain a virtual clone of each source database against a specific point-in-time. This results in maintaining the referential integrity and consistency of data across the provisioned virtual database instances.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 4 / 8 \n\n---\n\n 6. Synchronization with source database: The updates made to the virtual database clones can be written back to the intermediate database if required. For example, this might be required if the data in the virtual clone has been augmented for testing purposes and a need arises to replicate this to other test environments. This feature can also be used to help maintain a \u2018gold copy\u2019 of the database.<|endoftext|>7. Self-service: Virtualization tools, once setup, are easy to use and can reduce the dependency on database administrators (DBAs) when compared to the traditional approach where the physical copy of the database needs to be refreshed. Further, the TDM team as well as developers and testers, can be provided the facility to obtain, via self-service, virtual copies of the given source databases.<|endoftext|>8. Roll back to a previous state: Most database virtualization tools provide a feature for rolling back the database state to a previous date. This is especially useful for testing teams. Often, test data gets corrupted or changed after a test cycle. By using the roll back feature, the test data can be restored to a previous consistent state to enable another round of testing.<|endoftext|>9. Roll forward: After rolling back, some database virtualization tools provide the facility to roll forward, if required, to a specified date in the database timeline.<|endoftext|>10. Ease of administration: Database virtualization products offer centralization of all administrative tasks leading to ease of administration. There is no need to account for differences in environment configurations in each non-production environment where the target database is to be deployed. These features have the potential to bring significant cost savings and efficiency improvements from a TDM perspective.<|endoftext|>TDM use cases TDM processes can be broadly divided into data refresh and data provisioning processes. The data refresh process involves refreshing a non-production database with required datasets. The data provisioning process is concerned with mining this data and allocation of identified records to end users. Database virtualization is applicable External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 5 / 8 \n\n---\n\n to the former process viz. data refresh. Let us look at some relevant cases of how it can be leveraged for TDM: a) Database virtualization in non- production environments As we have seen, in the traditional approach, each non-production database is a physical copy or subset of the production copy. With a database virtualization tool, the non-production environments are virtual database clones. An added advantage here is that if the data in any environment is corrupted \u2013- let\u2019s say after a round of testing \u2013- it is very easy to revert the data back to its original state. Some database virtualization tools provide support for data masking, which is a key requirement of most TDM implementations. A simplified depiction of this scenario is shown below: Production Database Non-production environment setupby virtual cloning of production database Virtual Clone 1 Virtual Clone 2 Virtual Clone 3 Database virtualization engine[+ Data masking] b) Database virtualization with data masking Data masking is often a key TDM requirement and not all database virtualization tools support it. In this case, an intermediate database can be setup, which will store the masked data and serve as the source database for the database virtualization engine. The intermediate database might be a full or partial copy of the production database. Data masking can be achieved by either using industry standard TDM tools or custom scripts.<|endoftext|>Virtual Clone 2 Virtual Clone 3 Virtual Clone 1 Mask Non-production environment setup with data masking and virtual cloning Production Database Data extraction/ sub-setting Intermediate Database Data masking utility Database virtualization engine c) Database virtualization with data masking and synthetic data creation In practice, there may be several data requirements from data consumers that cannot be fulfilled from production databases, as the data is unavailable. This may be the case, for example, in new development programs that may require data for testing. In this case, a synthetic data creation tool can be used to generate data, which in turn can be inserted into the intermediate database. As the intermediate database is being used as the source for the database virtualization engine, it will ensure that the virtual database clones reflect the generated data, in addition to the production data.<|endoftext|>Virtual Clone 2 Virtual Clone 3 Virtual Clone 1 Mask Non-production environment setup with synthetic data generation and virtual cloning Production Database Data extraction/ sub-setting Intermediate Database Data masking utility Database virtualization engine Data generation Benefits for TDM The key benefits that database virtualization can provide for TDM are: 1. Faster test data setup and refresh: Database virtualization can be leveraged for rapidly provisioning a database clone from a source database; for example, production, disaster recovery, a gold copy etc. Similarly, refreshing a virtual database clone can be executed quite quickly.<|endoftext|>2. Cost savings: Test databases, being virtual clones, do not require physical storage space. Also, database virtualization tools do not maintain full copies of the data \u2014 only the changes in data are tracked and logged. This results in significant savings in hardware costs.<|endoftext|>3. Increased reuse: Most database virtualization tools provide point- in-time snapshots. Thus, if there is a need, for example, to repeat a round of testing or a specific test case, it is easy to roll back the database to a previous state. External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 6 / 8 \n\n---\n\n 4. Increased test environment availability: As virtual database clones have a significantly lower resource utilization as compared to physical databases, it is feasible to have larger number of test databases at any given point in time. This can significantly eliminate environment contentions and increase productivity.<|endoftext|>5. Enable TDM for DevOps: DevOps teams can rapidly provision data for dev and test environments using self-service features, thus enabling continuous integration.<|endoftext|>Best practices While considering database virtualization for TDM, the following best practices should be followed: 1. Conduct product fitment: There are several robust database virtualization products available in the market today, with significantly varying feature- sets and technology compatibility. Hence, it is an imperative to carry out a comprehensive product fitment analysis before selecting the right product, keeping in mind the current and long term needs.<|endoftext|>2. Protect sensitive data: Production databases often contain sensitive information that if exposed can result in serious data breaches. It is imperative to have a mechanism in place to mask sensitive information so that the virtual database clones provisioned for development and testing are properly desensitized. One strategy could be to leverage a database virtualization product that inherently supports data masking or even setting up a staging database that has desensitized data as the source database for the virtualization engine (instead of using the production database directly as the source). 3. Plan for data reusability: As database states can be easily rolled back and rolled forward, a data reuse strategy should be put in place to effectively utilize this feature for relevant scenarios (for example, regression testing cycles). 4. Enable self-service: Relevant development / testing team members should be provided access to create virtual database clones to decrease the dependency on DBAs. 5. Monitor performance metrics: Key database performance metrics, such as SQL query response time, need to be monitored before and after virtualization, so that corrective action may be taken accordingly.<|endoftext|>Database virtualization product examples Listed below are a few database virtualization tools available in the market today, which can create virtual database clones. As this technology is still maturing and evolving, the databases supported by these vendors are likely to increase soon: # Tool Databases supported 1 Delphix DaaS Oracle, SQL Server, Sybase, DB2, PostGres, MySQL 2 Actifio Copy Data Virtualization Oracle, MS SQL Server 3 NetApp FlexClone Oracle 4 EMC XtremIO Oracle, MS SQL Server 5 VMware vFabric Data Director Oracle, SQL Server, MySQL External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 7 / 8 \n\n---\n\n Conclusion As compared to the traditional approach, database virtualization offers significant opportunities in terms of cost optimization and efficiency improvements in non- production environments. The features offered by these tools are specifically useful for databases used for development and testing purposes. The key benefits of leveraging database virtualization for TDM are faster test data setup, cost savings in disk space and increased reusability of test data. Moreover, the ability to provide self-service, coupled with features like ease of refresh, taking point-in-time snapshots, write backs to the source database and ability to roll back any changes made can be a game-changer in the effective management of test data. Enterprises would be well advised to take a serious look at database virtualization tools, irrespective of whether they have a formal TDM implementation in place.<|endoftext|>External Document \u00a9 2018 Infosys Limited \n\n---\n\n Page: 8 / 8 \n\n---\n\n \u00a9 2018 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. For more information, contact askus@infosys.com Infosys.com | NYSE: INFY Stay Connected \n\n\n***\n\n\n "} {"text": "# Infosys Whitepaper \nTitle: Today, the most popular form of evaluating an individual capabilities and competency level in any organization is done by Appr \nAuthor: Infosys Technologies \nFormat: PDF 1.4 \n\n---\n\n Page: 1 / 12 \n\n---\n\n Presented at the 3rd International Software Testing Conference, India The When & How of Test Automation Vivek Motwani Programmer Analyst Product Competency Center Infosys Technologies Limited, Bangalore Abstract Test automation has always been looked upon as a magic formula to improve the quality processes of products/applications right from the day when first commercial product/application was made. But when one actually starts automating the testing, the ground realities are realised. More often than not, the teething troubles of deciding the right time to go for automation, defining the scope of automation and selection of right tool for automation are over-whelming in the first place. And even if these teething troubles are overcome, the automation tool developed is usually inefficient as lots of important considerations are over-looked. This paper aims to suggest the solution of these issues, and suggest best practices to be followed while doing the automation so as to maximise the efficiency of the automation tool developed. Introduction Automation is the only long-term solution for reduced costs in software testing and better quality products. But these aims are achieved only when certain best practices are followed before and while developing the automation suite. Howard Fear has aptly stated, \"Take care with test automation. When done well, it can bring many benefits. When not, it can be a very expensive exercise, resulting in frustration\u201d. [3] More often that not, after automating the testing of a product, the automation team finds the automation tool more of a headache because of the unplanned and thoughtless approach adopted while developing the tool. Generally, lots of effort is spent in developing the tool, only to discover that the tool is limited in scope, lacks user-friendliness and requires frequent re-work every now and then. \u00a9 QAI India Ltd, 3rd Annual International Software Testing Conference, 2001. No use for profit permitted. \n\n---\n\n Page: 2 / 12 \n\n---\n\n Presented at the 3rd International Software Testing Conference, India And if sufficient care is exercised and proper practices are followed before and while automating the same product/application, the resulting automation tool not only saves time and effort, but is also a sheer beauty in itself because of the amount of user-friendliness, flexibility, reusability and extensibility it ensures. Let us, therefore, discuss what all needs to be taken care of before going for test automation and also while actually doing the automation. Automation when? (The Desiderata) Lots of effort has to be spent even before you actually start with automation. It needs to be ensured that following things have been taken care of: - Stability of the product/application is ensured: The first thing that needs to be ensured is that the product/application is fairly stable in terms of functionality. Even if it is slated to incorporate new features, the new features should not disturb the existing functionality. There is no sense in automation the testing of a product that is supposed to change functionality-wise. Besides, the error messages generated by the product/application should remain consistent across different releases. If the testing is GUI-based, then it needs to be ascertained that the future releases of the product would not be undergoing GUI changes which might prove critical for the automation suite. Interface to be tested has been identified: Three different interfaces a product might have are command line interfaces (CLIs), application-programming interfaces (APIs), and graphical user interfaces (GUIs). Some may have all three, but many will have only one or two. These are the interfaces that are available to you for your testing. By their nature, APIs and command line interfaces are easier to automate than GUIs. Find out if your product has either one; sometimes these are hidden or meant for internal use only. After this, you need to decide which interface\u2019s testing has to be automated. Some relevant points are: - \u00a9 QAI India Ltd, 3rd Annual International Software Testing Conference, 2001. No use for profit permitted. \n\n---\n\n Page: 3 / 12 \n\n---\n\n Presented at the 3rd International Software Testing Conference, India \ufffd GUI test automation is more difficult than test automation of the other two interfaces. This is because firstly, GUI test automation will invariably require some manual script writing. Secondly, there will always be some amount of technical challenge of getting the tool to work with your product. Thirdly, GUI test automation involves keeping up with design changes made to a GUI. GUIs are notorious for being modified and redesigned throughout the development process. \ufffd Despite the reasons for not depending on GUI test automation as the basis for testing your product functionality, the GUI still needs to be tested, of course, and you may choose to automate these tests. But you should have additional tests you can depend on to test core product functionality that will not break when the GUI is redesigned. These tests will need to work through a different interface: a command line or API. \ufffd In order to simplify the testing of an API, you may want to bind it to an interpreter, such as TCL or Perl or even Python. This enables interactive testing and should also speed up the development cycle for your automated tests. Working with API\u2019s may also allow you to automate unit tests for individual product components. Scope of automation has been defined: Before setting out to automate the testing of your application/product, it is essential to define the scope/intended coverage of the automation tool. The scope may encompass functionality testing, regression testing or simply acceptance testing. You can even select to automate the testing of certain particular features or certain selective testcases of different features. Individual testcases to be automated have been identified: Automation suite should be looked upon as a baseline test suite to be used in conjunction with manual testing, rather than as a replacement for it. It should aim at reducing the manual testing effort gradually, but not doing away with manual testing altogether. It needs to be understood that automation can aid manual testing effort but cannot replace manual testing. What machines are good at and humans are slow at should be chosen for automation. Setting realistic goals in early stages of test automation is \u00a9 QAI India Ltd, 3rd Annual International Software Testing Conference, 2001. No use for profit permitted. \n\n---\n\n Page: 4 / 12 \n\n---\n\n Presented at the 3rd International Software Testing Conference, India important for achieving long-term success. So, even after defining the scope of the automation tool in terms of acceptance/regression testing, etc, it needs be made sure that following kinds of testcases are eliminated from the scope of automation: - \ufffd Testcases that are long and complicated and require manual inspection/intervention in between. \ufffd Testcases that take tremendous amount of time in automation and it is difficult to ensure re-usability even if they are automated. \ufffd Testcases pertaining to usability testing. Usability testing means testing in a true end-user environment in order to check whether the system is able to operate properly in accordance with the exact set of processes and steps applied by the end-user, including user's interface and system convenience estimation. It\u2019s very important to include the right testcases in the suite. If the selection of testcases for the automation suite is not meticulous, you might end up discovering nothing really important about the software you are testing even if you develop a highly robust and reliable test suite. Testcases have been fine-tuned: The testcases need to be fine-tuned for automation. The expectation level from the testcases for automating is widely different from the expectation from manual testing point-of-view. The salient features that need to be taken care of include: - \ufffd Manual regression tests are usually documented so that each test picks up after the preceding test, taking advantage of any objects or records that may already have been created. Manual testers can usually figure out what is going on. A common mistake is to use the same approach with automated tests. But because of this approach, a failure in one test will topple successive tests. Moreover, these tests also cannot be run individually. This makes it difficult to use the automated test to help analyze legitimate failures. So, it is advised to revamp the testcases so as to make them independent. Each testcase should setup its test environment. \u00a9 QAI India Ltd, 3rd Annual International Software Testing Conference, 2001. No use for profit permitted. \n\n---\n\n Page: 5 / 12 \n\n---\n\n Presented at the 3rd International Software Testing Conference, India \ufffd The testcases need to be equipped with proper test-data. E.g. \u2013 If there is a testcase for uploading of a file, then it should explicitly tell which file to upload. If there is a testcase for creating a folder with invalid characters, then it should state which characters to use for creating the folder. Such fine-tuning of the testcases before starting automation ensures reduction in the actual time for developing the tool. It also guarantees that the tool actually executes the testcases in a way that checks the desired functionality. The right tool has to be decided: There are hundreds of automation tools available in the market. A careful effort has to go into deciding which tool would be most suitable for automating the testing of your product/application. Following criteria would be useful in making the decision: \ufffd Is the automation suite required to work on different platforms? If platform- independence is required, the demands on the automation suite will be very high. E.g. \u2013 If the suite has to support different flavors of Unix, it might be suitable to go for platform- independent things like perl, etc. \ufffd If the testing to be automated is GUI-based, it might be preferable to use a tool like SilkTest, WinRunner, Rational Robot, etc. But every tool will have its own technical limitations that prevent efficient automation. So, it is necessary to evaluate testing tools for critical interfaces of the application that need to be automated. \ufffd Sometimes, it might be best to develop a scripting tool using a suitable scripting language instead of going for the ready-made tools available in the market. This is especially preferable when the testing is on the server side. The right mode (script recording/script development) has been decided: Most of the GUI automation tools have a feature called \u2018record and playback\u2019 or, \u2018capture replay\u2019. Using this feature, you execute the test manually while the test tool sits in the background and remembers what you do. It then generates a script that you can run to re-execute the test. Script development, on the other hand, implies writing the scripts for running the testcases in the language used by the tool. Script development is akin to programming in a language like C or C++, but the purpose is to execute the testcases in an automated style. \u00a9 QAI India Ltd, 3rd Annual International Software Testing Conference, 2001. No use for profit permitted. \n\n---\n\n Page: 6 / 12 \n\n---\n\n Presented at the 3rd International Software Testing Conference, India If you are going for GUI test automation, then points worth-considering while making a sane decision are: \ufffd Record and Playback approach to creation of test scripts and test suites is easy to develop but difficult to maintain. \ufffd Error recovery cannot be incorporated by just recording a test script. \ufffd In data driven tests, achieving reusability of test scripts will be very limited. Creation of test data and integration of the same with test scripts is the time consuming part. When a function is coded for the same purpose with data input file, maximum re-usability and ease is ensured. More often than not, it will be required to strike a careful balance between the two modes, instead of using one of the two modes. Using the recording mode alone will render the automation suite non-reusable and using the scripting mode alone will require more investment of effort and time. Though a middle path will be suggested generally, it might be worthwhile spending some time to decide the right mode or right mix of modes as per the application/product under consideration. Most of the further discussion will be useful only when the right mix is adopted or scripting is followed altogether. All in all, the suggested steps to be followed before starting with automation can be depicted in the figure below: - \\ Fig.1: The pre-automation stage cycle Check the stability of the product / application Decide the interface to be tested Define the scope of automation Identify individual testcases Fine-tune the test case documents Decide the right tool Decide the right mode/ right mix of modes \u00a9 QAI India Ltd, 3rd Annual International Software Testing Conference, 2001. No use for profit permitted. \n\n---\n\n Page: 7 / 12 \n\n---\n\n Presented at the 3rd International Software Testing Conference, India Automation how? (The Regimen) After taking care of the above stipulations, the right direction has been identified and now the stage is all set to go for automation full-fledged. But in order to reach the destination, lots more needs to be paid attention to. So, here we go: - Following proper test scripting standards: Automated testing involves a mini-development cycle. So, proper coding standards for test scripts should be prepared. Checklists should be developed for review of test scripts. On the whole, all the software practices followed in the development of an application or a product, which are applicable to the development of the automation suite, should be put in place. Whatever tool is chosen, ultimately, a tool will be only as good as the process being used to implement the tool. Identifying common steps and converting them into functions: At the outset, the steps common amongst different testcases should be identified and converted in the form of functions. Such functions can be placed in a common file, from where they can be called by different testcases by passing suitable parameters as per the need. This will encourages re-usability of code and save effort and time. Besides, these functions can be used again when newer testcases are added to the automation suite at a later stage. Identifying other peripheral functions: After the functions as stated above have been identified, it is advisable to identify the peripheral functions that will be required by all the testcases in general. E.g. \u2013 A separate function for writing into log files can be written. This function can take the error message, severity level of the error message and the path and name of the log file as the input parameters. Depending on the requirements, more of such reusable functions can be identified. Such functions will simplify and streamline the process of test script development in the long run. Providing room for extensibility: The automation suite should be written in a manner such that additional testcases can be added to it. The additional testcases may cater to testing enhanced \u00a9 QAI India Ltd, 3rd Annual International Software Testing Conference, 2001. No use for profit permitted. \n\n---\n\n Page: 8 / 12 \n\n---\n\n Presented at the 3rd International Software Testing Conference, India functionality of an existing feature as well as testing new features incorporated in the application/product. The suite should have such an architecture that it is extensible both in terms of being able to add more functions and also in terms of being able to add more testcases by calling the existing/new functions. Generating proper logs: A common problem is what to do when automated tests fail. Failure analysis is often difficult. Was this a false alarm or not? Did the test fail because of a flaw in the test suite, a mistake in setting up for the tests, or an actual defect in the product? Hence, it is required that the suite should generates logs of its own. But a good automation suite with ambiguous logging messages is worse than manual testing. Few points that need to be taken care of from logging point-of-view are: - \ufffd An ideal automation suite should explicitly check for common setup mistakes before running tests and generating detailed logs of its own. And logging needs to be as user- friendly as possible. \ufffd The logging should be done in a manner that facilitates statistical analysis of the results. This implies that the log file should have the results in such a format such that can be processed by parsing, and useful statistics can be generated. Independence of selective execution: The scripts should be written/arranged in such a manner that they provide the independence of executing individual testcases or at least testcases belonging to the same module. This is important when the need is not to execute the entire suite, but to verify particular bugs. Signal-handling and clean exit on abrupt termination: It needs to be ensured that the suite does all the clean up when terminated abruptly, consciously or unconsciously, by the user. It may be required by the script to handle the termination/kill signal for a while so as to get the time for clean up (and may be, complete the currently executing testcase, if the suite desires). Such signal handling is extremely important in some particular cases. E.g. - When an automation suite is run \u00a9 QAI India Ltd, 3rd Annual International Software Testing Conference, 2001. No use for profit permitted. \n\n---\n\n Page: 9 / 12 \n\n---\n\n Presented at the 3rd International Software Testing Conference, India through command line on a Unix terminal as a foreground process and the user does a Ctrl-D in order to stop the suite for whatever reasons. The suite might have changed some configuration files or properties files before it received the signal. So, if the changes are not reverted back before the termination of the suite, then things will go for a toss. Self-sufficiency in individual testcases: Testcases should not be dependent on preceding testcases for execution. If there is dependency on testcases occurring before in the sequence, then the subsequent testcases will fail without any reason. If at all such dependence is unavoidable, the error message in the log file, when such testcases fail because of the failing of preceding testcase, should be explanatory enough. Equipped with test data: The automation suite should be equipped with all the test data required by the different testcases. The test data may consist of simple data input as required by the testcases to supply parameters to the functions for testing different conditions like numeric input, alpha-numeric input, non-alpha-numeric input, etc. It may as well consist of specific files to be supplied to the testcases to test particular functionality of the application/product. The automation suite has to be accompanied with such test data and this test data has to be prepared for the suite with precision. Example: - A particular feature of the application/product may have to be tested with files of different sizes, say 0 bytes, 64 KB, 1MB, 30 MB, etc. So, the suite requires to have the files precisely of these sizes only. All such files may be kept in a particular folder from where the suite picks them up. The regular input data, which is required by the functions as parameters can be supplied through the input data files. The individual testcases may parse the parameters to be supplied to the testcases while reading them from the input data files. The tools available in the market support different types of file to be used as data input files. Example: - WinRunner uses excel sheets for reading data, while SilkTest uses .dat files. Dynamic generation of names for temporary files and input data: Sometimes, the automation suite would require to create certain temporary files. If the suite does not delete the temporary files created by itself, then they will get over-written in the next run of the suite. Besides, if a file \u00a9 QAI India Ltd, 3rd Annual International Software Testing Conference, 2001. No use for profit permitted. \n\n---\n\n Page: 10 / 12 \n\n---\n\n Presented at the 3rd International Software Testing Conference, India by the same name exists even before the first run of the suite, then that file may get over-written in the first run itself. The consequence will be even worse if the write permission is not there on the already existing file. The script will fail to over-write also in such a case and the testcase might eventually bombard. Similar problems are faced when the suite contains a positive testcases like creating a folder with a given name. If the suite does delete this folder created as a part of the clean-up process, the testcases fails unnecessarily when the suite is run again with the testcase trying to create a folder with the same name. A solution to all such problems is to dynamically generate the names for temporary files and all such input data at the run time. This way the names will not conflict with those of the existing files and fresh data. Such dynamic generation of names can be accomplished by several ways. One typical way of generation can be stripping the microsecond part of the current time and appending it to a name. This way there will be an extremely rare probability (10 to the power of \u20136, to be precise) that a conflict in the names will take place. Cleaning-up: It has to be ensured that the automation suite brings the application/product back to the original state it was in before the suite executed. If any configuration or properties file was changed for the execution of some testcase, then the changes must be reverted back. If the suite generates some temporary files, they should be deleted by the suite towards the end. Incorporating user-friendliness: The automation suite should be as user-friendly as possible. Some basic points for ensuring user-friendliness are: - \ufffd The user should have freedom to put the test data files anywhere on the m/c. \ufffd The suite can be run from anywhere on the m/c. \ufffd It can be installed anywhere on the m/c. \ufffd Once it is run, the suite should not require any manual intervention till completion. The user should be able to run the suite unattended. For incorporating such user-friendliness, the suite needs to be designed in a proper way. A separate configuration file can be created that contains all the variables that the user might want \u00a9 QAI India Ltd, 3rd Annual International Software Testing Conference, 2001. No use for profit permitted. \n\n---\n\n Page: 11 / 12 \n\n---\n\n Presented at the 3rd International Software Testing Conference, India to change. E.g. \u2013 The user might want the log files to be generated on the desktop instead of a hard-coded path. The user might as well want the suite to pick-up the test data/files from a directory of his choice. All such entities can be placed in the configuration file in the form of variables that the user can change easily. The suite can read these variables from the configuration files every time it is run. If such a design is used, all that the user would need to do before running the script is to change the configuration file as per his needs. Thus, the user will get a tremendous amount of flexibility and independence. Developing an efficient error recovery routine: Error Recovery routine enables the test suite to run continuously unattended. The function of this routine is to anticipate errors, decide on corrective action, log the error and proceed further with next test, if possible. E.g. - If unexpected termination of application under test happens, the routine should recognize the interruption and restart the application. This prevents cascading effect or reporting wrong defects after a test suite execution. In a nutshell, this will ensure that failures in test execution are effectively trapped, interpreted and suite continues to run without failures. Without such an error recovery system, automated test suite runs will never take off. Manual presence will become a necessity during test suite execution. Test scripts for test data setup and cleaning up: If the automation suite does not take care of test data setup, it will have to be done manually by the user, which reduces the fun of test automation. This becomes all the more important when test data setup requirements are huge and as a result, the whole exercise become highly time consuming. Hence, an ideal automation suite should incorporate dedicated scripts for test data set-up. These scripts are executed before any other functionality test can be executed on the product. E.g. \u2013 When the application/product in focus is an ERP suite or a banking software, the test data setup part itself may take 3-4 man-days of effort. With the automation in place for this setup, the effort is reduced drastically. Similarly, scripts for cleaning-up should also be incorporated in the automation suite. Such scripts will aim at bringing the application to the ground state it was in before the \u00a9 QAI India Ltd, 3rd Annual International Software Testing Conference, 2001. No use for profit permitted. \n\n---\n\n Page: 12 / 12 \n\n---\n\n Presented at the 3rd International Software Testing Conference, India automation suite was run, i.e., they will undo all the changes that any testcases in the suite brought about while executing. E.g. \u2013 If there is a testcases for creating a folder, then the clean-up action will delete this folder. Testing the test scripts: Test scripts should be tested before they are used in a test suite. Testing of all test scripts should be planned in test automation activity. Adequate tests need to be performed for each test script. When test error simulation and rectification is difficult and time- consuming process, reporting false errors can cost more and defeat the objective. The goal for the automation team has to be that a test program should never report a false problem. All scripts should satisfy the following criteria: - \ufffd When given a valid input, they produce the correct output. \ufffd When given an invalid input, they correctly and gracefully reject the input. \ufffd Do not hang or crash, given either valid or invalid input. Conclusion Test automation is a great idea. But it is not a magic wand. Proper time and effort has to be spent for the development of the test automation suite. And the key is to follow the right processes. In eagerness to achieve fast results, the desirable processes are compromised. And that is the reason why, more often than not, it only promises and raises hopes, but simply fails to deliver. Acknowledgement I wish to express my sincere gratitude towards Sridhar Kumanduri and KiranKumar Marri for sharing their experience in test automation with me and giving me extremely valuable feed-back. References 1. Bach, James. 1996. \u201cTest Automation Snake Oil.\u201d Windows Technical Journal, (October). http://www.satisfice.com/articles/test_automation_snake_oil.pdf 2. Success with Test Automation by Bret Pettichord (bret_pettichord@bmc.com) 3. Howard Fear on Test Automation by Howard Fear (hsf@pageplau.com) 4. Automated Testing: A Practical Approach for ERP product by Kishore C.S. (cs@rsi.ramco.com) \u00a9 QAI India Ltd, 3rd Annual International Software Testing Conference, 2001. No use for profit permitted. \n\n\n***\n\n\n "} @@ -177,7 +184,6 @@ {"text": "Infosys Press Release (PR) \nTitle: Infosys Recognized as Global Top Employer for the Second Consecutive Year; Ranked #1 in India Again \nAuthor: ['Infosys Limited'] Infosys (NSE, BSE, NYSE: INFY), a global leader in next-generation digital services and consulting, has been recognized by Top Employers Institute among the Global Top Employers for the second consecutive year. Infosys was ranked #1 Top Employer in India, in recognition of its best-in-class people practices and consistency in delivering employee experience globally. Infosys is one of 11 companies worldwide to receive this recognition.<|endoftext|> Infosys has been named Top Employer across the following regions and top-ranked in 16 of the 22 countries: Asia Pacific \u2013 India, Australia, New Zealand, Singapore, Japan, and China North America \u2013 USA, Canada and Mexico Middle East \u2013 UAE, Oman and Bahrain Europe \u2013 U.K, Ireland, France, Belgium, Netherlands, Germany, Switzerland, Sweden, Romania, and Poland Krish Shankar, Executive Vice President and Group Head of Human Resource Development, Infosys, said, \u201cWe are delighted to be awarded Global Top Employer again this year. This comes at a time when we have strengthened our approach to employee engagement, making it more purposeful and morale-boosting. We continue to significantly invest in digital learning for our workforce creating new avenues for their growth. Infosys\u2019 internal talent marketplace also helps them move continuously upward in the value chain, delivering on our promise of \u2018careers that never stand still\u2019. This recognition by the Top Employers Institute is a testimony to our concerted efforts to make it possible for every Infosys employee to navigate further, sustained by our culture and values.\u201d The Top Employers Institute program certifies organizations based on their HR Best Practices across 6 HR domains consisting of 20 areas such as People Strategy, Work Environment, Talent Acquisition, Learning, Well-being, Diversity & Inclusion, and more. For the evaluation, Top Employers Institute conducted a detailed assessment of Infosys\u2019 people practices through the HR Best Practices assessment in 22 countries. The Top Employer Certification highlighted Infosys\u2019 focus on supporting their employee\u2019s well-being and experience, especially during the pandemic. It also reflects the Company\u2019s Environment, Social and Governance (ESG) vision and commitment to its workforce.<|endoftext|> David Plink, CEO, Top Employers Institute said, \u201cReflecting on the demanding year that has, like the year before it, impacted organizations across the world, our Global Top Employers have continued to prioritize going above and beyond the norm to maintain their excellent people practices in the workplace. As a global Top Employer, Infosys has proven its unwavering commitment to employees on a global scale, joining a niche group of companies that have achieved a certification through the Top Employers Program. We are excited to celebrate and applaud them for their achievement in 2022.\u201d About Top Employers Institute Top Employers Institute is the global authority on recognizing excellence in People Practices. We help accelerate these practices to enrich the world of work. Through the Top Employers Institute Certification Program, participating companies can be validated, certified, and recognized as an employer of choice. Established over 30 years ago, Top Employers Institute has certified over 1857 organizations in 123 countries/regions. These certified Top Employers positively impact the lives of over 8 million employees globally. Top Employers Institute. For a better world of work.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in more than 50 countries to navigate their digital transformation. With over four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NSE, BSE, NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2021. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For more information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} {"text": "Infosys Press Release (PR) \nTitle: Infosys Positioned as a Leader in IDC MarketScape: Worldwide B2B Commerce Services for Industrial Manufacturing 2021 Vendor Assessment \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in IDC MarketScape: Worldwide B2B Commerce Services for Industrial Manufacturing 2021 Vendor Assessment Infosys (NSE, BSE, NYSE: INFY) has been positioned as a Leader in IDC MarketScape: Worldwide B2B Commerce Services for Industrial Manufacturing 2021 Vendor Assessment (Doc #EUR148242121, October 2021). The study highlights Infosys\u2019 industry know-how, proven B2B digital commerce expertise, and ability to streamline B2B commerce operations for its clients through the Infosys Digital Store services solution.<|endoftext|> For the study, the IDC MarketScape assessed and analyzed eight IT service providers based on their scope of engagement, deep digital commerce expertise, and ability to deliver projects related to B2B business models. Infosys was recognized for its extensive capabilities in helping industrial manufacturing clients in their B2B transformation initiatives. According to the report, \u201cInfosys has invested significantly over the past several years to bolster its capabilities in helping industrial manufacturing clients in their B2B transformation initiatives. Investments related to its B2B commerce capabilities include acquisitions in digital customer experience (e.g., Blue Acorn iCi) and digital design studios (e.g., WONGDOODY) to deliver human-centric experiences as well as in consulting and implementation capabilities related to Salesforce (e.g., Fluido, Simplus).\u201d Further, the IDC MarketScape\u2019s evaluation highlights Infosys\u2019 strengths, including the following: Infosys has a comprehensive set of prebuilt solutions based on strategic partners as well as its own IP dedicated to the needs of the B2B industrial manufacturing domain A significant share of Infosys' project services business comes from projects with clients in industrial manufacturing, which accordingly plays into domain expertise for this industry Infosys has strong industry domain know-how in industrial manufacturing and proven project experience related to B2B digital commerce Stefanie Naujoks, Research Director, Manufacturing Insights Europe, at IDC, \u201cInfosys has strong industry domain know-how in industrial manufacturing and proven project experience related to B2B digital commerce. Client reference interviews in particular revealed that clients appreciate Infosys' flexibility and ability to scale and excellent feedback with regards to client relationship and account management.\u201d Jasmeet Singh, Executive Vice President and Global Head of Manufacturing, Infosys, said, \u201cThe recognition by the IDC MarketScape exemplifies our continued investments in transforming B2B and B2B2C commerce operations for our industrial manufacturing clients. We will leverage our digital commerce and digital experience expertise along with partner solutions to effectively address the commerce requirements of industrial manufacturers. Our capability to bring together consulting, technology and business process management provides our industrial manufacturing clients with an accountable and dependable commerce transformation partner.\u201d About IDC MarketScape IDC MarketScape vendor assessment model is designed to provide an overview of the competitive fitness of ICT (information and communications technology) suppliers in a given market. The research methodology utilizes a rigorous scoring methodology based on both qualitative and quantitative criteria that results in a single graphical illustration of each vendor\u2019s position within a given market. IDC MarketScape provides a clear framework in which the product and service offerings, capabilities and strategies, and current and future market success factors of IT and telecommunications vendors can be meaningfully compared. The framework also provides technology buyers with a 360-degree assessment of the strengths and weaknesses of current and prospective vendors.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in more than 50 countries to navigate their digital transformation. With over four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NSE, BSE, NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2021. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For more information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} {"text": "Infosys Press Release (PR) \nTitle: Financial information for the Third Quarter ended December 31, 2021 \nAuthor: ['Infosys Limited'] Results for the Third Quarter ended December 31, 2021 Infosys announces results for the third quarter ended December 31, 2021 on Wednesday, January 12, 2022.<|endoftext|> Schedule of events Press release on schedule of events Highlights (IND AS) Standalone and consolidated results and Regulation 33 auditors reports Press releases IFRS USD | IFRS INR Fact Sheet Download Financial Statements IFRS Financial Information Ind AS Standalone | Consolidated Additional information Download Management's comments on the results January 12, 2022 5:00 p.m. (IST) Archived Webcast of Press Conference | Transcript Webcast of conference call January 12, 2022 6:30 p.m. (IST) Archived Earnings conference call - Audio | Transcript \n\n\n***\n\n\n "} -{"text": "Infosys Press Release (PR) \nTitle: \nAuthor: [] \n\n\n***\n\n\n "} {"text": "Infosys Press Release (PR) \nTitle: Infosys Positioned as a Leader in Everest Group Microsoft Dynamics 365 Services PEAK Matrix\u00ae Assessment 2021 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in Everest Group Microsoft Dynamics 365 Services PEAK Matrix\u00ae Assessment 2021 Infosys (NSE, BSE, NYSE: INFY), a global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group Microsoft Dynamics 365 Services PEAK Matrix\u00ae Assessment 2021. Infosys was recognized for scoring high in assessment of market impact, vision and capability. The report highlighted Infosys for its ability to successfully execute large-scale, multi-continent, end-to-end Dynamics 365 services leveraging Infosys Cobalt suite of services, solutions and platforms, underpinned by its strong global delivery network.<|endoftext|> For the report, Everest Group evaluated 18 leading service providers with Microsoft Dynamics 365 services in the scope of work and basis their market adoption, portfolio mix, value delivered, vision and strategy, innovations and investments, and delivery footprint. The assessment is based on Everest Group's annual RFI process for the calendar year 2020, interactions with leading service providers, client reference checks, and ongoing analysis of the Microsoft Dynamics 365 services market.<|endoftext|> The assessment highlights Infosys\u2019 strengths in the following areas: Talent pool for its extensive experience in Microsoft Dynamics 365, technical expertise along with its organizational change management capabilities Strong domain expertise specifically in industries such as banking, electronics and technology, and energy and utilities to aid clients in their journey Clients appreciate Infosys for its overall talent management strategy, its account management capabilities and acknowledge leveraging Infosys for future requirements Advisory services - in guiding clients for creating transformational roadmap and shaping the business case along with their implementation capabilities expertise across Microsoft portfolio Satish HC, Executive Vice President, Co-Head of Delivery, Infosys, said, \u201cOur excellent performance in the Everest Group\u2019s PEAK Matrix\u00ae showcases our deep domain knowledge and differentiated offerings backed by our Cobalt suite of services and significant investments in the space. As an accredited Microsoft partner, Infosys will continue to offer large-scale, multi-continent, end-to-end Dynamics 365 services and time-to-market advantages while transforming user experience in a simplified and intuitive manner. I strongly believe that our endeavor in accelerating customers\u2019 transformation journeys underscored by a wide scope of data and analytics services, and a global delivery footprint has contributed towards this position. With Infosys Cobalt in our arsenal, we will continue to maximize the business value for our clients through extensive IT and cloud-native service capabilities.\u201d \u201cEnterprises\u2019 strong focus on digital transformation has led to the rapid adoption of cloud-based enterprise platforms. The adoption of Dynamics 365 is driven by quicker time-to-market, better affordability, and close integration with the Microsoft stack,\u201d said Yugal Joshi, Partner, Everest Group. \"Infosys has invested in building verticalized solutions and a framework for Dynamics 365, such as Smart Retail, Modern CX for banking, and Housing management to deliver industry-specific solutions. It has established a good ecosystem with partners such as SK Global and To-Increase to enhance its delivery capabilities. Clients recognize Infosys\u2019 experience in Dynamics 365, talent management strategy, and organization change management capabilities. Taken together, these capabilities have helped position Infosys as a Leader on Everest Group\u2019s Microsoft Dynamics 365 Services PEAK Matrix\u00ae Assessment 2021.\u201d Complimentary custom copies of Microsoft Dynamics 365 Services PEAK Matrix\u00ae Assessment 2021 reports can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in more than 50 countries to navigate their digital transformation. With over four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NSE, BSE, NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2021. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For more information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} {"text": "Infosys Press Release (PR) \nTitle: Infosys Positioned as a Leader in the IDC MarketScape: Worldwide Managed Multicloud Services 2021 Vendor Assessment \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the IDC MarketScape: Worldwide Managed Multicloud Services 2021 Vendor Assessment Infosys (NSE, BSE, NYSE: INFY), a global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in the IDC MarketScape: Worldwide Managed Multicloud Services 2021 Vendor Assessment (Doc #US45977020, October 2021). The report highlights Infosys\u2019 comprehensive approach to delivering managed cloud services, as part of Infosys Cobalt, its ability to efficiently operate cloud workloads, and offer seamless cloud-powered enterprise transformation for clients.<|endoftext|> For the report, IDC MarketScape evaluated 14 managed multicloud service providers based on their service coverage and lifecycles, portfolio, revenue, and partner network. As a leader, Infosys was recognized for its technology and domain expertise, broad repository of assets, tools, solutions, platforms and accelerators that are integral to Infosys Cobalt. The report also highlighted Cobalt Community, which includes Infosys employees, customers, cloud partners, and start-ups, an expanding ecosystem of technology and innovation hubs that give Infosys the ability to deliver faster time to market for its global clients. IDC MarketScape further recognized the role of Infosys\u2019 Cloud Managed Services in enhancing resilience and agility of its clients\u2019 IT ecosystem on the cloud.<|endoftext|> The assessment highlighted Infosys\u2019 strengths in the following areas: Infosys showcased strength in migration and modernization of applications to the cloud using managed multicloud services.<|endoftext|> Infosys also exceeded market averages in using managed multicloud services for blockchain on public clouds (spanning IaaS, PaaS, and/or SaaS) as well as for its total number of centers of excellence (COEs)/labs (physical locations) to support these services.<|endoftext|> From a business perspective, Infosys exceeded market standards in the share of worldwide managed multicloud services business generated from applications (e.g., ERP, SCM, CRM) and applications development/deployment using multicloud environments along with the share of managed multicloud services business generated from its own sales resources.<|endoftext|> Finally, Infosys exceeded the industry standard for its client retention rate and was highly rated by customers for cost savings' effectiveness.<|endoftext|> \"With the breadth of its managed multicloud services resources, Infosys is positioned to help clients with their cloud strategies as well as ensure that enterprises can meet the demands of a hyperdynamic market and increasing need to ensure business resilience,\u201d said David Tapper, Program Vice President, Outsourcing and Managed Cloud Services, IDC. \"In supporting enterprises with their multicloud requirements, clients indicate that Infosys can meet SLAs, deliver cost-effective solutions, enable access to full array of public cloud providers, and provide business and technology expertise.\" Narsimha Rao Mannepalli, Executive Vice-President, Head of Cloud & Infrastructure Solutions, Infosys, said, \"We are delighted to be recognized as a Leader in the IDC MarketScape 2021 for Worldwide Managed Multicloud Services. This is an acknowledgement of our strong capability in this space, and our focus on a customer-centric approach. Leveraging our Cobalt portfolio, deep contextual knowledge, rich partner ecosystem and industry expertise across verticals, we will continue to support our clients grow their business with speed, scale, and agility.\u201d To read this report, please visit: https://www.infosys.com/services/cloud-cobalt/analyst-reports/leader-worldwide-managed-multicloud-services2021.html IDC MarketScape vendor assessment model is designed to provide an overview of the competitive fitness of ICT (information and communications technology) suppliers in a given market. The research methodology utilizes a rigorous scoring methodology based on both qualitative and quantitative criteria that results in a single graphical illustration of each vendor\u2019s position within a given market. IDC MarketScape provides a clear framework in which the product and service offerings, capabilities and strategies, and current and future market success factors of IT and telecommunications vendors can be meaningfully compared. The framework also provides technology buyers with a 360-degree assessment of the strengths and weaknesses of current and prospective vendors.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in more than 50 countries to navigate their digital transformation. With over four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NSE, BSE, NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2021. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For more information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} {"text": "Infosys Press Release (PR) \nTitle: Infosys Equinox Collaborates with Packable to Help Amplify its Direct to Consumer e-Commerce Offerings for its Brand Partners \nAuthor: ['Infosys Limited'] Infosys (NYSE: INFY), a global leader in next-generation digital services and consulting, today announced its collaboration with Packable, a leading e-commerce company with a proprietary tech-enabled offering, sitting at the intersection of brands, marketplaces and customers. Packable recently announced merger with Highland Transcend Partners, setting it on the path to becoming a public company. Through the collaboration with Packable, Infosys will integrate its flagship human-centric digital commerce platform, Infosys Equinox, with Packable IQ (Packable\u2019s proprietary e-commerce platform). The strategic collaboration will strengthen Packable\u2019s ability to offer its brand partners an engaging, innovative, and agile Direct to Consumer platform (D2C): \u201cD2C-in-a-box.\u201d The proliferation of e-commerce and digital channels means it is increasingly critical for brands to develop and execute innovative D2C strategies to help them win e-commerce shoppers through unique, personalized and innovative customer engagement. Packable \u2013 equipped with Infosys\u2019s Equinox\u2019s microservices based, API first, and cloud native design \u2013 will be better placed to add greater value to its brand partners via the new D2C-in-a-box offering. Infosys Equinox combined with Packable IQ\u2019s intelligent pricing, consumer transaction data, smart inventory management and extensive fulfillment capabilities will create a highly competitive D2C platform to run and manage a brand\u2019s e-commerce website and operations. It will also enable brands to create curated D2C journeys ready to be launched in a matter of weeks.<|endoftext|> The Infosys Equinox-powered cloud-native D2C platform will bring together the best of Packable and Infosys for brands: Packable IQ along with Infosys Equinox\u2019s end-to-end commerce-as-a-service for enterprises, to drive results throughout the purchase lifecycle. The solution will create a repository of complementary and collectively exhaustive services that can be easily integrated with existing core systems or new platforms to deliver headless commerce capabilities coupled with real time analytics and insights on-demand. It will also provide brands with digital advances including conversational commerce, augmented reality, voice and social commerce, enabling them to engage with consumers through rich and hyper-personalized experiences.<|endoftext|> Karmesh Vaswani, Executive Vice President & Global Head Consumer, Retail & Logistics, Infosys, said, \u201cOur clients are seeking predictive and customized playbooks to win in e-commerce marketplaces. Powerful stories made to stick digitally, smart analytics and algorithms, rich personalized experiences and staying a step ahead of the consumer are key. They also need platform capability to overcome the inertia of mainstream enterprise technology stacks. We are delighted that the Packable-Infosys Equinox strategic collaboration will enable brands with autonomous capabilities to place the right offerings before the right consumer, at the right moment, and at the right price.\u201d Andrew Vagenas, Chief Executive Officer, Packable, said, \u201cWe\u2019re thrilled to partner with Infosys. This exciting partnership marks another milestone in the execution of Packable\u2019s strategy of augmenting our D2C platform ecosystem to accelerate brand partners\u2019 revenues and profitability across e-commerce channels. As we continue our journey to becoming a public company, we\u2019re diligently looking for partnerships to help bring the highest-quality services to customers, and this agreement with Infosys Equinox allows us to do just that.\u201d Ash Mehra, Chief Information Officer, Packable, said, \u201cAt Packable, we are actively deepening our relationships with brand partners of all stripes, from household name consumer product's companies to Digitally Native Brands. This partnership with Infosys Equinox will enable us to provide even more value additive services to our brand partners, continuing to set them up for success in the age of e-commerce.\u201d About Packable Packable is a leading E-commerce company with a proprietary technology platform that empowers brands throughout the transaction lifecycle, by providing them with tech-enabled inventory planning and data analytics, marketing, marketplace management, logistics and distribution, customer experience and support. Founded in 2010, Packable has approximately 1000 employees, including a premier team of E-commerce experts, connecting consumers to their favorite brands on online marketplaces such as Amazon, Walmart, Google, eBay, Target, Kroger and Facebook, becoming one of the largest marketplace sellers in North America. By combining the end-to-end commerce lifecycle in one platform, Packable acts as a comprehensive service provider and empowers its brand partners to avoid disparate and inefficient points of sale. Additionally, since Packable helps facilitate the vast E-commerce lifecycle, it gains access to rich customer transaction data, providing it with differentiated data insights that it uses to optimize its platform and benefit its brand partners.<|endoftext|> To learn more about Packable, which announced on September 9, 2021 that it plans to become a public company through a merger with Highland Transcend Partners (NYSE: HTPA), a special purpose acquisition company (SPAC), please visit: packable.com. Upon completion of the transaction, Packable expects to be listed on the NASDAQ.<|endoftext|> About Infosys Equinox Infosys Equinox is the flagship human-centric digital commerce platform of Infosys. The platform helps brands provide omnichannel and memorable shopping experiences to their customers. With a future-ready architecture and integrated commerce ecosystem, Infosys Equinox provides an end-to-end commerce platform covering all facets of an enterprise\u2019s e-commerce needs.<|endoftext|> Visit https://www.infosysequinox.com/ to see how Infosys Equinox can help your enterprise deliver hyper-segmented, personalized omnichannel commerce experiences for B2B and B2C buyers.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in more than 50 countries to navigate their digital transformation. With over four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit https://www.infosys.com/about.htm to see how Infosys (NSE, BSE, NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2021. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media Contacts: Packable: Packable-SVC@sardverb.com Infosys: PR_Global@Infosys.com \n\n\n***\n\n\n "} @@ -230,7 +236,6 @@ {"text": "Infosys Press Release (PR) \nTitle: Infosys to Onboard Award-Winning Experience Design Agency, Carter Digital \nAuthor: ['Infosys Limited'] Infosys (NYSE: INFY), a global leader in next-generation digital services and consulting, today announced a definitive agreement to purchase assets and onboard employees of Carter Digital, one of Australia\u2019s leading and award-winning experience design agencies. This asset takeover strengthens Infosys\u2019 global design and experience offerings, demonstrates its continued commitment in bringing innovative thinking, talent and creativity to its clients, and provide effective global digital solutions.<|endoftext|> Carter brings to Infosys, experts in human centered design, experiential, enhanced digital transformation, customer interaction expertise, and will also cement WONGDOODY, an Infosys brand, into the Australasian market.The agency is known for its holistic approach and \u2018people first, design later\u2019 mantra, delivering services to connect digital to physical experiences in the consumer, commerce, technical and corporate environments, backed with data analysis, analytics, and creative expertise, to drive compelling, purposeful outcomes.<|endoftext|> With services that include business and creative strategy, research and insights, branded commerce and digital product development, user and customer experiences, interaction, experiential and creative design, consumer and product design, Carter delivers enriched, purpose-led experience for brands across arts, culture, education, tourism, events, start-ups and healthcare.<|endoftext|> Together with Infosys\u2019 earlier acquisition of WONGDOODY that offers creative and marketing services, Carter brings complementary capabilities to help global CMOs and businesses thrive in a digital commerce world. As part of Infosys\u2019 global design and experience offering, Carter Digital will be rebranded as WONGDOODY and join its network of studios across Seattle, Los Angeles, New York, Providence, Houston, and London, as well as design hubs in five Indian cities.<|endoftext|> Andrew Groth, Senior Vice President and Region Head for Australia and New Zealand, said, \u201cAustralia is a strategic market for Infosys and the company has enjoyed strong and consistent growth serving marquee clients across a range of industries from telecom and financial services, to utilities and the public sector. As digital experience becomes a critical differentiator in most enterprise transformations, the addition of Carter\u2019s capabilities reaffirms our commitment to help clients navigate their digital priorities with a complete end-to-end offering.\u201d Ben Weiner, CEO, WONGDOODY, an Infosys company, added, \u201cIn Carter, we have found kindred spirits who align with the cultures of both WONGDOODY and Infosys. We are very excited to bring their capabilities to Infosys\u2019 clients in the market where the opportunity to add layers of digital strategy, customer experience, and design is significant and compelling. We are excited to welcome Carter Digital to the Infosys family\u201d \u201cCarrying the WONGDOODY flag into our region provides us the ability to turbo charge our delivery. This, along with the backing of Infosys, means we now have the instant depth and scalability to meet the growing needs and expectations of our current and future clients,\u201d said Paul Beardsell, Founder & Managing Director, Carter Digital.<|endoftext|> James Noble, Founder & Chief Creative Officer of Carter Digital added, \u201cWe\u2019re excited to be joining WONGDOODY, an Infosys company. Being a like-minded, internationally recognised human experience and brand engagement agency creates enormous opportunities for us in the Australasian market. This enable us to further our industry-leading work, connecting us to new capabilities, and enhancing our partner's success.\" This is an asset purchase and the transaction is expected to close during the fourth quarter of fiscal 2021, subject to customary closing conditions.<|endoftext|> About Carter Digital Putting people ahead of everything else, Carter delivers human-centric, data driven outcomes to transform the way customers interact with businesses, in a rapidly changing digital world. Delivering experiences to surprise and delight the people using them, they enable clients to exceed audience needs, grow market share and deepen engagements. Carter have achieved sustained success in a world where technology, expectations and adapts to meet customer behaviours.<|endoftext|> Winner of numerous industry accolades and awards on behalf of their clients since 2010, is an acknowledgement of the consistent results Carter deliver.<|endoftext|> About WONGDOODY, an Infosys company WONGDOODY is an award-winning creative agency and the global experience-and-design platform for Infosys. The company is recognized for branding, retail, and consumer insights. With offices in Seattle, New York, Los Angeles, Providence, and across the globe; WONGDOODY clients have included Amazon, Honda, and a wide range of Fortune 500 companies.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} {"text": "Infosys Press Release (PR) \nTitle: Quarterly Results and Filings of Q3 2020 \nAuthor: ['Infosys Limited'] Results for the Third Quarter ended December 31, 2020 Infosys announce results for the third quarter ended December 31, 2020 on Wednesday, January 13, 2021.<|endoftext|> Schedule of events Press release on schedule of events Highlights (IND AS) Standalone and consolidated results and Regulation 33 auditors reports Press releases IFRS USD | IFRS INR Fact Sheet Download Financial Statements IFRS Financial Information Ind AS Standalone | Consolidated Additional information Download Management's comments on the results January 13, 2021 4:30 p.m. (IST) Archived Webcast of Press Conference | Transcript Webcast of conference call January 13, 2021 6:30 p.m. (IST) Archived Earnings conference call - Audio | Transcript \n\n\n***\n\n\n "} {"text": "Infosys Press Release (PR) \nTitle: Infosys Implements Global Warranty Solution to Simplify Factory Warranty Processes for Johnson Controls \nAuthor: ['Infosys Limited'] Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has successfully implemented a global warranty solution on SAP S/4 HANA, across all Johnson Controls ducted products. This enables Johnson Controls, the global leader in smart and sustainable buildings, to simplify factory warranty processes, enhance visibility into assets, and become more responsive to customers.<|endoftext|> Johnson Controls selected Infosys as a technology services partner for its in-depth knowledge of business priorities and the ability to develop best-fit solutions for Johnson Controls for over two decades. Infosys replaced the legacy system with a global warranty platform solution, leveraging the latest UI/UX technologies such as SAP Fiori combined with SAP S/4 HANA. These technologies, along with the application of the agile methodology for execution, delivered an end-to-end, integrated, and centralized warranty process for Johnson Controls. The solution also offers a digital-first experience for both B2B and B2C customers while supporting equipment integration and warranty claims on finished products.<|endoftext|> Krzysztof Soltan, Vice President Information Technology - Building Solutions North America & Global Retail at Johnson Controls, said, \u201cWarranty management of assets in smart and healthy buildings are playing an increasingly important role at a process as well as operational level. In order to address the next generation digital needs, we partnered with Infosys to create a solution for our customers with most relevant information through Johnson Controls OpenBlue digital platform.\u201d Jasmeet Singh, Executive Vice President and Global Head of Manufacturing, Infosys, said, \u201cThe implementation of the Global Warranty Solution on SAP S/4 is a testament to our 20-year partnership with Johnson Controls. The global warranty solution integrates seamlessly with legacy systems to harmonize business processes. Through this transformation journey, we will also help Johnson Controls develop digital strategies for warranty processes. With a comprehensive warranty solution, Johnson Controls will be able to provide digital consumers with a superior after-sales experience.\u201d About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Press Release (PR) \nTitle: \nAuthor: [] \n\n\n***\n\n\n "} {"text": "Infosys Press Release (PR) \nTitle: Daimler and Infosys Announce Strategic Partnership to Drive Hybrid Cloud-powered Innovation & IT Infrastructure Transformation in the Automotive Sector \nAuthor: ['Infosys Limited'] Daimler AG and Infosys (NYSE: INFY), today announced a long-term strategic partnership for a technology-driven IT infrastructure transformation. After the receipt of all regulatory approvals, Daimler AG will transform its IT operating model and infrastructure landscape across workplace services, service desk, data center, networks and SAP Basis together with Infosys. The partnership will enable the company to deepen its focus on software engineering and to establish a fully scalable on-demand digital IT infrastructure and anytime-anywhere workplace. The collaboration will empower Daimler to strengthen its IT capabilities, and Infosys, its automotive expertise.<|endoftext|> As software becomes modular, digital infrastructure continues to play an important role in defragmentation. Daimler will work towards a model that ensures a robust IT infrastructure across its plants and regions, and supports consolidation of its data centers, scaling its IT operations, and bringing innovations to the fore. Some of the key deliverables from this partnership include: A smart hybrid cloud, leveraging Infosys Cobalt and leading cloud providers, accelerating the multi-cloud journey with a focus on open source adoption A carbon neutral solution, by consolidating and rationalizing data centers across all regions Standardized technology stack by bringing in an ecosystem of best of breed partners Creation of a state of the art Zero Trust network with seamless technology upgrades Persona-driven and cognitive, AI powered anytime-anywhere workplace solution that empowers the end-users As a part of this partnership, automotive IT infrastructure experts based out of Germany, wider Europe, the U.S. and the APAC region will transition from Daimler AG to Infosys. Infosys is well placed to realize this transition as an expert having integrated more than 16,000 employees through other partnerships in recent years with a high acceptance, retention and satisfaction rate. The transfer will also enable Infosys to bolster and grow its automotive business, while offering Daimler employees strong prospects for long-term career growth and development.<|endoftext|> \u201cWe are excited about this partnership and the opportunity to support Daimler AG\u2019s automotive vision. As we embark on this journey, we will bring together capabilities, ecosystems and a hybrid cloud infrastructure that will shape new experiences for Daimler AG and the industry at large. Infosys has deep expertise in helping our clients across the globe navigate their digital journeys, and as part of this strategic partnership, we look forward to setting a new standard for the automotive industry,\u201d said Salil Parekh, Chief Executive Officer, Infosys.<|endoftext|> Talking about the partnership, Jan Brecht, Chief Information Officer, Daimler and Mercedes-Benz, said, \u201cSoftware becomes modular and IT infrastructure becomes big. Daimler will take three steps at once to transform its IT infrastructure: consolidation, scaling and modernization. We need to think infrastructure beyond the size of our company. With Infosys we found a partner to scale, to innovate and to speed up. Moreover, this is a strategic partnership for Daimler\u2019s IT capabilities and Infosys\u2019 automotive expertise. Infosys wants to grow with us in the automotive industry, which gives career opportunities for our employees. With this partnership, Daimler also strengthens its overall technology investment and partnership strategy.\u201d Daimler at a glance Daimler AG is one of the world's most successful automotive companies. With its Mercedes-Benz Cars & Vans, Daimler Trucks & Buses and Daimler Mobility divisions, the Group is one of the leading global suppliers of premium cars and one of the world's largest manufacturer of commercial vehicles. Daimler Mobility offers financing, leasing, fleet management, investments, credit card and insurance brokerage as well as innovative mobility services. The company founders, Gottlieb Daimler and Carl Benz, made history by inventing the automobile in 1886. As a pioneer of automotive engineering, Daimler sees shaping the future of mobility in a safe and sustainable way as both a motivation and obligation. The company's focus therefore remains on innovative and green technologies as well as on safe and superior vehicles that both captivate and inspire. Daimler continues to invest systematically in the development of efficient powertrains \u2013 from high-tech combustion engines and hybrid vehicles to all-electric powertrains with battery or fuel cell \u2013 with the goal of making locally emission-free driving possible in the long term. The company's efforts are also focused on the intelligent connectivity of its vehicles, autonomous driving and new mobility concepts. Daimler regards it as its aspiration and obligation to live up to its responsibility to society and the environment. Daimler sells its vehicles and services in nearly every country of the world and has production facilities in Europe, North and South America, Asia and Africa. In addition to Mercedes-Benz, the world's most valuable luxury automotive brand (source: Interbrand study, 20 Oct. 2020), and Mercedes-AMG, Mercedes-Maybach and Mercedes me, its brand portfolio includes smart, EQ, Freightliner, Western Star, BharatBenz, FUSO, Setra and Thomas Built Buses as well as the brands of Daimler Mobility: Mercedes-Benz Bank, Mercedes-Benz Financial Services and Daimler Truck Financial. The company is listed on the Frankfurt and Stuttgart stock exchanges (ticker symbol DAI). In 2019, the Group had a workforce of around 298,700 and sold 3.3 million vehicles. Group revenues amounted to \u20ac172.7 billion and Group EBIT to \u20ac4.3 billion.<|endoftext|> Further information on Daimler is available at www.media.daimler.com and www.daimler.com About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} {"text": "Infosys Press Release (PR) \nTitle: RBL Bank Embraces Finacle Digital Banking Solution Suite on Cloud with Containerized Platform \nAuthor: ['Infosys Limited'] RBL Bank Embraces Finacle Digital Banking Solution Suite on Cloud with Containerized Platform Infosys Finacle, part of EdgeVerve Systems, a wholly owned subsidiary of Infosys, and RBL Bank, one of India\u2019s leading private sector banks, announced that the bank will migrate from an on-premise deployment to a modern Cloud Native Computing Foundation (CNCF) Certified, Kubernetes managed, containerized ecosystem.<|endoftext|> The shift will enable RBL Bank to power its business with a state-of-the-art private cloud architecture, enabling it to cost-effectively scale at will, while delivering new digital banking capabilities at speed. The bank will gain from the automated, self-service capabilities of the cloud-native architecture to realize greater responsiveness, agility and reliability required to succeed in today\u2019s digital reality.<|endoftext|> Highlights: Finacle\u2019s extensive open API (application programming interface) repository will provide the agility required to seamlessly integrate and co-innovate with ecosystem partners, one of the key focus areas for the bank. Bank will also co-innovate with Infosys Finacle in enriching the suite.<|endoftext|> The bank has also upgraded its API foundation with Finacle\u2019s Digital Accelerator APIs. The solution\u2019s microservices architecture enables the bank to easily scale-up to manage surges in services, on-demand, across traditional, modern and emerging channels. The bank is now processing, on an average, six times higher mixed channel transactions per day compared to last year.<|endoftext|> Finacle Digital Banking Solution Suite will help enhance digitization and automation across the enterprise, leading to significantly improved customer experience, greater STP (straight through processing), and lower operational costs.<|endoftext|> Venkatramana Gosavi, Senior Vice President & Global Head of Sales, Infosys Finacle, said, \u201cToday, more than ever, customers demand round-the-clock, personalized banking services \u2013 on bank\u2019s digital channels as well as third party applications. RBL Bank has always been committed to staying contemporary when it comes to digital banking paradigms to successfully meet customer demands. We believe the upgraded digital banking platform, on a cloud-native architecture, will help accelerate innovation at RBL, enable deeper customer engagements and drive extensive automation to achieve operational excellence.\u201d Sankarson Banerjee, Chief Information Officer, RBL Bank, said, \u201cEven as our customers shift and readjust how they bank in these challenging times, it is our continuous endeavor to provide them with a world class customer experience. When combined with the flexibility to elastically scale our applications and microservices, we will be better placed to meet our digital transformation goals. We\u2019re investing in this new platform to gain exponential benefits in reducing costs, increasing efficiency, lowering cost to serve and ultimately, a better customer experience. Finacle\u2019s microservices and API based architecture forms the foundation on which we react to market requirements faster and remain ever ready to serve digitally native customers.\u201d About RBL Bank RBL Bank is one of India's fastest growing private sector banks with an expanding presence across the country. The Bank offers specialized services under six business verticals namely: Corporate & Institutional Banking, Commercial Banking, Branch & Business Banking, Retail Assets, Development Banking and Financial Inclusion, Treasury and Financial Markets Operations. It currently services over 8.49 million customers through a network 1,631 Offices (386 Branches & 1,245 BC Branches) spread across 28 Indian States and Union Territories. To know more, visit https://www.rblbank.com/ Infosys Finacle Finacle is the industry-leading digital banking solution suite from EdgeVerve Systems, a wholly owned product subsidiary of Infosys. Finacle helps traditional and emerging financial institutions drive truly digital transformation to achieve frictionless customer experiences, larger ecosystem play, insights-driven interactions and ubiquitous automation. Today, banks in over 100 countries rely on Finacle to service more than a billion consumers and 1.3 billion accounts.<|endoftext|> Finacle solutions address the core banking, omnichannel banking, payments, treasury, origination, liquidity management, Islamic banking, wealth management, analytics, artificial intelligence, and blockchain requirements of financial institutions to drive business excellence. An assessment of the top 1250 banks in the world reveals that institutions powered by the Finacle Core Banking Solution, on average, enjoy 7.2% points lower costs-to-income ratio than others.<|endoftext|> To know more, visit www.finacle.com Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} {"text": "Infosys Press Release (PR) \nTitle: El Paso Water Selects Infosys as its Strategic Partner for Customer Service Transformation \nAuthor: ['Infosys Limited'] Infosys (NYSE: INFY), a global leader in next-generation digital services and consulting, today announced a strategic partnership with El Paso Water (EPWater), a municipal utility in El Paso, Texas, to transform its legacy customer information systems (CIS) with Oracle Utilities Customer to Meter (C2M). As part of this engagement, Infosys will leverage its Preconfigured Accelerator for Customer Experience (PACE), to digitize EPWater\u2019s customer engagement and billing platforms.<|endoftext|> A platinum level member of the Oracle Partner Network (OPN), Infosys will accelerate the implementation of Oracle Utilities C2M on a platform-as-a-service (PaaS) model, along with cloud solutions for Customer Self Service and Mobile Workforce Management, with its PACE framework to deliver accuracy and efficiency. This empowers EPWater with an agile and flexible platform that commits to minimum customization with improved system agility and interoperability. Infosys\u2019 Robotic Process Assistant on proprietary AssistEdge framework will enable EPWater to seamlessly automate repetitive processes. The implementation allows EPWater to leverage new tools and processes to elevate customer experience and bolster employee productivity. As part of this partnership, Infosys will simplify and modernize EPWater\u2019s IT landscape with a scalable architecture to improve audit control mechanisms and financial transparency.<|endoftext|> Marcela Navarrete, Vice President at EPWater, said, \u201cThis is an ambitious undertaking with multiple system upgrades simultaneously, but it\u2019s a necessary project to help us make a leap forward to improve both efficiency and customer satisfaction.\u201d Ashiss Kumar Dash, SVP and Segment Head - Services, Utilities, Resources, Energy, Infosys, said, \u201cWe are delighted to partner with EPWater in their customer service transformation journey. In our past implementations, we have seen our utility clients reap numerous benefits from the flexibility, nimbleness, and cost-effectiveness of Infosys PACE solution on the Oracle C2M and Customer Care and Billing (CC&B) platforms. With deep knowledge in areas of Customer Care and Billing, C2M, meter data, and mobile workforce management guided by industry-best practices, Infosys is committed to deliver superior customer experience and employee engagement.\u201d About EPWater With oversight by the Public Service Board, El Paso Water provides water, wastewater, reclamation and stormwater management services for residential and commercial customers in the City of El Paso and wholesale services for some areas in El Paso County. The utility is recognized as a national leader for its innovative water supply strategy that includes water reuse, inland desalination and conservation.<|endoftext|> Visit epwater.org for more information.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} @@ -247,26 +252,26 @@ {"text": "Infosys Press Release (PR) \nTitle: Infosys and Ellen MacArthur Foundation Partner to Drive Forward Circular Economy \nAuthor: ['Infosys Limited'] Infosys and Ellen MacArthur Foundation Partner to Drive Forward Circular Economy Infosys (NYSE: INFY), a global leader in next-generation digital services and consulting, has partnered with the Ellen MacArthur Foundation charity. The organisations will work together to accelerate the global transition to a circular economy.<|endoftext|> The Foundation works with its Network of Strategic Partners, Partners, and Members. Infosys has joined the Network as a Partner. The Network includes some of the world\u2019s leading and most influential organisations, including businesses, governments, educators, innovators, and investors, to drive systemic change.<|endoftext|> Infosys CEO, Salil Parekh and Dame Ellen MacArthur discuss the importance of building a circular economy.<|endoftext|> The collaboration coincides with the launch of the strategic Sustainable Business Unit within Infosys, which will enable customers to better incorporate circular designs into their products, services, and supply chains.<|endoftext|> The Ellen MacArthur Foundation develops and promotes the idea of a circular economy. It works with, and inspires, business, academia, policymakers, and institutions to mobilise systems solutions at scale, globally. The circular economy offers an alternative to the linear \u2018take, make, waste\u2019 linear economy \u2014 one which is better for people, the economy, and the environment. The circular economy is based on three key principles \u2014 design waste out of the system, keep products and materials in use, and regenerate natural systems.<|endoftext|> Infosys will focus on aligning its digital transformation toolkit \u2014 Live Enterprise Suite \u2014 with the Foundation\u2019s circular economy performance measurement tool, Circulytics. Circulytics enables companies to measure their circular economy performance and identify opportunities to adopt, or further embed, circular practices, thereby driving the transition to a circular economy. Infosys will be able to achieve circular design of products, services, and supply chains much more quickly by reusing and repurposing customers\u2019 existing technology stacks, rather than replacing them.<|endoftext|> The Partnership follows the recent announcement that Infosys has become a PAS 2060 certified carbon neutral company \u2013 30 years ahead of the timeline set out in the Paris Agreement on climate change. Additionally, 34 of the company\u2019s buildings have the highest level of green building certification and no wastewater is discharged from any of its campuses. Now, as a system integrator dedicated to doing the right thing since inception in 1981, Infosys is well placed to take the technology conversation forward on circularity.<|endoftext|> Corey Glickman, Head of Sustainable Business, Infosys, said: \u201cBeing a Partner of the Ellen MacArthur Foundation is a valued relationship that allows us to continue at pace the work we\u2019ve already been doing to promote the importance of efficient practices and supply chains. Infosys believes there is a symbiotic relationship between digitisation and sustainability and through aligned strategies and clever design \u2014 particularly on circularity \u2014 you can achieve both, with just one pocket of spending.\u201d James George, Network Development Lead, Ellen MacArthur Foundation, said: \u201cI am very excited to see how this relationship develops and deepens over the next few years. As a global leader in next-gen digital platforms, Infosys will bring a calibre of expertise and knowledge that will further help the Network to realise its digital ambition, which will support the transition towards a circular economy. As a Partner with the Foundation, Infosys have drawn a line in the sand, to embrace a circular economic framework as part of their future value proposition.\u201d About Ellen MacArthur Foundation The Ellen MacArthur Foundation is a UK-based charity committed to the creation of a circular economy that tackles some of the biggest challenges of our time, such as waste, pollution, and climate change. A circular economy designs out waste and pollution, keeps products and materials in use, and regenerates natural systems, creating benefits for society, the environment, and the economy. The Ellen MacArthur Foundation works closely with designers, businesses, educators, and policymakers around the world to achieve this.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} {"text": "Infosys Press Release (PR) \nTitle: Infosys Science Foundation to Announce the Winners of the 12th Infosys Prize \nAuthor: ['Infosys Limited'] Infosys Science Foundation (ISF) will announce the winners of the Infosys Prize 2020 on the evening of December 02, 2020, virtually. The Infosys Prize, which was instituted to elevate the prestige of science and research, also aims to inspire the youth to choose a vocation in research. The prize for each category comprises a pure gold medal, a citation, and a prize purse of USD 100,000 (or its equivalent in Rupees) this year.<|endoftext|> Infosys Science Foundation will felicitate scholars across six categories - Engineering and Computer Sciences, Humanities, Life Sciences, Mathematical Sciences, Physical Sciences, and Social Sciences. A distinguished jury, comprising of leaders in each of these fields, evaluates the work of the nominees against the standards of international research, placing the winners on par with the finest researchers in the world.<|endoftext|> \u201cOver the last 12 years, the Infosys Prize has chosen the very best contemporary researchers and scientists who have gone on to distinguish themselves further. This reaffirms our faith in our process and purpose. We realized that there was a serious need to bring science to the fore and make it fashionable again, especially for the younger generation who need to see contemporary role models in these fields and be inspired by them,\u201d said N.R. Narayana Murthy, Founder, Infosys, President - Board of Trustees, Infosys Science Foundation.<|endoftext|> Since its inception in 2009, Infosys Science Foundation has felicitated works of 68 laureates from not just institutes like the IITs, IISc, ISIs, and NCBS but also CSIR labs across the county, niche research institutes like JNCSAR and Harish Chandra Research Institute, among others. Last year, this time, Abhijit Banerjee and Esther Duflo, early winners of this Prize, won the Nobel memorial prize in Economics. Manjul Bhargava and Akshay Venkatesh went on to win the Fields Medal \u2013 one of the highest honors in mathematics given only once in four years to those under 40 years of age. Gagandeep Kang became the first woman from India to be elected as a member of the Royal Society. This year, Ashoke Sen and Thanu Padmanabhan, who were awarded Infosys Prize in 2009, made it to the top 30 in their field in a list of top 2 percent leading scientists in the world, according to a paper published by Stanford researchers.<|endoftext|> Winners for this year will be announced and felicitated in a virtual ceremony on December 2, 2020.<|endoftext|> Agenda of the event: Opening address by N. R. Narayana Murthy- Founder, Infosys, President - Board of Trustees, Infosys Science Foundation Address by the chief guest \u2013 Abel Prize winner, S R Srinivasa Varadhan Announcement of winners across six categories by the chairs of each jury panel Vote of thanks by Salil Parekh \u2013 CEO and MD, Infosys About the Infosys Science Foundation The Infosys Prize is awarded by the Infosys Science Foundation, a not-for-profit trust set up in 2009. The award is given annually to honor outstanding achievements of contemporary researchers and scientists across six categories: Engineering and Computer Sciences, Humanities, Life Sciences, Mathematical Sciences, Physical Sciences and Social Sciences. Each prize consists of a gold medal, a citation and a purse of USD 100,000. The award intends to celebrate success in research and stand as a marker of excellence in these fields.<|endoftext|> Prof. Kaushik Basu | Prof. Arvind | Prof. Shrinivas Kulkarni | Prof. Akeel Bilgrami | Dr. Mriganka Sur | Prof. Chandrashekhar Khare Srinath Batni | K. Dinesh | S. Gopalakrishnan | N. R. Narayana Murthy | Nandan Nilekani | T. V. Mohandas Pai | Shibulal S.D.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} {"text": "Infosys Press Release (PR) \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} -{"text": "Infosys Blog \nTitle: Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 \nAuthor: ['Infosys Limited'] Infosys Positioned as a Leader in the Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 Infosys (NYSE: INFY), the global leader in next-generation digital services and consulting, today announced that it has been positioned as a Leader in Everest Group\u2019s PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020. Infosys was recognized for its ability to help organizations augment their digital capabilities, modernize their core systems, and deliver design-led experiences in an agile manner. Backed by deep domain expertise and experience, Infosys leverages platforms such as Infosys PolyCloud Platform and Infosys Cloud Native Development Platform, part of Infosys Cobalt, to simplify and accelerate cloud native journey for its clients.<|endoftext|> Everest Group assessed 21 leading service providers through a multi-phased research and analysis process for their vision and capabilities in the cloud-native applications development space. Infosys\u2019 cloud-native application development services include API, microservices, PaaS, observability, security, and DevSecOps.<|endoftext|> The key highlights of the report include: Design thinking approach and joint workshops with clients that have helped build and demonstrate POCs, thus, fostering client confidence Mature set of tools and accelerators that enable predictability and consistency in its cloud-native engagements Strong pool of domain experts across industry verticals, which enables it to contextualize cloud-native solutions with a better understanding of clients\u2019 businesses Infosys\u2019 upskilling initiatives that help provide consistent and quality delivery teams in cloud-native engagements Extensive partnership with ISVs and cloud service providers to develop joint solutions and enhanced service offerings for clients \u201cRapidly evolving market conditions have put unprecedented pressure on enterprises to differentiate themselves and find more agile, scalable, and cost-effective means to develop applications. In response, they are increasingly relying on cloud-native development,\u201d said Alisha Mittal, Practice Director, Everest Group. \u201cInfosys is enabling its clients to develop resilient cloud-native applications leveraging Infosys Cobalt, a set of services, solutions, and platforms for enterprises to accelerate their cloud journey. Infosys\u2019 clients also appreciate its talent initiatives, design thinking approach, and domain expertise across industry verticals.\u201d \u201cCloud native applications and technologies are the way forward to drive innovation, resilience and deliver well-recognized business value to customers. It is an ideal approach for enterprises that are looking to build and run responsive, scalable, and fault-agnostic apps across public, private, or hybrid clouds\u201d, said Shaji Mathew, Executive Vice President, Infosys. \u201cOur positioning as a Leader in the report validates our deep domain knowledge backed by offerings from Infosys Cobalt to contextualize cloud-native solutions specific to our clients\u2019 businesses across industry verticals.<|endoftext|> A complimentary custom copy of Everest Group PEAK Matrix\u00ae for Cloud-native Application Development Service Providers 2020 can be accessed here.<|endoftext|> About Infosys Infosys is a global leader in next-generation digital services and consulting. We enable clients in 46 countries to navigate their digital transformation. With nearly four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.<|endoftext|> Visit www.infosys.com to see how Infosys (NYSE: INFY) can help your enterprise navigate your next.<|endoftext|> Safe Harbor Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2020. These filings are available at www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.<|endoftext|> Media contacts: For further information, please contact: PR_Global@infosys.com \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: Apple Vision Pro: Revolutionizing Augmented Reality Experiences \nAuthor: ['Infosys Limited'] Apple Vision Pro: Revolutionizing Augmented Reality Experiences Welcome to the future of augmented reality! Apple Vision Pro is set to redefine the way we interact with digital content, blending the virtual world seamlessly with reality. With its cutting-edge features, intuitive controls, and sleek design, this AR headset is poised to revolutionize the way we work, play, and explore. Let\u2019s dive into the impressive capabilities and potential use cases of the Apple Vision Pro.<|endoftext|> Unmatched Visuals: At the heart of the Apple Vision Pro lies a 4K micro-OLED display, boasting a staggering 65 times more pixel density than an iPhone screen. This immersive display ensures breathtaking visuals, bringing digital content to life with exceptional clarity and detail. Whether you\u2019re gaming, designing, or consuming media, the Apple Vision Pro will transport you to a whole new dimension of visual experiences.<|endoftext|> Seamless Interaction: With the absence of a physical controller, the Apple Vision Pro takes user interaction to the next level. Using advanced sensors, including those for eye and hand tracking, this AR headset allows for natural and intuitive control. By simply moving your eyes or gesturing with your hands, you can navigate menus, interact with virtual objects, and immerse yourself in a world of possibilities.<|endoftext|> Powerful Processing: The M2 and R1 chips power the Apple Vision Pro, enabling real-time processing and seamless performance. These chips ensure smooth rendering of complex AR content, allowing for fluid interactions and an unparalleled user experience. With the Apple Vision Pro, you can say goodbye to lag and latency, and fully embrace the limitless potential of augmented reality.<|endoftext|> Unmatched Sensor Array: The Apple Vision Pro is equipped with an array of sensors that revolutionize how we perceive and interact with the world around us. From tracking eye movement and hand gestures to capturing facial expressions and depth information, this headset delivers a truly immersive experience. Two down cameras, two side cameras, two IR illuminators, two LiDAR scanners, and two True Depth cameras work in harmony to provide precise tracking and depth perception, further blurring the line between the physical and virtual realms.<|endoftext|> Immersive and Authentic: Apple Vision Pro offers two distinct modes: immersive mode and pass-through mode. In immersive mode, users can completely immerse themselves in virtual environments, enjoying interactive experiences like gaming, virtual tours, and creative design. In pass-through mode, the headset seamlessly integrates virtual content into the real world, allowing for enhanced productivity, navigation, and information overlays.<|endoftext|> Uncompromised Security: Your privacy and security are of utmost importance with the Apple Vision Pro. The device features Optic ID, an innovative authentication method that uses optical technology to verify your identity seamlessly. This ensures that only authorized users can access sensitive information or perform secure actions within the AR environment.<|endoftext|> Apple Ecosystem Integration: With a focus on a seamless user experience, the Apple Vision Pro is designed exclusively for Apple apps. This tight integration with the Apple ecosystem allows for seamless syncing, sharing, and compatibility across various devices. Whether you\u2019re using your iPhone, iPad, or Mac, the Apple Vision Pro seamlessly integrates with your existing workflows and enhances your productivity.<|endoftext|> Use-cases for vision pro Remote Collaboration and Communication: AR headsets can revolutionize remote work by enabling immersive and interactive collaboration regardless of physical location. Teams can come together virtually, view and manipulate 3D models, share information in real-time, and communicate through augmented video calls, enhancing productivity and fostering seamless collaboration.<|endoftext|> Education and Training: Augmented reality headsets can transform the learning experience by overlaying digital content onto the real world. Students can engage in interactive lessons, explore historical sites, dissect virtual organisms, or practice hands-on skills in simulated environments. AR headsets provide an immersive and dynamic educational experience that enhances comprehension and retention.<|endoftext|> Healthcare and Medical Training: AR headsets have immense potential in healthcare settings. Surgeons can benefit from real-time visualizations and augmented guidance during complex procedures, allowing for precise and efficient interventions. Medical students can practice virtual surgeries, study anatomy with interactive overlays, and gain hands-on experience in a safe and controlled environment.<|endoftext|> Enhanced Surgical Visualization: The Apple Vision Pro can revolutionize surgical procedures by providing real-time visualizations and augmented guidance. Surgeons can overlay important information, such as preoperative imaging scans, patient vitals, and surgical plans directly onto their field of view. This augmented assistance enables precise and efficient interventions, leading to improved surgical outcomes and patient safety.<|endoftext|> Training and Simulation: Medical students and professionals can greatly benefit from the immersive training experiences offered by the Apple Vision Pro. Virtual simulations allow users to practice complex procedures, such as surgeries, in a controlled and realistic environment. With the headset\u2019s accurate tracking and depth perception, trainees can hone their skills, learn from their mistakes, and gain valuable hands-on experience before entering the operating room.<|endoftext|> Anatomy Education and Visualization: Studying and understanding human anatomy is a crucial part of medical education. The Apple Vision Pro can enhance traditional learning methods by overlaying interactive anatomical models onto the real world. Students can visualize and manipulate virtual organs, systems, and structures, gaining a deeper understanding of the human body. This immersive approach improves comprehension and retention, ultimately leading to more proficient healthcare professionals.<|endoftext|> Retail and E-Commerce: AR headsets offer virtual try-on experiences, enabling customers to see how products look or fit before purchasing. Users can virtually place furniture in their homes, try on virtual fashion items, or visualize how a new paint color would look on their walls. These immersive shopping experiences enhance customer engagement, reduce returns, and improve overall customer satisfaction.<|endoftext|> Architecture and Design: Architects and designers can use AR headsets to visualize and modify 3D models of buildings or interior spaces in real-time. Clients can walk through virtual representations of projects, making design decisions and providing feedback before construction begins. This streamlines the design process, improves communication, and reduces costly errors.<|endoftext|> Tourism and Travel: AR headsets can enhance the travel experience by providing interactive and informative overlays on landmarks, historical sites, and tourist attractions. Users can access real-time information, translations, and augmented guides, enriching their understanding of the places they visit and offering a more immersive and personalized travel experience.<|endoftext|> Gaming and Entertainment: Augmented reality headsets bring gaming to a whole new level by overlaying virtual elements onto the real world. Users can engage in interactive multiplayer games, experience immersive storytelling, and explore virtual worlds in their own surroundings, blurring the boundaries between the digital and physical realms.<|endoftext|> Industrial Maintenance and Repair: AR headsets can assist technicians and engineers in performing maintenance and repairs by providing step-by-step visual instructions, overlaying diagnostic information, and offering real-time guidance. This reduces downtime, improves efficiency, and enhances worker safety.<|endoftext|> The Apple Vision Pro AR headset will definitely raise the bar for augment reality hardware in the coming future. We will see wide range of application and innovations built around it. Its lightweight design, powerful processing capabilities, unmatched sensor array, and intuitive controls make it a game-changer for professionals, gamers, and enthusiasts alike. With its stunning visuals and immersive experiences, the Apple Vision Pro opens up a new realm of possibilities, enabling us to explore, create, and connect in ways we\u2019ve never imagined. Get ready to step into the future with the Apple Vision Pro AR headset.<|endoftext|> \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: End to End view of OTA for Connected Vehicles \nAuthor: ['Infosys Limited'] End to End view of OTA for Connected Vehicles Background \u2013 What is OTA for connected vehicle Over-the-Air (OTA) software updates for connected vehicles became increasingly necessary with advanced electronics, control units & technology embedded in the vehicles which need firmware(s) & software to function.<|endoftext|> OTA updates enable automobile manufacturers to push software packages remotely through the wireless connection. Wireless connection can be through Communication Modules (DCMs) or Local Wi-Fi connections. The software packages for connected vehicles are developed and pushed for new features and bug fixes similar to smartphones and other electronic devices.<|endoftext|> The possibility to update software in the vehicles saves vehicle manufacturers and regional dealers from arrangements of customer visit, and provide manual technical assistance and related time & manpower load.<|endoftext|> The necessity of OTA \u2013 Debriefed objective points Connected vehicles are loaded with heavy and complex software packages for connected services, vehicle controls, sensors, vehicle and driver safety features, autonomous driving, and much more.<|endoftext|> It is crucial and obligatory for vehicle manufacturers and OEMs to keep in-vehicle software packs and operating systems bug-free and up-to-date with enhanced and new features.<|endoftext|> This demand for frequent and swift software updates pushes manufacturers and OEMs to adapt to OTA mechanisms with the below listed key necessities and/or benefits.<|endoftext|> Drawbacks of time-consuming manual software updates, customer-to-dealer visits, and wired set-ups Automated OTA updates can deliver software/firmware fixes and enhancements remotely, swiftly improving vehicle functionality, safety, and security in no time. The fast-paced evolution of connected vehicles and services pushing heavy demand on the OTA In the ever-evolving world of connected vehicles and their features, OTA software updates are inevitable for manufacturers to be swift and efficient to deliver software updates, improve vehicle performance, and enhance overall customer delight.<|endoftext|> E2E \u2013 Systems and Business Functions Enablement of OTA for connected vehicles involves cross-functional system domains and/or functional capabilities listed below.<|endoftext|> Data Collection and Data Analysis of versions and status of in-vehicle software(s) Regulation Compliance and Management UNECE-WP-29 for cybersecurity and software updates, And UNECE-R156 & R155 for component-specific functionalities Legal Compliance to GDPR and Customer Consent Management Fully automated Campaign Configuration and E2E OTA Rollout Management Notification Management to keep customers informed about software updates, and collect consent Product Management, Research, and SDLC of the Software Package(s) Data Analytics and Reporting to judge the success rate of the OTA rollouts Customer management for failed OTA to coordinate with the customer for wired updates Bench set-up with CAN-bus for wired updates Typical-E2E-System-Domains-Functions-and-Workflow-for-Vehicle-OTATechnical capabilities Telematics \u2013 To collect vehicle data and send signals to change the configuration of the vehicle Cellular Network \u2013 Telematics data is transmitted over a cellular network, such as 3G, 4G LTE, or 5G. Firmware Over-the-Air (FOTA) \u2013 FOTA technology enables the wireless transfer, installation, and rollbacks Differential (Delta) Updates enable installation of just a changed part of the software, not the whole package Secure Communication, E2E encryption, digital signatures & certificates, and authentication to protect the OTA update from unauthorized access and tampering. Remote Software Management Platforms (Campaign Consoles) to create and package software updates, and schedule deployments in batches to get status and performance. Remote Device Management and Monitoring for remote provisioning, resets, and restarts Along with the Technical Capabilities the technical protocols used for OTA are Open Mobile Alliance (OMA) and Device Management (OMA-DM) MQTT (Message Queuing Telemetry Transport) HTTPS (Hypertext Transfer Protocol Secure) CoAP (Constrained Application Protocol) SOTA (Software Over-The-Air) protocols Current Challenges and Limitations Although OTA for vehicles is in practice for more than two decades; the challenges and limitations are also growing alongside the complexity of the connected services and demanding customer experience and regulations.<|endoftext|> Some of the prominent challenges are explained below.<|endoftext|> \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: Optimize Maintenance Costs through Predictive Maintenance \nAuthor: ['Infosys Limited'] Optimize Maintenance Costs through Predictive Maintenance Predictive Maintenance: Predictive maintenance, a strategy to service the equipment only when needed, reducing the unexpected outages. These proactive analysis can help to increase the equipment life along with the reduction in the product delays with the reduction in the equipment changeovers/downtime.<|endoftext|> Following are the few highlights on Predictive Maintenance Enables the organization in monitoring assets remotely and that too in real time and also maintain a digital record of the transaction details.<|endoftext|> Helps to monitors the asset\u2019s location and its utilization in integration with IoT.<|endoftext|> End-to-end visibility with real-time analytics enables improved productivity Optimize logistics of the parts and ensure proper maintenance planning.<|endoftext|> Significance and Management of Predictive Maintenance: Predictive maintenance, a key component of Industry 4.0. Improper maintenance management and strategies can impact the operational efficiency of the organization along with its profitability . i.e., The effective maintenance practices determine the ability to operate reliably and profitably. To be very competitive, companies need to minimize the plant/equipment unplanned downtime and inturn optimize maintenance costs. Implementing best maintenance practices, processes, and applications can yield good returns.<|endoftext|> Predictive analytics is used to predict the assets failure and to generate actionable insights in real time. Different data sources can be used to get the raw data based on which the decision of whether to have maintenance operations needed i.e., data from different sources like IoT, M2M etc., is required to establish an effective predictive maintenance system. For example, maintenance mgmt., systems contain information on maintenance manual, parts of equipment, maintenance reports etc.<|endoftext|> Advanced analytics capabilities (like Oracle Analytics Cloud) are very critical for maintenance optimization, i.e., for analytics and visualizations. This helps the organization in predictive analytics besides descriptive analytics. Machine learning and data science methods are used to build the predictive maintenance models.<|endoftext|> Highlighting elements of the Predictive Maintenance system: Asset Monitoring: Monitor Assets remotely in real time, collect the information from physical world into digital form and Optimize the Asset lifecycle by leveraging Artificial Intelligence Data Analytics: Analyze asset data streams and analytics tools to deliver visualizations of real-time data and failures prediction by advanced analytics and ML algorithms, before the failure happen and resulting in maintenance planning optimization.<|endoftext|> Maintenance Optimization: Based on the AI, data insights and subsequent predictive actions, automation (like using sensor data) of the maintenance Work Order creation, technicians\u2019 assignment and optimal maintenance schedule recommendations.<|endoftext|> Integrate with IoT: Monitors the asset\u2019s location, and its utilization in association with IoT. Real time data integration with the physical asset and the IoT application.<|endoftext|> Optimized Operations: Optimization of business process through real time data driven decision. This also reduces the operations cost. Complete view of assets and equipment helps the organization to know the location of an asset and its lifecycle details.<|endoftext|> Key Benefits: Following are key benefits from Predictive Maintenance Equipment Uptime increase Reduction in breakdown Increase in Productivity Reduction in Maintenance Costs Poor maintenance strategies can affect the maintenance operations efficiencies and have impact on the organizational profits. In today\u2019s world, to be very competitive, the organizations in asset oriented industries, need have a proper maintenance strategy which results in reduction of unplanned downtime and optimization of the maintenance costs. Hence have an effective Predictive Maintenance strategy in place which enable organizations in monitoring their assets in real time, assets integration and data collection from different sources, analyze the data and translate it into meaningful insights and finally convert those insights into very prescriptive actions in an automated manner, in optimizing maintenance activities and costs.<|endoftext|> Ref: https://www.oracle.com/a/ocom/docs/applications/supply-chain-management/oracle-future-ready-predictive-maintenance-brief.pdf https://www.oracle.com/a/ocom/docs/applications/supply-chain-management/oracle-future-ready-predictive-maintenance-info.pdf https://www.oracle.com/ae/data-platform/predictive-maintenance/ \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: Exploring the Future of Programming with GitHub Copilot: Revolutionizing Business Efficiency \nAuthor: ['Infosys Limited'] Exploring the Future of Programming with GitHub Copilot: Revolutionizing Business Efficiency Introduction: In the fast-paced world of software development, businesses are constantly seeking innovative tools and technologies to streamline their programming processes, enhance productivity, and deliver high-quality code. One such groundbreaking advancement is GitHub Copilot, an AI-powered code completion tool developed by OpenAI and GitHub. This solution has the potential to reshape the future of programming, offering an intelligent assistant that can assist developers in writing code faster and more efficiently. In this blog post, we will delve into the transformative power of GitHub Copilot and its implications for businesses in various sectors.<|endoftext|> Enhanced Development Speed and Efficiency: GitHub Copilot is designed to save developers valuable time by suggesting code snippets, completing lines of code, and providing context-aware code suggestions in real-time. This AI-powered assistant has been trained on a vast array of publicly available code, making it capable of understanding and generating code for multiple programming languages and frameworks. By automating repetitive and mundane coding tasks, developers can focus their energy on more complex problem-solving and innovation, leading to faster development cycles and reduced time-to-market for businesses.<|endoftext|> Improved Code Quality and Consistency: Maintaining code quality and consistency is crucial for any software development project. GitHub Copilot acts as an intelligent pair programmer, offering suggestions based on best practices and common coding patterns. It can help catch syntax errors, highlight potential bugs, and recommend alternative code implementations. This ensures that the code produced is of higher quality, minimizing the occurrence of bugs and vulnerabilities. With GitHub Copilot\u2019s assistance, businesses can significantly enhance their software reliability and reduce the costs associated with debugging and maintenance.<|endoftext|> Accelerated Learning and Onboarding: In addition to its code completion capabilities, GitHub Copilot acts as a learning tool, making it particularly valuable for businesses when onboarding new developers. As the assistant suggests code snippets and explains the underlying concepts, it assists developers in learning new programming languages and frameworks more efficiently. This can greatly reduce the learning curve for new team members, enabling them to become productive contributors in a shorter period. By leveraging GitHub Copilot\u2019s AI capabilities, businesses can facilitate knowledge transfer, improve collaboration, and foster a more cohesive and efficient development team.<|endoftext|> Tailored Solutions for Industry-Specific Needs: Different industries have unique programming requirements, and GitHub Copilot can adapt to specific domain needs. By training the AI model on industry-specific codebases, businesses can create tailored versions of GitHub Copilot that cater to their specific programming challenges. For example, a financial institution can train Copilot on their proprietary financial models and algorithms, ensuring that the suggestions provided align with their industry regulations and practices. This customization opens up new possibilities for businesses, empowering them to accelerate development in their respective fields.<|endoftext|> Ethical Considerations and Human Oversight: While the potential of GitHub Copilot is remarkable, it is essential to address potential ethical concerns. GitHub Copilot is an AI assistant that offers suggestions, but it is not infallible. Human oversight is crucial to review and validate the code generated by Copilot to ensure adherence to business requirements, security standards, and legal compliance. Additionally, businesses should be mindful of any potential biases in the training data and work towards creating more inclusive and equitable software solutions.<|endoftext|> Conclusion: GitHub Copilot represents an exciting leap forward in programming technology, offering businesses the opportunity to significantly enhance their development processes. By harnessing the power of AI, Copilot empowers developers to write code faster, improve code quality, accelerate learning, and adapt to industry-specific needs. As businesses embrace this transformative tool, it is crucial to maintain a balance between automation and human expertise, leveraging the capabilities of GitHub Copilot while ensuring human oversight and ethical considerations. The future of programming with GitHub Copilot is promising, and businesses that embrace this innovation will undoubtedly gain a competitive edge in the ever \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: Future of Manufacturing with Augmented Reality \nAuthor: ['Infosys Limited'] Future of Manufacturing with Augmented Reality Augmented Reality (AR) is a technology that enhances real-world objects by adding computer-generated information, graphics, and sounds, thus enhancing the perception of the real world. From a discrete manufacturing perspective, AR is used to visualize, monitor, and control the manufacturing process in real-time, leading to improved efficiency, quality, and safety.<|endoftext|> Augmented Reality vs. Virtual Reality: Augmented Reality differs from Virtual Reality (VR) in that AR blends virtual information with the real-world environment, whereas VR immerses the user in a fully virtual environment. Augmented Reality is more suitable for the manufacturing industry as it allows the user to stay engaged with the physical world while augmenting it with relevant information.<|endoftext|> Practical Applications: The manufacturing industry has been quick to adopt AR technology due to its practicality and ability to provide real-time assistance to the workforce. Some practical applications of AR in manufacturing are: Training and Education: AR technology can provide an interactive and immersive training experience to new employees, enabling them to learn quickly and efficiently. Jaguar Land Rover, for example, uses AR headsets to train their employees in assembly line processes.<|endoftext|> AR technology can provide an interactive and immersive training experience to new employees, enabling them to learn quickly and efficiently. Jaguar Land Rover, for example, uses AR headsets to train their employees in assembly line processes. Maintenance and Repair: AR can be used to provide real-time assistance to maintenance and repair personnel, helping them diagnose and fix issues more quickly. Boeing, for example, uses AR headsets to assist their technicians in wiring harness installation.<|endoftext|> AR can be used to provide real-time assistance to maintenance and repair personnel, helping them diagnose and fix issues more quickly. Boeing, for example, uses AR headsets to assist their technicians in wiring harness installation. Quality Control: AR can be used to visualize and monitor the manufacturing process in real-time, leading to improved quality control. Volvo, for example, uses AR technology to visualize and test car designs in real-time.<|endoftext|> AR can be used to visualize and monitor the manufacturing process in real-time, leading to improved quality control. Volvo, for example, uses AR technology to visualize and test car designs in real-time. Remote Assistance: Using AR glasses and mobile devices, the experts can provide real time guidance the onsite workers, by overlaying digitally the instructions and annotations and visual cues in the real world thus enabling smooth troubleshooting.<|endoftext|> Using AR glasses and mobile devices, the experts can provide real time guidance the onsite workers, by overlaying digitally the instructions and annotations and visual cues in the real world thus enabling smooth troubleshooting. Ergonomics and Safety: While using the AR technology, the workers can leverage the guidance via the onscreen text, annotations / voice w.r.t to their posture and the method to handle the equipment so that the work happens in an efficient way while giving utmost importance to workers safety.<|endoftext|> Augmented Reality Probable Use Cases in Industry 5.0: Industry 5.0 is the next phase in the evolution of the manufacturing industry, where humans and machines work together in a harmonious and collaborative environment. Augmented Reality technology is likely to play a significant role in Industry 5.0, with the following probable use cases: Collaborative Robotics: AR technology can be used to enhance the collaboration between humans and robots in the manufacturing process, leading to improved efficiency and safety.<|endoftext|> AR technology can be used to enhance the collaboration between humans and robots in the manufacturing process, leading to improved efficiency and safety. Real-time Data Visualization: AR technology can be used to provide real-time data visualization to the workforce, enabling them to make informed decisions quickly and efficiently.<|endoftext|> AR technology can be used to provide real-time data visualization to the workforce, enabling them to make informed decisions quickly and efficiently. Predictive Maintenance: AR technology can be used to monitor the manufacturing process in real-time, enabling predictive maintenance and reducing downtime.<|endoftext|> Conclusion: Augmented Reality technology is transforming the manufacturing industry by enhancing efficiency, quality, and safety. As the manufacturing industry moves towards Industry 5.0, the role of AR technology is likely to become even more significant, with increased collaboration between humans and machines and real-time data visualization. The manufacturing industry needs to embrace AR technology to stay competitive and achieve better outcomes.<|endoftext|> \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: Data Privacy \u2013 More Fines, More Awareness \nAuthor: ['Infosys Limited'] Data Privacy \u2013 More Fines, More Awareness Data Privacy Trends in 2023 Data Privacy is becoming a very popular topic worldwide. Privacy is a right of every individual in the world. As the number of Data privacy laws is increasing, Companies need to focus & protect users\u2019 privacy & comply with the new regulations. As awareness of data privacy is increasing globally, managing personal data concerning the laws & regulations affects the trust of people which in the end affects the profit margins of the company. So, Companies are now trying to comply with the rules & regulations of each country prescribed by their respective governments.<|endoftext|> Also, to avoid Data Privacy breaches, GDPR (General Data Protection Regulation) of the EU legislators imposes heavy penalties if companies are found misusing or mishandling the data.<|endoftext|> 1. Global rise in data privacy laws & regulations Due to the introduction of GDPR in 2018, there has been significant growth in data privacy prevention regulations & companies are now investing in making the data more secure, avoiding third-party cookies, etc. Today, 100+ countries have proper Data Privacy laws & regulations & this number is rising significantly. It is expected that 75% of the global population would have their private information secured complied with the regulations by the end of 2024.<|endoftext|> 2. Companies will invest more in privacy technologies There will be huge spending by the companies in the race to protect the data & avoid penalties. Advertisers & Marketing agencies rely on sharing information models. However, this will soon be stopped with new laws & regulations. Google has already launched a Privacy Sandbox in 2019 & is currently working on Trust Token-API to replace third-party cookies. This will greatly enhance the user experience & greater security of personal data.<|endoftext|> 3. More privacy-related fines will be charged to service providers Big tech companies are often charged with fines & penalties for data breaches. One of the world\u2019s top companies, Meta will be charged an enormous fine of \u20ac1 billion estimated under GDPR breaches.<|endoftext|> Moreover, IDPC (Irish Data Protection Commission) has more than 40 open inquiries for other big tech companies.<|endoftext|> Cookies and other tracking technologies are also evolving with time. So, the website owners need to continuously update their current Privacy Policy & process the data accordingly.<|endoftext|> 4. Increase in requests and complaints of data subjects Data subjects or users are becoming aware of their rights of protecting their personal data from any fraudster, cookie, or any other agent of data breaches. Users are getting to know about their rights to know, update, delete, edit or handle the information. So, there are significant increases in complaints about data handling.<|endoftext|> In the year 2020, India banned the famous mobile game PUBG. The biggest reason for the ban was data privacy concerns. The game servers were not in India, but in China, and it was believed that the data was misused, stolen & transferred to some other entity.<|endoftext|> As Data subjects become aware of their rights, they will start to prefer 1st party data handlers who will be more secure, transparent, and protective & has full control over data.<|endoftext|> 5. Greater transparency in the collection and processing of personal data Data is priceless. Data contains valuable information which can create a business & also destroy it. According to the user privacy survey, it is likely that the users would change their service providers as users become more aware of their data. The businesses that handle the data properly and comply with regulations will see an increase in active users compared to others.<|endoftext|> Below Mentioned are some of the huge fines levied on big companies due to data breaches Top fines 1. Amazon GDPR fine \u2013 \u20ac746 million On July 16th, 2021, Amazon- Inc. was imposed with a huge fine of \u20ac746 million ($888 million) due to violation of GDPR. More than 10,000 people filed complaints against Amazon Inc in May-2018 through a French Privacy Rights Group. An investigation was opened by CNPD on how Amazon Handles & processes its user data. They found infringements in Amazon\u2019s advertising & targeting system which was conducted without the consent of the user.<|endoftext|> 2. Meta GDPR fine \u2013 \u20ac405 million On September 5th, 2022, Meta Ireland was imposed with a fine of \u20ac405 million GDPR fine for finding infringements in processing the personal data of children in accordance with legal bases. According to Data Protection Commission (DPC), personal data such as email addresses & phone numbers of Instagram business accounts of children aged from 13 to 17 years old were automatically displayed. Meta failed in providing proper measures with information using clear & plain language to children, lacked organizational measures & technical measures & also failed in conducting a Data Protection Impact Assessment where processing resulted in a high risk to the rights & freedom of child users.<|endoftext|> 3. Meta GDPR fine \u2013 \u20ac265 million An investigation has previously been launched by DPA in 2021 after several media reports reported that Facebook\u2019s data set was made available on a hacking platform with highly personal data. 533 million users were affected due to this data leak disclosing all their personal information such as email addresses & phone numbers to third parties without authorization & consent.<|endoftext|> So, on November 25th,2022 DPA imposed a \u20ac265 million fine on Meta after reviewing the Facebook Search, Messenger Contact Importer & Instagram Contact Importer tools.<|endoftext|> 4. WhatsApp GDPR fine \u2013 \u20ac225 million On 2nd September 2021, Ireland\u2019s DPC imposed \u20ac225 million fine on WhatsApp Ireland which is a Facebook Owned Voice over-Ip service & messaging app after a 3-year investigation.<|endoftext|> The decision was issued to reassess the proposed fine regarding infringements of transparency in the calculation of the fine as well as the period for WhatsApp to comply after the EPDB (European Data Protection Board) intervened and asked the DPC.<|endoftext|> 5. Google LLC fine- \u20ac90 million On December 31, 2021, a \u20ac90 million fine was imposed on Google LLC France over its inability to refuse cookies easily as users could accept on YouTube. The CNIL concluded that the cookie refusal mechanism was much more complex than it should be, which led users to accept cookies which benefited the company to use it for advertising & targeting users based on cookies.<|endoftext|> \u20ac100 euros per day fine was also imposed on the company until they provide simple mechanisms to refuse cookies as that of accepting them.<|endoftext|> Importance Of Data Privacy Data will help you to improve the quality of life for the people you support: Improving quality is primarily among the reasons why organizations should be using data. By allowing you to measure and take action, an effective data system can enable your organization to improve the quality of people\u2019s lives.<|endoftext|> Data allows you to monitor the health of important systems in your organization: By utilizing data for quality monitoring, organizations are able to respond to challenges before they become full-blown crises. Effective quality monitoring will allow your organization to be proactive rather than reactive and will support the organization to maintain best practices over time. So, choosing a wise data manager is particularly important. For example, IEDPS.<|endoftext|> WHY iEDPS? The patented iEDPS-Infosys Enterprise Data Privacy Suite provides enterprise-class data privacy & data management. It enables organizations to de-risk as well as protect sensitive data bundled with advanced test data management capabilities. iEDPS helps manage all data needs and enables an organization to adhere to global regulatory standards such as GDPR, CCPA, HIPAA, PIPEDA, GLBA, ITAR, and other global and local regulations. iEDPS can be deployed on any platform and supports all major databases and file systems.<|endoftext|> iEDPS uses a data masking technique to hide sensitive data in the repositories. It identifies sensitive data in the repositories & performs data masking (Static & Dynamic) adhering to all the global standards prescribed. Also, iEDPS is very flexible to any newer recommendations of the client for any other type of data protection algorithm they need which will be developed, tested & then delivered maintaining data privacy throughout the development team, production & testing team to avoid any kind of data breaches too.<|endoftext|> \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: The Rising Peril Of Disinformation \nAuthor: ['Infosys Limited'] The Rising Peril Of Disinformation The influence of social platforms on people has made a tremendous rise in the amount of disinformation being spread across the globe. The complex and vast environment of the internet has increased the scale of information available to the user at each moment. People rely on the news spread on social media as they have everything at their fingertips. They could easily generate, share and interact with billions of users worldwide. But there are two sides of the same coin. On one side, it gives rise to several opportunities. But on the other aspect, it gives rise to several dangers as they are unaware of the fake information being spread.<|endoftext|> Most people are hardly aware of the sources that provide this information. Much of the news pieces on social media are from noncredible sources. More than 50% of the users believe that news is fake. But the rest of the population solely trusts this false information and spreads it regardless of the fact that they are misleading others. Nowadays, \u2018clickbait\u2019 is also a major issue. Even reputed media channels have started using it to generate engagement for their articles.<|endoftext|> Case Study of Disinformation Covid-19 was a trending topic on the internet for the past two years, and the amount of disinformation that was spread was very high. People shared the news without even validating the information.<|endoftext|> The Covid-19 pandemic has affected millions of people all around the globe. Many people tried to make the public aware of the disastrous effects of this pandemic. But there were an equal number of other people who just spread disinformation which was much more infectious. This false spread of information became so rampant that the World Health Organization coined the term \u201cinfodemic\u201d to describe these conspiracies and unsubstantiated claims surrounding the outbreak.<|endoftext|> The major disinformation spread was regarding the home remedies for combating the virus, which included drinking water every 15 minutes, ingesting garlic, being exposed to heat, and so on. All these were rejected by the World Health Organization. Yet still, these are being spread as preventive measures despite the explicit dispelling of these remedies as appropriate.<|endoftext|> How To Spot and Stop Disinformation? It\u2019s always important to have a critical mindset while reading online. Always make sure you are reading from a trustworthy news provider. Sometimes we get attracted to certain headlines, but the news would be very different. This is the technique of that channel to attract users and increase views, also known as the act of \u2018clickbait\u2019. A fact check should be done by the media outlets before sharing information.<|endoftext|> If something feels wrong while reading an article, always make sure that you do a small digging into the references and try to understand what they are claiming. But sometimes, they can trick us by creating fake URLs as well.<|endoftext|> Disinformation can also be spread from person to person. Someone may tell their friends or family a false story, and when they share it with others, it amplifies information to a wider set of audience. Always make sure when you share anything with others, you have clear knowledge about it.<|endoftext|> Detecting and responding to disinformation is necessary. It requires to be done in a customized approach. All the requirements and sensitivities of the organization should be met while planning this approach.<|endoftext|> Continuous monitoring is required and assesses the risks involved due to the spread of this disinformation.<|endoftext|> Always try to educate yourself about disinformation and how to spot it. Invariably trust your instincts, and don\u2019t let others mislead you. You can always find the truth with a little bit of work.<|endoftext|> Conclusion One of the major factors in the rise in the spread of disinformation is Artificial Intelligence. Disinformation groups that are utilizing AI can output massive quantities of fabricated articles. These blur the line between what the public thinks is fact and what they think is fiction.<|endoftext|> This is a difficult problem to solve. To tackle it, organizations need to develop a better understanding of modern digital information ecosystems. Developing a system to record and log troublesome content that is noticeable is important and useful. Forming relationships with credible information sources and trusted journalists helps prevent the spread of disinformation. Working with like-minded organizations is also essential to mitigate the effects of disinformation.<|endoftext|> The use of the right tools and strategies prevents disinformation from impacting your organization and your work. Consistent monitoring is required to ensure the contents are verified before being processed.<|endoftext|> We at iEDPS (Infosys Enterprise Data Privacy Suite) make sure that the information is relevant and secure. We help by providing security to those fields that are sensitive and have several encryption techniques that help in data privacy and ensure that sensitive data is masked.<|endoftext|> \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: Hard Interest to Negative Interest to ECR and Green Credits \u2013 Fee Billing as an opportunity \nAuthor: ['Infosys Limited'] Hard Interest to Negative Interest to ECR and Green Credits \u2013 Fee Billing as an opportunity From a period of ultra-low interest rates for a long period of time, so much so that it was taken for granted that the interest rates will never ever go higher again, to an era where the rates have gone up over 500% and touching 5.25% within a year, the pandemic era has been marked by some crazy moves in the deposits and money markets over the past couple of years. This blog covers only the impact on banks\u2019 fee billing systems and the challenges to handle such gyrations.<|endoftext|> There was a time, not in the very distant past when interest rates went from positive territory to below zero!! This essentially meant that banks were being penalized by Fed and ECB(European Central Bank) for maintaining deposits with them. In the absence of any interest-yielding avenues, this forced banks to start charging corporates for parking excess money with them. This inverted the basic business proposition of bank lending where banks earned money from sourcing deposits at a lower rate and lent them to needy corporates at a higher rate with the difference shown as profits. However, the other part of banks being safe custodians of money and provider of liquidity on demand came to the center fore. Large corporates with significant cash flows with no avenues to park excess funds overnight had no choice other than to pay banks for the same service as they needed the safety and liquidity provided by banks.<|endoftext|> One part of the Dodd-Frank Act which allowed banks to pay interest on DDA (Demand & Deposit Accounts) had little use in the ultra-low interest rate and negative interest rates regimes, preventing corporates from earning interest on their deposits with the bank. The very purpose of earnings credit given by banks in lieu of hard interest, which resulted from corporate treasurers and banks sitting together to seek compensation for charging various fees on deposits, on which interest payment was illegal, vanished with Dodd-Frank act allowing interest on deposits. Now, with a sudden spike in interest rates of over 5 % within an year, the interest rates have again made a comeback for corporates to earn decent interest on their DDA (Demand & Deposit Accounts).This has impacted several areas of the banks business including their Transaction Fee Billing Systems. One of the few bright spots in the banks in recent years, has been the performance of Global Transaction Revenues of various banks which continued to show an upswing even as other avenues went down. This forced all banks to continue upgrading their billing systems and separate out the charges part from their core systems across various Processors and channels. This \u201cmiddle layer\u201d pulled data from disparate customer management systems and transaction processing systems and combined the same to provide complex and personalized pricing and billing. These systems also provided a centralized view of the entire customer relationship of the banks and utilized the same to provide differentiated service.<|endoftext|> A key challenge banks face is to seamlessly move from charging earnings credit through Legacy Account Analysis or Fee Billing systems to charging hard interest or a hybrid of the two within their current Fee Billing systems. Billing systems should enable easy migration from ECR (Earnings\u2019 Credit Rate) to a hard interest pricing model for specific products or a hybrid model where the accumulated Earnings credit are either set-off against specific fee lines or paid out as hard interest. The billing systems should enable seamless sunsetting of negative interest rates, if provided for the past years and start paying out hard interest or EC(Earnings Credit).<|endoftext|> Another challenge being seen is the increasing requirement to have ESG (Environment, Social & Governance) support for banks customers. An interesting use case here would be for banks to partner green projects and start adjusting the accumulated EC (Earnings Credit) towards such projects resulting in \u201cGreen Credits\u201d towards ESG (Environment, Social & Governance) norms. The billing systems should facilitate the on-boarding of such partners and projects and also be in a position to offer the Green Credits to its customers. Banks can also use this as a Revenue earning stream for providing its customers the opportunity to earn Green Credits and share a part of such revenue with the partners.<|endoftext|> Banks expect their Fee Billing Systems to now provide ability to easily on-board and revenue share with partners and also present partners with a detailed break up of such revenue share to earn the trust of its partners of being fair.<|endoftext|> Thus, Fee Billing systems are now converging towards Bank in a box or BaaS (Banking as a Service) model. Unless Fee Billing Systems of banks evolve and support such requirements, banks will be caught on the backfoot and unable to be nimble enough to capture such opportunities.<|endoftext|> Do your Current Account Analysis or Fee Billing systems allow such seamless options without resulting in rewiring or hard coding and resulting in huge maintenance and development overheads? Do your customers get to see their Green Credits and contributing projects? Are your contributing partners able to see their revenue share transparently? Are banks able to leverage and be nimble enough to grab such opportunities? Infosys Revenue Management Platform is a cutting-edge fee billing solution providing banks expertise and ability to launch such personalised service rapidly.<|endoftext|> \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: How Secure Data Mining helps in Test Data Preparation \nAuthor: ['Infosys Limited'] How Secure Data Mining helps in Test Data Preparation Nowadays, worldwide, every organization stores a lot of data used for testing/development purposes. The more data an organization produces, the more difficult it becomes to make sense of it and derive meaningful insights from it. There is an ingenious solution for this issue, which is the data mining process. Data mining identifies the meaningful relationship in the raw data of an organization, and it is typically done to predict future data. Data mining deals with a large number of datasets with various techniques involved.<|endoftext|> What is Data Mining? Data mining is a technique used by various organizations to change raw data into proficient information involving multiple techniques and methods. Data mining techniques analyze the data based on patterns and connections present in data. It aids in researching future trends by analyzing past data and also helps in identifying the relationships and correlations among the data.<|endoftext|> Steps in Data Mining Setting Objectives \u2013 Every organization should set an objective of what data they want and how it can be organized. This is where the data scientists and stakeholders come together to define a business problem to which data mining can be applied. Data Preparation \u2013 This step is to identify the correct data based on the objectives set. It is to understand the type of data source involved. The data needs to be filtered or cleaned based on the need. Data Processing \u2013 Defines applying the data mining techniques/models, i.e., it identifies the relationship/patterns/correlations in the data. Evaluating Results \u2013 To evaluate the results obtained from the data mining models and deploy further if required.<|endoftext|> Data Mining Techniques Association : Refers to the process of finding correlations between different types of data. The goal of association rule mining, given a set of transactions, is to find the rules that allow us to predict the occurrence of a specific item based on the occurrences of the other items in the transaction.<|endoftext|> : Refers to the process of finding correlations between different types of data. The goal of association rule mining, given a set of transactions, is to find the rules that allow us to predict the occurrence of a specific item based on the occurrences of the other items in the transaction. Classification : It is the process of predicting new data, i.e., putting your data in buckets based on specific shared qualities and characteristics. The most challenging aspect of classification is determining which categories one should place data into.<|endoftext|> : It is the process of predicting new data, i.e., putting your data in buckets based on specific shared qualities and characteristics. The most challenging aspect of classification is determining which categories one should place data into. Clustering : Similar to classification, clustering is loosely putting data in buckets based on similarities. The difference between classification and clustering is that classification requires creating categories, while clustering is more about finding similarities regardless of a category.<|endoftext|> Advanced Data Mining Techniques Artificial Intelligence: Some of the artificial intelligence techniques helps the user to classify the data. The technique mainly used is Natural Language Processing (NLP) which helps in identifying insights from larger datasets.<|endoftext|> Machine Learning: In data mining, machine learning refers to programming software or computer to predict future patterns and behaviours without being explicitly programmed to do so. A data analyst can use the Python and R Programming languages to use machine learning in a data mining context.<|endoftext|> In the market, there are a lot of data privacy products available for data mining features. One of the key products in data mining is an Infosys offering, Infosys Enterprise Data Privacy Suite (iEDPS), which is a data privacy solution present in the market for over 10+ years.<|endoftext|> iEDPS Product Details Infosys Enterprise Data Privacy Suite (iEDPS) is a patented enterprise-class data privacy suite which will enable users to protect and de-risk sensitive data. iEDPS is a one-stop solution for the protection of confidential, sensitive, private, and personally identifiable information within enterprise repositories. It supports various databases like Oracle, SQLServer, etc., and various file types like Delimited, Fixedlength, XML, JSON, etc. iEDPS has many functionalities to identify and protect sensitive fields/data. Some of them are below: iEDPS identifies the sensitive fields (Discovery) Users can mask the data (sensitive fields) and subset it Data generation iEDPS supports more than 180+ algorithms to mask sensitive fields like encryption [MYP1] and deterministic lookup file-based algorithms Supported with various static and dynamic masking capabilities inbuilt iEDPS is an easy-to-use data privacy protection that helps in automating data protection and privacy across an enterprise.<|endoftext|> How iEDPS Helps in Data Mining iEDPS supports the data mining feature, which is query-based. Users can create a connection to any supported database and build a template in the format of a query. Users can prepare multiple query-based templates, i.e., queries to retrieve the data, based on the criteria they need. All these templates are stored in iEDPS and can be used by the testers to retrieve the data by directly executing the template. The results will give you the filtered data.<|endoftext|> This will result in below: Users/Testers don\u2019t need to search the entire set of huge data. Instead, they can run the pre-made templates and get the correct data that is required.<|endoftext|> Increased self-service Reduces effort spent on test data preparation Removes dependency on personnel Based on the above observations using iEDPS, its data mining feature will help the end user with the test data preparation and reduce the effort involved.<|endoftext|> \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: Unified Tool for Minimum Acceptance Test, Business Process Automation and More \nAuthor: ['Infosys Limited'] Unified Tool for Minimum Acceptance Test, Business Process Automation and More Overview \u2018Any One Can Automate Testing\u2019 is an open-source Testing Automation tool, developed by Infosys for regression testing. This is designed to reduce testing cycle time during quarterly patch updates from Oracle, for their product/ applications, and volume testing etc. Alternatively, it has also been utilized for efficient business process handling, resulting in significant time and resource savings. This blog unfolds those alternative applications of this tool to manage work more efficiently and to increase productivity.<|endoftext|> Introduction to the \u2018Any One Can Automate Testing\u2019 tool This testing tool was developed primarily for below given use areas.<|endoftext|> System Setup and Configuration can be done for a new instance. Updates in multiple configuration parameters in the existing instance is also possible with this tool.<|endoftext|> Automated Regression Testing In cloud-based systems patch deployment happens frequently, from the application owner, like Oracle, for Merchandise Foundation Cloud Service. After every patch deployment and version upgrade of application(s) Regression testing ensures that system is working fine.<|endoftext|> In agile mode of implementation, as and when codes are deployed in instance, business critical scenarios are tested to ensure that key functionalities do not get adversely impacted.<|endoftext|> While the tool was originally planned to be used with Oracle Retail modules like Merchandising, Planning, POS (Point of Sales) etc. Tool\u2019s architecture allows it to be expandable to include any system in the client landscape with minimal investment.<|endoftext|> This tool helps in Volume testing by creating larger data points, with comparatively lesser effort. Say, for example, creation of multiple regular items in Merchandise Foundation Cloud Service. Also, this tool can be used for processing large data, from UI. For example, loading of sales data.<|endoftext|> In collaboration with business user, Usability Testing ensures environment/ system being fit for use. With this tool, typical set of basic activities can be tested, which are pre-recorded earlier.<|endoftext|> Typical Workflow This Selenium-based tool, records process, marks data input points, and adds nodes of result capturing (like screenshot to be taken and/or back-end data to be captured from database). Then the whole process is re executed with new input data points.<|endoftext|> A simplistic flow chart shows these steps in more detail.<|endoftext|> Rerun or execution of automated flow, can be done in multithreaded way as well, in parallel.<|endoftext|> Alternative Usage of Tool Beyond the primary usage of tool like regression and volumetric testing, this tool can easily automate repetitive business user actions. The users are required to record process steps one time which the tool leverages to execute on multiple sets of data and achieve desired business/process outcome. This frees up user bandwidth and allows them to focus on operational excellence and innovation initiatives.<|endoftext|> Use Cases Here are few illustrative scenarios where this tool has supported Business Scenarios as well as helped in Issue Resolution processes, and hence added value.<|endoftext|> Benefits of this tool Conclusion \u2018Any One Can Automate Testing\u2019 is dynamic, flexible, has demonstrated capabilities to handle business scenarios alongside testing, and is upgraded continuously by Infosys team. It can very easily help in addressing more varied business issue resolution and in handling data intensive processes.<|endoftext|> \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: Tiny ML \u2013 Machine Learning on resource constrained devices \nAuthor: ['Infosys Limited'] Tiny ML \u2013 Machine Learning on resource constrained devices Preface: Tiny Machine Learning (Tiny ML) is practice of deploying machine learning models on resource-constrained devices, such as microcontrollers or Internet of Things (IoT) devices, with limited processing power, memory, and energy consumption.<|endoftext|> Benefits: Realtime : By bringing machine learning to the edge, TinyML enables real-time processing, reduced latency.<|endoftext|> Operate on resource constraint devices: TinyML models are designed to be lightweight, compact, and energy-efficient, enabling them to operate efficiently on resource-constrained devices.<|endoftext|> Instant Decision: TinyML enables on-device inference, allowing edge devices to make decisions instantly without relying on cloud connectivity or external processing.<|endoftext|> Data transfer contains: TinyML reduces the amount of data transmitted by performing local inference.<|endoftext|> Offline Operation: Some edge devices operate in environments with intermittent or no network connectivity. Operating on device itself support offline.<|endoftext|> Cost Reduction: Deploying machine learning models on resource-constrained edge devices reduces the reliance on expensive cloud infrastructure and continuous data transmission.<|endoftext|> Privacy and Security: Transmitting sensitive data to the cloud for processing raises privacy and security concerns, which is avoided.<|endoftext|> Use Cases: Telematics and Usage-Based Insurance (UBI): TinyML can be used in telematics devices to collect and analyze data on driving behavior, allowing insurers to assess risk more accurately and offer usage-based insurance policies tailored to individual driving patterns.<|endoftext|> Claims Processing and Fraud Detection: By deploying TinyML models on edge devices, insurers can quickly assess the damage, estimate repair costs, and identify potential fraud in real-time.<|endoftext|> Property Risk Assessment: By analyzing information from IoT sensors and smart home gadgets. Insurance companies may track variables like temperature, humidity, water leaks, and smoke detection in real-time by deploying TinyML models on these sensors. This enables early risk detection, prompt alarms, and proactive risk mitigation steps. Additionally, it may result in more precise underwriting and customised insurance pricing.<|endoftext|> Fraud Detection: Running ML on the device for fraud detection has several advantages over cloud ML, including real-time processing, decreased latency, greater privacy, and less reliance on network access.<|endoftext|> Retail Operations and Inventory Management: TinyML may be used to improve retail operations and inventory management. TinyML models on edge devices allow them to analyze data from sensors and cameras to detect product availability, check stock levels, and track consumer behavior. This allows for more precise demand forecasts, more effective inventory management, and more personalized client experiences.<|endoftext|> Difference between IoT and Tiny ML: TinyML involves deployment of machine learning models on edge devices to enhance efficiency, reduce latency, improve privacy, and enable intelligent functionality. TinyML optimizes and compresses machine learning models to run efficiently on devices with limited resources.<|endoftext|> IoT connects physical devices, sensors, and objects to enable data collection, communication, automation, data sharing, and remote control. Sensors, actuators, communication protocols, cloud computing, and data analytics are just a few of the technologies that make up the Internet of Things (IoT). Device communication, data management, and control are made possible by IoT technology.<|endoftext|> TinyML focuses on deploying ML models on edge devices for real-time decision-making, while IoT is a broader concept encompassing device connectivity, data sharing, and automation.<|endoftext|> Future Growth: ABI Research, a worldwide technology market advisory predicts that TinyML market will grow with shipment of the IoT devices from 15.2 million in 2020 to 2.5 billion in 2030. Each of this device enables the use of TinyML. One can imagine the need and opportunity for TinyML.<|endoftext|> Conclusion: TinyML involves optimizing and compressing ML models to run efficiently on these devices, enhancing efficiency, reducing latency, improving privacy, and enabling intelligent applications in various domains such as industrial automation, healthcare, smart homes, agriculture, and more. It empowers edge devices to perform inference and analysis locally, minimizing the need for constant cloud connectivity and enabling intelligent functionality at the edge.<|endoftext|> \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: Google\u2019s AI Advancements Revolutionize Product Mapping and Search \nAuthor: ['Infosys Limited'] Google\u2019s AI Advancements Revolutionize Product Mapping and Search Introduction: In recent years, Google has harnessed the power of artificial intelligence (AI) to transform the way products are mapped and searched. By leveraging cutting-edge technologies such as computer vision and machine learning, Google has significantly enhanced its search capabilities, particularly in the realm of e-commerce. This article explores how Google\u2019s AI advancements have revolutionized product mapping and search, offering users a more intuitive and efficient way to discover and purchase products.<|endoftext|> Computer Vision and Visual Search: At the core of Google\u2019s AI-driven product mapping and search lies computer vision, a technology that enables machines to understand and analyze images and videos. Through sophisticated computer vision algorithms, Google can recognize objects, including various products, within images or videos. This capability has been leveraged to empower visual search, allowing users to upload images or enter keywords to find visually similar products across multiple online retailers. By employing AI in this manner, Google can match the visual attributes of products and deliver relevant search results to users.<|endoftext|> Structured Data Extraction: Another critical application of AI in product mapping and search is the extraction of structured data from websites. Google\u2019s AI algorithms can crawl and analyze web pages, extracting essential information such as product descriptions, prices, and availability. This structured data is then indexed and utilized to power search results, making it easier for users to find specific products. By automating this process with AI, Google can provide up-to-date and accurate product information, enhancing the overall search experience.<|endoftext|> Integration with Google Maps: Google\u2019s AI advancements have not been limited to online search alone but have also extended to its mapping services. Through machine learning techniques, Google can extract valuable information from satellite imagery, street view data, and other sources to identify and map various points of interest, including businesses and retail locations. This integration of AI-driven mapping with product search allows users to find nearby stores, explore store layouts, and even locate specific products within a store. By leveraging AI in this manner, Google provides users with a comprehensive and convenient way to navigate the physical retail landscape.<|endoftext|> Personalized Recommendations and Shopping Assistance: Google\u2019s AI capabilities extend beyond traditional search and mapping functionalities. By analyzing user behavior and preferences, Google can offer personalized product recommendations, tailored to individual interests and needs. Through machine learning algorithms, Google learns from user interactions, purchase history, and online activity to suggest relevant products, making the shopping experience more personalized and engaging.<|endoftext|> Conclusion: Google\u2019s AI advancements have transformed the landscape of product mapping and search, enabling users to navigate and discover products with unprecedented ease and accuracy. By harnessing the power of computer vision, extracting structured data, integrating with mapping services, and providing personalized recommendations, Google has created a seamless and intuitive search experience. As AI continues to advance, we can expect further innovations from Google, revolutionizing the way we discover and interact with products in the online and offline realms.<|endoftext|> \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: Harnessing Efficiency and Performance with Microsoft Graph \nAuthor: ['Infosys Limited'] Harnessing Efficiency and Performance with Microsoft Graph In the ever-evolving landscape of software development; efficiency and performance are key considerations. When working with Microsoft 365 services, developers can take advantage of the Graph Batch API. This powerful feature allows bundling multiple API requests into a single batch, significantly reducing network round trips and optimizing overall performance.<|endoftext|> The Graph Batch API is an integral part of Microsoft Graph, a unified API endpoint that provides access to various Microsoft 365 services. By leveraging the Batch API, developers can group multiple API requests into a single batch, resulting in improved efficiency and reduced latency. Rather than sending individual requests for each operation, the Batch API enables you to combine related operations into one request, minimizing network overhead.<|endoftext|> Benefits of the Graph Batch API: \u2013 Enhanced Performance: Traditional single-item operations can introduce significant overhead due to network latency and the need for multiple round trips to the database. However, by batching multiple graph operations together, the Graph Batch API reduces these latencies and minimizes the number of requests sent to the database. This batching approach dramatically improves performance, enabling faster processing of large volumes of data and more efficient utilization of system resources.<|endoftext|> Reduced Network Overhead: The Graph Batch API allows you to combine multiple graph operations into a single batch request, reducing the overall network overhead. Instead of sending individual requests for each operation, the Graph Batch API sends a single request with all the operations bundled together. This approach significantly reduces network traffic, especially when dealing with large datasets or distributed databases, leading to improved scalability and reduced infrastructure costs.<|endoftext|> Atomicity and Consistency: Maintaining data integrity is vital when working with graph databases. The Graph Batch API provides atomicity and consistency guarantees for batch operations. Atomicity ensures that either all operations within a batch succeed, or none of them do. This property guarantees the integrity of your data, especially when performing multiple updates or modifications simultaneously. Consistency ensures that the graph database remains in a valid state throughout the execution of the batch operations.<|endoftext|> Transactional Integrity: The Graph Batch API supports transactional operations, allowing you to execute a sequence of graph operations as a single transaction. This means that if any operation within the batch fails, the entire transaction will be rolled back, ensuring transactional integrity.<|endoftext|> Implementation of the Graph Batch API:- Implementing the Graph Batch API involves constructing a batch request payload using the MIME multipart/mixed format. Each individual request is represented as a separate part within the batch request. You can include operations for different resources or services in a single batch. The batch request is sent to the /v1.0/$batch endpoint of the Microsoft Graph API.<|endoftext|> To make the implementation process easier, Microsoft provides comprehensive documentation and code samples that outline the necessary steps. Developers can refer to these resources to understand the structure and syntax of batch requests, as well as any limitations imposed by the API.<|endoftext|> Use Cases and Scenarios:- The Graph Batch API proves invaluable in various scenarios. For instance, if an application needs to retrieve data from multiple Microsoft 365 services simultaneously, it can bundle these requests into a single batch, improving overall efficiency. Similarly, when updating multiple resources, such as creating or modifying files, folders, or emails, the Batch API ensures atomicity and consistency in the operations.<|endoftext|> In summary, the Graph Batch API is a powerful tool that empowers developers to optimize their interactions with Microsoft 365 services. By consolidating multiple operations into a single batch, developers can achieve improved efficiency, reduced latency, and enhanced performance. Embracing the Graph Batch API unlocks the potential for streamlined and effective integration with Microsoft Graph.<|endoftext|> \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: Apple Vision Pro: The Future of Spatial Computing \nAuthor: ['Infosys Limited'] Apple Vision Pro: The Future of Spatial Computing Introduction Apple has always been a frontrunner when it comes to innovation in technology. Be it the first touch screen iPod or the first touch screen iPhone or the first of its kind line of Macbooks and iMacs with unmatched features or the design and fabrication of M1 processors to power its devices.<|endoftext|> Apple has lived up to expectations of all technologists this year too and launched its new innovative product Apple Vision Pro at WWDC 2023. It\u2019s the world\u2019s first wearable spatial computer that uses dual chip design; Apple\u2019s M1 and newly introduced, R1 chip. It also introduces fully three dimensional user interface controlled by natural and intuitive inputs like our hands, eyes and voice. User can navigate the apps using their eyes, they can select using the tap of their fingers and you can give commands to Siri using voice input. It has been called as \u201cBeginning of new era for computing\u201d by Tim Cook, Apple\u2019s CEO.<|endoftext|> Vision Pro Vision Pro is also the Apple first 3D camera. It uses separate ultra-high-resolution displays for both eyes having a combined resolution of 23 million pixels along with 12 cameras, 5 sensors and 6 microphones, all inside the headset. The display lens can also be customised with the power lenses for uses that wear spectacles. All these, coupled with spatial audio, provide an incredible immersive and engaging experience to the user. It can scale movies to provide a movie theatre experience or can be connected to a game controller to play favourite games on massive screen with incredible spatial audio.<|endoftext|> EyeSight Apple has also ensured that the user is not isolated from the outside world. The apps are displayed to user in the physical world. When someone is in the room, the eyes are visible to other users and apple calls this feature, EyeSight. Panoramas wrap around the user as if the user is at that very place.<|endoftext|> Persona Vision Pro can be connected to Mac just by looking at it, thereby converting a 13-inch screen to massive one. Apple FaceTime calls can be initiated, and it feels that the other person is in front of the user in life size. User\u2019s \u201cPersona\u201d is created by Vision Pro during setup and that is what others see with all your expressions during a video call.<|endoftext|> Optic ID Privacy and user data security is always Apple\u2019s priority. Vision Pro brings in all new privacy and security feature known as \u201cOptic ID\u201d. It is a secure system that uses the uniqueness of iris for authentication of user, or unlocking and singing into the device. Optic ID data is fully encrypted and stored locally.<|endoftext|> visionOS and visionOS SDK Apple has also launched \u201cthe first ever Operating System for Spatial Computing\u201d and its called visionOS. It offers spatial building blocks like Windows, Volumes and Spaces to embed 3D content. Its all new platform with familiar tools and can be used to build apps and games for Vision Pro. visionOS makes use of SwiftUI, RealityKit, ARKit and accessibility to provide immense User Experience. Siri is also accessible in visionOS while wearing the headset.<|endoftext|> visionOS SDK along with updated XCode and visionOS simulator, Reality Composer Pro and sample codes is expected to be available by end of June 2023.<|endoftext|> Conclusion Vision Pro is definitely something the we have imagined in our future and now made available to us in present by Apple. We have seen similar things in sci-fi movies and dreamt of the world using it. The technology is new and niche. With Apple also announcing a partnership with Disney to offer new experiences for Vision Pro and enhanced ways to watch shows on Disney+, it will definitely find its early buyers. Vision Pro will be available for sale sometime early next year.<|endoftext|> \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: Enhance User Productivity and Improve Adoption of Oracle Content Management \nAuthor: ['Infosys Limited'] Enhance User Productivity and Improve Adoption of Oracle Content Management Businesses around the world prefer to manage the content for their usage centrally. This content could be related to different functions/processes such as sales, marketing, policies, training etc. Once this content is available in a single repository, web sites are generally developed for the end users to access this content.<|endoftext|> Key challenge is the ease of access to this content whenever user needs it, wherever user needs it. Apart from the websites, what if channels like intelligent chatbot is available for end users to search, fetch and access the content. Can this chatbot be seamlessly integrated with the commonly used tools like Microsoft teams or Slack.? Oracle to the rescue Oracle is the worldwide leader in providing SaaS and PaaS based solutions which spread across Customer Experience, ERP, HCM and more. Apart from these key solutions, Oracle also provides peripheral cloud services for Content Management and Conversational AI (chatbots) as well.<|endoftext|> Oracle Content Management provides capability to create, publish and manage various types of content (documents, videos, images etc.).<|endoftext|> Oracle Digital Assistant provides an AI platform to create conversational experiences for business applications through chat and voice interfaces.<|endoftext|> Let\u2019s have a look at how these two cloud services can work in tandem to provide seamless experience to access the content easily and quickly.<|endoftext|> Oracle Content Management (OCM) Oracle Content Management is a cloud-based content hub. It is a PaaS service provided by Oracle Cloud Infrastructure (OCI). It offers powerful collaboration capabilities to streamline the creation and delivery of content and improve employee engagement.<|endoftext|> Organizations can create a repository of content which can be consumed by various users. This content can be published to different channels like websites, mobile, chat etc.<|endoftext|> Key Solution Components are: Repository : Repository is a logical storage location for all the assets. It is an entity which manages all the files/folders in a structured order.<|endoftext|> : Repository is a logical storage location for all the assets. It is an entity which manages all the files/folders in a structured order. Asset : An asset can be a content item that represents an individual piece of content, such as product literature, compliance documents, a blog post, case study, or a digital asset that represents an image, video, or other type of media.<|endoftext|> : An asset can be a content item that represents an individual piece of content, such as product literature, compliance documents, a blog post, case study, or a digital asset that represents an image, video, or other type of media. Taxonomy : A taxonomy is a hierarchy of categories to allow asset categorizations and help users find assets. It represents how content across organization is defined and classified. For example, define taxonomies for products, branches, compliance type, roles or any other hierarchy of subject categories that is relevant for your organization.<|endoftext|> : A taxonomy is a hierarchy of categories to allow asset categorizations and help users find assets. It represents how content across organization is defined and classified. For example, define taxonomies for products, branches, compliance type, roles or any other hierarchy of subject categories that is relevant for your organization. Integration: Oracle Content Management provides REST APIs for content delivery. These APIs can be used by consuming systems, for example chatbots to search, fetch and display the content.<|endoftext|> Oracle Digital Assistant (ODA) Oracle Digital Assistant enables development of chatbots which understands natural language and can be interacted through voice or text. ODA powered by Natural Language Processing (NLP) can understand the user query and respond appropriately as per the skills it is trained for. ODA has API based integration capabilities which can fetch information from various sources.<|endoftext|> Natural Language Processing : Allows creation of skills to cater to the user\u2019s inputs and commands. The Digital Assistant takes care of the processing using inbuilt algorithms to understand the inputs.<|endoftext|> : Allows creation of skills to cater to the user\u2019s inputs and commands. The Digital Assistant takes care of the processing using inbuilt algorithms to understand the inputs. Conversational flow and context : The context of the chat is maintained and based on the user input, the appropriate flow is invoked.<|endoftext|> : The context of the chat is maintained and based on the user input, the appropriate flow is invoked. Enterprise Integration : Custom components can be created to integrate with multiple applications using APIs to fetch data and send data to the users.<|endoftext|> : Custom components can be created to integrate with multiple applications using APIs to fetch data and send data to the users. Multi-Channel Support: Digital Assistant can be integrated with various channels, such as website, MS Teams which can carry the conversations back and forth from users on various messaging platforms to the digital assistant and its various skill bots.<|endoftext|> Functional Flow Oracle Digital Assistant integrated with Oracle Content Management can provide seamless access to content stored in Cloud.<|endoftext|> The following example flow suggests searching and accessing various documents by different means.<|endoftext|> User can search documents by Categories.<|endoftext|> User can search documents by Name.<|endoftext|> User can search documents by Metadata (Content/\nAuthor) Search by Categories The user has an option to search by category hierarchy by choosing the category and then the Subcategory. System displays all documents under the Subcategory hierarchy as hyperlink which user can click and download.<|endoftext|> Search by Document Name The user has an option to search the document using document name. The system searches the documents in the entire repository based on the document name and shows the documents as hyperlinks which user can click and download.<|endoftext|> Search by Document Metadata (Content/\nAuthor) The user has an option to search the document using document metadata. The system searches the documents in the entire repository-based document content, document author and other metadata and shows the documents as hyperlinks which user can click and download.<|endoftext|> Architecture The following diagram depicts the typical ODA architecture.<|endoftext|> In this solution, ODA integrates with OCM to read metadata of the documents based on user inputs.<|endoftext|> A skill is created in ODA with different intents such as \u201cList All Categories\u201d, \u201cSearch Document By Content\u201d and \u201cSearch Document by Name\u201d for each of the options that user can choose. Based on the user selection the respective dialog flow is executed.<|endoftext|> System components built in node.js has the logic to integrate with OCM to retrieve data based on user inputs which is used by ODA to display as results.<|endoftext|> ODA has inbuilt capabilities for authentication using OAuth which is setup to enable authentication for APIs of external systems.<|endoftext|> OCM has built in Content Delivery APIs which are used to search documents, retrieve metadata of document to generate clickable URLs for document download.<|endoftext|> Below are the APIs which are used for this integration.<|endoftext|> Get Taxonomy: https://<>/content/published/api/v1.1/taxonomies?channelToken=<>&expand=children Get Categories: https://<>/content/published/api/v1.1/taxonomies/<>/categories?channelToken=<> Get Documents (By name, by metadata): https://<>/content/published/api/v1.1/items?channelToken=<> This skill can be used in different channels such as MS Teams, Intranet, Slack for use by end users. This skill can also be integrated with any other ODA based chatbot such as Oracle CX Sales in MS Teams which will provide a single window for end users for all conversations related to CRM.<|endoftext|> Benefits Integration of OCM with ODA provides the following benefits: \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: Part 3 : DevOps practices for a better ERM \nAuthor: ['Infosys Limited'] Part 3 : DevOps practices for a better ERM In the previous blog series, we covered the basic understanding of ERM and how it gets integrated with other ITIL processes in an enterprise.<|endoftext|> While we know that release management is an integral part of DevOps, in this blog we will see how other DevOps practices help achieve a better ERM.<|endoftext|> Release Management is about how releases are planned, scheduled, and managed throughout the software development lifecycle, and this remains true in the DevOps scenario too. In fact, the need for frequent releases, and speed-to-market without any compromise on quality and security, demands moving beyond traditional release management.<|endoftext|> Let us look at some of the DevOps practices and how they help us strengthen the software releases.<|endoftext|> Branching and merging strategy: When teams look at faster deployment or working on parallel releases, implementing the right branching strategy would be the key. The branching strategy will not only focus on facilitating the development process but will also define how each feature, enhancement, or bug fix is released to production in a controlled manner. The right branching strategy, coupled with appropriate CICD automation and controls, helps in parallel development, optimizes development workflow, and facilitates a structured and faster release.<|endoftext|> CICD Automation: CICD automation helps orchestrate the entire build, test, and deploy cycle. This ensures that any code change is quickly and efficiently integrated with minimal manual intervention to reduce the delivery time. CICD automation also ensures that quality and security checks are typically done towards the product release timeframe and are built early into the system for faster feedback, thereby ensuring smoother releases. A few CICD orchestration tools also assist in building approvals into the pipeline, ensuring that wait for time and follow-ups surrounding manual approval throughout the lifecycle stages are minimized or eliminated entirely, hence speeding the release process.<|endoftext|> Quality: With continuous validation practices, software testing is done in an integrated and collaborative approach. Before it can be released, the application software must pass several functional and non-functional testing steps. The key to ensuring software quality is to design the test strategy, create test plans, standardize the test environments, automate as many test cases as feasible, and lastly integrate into the CICD pipeline for automated execution and automated inspections. Furthermore, quality gates ensure validating coding standards, code coverage, and successful completion of various types of tests when implemented into the pipeline. This iterative testing till the release ensures that quality is embedded into the system and that any audit/compliance requirements for the release are met in an automated manner much earlier in the lifecycle.<|endoftext|> Security: Security is one of the most significant components of the release management validation process. DevSecOps assists in integrating security checks into the development and testing stages, resulting in the early discovery of security concerns (shift left) to help avoid code change later. There are various stages in ensuring the security of the application. Software Composition Analysis (SCA) helps identify open-source software vulnerabilities, license compliance, and maintainability issues. Static Application Security Testing (SAST) helps in detecting vulnerabilities in the source code. Both SAST and SCA can be integrated into the CI pipeline in the development stage. DAST (Dynamic Application Security Testing) on the other hand helps in detecting run-time vulnerabilities in the application and can be integrated into the CD pipeline.<|endoftext|> Infrastructure provisioning: The readiness and availability of various consistent and scalable environments for executing different stages are key to the validation of the application software to be released. DevOps practices around the integration of configuration management, infra-as-code, dynamic environments, container environments, etc. help teams build consistent environments as and when needed without any manual intervention. In addition to cost optimization, these practices help reduce complexity and human errors, build release confidence, achieve scalable infra and consistent infra and at a faster speed.<|endoftext|> Auditing and Traceability: The release management process necessitates auditing and traceability of requirements throughout its lifecycle. DevOps pipelines should build these auditing and logging so that the real-time status of the application across environments and lifecycle execution stages is available. Further, such audits and logs can make the compliance checks during the release process much smoother.<|endoftext|> Deployment automation and strategy: Continuous deployment automates the practice of deploying software to production quickly and efficiently. Deployment automation ensures that deployments are fast, consistent, and repeatable across the environments. This also helps achieve managed access control and traceability, which is an important compliance requirement for a successful release.<|endoftext|> There are different deployment strategies available, and teams can choose the best one, based on the impact of changes to the system and the end-users. Some of the popular deployment strategies are blue-green, canary, rolling, recreate, etc. Every deployment strategy requires teams to also work on corresponding rollback strategies so that in case of release failures, the system can be brought to its previous working state most shortly and efficiently.<|endoftext|> Monitoring and closed feedback: The release process doesn\u2019t stop at deploying application software to production. Monitoring and tracking the performance of the release, identifying issues, and generating action items for application teams to react to are critical to the release process. DevOps helps build monitoring throughout the software lifecycle and integrates with ALM tools to generate user stories for actionable items.<|endoftext|> In the next series in this blog, we will look at tools that help orchestrate the entire release management process.<|endoftext|> \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: Summary on Apple WWDC-2023 \nAuthor: ['Infosys Limited'] Summary on Apple WWDC-2023 Apple Vision Pro The long-rumored mixed reality headset, Apple Vision Pro, was presented at Worldwide Developers Conference (WWDC). With the use of this spatial computer, users may interact with both digital information and the real environment while maintaining social connections.<|endoftext|> iOS 17 Apple drops support for the iPhone 8 and X with iOS 17. These are the five features in new iOS: 1. Transcript of a live voicemail 2. Better autocorrection 3. Contact posters 4. Simpler Namedrop and Airdrop 5. Standby mode 6. Journal App iPad OS 17 Widgets, redesigned Lock Screens, and Live Activities on Lock Screen\u2014track deliveries, scores, and numerous timers\u2014are all included. The iPad is getting a health app.<|endoftext|> macOS Sonoma Sonoma imagery for desktops, screensavers. Widgets can be placed on desktop.<|endoftext|> tvOS17 tvOS 17 redesigned Control Center. Siri Remote. FaceTime on Apple TV, it works with iPhone as the camera/mic. Continuity Camera API for tvOS.<|endoftext|> watchOS 10 There have been \u201ccomprehensive redesigns\u201d made to WatchOS 10. Show widgets, a redesigned trophy case, new watch faces for Snoopy and Woodstock, the Activity app with corner icons, and the World Clock with customizable background colours by using the digital crown.<|endoftext|> 15-inch MacBook Air The new MacBook Air has a 15.3-inch Liquid Retina display. It has a brightness of more than 500 nits and can display up to 1 billion colours, making it the largest MacBook Air to date. It should be incredibly enjoyable for users to use.<|endoftext|> New Mac Pro With the release of the new Mac Pro, Apple completed its move to Apple silicon and unveiled the new M2 Ultra CPU. Similar to the M1 Ultra, it is a big chip that requires a lot of cooling.<|endoftext|> Additional updates \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: Is MACH architecture right for your business? Key considerations to keep in mind \nAuthor: ['Infosys Limited'] Is MACH architecture right for your business? Key considerations to keep in mind In today\u2019s digital landscape, businesses must respond quickly to new opportunities, market demands, and customer requirements. None of this is feasible without software systems that are modular, easy to modify, and can integrate with a wide range of technologies and platforms.<|endoftext|> The MACH architecture provides the perfect solution by offering much needed flexibility by separating disparate business functions into independent functional services. According to the Infosys Digital Commerce Radar 2023 report, MACH can cope with the continuously evolving technology and customer needs. As a result, businesses can develop software solutions that can be easily extended without the risk of destabilizing the entire system. Moreover, MACH allows for creating specialized services that can dynamically connect, enabling businesses to scale their systems. Blogs 1 and 2 dealt with an overview of the architecture and its primary advantages.<|endoftext|> Enterprises may find MACH architecture promising, but hurdles still exist. As discussed in our earlier blog, implementing a MACH architecture demands considerable time, resources and expertise depending on the business needs. It\u2019s certainly not about adopting it because it is a flexible and evolved architecture system with a surplus of benefits. Yes, headless and composable architecture sounds exciting, but does your organization need it? In essence, enterprises should consider several technical and business implications before adopting MACH architecture.<|endoftext|> Here are three questions that enterprises must find answers to help make the critical decision. Answering these questions should clearly indicate the path ahead with MACH.<|endoftext|> 1. Does your business require a shift to composable architecture? First, enterprises must be confident about the triggers for a move to MACH architecture. Remember that while MACH architecture is well-suited for distributed computing and promises business agility, innovation and flexibility, it leads to a much more complex landscape as it integrates with diverse solution components. For example, MACH architecture may be overkill if an organization develops a software system that entails less flexibility or scalability in business capabilities. In this case, MACH architecture may instead unnecessarily increase development complexity and costs, and the organization must assess the RoI of MACH architecture to get an accurate picture. The company may be better placed to stick with traditional architecture. It\u2019s a tradeoff between getting the best-of-breed components versus investing significant resources and money.<|endoftext|> 2. Will the shift to MACH align with the business vision? Enterprises must assess if the aspired technology landscape with MACH will help them grow revenues. In addition, they must determine if the investment in MACH yields suitable RoI for their business. It\u2019s a given that MACH will usher in more capabilities and flexibility to cater to specific business needs. But enterprises must check if their business context necessitates these superior, flexible capabilities and a high cost. It\u2019s a matter of deciding to start on a clean slate and build a system in a highly composable manner or selecting packaged business capabilities that offer limited flexibility. Depending on its plans and prospects, the business may well decide that a monolith is adequate for its purpose. On the other hand, it could opt for a MACH solution if it anticipates more growth in the future, as MACH is a future-proof approach. It all depends on what the business situation warrants.<|endoftext|> 3. Can your IT team handle the complexities of a MACH architecture? Developing and maintaining a MACH system requires a high degree of technical expertise. Suppose an organization lacks the expertise or resources to work with MACH architecture. In that case, the absence of technical governance to direct the strategic technical vision implies that MACH is not the right path for them. In addition, the issue gets more complicated for those without a significant internal IT team as MACH solutions imply dealing with multiple vendors, and there is no single point of contact like in monolithic systems. So, an enterprise must carefully assess its performance and satisfaction with its existing monolithic systems. Because moving away from the status quo comes at a cost that must make business sense and is worth pursuing. Plus, as many solutions come together in the case of MACH architecture, it requires an IT leader with a solid understanding of the technology vision and the ability to execute in complete alignment with the business vision. This means investment in technical governance to guide them in the proper direction. Clearly, those enterprises with a limited IT setup must deliberate before switching to MACH solutions that demand a more sophisticated and mature structure.<|endoftext|> Conclusion: Weighing the Pros and Cons of MACH Architecture In conclusion, the MACH architecture offers businesses critical flexibility, agility, and scalability, but it is not a one-size-fits-all solution. Enterprises must carefully evaluate their technical and business needs and weigh the benefits of adopting MACH against the costs and tradeoffs involved. By answering the three questions discussed here, enterprises can make an informed decision on whether to adopt the MACH architecture and chart a path that best aligns with their business goals and objectives.<|endoftext|> \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: Saying goodbye to large monolithic systems in favor of MACH \nAuthor: ['Infosys Limited'] Saying goodbye to large monolithic systems in favor of MACH In a business environment rapidly turning to digital technology to survive and thrive, the underlying technology infrastructure that powers the enterprise must be tuned to respond accordingly. Scalability, speed, agility, resilience and flexibility are terms that we hear a lot today. At the same time, the emphasis on providing personalized customer experiences while boosting productivity of resources and revenues remains high. Enterprises are thus primed for progress but held back by monolithic systems.<|endoftext|> Monolithic systems tend to be less flexible, agile, reliable and resilient \u2013 clearly, unsuited for the dynamic changes that are so characteristic of these times. However, most companies still run their businesses on monolith legacy systems. Companies realize that is not the most conducive background to support digital transformation, enhance customer experiences, drive business growth, and seek alternatives.<|endoftext|> That\u2019s where MACH architecture, Microservices based, API-first, Cloud-native SaaS and Headless can play a major role. MACH architecture enables organizations to build modern, customer experience focused, cloud-native, and scalable systems that can support digital transformation initiatives at a fast clip. The four pillars that comprise MACH architecture make it so powerful \u2013 API Centricity \u2013 APIs are designed and developed before any user interface or other system components are built.<|endoftext|> Cloud native \u2013 SaaS solutions make scaling and automatically updating the components easier. In addition, it is possible to establish multi-tenant SaaS that is provisioned on demand, self-serviced and consumed as a service.<|endoftext|> Microservice based architecture \u2013 individual pieces of business functionality that are independently developed, deployed and managed.<|endoftext|> Headless \u2013 the front end is completely decoupled from the backend and can be changed independently without disturbing the backend.<|endoftext|> Thanks to its composable nature, MACH\u2019s distinct advantage is that it offers flexibility to business users to replace solution components in their system landscape in a plug-and-play manner, enabling them to quickly cater to ever evolving business needs. Such a fast response is almost unimaginable with a monolith. Moreover, monolithic systems imply vendor lock-ins, further restricting flexibility.<|endoftext|> Consider a typical commerce platform with multiple capabilities, such as a digital experience platform, order management system, payment, vouchers and promotions. Monolithic systems are typically single vendor platforms. All capabilities are built as an extension of this platform\u2019s base capabilities, limiting an enterprise\u2019s choice to what only the vendor offers. However, with a MACH based approach, enterprises can build next generation platforms with components from multiple vendors, each of which can be independently replaced if they are not able to keep pace with evolving business demands. So, suddenly, the system becomes much more flexible as each component is decoupled. Each component can be selected based on business demand and its capabilities. For instance, a commerce platform can integrate with an external search engine because it\u2019s API based and switch to another easily because it offers superior outcomes.<|endoftext|> Another huge appeal of MACH architecture is that all components are SaaS based or cloud native. As a result, MACH architecture will reduce the infrastructure footprint that the enterprise needs to manage. They now have the bandwidth to build business capabilities instead of focusing on hosting needs.<|endoftext|> MACH architecture offers several advantages over monolithic architecture. MACH is a more flexible and scalable architecture that enables faster and more frequent deployments, reduces dependency on a single technology stack, allows for greater agility in responding to changing business needs, and promotes better overall system resilience. We discuss the many advantages that MACH architecture offers in the next blog in this three-part series.<|endoftext|> It appears that enterprises must quickly shift to MACH platforms. MACH Alliance research showed that four-fifths of their respondents strongly intended to increase MACH elements in their architecture in the future as they believe that will help them get ahead of the competition[1]. In addition, almost half the respondents desired completely composable platforms, while over 50% wanted completely cloud driven platforms.<|endoftext|> According to the Infosys Digital Commerce Radar 2023 study, companies seem to be adopting MACH platforms.<|endoftext|> Figure 1 Implementation status of different platform design architectures While all three types of architecture exist today, the trend is to move towards flexible and technologically advanced options.<|endoftext|> However, implementing a MACH architecture can consume significant time, resources and expertise based on the business needs. Moreover, the complexity will increase with a distributed systems layout. Therefore, the decision to shift from a monolithic architecture to MACH should involve a careful evaluation of the organization\u2019s goals, needs, and resources and a thorough assessment of the technical feasibility and costs of the transition. If the evaluation gives the go-ahead for a shift to MACH, then it\u2019s worth the investment.<|endoftext|> At Infosys, we are keenly aware of the intricacies of making this decision. That\u2019s why the third blog in this three-part series will focus on the key considerations for companies before taking the MACH road.<|endoftext|> [1] Global 2022 Research Shows MACH Adoption Is High On The Agenda for Tech Leaders (machalliance.org) \n\n\n***\n\n\n "} +{"text": "Infosys Blog \nTitle: Top 5 benefits of MACH \nAuthor: ['Infosys Limited'] Top 5 benefits of MACH MACH architecture (Microservices-based, API-first, Cloud-native, and Headless) is a popular approach to building modern, flexible, and scalable applications. In this blog, we discuss its top five benefits and how it can help businesses achieve their goals.<|endoftext|> 1. Flexibility: The modular or composable nature of the architecture allows flexibility in choosing the best technology stack for each service, rather than being limited to a single technology stack like in a monolithic architecture. Moreover, it is designed to plug and play, making it easy for business users to access the best-fit solution and enhance performance. In addition, its independent nature allows enterprises to experiment with new technologies without fearing disrupting other application parts and easily add new features or services to their applications while ensuring high performance and availability. As a result, MACH future-proofs business solutions by helping them match their business capabilities to ever evolving needs.<|endoftext|> 2. Scalability: Given the design of this architecture, each component can be scaled independently, providing enterprises the ability to scale only what is necessary. As a result, resources can be allocated more efficiently, and developers can focus on improving the performance of specific services rather than the entire application. In contrast, scaling a monolithic architecture is challenging as all application components are tightly coupled. So, scaling any component implies scaling the entire application, amplifying costs and effort. Furthermore, scaling the entire application can result in over-provisioning resources, which can also be wasteful.<|endoftext|> 3. Faster time-to-market: Because MACH is modular, organizations can achieve faster time-to-market, as each service can be developed, tested, and deployed independently. Additionally, because each microservice is smaller and more focused, it is easier for developers to understand and modify the code, which reduces development time. This is a big departure from how traditional monolithic architecture handled changes \u2013 where change was time-consuming and laborious due to the big, single block of software.<|endoftext|> 4. Richer and personalized experience: MACH\u2019s headless approach enables a more consistent and personalized user experience across channels. So custom user interfaces optimized for specific devices or platforms can emerge, resulting in a more tailored and enriched experience for the end user. Additionally, headless architecture allows for easier customization of the user interface, resulting in a more personalized experience for the end user.<|endoftext|> 5. Lower TCO: The microservices-based approach enables cost savings through reduced infrastructure overhead, efficient resource utilization, faster development and deployment and lower maintenance costs. These benefits make MACH architecture an attractive option for organizations looking to reduce costs while improving application performance and scalability.<|endoftext|> \n\n\n***\n\n\n "} {"text": "# Infosys POV \nTitle: Data Imperatives in IT MA&D in Life Sciences Industry \nAuthor: Infosys Consulting \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 10 \n\n---\n\n An Infosys Consulting Perspective Consulting@Infosys.com | InfosysConsultingInsights.com DATA IMPERATIVES IN IT MA&D IN LIFE SCIENCES INDUSTRY \n\n---\n\n Page: 2 / 10 \n\n---\n\n 2 FOREWORD Larger macroeconomic headwinds (first the pandemic, then rising interest rates and now recessionary fears) are pushing organizations to resort to mergers, acquisitions, and divestitures (MA&D) as a strategic lever to achieve higher market share, acquire new capabilities, or/and refocus strategy on core business to improve financial performance. The average annual global MA&D value was approximately $3.6 trillion in 2011-20 cycle and increased to $5.9 trillion in 2021, highlighting the growing importance of MA&D in meeting future business needs. The life sciences industry is increasingly looking at MA&Ds to acquire new specialty / generic drug line (and related market pipeline) in pharmaceuticals, specialized capabilities in diagnostics and digital health sector, niche research and development capabilities for effective drug discovery around \u201cspecialty drugs\u201d and patented IP data around experimental drugs.<|endoftext|>There is growing emphasis on antitrust regulations, regulatory reporting, disclosure requirements, and overall deal approval processes. Compliance with these directly relates to the way entity data is managed (before and after MA&D transaction). Multiple data types including financial, operational, people, supplier, and customer data come into remit. This requires organizations to carefully design and execute their data strategy. There are multiple examples from the industry which showcase that despite growing importance of data strategy in MA&D transactions, just 24% of organizations included CIOs in pre-merger planning4. Abbott Laboratories\u2019 acquisition of Alere got delayed due to regulatory concerns on market concentration1 and anti-competition2. Pfizer and Allergan terminated their planned merger due to the change in treasury rules that made tax benefits less attractive3 and many more.<|endoftext|>Data strategy design and execution start with definition of business metrics and alignment on value measurement approach. After metrics are defined and accepted, linkage to source systems, standardization of data element definitions and management of meta-data along with master data ownerships are key to accurately measuring and interpreting these metrics. Data qualification, especially in regulated industries, is critical to understand and managing qualified data (GxP) and related platforms & applications involved. Finally, a performance oriented and scalable data integration methodology followed by an overarching process and governance mechanism is necessary for ensuring ongoing quality and compliance.<|endoftext|>A poorly designed data strategy and execution often leads to ambiguous understanding of key metrics and underlying data elements, incongruent data standards and unclear ownerships - resulting in faulty data integration, inaccurate transaction records, and ultimately unreliable insights and legal complications. An effective way to overcome these pitfalls is to define a robust data design and execution strategy covering key elements addressing distinctive needs of the life sciences industry.<|endoftext|>Data Imperatives in IT MA&D in Life Sciences Industry | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 3 / 10 \n\n---\n\n Mergers, acquisitions, and divestitures (MA&D) are strategic channels for growth. Multiple benefits can be achieved through an effective MA&D transaction, including exponential growth, entry to newmarkets, optimized cost savings, and improved competitiveness.<|endoftext|>The life sciences industry has been experiencing rapid growth and transformation in recent years, fueled by innovations in R&D, regulatory changes, and technological advancements in provider and payer domains. MA&D transactions have become a vital strategic tool for organizations to expand their portfolios, access new markets and improve their competitive positions. Given the complex nature of MA&D transactions, it requires comprehensive due diligence and planning before, during, and after the transaction. Data, being the fundamental building block of any organization, is a critical factor in this due diligence and planning. It is also one of the commonly overlooked factors. In this article, we highlight key elements of data strategy and design within a MA&D transaction and typical pitfalls along with ways to overcome them.<|endoftext|>MA&D transactions in the life sciences industry are increasingly subject to higher scrutiny from regulatory bodies to ensure greater transparencies and better shareholder and consumer protection. There are three key regulation types which are in place: 1. Greater financial and operationaltransparencies: A. India - Foreign Exchange Management Act (FEMA), SEBI Laws B. USA \u2013 Securities Act, Securities Exchange Act C. Europe \u2013 European Union Merger Law 2. Better intellectual property protection: Patents, trademarks, copyrights, trade secrets, designs, data protection.<|endoftext|>3. Higher fair play and consumer protection: A. India \u2013 Competition Act B. USA \u2013 Federal Antitrust Laws C. Europe \u2013 Competition Law According to an analyst report, the average MA&D failure rate is ~70%4. A key reason for this high failure rate is difficulty in integrating the two entities,5 especially w.r.t culture, operational ways of working, revenue recognition and performance incentives. All these aspects are directly impacted by the way data is designed and managed. Despite the importance, just 24% of organizations included CIOs in pre-merger planning3. Effective data management is key to adhering to these regulatory requirements and ensuring that data is properly collected, analyzed, and reported throughout the transaction process.<|endoftext|>Introduction 3 Data Imperatives in IT MA&D in Life Sciences Industry | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 4 / 10 \n\n---\n\n MA&Ds are fundamentally complex transactions that impact business entities, systems, processes, and data of the organizations involved. There are eight elements which underpin data strategy and execution.<|endoftext|>Fig 1 \u2013 Key elements of data strategy within a MA&D transaction. 1. Business metrics and measurement: Defining metrics to evaluate the performance of the target entity is critical. It is important that all entities involved in the transaction clearly define and agree upon the metrics which define success; noteworthy metrics in the life sciences industry include clinical trial outcomes, regulatory approval timelines, molecule discovery rates, drug pipeline progress and GxP compliance metrics. These metrics articulate objectives and key results of the target entity. Agreement on accurate metrics and their measurement logic improves operational and financial transparencies, thereby promoting adoption of the integration / divestiture decision.<|endoftext|>2. Data policies and standards: It is essential to establish a common set of data standards and policies to maintain data assets in the target environment. This involves defining standard data formats, structures, and rules for data management and establishing governance policies to ensure security, privacy, compliance, and protection of data. In a merger or a divestiture scenario, data policies for resulting entities are driven by target business needs and operational requirements.<|endoftext|>Key elements of data strategy and execution 4 Data Imperatives in IT MA&D in Life Sciences Industry | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 5 / 10 \n\n---\n\n 3. Metadata management: Metadata helps to classify, manage, and interpret master data.<|endoftext|>Managing metadata is essential in ensuring standardization of data elements across systems e.g., customer ID, distribution channel codes, clinical trial identifiers, drug classification codes, etc.<|endoftext|>Effective metadata management promotes improved data consistency, better data quality, governance, compliance, and security. Like data policies and standards, metadata management standards are driven by target business needs and operational requirements.<|endoftext|>4. Master data management: Data ownership is crucial in MA&Ds because it determines necessary accountabilities and responsibilities towards maintaining the data assets.<|endoftext|>Defining master data ownership during the pre/post-close phases is critical in ensuring smooth transition to integrated operations6. All parties involved must align on a clear ownership on gaining access, maintaining, and governing the master data assets after the transaction. Establishing data stewardship roles and processes to maintain master data is essential to avoid pitfalls such as delays in integration, legal disputes, and potential regulatory penalties. In addition, clear data ownership contributes to better intellectual property protection in a MA&D transaction. This ownership also means managing data at a product level with a promise of a required level of data quality, making it easier for users to extract valuable insights and intelligence.<|endoftext|>5. Data lineage management: MA&D transactions create large data assets, which increasingly become interconnected, complex, and challenging to work with. Data lineage tracks flow of data from source to destination, noting any changes in its journey across different systems. This allows for tracing data origins, evaluating data accuracy and pinpointing potential risks, enabling risk management, thus elevating probability of success of MA&D transactions.<|endoftext|>6. Data qualification: A crucial element for consideration is qualification of data into GxP and non-GxP. GxP data is subject to stringent regulations, while non-GxP data has fewer regulatory constraints. Proper data qualification enables organizations to manage GxP data in compliance with regulatory guidelines and handle non-GxP data as appropriate for its intended use. This helps in adoption of efficient data management processes especially from an extract, transform and load perspective. It also emphasizes relevance of systems that will hold the regulatory data thus ensuring required controls in place when interacting with such systems.<|endoftext|>5 Data Imperatives in IT MA&D in Life Sciences Industry | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 6 / 10 \n\n---\n\n 7. Data integration: Effective integration of data across systems such as clinical trial databases, product development pipelines, and sales and marketing platforms into a single, unified environment is critical for the new entity to make effective decisions. Integration of data requires consistent understanding of data and minimization of data redundancies. This helps the new entity gain better and more accurate understanding of its business and operational data, thereby expediting envisioned synergy realization. It also increases operational efficiency by streamlining internal processes, reducing duplication of effort, thereby improving risk profile. Effective data integration is essential for achieving information protection and transparency in a MA&D transaction.<|endoftext|>8. Data governance: Data governance is a crucial element for managing \u201cdata at rest\u201d and \u201cdata in motion\u201d. Robust data governance establishes policies, processes and controls to manage data throughout the life cycle. An effective data governance framework ensures both \u201cdata in motion\u201d and \u201cdata at rest\u201d are adequately protected while tracking data health in a near real time manner, thereby fostering trust with regulators, customers, and partners.<|endoftext|>Common pitfalls in a MA&D and ways to overcome them Data design and execution to support an integration / divestiture transaction is often complicated and stressful. However, with the right interventions, organizations can navigate around these complications. A non-effective data strategy can have far-reaching consequences, such as reduced financial and operational transparency, compromised intellectual property protection, decreased fair play among entities involved and weakened consumer protection. We have identified sixcommon pitfalls and their impact.<|endoftext|>Fig 2 \u2013 Critical elements of pitfalls in a MA&D transaction 6 Data Imperatives in IT MA&D in Life Sciences Industry | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 7 / 10 \n\n---\n\n 1. Data ownership: One of the most common pitfalls in MA&D transactions is limited clarity around ownership of data assets in target state. The issue is particularly pronounced when organizations involved have multiple focus areas with data stored in a single system but without proper segregation and ownership. For example, an organization may have three focus areas such as BioSimilars, BioPharma and Med Devices. Data on these focus areas may be stored in one system but not segregated based on focus areas. MA&D in any one of these areas will impose a significant challenge in terms of data segregation dependency identification. Ambiguities regarding data asset ownership often leads to intellectual property disputes, faulty data integration, and challenges extracting data specific to a new entity. To avoid such confusion, it is essential to establish data ownership early in the transaction and assign data stewards to manage data in rest as well in motion.<|endoftext|>2. Business metrics: Organizations involved in a MA&D transaction may prioritize select GxP and non-GxP metrics based on their distinctive strategic objectives and market priorities. For example, in a MA&D involving a generic and a specialty drug maker, the generic drug maker might emphasize GxP metrics such as manufacturing quality and regulatory submission timelines, as well as non-GxP metrics such as market share and cost efficiency. On the other hand, specialty drug makers might focus on GxP metrics such as clinical trial data quality and patient safety, and non-GxP metrics such as R&D pipeline growth and innovative therapy development. Given these diverse priorities, establishing common performance criteria for the new entity might be a challenge. Moreover, lack of uniformity in underlying logic for measuring the performance may further exacerbate the issue. Organizations must establish uniform metrics and underlying measurement criteria that is reflective of strategic priorities of the target entity.<|endoftext|>3. Data standards: Organizations also face roadblocks when they fail to establish common definitions for data elements. The resulting inconsistency in data standards increases the risk of inaccurate transaction records. Such inaccuracies can impair decision-making during critical stages of the MA&D and might even jeopardize the overall success of the transaction. Creating a unified data dictionary and standardizing data definitions across all entities involved is essential to mitigate such risks.<|endoftext|>4. Data lineage: A common pitfall is related to replication of source data elements across multiple source systems. Replication of data elements in multiple systems increases complexity of managing data and leads to additional synchronization overheads.<|endoftext|>Establishing standardized data lineage practices along with synchronized replication processes through automated tools is key to increasing data congruency.<|endoftext|>7 Data Imperatives in IT MA&D in Life Sciences Industry | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 8 / 10 \n\n---\n\n 5. Data governance: Another common challenge encountered during MA&Ds arises from ineffective and inconsistent data governance processes. Inconsistent data governance processes decrease accuracy of inferences and insights which can be derived from datasets. A consistent data governance process ensures data protection and regulatory compliance.<|endoftext|>6. Knowledge management: Heavy reliance on individuals makes knowledge retention vulnerable to personnel changes. To overcome this challenge, organizations must develop a knowledge management capability that is not solely dependent on people but facilitated through a set of processes and tools. A robust knowledge management capability enables effective and efficient use of data during the transaction.<|endoftext|>7. A well-designed data strategy is complemented by an effective execution plan. By proactively identifying potential challenges and implementing mitigating solutions, organizations can effectively navigate through the complexities and maximize value realization from a MA&D transaction. Effective data strategy and execution can safeguard the success of the transaction and ensure that the resulting entity(s) operates efficiently and effectively.<|endoftext|>About the CIO advisory practice at Infosys Consulting Over the next 5 years CIOs will lead their organizations towards fundamentally new ways of doing business. The CIO Advisory practice at Infosys Consulting is helping organizations all over the world transform their operating model to succeed in the new normal \u2013 scaling up digitization and cloud transformation programs, optimizing costs, and accelerating value realization. Our solutions focus on the big-ticket value items on the C-suite agenda, providing a deep link between business and IT to help you lead with influence.<|endoftext|>8 Data Imperatives in IT MA&D in Life Sciences Industry | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 9 / 10 \n\n---\n\n MEET THE AUTHORS Inder Neel Dua inder_dua@infosys.com Inder is a Partner with Infosys Consulting and leads the life sciences practice in India. He has enabled large scale programs in the areas of digital transformation, process re-engineering and managed services.<|endoftext|>Anurag Sehgal anurag.sehgal@infosys.com Anurag is an Associate Partner with Infosys Consulting and leads the CIO advisory practice in India. He has enabled large and medium scale clients to deliver sustainable results from multiple IT transformation initiatives.<|endoftext|>Ayan Saha ayan.saha@infosys.com Ayan is a Principal with the CIO advisory practice in Infosys Consulting. He has helped clients on business transformation initiatives focusing on IT M&A including operating model transformation.<|endoftext|>Manu A R manu.ramaswamy@infosys.com Manu is a Senior Consultant with CIO advisory practice in Infosys Consulting.<|endoftext|>He has assisted clients on technology transformation initiatives in the areas of IT M&A and cloud transformation.<|endoftext|>Sambit Choudhury sambit.choudhury@infosys.com Sambit is a Senior Consultant with the CIO advisory practice in Infosys Consulting. His primary focus areas include enterprise transformation with IT M&A as a lever. He has helped clients in areas of IT due diligence, integration, and divestitures.<|endoftext|>1 FTC Requires Abbott Laboratories to Divest Two Types of Point-Of-Care Medical Testing Devices as Condition of Acquiring Alere Inc.<|endoftext|>2 EU clears Abbott acquisition of Alere subject to divestments | Reuters 3 Pfizer formally abandons $160bn Allergan deal after US tax inversion clampdown | Pharmaceuticals industry | The Guardian 4 Why, and when, CIOs deserve a seat at the M&A negotiating table | CIO 4 The New M&A Playbook - Article - Faculty & Research - Harvard Business School (hbs.edu) 5 Don\u2019t Make This Common M&A Mistake (hbr.org) 6 6 ways to improve data management and interim operational reporting during an M&A transaction 9 Data Imperatives in IT MA&D in Life Sciences Industry | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 10 / 10 \n\n---\n\n consulting@Infosys.com InfosysConsultingInsights.com LinkedIn: /company/infosysconsulting Twitter: @infosysconsltng About Infosys Consulting Infosys Consulting is a global management consulting firm helping some of the world\u2019s most recognizable brands transform and innovate. Our consultants are industry experts that lead complex change agendas driven by disruptive technology. With offices in 20 countries and backed by the power of the global Infosys brand, our teams help the C- suite navigate today\u2019s digital landscape to win market share and create shareholder value for lasting competitive advantage. To see our ideas in action, or to join a new type of consulting firm, visit us at www.InfosysConsultingInsights.com. For more information, contact consulting@infosys.com \u00a9 2023 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names, and other such intellectual property rights mentioned in this document. Except as expressly permitted, neither this document nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printed, photocopied, recorded or otherwise, without the prior permission of Infosys Limited and/or any named intellectual property rights holders under this document. \n\n\n***\n\n\n "} {"text": "# Infosys POV \nTitle: Energy Transition: Hydrogen for Net Zero \nAuthor: Infosys Consulting \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 11 \n\n---\n\n An Infosys Consulting Perspective By Sundara Sambasivam & Shivank Saxena Consulting@Infosys.com | InfosysConsultingInsights.com Energy Transition Hydrogen for Net Zero \n\n---\n\n Page: 2 / 11 \n\n---\n\n Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 2 Energy transition: Hydrogen for Net Zero The pressure to reduce carbon emissions to achieve the target of net zero emissions by 2050 is ever-increasing. There is no silver bullet, no \u2018one-size-fits-all\u2019 solution to address this challenge. At this point in time, there are many different energy sources with varying levels of investment that are being explored and tested to enable our transition towards net zero.<|endoftext|>Hydrogen (H2) is one of the most abundant elements found in nature. For decarbonization of the industry, it is considered a key component; opening new frontiers and complementing existing solutions. This series of papers aims to share some interesting perspectives on this sector, the associated challenges, and why it could play a significant role in the decarbonisation agenda. Current limitations in tech, scaling challenges, and feasibility concerns are just some of the reasons it has not yet been harnessed fully. However, hydrogen has significant potential to manage this challenging journey towards net zero.<|endoftext|>\n\n---\n\n Page: 3 / 11 \n\n---\n\n Types of Hydrogen Both the production source and process used define the hydrogen type. Below is a list of diverse hydrogen types produced today based on production method and source (The hydrogen colour chart, 2022).<|endoftext|>Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 3 \n\n---\n\n Page: 4 / 11 \n\n---\n\n Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 4 MARKET OUTLOOK - Production & Economies Production and demand outlook According to the 2021 Report of International Energy Agency on Hydrogen, only 0.49 Mt of hydrogen was produced via electrolysis. Although this was only 0.5% of overall global production, the outlook on green and blue hydrogen is promising. It has become an essential element for any state policy on energy transition for net zero. By 2050, more than 80% of production is estimated to be of green or blue hydrogen. Demand will primarily be driven by power, transport, and industry where demand for green hydrogen has the potential to grow 200% by 2050.<|endoftext|>Figure 2: Global hydrogen production and demand outlook (Harnessing Green Hydrogen: Opportunities for Deep Decarbonization in India, 2022) \n\n---\n\n Page: 5 / 11 \n\n---\n\n Economic outlook The current hydrogen production costs from different methods are listed in Figure 3 (Hydrogen Strategy: Enabling a low-carbon Economy, 2020). Coal and other fossil fuel-based production is inexpensive at around 2 USD/kg. Prices increase by 10 to 20% when using carbon capture and storage (CCS). Electrolysis powered by renewable energy (RE) is the most expensive at 5 to 10 USD/kg and is not currently a competitive price. This needs to decrease to at least 2 USD/kg or lower in the next decade to directly compete with fossil fuels as an energy source.<|endoftext|>There are several elements that would play a critical role in driving the cost of the end-to-end supply chain of production and distribution. These include higher levels of innovation through research and development (R&D) and the right investment through disruptive digital technologies like artificial intelligence, the Internet of Things, blockchain smart contracts, certificates, and digital twin.<|endoftext|>Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 5 Figure 3: Hydrogen production costs by source and method (Hydrogen Strategy: Enabling a low-carbon economy, 2020) \n\n---\n\n Page: 6 / 11 \n\n---\n\n Economic outlook Renewables and electrolyser costs drive green hydrogen prices and are both showing declining trends. Electrolyser costs are expected to fall by 30% in the next ten years (Harnessing Green Hydrogen, 2022). Industrial manufacturers like Siemens Energy and Linde have already started setting up some of the world\u2019s biggest electrolyser production facilities in line with the European Union\u2019s (EU) strategy (REPowerEU plan May 2022) for fuel diversification, which will need a 27 billion EUR direct investment in domestic electrolyser and distribution of hydrogen in the EU, excluding the investment of solar and wind electricity (REPowerEU Plan, 2022).<|endoftext|>The US, on the other hand, has announced future investments of up to 9 billion USD from 2022 to 2026 through its \u2018Infrastructure Investments and Jobs Act\u2019 (Garc\u00eda-Herrero et al, 2022). The key difference is that US policy plans to use both blue and green hydrogen in the fuel mix, while the EU views blue hydrogen as a temporary solution only. Based on policy support and market conditions, the industry will decide on a future roadmap. Green credits and green hydrogen trading can turn many fossil fuel-dependent countries into future energy suppliers. Various states and corporates are funding green and brown field projects which have created finance opportunities for venture capital, underwriters, and insurance firms.<|endoftext|>Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 6 Figure 4: Renewables and electrolyser cost outlook (Harnessing Green Hydrogen, 2022) \n\n---\n\n Page: 7 / 11 \n\n---\n\n Economic outlook Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 7 Figure 5: Renewables and electrolyser cost outlook (Harnessing Green Hydrogen, 2022) \n\n---\n\n Page: 8 / 11 \n\n---\n\n Figure 6: Hydrogen value chain opportunities Hydrogen Value Chain Opportunities Figure 6 outlines the end-to-end value chain from production and electrolyser plant setup, operations in conjunction with RE parks, storage (long- and short-term), distribution (liquified or gaseous), and consumption applications (power, transportation, and industries). It gives an overview on the current usage of Hydrogen in industry applications. New emerging areas where significant opportunities exist for growth are primarily transportation (heavy duty vehicles and shipping), long-term energy storage (sub-surface), and green ammonia (production and energy carrier). Hydrogen can contribute directly to decarbonising the biggest polluters like steel, refineries, and ammonia production. Although Hydrogen has a clean burn, its production is not clean. Hydrogen production from fossil fuels resulted in 900 Mt CO2 emissions in the year 2020 (Global Hydrogen Review 2021, 2021). High demand for green and blue hydrogen and hydrogen-based fuels could reduce up to 60 Gt of CO2 emissions between 2021 and 2050, accounting for a reduction of 6% of total cumulative emissions (Hydrogen, 2022).<|endoftext|>Some of the biggest polluters in the transportation sector include long-haul freight, heavy-duty vehicles, maritime, and jet fuel. Decarbonizing them is not easy. By 2050, green ammonia can meet 25% of shipping fuel demand to meet the International Maritime Organization\u2019s goal of reducing CO2 emissions by 50% from 2008\u2019s levels. Hydrogen fuel cells can gear up short distance rides such as ferry journeys (Harnessing Green Hydrogen, 2022). With air travel growth, a significant carbon footprint increase is expected in aviation which already has the highest carbon emission intensity. Options like hydrogen fuel cells, hydrogen turbines, and hydrogen-based electrolytic synthetic fuel exist to decarbonize aviation, but each option has its merits and demerits. Big corporations like Airbus or start-ups like ZeroAvia have already presented their roadmaps for a hydrogen-based carrier in the next decade.<|endoftext|>For building, hydrogen can be blended into existing gas networks for both residential and commercial complexes. It can also be used by boilers and fuel cells. Its biggest promise is in long-term energy storage. This will impart stability to renewables-based generation and grid operations. Today, new gas turbines can also use hydrogen as a fuel component.<|endoftext|>Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 8 Opportunities for the industry \n\n---\n\n Page: 9 / 11 \n\n---\n\n What\u2019s next? In our next articles, we will discuss the challenges of this emerging sector, some exciting industry projects underway around hydrogen, support, and digital solutions needed to help pave the way to net zero. Infosys Consulting achieved its net zero goals 30 years ahead of time and is working to help our partners in their energy transition journey towards their own net zero goals.<|endoftext|>Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 9 \n\n---\n\n Page: 10 / 11 \n\n---\n\n MEET THE EXPERTS Sundara Sambasivam Associate Partner - Services, Utilities, Resources and Energy Practice Sundara.Sambasivam@infosys.com \u201cThe lines betw een digita l and physi cal retail will conti nue to blur\u201d Sources \u2022 Garc\u00eda-Herrero, A.,Tagliapietra, S. & Vorsatz, V. (2021), \u2018Hydrogen development strategies: a global perspective\u2019, Bruegel, August, [Online], Link: [Accessed: 21 Nov 2022].<|endoftext|>\u2022 \u2018Global Hydrogen Review 2021\u2019, (2021), International Energy Agency: IEA, [Online], Link Accessed:16 Nov 2022].<|endoftext|>\u2022 \u2018Harnessing Green Hydrogen: Opportunities for Deep Decarbonisation in India\u2019, (2022), Niti Aayog & Rocky Mountain Institute (RMI), June, [Online], Link [Accessed: 16 Nov 2022].<|endoftext|>\u2022 \u2018Hydrogen\u2019, (2021), International Energy Agency: IEA, [Online], Link [Accessed: 16 Nov 2022].<|endoftext|>\u2022 \u2018Hydrogen\u2019, (2022), International Energy Agency: IEA, [Online], Link [Accessed: 16 Nov 2022].<|endoftext|>\u2022 \u2018Hydrogen Strategy: Enabling A Low-Carbon Economy\u2019, (2020), U.S. Department of Energy, July, [Online], Link [Accessed: 16 Nov 2022].<|endoftext|>\u2022 \u2018REPowerEU Plan\u2019, (2022), European Commission, [Online], Link [Accessed: 16 Nov 2022].<|endoftext|>\u2022 \u2018The hydrogen colour chart\u2019, (2022), National Grid, [Online], Link [Accessed: 16 Nov 2022].<|endoftext|>Shivank Saxena Senior Consultant - Services, Utilities, Resources and Energy Practice Shivank01@infosys.com Energy transition: Hydrogen for Net Zero | \u00a9 2022 Infosys Consulting 10 Over 22 years of global experience, Sundar has led a number of business and digital transformation and outcome-based efficiency turnaround programmes across the Energy and Utilities (Transmission & Distribution). Sundar is excited to collaborate and help our clients to navigate the journey of Energy Transition towards the net zero ambitions.<|endoftext|>Over 11 years of experience, Shivank has led digital transformation projects, enabling end-to-end systems\u2019 delivery for clients across industries and sectors. He has ensured sustained value delivery on multiple engagements by building roadmaps and driving planning-to-execution for various business-led initiatives. He is passionate about supporting the industry to meet its net zero goals, and currently helps clients innovate to drive energy transition initiatives.<|endoftext|>\n\n---\n\n Page: 11 / 11 \n\n---\n\n consulting@Infosys.com InfosysConsultingInsights.com LinkedIn: /company/infosysconsulting Twitter: @infosysconsltng About Infosys Consulting Infosys Consulting is a global management consulting firm helping some of the world\u2019s most recognizable brands transform and innovate. Our consultants are industry experts that lead complex change agendas driven by disruptive technology. With offices in 20 countries and backed by the power of the global Infosys brand, our teams help the C- suite navigate today\u2019s digital landscape to win market share and create shareholder value for lasting competitive advantage. To see our ideas in action, or to join a new type of consulting firm, visit us at www.InfosysConsultingInsights.com. For more information, contact consulting@infosys.com \u00a9 2022 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names, and other such intellectual property rights mentioned in this document. Except as expressly permitted, neither this document nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printed, photocopied, recorded or otherwise, without the prior permission of Infosys Limited and/or any named intellectual property rights holders under this document. \n\n\n***\n\n\n "} {"text": "# Infosys POV \nTitle: Next-gen Process Mining powers Oil & Gas transformation \nAuthor: Infosys Consulting \nFormat: PDF 1.7 \n\n---\n\n Page: 1 / 13 \n\n---\n\n An Infosys Consulting Perspective By Sachin Padhye, Naveen Kamakoti, Shruti Jayaraman and Sohini De Consulting@Infosys.com | InfosysConsultingInsights.com Next-gen Process Mining powers Oil & Gas transformation \n\n---\n\n Page: 2 / 13 \n\n---\n\n Oil & Gas transformation | \u00a9 2023 Infosys Consulting 2 Process Mining technology: A key enabler to transform Oil & Gas Technology continues to be a reliant and indispensable enabler to transform operations of Oil and Gas companies. As a growing trend, the broader intent of incorporating technology is the use of operational data to support analytics and fact-based decision-making. However, increasing complexity of core and supplementary processes, coupled with limited agility of legacy and monolithic IT systems adds a constant challenge to continuous process improvement.<|endoftext|>The agility of business processes and operations depends on the ability to capture real-time data and perform large scale analyses to generate actionable insights on-demand and help steer and nudge key metrics and key performance indicators (KPIs). One such technology framework with ever-growing adoption is Process Mining, especially due to the evolution from process-discovery-based limited applications to centralized platforms for integrated process automation.<|endoftext|>\n\n---\n\n Page: 3 / 13 \n\n---\n\n 3 Process Mining uses detailed data from business processes Process Mining is the practice of using data from various sources to analyze, baseline and improve business processes. The concept of Process Mining is built on the pillars of analysis techniques using artificial intelligence (AI) and machine learning (ML). It is an approach to analyze, optimize, and improve complex operational processes. Powered by event data logs and data science tools, Process Mining helps identify process variations and bottlenecks and gathers quantitative insights in process flows. It also helps address performance and compliance-related issues in processes. The following high-level steps are involved in a typical process mining lifecycle journey: Step Description Tools used 1. Data collection Collect data from various sources, such as event logs, databases, operational data stores.<|endoftext|>Data extraction tools, such as ETL tools, log parsers, or database connectors.<|endoftext|>2. Data pre- processing Clean, filter, and normalize data to ensure consistency and accuracy.<|endoftext|>Data cleaning and preparation tools, such as Python, or R scripts.<|endoftext|>3. Process discovery Create a process model based on the collected data.<|endoftext|>Process Mining tools, such as Disco, ProM, or Celonis.<|endoftext|>4. Conformance checking Compare the process model with the collected data to identify deviations, errors, or inefficiencies in the process.<|endoftext|>Conformance checking tools, such as Disco, ProM, or Celonis.<|endoftext|>5. Process enhancement Optimize the process model to improve efficiency, reduce costs, and enhance quality.<|endoftext|>Process simulation and optimization tools, such as Arena, Simul8, or ProModel.<|endoftext|>6. Process monitoring Continuously track and analyze process data to identify potential issues, bottlenecks, or opportunities for improvement.<|endoftext|>Process monitoring tools, such as Celonis, Splunk, ELK, or Graylog.<|endoftext|>7. Process visualization Create graphical representations of the process model and process data to help stakeholders understand the process and identify areas for improvement.<|endoftext|>Data visualization tools, such as Celonis, Tableau, Power BI, or QlikView.<|endoftext|>Oil & Gas transformation | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 4 / 13 \n\n---\n\n 4 Evolution of Process Mining From being a niche technology used in research-oriented projects to a completely integrated cross-functional collaboration platform, Process Mining has evolved and matured for broad adoption. Here is a summary of this evolution: Oil & Gas transformation | \u00a9 2023 Infosys Consulting Process discovery Process conformance User-centered process Integrated process automation Time period First Generation Second Generation Third Generation Fourth Generation 1988 \u2013 2004 2004 \u2013 2011 2011 \u2013 2016 2016 \u2013 Present Summary Focus on discovery of process models from event logs Introduction of conformance checking and process enhancements Shift towards more user- centered and interactive approaches Expansion of process mining beyond event logs to include other types of data and processes Use of process discovery algorithms and process tree visualizations Integration of multiple perspectives and data sources Focus on business process management and improvement Integration of process mining with other technologies such as AI, IoT, and blockchain Limited support for large and complex processes Focus on quality control, compliance, and audit trails Integration of social, organizational, and environmental factors Increased focus on automation, robotics, and digital transformation Challenges in handling noise, concurrency, and infrequent behavior Use of data mining and machine learning techniques Increased emphasis on big data, cloud computing, and distributed systems Development of new techniques such as predictive process monitoring and prescriptive analytics People Primarily academic researchers and process experts Involvement of business stakeholders and end-users in process mining projects Involvement of a wider range of stakeholders including end-users, IT staff, and top management Involvement of a wide range of stakeholders including business users, IT staff, data scientists, and process experts Minimal involvement of business stakeholders and end-users Increasing emphasis on collaboration and communication Greater emphasis on user needs and user experience Greater emphasis on cross- functional collaboration and co-creation Process Emphasis on process modeling and analysis Shift towards process improvement and optimization Increased integration of process mining with business strategy and management practices Integration of process mining with digital transformation and innovation initiatives Limited focus on process improvement and optimization Greater attention to business objectives and value creation Greater emphasis on continuous improvement and innovation Systems Basic computing tools and algorithms Development of more sophisticated algorithms and methods Greater use of cloud computing, big data, and advanced analytics Integration of AI, IoT, Blockchain, and Advanced Analytics Primarily desktop- based software Increased use of enterprise-level software systems Integration with other digital technologies such as social media and mobile devices Use of event logs and basic data mining techniques Integration of multiple data sources and formats Greater use of process automation and robotic process automation (RPA) \n\n---\n\n Page: 5 / 13 \n\n---\n\n 5 Process Mining impacts multiple areas in Oil & Gas The Oil and Gas industry is complex and dynamic with significant data generated across all areas. For Oil and Gas companies, Process Mining can be particularly important because of the complex and highly regulated nature of their operations. Here are some specific ways in which Process Mining can benefit Oil and Gas companies: Oil & Gas transformation | \u00a9 2023 Infosys Consulting Value levers Impact of Process Mining Operational efficiency Process Mining can help identify inefficiencies in processes, such as bottlenecks or unnecessary steps, and suggest ways to streamline them. This can lead to cost savings and better use of resources.<|endoftext|>Regulatory compliance Oil and Gas companies are subject to numerous regulations and standards, such as those related to environmental protection and worker safety. Process Mining can help ensure that these regulations are being followed and identify areas where improvements are needed.<|endoftext|>Operational safety Safety is a top priority for Oil and Gas companies, and Process Mining can help identify potential hazards and risks.<|endoftext|>By analyzing data from sensors, equipment, and other sources, companies can identify patterns that may indicate an increased risk of accidents or equipment failure.<|endoftext|>Optimized maintenance Process Mining can help companies optimize maintenance schedules by analyzing data from equipment and other sources to identify when maintenance is needed. This can help prevent unplanned downtime and reduce maintenance costs.<|endoftext|>Customer satisfaction Oil and Gas companies may interact with customers in various ways, such as through fuel delivery or service stations. Process Mining can help companies understand how customers interact with their services and identify ways to improve the customer experience.<|endoftext|>\n\n---\n\n Page: 6 / 13 \n\n---\n\n 6 Pre-requisites for Process Mining Certain conditions need to be met before leveraging Process Mining effectively. These can broadly be grouped under People, Process and Technology.<|endoftext|>Oil & Gas transformation | \u00a9 2023 Infosys Consulting Who can help manage the organizational changes that may result from Process Mining initiatives. BUSINESS ANALYSTS With domain knowledge of the processes to be analyzed. With expertise in data management and system integration. With expertise in data analysis and statistical modelling. Or subject matter experts who can provide feedback on the accuracy and relevance of Process Mining results. DATA SCIENTISTS IT PROFESSIONALS PROCESS OWNERS CHANGE MANAGEMENT EXPERTS LEADERSHIP SUPPORT People Process Compliance with legal and regulatory requirements, such as data privacy laws. PROCESSES Well-defined processes with documented workflows and procedures. Availability of the required hardware or software infrastructure to support Process Mining activities Access to event logs or other data sources that capture process data. Alignment with the organization\u2019s strategic objectives and goals. DATA CAPTURE TECH INFRASTRUCTURE STRATEGIC OBJECTIVES COMPLIANCE AGILITY Organization\u2019s ability and culture to adopt new frameworks for continuous improvement. \n\n---\n\n Page: 7 / 13 \n\n---\n\n 7 Oil & Gas transformation | \u00a9 2023 Infosys Consulting System Visualization software to create dashboards and reports. ACCESS TO DATA Access to digitized processes and/or processes with event/case data and relevant data sources. This includes, event logs, databases, and other data repositories. Data cleaning, transformation, and normalization tools to prepare data for analysis. Process Mining software to extract and analyze process data. Process modelling software to create process model. PROCESS DATA DATA TOOLS PROCESS MODEL VISUALIZATION INVESTMENT Continued investment in technology platforms and relevant features. \n\n---\n\n Page: 8 / 13 \n\n---\n\n 8 Case studies The following case studies cite instances where Process Mining helped a US-based Oil and Gas major realize efficiencies and optimize resources using Celonis.<|endoftext|>Oil & Gas transformation | \u00a9 2023 Infosys Consulting Approach \u2022 Key AS-IS process flows for these processes were modeled in ARIS to begin with. This provided an understanding of the current pain points and areas of improvement. \u2022 This process model was leveraged to identify the data availability in applications across each of the steps.<|endoftext|>\u2022 The journey: A case was created through all its states and the data captured from the previous step was mapped against this to ensure data consistency.<|endoftext|>\u2022 This data was imported into the Celonis Execution Management system to create a data model. \u2022 Based on this data model, multiple process analysis dashboards and components were created to track various metrics and KPIs across key dimensions such as time, vendors, locations. Process Mining in upstream logistics \n\n---\n\n Page: 9 / 13 \n\n---\n\n 9 Oil & Gas transformation | \u00a9 2023 Infosys Consulting Business/process area Common challenges Potential process mining gains Value levers impacted Standard enterprise processes (order-to- cash, procure-to-pay) Manual interventions Reduction of TAT Operational efficiency Form corrections Improve no-touch processing Regulatory compliance Rate changes, data mismatches Automation Customer satisfaction Supply chain management High complexity of supply chain Reduction of process lead time Operational efficiency Visibility is limited among all stakeholders Reduction of cost by removing bottlenecks Optimized maintenance Best practices are not well-defined Full transparency of process Customer satisfaction Vessel schedule optimization Multiple rigs covered by same vessel Effective route planning to reduce fuel costs and optimize time Operational efficiency Route planning done at the last minute Operational safety Optimized maintenance Helicopter schedule optimization High cost due to over utilization Incorporate best practices for utilization Operational efficiency Unnoticed maintenance risks Monitoring risks Regulatory compliance Operational safety Optimized maintenance Warehouse management Warehouse layout inefficient Root cause analysis for layout Operational efficiency Lack of process automation Forecasting data for inventory utilization and avoiding stock outs Customer satisfaction Warehouse inventory inaccuracy Enhanced customer management Warehouse utilization inaccuracy Fleet management High fuel cost Improved fleet efficiency and routing Operational efficiency Under-utilized assets KPI monitoring to improve utilization Customer satisfaction Vendor management Manual processes, poor automation SLA improvement Operational efficiency Rental costs high and equipment under- utilized Contract visibility and optimization Customer satisfaction End-to-end system integration not available \n\n---\n\n Page: 10 / 13 \n\n---\n\n 10 Oil & Gas transformation | \u00a9 2023 Infosys Consulting Reference industry use cases Large integrated Oil & Gas major One of the largest Oil and Gas companies in the world has been using Process Mining to improve the efficiency of its drilling operations. By analyzing data from drilling rigs, this company was able to identify inefficiencies and areas for improvement, such as reducing idle time and optimizing drilling parameters. As a result, the firm was able to reduce drilling time and costs while improving safety and environmental performance.<|endoftext|>A European Oil & Gas company This company used Process Mining to optimize its maintenance processes for offshore platforms. By analyzing maintenance data, the firm was able to identify patterns and trends which improved the reliability of its equipment, reduced downtime, and lowered maintenance costs. The company also used Process Mining to identify opportunities for process standardization and optimization, resulting in further improvements in efficiency and cost savings.<|endoftext|>A large National Oil Corporation (NOC) This NOC used Process Mining to improve its customer service processes. By analyzing customer service data, the NOC was able to identify areas where it could improve its service levels, such as reducing response times and increasing the accuracy of billing. The company also used Process Mining to optimize its meter reading processes, resulting in significant cost savings.<|endoftext|>\n\n---\n\n Page: 11 / 13 \n\n---\n\n Process Mining encourages sustainable growth Oil and Gas companies operate in a complex environment with multiple interconnected processes, making it challenging to identify inefficiencies and areas for improvement. Process Mining provides a valuable tool for these companies to gain insights into their operational processes by analyzing data from various sources. By applying Process Mining techniques, Oil and Gas companies can identify bottlenecks, reduce costs, improve efficiency, and enhance the quality of their products and services. The benefits of Process Mining include improved compliance, enhanced decision-making, and increased operational efficiency. Therefore, implementing this technology can help Oil and Gas companies stay competitive and achieve sustainable growth in an ever-changing industry.<|endoftext|>11 Oil & Gas transformation | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 12 / 13 \n\n---\n\n MEET THE EXPERTS SACHIN PADHYE Associate Partner, SURE Sachin.Padhye@infosys.com 12 Sachin works with large Oil and Gas companies in the upstream, midstream, and downstream areas to frame their digital strategy across customer and employee experiences. He helps clients quantify value, beginning with industry opportunities and ending with decisions built with big data, analytical tools and visualizations and narratives. His current focus is digital data monetization, where he helps companies put a monetary value to the data that is used to execute their digital strategy. NAVEEN KAMAKOTI Principal, SURE Venkata_Kamakoti@infosys.com Naveen has over 18 years\u2019 experience in digital business transformation initiatives, focusing on process consulting, re-engineering and mining, as well as business architecture and consulting across information and professional services, plus Oil & Gas (upstream) domains. He leads the process consulting and transformation community of practice for Infosys Consulting.<|endoftext|>SHRUTI JAYARAMAN Senior Consultant, SURE Shruti.Jayaraman@infosys.com Shruti has four years\u2019 experience in business process improvement and digital transformation initiatives with a focus on process modelling, analysis, and mining. She\u2019s worked with upstream Oil & Gas clientele, across financial planning, process design and optimization, third-party hiring and government reporting areas for the last two years. She has administered trainings in process modelling using ARIS and has worked in Agile methodologies. SOHINI DE Consultant, SURE Sohini.De@infosys.com Sohini has over four years\u2019 experience in process transformation initiatives focusing on business process improvement, process design, modeling and mining. She has two years\u2019 experience in the upstream energy industry in marine logistics, and integrity inspection. She has conducted trainings in ARIS Designer platform for process modeling and has hands-on experience working in Agile methodologies. Oil & Gas transformation | \u00a9 2023 Infosys Consulting \n\n---\n\n Page: 13 / 13 \n\n---\n\n consulting@Infosys.com InfosysConsultingInsights.com LinkedIn: /company/infosysconsulting Twitter: @infosysconsltng About Infosys Consulting Infosys Consulting is a global management consulting firm helping some of the world\u2019s most recognizable brands transform and innovate. Our consultants are industry experts that lead complex change agendas driven by disruptive technology. With offices in 20 countries and backed by the power of the global Infosys brand, our teams help the C- suite navigate today\u2019s digital landscape to win market share and create shareholder value for lasting competitive advantage. To see our ideas in action, or to join a new type of consulting firm, visit us at www.InfosysConsultingInsights.com. For more information, contact consulting@infosys.com \u00a9 2022 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names, and other such intellectual property rights mentioned in this document. Except as expressly permitted, neither this document nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printed, photocopied, recorded or otherwise, without the prior permission of Infosys Limited and/or any named intellectual property rights holders under this document. \n\n\n***\n\n\n "}