Agile in Testing Practice: How Engineering Testing Practices Drive Productivity and Quality

Agile, as the name refers, implies quickly. Hence Agile Testing refers to the process of rapidly validating against the client’s requirements. As soon as the build is available, testing is initiated to identify bugs. As Testers, we strive to provide active feedback during the testing process instead of being the audience at the other end. Emphasis is on the quality of the deliverable despite short time frames, which further assists in cost reduction of the entire development cycle. Given the rate of change around us, it is often difficult or impossible to predict how the applications will evolve over time.

Market conditions change rapidly, end-user needs evolve, and new competitive threats emerge without warning. In many situations, it wouldn’t be possible to define requirements fully before the project begins. It is critical to be agile enough to respond to a fluid business environment. Fluidity implies change, and change is expensive, particularly it is uncontrolled or poorly managed. One of the most compelling characteristics of the agile approach is its ability to reduce the costs of change during the software development process. Does this mean that recognition of challenges posed by modern realities may lead to abandonment of valuable software engineering principles, concepts, methods, and tools? Absolutely not! Like all engineering disciplines, software engineering continues to evolve, and it is possible to adapt to meet the challenges posed by a demand for agility.

Benefits of Agile Testing

  • Development and testing activities are concurrent.
  • Everyone work as a team towards a common goal and everyone is responsible for the quality.
  • Continuous integration and customer feedback.
  • Less risk of time squeeze.
  • Working software is developed and delivered to the customer frequently along with documentation.

Strategies for Agile Testing

01. Early Software Testing:

  • Test as early as possible, the potential impact of a defect rises exponentially over time (this isn't always true, but it's something to be concerned about). In fact, many agile developers prefer a test-first approach.
  • Test as often as possible, and more importantly, as effectively as possible, to increase the chances of identifying defects. Although this increases costs in the short term, studies have shown that greater investment in testing reduces the total cost of ownership of a system due to improved quality.
  • Align testing investments with direct & indirect costs associated with a failed deployment. Some applications need testing protocols that are mandated by regulatory frameworks.
  • Pair testing, just like pair programming and modelling in pairs, it's an exceptionally good idea.

02. Testing Throughout the Lifecycle:

Testing activities vary throughout the lifecycle. During Iteration 0, initial setup tasks are performed. This includes identifying the people who will be on the external testing team, identifying and potentially installing testing tools. If the project has a deadline, it is important to identify the date “End Game” (Release) date. A significant amount of testing occurs during construction iterations - agilists test often, test early, and usually test first. Regardless of the style, the true goal should be to test, not to plan to test, and certainly not to write comprehensive documentation about how testing is intended at some future point! Agilists still do planning, and still write documentation, but the focus is on high value activities such as actual testing. During the “End Game”, full system and acceptance testing are the focus areas. The testing effort is greatly reduced at the End Game since rigorous testing has been largely accomplished during earlier stages of the product development lifecycle.

03. Testing During Construction Iteration:

There are two aspects to confirmatory testing: agile acceptance testing and developer testing, both of which are automated to enable continuous regression testing throughout the lifecycle. Confirmatory testing is the agile equivalent of testing to the specifications, and typically acceptance tests are primarily focused on requirements specifications while developer tests are focused on the design specifications. Both concepts are applications of the agile practice of single sourcing information, whenever possible. Agile acceptance testing is a mix of traditional functional testing and traditional acceptance testing because the development team and their stakeholders are doing it collaboratively. Developer testing is a mix of traditional unit testing and traditional class integration testing. The goal is to look for coding errors, perform coverage if not full path testing, and to ensure that the system meets the current intent of its stakeholders. Developer testing is often done in a test-first manner, where a single test is written, and then sufficient production code is written to fulfil that test. This test-first approach is considered a detailed design activity first and a testing activity second. Automation is an important aspect of construction testing due to the increased need for regression testing on evolutionary projects. It is possible to generate acceptance test cases from use cases and scenario definitions or from process diagrams such as UML activity diagrams or flow.

04. Investigative Testing:

Investigative testing unearths potential problems in the form of defect stories—the agile equivalent of a defect report. A defect story is treated as a form of requirement—it is estimated and prioritized and added to the requirements stack. The need to fix a defect is a type of requirement, so it makes perfect sense to address it just like any other requirement. As would be expected, during the “End Game”, the only requirement type that is being worked upon, is defect stories. Good investigative testing efforts reveal any problems that developers missed long before they become too expensive to address. It also provides feedback to management that the team is successfully delivering high-quality working software on a regular basis.

Consider a particular scenario of “product purchase” that can be schematically represented using UML notation as shown below. The ROI case study provides a quick overview of how agile testing creates value in this case.

Case Study: ROI on Test Automation

Application Name

XYZ

Number of Test Cases

4000 (Sample Number)

Manual(hours) to execute

336 (Considering 5 Min is required for one Test Case Execution)

Automated Effort (hours)to execute

132 (Considering 2 Min for one Test Script)

Number of Test Iteration Planned Yearly

20 (20 Iteration are planned in a Year. Each iteration will require execution of 2000 Test Cases)

Total Projected Hours Saved

(336*10) – (132*20) = 720

Total Savings Annually ($)

720 * 50 =36,000 (Considering 50$ per hour is one FTE Rate)

Savings %

(720/3360)*100 = 21.4%

Automation Effort Estimate

4000 (Considering 2 hours for Scripting one Test Case)

Automated Effort FTE Cost ($)

4000 * 60 = 2,40,000 (Considering 60$ per hour is one FTE Rate)

 

Automated Testing:

The main challenges associated with automated testing are:

  1. Selection of Automation Tool – There are a variety of automation tools available in the market and choosing a good automation tool is one of the major challenges that an organization often faces. Commercial tools could be expensive, while open-source tools may not be reliable. Additionally, in some cases, organizations may not have sufficient expertise to make optimum use of the testing tools.
  1. No Defined Process for Executing Automation Project within an organization – Automation is akin to project execution, wherein requirements, framework design and test case execution must be aligned. Lack of systematic approach and process makes successful automation tough. Just as processes, guidelines and checklist are defined for the project, it is a best practice to have guidelines, processes and checklist available for Automation as well.
  1. Availability of Right Resources – The right set of resources is a must when attempting automation. This means the resources should be skilled enough to design and code robust scripting, so that it requires minimum time for debugging during maintenance.
  1. Commitment from Customer or Management – Automation is time consuming and a resource intensive task. Customer or management commitment is required to extract the real benefits of automation.

Automation Testing with Selenium

Selenium is an excellent open-source tool for driving automated testing in web-based applications. The only cost is the effort that goes into designing and developing the script for the application, and no lifecycle design for automation needs to be developed. Selenium Automation lifecycle fits well into existing automation lifecycles. Given the popularity of Selenium as an automation tool, there’s plenty of information online related to best practices. Selenium scripting can be done in several programming languages like C#, Java, PHP, Ruby etc. unlike other commercial tools which support single scripting language. Since Selenium scripting can be done in a language of choice, finding the right resources is simpler. Finally, given that Selenium doesn’t have any licensing costs, engineering leadership & teams recognize that their only investments in testing automation are related to infrastructure and developer efforts.

Software quality directly impacts the financial performance of the business. The example above shows that test automation can drive improved quality along with significant savings. A carefully chosen test strategy brings together several elements such as automation, people skills, workflow & processes, tool selection and culture.

At Encora, we integrate test automation in the DevOps cycle to drive engineering efficiency & effectiveness. Some of the ways we achieve this goal are by:

 

  • Building automation plans to maximize code coverage & create coherent test environments
  • Augmenting open-source QA test automation platforms with Encora-built accelerators
  • Implementing best practices for cross-platform testing with Behavior Driven Development (BDD) integration
  • Enabling test case generation, maintenance & scalability using AI
  • Integrating Test Environment Management & Test Data Management into automation cycle

Share this post