Scalable High-Load Testing Framework for a Large Enterprise
A migration software provider aimed at accelerating product development and driving international scale. The product stores large volumes of data about clients' employees and automates its transformation. Our high-load testing framework was designed to simulate real-world scenarios by generating a significant amount of virtual user traffic and data load.
Industry:
Automation processes
Services:
Software Testing
Location:
US
Challenge
Manual testing: Manual testing methods are simply not efficient enough to accurately and quickly detect and solve problems. This can lead to significant delays and increased costs when developing and rolling out new features and modules. Additionally, manual testing methods are often prone to human error, which can result in inaccurate data and potentially devastating bugs.
Scalability: Scalability is another major challenge that must be addressed when developing a high-load testing framework. As enterprise software grows in complexity and size, testing becomes more difficult. A scalable testing framework is necessary to ensure that software can be thoroughly tested in a timely and cost-effective manner.
High-load testing scenarios: The framework must be able to handle high-load testing scenarios. For example, scalability testing may involve simulating thousands or even millions of simultaneous users accessing the software. This puts a significant strain on the framework, which must be designed to handle high levels of stress and load.
Solution
Scalable testing framework
We implemented a scalable automation testing framework that accelerated product development and ensured the high quality of the code. It was designed to support a huge number of automated test cases.
Our framework consists of 5,000+ automated tests run asynchronously in 20 threads overnight. Only 5-10% of all products have such a scale. The established test environment is used to find possible bugs in new features before deploying the code to production.
UI components standardization
Building solid communication between the Frontend and QA team makes it possible to arrive at a components pattern for the automation framework. This allowed for the acceleration of automated test creation and boosted overall product development since teams now use unified patterns to create new components. The biggest benefit of such an approach is better quality and less code volume, which ensures code readability, ease of maintenance, and saves overall team resources.
Optimized sprint workflow
Our QA engineers adopted an optimized process to integrate closely with the development team and shorten the time necessary to release a product increment. We needed to iterate on the product quickly: one sprint took a week which created pressure on teams.
Now testing starts on the requirements elicitation and validation stage. This approach enabled us to start writing automated test cases for the back-end and front-end before the development team implements features.
Technologies Used
In order to deliver a robust and reliable migration software product that can handle large volumes of data, our team has leveraged a powerful technology stack. This stack includes a variety of tools and technologies that have been carefully selected to enable us to build a product that can stand up to the rigorous demands of the market.
About the team
To tackle all the challenges, we assigned a team of eight Automation QA Engineers and a QA Automation Lead to work closely with the client's small autonomous product development teams. This allowed us to seamlessly integrate our team into their existing infrastructures and workflows, ensuring a smooth and efficient process from start to finish.
Impact
During work on this product, our automation team delivered 90%+ test coverage and is still constantly looking for ways to improve the framework. The optimized workflow helped speed up iterations by up to one week.
The automation QA engineers of Techstack own the responsibility for the state and quality of the whole product. They are an essential part of the product development team and proactively evaluate every feature before it can be deployed to production.
Up to 85% of regression bugs were spotted by the autotests
Regression bugs identification time decreased from 1 month to 19 hours
15% of bugs spotted before starting/during development
Tests automation creation speed increased by 75-80%