Case StudiesSetting Up an Automation Testing Framework From Scratch in a Late-stage Project

Setting Up an Automation Testing Framework From Scratch in a Late-stage Project

This is a set of user-friendly and powerful desktop utilities coupled with graphic editors. They have been brought together and consolidated under one brand to provide a seamless and holistic user experience.

Industry:

Design Software

Services:

Software Testing

Location:

Netherlands

01

Challenge

Comprehensive understanding of the project goals: Our team had to work through multiple iterations of assessment, evaluation, and refinement to arrive at a framework that met the product needs and was scalable to accommodate future growth and changes.

Limited time: The product was in its late stages, and time was of the essence. Our team faced this challenge head-on, mobilizing quickly and efficiently to build a framework that would streamline our testing process. Despite the time constraints, we remained focused on quality, ensuring that our tests were thorough and effective.

The right tools and resources: One of the critical challenges we encountered was finding the right automation testing tools and resources to fit within our product's context. The market is flooded with a myriad of automation tools, each offering different features and capabilities. Our team had to thoroughly evaluate and test multiple options to identify the tools that were the best fit for our project.

Integration with the existing software: We also encountered challenges related to integrating the automation framework with the existing tools and systems infrastructure. This included the need to incorporate the framework with the continuous integration and delivery pipeline, enabling easy and fast integration of automated tests with the project builds and deployments.

02

Solution

The team has grown from 0 to 8 members and included manual and automation QAs. We utilized sophisticated approaches based on the product needs and goals to:

  • Introduce double-stage Kanban sprints, where the QA team tests developed features while the development team worked on new features in parallel.

  • Optimize the smoke test to save time before releases.

  • Introduce testing in different environments (Trunk, Staging) to ensure impeccable release quality.

  • Divide the test team into sub-teams, each focused on their product, to allow engineers to concentrate better and ensure better quality.

We also built a graphic design editor that allows users to apply various graphic effects. Manually testing their correct application is a cumbersome and error-prone task. That’s why we built a library of graphic effect references to detect errors automatically.

We used a per-pixel comparison algorithm based on the NuGet Package (System.Drawing.Common), which allowed for the operation of Bitmap objects (reference and actual images). The algorithm defines the match percentage, so we can ascertain whether an effect was applied correctly.

The usage of this library eliminates the majority of routine testing and saves time for quality assurance of other application parts.

03

About the team

The team grew from 0 to 9 members during our cooperation.

Team composition

  • QA Manager

    1

  • QA Engineers

    7

  • Front-end developer

    1

04

Impact

Unit tests count: 5,000+ Automated tests count: 2,500+ Releases we took part in: 50+ Team count: from 0 to 8 members

Over the course of our cooperation, the product has become financially self-sufficient. Parallel development and testing sprints ensured an uninterrupted software delivery.

Let’s create together!
Get in touch with us
05