Reports testing presents many difficulties because it requires thorough planning, outstanding analytical skills and a great degree of concentration.
The greatest challenges for manual QAs are:
- A large number of reports with different logic and layout for each
- A continuous investigation process
- Testing various types of tasks simultaneously, such as a new report, enhancement, performance and regression
- An in-depth knowledge of differences between various databases, servers and reporting applications
- A thorough understanding of the report creation process and the ability to understand stored procedure code for white-box testing.
Report logic and report layout verification require significant time and can only be done manually. Also, to provide high-quality testing, a QA team must not only ensure the required changes were made, but that nothing else was broken. Because any simple change made to an object could produce a long list of impacted objects, QA teams needed an automated solution that could perform the regression testing.
In addition, reports testing demands extensive data verification work, comparison of different file types, performance measurement, comparison of certain database objects and so on. This routine work also required automation.
Reporting QAs required a solution that would be simple to use, provide readable tests results and have the same strategy in the testing cycle for different reporting applications and projects.
A reporting automation testing approach had to solve the following tasks:
- Improve the quality of testing by maximizing the scope of verifications
- Reduce time spent on routine work and concentrate on deeper reports analysis
- Create a complex testing application to simplify the manual testing process
Sprinterra delivered results by:
- Maximizing testing coverage through automation tools
- Providing continuous data quality verification without involvement of human resources
- Providing an automated tool for performance testing
- Providing regression applications and automated tools which are written using open source instruments
- Using a data-driven approach for regression tests
Approach and Solution
The Sprinterra Automation QA team started with a simple solution based on open source technologies using a data-driven approach to run regression tests.
This solution evolved into a brilliant one. A unique regression strategy was developed and could be applied to any new reporting application or reporting project. Although each project has some customization depending on particular needs, all of the projects work under the same strategy:
- QAs choose the object–such as reports or stored procedures–for the regression run and begin the regression.
- The automated solution reads the input data from a local testing database.
- The automated application launches the reporting application and runs all existing test cases for the selected objects/reports.
- Once the report execution is completed, report outputs are downloaded and compared to baselines, which are optimal or expected results for the reports. Comparisons are available in .csv, .xls and .pdf formats.
- When mismatches are found, a comparison file is created and all the differences are highlighted, which is extremely useful for further analysis by the QA team.
- Once all reports are executed, outputs are downloaded, compared to the expected results and an email with regression results is sent. This email contains general statistics on regression results and details for each particular test case.
- The ability to manually override test case parameters for specific verifications on different environments
- Options to start regression from any stage
- A “compare only” option, which permits using only comparison functionality of a regression
The Sprinterra QA team created additional automated tools, including:
- “SPPerf tool” – A powerful tool for stored procedures verification. It gathers performance statistics such as running time and i/o count for different database types. It also can be used for load testing as it compares SP run results on different environments or simply runs an unlimited number of different SP calls.
- “Excel Compare Tool” – a highly customizable VB script for Excel. It allows the user to compare two Excel files with many additional options, such as ignoring specific rows and columns, manually selecting key columns and verifying rows that were split and placed into one of the compared files. It then displays comparison results as matched or mismatched for all cells and offers additional info for mismatched.
- “Regression UI Tool” – a graphical user interface for regression packed into a JAR archive, offering the QA a much more comfortable experience while regression testing. It simplifies all manual routing that should be done for regression testing, including running reports, specific options selection, adding and updating test cases and more. It also handles regression as a desktop application without the need for additional installed software–thus simplifying usage on large teams.
- Standardized design implementation saves human resources and automates regression at any time without the need for specific skills.
- GUI-based design deploys regression as an application to any VM without the need for additional hardware resources such as servers or databases.
- Local resource usage approach permits starting any amount of regression applications simultaneously from different machines without any shared resource conflicts.
- Reduced need for human and hardware resources for regression testing allows for continuous performance to increase product quality and lowers the risk of missed bugs.
- Additional verification for project-related application functionality allows for catching missed “non-project-related” bugs.
- Statistics gathering and e-mail notification system helps obtain regression testing results with detailed statistics without the need for manual results verification.
- Results gathering and sharing permits storing changes tracking for all stories.
- Standardized design for all reporting projects
- Platform-independent approach
- Functional extension and customization
- Easy-to-understand design and results
- Any stage exception-handling approach
- Additional verifications as a side effect of main testing
- Selenium WebDriver
- Apache POI