Reporting Results

The more carefully you analyze the data to be collected, the more flexibility you will have for reporting. The suggestions that follow are a sample of reports you might find useful:

  • A list of applications that failed the compatibility tests.
    This report requires follow-up to resolve the problems and retest the applications.

  • For each business unit, the total number of applications for each priority level.

  • For each business unit, the total number of applications that are untested.
    Include the percentage of untested applications. You can use this report to track who is keeping up and who is not. Do not overlook the possibility of using this report as an incentive for groups that are behind in their testing.

If you report progress by whether applications passed or failed, consider whether you need to show relative progress or actual numbers. You can show relative progress with enhancements such as color schemes or graphics. This can provide your audience with a sense of your progress without presenting numbers that can be misleading. If you need to show progress with actual numbers, you might want to devise a way to weight or report the numbers based on the priority of the applications. For example, a report showing only that 10 applications passed and one failed might not present an accurate status. If the 10 applications were special utilities used by a few users intermittently, and the one failed application was a critical application required to run your day-to-day business, the report would not give a complete picture.

If you have testers post issues at a specific place, such as a Web site, you might want to provide a report of open and closed issues.

Create and distribute reports to management and testing participants after each testing event, and periodically as needed. If you have a Web site for your testing project, you might include the ability to run online reports.