Between three test engineers and myself, we analyzed requirements, tracked issues thereof, designed test scenarios, and test cases. Three additional test engineers joined us during the execution and release phase. Since the dev and test were independent business units, the test engineers were typically loaned resources who were assigned on an availability basis.
We were faced with the challenges of frequently changing test engineers besides the package deals that come with testing a data-intensive, high impact financial auditing software. Also, the UI/UX design part was also at stake. We outsourced the design part to some of the best UI/UX design companies in India which made our designing part easier.
Here is how we two tools available in our development environment to help us improve productivity:
- SQL Server: Our software accepted data input in the form of XML files. We created bulk test data (based on some sample data provided by our clients) in SQL server tables. Then used a simple “For XML” clause to generate basic XML files. Minor find-and-replace tweaks generated our actual input file. We then attached these test data files with each test case in our test case management tool for use during the execution cycle. Now the newbie engineers joining us had simply to use this data to verify the expected results.
- Share point server:
We were given a project website on SPS for our internal collaboration. We created lists on this site:- Queries list: Queries regarding requirements or behavior were logged and tracked here. Each query had a user to whom it was currently assigned to. The test manager monitored aged queries and made sure they were resolved satisfactorily. Often these queries ended up simplifying requirements that were conflicting or ambiguous or helped improvise technical design. Sometimes the query turned out to be a genuine issue, so it was transferred to our defect tracking tool. We made sure to add the test case/defect id from another system into this list so we could establish traceability.
- Test scenarios list: Before we created detailed test cases, we nailed down the top-level test scenarios. This helped us gauge the extent of testing needed and get a feel of changes being done to the system in each iteration. This list also served as the introductory chapter for new members.
Each item in this list had 3 owners assigned to it – the business analyst owner, the dev owner, and the QA owner. The QA owner was typically the one who created this scenario. The dev owner was the one who would use this scenario during his / her preliminary testing before marking a piece of code as ready-to-test. The BA owner would come into the picture if anyone from the team asked clarifications on an item or if the requirements document needed updates due to a scenario.
- Queries list: Queries regarding requirements or behavior were logged and tracked here. Each query had a user to whom it was currently assigned to. The test manager monitored aged queries and made sure they were resolved satisfactorily. Often these queries ended up simplifying requirements that were conflicting or ambiguous or helped improvise technical design. Sometimes the query turned out to be a genuine issue, so it was transferred to our defect tracking tool. We made sure to add the test case/defect id from another system into this list so we could establish traceability.
The advantage SPS provided over .xls sheets was that we did not have to juggle shared workbooks and private copies. All info was kept in one location for every one to see. We also had our team members set up alerts on each of these lists so as to get notified of any changes.
These are just two of the ways we used tools in an unusual way to meet our specific needs.