Weekend Testing Session No.10
Date and Time: 03rd October 2009, 3pm - 5pm IST
Application: Converber v 2.1.0
This session was different just like every other session.
The mission was unique as the testers were not supposed to hunt for bugs.
FLASH NEWS: DO NOT HUNT FOR BUGS!!!
How often do you see that?
Then what was this session about?
Following details were given to testers:
Context: A client wants Beta Testers for testing Converber. You have to prove them that you are the best Beta Tester for them.
How will you be judged?
You have to give a list of Test Ideas/Scenarios you would cover if selected as the Beta Tester.
Based on the list of Test Ideas/Scenarios, you'd be selected/rejected as the Beta Tester.
Most of the testers were surprised with the mission and set out to achieve the mission at 3pm IST sharp.
It was a different experience for me too. I was busy browsing through different articles on Exploratory Testing Approach to identify most of the quality criteria.
It was already twenty minutes and I had not even launched the application.
Luckily, I found these two documents:
How Do You Spell Testing - James Bach and
Heuristic Test Strategy Model
I found some testers interested in finding bugs in some modules of the application.
Being the facilitator, I reminded them of the mission.
It was challenging for most of the testers.
While some were finding it hard to put ideas to paper, some could not resist the idea of hunting bugs.
This particular session went so quickly that we realized that it was already 4pm IST. We stopped generating test ideas!!!
One of the challenges faced by Tejas was not to start testing as soon as an idea popped up in his mind. Dhanasekar echoed Tejas's concern - he too started testing as soon as he came up with any idea.
Rajesh wanted to clarify what a test idea meant and how it is different from a test case. I felt it was too late for this question to come up as the testing session was already over. Rajesh learnt that he could have gathered valuable information had he raised this question at the start of the testing session itself.
Dhanasekar and Tejas were of the opinion that there are too many terminologies to add to the confusion.
We started off with Dhanasekar sharing his experience, challenges, learning of this session. He found it easy to hunt for bugs than documenting test ideas and promised to work on that aspect. The major challenge he had to tackle was to not to test the test idea he generated. Being unclear about the mission too did not help his cause.
He got diverted on finding a crash and started investigating it.He realized the importance of questioning which could have saved him a lot of time.
His biggest learning was to
"FOCUS ON THE MISSION"and he was of the opinion that this exercise would help him present his test ideas better.
Rajesh started off by sharing his experience. His favorite subject being Maths, he loved to test this application as it involved a lot of mathematical calculations. His limited knowledge about different conversions forced him to experiment with only those units which he was comfortable with. As he did not ask questions to clarify what a test idea was, he was hunting for bugs along with the task of generating the test ideas.
His biggest learning for the day was
"THE IMPORTANCE OF QUESTIONING"He had read about questioning the stakeholders for more information and today was his practical experience of questioning the stakeholders.
Dhanasekar added a valuable point that it is difficult to generate test ideas just by looking at the GUI. I'd say that's another trap : The mission did not specify that the application should not be used. Questioning can help us clear traps.
The general challenge most of the testers faced was highlighted by Sushant:
...even though we may not want to hunt for bugs, but eyes find them out...
Sathish re-framed the mission statement:
"The mission is to find ways to identify the bugs"
Vivek was next to present his experiences. He faced difficulty in defining the test scenarios. He decided to give a broad idea of his test scenarios. As I shared with him this link : Heuristic Test Strategy Model, he was happy that being a part of Weekend Testing increases his knowledge base.
Sushant tested the application keeping in mind the age group of the audience. He has a habit of testing any application from user-perspective. As he was exploring the application, he found some issues which he could not ignore. He also highlighted how being in an informal environment helped him think and test better.
He was confident that such Weekend sessions would prepare him for the tough environment at office.
Satish brought with him - a different perspective - he searched for failures in the previous releases. Based on the failures, he modified his test scenarios. Satish concentrated only on the generation of test idea. As part of it, he learnt the application.
The biggest challenge was the lack of knowledge of the categories in the application.
He had never attempted a Beta-test and this entire exercise itself proved to be the greatest learning. He stressed the fact participating in this exercise increased his confidence.
As a tester, we have to concentrate on the mission and not on hunting bugs. Many testers find it difficult :)
Tejas had a major challenge: Unclear requirement. Even he fell into the trap of not questioning.
He highlighted the importance of
One more important point which came up in this discussion was that it is OK to fail in front of friends than failing in front of stakeholders.
He promised that he needs to give more attention to record test ideas in a systematic way.
Next, I had to present my learning and experiences.
I listed the two links which helped me and the quality criteria I concentrated on.
We had a discussion on the difference between Claims Testing and Acceptance Testing.
My learning was to improve my knowledge on the different quality criteria used to test any application.
We had a further discussion on each other's list of test ideas. Every tester had to justify his list of test ideas and why he must be selected as the BETA TESTER.
The test reports were really interesting and covered a lot of different ideas.
Ajay: "I have taken care of more quality criteria : Functional, Usability, System Configuration, Data, Domain, Performance, Claims Testing and Operating System and hence increased coverage."
Vivek: "I can cover different versions of different OS. Installation and Functional testing would mean good coverage"
Tejas: "I can discover most of the functional bugs"
Satish: "Functional, Usability and Scenario Testing were my main focus areas."
Rajesh: "I concentrated on Functionality, Usability, Domain, Installation, Upgrade, Performance and Claims Testing"
Finally after the poll, Rajesh was selected as the BETA TESTER.
"It is more important to meet the mission than knowing the terminologies"
Even though Rajesh did not understand what a test idea meant, what mattered was his list of ideas to have increased coverage.
This session was lively with interesting mission, discussions, learning and polling.
Thanks to all the testers. See you all in WT Session No. 11