Something interesting happened when I was testing with my team.
"Amazing!!! I'm surprised how could the programmer fix this issue without any side-effects"
This is a blog where I (Ajay Balamurugadas) write about my experiences with SOFTWARE, testing software, enjoying defects in software and my learnings in software testing :)
Something interesting happened when I was testing with my team.
"Amazing!!! I'm surprised how could the programmer fix this issue without any side-effects"
So my question is:exploratory testing is any testing to the extent that the tester activelycontrols the design of the tests as those tests are performed and uses information gained while testing to design new and better tests.
Date: 05th Dec 2009
Time: 2.30pm - 5.30pm IST
Parimala Shankaraiah and myself were ready to start the session.
Yes, two facilitators to organize a testing session with nearly 25 registered testers.
And James Bach is on Skype waiting for any questions from the testers.
WOW... All set and what's this... I can't find Parimala online.
Beep Beep: Message on Mobile:
Powercut at my place
- Parimala
SOS sent to Sharath Byregowda and 2.25pm IST, powercut at my place.
Oops... Both Parimala and myself rush to the nearest cyber cafe near our home.
Both Parimala and myself login just in time from two different locations.
I started adding the new members for chat while Parimala took over the task of sending the email to those present on time.
Once group chat was initiated, we exchanged roles. I contacted James on twitter.
So with James on standby, Parimala on chat clarifying the questions from testers, I was sending email to every tester who had not yet received the email disclosing the details of the WT 17 session.
Half an hour and it seems that everything is settled.
Slowly questions started cropping up in the discussion.
Initially questions were re-directed to James on twitter.
Later, email id and skype id of James was given to testers for direct interaction. Thanks James for your precious time.
Literally, two hours just flew. Finally an email thanking the testers and highlighting the deliverables was drafted and kept ready. While testers were busy testing, Parimala was drafting the email and I was adding the new email ids to the list.
At 5pm IST, we sent out the email.
Testing was stopped at 5.30pm IST and reports were trickling in since 5.15pm.
Surprise element: There was last minute change in the specifications and those who asked James about the specification got more details.
Only the testers who asked questions got to know more information...
Questioning: Very important skill of a tester.
Finally, all the reports came in by 5.45pm.
Parimala and myself logged out to have our lunch :)
Thank you Weekend Testers.
See you in next WT session .
Mumbai: 12 Nov.2009 – 12 Nov.2009 Hyderabad: 16 Nov.2009 – 16 Nov.2009 Chennai: 13 Nov.2009 – 13 Nov.2009 Bangalore: 17 Nov.2009 – 18 Nov.2009
Course Description
Rapid testing is a complete methodology designed for today’s testing, in which we’re dealing with complex products, constant change, and turbulent schedules. It's an approach to testing that begins with developing personal skills and extends to the ultimate mission of software testing: lighting the way of the project by evaluating the product. The approach is consistent with and follow-on to many of the concepts and principles introduced in the book Lessons Learned in Software Testing: a Context-Driven Approach by Kaner, Bach, and Pettichord. In interactive workshop, Michael Bolton, the co-author (with James Bach) of the Rapid Software Testing course introduces testers, managers, developers, and any other interested parties to the philosophy and practice of Rapid Software Testing, through lecture, stories, discussions, and “minds-on” exercises that simulate important aspects of real software testing problems.
Contact Details
Bangalore/Hyderabad: Akshay Raj
(M): +91-9845176034
(P): +91-080-41574806/7/9
akshay.r@edistatesting.com, training@edistatesting.com
Chennai: Harsha Bhat
(M): +91-9845098916
harsha.bhat@edistatesting.com
Delhi: Divya Raturi
(M): +91-9871252501
divya.raturi@qaiglobal.com
Mumbai: Kishor Parab
(M): +91-9821251126
kishor.parab@qaiglobal.com
I have registered. When will you register?
Five Lessons Learnt from my last Testing project
1. Every project teaches you something new if you are ready to learn.
I tested a web based application for the first time. Introduced to terms like IISRESET, HOSTS, PROXY SERVER, RAD GRID, AJAX, HTTPS, DOMAIN NAME, DATABASE SERVER.
2. A tester's role is to meet the mission.
I found 73 bugs in 6 hours and had to close 69 bugs because they were not at all important from the customer's perspective.
Previous Project: Found 710 issues and 685 were fixed. Everyone appreciated me.
This project: Found 73 issues and 4 were fixed. Everyone asked me to concentrate on features which the customer would use.
3. Test on the expected environment.
Tested for two weeks on an environment which was not the environment at the customer end. Now I feel, why did we waste those two weeks?
4. Test the environment first.
Believed a technical person for setting up the environment and the environment was wrong. First build to customer and it failed miserably. Tested the environment and found that a simple mistake meant that we tested on the wrong environment for two months.
5. Work as one team towards one goal.
It is good to interact with programmers, product managers, tech support, sales manager, QA head, Development Head. Everyone has something new(read different) to contribute.
WOW, I'm very happy that after three years of testing in office, this is my first product release to market.
I'd say worth the wait for three years.
So, when's the next release BOSS? I'm waiting :)
Date: 10th October 2009
Time: 3pm – 5pm IST
Session details:
Notalon:
Download link: Source Forge
Lecture: CSTER
Reference: Wikipedia
Mission: LISTEN to the lecture and Prepare the notes from the lecture.
Deliverable: The notes must be exported to a pdf file using Notalon application.
Weekend Testing Session No.10
Date and Time: 03rd October 2009, 3pm - 5pm IST
Application: Converber v 2.1.0
This session was different just like every other session.
The mission was unique as the testers were not supposed to hunt for bugs.
FLASH NEWS: DO NOT HUNT FOR BUGS!!!
How often do you see that?
Then what was this session about?
Following details were given to testers:
Context: A client wants Beta Testers for testing Converber. You have to prove them that you are the best Beta Tester for them.
How will you be judged?
You have to give a list of Test Ideas/Scenarios you would cover if selected as the Beta Tester.
Based on the list of Test Ideas/Scenarios, you'd be selected/rejected as the Beta Tester.
Most of the testers were surprised with the mission and set out to achieve the mission at 3pm IST sharp.
It was a different experience for me too. I was busy browsing through different articles on Exploratory Testing Approach to identify most of the quality criteria.
It was already twenty minutes and I had not even launched the application.
Luckily, I found these two documents:
How Do You Spell Testing - James Bach and
Heuristic Test Strategy Model
I found some testers interested in finding bugs in some modules of the application.
Being the facilitator, I reminded them of the mission.
It was challenging for most of the testers.
While some were finding it hard to put ideas to paper, some could not resist the idea of hunting bugs.
This particular session went so quickly that we realized that it was already 4pm IST. We stopped generating test ideas!!!
One of the challenges faced by Tejas was not to start testing as soon as an idea popped up in his mind. Dhanasekar echoed Tejas's concern - he too started testing as soon as he came up with any idea.
Rajesh wanted to clarify what a test idea meant and how it is different from a test case. I felt it was too late for this question to come up as the testing session was already over. Rajesh learnt that he could have gathered valuable information had he raised this question at the start of the testing session itself.
Dhanasekar and Tejas were of the opinion that there are too many terminologies to add to the confusion.
We started off with Dhanasekar sharing his experience, challenges, learning of this session. He found it easy to hunt for bugs than documenting test ideas and promised to work on that aspect. The major challenge he had to tackle was to not to test the test idea he generated. Being unclear about the mission too did not help his cause.
He got diverted on finding a crash and started investigating it.He realized the importance of questioning which could have saved him a lot of time.
His biggest learning was to
"FOCUS ON THE MISSION"and he was of the opinion that this exercise would help him present his test ideas better.
"THE IMPORTANCE OF QUESTIONING"He had read about questioning the stakeholders for more information and today was his practical experience of questioning the stakeholders.
...even though we may not want to hunt for bugs, but eyes find them out...
"The mission is to find ways to identify the bugs"
"PERSEVERANCE"
Date: 26th September 2009
Time 3pm - 5pm IST
Product Tested: SplashUp
Mission: To find Functional bugs in the Splashup application
Last week, we had tested the same application with testers choosing different quality criteria.
This week, we had to test the same application with one quality criteria - "Functionality" as the base.
Testers: Ajay Balamurugadas, Amit Kulkarni, Dhanasekar Subramaniam, Gunjan Sethi, Karan Indra, Parimala Shankaraiah, Poulami Ghosh, Rajesh Iyer and Suja C S.
This was the first session where I was not moderating and only testing the product.
I was happy that I could dedicate more time for testing.
We tested from 3pm to 4pm and started the discussion session at 4pm sharp.
Poulami started off the discussion. This being her first experience with BWT, used Exploratory Testing Approach to guide her. She wanted to get a feel of the product before she could concentrate on issues in the application.
She found the "Layers" feature interesting enough to continue her focused testing on the Layers and Filters feature.Happy with her first BWT experience, she promised to attend more sessions before passing any feedback to the team.
Poulami found the application very user-friendly and found the Auto-Crop feature not working.
Rajesh was next to describe his experiences. He was interested more in the Sign Up feature of the product. Having created an email address with username of 132 characters length, he was unable to login. Though the email was created successfully, an error message greeted him on Login.
Me and Rajesh had a discussion about an error message popping on the screen if Webcam was not connected. We were not sure if Flash generated the error or the SplashUp application generated this error. While I felt that the error was application specific, Rajesh was of the opinion that it was similar to Flash Generic messages.
I was happy that Rajesh enjoyed testing the application. He also felt that this was a good application to test.
Once Rajesh was done with his description, Amit took over. Amit was frustrated with the application being non user friendly. Absence of help files and lack of support to other image formats posed a serious question regarding the scope of the application.
One of the highlights of Amit's description was the bug he discovered. Moving the error message out of visible window area made it disappear.
He felt that such bugs were common in similar applications and make him wonder if the application is really tested before releasing.
Someone had to cool Amit's frustration on the product and Dhanasekar took centre stage. Like Poulami, he too was a first timer to BWT. He had no experience of testing any imaging software and hence concentrated on the different file types for the application.
One of the bugs found by Dhanasekar was the "Improper handling of unsupported file formats".
This made me wonder how different people look at the same application in different ways and How thought process of each individual under the same circumstances varied.
The only concern he expressed was the lack of prior knowledge of the product being tested. BWT's purpose of letting testers to test with less information about the product would be defeated. The thrill of testing an application when one does not know anything about the application is different from testing a known application is different.
There is less chance of getting biased if one does not know much information about an application. Amit also was of the opinion that exploring a product without much information is good as testers get to learn a lot of new things.
What followed next interested me. Suja's description of her testing approach. After the initial "Get to know the product" session, Suja divided her tests into "Happy Testing" and "Negative Cases".
I feel this is a very narrow way of modelling the application. It was good to see other testers actively participating in the discussion. Even Suja wanted the application to have more documentation to help the user. The experience with BWT was good and she was happy.
Gunjan was next and her previous experience in testing imaging software helped her. using an Exploratory approach, she went on different tours of the product. She found some bugs with the Zoom and Filters feature. Her logical approach to testing the application was a different experience when compared to the last BWT session she attended.
Her only concern was that it took some time to know some features.
Next was my turn. Only testing and no moderating was in itself a different experience for me. This application had lots of bugs and if one is Bug-hungry, I'd recommend this application.
One of the strange bugs I discovered was to make the Menu bar disappear.I also learnt a lot of different bugs.
The purpose of BWT is achieved if a tester goes back with some learning. :)
Amit asked a very important question:
How many of you tried using the application only with their keyboard?
I replied in the negative as if it failed, that would be an usability issue and the mission was to find functionality issues.
Karan's summary was rocking. He had typed everything in a notepad and just pasted everything at once on his turn.
Following an Exploratory approach to some extent, he felt the application was not user friendly. He was confident that with time, this application could be developed into a full-fledged application.
Parimala - the moderator for the session was the last one to present.
Lack of dedicated time for testing was her main concern.
A new software for her, being a curious tester, she explored and learnt most of it quickly. She tested the Tools section of the application till time permitted.
Overall the session was good coupled with strange bugs an discussions about them.
The only concern was: It was fast and discussions were not full-fledged.
We will improve on this next time.
Thanks to all the testers, I learnt some new bugs.
Interested to join us in next session? Email to weekendtesting@gmail.com
See you all in BWT 10.
Till then, ENJOY TESTING :)
Update: Please find the Test Report shared at Scribd.
Friends,
Weekend is nearing and so is our testing session. Please confirm your participation for the "BWT Session No.9"
Date: Saturday 26th September 2009
Time: 3pm – 5pm IST
Please be online on Gmail (visible mode) by 2.30pm IST.
You’d be provided download details.
Testing session: 3pm – 4pm IST
Discussion Time: 4pm – 5pm IST
Please send an email to weekendtesting@gmail.com with the subject “BWT 9 Confirmed Participant”.
We’ll include you for the session once we receive an email.
For more details, contact weekendtesting@gmail.com
20th September 2009, 9pm -11pm IST would be etched in the minds of six testers who got together online to test the ‘Areca Backup’ application. This session marked the eighth session of BWT. It was exactly two months since the concept of ‘Bangalore Weekend Testers’ originated.
List of Participants: Ajay Balamurugadas, Bhargavi, Gunjan, Parimala Shankaraiah, Sudhakar, Vasupratha
Application: Areca Backup
In their own words…
“Areca-Backup is a file backup software that supports incremental, image and delta backup on local drives or FTP servers. Areca-Backup also allows you to browse your backups and navigate among different version of the files contained in your archives.”
All the testers were geared up for the testing session.
The application had been downloaded.
What next?
What about the MISSION?
The mission for this session was as special as the session.
Following mission was given to the testers:
Mission -: You have to choose one of the quality criteria out of the following –
Installability / Usability / Performance / Reliability / Compatibility /Testability.
Choose one quality criteria and stick to that quality criteria for the entire session to test the ‘Areca Backup’ application.
Special THANKS to Pradeep who suggested this mission.
Each tester was very enthusiastic on hearing the mission and started their journey of exploring the product in order to find valuable information.
The testing session started at 09.03pm and lasted till 10.03pm IST.
We had the Discussion session soon after the testing session.
Each tester was supposed to reveal the mission they chose, the approach followed during the testing session. The tester had to highlight any challenges they faced and any specific learning from this session. The individual tester’s experience of the BWT 8 session was the icing on the cake :)
Sudhakar, the first tester to send in his Test Report, started off the discussion session.
He was very clear in his mission:
“Find issues” and he chose Usability as the Quality Criteria.
One interesting thing about his whole approach to test this application was his expectation: “The application should guide the user”
With focus on data validations, Sudhakar was frustrated at the not-so-good validation implemented.
One major challenge apart from the poor validation was the time taken to understand the application. Lack of understanding the product fully prevented him from exploring the product to a greater degree. Finally, Sudhakar felt other than lack of time, the overall experience of participating in this session was good.
We moved on to Vasupratha’s experience.
Vasupratha echoed Sudhakar’s concern about the lack of time. Usability was the quality criteria once again. Vasupratha felt that additional time for testing would have helped in better exploration of the product.
Next turn was Parimala’s.
A different Quality Criteria: Installation was chosen.
The mission set by her was straight forward:
“To test the installability of Areca 7.1.5”
Following an Exploratory approach, Parimala gave a lot of valuable information in terms of bugs. As the number of installation steps was minimal, Parimala did not face a lot of challenges.
At the same time a particular intermittent bug was playing hide and seek with her. :)
Parimala learnt some new scenarios to test once she took up Installation as the criteria. The new learning (New Scenarios) helped her do a round of Scenario Testing.
With this being a good experience, she wanted to do some functional testing in the near future. :)
Gunjan was ready to share her experiences.
Her mission was decided more because of the circumstances than her choice.
Usability was her first choice. The application when launched was giving an error about a missing .dll file. So, Gunjan shifted her focus from Usability to Installability as she had to un-install and re-install the application.
With an exploratory approach to her rescue, Gunjan delved deep into issues in installation and un-installation. Some interesting issues greeted Gunjan even though System restore was also tried to get the application working.
The help file was one of the sources of information from which she tried out the scenarios. Her biggest learning was to ensure system is in correct condition before testing any application.
Gunjan being a first timer to BWT enjoyed herself and found it interesting to think “OUT OF THE BOX”. This was the first time she tested any software out of her office work.
Now, it was the turn of Bhargavi.
Bhargavi’s mission focused on finding problems with Performance as the Quality Criteria.
Following an Exploratory approach, Bhargavi could face many challenges easily.
The major challenge was the difficulty in understanding the features and knowing where to start and how to start modeling the application.
Some bugs pertaining to other quality criteria slowed down Bhargavi’s progress.
She had her share of learning too. As she took the “Performance” quality criteria which she hadn’t tested before, she learnt new ideas to test. This boosted her confidence. Bhargavi enjoyed testing the product with a different perspective – Focus on only one quality criteria.
Her tests forced 100% CPU usage as well as low disk space.
The mission taught Bhargavi to concentrate on particular quality criteria who habitually concentrated on all quality criteria. Parimala added a point as to how testers find interesting and intriguing issues when focus is on a small part of the application.
Finally, it was my turn.
I chose “Performance” as the quality criteria for the simple reason: Never tested for Performance alone before. I too followed an Exploratory approach with my toolkit which consisted of Process Explorer, Windows Task Manager, MS Excel, Notepad, WebEx Recorder and Date and Time Properties Window.
The biggest challenge for me was to learn the product quickly.
Help file helped me to some extent.
Once I understood how to backup, I started with 23GB folder and that was my biggest mistake of the day. :(
Expecting backup software to handle 23GB of data and backup within 15 minutes was very foolish on part of me. Thereby, I spent fifteen minutes watching the progress bar of the backup process.
On trying with a 4MB file, backup process completed within a matter of few seconds.
I glanced through the report which was generated after backup. A bug in the report took away my precious ten minutes.
Biggest learning I had out of this exercise was to prepare test data while the system was being modeled. Also having unrealistic goals(Read 23GB file) does not help the cause.
Later, I tried with 30MB, 60MB and 90MB folder to monitor the performance. But it was almost the end of testing session.
Bharath highlighted the value www.testersdesk.com added in testing the Performance criteria.
Experience was good as it marked the successful completion of two months of Weekend Testing.
Every BWT session gave me a different experience.
Right from BWT 1 where I and Parimala tested www.vischeck.com to BWT 8, every experience is a unique learning and thought provoking experience.
I’d like to thank all the testers, Pradeep and the BWT members: Manoj, Parimala and Sharath for their continuous support and hard work.
Looking forward to BWT 9: A new product, new testers and a new experience.
See you there. :)
This is a post highlighting the conversation I had with one of my programmers.
Programmer (P): Hi Ajay! I need your help in reproducing the defect #abcdef
Defect #abcdef:
Step 1: ...
Step 2: Enter a value 50 in the text field.
Step3: ...
Ajay (A): Sure, How can I help you?
P: I'm unable to reproduce the defect.
A: Which OS are you trying it on? I had logged it on Win 2003.
P: Yes, I know that. It would be easier if I could reproduce that on Win XP.
A: (he he smiles) OK, I'll reproduce the defect first on Win 2003 and then we could try on Win XP also.
P: OK, great.
A: Step1, Step2, Step3 and here it is, REPRODUCIBLE!!!
P: Ok, let me try it. Step1, Step2 and Step3 and ??? Where's the defect?
A: Oh!!! Let me try again. Step1, Step2 and Step3 and again REPRODUCIBLE!!!
P: (Smiles) Step1, Step2 and Step3 : NOT REPRODUCIBLE
Silence for few seconds.
A: (thinking what could be different) Hmmm, maybe the speed with which I execute the steps is different from your speed of execution.
P: Maybe. Very little chance of that happening.
A: OK, let me try consecutive times. Step1, 2, 3: REPRODUCIBLE. Step1, 2, 3: REPRODUCIBLE.
P: Oh!!! How do YOU reproduce that Man!!! See, Step1, 2, 3: NOT REPRODUCIBLE.
P: Maybe you are pressing 'Enter' after entering the number.
A: Hmmm, I'm not pressing Enter key.
P: Then maybe single click causes this problem. Or maybe double click to select the field before entering the number.
A: (thinking WOW, so many factors!!! Let him go on)
P: Or this might happen if you select the text by 'Click and Drag' using mouse
P: OK, I'll look into this issue. Thanks for your time.
---------- END OF CONVERSATION ---------
This particular conversation refreshed my memory of how many different factors affect a single entry in a text field.
> Operating System
> Response time
> If the focus is on the field or not
> After entering the number, did the user press Enter or Tab?
> Did the user double click on the field to select the default text?
> Did the user delete the text before typing in the new text?
> Did he press 'Delete' or 'Backspace'
I'm sure there are many more factors related to the single entry in the text field.
The point I want to highlight here is "How useful is it to have a conversation with a programmer about the product?"
In my case, it was useful. Have you experienced such an interaction?
Do let me know.
Please feel free to share such experiences(Good & Bad).
Friends,
Weekend is nearing and so is our testing session.
BWT Session No.8
Date: Sunday 20th Sep 2009
Time: 9pm – 11pm IST
Please be online on Gmail (visible mode) by 8.30pm IST. You’d be provided download details.
Testing session: 9pm – 10pm IST
Discussion Time: 10pm – 11pm IST
Please send an email to weekendtesting@gmail.com with the subject
“BWT 8 Confirmed Participant”.
We’ll include you for the session once we receive an email.
For more details, contact weekendtesting@gmail.com
Friends,
Weekend is nearing and so is our testing session.
BWT Session No.7
Date: Sunday 13th Sep 2009
Time: 5pm – 7pm IST
Please be online on Gmail (visible mode) by 4.30pm IST. You’d be provided download details.
Testing session: 5 – 6pm IST
Discussion Time: 6 – 7pm IST
Please send an email to weekendtesting@gmail.com with the subject
“BWT 7 Confirmed Participant”.
We’ll include you for the session once we receive an email.
For more details, contact weekendtesting@gmail.com
This is a Silent Post to all the Silent Readers of my Blog.
A silent post which is not supposed to say anything but just the word "Thanks"
How many times have you enjoyed a software while testing it?
One such occasion was in the Bangalore Weekend Testers Session held on Saturday, the 5th of September 2009 3pm - 5pm IST.
Twelve testers agreed to test TuxPaint, a free, award-winning drawing program for children ages 3 to 12. It combines an easy-to-use interface, fun sound effects, and an encouraging cartoon mascot who guides children as they use the program.
WOW!!! The kid in every tester came out and literally played with the software such that 60 bugs surfaced which were carefully noted down by the tester at work.
Amazing to know if you enjoy doing something, you can excel at it too!!!
Please do treat yourself with this software available at www.tuxpaint.org.
If you want to know the issues beforehand, check out the list of bugs found by us here.
Thanks to Bill Kendrick who permitted us to test and publish the report.
I'm sorry if you missed this week's session too but you have a chance next week.
If you want to register for next week, email to weekendtesting@gmail.com and watch out this space.
Friends,
Next weekend is nearing and so is our testing session.
BWT Session No.6
Date: Saturday 5th Sep 2009
Time: 3pm – 5pm IST
Please be online on Gmail (visible mode) by 2.30pm IST. You’d be provided download details.
Testing session: 3 – 4pm IST
Discussion Time: 4 – 5pm IST
Please send an email to weekendtesting@gmail.com with the subject
“BWT6 Confirmed Participant”.
We’ll include you for the session once we receive an email.
For more details, contact weekendtesting@gmail.com
I’m very happy to have participated in five consecutive BWT Sessions.
Thanks to all the members for their active participation.
The 5th BWT Session was on 30th August 2009 from 9.30pm to 11.30pm IST.
After a hectic day of writing my MS exams, it was time for the BWT Session.
I logged in at 8.30pm IST to find Amit and Anup online.
Slowly as the clock ticked 9.15pm IST, members started joining.
Finally at 9.30pm IST, we were a group of seven testers ready to prove a point.
“Every product has bugs even if it’s a Google Product”
Application to Test: Google Calendar
Testing Session: 9.30pm to 10.30pm IST
Discussion Session: 10.30pm to 11.30pm IST.
From the learning perspective, it was challenging for me even though I could find bugs in the product.
After the testing session, discussions were good, heated and interesting.
Members discussed their plan of attack, their learning experience, feel of the product. Rated the product and finally submitted their reports.
Some of the questions out of the discussion:
1. Why do some people assume that if it’s a Google product, it should not have bugs?
2. Should a product encourage easy learning? Is that an issue if it doesn’t?
3. Should products be compared during testing? To what extent that comparison must be done?
Expecting all the questions here???
No way, I’m going to list out all the questions, answers, comments, opinions here.
If you are interested to join us, email to weekendtesting@gmail.com
Please find the Test Report shared at Scribd and a question at TestRepublic.
Hope to see you in the next weekend testing session. :)
Friends,
Next weekend is nearing and so is our testing session.
BWT Session No.5
Date: Sunday 30th Aug 2009
Time: 09.30 – 11.30pm IST
Please be online on Gmail (visible mode) by 9pm IST. You’d be provided download details.
Testing session: 09.30 – 10.30pm IST
Discussion Time: 10.30 – 11.30pm IST
Please send an email to weekendtesting@gmail.com with the subject
“BWT5 Confirmed Participant”.
We’ll include you for the session once we receive an email.
For more details, contact weekendtesting@gmail.com
After a nice lunch, I started pinging others to add to the Group chat.
Once every registered member confirmed their presence, we started this week's session. Bangalore Weekend Testing Session No. 4
Application: Freemind 0.9.0_RC5
Testing was done on Windows XP SP3 and Windows Vista 32 bit.
Testers: Ajay, Anil, Parimala, Rajesh, Ravisuriya and Vivek.
Time: 3.30pm - 6.00pm IST.
As promised, the test report is an improvement. You can check that out for yourself. We followed a common template this time.
Discussion was better than last time. Instead of telling the bugs in a round-robin manner, we discussed what we felt, what we tested, why we tested, what we learnt. Problems faced, challenges, tools used, questions and ideas were exchanged. And we had a lot of fun discussing.
Some of the questions which came up in the discussion:
1. Should a tester learn the product to find bugs? Follow this question on Test Republic here.
2. How working without pressure brings out the best in some testers?
3. Should we test the full application bit by bit or any one feature fully?
And lots more...
Please find the list of issues shared at Scribd.
It is OK to make mistakes but to repeat the same mistake, it becomes a SIN. :)
If you want to enjoy the BWT's 5th session, watch out for this space and register by sending an email to weekendtesting@gmail.com
Friends,
Next weekend is ready and so is our testing session.
BWT Session No.4
Date: Saturday 22nd Aug 2009
Time: 03.30 – 05.30pm IST
Please be online on Gmail (visible mode) by 3pm IST. You’d be provided download details.
Testing session: 03.30 – 04.30pm IST
Discussion Time: 04.30 – 05.30pm IST
Please send an email to weekendtesting@gmail.com with the subject
“Confirmed Participant”.
We’ll include you for the session once we receive an email.
For more details, contact weekendtesting@gmail.com
Will it work?
Will it be good?
Will it be enjoyable?
Can we manage?
Will everyone benefit out of it?
Will everyone have fun out of it?
Will it be a learning experience?
Will everyone agree to our motto?
Will there be heated arguments?
Will everyone come on time?
How many bugs will we find?
How long the session it'll be?
How will we coordinate?
Do we need more than one software?
What if we face any distractions?
What if there is a power cut?
Ufff, all these questions were answered once the Bangalore Weekend Testing Session started.
Everyone came on time and wow what a session we had!!!!
Date: 15th August 2009
Session started at 9.30pm IST and ended at 11.30pm IST.
Testing session: 9.30pm to 10.30pm
Discussion Session: 10.30pm to 11.30pm
Every member participated actively and bugs flowed(literally) such that the discussion time was extended from 10.30pm -11.00pm to 10.30pm to 11.30pm IST. :)
I'll not hide the report anymore.
Please find the report shared at Scribd.
Most Important:
We promise to improve our bug reporting skills along with suitable screenshots.
Happy Weekend Testing!!!
Friends,
Get ready for the next session of Weekend Testing.
Date: Aug 15th 2009
Time: 09.30pm IST(4pm GMT)
If you would like to join the session or need more details,
please send an email to weekendtesting@gmail.com.
Hope to see you there.
-Bangalore Weekend Testers
Testers involved:
Ajay Balamurugadas, Sharath Byregowda and M V Manoj
Website Tested:
www.tinyurl.com
Mission:
To find bugs.
Start Time:
Aug 09 2009, 0022hrs IST
End Time:
Aug 09 2009, 0145hrs IST
About the website:
TinyURL is a web service that provides short aliases for redirection of long URLs. Kevin Gilbertson, a web developer, launched the service in January 2002 so that he would be able to link directly to newsgroup postings which frequently had long and cumbersome addresses.
Our Testing Session highlights:
The entire session was conducted over Group Chat on Gmail.
We started our search for a software to test around 1145hrs on Aug 08th 2009.
The search continued for an hour with Eyeos, Barcode4J and Piwik grabbing our attention.
Lack of prerequisites for the above softwares forced us to test TinyURL.
Bugs were communicated to the group as and when they were found.
The tests were based on learning from the tests conducted by the other testers.
Sharath's tests focussed on the security issues. Wish we had a proxy network setup. He'd have loved to test on a proxy network.
Manoj's tests focussed on the usability and Custom Alias feature. He highlighted the disadvantages of using Custom alias feature.
I focussed on the general functionality of the website.
It was fun testing coupled with good learning for me.
Though the testing session lasted for over an hour, the lessons gained would be for a long time. Thanks to Sharath and Manoj for their determination and passion at odd hours of Sunday.
Please find the testing report shared at Scribd.
If you would like to participate in such testing on weekends, drop an email to me at :
weekendtesting@gmail.com
Date: 01 August 2009 Saturday
Do you like fame?
Last week, I was lucky to have captured this image live.
This is about a traffic signal: Not the movie, this is about the signalling device used to control traffic. On one of the main roads near my residence, the traffic signal displayed one of the rare-to-be-seen visuals.
As seen in the image, this seems to be a signal to guide the pedestrian.
I would expect only one image to be displayed at this traffic signal.
Either the
Congratulations !!!
The ‘Hands on software testing training’ has brought in a bold change.
Are you brave enough to appreciate these ‘testers’?
Do you still believe that ‘training through slides’ is better than ‘hands on training’?
Then read no further…
The results might shock you.
The Test EXPERIENCE Reports are out and available at http://testertested.qualityfrog.com/erpstt.pdf
How do you feel?
Don’t you think we need to support this unique school of software testing?
How Can I support this initiative?
Send an email to isupport@etifinishingschool.com with
-Your Name
-Your Designation
-Your Organization name
-Your Web Address
Friends, There is never a wrong time to do the right thing!!!
Who knows, few years down the lane, this may be the turning point in the history of software testing.
Be proud to extend your full-fledged support.
Thank you for your valuable time.