Showing posts with label fun. Show all posts
Showing posts with label fun. Show all posts

Monday, August 9, 2010

Interesting Virus and funny bug investigation

Cool, the last post was not that bad when compared to a blogpost from web. In terms of the format, the last paragraph had 2-3 additional line breaks. Hopefully, I'll not repeat the same mistake in this blogpost.

Today I reached office few minutes late. Two of my team members had already started working on the build released on friday. I heard one of the two team members talk about Virus with the systems guy.

Incident No.1

I heard one of the two team members talk about VIRUS. I thought she was talking about the VIRUS character from the '3 Idiots' movie. On seeing the systems guy, I understood that its the VIRUS - the one we worry about. The network cables were disconnected and the systems guy was busy checking the security updates, patches and other vital information.
Hmmm, just when I thought 'one resource down' for the day, the systems guy was laughing loudly and my colleague was smiling. I was wondering what happened and what was so funny? The systems guy left and I went to my colleague to know more about the incident.

# My colleague had called the systems guy. Her exact words were: 'A pop up says that there are 7 virus detected and the xxxxxx antivirus has not detected the virus. Please come fast. I've disconnected the cables.'
# How could this happen? Why did the antivirus not detect the virus? How did the virus breach the antivirus barrier? How risky was this virus? How many files and computers were affected?
# Simple reason was: It was NOT a VIRUS. It was one of those funny ads on the website which tries to distract the user and install some junk toolbar on the browser.

No wonder the systems guy was laughing so loudly.

Learning:
1. My colleague was so focussed on the application under test that she failed to look at the bigger picture. Is this an example of 'Inattentional Blindness' or 'Lack of DeFocus principle' ?
2. She could have investigated a bit more before calling the systems guy.
3. She could have called for help within the team.

Incident No.2:
Colleague next to me had to reproduce a customer issue to the programming team.
What is the scenario? Let me describe it. Our software is used to print a photo and a footer with details of the photo. Around 4-6 lines were printed as the footer. The first line was for the title of the photo.
What is the issue:
The first line of the footer was not printed completely in Japanese language. The programmer was not able to reproduce the issue and the customer had attached a screenshot of the problem as a pdf file. The pdf file clearly highlighted how the first line of the footer was not printed completely.

My colleague's approach:
As he was not familiar with Japanese language, he wanted to reproduce the issue in English. He printed the pdf and found that the first line of the footer was not printed completely. An email was sent to the programmer that the issue was reproducible.
Five minutes later, the programmer was in my colleague's cubicle and what came out of the small discussion was a bit funny.

My colleague had printed the pdf without the footer setting. As the pdf had the screenshot of the issue, my colleague thought that the footer was not printed correctly. :)

Learning:
1. Carelessness or Lack of focus?
2. Pressure to reproduce an issue
3. Importance of bug investigation skills.

Two incidents in one day... Lets see in next blogpost if there are any other interesting experiences.

Leia Mais…

Tuesday, July 6, 2010

Answer to a Testing Challenge

My answer to the challenge by Michael Alexander


Here is the challenge:

I'll describe my approach after Michael describes the approach. I do not want to spoil the fun.

My answer:





Update: I beat my own score ;)

Leia Mais…

Saturday, October 10, 2009

Are you a good note taker? Experience Report of WT 11


Date: 10th October 2009
Time: 3pm – 5pm IST

Session details:
Notalon:
Download link: Source Forge
Lecture: CSTER
Reference: Wikipedia

Mission: LISTEN to the lecture and Prepare the notes from the lecture.
Deliverable: The notes must be exported to a pdf file using Notalon application.

The best note-taker would be judged on the following parameters:
1. Content - Lecture Content
2. Easy to read notes.
3. Good usage of Notalon features.

Duration: 1hour.

We started the testing session at 3pm sharp.
The video being a short video of 6 minutes helped the testers to play it again and again.

Some testers were not clear with the mission. And they questioned till they got enough information to meet the mission. It was good to see testers question. Questioning is a very important skill and such exercises help the testers improve their questioning skills.

While some testers had questions related to the mission, some had questions totally irrelevant to the mission. Focusing on the mission is important and sometimes testers can get distracted by other interesting parameters of the testing activity.
Once all the participants emailed their report, the entire list of reports was then sent to everyone.

About the session, Sushant summed it up in one sentence:
“Testers had to take help of all the senses”.

Poulami started the discussion. She was very new to these kinds of exercises. The entire mission of listening to a lecture, take notes with the help of a new application, was in itself a big challenge to her. She took up this challenge and enjoyed multitasking.

She felt such exercises would help hone her multitasking skills. One more important point highlighted by Poulami was her increased concentration levels to help note taking. When asked if she would have concentrated so much if she was listening to the lecture alone, she replied in the negative. She learnt the importance of being detail oriented and at the same time look at the bigger picture.

Sushant was one of those testers who thoroughly enjoyed this session. He felt that such exercises would help testers improve their listening skills and to comprehend a lecture. He was happy that such exercises break the monotony of office activity. He highlighted the importance of filtering out the most important points in a lecture and reducing it to a couple of statements.

Next, Karan described his experience. As he was not clear with the mission, he questioned to get a clear idea of the mission. He gave more importance to listening to the lecture and just jotted down what was present in the slides. According to him, these kinds of exercises help to beat the boredom. Interesting point to note in his description was this exercise increased his self- confidence.

It is always good to know that such testing sessions help testers who are so busy doing testing that they do not learn anything new.
Karan felt that the biggest learning for him was to be smart in doing things and not get afraid by the mission or the application.

Bhargavi was next and shared an interesting point. She was trying to understand the relation between the three links: Application, Lecture and the Reference. The only challenge for her was to manage the entire activity and being late to the session did not help her cause. She promised to try this exercise again and share her experiences.

Regarding the mission for this session, she liked it and was of the opinion that such exercises would definitely make a difference in the careers of the testers. Her learning was to join session at right time :) She appreciated the ability to learn and use the Notalon application quickly.

Most of the testers promised to use Notalon application instead of Notepad. The feature to export to pdf is cool. We also discussed if anyone used the Preferences menu to change the pdf settings.

It was my turn to share my experiences. I felt very happy completing this whole exercise. I made use of the borders and fonts feature to improve the overall look of the pdf document. My approach was to pay attention to the video and simultaneously take notes in one shot without pausing the video.

Once I finished one round of video, I played it again and again to hunt for missing ideas/words. Playing the video five times helped me to frame my notes better. Please find my test report here.

I felt that if we could make our own notes for all such testing videos,
we could learn a lot than just listening to the videos.
Bhargavi and Karan agreed on that point.

And regarding best note taker of the day, Sushant won the title with his excellent summary of the entire lecture in addition to detailed notes.

With a happy learning session, everyone is looking forward to WT12.
Meet you next weekend :)

Leia Mais…

Sunday, October 4, 2009

Do not find Bugs but Generate Test Ideas :) WT session 10

Weekend Testing Session No.10



Date and Time: 03rd October 2009, 3pm - 5pm IST
Application: Converber v 2.1.0

This session was different just like every other session.
The mission was unique as the testers were not supposed to hunt for bugs.

FLASH NEWS: DO NOT HUNT FOR BUGS!!!
How often do you see that?

Then what was this session about?

Following details were given to testers:

Context: A client wants Beta Testers for testing Converber. You have to prove them that you are the best Beta Tester for them.
How will you be judged?
You have to give a list of Test Ideas/Scenarios you would cover if selected as the Beta Tester.

Based on the list of Test Ideas/Scenarios, you'd be selected/rejected as the Beta Tester.


Most of the testers were surprised with the mission and set out to achieve the mission at 3pm IST sharp.

It was a different experience for me too. I was busy browsing through different articles on Exploratory Testing Approach to identify most of the quality criteria.

It was already twenty minutes and I had not even launched the application.

Luckily, I found these two documents:
How Do You Spell Testing - James Bach and

Heuristic Test Strategy Model

I found some testers interested in finding bugs in some modules of the application.
Being the facilitator, I reminded them of the mission.

It was challenging for most of the testers.
While some were finding it hard to put ideas to paper, some could not resist the idea of hunting bugs.

This particular session went so quickly that we realized that it was already 4pm IST. We stopped generating test ideas!!!

One of the challenges faced by Tejas was not to start testing as soon as an idea popped up in his mind. Dhanasekar echoed Tejas's concern - he too started testing as soon as he came up with any idea.

Rajesh wanted to clarify what a test idea meant and how it is different from a test case. I felt it was too late for this question to come up as the testing session was already over. Rajesh learnt that he could have gathered valuable information had he raised this question at the start of the testing session itself.

Dhanasekar and Tejas were of the opinion that there are too many terminologies to add to the confusion.

We started off with Dhanasekar sharing his experience, challenges, learning of this session. He found it easy to hunt for bugs than documenting test ideas and promised to work on that aspect. The major challenge he had to tackle was to not to test the test idea he generated. Being unclear about the mission too did not help his cause.

He got diverted on finding a crash and started investigating it.He realized the importance of questioning which could have saved him a lot of time.

His biggest learning was to

"FOCUS ON THE MISSION"
and he was of the opinion that this exercise would help him present his test ideas better.

Rajesh started off by sharing his experience. His favorite subject being Maths, he loved to test this application as it involved a lot of mathematical calculations. His limited knowledge about different conversions forced him to experiment with only those units which he was comfortable with. As he did not ask questions to clarify what a test idea was, he was hunting for bugs along with the task of generating the test ideas.

His biggest learning for the day was
"THE IMPORTANCE OF QUESTIONING"
He had read about questioning the stakeholders for more information and today was his practical experience of questioning the stakeholders.

Dhanasekar added a valuable point that it is difficult to generate test ideas just by looking at the GUI. I'd say that's another trap : The mission did not specify that the application should not be used. Questioning can help us clear traps.

The general challenge most of the testers faced was highlighted by Sushant:
...even though we may not want to hunt for bugs, but eyes find them out...

Sathish re-framed the mission statement:
"The mission is to find ways to identify the bugs"


Vivek was next to present his experiences. He faced difficulty in defining the test scenarios. He decided to give a broad idea of his test scenarios. As I shared with him this link : Heuristic Test Strategy Model, he was happy that being a part of Weekend Testing increases his knowledge base.

Sushant tested the application keeping in mind the age group of the audience. He has a habit of testing any application from user-perspective. As he was exploring the application, he found some issues which he could not ignore. He also highlighted how being in an informal environment helped him think and test better.

He was confident that such Weekend sessions would prepare him for the tough environment at office.

Satish brought with him - a different perspective - he searched for failures in the previous releases. Based on the failures, he modified his test scenarios. Satish concentrated only on the generation of test idea. As part of it, he learnt the application.

The biggest challenge was the lack of knowledge of the categories in the application.
He had never attempted a Beta-test and this entire exercise itself proved to be the greatest learning. He stressed the fact participating in this exercise increased his confidence.

As a tester, we have to concentrate on the mission and not on hunting bugs. Many testers find it difficult :)

Tejas had a major challenge: Unclear requirement. Even he fell into the trap of not questioning.

He highlighted the importance of
"PERSEVERANCE"

One more important point which came up in this discussion was that it is OK to fail in front of friends than failing in front of stakeholders.

He promised that he needs to give more attention to record test ideas in a systematic way.

Next, I had to present my learning and experiences.

I listed the two links which helped me and the quality criteria I concentrated on.
We had a discussion on the difference between Claims Testing and Acceptance Testing.
My learning was to improve my knowledge on the different quality criteria used to test any application.

We had a further discussion on each other's list of test ideas. Every tester had to justify his list of test ideas and why he must be selected as the BETA TESTER.

The test reports were really interesting and covered a lot of different ideas.

Ajay: "I have taken care of more quality criteria : Functional, Usability, System Configuration, Data, Domain, Performance, Claims Testing and Operating System and hence increased coverage."
Vivek: "I can cover different versions of different OS. Installation and Functional testing would mean good coverage"
Tejas: "I can discover most of the functional bugs"
Satish: "Functional, Usability and Scenario Testing were my main focus areas."
Rajesh: "I concentrated on Functionality, Usability, Domain, Installation, Upgrade, Performance and Claims Testing"

Finally after the poll, Rajesh was selected as the BETA TESTER.

Congratulations Rajesh!!!

"It is more important to meet the mission than knowing the terminologies"
Even though Rajesh did not understand what a test idea meant, what mattered was his list of ideas to have increased coverage.

This session was lively with interesting mission, discussions, learning and polling.

Thanks to all the testers. See you all in WT Session No. 11

Leia Mais…

Sunday, September 27, 2009

One Mission - Many Approaches: BWT 9 Experience Report


Date: 26th September 2009
Time 3pm - 5pm IST
Product Tested: SplashUp

Mission: To find Functional bugs in the Splashup application

Last week, we had tested the same application with testers choosing different quality criteria.
This week, we had to test the same application with one quality criteria - "Functionality" as the base.

Testers: Ajay Balamurugadas, Amit Kulkarni, Dhanasekar Subramaniam, Gunjan Sethi, Karan Indra, Parimala Shankaraiah, Poulami Ghosh, Rajesh Iyer and Suja C S.

This was the first session where I was not moderating and only testing the product.
I was happy that I could dedicate more time for testing.

We tested from 3pm to 4pm and started the discussion session at 4pm sharp.

Poulami started off the discussion. This being her first experience with BWT, used Exploratory Testing Approach to guide her. She wanted to get a feel of the product before she could concentrate on issues in the application.

She found the "Layers" feature interesting enough to continue her focused testing on the Layers and Filters feature.Happy with her first BWT experience, she promised to attend more sessions before passing any feedback to the team.

Poulami found the application very user-friendly and found the Auto-Crop feature not working.

Rajesh was next to describe his experiences. He was interested more in the Sign Up feature of the product. Having created an email address with username of 132 characters length, he was unable to login. Though the email was created successfully, an error message greeted him on Login.

Me and Rajesh had a discussion about an error message popping on the screen if Webcam was not connected. We were not sure if Flash generated the error or the SplashUp application generated this error. While I felt that the error was application specific, Rajesh was of the opinion that it was similar to Flash Generic messages.

I was happy that Rajesh enjoyed testing the application. He also felt that this was a good application to test.

Once Rajesh was done with his description, Amit took over. Amit was frustrated with the application being non user friendly. Absence of help files and lack of support to other image formats posed a serious question regarding the scope of the application.

One of the highlights of Amit's description was the bug he discovered. Moving the error message out of visible window area made it disappear.
He felt that such bugs were common in similar applications and make him wonder if the application is really tested before releasing.

Someone had to cool Amit's frustration on the product and Dhanasekar took centre stage. Like Poulami, he too was a first timer to BWT. He had no experience of testing any imaging software and hence concentrated on the different file types for the application.

One of the bugs found by Dhanasekar was the "Improper handling of unsupported file formats".

This made me wonder how different people look at the same application in different ways and How thought process of each individual under the same circumstances varied.

The only concern he expressed was the lack of prior knowledge of the product being tested. BWT's purpose of letting testers to test with less information about the product would be defeated. The thrill of testing an application when one does not know anything about the application is different from testing a known application is different.

There is less chance of getting biased if one does not know much information about an application. Amit also was of the opinion that exploring a product without much information is good as testers get to learn a lot of new things.

What followed next interested me. Suja's description of her testing approach. After the initial "Get to know the product" session, Suja divided her tests into "Happy Testing" and "Negative Cases".

I feel this is a very narrow way of modelling the application. It was good to see other testers actively participating in the discussion. Even Suja wanted the application to have more documentation to help the user. The experience with BWT was good and she was happy.

Gunjan was next and her previous experience in testing imaging software helped her. using an Exploratory approach, she went on different tours of the product. She found some bugs with the Zoom and Filters feature. Her logical approach to testing the application was a different experience when compared to the last BWT session she attended.

Her only concern was that it took some time to know some features.

Next was my turn. Only testing and no moderating was in itself a different experience for me. This application had lots of bugs and if one is Bug-hungry, I'd recommend this application.

One of the strange bugs I discovered was to make the Menu bar disappear.I also learnt a lot of different bugs.

The purpose of BWT is achieved if a tester goes back with some learning. :)

Amit asked a very important question:
How many of you tried using the application only with their keyboard?
I replied in the negative as if it failed, that would be an usability issue and the mission was to find functionality issues.

Karan's summary was rocking. He had typed everything in a notepad and just pasted everything at once on his turn.
Following an Exploratory approach to some extent, he felt the application was not user friendly. He was confident that with time, this application could be developed into a full-fledged application.

Parimala - the moderator for the session was the last one to present.
Lack of dedicated time for testing was her main concern.
A new software for her, being a curious tester, she explored and learnt most of it quickly. She tested the Tools section of the application till time permitted.

Overall the session was good coupled with strange bugs an discussions about them.
The only concern was: It was fast and discussions were not full-fledged.

We will improve on this next time.
Thanks to all the testers, I learnt some new bugs.
Interested to join us in next session? Email to weekendtesting@gmail.com

See you all in BWT 10.
Till then, ENJOY TESTING :)

Update: Please find the Test Report shared at Scribd.

Leia Mais…

Friday, September 25, 2009

Weekend Testing Session No. 9

Friends,

Weekend is nearing and so is our testing session. Please confirm your participation for the "BWT Session No.9"

Date: Saturday 26th September 2009
Time: 3pm – 5pm IST

Please be online on Gmail (visible mode) by 2.30pm IST.
You’d be provided download details.

Testing session: 3pm – 4pm IST
Discussion Time: 4pm – 5pm IST

Please send an email to weekendtesting@gmail.com with the subject “BWT 9 Confirmed Participant”.

We’ll include you for the session once we receive an email.

For more details, contact weekendtesting@gmail.com

Leia Mais…

Monday, September 21, 2009

Challenge of dimensions of quality : BWT 8 : Experience Report

20th September 2009, 9pm -11pm IST would be etched in the minds of six testers who got together online to test the ‘Areca Backup’ application. This session marked the eighth session of BWT. It was exactly two months since the concept of ‘Bangalore Weekend Testers’ originated.

List of Participants: Ajay Balamurugadas, Bhargavi, Gunjan, Parimala Shankaraiah, Sudhakar, Vasupratha

Application: Areca Backup

In their own words…
Areca-Backup is a file backup software that supports incremental, image and delta backup on local drives or FTP servers. Areca-Backup also allows you to browse your backups and navigate among different version of the files contained in your archives.”

All the testers were geared up for the testing session.
The application had been downloaded.

What next?
What about the MISSION?

The mission for this session was as special as the session.

Following mission was given to the testers:
Mission -: You have to choose one of the quality criteria out of the following –
Installability / Usability / Performance / Reliability / Compatibility /Testability.
Choose one quality criteria and stick to that quality criteria for the entire session to test the ‘Areca Backup’ application.


Special THANKS to Pradeep who suggested this mission.

Each tester was very enthusiastic on hearing the mission and started their journey of exploring the product in order to find valuable information.
The testing session started at 09.03pm and lasted till 10.03pm IST.
We had the Discussion session soon after the testing session.

Each tester was supposed to reveal the mission they chose, the approach followed during the testing session. The tester had to highlight any challenges they faced and any specific learning from this session. The individual tester’s experience of the BWT 8 session was the icing on the cake :)

Sudhakar, the first tester to send in his Test Report, started off the discussion session.
He was very clear in his mission:
“Find issues” and he chose Usability as the Quality Criteria.
One interesting thing about his whole approach to test this application was his expectation: “The application should guide the user”

With focus on data validations, Sudhakar was frustrated at the not-so-good validation implemented.
One major challenge apart from the poor validation was the time taken to understand the application. Lack of understanding the product fully prevented him from exploring the product to a greater degree. Finally, Sudhakar felt other than lack of time, the overall experience of participating in this session was good.

We moved on to Vasupratha’s experience.
Vasupratha echoed Sudhakar’s concern about the lack of time. Usability was the quality criteria once again. Vasupratha felt that additional time for testing would have helped in better exploration of the product.

Next turn was Parimala’s.
A different Quality Criteria: Installation was chosen.
The mission set by her was straight forward:
“To test the installability of Areca 7.1.5”
Following an Exploratory approach, Parimala gave a lot of valuable information in terms of bugs. As the number of installation steps was minimal, Parimala did not face a lot of challenges.
At the same time a particular intermittent bug was playing hide and seek with her. :)

Parimala learnt some new scenarios to test once she took up Installation as the criteria. The new learning (New Scenarios) helped her do a round of Scenario Testing.
With this being a good experience, she wanted to do some functional testing in the near future. :)

Gunjan was ready to share her experiences.
Her mission was decided more because of the circumstances than her choice.
Usability was her first choice. The application when launched was giving an error about a missing .dll file. So, Gunjan shifted her focus from Usability to Installability as she had to un-install and re-install the application.

With an exploratory approach to her rescue, Gunjan delved deep into issues in installation and un-installation. Some interesting issues greeted Gunjan even though System restore was also tried to get the application working.
The help file was one of the sources of information from which she tried out the scenarios. Her biggest learning was to ensure system is in correct condition before testing any application.
Gunjan being a first timer to BWT enjoyed herself and found it interesting to think “OUT OF THE BOX”. This was the first time she tested any software out of her office work.

Now, it was the turn of Bhargavi.
Bhargavi’s mission focused on finding problems with Performance as the Quality Criteria.
Following an Exploratory approach, Bhargavi could face many challenges easily.
The major challenge was the difficulty in understanding the features and knowing where to start and how to start modeling the application.
Some bugs pertaining to other quality criteria slowed down Bhargavi’s progress.

She had her share of learning too. As she took the “Performance” quality criteria which she hadn’t tested before, she learnt new ideas to test. This boosted her confidence. Bhargavi enjoyed testing the product with a different perspective – Focus on only one quality criteria.
Her tests forced 100% CPU usage as well as low disk space.

The mission taught Bhargavi to concentrate on particular quality criteria who habitually concentrated on all quality criteria. Parimala added a point as to how testers find interesting and intriguing issues when focus is on a small part of the application.

Finally, it was my turn.
I chose “Performance” as the quality criteria for the simple reason: Never tested for Performance alone before. I too followed an Exploratory approach with my toolkit which consisted of Process Explorer, Windows Task Manager, MS Excel, Notepad, WebEx Recorder and Date and Time Properties Window.

The biggest challenge for me was to learn the product quickly.
Help file helped me to some extent.
Once I understood how to backup, I started with 23GB folder and that was my biggest mistake of the day. :(

Expecting backup software to handle 23GB of data and backup within 15 minutes was very foolish on part of me. Thereby, I spent fifteen minutes watching the progress bar of the backup process.

On trying with a 4MB file, backup process completed within a matter of few seconds.
I glanced through the report which was generated after backup. A bug in the report took away my precious ten minutes.

Biggest learning I had out of this exercise was to prepare test data while the system was being modeled. Also having unrealistic goals(Read 23GB file) does not help the cause.

Later, I tried with 30MB, 60MB and 90MB folder to monitor the performance. But it was almost the end of testing session.
Bharath highlighted the value www.testersdesk.com added in testing the Performance criteria.
Experience was good as it marked the successful completion of two months of Weekend Testing.

Every BWT session gave me a different experience.
Right from BWT 1 where I and Parimala tested www.vischeck.com to BWT 8, every experience is a unique learning and thought provoking experience.

I’d like to thank all the testers, Pradeep and the BWT members: Manoj, Parimala and Sharath for their continuous support and hard work.

Looking forward to BWT 9: A new product, new testers and a new experience.
See you there. :)

Leia Mais…

Friday, September 18, 2009

Weekend Testing Session No.8

Friends,

Weekend is nearing and so is our testing session.

BWT Session No.8

Date: Sunday 20th Sep 2009
Time: 9pm – 11pm IST

Please be online on Gmail (visible mode) by 8.30pm IST. You’d be provided download details.

Testing session: 9pm – 10pm IST
Discussion Time: 10pm – 11pm IST

Please send an email to weekendtesting@gmail.com with the subject
“BWT 8 Confirmed Participant”.

We’ll include you for the session once we receive an email.

For more details, contact weekendtesting@gmail.com

Leia Mais…

Friday, September 11, 2009

Weekend Testing Session No. 7

Friends,

Weekend is nearing and so is our testing session.

BWT Session No.7

Date: Sunday 13th Sep 2009
Time: 5pm – 7pm IST

Please be online on Gmail (visible mode) by 4.30pm IST. You’d be provided download details.

Testing session: 5 – 6pm IST
Discussion Time: 6 – 7pm IST

Please send an email to weekendtesting@gmail.com with the subject
“BWT 7 Confirmed Participant”.

We’ll include you for the session once we receive an email.

For more details, contact weekendtesting@gmail.com

Leia Mais…

Sunday, September 6, 2009

Weekend Testing Session 6 - TuxPaint



How many times have you enjoyed a software while testing it?

One such occasion was in the Bangalore Weekend Testers Session held on Saturday, the 5th of September 2009 3pm - 5pm IST.

Twelve testers agreed to test TuxPaint, a free, award-winning drawing program for children ages 3 to 12. It combines an easy-to-use interface, fun sound effects, and an encouraging cartoon mascot who guides children as they use the program.

WOW!!! The kid in every tester came out and literally played with the software such that 60 bugs surfaced which were carefully noted down by the tester at work.

Amazing to know if you enjoy doing something, you can excel at it too!!!

Please do treat yourself with this software available at www.tuxpaint.org.
If you want to know the issues beforehand, check out the list of bugs found by us here.

Thanks to Bill Kendrick who permitted us to test and publish the report.

I'm sorry if you missed this week's session too but you have a chance next week.
If you want to register for next week, email to weekendtesting@gmail.com and watch out this space.

Leia Mais…

Thursday, September 3, 2009

Weekend Testing Session 6

Friends,

Next weekend is nearing and so is our testing session.

BWT Session No.6

Date: Saturday 5th Sep 2009
Time: 3pm – 5pm IST

Please be online on Gmail (visible mode) by 2.30pm IST. You’d be provided download details.

Testing session: 3 – 4pm IST
Discussion Time: 4 – 5pm IST

Please send an email to weekendtesting@gmail.com with the subject
BWT6 Confirmed Participant”.

We’ll include you for the session once we receive an email.

For more details, contact weekendtesting@gmail.com

Leia Mais…

Monday, August 31, 2009

BWT Session No. 5: Google Calendar


I’m very happy to have participated in five consecutive BWT Sessions.
Thanks to all the members for their active participation.

The 5th BWT Session was on 30th August 2009 from 9.30pm to 11.30pm IST.

After a hectic day of writing my MS exams, it was time for the BWT Session.

I logged in at 8.30pm IST to find Amit and Anup online.
Slowly as the clock ticked 9.15pm IST, members started joining.

Finally at 9.30pm IST, we were a group of seven testers ready to prove a point.

Every product has bugs even if it’s a Google Product

Application to Test: Google Calendar
Testing Session: 9.30pm to 10.30pm IST
Discussion Session: 10.30pm to 11.30pm IST.

From the learning perspective, it was challenging for me even though I could find bugs in the product.

After the testing session, discussions were good, heated and interesting.
Members discussed their plan of attack, their learning experience, feel of the product. Rated the product and finally submitted their reports.

Some of the questions out of the discussion:
1. Why do some people assume that if it’s a Google product, it should not have bugs?
2. Should a product encourage easy learning? Is that an issue if it doesn’t?
3. Should products be compared during testing? To what extent that comparison must be done?

Expecting all the questions here???

No way, I’m going to list out all the questions, answers, comments, opinions here.
If you are interested to join us, email to weekendtesting@gmail.com

Please find the Test Report shared at Scribd and a question at TestRepublic.

Hope to see you in the next weekend testing session. :)

Leia Mais…

Thursday, August 27, 2009

Weekend Testing Session 5

Friends,

Next weekend is nearing and so is our testing session.

BWT Session No.5

Date: Sunday 30th Aug 2009
Time: 09.30 – 11.30pm IST

Please be online on Gmail (visible mode) by 9pm IST. You’d be provided download details.

Testing session: 09.30 – 10.30pm IST
Discussion Time: 10.30 – 11.30pm IST

Please send an email to weekendtesting@gmail.com with the subject
BWT5 Confirmed Participant”.

We’ll include you for the session once we receive an email.

For more details, contact weekendtesting@gmail.com

Leia Mais…

Saturday, August 22, 2009

Fourth BWT session was even better :)


After a nice lunch, I started pinging others to add to the Group chat.
Once every registered member confirmed their presence, we started this week's session. Bangalore Weekend Testing Session No. 4

Application: Freemind 0.9.0_RC5

Testing was done on Windows XP SP3 and Windows Vista 32 bit.

Testers: Ajay, Anil, Parimala, Rajesh, Ravisuriya and Vivek.

Time: 3.30pm - 6.00pm IST.

As promised, the test report is an improvement. You can check that out for yourself. We followed a common template this time.

Discussion was better than last time. Instead of telling the bugs in a round-robin manner, we discussed what we felt, what we tested, why we tested, what we learnt. Problems faced, challenges, tools used, questions and ideas were exchanged. And we had a lot of fun discussing.

Some of the questions which came up in the discussion:
1. Should a tester learn the product to find bugs? Follow this question on Test Republic here.
2. How working without pressure brings out the best in some testers?
3. Should we test the full application bit by bit or any one feature fully?

And lots more...

Please find the list of issues shared at Scribd.

It is OK to make mistakes but to repeat the same mistake, it becomes a SIN. :)

If you want to enjoy the BWT's 5th session, watch out for this space and register by sending an email to weekendtesting@gmail.com

Leia Mais…

Sunday, August 16, 2009

Weekend Testing Session Report.




Will it work?
Will it be good?
Will it be enjoyable?

Can we manage?
Will everyone benefit out of it?
Will everyone have fun out of it?
Will it be a learning experience?
Will everyone agree to our motto?
Will there be heated arguments?
Will everyone come on time?

How many bugs will we find?
How long the session it'll be?

How will we coordinate?
Do we need more than one software?


What if we face any distractions?
What if there is a power cut?

Ufff, all these questions were answered once the Bangalore Weekend Testing Session started.
Everyone came on time and wow what a session we had!!!!

Date: 15th August 2009
Session started at 9.30pm IST and ended at 11.30pm IST.

Testing session: 9.30pm to 10.30pm
Discussion Session: 10.30pm to 11.30pm

Every member participated actively and bugs flowed(literally) such that the discussion time was extended from 10.30pm -11.00pm to 10.30pm to 11.30pm IST. :)

I'll not hide the report anymore.
Please find the report shared at Scribd.

Most Important:
We promise to improve our bug reporting skills along with suitable screenshots.

Happy Weekend Testing!!!

Leia Mais…