Showing posts with label "exploratory testing". Show all posts
Showing posts with label "exploratory testing". Show all posts

Tuesday, April 11, 2017

Quarterly Update: Conferences, Workshops, Meetups and Learning

This post is a quick update about the past few months.

Test With Ajay website was launched

Launched www.TestWithAjay.com as a one stop source to capture my blog posts, tweets, articles, books and workshop announcements. This will be my online portfolio going forward.
Upcoming workshops include

April 15 (Bangalore)

Web Performance Engineering | Test With Ajay 
More Details 

April 28 (Bangalore)
Mobile Application Testing | Agile Testing Alliance 
More Details 

April 29 (Bangalore)
50+ Tools: Know, Learn and Apply in your testing’ | STeP-IN Forum
More Details 

I also conduct online workshops over Skype chat. Deepan Kumar recently took my online workshop.
If you are interested, you can ping me on Skype (ajay184f is my Skype id)
Deepan Kumar

Conducted Exploratory Testing workshop for Agile Testing Days Asia audience
At the last minute, there were multiple registrations and there were around 40 testers in my Exploratory Testing workshop. The workshop was a success for me as it was quite an interactive workshop and I managed to squeeze in most of the concepts in an eight hour window. The true success of any workshop is when the participants go back to their organization and implement some of the learnings. I have not heard back from any of those testers which bothers me to some extent. 

A one day workshop is not a magic pill to solve all your problems. It needs continuous effort from the testers and the management across few pilot projects to see the expected end results. And I am always ready to help people who are struggling to implement the learnings from my workshop.

Attended Agile Testing Days Asia and Agile India 2017 conferences
As part of the sponsor booth, I spent most of my time at these conferences at the sponsor stall. There were many interesting conversations and how people have assumptions which is very hard to dislodge or alter.

Agile Testing Days came to Asia for the first time and it was a success considering the enthusiasm shown by participants in Agile Games and also the fully packed sessions.

This was my first time at Agile India conference and I was blown away by the sheer number of participants every day. With so many well known speakers spread over 5-7 days, they had all the bases covered.

Presented at Global Testing Retreat, Pune
The topic was: "Agile Teams: The Best Test of a Tester's Skills". The talk was for 25 mins instead of the usual 45 minutes. It is a good idea too as one need not wait the whole 45 mins for the next talk.

There were a lot of new faces and talks which fascinated me. Looking forward to the next year's conference.
                                      
 
Exploratory Testing workshop for Global Testing Retreat audience
This was a three day course and we had good discussions on various aspects of testing. There were lots of hands-on exercises and each concept was followed by an exercise.

Started using Encode app
I am not sure how I stumbled on this app but I am happy that I did. I am loving the lessons and the exercises for JavaScript in this app.

Started learning Java from 'Java for Testers' book
Alan Richardson has done a great job explaining every concept in detail. Though I am finding it hard to understand few concepts, I WANT to learn Java and I am on it.

Agile Testing Alliance Meetup 
I presented at the meetup organized by Agile Testing Alliance on "Problem Solving Techniques". I am planning to create a workshop out of it soon.


And I need to get ready to office. Till next time, enjoy testing.

Leia Mais…

Sunday, November 17, 2013

Agile Testing Days: Presentation Slides Explained

I was very happy to be at Agile Testing Days, 2013.
This is how I felt when I was leaving Potsdam, Berlin.
My promise on twitter:
And here are the slides explained. The complete slide deck can be found here.
Before I started the session, I showed most of the slides to the audience and a one-liner on what the slide is about. I then gave the option to choose some other session, if the audience did not like what will follow for next 45 minutes. This way, I would have wasted only 2 minutes of someone's time and not 45 minutes.

No one left after the quick overview of all the slides. 
This slide was on even before I started the session. I kept this slide to lighten the atmosphere. Most of the talks I have attended have the first slide with the topic name, presenter's name and maybe the company logo. 
To ensure that there were no red flags about my understanding of "Agile Testing", I used this slide. I did not read through the contents as it is already known to many. I highlighted that Agile Testing used "Agile Manifesto" as a guideline and customer satisfaction by delivering quickly is more important than following any process. When no one raised any red flag, I moved on to this slide:
With the above slide in background, I explained what Exploratory Testing is, how there are different definitions but highlight one common theme:
If the next test is influenced by learning from the previous test, you are applying exploratory testing approach.
I then- asked the audience on what they feel about Exploratory Testing. The whole group then came up with a list of words/topics. I played the role of note taker. The beauty of this exercise was each word/topic seemed to lead to a new word/topic.
Once we were done with Pros of Exploratory Testing (ET), we also discussed a bit on Cons of ET as per those who complain against ET.
The next slide, we focused on the ET skills, highlighted by Jon Bach.
There are some pre-requisites to do exploratory testing which can stand up to scrutiny. Everyone can test but is your testing valuable? Why do so many people reject ET and force testers to follow a scripted approach?
Good testing requires skill and good testers work on their skills.
As each link explains the myths in detail, I did not spend more time on this slide.
The following slide seems to need more explanation. Let me explain.
Skills:
Work on your skills. Do not just restrict to testing related skills. Learn from other disciplines. Spend time practicing the skills. Only those who work on the skills will survive.

Experience:
Experience matters a lot. Try to experience as many different contexts as possible. The varied experiences and the experience in a particular context  helps you think of different and useful test ideas which would help you in testing.

Customers/Context:
What is the use of any product if it does not solve customers' problems? Do you understand your customers and the context well enough to design your test strategy? I do understand that customers is one of the factors in context.

Risks:
My first question to the product owners and the programmers in my company: What is the biggest risk you feel with this feature? What are you worried about the most? The answers help me a lot in understanding the product and the project a bit more in detail.

Exploration:This is related to the "Experience" point. In any aspect of the project, pay additional attention to exploration path. Do not restrict it to just testing. Explore in the true sense - to investigate.


Testing:
Finally, a tester with good testing skills and who is skilled at Exploratory testing will be able to help any project and not just Agile testing projects. This tweet sums up the essence of my talk:


Last few slides are self explanatory. And here are some of the photos from twitter:
 

Finally, I ended my session with a call for questions. Hope this blog post serves the purpose of the session. I enjoyed the session where most of us were involved and shared thoughts. 
If you have any questions, feel free to email me at ajay184f@gmail.com

Regards,
Ajay Balamurugadas

Leia Mais…

Tuesday, October 29, 2013

Agile Testing Days - Day 1: A mind map

Day1: Agile Testing Days
Tutorial I attended: http://www.agiletestingdays.com/program.php?id=297

“Compressing Test Time with Exploratory Methods: A Practicum”

Leia Mais…

Saturday, July 20, 2013

(Free) Hands-On Training on Software Testing

Course Overview
Update: The seats have been filled and registrations have closed. Thanks.
This time, I am conducting this course in collaboration with STeP-IN Forum and the target audience is testers with experience between 0 - 3 years.

Schedule:
Date: August 1st to August 30th (excluding weekends)
Time: 06.00 am to 07.30 am IST
Link to Register for the course: http://stepinforum.org/software-testing-training

Course Overview:
As highlighted in the mind map, this training will focus on the following topics:

Basics of Software Testing
We will start with understanding the basic terms like bug - issue - quality - defect. We will definitely NOT go through V-Model, Waterfall and many other such terms which is slowly losing out its importance in today's testing world.

Test Ideas
This session will focus on how to generate test ideas, learn from different sources to test any product. We will also know that software testing is not only about testing Functionality.

Bug Hunting
There is no fun without bugs. So, how do we find them? How is bug investigation different from bug hunting? How to find Sev 1 bugs?

Tools
We will definitely be using many tools in our sessions. We will also focus on how to scout for resources and tools in particular.

Test Reporting
Once a tester completes the test execution, (s)he should be able to provide a professional test report. We will create different reports and get feedback from the group.

Generic
Does your learning stop after a course or workshop? How can one learn about software testing every day? We will go through few important areas for self-learning.
=====================================================================
Link to Register for the course: http://stepinforum.org/software-testing-training

Leia Mais…

Wednesday, December 28, 2011

Four Quick Questions While Testing

This is a quick post in between a testing session. Right now I am testing www.ibibo.com and they have a 'Friend Suggest' feature where ibibo suggests friends for you.
Here is the screenshot from ibibo.com
Friend Suggest
I assume that friends are suggested based on some algorithm. The number of mutual friends is also highlighted.
I clicked on 'Add as a friend' against one of the profiles displayed. A popup was displayed.
Once you click on 'Add as Friend' link
While it is good to inform the user about an action he performed, as a user I was not able to perform any other action till I dismissed the popup by clicking on the OK or close button.

Compare that to what Facebook does. Even Facebook has the Friend Suggest feature by another name: 'People you may know'.
Facebook Friend Suggest

Once I clicked on 'Add Friend' link, the particular profile slowly disappeared after displaying 'Friend request sent'.
                                                          
Do you see the difference? Is there a problem here?

After this observation, I made some notes. Few questions which helped me find a problem were:

* What is the purpose of the feature?
* How will the users use it?
* Are there any problems in what you observe? Primarily any inconsistency or usability issues?
* Have you seen such a feature elsewhere? Comparable products heuristic?

Let us go through each of the questions in detail.
Purpose of the feature
As a tester, do you understand the purpose of the feature. What problem does the feature try to solve? As the product is a social networking site, more the friends, the better. The feature is trying to help you add friends. In the case of ibibo, does it really help?

Users using the feature
As a user, is it easy to understand the feature? Is there help available? Is it user-friendly? Can the user use all the feature's functionality? How is the first impression? Is the feature easily visible or hidden behind a lot of junk?

Usability Issues
This morning, I read the latest blog post by Michael Bolton - http://www.developsense.com/blog/2011/12/why-checking-is-not-enough/ on why checking is not enough.
I am not sure how a check might find the level of user's frustration or happiness.
When I was not able to click on any other link till I dismissed the confirmation popup, I was slightly frustrated.
Confusion, frustration, delight and other emotions - how will a check handle it?
As a tester, are you aware of your emotions when you test, after the tests?

Comparable with similar products heuristic
Finally, as a tester make use of the similar products heuristic if given a chance. Just as I write this post, I hear the dialogue from 'First Blood II' - Mind is the best weapon.  So true.

Maybe, next time too - I will use the four questions as a quick heuristic.
What is the purpose, who will use it, any usability issues, how the competitor handles it?

Leia Mais…

Friday, December 11, 2009

Touring and Modeling: Scripted Vs Exploratory

Discussions on twitter helps.

Fast, Open to public for views. Private (if required).

So, while discussing about software testing on twitter, more specifically about "Touring", "Modeling" with Shrini Kulkarni, some questions cropped up.

1. Is touring a way to explore a model in software testing?
2. Does touring happen in Scripted Testing? To what extent?
3. Does Touring happen more in Exploratory testing than Scripted testing?

Touring - a way to explore a model, is a continuous way to build a better model. Once a better model is built, a new tour to explore it in more detail could be carried out.

As mentioned by James Bach in "Exploratory Testing Explained",

exploratory testing is any testing to the extent that the tester actively
controls the design of the tests as those tests are performed and uses information gained while testing to design new and better tests.
So my question is:

We follow a restricted tour in Scripted Testing. In Scripted testing, our next test is not based on the result/information gained from the previous tour. Are we not following a strict roadmap or the touring is restricted?

Isn't touring and modeling a one time activity in Scripted Testing? A tour may be done to gain information to write the scripts.

In Exploratory Testing, isn't Touring and Modeling a continuous activity to gain more valuable information?

Are they both - "Touring and Modeling" in a loop in Exploratory Testing?

Final point: "Touring and Modeling" is restricted in Scripted Testing and "Touring and Modeling" is a continuous activity in Exploratory Testing.

What do you think?

Feel free to correct me, comment, discuss, question, argue and finally tour the model I have in my mind.


Leia Mais…

Thursday, November 5, 2009

Golden Chance to meet Michael Bolton in India!!!

QAI India
Organized by
Edista Testing Institute
Learn, Rapid Software Testing with Michael Bolton Learn, Rapid Software Testing with Michael Bolton Learn, Rapid Software Testing with Michael Bolton
Mumbai: 12 Nov.2009 – 12 Nov.2009 Hyderabad: 16 Nov.2009 – 16 Nov.2009
Chennai: 13 Nov.2009 – 13 Nov.2009 Bangalore: 17 Nov.2009 – 18 Nov.2009


Course Description

Rapid testing is a complete methodology designed for today’s testing, in which we’re dealing with complex products, constant change, and turbulent schedules. It's an approach to testing that begins with developing personal skills and extends to the ultimate mission of software testing: lighting the way of the project by evaluating the product. The approach is consistent with and follow-on to many of the concepts and principles introduced in the book Lessons Learned in Software Testing: a Context-Driven Approach by Kaner, Bach, and Pettichord. In interactive workshop, Michael Bolton, the co-author (with James Bach) of the Rapid Software Testing course introduces testers, managers, developers, and any other interested parties to the philosophy and practice of Rapid Software Testing, through lecture, stories, discussions, and “minds-on” exercises that simulate important aspects of real software testing problems.

Contact Details
Bangalore/Hyderabad: Akshay Raj
(M): +91-9845176034
(P): +91-080-41574806/7/9
akshay.r@edistatesting.com, training@edistatesting.com

Chennai: Harsha Bhat
(M): +91-9845098916
harsha.bhat@edistatesting.com

Delhi: Divya Raturi
(M): +91-9871252501
divya.raturi@qaiglobal.com

Mumbai: Kishor Parab
(M): +91-9821251126
kishor.parab@qaiglobal.com

I have registered. When will you register?

See you at the workshop.

Leia Mais…

Sunday, October 4, 2009

Do not find Bugs but Generate Test Ideas :) WT session 10

Weekend Testing Session No.10



Date and Time: 03rd October 2009, 3pm - 5pm IST
Application: Converber v 2.1.0

This session was different just like every other session.
The mission was unique as the testers were not supposed to hunt for bugs.

FLASH NEWS: DO NOT HUNT FOR BUGS!!!
How often do you see that?

Then what was this session about?

Following details were given to testers:

Context: A client wants Beta Testers for testing Converber. You have to prove them that you are the best Beta Tester for them.
How will you be judged?
You have to give a list of Test Ideas/Scenarios you would cover if selected as the Beta Tester.

Based on the list of Test Ideas/Scenarios, you'd be selected/rejected as the Beta Tester.


Most of the testers were surprised with the mission and set out to achieve the mission at 3pm IST sharp.

It was a different experience for me too. I was busy browsing through different articles on Exploratory Testing Approach to identify most of the quality criteria.

It was already twenty minutes and I had not even launched the application.

Luckily, I found these two documents:
How Do You Spell Testing - James Bach and

Heuristic Test Strategy Model

I found some testers interested in finding bugs in some modules of the application.
Being the facilitator, I reminded them of the mission.

It was challenging for most of the testers.
While some were finding it hard to put ideas to paper, some could not resist the idea of hunting bugs.

This particular session went so quickly that we realized that it was already 4pm IST. We stopped generating test ideas!!!

One of the challenges faced by Tejas was not to start testing as soon as an idea popped up in his mind. Dhanasekar echoed Tejas's concern - he too started testing as soon as he came up with any idea.

Rajesh wanted to clarify what a test idea meant and how it is different from a test case. I felt it was too late for this question to come up as the testing session was already over. Rajesh learnt that he could have gathered valuable information had he raised this question at the start of the testing session itself.

Dhanasekar and Tejas were of the opinion that there are too many terminologies to add to the confusion.

We started off with Dhanasekar sharing his experience, challenges, learning of this session. He found it easy to hunt for bugs than documenting test ideas and promised to work on that aspect. The major challenge he had to tackle was to not to test the test idea he generated. Being unclear about the mission too did not help his cause.

He got diverted on finding a crash and started investigating it.He realized the importance of questioning which could have saved him a lot of time.

His biggest learning was to

"FOCUS ON THE MISSION"
and he was of the opinion that this exercise would help him present his test ideas better.

Rajesh started off by sharing his experience. His favorite subject being Maths, he loved to test this application as it involved a lot of mathematical calculations. His limited knowledge about different conversions forced him to experiment with only those units which he was comfortable with. As he did not ask questions to clarify what a test idea was, he was hunting for bugs along with the task of generating the test ideas.

His biggest learning for the day was
"THE IMPORTANCE OF QUESTIONING"
He had read about questioning the stakeholders for more information and today was his practical experience of questioning the stakeholders.

Dhanasekar added a valuable point that it is difficult to generate test ideas just by looking at the GUI. I'd say that's another trap : The mission did not specify that the application should not be used. Questioning can help us clear traps.

The general challenge most of the testers faced was highlighted by Sushant:
...even though we may not want to hunt for bugs, but eyes find them out...

Sathish re-framed the mission statement:
"The mission is to find ways to identify the bugs"


Vivek was next to present his experiences. He faced difficulty in defining the test scenarios. He decided to give a broad idea of his test scenarios. As I shared with him this link : Heuristic Test Strategy Model, he was happy that being a part of Weekend Testing increases his knowledge base.

Sushant tested the application keeping in mind the age group of the audience. He has a habit of testing any application from user-perspective. As he was exploring the application, he found some issues which he could not ignore. He also highlighted how being in an informal environment helped him think and test better.

He was confident that such Weekend sessions would prepare him for the tough environment at office.

Satish brought with him - a different perspective - he searched for failures in the previous releases. Based on the failures, he modified his test scenarios. Satish concentrated only on the generation of test idea. As part of it, he learnt the application.

The biggest challenge was the lack of knowledge of the categories in the application.
He had never attempted a Beta-test and this entire exercise itself proved to be the greatest learning. He stressed the fact participating in this exercise increased his confidence.

As a tester, we have to concentrate on the mission and not on hunting bugs. Many testers find it difficult :)

Tejas had a major challenge: Unclear requirement. Even he fell into the trap of not questioning.

He highlighted the importance of
"PERSEVERANCE"

One more important point which came up in this discussion was that it is OK to fail in front of friends than failing in front of stakeholders.

He promised that he needs to give more attention to record test ideas in a systematic way.

Next, I had to present my learning and experiences.

I listed the two links which helped me and the quality criteria I concentrated on.
We had a discussion on the difference between Claims Testing and Acceptance Testing.
My learning was to improve my knowledge on the different quality criteria used to test any application.

We had a further discussion on each other's list of test ideas. Every tester had to justify his list of test ideas and why he must be selected as the BETA TESTER.

The test reports were really interesting and covered a lot of different ideas.

Ajay: "I have taken care of more quality criteria : Functional, Usability, System Configuration, Data, Domain, Performance, Claims Testing and Operating System and hence increased coverage."
Vivek: "I can cover different versions of different OS. Installation and Functional testing would mean good coverage"
Tejas: "I can discover most of the functional bugs"
Satish: "Functional, Usability and Scenario Testing were my main focus areas."
Rajesh: "I concentrated on Functionality, Usability, Domain, Installation, Upgrade, Performance and Claims Testing"

Finally after the poll, Rajesh was selected as the BETA TESTER.

Congratulations Rajesh!!!

"It is more important to meet the mission than knowing the terminologies"
Even though Rajesh did not understand what a test idea meant, what mattered was his list of ideas to have increased coverage.

This session was lively with interesting mission, discussions, learning and polling.

Thanks to all the testers. See you all in WT Session No. 11

Leia Mais…

Sunday, September 27, 2009

One Mission - Many Approaches: BWT 9 Experience Report


Date: 26th September 2009
Time 3pm - 5pm IST
Product Tested: SplashUp

Mission: To find Functional bugs in the Splashup application

Last week, we had tested the same application with testers choosing different quality criteria.
This week, we had to test the same application with one quality criteria - "Functionality" as the base.

Testers: Ajay Balamurugadas, Amit Kulkarni, Dhanasekar Subramaniam, Gunjan Sethi, Karan Indra, Parimala Shankaraiah, Poulami Ghosh, Rajesh Iyer and Suja C S.

This was the first session where I was not moderating and only testing the product.
I was happy that I could dedicate more time for testing.

We tested from 3pm to 4pm and started the discussion session at 4pm sharp.

Poulami started off the discussion. This being her first experience with BWT, used Exploratory Testing Approach to guide her. She wanted to get a feel of the product before she could concentrate on issues in the application.

She found the "Layers" feature interesting enough to continue her focused testing on the Layers and Filters feature.Happy with her first BWT experience, she promised to attend more sessions before passing any feedback to the team.

Poulami found the application very user-friendly and found the Auto-Crop feature not working.

Rajesh was next to describe his experiences. He was interested more in the Sign Up feature of the product. Having created an email address with username of 132 characters length, he was unable to login. Though the email was created successfully, an error message greeted him on Login.

Me and Rajesh had a discussion about an error message popping on the screen if Webcam was not connected. We were not sure if Flash generated the error or the SplashUp application generated this error. While I felt that the error was application specific, Rajesh was of the opinion that it was similar to Flash Generic messages.

I was happy that Rajesh enjoyed testing the application. He also felt that this was a good application to test.

Once Rajesh was done with his description, Amit took over. Amit was frustrated with the application being non user friendly. Absence of help files and lack of support to other image formats posed a serious question regarding the scope of the application.

One of the highlights of Amit's description was the bug he discovered. Moving the error message out of visible window area made it disappear.
He felt that such bugs were common in similar applications and make him wonder if the application is really tested before releasing.

Someone had to cool Amit's frustration on the product and Dhanasekar took centre stage. Like Poulami, he too was a first timer to BWT. He had no experience of testing any imaging software and hence concentrated on the different file types for the application.

One of the bugs found by Dhanasekar was the "Improper handling of unsupported file formats".

This made me wonder how different people look at the same application in different ways and How thought process of each individual under the same circumstances varied.

The only concern he expressed was the lack of prior knowledge of the product being tested. BWT's purpose of letting testers to test with less information about the product would be defeated. The thrill of testing an application when one does not know anything about the application is different from testing a known application is different.

There is less chance of getting biased if one does not know much information about an application. Amit also was of the opinion that exploring a product without much information is good as testers get to learn a lot of new things.

What followed next interested me. Suja's description of her testing approach. After the initial "Get to know the product" session, Suja divided her tests into "Happy Testing" and "Negative Cases".

I feel this is a very narrow way of modelling the application. It was good to see other testers actively participating in the discussion. Even Suja wanted the application to have more documentation to help the user. The experience with BWT was good and she was happy.

Gunjan was next and her previous experience in testing imaging software helped her. using an Exploratory approach, she went on different tours of the product. She found some bugs with the Zoom and Filters feature. Her logical approach to testing the application was a different experience when compared to the last BWT session she attended.

Her only concern was that it took some time to know some features.

Next was my turn. Only testing and no moderating was in itself a different experience for me. This application had lots of bugs and if one is Bug-hungry, I'd recommend this application.

One of the strange bugs I discovered was to make the Menu bar disappear.I also learnt a lot of different bugs.

The purpose of BWT is achieved if a tester goes back with some learning. :)

Amit asked a very important question:
How many of you tried using the application only with their keyboard?
I replied in the negative as if it failed, that would be an usability issue and the mission was to find functionality issues.

Karan's summary was rocking. He had typed everything in a notepad and just pasted everything at once on his turn.
Following an Exploratory approach to some extent, he felt the application was not user friendly. He was confident that with time, this application could be developed into a full-fledged application.

Parimala - the moderator for the session was the last one to present.
Lack of dedicated time for testing was her main concern.
A new software for her, being a curious tester, she explored and learnt most of it quickly. She tested the Tools section of the application till time permitted.

Overall the session was good coupled with strange bugs an discussions about them.
The only concern was: It was fast and discussions were not full-fledged.

We will improve on this next time.
Thanks to all the testers, I learnt some new bugs.
Interested to join us in next session? Email to weekendtesting@gmail.com

See you all in BWT 10.
Till then, ENJOY TESTING :)

Update: Please find the Test Report shared at Scribd.

Leia Mais…

Friday, September 25, 2009

Weekend Testing Session No. 9

Friends,

Weekend is nearing and so is our testing session. Please confirm your participation for the "BWT Session No.9"

Date: Saturday 26th September 2009
Time: 3pm – 5pm IST

Please be online on Gmail (visible mode) by 2.30pm IST.
You’d be provided download details.

Testing session: 3pm – 4pm IST
Discussion Time: 4pm – 5pm IST

Please send an email to weekendtesting@gmail.com with the subject “BWT 9 Confirmed Participant”.

We’ll include you for the session once we receive an email.

For more details, contact weekendtesting@gmail.com

Leia Mais…

Monday, September 21, 2009

Challenge of dimensions of quality : BWT 8 : Experience Report

20th September 2009, 9pm -11pm IST would be etched in the minds of six testers who got together online to test the ‘Areca Backup’ application. This session marked the eighth session of BWT. It was exactly two months since the concept of ‘Bangalore Weekend Testers’ originated.

List of Participants: Ajay Balamurugadas, Bhargavi, Gunjan, Parimala Shankaraiah, Sudhakar, Vasupratha

Application: Areca Backup

In their own words…
Areca-Backup is a file backup software that supports incremental, image and delta backup on local drives or FTP servers. Areca-Backup also allows you to browse your backups and navigate among different version of the files contained in your archives.”

All the testers were geared up for the testing session.
The application had been downloaded.

What next?
What about the MISSION?

The mission for this session was as special as the session.

Following mission was given to the testers:
Mission -: You have to choose one of the quality criteria out of the following –
Installability / Usability / Performance / Reliability / Compatibility /Testability.
Choose one quality criteria and stick to that quality criteria for the entire session to test the ‘Areca Backup’ application.


Special THANKS to Pradeep who suggested this mission.

Each tester was very enthusiastic on hearing the mission and started their journey of exploring the product in order to find valuable information.
The testing session started at 09.03pm and lasted till 10.03pm IST.
We had the Discussion session soon after the testing session.

Each tester was supposed to reveal the mission they chose, the approach followed during the testing session. The tester had to highlight any challenges they faced and any specific learning from this session. The individual tester’s experience of the BWT 8 session was the icing on the cake :)

Sudhakar, the first tester to send in his Test Report, started off the discussion session.
He was very clear in his mission:
“Find issues” and he chose Usability as the Quality Criteria.
One interesting thing about his whole approach to test this application was his expectation: “The application should guide the user”

With focus on data validations, Sudhakar was frustrated at the not-so-good validation implemented.
One major challenge apart from the poor validation was the time taken to understand the application. Lack of understanding the product fully prevented him from exploring the product to a greater degree. Finally, Sudhakar felt other than lack of time, the overall experience of participating in this session was good.

We moved on to Vasupratha’s experience.
Vasupratha echoed Sudhakar’s concern about the lack of time. Usability was the quality criteria once again. Vasupratha felt that additional time for testing would have helped in better exploration of the product.

Next turn was Parimala’s.
A different Quality Criteria: Installation was chosen.
The mission set by her was straight forward:
“To test the installability of Areca 7.1.5”
Following an Exploratory approach, Parimala gave a lot of valuable information in terms of bugs. As the number of installation steps was minimal, Parimala did not face a lot of challenges.
At the same time a particular intermittent bug was playing hide and seek with her. :)

Parimala learnt some new scenarios to test once she took up Installation as the criteria. The new learning (New Scenarios) helped her do a round of Scenario Testing.
With this being a good experience, she wanted to do some functional testing in the near future. :)

Gunjan was ready to share her experiences.
Her mission was decided more because of the circumstances than her choice.
Usability was her first choice. The application when launched was giving an error about a missing .dll file. So, Gunjan shifted her focus from Usability to Installability as she had to un-install and re-install the application.

With an exploratory approach to her rescue, Gunjan delved deep into issues in installation and un-installation. Some interesting issues greeted Gunjan even though System restore was also tried to get the application working.
The help file was one of the sources of information from which she tried out the scenarios. Her biggest learning was to ensure system is in correct condition before testing any application.
Gunjan being a first timer to BWT enjoyed herself and found it interesting to think “OUT OF THE BOX”. This was the first time she tested any software out of her office work.

Now, it was the turn of Bhargavi.
Bhargavi’s mission focused on finding problems with Performance as the Quality Criteria.
Following an Exploratory approach, Bhargavi could face many challenges easily.
The major challenge was the difficulty in understanding the features and knowing where to start and how to start modeling the application.
Some bugs pertaining to other quality criteria slowed down Bhargavi’s progress.

She had her share of learning too. As she took the “Performance” quality criteria which she hadn’t tested before, she learnt new ideas to test. This boosted her confidence. Bhargavi enjoyed testing the product with a different perspective – Focus on only one quality criteria.
Her tests forced 100% CPU usage as well as low disk space.

The mission taught Bhargavi to concentrate on particular quality criteria who habitually concentrated on all quality criteria. Parimala added a point as to how testers find interesting and intriguing issues when focus is on a small part of the application.

Finally, it was my turn.
I chose “Performance” as the quality criteria for the simple reason: Never tested for Performance alone before. I too followed an Exploratory approach with my toolkit which consisted of Process Explorer, Windows Task Manager, MS Excel, Notepad, WebEx Recorder and Date and Time Properties Window.

The biggest challenge for me was to learn the product quickly.
Help file helped me to some extent.
Once I understood how to backup, I started with 23GB folder and that was my biggest mistake of the day. :(

Expecting backup software to handle 23GB of data and backup within 15 minutes was very foolish on part of me. Thereby, I spent fifteen minutes watching the progress bar of the backup process.

On trying with a 4MB file, backup process completed within a matter of few seconds.
I glanced through the report which was generated after backup. A bug in the report took away my precious ten minutes.

Biggest learning I had out of this exercise was to prepare test data while the system was being modeled. Also having unrealistic goals(Read 23GB file) does not help the cause.

Later, I tried with 30MB, 60MB and 90MB folder to monitor the performance. But it was almost the end of testing session.
Bharath highlighted the value www.testersdesk.com added in testing the Performance criteria.
Experience was good as it marked the successful completion of two months of Weekend Testing.

Every BWT session gave me a different experience.
Right from BWT 1 where I and Parimala tested www.vischeck.com to BWT 8, every experience is a unique learning and thought provoking experience.

I’d like to thank all the testers, Pradeep and the BWT members: Manoj, Parimala and Sharath for their continuous support and hard work.

Looking forward to BWT 9: A new product, new testers and a new experience.
See you there. :)

Leia Mais…

Friday, September 18, 2009

Weekend Testing Session No.8

Friends,

Weekend is nearing and so is our testing session.

BWT Session No.8

Date: Sunday 20th Sep 2009
Time: 9pm – 11pm IST

Please be online on Gmail (visible mode) by 8.30pm IST. You’d be provided download details.

Testing session: 9pm – 10pm IST
Discussion Time: 10pm – 11pm IST

Please send an email to weekendtesting@gmail.com with the subject
“BWT 8 Confirmed Participant”.

We’ll include you for the session once we receive an email.

For more details, contact weekendtesting@gmail.com

Leia Mais…

Friday, September 11, 2009

Weekend Testing Session No. 7

Friends,

Weekend is nearing and so is our testing session.

BWT Session No.7

Date: Sunday 13th Sep 2009
Time: 5pm – 7pm IST

Please be online on Gmail (visible mode) by 4.30pm IST. You’d be provided download details.

Testing session: 5 – 6pm IST
Discussion Time: 6 – 7pm IST

Please send an email to weekendtesting@gmail.com with the subject
“BWT 7 Confirmed Participant”.

We’ll include you for the session once we receive an email.

For more details, contact weekendtesting@gmail.com

Leia Mais…

Thursday, September 3, 2009

Weekend Testing Session 6

Friends,

Next weekend is nearing and so is our testing session.

BWT Session No.6

Date: Saturday 5th Sep 2009
Time: 3pm – 5pm IST

Please be online on Gmail (visible mode) by 2.30pm IST. You’d be provided download details.

Testing session: 3 – 4pm IST
Discussion Time: 4 – 5pm IST

Please send an email to weekendtesting@gmail.com with the subject
BWT6 Confirmed Participant”.

We’ll include you for the session once we receive an email.

For more details, contact weekendtesting@gmail.com

Leia Mais…

Monday, August 31, 2009

BWT Session No. 5: Google Calendar


I’m very happy to have participated in five consecutive BWT Sessions.
Thanks to all the members for their active participation.

The 5th BWT Session was on 30th August 2009 from 9.30pm to 11.30pm IST.

After a hectic day of writing my MS exams, it was time for the BWT Session.

I logged in at 8.30pm IST to find Amit and Anup online.
Slowly as the clock ticked 9.15pm IST, members started joining.

Finally at 9.30pm IST, we were a group of seven testers ready to prove a point.

Every product has bugs even if it’s a Google Product

Application to Test: Google Calendar
Testing Session: 9.30pm to 10.30pm IST
Discussion Session: 10.30pm to 11.30pm IST.

From the learning perspective, it was challenging for me even though I could find bugs in the product.

After the testing session, discussions were good, heated and interesting.
Members discussed their plan of attack, their learning experience, feel of the product. Rated the product and finally submitted their reports.

Some of the questions out of the discussion:
1. Why do some people assume that if it’s a Google product, it should not have bugs?
2. Should a product encourage easy learning? Is that an issue if it doesn’t?
3. Should products be compared during testing? To what extent that comparison must be done?

Expecting all the questions here???

No way, I’m going to list out all the questions, answers, comments, opinions here.
If you are interested to join us, email to weekendtesting@gmail.com

Please find the Test Report shared at Scribd and a question at TestRepublic.

Hope to see you in the next weekend testing session. :)

Leia Mais…

Thursday, August 27, 2009

Weekend Testing Session 5

Friends,

Next weekend is nearing and so is our testing session.

BWT Session No.5

Date: Sunday 30th Aug 2009
Time: 09.30 – 11.30pm IST

Please be online on Gmail (visible mode) by 9pm IST. You’d be provided download details.

Testing session: 09.30 – 10.30pm IST
Discussion Time: 10.30 – 11.30pm IST

Please send an email to weekendtesting@gmail.com with the subject
BWT5 Confirmed Participant”.

We’ll include you for the session once we receive an email.

For more details, contact weekendtesting@gmail.com

Leia Mais…

Saturday, August 22, 2009

Fourth BWT session was even better :)


After a nice lunch, I started pinging others to add to the Group chat.
Once every registered member confirmed their presence, we started this week's session. Bangalore Weekend Testing Session No. 4

Application: Freemind 0.9.0_RC5

Testing was done on Windows XP SP3 and Windows Vista 32 bit.

Testers: Ajay, Anil, Parimala, Rajesh, Ravisuriya and Vivek.

Time: 3.30pm - 6.00pm IST.

As promised, the test report is an improvement. You can check that out for yourself. We followed a common template this time.

Discussion was better than last time. Instead of telling the bugs in a round-robin manner, we discussed what we felt, what we tested, why we tested, what we learnt. Problems faced, challenges, tools used, questions and ideas were exchanged. And we had a lot of fun discussing.

Some of the questions which came up in the discussion:
1. Should a tester learn the product to find bugs? Follow this question on Test Republic here.
2. How working without pressure brings out the best in some testers?
3. Should we test the full application bit by bit or any one feature fully?

And lots more...

Please find the list of issues shared at Scribd.

It is OK to make mistakes but to repeat the same mistake, it becomes a SIN. :)

If you want to enjoy the BWT's 5th session, watch out for this space and register by sending an email to weekendtesting@gmail.com

Leia Mais…

Thursday, August 20, 2009

Weekend Testing Session No. 4

Friends,

Next weekend is ready and so is our testing session.

BWT Session No.4

Date: Saturday 22nd Aug 2009
Time: 03.30 – 05.30pm IST

Please be online on Gmail (visible mode) by 3pm IST. You’d be provided download details.

Testing session: 03.30 – 04.30pm IST
Discussion Time: 04.30 – 05.30pm IST

Please send an email to weekendtesting@gmail.com with the subject
Confirmed Participant”.

We’ll include you for the session once we receive an email.

For more details, contact weekendtesting@gmail.com

Leia Mais…