Sunday, May 2, 2021

A growing problem to be solved soon


Photo by Alex Knight on Unsplash

Somehow the current batch of testers are more inclined towards Automation, call themselves SDET / Automation engineers and look down on the role of functional tester. I am not here to stir up another debate of which term is the right term to use. My concern is somewhere else - The understanding of "testing" as a craft and focus on the "testing skills". 

 Some of the questions have no answers with this batch of testers: 
  • Why do we test? (We test to automate - Yes, I was shocked on listening to this answer) 

  • How do you test this product? (I will automate using - Automation is one of the activities based on the context. Don't approach testing with tools. Use tools if they help you in testing) 

  • What kind of bugs have you found? (UI, Usability..) 

  • When was the last time you designed your tests? (Oh, we received them from the manual or functional team and we went ahead and automated them) 

Garbage In - Garbage Out

All I can say is: Garbage in - Garbage out! I am not against automation. Automation is very useful. It helps you achieve the things that you might not be able to do without the tools. 

Without having the right test ideas, what are you going to automate? 
Won't they be incorrect or way too shallow? Think about it. 
Are the checks even valuable? Are they going to answer the questions that matter? 

So, now without your test ideas being strong, I don't understand how you can add value as a tester!

Systematic Product Modelling 
Have you seen the testers who say that they have not tested the feature and hence have no idea about the feature even though it is part of the same product they are testing? On top of it, the two features would even be interacting with each other. Why would one not want to learn about the whole product? 

  • Do we even spend enough time and the right techniques to model the whole application? 

  • Do we think across layers - UI, API, DB?

  • Do we think about the different users - the first-timer, the legacy, just started, the one who has not updated?

  • Do we understand how the data flows across features and how it gets transformed?

  • Do we know the defaults?

  • Do we know how the product is marketed?

  • Do we know the complaints the users raise to our customer support teams?

  • Without modeling the overall application, I wonder what use is the automation skill?

  • Isn't it a very narrow-minded approach?

  • Won't the power of automation be well used if you exactly know where to use it rather than using it at every opportunity? 

Who /also/ encourages them?
Think of all the questions that are asked in the interview. 
  • How much do we focus on the fundamentals? 

  • Do we jump straight to the tools?

  • Do we fill up the positions even if we know that this tester is not a well-rounded tester?

  • Do we give higher emphasis to automation skills?
Check out the job descriptions that are posted as well. No mention of the core skills. The tools form the majority of the JD. We can't blame the testers with such JDs.

Soon, we will get to a state where testers have no idea what testing is, the bigger picture, the multi-dimensional aspect of quality, and how one can add value. We should get away from the zone of robots with tools ready to automate anything and everything without understanding testing.

The next time you have a chance to influence any of the above three factors - "Decide what to automate", "Knowledge of the product", "Interview the testers", dive in and help the industry move ahead!

1 comments:

Anonymous said...

Good article, it defines very well the meaning of automation, which in my opinion is the most important part in relation to software testing.



_____________________________
Visit https://factored.ai/