I'm not an expert on Automation, but I've been around a long time to see the discussions and debates.
This is currently expressed in Chris McMahon's latest blog: Reviewing "Context Driven Approach to Automation in Testing
My question is, should we, as a community be talking about this? Is there any way we can make progress on this idea/theme of test automation without stressful arguments? How can we communicate our experiences and ideas in a helpful way to help move our software testing craft forward?
(Note/edit: Alister Scott has also added to the conversation).
If there is one specific comment that we should be discussing, it is how to lobby the vendors to stop professing that their tools is the answer to all your problems. Especially at management level.
We definitely should be discussing automation in the context of testing, but personally I am bored of the division in the arguments.
When approaching a problem (testing or not), having a broad toolkit in your possession must supersede a single, potentially constrained approach. If you primarily automate tests in isolation or based on paper requirements, then lacking the ability to delve deeper with Stakeholders, to gain further context of the ask, will be a constraining factor.
Likewise, if you have no ability to utilise tools and rely on manual effort alone, you will find time and coverage a constraining factor, irrelevant of the customer value.
I feel that a quality, rounded tester is someone who has a broad skill-set. They can facilitate discussions; know how to lead in the quest for context and expanding the ask; can assess risk in both technical and business domains; can automate in terms of quick/dirty/throw-away tests/checks and can build a suitable test frameworks (not just a test library); can read/write (& understand) several languages addressing front-end/back-end/integration needs; can communicate with techies and non-techies alike; are forward looking in the industry, not just regressive.
Its a big challenge for most testers to achieve this level of maturity, generally because box ourselves as a cog in the process and within our comfort zone.
But this challenge is one that I have set myself, and judge those which I employee against.
Software automation testing is the process of software verification in which the basic function and test steps, such as running, initialization, execution, analysis and delivery of results are performed automatically by tools for automated testing.
Manual testing is performed by a human sitting in front of a computer carefully executing the test steps. Automation Testing means using an automation tool to execute your test case suite. The automation software can also enter test data into the System Under Test, compare expected and actual results and generate detailed test reports.
I'm coming to this a bit late but .....Beyond (and before) the nitty gritty technical issues I think there a few things that are really important for us a community to help the people we work for (Clients / Employers....) understand what we're talking about when we as testers talk about automation and why those further up the food chain might consider using (or not using) it.
i) There are so many tools a tester can use to increase the granularity with which they can view a system or the speed & pressure with which they can interrogate that system. All of these could have some claim to be 'automation' but most don't 'drive' the system as a user would. I would rather use a SQL query to review the posted contents of a database table than look through it 'manually' row by row any day! There's lots of good stuff by James B and others on this. Sure, the UI tools have their uses but quite often when I speak to clients & project sponsors that's ALL they're thinking about when they hear the magic word 'automation'. They often feel that if they don't have the UI tool then they haven't 'automated' and thus have been left behind.
ii) The decision to automate or not (in whatever form - let's call it 'Supportive Tooling') is often unsurprisingly cost based (time and money). The problem is that cost is often only ascribed to the first iteration of the product. I.e. If I don't buy this bit of kit and don't have a supportive tooling framework I can get the first iteration of the product into production faster / cheaper. Maybe, but how does that stack up over the lifetime of the product (for a financial platform that might be 7-8 years)? It's not just the regression testing overheads associated with future releases. What about the testing around quarterly patching cycles, DR testing, infrastructure updates and all those other things that happen in an actual workplace? How often are you going to make some poor tester grind through a stack of rigid manual scripted testst to ascertain that basic system functionality is unchanged? That's my definition of 'cruel and unusual'.....It's very costly over the lifetime of an application and it can make the organisation inflexible to change.
iii) Depending on what the chosen 'automation' approach is within an organisation can have a significant impact on the what is expected of the cadre of currently employed testers. My experience is that these people are often drawn from a very mixed bag of backrounds and abilities (both business domain & technical). They will all however have JOB DESCRIPTIONS and playing with these can be like crumbling a bit of C4 into your first cigarette of the day.......
This is NOT by any means an exhaustive list - but my tea levels have dropped to critical.....
Yeah! Because it will help to build up a network which will bring lots of new information based on automation QA, software testing etc.And share the thoughts only which you have experienced not listen from here and there because it might be starting of argument etc.
As well as guys I would like to know is there any company apart from IBM, TCS, Astegic. Who is best in software testing?