I'm not an expert on Automation, but I've been around a long time to see the discussions and debates.
This is currently expressed in Chris McMahon's latest blog: Reviewing "Context Driven Approach to Automation in Testing
My question is, should we, as a community be talking about this? Is there any way we can make progress on this idea/theme of test automation without stressful arguments? How can we communicate our experiences and ideas in a helpful way to help move our software testing craft forward?
(Note/edit: Alister Scott has also added to the conversation).
Automation is part of a tool set for testers, and the single most valuable tool any tester can possess is a curious, functioning mind. It's not just creating automated tests: it's building setup and teardown routines, putting together data generation scripts, running tool-assisted data comparisons, running tools to scan a codebase for security vulnerabilities (something I'm doing right now, as it happens).
There's a tendency on the part of managers to see automation and automated testing as a magic bullet that will fix all the problems, not least because many of the big box tools are marketed that way (and in my opinion, often lack flexibility to handle the more complicated scenarios without developer assistance or a coding tester/SDET), and this shows in the job boards and such.
Communication is the only way this will change, and it seems to be an organization at a time thing, not something that's going through management courses or management seminars.
If there is one specific comment that we should be discussing, it is how to lobby the vendors to stop professing that their tools is the answer to all your problems. Especially at management level.
"We ought to be talking about Automation as a means to an end, and not the end itself."
"Automated testing is a noble goal, but should be seen as part of a tester's skillset (not their ONLY required skill), alongside the ability to write good test conditions, and provide the insight that testers can bring to a team."
YES! There are cases where almost everything can be automated; there are cases where some things can be automated, and there are cases where almost nothing can be automated. A good tester can use it when available (efficiency), work without it when it's not, and know when to do which.
We definitely should be discussing automation in the context of testing, but personally I am bored of the division in the arguments.
When approaching a problem (testing or not), having a broad toolkit in your possession must supersede a single, potentially constrained approach. If you primarily automate tests in isolation or based on paper requirements, then lacking the ability to delve deeper with Stakeholders, to gain further context of the ask, will be a constraining factor.
Likewise, if you have no ability to utilise tools and rely on manual effort alone, you will find time and coverage a constraining factor, irrelevant of the customer value.
I feel that a quality, rounded tester is someone who has a broad skill-set. They can facilitate discussions; know how to lead in the quest for context and expanding the ask; can assess risk in both technical and business domains; can automate in terms of quick/dirty/throw-away tests/checks and can build a suitable test frameworks (not just a test library); can read/write (& understand) several languages addressing front-end/back-end/integration needs; can communicate with techies and non-techies alike; are forward looking in the industry, not just regressive.
Its a big challenge for most testers to achieve this level of maturity, generally because box ourselves as a cog in the process and within our comfort zone.
But this challenge is one that I have set myself, and judge those which I employee against.
Software automation testing is the process of software verification in which the basic function and test steps, such as running, initialization, execution, analysis and delivery of results are performed automatically by tools for automated testing.
Manual testing is performed by a human sitting in front of a computer carefully executing the test steps. Automation Testing means using an automation tool to execute your test case suite. The automation software can also enter test data into the System Under Test, compare expected and actual results and generate detailed test reports.
I'm coming to this a bit late but .....Beyond (and before) the nitty gritty technical issues I think there a few things that are really important for us a community to help the people we work for (Clients / Employers....) understand what we're talking about when we as testers talk about automation and why those further up the food chain might consider using (or not using) it.
i) There are so many tools a tester can use to increase the granularity with which they can view a system or the speed & pressure with which they can interrogate that system. All of these could have some claim to be 'automation' but most don't 'drive' the system as a user would. I would rather use a SQL query to review the posted contents of a database table than look through it 'manually' row by row any day! There's lots of good stuff by James B and others on this. Sure, the UI tools have their uses but quite often when I speak to clients & project sponsors that's ALL they're thinking about when they hear the magic word 'automation'. They often feel that if they don't have the UI tool then they haven't 'automated' and thus have been left behind.
ii) The decision to automate or not (in whatever form - let's call it 'Supportive Tooling') is often unsurprisingly cost based (time and money). The problem is that cost is often only ascribed to the first iteration of the product. I.e. If I don't buy this bit of kit and don't have a supportive tooling framework I can get the first iteration of the product into production faster / cheaper. Maybe, but how does that stack up over the lifetime of the product (for a financial platform that might be 7-8 years)? It's not just the regression testing overheads associated with future releases. What about the testing around quarterly patching cycles, DR testing, infrastructure updates and all those other things that happen in an actual workplace? How often are you going to make some poor tester grind through a stack of rigid manual scripted testst to ascertain that basic system functionality is unchanged? That's my definition of 'cruel and unusual'.....It's very costly over the lifetime of an application and it can make the organisation inflexible to change.
iii) Depending on what the chosen 'automation' approach is within an organisation can have a significant impact on the what is expected of the cadre of currently employed testers. My experience is that these people are often drawn from a very mixed bag of backrounds and abilities (both business domain & technical). They will all however have JOB DESCRIPTIONS and playing with these can be like crumbling a bit of C4 into your first cigarette of the day.......
This is NOT by any means an exhaustive list - but my tea levels have dropped to critical.....
Yeah! Because it will help to build up a network which will bring lots of new information based on automation QA, software testing etc.And share the thoughts only which you have experienced not listen from here and there because it might be starting of argument etc.
As well as guys I would like to know is there any company apart from IBM, TCS, Astegic. Who is best in software testing?
In the last couple of years I've seen just how wonderfully varied the software development industry is. From attending the National Software Testing Conference last summer to UKSTAR in London this week. There were testers from the media industry. The financial industry. The medical industry. The defense industry. Testers are testing mobile apps. Web services. Medical devices and critical systems. Some have made huge leaps in efficiency by adopting Agile and Scrum. Some have tried those methodologies and ended up binning them after a few short months as they simply did not work for them. And when all was said and done. What I took away was that everyone does things their way. However best suits their product under development. For their team. In their environment. Their industry. There is not a 'one size fits all' approach to software testing. I think this is largely down to the politics of each companies leadership hierarchy. And to individual industry standards and practices. Not to mention budget. If you're going to ask the question; would automation work for us? You need to lay all your testing activities out on the table and analyse how simple and repetitive those tests are and how much time it's taking you to complete those blocks of activity. Almost everyone could benefit from automating simple user interface 'does it work' checks. Checking to make sure a button can be pressed or a field can have a value entered into it, as that's not testing. That's checking. Checking can be automated. A true test is executing the application with the INTENT of finding defects. Pushing it and pulling it and driving over it until it falls over. Automating the boring checking allows the tester to really get to grips with being what we really are. Investigators. Searching for the bad guys. Those dag-nasty bugs. You can trim days and even weeks of your testing activity if the time consuming, low value, chaff is automated.
can you please suggest me the automation testing tools?
As I will start working on automation project on next week.
The discussion are Very much useful