P2P

Summer20211

Peer to Peer: ILTA's Quarterly Magazine

Issue link: https://epubs.iltanet.org/i/1388375

Contents of this Issue

Navigation

Page 58 of 66

59 I L T A N E T . O R G • False sense of Security – the goal is to reduce manual testing, making sure new features don't impact the application. Automated tests provide validation at the end of the test. That is different than a person looking at the test through every step. Some user experience cannot be duplicated by automation. And again, if the test contain defects, automation simply preserves the problem. • Maintenance – for every application and system update, tests must be reviewed and updated, more effort to update than to rewrite them, not good. Good planning needed, continuance investment in maintenance to manage effectively. • Organizational support – choosing tools, training, experimenting, all need an integrated approach at your organizational level. Often infrastructure needs for automation are hard to justify for one or a few applications to be tested, so you need support from management. Communicating real costs and benefits is critical for acquiring this support. • Technical issues – tools require detailed technical knowledge and need specialists. Problems with software you are testing, if it wasn't designed to be tested can also be a challenge. Anything that is "real time" can be tough. Identifying technical issues upfront and choosing the best approach is key. • Lack of investment – the initial phase of test automation is expensive if you are doing it on your own: design and build, reusable functions, licensing costs, operating costs, even free tools cost time to learn and maintain. You scope has to cover these or you may have to give up test automation part way through. • Team cohesion – getting the whole team involved in identifying test automation objectives and setting targets, building your evidence from historical data, even doing a proof of concept. The entire team needs understand the goals, scope and timeframe, discuss what should and should not be automated, and get the right infrastructure in place to support it. • "Testware" is Software – even when the test process is reasonably well-defined, many standard software development challenges or requirements are involved. test instructions are in scripts or programs or frameworks, they are still like source code, so you have to understand coding principles: • Source control management • Coding best practices, coding standards • Continuing integration best practices • Documentation and versioning. Considerations For Test Automation There are numerous areas to review and consider in starting a test automation journey: Quality Assurance and Test Maturity • Is your team mature in its approach to quality, do you follow a good quality process for your software implementation projects and for ongoing manual testing of your applications and integrations? If yes, you may be ready to consider automated testing. • A strong and high-quality test process is important when transitioning to test automation. • When deciding what to automate and when, consider software that is stable and needs to be tested on an iterative basis, or needs repeated smoke testing, static and repetitive tests, data driven testing (large data sets). Total Cost of Ownership • Tool Costs – purchase and licensing costs. • Time costs – defining approach, strategy, implementation and execution. • People cost – skills, training, maintenance. • Equipment Costs – infrastructure or device costs for mobile testing. Training • Team members to learn tools • Training in execution and in reviewing results.

Articles in this issue

Archives of this issue

view archives of P2P - Summer20211