How to Evaluate Telecom Testing Tools

We’ve talked a little bit already at this blog about how automated testing is becoming a virtual necessity for telco operators. As increasing device fragmentation and the proliferation of different protocols continue to inflate the number of use cases that require testing, human testers are struggling to keep up.

But, if you’re reading this, you probably know all that. More than that, you’re probably almost ready to take the plunge and seriously consider an automated testing solution for your business. The question at this point is, how do you choose the right tool?

Obviously, there is no one easy answer to this question—but here are a few considerations that might help you identify testing tools that can improve your test coverage and increase your testing ROI.

Ease of Scripting

Ultimately, success or failure in adopting new technology tends to come down to its implementation: Is it incorporated into existing test frameworks in such a way as to promote ease of use? Does it actually provide results in a way that users can access and turn into added value?

No matter how fancy or sophisticated a piece of software (or hardware) is, its real value is going to be limited by the user’s ability to understand and interact with the technology. For businesses in the telecom sphere, this means that a valuable solution will often have to be usable by both technical and non-technical personnel—meaning that it will have to be evaluated by two different standards of usability.

For the first category (technical users), we find that good proxy for how well a solution will slot into existing frameworks is ease-of-scripting. How easy or difficult is it for users to define parameters and use cases in the appropriate programming language?

How quickly are users able to learn any new scripting languages and use them to extract value from the tool at hand? The answers to these questions will tell you a lot about overall usability.

Non-technical Usability

If a tool requires too much time for scripting use cases and defining frameworks for technical users, the potential time and cost savings can quickly go out the window. By the same token, if non-technical users can’t interact with the tool in the desired way, any time saved can be cancelled out once the testing process moves beyond the engineer’s desk.

A tool that maximizes the ease of use for technical users could easily fail to prioritize non-technical users, which would make it difficult to expand the impact of the test automation into the rest of the organization.

If, for example, you’re automating your VoIP tests, and the executive making the go-no-go decision for a particular rollout of network updates can’t access or understand the results of those tests, then from a decision-making standpoint you’ve added little to no value.

If, by contrast, users with limited programming ability are still able to navigate the test infrastructure to pinpoint areas of improvement and understand service levels, you create an environment where decision-makers are empowered by data.


Speaking of being empowered by data—the next big consideration when evaluating a telecom test automation tool is the quality of reporting being offered. First off, you’ll want to see what formats the reports can actually be exported to.

If a piece of software only lets you view reports as some arcane file type and can’t support Excel spreadsheets and other common software, then you run every risk that the results of your tests will be siloized and inaccessible.

If, on the contrary, you can store test data in an Excel file, or visualize it cleanly and easily through the software interface itself, you put your company in a position where even users who are not directly involved in the nitty gritty of your test operations can analyze and understand the results.

If you’re evaluating a test solution that uses a keyword based framework, those the results of your test should be easy to filter by keyword.

Vendor Quality

Here, things get a little bit nebulous. Whether or not a testing solution successfully meets the needs of both technical and non-technical candidates, and whether or not it offer readable, usable reporting functionality, should be pretty clear from a little research or a quick trial.

But what do we mean by vendor quality? Should the vendor really matter if the product itself has been deemed satisfactory? Well, as a matter of fact it should. Why? Because the quality of the product doesn’t necessarily tell you everything you need to know about the quality of support you’ll receive in using the product.

This is the type of thing that you can often ascertain from online vendor reviews, but you should also make sure that any potential partner has a clear support plan in place that they can communicate to you.

Also, given the complexities of the telecom domain when it comes to protocols, network elements, diversity of devices and operating systems, etc., you might give strong consideration to how much telecom-specific experience your vendor can offer.

Have they worked with telecoms in the past to automate testing? Do they specialize in the field? If so, they may be better poised to help you with issues that are specific to your industry.


Last but not least, we have what is arguably the biggest factor in your return on investment: the investment itself. Cost is, of course, always a major consideration—but it’s complicated by the fact that ROI is difficult to measure in advance with any precision.

Sure, you might find that you want to balance cost against likely payoff, but until you’ve actually adopted a solution and used it for a while it will be difficult to assign a precise dollar (or euro) amount to the purchase.

For this reason, one of the best ways to tackle the uncertainty inherent in cost calculations is to seek out solutions that can be deployed on a limited basis and then scaled up or down as needed.

Scalability is, of course, a valuable asset on its own, but in this context it offers you the ability to know what you’re getting into before you make a companywide commitment to a