Everybody wants to be part of the next big thing, whether that's some shiny new test management strategy or an emerging technology like Artificial Intelligence (AI). In fact, so many people want to be part of the AI movement that there's a real tendency to describe things as AI when they're really just automation or advanced analytics. This puts decision-makers on the test bench in something of a tricky spot—you need to perform your due diligence to make sure you're getting what you think you are out of a solution.
When it comes to Machine Learning (ML)—a specific sub-concept within the larger AI-sphere—there's less confusion. Why? Because ML refers to a particular set of processes that revolve around structuring and analyzing unstructured data, in order to find insights and draw conclusions. Thus, while there are potentially infinite ways AI might transform test automation, there are a few defined, concrete applications for ML in the test lab. This includes Quality of Experience (QoE) testing, SLA monitoring, and fraud detection, all of which can significantly contribute to achieving ROI for your test operations.
What is Machine Learning?
Before we talk about some of the specific ways ML can impact test automation, let's first figure out precisely what ML means. Per the MIT Technology Review, “Machine-learning algorithms use statistics to find patterns in massive amounts of data. Here, data encompasses a lot of things—numbers, words, images, clicks, what have you. If it can be digitally stored, it can be fed into a machine-learning algorithm.” It notes that recommendation engines on sites like Netflix and YouTube are often powered by this kind of statistical technology—to say nothing of mission-critical business insights. Using machine learning algorithms like the ones being described, you can extract value from data that would be impossible for a human to comb through.
For our purposes, the most important applications of machine learning technology are those in which it's combined with streaming algorithms. These algorithms sequence random, unstructured data to create a “sketch” that stakeholders can understand. For instance, throughout your automated testing workflows, you might be collecting tons of data points on different network states, along with signal traces, CDRs, etc. Though you probably have to pay to store this data somewhere, there's not much you can do to extract value from it by hand. Enter the streaming algorithm: by properly “querying” the system, data can immediately be extracted without sifting through and synchronizing tons of detailed records. Maybe there are dependencies in the arrangements you've made with your roaming partners that interfere with your legacy fallback procedures, which might account for a recent uptick in dropped calls. Conversely, you might discover snippets of code in your OSS/BSS systems that are adversely impacting user experience on your network.
How Machine Learning Impacts Churn Rates
As part of its broader ability to wrangle large and complicated caches of data, ML algorithms can also help testers to add new metrics and KPIs to their existing test reports, to localize faults better and improve their services. To wit, ML can be a big help when it comes to generating QoE (quality of experience) metrics to complement your existing QoS (quality of service) measurements. By modeling your audio quality as a mean opinion score using POLQA (Perceptual Objective Listening Quality Analysis), for instance, you can go beyond the usual purview of an end-to-end test to answer some critical questions:
- How would subscribers rate the audio quality they're experiencing, above and beyond the fact that the audio is connected?
- What factors might be negatively impacting audio QoE for your subscribers, and how can you address them?
- What critical facets of the user experience are your more objective QoS metrics leaving out or eliding over?
Since user satisfaction is closely correlated with the kinds of intangibles that QoE tracks, then the more objective QoS measures like latency and packet loss, combining QoE and QoS in this way can go a long way towards mitigating the factors that impact user churn. With machine learning algorithms to synthesize these network realities for you, you can finally pinpoint the issues that are causing users to seek other providers or carriers. You can work to address them in an end-to-end manner.
Challenges in Machine Learning Integration
Hopefully, this doesn't sound too good to be true—because it isn't. This is real technology that's being deployed to improve customer experience and decrease churn among telco testers worldwide. It's also being used to speed up service verification across the board via a reduction in a manual effort. The question is, how can you deploy it within your operation? For starters, you need to collect and aggregate all of the test data that your ML algorithms will be combing through; this means using a centralized test management system that can handle distributed testing processes with a high degree of visibility.
It can help leverage new network topologies like network slicing, SDN/NFV (Software Defined Networking/Network Function Virtualization), and cell densification and improve network operations through increased visibility that ML methods provide. This gives the machine learning processes even more relevance, and it enhances the quality of the insights you're receiving from it via apt data collection. The better those insights get, the more effectively you're able to save time for human testers and engineers, address customer concerns before they lead to subscriber churn, and create a more stable and effective network.
Again, this isn't science fiction—this is something that testers can employ right now to improve their processes, boost their ROI, and future-proof their offerings. The trick is to make sure you build out your testing processes with a genuinely flexible and data-driven framework. This will give you the foundation you need to take full advantage of machine learning-based improvements, to say nothing of any new technologies that emerge in the next few years.