Delivering Effective AI for Telecom Companies: Trusted, Open, Hybrid

It’s not a surprise that in today’s challenging economic landscape, rising costs pose a significant threat to the telecommunications industry. Consider that in 2022, Bain Capital was predicting that Telcos would grapple with increased personnel and escalating operating costs due to inflation. And here we are. 

To combat these challenges, telcos must proactively seek opportunities to streamline operations and optimize revenue streams. From embracing automation to digital transformation initiatives, data plays a significant role in powering cost control strategies. According to consulting firm BCG’s 2024 Telco Value Creators Report, leading telcos are “radically optimizing costs through next-generation network architecture and core-to-cloud transformation” as well as “exploring ways to deploy GenAI (generative AI) that will transform each step in the industry’s value chain.”

Adapting to these economic shifts is crucial for telcos not only to survive but also to thrive in an increasingly competitive market. With the strategic use of open-source solutions and generative AI, the industry can not only implement cost-effective approaches but also pave the way for enhanced efficiency and scalability.

“Because gen AI democratizes access to powerful capabilities, any telco—a small operator or large incumbent—can reshape customer expectations and its organizational efficiency. In doing so, they can potentially narrow previously unassailable competitive advantages and overturn long-standing barriers to growth.” McKinsey

How Cloudera powers cost-control strategies

With Cloudera’s experience delivering data and AI into the telecommunications industry, we conducted an internal assessment as to the areas where we have jointly built further value.  Here are ten areas where Cloudera has enabled telcos to leverage open source and generative AI to implement cost control measures.

Ten Strategies for Success

1. Reduce cloud costs by running large regular data workloads on-prem
The public cloud is flexible, agile, accessible, and a great option for your business, looking to innovate fast and develop new ideas. The costs can increase and decrease, but as new workloads mature, that variability dies away. As the Rakuten CMO Geoff Hollingworth has said, “[a]s soon as a workload gains known predictable load, move it to private cloud for simple economic reasons.” Most telecom companies experience “sticker shock” when public cloud operations go into full production, and seek FinOps and other solutions to help – careful workload management can accelerate cost reduction by offering more efficient alternatives.

2. Deploy Flink for high volume data streaming
One major carrier is ingesting 3PB every single day into its Cloudera environment, mostly made up of network data, logs and telemetry from its extensive network across the USA. That’s 3,000 terabytes, every single day. To move that volume of data, old-school batch file processing simply doesn’t work, and real-time streaming is imperative. Flink, supported by Cloudera for many years with SQL Stream Builder, is one of the latest innovations in the Apache stable and is designed to respond to higher volume and velocity demands, while significantly lowering the cost of high-volume data operations. Combined with Kafka and Nifi, open-source data streaming operates at carrier grade to seamlessly engage at carrier-grade network scale.

3. Network anomaly detection as the grounding for network automation
As telcos seek to reduce the cost of operations, the network persists as its highest-cost domain. Automating network operations remains the nirvana for the industry, and to a large extent that begins in service assurance with network anomaly detection. While traditional tools like fault and performance management have their uses, automating anomaly detection in network streams can trigger demands for configuration changes or capacity reallocation to offset potential issues, and reduce the incidence or impact of network degradations. TIM Sparkle in Italy, one of the largest providers of global network in the world, has been using Cloudera CDP to support network anomaly detection for many years. 

4. Generative AI for network anomaly description and resolution
Networks are complicated. They are multi-vendor, they integrate Telecom and IT standards, with wireless, wireline and legacy models, and your customers expect a single view of their operations. Trying to find and address network defects can be time-consuming and difficult to understand, and often institutional knowledge can reside in a few key individuals, creating risks to the business should one of them leave. Generative AI can describe network issues in natural language as they arise, and potentially even potential causes and resolutions, based on historical performance, vendor communities and other documentation. One of the largest operators in the Middle East is now using Cloudera to explore how Generative AI can improve its operational performance by combining public services such as and Hugging Face, both partners of Cloudera, with on-prem local data for hyper-contextualised applications. 

5. Rationalise data mediation platforms with open source NiFi
Historically, data mediation platforms were used to collect data from the network and ‘translate’ that data into consumable transactions for billing, service assurance and security operations. They were often vertically focused, only serving telecom operations, as those businesses had ‘unique needs’ in terms of scale and complexity. Open source tools like Nifi can replace those systems at a fraction of the cost, and scale is no longer a challenge that can only be met by vertical appliance solutions – as LG uPlus found out. With Cloudera as the network data mediation layer for its entire wireline and 3G/4G/5G wireless service assurance functions, they are ingesting over 400TB of network telemetry per day. 

6. Hybrid cloud data architecture for public cloud workload offload
While public cloud offers significant advantages over on-prem for data and AI, such solutions come with constraints. Regulatory, security and jurisdiction preferences and laws each may require that certain workloads or particular data sets cannot reside on public cloud infrastructure. Similarly, as mentioned above, certain workloads may vary in resource requirements when being designed, so it’s cheaper to avoid sizing to the maximum resource requirement in that early phase. However, these workloads can become much more predictable and consistent in terms of scale when they mature, making an on-prem solution more cost-effective. Over time, as workloads grow or shrink, and as regulations, privacy postures and company policies evolve, new options may become available. Having the capacity to move workloads around, from on-prem to public cloud, between public clouds, and back to the data center, can offer tools to make sure that the best option is always available for the workload. One major European telco had been working on a plan to migrate on-prem workloads to the public cloud, but when the Schrems II decision on GDPR and related matters caused their privacy posture to change. Cloudera allowed them to persist an architecture on-prem, but also to retain an option to migrate workloads to the public cloud as and when that posture changed again.

7. Re-platform appliance-based processes for lower TCO
Appliances combine hardware and software into a dedicated solution for large-volume data workloads, and in the past, they provided performant solutions for specific workloads. They were expensive, but the bang for the buck was great. Today, however, with open source Data LakeHouse solutions from Cloudera, based on open source technologies like Apache Iceberg, those appliances now cost far more than is necessary, though telcos are often stuck with them as legacy, with too many dependent applications. The Cloudera platform can offload data from these appliances – which are usually charged for on a per-terabyte basis – while retaining the front-end interface and avoiding any disruption to the dependent applications. This can dramatically reduce the cost of operations as hundreds of Cloudera clients have discovered, including Saudi Telecom.

8. Redeploy internal open-source data and AI support teams
When building open source solutions for your internal operations, your IT support will have to develop capabilities to support those internal customers on both the applications that you have developed and on the open source components inside. As those applications scale, and require enhanced availability, Cloudera can provide enterprise-grade support for core application components and relieve pressure on internal commitments. Nokia faced a similar dilemma for its AVA suite of products, and rather than recruit specialists in Iceberg, HBase, Ozone, and other Apache open source components, partnered with Cloudera to deliver its telecom analytics solutions. Nokia focuses on the telecom vertical solutions, while Cloudera focuses on the data infrastructure powering it along. 

9. Use high-volume network streaming data and generative AI to construct real-time customer profiles for hyper-personalised agent response
Just as automation is required in the network, so too is automation required in customer support. Traditionally, customer profiles derive from BSS data, focusing mainly on payment, interaction and billing information. Network data can build layers of significant passive experience data, building nuance into each customer interaction that will move the needle on customer experience. More and more telecom operators are using Generative AI running on Cloudera to dynamically translate and interpret voice interactions with real-time recommendations, sentiment analysis and CSR guidance. MTN has been combining mobile money and telecom network data to drive digital campaigns to even greater levels of targeting across Africa, and in one recent case improved campaign uptake by 186%. 

10. Use Cloudera DataFlow to centralize data distribution and minimize data redundancy
While there is no shortage of data in telecom operations, neither is there any shortage of demand for that data. Network operations and customer management are both significant consumers, but within those groups, there are multiple data consumers, and on top of that there are B2B, finance, IT, fleet, channels and countless other groups all looking for data sets, and often the same data. This can lead to redundancy, unnecessary spending, and a loss of control over how data is being used – a real security risk. Some of the largest telcos in the world are using Cloudera DataFlow and Cloudera Observability – all based on open source technologies – to optimize their data and resource consumption and get their arms around its vast reservoirs of enterprise and network data, and cloud and data center resource allocations. 

For almost twenty years, Cloudera has led the evolution of open source data and AI, and has been its leading innovator. With eighty of the top one hundred telecommunications service providers running Cloudera, and over 25 Exabytes under management – that’s as much as any of the hyperscalers – Cloudera is the industry powerhouse when it comes to high-end data and AI engineering.

Cloudera will be sponsoring this year’s TM Forum Digital Transformation World at Copenhagen, Denmark, June 18-20. Come and talk to us about your data workloads, we’d love to explore those with you! Click here to request a meeting, and we’ll be in touch!

Anthony Behan
Global Managing Director, Communications, Media & Entertainment at Cloudera
More by this author

Leave a comment

Your email address will not be published. Links are not permitted in comments.