Although the mobile industry is booming, major Mobile Network Operators didn’t have that many opportunities to collaborate with Software Houses.
Most software used by Mobile Network Operators (MNO), from Customer Experience Management to Core Network Operations, is developed as part of a bigger deal with Radio Access Network (RAN) vendors and is a closed ecosystem. This leads to a vendor lock-in (which is not that terrible, as there are only several vendors available anyway) but the bigger issue is that such a vendor needs to build his software to match the requirements of almost each telecom operator on the planet. Such requirements are many times contradictory, which leads to multiple compromises and roadmap delays.
Thus, tailor-made solutions are needed, as they simplify the daily workflow and reduce OPEX.
A New Approach To Developing Software
With the introduction of 5G comes a huge mindset switch. MNOs quickly noticed business benefits from offloading less critical services out of their own data centre to the Cloud and are opening themselves to modern IT practices, DevOps, micro-services, etc. As a result, software developers from typical Cloud development shops are now exploring the world of mobile networks, and Network Engineers are getting to know how to better manage and automate the increasing complexity coming from yet another RAN technology.
In this article, I’ll share my experience related to machine learning and other disciplines hidden under the umbrella of data science, where I saw biggest benefits from MNO-Software House collaborations.
Here are three examples from my engagements with telecom operators from Q1 2020:
1. So much more data!
The fuel for every machine learning engineer is good and reliable data.
5G is introduces many new data sources, significantly increasing data volumes and speed in which they are generated. For some MNO’s, implementing this might be a challenge, but for an experience data engineer it is bread and butter. By ensuring proper data collection, ETL, and data cleaning with tools like Scala, Spark, Snowflake, Kafka and others, data engineers can quickly harness huge amounts of data coming from the network.
Unfortunately, data engineering is not enough. To get insights out of this data — which can be used either to optimize the service or increase customer satisfaction — additional help is needed. At initial engagements I usually see passionate network engineers turned into data scientists working in their local notebooks — however, this usually means reinventing the wheel. What usually helps is introducing the ML Ops concept with such a way of working.
If you are not familiar with this approach, I would recommend you start by reading Martin Fowler.
The End-to-End CD4ML Process – source: martinfowler.com
Implementing ML Ops allowed one of the operators in APAC to triple the number of analysis their data science team was able to produce — and such a project takes only 2 months to complete.
2. Self-Organising Networks (SON)
In the era of 5G, Self-Organising Networks are even more important than earlier, as there are more sites, antennas, RAN features which need optimising.
Although equipment vendors provide SON solutions dedicated to their network elements, there’s still a big need for network level Self Organising Networks, which may be described as SON Coordinator components. They gather data from all vendor specific systems and coordinate them to achieve optimal configuration from the entire network’s perspective.
Increasing network complexity
Imagine that your d-SON (distributed SON) function running on a base station from VendorA operates on a neighbour cell from VendorB needing handover optimisation triggered by its c-SON (centralized SON). Such an approach allows things like — scenario based SON, user group or device group-based SON and integrations of network management to CEM to propose optimisation automatically based on user complaints. Such advanced SON features require a big amount of computing power and utilising public cloud ML’s like AWS SageMaker lowered the OPEX of SON coordination system 4 times.
3. Smart Tokenisation
One might now ask if any data is safe when offloaded to the public Cloud. Luckily, there’s a IT solution for that — smart tokenisation. It resembles storing raw data in an encrypted place. The tokenisation function residing on top can generate anonymized yet still meaningful data, to mask and protect original and sensitive content.
- Name and Surname — real looking — but fake
- Address — valid only in terms of general location
- Email — fake but with a valid DNS – alias
- MSISDN — totally fake
Tokenisation can store unique checksums, IDs and encryption keys in separately secured places and monitor possible changes. It can catch (near-real-time) any attempt of data tempering. All information is rotated quickly (with re-generated new fake data). Such an approach enables:
- Data tracking — near-real-time monitoring and auditability, showing each change, source and difference compared to raw information. It provides on-the-go security of the data platforms.
- Data sharing — inside and among organisations — any sensitive information is masked or aliased, having meaning only to the initial data keeper.
- Subscriber protection — real-time monitoring over subscriber and aliases allows implementing additional security mechanisms like content filtering, attack protections, leakage blockers, etc.
Such an approach allowed to expose more valuable data via 5G NEF and monetise insights coming from network traffic, keeping subscriber privacy. This can be used even now to share population movements data to researchers fighting COVID-19 without privacy risks.
This concludes our list of practical examples from the last quarter. As you can see, combining the broad telecom network knowledge of operator engineers and agile IT of Software Houses can clearly be beneficial to bringing the world closer and fully interconnected.
If you’d like to know more, don’t hesitate to contact us at email@example.com