Debunked: the 5 biggest data myths in telecoms
Telcos, while willing to devote resource to their data solutions, are seeing little of the return that they need to justify that investment.
March 24, 2017
Telecoms.com periodically invites expert third parties to share their views on the industry’s most pressing issues. In this piece Mark Samson, principal systems engineer at Hadoop specialist Cloudera, addresses five of the biggest concerns he thinks telcos have around big data.
Telecommunications is an industry that touches all of us, whether through our broadband networks, our mobile devices, or access to TV and entertainment. Telco companies allow us to speak to each other, to share our thoughts, and to do business with anyone in the world.
It has always been an information-rich industry, but shifts in technology have led to an explosion of a variety of new and emerging sets of structured, unstructured, and semi-structured data sources – including everything from clickstream data, location data, mobile application logs, social media streams and sensor data.
Every operator knows that optimisation of the customer experience is a revenue opportunity. Offering a personalised product or service to a customer at the right time means increased relevancy and a chance to improve engagement, revenue and brand loyalty. However, the data available to do so, while helpful, is highly siloed across the organisation, varied in formats and schemas. It’s also expensive to store, process and manage, and therefore difficult to derive value from.
Telcos, while willing to devote resource to their data solutions, are seeing little of the return that they need to justify that investment. This begs the question: when the opportunities are so huge, why aren’t telco companies taking advantage? It’s high time to debunk the five biggest myths that are holding the telco industry back from realising the potential of their data:
Myth 1: My data is too diverse to analyse
My data resides across hundreds of systems and platforms. With so much complex structured, unstructured, and semi-structured data existing in silos across my organisation, there’s no way I can consolidate and centralise it all.
REALITY: “Big Data” is actually very diverse data coming from various sources, and need not always be highly structured and ready to use. For telco, the variety of data is particularly relevant; as they work across many different business streams and have access to not only huge volumes of data, but also diverse datasets. And this is where industry leading Big Data platforms such as Apache Hadoop excel. With a growing set of open source components that constitute the Apache Hadoop ecosystem, Telcos can now easily ingest, store, process, and analyse large volumes of data, regardless of its source, format or size.
Myth 2: Open source means open security
Any mention of “the cloud” or “open source” brings security to mind. Because Hadoop is open source, it’s perceived that security protocols governing data will naturally fall by the wayside.
REALITY: The Apache Hadoop ecosystem has come a long way since 2006. Powered by Apache Sentry, Record Service and other leading tools, Big Data platforms provide comprehensive, enterprise-grade security and governance for organisations to effectively ensure data security and compliance. These technologies deliver applications and frameworks for deploying, managing, and integrating the necessary data security controls demanded by today’s strictest regulatory environments and compliance policies.
Myth 3: Hadoop doesn’t provide adequate returns on investment
According to a Gartner survey, only 26% of respondents claimed to be either deploying, piloting or experimenting with Hadoop. It’s seen as too costly and is “overkill” for the problems businesses face.
REALITY: This is probably the biggest myth of all. Depending on the scope and scale of operations, enterprises have been able to drive significant cost savings in the range of 35- 90%, as compared to traditional data management mechanisms, by moving to Hadoop. Telcos, in particular, are using Hadoop as the data management platform to drive some of the most compelling use cases. An example of using Hadoop to derive value is to drive down revenue leakage and fraud.
Based on industry estimates, telcos lose approximately 2.8% of their revenues to fraud annually – costing the industry approximately US $38 billion every year. Telcos could in fact be adding that amount to their bottom line without selling any additional products or services. To compete and out-innovate cyberfraud criminals, leading Telcos are utilising Apache Hadoop and machine learning algorithms to analyse large amounts of data to uncover fraud and revenue threats in real-time.
Myth 4: Hadoop is too difficult for my team to learn
Hadoop is very difficult with a need for complex skill sets. Only a few skilled data scientists and analysts that work for elite tech companies can derive real value from implementing Hadoop.
REALITY: Technological advancements have removed the perceived complexity. Every organisation should be able to realise the value of their data and learn the applicable skills to make the most of it. Telco companies are able to save time and money on headhunting data scientists by training up their existing teams. Today there are a variety of workshops and certification programs on offer, including inexpensive self-paced Hadoop Basics tutorials, so teams can quickly get up to speed.
Myth 5: Legacy systems are too ingrained and complex to work with yet another data solution
Legacy systems still determine much of how organisations manage, analyse and gain value from their data. Organisations are understandably reluctant to invest in open source tools as they don’t want to add more complexity to an unwieldy stack that has to work across multiple business units and respond to multiple platforms.
REALITY: Apache Hadoop works seamlessly with legacy systems to organise data across analytics and management platforms. This isn’t about throwing out legacy systems, but rather consolidating them and pulling structured and unstructured data from unexpected sources. Hadoop-based solutions can help telcos to process and analyse both structured and unstructured data going back several years, complementing, rather than jarring with legacy systems. There may be value in the legacy systems that aren’t being mined but organisations don’t have the data platform in place to fully understand what it can achieve.
Mark Samson is a principal systems engineer at Cloudera, helping customers solve their big data problems using enterprise data hubs based on Hadoop. He is a seasoned expert with over 18 years’ of experience. Prior to his role at Cloudera he was the big data software technical lead for Europe at IBM.
Read more about:
DiscussionAbout the Author
You May Also Like