Inventing the future
With 4G only just reaching commercial maturity 5G is still very much something that remains to be defined. But in this glimpse of the future MCI looks at the access and core technologies that may shape the industry of tomorrow.
June 10, 2014
In 1982 a then little-known writer and director, Steven Lisberger, captured public attention with a groundbreaking movie called Tron. The premise of the film was a warning that computers would eventually take over our lives and that the best way to counter this eventuality would be through inside knowledge, by understanding the computers so well they would not be able to dominate humanity.
Lisberger had no way of knowing just how portentous his storyline would have become some 30 years later. But in a press interview at the film’s launch, Lisberger referenced a 1963 quote from the Hungarian scientist who first developed holography, Dennis Gabor, who said that “the best way to predict the future is to invent it.”
The communications industry is not short of predictions. By the end of this year, Informa’s WCIS+ forecasts almost 386 million LTE connections around the world. By the start of 2020, this number will have leapt to over 2.3 billion. This means 27 per cent of the predicted 8.4 billion mobile connections will be running on 4G technologies by 2020, with the balance largely on 3G.
In terms of consumption, the Cisco Visual Networking Index predicts that monthly global mobile data traffic will surpass 15 exabytes by 2018; that’s 1.8GB per connection per month and more than half of that will be over LTE, Cisco says. Indeed, by 2020, industry players like Nokia have said they expect to be helping operators to deliver 1GB to every subscriber every day.
The future, it would seem, is in mobile broadband. And while this prediction would seem elementary, a cynic might say that in the case of the mobile communications industry, invention is the mother of necessity.
Roughly every ten years, the industry goes through a technology step change, where ad hoc developments can be taken no further and a new generation of technology must be rolled out. The innovation here is driven by the vendor community, responsible for developing, installing, maintaining and upgrading that infrastructure to meet demand. In order to capitalise on their R&D investment, the vendors must convince the operators to buy into the latest generation of equipment. The fine line, as we witness the commercial maturity of LTE, is highlighting the shortcomings of previous technologies such as 3G, whilst making future developments seem too far away to be worth holding out for. With this in mind the vendor community was understandably cautious when MCI approached with questions on the status of 5G development.
But by happy coincidence there was good reason to prompt the industry heavyweights, as mid-May, Japanese carrier NTT Docomo farmed out contracts to several vendors in order to pilot 5G technologies.
Japan, like South Korea, has a very advanced mobile market and while not always the first to deploy new technologies, typically sees very rapid adoption and subsequent saturation. South Korea is expected to have a pilot 5G network available for the Winter Olympics in 2018 and commercial offerings by 2020, while Japan is also aiming for a 2020 launch of the technology.
“5G studies are starting to gain real momentum as we point toward 2020. We appreciate that 5G will provide significant performance enhancements to support future new applications that will impact both users and industry,” said Seizo Onoe, executive vice president and CTO at NTT Docomo, commenting on the pilot.
Alcatel-Lucent, Ericsson, Fujitsu, NEC, Nokia and Samsung have all been selected to work on a 5G proof of concept system, using the 15GHz frequency band for the air interface as well as exploring the potential of millimeter wave technology in the 70GHz spectrum band.
With 5G standards so undefined at present there is a considerable focus on access network technologies mainly as an evolution to what has been seen in LTE and LTE-Advanced. According to Dr Shahram G Niri, general manager of the 5GIC (5G Innovation Centre) at the University of Surrey in the UK, driving more data through the scarce, finite and expensive radio spectrum is the real challenge. “I therefore believe a new RAN becomes the main agenda for 5G,” he says.
The 5GIC is one of several academic and vendor activities that have been announced as of this year. The University of Surrey received funding from the UK government, major infrastructure vendors and mobile operators to establish the world’s first dedicated 5G programme and an international hub for telecommunication research and innovation with a unique, large scale 5G test bed.
As Niri points out, an evolution in radio access has normally defined the new generation and standard, from analogue 1G to digital 2G with TDMA, to 3G with WCDMA and finally 4G with OFDMA and SC-FDMA. Each standard also brought with it major attributes, such as 2G for mobility and roaming, 3G for multi-media service and 4G for full IP and greater spectral efficiency.
“But it may not be an easy task to exactly define 5G just yet. Our industry is a fast moving and dynamic industry. What we could however say is that the future wireless broadband will be provided not by one standard but by a combination of several standards and technology families,” Niri says. “Depending on the services required and subject to underlying infrastructure and available spectrum, integrated 2G/3G/4G together with future releases of 3G and 4G (LTE-A and beyond) and 5G will be the pillars of future broadband service.”
Niri believes that 5G will be driven by capacity and quality with an evolution in radio access, with new waveforms and a leap forward primarily in spectral efficiency and to some extent latency.
But at the moment, the inclusion of brand new technologies is still up for debate.
Professor Simon Saunders, director of spectrum specialist Real Wireless, argues that we are long overdue radio innovations as 4G was not really a ‘new’ technology. “It was very much based on antenna developments as well as 3G and Wimax. It was really a decade-old technology when it came to commercialisation,” he says.
“So will 5G introduce new technology? What we’ve seen so far—even more massive MIMO and wider bandwidths—are not really anything new. I’m not sure the pace of innovation has been that exciting to date.”
Saunders does grant, however, that very recent and as yet unverified technologies “might be something different”. In this he refers to Californian startup Kumu Networks, which is pitching wireless full duplex technology that is neither TDD nor FDD. Instead it is simultaneously transmitting and receiving in the same frequency through the same antenna.
Kumu claims to have developed technology that cancels self-interference, the “unwanted” energy that leaks into a radio’s receiver while transmitting. As a result of the cancellation, the receiver hears no noise from its transmitter, freeing it to cleanly receive external signals.
The physics are theoretically sound as the waves just pass through each other, so why has this not been done before? It was an engineering problem that Kumu only recently claims to have solved. Because a device has to transmit at very high power, while listening to a very weak signal, it effectively deafens itself. It’s like trying to listen while shouting.
At Alcatel-Lucent owned R&D subsidiary Bell Labs, Tod Sizer the head of wireless R&D, has been looking at a similar technology dubbed Universal Filter Multi Channel (UFMC). “It used to be that you used a separate frequency for each application, for example M2M or video downloading, and this is a very inefficient use of spectrum resources,” he says. “So with this new type of air interface you can have devices that send data once in a while sitting with those that are streaming downloads in the same spectrum band. It’s much more scalable and you can dynamically change the way you treat different traffic flows.”
So with full duplex, which is a key attribute of UFMC, any given bit of spectrum could be used for both sending and receiving. According to Saunders: “If you had a small cell on a lamppost feeding users on the ground, it used to be that an operator needed separate spectrum to backhaul that connectivity. With full duplex however, that connectivity could, theoretically, be backhauled over the same spectrum.”
But Lauri Oksanen, VP of research and technology for networks at Nokia, is more sceptical. “It’s a well known technology but until now the implementation has been so costly and complex that it has not been worthwhile. Although there have been some advances and we are considering whether this could be one of the advances to consider,” he says.
“The main issue is that if you want to deploy it on existing bands then you need to introduce new devices as you can’t have legacy devices and new devices operating in the same band. It still needs to be proven how much of a gain you think you get and at what cost.”
While wifi is a mature technology with a sound use case, it deals terribly with interference and, when a wifi network gets flooded, it slows down. “But this could be avoided with antennas that are transmitting at the same time as listening, and this can make a big difference to the network,” says Saunders. “Regulators would also have less trouble deciding on FDD and TDD packages of spectrum as operators would be able to use both previously allocated swathes at once,” he says.
It’s no secret that technical specifications are largely irrelevant to end users. As long as they have a connection, it doesn’t matter what technology the bearer is. So in this vein carriers need to create the illusion to the user that capacity is unlimited—and the trick here is to figure out how to move resources around the network and get them in close proximity to the users.
“We need the network to be flexible to different types of applications,” says Sizer. “So with an application that requires low latency such as car to car connectivity or gaming, time matters—so the network would adapt for the flow from that particular user to handle the processing and put it geographically closer in the network in a nearby data centre.
“But for a video download where latency isn’t a concern then you can route that connection to a more inefficient data centre further away and it still wouldn’t affect performance,” he says.
With 4G developments the industry’s key focus was on capacity, but with 5G the key focus is expected to be on capacity density. The ‘personal cell’ is a term that gets a lot of air time in this context.
A real world device has maybe six radios built into it and around three are active at any time: wifi, cellular and Bluetooth for example. So it’s feasible that the device should be able to connect to multiple radios at once in order to increase capacity, operating on a dynamic, application–aware basis which returns to the premise that the user shouldn’t care whether the connection is wifi or cellular or any other bearer.
In this vision, every user gets their own cell, so there are no limitations on bandwidth and one subscriber’s usage doesn’t hurt anyone else’s usage of the spectrum because everybody is reusing the same spectrum. To increase the capacity on wired networks the carrier would simply add another wire, so to increase the capacity on wireless networks the carrier simply adds another access point. If you don’t run out of spectrum, a network strategy that allows you to keep building more cells suddenly becomes viable.
Sara Mazur, vice president and head of Ericsson Research, expects that for this reason small cells will become more » important in terms of network topology but also notes that changes in the core network are starting to happen now as well. “We need short latency and the ability to execute functionality close to end users out on the access network, but we also need the ability to execute functionality in the core when needed,” she says. “The evolution of the core is starting now with the adoption of cloud technology in the execution environment and we`re seeing more of the network environment virtualised.”
In this respect, Japanese carrier NTT Docomo is once again at the forefront, following up its 5G trial announcement with the successful virtualisation of the Evolved Packet Core (EPC) in joint verification tests with Alcatel-Lucent, Cisco and NEC to support the functions required for Network Functions Virtualization (NFV).
The aim is to enable faster delivery of new telecom services and boost performance by applying virtualization technology to EPC software that takes on LTE data communication functions. The test results according to Docomo have confirmed EPC’s ability to adaptively boost processing capabilities through controls from the system that manages EPC in response to how much data customers use lending credence to the aforementioned concept of the reactive network.
Docomo checked the platform’s performance during a breakdown in hardware functions, when a backup structure was quickly and automatically constructed using different hardware in order to sustain stable data communications, effectively rerouting the connection.
“NFV is highly expected to bring changes in the ecosystem of network industries,” said Seizo Onoe. “Nonetheless, unless there’s a high degree of collaboration between players, this would end up being pie in the sky.”
While already acknowledged as a universal truth in terms of technology development and evidenced by the patent rushes accompanying any new generation of wireless, never before has collaboration between stakeholders been so important. But this time around it’s because the interested parties extend beyond the traditional wireless sector.
5G is not only expected to act as a driver for development in the telco sector but will also provide impetus to deeper integration with other industry verticals.
Another EC funded organisation, METIS the Mobile and Wireless Communications Enablers for the Twenty-twenty Information Society is a European consortium also aiming to lay foundations for 5G through collaborations spanning telecommunications manufacturers, network operators, the automotive industry and academia.
Essential services that fall under the project’s remit include ebanking, elearning and ehealth, which METIS believes will see an avalanche of mobile and wireless traffic volume, increasing a thousand-fold over the next decade. Traffic will be driven by a mix of communication between humans and machines that need to access and share information efficiently, comfortably and safely as the advent of the Internet of Things ushers in tens of billions of connected devices.
Simon Saunders notes that the customer base for his consultancy is expanding into enterprise territory because “every business needs wireless but most businesses don’t understand it. So we need to decode things both ways between the different communities. Businesses are not sure what the need wireless for or what they need to deliver a specific application. Although we work with operators, vendors and regulators we also get lots of business from wireless users like Wembley Stadium,” he says.
Whereas 3G and 4G were both designed with multimedia consumption in mind, the true potential of the Internet of Things will not be realised until it gets a carrier designed with its unique features in mind.
“If you want a joined up technology that is harmonised and operators could play a role in, then the last thing to do is start from LTE in terms of using it as the optimum technology for M2M, because its designed for high data rates and speeds, capacity density, and high end devices with short battery life,” says Saunders.
M2M channels only carry bytes of data at the moment and given the nature of signalling on LTE, all that chatter means you have more traffic on the signalling channel than you do on the payload, which is inefficient in terms of resource usage. While LTE works with lots of frequencies and time slots and resource allocation, work is now on to ensure that from time to time some of those resources are reserved for a different air interface which suits M2M and that devices with a battery life of ten to 15 years know when to wake up and go to sleep again in order to conserve power.
“By the end of this decade there will be more connected machines than connected humans, maybe ten times as many, with very different requirements,” says Nokia’s Oksanen. “We’re working in 3GPP to make LTE more M2M friendly but keeping backwards compatibility in mind there is only so much you can do, so M2M will be a big feature of 5G,” he says.
Despite the lack of 5G standards definition, what’s clear is that mobile networks are changing. Previous techniques that originate from voice-driven networks are no longer viable and LTE is just providing a glimpse of the availability of ubiquitous broadband. While new breakthroughs are being made in access technology, disruptive innovations in IT are poised to radically change networks from the core through the use of SDN and NFV, intelligent network technologies that at once time seemed as incomprehensible as the digital world depicted in Tron.
The major challenge will be to seamlessly and gradually migrate today’s fragmented, multi-generation and multivendor networks to these new technologies and concepts whist still running several generations of legacy equipment. But as the industry has made clear, the future is unwritten.
Read more about:
DiscussionAbout the Author
You May Also Like