Ericsson and Telstra claim ‘world-first’ RAN compute platform deployment
Ericsson and Australian telco Telstra are claiming a ‘world-first’ with the deployment of Ericsson’s 4th generation of RAN purpose-built compute platform.
August 30, 2024
Telstra deployed Ericsson’s RAN Processor 6672 in a baseband pooling configuration, or Centralised RAN (C-RAN) configuration, which it says delivers more than three times the capacity compared to the previous generation.
Ericsson says RAN Compute units can handle all the digital signal processing tasks of the RAN including the modulation, demodulation, encoding and decoding and scheduling of a users’ LTE and NR traffic. A more advanced RAN Compute platform allows more data to be processed simultaneously, we’re told.
Telstra is apparently the first telco globally to test, validate, and operate commercial traffic on this RAN Compute platform within the C-RAN hubs that service multiple radio sites. In a C-RAN configuration, the new RAN processors ‘offer up to 60% lower energy consumption compared to a distributed deployment.’
“The deployment of our latest Generation RAN compute platform with Telstra represents a significant global milestone in mobile technology,” said Emilio Romeo, Head of Ericsson, Australia, and New Zealand. “This breakthrough not only enhances current services but also prepares the network for future innovations providing a more reliable, sustainable experience.
Telstra’s Executive for Wireless Network Engineering, Sri Amirthalingam added: “We aspire to give our customers a world leading mobile experience and this technology will unlock new capabilities and support increased capacity in the network. With Ericsson’s support, it will help us meet our customers’ data needs more efficiently as they rely on their mobile for day-to day tasks and is an important step in laying the foundations for 6G.”
A key part of the tech seems to be about supporting ‘advanced automation and AI/ML capabilities’. Compared to previous generations, the new RAN processors can have up to 20 times more pre-loaded AI models with higher inference capacity, so says the release.
About the Author
You May Also Like