Scientists are developing artificial intelligence (AI) models that could help next-generation wireless networks such as 6G deliver faster and more reliable connections.

In a study that featured in December 2024’s edition of IEEE Transactions on Wireless Communications, researchers detailed an AI system which reduces the amount of information that needs to be sent between a device and a wireless base station — such as a cell tower — by focusing on key information such as angles, delays and signal strength.

By optimizing signal data in wireless networks that use high-frequency millimeter-wave (mmWave bands of the electromagnetic spectrum, the researchers found that connectivity errors were significantly reduced, and the AI system improved data reliability and connectivity in diverse environments, such as in urban areas with moving traffic and pedestrians.

“To address the rapidly growing data demand in next-generation wireless networks, it is essential to leverage the abundant frequency resource in the mmWave bands,” said the lead author of the study, Byungju Lee, a professor in the telecommunications department at Incheon National University, South Korea.

Related: Future wearable devices could draw power through your body using background 6G cellphone signals

“Our method ensures precise beamforming, which allows signals to connect seamlessly with devices, even when users are in motion,” said Lee.

Smarter ways to shape waves

The current challenge for networks that use high-frequency radio spectrum like mmWaves is that they rely on a large group of antennas working together through massive multiple-input multiple-output (MIMO). The process needs precise information — referred to as “channel state information” (CSI) — to deliver connectivity between base stations and mobile devices with compatible antennas.

This situation is further complicated by changes to a network’s environment, such as antennas moving with people and traffic, or obstructions in the line of sight between devices and cell towers. This leads to “channel aging” – a mismatch between the predicted channel state and its actual state, which results in degraded performance such as reduced data throughput and signal quality.

To try and overcome such challenges, the study’s authors used a new kind of AI model known as a transformer. Convolutional neural networks (CNNs) can be used to help predict and optimize wireless network traffic, by recognizing signal patterns and classification.

But the researchers took a different approach: by using a transformer model instead of a CNN in their network analysis method, both short- and long-term patterns in signal changes could be tracked. As a result, the AI system, dubbed “transformer-assisted parametric CSI feedback”, could make real-time adjustments in the wireless network to improve the connection quality between a base station and a user, even if the latter was moving quickly.

The improvement is explained by the difference between CNNs and transformers. Both are neural network models that analyze visual patterns such as images — in this case, patterns on the electromagnetic spectrum — but CNNs tend to be trained on smaller datasets and focus on “local” features, whereas transformer models use larger datasets and have a self-attention mechanism that enables them to determine the importance of different input elements and their relationships at a global and local level.

In simple terms, a transformer model will learn about an image as a whole, while a CNN has a bias toward features like edges and textures. Transformers see the bigger picture, so to speak.

However, transformer models are more computationally demanding than CNNs. But if they can deliver robust next-generation wireless networks, they could be the key to high-speed wireless communication in the near future.

Share.

Leave A Reply

Exit mobile version