Abstract

6G – sixth generation – is the latest cellular technology currently under development for wireless communication systems. In recent years, machine learning (ML) algorithms have been applied widely in various fields, such as healthcare, transportation, energy, autonomous cars, and many more. Those algorithms have also been used in communication technologies to improve the system performance in terms of frequency spectrum usage, latency, and security. With the rapid developments of ML techniques, especially deep learning (DL), it is critical to consider the security concern when applying the algorithms. While ML algorithms offer significant advantages for 6G networks, security concerns on artificial intelligence (AI) models are typically ignored by the scientific community so far. However, security is also a vital part of AI algorithms because attackers can poison the AI model itself. This paper proposes a mitigation method for adversarial attacks against proposed 6G ML models for the millimeter-wave (mmWave) beam prediction using adversarial training. The main idea behind generating adversarial attacks against ML models is to produce faulty results by manipulating trained DL models for 6G applications for mmWave beam prediction. We also present a proposed adversarial learning mitigation method’s performance for 6G security in mmWave beam prediction application a fast gradient sign method attack. The results show that the defended model under attack’s mean square errors (i.e., the prediction accuracy) are very close to the undefended model without attack.

Highlights

  • Cellular networking has been the most popular wireless communication technology in the last three decades (1G-2G in the early 1990s, 3G in the early 2000s, 4G in 2010s, 5G in 2020s), which can support high data rate with long distance for voice and data

  • We proposed an adversarial training-based mmWave beam prediction model to protect the model against FGSM adversarial machine learning (ML) attacks

  • We show that an undefended deep learning (DL)-based mmWave beam prediction model system is vulnerable against carefully designed adversarial noise

Read more

Summary

Introduction

Cellular networking has been the most popular wireless communication technology in the last three decades (1G-2G in the early 1990s, 3G in the early 2000s, 4G in 2010s, 5G in 2020s), which can support high data rate with long distance for voice and data. Latest cellular technologies (4G/5G/6G) support higher data rates, i.e., approximately 33.88 Mbps, 1.100 Mbps, and 1 Tbps, respectively, and low latency, i.e., milliseconds. They are still suffering congestion and reduced network performance due to sharing the frequency spectrum with other mobile users. Introducing the 5G with super-fast data speeds is a breakthrough and presents a significant transformation in mobile networking and data communication. It offers a data transmission speed of 20 times faster than the 4G networks and delivers less than a millisecond data latency [2], [3], [4]. The main difference of 5G is to use a new technology called massive multiple-input multiple-output (MIMO) and using multiple targeted beams to spotlight [5], [6]

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call