Video here. Ahmed Alkhateeb and his students has advanced mmWave beam prediction which they are using to track and communicate with drones. That isn't easy; 3GPP 5G requires a robust back channel to conform the cell signal to the receiver. That's hard with a small, fast-moving drone.
3 years ago, Vodafone, Saudi Telecom, and I believe Verizon all promised drone tracking in the near future. None of them seem to be close to delivering it. But look at the video of the work a team of grad students has done.
They use visual and GPS tracking to optimize the coding. This reduces the computation required and the heat generated.
The drone footage is dramatic but the dataset the Arizona team developed may be the larger contribution.
DeepSense 6G is a real-world multi-modal dataset that comprises coexisting multi-modal sensing and communication data, such as mmWave wireless communication, Camera, GPS data, LiDAR, and Radar, collected in realistic wireless environments.
Towards Real-World 6G Drone Communication: Position and Camera Aided Beam Prediction
Millimeter-wave (mmWave) and terahertz (THz) communication systems typically deploy large antenna arrays to guarantee sufficient receive signal power. The beam training overhead associated with these arrays, however, make it hard for these systems to support highly-mobile applications such as drone communication. To overcome this challenge, this paper proposes a machine learning-based approach that leverages additional sensory data, such as visual and positional data, for fast and accurate mmWave/THz beam prediction. The developed framework is evaluated on a real-world multi-modal mmWave drone communication dataset comprising of co-existing camera, practical GPS, and mmWave beam training data. The proposed sensing-aided solution achieves a top-1 beam prediction accuracy of 86.32% and close to 100% top-3 and top-5 accuracies, while considerably reducing the beam training overhead. This highlights a promising solution for enabling highly mobile 6G drone communications.